For HARRI to be able to drive autonomously, he needs an intelligent driving system that can react and make decision independently via information received from the environment. To do this, he has been equipped with the necessary sensor technology: twelve cameras, four laser-based LIDAR systems for measuring distance and speed as well as ultrasonics for ultra-precise and accurate detection of anything in close proximity. In addition there are sensors for longitudinal and lateral acceleration.
The high-efficiency software is specifically developed by our software specialists, based on AUTOSAR Classic and Adaptive. They deal with the diverse problems that HARRI will encounter on the road. This includes recognizing or detecting the environment (objects or obstacles), localization and positioning of the vehicle as well as strategic planning for problems. In addition, there are higher level functions such as planning trajectories or motion paths.
Complex control systems
The meaningful evaluation of the enormous amounts of data derived from sensor technology and GPS for the individual functions and aggregating them in real time to form an overall composite image constitutes an additional challenge. For example, longitudinal and lateral guidance: In order to keep HARRI securely on the trajectory, steering, powertrain and brakes must be capable of optimal interaction with the sensor and navigation technology. This complex control system calls for a enormous computer processing power but also for very high level algorithmics.
The future is about machine learning
For a while to come, HARRI will still be moving only around our own testing grounds, where he will be conditioned for public transport. For it will still require a fair number of learning advances until he not only senses but also unambiguously identifies objects and derives the appropriate driving maneuvers from this. Machine learning is the key to empowering him to also interpret unknown data and safely and reliably manage situations.
- ADC (Autonomous Drive-Domain-Controller) with algorithms developed in house
- AI-based environment and object detection/recognition
- Sensor data fusion
- Trajectory planning
- Longitudinal and lateral guide/control