Sie sind hier:



Deprecated: mysql_connect(): The mysql extension is deprecated and will be removed in the future: use mysqli or PDO instead in /var/www/nps/fix2_export_default/0/docs/c1uet015/Medienpool/Data/dat_konf.php on line 2

Environment perception for autonomous driving

Lehrstuhl: Institute of Control Theory and Systems Engineering

Betreuer: Manuel Schmidt, Niklas Stannartz,

Beginn ab: 11.10.2018

Maximale Anzahl der Teilnehmer: 8

Beschreibung: Nowadays, autonomous driving is not only a futuristic dream anymore, but may become a reality in the next years. The automotive industry is continuously working on the development of new Advanced Driver Assistance Systems (ADAS) that will initially assist the driver in various safety-critical situations and eventually take over the driving task completely. Besides actual car manufacturers like Daimler or Tesla, especially IT giants like Google are exhaustively testing their prototype vehicles at the moment.

The basis for autonomous driving is to make the vehicle “see” the environment like a human. This visual sense is provided by environmental sensors like camera, Radar and, more recently, Lidar sensors that deliver an extremely high-resolution image of the environment even during night or bad weather operations where camera sensors fail. A complementary usage of different sensor technologies is therefore mandatory for an intelligent and accident-free autonomous car.

This project group aims at the development of algorithms that enable the vehicle to “see” and “understand” its environment. These algorithms should be developed and tested in the driving simulator as well as in the prototype vehicle of the institute. Possible working packages are:

- Localization and Mapping: To perform autonomous driving tasks, the vehicle has to know exactly where it is in relation to its environment. This usually includes the generation of a map as well as the localization in this map. Here, state-of-the-art algorithms out of the field of robotics show the most promising results which can be evaluated using an on-board RTK-GPS system with centimeter precision.

- Object detection, classification, fusion and tracking: In order to interact appropriately in the traffic, the vehicle needs to perceive the dynamic as well as the static “participants” of the environment like other vehicles, pedestrians, traffic lights and lane markings. Not only the exact positions of these objects have to be tracked, but also what kind of objects are perceived. Finally, all measurements have to be fused.

Skills in Matlab, Python and ROS are desireable as well as an interest in Automotive technologies.