Sprungmarken

Servicenavigation

Hauptnavigation

Sie sind hier:

Hauptinhalt

Detailseite



Environment perception for automated driving

Chair: Institute of Control Theory and Systems Engineering

Supervisior: Niklas Stannartz,

Start: 01.04.2020

Maximum amount of participants: 6

Description: Nowadays, autonomous driving is not only a futuristic dream anymore, but may become a reality in the next years. The automotive industry is working continuously on the development of new Advanced Driver Assistance Systems (ADAS) that will initially assist the driver in various safety-critical situations and eventually take over the driving task completely. Besides actual car manufacturers like Daimler or Tesla, especially IT giants like Google are exhaustively testing their prototype vehicles at the moment.
The basis for autonomous driving is to make the vehicle “see” und “understand” the environment like a human. This visual sense is provided by environmental sensors like camera, radar and, more recently, lidar sensors that deliver an extremely high-resolution image of the environment even during night or bad weather operations where camera sensors fail. A complementary usage of different sensor technologies is therefore mandatory for an intelligent and accident-free autonomous car.
The Institute of Control Theory and Systems Engineering owns a test vehicle that is equipped with state-of-the-art camera and lidar sensors that generate a 360° degree surround view of the environment. Furthermore, it is equipped with computing hardware that is capable to process the measurement data in real-time. The ROS framework is used for the communication between the sensors and the computing hardware.
This project group aims at the development of algorithms that enable the vehicle to “see” and “understand” its environment. The students will directly work with the software and hardware of the test vehicle. Possible work packages are:
• Object Detection and Tracking within lidar point cloud data: The Matlab Sensor Fusion Toolbox includes functions to process point cloud data in order to detect and track other vehicles. It is possible to generate C code from the detection and tracking algorithms that should be integrated into the current ROS framework.
• Commissioning of the Mobileye development kit: The test vehicle is additionally equipped with a Mobileye camera, a commercially available sensor kit that realizes ADAS functions like lane departure warning or the detection of other vehicles. The camera sensor shall be commissioned using available ROS drivers.
• Visualization: ROS already provide tools for the visualization of measurement data. The current visualization shall be enhanced such that detected objects (from the lidar or Mobileye camera) are intuitively visualized.
• Further work packages may include sensor fusion, synchronization or calibration as well as the application of machine learning algorithms for the environment perception.


 

The project group takes place!

Participants: 180418, 215802, 215901, 183983, 185725, 175192, 187071, 198363,