Share
linkedin twitter facebook email

 

Robust people detection through sensor data fusion
 
Application domains such as automotive, smart buildings and industrial automation require reliable and robust real-time detection and tracking of nearby persons in all weather and lighting conditions e.g. fog, dirt, rain, sunlight, ...
 
Video cameras can accurately detect persons, but their performance degrades quickly under poor lighting conditions. Radars are much more resilient to these impairments, but their angular resolution is limited.
Fusing radar and camera detections combines the benefits of both: it ensures detection and tracking of persons under a diversity of environmental and weather conditions.
 
In this demo, imec presents real-time sensor fusion on a resource constrained, embedded platform. Machine learning has been applied both on camera-based people detection as well as on radar-based detection, which is a rather unexplored terrain. Both stereo camera-based and radar-based real-time detection algorithms have been implemented on an embedded off-the-shelf NVIDIA Jetson TX1 platform. On the same platform, a fusion tracker has been implemented that combines the available track information of the individual sensors to a robust and reliable track of the persons within the range of the sensors. The demo shows the person detection in real time over different environment conditions, mimicking a variety of weather conditions.
 

Copyright imec 2017