Embedded perception system for real time 4D scene analysis
Published : 7 August 2019
With the growing number of autonomous systems the needs of environment perceptions explode in the embedded systems. These systems integrate a wide variety of sensors and of perception functions. They often model the near environment with a collection of mostly independent specific functions. The goal of this work is to design a new embedded perception system, taking advantage of several sensors and temporal measurements to generate a 3D model understandable by a higher-level application. It could, for example, generates a 3D mesh of a scene with semantic and dynamic information. The targeted application domain is the extended reality. Firstly, the candidate will develop a golden applicative 3D modeling pipeline based on the latest algorithms on a PC. Next, he will imagine and define a embedded system with several sensors and will adapt the algorithms to minimize the energy consumption and reduce the execution latency.