Systems for autonomous driving use multiple environment sensors. To simulate the sensors in a HIL setup for sensor fusion and function testing, it is essential to accurately synchronize the stimulation of the individual sensors. The dSPACE Environment Sensor Interface Unit supports the time-correlated feeding of raw sensor data to one or more sensor ECUs.
The dSPACE Environment Sensor Interface (ESI) Unit is the flexible answer to the question of how to feed broadband digital signals into camera, radar, and lidar ECUs while maintaining time correlation and low latencies. One of the greatest challenges in the development of functions for automated driving is the validation of environment sensors. Because many development departments use the NVIDIA® DRIVE™ PX2 and AGX platforms in their predevelopment and research for automated driving, dSPACE now offers the ESI Unit in an out-of-the-box variant that is preconfigured for these NVIDIA® platforms and can be used for inserting camera raw data. The dSPACE solutions for sensor and environment simulation can be used directly with the preconfigured ESI Unit to validate functions for automated driving.
For the development and validation of environment sensors, e.g., radar, camera and lidar, and more generally the validation of ADAS/AD functions, support for a range of ECU interfaces for data insertion is essential. In addition to testing based on over-the-air methods and object lists, the insertion of raw data or target lists is of utmost importance for the validation of perception and fusion algorithms that are based on raw data. The Environment Sensor Interface (ESI) Unit supports all relevant sensor interfaces and is eminent for closed-loop and open-loop testing. Advanced sensor simulation in combination with the ESI Unit makes it easy to provide synthetic sensor data under realistic conditions and with low latencies. This is useful for validating functions for autonomous driving in hardware-in-the-loop (HIL) simulation, in both closed- and open-loop scenarios. If RTMaps is used as well, recorded sensor data can be replayed conveniently.
The Environment Sensor Interface Unit supports the injection of raw data and target lists for HIL tests of camera, radar, and lidar ECUs as well as central processing units for autonomous driving. Thanks to its flexible and scalable architecture, the ESI Unit supports, for example, lidar point cloud data injection via 10 Gigabit Ethernet, radar raw data injection via MIPI CSI-2, and camera raw data injection via TI FPD-Link III or Maxim GMSL. To meet the requirements of next-generation ECUs, the ESI Unit can be configured to simulate the latest sensors, including FPGA-based sensor models. A single ESI Unit simulates up to twelve sensors synchronously and supports more than 50 Gbit/s of aggregated bandwidth. Multiple combined ESI Units let you test functions for autonomous driving with dozens of different sensors. Special customer requirements and functions can be implemented directly on the ESI Unit thanks to the powerful Xilinx® UltraScale+™ FPGA.
As NVIDIA® DRIVE PX2 and AGX platforms are widely used in predevelopment and research for functions for autonomous driving, dSPACE provides a preconfigured ESI Unit for camera raw data injection. In combination with dSPACE AURELION (Sensor Simulation) and RTMaps (Data Replay) up to four cameras (Sekonix SF332x) can be simulated as an off-the-shelf version.
Parameter | Specification |
---|---|
FPGA |
|
Memory |
|
Sensor interfaces |
|
Input interfaces |
|
Design |
|
Cooling |
|
Power supply |
|
Weight |
|
Size |
|
Camera |
|
Radar |
|
Lidar |
|
1) Available on request.
Subscribe newsletter
Subscribe to our newsletters, or manage or delete your subscriptions