For a better experience on, enable JavaScript in your browser. Thank you!

Environment Sensor Interface Unit

Feeding digital image data into camera ECUs

Systems for autonomous driving use multiple environment sensors. To simulate the sensors in a HIL setup for sensor fusion and function testing, it is essential to accurately synchronize the stimulation of the individual sensors. The dSPACE Environment Sensor Interface Unit supports the time-correlated feeding of raw sensor data to one or more camera ECUs.  

ADAS and autonomous driving typically require multiple camera, radar, and lidar sensors. For example, camera boxes are commonly used today for testing camera-based systems. Here, the cameras are stimulated "over-the-air" with the aid of monitor images. This approach often has considerable disadvantages, such as limited monitor contrast, distortion caused by lenses or a complex setup with stereo or multiple cameras. The Environment Sensor Interface Unit that feeds raw sensor image data to camera ECUs is the ideal solution to overcome these typical drawbacks during testing. In addition, it supports the time-correlated stimulation of a high number of individual sensors.   

Key Benefits

In a HIL setup, the Environment Sensor Interface Unit stimulates camera sensors with raw data. For long-range video interfaces, it is connected directly to the image processing unit of the camera sensor. In case of short-range interfaces, the connection is made via fiber-optic cable and an optional dSPACE plug-on device (ESI-POD). The modular system supports all common interface types such as GMSL, FPD-Link III, or CSI2 and is individually configured according to customer requirements. The powerful FPGA handles the data preparation (e.g., image manipulation) and ensures time-correlated raw data transmission with a high number of simultaneously connected camera sensors. This means that typical scenarios for autonomous driving with a stereo camera, four surround view cameras and a driver's camera can be realized with the aid of a single Environment Sensor Interface Unit. Depending on the use case, the image data can come from different sources. In open-loop simulation, it is typically recorded data that is played back in RTMaps. Alternatively, you can also use an open-loop simulation setup with ASM/MotionDesk. With closed-loop HIL, on the other hand, the data is generated on the basis of GPU models with a high dynamic range of up to 24 bit1) by MotionDesk. The expansion of the Environment Sensor Interface Unit by radar and lidar raw data is under development. The Environment Sensor Interface Unit is prepared for mounting in a HIL system (19-inch rack).

1)  Available on request.

Parameter Specification

Environment Sensor Interface Unit (ESI Unit)

  • Xilinx Zynq UltraScale+ FPGA with up to 15.9 Gbit/s aggregated data rate
  • Time-correlated data feeding to a high number of sensors in parallel
  • Image processing and manipulation, e.g., simulation of defective pixels or columns
  • Flexible adaption of video interfaces via FMC plug-in modules (FPD-Link III, GMSL, 1G/10G1) Ethernet, GigE Vision1), Ethernet AVB1), ...)
  • Up to four HDMI interfaces (resolution up to 2560×1600 pixels, refresh rate up to 120 Hz)
  • Ultra-low-latency optical interfaces to connect dSPACE plug-on devices
  • High Dynamic Range (HDR) support with 8 to 241) bit
  • Ethernet interface for configuration and feedback data
  • Customer-specific interfaces on request
  • Generic hardware and firmware, reusable for future projects
  • Connecting multiple ESI Units makes it possible to use an even larger number of sensors

Optional plug-on device (ESI-POD)

  • Short-range video interfaces (HiSPI, CSI2, LVDS, Parallel)
  • Interface to sensor ECU (I2C, SPI), e.g., to control field of view or exposure time
  • Customer-specific interfaces on request
  • Potential-free power supply


1)  Available on request.

Use Cases