A true-to-life depiction of reality within a simulation, a digital twin, can be used to carry out high-precision virtual test drives. In contrast to gaming applications, in which the human eye can be fooled, the vehicle development process always requires physically correct computations. The new dSPACE solution for sensor-realistic simulation offers next-generation visualization and realistic sensors (camera, radar, lidar), thus making it possible to test and validate driving functions in real time or even faster.
As realistically as possible: Starting in the summer of 2021, a completely new solution by dSPACE will be available to users to integrate top-grade visualization and cutting-edge, realistic sensors into their processes for developing and validating driving functions. This applies to virtually any conceivable use case. The dSPACE solution for sensor-realistic simulation can be used across various stages of the development process, for example, in hardware-in-the-loop (HIL) testing, in software-in-the-loop (SIL) testing, and even for parallel validation in the cloud. In particular, the solution is suited for developing AI-based features and training data, including the training and testing of artificial neural networks. The data for the sensor-realistic simulation can be computed both on stationary computers and in the cloud. The solution supports Windows® and Linux, as well as docker containers for Linux systems.
Profile: dSPACE Solution for Sensor-Realistic Simulation
- High-resolution visualization, including realistic lighting and weather effects
- High-quality 3-D assets developed by dSPACE, such as vehicles, e-scooters, and pedestrians
- Realistic camera, lidar, and radar models
- Linux and docker support
- Interfaces for integrating third-party sensor models
Camera sensor model with high-fidelity graphics and lighting effects, configurable and realistic lens profiles, options for modifying images, and configurable color filters for raw sensor data.
Radar sensor model with polarimetric calculation of the radar channel, consideration of mirroring reflections and diffuse scattering, multi-path propagation, and adaptive ray launching to interact with every object within the detection range of the sensor. The parameterizable models for generating raw radar data and compiling target lists and object lists simulate the real sensors precisely.
Lidar sensor model with point cloud or raw data output, support for scanning and flash-based sensors, and an ego-motion feature for rotating sensor devices.
Efficient Inner Workings
The 3-D rendering engine, the high-precision dSPACE simulation models, and the realistic 3-D assets allow for the precise simulation of sensors (camera, radar, lidar), environments, weather conditions, lighting effects (daytime, nighttime), and materials. The new solution represents a significant advancement and will more than replace the current software MotionDesk and Sensor Simulation in the future. It combines a wide range of different features into a single product.
High-Precision Sensor Models
The complex vehicle environment is simulated in great detail to validate different types of sensors with the help of models, with all sensor data being computed in real time. In addition, it is possible to integrate third-party sensor models.
Easy Integration into the Tool Environment
The new dSPACE solution can be integrated seamlessly into many other dSPACE tools, such as ASM simulation models, ModelDesk for parameterization, the VEOS simulation platform, the RTMaps development environment for perception algorithms, the scenario generator, and the Environment Sensor Interface (ESI) Unit. Even the upcoming simulation platform SIMPHERA will be compatible with the solution for sensor-realistic simulation.
Mr. Seiger, this new solution for sensor-realistic simulation is a completely new development. How was it initiated?
Our customers’ requirements in terms of sensor simulation have increased substantially. In the past, sensors were largely tested and validated using object lists, especially in the ADAS/AD context. But today that is no longer enough. Even though camera-based tests, for example, were effective with MotionDesk and Sensor Simulation, there were limitations, and the former technology could not be used for realistic, high-resolution graphics without investing a great deal of time and effort.
Where can users really notice the leap in performance of the new solution for sensor-realistic simulation?
The new underlying technology is specifically designed for performance and high-resolution graphics. This enables us to simulate realistic sensors in real time. We are certainly capitalizing on the gaming industry as well. However, while the entertainment industry relies on tricks to create certain visual effects for the human eye, the ECUs used by our customers call for physical correctness. Our aim is to compute the necessary adjustments and expansions so efficiently that we can make them available in real time.
How can you ensure that the computed sensor data delivers the correct results?
We are continuously verifying our models by means of an iterative process. This involves comparisons with values from the literature and comparisons with sensor data that was recorded in real life. It is here that we can see the strengths of our partnerships, for example, with the companies Hella and Velodyne with whom we design, conduct, and analyze experiments. Results flow through an agile development process into the product, thus allowing our customers to reap the benefits straight away.