Global automobile manufacturers and technology suppliers rely on dSPACE to implement the idea of autonomous driving. We provide the required simulation and validation solutions, including SIL, HIL, prototyping, data logging, data replay, data enrichment, sensor realism, scenario-based testing, scenario generation as well as data and test management. Our portfolio includes solutions for use on a PC, a HIL simulator, or in the cloud. We also offer consulting services, if required.
To help you put the idea of autonomous driving on the road, dSPACE offers comprehensive solutions and services for the data-driven development and validation. This ensures seamless, efficient data processing at all development stages, from data logging to release or sign-off tests.
The chain of effects in autonomous driving generally consists of different processing stages. First, the sensor’s raw data has to be preprocessed (perception). The goal is to detect features and static or dynamic objects as well as free spaces in the environment of the vehicle on the basis of single images or reflection points. During the subsequent stage, the results are merged and collated to a consistent environment model (data fusion). For this, time synchronization and correlation of sensor data is important. In addition, it is necessary to know the exact location and lane position of the vehicle based on a high-definition map (localization).
Based on the environment model, the situation around the vehicle is analyzed, the potential driving trajectories are planned, the decision for a certain maneuver is made, and the longitudinal and lateral control is executed.
A detailed and comprehensive simulation of the real world is the basis for a successful validation. Using suitable sensor models and the integration of real sensors with the test environment plays an important role. The range of sensor models extends from technology-independent variants, which generate object lists directly from information provided by the environmental model, to phenomenological or physical models, which are typically calculated on a high-performance GPU and feed raw data to the connected real sensors such as camera or radar. There are different integration options for sensors depending on the type of data and the layer to stimulate. These options can range as far as direct stimulation of the sensor front end, either over-the-air, such as radar, or via HF cable with GNSS (Global Navigation Satellite System) or V2X (Vehicle-to-X) signals. Using the real sensors in the test environment is often indispensable since the signal preprocessing, the sensor data fusion, and creating the environment model in the sensor’s control unit have a deep impact on the chain of effects.
Artificial-intelligence-assisted one-stop-solution for data-driven development of autonomous vehicles (AV), from data recording and enrichment to generation of real-world-based scenarios for large scale simulation.
Rapid PrototypingDeveloping perception, fusion and application algorithms using dSPACE prototyping systems and RTMaps
MIL/SIL SimulationTesting automated driving functions via model- (MIL) or software-in-the-loop (SIL) simulation on standard PCs or PC clusters
HIL SimulationTesting automated driving systems and complete chains of effects in the laboratory
Overview of ToolsWell-coordinated tool chain with tools that interact smoothly throughout all the development steps
VideosA selection of dSPACE videos on advanced driver assistance systems (ADAS) and autonomous driving
dSPACE ConsultingdSPACE Consulting offers consultancy projects to support you in defining processes and optimizing them throughout all phases of ECU development, independent of whether dSPACE tools are used.
Subscribe newsletter
Subscribe to our newsletters, or manage or delete your subscriptions