For a better experience on dSPACE.com, enable JavaScript in your browser. Thank you!

The topic of highly automated driving is the focus of many automobile manufacturers’ development activities. Requirements such as 360° redundant surround view with numerous heterogeneous sensors, high-precision positioning or car connectivity are also challenging topics for tool suppliers.

The dSPACE response to this is an end-to-end tool chain for autonomous driving from a single source. Unique rapid prototyping solutions of high-performance platforms and a tailored software environment allow for the development of complete multisensor applications in the vehicle, from perception and fusion algorithms to real-time controls. The significant increase in testing effort can be managed only by moving the tests forward to software-in-the-loop (SIL) simulation. PC clusters enable a high test throughput by means of greatly parallelizing computer nodes and simulations while at the same time maximizing scalability. For release tests, the hardware-in-the-loop (HIL) simulation remains indispensable. One of the greatest challenges for this is integrating real environment sensors such as camera, radar or lidar and the sensor fusion. dSPACE offers a complete range of integration options, from simple restbus simulation and raw data feed to over-the-air stimulation. 

The chain of effects in autonomous driving generally consists of different processing stages. First, the sensor’s raw data has to be preprocessed (perception). The goal is to detect features and static or dynamic objects as well as free spaces in the environment of the vehicle on the basis of single images or reflection points. During the subsequent stage, the results are merged and collated to a consistent environment model (data fusion). For this, time synchronization and correlation of sensor data is important. In addition, it is necessary to know the exact location and lane position of the vehicle based on a high-definition map (localization).

Based on the environment model, the situation around the vehicle is analyzed, the potential driving trajectories are planned, the decision for a certain maneuver is made, and the longitudinal and lateral control is executed. 

A detailed and comprehensive simulation of the real world is the basis for a successful validation. Using suitable sensor models and the integration of real sensors with the test environment plays an important role. The range of sensor models extends from technology-independent variants, which generate object lists directly from information provided by the environmental model, to phenomenological or physical models, which are typically calculated on a high-performance GPU and feed raw data to the connected real sensors such as camera or radar. There are different integration options for sensors depending on the type of data and the layer to stimulate. These options can range as far as direct stimulation of the sensor front end, either over-the-air, such as radar, or via HF cable with GNSS (Global Navigation Satellite System) or V2X (Vehicle-to-X) signals. Using the real sensors in the test environment is often indispensable since the signal preprocessing, the sensor data fusion, and creating the environment model in the sensor’s control unit have a deep impact on the chain of effects.  

ラピッドプロトタイピング

認知、融合、およびアプリケーション向けの各アルゴリズムの開発にdSPACE製プロトタイピングシステムおよびRTMapsを使用。

MIL/SILシミュレーション

標準PCまたはPCクラスタを使用したモデル(MIL)またはSIL(software-in-the-loop)シミュレーションにより、運転機能のテストを自動化。

HILシミュレーション

自動運転システムおよび完全な効果チェーンをラボ内でテスト

実車によるテストドライブ

実車によるテストドライブ中に、環境センサおよびバスネットワークからの時間的相関関係の設定されたデータを記録

ツールの概要

すべての開発ステップを通じて円滑に相互作用が行われるよう適切に調整されたツールチェーン

動画

先進運転支援システム(ADAS)および自動運転に関するdSPACEの動画