How Is Autonomous Driving Transforming dSPACE?

Veröffentlicht: 04.06.2019

André Rolfsmeier, Lead Product Manager, Advanced Applications and Technologies dSPACE GmbH, gab am dSPACE Tech Day in Novi, Michigan, USA, Einblicke, wie sich dSPACE im Zuge des autonomen Fahrens weiterentwickelt.

André Rolfsmeier, Lead Product Manager, Advanced Applications and Technologies, dSPACE GmbH

As the transportation industry moves closer to the level 3, 4 and 5 autonomous systems, OEMs are accelerating their efforts to establish driving functions that are based on predictive 360-degree environment perception and redundant sensing and make it possible for vehicles to drive safely in real traffic situations. 

Predictive and 360° redundant sensing

Sensor technologies are rapidly evolving to support autonomous driving, but these advancements also bring about new challenges for OEMs, such as on-demand software, over-the-air software updates, big data processing, validation of operational safety and homologation, to name only a few.

In this new, highly dynamic and agile development environment, artificial intelligence (AI) will play a fundamental role. For example, AI will be used in the vehicle to process and analyze sensor data, to predict driving situations, and determine the reaction of the vehicle. 

This image identifies some of the common challenges related to the development of autonomous driving.

All of these developments are rapidly transforming the transportation industry, and dSPACE is also putting huge efforts into research and development activities.

“We are changing our strategies to quickly respond to these new and upcoming requirements,” said André Rolfsmeier, Lead Product Manager, Advanced Applications and Technologies, dSPACE GmbH. “We are adapting our tool chain and developing new solutions to accelerate the implementation of autonomous driving and – in the end – to increase our customers’ productivity.” 

Current Support for ADAS and AD

Core competencies for advanced driver assistance systems (ADAS) and autonomous driving (AD) that are currently available as part of the dSPACE product portfolio range from model-in-the-loop (MIL) simulation, prototyping of driving functions, software-in-the-loop (SIL) to hardware-in-the-loop (HIL) simulation and test management. These core competencies include dedicated solutions for ADAS and AD applications. 

dSPACE current portfolio for ADAS and AD applications.

The partnership with Intempora lets dSPACE integrate the RTMaps software development environment into its tool chain, especially to address demanding multisensor applications. RTMaps enables users to accurately time-stamp, record, synchronize, and play back data from multiple sensors and vehicle buses. In addition, RTMaps provides a graphical environment that allows perception and fusion algorithms to be implemented easily based on C/C++, Python or Simulink code. 

“The world’s first driverless production vehicle (NAVYA ARMA) was developed using RTMaps” said Rolfsmeier. “RTMaps provides an extensive library for sensors and data processing. As a result camera, radar, lidar, and GNSS sensors, for example, can be implemented quickly and easily.” Rolfsmeier said dSPACE is also heavily investing in SIL and HIL technologies.

“SIL is not a dream of the future,” he said. “It’s already being heavily used by our customers.”

Rolfsmeier mentioned that, for example, development engineers at BMW, VW, and Jaguar Land Rover uses SIL and virtualization to design and test algorithms in the early development phase. He also noted that the MAN Truck & Bus Company is using camera- and radar-in-the-loop sets for ADAS development.

Creating a Scalable Computer Platform for Scenario-Based Testing

With regard to developing level 3, 4 and 5 autonomous systems and implementing new testing methods, Rolfsmeier said companies not only have to follow a model-based development path, but they have to follow a data and scenario driven path.

“You have to be able to virtually test millions of scenarios on a scalable computer platform environment,” he said.

To support thousands of different driving scenarios, dSPACE offers a cluster simulation solution that uses multiple PCs and the dSPACE VEOS platform. Testing is performed in the customer’s laboratory using PC clusters. In the near future, they will even be performed in the cloud. When the measurement data has been captured and stored, test drives can also be replayed in a SIL and HIL setup that matches the real environment. With this setup, 20 and more sensors can simultaneously be integrated and the sensor and vehicle bus data can be replayed time-synchronously.

“Our vision is for you to take your measurement data and from this data generate simulation scenarios that very closely resemble the real environment ,” Rolfsmeier said. “When these scenarios become available, you can use a new testing method called scenario-based testing. You will be able to up-scale your testing capabilities and performance using cloud capabilities.”

“At dSPACE, we are also moving into the area of cloud simulation and setting up an AI team,” said Rolfsmeier. “We are increasingly working in a highly dynamic, agile way to address these use cases within a reasonable time frame.” 

A snapshot of how dSPACE is transforming.

Addressing the Important Issue of Homologation

It is impossible to perform real test drives for every conceivable scenario. Therefore, simulation will be essential for validation. dSPACE supports customers in using sensor front end and data replay tests as well as SIL/HIL simulation for release testing and homologation. In this context, the ISO 26262 functional safety standard, Safety of the Intended Functionality SOTIF and UN-ECE regulations are vital. During several projects, dSPACE gained valuable experience with this particular topic. 

Other Areas of Work in Progress

To support closed-loop simulation, adequate simulation models are required. dSPACE provides realistic simulation models for the environment, traffic, vehicles, and all sensor types (i.e., camera, radar, lidar, etc.). The goal is to provide sensor models to support simulation for all the different levels that are a requirement to test specific use cases, for example, the test of perception, fusion, planning, and motion control algorithms. Particularly for validating perception, dSPACE working on new ways to improve physics-based simulation based on ray tracing.

“We are able to simulate the realistic behavior of radar and lidar beams in real time and even faster,” Rolfsmeier said. “To test central processing units in a SIL or HIL environment, it is essential to accurately synchronize the data from the various environment sensors. We are also improving our camera sensor model with a photo-realistic simulation environment. You can change weather conditions (in a simulated scenario) at the click of a button.”

Treiben Sie Innovationen voran. Immer am Puls der Technologieentwicklung.

Abonnieren Sie unser Expertenwissen. Lernen Sie von erfolgreichen Projektbeispielen. Bleiben Sie auf dem neuesten Stand der Simulation und Validierung. Jetzt dSPACE direct und dSPACE direct aeropace & defense abonnieren.

Formularaufruf freigeben

An dieser Stelle ist ein Eingabeformular von Click Dimensions eingebunden. Dieses ermöglicht es uns Ihr Newsletter-Abonnement zu verarbeiten. Aktuell ist das Formular ausgeblendet aufgrund Ihrer Privatsphäre-Einstellung für unsere Website.

Externes Eingabeformular

Mit dem Aktivieren des Eingabeformulars erklären Sie sich damit einverstanden, dass personenbezogene Daten an Click Dimensions innerhalb der EU, in den USA, Kanada oder Australien übermittelt werden. Mehr dazu in unserer Datenschutzbestimmung.