Traffic in urban areas places enormous demands on autonomous vehicles. Virtual test drives are essential to prove their safety. But how do you achieve realistic tests?

In automated driving from level 2, a sophisticated electronic control unit takes control of the driving functions. The validation of electronic control units is of crucial importance for vehicle manufacturers so that their customers can rely on their vehicles to master critical traffic situations with confidence. ZF offers a state-of-the-art validation service for advanced driver assistance systems/autonomous driving (ADAS/AD) electronic control units.

Take Each Step Safely: See, Think, Act

ADAS/AD electronic control units have powerful processing stages that enable automated driving:

  • Perception (See): Processing of data from environment sensors such as camera, radar, lidar, GNSS, and ultrasound. The sensors collect information about the vehicle's surroundings and send it to the electronic control unit, which analyzes and interprets it.
  • Decision-making and movement planning (Think): Evaluating the collected data and, based on this, making intelligent decisions for vehicle control. The electronic control unit plans the optimum route, responds to traffic situations, and acts proactively to ensure safe and efficient driving.
  • Actuator control (Act): Ensure that commands are accurately transmitted to relevant actuators to generate the desired vehicle response.

Validating Electronic Control Units Using HIL Simulation

"The ADAS/AD electronic control units are validated through the use of innovative technologies such as hardware-in-the-loop (HIL) simulation," reports Oliver Maschmann from ZF. This creates a virtual world in which a virtual vehicle, also known as a digital twin, acts realistically. Sensor-realistic simulations are carried out in this simulated environment in order to provide the electronic control unit's environment sensors with precise data. "ZF uses dSPACE simulators optimized for real-time sensor simulation, which are characterized in particular by the synchronous provision of data for different sensor types," adds Maschmann. 

The HIL simulator generates the environment for radar, lidar, and camera sensors, including their front end, in real time and synchronously. It makes it available to the ADAS/AD control unit. The real-time simulation of the 3D world and the sensor models is calculated with all physical effects in AURELION. The vehicle simulation was generated with the Automotive Simulation Models (ASM) tool suite.

Safe Functions for Autonomous Driving

Safe autonomous driving is one of the biggest challenges facing the automotive industry. Clearly defined application areas – the operational design domain (ODD) – and security standards such as ISO 26262 and SOTIF (ISO 21448) play an important role in the validation solution. Compliance with these standards ensures that the vehicles meet the highest safety requirements and demonstrates that development and validation are carried out according to the state of the art. This requires extensive tests and simulations with variable and even critical scenarios to show that the vehicles can respond appropriately even in unforeseen situations. The simulation also supports the development and validation of brand-specific aspects in the control systems.

Operational Design Domain (ODD)

An ODD definition describes the specific operating conditions under which an automated driving system should function properly. It defines which operating parameters the automated vehicle must be able to handle, for example, weather conditions, infrastructure, location, time of day, and other influences on the driving function. The ODD is therefore an important part of a vehicle's safety concept.

Creating a Virtual World
Simplified representation of the transport network in the Friedrichshafen region.

Creating a Virtual World

The Friedrichshafen traffic region was selected to develop a particularly realistic safety concept for autonomous vehicles. It comprises 988 roads, 500 intersections, roundabouts, multilane roads, tunnels, etc. The ODD defines, among other things, the area in which the autonomous vehicle may be used. To start with simulations in precisely this ODD at an early stage, an exact digital image of the traffic network first had to be created in OpenDrive format.

Development of a Digital Transport Network

The transport network was essentially built on the basis of high-definition (HD) maps. All roads with their lanes, lane widths, intersections, and tunnels were taken into account. Missing sections and data were determined with measurement drives and added to the digital HD traffic network. 

Construction of a Realistic Virtual 3D World

An important step in the development and validation of automated vehicles is the creation of a static, virtual, three-dimensional test environment. This high-precision 3D world is required to check the environment detection with radar, lidar, and camera, and to enable localization. 
The realistic construction of the 3D world is complemented by measurement drives. Here, the real world is first digitally captured and recorded using environment sensors. In the next step, the sensor data is labeled to identify objects and their properties according to defined object classes (road markings, traffic signs, buildings, etc.). The labeled sensor data is then converted into 3D objects in combination with the HD map data and provided with surface texturing and material properties to enable an accurate reconstruction of the real objects. A workflow consisting of labeling tools from understand.ai and 3D design tools is used for high-precision implementation and is offered by dSPACE as a 3D service. The exact positioning of the peripheral buildings on the mapped road network is also part of the 3D service. A particular advantage of this process is that no additional time-consuming reference measurements are required. 
The recorded measurement data can be used for various purposes as part of the validation project: 

  • Creating a 3D world (digital twin) of the static environment
  • Validating the perception by comparing it with the labeled data
  • Validating the sensor models by comparing the real scenario with the virtual scenario

Comparison of simulation and real data: The real vehicle environment (left) is transferred semi-automatically into a detailed digital twin (right) using the dSPACE service for 3D reconstruction. In the reconstruction process based on iterative annotation and generation, realistic 3D models corresponding to the real data are generated on the basis of simple annotations. Non-annotated details are added on the basis of procedural decisions. The digital twin also contains the surface properties relevant for the AURELION sensor simulation.

Evaluating the Virtual 3D World

ZF assessed the suitability of the transport network, including the static virtual 3D world, on the basis of various criteria such as accuracy, attention to detail, and realism.

Using reference points, they demonstrated a match within the permissible deviations between the virtual and real worlds. The accuracy was sufficient to test the localization algorithms in control units. ZF also carried out plausibility checks with real electronic control units, for example, to validate the modeled GNSS coordinates and the relevant points of the traffic infrastructure. The tools and services from dSPACE enable ZF to build realistic OpenDrive road networks for the simulation and validation of ADAS/AD control units. The required accuracy is achieved and the sensor data can be efficiently converted into a 3D world. With the 3D service from dSPACE, ZF can offer a powerful service for the validation of ADAS/AD.
 

Drive Off Virtually: With ADAS/AD into the Urban Digital Traffic Region

With the establishment of a static virtual HD test environment, an essential milestone has been reached for the virtual validation of ADAS/AD in urban traffic regions. The structure of the virtual OpenDrive road network and the 3D world provide a solid basis for development. In addition, further and even more complex application possibilities are currently being implemented, for example, the integration of dynamic traffic scenarios in the 3D world, the integration of vehicle-to-everything (V2X), and connected services to test future vehicle functions and systems.

Impressions of the virtualized transport region 

Courtesy of ZF

 

dSPACE MAGAZINE, PUBLISHED JULY 2024

More Information

  • ADAS & Autonomous Driving
    ADAS & Autonomous Driving

    Open, end-to-end simulation and validation solution to empower safe automated driving

  • Scenario Generation
    Scenario Generation

    The Scenario Generation Solution provided by dSPACE enables the creation of realistic scenarios based on recorded real-world data for the large-scale testing of functions for autonomous driving.

  • AURELION
    AURELION

    Integrate realistic sensor data to test and validate your perception and driving functions - with AURELION, the software solution for sensor-realistic simulation.

  • ZF: AI-in-the-Loop
    ZF: AI-in-the-Loop

    Presenting a new test system for validating an autonomous, AI-based vehicle using realistic sensor simulation

  • ZF: The 7th Sense for Highly Automated Vehicles
    ZF: The 7th Sense for Highly Automated Vehicles

    Validation of connected automated vehicles with complex vehicle-to-everything (V2X) traffic scenarios

Stay up-to-date with our dSPACE direct newsletter service.

With our dSPACE newsletter service, we will keep you informed about current use cases and new solutions and products, as well as trainings and events. Sign up here for a free subscription.

Enable form call

At this point, an input form from Click Dimensions is integrated. This enables us to process your newsletter subscription. The form is currently hidden due to your privacy settings for our website.

External input form

By activating the input form, you consent to personal data being transmitted to Click Dimensions within the EU, in the USA, Canada or Australia. More on this in our privacy policy.