Dr. Klaus Lamberg, Senior Product Manager HIL Testing, dSPACE GmbH
In the early years of hardware-in-the-loop (HIL) simulation, about 20 years ago, everything was about real-time simulation. The main challenge was how to bring an engine or vehicle dynamics model to life in real-time. Today, HIL simulation and testing is much more than just real-time simulation. It is about distributed model development and integration processes, managing HIL farms, automated testing, and ensuring functional safety. So, is real-time a relic from the early years, or is it still a critical factor?
In my opinion, the answer is: It is a critical factor now more than ever, and this article will explain why.
First, all classical electronic control units (ECUs) are essentially controllers that measure input signals and produce outputs in order to drive physical systems to desired states and along desired trajectories. For example, ECUs run an engine economically, brake a car safely, or drive it autonomously. Therefore, from the ECU hardware and software perspective, the HIL system, while executing the real-time dynamic model of the physical system, closes the control loop. This is why we call it a closed-loop simulation.
The simulation of the physical system – the plant model – must be as realistic as necessary to make the ECU ’think‘ it is embedded in the real environment. This requires the simulation to represent the exact dynamic behavior of the physical system. We cannot test a dynamics stability controller acting in milliseconds if we are not able to also simulate the entire vehicle dynamics in milliseconds or less. This is where real-time comes into play. The closed-loop HIL system requires the simulation to react to ECU outputs and give simulated sensor feedback at the ECU inputs as quickly and precisely as the real system would do. If a real system responds to a valve or throttle actuation within a millisecond, the simulation will have to do as well. Actually, the simulation must run with even double the frequency than the ECU does, but we won’t dive too deep into digital signal processing theory here. However, this is why the simulation must run in real-time, and there is no compromise from that acceptable.
And in fact, it is. And it becomes even more challenging with stricter emissions regulations, highly dynamic, electrified powertrain applications, more sophisticated performance algorithms in vehicle dynamics, braking and active safety, and with increasing security demands, for example, in autonomous driving.
One way to answer these requirements is to select fast enough processors for the simulation. Today's real-time processors with multi-core architectures and higher clock rates offer high performance and can run even larger system models with lower sample rates within microseconds. But since the size of models as well as model fidelity required for highly accurate simulation are growing, we must be able to scale the processing power in HIL systems. Therefore, we can choose from two options:
We can build a multiprocessor system and distribute the simulation models to such interconnected multi-core computers.
In electromobility, we see the need for very fast and even more time-accurate simulation due to very fast control loops for electric motors with power electronics. For this, fast processors are very helpful but they may not be enough. FPGA technology helps in this case, since it allows for algorithms and models to be implemented directly in the hardware, making the fastest cycle times and response rates possible.
With autonomous driving, there are even more trends that are technologically challenging, but not unsolvable. First of all, autonomous driving brings along new sensor technologies, such as camera, radar, lidar, ultrasound and so on. HIL systems for autonomous driving ECUs, must be able to simulate these new types of sensors and generate their respective behavior and signal patterns, for example, surface reflections and sensor point clouds. Therefore, GPUs are used, which are very good in parallel computing of data-intensive applications, which camera, radar and lidar naturally are.
Second, communication technologies are growing fast and vastly. Autonomous driving ECUs are part of a large and complex communication network inside and outside the vehicle. Communication technologies and protocols, such as CAN, CAN FD, LIN, FlexRay, Automotive Ethernet, Wi-Fi, DSRC, 5G, GPS, and so on, must be simulated. This is classically called restbus simulation since the communication and behavior of the other bus and network participants must also be simulated in real-time. Bus communication has evolved from exchanging simple messages and signals over the CAN bus with 125 or 500 kbit/s to a service-based and secured onboard communication over 10-Gbit Automotive Ethernet. And the amount of in-vehicle communication has grown dramatically over recent years. This requires computing power. Depending on the HIL application, restbus simulation can take 50% or more computing bandwidth of the entire real-time application. So, this becomes another critical aspect of HIL simulation, where we see the demand for processing power increasing dramatically.
For these reasons, one can easily understand why real-time simulation is more critical to HIL testing than ever. Meeting today’s and future real-time demands and varying computation needs requires multicore computing systems, interconnectable to build up multiprocessor systems, expandable with FPGA and GPU boards, all of which being kept time-coherent and supported by a seamless software environment. And this is why dSPACE is continuously working on providing new processor boards with more processing power, optimizing interprocessor and I/O communication, and introducing technologies such as FPGAs and GPUs into our HIL testing portfolio. Because providing fastest real-time performance in combination with the most comprehensive set of tools, solutions, and services, is what makes our HIL customers so successful. dSPACE is committed to be your partner in simulation and validation.