Nowadays, ADAS applications are often based on reliable environment detection by camera or radar. In order to test corresponding ECUs by means of simulation, the real sensors must be stimulated in a suitable way at the HIL. A typical approach for release tests is using a camera box. Enclosed in a chamber, the camera sensor is stimulated by a monitor with 3-D scenarios. The pictures are generated on a PC with a graphics card on the basis of data from a vehicle and environment model. This has to be done in real time with a high frame rate and sufficient detail accuracy, so the camera control unit can recognize and classify objects correctly. For applications that require other types of sensors such as radar, sensor fusion takes place in the control unit. An algorithm combines the objects detected by the camera and radar to form a uniform environment model. It requires that the data streams of both sensors reach the control unit correlated in time. In a HIL test environment, this is done by real-time simulation. It ensures that stimulating the camera from MotionDesk and feeding object lists to the radar control unit are time-synchronized.