Reliable environment recognition, precise localization, and a well-coordinated vehicle-driver interaction are essential prerequisites for highly automated and autonomous driving. The large volume of sensor and communication data are often gathered in a central control unit. Here, they are preprocessed and classified using complex algorithms, e.g., from the field of artificial intelligence, and then merged into a unified environment model that is used in subesequent steps to determine the driving trajectory as well as vehicle longitudinal and lateral control in real time. The combination of MicroAutoBox II and Embedded SPU provides more than sufficient resources for this. With various sensor and communication interfaces, a high-performance CPU/GPU combination for perception and fusion algorithms as well as a real-time unit for vehicle control and function monitoring, it is ideally suited for rapid prototyping of autonomous driving functions in the vehicle.