Hybrid Cloud Data Replay Solution

  • High-performance data replay validation solution that combines the strengths of the Microsoft® Azure® cloud, Equinix's International Business Exchange™ (IBX®) global data center infrastructure, and dSPACE test solutions
  • Scalable, continuously available, and cost-optimized
  • High-quality, industry-proven systems
  • Smooth data flow and transfer between different geographical locations
  • Single web-based access point for each user without having to worry about the underlying data transfer

Bridging Data Flow Gaps Between IT and Development & Validation of Autonomous Driving

To allow highly automated and consistent data replay testing on software and hardware platforms and efficiently work in globally distributed development teams, the technology collaboration between Microsoft, Equinix, and dSPACE (including tools from the dSPACE group company Intempora) has resulted in a powerful solution that combines the best of three worlds.

The Solution

The new hybrid solution consists of:

   
Microsoft
  • ExpressRoute private network connection that provides ~100 Gbps connectivity between cloud services & HIL systems in the Equinix data center
  • Azure Data Lake Storage Gen 2 built on Azure Blob Storage – a hierarchical namespace to service multiple petabytes of data while sustaining hundreds of gigabits of throughput.
  • Azure Compute VMs: Both Linux & Windows on GPU, CPU, or FPGA.
dSPACE
  • Data management and selection application based on IVS (Intempora Validation Suite), as a single interface for software and hardware data replay, enabling easy switching among different test methods
  • Data replay HIL system, based on the modular SCALEXIO hardware and the Environment Sensor Interface (ESI) Unit, with support for all relevant sensor interfaces and networks
  • RTMaps development and validation environment for multisensor applications: Acquire data from all sensors, and time-stamp, develop, and integrate processing and data fusion algorithms.
Equinix
  • Data center, interconnection, and digital services
  • Process data at the edge, reduce data backhaul costs, avoid data privacy and regulatory limitations.

Workflow and User Experience

  1. Data logging fleets let you continuously gather new traffic scenarios around the globe.
  2. High-bandwidth data ingestion interfaces allow for the easy integration of the logged data into the data pipeline, and rapid data transfer.
  3. Data organization and enrichment with all relevant metadata, such as data logger properties, used sensor set, etc., make the data searchable.
  4. Immediate access and search of the relevant scenarios from anywhere.
  5. Local downloads of sample data sets from the cloud. Develop, debug, and create test Dockers locally. Push new software algorithms and versions (SuT, Dockers) to the data management system.
  6. Batch postprocessing and testing of the selected scenarios against the newest software version (software data replay, benchmarking, etc.)
  7. Batch hardware data replay testing: Once a software version is defined based on the previous software data replay testing phase, it is deployed to a hardware platform (DuT). The same traffic scenarios used in software data replay testing are then reused to test the DuT and analyze the deviation in performance and quality due to the hardware dependencies.

Contact Information

Do you have any questions? Our experts are always happy to help. Simply contact us.

Related Topics Video

Subscribe newsletter

Subscribe to our newsletters, or manage or delete your subscriptions