CES 2023 was electrifying, engaging and definitely held up to its motto “As the most influential tech event in the world”. Although CES was historically known for exhibiting consumer electronics, the automotive industry is a vastly growing segment at the show, and AMD did not miss its chance to demonstrate our broad range of automotive applications.
To kick off the show, our partner ECARX announced on January 5th, that their next-generation digital cockpit, powered by AMD, is set to feature in the smart brand's all-electric production vehicles, to be launched from 2024. ECARX joined us in our booth to demonstrate their Digital Cockpit technology.
ECARX Flagship Immersive Automotive Digital Cockpit Computing Platform
ECARX demonstrated its first independently designed and developed flagship high-performance computing platform for next
generation in-vehicle digital cockpit powered by AMD technology. This immersive cockpit experience showcased a full 3D human-machine interaction interface, multiple ultra-high definition 4K displays, and AAA high-end gaming entertainment.
The demo included paired with AMD Radeon™ RX 6000 Series GPUs and ECARX hardware and software.
You can learn more about this exciting global strategic collaboration from our announcement in August 2022.
We also showcased the broad capability of our AMD processors with the following AMD demo.
AMD has enabled PC class experience into the vehicle. Showcasing digital cockpit capabilities of AMD processors. Powered by a single Ryzen Embedded processor, the demo consisted of a central display with interactive HMI, a cluster display, as well as two additional displays for rear-seat video entertainment running. The interactive HMI was developed on the Unreal Engine 5.1, a PC-class gaming engine that is not typically supported on mobile processors. The AMD embedded processors can run the latest gaming engines with features typically only seen in high end desktop gaming systems.
The demo consisted of a Sapphire FS-FP6 embedded motherboard powered by AMD Ryzen Embedded V2000 processor and Siili Automotive Digital Cockpit HMI Software paired with four touchscreen displays. The center screen demo utilized the latest Unreal Engine for HMI visualization of a self-driving vehicle rendered on the embedded GPU.
Additional key automotive applications are driver monitoring, 3D Surround View and front camera view. AMD came ready with the following two demos:
Dynamic Function Exchange with Driver Monitoring & 3D Surround View - Xylon
AMD collaborated with Xylon to develop a novel multi-function ADAS design, featuring Driver Monitoring and 3D surround view capabilities. This is no ordinary design as it leverages the unique adaptive-computing capability known as Dynamic Function Exchange (DFX) – available throughout the AMD portfolio of Adaptable SoCs.
Giving the ability to CHANGE an ECU’s internal hardware architecture and functionality in just milliseconds – this demo “instantaneously” changes cameras and processing datapaths to switch between driver monitoring and 3D surround view in a SINGLE domain-oriented device.
DFX-enabled time multiplexing of hardware accelerator pipelines for ADAS features that are mutually exclusive, allowing smaller devices, smaller PCBs, and reduced configuration storage space - translating to cost, size, and power reductions.
Finally with the supported Isolation Design Flow and on-chip monitoring features, customers also receive improved functional safety and enables future updates.
Forward Camera - Motovis
This forward-camera, proof-of-concept demo showed how AMD can help deliver improved range (>180), wider field of view (FOV H120o C) and enhanced safety in driver assist systems (target up to SAE L2+). Combining the Zynq™ UltraScale+™ MPSoC, Omnivision’s 8M pixel imager, and Motovis’ deep-learning networks, it showcased how to innovate faster and future-ready forward camera designs for OEMs.
The demo proved you could build a scalable, auto-qualified forward-camera solution with adaptable AI that is functional-safety-enabled and production-ready.
Automated Park Assist (APA) – AISIN
In November, AMD announced that the Automotive Grade Zynq® UltraScale+ ™ MPSoC platform has been selected to power the Aisin Automated Parking-Assist platform. At CES, we were excited to unveil the video that highlights how Aisin’s parking assist technology is bringing fully autonomous parking to the market, utilizing the unique capabilities of adaptive computing technology from AMD.
Our final automotive demo showed off some of our cutting-edge AI capabilities for the automotive space. The demo showcased a fully integrated accelerated video analytics pipeline from ingest to inference, fully configured through
GStreamer framework. Featuring 16 simultaneous HD video inputs, each frame is able to process an average of 10 deep-learning models with H.264 decode and compute-vision functions.
The demo was powered by the recently announced AMD Alveo™ V70 Accelerator card with 16GB DDR4, plus a 75W, HHHL, single-slot x16 PCIe® card, and Vitis™ Video Analytics SDK with Vitis AI backend.
The Alveo V70 accelerator card is equipped with the same AMD XDNA AI Engine architecture as the Automotive-focused Versal® AI Edge series, which is going into production this year.
This technology was just too cool to capture via image! You’ll have to check it out at our next event!
Overall, AMD showcased a solid portfolio of automotive applications at CES 2023! You won’t want to miss our next automotive demonstrations at Embedded World 2023!
For more information on our x86 Automotive products please click here.
For more information on our Adaptive SoCs please click here.