Project Overview:
This project (ELKH-PoC-2023-054) develops an integrated edge cloud platform and a collaborative, AI-driven immersive driver-assistance system designed for low-latency, safety-critical applications. The core objective is to enable real-time perception, decision support, and intuitive visualization by bringing computation closer to end users through GPU-accelerated edge infrastructure.
The platform integrates edge-native microservice orchestration, an AI-based proactive resource scaling mechanism, and a microservice controller that dynamically optimizes the software structure at runtime. The controller continuously adapts the composition and placement of microservices based on workload characteristics, network conditions, and latency constraints, ensuring efficient and responsive operation for delay-sensitive applications. Together with edge AI services, these components form a unified architecture for executing real-time, compute-intensive workloads at the network edge.
On top of this platform, we implemented an immersive driver-assistance application that detects and predicts potentially dangerous traffic situations and presents them to the driver using XR (Extended Reality) technologies.
The system was validated in outdoor experiments on a 5G test network, interconnected with GPU-equipped edge servers. In demonstration scenarios involving a vehicle and a cyclist, the driver received real-time hazard warnings on a Microsoft HoloLens 2 device, including the predicted position and motion of the cyclist visualized as an XR overlay. Perception and decision logic were executed on the edge cloud, and contextual information was integrated using NDS.Live HD map data.
Beyond driver assistance, the developed platform is applicable to a broader class of low-latency, AI-enabled edge applications, such as smart mobility, immersive interaction, and safety-critical cyber-physical systems.
Contacts:
Balázs Sonkoly - sonkoly.balazs@vik.bme.hu
László Toka - toka.laszlo@vik.bme.hu
Balázs Fodor - balazs.fodor@edu.bme.hu
Marcell Szabó - marcell.szabo@edu.bme.hu
Bálint György Nagy - nagy.balint.gyorgy@vik.bme.hu
The developed system represents a TRL6-level prototype, demonstrated in a relevant real-world environment, including outdoor experiments on the BME Infopark Campus, a 5G test network (VKE testbed), and GPU-accelerated edge servers. The architecture shown here illustrates the experimental demo setup used to validate the end-to-end operation of the platform, from edge AI-based perception to immersive XR-based visualization for the driver.
The key functional components demonstrated in the architecture:
Collaborative Environment Perception and Dynamic HD Map Construction
XR-Based Visual Feedback for the Driver (HoloLens 2)
The system enables real-time collaborative environment perception by processing sensor data originating from vehicles and other traffic participants, such as cyclists. Using edge-based AI services, the platform detects objects, models their motion, and predicts potentially hazardous situations in real time. Crucially, perception and interpretation are executed on local edge resources rather than in a remote cloud, significantly reducing end-to-end latency and improving responsiveness. The resulting contextual information is continuously integrated into a dynamic HD map representation, supporting accurate and up-to-date situational awareness.
To present the generated context information in an intuitive way, the project developed an XR-based visualization framework that delivers real-time feedback directly into the driver’s field of view. In the demonstrated scenarios, the driver received live hazard warnings on a Microsoft HoloLens 2 device, including the predicted position and motion trajectory of an approaching cyclist. Integration with NDS.Live HD map data ensured spatially accurate placement of visual cues, enabling precise and immersive representation of potential hazards.
The following videos showcase key experimental demonstrations of the developed system in action. They highlight how edge-based AI services and immersive XR visualization work together to support real-time, cooperative driver assistance.
The demos were recorded during outdoor experiments on a 5G test network, where perception and decision-making were executed on GPU-accelerated edge servers, and hazard information was delivered to the driver via a Microsoft HoloLens 2 device. The videos illustrate both the system-level behavior and the user-facing immersive experience under realistic conditions.