The automotive industry is constantly evolving and developing new technology to make driving a more intuitive and enjoyable experience. If current trends are any indication, augmented reality (AR) could be the next big thing to help car manufacturers create the next generation of innovative navigation solutions that bring human-like perception to vehicles.
According to a report by Fortune Business Insights, the global AR in automotive market is projected to grow from $4.51 billion in 2021 to $14.44 billion in 2028.
By overlaying computer-generated information on what the driver is looking at, such technology can enrich reality and enhance the user’s driving experience.
Integrating AR into the driving experience can help reduce cognitive load on drivers. Instead of having to translate abstract information they see on the vehicle’s display to what they see in the real world, they have explicit information displayed to them. With a pointer marking the exact turn they must take or the exact spot they need to stop at, the world is given more context and it becomes easier to understand.
Similarly, AR helps with situational awareness by warning drivers of hazards on the road and displaying other such alerts, enabling them to react quickly and avoid accidents. In this way, AR serves as a useful addition to existing ADAS technologies.
What’s more, with AR, drivers can also see relevant location information such as real-world points of interest (POIs) along the way, be it parking spaces or nearby gas stations. By overlaying details such as POI ratings, AR makes it possible to view information while driving that wouldn’t be immediately visible otherwise.
AR can display relevant information such as real-world POIs along the route, which the driver might otherwise miss.
The first-generation of AI-powered AR vehicles provide the driver an indirect view of reality using AR video, which is superior due to its computer vision functionality with real-time road perception. Virtual information is superimposed over a live video feed shown in the vehicle’s cluster or center display.
The user can also view the world directly through head-up displays (HUDs), which present data from gauges, navigation prompts and additional infotainment information such as the song that’s currently playing. AR HUDs often project information on the windshield of the car, overlaid onto the road ahead, keeping the driver’s eyes on the road and minimizing distractions. The offering of aftermarket HUDs also makes it possible to retrofit the tech to modernize existing vehicles.
However, even as OEMs rush to gain competitive advantage by integrating AR in their cars, the costs and extensive timelines required to create an automotive-grade products mean that the technology remains inaccessible to the mass market.
Several factors need to be kept in mind in order to ensure a seamless AR driving experience.
Factoring in user understanding of the relation between AR objects and reality is key. Depth cues play a vital role here, especially when it comes to detecting and visualizing occlusion, according to TomTom Senior Interaction Designer Paul Schouten.
Occlusion commonly refers to when an object being tracked is blocked (occluded) by another object from a given viewpoint. In AR, occlusion is essential to create an immersive experience. Real-world objects shouldn’t be occluded by virtual objects. Alternatively, such objects, or parts of them, should be shown differently from virtual elements (e.g., darker), which is still a challenge at this stage in the technology’s development, Schouten says. Failure to do so can distract the driver from the real world, putting them in an unsafe position.
Virtual elements must also be highly visible and stand out from real-world objects, regardless of external conditions. This is easy to achieve with AR video, but trickier when it comes to HUDs, where light (AR elements) is being projected onto light (the driver’s vision of the outside world). When it’s bright outside, virtual objects are likely to be less visible.
Distance is a crucial factor too. While the computer can easily recognize objects that are close, projecting information about them would be rather useless for someone driving at a high speed. Schouten stresses that information concerning navigation especially should be displayed well in advance to allow the driver to be fully prepared for any situation the road has in store for them.
With breakthrough computer vision-powered spatial AI from Phiar, a company that develops AI specifically for automotive applications, and TomTom’s advanced navigation software, users are brought a seamless in-vehicle AR experience.
Phiar’s lightweight Spatial-AI Engine and Mobility AR Engine enable vehicles to understand their environment better in real time. With the ability to run on any mainstream in-vehicle infotainment platform, Phiar’s real-time perception ensures object recognition and tracking, depth estimation, lane detection, semantic segmentation and 3D localization.
According to Phiar CEO Gene Karshenboym, the company is making it easy for OEMs and Tier 1 companies to integrate their SDK and offer AR capabilities due to their easy-to-integrate platform and ability to utilize vehicle IVI SoC, camera and sensors.
With a wide range of functions to perform and information to provide, prioritizing and structuring what’s displayed to the driver is essential in order to avoid cognitive overload and keep their attention on the road.
To achieve this informational balance, TomTom’s SDK provides easy-to-integrate Navigation UI components, which ensure that the right information is displayed at the right place at the right time, says TomTom Senior Product Manager Automotive Mark Rietveld.
By taking a holistic approach in their design process, TomTom and Phiar promise AR elements that can add value within any possible screen configuration. In this way, AR becomes an integral part of the driving experience rather than an unnecessary technology that adds clutter.
TomTom and PHIAR provide the PHIAR AI/AR component pre-integrated and tested with TomTom NavKit2 SDK.