There’s no such thing as mapless automated driving
Snigdha Bansal·Jan 05, 2026

There’s no such thing as mapless automated driving

Snigdha Bansal
Staff writer
Jan 05, 2026 · 12 min read
Why mapless automated driving is a myth | TomTom Newsroom

With AI taking on a larger role in automated driving (AD), expectations of what autonomous vehicles should be capable of are rising. Vehicles are expected to go beyond reacting to what's ahead, and even predict what will happen next. In this growing reliance on neural networks, maps have increasingly been labeled as optional, or even obsolete. But as with every new tech, there's a path to evolution until things work perfectly. The modern world of safe automated driving still depends on a contextual awareness and grounded understanding of what lies ahead — how the road is shaped, how traffic typically behaves and what situations are likely to unfold beyond the range of onboard sensors.

While AI-driven systems are powerful, using them without a form of ground truth can also introduce new risks, from misinterpretation to predictions that aren’t fully grounded in reality. That’s where maps play a critical role. By providing real-world context, maps anchor automated systems in the real world — helping ensure that predictions align with reality rather than assumption. As the industry debates which sensors and data sources deliver the greatest return on investment, one question remains central: how important are maps for autonomous vehicles, really? Traditionally, AD has relied on rule-based models, where the vehicle follows strict instructions pre-programmed  by engineers to maintain safety. For example: “if a pedestrian is detected within 20 meters, apply the brakes.” Such models rely on fixed logic and conditions anticipated by engineers. While these systems work for the conditions they’ve been programmed for, any surprises on the road can throw them off. But with the influx of neural networks in recent years, AD’s ability to deal with the unexpected is improving.

automated driving
AI automation and looking ahead with maps 

As AD technology develops, the industry is beginning to see a division between those that believe AD needs a map and those that don't. Today’s AD systems rely heavily on sensors, especially in the perception layer — the part of the stack that turns raw sensor data into an understanding of the vehicle’s environment. To be certain that a vehicle acts safely, however, the perception stack relies on more than just sensors — it also incorporates data from maps.  

In most advanced driving systems today, digital maps provide structure and context that sensors alone cannot always capture, or might fail to, especially when obscured by unfavorable weather conditions like rain or snow. They tell the vehicle the lane it should be in, what the speed limit is, how intersections are shaped, where road edges are and what to expect beyond the range of the cameras.  

In the perception stack, if LiDAR gives you the immediate view of the ‘here’ and ‘now’, maps really help you understand what’s coming next.

Andrew Hart

CEO, SBD Automotive

Lidar point cloud image

Hart believes that supporting the vehicle’s understanding of the world with maps is key to creating more trust in the vehicle, which remains a barrier in the advancement of AD technology.

At the same time, thanks to AI, a new generation of automated driving software is emerging — end-to-end (E2E) AI models. Instead of splitting autonomy into separate modules for perception, prediction and planning, E2E AI models process sensor data within a single large neural network, allowing vehicles to make more dynamic decisions. Unlike traditional modular AD systems, these AI-driven models have a simpler architecture and can adapt and learn from their own experiences of the world. Similar to large language models (LLMs), these networks ingest sensor data and output driving decisions. And while this sounds impressive, the inexplicable, ‘black-box’ nature of this tech has raised safety concerns. The highly automated and data-driven nature of these models makes it difficult to understand its internal decision logic, which means it can be extremely difficult to understand why a vehicle acted in a certain way, especially if it behaved unsafely.  

Modern AI systems can also misclassify objects, produce false positives or create scenarios that may exist in real life — sometimes described as ‘hallucinations’ — and in the case of autonomous vehicles, mislabeling objects or hallucinating a pedestrian could have catastrophic consequences. 

Can autonomous vehicles go mapless? 

With E2E models reshaping the AD landscape, some companies position them as being less dependent on maps, or even completely free of a map. The growing perception is that E2E models can ‘learn the map’ internally from camera data, making supporting maps optional. Michael Harrell, TomTom’s SVP Engineering Maps, disagrees. Harrell explains that the recent breakthroughs powering today’s automation systems are built on two major advances in AI — self-supervised learning (SSL) and vision transformers (ViTs). SSL allows models to teach themselves by discovering patterns in enormous amounts of unlabeled data — far more than humans could ever annotate manually. This technique, widely used in generative AI, is now accelerating AD technologies by improving how vehicles understand raw sensor input. SSL helps fuse data from cameras, radar and LiDAR into a unified picture of the environment, making it easier to detect objects and understand what’s happening. It also improves trajectory prediction, helping the vehicle anticipate the future movements of objects and other vehicles. ViTs, meanwhile, are deep learning models that process images and video to capture subtle cues traditional models often miss. In AD, they can interpret context to understand a pedestrian’s body language and determine if they’ll unexpectedly step onto the street, or if the slight drift of a car is a signal that it might unexpectedly change lane or that the driver is not paying fully attention. ViTs can use past frames of video to predict what may happen in future ones, giving vehicles a stronger sense of unfolding events on the road. But Harrell stresses that even with these advancements, the models are not infallible. Their predictions can be highly detailed, but can’t always be trusted completely. 

“Just like we don’t want LLMs to make things up, we don’t want these AD models predicting fantasyland ahead. We want them to predict reality. And the map is what grounds reality,” he says. 
Map as a way for the autonomous vehicle to communicate its thinking with the driver by showing a map on the screen'Maps allow the vehicle to communicate its 'thinking' to the driver and passengers, helping maintain trust in the vehicle.Harrell adds that maps remain essential not just for how the vehicle perceives the world, but also for how it communicates that understanding back to the human inside. Even if an E2E AI model is making the predictions, the system still needs a way to show the driver what it ‘sees’, what it plans to do next and how it interprets the road ahead.  
“How do you display that? Through the map onto the screen,” he explains. “The map is the interface between the model and the operator, so it still has to exist.” 

To support both the vehicle’s intelligence and the driver’s trust, automakers need a map that is detailed enough for machine reasoning and clear enough for human interpretation — a map that is both technically robust and constantly up to date. 

Automating mapmaking to accelerate vehicular automation 

Building a robust and up-to-date map is not an easy feat, add to it the required scale and richness of data needed for modern automation; the traditional way of mapmaking needs an overhaul and needs it fast. While the status quo may have been maps which need manual updates, making them slow, expensive and difficult to scale, seeing how the role of maps is evolving in autonomous vehicles, TomTom has found an innovative new approach to mapmaking using AI — including fundamental vision models — to automate map creation and updates, ensuring high precision and frequent refreshes. 

Instead of only relying on survey vehicles, sensors and manual editing, TomTom’s Orbis Maps for Automation uses a combination of foundational models and in-house AI to automatically process multiple data sources and generate a highly detailed, 3d map of every road, down to lane-level precision.  

This foundation is further strengthened by TomTom’s latest launch of the ADAS SDK, making it easier for carmakers to integrate Orbis map intelligence directly into their vehicle’s AD systems.

 
an image of a car on a screen with the ADAS system detecting and communicating a hazard ahead

The latest example of this approach is CARIAD, the automotive software company of the Volkswagen Group, which has selected Orbis Maps for Automation as a core component of its AD systems. By integrating Orbis map data into its AD stack, CARIAD will strengthen how its vehicles understand and anticipate their surroundings, using real-world spatial context to support functions ranging from Intelligent Speed Assistance to hands-free driving. In doing so, the map acts as a stabilizing layer — helping automated systems navigate complex driving scenarios in a predictable, human-like manner. 

The map in ‘mapless’ 

While the role of maps is evolving, maps remain a critical part of how automated vehicles understand and navigate the world, even in so-called ‘mapless’ systems. Sensors can only interpret what they see, and a huge part of driving safely is preparing for the unexpected. As a result, when it comes to today’s AI-driven AD systems, context matters even more. Accurate positioning, lane-level geometry, traffic rules, elevations and typical speeds all help an onboard model interpret situations faster and more reliably to, make decisions with more confidence. Maps act as a shared memory — a condensed record of countless vehicles’ real-life ‘experiences’ on a given road — supporting perception rather than replacing it. ‘Mapless’ automation is a transitionary point of view. The path to scalable autonomy isn’t about choosing between maps and perception. It’s about combining them intelligently. Even as AI changes how maps are used in AD, they continue to be important for autonomous vehicles, whether to operate more safely, to anchor their understanding of the world or to maintain trust between the machine and the human behind the wheel. Being at the helm of the change and with expertise in mapmaking and the automotive industry, TomTom believes Orbis Maps can become a key component of the future of perception, making AD systems easier to develop, more affordable to build and safer and more trustworthy for everyone on the road. 

People also read

TomTom Premium Map display 3D
product focus

A new dimension for ADAS maps: TomTom Orbis Maps 3D

TomTom Orbis Maps is a game-changer, and now it comes in 3D. With greater likeness to the real world Orbis Maps offers enhanced safety and navigation by providing detailed road geometry and lane-level information. But with an ever-changing world, how can we keep up? TomTom builds its maps using machine learning and AI to automate map creation and updates, ensuring high precision and frequent refreshes. Based on multiple, diverse sources of data, they support autonomous driving and beyond, and can be tailored to specific customer needs. We spoke to the experts and took a deep dive into the technology, which is opening new possibilities.
Jan 06, 2025·6 mins read
What are the Different Autonomous Driving Levels? | TomTom Newsroom
explainers and insights

What are the six levels of autonomous driving?

Autonomous vehicles and advanced driver-assistance systems (ADAS) are projected to help save millions of lives globally, eliminate congestion, reduce emissions, and allow us to rebuild cities around people and not cars.
Dec 26, 2019·8 mins read
A driver using next-gen automotive technology
product focus

Seamless integration for seamless interactions: How TomTom brings AI into the car

The TomTom AI Agent is here. And while essential for bringing intelligent navigation into the vehicle, there’s far more to this solution than meets the eye. Built on a growing number of industry partnerships — most recently SoundHound AI, Amazon and Visteon — it’s the bridge between old software stacks and new tech, simplifying integration efforts so that carmakers can create seamless voice experiences quickly and cost effectively.
Jan 06, 2026·8 mins read
Ethical mapmaking in the AI age
behind the map

Mapmaking in the AI age: How TomTom builds maps you can trust

It’s safe to say that we’re in the middle of a renAIssance. Over the past couple of years, with generative and agentic AI becoming mainstream, businesses around the globe have been in a race to integrate it into both their workflow and their output. But like any other new phenomenon, AI faces questions on the ethical implications of its widespread use.
Nov 05, 2025·8 mins read
Never miss a story
Get the latest news from TomTom in your inbox.

* Required field. By submitting your contact details to TomTom, you agree that we can contact you about marketing offers, newsletters, or to invite you to webinars and events. We could further personalize the content that you receive via cookies. You can unsubscribe at any time by the link included in our emails. Review our privacy policy. You can also browse our newsletter archive here.