The population boom in Boise has led to subsequent urban development, requiring mapmakers to take note and ensure digital maps reflect these changes quickly and accurately.
Government data can also make for a helpful resource. For example, every month, the US Postal Service adds new addresses — now receiving mail — to its records. The monthly address update is another useful source to alert mapmakers about potential changes in the world that need to be mapped.
“The US government actively tracks migration from state to state, and individual municipalities also provide us with sources to feed into our maps,” says King. “In this case, we have been working with the state of Idaho for several years. When they released state-wide geographical data a little over two years ago, we were able to zoom in on Boise and compare the localities with what our maps recognized. As the speed of development increased, so did our focus on the area.”
Once these areas have been identified, the question about how to edit the map arises. While mapmakers generally automate map editing to make it a smooth process, it’s not always the best choice. It might not work as well in areas that develop rapidly, like Boise, as it does in what King calls “maintenance geographies” — established areas like New York City (NYC) or Amsterdam where the volume of map changes is much lower.
Where automation fails
Automation enables changes to be added to the base map without much human intervention. Essentially, new location data containing additions, modifications or deletions to the map comes as a data set, after which it’s lumped together and added onto the current map.
Once the data has been ingested, the map can then be cleaned up for minor inaccuracies such as misspelled street names. This works reasonably well in a city like NYC, which is already more or less established. Changes usually amount to the closure of a road or opening of a new shop at the most.
However, King explains, a map is like a patchwork quilt of data — made up from several different sources that vary significantly from each other in terms of quality but can work together with careful crafting.
To lump together sources and automate ingestion of this data by the map without analyzing the quality of each source would mean risking the degradation of data and sacrificing quality. At the same time, examining each new change for quality makes the process painfully slow, causing valuable source material to be left unused or get so old it’s no longer useful.
One step closer
To address the downsides of automation and help update maps for boomtowns quicker than manual processes allow, King and his colleagues created a new tool. Known as the Proactive Sourcing tool (PAS), it compares the incoming data with changes against the data currently being reflected on the map.
To put it simply, if every delivery from a source is compared to the original base map, there will always be a lot of differences, TomTom Regional Sourcing Specialist Thomas Byker tells me. So, PAS takes the newest data from a source and compares it against the last delivered version, and then you only see the smaller changes that the source provider has added within that time period. This means that the editing process can remain hyper focused on the changes that are freshly happening in a specific area.
This new approach promises faster returns than complete automation. When it comes to accuracy, maps updated by PAS using source material can only be as accurate as the source data itself, which is highly variable. In rapidly growing suburban areas especially, obstructions like tree cover combined with mountainous topography can make it difficult to quickly assess the quality of source data, according to TomTom Regional Sourcing Specialist Kurt McClure.
So, while PAS could help keep up with how fast cities like Boise expand, there is still scope to address accuracy. As it turns out, that’s possible using another in-house map editor.
A collaborative approach to map editing
Since October 2020, King’s team has been using Vertex, a visual map editor designed to ensure both freshness and overall quality of map data for TomTom and its partners.
“We wanted to take a more proactive approach to map editing. We saw the need for a tool to process numerous high-quality leads and sources at a faster speed. So, we provided a semi-automated solution that empowers mapmakers instead of having them depend on automated processes, especially in high-growth areas,” says Nochumson, who manages the day-to-day development of the tool.
Using a combination of local knowledge, aerial imagery and probe data, Vertex automatically proposes map updates to human editors — who then have the option to accept or reject them.
King’s team already has data from countless different sources made available for ingestion by the PAS tool. By importing it into Vertex as an editable layer, this data can act as a base for the proposed changes.
Think of it as a dish with ingredients sourced from different places. Map data prepared by PAS, when entered in what is called the Automated Road Tool (ART) layer in Vertex, results in map changes for editors to consider. Editors can also adjust the dish to their taste, by incorporating elements like missing or incorrect street names.
In a little over a year, the ART has generated over 50,000 kilometers of new road updates for editors to consider, not limited to Boise, but also including other similar geographies like Denton, Texas. This is a vast improvement from complete automation, the method mapmakers relied on earlier, which risked maps going stale in areas like Boise due to the varying quality of source data.