UX Prototyping in 4 dimensions
Paul Schouten·Jul 07, 2021

UX Prototyping in 4 dimensions

Paul Schouten
Interaction Designer
Jul 07, 2021 · 4 min read
UX Prototyping in 4 Dimensions | TomTom Newsroom

How corona made me add two dimensions to conventional 2D design

The beginning of 2020 was a thrill. I bought my first apartment, got a really cool new position at TomTom and …a pandemic hit us all.

As you probably know too, being quarantined inside the house wasn’t fun. But on the bright side, it did come with a lot of extra spare time. Time that I decided to spend on learning new skills.

Half Life Alyx’s amazing VR gameplay

Half Life Alyx’s amazing VR gameplay

Around the same time a new chapter of my favorite game franchise was launched; Half Life Alyx, which made me super hyped for VR. And so, I set myself the challenge to build my own VR game during the lockdown. It was a decent challenge, as I had no prior knowledge on how to make videogames at all. And the only programming skills I had was a bit of HTML & JavaScript coding for UI prototyping.

The great thing of working at TomTom is they really support personal development. I knew I could utilize the same skill set for my new position and therefore I got the ability to also spend some working hours on developing these skills.

So, in 2020 I basically learned to code in C# through working with Unity.

Adding the third dimension 

For my first ‘VR game challenge’, I decided to build the apartment I just bought in VR. Not really a game, but it would force me to learn the basics. Unity’s learning platform helped me a lot with this, kudos to them to make it freely available when the pandemic started. When I signed the contract for the apartment, I only had a 2D top-view blueprint of the place, not even a 3D rendering or impression (only my imagination). But several weeks later I was walking through my future apartment in VR! And boy, that was interesting…

Decorating my new apartment in 3D

Decorating my new apartment in 3D

Decorating my new apartment in 3D

It made me see things that I didn’t realize before. How opening doors would affect the space, the effect of daylight through windows, what furniture would be possible. Seeing it in 3D added a whole new dimension!

My new position at TomTom also forced me to think in that third dimension as I would be focusing on new technologies of which one would be Augmented Reality.

I started that project by photoshopping several screenshots taken from dash cam footage, adding AR elements on top of that 2D image. But once you start to see the designs in 3D, on a virtual road, you start to stumble upon things you didn’t imagine when looking at it in 2D. For instance, the distance of AR objects, their viewing angle, readability, color, and more.

Seeing 3D navigation manifest itself in the 3D space

Seeing 3D navigation manifest itself in the 3D space

The use of 3D in our design process didn’t stop there. At TomTom, we design navigation experiences for multiple screens in the car. Seeing the information distribution over those screens while sitting in the virtual car is very important. Is the information visible to the participant? How often and how long does the user need to glance to the screen.

This is of course possible in VR, but having a modular simulator setup is a bit better on the stomach for our test participants. And with eye tracking hardware, we get great insights.

Low fidelity setup of our test-driving environment, where space and location of screens are tested using eye tracking

Low fidelity setup of our test-driving environment, where space and location of screens are tested using eye tracking

Adding the fourth dimension (time)

Next to seeing our ideas in 3D, we can now also experience them in a highly dynamic context. This is essential for us, as we design interfaces that will be used in dynamic driving situations.

As you drive through the world, the interface changes accordingly. But the UI elements also are influenced by events happening in that world. How about getting a pedestrian warning while the UI is showing a guidance instruction, how does that work in AR when elements are overlapping? How does the test driver respond to this, where does she or he look at? I think you get an idea of how important this is for us to design interfaces that support driver safety, one of the core missions of TomTom.

Dynamic events cause dynamic UI

Dynamic events cause dynamic UI

To support this dynamic nature, it was important to create a ‘living’ city. For the first iteration we didn’t base this on actual map data. So, I got the awesome opportunity to design my own city, which turned out to be a lot of fun. Creating different city areas, adding cars that randomly take turns & change lanes, a pedestrian system, implementing traffic rules such as traffic light systems. A lot of work in hindsight, but programming is addictive!

Driving through my own city

Driving through my own city

Another important aspect of time: changing weather effects & day/night cycle. This is very relevant for UI design and especially AR, as the visibility is directly affected by the environmental lighting.

A fifth dimension?

I don’t think adding a fifth dimension will be our next step, I don’t even know what that would mean to us…

We are however working on more advanced driving simulator setup and using real map data in our environment, in order to increase realism even further. So, expect more of us soon!

Never miss a story
Get the latest news from TomTom in your inbox.

* Required field. By submitting your contact details to TomTom, you agree that we can contact you about marketing offers, newsletters, or to invite you to webinars and events. We could further personalize the content that you receive via cookies. You can unsubscribe at any time by the link included in our emails. Review our privacy policy.