abandonclosediscoverdisruptionfacebookgoogle-plus instagram linkedinmap-markerphonepinterestsearchtwittervimeo-squareyoutube email

Tapping into zero User Interface and voice-controlled devices

Kardo Ayoub, Digital Creative Lead at The Team, looks at the evolution of User Interaction with devices and how the increased adoption of voice-controlled devices will change the design game.

For the past fifteen years I have been working in the design industry. Through the years I have had many job titles ranging from webmaster, web designer, interaction designer, user experience designer et cetra, et cetra.

I did my degree in Computer Visualisation and Interaction design. Although I have always enjoyed the creative side of my career, the usability and interaction design side has always been more interesting to me.

My first proper project at university was to design a visitor information interface for a kiosk at the V&A. We had a 1024x768px touch screen and physical scroll ball, you couldn’t do swipes back then. The challenge was to fit all the buttons, features and content on the screen and try to make the platform as intuitive as possible.

Over the years this hasn’t changed much. Screen resolutions, browser capability and computing power may have improved but the primary way of interacting with a computer or device has remained to be through a screen-based interface and physical inputs.

So far this has served the purpose but it’s starting to feel a little clumsy. That’s because for many years, tech companies have been focusing on improving the experience of using a device rather than the way we interact with tech.

Take a mobile phone for example. My first mobile was an Ericsson A1018s, it was so big I used to call it “The Fridge”. Then Nokia came out with the much smaller 3310, with Snake installed on it, and I thought we had reached the summit and the world of tech couldn’t go any further. It was a masterpiece and a technological miracle.

Fast forward a few years, the iPhone was born which changed the game completely and we have been blessed with many fantastic devices ever since.

In the process, we have invented new input methods and interaction patterns but the thing that has remained constant throughout the years has been the way we interact with all these devices. It has always been through performing a physical action that we receive our results displayed on a flat 2D screen.

However, this is starting to change. A few years ago, the ‘internet of things movement’ started. Connected devices talking to each other to perform some of our daily tasks without the need for us to get involved at all. And soon with increased adoption of voice-controlled devices the design game will change again but this time more drastically.

As humans we use voice as our primary method of communication and for most of us, it’s the easiest way to express our emotions and intent. So, it only makes sense to use voice to interact with computers and get them to do what we want.

As humans we use voice as our primary method of communication and for most of us, it’s the easiest way to express our emotions and intent. So, it only makes sense to use voice to interact with computers and get them to do what we want.

Let’s look at how different things could be if we use voice-assisted technology to book a holiday for example…

At the moment we need to use a mobile or desktop to access a browser or an app, search for holidays, make a choice, select dates, book, enter payment details and pay.

We then organise accommodation and transport to and from the airport through a phone, app or browser.

We might need to sort out some sort of transport or car hire then the process will start again.

But what if we could complete most of the process using our voice? What if we could speak to Alexa or Google Home just the same way we used to talk to travel agents 10 or 15 years ago?

What if Alexa could find you your ideal holiday because she knows exactly what you like based on your past holidays, books you’ve read, songs you’ve played, food recipes you’ve looked up or – this creepy but already happening – conversations you’ve had?

What if Alexa could find you your ideal holiday because she knows exactly what you like based on your past holidays, books you’ve read, songs you’ve played, food recipes you’ve looked up or – this creepy but already happening – conversations you’ve had?

What if you could ask Alexa to find you a holiday for two with a budget of £1,000 for a week and let her do most of the legwork from finding you destinations and accommodation options and then go ahead and complete the entire process, including your travel insurance?

Of course, you still need to see pictures of where you will be going and staying to confirm, which would need to be displayed on a screen of some sort, which would be the equivalent of a travel agent showing you a brochure before sorting everything else out for you.

So, what does all this mean for us designers? What would happen to us if there aren’t any pixels to be pushed about?

Lucky for us, we will still be very relevant. In fact, maybe more relevant than ever before. You see, designing a zero UI, or screen-less interface requires a substantial amount of user experience research and design. In fact, I’d argue that interaction design is a lot more important here than conventional user interfaces. Designers can no longer rely on shiny devices and glossy interfaces to hide design flaws. Interaction design needs to be considered to figure out how users phrase and construct statements and how they would converse with the device to make it feel as natural, and as human, as possible.

If designed the right way, voice-controlled devices can give brands a voice and personality enabling them to get closer to consumers and become present in the environment they feel most comfortable and interact with them in the most natural way possible without a glass barrier.

You might also like: