We have seen many innovative changes to User Interface (UI) in the past decade with systems such as the touchscreen iPhone, Virtual Reality (VR) commercial headsets, Augmented Reality (AR) advancements, etc.

Michael Poh from the Hongkiat, a site based on all things tech, writes in his 2018 blog what next-generation UI may look like. Poh writes about the high possibility that we will see Gesturing in future UI. He notes that “in gesture recognition, the input comes in the form of hand or any other bodily motion to perform computing tasks, which to date are still input via device, touch screen or voice. The addition of the z-axis to our existing two-dimensional UI will undoubtedly improve the human-computer interaction experience.  ” He also provides a link to this TedTalk which shows a prototype of what gesturing will look like whilst also talking about the future of UI:

Danielle Reid claims that “the user interface is becoming the world around us” in her toptal blog. She describes Google’s advanced Technologies and Projects development of Project Soli which uses a miniature radar for motion tracking the human hand. This is also a gestural interaction.

Project Soli the latest in the evolution of UI design

She claims that “screen-based UIs are slowly disappearing” and that in this “‘new world’ it will be more about designing experiences, not UIs”.

Leave a comment