Published on July 14th, 2020 | by Emergent Enterprise0
How AI, ML, and AR Will Change the Face of Design
Because of the capabilities of technologies like artificial intelligence and augmented reality designers will need to adopt a new approach on how they want users to experience a product or a service. That’s the perspective of Clive “Max” Maxfield at DesignNews where he shares some of his thoughts on the new and challenging demands of design in the AI/AR age. It really does come down to a change in thinking as the UX now happens alongside the user as they interact with the product, service and interface. The designer has to think in a more non-linear way that places a high priority on the context of the end user.
Photo Source: Pixabay
The combination of artificial intelligence (AI), machine learning (ML), and augmented reality (AR) will change the face of design in the not-so-distant future.
It can be a funny old world sometimes. Here I am, quivering on the edge of my command seat, ensconced in the Pleasure Dome (my office), poised to pen this column on how the combination of artificial intelligence (AI), machine learning (ML), and augmented reality (AR) will change the face of design in the not-so-distant future. At the same time, I just commenced the process of creating my first AI application, which is really focusing my attention and making me think about all of the areas with which I could use a little AI/AR help.
Before we plunge headfirst with gusto and abandon into the use of AI/AR in the context of electronic design, let us first set the scene a little (for the purposes of this column, we will take AI to encompass ML). I, personally, am very excited by all of this, because I truly believe that the combination of AI and AR is going to change the way in which we interface with our systems, the world, and each other.
One thing that’s important to note is the fact that we are still in the very early days of AI and AR. When Charles Babbage (1791–1871) commenced work on his Analytical Steam Engine in the 1830s, he thought of this machine only in the context of performing mathematical calculations, which he very much disliked doing by hand. It’s fascinating to me that Babbage’s assistant, Augusta Ada Lovelace (1815–1852), mused on something akin to AI. In fact, Ada wrote about the possibility of computers using numbers as symbols to represent things like musical notes, and she went so far as to speculate of machines one day “having the ability to compose elaborate and scientific pieces of music of any degree of complexity or extent.”
The founding event of the field of artificial intelligence as we know and love it today was the Dartmouth Workshop, which took place in 1956. Following this meeting, a humongous amount of work took place over the years. Sad to relate, however, AI largely remained in the realm of academia until around the 2010s, at which point a combination of algorithmic developments coupled with advances in processing technologies caused it to explode onto the scene.
In the 2014 version of the Gartner Hype Cycle, AI (in the form of ML) wasn’t even considered to be a blip on the horizon. Just one year later, in the 2015 edition of the Hype Cycle, ML had already crested the “Peak of Inflated Expectations.” The point is that this was only five years ago at the time of this writing. Today, AI pops up all over the place. For example, the Nebo handwriting recognition app on my iPad Pro uses multiple artificial neural networks (ANNs) to decipher notes I’ve made for myself that are so cryptic even I cannot decipher them without Nebo’s help. Meanwhile, my Subaru Crosstrek uses binocular cameras and machine vision to take control of the steering wheel and brakes to prevent me from wandering out of my lane or crashing into the car in front.
Think of how far we’ve come from the first point-contact transistor in 1947 to silicon chips containing tens of billions of transistors today. The fact that this is a tad over 70 years (which really isn’t long in the scheme of things) is deceptive, because technology is evolving at an exponential rate, to the extent that it’s almost impossible to predict where we will be in as little ten-years’ time. All I can say is, “Don’t blink!”
Sometimes the general public latches onto a term that is perhaps not the optimum choice. Such is the case with AR, which — as we previously indicated — is short for “augmented reality.” As its name suggests, AR refers to an interactive experience of a real-world environment in which objects that reside in the real world are enhanced by computer-generated perceptual information.