I realize the irony of publishing this in the week that Intel announced it was pulling the plug on its smart glasses project Vaunt (pictured) but I think that the smartphone’s days are numbered, and smart glasses are going to be the thing that kills it.
If you’re anything like me, you break out in a cold sweat whenever your phone’s battery drops below 10%. The idea of going without a phone would feel like losing a limb. An invisible digital limb, but still, a limb.
So the idea of not having a phone anymore seems like a totally alien concept, but I’ve seen some emerging technology recently that’s convinced me that we’re only a few years away from the end of the smartphone.
Now I don’t mean that in five years time the smartphone will be gone completely, that is madness. I just mean that the technology that is going to kill it is here already and in five years we’ll have heard the first death knoll. Let me explain.
Getting to where we are now
It was 2007 when Steve Jobs first walked out on stage in his black roll-neck, comfortable trainers, iPhone 1st gen in his hand (albeit not a fully functioning one) and changed the industry. That’s just over ten years ago. To put it another way, that’s the year Pixar’s Ratatouille came out.
And 2007 wasn’t even when it went mainstream; ‘You don’t need a smartphone,’ I thought, ‘a phone only really needs to be able to make calls, send texts, and play these sweet sweet polyphonic ringtones.’
And so, clutching on to my Nokia 3510 (yes, I know, it could access the internet, but this was back in the days that internet on a phone cost four million pounds a minute) and my strongly held beliefs, I stared down the oncoming wave of smartphone adoption and dug in my heels.
Little did I know that in a few short years, everything I’d known about phones, about connecting to the internet, about socializing, and most importantly about ringtones would be totally changed.
Over those years, phone screens have got bigger and bigger, taking up more and more of the real estate on the handset, until we reached the point where we are now, that manufacturers are creating a ‘notch’ to house essential components rather than have a thin bezel.
Where we’re going
The only reasonable way to go from here is a screen beyond a screen. Where there is no physical object creating a confine for the visual platform. And the easiest way of achieving that is by having the screen closer to your eye, creating the illusion of size.
Now, I know what you’re thinking: ‘Andrew, we already have smart glasses. And they’re not good.’
The AR glasses that we’ve had so far have undoubtedly proved useful (admittedly more on the business than consumer side of things) and have taken the bold first steps. The first ever smartphone wasn’t widely adopted either.
But the building blocks are there, and not just in terms of the display. We’ve got bone-conduction audio being included in running headphones that allow you to listen to music while clearly hearing the world around you, and EEG headsets that could allow you to control your smart glasses using just your brain.
Now, EEG control is the bit of the equation that is the furthest away from a technology that we’re used to using, but that doesn’t mean it’s a long way from being a commercially available product.
I was recently in Dubai for the GESF education conference where I flew a drone with my brain using a commercially available EEG headset. A version of this headset was used to control a Formula One car, so the idea of controlling an electronic device that plays Spotify and makes calls really isn’t that far-fetched.
And this ties into a current wave that we’re experiencing in electronics; the move away from the unnatural method with which we interact with our phones. Tapping, swiping and typing may feel natural, but so does driving stick if you’re used to it, and there’s nothing more unnatural (and 20th Century) than pulling levers, pushing pedals and twisting wheels just to make a machine work.
There’s a reason that talking to a voice assistant is more satisfying than tapping on a screen (when it works), as it’s a more natural process.
Imagine if the next step of that was just thinking what you want to happen and it happens. A notification pops up in the corner of your field of vision and you’re able to ‘think’ it away. It sounds sci-fi, but the truth is, we’re really not that far away.
Emotiv, the makers of the drone-brain EEG headset mentioned above are currently working on a version of the headset that could well take the form of glasses.
“The research headset, it’s not the most convenient thing to put on your head. But there are a couple of things that people don’t mind putting on their head. And it’s going to come out soon.”
Just a little Moore of Moore’s law
While the last few years have seen people claiming we’ve seen the end of Moore’s law (that the number of transistors per square inch on a circuit doubles every two years), we are undoubtedly still seeing rapid development in technology, while phone improvements seem, well, iterative.
It’s been years since a phone came out that was a genuine game-changer, and that doesn’t match up with the progress of technology. The rate of technological development we’re currently experiencing means the gap between the laughable and the commonplace is shrinking.
The difference between ten years ago and tomorrow is the same as the difference between tomorrow and three years from now. Then that gap again will basically be a year. Then a few months.
Before you know it we’ve had generations of industry-changing technology in five years. My numbers are rough, but you get the point.
With rumors that Apple is working on smart glasses, Facebook confirming it’s thinking about using EEG control to send messages, and Microsoft filing patents for a mind-controlled Windows app, it’s entirely possible that the next big leap in this tech is only a couple of years away.
Current EEG allows you to control a cursor with your brain, which would make for a pretty laborious texting process, especially if you’re just wanting to fire off some swift banter, but a research headset recently created by MIT could fix that problem by monitoring ‘subvocalizations’ – the imperceptible muscle signals made from your brain to your mouth when you speak in your head.
I’m aware these technologies aren’t ready yet, but let’s not forget, neither was the iPhone. I’m certainly excited about the possibility of a new game-changing technology, especially with all the advancements being made in computer vision, meaning smart glasses could be able to identify what you’re looking at, and turn you into a real-life RoboCop.