AI might be all the rage, but for consumers, the benefit is hampered in the real world by a lack of visibility into our environment, something that smart glasses could solve.
Few of us are willing to roam the streets with our smartphone camera guiding the way due to the risk of crime, an accident, or not wanting to look like tourists. That’s why smart glasses represent the next logical technology to deliver future AI services.
With £299 Ray Ban’s Meta AI Glasses showing the world how it can be done, check out the whimsical Super Bowl advert, the race to replace the smartphone is on. Recent updates to Meta AI help with reminders, planning, brainstorming and it is a more natural conversationalist than in previous editions. And, of course, the glasses take prescription lenses for people who actually need them.
The smartest glasses framing up for launch
There are many products currently under development, including Samsung’s secret Android project, and whatever is taking so long in Apple’s labs. These concepts pack in cameras, microphones, speakers or bone conduction technology to provide sound along with cutting-edge connective features. The AI system means the glasses don’t need the processing power and battery of previous bulky versions, with all the work done in the cloud.
That means, lighter and hopefully cheaper models are on the horizon once the first generation or two see the novelty wear off. If they take off as a consumer hot item in the way smartphones and watches have, then they will help in shopping, navigation all without the need to fixate on a screen.
Alongside the traditional tech brands, there are plenty of new competitors taking a differnet approach. Halliday recently raised $3.5 millon on Kickstarter for its invisible display system. China’s Meizu is proud of its real-time translation feature, and there are others who can use materials science, display innovation and low-power technology to create other innovative approaches.
Aside from making glasses cooler, they could help in more every day situations for everyone, we’d love a pair that could identify someone saying hello to you who you haven’t seen in months or years, while improving life for people with sensory health conditions and other limiting issues.
Solving the tech problems first
The reasons many of these projects have taken so long is their need to fit all human head shapes and types, and then operate alongside our binocular vision. These are problems Samsung, Apple and others are working hard on.
Do read Apple’s new patent that aims to solve the vision issue! “Apple’s patent application describes using an optical architecture that includes at least one sensor in the bridge or frame of smartglasses that can directly measure the binocular boresight of the two displays relative to each other. The sensor(s) can be complemented by a projector having an oversized field-of-view (FoV) and capable of performing electronic image shifting to correct for detected misalignment.”
With billions invested, and products needing to hit the market in the next couple of years to avoid being left behind, the rest of the 2020s will be a very exciting time for smart glasses, before they are likely phased out by button-hole cameras, jewelery-mounted devices or those delightful orbs seen in 2017’s The Circle movie (trailer).

Whatever the hardware, the AI back-end is already developing at a phenomenal rate, with AIs holding natural conversations and capable of understanding more of our world. By the time smart glasses are a common consumer purchase, it will be hard to tell them apart from other people.
This creates great opportunities for marketers, brands and businesses to have a more “natural” relationship with customers and prospects, rather than relying on a hit-and-miss series of adverts or messages that are still largely woeful even in the connected age.