By Bobby Jefferson on Wednesday, 23 October 2024
Category: Tech News

AR Glasses Have AI Now, But What Does That Mean?

Key Takeaways

AR glasses paired with AI could identify and provide data about individuals using facial recognition technology. AI advancements enhance AR glasses to augment reality by showing detailed information in real-time. Combining AI with AR glasses can create a personal AI Chatbot to learn and remember daily habits, thus raising privacy concerns.

Augmented Reality (AR) glasses are becoming affordable and compact enough to be desirable wearable gadgets. At the same time, AI technology has advanced to the point where the things these glasses can do is supercharged almost beyond imagination. With AI technology integrated with AR glasses, what does this mean for you, and for all of us as a whole?

Recognizing Faces (and Pulling Their Data)

In October 2024, a pair of Harvard students modified a pair of Meta Ray-Ban smart glasses so that they could identify and extract information about anyone they'd see in public. Basically, they rigged up a system where AI facial recognition would compare someone's face to social media profiles, and then spit out their name and any other available details to the wearer. This meant they could walk up to almost anyone and know their name and other details about them, even pretending to have met them in the past.

Trismegist san / Shutterstock.com

Of course, none of the fundamental technology is new. Governments, companies, and even individuals can already do this with video footage or photos. What makes this special is the combination of glasses people wear on their face and the power of modern machine learning in the cloud, and, increasingly, on local devices. Either way, there's a good case for actively protecting yourself from facial recognition.

StudioCanal

Facial recognition and telling you about the people around you is just a subset of a larger set of tricks that AI can enable in AR glasses. Ultimately, the entire point of augmented reality is to, well, augment reality and, traditionally, that's meant enhancing what our eyes can see about the world. Think of all those movies where we see the world through the eyes of a machine or someone wearing what's essentially sci-fi AR glasses. They see information pop up about what they're looking at and get more information about it.

Again, this is something we've had for a while, but recent AI technology improvements have made a huge difference to how useful and accurate this is. I've spent a lot of time feeding images into ChatGPT, and it's excellent at knowing what it's looking at. Perhaps more importantly, new AI technology could help make these enhancements pop up when the context is right, and not just flood you with everyything, or wait for you to ask for every little thing manually.

You may have heard of a controversial AI feature in Windows 11 known as "Recall" which effectively records what happens on your screen and then uses AI to analyze the images so you can ask a chatbot about anything you did with your computer, or use that data to do new tasks.

Combining AI with AR smart glasses means you can have the equivalent feature for your life beyond the computer screen. So your personal AI Chatbot can see what you're seeing and build a working memory of your habits, common problems, and so on. It would also mean you could offboard your own memory into the AI, and ask it about things that happened if you don't remember.

This immediately brings up a million privacy concerns, and they all need to be addressed, but at the same time, this could be a true killer feature for wearables like these. Giving the software a view of what you actually do day-to-day could be useful, as long as there's an effective kill-switch for those times you don't want it watching.

Helping You Hands-On With Your Work

In the late 2000s, BMW released a concept video showing how AR could help a mechanic know exactly what to do when working on a BMW. The software would guide them through each step and show how to perform the different tasks.

A modern pair of AR smart glasses driven by AI technology could help you in the same way, but of course, current and future AI technology is much more flexible and could be adapted to many different areas. So if you're trying to install RAM in your laptop, or need to figure out how to fix your pool filter, you could get the right advice on-screen whenever you need it.

Ray-Ban Meta Smart Glasses

Embraced by the next generation of culture makers, its journey continues with AI-enhanced wearable tech. Listen, call, capture, and live stream features are seamlessly integrated within the classic frame.

$299 at Amazon See at Meta See at Ray-Ban

The Dawn of Present AI Companions?

The last big thing I foresee in the short- to medium-term future is the rise of AR chatbots. Right now, you can have an AI chatbot live in your phone as a voice, or perhaps a virtual face on a screen. You can have them just a word away thanks to wireless buds, or smart speakers. With AR technology, especially with advanced forms of mixed- and extended-reality, your AI assistant can gain a virtual presence in the space you live in. So imagine them sitting in a chair across from you, or walking next to you in a store, helping you pick out shoes or finding the best deals.

AR glasses could completely change the way we interact with AI chatbots, and make them feel like "real" people. At least until you try to shake their hands! Although some might argue that the actual glasses themselves aren't quite ready for prime time yet.

Original link
(Originally posted by Sydney Butler)
Leave Comments