How AR and AI integration is enhancing user experience

As AR technology continues to evolve, we can expect more devices to incorporate AI for enhanced functionality.

Desi Aleksandrova, Marketing Communications Manager

Home  >  Blog   >   How AR and AI integration is enhancing user experience

Augmented reality (AR) technology interests me, both as a potential consumer and as a marketing professional in the field. Working at a company that develops AR dimming technologies keeps me informed about industry advancements – knowledge that will undoubtedly be useful whenever I’m ready to buy a pair of AR glasses for myself.

While my company’s focus is on hardware solutions for improving users’ visual experience and comfort, I am equally intrigued by other developments that are impacting on AR glasses adoption. Just as with any electronic device, the synergy between hardware and software is crucial in delivering a great user experience and driving widespread adoption.

The integration of AI and AR

A conversation with FlexEnable’s CRO, Erin McDowell, last year on AR trends brought up the topic of the integration of AR and artificial intelligence (AI). Erin believes that “AI will evolve AR and VR into more than game-playing and looking at your screens through a device – for example, a headset could learn your patterns, anticipate what you might want to know, and provide relevant guidance and information.”

Erin’s prediction doesn’t surprise me – the integration of AI into various professional domains is becoming increasingly prevalent, and it’s evident how it could improve efficiency and productivity for almost anyone. Considering AR glasses are often envisioned as extensions of our smartphones, with forecasts even suggesting they may eventually supplant them, it’s not hard to imagine how the fusion of AI and AR holds significant appeal for consumers.

Device makers are already integrating AI into AR – this inspired me to explore some of ways in which the combination of these two novel technologies will improve the user experience.

Enhanced contextual information

One of the uses that got me excited is the enhanced contextual information that users can receive about their surroundings. This would be particularly useful for regular travellers or when shopping. By leveraging computer vision and object recognition algorithms, the glasses can identify objects, landmarks, or people in the user’s field of view and overlay relevant information on the display. For example, users can receive instant translations of foreign text, historical facts about landmarks, or information about products in a store, enhancing their understanding and interaction with the environment.

Natural interaction and gesture recognition

This function will allow users to interact with virtual content in a more intuitive and natural way. By tracking hand movements and gestures, AI algorithms can interpret user inputs and manipulate virtual objects or interfaces accordingly. This enables hands-free interaction with AR content, enhancing usability and convenience, especially in scenarios where physical input devices are impractical or cumbersome.

Personalised assistance and recommendations

By analysing user data, such as past interactions, browsing history, and location, AI algorithms can provide customised recommendations for nearby attractions, restaurants, or activities. Additionally, AI can assist users with tasks like navigation, scheduling or shopping, by anticipating their preferences and offering relevant suggestions in real-time.

Augmented collaboration and communication

By integrating features such as real-time language translation, spatial annotation, and remote assistance, AI-enhanced AR glasses allow users to communicate seamlessly across language barriers, annotate shared visual content, and receive expert guidance or support from remote collaborators. This will enhance teamwork, knowledge sharing, and productivity in various professional and social settings.

AI-powered AR glasses

Some AR glasses already incorporate AI technology and to some extent enable the use cases above or are available to developers working on AI functionality.

For example, HoloLens 2, developed by Microsoft for enterprise applications, integrates AI for various purposes such as spatial mapping, hand tracking, and voice recognition.

Magic Leap 2 Developer Edition, on the other hand, is aiming to create a platform for developers to create AI-optimised augmented reality visualisations for healthcare, as well as AI-driven apps tailored to the manufacturing industry.

In April 2024, Vizux announced that it had entered into partnership with IT services company Perview who will offer its customers voice-driven AI solutions using Vuzix AR smart glasses to solve business problems by enabling user interaction and hands-free operation.

Similarly, in October 2023 Ray-Ban introduced the Ray-Ban Meta smart glasses collection which allows users to engage with Meta AI by using their voice.

These are just a few examples, and as AR technology continues to evolve, we can expect more devices to incorporate AI for enhanced functionality and user experience. It seems to me that the combination of AR glasses and AI has huge potential, and we will see more innovations that further improve this synergy. While I’m not ready to buy my first pair of AR glasses just yet, I am confident that once I do, they’ll be an essential tool for me.