• Vie. Nov 14th, 2025

Meta has unveiled their latest smart glasses, boasting a screen in the right lens that allows users to access various features like reading messages, navigation, and translation right from their face. These glasses are dubbed as the most advanced AI glasses in the world by the company, marking a significant milestone as Meta’s first smart Ray-Bans to incorporate a display. Mark Zuckerberg envisions these high-tech glasses as the future of portable computing, emphasizing that they provide a unique opportunity for AI to see and hear what the user experiences. Priced at $799 (£587) and set for release on 30 September, the glasses are operated using a neural band that tracks hand movements for control. Despite Meta’s bold claims about the glasses enhancing user engagement and presence in the real world, a firsthand experience with the tech left me unconvinced. During a demo event with Meta, I found myself so engrossed in the display that it hindered my interactions, leading me to inadvertently watch a game playing in my lens during an interview. When questioned about the glasses’ potential to improve face-to-face conversations, Ankit Brahmbhatt, Meta’s director of product management, acknowledged that there are still evolving aspects to the technology. While he believes the glasses promote a more present and engaged experience, the reality of having a screen in your field of vision can be distracting, as studies have shown our brains struggle to multitask effectively. With research highlighting the concept of inattentional blindness caused by cognitive distractions from devices, the implications of incorporating smart glasses into daily life remain a topic of debate. Despite the promising capabilities of Meta’s new smart glasses, the challenge lies in balancing technological advancements with the ability to remain fully engaged in the world around us. Meta’s new smart glasses fail to convince in real-world tests

PorStaff

Sep 18, 2025

Meta has unveiled smart glasses with a screen in the right lens, allowing users to read WhatsApp messages, view maps, or translate conversations directly on their face.

These AI glasses, described by the company as the most advanced in the world, mark the first time Meta has integrated a display into its smart Ray-Bans.

Mark Zuckerberg envisions these high-tech glasses as the future of portable computing, stating at the launch event that they are «the only form factor where you can let AI see what you see, hear what you hear».

Available for purchase on 30 September for $799 (£587), the display is operated using a neural band worn around the user’s wrist, which tracks hand movements.

Image:
The display on the new Meta Ray-Bans can show wearers directions in their lenses. Pic: Meta

The glasses feature touch controls that allow users to adjust volume, zoom the camera, and close the display with simple gestures. Future updates will enable users to write texts by drawing letters in the air.

«The band’s ability to detect signals is remarkable – it can measure movement before it is visually noticeable,» said a Meta spokesperson.

Or pop up with incoming WhatsApp messages. Pic: Meta

Image:
Or pop up with incoming WhatsApp messages. Pic: Meta

The company claims that the glasses are designed to enhance users’ awareness of their surroundings and keep them engaged with the world around them. However, my personal experience trying the glasses at a Meta event last month was quite the opposite.

I found myself so engrossed in the display that during an interview with Ankit Brahmbhatt, director of product management at Meta, I realized I was inadvertently watching a game on my lens.

Tech reporter Mickey Carroll talks to Mr Brahmbhatt

Image:
Tech reporter Mickey Carroll talks to Mr Brahmbhatt

I admitted this to Mr Brahmbhatt and questioned whether the glasses would truly facilitate better face-to-face interactions or if people would just appear more engaged because they weren’t staring at their phones.

«We acknowledge that we don’t have all the answers yet, similar to when smartphones were first introduced or any new technology, many aspects evolve over time,» Mr Brahmbhatt responded.

«Our belief is that these glasses will help users be more present and engaged. With AI glasses, users can truly immerse themselves in the moment.»

Mr Brahmbhatt in his pair of Meta Ray-Ban Displays

Image:
Mr Brahmbhatt in his pair of Meta Ray-Ban Displays

Despite the potential benefits, having a screen in one’s field of vision can be distracting, as research has shown that our brains struggle to focus on multiple tasks simultaneously.

One notable study in the Applied Cognitive Psychology journal revealed that individuals walking while using their phones failed to notice a clown riding a unicycle in front of them.

Read more from Sky News:
How most people are using ChatGPT
NHS medicines bill ‘should rise to preserve UK drug industry’

This phenomenon is known as inattentional blindness.

«People are constantly distracted by devices,». Meta has announced smart glasses with a screen in the right lens, allowing users to read WhatsApp messages, view maps, or translate conversations directly from their glasses. The company touts these as the world’s most advanced AI glasses and the first time it has incorporated a display into its smart Ray-Bans.

Mark Zuckerberg envisions these high-tech specs as the future of portable computing, stating they are «the only form factor where you can let AI see what you see, hear what you hear.» These glasses were released on 30 September for $799 (£587) and are controlled using a neural band that wraps around the user’s wrist to monitor hand movements.

The display will automatically turn off when it detects the user is driving, but there’s nothing preventing users from turning it back on, potentially leading to dangerously distracted drivers, according to Professor Gemma Briggs, a professor of applied cognitive psychology at the Open University.

Meta insists that these glasses will keep wearers engaged in the moment, but Professor Briggs’ research shows that any form of phone use, whether hands-free or not, significantly increases the likelihood of distraction while driving. This distraction can lead to a higher risk of collisions and delayed reaction times to hazards.

For some, the idea of having a phone strapped to their face is already distracting enough, let alone adding a display in their glasses. Picture: Meta
The company claims that their glasses are designed to keep you engaged and present in the moment, with technology that prioritizes awareness of the world around you. However, my experience trying out the glasses at a Meta event revealed the opposite effect. I found myself so engrossed in the display that I accidentally started watching a game on my lens during an interview with Ankit Brahmbhatt, Meta’s director of product management.

I admitted my distraction to Mr. Brahmbhatt and questioned whether the glasses truly enhance face-to-face interactions or simply create the illusion of engagement by diverting attention from phones. Mr. Brahmbhatt acknowledged that they are still exploring the potential of the technology, comparing it to the early days of smartphones when many capabilities evolved over time. He believes that the AI glasses offer a more immersive and engaging experience that encourages interaction.

However, research suggests that having a screen in your field of vision can be distracting, as our brains struggle to multitask effectively. Studies like the one published in the Applied Cognitive Psychology journal have shown that people can be so absorbed in phone tasks that they fail to notice obvious things around them, like a unicycling clown.

This phenomenon, known as inattentional blindness, highlights the cognitive distractions posed by devices. Professor Gemma Briggs, an expert in applied cognitive psychology, notes that while people are constantly distracted by technology, the risks are significantly higher in certain situations, such as driving.

While Meta’s glasses have safety features that turn off the display while driving, users can easily override this setting. Professor Briggs has emphasized the potential dangers of distracted drivers, regardless of whether they are touching, holding, or using their phone hands-free. According to his research, the likelihood of being involved in a collision increases fourfold when drivers are distracted by their phones. Moreover, distracted drivers are less likely to notice hazards on the road and take longer to react to any hazards they do perceive.

Meta’s introduction of smart glasses with a screen in the right lens has sparked conversation about the future of portable computing. These glasses allow wearers to read messages, view maps, and even translate conversations directly from their face. Mark Zuckerberg sees these AI glasses as a revolutionary form of technology that enables AI to see and hear what the wearer sees and hears. The company has priced these smart glasses at $799 and they come with a display controlled by a neural band that tracks hand movements.

The advanced features of Meta’s smart glasses, such as adjusting volume, zooming the camera, and soon being able to write texts by drawing letters in the air, are made possible by the sophisticated neural band technology. Despite Meta’s claim that these glasses are designed to keep wearers present and engaged with the world around them, some users have found that the display can be distracting. During a trial run with Meta’s director of product management, the user accidentally found themselves watching a game on the display, highlighting the potential for distraction.

When asked about the glasses’ impact on face-to-face conversations, Meta’s director of product management acknowledged that there are still many uncertainties surrounding the use of this technology. He emphasized that, like the introduction of smartphones, the adoption of AI glasses will evolve over time. The goal of these glasses is to encourage users to be more present and engaged in their surroundings, rather than being glued to their phones.

Overall, while Meta’s smart glasses offer a glimpse into the future of wearable technology, there are concerns about the potential for distraction and the need for users to strike a balance between technological engagement and being present in the moment. As these glasses become more prevalent, it will be important for users to consider the impact on their interactions and safety while using this innovative technology. AI glasses provide a sense of engagement that is truly immersive. However, having a screen in your field of vision can be extremely distracting, as research has shown that our brains struggle to multitask effectively. In fact, in a study published in the Applied Cognitive Psychology journal, participants walking while using their phones failed to notice a unicycling clown in front of them.

This phenomenon, known as inattentional blindness, highlights the cognitive distractions caused by devices. Professor Gemma Briggs, an expert in applied cognitive psychology, explains that this constant cognitive distraction can pose significant risks, especially in activities like driving.

While Meta’s glasses are designed to automatically turn off while driving, users can easily override this feature, potentially leading to dangerous distractions. Professor Briggs’ research shows that distractions, whether hands-free or not, significantly increase the likelihood of accidents and impair reaction times to hazards.

Despite Meta’s assurances that their glasses promote engagement, some may find the idea of having a screen constantly in their vision to be overwhelming, considering the existing distractions posed by smartphones.

SOURCE

Por Staff

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *