
The Meta Ray-Ban AI glasses serve as assistive technology, allowing users to communicate via text and phone, capture images, and use their camera to read text for them. The glasses can be used hands-free, allowing users to access their phone seamlessly. The glasses aid people with visual or mobility disabilities in navigating easily.
Meta glasses work under the social model of disability, bridging communication gaps that can come from using mobile phones directly. Meta has created specific features in order to aid people with disabilities in understanding the world around them. The glasses uses it’s translate feature to read text for the user. If a person with a visual disability is unable to read the text in front of them, they can utilize Meta’s AI to verbally ask to read for them. The speakers that are installed in the glasses then recite the text in front of them. Users can also integrate their glasses with the Be My Eyes app. The app allows users to call a volunteer who can help them make out their surroundings. The agent is able to directly aid the user by accessing the camera and microphone.

For users with auditory disabilities, the Meta glasses uses its private screen in order to live-caption conversations in front of them. The screen also provides live updates of any messages received or incoming calls. The visual aid allows users to communicate more easily in cases where someone may not know how to communicate using American Sign Language (ASL). Haptic responses notify users when they receive a notification so they can rely less on auditory signals that may be hard to recognize. Meta is also working on its software to interpret ASL for those who want to communicate with someone who is deaf. Whether in the real-world or online, the software creates a two-way street for communication. The version of the glasses that has the screen installed also comes with a wristband to navigate the screen. The wristband captures the gestures made by the users in order to move through the interface of the glasses. People who may have difficulty with mobility or using a phone’s keys will find this helpful. They cannot only communicate verbally but also more easily capture their natural motions in order to access their phone.

Under the models of disability, the meta glasses aids it’s users under a social guise. The software meets the users’ needs to give them more access to communicating with the world around them. However, when examining it as a functional solution, there is one major barrier. Pricing. The most basic model runs for $379 USD, and the most advanced version is sold for $799. While a user may be able to buy a cheaper version of the glasses, those models do not have the private screen for captioning available or the wristband to capture gestures. Only users who would be able to afford a pair of glasses would benefit from it. Another drawback of the functional solutions is the battery life associated with the device. While it is helpful for users, some complain that the battery life is quite short for the amount of time they would have to use the device.
Meta’s glasses help to connect users with disabilities communicate with the world around them. It answers their direct needs in order better interact with text and speech. It is less focused on correcting the issue than it is on finding solutions to make life more equitable for users with disabilities. However, whether or not that is for everyone of any economic background, is still a question.