Global Accessibility Awareness Day highlights Meta’s commitment to enhancing accessibility through innovative products. Meta's Ray-Ban Meta glasses, equipped with AI integrations, provide a hands-free experience beneficial for the blind and low vision community. Users can capture photos, send messages, make calls, and receive real-time assistance from Meta AI. A new feature allows for customized detailed responses based on the user's environment, rolling out soon in the U.S. and Canada, with plans for global expansion. Additionally, the "Call a Volunteer" feature, developed with Be My Eyes, will connect visually impaired users with sighted volunteers for assistance.
Meta is also exploring wristband devices using surface electromyography (sEMG) to improve human-computer interaction for individuals with physical disabilities. These wristbands can interpret muscle signals for computer controls, aiding those with limited movement. Research collaborations, including one with Carnegie Mellon University, focus on enabling users with hand paralysis to utilize these controls effectively.
To enhance communication in the metaverse, Meta is implementing live captions and live speech features in its extended reality products. Live captions convert spoken words into text, while live speech transforms text into synthetic audio, catering to users with different communication preferences. Furthermore, the Llama AI models are being utilized to create tools like a WhatsApp chatbot that translates American Sign Language (ASL), facilitating communication between Deaf and hearing individuals.
Meta remains dedicated to developing features that foster connections and address the diverse needs of its global user base.