Microsoft and Be My Eyes collaborate in a first-of-it-kind program to make AI models more inclusive by closing the “accuracy gap” for over 340 million people in the blind and low-vision community

Be My Eyes will furnish Microsoft with unique video datasets to train AI models on disability effectively.

When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.

What you need to know

What you need to know

Be My Eyes announced it’s working with Microsoft"to make AI models more inclusive for the over 340 million people around the world who are blind or have low vision." For context, Be My Eyes is a mobile app that helps connect users with vision impairments to sighted assistants for visual interpretations.

According to the company, integrating AI into the app will make it more effective and efficient:

“Publicly available datasets used to train AI models often lack accessibility context and can fail to reflect the lived experience of people who are blind or have low vision. This disability data desert risks an AI-prevalent future of inherent bias and inaccessibility that repeats the mistakes made during the evolution of the internet, but with the potential impact being even more significant.”

The company further indicated that disability, specifically in the blind and low-vision community, is seemingly left out of the development of AI models. Consequently, AI models struggle to handle disability-related tasks.

The AI revolution will be “more inclusive”

The AI revolution will be “more inclusive”

As you may know, training is crucial to the efficiency and effectiveness of an AI model. However, “disability is often underrepresented or incorrectly categorized in datasets used to train AI.” This ultimately limits the utility of the AI models coupled with instances of bias.

Now, Be My Eyes has indicated that it’ll provide Microsoft with unique video datasets to promote effective training of AI models on disability. The videos will include challenges facing visually impaired people. The videos could potentially help bridge the “accuracy gap” limiting the usefulness of AI models to the blind and low-vision community.

Privacy and security continue to raise concerns as generative AI becomes more prevalent. Be My Eyes' first-of-its-kind collaboration with Microsoft promises to maintain privacy in its new efforts by scrapping all personal data from the video metadata, including all user, account, and Personal Identifiable Information (PII) before sharing the content with the tech giant. Likewise, users can opt out of Be My Eyes' new campaign, preventing their personal information and data from being shared.

Get the Windows Central Newsletter

All the latest news, reviews, and guides for Windows and Xbox diehards.

According to Be My Eyes:

“The video datasets represent the lived experience of the blind and low vision community and will be used to improve the accuracy and precision of scene understanding and descriptions, with the goal of increasing the utility of AI for the blind and low-vision community.”

Towards the end of 2023, Microsoft announced its partnership with Be My Eyes tomake its Disability Answer Desk customer service department more accessibleto visually impaired users. The app features an AI-powered tool dubbed Be My AI powered by OpenAI’s GPT-4 model, which provides users with a vivid description of an image.

🎃The best early Black Friday deals🦃

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You’ll also catch him occasionally contributing at iMore about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.