Microsoft’s Annual Ability Summit recently showcased the tech giant’s commitment to advancing accessibility through the power of AI. Promising accessible features for Windows Copilot, Azure AI, and other Microsoft applications, the event aimed to bridge the disability divide.
Highlights included Copilot’s new features supporting live captions and narrator options, set to roll out by late March 2024. Notably, Copilot’s natural language processing capabilities will facilitate easier navigation for users with disability, simplifying tasks like interpreting colour-coded charts.
Additionally, Microsoft unveiled upgrades such as the Accessibility Assistant, aiding content creators in producing accessible material. Azure AI took centre stage, powering applications like ‘Seeing AI,’ designed to assist the blind community with everyday tasks and providing improved audio descriptions through enhanced machine vision.
Crucially, Microsoft pledged to democratise AI development, enabling developers of all skill levels, including those with disability, to participate. This inclusivity fosters the creation of AI-driven accessibility solutions by individuals with lived experience, promising to assist even more people in the future.
Moreover, Microsoft’s collaboration with Answer ALS and the ALS Therapy Development Institute showcases its dedication to leveraging AI for impactful causes, such as advancing ALS research. By making vast amounts of clinical and genomic data accessible via Azure, Microsoft aims to accelerate progress toward finding a cure for ALS, exemplifying technology’s potential to drive positive change.
To find out more about the Ability Summit, the Microsoft Ability Summit homepage has all the updates.