Google Maps has been undergoing significant enhancements lately, particularly with the integration of AI-powered features aimed at improving navigation experiences. The most recent upgrade, unveiled at the Mobile World Congress (MWC) 2024, focuses on enhancing screen reader support for Lens within Google Maps.
Screen reader support for Lens was initially introduced last year, allowing users to utilise their smartphone cameras to scan their surroundings, with Google providing audible feedback on what is detected. This feature has been a game-changer for individuals who are blind or with low vision and those navigating unfamiliar territories, especially when language barriers are present.
By pointing the camera at a specific location or business, Google Maps can now provide a wealth of relevant information audibly. This includes details such as business hours, average review ratings, and even directions to the location.
To access this feature, users simply need to tap the camera icon within Google Maps’ search bar and lift their phones to scan the surroundings. It’s essential to ensure that TalkBack, the screen reader feature, is enabled, which can be found in Android’s Accessibility menu within the settings.
In addition to the Google Maps update, Android Auto’s AI summaries feature is also expanding to more Android phones. Originally launched alongside the Samsung Galaxy S24, this feature aims to minimise distractions by summarising texts or group chats while driving. It can also suggest relevant replies and actions, such as navigating to shared locations or sharing one’s estimated time of arrival with contacts.
While some may argue for minimising distractions altogether, these features cater to individuals who need to stay connected while on the move. Both updates are currently rolling out to Android phones and should be available to users soon, if not already.
To read more, check out MSN’s article on Google Maps’ newest accessibility upgrade.