How to Use Your Smartphone to Cope with Vision Loss

1 year ago
tgadmintechgreat
114

To make similar changes to Google Assistant, follow the link Settings > Google > Settings for Google Apps > Search, Assistant and Voiceand select Google Assistant. you can click Lock screen and turn on Assistant responses on the lock screen. If you scroll down you can also adjust the sensitivity, enable Continuation of the conversationand choose which Notifications you want Google Assistant to give you.

How to identify objects, doors and distances

First launched in 2019, App Lookout for Android allows you to point the camera at an object to see what it is. This smart app can help you sort mail, find groceries, count money, read food labels, and do many other tasks. The application has different modes for certain scenarios:

Text mode for signs or mail (short text).

The documents The mode can read you a whole handwritten letter or a whole page of text.

Pictures The mode uses Google’s latest machine learning model to give you an audio description of an image.

food label mode can scan barcodes and recognize food.

Currency mode identifies denominations for various currencies.

Research mode will highlight objects and text around you as you move the camera.

AI-enabled features work offline, without Wi-Fi or data connection, and the app supports multiple languages.

Apple has something similar built into it Magnifier app. But it relies on a combination of a camera, on-device machine learning, and lidar. Unfortunately, LiDAR is only available on iPhone Pro (12th or later), iPad Pro 12.9-inch (4th generation or later), and iPad Pro 11-inch (2nd generation or later). If you have one, open the app, tap the gear icon and select Settings add Discovery mode to your controls. There are three options:

People detection will warn you of people nearby and can tell you how far away they are.

Door detection can do the same for doors, but can also add an outline of your preferred color, provide information about the door’s color, material, and shape, and describe decorations, signs, or text (such as opening hours). This Video shows a range of Apple accessibility features, including door detection, in action.

Apple via Simon Hill

Apple via Simon Hill

Image Description can identify many objects around you using screen text, speech, or both. If you are using speech, you can also go to Settings > Availability > Voice behind the scene > Voice-over recognition > Image Description and turn it on to turn on detection mode to describe what’s in the images you point your iPhone at, such as paintings.

You don’t need Wi-Fi or a data connection to use these features. You can adjust things like distance, whether you want sound, tactile sensations, speech feedback and more via Detectors section at the bottom Settings in the Magnifier app.

How to take better selfies

Managed Frame is a brand new feature that works with TalkBack, but is currently only available on the Google Pixel 7 or 7 Pro. Blind or visually impaired people can take the perfect selfie thanks to a combination of precise audio guidance (move right, left, up, down, forward or backward), high-contrast visual animation, and haptic feedback (various combinations of vibration). This feature tells you how many people are in the frame, and when you hit that sweet spot (which the team found using machine learning), it starts a countdown before taking the shot.

The Buddy Controller feature on iPhone (iOS 16 and later) lets you play co-op with someone in a single player game with two controllers. You can potentially help visually impaired friends or family members when they get stuck in the game (be sure to ask about it first). To enable this feature, connect two controllers and go to Settings > General > game controller > Buddy controller.

While this guide can’t cover every feature that can help with visual impairment, here are a few final tips that you might find helpful.

You can get voice guidance when you’re out and about on your Android phone or iPhone, and it should be turned on by default. If you’re using Google Maps, tap your profile picture in the top right corner, select Settings > Navigation settingsand select the one you want Scope of the manual.

Both Google Maps and Apple Maps offer a feature where you can view your routes in real time overlaid on your surroundings by simply lifting your phone. Register for Apple Maps Settings > Cards > Walking (under Directions) and make sure Raise to View included. For Google Maps go to Settings > navigation settings, and scroll down to make sure Live Stream under Walking options included.

If you’re browsing the web on an Android device, you can always ask Google Assistant to read a web page by saying: “OK Google, read it.”

You can find more helpful tips on how technology can help people with vision loss at Royal National Institute for the Blind (RNIB). To find video tutorials on some of the features we’ve discussed, we recommend visiting Hadley website and try the workshops (you will need to register).

Leave a Reply