The latest version of iOS adds a few smart features intended for use by people with hearing and vision impairments, but some of which may be helpful to just about anybody.
The most compelling new feature is perhaps Sound Recognition, which creates a notification whenever the phone detects one of a long list of common noises that users might want to be aware of. Sirens, dog barks, smoke alarms, car horns, doorbells, running water, appliance beeps — the list is pretty extensive. A company called Furenexo made a device that did this years ago, but it’s nice to have it built in.
Users can have notifications go to their Apple Watch as well, in case they don’t always want to be checking their phone to check if the oven has gotten up to temperature. Apple is working on adding more people and animal sounds as well, so the system has room to grow.
The utility of this feature for hearing-impaired folks is obvious, but it’s also nice for anyone who gets lost in their music or podcast and forgets they let the dog out or are expecting a package.
Also new in the audio department is what Apple is calling a “personal audiogram,” which amounts to a custom EQ setting based on how well you hear different frequencies. It’s not a medical tool — this isn’t for diagnosing hearing loss or anything — but a handful of audio tests can tell whether certain frequencies need to be boosted or dampened. Unfortunately the feature only works, for some reason, with Apple-branded headphones.
Real Time Text conversations is an accessibility standard that basically sends text chat over voice call protocols, allowing seamless conversations and access to emergency services for nonverbal people. It’s been supported by iPhones for some time, but now users don’t need to be in the calling app for it to work — do a call while you play a game or watch a video, and the conversation will appear in notifications.
A last feature intended for use by the hearing impaired is an under-the-hood change to group FaceTime calls. Normally the video automatically switches to whoever is speaking — but of course sign language is silent, so the video won’t focus on them. Until iOS 14 anyway, in which the phone will recognize the motions as sign language (though not any specific signs) and duly switch the view to that participant.
VoiceOver makeover
Apple’s accessibility features for those with low or no vision are solid, but there’s always room to grow. VoiceOver, the smart screen-reading feature that’s been around for more than a decade now, has been enhanced with a machine learning model that can recognize more interface items, even if they haven’t been properly labeled, and in third party apps and content too. This is making its way to the desktop as well, but not quite yet.
iOS’s descriptive chops have also been upgraded, and by analyzing a photo’s contents it can now relate them in a richer way. For instance, instead of saying “two people sitting,” it might say, “two people sitting at a bar having a drink,” or instead of “dog in a field,” “a golden retriever playing in field on a sunny day.” Well, I’m not 100 percent sure it can get the breed right, but you get the idea.
The Magnifier and Rotor controls have been beefed up as well, and large chunks of Braille text will now auto-pan.
Developers with vision impairments will be happy to hear that Swift and Xcode have received lots of new VoiceOver options, as well as making sure common tasks like code completion and navigation are accessible.
Back tappin’
The “back tap” is a feature new to Apple devices but familiar to Android users, who have seen things like it on Pixel phones and other devices. It enables users to tap the back of the phone two or three times to activate a shortcut — super handy for invoking the screen reader while your other hand is holding the dog’s leash or a cup of tea.
As you can imagine the feature is useful to just about anyone, since you can customize it to perform all sorts of shortcuts or tasks. Unfortunately the feature is for now limited to phones with FaceID — which leaves iPhone 8 and SE users, among others, out in the cold. It’s hard to imagine that there is no secret tap-detection hardware involved — it’s almost certain that it uses accelerometers that have been in iPhones since the very beginning.
Apple is no stranger to holding certain features hostage for no particular reason, such as the notification expansions that aren’t possible a brand-new phone like the SE. But doing so with a feature intended for accessibility is unusual. The company did not count out the possibility that the back tap would make its way to button-bearing devices, but would not commit to the idea either. Hopefully this useful feature will be more widely available soon, but only time will tell.
Source: TechCrunch https://ift.tt/3fXLsw9
No comments:
Post a Comment