apple watch assistivetouch watchos image 1621491206447
apple watch assistivetouch watchos image 1621491206447


Apple announced a range of accessibility features designed for people with limited mobility, vision, hearing, and cognitive impairments. These features will be available through software updates later this year. One of the most interesting features is the Apple Watch, which allows people with different limbs to navigate the user interface using AssistiveTouch. IPhone and iPad users will also be part of the new accessible treatment. Additionally, Apple has announced a new sign language interpreting service called SignTime, which will be available for communication with AppleCare and retail customer care.

With AssistiveTouch on watchOS, Apple Watch users can use a cursor on the display to navigate a range of hand gestures, such as: B. by a pinch or a pinch. According to Apple, the Apple Watch will use built-in motion sensors like the gyroscope and accelerometer, as well as the optical heart rate sensor and machine learning on the device to detect subtle differences in muscle movement and tendon activity.

With the new support for gesture control via AssistiveTouch, people with different limbs can more easily answer incoming calls, control an on-screen motion pointer, and access the Notification Center and Control Center – all on an Apple Watch – without touching the screen or that Have to move digitally crown. However, the company has not disclosed which Apple Watch models will be compatible with the new features.

In addition to the gesture controls on the Apple Watch, iPadOS provides support for third-party eye-tracking devices so that users can control an iPad with their eyes. According to Apple, compatible MFi (Made for iPad) devices track where a person is looking at the screen in order to move the pointer accordingly and follow the person’s gaze. This works to perform various actions on the iPad, including typing, without users having to touch the screen.

Apple is also updating its pre-installed screen reader VoiceOver with the ability to examine more details in pictures. These details include text, tabular data, and other objects. Markup also allows users to add their own descriptions to images for a personalized feel.

Apple Voiceover image details detection update Apple VoiceOver

Apple is updating VoiceOver with the ability to learn more about pictures
Photo Credit: Apple

For neurodiverse people or anyone distracted by everyday noises, Apple has background sounds like balanced, light, and dark, and ocean, rain, and stream sounds that continue to play in the background to mask unwanted ambient noise or external sound . These will “help users focus, stay calm, or rest,” Apple said.

Apple also offers mouth noises such as clicks, pops, or EE noises to replace physical buttons and switches for non-speaking users with limited mobility. Users can also customize the display and text size settings for each app. In addition, new Memoji adjustments are made to represent users with oxygen tubing, cochlear implants, and a soft helmet for headgear.

Apple Memoji Accessibility Image Apple Memoji

Apple’s Memoji customizations get cochlear implants, oxygen tubing, and a soft helmet for headgear
Photo Credit: Apple

In addition to the primary software changes, Apple is expanding its range of MFi hearing aids (Made for iPhone) to include support for new bidirectional hearing aids. The next-generation models from MFi partners will be available later this year, the company said.

Apple also provides support for audiogram detection – graphs that show the results of a hearing test – for headphone accommodations. Users can upload their hearing test results to Headphone Accommodations to make it easier to amplify quiet sounds and adjust specific frequencies to suit their hearing abilities.

Users have not been given specific schedules as to when they can expect the new features to reach their Apple devices. However, expect some details to be revealed at the Apple Worldwide Developers Conference (WWDC) next month.

Apple will also roll out SignTime’s service for communicating with AppleCare and retail customer support via American Sign Language (ASL) in the US, British Sign Language (BSL) in the UK and French Sign Language (LSF) in France directly through a web browser . It will also be available in physical Apple Stores for remote access to a sign language interpreter without the need for prior bookings.

Apple Signtime Sign Language Interpreter Service Image Apple SignTime SignTime Apple

Apple introduces the SignTime sign language interpreting service for easy communication with service personnel
Photo Credit: Apple

The SignTime service will initially be available in the US, UK and France from May 20th. Apple plans to expand the service to other countries in the future. However, details of this may be announced at a later date.


We’re covering all of the things from Apple this week – iPad Pro, iMac, Apple TV 4K, and AirTag – on Orbital, the Gadgets 360 podcast. Orbital is available in Apple Podcasts, Google Podcasts, Spotify, and anywhere you can get your podcasts.

LEAVE A REPLY

Please enter your comment!
Please enter your name here