
Global Accessibility Awareness Day (GAAD) lands on the third Thursday in May – this year that’s May 15. Its mission? To spark conversations, get us all thinking, and help everyone learn more about digital access and inclusion. It’s not just a trend, it’s about equality and an opportunity for everyone to access the same opportunities in the digital world. And, of course, Apple, as one of the biggest names in technology, is not about to sit on the sidelines when it comes to this. The company has announced a series of accessibility features that will roll out later this year, likely at WWDC 2025 starting in June.
We’ll take a closer look at some of the new features coming to iOS 19, macOS 16, and other updates.
Table of Contents
What’s New in iOS 19? A Sneak Peek at the Upcoming Accessibility Features
Let’s focus first on the new iOS 19 updates for iPhone and iPad. There are a lot of accessibility changes, and each one affects how you use the device differently.
1. A New Way to Control Your Device with Head Tracking
iOS 19 introduces Head Tracking, a feature that lets users control an iPhone or iPad using head movements. Similar to last year’s eye tracking, this new feature recognizes facial gestures when users raise their eyebrows, open their mouths, or even smile to perform specific actions. For example, a smile could open the Control Center, while opening your mouth could take you back to the home screen.

We are really impressed by the announcement of this feature. You can only imagine how much easier it will make using the iPhone for people with mobility impairments, especially those who struggle to interact with touchscreens. Set an iPhone or iPad on a table or tripod and run the whole thing – no taps needed.
2. Know When Your Name is Called
Name Recognition builds on the existing Sound Recognition functionality. An iPhone could listen for the user’s name and notify them when someone calls them. It’s a huge help for anyone who’s deaf or hard of hearing – no more wondering if a friend across the room is trying to get your attention. Sure, it won’t work miracles in a noisy subway car, but in quieter settings, like game night, dinner with friends, hanging out at home, it helps. Users will stay in the moment and never miss out on being called by name, a beat of the conversation.

3. Share Accessibility Settings with Others
iOS 19 will make it dead-simple to share your custom accessibility setup with someone else, think AirDrop, but for your tailored settings. Just hit Share, pick a nearby iPhone or iPad, and a prompt pops up on their screen. They tap Accept (or Decline) and boom – their device instantly matches your tweaks. Ideal for quickly syncing preferences with a caregiver, family member, or a friend, so everyone gets the same experience without digging through menus.

4. Accessibility Reader
iOS 19’s Accessibility Reader is a system-wide reading mode that makes text easier to digest for folks with dyslexia, low vision, or other reading problems. It can be accessed from any app and tweaked to suit your eyes – font size, line spacing, background color, and even contrast. It also hooks into the Magnifier app (we’ll cover it a bit later in the section about the new macOS), so users can scan printed text, think menus or books, with their camera and have it displayed in their chosen accessible format. It’s perfect for anyone who needs a little extra help making sense of written content.

5. Full Braille Access
iOS 19 introduces Braille Access, which turns the iPhone into a full-blown Braille notetaker. With Braille Screen Input or a connected Braille display, users can navigate the device and jot down notes in Braille. A built-in app launcher lets users open any app by typing its name in Braille, and Nemeth Braille support makes math and science calculations possible right on the device. Available across iPhone, iPad, and other Apple devices, this feature empowers users to tackle everyday tasks independently. We see this upcoming update as just as important as the new Head Tracking feature, as it totally transforms how people can use their devices.

6. Transparency in the App Store
With iOS 19, Apple rolls out Accessibility Nutrition Labels in the App Store. These labels highlight the accessibility features each app supports, like VoiceOver, captions, reduced motion, and more. Users will be able to see this info right in the app’s description, making it easier to choose apps that fit their needs. Now, users with disabilities won’t have to guess if an app will work for them – they can instantly see which accessibility features are included.

7. Control Devices with Your Mind (BCI)
Switch Control for Brain-Computer Interfaces (BCI) is one of the most exciting and innovative features coming to iOS 19. This tool will enable users with severe mobility impairments to control their devices using brain signals. In collaboration with Synhron, iOS 19 will integrate BCI with Switch Control, allowing users to interact with their devices through an implanted device that translates brain waves into actions.
Thoughts are now an input device.
Today, @Apple announced its new BCI Human Interface Device (#HID) protocol—and Synchron is proud to be the first brain-computer interface company to achieve native integration with iPhone, iPad, and Apple Vision Pro.
That means:
🧠 No touch.… pic.twitter.com/7prNC3uoau
— Synchron (@synchroninc) May 13, 2025
While we don’t know if this feature will be available in 2025, Apple’s public announcement is already a huge step forward in the evolution of assistive technology. The concept of controlling a device with just brain signals may sound like science fiction, but the reality of it coming to iOS is incredibly exciting.
8. New Experience with Background Sounds
In iOS 19, Background Sounds will receive an upgrade with more customization options. Users can tweak the EQ settings to suit their needs, whether it’s winding down, zeroing in on work, or drifting off to sleep. Apple also added a timer so the sounds automatically fade out after a set period – perfect for those who like to nod off to white noise without it running all night. Plus, you can now build automations in the Shortcuts app to launch background sounds at specific times or in response to certain actions. These upgrades give anyone, especially folks with sensory sensitivities, a simple way to craft a calm, personalized soundscape whenever they need it.

9. Personal Voice Improvements
First introduced in iOS 18, Personal Voice will be upgraded in iOS 19. This feature lets users create a custom version of their voice for text-to-speech, and now the setup is faster and sounds more natural. Instead of long recordings, users only need to speak 10 short phrases, done in about a minute. The system then uses on-device machine learning to build a voice that’s smoother and more lifelike than before.
This update really makes a real difference for users dealing with speech loss from conditions like ALS or other neurological disorders. The result is a more personal, authentic-sounding voice that helps users communicate in a way that still feels like them, not a robot.
10. Feature to Reduce Motion Sickness
Now, let’s talk about something a bit less serious but equally exciting – Vehicle Motion Cues. This feature, initially introduced in iOS 18, is designed to help users who get motion sickness when using their iPhone or iPad in a moving car. In iOS 19, Apple will take Vehicle Motion Cues even further by letting users customize the animated dots that appear whenever the device senses movement. It might sound like a small tweak, but for anyone who’s ever felt queasy in the seat, it’s a real lifesaver – no more pulling over every few miles or rifling through motion-sickness remedies just to scroll through a text.
11. New Shortcut Item
The Hold That Thought shortcut is designed to help users stay on track with their tasks. This shortcut captures your current activity or thought and lets you recall it later when you’re ready to refocus. The feature is convenient for people who get distracted easily, such as those with ADHD, by allowing them to capture their thoughts or ongoing tasks with minimal effort.
When activated, the shortcut takes a screenshot of your current screen and adds a note describing what you were doing. It also saves any clipboard content at that moment. Later, when the user is ready to continue the task, they can simply open the shortcut to review the task and resume work.
Accessible Features for Other Apple Devices
Now that we’ve explored the new accessibility features coming to iPhone and iPad, we can’t help but mention a few other innovations Apple has rolled out for its other devices, including Mac, Apple Watch, and Vision Pro.
Accessibility Updates for Mac
One of the standout accessibility updates in macOS 16 is the new Magnifier app, designed to help users with low vision interact with the physical world more easily. Using the Mac’s built-in camera or a connected iPhone, the app zooms in on objects, text, or anything in the environment. What makes this Magnifier unique is its real-time zooming and text recognition features. For example, you can zoom in on something like a printed page or a document on a whiteboard and have the text automatically read aloud.

This description only scratches the surface of what the feature can do, so we recommend checking out the video Apple published on their website. It demonstrates how the Magnifier can transform learning, work, and daily tasks on the Mac.
Along with the Magnifier, macOS 16 also introduces Vehicle Motion Cues, which we covered in detail earlier in the iOS section. It works the same way on Mac, so we’ll skip the repetition and move on to the following accessibility features.
Live Listen with Captions for Apple Watch
One of the biggest watchOS 12 updates is Live Listen with real-time captions. It’s an important thing for anyone who’s deaf or hard of hearing- an Apple Watch basically becomes a mini communication hub. Live Listen lets the watch act like a remote mic, picking up sounds around and streaming them to AirPods or compatible hearing aids, while captions pop up on the wrist in real time so the users never miss a word.
This is exactly what we mentioned earlier in the section about accessible iOS features, specifically Name Recognition. With the Apple Watch, the user can stay fully aware of the conversation, not just be notified when their name is called.
Spatial Computing Accessibility through Vision Pro
The Vision Pro marks Apple’s entry into the world of spatial computing, bringing some significant accessibility updates in visionOS 3. Apple aims to offer low-vision users an entirely new way to interact with the world. Thanks to its advanced camera system, the Vision Pro will feature an upgraded Zoom function that allows users to magnify anything in their environment just by looking at it. This zoom feature isn’t limited to apps or digital content – it works in the real world and enables users to zoom in on objects, text, or faces around them while wearing the headset.
In addition, the Vision Pro will include Live Recognition, a feature that leverages the device’s sensors and AI to recognize and describe the user’s surroundings in real-time. Vision Pro reads aloud recognized objects (doors, furniture, signs), so users can easily find their way around unfamiliar spaces. For people with severe vision loss, this level of interaction with the physical world is nothing short of revolutionary.
Final Thoughts
Those are all the announcements we have from Apple for now. Now, we just have to wait for the full reveal of each accessibility feature. We will likely see this during WWDC 2025 in June. However, since some features feel pretty ahead of their time, we might not see everything showcased in this announcement. The feedback from users so far has been overwhelmingly positive, which shows that Apple is heading in the right direction.
Before we end, we want to remind you about Global Accessibility Awareness Day. It’s an important cause, so talk about it and share a post on X (formerly Twitter), Instagram, or whatever social media platform you use. And remember, don’t just talk about accessibility once a year – make it part of the conversation daily.