June 5, 2013

Magical Moments

The Loop > Magazine > Issue 3

By Luis Pérez

We can all remember our “magical” moments with technology: our first email, our first text message, the first time we used our voice to interact with a device. For me, one of those magical moments occurred the first time I heard Alex speak. Alex, it turns out, is not a person, but the default voice of the VoiceOver screen reader that ships with every Mac.

A screen reader is software that allows someone who is unable to see the display to receive the information in another format such as synthesized speech or Braille. While I did not yet need a screen reader when I first heard Alex, I knew the day would come when I would have to rely on this technology.

Just three years earlier, I had been diagnosed with a condition known as retinitis pigmentosa or RP for short. RP results in progressive vision loss, starting with night blindness and progressing through the loss of peripheral and finally central vision. In my case, the progression of my RP has been slow but steady, and today I have less than ten degrees of central vision left. While scientists are hard at work to find a cure for RP, none exists at the time of writing. Without a cure, there is a good chance that I am going to lose my remaining vision over time.

My encounter with Alex could not have come at a better time. The three years since I had been given the news of my impending vision loss had been extremely difficult. The first thing to go was my ability to drive, and with it, some of my independence. In fact, I had found out that I had RP following a series of car accidents. Losing the ability to drive also meant that I had to rethink other areas of my life: how was I going to get to work? how was I going to provide for my family? what kind of future would I have as a person with a visual disability? Following my diagnosis, I struggled to come to terms with my disability, moving between denial and anger that this was really happening to me. My denial reflected itself in my reluctance to use any kind of aid that reminded me of my vision loss, whether it was my white cane (which I often conveniently “forgot” at home) or the assistive technology that had been recommended by my rehabilitation specialists.  

In addition to dealing with my disability on an emotional level, some of my resistance to assistive technology was related to my frustration with the poor quality of it. The first time I heard a screen reader I remember thinking to myself “surely we can do better than this.” Well, as it turns out, a group at Apple was thinking the same thing, and their efforts lead to the creation of Alex, the voice introduced in Mac OS X 10.4, also known as Tiger. Alex is more than just a voice though. It is an intelligent system that can detect the difference between words that are spelled the same but have a different meaning based on their context. For example, it can discern the different uses of the word wind in “she felt the strong wind on her face” and “he stopped to wind his clock.” Similar screen reader technology, which is collectively known as VoiceOver, has been incorporated throughout Apple’s product lineup, including its iOS devices, the Apple TV, and even the iPod shuffle, a device that doesn’t even have a screen (VoiceOver on this device allows runners to get information such as battery status without having to stop to look at the device).  

In addition to VoiceOver, I also often use Zoom for magnification and Invert Colors to provide a higher contrast display that makes text easier for me to read. On my Mac, I have the cursor enlarged to its maximum size, and this makes it easier for me to find it on the screen without straining my eyes. The blind are not the only ones to benefit from Apple’s commitment to accessibility, however. Both OS X and iOS include a range of other features for those with hearing, motor and learning challenges. These features are built-in, rather than requiring the purchase and installation of additional software. Just as important, all of these features have been developed by a mainstream technology company, rather than by an assistive technology one.  Personally, this removed a lot of the stigma that assistive technology had for me. Slowly, I became more comfortable with it because I was using the same device as everyone else, and this made it easier for me to accept it as an integral part of my life. As I continued to explore VoiceOver and the other accessibility features built into my Mac (and later my iPhone and iPad) I started to redirect a lot of the effort I was spending trying to deny that I had a disability into actually being productive and looking at my disability as an asset rather than something that would hold me back in life.

It was then that I started to thrive. I enrolled in graduate school to pursue first a Master’s degree and later my doctorate, which I recently completed. Along the way, I was named an Apple Distinguished Educator, and I have used that platform to share my passion for accessibility and Apple products at a number of events around the country as well as through my online presence. As I travel, my iPhone is always with me, and I use it not only for productivity, but to take photos. Despite my fading eyesight, I  have become part of a global movement known as iPhoneography, where photographers capture, edit and share photos right from their mobile devices. I am confident that even if I lose my eyesight completely, I will be able to continue my photography with the help of my iPhone.  The Camera app is compatible with VoiceOver, and using the face detection technology built into this app, I know that I will be able to capture portraits of my daughter as she grows up even if I can barely see the screen. Just a few years ago, if someone would have told me that I would someday be traveling around the country independently and doing what I do, I would have told them they were crazy. So great has been the change that Apple’s technologies have had on my life and on my way of thinking.  

Without a doubt, my life today is better because of Apple’s commitment to accessibility. However, the full promise of technology for people with disabilities like mine remains on the horizon. Until app developers and content creators start to more consistently incorporate accessibility practices into their workflows, the full potential of technologies such as VoiceOver will not be realized. One common problem people with visual disabilities encounter when trying out a new app is that not all of the buttons needed to navigate the app are properly labeled. Thus, turning on VoiceOver will result in the user hearing a series of “button, button, button” without any indication as to what each button does. Without these labels, which can be easily added during the design process with Apple’s development tools, we are locked out of apps we could be using to complete our school work, stay up to date with the news, access online services and more.

The same thing happens when ebook publishers leave out accessibility descriptions (also known as alternative text) for images, charts and other graphical content. Without these descriptions, those of us who have visual disabilities miss out on information that is important to understanding the rest of the content. Building in VoiceOver compatibility for apps and other content such as ebooks benefits more than just blind users. Users with cognitive or motor disabilities who rely on switch devices to interact with their devices also benefit because many of the newer switch interfaces use VoiceOver to enable communication between the switch interface and the iOS device.

Incorporating accessibility into the design workflow will result in a number of benefits for developers and content creators as well. For one, they will gain access to a new source of revenue from a group of people who are becoming increasingly tech savvy as they realize the potential of technology for improving their lives. About one in five people in the U.S. has a disability, and I would argue that it would be foolish for any developer or content creator with a good business sense to turn away these potential customers. This does not even include those with age-related difficulties who, while not classified as having a disability, have similar needs. Furthermore, as universal design advocate Wendy Chisholm has stated, “Google is a blind and deaf user with millions with friends and dollars to spend.” Google and other search engines cannot see images or watch videos, but they can access the text in accessibility descriptions and closed captions. Adding in these accessibility features could improve the searchability of your content, bringing your message to a wider audience.

Aside from these practical benefits, adding accessibility to your apps and content is the right thing to do and sets you apart as a socially responsible developer. By creating accessible content you are in some ways part of a new civil rights movement for the 21st century and beyond. As more of our daily lives moves to the virtual world of the Web and apps, building in accessibility will ensure that people with disabilities are not excluded from accessing the same educational and employment opportunities as other groups.  

As my eyesight has diminished, my “vision” has become clearer. Just like my eyesight, I have become more focused on my work to educate on the power of technology to have a positive impact on the lives of people of all levels of ability. One of the more rewarding aspects of this work is witnessing that “magical” moment when someone for the first time is able to access his or her email or read a book independently with the help of accessibility features.

Working together, companies like Apple, the developer community, ebook authors and educators like me can help ensure those magical moments happen for more people as we go forward.

Buy The Loop magazine on the App Store from your iPhone or iPad.*