The Android is another smart phone platform that is gaining popularity. In January of this year, 7% of the cell phone consumers in this country had Android phones, and the number is steadily increasing. Many blind and visually impaired individuals wonder if Android is accessible. The answer is that it is partially accessible, but more work still needs to be done before the Android platform can be considered fully accessible (Burton).
Types of Phones
The two most popular phones at present are the Droid, which can be obtained through Verizon Wireless, and the Nexus One, which is an unlocked smart phone that can be used on both the T-Mobile and the AT&T networks. Though these phones are the most popular, we will not only focus on just these two, but we will discuss the typical features of an Android phone. Since technology is fast changing, it will not do us much good to focus on specific phone models when reviewing the accessibility of the Android platform. Rather, the phone models here may be mentioned periodically to provide examples for the statements made in this article (Burton).
Some Android phones have physical QWERTY keyboards, such as the Droid. Others only have a virtual keyboard that is on a touch screen. Some have track balls for navigation, like the Nexus One. Others have a D-pad, such as what is found on the Droid. Along with the various virtual buttons that may appear on the touch screen, Android phones are all designed so that there are four buttons that are always in the same place'”home, back, search, and menu (Burton).
Who is working on the Accessibility Aspect of Android?
A division of Google called the Eyes Free Project is working on the accessibility aspect of the Android platform. The project is led by T.V. Raman, a blind scientist, and his colleagues. The purpose of this project is to improve accessibility for those who have vision problems, as well as to make life easier for sighted individuals who cannot look at their phones at a particular time (Burton).
When the project first began, there were only certain apps available, notably the talking caller ID, the talking dialer, and the talking compass apps. Accessibility dramatically improved when the first Android screen reader was released in October of 2009, known as TalkBack. This screen reader is enhanced by two additional apps'”SoundBack and KickBack'”provide nonspoken feedback, such as beeps, vibrations, and clicks, when users are interacting with their phones. These apps can be enabled by going to the settings menu and then to the accessibility option and they will not be disabled unless they are disabled manually by the users. Sighted assistance is necessary for activating these applications, as there is no way to activate them independently as of yet (Burton).
The Eyes Free Project is working to make things easier. As of late, they have bundled all of these applications into one shell, which they are referring to as the Marvin Shell. The Marvin Shell can be set as the home screen, and it can serve as the launching pad for all applications, especially those made by Eyes Free. There is no additional price to be paid for these accessibility tools, and they come on all Android phones straight out of the box that run Android 1.6 or later (Burton).
Accessibility and the Android: How It Works
The Marvin Shell is set up like a three by three grid, similar to the set up of a dialing pad on a phone. Each number on the grid has a specific function. For instance, one is for signal strength. The two key is for checking the date and time. The three key is for checking on the battery level. The six key tells users where they are located, and the eight key is used to launch applications. In order to find these keys, users will need to place their finger in the middle of the screen. When they place their finger in the middle of the screen, they will automatically be on the five key. There is no need to be precise because the Marvin Shell uses what is called relative positioning. From the spot on the screen that is considered to be the five key, users will use relative positioning to find the other keys by sliding their fingers to find the keys they need. For instance, let’s say a user wants to launch an application. The user will place a finger in the middle of the screen'”since relative positioning is used, there is no need to be exact because even if the individual is not exactly in the middle, the five key will be where the user lands a finger. Once the user has placed a finger in the middle of the screen, the user will then glide that finger until finding the eight key. We know that the eight is right below the five on a standard telephone keypad, so the user will slide the finger that is on the five down to the eight and then lift the finger up. When the finger is lifted from the eight key, the user will then be where applications can be launched (Burton).
Dialing a Number Using Marvin’s talking Dialer
Most individuals who use smart phones have the numbers that they will typically call saved in their contacts list. Sometimes, however, there will still be that occasional time when numbers will need to be called that is not saved in the contact list. If a number needs to be dialed, the process is relatively simple.
Let’s say, for instance, that the user wants to dial an 800 number. In order to do this, the user will first start out on the home screen. While there, the user will place a finger on a button at the bottom right corner of the screen called search. Once the search button has been pressed, the talking dialer will be activated, and the user will then be able to dial numbers using relative positioning, just like when navigating on the home screen. The user will place a finger in the middle of the screen for the five key. Then, the user will slide a finger below the five until he hears one click sound. Hearing one click indicates that the user’s finger is on the eight key. The user will then need to lift the finger off the eight key, and then the number eight will be spoken, indicating that the number eight has been entered. To dial the zero, the user will again have to place a finger in the middle of the screen, starting at the five key as usual, and then he will slide his finger down until he hears two clicking sounds. This will indicate that he is on the zero key. The user will then lift his finger and hear the zero spoken. The user will continue this process until the number has been dialed. Once the entire number has been entered, the user can then tap the search button once to hear all the numbers that have been entered. Tapping it again will place the call (Burton).
If the user enters a wrong number during the dialing process, he can shake the phone once to delete the wrong digit. If the user wants to erase all of the numbers that have been entered, he will need to shake the phone twice (Burton).
When the call is finished, the user will need to touch a virtual button that is mid way up the screen. The director of the Eyes Free Project recommends putting a sticker on the other side of the phone in the same place that the button will be on the screen in order to properly find it. Being able to find this button will also enable the user to swipe a finger in order to answer or ignore calls (Burton).
Phones that Work Best with Eyes Free
The phones that work best with Eyes Free are those that have the physical QWERTY keyboards. Phones with only the touch screens and the virtual keyboards are not the best, as these will not allow users to access all functions properly. With the physical QWERTY keyboards, users can access pretty much all of the functions of the phone, with the exception of the applications that are not accessible. Virtual keyboards that are on the touch screen models limit the functions that users can access. For instance, users may not be able to access certain apps because they cannot get to all of them using the stroke dialer. In addition, if a user must call an automated system, the phones that have only the virtual keyboards will not allow access to the prompts using the Marvin Shell’s talking dialer (Burton). Users who wish to help improve accessibility may want to have both types of phones on hand, so they are able to get the entire picture and provide detailed suggestions to the Eyes Free Project concerning accessibility.
There are still some problems with accessibility that need to be worked out. One major problem for many blind and visually impaired individuals is that Internet browsing and emailing are still not accessible. Furthermore, the talking dialer will not allow users to interact with automated systems where buttons need to be pressed to follow certain prompts. Additionally, there are some bugs that need to be worked out. For example, the talking caller ID on the Nexus One will speak the number of the previous caller rather than the one calling at present. When users press the power button to wake the phone from sleep, they are told to press the home button, when they really have to just swipe their fingers across the bottom of the screen from left to right (Burton).
The development of accessibility for Android is really going in a positive direction. If the progress that we have seen in the past year continues, we will see a major change in the smart phone industry concerning accessibility. Not to mention, the blind and visually impaired will have many more options. Since the Android platform is open source, any blind or visually impaired individual with a technical background is encouraged to help with the project in order to make drastic improvements to accessibility. Those that are not so savvy can also lend a helping hand by demoing an Android and providing suggestions to the project. If we as a community stick together on this, we can make some positive changes by leaps and bounds.
Burton, Darren. “Can an Android Make Your Mobile Phone Accessible?” Access World. American
Foundation for the Blind, May 2010. Web. 13 July 2010. http://www.afb.org/afbpress/