Assistance for blind passengers at Heathrow Airport: will it improve?

April 7, 2013

If you follow me on Twitter (@kirankaja12), you may have noticed a bunch of tweets from me in the past few weeks about the rather appalling level of assistance provided by Heathrow Airport to blind passengers. Sorry for flooding your timelines! According to EC Regulation 1107/2006, it is now the responsibility of the airport to provide any special assistance required by so called “Person with Reduced Mobility” or “PRM” passengers. This rather all-encompassing category of passengers may include persons with disabilities such as blind and partially sighted people, passengers who are deaf or hard-of-hearing, wheelchair users as well as elderly people who may have problems walking long distances or cannot manage steps. The regulation of course applies to airports within the European Union. Outside the EU, the responsibility to provide assistance usually lies with the airline. Almost all airports in the EU outsource the special assistance activity to a third-party contractor. In case of Heathrow Airport, this is a company called Omniserv.

So what on earth is the problem?

I use Heathrow Airport quite a bit because this is by far the most convenient airport for me. It is also the home of British Airways which in my opinion is the best airline for blind passengers primarily because BA cabin crew is well trained to assist passengers with visual difficulties. I mostly travel for work reasons with occasional leisure travel.

Over the past year or so, I have noticed that the assistance levels at Heathrow Airport have gone downhill. The issues I face on a regular basis include:

  • · No assistance provided when my flight arrives into Heathrow in spite of booking assistance in advance with the airline. In a number of instances, there was no one from Omniserv to help me off the plane and guide me out of the terminal building. On a number of occasions, the cabin crew kindly walked me out even though this is not their responsibility. On other occasions, I had to wait a long time after everyone disembarked for someone to pick me up. a couple of times, I was left in a rather quiet part of the terminal building and was asked to wait for assistance with no information as to when they will be picking me up or no easy way to contact a member of staff.
  • · Heathrow Airport advises passengers requiring assistance to arrive at least two hours before the flight departure. I diligently follow this but they would then make me wait for a long time before helping me through security and taking me to the departure gate. On occasion, I had to wait for up to 50 minutes to be helped inside. Again, they would not provide any update as to what is causing the delay or when they are likely to come assist me in, etc. their service level agreement states that 97% of passengers who have pre-booked assistance will be helped within 10 minutes and 99% within 17 minutes. Just like any other passenger, I may want to have a coffee or a meal before I get on a long flight. But if they take a long time to help me through security, there won’t be any time left for me to do this.
  • · While most Omniserv staff are very helpful and provide a good level of service, I have seen enough instances of unprofessional and inappropriate behavior. They would loudly argue with each other right in front of passengers about whose responsibility it is to help someone, fail to communicate important information such as flight delays, refuse to escort blind passengers to a coffee shop or other retail establishment within the airport when they are clearly required to do so, etc.
  • · Of course there is that old bugbear of insisting that blind passengers sit on a wheelchair when they are able to walk just fine. I flatly refuse. There is absolutely no reason why blind people need to sit on a wheelchair unless they specifically wish to do so. Admittedly, this hasn’t happened to me at Heathrow of late. They offer a wheelchair but they don’t insist too hard when I politely refuse it.

So what did I do?

In addition to ranting about it on Twitter?

I am aware that Heathrow is a huge airport with high passenger volumes and there will be times when the resources are stretched thin. I expect, and am happy to put up with reasonable delays. However, when they consistently fail to show up to escort people off an arriving plane in spite of pre-booking assistance, I thought it was time to start complaining.

In September 2012, when no assistance was provided to me when arriving into Heathrow from Glasgow, I first wrote to British Airways explaining what happened and asking why no assistance was provided. They got back to me saying that under EU regulation, it is not their responsibility anymore and that I should write to Heathrow Airport directly. So I sent a strongly worded feedback to Heathrow Airport via their website. I didn’t receive any reply initially and so I sent a second strongly worded feedback. I then received an apology from Omniserv customer services person. They gave me a vague reason about not having the correct information for my flight and informed me that they will improve their processes. However, the problem continued to persist. Meanwhile, I also started hearing about other blind people encountering similar problems. Omniserv again failed to assist me off a plane last month when I was coming back from a two and a half week work trip to the US. As usual, I booked assistance in advance and informed British Airways that I am blind. This information was passed on to Omniserv and Heathrow. In spite of all this, they not only failed to escort me off the plane but also failed to respond when they were told that a blind passenger was waiting to be helped. I had to find my way on my own within the terminal building until I luckily encountered an Omniserv employee who was just coming off a break. Once more, I sent a written complaint to both British Airways and Heathrow Airport with exact details of what happened and asking for a justification.

So why did Heathrow/Omniserv screw it up?

It is the stupid computer’s fault!

Without going into too much detail, Omniserv claims that I was not met at the plane because of a problem with their computer system. My booking apparently had two special assistance codes “blind” and “WCHR”. The latter is a code that denotes that I have no problems other than not being able to walk long distances. If a booking has more than one special code, the Omniserv system only retains one code and in my instance, it ignored the “blind”. Since they don’t necessarily have to assist every “WCHR” passenger in person, on this instance, they chose not to send a member of staff to the plane I was arriving on. I am rather puzzled by the fact that this glaring problem with the computer system has not been discovered in the two and a half years of Omniserv operations at Heathrow Airport.

Is Heathrow/Omniserv going to do anything about it?

Your guess is as good as mine!

After a number of email conversations and a couple of meetings with various people at Omniserv and Heathrow Airport, they now assure me that they will take a number of steps to improve the special assistance service at Heathrow Airport. I attended a focus group for blind users of Heathrow on Wednesday 4th April where I met other blind travelers who had similar experiences. Again without going into too much detail, here are some of the measures that Omniserv/Heathrow told us about that will hopefully lead to some improvements.

  • · Generating regular extra reports on their computer systems to troll through bookings with multiple assistance codes to identify blind passengers. This is to circumvent the current limitation with their computer system mentioned above. However, as I observed earlier, I am surprised that this issue has gone undetected for so long. They would surely be better off fixing the system so that it receives all codes and not just one.
  • · Identifying and retraining a core group of Omniserv staff who will primarily be assisting blind and partially sighted people. They feel that this is important because blind passengers constitute less than 1% of all special assistance requests at Heathrow. This means that most Omniserv staff members assist blind passengers only infrequently and therefore may not recall their training. Given the scale of special assistance operations at Heathrow, this may sound logical to Omniserv but we will need to see how this works in theory. What happens when no one from this core group of staff is on shift when a blind passenger is travelling through Heathrow? They should also be ensuring that staff is retrained in disability awareness on a regular basis.
  • · Working with airlines operating out of Heathrow to ensure that they correctly pass on special assistance requests to Omniserv. Considering the fact that passengers can book their journeys directly with airlines or through thousands of travel agents scattered all over the world, I am not entirely sure how successful this exercise will be. Omniserv should still be prepared for special assistance requests that haven’t been notified in advance.

While both Omniserv and Heathrow Airport seem to be genuinely interested in improving the service levels, it remains to be seen if there will be any real change in the next few months. More importantly, if they do manage to improve the service, they should ensure that that is maintained consistently.

Is there anything that blind people can do?

In addition to talking about it on Twitter that is

Definitely. There are a few things that we as blind passengers can do to help improve the situation.

Book assistance when possible

According to EC Regulation 1107/2006, you don’t really have to book assistance in advance. Even though they recommend booking assistance at least 48 hours in advance, the airport is required to make all “reasonable efforts” to assist you when you arrive at the airport prior notification. Do remember though that airports like Heathrow can get extremely busy at certain times and no one has unlimited resources. Providing an early notification helps the airport schedule their resources correctly. So if you are booking your ticket in advance, do make sure that you inform the airline or travel agent that you need to book assistance as well. You should also let them know the type of disability you have so that they can book the right type of assistance. Remember that they have this weird system of special assistance codes that may influence the type of assistance you may get.

I know that some of us don’t really want to book assistance because of a number of reasons. In fact, I always booked assistance in advance but I still ended up getting horrible service. So there is a tendency to not bother when you notice that the service is equally bad when you do book in advance.

However, there is no clear definition of what “reasonable effort” actually means. So an airport can claim but they could not accommodate you in spite of making all “reasonable efforts” if you don’t notify. Further, if you do get poor level of service, you are on a much firmer ground when complaining to the service provider if you have booked in advance.

Of course, there will always be instances where it isn’t possible to notify in advance. If you have to fly at short notice for whatever reason, most airports should generally accommodate you. Airline journeys, unlike train or bus journeys, tend to be booked well in advance. So I would urge you to take a few minutes to book assistance whenever possible.

A number of airlines let you book assistance online. This is typically done either at the time of the booking the ticket or can be done later via the airline website. in Europe, I was able to book assistance online with Finnair, Brussels Airlines and Airfrance KLM. Annoyingly, British Airways still requires you to call their customer care number to book special assistance. Wait times can sometimes be up to 20-25 minutes. I have been asking @British_Airways on Twitter to provide an online special assistance booking facility for a while now. If you think this would be useful, please tweet them and let them know.

I booked in advance but I still got a crappy service!

You don’t have to put up with it. Take a few minutes to complain to the airport that is at fault. Explain that in spite of requesting in advance, you weren’t given the assistance you required in a reasonable time. you can contact Heathrow Airport via their feedback form at http://www.heathrowairport.com/help/contact-us/feedback. Or, you can tweet them @HeathrowAirport. You can usually find contact details for other airports on their respective websites. You should also send in a complaint to the airline that you were flying with. Because it is the airport who is responsible for providing assistance, the airline will usually forward your complain to the relevant airport and forget about it. But at least it may add additional weight to your complaint to the airport.

It is important to provide feedback when things go wrong. Even if you travel only occasionally, there is absolutely no reason why you should put up with inferior service. But as long as we as customers don’t hold them accountable, airports and even airlines will make no efforts to mend their ways. Similarly, we as passengers do have to make an effort to provide positive feedback when someone provides a very good level of support.

When you are writing to an airport, provide as much detail as you can remember about the incident. For instance, It would be far better to tell them that you have waited 45 minutes before you were offered assistance (if that is the case) rather than just saying that there was a long delay. Do remember that you need to allow for a reasonable time before assistance can be provided. If I have booked assistance in advance, I typically allow for a 15 minute delay at large airports before I begin to get annoyed. I am not for a moment suggesting that we should all start keeping track of the time we had to wait for assistance. It is certainly not my favorite pastime while waiting to board a plane. But any substantial facts are helpful when complaining to an airport.

What did I miss?

Quite a bit I am sure!

Let me confess that I am not an expert in accessible travel or anything of that sort. I am not a legal expert either to fully understand EC Regulation 1107/2006. I just happen to be a blind guy who travels a fair bit and was unlucky to have had some rather unpleasant experiences particularly at Heathrow Airport.

I don’t know enough about all the challenges that people with other disabilities face while travelling by air especially wheelchair users like @Christiane whose inputs on Twitter are very appreciated.

So dear readers, if you happen to have any experiences of special assistance at Heathrow or other airports, please do share them via the comments form below. If you prefer, you can also talk to me on Twitter @kirankaja12 or email me on kirankaja12 at gmail dot com.

Happy travels and May you always get the assistance you need!

Android JellyBean Accessibility Continued: Books, Music, Movies and YouTube

August 12, 2012

This is a post about a few more features in Android JellyBean that might be of interest to the accessibility community. If you haven’t done so already, you may want to read my earlier blog post Random thoughts on Android Jellybean and Google Nexus 7 Accessibility. I received a lot of very good feedback on that blog post. Thank you very much for your comments and recommendations.

Google Books

I was pleasantly surprised to find that an ebook of Only Time Will Tell by Jeffrey Archer was included for free on my Nexus 7 device. Or at least I think it is free! I don’t remember intentionally purchasing it but it is perhaps best to check my credit card statement anyway. Nevertheless, this ebook is on my device and I wanted to read it because Jeffrey Archer is one of my favourite authors and although I have an unabridged audio version of this book, I haven’t started reading it yet.

Ebooks purchased from Google are available in an app called “Play Books”. I suspect the name “Play Books” comes from “Play Store” which is Android’s online store for apps, music, ebooks and other media content. “Play Books” sounds like a weird name for an ebook application though.

There is a dock with a number of app icons located at the bottom of the home screen. This dock is available on all pages of the home screen. Play Books is one of the apps that can be launched from the dock.

When the Play Books app is launched, the first screen contains the list of ebooks that have been purchased on the Play Books store. I believe Google also distributes books that are out of copyright and these are free to download. Any such free downloads should also appear in this list.

When the Play Books app is launched for the first time, it may take a few seconds for the list of available books to be populated. I suspect that the device will acquire the list from the Play Books online store. Further, an internet connection is required to open and read these ebooks. But there is an option to download them to the device as well.

I have had mixed results navigating through the list of books on this screen. Sometimes, the book titles are read out when using the left and right swipe gestures. But more often, I have to explore the screen to get to them.

Assuming that Talkback is running and explore by touch is active, double tapping a book in the list will open it. More importantly, Talkback will start reading the book aloud. Do remember that the book has to be downloaded from Google if opening it for the first time. So it may be a few seconds before Talkback starts to read the book. I do have to say though that the default TTS engine in JellyBean isn’t really suited for reading books. Especially not fiction. I may have to invest in a better sounding TTS engine.

If you let it, Talkback will continue to read to the end of the book. As mentioned in my previous blog post, there is no command or gesture to stop/pause and resume speech in Talkback. But I may have luckily discovered a workaround for this. When you want to pause speech, tap and hold with a single finger anywhere on the screen. Talkback will pause reading the book as long as your finger is touching the screen. To resume reading, just lift your finger. Very handy for answering a quick phone call for instance.

The only other way to pause or stop reading the book is going back to the previous screen which is the list of books or going out of the Play Books app completely. Unfortunately, when Talkback is reading the book, it is almost impossible to get to the “Back”, “Home” and “Recent Apps” soft buttons at the bottom of the screen. When I touch these buttons, I can hear the earcon indicating that I am on one of these buttons but Talkback continues to read the book. It may read the name of the soft button that I touched much later when it has completed reading a chunk of text from the book but I may have moved away from that button by then. Fortunately, Talkback has a set of four shortcut gestures that come in handy here. The Home button can be activated by swiping up and left and for the Back button, it is swipe down and left. These shortcut gesture assignments can be altered from Apps > Settings > Talkback > Settings > Shortcut Gestures.

Moving to the next and previous page are the only navigation capabilities available within a book when using Talkback. Two finger swipe up moves to the next page and two finger swipe down moves to the previous page. Talk back will resume reading from the next or previous page. Using the right swipe gesture to read the next chunk of text doesn’t work reliably. The gesture works for the first couple of attempts with some books while with others, it doesn’t work at all. Changing the reading level to character, word or paragraph doesn’t seem to have any effect.

The books I tried to read do appear to have table of contents. Although I haven’t verified this with a sighted user, I am fairly certain that it is possible to jump to a particular chapter from the table of contents. This is however not possible with Talkback. Of course, when Talkback starts reading from the beginning of the page, it dutifully reads through the table of contents. Listening to something like “Chapter 1 chapter 2 Chapter 3” until chapter 70 or so can get annoying rather quickly. So I try to use the move to next page gesture a number of times in an attempt to find the beginning of the text with mixed results.

If for some reason, you didn’t want Talkback to automatically start reading books, you can turn this off in Play Books settings. In the application’s main screen, find the “More Options” button. Activating this button opens up a menu. Activate “Settings” from this menu. Look for the “Automatically read outloud” checkbox and uncheck it. If you are a Talkback user, you will need a really compelling reason to disable this option. I haven’t found an alternative way to read any book with this option disabled.

For partially sighted users, there are options to change the font type and increase or decrease the font size. There appear to be six different fonts. However, I cannot really test if the different font size options are suitable for users with low vision. These display options are available through a popup menu that can be opened by swiping up from the Home button. Doing this with Talkback isn’t an easy task though. Finally, according to help documentation for Google Play Books, some ebooks may only be offered with original scanned pages. This means that these ebooks won’t have text reflow capabilities. However, users will apparently be notified of this when purchasing or downloading the ebook. I haven’t come across such an ebook on Google as yet but I am not sure if Talkback will be able to read the book outloud if it only contains scanned pages.

I do have to admit that for fiction at least, I prefer to listen to human narrated version of a book rather than listening to a text to speech engine. However, I am still persisting with the ebook version of “Only Time Will Tell”.

The most important thing however is the fact that visually impaired people now have a much greater choice of reading material that is partly or fully accessible. In addition to Google Books on Android, iBooks on iOS and Blio on both platforms, Adobe Digital Editions on PC and Mac is now screen reader accessible as well.

More Gestures for Google Books

After I published this blog post, I was informed about a few gestures and commands for reading books in Play Books app. These don’t appear to be documented online. So most of what is written below is from my experience of using the gestures and commands.

When a book is opened and Talkback is reading it, double tapping with a single finger pauses speech. Double tap with a single finger again to resume speech. Talkback will start reading the book from the location it was paused. I haven’t discovered this in my initial testing because everywhere else in the Android platform, a single finger double tap activates things. As such I didn’t expect it to be performing such a conceptually different function of pausing and resuming speech. No matter how hard I try to think logically, a single finger double tap activating items throughout the platform but doing entirely a different thing in a single application is hard to comprehend.

It is also possible to jump to next page and previous page by touching and then double tapping the right corner and left corner of the screen respectively. For example, to go to the next page, you would first touch the screen towards the right side corner. Nothing happens at this stage. But double tapping the screen now will take you to the next page and Talkback will continue reading from the next page.

Earlier, I did say that a single finger double tap will pause and resume speech. This would obviously lead to readers concluding that after moving to the next or previous page, they can use double tap to pause speech. Unfortunately, this wouldn’t be true. Single finger double tap is now set to moving to the next or previous page (depending on which function was selected) and subsequent single finger double taps will continue doing this new function. In order to make single finger double tap pause and resume speech, the users will have to touch the middle of the screen and then double tap. This will return the single finger double tap gesture to its former state which is to pause or resume reading.

Perhaps this can be explained in a slightly simpler way. Imagine that when a book is opened, there are three huge buttons on the screen from left to right in a row. The left most button towards the left edge of the screen is the “Previous Page” button. The middle button is the “Pause/Resume Reading” button. And the right most button towards the right edge of the screen is the “Next Page” button. To activate these buttons, you would first select one of them by touching it and then double tapping it. Once you activate a button, subsequent single finger double taps will continue to activate that button. In order to activate a different button, you have to touch it first.

It would have been very helpful if all of this was clearly documented in an appropriate location. Nexus 7 manual’s Accessibility section would have been a good place to mention it. Most users, especially if they are new to the platform, would have a very hard time figuring out something like this on their own.

Music and Movies

There is not much to talk about these applications. The music playback application is called “Google Music” which is probably a more obvious title than “Play Books”. Again, “Play Music” can be launched from the dock on the home screen. Unfortunately, Google’s online music store and music in the cloud services are not available in the UK and so I wasn’t able to test them. However, I did manage to copy a number of music tracks on to the Nexus 7 device. Play Music app recognised them immediately and I was able to play them with not much trouble. The playback controls are located at the bottom of the screen just above the “Back”, “Home” and “Recent apps” soft buttons. Playback controls are also available on the lock screen and they are usable with Talkback. I haven’t figured out a way to rewind and fast forward inside a track.

The app for movie play back is a different storey altogether. Not surprisingly, the app is called “Play Movies” and is also available on the dock. Using Talkback, I can start playback of a movie. But I haven’t found a way to pause playback. I needed sighted help to accomplish this. None of the shortcut gestures work when the movie is playing. Double tapping the screen brings up an unlabelled image control. I tried double tapping this to bring up playback controls with no success. Perhaps there is a way to pause or stop playback when using Talkback. However, I am not too interested in watching movies and so I didn’t spend too much time on finding a solution.

Movies can be purchased on the Google Play online movie store. I was happy to note that the movies I purchased did contain closed captions. It is also possible to select the font size for caption display from “small, medium, large and huge. There is no audio description for these movies though.

Finally, I noticed a number of unlabelled image controls in both Play Music and Play Movies apps all over the place. However, I am not sure what these controls do.

YouTube

The YouTube app is available in the “Apps” folder. It is mostly usable with Talkback except for a couple of challenges. Once you start playback of a YouTube video, in order to pause playback, a Talkback user needs to double tap the video to bring up playback controls. I have found a reliable way that works for me. When the video is playing, find a control named “Info”. From this control, swipe left once. There won’t be any speech feedback but you are in the right spot to bring up playback controls. I presume the focus is on the video. Double tap the screen now. Playback controls are visible but focus doesn’t move to the controls. Swipe right once to move to “Pause Video” button. This is a bit unwieldy but does work for me more often than not. I did try remembering the video playback area, double tapping it and then remembering the location of the “pause video” button. But I admit that I haven’t been successful.

I suspect the YouTube interface on a 7 inch tablet can be a little crowded. It is perhaps easier to access the controls on a smaller screen. It would also be interesting to see how accessible the new standalone YouTube app for iOS will be. If it is going to use the same UI design as the Android tablet app, it may not be a pleasant experience for blind users.

As always, any comments, suggestions, tips and app recommendations are very welcome. Please leave a comment below or get in touch with me on Twitter @kirankaja12.

Random thoughts on Android Jellybean and Google Nexus 7 Accessibility

August 2, 2012

If you follow developments in mobile technology, you may already know that a new version of the Android operating system called “JellyBean” has been launched last month along with a new 7 inch tablet device called the Google Nexus 7. It was also announced that JellyBean came with a number of accessibility improvements. To know more about these improvements, either watch this hour long or the announcement on the Eyes-Free Android blog.

I needed to test the new accessibility improvements in JellyBean and so I ordered a Google Nexus7. I was happy to note that the help documentation for Google Nexus 7 has a section dedicated for accessibility features. Anyone considering a purchase of an Android JellyBean device should read through this information.

In spite of reading through this section and doing a bit of research in advance, I ran into a couple of problems while setting up the new Google Nexus 7. I thought I will document them here with possible workarounds that I found.

Turning the device on for the first time

If you are blind and do not have sighted assistance, don’t be in a hurry to unbox your Nexus 7 and turn it on. It would help to prepare a bit.

Firstly, you will need to have a pair of headphones ready. You don’t have to use them right from the start: Nexus 7 does have an external speaker but it would be very hard to complete the setup without them as you will see shortly. Secondly, it would help to have a flat surface like a desk. There is a gesture to turn on Accessibility mode which fires up Talkback with Explore by Touch mode active. This gesture is rather unreliable and seems to work best when the device is placed flat on a desk.

If you are holding Nexus 7 in the portrait mode, the power button and the volume rocker should be along the right edge of the device towards the top. The 3.5mm headphone socket is along the bottom edge towards the right. Don’t forget to connect the device to a power source before you start. The mini USB port is along the bottom edge towards the middle. It seems to ship with an almost empty battery.

When you turn on the device by pressing the power button, you will be placed on a welcome screen. You can turn on the Accessibility mode right here. You need to touch the screen with two fingers placed slightly apart and hold for 4seconds. When you place 2 fingers, in the first second or so, the device should recognise the gesture and will ask you to continue holding for a few more seconds to turn on Accessibility Mode. Don’t remove your fingers from the screen and no matter what you do, don’t let a third finger touch the screen in these few seconds. I did this exact thing by mistake and the device said “cancelling accessibility mode” and wouldn’t talk to me at all for the next 3 hours no matter what I tried.

It appears as if the gesture works better when you use both hands with one finger touching towards the top half of the screen while the other touches towards the bottom half. If for some unfortunate reason, you end up in a situation where the device says “cancelling accessibility mode”, you can try the same two finger gesture again. Even though the device may not speak, you may hear a click sound after 4 seconds or so. At this point the Accessibility mode may come up on its own. If not, try drawing a rectangle on the screen in the clockwise direction. This gesture is supposed to enable Talkback and explore by touch although it didn’t work in my case. As a last resort, you may try the two finger gesture again a few times. You might just be lucky!

Of course, if you have the chance, you can also setup the device with sighted help and then turn on Talkback from Apps > Settings > Accessibility > Talkback. A number of users on Twitter and on the Eyes-free mailing list seem to be having issues with the gesture to turn on Accessibility mode. I tried the two finger gesture and the rectangle gesture countless number of times and spent close to three hours on it before it somehow worked. I hope that this issue will be addressed at some point in the near future. Turning on Accessibility mode with hardware keys might be a better way for most users. A large majority of Android devices will have at least a power button and volume control keys.

Assuming that you got the two finger gesture to work and you have the Accessibility mode turned on, you will be placed in the explore by touch tutorial. It only has three steps but I found it to be very useful. It covers the important gestures that you will need to learn to use Talkback with explore by touch mode enabled. Once the tutorial is finished, you will return to the welcome screen where you need to pick a language and activate the “Start” button to start setting up the device.

When you get to the screen to enter the password for the Wi-Fi network, look for the checkbox which says “show password” or “display password” and check it. If you don’t do this, all characters on the virtual keyboard are read as “dot” by Talkback. Unfortunately, when entering your Google account details, the “show password” option isn’t available (for obvious reasons). This is where the headphones come in handy. In order to enter the password with the virtual keyboard, you need to connect the headphones. Once headphones are connected, the characters on the virtual keyboard are announced correctly. Otherwise, you will just hear “dot” no matter which character you touch.

This is a great idea for security reasons but I don’t think it is smart to enable this by default. What if I don’t have a pair of headphones handy when I am setting up the device? This feature can be turned off by selecting “speak passwords” checkbox in apps > settings > Accessibility but it isn’t possible to get to settings from the initial setup screen. I just don’t think this is a smart usability decision.

Accessibility mode gesture isn’t available on all devices

It is important to remember that the 2 finger tap and hold gesture to turn on Accessibility mode may not be available on most devices. Both the current Google devices, Nexus 7 tablet and Galaxy Nexus smartphone support this. on other Android devices from manufacturers like HTC, Samsung, Motorola and so on, the components required for accessibility support will have to be either downloaded from the Google Play app store or enabled in Apps > Settings > Accessibility. In addition, these manufacturers may also customise the user interface of Android with skins. These skins may or may not work with Talkback and other accessibility features. For now, it is best to stick to Google devices which run stock Android (with no customisation) if you need to use a screen reader. There are ways to install stock Android builds on non-Google devices but this is not easily accomplished.

New gestures

Explore by touch mode was introduced in Ice cream Sandwich (Android 4.0) and this is improved in JellyBean. In addition to touch exploration where you would move your finger along the screen for Talkback to announce the items you have touched, you can also use gestures to navigate through these items. For example, swipe right and swipe left moves focus to the next and previous item on the screen respectively. Double tap anywhere on the screen to activate the last focussed item.

While you can get to any item on the screen by touching it if you remember the exact position, a combination of touch exploration and swipe gestures is the most efficient way to navigate a touch screen in my opinion. Introduction of swipe gestures is an excellent addition to Android Accessibility.

A list of all Accessibility gestures in Android 4.1 (JellyBean) are available under the accessibility section in Nexus 7 help documentation. By default, the left and right swipe gestures move focus by item. But you can change this to move by character, word or paragraph. Swipe down with one finger and in the same gesture swipe up. Talkback will say “read by character”. You can now use the left and/or right swipe gestures to read the current item character by character. To return to the default navigation mode (moving by item), keep using the swipe down and up gesture until Talkback says “default reading level”. Swipe up and down toggles backwards among the available navigation modes.

There are a couple of very important gestures missing and I haven’t found an alternative way to perform these actions:

· A gesture to pause or stop speech. This doesn’t seem to be possible on the Nexus 7 tablet but Marco Zehe has noticed that tapping his Galaxy Nexus on the top edge usually silences speech. I suspect this is because of the proximity sensor on the phone. But really, a screen reader should have a pause speech command/gesture.

· A gesture to read from the current item to the end. I thought this is a basic requirement for a screen reader. But there doesn’t seem to be a way to do this. For example, if I am reading a long email thread in the Gmail app, I have to read the entire body of the message at one go or read by character/word/paragraph. Further, reading anything remotely lengthy in a web browser is extremely painful without this command.

A word about navigating lists

It is best to illustrate my point with an example. When you open the Apps > Settings screen, Talkback will inform you that you are on a list showing 18 items. There are actually 24 items in the Settings screen on my Nexus 7 but only 18 items are visible on the screen. In order to get to the other 6 items, you will have to manually scroll with scrolling gestures. If you do not do this and continue navigating with left/right swipe gestures, you will only keep moving through these 18 items in the list and other items on the screen. In order to scroll down to the other 6 items, place two fingers on the screen and move up. You can also swipe right and swipe left in the same gesture to go to the next page in a list.

I wanted to point this out because this behaviour is contrary to how most other user interfaces work especially for screen reader users. When navigation commands are used, list controls tend to automatically scroll to display hidden items.

Talkback behaviour is particularly confusing because when you move focus to a list control, it only informs the number of items that are being displayed but fails to mention how many items there are in the list. In the Settings example for instance, instead of just saying “list showing 18 items”, it would be useful for Talkback to say “list showing 18 of 24 items”. Without this information, users will always have to attempt manually scrolling the list just to make sure they are not missing any items.

Virtual Keyboard

Using the default Android keyboard in JellyBean is straightforward. When you encounter an input field, just double tap on the input field to bring up the keyboard. Move your finger along the keys and when you have found the character you would like to insert, just lift your finger. Talkback will read out the inserted character at a lower pitch. On the Nexus 7, I am noticing that I have to be precise with my finger placement on the characters for them to be inserted. If I don’t lift my finger in the exact centre of the character, it doesn’t insert it. to insert numbers, switch to the “Symbols” keyboard by tapping the “symbols” toggle on the bottom left of the keyboard

Do remember that you will need headphones to enter passwords if you haven’t turned on “speak passwords” in Apps > Settings > Accessibility.

Entering a Pin on the lock screen

If the Nexus 7 needs a pin to unlock, the way to enter it is slightly different to the usual way you enter text. On the lock screen, there is an input field without a label and immediately to the right is a button without a label as well. This unlabelled button needs to be activated by double tapping before entering the pin. To insert digits of the pin, just tap the digits on the telephone keypad style virtual keypad. Don’t double tap them. Just tap the digits and lift the finger. Even though there is no feedback, the digits are entered. When you are done entering all digits of the pin, tap the enter key to the right of the digit 0 on the virtual keypad. On the other hand, if the device is locked with a password, you would double tap the unlabelled input field which brings up a standard virtual keyboard.

Voice recognition

Voice recognition is one reason why I keep trying to switch to Android. It does an excellent job recognising my accent. It is much more accurate than Siri on iPhone. Voice Recognition is available throughout Android. Basically, you are able to speak into almost all input fields in Android.

Before you start using voice recognition though, make sure you connect a pair of headphones. Trying to use voice recognition when Talkback is running and speaking out of the built-in speaker is a recipe for disaster!

When you are composing a text message or an email, look for the “voice input” button on the keyboard. This is between the “Symbols” toggle and the spacebar on the bottom left corner of the device. Tap this and lift your finger and you will hear an earcon notifying you that the device is ready for you to speak. Just speak normally and when you are done, wait for a few seconds. The device will play another earcon to indicate that it has paused recognition. You will also notice that the keyboard now only has one item which says “tap to speak”. Double tap this button to start recognition again. In order to get the normal keyboard back, you need to activate the Back button on the bottom left of the device and then double tap the input field again. This process is a bit clumsy but I haven’t found an alternative.

When you are speaking, Talkback will start reading out words as they are recognised by the device and entered in the input field. This can be a bit distracting. And by now you can guess why I suggest using headphones. When the device is in voice input mode, Talkback will read the text inserted by the voice recognition feature. However, the voice recognition features starts to recognise Talkback’s speech output and inserts the same words again in the input field. This results in an infinite loop with Talkback reading out the same words and the voice recognition feature inserting the same words over and over again. This is rather funny to watch but makes for a horrid user experience.

In spite of this, I still say that Android’s voice recognition feature is fantastic. I can probably put up with having to use headphones for the recognition accuracy it gives me especially with my Indian accent.

Further, you can also use Google Voice Search for some simple requests such as asking for local weather information and so on. If Google Voice Search finds a simple answer, it will speak out the answer. If it can’t find an answer to your question, it just displays Google search results. Google Voice Search is located on the home screen’s dock. Surprisingly, Talkback doesn’t interfere with voice recognition when using Voice Search. So this can be used even without headphones.

Google Chrome

Talkback now works with Google Chrome which is the default browser on Android. It does recognise various HTML elements such as lists, headings, landmarks and so on but there is no way to navigate/jump among these. The available reading levels are still character, word and paragraph.

Further, the explore by touch feature doesn’t seem to work reliably on a web page. Although navigating by left and swipe gestures works fine, exploring a web page by moving a finger along the screen doesn’t seem to reliably work on all web pages.

There is an option to “enhance web accessibility” in Apps > Settings > Accessibility which needs to be turned on for Google to insert certain scripts which are supposed to improve accessibility on web pages. If you are having problems on the web, it might be worthwhile to check if this option is enabled.

Other nice features

App developers should always test their apps with Talkback. JellyBean makes this a little easier. There is an option to display speech output under Apps > Settings > Accessibility > Talkback > Settings > Developer Settings. Enable this option and there will be a bubble on the screen which displays what Talkback is speaking.

I was also happy to notice that the default email app on Android and the Gmail app have been made accessible. They now work with Talkback for most part except for minor annoyances. The read/unread status of an email message isn’t spoken by Talkback. Plus the body of an email is displayed in web view which means explore by touch doesn’t work correctly. One has to use the left/swipe right gestures to navigate and read the message. But I can read email messages from my work account as well as Gmail. There is also K9 email which is supposed to be an accessible email client but I haven’t used it in a while.

I was trying to find an accessible Twitter client for Android. UberSocial seems to be the answer but the app displays timelines in web view which makes it a bit difficult to use.

I am sure I missed a few great features but as the title of the article indicates, these are random thoughts and I am not trying to do a full review. Further, there are other accessibility features that I haven’t tried out. There is a large text feature in Accessibility options. I am not sure how this is supposed to work. And there are a number of third-party accessibility services which I may test when I have the time.

If you have any tips or tricks or recommendations on apps, please get in touch on Twitter @kirankaja12.

I expect more from Technology because of Jobs

October 6, 2011

Words are never enough to express
such a loss
and I have never been good at writing anyway. But I feel compelled
to say a few things about how Steve Jobs has profoundly altered my
expectations when it comes to technology especially as a blind person.

 

I started using a mobile phone in late 2000. It was a Nokia 3310. I was 19 at that time
and living in India. The phone was a gift from my uncle in Singapore. All I could
do with it is send and receive calls. I had to remember telephone numbers of
course because I couldn’t use the phone book. Text messages, reminders and
everything else was equally unusable. But surprisingly, I didn’t mind it in the
least. I had a device that I could use to contact friends and family any time and
any where I wanted. For a while, I felt like I had the greatest thing in the
world.

 

In 2011 however, I at least own four different so called
smart phones and as a blind person, I can use most of their excellent features
and functionality. The real game changer for me is undoubtedly the iPhone with
its built-in accessibility features. In spite of all its limitations and Apple’s
walled garden approach, iPhone has drastically improved my access to
information and to that, I am eternally grateful to Steve Jobs.

 

I grew up in a south Indian city called Hyderabad which had no
real infrastructure to support persons with blindness or any other disabilities
for that matter. Access to information and books in alternative formats was
unheard of for most of my younger years. I read my first proper braille book
when I was sixteen thanks to RNIB’s National Library for the Blind in the UK. As
a result, I started devouring information when my parents bought me a computer,
a flatbed scanner and Kurzweil 1000
software. I believed for a long time that I couldn’t live without my computer
for any length of time. But the iPhone changed all this. I can do almost
everything on the iPhone now including listening to books. I am still catching
up on all the reading I missed when I was young. Survival would be tough
without the iPhone.

 

I don’t believe Apple’s motives behind including built-in
accessibility features in their products are entirely altruistic. I am sure
legal considerations were a significant factor in setting the accessibility initiative
at Apple in motion. Does anyone remember the VPAT for the 1st
generation iPhone? It implied that blind people could use the device with
exceptions.

 

But the reasons are immaterial. Once Apple decided to build
in accessibility, all of us benefited from Steve Jobs’ vision of doing things
right. I have no doubt that he was directly and/or indirectly responsible for
the high level of accessibility we see in iOS devices these days. It wouldn’t
have been possible without continued support from top management. Stevie
Wonder was entirely right to thank Jobs
for this.

 

Unfortunately, increased accessibility in iOS devices has greatly
raised my expectations. It is hard to digest the fact that other smartphone and
tablet devices and platforms offer very little by way of accessibility. If it
weren’t for the iPhone, I would have been happy to wait for a 3rd
party company to develop assistive technology software for one or two of these
devices. Instead, I now expect that other smartphones and tablets also provide
these accessibility features by default.

 

Although I am frequently disappointed by lack of such
features, I am thankful to Steve Jobs for opening my eyes (figuratively
speaking of course) to the fact that I can definitely expect more from
technology. Thank you Steve for treating persons with disabilities fairly. I sincerely
pray that Apple continues your legacy. Rest in peace.

 

Can tablets replace traditional notetakers?

August 25, 2011

Thanks to my employer, I recently began using an iPad 2 for testing. Although I have been using the iPhone for over an year, I should admit that I am not entirely comfortable entering large amounts of text on a touchscreen. while it can certainly be accomplished, it does require a significantly higher level of effort for a blind person using VoiceOver. I am sure some of you will disagree.

So, I invested in a 69 pounds keyboard case for iPad2 manufactured by Kensington. This is a leather folio style keyboard case that turns the iPad 2 into a netbook like device and makes entering text much easier. Although there are downsides to this combination, I am extremely happy with the solution. Further, I also think this combination has the potential to be a replacement for very expensive dedicated notetakers for blind people.

Why not a Netbook instead of this combination?

A very valid question raised by some. Firstly, iOS with its built-in accessibility features is a far more superior experience on a small device like this. It doesn’t take me more than 4-5 seconds to wake up the device and start using it. Pressing any key on the keyboard brings the iPad out of standby. Contrast this with boot time for Windows 7. Even if Windows is set to standby instead of shutdown, I find that assistive technology, particularly Jaws does not function efficiently after Windows resumes from standby. Often, I have to shutdown Jaws and launch it again for it to work properly. I believe screen readers without mirror drivers like NVDA may be immune to this.

Secondly, the battery on iPad lasts for about 10-11 hours of continuous use. I did not notice any significant reduction in spite of using the keyboard which connects to the iPad via bluetooth. This is significantly longer than most net books although there are some netbooks which claim to offer similar battery backup times.

Thirdly, I get access to all the apps and content on iOS that made iPhone and iPad such popular devices.

Finally, there is something to be said for using a mainstream device with built-in accessibility features!

About the keyboard case

As I said earlier, the case is manufactured by Kensington and is called the 2nd Generation leather folio keyboard case for iPad 2. It is like a hardcover book and when opened like a book, the keyboard is on the left and the slot for sliding in the iPad is on the right. There are velcro straps to hold the iPad securely. When using with the keyboard, the iPad is invariably in landscape mode. The keyboard is not full-sized but is more like what you will find on any netbook. But the keys are widely spaced and the key travel is quite good. There is a rubber membrane on the keyboard making it spill proof. there is a physical on/off toggle switch. Since the keyboard connects via Bluetooth, there is a tactilely discernible paring button. Holding this button down for a few seconds puts the keyboard in pairing mode. The iPad recognises the keyboard and when connecting, asks you to enter a 4 digit pin. the process is all very simple.

The keyboard has a rechargeable battery which lasts for a long time. There is a mini USB port for charging it. I have had the keyboard for close to a month now and it is still running on the initial charge.

However, There is one downside to this case. The base of the iPad rests above the keyboard in a roughly 45 degree angle and other than a few bumpy lines, there is no real support to hold the side with the iPad in that position. It always feels as if the iPad will slide on to the keyboard especially when travelling in a car. But so far it seems to hold up fine. Undoubtedly, there are other keyboard cases without this minor inconvenience. But this sounded like a good deal for the price.

A number of users also prefer to use the Apple Wireless keyboard with iPads and iPhones. But the Apple keyboard is a bit too big for my liking. Also, I prefer to keep the keyboard and iPad combination together in one piece.

Do I still use the touch screen?

Certainly! Most times, it is faster to activate certain items on the touch screen especially in those apps I regularly use. Navigating via the keyboard is just another alternative.

So can this combination b an alternative to notetakers?

Hard to say at the moment. But an iPad with a keyboard or a keyboard case combo certainly has the potential to replace dedicated notetakers for blind users.

Firstly, this combination is much cheaper. A dedicated notetaker even without a braille display can cost upwards of 2000 pounds. Where as the cheapest iPad with 16GB memory and wifi costs 379 pounds. Add another 60-70 pounds for keyboard or keyboard case. The entire solution costs less than 500 pounds.

Further, iPad does have the latest in terms of technology, a good quality web browser, WiFi and/or 3G connectivity, access to email and social networking apps and so on. There are even apps to read DAISY books. Additionally, iBooks provides access to a large number of eBooks as well.

Of course there are limitations. One of the main advantages of dedicated notetakers is the fact that the software on those devices has been optimised for blind users. One of the drawbacks of iPad right now is the lack of fully accessible word processing and spreadsheet apps. Both Pages and Numbers (both from Apple) are not fully accessible. Although there are other apps such as plain text and evernote, they do not provide access to text formatting information among other things. So, while I can take notes and write long documents, I cannot format them properly. Of course, even if these apps did provide this information, the iOS Accessibility API and VoiceOver in particular need to be enhanced to be able to parse the information and provide it to users. further, there are inherent limitations in iOS such as lack of native support for zip archives and the fact that I cannot save my email attachments drives me crazy. I have to rely heavily on cloud storage services such as Dropbox.

However, I am very optimistic that at least some of the accessibility limitations of iOS will be addressed in the future. If indeed this happens, the need for dedicated notetaker devices may decrease. Perhaps, this is already happening to some extent. Do remember that FreedomScientific hasn’t updated Pacmate Omni for a really long time.

I look forward to the day when blind people can make the best use of mainstream technology on par with their sighted counterparts.

What about braille?

I have a Braille Connect 40 bluetooth portable braille display which I mainly use with my computer. However, I can connect this to my iPhone or iPad and operate them completely from the braille display. I can’t think of any cheaper solution. Refreshable braille sadly continues to be expensive technology.

P.S: Most of this blog post has been created on the iPad using a free WordPress app.

Miscellaneous Assistive Technology tips

April 28, 2011

Welcome to my new blog post after two and a half years. Be warned that my writing abilities might be a bit rustic.

This post is a collection of tips and tricks that I discovered myself or learnt from other assistive technology users. My intension is to keep adding tips to this post as and when I discover them. So, if you have anything useful, please pass it along. I use Jaws 12 as my primary screen reader on an Apple MacBook Pro running Windows 7 natively. Before you ask, yes I do use Mac OS as well but I find that VoiceOver is not mature enough for me to be able to do my job effectively. I hope that this situation changes in the near future but that is a topic for another blog post. On to the tips now…

Outlook 2010: Shortcut for viewing the current message in browser

If you are using Outlook 2007 or above with a screen reader, you may already be frustrated by Microsoft’s decision to use Word for displaying email message content. So, most screen reader users generally open any email message that is in HTML format in the browser. There are number of ways of doing this. In fact, Jaws 12 suggests that I use Alt+H followed by the letter ‘A’ followed by letter ‘V’. But for some strange reason, this doesn’t seem to work for me.

But I discovered that “View in Browser” is one of the icons in the Quick Access toolbar in Outlook 2010. I also found out that Outlook assigns keyboard shortcuts automatically to icons in the Quick Access Toolbar. These shortcuts are usually Alt followed by numbers ‘1’, ‘2’ and so on. Alt followed by ‘1’ activates the first icon or item in the Quick Access Toolbar, Alt followed by ‘2’ activates the second item and so on. Since “View in Browser” is the 6th item in my Quick Access Toolbar, Alt followed by ‘6’ opens the current email message in the browser for me. You have to first press Alt, release it and then press ‘6’.

This may be a faster way of getting to the browser even if Alt+H followed by ‘A’ followed by ‘V’ works for you. This should generally work for most users who haven’t tinkered with their default ribbon and toolbar items. I am not sure if this works with Outlook 2007. If you happen to use it, please try it and let me know.

HTML & PDF: Selecting & copying text

Jaws 12 and some of the older versions don’t seem to be doing a good job of selecting and copying text. When I select a phrase or sentence on a webpage or in a PDF file in Adobe Reader and copy it to another application, only a portion of my selection is being copied. There is a setting in Jaws verbosity setting which fixed the problem for me.

In Internet Explorer and Adobe Reader, press JawsKey+V to open the “Adjust Jaws options” dialog. Press ‘S’ to jump to “Select and copy”. By default this is set to “Full content using onscreen highlight”. Press Spacebar to change it to “From virtual cursor”. Press Enter to dismiss the dialog. Text selection and copying should work better now. Remember to do this for both Internet Explorer and Adobe Reader. I am not sure if this applies to Firefox.


Follow

Get every new post delivered to your Inbox.

%d bloggers like this: