Archive for the ‘Tips & Tricks’ Category

Android JellyBean Accessibility Continued: Books, Music, Movies and YouTube

August 12, 2012

This is a post about a few more features in Android JellyBean that might be of interest to the accessibility community. If you haven’t done so already, you may want to read my earlier blog post Random thoughts on Android Jellybean and Google Nexus 7 Accessibility. I received a lot of very good feedback on that blog post. Thank you very much for your comments and recommendations.

Google Books

I was pleasantly surprised to find that an ebook of Only Time Will Tell by Jeffrey Archer was included for free on my Nexus 7 device. Or at least I think it is free! I don’t remember intentionally purchasing it but it is perhaps best to check my credit card statement anyway. Nevertheless, this ebook is on my device and I wanted to read it because Jeffrey Archer is one of my favourite authors and although I have an unabridged audio version of this book, I haven’t started reading it yet.

Ebooks purchased from Google are available in an app called “Play Books”. I suspect the name “Play Books” comes from “Play Store” which is Android’s online store for apps, music, ebooks and other media content. “Play Books” sounds like a weird name for an ebook application though.

There is a dock with a number of app icons located at the bottom of the home screen. This dock is available on all pages of the home screen. Play Books is one of the apps that can be launched from the dock.

When the Play Books app is launched, the first screen contains the list of ebooks that have been purchased on the Play Books store. I believe Google also distributes books that are out of copyright and these are free to download. Any such free downloads should also appear in this list.

When the Play Books app is launched for the first time, it may take a few seconds for the list of available books to be populated. I suspect that the device will acquire the list from the Play Books online store. Further, an internet connection is required to open and read these ebooks. But there is an option to download them to the device as well.

I have had mixed results navigating through the list of books on this screen. Sometimes, the book titles are read out when using the left and right swipe gestures. But more often, I have to explore the screen to get to them.

Assuming that Talkback is running and explore by touch is active, double tapping a book in the list will open it. More importantly, Talkback will start reading the book aloud. Do remember that the book has to be downloaded from Google if opening it for the first time. So it may be a few seconds before Talkback starts to read the book. I do have to say though that the default TTS engine in JellyBean isn’t really suited for reading books. Especially not fiction. I may have to invest in a better sounding TTS engine.

If you let it, Talkback will continue to read to the end of the book. As mentioned in my previous blog post, there is no command or gesture to stop/pause and resume speech in Talkback. But I may have luckily discovered a workaround for this. When you want to pause speech, tap and hold with a single finger anywhere on the screen. Talkback will pause reading the book as long as your finger is touching the screen. To resume reading, just lift your finger. Very handy for answering a quick phone call for instance.

The only other way to pause or stop reading the book is going back to the previous screen which is the list of books or going out of the Play Books app completely. Unfortunately, when Talkback is reading the book, it is almost impossible to get to the “Back”, “Home” and “Recent Apps” soft buttons at the bottom of the screen. When I touch these buttons, I can hear the earcon indicating that I am on one of these buttons but Talkback continues to read the book. It may read the name of the soft button that I touched much later when it has completed reading a chunk of text from the book but I may have moved away from that button by then. Fortunately, Talkback has a set of four shortcut gestures that come in handy here. The Home button can be activated by swiping up and left and for the Back button, it is swipe down and left. These shortcut gesture assignments can be altered from Apps > Settings > Talkback > Settings > Shortcut Gestures.

Moving to the next and previous page are the only navigation capabilities available within a book when using Talkback. Two finger swipe up moves to the next page and two finger swipe down moves to the previous page. Talk back will resume reading from the next or previous page. Using the right swipe gesture to read the next chunk of text doesn’t work reliably. The gesture works for the first couple of attempts with some books while with others, it doesn’t work at all. Changing the reading level to character, word or paragraph doesn’t seem to have any effect.

The books I tried to read do appear to have table of contents. Although I haven’t verified this with a sighted user, I am fairly certain that it is possible to jump to a particular chapter from the table of contents. This is however not possible with Talkback. Of course, when Talkback starts reading from the beginning of the page, it dutifully reads through the table of contents. Listening to something like “Chapter 1 chapter 2 Chapter 3” until chapter 70 or so can get annoying rather quickly. So I try to use the move to next page gesture a number of times in an attempt to find the beginning of the text with mixed results.

If for some reason, you didn’t want Talkback to automatically start reading books, you can turn this off in Play Books settings. In the application’s main screen, find the “More Options” button. Activating this button opens up a menu. Activate “Settings” from this menu. Look for the “Automatically read outloud” checkbox and uncheck it. If you are a Talkback user, you will need a really compelling reason to disable this option. I haven’t found an alternative way to read any book with this option disabled.

For partially sighted users, there are options to change the font type and increase or decrease the font size. There appear to be six different fonts. However, I cannot really test if the different font size options are suitable for users with low vision. These display options are available through a popup menu that can be opened by swiping up from the Home button. Doing this with Talkback isn’t an easy task though. Finally, according to help documentation for Google Play Books, some ebooks may only be offered with original scanned pages. This means that these ebooks won’t have text reflow capabilities. However, users will apparently be notified of this when purchasing or downloading the ebook. I haven’t come across such an ebook on Google as yet but I am not sure if Talkback will be able to read the book outloud if it only contains scanned pages.

I do have to admit that for fiction at least, I prefer to listen to human narrated version of a book rather than listening to a text to speech engine. However, I am still persisting with the ebook version of “Only Time Will Tell”.

The most important thing however is the fact that visually impaired people now have a much greater choice of reading material that is partly or fully accessible. In addition to Google Books on Android, iBooks on iOS and Blio on both platforms, Adobe Digital Editions on PC and Mac is now screen reader accessible as well.

More Gestures for Google Books

After I published this blog post, I was informed about a few gestures and commands for reading books in Play Books app. These don’t appear to be documented online. So most of what is written below is from my experience of using the gestures and commands.

When a book is opened and Talkback is reading it, double tapping with a single finger pauses speech. Double tap with a single finger again to resume speech. Talkback will start reading the book from the location it was paused. I haven’t discovered this in my initial testing because everywhere else in the Android platform, a single finger double tap activates things. As such I didn’t expect it to be performing such a conceptually different function of pausing and resuming speech. No matter how hard I try to think logically, a single finger double tap activating items throughout the platform but doing entirely a different thing in a single application is hard to comprehend.

It is also possible to jump to next page and previous page by touching and then double tapping the right corner and left corner of the screen respectively. For example, to go to the next page, you would first touch the screen towards the right side corner. Nothing happens at this stage. But double tapping the screen now will take you to the next page and Talkback will continue reading from the next page.

Earlier, I did say that a single finger double tap will pause and resume speech. This would obviously lead to readers concluding that after moving to the next or previous page, they can use double tap to pause speech. Unfortunately, this wouldn’t be true. Single finger double tap is now set to moving to the next or previous page (depending on which function was selected) and subsequent single finger double taps will continue doing this new function. In order to make single finger double tap pause and resume speech, the users will have to touch the middle of the screen and then double tap. This will return the single finger double tap gesture to its former state which is to pause or resume reading.

Perhaps this can be explained in a slightly simpler way. Imagine that when a book is opened, there are three huge buttons on the screen from left to right in a row. The left most button towards the left edge of the screen is the “Previous Page” button. The middle button is the “Pause/Resume Reading” button. And the right most button towards the right edge of the screen is the “Next Page” button. To activate these buttons, you would first select one of them by touching it and then double tapping it. Once you activate a button, subsequent single finger double taps will continue to activate that button. In order to activate a different button, you have to touch it first.

It would have been very helpful if all of this was clearly documented in an appropriate location. Nexus 7 manual’s Accessibility section would have been a good place to mention it. Most users, especially if they are new to the platform, would have a very hard time figuring out something like this on their own.

Music and Movies

There is not much to talk about these applications. The music playback application is called “Google Music” which is probably a more obvious title than “Play Books”. Again, “Play Music” can be launched from the dock on the home screen. Unfortunately, Google’s online music store and music in the cloud services are not available in the UK and so I wasn’t able to test them. However, I did manage to copy a number of music tracks on to the Nexus 7 device. Play Music app recognised them immediately and I was able to play them with not much trouble. The playback controls are located at the bottom of the screen just above the “Back”, “Home” and “Recent apps” soft buttons. Playback controls are also available on the lock screen and they are usable with Talkback. I haven’t figured out a way to rewind and fast forward inside a track.

The app for movie play back is a different storey altogether. Not surprisingly, the app is called “Play Movies” and is also available on the dock. Using Talkback, I can start playback of a movie. But I haven’t found a way to pause playback. I needed sighted help to accomplish this. None of the shortcut gestures work when the movie is playing. Double tapping the screen brings up an unlabelled image control. I tried double tapping this to bring up playback controls with no success. Perhaps there is a way to pause or stop playback when using Talkback. However, I am not too interested in watching movies and so I didn’t spend too much time on finding a solution.

Movies can be purchased on the Google Play online movie store. I was happy to note that the movies I purchased did contain closed captions. It is also possible to select the font size for caption display from “small, medium, large and huge. There is no audio description for these movies though.

Finally, I noticed a number of unlabelled image controls in both Play Music and Play Movies apps all over the place. However, I am not sure what these controls do.


The YouTube app is available in the “Apps” folder. It is mostly usable with Talkback except for a couple of challenges. Once you start playback of a YouTube video, in order to pause playback, a Talkback user needs to double tap the video to bring up playback controls. I have found a reliable way that works for me. When the video is playing, find a control named “Info”. From this control, swipe left once. There won’t be any speech feedback but you are in the right spot to bring up playback controls. I presume the focus is on the video. Double tap the screen now. Playback controls are visible but focus doesn’t move to the controls. Swipe right once to move to “Pause Video” button. This is a bit unwieldy but does work for me more often than not. I did try remembering the video playback area, double tapping it and then remembering the location of the “pause video” button. But I admit that I haven’t been successful.

I suspect the YouTube interface on a 7 inch tablet can be a little crowded. It is perhaps easier to access the controls on a smaller screen. It would also be interesting to see how accessible the new standalone YouTube app for iOS will be. If it is going to use the same UI design as the Android tablet app, it may not be a pleasant experience for blind users.

As always, any comments, suggestions, tips and app recommendations are very welcome. Please leave a comment below or get in touch with me on Twitter @kirankaja12.

Random thoughts on Android Jellybean and Google Nexus 7 Accessibility

August 2, 2012

If you follow developments in mobile technology, you may already know that a new version of the Android operating system called “JellyBean” has been launched last month along with a new 7 inch tablet device called the Google Nexus 7. It was also announced that JellyBean came with a number of accessibility improvements. To know more about these improvements, either watch this hour long or the announcement on the Eyes-Free Android blog.

I needed to test the new accessibility improvements in JellyBean and so I ordered a Google Nexus7. I was happy to note that the help documentation for Google Nexus 7 has a section dedicated for accessibility features. Anyone considering a purchase of an Android JellyBean device should read through this information.

In spite of reading through this section and doing a bit of research in advance, I ran into a couple of problems while setting up the new Google Nexus 7. I thought I will document them here with possible workarounds that I found.

Turning the device on for the first time

If you are blind and do not have sighted assistance, don’t be in a hurry to unbox your Nexus 7 and turn it on. It would help to prepare a bit.

Firstly, you will need to have a pair of headphones ready. You don’t have to use them right from the start: Nexus 7 does have an external speaker but it would be very hard to complete the setup without them as you will see shortly. Secondly, it would help to have a flat surface like a desk. There is a gesture to turn on Accessibility mode which fires up Talkback with Explore by Touch mode active. This gesture is rather unreliable and seems to work best when the device is placed flat on a desk.

If you are holding Nexus 7 in the portrait mode, the power button and the volume rocker should be along the right edge of the device towards the top. The 3.5mm headphone socket is along the bottom edge towards the right. Don’t forget to connect the device to a power source before you start. The mini USB port is along the bottom edge towards the middle. It seems to ship with an almost empty battery.

When you turn on the device by pressing the power button, you will be placed on a welcome screen. You can turn on the Accessibility mode right here. You need to touch the screen with two fingers placed slightly apart and hold for 4seconds. When you place 2 fingers, in the first second or so, the device should recognise the gesture and will ask you to continue holding for a few more seconds to turn on Accessibility Mode. Don’t remove your fingers from the screen and no matter what you do, don’t let a third finger touch the screen in these few seconds. I did this exact thing by mistake and the device said “cancelling accessibility mode” and wouldn’t talk to me at all for the next 3 hours no matter what I tried.

It appears as if the gesture works better when you use both hands with one finger touching towards the top half of the screen while the other touches towards the bottom half. If for some unfortunate reason, you end up in a situation where the device says “cancelling accessibility mode”, you can try the same two finger gesture again. Even though the device may not speak, you may hear a click sound after 4 seconds or so. At this point the Accessibility mode may come up on its own. If not, try drawing a rectangle on the screen in the clockwise direction. This gesture is supposed to enable Talkback and explore by touch although it didn’t work in my case. As a last resort, you may try the two finger gesture again a few times. You might just be lucky!

Of course, if you have the chance, you can also setup the device with sighted help and then turn on Talkback from Apps > Settings > Accessibility > Talkback. A number of users on Twitter and on the Eyes-free mailing list seem to be having issues with the gesture to turn on Accessibility mode. I tried the two finger gesture and the rectangle gesture countless number of times and spent close to three hours on it before it somehow worked. I hope that this issue will be addressed at some point in the near future. Turning on Accessibility mode with hardware keys might be a better way for most users. A large majority of Android devices will have at least a power button and volume control keys.

Assuming that you got the two finger gesture to work and you have the Accessibility mode turned on, you will be placed in the explore by touch tutorial. It only has three steps but I found it to be very useful. It covers the important gestures that you will need to learn to use Talkback with explore by touch mode enabled. Once the tutorial is finished, you will return to the welcome screen where you need to pick a language and activate the “Start” button to start setting up the device.

When you get to the screen to enter the password for the Wi-Fi network, look for the checkbox which says “show password” or “display password” and check it. If you don’t do this, all characters on the virtual keyboard are read as “dot” by Talkback. Unfortunately, when entering your Google account details, the “show password” option isn’t available (for obvious reasons). This is where the headphones come in handy. In order to enter the password with the virtual keyboard, you need to connect the headphones. Once headphones are connected, the characters on the virtual keyboard are announced correctly. Otherwise, you will just hear “dot” no matter which character you touch.

This is a great idea for security reasons but I don’t think it is smart to enable this by default. What if I don’t have a pair of headphones handy when I am setting up the device? This feature can be turned off by selecting “speak passwords” checkbox in apps > settings > Accessibility but it isn’t possible to get to settings from the initial setup screen. I just don’t think this is a smart usability decision.

Accessibility mode gesture isn’t available on all devices

It is important to remember that the 2 finger tap and hold gesture to turn on Accessibility mode may not be available on most devices. Both the current Google devices, Nexus 7 tablet and Galaxy Nexus smartphone support this. on other Android devices from manufacturers like HTC, Samsung, Motorola and so on, the components required for accessibility support will have to be either downloaded from the Google Play app store or enabled in Apps > Settings > Accessibility. In addition, these manufacturers may also customise the user interface of Android with skins. These skins may or may not work with Talkback and other accessibility features. For now, it is best to stick to Google devices which run stock Android (with no customisation) if you need to use a screen reader. There are ways to install stock Android builds on non-Google devices but this is not easily accomplished.

New gestures

Explore by touch mode was introduced in Ice cream Sandwich (Android 4.0) and this is improved in JellyBean. In addition to touch exploration where you would move your finger along the screen for Talkback to announce the items you have touched, you can also use gestures to navigate through these items. For example, swipe right and swipe left moves focus to the next and previous item on the screen respectively. Double tap anywhere on the screen to activate the last focussed item.

While you can get to any item on the screen by touching it if you remember the exact position, a combination of touch exploration and swipe gestures is the most efficient way to navigate a touch screen in my opinion. Introduction of swipe gestures is an excellent addition to Android Accessibility.

A list of all Accessibility gestures in Android 4.1 (JellyBean) are available under the accessibility section in Nexus 7 help documentation. By default, the left and right swipe gestures move focus by item. But you can change this to move by character, word or paragraph. Swipe down with one finger and in the same gesture swipe up. Talkback will say “read by character”. You can now use the left and/or right swipe gestures to read the current item character by character. To return to the default navigation mode (moving by item), keep using the swipe down and up gesture until Talkback says “default reading level”. Swipe up and down toggles backwards among the available navigation modes.

There are a couple of very important gestures missing and I haven’t found an alternative way to perform these actions:

· A gesture to pause or stop speech. This doesn’t seem to be possible on the Nexus 7 tablet but Marco Zehe has noticed that tapping his Galaxy Nexus on the top edge usually silences speech. I suspect this is because of the proximity sensor on the phone. But really, a screen reader should have a pause speech command/gesture.

· A gesture to read from the current item to the end. I thought this is a basic requirement for a screen reader. But there doesn’t seem to be a way to do this. For example, if I am reading a long email thread in the Gmail app, I have to read the entire body of the message at one go or read by character/word/paragraph. Further, reading anything remotely lengthy in a web browser is extremely painful without this command.

A word about navigating lists

It is best to illustrate my point with an example. When you open the Apps > Settings screen, Talkback will inform you that you are on a list showing 18 items. There are actually 24 items in the Settings screen on my Nexus 7 but only 18 items are visible on the screen. In order to get to the other 6 items, you will have to manually scroll with scrolling gestures. If you do not do this and continue navigating with left/right swipe gestures, you will only keep moving through these 18 items in the list and other items on the screen. In order to scroll down to the other 6 items, place two fingers on the screen and move up. You can also swipe right and swipe left in the same gesture to go to the next page in a list.

I wanted to point this out because this behaviour is contrary to how most other user interfaces work especially for screen reader users. When navigation commands are used, list controls tend to automatically scroll to display hidden items.

Talkback behaviour is particularly confusing because when you move focus to a list control, it only informs the number of items that are being displayed but fails to mention how many items there are in the list. In the Settings example for instance, instead of just saying “list showing 18 items”, it would be useful for Talkback to say “list showing 18 of 24 items”. Without this information, users will always have to attempt manually scrolling the list just to make sure they are not missing any items.

Virtual Keyboard

Using the default Android keyboard in JellyBean is straightforward. When you encounter an input field, just double tap on the input field to bring up the keyboard. Move your finger along the keys and when you have found the character you would like to insert, just lift your finger. Talkback will read out the inserted character at a lower pitch. On the Nexus 7, I am noticing that I have to be precise with my finger placement on the characters for them to be inserted. If I don’t lift my finger in the exact centre of the character, it doesn’t insert it. to insert numbers, switch to the “Symbols” keyboard by tapping the “symbols” toggle on the bottom left of the keyboard

Do remember that you will need headphones to enter passwords if you haven’t turned on “speak passwords” in Apps > Settings > Accessibility.

Entering a Pin on the lock screen

If the Nexus 7 needs a pin to unlock, the way to enter it is slightly different to the usual way you enter text. On the lock screen, there is an input field without a label and immediately to the right is a button without a label as well. This unlabelled button needs to be activated by double tapping before entering the pin. To insert digits of the pin, just tap the digits on the telephone keypad style virtual keypad. Don’t double tap them. Just tap the digits and lift the finger. Even though there is no feedback, the digits are entered. When you are done entering all digits of the pin, tap the enter key to the right of the digit 0 on the virtual keypad. On the other hand, if the device is locked with a password, you would double tap the unlabelled input field which brings up a standard virtual keyboard.

Voice recognition

Voice recognition is one reason why I keep trying to switch to Android. It does an excellent job recognising my accent. It is much more accurate than Siri on iPhone. Voice Recognition is available throughout Android. Basically, you are able to speak into almost all input fields in Android.

Before you start using voice recognition though, make sure you connect a pair of headphones. Trying to use voice recognition when Talkback is running and speaking out of the built-in speaker is a recipe for disaster!

When you are composing a text message or an email, look for the “voice input” button on the keyboard. This is between the “Symbols” toggle and the spacebar on the bottom left corner of the device. Tap this and lift your finger and you will hear an earcon notifying you that the device is ready for you to speak. Just speak normally and when you are done, wait for a few seconds. The device will play another earcon to indicate that it has paused recognition. You will also notice that the keyboard now only has one item which says “tap to speak”. Double tap this button to start recognition again. In order to get the normal keyboard back, you need to activate the Back button on the bottom left of the device and then double tap the input field again. This process is a bit clumsy but I haven’t found an alternative.

When you are speaking, Talkback will start reading out words as they are recognised by the device and entered in the input field. This can be a bit distracting. And by now you can guess why I suggest using headphones. When the device is in voice input mode, Talkback will read the text inserted by the voice recognition feature. However, the voice recognition features starts to recognise Talkback’s speech output and inserts the same words again in the input field. This results in an infinite loop with Talkback reading out the same words and the voice recognition feature inserting the same words over and over again. This is rather funny to watch but makes for a horrid user experience.

In spite of this, I still say that Android’s voice recognition feature is fantastic. I can probably put up with having to use headphones for the recognition accuracy it gives me especially with my Indian accent.

Further, you can also use Google Voice Search for some simple requests such as asking for local weather information and so on. If Google Voice Search finds a simple answer, it will speak out the answer. If it can’t find an answer to your question, it just displays Google search results. Google Voice Search is located on the home screen’s dock. Surprisingly, Talkback doesn’t interfere with voice recognition when using Voice Search. So this can be used even without headphones.

Google Chrome

Talkback now works with Google Chrome which is the default browser on Android. It does recognise various HTML elements such as lists, headings, landmarks and so on but there is no way to navigate/jump among these. The available reading levels are still character, word and paragraph.

Further, the explore by touch feature doesn’t seem to work reliably on a web page. Although navigating by left and swipe gestures works fine, exploring a web page by moving a finger along the screen doesn’t seem to reliably work on all web pages.

There is an option to “enhance web accessibility” in Apps > Settings > Accessibility which needs to be turned on for Google to insert certain scripts which are supposed to improve accessibility on web pages. If you are having problems on the web, it might be worthwhile to check if this option is enabled.

Other nice features

App developers should always test their apps with Talkback. JellyBean makes this a little easier. There is an option to display speech output under Apps > Settings > Accessibility > Talkback > Settings > Developer Settings. Enable this option and there will be a bubble on the screen which displays what Talkback is speaking.

I was also happy to notice that the default email app on Android and the Gmail app have been made accessible. They now work with Talkback for most part except for minor annoyances. The read/unread status of an email message isn’t spoken by Talkback. Plus the body of an email is displayed in web view which means explore by touch doesn’t work correctly. One has to use the left/swipe right gestures to navigate and read the message. But I can read email messages from my work account as well as Gmail. There is also K9 email which is supposed to be an accessible email client but I haven’t used it in a while.

I was trying to find an accessible Twitter client for Android. UberSocial seems to be the answer but the app displays timelines in web view which makes it a bit difficult to use.

I am sure I missed a few great features but as the title of the article indicates, these are random thoughts and I am not trying to do a full review. Further, there are other accessibility features that I haven’t tried out. There is a large text feature in Accessibility options. I am not sure how this is supposed to work. And there are a number of third-party accessibility services which I may test when I have the time.

If you have any tips or tricks or recommendations on apps, please get in touch on Twitter @kirankaja12.

Miscellaneous Assistive Technology tips

April 28, 2011

Welcome to my new blog post after two and a half years. Be warned that my writing abilities might be a bit rustic.

This post is a collection of tips and tricks that I discovered myself or learnt from other assistive technology users. My intension is to keep adding tips to this post as and when I discover them. So, if you have anything useful, please pass it along. I use Jaws 12 as my primary screen reader on an Apple MacBook Pro running Windows 7 natively. Before you ask, yes I do use Mac OS as well but I find that VoiceOver is not mature enough for me to be able to do my job effectively. I hope that this situation changes in the near future but that is a topic for another blog post. On to the tips now…

Outlook 2010: Shortcut for viewing the current message in browser

If you are using Outlook 2007 or above with a screen reader, you may already be frustrated by Microsoft’s decision to use Word for displaying email message content. So, most screen reader users generally open any email message that is in HTML format in the browser. There are number of ways of doing this. In fact, Jaws 12 suggests that I use Alt+H followed by the letter ‘A’ followed by letter ‘V’. But for some strange reason, this doesn’t seem to work for me.

But I discovered that “View in Browser” is one of the icons in the Quick Access toolbar in Outlook 2010. I also found out that Outlook assigns keyboard shortcuts automatically to icons in the Quick Access Toolbar. These shortcuts are usually Alt followed by numbers ‘1’, ‘2’ and so on. Alt followed by ‘1’ activates the first icon or item in the Quick Access Toolbar, Alt followed by ‘2’ activates the second item and so on. Since “View in Browser” is the 6th item in my Quick Access Toolbar, Alt followed by ‘6’ opens the current email message in the browser for me. You have to first press Alt, release it and then press ‘6’.

This may be a faster way of getting to the browser even if Alt+H followed by ‘A’ followed by ‘V’ works for you. This should generally work for most users who haven’t tinkered with their default ribbon and toolbar items. I am not sure if this works with Outlook 2007. If you happen to use it, please try it and let me know.

HTML & PDF: Selecting & copying text

Jaws 12 and some of the older versions don’t seem to be doing a good job of selecting and copying text. When I select a phrase or sentence on a webpage or in a PDF file in Adobe Reader and copy it to another application, only a portion of my selection is being copied. There is a setting in Jaws verbosity setting which fixed the problem for me.

In Internet Explorer and Adobe Reader, press JawsKey+V to open the “Adjust Jaws options” dialog. Press ‘S’ to jump to “Select and copy”. By default this is set to “Full content using onscreen highlight”. Press Spacebar to change it to “From virtual cursor”. Press Enter to dismiss the dialog. Text selection and copying should work better now. Remember to do this for both Internet Explorer and Adobe Reader. I am not sure if this applies to Firefox.

%d bloggers like this: