Blind people using touchscreens: The issues

31. January 2014

Our tablet tests conducted in November 2013 have revealed some general issues that blind users encounter when using touch input on a tablet. We describe the issues here.

This is a part of our comparative test of tablets for blind users. An introduction is provided in Tablet test with blind users: Overview. Another related arcticle covers Interface issues when using tablets with screen readers.

With screen readers activated, the operating mode of touch screens changes profoundly. While in normal operating mode a single tap is sufficient to activate elements, non-sighted touch interaction first focuses elements by traversing them with swipe gestures or exploring the screen with one finger, then activates the focused element with a double tap. Direct manipulation actions that rely on visual feedback are disabled.

Unintended touches

Avoiding unintended screen touches was one of the most frequent issues novice users encountered. As several fingers hover over the surface, it often happens that a finger other than the primary finger used for swiping and tapping accidentally touches the screen. The result is often that the user loses the current screen reader focus and may not even notice that an unintended touch was responsible for this. The focus cannot be restored easily (to our knowledge, no touch OS offers a gesture to undo accidental touches). The user has to swipe through often many elements or explore the screen with touch to retrieve the former focus position.

A related issue occurs when using the device hand-held. The fingertips of the hand holding the device may creep round the edge and register as touch. The problem of unintended touches becomes less significant with some practice.

Delayed reaction to gestural input

Gestural input with screen readers enabled was not uniform across devices tested. This what we found:

  • iPad mini / iOS 7: Generally the iPad has few issues regarding delays or unreliable input. Response times are good.
  • Google Nexus 7 / Android 4.4.: The Nexus seems to react more slowly to touch input than the competition. At times, tapping or swiping causes no reaction and requires a second attempt.
  • Thinkpad Helix / Windows 8.1: Under some conditions the reaction to gestural input seems prone to errors, for example when a double-tapping gesture to activate a focused element happens to occur on top of another interactive element. In those cases, double-tapping may at times place the focus on this other element rather than activate the selected element.

We were not able to determine whether the differences observed were attributable more to the devices tested (screen, CPU speed) or to the respective operating systems.

Disambiguating swipe gestures and other touch gestures

Executing swipe gestures successfully was frequently a problem for novice users. Swiping must be carried out so that the touch does not register as a focus-setting touch (the non-visibility of the distance between finger and the tablet surface may be a problem here). Frequently, users carried out swipe gestures in a way that set the focus at the beginning or end of the swipe. This problem is likely to become less significant with some practice.

A related issue is that a fast double tap (especially in repeated activation attempts) may accidentally become a triple tap. In Windows 8, the triple tap calls up an often unexpected 'secondary action'.

Disambiguating vertical and horizontal swipe gestures

  • iPad mini / iOS 7: iOS uses vertical swipe gestures differently: for example, they may move between app regions, not individual elements. The disabiguation was not a probblem on the iPad. The iOS rotor was rarely called up accidentally (but it happened).
  • Google Nexus 7 / Android 4.4.: Android does not differentiate between horizontal and vertical swipe gestures: swiping right and swiping down has the same effect. This may simplify things at first, but it wastes direction as a fairly basic mode for differentiating gestural input.
  • Thinkpad Helix / Windows 8.1: Narrator uses a vertical swipe gesture to swap reading modes, for example, by paragraph, line, word, character, heading, link, or element. This corresponds to the modes that can be selected via the rotor in iOS. During the Windows 8 tests, users applying horizontal swiping to traverse elements on screen at times involuntarily caused a change of reading mode. The unintended call-up of the iOS rotor happened far less frequently because its call-up gesture is more specific.

Location dependence of gestures

With Windows 8 there is an additional problem (especially for novice users): Gestures have different results depending on where they are applied. A vertical swipe from the top or bottom edge of the screen often inserts marginal menus. These are not always located just on the side where the gesture was applied: the calendar displays menus but both in the top and bottom margin, and swiping from the top edge on the home screen inserts a menu at the bottom. Applied with enough distance from the edge, the same vertical gesture changes the navigation mode (elements, links, headings, paragraphs, lines, words, characters, tables). Similar unexpected consequences surface when swiping inwards from the right edge (calls up the charms menu) or from the left edge (toggling of open apps) – fortunately, the latter behavior, which was quite disorienting for users, can be deactivated.