7.2 Interactive Media Production Standards

The CMHR encountered two primary issues in designing the exhibition of human rights, an intangible subject matter:

  • How can we make touchscreen monitors more inclusive in their presentation and navigation of digital content and digital content structures?
  • How can we make non-digital content—such as static text, framed images or artefacts behind cases—inclusive?

The CMHR’s mandate for inclusive design for its exhibits resulted in a variety of designs and intentions. Our mandate ultimately required us to develop and implement strategies to provide access to content through a Universal Keypad (UKP), for screen and audio controls, as well as a mobile application using low-energy Bluetooth iBeacons.

Universal Approach for Interactives

Within the larger context of accessibility and universal design, CMHR considered what should be a set of guiding principles for users of interactive experiences. These principles were informed from standards such as the Web Content Accessibility Guidelines (WCAG) v2.0, the Seven Principles of Universal Design, and a review of existing assistive technology interfaces and best practices, as well as referencing research from human computer interaction (HCI). Accessibility for interactives must be semantically structured to support text-to-speech.

Basically, accessibility interfaces aim to answer these questions for the user:

  • “Where am I?”
  • “Where can I go from here?”
  • “How can I get there?” (Or: “How can I make that happen?”)

An accessible interface guides the user to act, as in:

  • “OK, button, 1 of 2, press Select to activate. Use arrow keys to move between choices.”

This guidance provides context and informs the next steps the user can take, as in:

  • “Where am I?”—We’re on the OK button.
  • “Where can I go from here?”—There’s at least one more object suggested by “1 of 2.”
  • “How can I get there?”—We can “press Select to activate” or use the arrow keys.

Accessibility interfaces must at least provide this guidance.

At the same time, this guidance begins to create a mental framework of functionality for the user. In particular, whenever you encounter an object described as a button, it’s likely that you can press Select to activate. Likewise, when ordinality (the order of objects) is described, such as “1 of 2,” the arrow buttons likely are the tool to get to the next elements in the range.

This progression of first providing context and then providing an accessibility hint that leads to a mental framework of functionality is important. The real goal in any interface is for the interface to disappear for the user. This is a familiar practice for sighted users; it is no different here. By providing context first and additional hints second, as the interface receives additional use, the hint becomes less important to the user. The user has the option of anticipating what’s to happen and moving forward in the experience without waiting for the hint to complete in its entirety. Beginners become intermediate users and then expert users, and where consistency is provided in the interfaces across the Museum, users are able to focus increasingly on content first rather than on trying to figure out how an interface works.

The formula that allows this to work is always:

  • Context, Pause, Hint

Understanding this methodology informs how to develop unanticipated interfaces—for example, a list of songs using a media player that has user controls for rewind, pause, and fast forward. Our guidance would sound something like:

  • “If Your Dad Doesn’t Have a Beard, song, 2 of 20, press Select to play song. Use arrow keys to change songs.”

Each song becomes nothing more than a button that launches a media experience, and the media player controls can be extrapolated as such:

  • “Rewind, button, 1 of 3, press Select to activate. Use arrow keys to navigate between controls.” and so on.

Eventually, the eyes-free user can fly along this list, only pausing long enough to hear the name of the song, hitting Select to play it, and then quickly hitting back to return to their position within the song list. This is why statefulness in the “back” functionality is important, and why consistent explicit semantic prioritization in the speech rule is critical.

  • Rule: Name, value, type state ordinality, (pause) hint

In the above rule, punctuation is meaningful: The comma is used to indicate a small pause, as is generally inserted by most text-to-speech engines when they encounter a comma in a text string. The reference to an explicit pause refers to a greater pause of between 300 to 500 milliseconds that helps separate the accessibility hint from the primary speech associated with each control.

It’s worth noting that hints are the concatenation of both the component and the container-level hint. In other words, the component and the container-level hints are interconnected.

Such an interface also exemplifies why consistent focus matters (e.g., the ability of the eyes-free user to quickly play a song, fast forward a bit in it—because they’ve memorized that it is the third control—and then return to the list, all without even pausing to hear the speech prompts).

The perhaps counterintuitive takeaway here is that the more perfectly we achieve the speech prompts, the more we can facilitate the eyes-free user from not needing them once they become accustomed to the interface.

General navigation of content

In a greatly simplified view, each screen of content typically breaks down into two major components. We begin to imagine the structure of any screen similar to the Document Object Model (DOM) that’s typically followed for Web pages. As such, the interface should establish a simple hierarchy of content and apply the handful of interaction outlined in the next section.

First level: Overall view

  • Content
  • Meta Navigation
  • … loop to top (wrap tone)

Second level: Content (example)

  • Introduction
  • Filter
  • Text Container 1
  • Text Container 2
  • Video
  • Navigation Control
  • Navigation Control
  • … loop to top (wrap tone)

Second level: Meta navigation

  • Includes language selection
  • Playback speed selection
  • … loop to top (wrap tone)

Third level: Filter

  • Introduction
  • Option A
  • Option B
  • Option C
  • Apply Filter Button … loop to top (wrap tone)

Third level: Other objects (left out on purpose)

Third level: Meta navigation

  • Includes language selection
  • Playback speed selection
  • … loop to top (wrap tone)

In this basic structure, we begin to get a sense of the different rules applied in practical use: menu wrapping, filtering, and meta navigation.

General interaction guidelines

Initial development process

As multiple media producers were brought on board at the CMHR, interactives were still in flux and the final design and implementation of the UKP was not yet available. Further, the understanding of the implementation for accessibility was an evolving practice. As a result, each of the nine producers gained independent introductions to the topic and had individual reviews of their intended implementation, software stack, and desired interactions. This proved both time consuming and difficult, as we realized that consistent description and understanding of intent was tricky at a distance when most development and review was done remotely. We distributed working versions of software for review with mixed results. Our eventual goal for the inaugural development was to accept slight variation in implementation, and to accept the results as slight variations in execution that would be subject to review. Given that our interface approach was somewhat bespoke, we saw an opportunity to use the slight variations for a comparison point that could then result in a finalized set of guidelines for use in the Museum.

It should be cautioned that the guidelines we describe below were developed after a review of all the interactive pieces post-opening. Every interactive had extensive hands-on testing, much arguing over fine and subtle details, and then agreement on a consistent approach for delivery in the future. These guidelines can be generalized to new interactives and accommodate the emerging trend to multi-touch interfaces as the desired modality. In fact, the model of interaction codified in these guidelines does not depend on a particular means of navigation; instead, they simply assume that some form of iterative navigation is possible.

The guidelines that follow describe a number of specific use cases of interactivity and then how we expect an accessible interface to present an efficient and consistent approach once we’ve accepted our general model of interaction.

Audio cues

In the same way that sighted users build interfaces with a variety of visual affordances – hover states, animations, colour changes, etc. – good accessibility interfaces offer similar affordances through audio cues and sounds. These sounds are intended to have a light touch, being short and minimally intrusive while providing a critical layer of feedback for the vision-impaired user.

  • Rule: Audio cues are used to provide feedback for actions.

These cues aren’t to be confused with content that is read by text-to-speech or the description and hinting accompanying most interface elements, but rather as the audio confirmation of activity that’s happening: the tone when an object is selected or the sound when the back key cancels an interaction. The sighted user can see these changes; the vision-impaired user needs to hear these changes.

Wrapping menus

As a general rule of thumb, navigation of an interface at a local or per-screen level shouldn’t be a series of dead-end interactions in which a user traverses a path of information and content only to have to retrace their steps to get somewhere else. The overall information architecture of an interactive may have a series of branching trees, but individual screens and “local” experiences should not.

  • Rule: Menus wrap in both directions and make a sound as they do so.

Navigation actions should always be reversible for the user to increase predictability. Pressing the next key twice, followed by pressing the previous key twice, should have the effect of returning the user to where they started. The same is true as the user gets to the end of a menu; the last item in the list of objects is followed by a wrap sound, and the next item is the first item in the list. The reverse is true: the first item, a wrap sound, followed by the last item in the list. Wrapping ensures that we follow the rule regarding all navigation actions being reversible.

Automatic reading/story containers

Where short paragraphs of content exist, especially in the form of introductory paragraphs, we make the deliberate decision to make the overall text interface a little less cumbersome by simply automatically starting the playback of the selected text. This rule requires a judgment call on the length and context of these paragraphs, but the intent is focused on convenience.

  • Rule: Text blocks of approximately 250 words or fewer should be read automatically.

In contrast, larger chunks of text that include multiple paragraphs of text will appear in story containers. These containers should be quickly and easily traversed by the user with individual paragraphs acting as individual objects. In this instance, the focus is again on convenience; where we would normally include feedback such as ordinality, such feedback is out of place in this context since it would substantially interrupt the flow of content. Much like a sighted user can quickly and easily scan over a body of text, we approximate the same sort of experience for the vision-impaired user.

  • Rule: Story containers allow users to traverse multiple paragraphs of text.
    • No ordinality is conveyed in paragraphs.
    • The left and right arrows move to the previous and next paragraphs, respectively.
    • Paragraphs do not wrap (and they include a sound for reaching the terminus).
    • The up and down arrows move through the objects on the screen, leaving the story container.

Playing audio/video

Audio and video are good examples of media containers that break out slightly from the other interactions in the user interface. These components are representative of unique, self-contained interactions and begin to demonstrate how we can effectively use the different controls on the keypad to provide a complete experience that captures the intent of the experience without being overly deliberate. Full accessibility is provided, but it is done with simplicity and elegance. Further, this approach echoes the experience provided with story containers above, speaking to the general consistency desired in an interface.

  • Rule: Media containers are not broken into component parts and act cohesively.
    • The single element of focus is the media itself.
    • The Select button is the Play/Pause button for the media and toggles back and forth.
    • The left and right arrows scrub through the media; tapping moves a short distance while holding down keeps scrubbing.
    • The up and down arrows move to the previous and next objects on the screen.
    • When in accessibility mode, captions are on (this is the default case).
    • When in accessibility mode, sign language is on (this is the default case).

Image galleries

Image galleries are very similar to story containers and media containers.

  • Rule: Image galleries are a hybrid of story and media containers.
    • The ordinality of objects is described.
    • The description of the image is automatically read.
    • The hint for navigation needs to suggest that image descriptions will be automatically read to the user.
    • The left and right arrows traverse the list of images.
    • The up and down arrows move to the previous and next objects on the screen.

There is a unique interaction here, where the hint for navigation may appear at the end of a potentially long description. We believe this to be okay – skilled users will have already figured out the interface, and the modified hint gives additional guidance about what will happen.

Modifying containers

There are a handful of instances where the objects within a container may be modified through interactions – for example, sorting a set of available stories or videos by a particular theme. In these instances, the order in which users experience these actions and how they are applied to the content become fairly important. As with most other interface elements, we want the user to understand what will potentially happen before it actually happens. As a result, filtering mechanisms appear inline with a list of objects to be modified. Further, since this filtering mechanism modifies the list of objects, it’s not included in the count of the collection of objects when calculating ordinality.

  • Rule: Meta navigation specific to a content area (e.g., themes/skip/record) appears inline in the content container.
    • Actions that affect the contents of the content container become the first objects in the container (e.g., filtering).
    • For interactions that present exclusive singular choice (e.g., radio buttons), pressing Select automatically applies that choice.
    • For interactions that present multiple choices (e.g., checkboxes), an Apply button needs to be presented.
    • Actions are not included in ordinality of content.
    • The Back button cancels the previous action, and this needs to be emphasized as part of the container-level hint.

Back button/cancel

Where the Select button moves the user deeper into content and signals an action, the Back button provides an easy mechanism to do the reverse. Typically, the Back button will be used to retreat from a layer of content, but it also provides a similar function by acting as a Cancel button. For example, if a user is in the midst of filling out a form or a set of options that would cause a change to the content or the interface, the Back button is the graceful way to back out of the process, effectively serving the role of a Cancel button.

  • Rule: The Back button returns the user up a level in hierarchy, returning them to the point from which they left.
    • The Back button also acts as a Cancel button.

Interruptions

As a general rule, when the user takes action, whatever speech is being played should be interrupted and cancelled. Queuing of speech and audio events should not take place, because it leads to a potential disconnect between the actions of the user and the responsiveness of the interactive.

  • Rule: Whenever a user presses a button on the keypad, any speech and audio should be interrupted as part of the action.

Progress bars

Many visual elements for sighted users give constant feedback, and good accessibility interfaces need to approximate a similar experience for vision-impaired users. Simply providing feedback upon initial focus is a good start, but while the object remains in focus, additional feedback should be provided as it updates. The conceptual trick is taking into account the different reading speeds that can be selected by the user and choosing an interval of time that provides consistent feedback with interrupting or clipping the audio as the information is updated. The approach outlined below has been deemed preferable and more consistent across a range of possible interfaces over other options such as an audio ramp (which felt inconsistent with the overall design aesthetic of the Museum).

  • Rule: Progress bars need to provide continuous feedback, although at regular intervals.
    • Add a background tick every 0.5 seconds.
    • Announce updated progress every 5 seconds.

Meta navigation

The meta navigation controls that provide an interface for both language selection and speed control are integrated throughout each layer of the interface, always anchoring a list of objects. Sighted users are able to switch between English and French content at any point while using interactives; by incorporating the meta-navigation throughout the different layers of the interactive experience, we approximate the same capability for vision-impaired users. For example, when listening to a video, the user can press the down arrow to traverse to the next object in the container, which would be additional videos followed by the meta navigation. Proceeding further, a wrap sound would play, and the user would end up at the first video in the sequence.

Rule: The meta navigation controls appear as the last element of every list of objects that can be traversed.