Inclusive Interaction:

Description strategies for a complex interactive science simulation

    Taliesin Smith

    • Clayton Lewis, Department of Computer Science, University Colorado, Boulder
    • Emily B. Moore, PhET Interactive Simulations, University Colorado, Boulder

      TWTC, Memorial University, 2016

What I'll Cover in This Show 'n Tell

  • Introduce the PhET Project
  • Demonstrate the Simulation
  • Design Approach & Constraints
  • Summarize Earlier Findings
  • Dive into Challenges & Strategies for Description in Interaction
  • Demonstrate & Discuss the Accessible Prototype
  • Conclusions
  • Questions & Answers
OCAD University and PhET Interactive Simulations

PhET Interactive Simulations

Several PhET Simulations.
  • Founded in 2002 by Nobel Laureate, Carl Wieman
  • Created over 150 science & math simulations, or sims
  • Over 8 million downloads per year
  • Translated in 87 languages
  • Design of the sim guides students implicitly (Podolefsky et al, 2013; Moore et al, 2016) along productive learning paths.

Consider a simple exploration of Balloons & Static Electricity.

Design Challenge

Target User

How might a student, who cannot see the visuals or who does not use a mouse, engage with the sim & learn along side their peers?

Design Constraints

An accessible PhET Sim:

  1. Must be incorporated into the same sim used by sighted learners.
  2. Must not require hardware not widely available in classrooms.

Design Goals

Accessible design triad: Perceivable, Operable, Understandable.
Adapted from the WCAG 2.0 POUR Principlces (Caldwell, B., Cooper, M., Guarino Reid, L.)

Question: How can we make the non-visual experience as engaging & easy to use?

Accessible Design Features: 3 Pillars

  1. Infrastructure: a robust (Caldwell et al, WCAG) webpage-like structure to create a semantic hierarchy (i.e., Parallel DOM)
  2. Keyboard navigation: operation of all interactive objects via the keyboard (common key presses)
  3. Descriptions: text-based auditory descriptions read out by screen reader software as students explore & engage with the sim

Iterative Design & Evaluation

Keyboard with hands of user interacting with the sim.


  • Conducted interviews with 12 participants who were blind.
  • Prototypes contained static descriptions & keyboard navigation. Dynamic descriptions delivered live.

Analysis & Findings

  • Analyzed video recordings of participants thoughts spoken aloud (Lewis & Reiman, 1983; Chandrashekar et al, 2006) & screen reader output.
  • Found designs to be perceivable & operable, but not always understandable (Smith, Lewis & Moore, 2016).

Related Previous Research

  • Natural language has been used successfully to make interactive science content accessible (Demir, 2008; Gerino et al, 2014; Diagram Center, 2015.; Sorge et al, 2015)
  • Description guidelines (Keane & Laverent, 2014) exist for simple interactive scientific graphics

PhET Sims are Unique

  • highly interactive,
  • have complex objects, &
  • need to accommodate free exploration (many interactive learning tools have a predetermined exploration path).
Student drags balloon across play area with keyboard presses, hearing updates on position as they go.

Description for Inclusive Interaction

Three Challenges for Highly Interactive Scenarios

  1. Unpredictable event sequences
  2. Objects as controls & displays
  3. Describing state changes & effects

Description Strategy Framework

Key points of the framework graphic: I will discuss & demonstrate three categories of description: Static, Dynamic, and Interactive.

Challenge 1: Unpredictable event sequences

Blind users have parallel navigational styles. They listen, then interact (Kurniawan et al, 2003; Fakrudeen et al, 2013). They can listen:

  • line by line, OR
  • to headings (black diamonds), OR
  • to interactive objects, e.g., "do a tab through" (black circles).

Strategy: Support exploratory listening

  • Descriptions are modular & hierarchical, & make sense in any order
  • During interaction descriptions (dynamic & interactive) are announced to the learner via ARIA Live Regions (Scheuhammer et al, 2013; Pappas et al, 2014; King et al, 2016), a technical means of directing a screen reader to read out a change in the sim that is outside the current listening location (i.e., cursor focus).

Demo: Navigational styles

How screen reader users listen, before interacting.

Challenge 2: Objects as controls & displays

  • Controls like buttons & switches: easy!
  • Display objects that solely display information, e.g., sweater & wall - straight forward to describe (state information).

Balloon is a Complex Object

  • Balloon as a control: students actively engage with it.
  • Balloon as a display: surface shows visual indicators of electrostatic charges & the balloon's position updates independently when released.

Strategy: Incorporate three forms of descriptions

  1. While listening, the balloon has a static description to identify it, & a dynamic description containing state information: position, & amount of net charge.
  2. Additionally, once grabbed, a dynamic description describes how to move & release it.
  3. During interaction a combination of dynamic descriptions & interactive alerts connect students' actions with the changes in the sim.

Demo: Complex nature of balloon

  • Static, dynamic, & interactive.

Challenge 3: Describing state changes & effects

Rubbing has two integrated effects for sighted learners.

  1. Something happens: Negative charges are transferred from sweater to balloon.
  2. New state: distribution of charges after rubbing is different from before.

How to make both the state and the effects immediately apparent to blind learners?

Strategy: provide two types of description to integrate state changes & effects:

  • Dynamic descriptions are updated in the PDOM & are announced as changes occur.
  • Interactive alerts, hidden in the PDOM (i.e., not part of the object's state information), announce immediate effects of the learners' actions.

Descriptions are history & context sensitive, allowing for successive descriptions of similar RUB events to be shorter, and/or different from the first.

Demo: Rubbing balloon on sweater

  • State changes + Effects = Understandability

Description Strategy Framework Again

Static descriptions (blue column) form a browsable structured outline (black diamonds & black circles). Interactive static descriptions (black circles) encourage interaction. Dynamic descriptions (green column) keep the browsable state information up to date. Interactive alerts (orange column) connect users actions with changes in the sim, & are not browsable.

Quotes & Comments

Navigational strategies

P1: "I generally use the arrow keys to navigate line by line."
P8 (paraphrased for brevity) "I'm new to this site, so I am first going to go through everything with the down arrow [...] I'm repeating stuff to make sure I didn't miss anything. [...] Now, I'm going to do a Tab through."

Engagement with science

P5: [After first rub on sweater] "Hmmm...I got a lot more charges than I had before."
P3: [with 2 negatively charged balloons on the sweater] "I want to see what happens if I put one balloon on top of the other." [second balloon repels and moves to the bottom of the sweater.] "What will happen if I do the same thing on the Wall?"

Recent comments from researchers who are blind at ASSETS 2016

[After using the sim for a few minutes] "I'm going to see if I can deplete the charges on the sweater!"
[After dragging the balloon to the sweater, & hearing the alerts] "Am I using the same sim as a visual person?"

Conclusion & Future Work

Screen reader support for the PhET Sim, Balloons and Static Electricity & have indications that it is an engaging & inclusive experience. (need to test more)

In future work, we will describe two-balloon scenarios, apply this strategy to other sims, & begin to combine natural language with sound (Kramer et al, 2010) to create a more immersive environment for all students.


A big thanks to:

  • Jesse Greenberg, PhET Software Developer, for implementation & design insights
  • Our super enthusiastic participants
  • Shannon Fraser & Sambhavi Chandrashekar for assistance with interviews
  • Memorial University & CITL Media Services for space & video equipment
  • Inclusive Design Research Centre for space & video equipment

Funding for this work

  • provided in part by the National Science Foundation (DRL#1503439), the University of Colorado Boulder, & the William & Flora Hewlett Foundation.

More Information

Get in touch

  • @TaliesinSmith