Designing an Eyes-Reduced Document Skimming App for Situational Impairments

DFP student Taslim Arefin Khan and DFP faculty Joanna McGrenere and Dongwook Yoon submitted their work on "Designing an Eyes-Reduced Document Skimming App for Situational Impairments" to CHI2020. We are thrilled to share with the DFP community their research video below. You can also read the team's interview with UBC Language Sciences here.

Title: Designing an Eyes-Reduces Document Skimming App for Situational Impairments. 

Abstract: Listening to text using read-aloud applications is a popular way for people to consume content when their visual attention is situationally impaired (e.g., commuting, walking, tired eyes). However, due to the linear nature of audio, such apps do not support skimming—a non-linear, rapid form of reading—essential for quickly grasping the gist and organization of difficult texts, like academic or professional documents. To support auditory skimming for situational impairments, we (1) identified the user needs and challenges in auditory skimming through a formative study (N=20), (2) derived the concept of “eyes-reduced” skimming that blends auditory and visual modes of reading, inspired by how our participants mixed visual and non-visual interactions, (3) generated a set of design guidelines for eyes-reduced skimming, and (4) designed and evaluated a novel audio skimming app that embodies the guidelines. Our in-situ preliminary observation study (N=6) suggested that participants were positive about our design and were able to auditorily skim documents. We discuss design implications for eyes-reduced reading, read-aloud apps, and text-to-speech engines.