Active Projects



Acceptance, perceived fairness, and mental models of driver monitoring systems

Advanced driver assistance systems (ADAS) using automation are likely to be implemented in conjunction with driver monitoring systems (DMS) to detect distracted distracting or potential unsafe driving behaviors. DMS are designed to mitigate driver distractions and inattention during periods of automated driving by issuing alerts or other consequences when inattention to the road is detected. Further, it appears increasingly likely that driver monitoring systems will become a required standard safety feature in all vehicles—even those without ADAS or automated features. Driver monitoring systems are themselves a form of automation—sensors and computer algorithms are designed to supplement or replace the driver’s ability to monitor and regulate their own attentiveness to the roadway. Research in our lab is underway to understand human factors issues associated with driver monitoring systems. We are currently studying perceived fairness, acceptance, and mental models of driver monitoring.


download.png
waveform perspective no back.jpg

Theory of sonification

As a young field, sonification—the use of nonspeech sound to convey information— has been legitimately criticized for proceeding with underdeveloped or nonexistent theory. On-going work in the lab explores how a theory of sonification might be achieved.

download.png

Technology and interruptions

Research in our lab is exploring the impact of auditory interruptions (alerts, alarms, etc.) and other types of interruptions from technology such as smartphones and messaging services.  How and when will interruptions disrupt our ability to accomplish everyday tasks? What theoretical models of multitasking and task interruption can best explain results to date? Are interruptions from technology different from in-person interruptions?  How does the availability of messaging technology impact productivity during work tasks?

Previous Projects

Auditory perception and working memory for nonverbal sounds

A number of theoretical perspectives in psychology emphasize the distinction between verbal and visuospatial processing.  In other words, it is widely held that people think in words/language and images.  But what about nonspeech sounds like music and environmental sounds?  Do people remember sounds as sounds per se?  Does the capacity to remember sounds require that the person covertly verbalize or imitate the sound (via articulation, like humming along to the tune, etc.)?  Previous work has examined empirical and theoretical approaches to understanding perception and working memory for for nonverbal sounds.

Auditory displays for in-vehicle technologies

Auditory displays will play an important role for drivers as in-vehicle technologies bring more diverse information into cars.  Previous work has examined the design of auditory displays for in-vehicle technologies and how auditory displays may promote situation awareness during driving, especially in emerging technologies that support automation in vehicles. 

Acceptance and perceptions of safety for self-driving cars

Self-driving cars have captured the public imagination, and people have begun to establish expectations about how the advent of self-driving cars will change their lives.  The realization of self-driving cars, however, will face many human factors challenges.  Researchers have only begun to examine the factors that will affect drivers' acceptance of vehicle automation, and acceptance will be critical to the ultimate success of self-driving cars.  Previously we developed a scale to measure acceptance of self-driving cars, and we used the scale to begin to examine how a priori expectations about self-driving cars will affect acceptance. We also examined how driving styles and other driver traits might relate to acceptance.

Audio assistive technologies in education and testing accommodations

Technology has increasingly allowed for students with disabilities such as visual impairments to pursue educational opportunities in science, technology, engineering, and mathematics (STEM) fields.  Many disciplines in STEM rely heavily on graphs and diagrams to represent abstract concepts.  Our research has examined how can sound be used to improve the delivery of STEM educational curricula for all learners, including those with disabilities.  In particular, research has suggested that auditory graphs could be used to make traditionally visual learning materials accessible to learners with a wide range of sensory capabilities.  Further, high-stakes standardized testing plays an important role in determining a number of educational and career outcomes for many people.  Very little research, however, has examined how accommodations affect the validity of standardized tests.  Research in our lab has examined how graphs, diagrams, and tables are presented in accommodated versions of tests for people with visual impairments.

Auditory pareidolia

We studied the role of top-down perceptual processing in the perception of purported electronic voice phenomena.  Our findings have suggested that the suggestion of a paranormal context results in a criterion shift for perceiving human voices in ambiguous auditory stimuli, even when participants are self-reported skeptics.  Further, people showed little agreement about the content of the perceived utterances in the ambiguous stimuli.