Auditory learning using a portable real-time vocoder: Preliminary findings

Elizabeth D. Casserly, David B. Pisoni

Research output: Contribution to journalArticle

3 Scopus citations


Purpose: Although traditional study of auditory training has been in controlled laboratory settings, interest has been increasing in more interactive options. The authors examine whether such interactive training can result in short-term perceptual learning, and the range of perceptual skills it impacts. Method: Experiments 1 (N = 37) and 2 (N = 21) used preand posttest measures of speech and nonspeech recognition to find evidence of learning (within subject) and to compare the effects of 3 kinds of training (between subject) on the perceptual abilities of adults with normal hearing listening to simulations of cochlear implant processing. Subjects were given interactive, standard lab-based, or control training experience for 1 hr between the pre- and posttest tasks (unique sets across Experiments 1 & 2). Results: Subjects receiving interactive training showed significant learning on sentence recognition in quiet task (Experiment 1), outperforming controls but not lab-trained subjects following training. Training groups did not differ significantly on any other task, even those directly involved in the interactive training experience. Conclusions: Interactive training has the potential to produce learning in 1 domain (sentence recognition in quiet), but the particulars of the present training method (short duration, high complexity) may have limited benefits to this single criterion task.

Original languageEnglish (US)
Pages (from-to)1001-1016
Number of pages16
JournalJournal of Speech, Language, and Hearing Research
Issue number3
StatePublished - Jun 1 2015

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language
  • Speech and Hearing

Fingerprint Dive into the research topics of 'Auditory learning using a portable real-time vocoder: Preliminary findings'. Together they form a unique fingerprint.

  • Cite this