Interactive Sonification is a data analysis technique using sound, controlled interactively by a human user. It is an emerging subfield of auditory display research, which focuses on human interaction for understanding and using auditory feedback.
Interaction allows users to continuously query different "auditory views" of the data being analyzed which helps to build a more complete understanding of the structure of the data. There are many applications, from medical analysis to sports for people with visual impairments.
More general information about the topic can be found at: http://www.interactive-sonification.org/
Cancer cell detection using sonic interfaces
Members: Dr Andy Hunt, Dr Alistair Edwards (Computer Science)
This project is funded by a 3-year grant from EPSRC (EP/C512413/1: Sonification Of Cervical Smear Data To Improve Screening Accuracy). We are working with clinical cytologists from Leeds Health Trust to improve the detection rates of cervical cancer by rendering the microscope data as sound. Many diagnostic errors are made when trying to detect potentially cancerous cells using microscopic examination of cells which have been smeared across a glass slide. We are investigating the conversion of the slide data into a real-time sound feedback which will be available to the analyst. Sound may offer a degree of cross-checking against the visual data, and may also serve to help alert the listener to subtle changes in the data, especially data which is outside the current field of view.
High-level control of music synthesis for musicians
Members: Dr Al Disley, Prof David Howard, Dr Andy Hunt, Mr Tony Tew
Sound synthesis can be achieved by many methods, but none provides musicians with controls that are intuitive in musical terms; rather, they make use of low-level parameters such as fundamental frequency, filter cut-off frequency, or filter resonance. If musicians were able to make use of high-level terms that they regularly use during their music making such as bright-dark, poor-rich, focussed-unfocussed, static-dynamic, they would be able to create sounds in a much more intuitive manner. This can only be achieved by investigating the links between what musicians mean when they use high-level terms and the output sounds to which they refer. This project will (a) look at how musicians describe sounds, (b) decide what sound synthesis system would best enable such sounds to be created, and (c) implement a version of that synthesis system where sound creation is controlled using high-level musical sound descriptors.
Members: Dr Andy Hunt
Sonification is the use of sound to render data, so that it can be interpreted by a human being for analytical purposes. It is the audio counterpart of visualisation.
Members: Landmine Detection with sound feedback
We have brought together an international cross-discipline academic-industrial consortium to improve the detection rates of hand-held landmine detectors, by providing rich acoustic feedback to the human de-miner.
Use of Sound for Physiotherapy Analysis and Feedback
Members: Dr Andy Hunt, Ms Sandra Pauletto
This project is funded by a 3-year grant from EPSRC (GR/S08886: Improved Data Mining through an Interactive Sonic Approach). We have worked closely with Physiotherapists from Teesside University's School of Health & social care in order to develop a sound representation of EMG data taken from a patient's muscles. Through the sound not only can therapists hear the detailed temporal response of several muscles at the same time, but new insights are gained into the problems that each patient faces, as much of the microstructure and overall shape of the data is perceived in a very different way as sound. We are working on a system to integrate our sonification prototype with the clinical data gathering equipment, so that patients will be able to hear a live sonic rendition of their muscular performance, and use the aural information to modify their movements, aided by the therapist.