Watching this video made me think of the way that electrodes in a cochlear implant are "tuned" to particular input bands individually for each user - although with many fewer notes available. This process is called mapping, and each CI user user gets a unique map made by an audiologist.
It would be interesting to give some of this control over to the listener in certain situations. If a certain piece of music sounded really 'wrong' with their normal map, it might be possible for the user to come up with a specialised map on the spot (and then go back to their normal map) using some kind of simple graphic equaliser type interface...
Tuesday, December 8, 2009
Monday, December 7, 2009
Visual representations of sound.
I'm always interested in seeing/hearing different ways of representing sound visually - here's one I found today
Conference news Pt 1
The Music and Pitch team has just returned from Sydney, where we presented two papers at the 2nd International Conference on Music Communication Science (ICoMCS2), hosted at the University of NSW.
There were alot of interesting papers at the conference, some of my favourites:
1) An intersting application of sonification technology, where a sonification of a rowing skiffs acceleration was provided to the rowers in order to help smooth the power delivery of the rowing strokes.
2) An interesting plenary presentation about "chills" in music cognition research - these are those moments when you feel a shiver down your spine, or the hairs on the back of neck raise etc. It turns out that these are fairly reliably and easily elicited, and are also easy to measure using skin conductance, and so can be used to look into what different parts of music most effectively cause this emotional response.
The work that we presented was concerned with the ability to separate a melody from background notes - Jeremy presented soem work showing how acoustic factors (in this case the "temporal envelope" or "implusiveness" of a sound) can affect melody segregation, and I showed some data about how visual information (an animation of the musical score) could also improve the ability to separate the melody.
You can find abstracts for these talks, as well as our own, at the ICoMCS2 website:
http://marcs.uws.edu.au/links/ICoMusic09/program.html
Meanwhile, here's a picture of the (fairly noisy) ferry ride over to the conference each day!
There were alot of interesting papers at the conference, some of my favourites:
1) An intersting application of sonification technology, where a sonification of a rowing skiffs acceleration was provided to the rowers in order to help smooth the power delivery of the rowing strokes.
2) An interesting plenary presentation about "chills" in music cognition research - these are those moments when you feel a shiver down your spine, or the hairs on the back of neck raise etc. It turns out that these are fairly reliably and easily elicited, and are also easy to measure using skin conductance, and so can be used to look into what different parts of music most effectively cause this emotional response.
The work that we presented was concerned with the ability to separate a melody from background notes - Jeremy presented soem work showing how acoustic factors (in this case the "temporal envelope" or "implusiveness" of a sound) can affect melody segregation, and I showed some data about how visual information (an animation of the musical score) could also improve the ability to separate the melody.
You can find abstracts for these talks, as well as our own, at the ICoMCS2 website:
http://marcs.uws.edu.au/links/ICoMusic09/program.html
Meanwhile, here's a picture of the (fairly noisy) ferry ride over to the conference each day!
Welcome
Welcome to musicalbionics.
Here we can post quick reviews of useful papers or websites, links to other researchers or companies, sound clips or videos etc.
Here we can post quick reviews of useful papers or websites, links to other researchers or companies, sound clips or videos etc.
Subscribe to:
Posts (Atom)
