Interactional Sound and Music

Interactional - capable of acting on or influencing each other

The Interactional Sound and Music group in the Centre for Digital Music explores new ways of encountering sound from interactive art to real time data sonification.

We ask questions such as: How do we design and create engaging interactive soundscapes? What encourages collective action with, and through sound? Does music make the world go round? Can data be sonified beautifully and intuitively? How can algorithms improvise with us in real-time? What do people understand of our new forms of musical expression? How does technology change the music making process? Can we sense the imperceptible? What do diagrams sound like?

In order to address these questions we use techniques and models ranging from artistic intervention to controlled experiments, from ethnographic studies and discourse analysis to distributed cognition. We use tools from Max/MSP to SuperCollider and Java.

The group is led by Nick Bryan-Kinns ( email contact ) who focuses on design and evaluation of collectively engaging sound experiences.
Projects in the area include:

  • Interactive real-time musical systems
  • Interactive soundscapes
  • Understanding collaborative music making
  • Interactive data sonification
  • Audio games
  • Methods for designing and evaluating auditory displays
  • Distributed music making systems
  • Musical audio analysis for real-time interaction
  • Interaction design for musical composition
  • Visualizing structured data about music
  • Auditory graphs
  • Cross-modal interaction
  • Auditory overviews
  • Spatialised sound composition
  • Collaborative auditory displays and sonification

As part of the Centre for Digital Music Platform grant we have funded commissions on interactive sound for example, exploring the interaction of pianos, algorithms, physical intervention, and performance.

Key conferences, networks, and journals in the area include: