… the algorithm for the translation of sensor data into music control data is a major artistic area; the definition of these relationships is part of the composition of a piece. Here is where one defines the expression field for the performer, which is of great influence on how the piece will be perceived. —(Waisvisz; 1999)

Artists have been making electronic musical instruments and interactive installations using sensors and computers for several decades now, yet there is still no book available that details the process, approaches and methods of mapping gestures captured by sensors to output media such as sound, light and visuals.

Tools for this kind of mapping keep evolving and a lot of knowledge is embedded in these tools. However, this knowledge is in most cases not documented outside of the implementation itself. So the question comes then how we can preserve knowledge of how a performance or instrument works if tools become obsolete, file formats are not accessible (or documented) and source code is unavailable? How can we learn from what other artists have made before us? How can artists communicate about their approaches to mapping if they are using different tools for doing so?

Just a Question of Mapping

Ins and outs of composing with realtime data

The book aims to give an overview of the process of mapping and common techniques that can be used in this. These methods will be described in a way that gives an artist a guideline of when to use the method and how to implement the method in the environment they work in. Examples of implementations of these methods will be provided seperate from the book in a repository to which readers of the book can contribute.

The book will have two parts:

  • Part one is a general description of the approaches and methods. This can be read front to back, or can be used as a reference to look methods up.
  • Part two describes concrete works (instrument, performances and/or installations) based on interviews and in depth study of the implementation with references to the methods described in the first part of the book.

Outline of part one

  1. Introduction
    • Electricity and new musical instruments
    • Beyond music
    • Arbitrary?
    • Discussing the aesthetics of mapping
  2. Views on mapping
    • Contexts
      • music
      • dance
      • media art
      • interaction design
      • science
    • Formulating the problem
  3. Starting points
    • Imagining the instrument
    • If this were my instrument
    • Sonification (or perceptualisation)
  4. Elements of an instrument
    • Sensors
    • Electronic circuits
    • Voltage controlled (modular) systems
    • Microcontrollers
    • Digital protocols
    • Wireless communication
    • Computers
    • Software & computation
    • Output
  5. Communication between the elements
    • The physical connection
    • Digital protocols used by sensors and actuators
    • Serial protocols
    • HID
    • MIDI
    • OSC
    • DMX
    • Art-Net
    • Wireless
  6. Interface
    • Semantics of the interface
    • Effort and ease of use
    • Within and out of reach
  7. What does the data look like?
    • Timescale
    • Dimensionality
    • Range, resolution, linearity and reproducability
  8. Events and Time

  9. From one range to another
    • Introduction
    • From your input value to a standardized range
    • Unipolar and bipolar signals
    • Inverting the range
    • From a standardized range to a parameter range
    • Nonlinear approaches
    • Segmenting the range
    • Using a table
  10. Thresholds, states and modes of behaviour
    • Crossing a threshold
    • False triggers
    • States and conditions
    • Combining events and datastreams
    • Modal control
    • Picking up where you left off
  11. Changes, variation and filtering
    • Slope
    • Realtime statistics
    • Moving average
    • Deviation from the mean
    • Envelope following
  12. Computational behaviours
  13. Exploring the mapping space
  14. Using machine learning
  15. Transitions in the output
  16. Tuning
    • Calibration
    • Dealing with quantisation
    • Dealing with jitter
    • Dealing with loss of signal
    • Order of processing
  17. Conclusion

Case studies for part two

  • Andi Otto’s Fello.
  • Jeff Carey’s digital instrument consisting of a joystick, a keypad, pads & faders.
  • Roosna & Flak’s ongoing explorations with dance, accelerometer sensors, sound and light since 2013.
  • STEIM’s softwares Spider (Sensorlab), JunXion, LiSa and RoSa.

I am still looking for a good example of an interactive installation as a case study