Hostname: page-component-8448b6f56d-dnltx Total loading time: 0 Render date: 2024-04-17T18:35:08.122Z Has data issue: false hasContentIssue false

An Action–Sound Approach to Teaching Interactive Music

Published online by Cambridge University Press:  11 July 2013

Alexander Refsum Jensenius*
Affiliation:
fourMs lab, Department of Musicology, University of Oslo, PB 1017 Blindern, 0315 Oslo, NORWAY E-mail: a.r.jensenius@imv.uio.no

Abstract

The conceptual starting point for an ‘action–sound approach’ to teaching music technology is the acknowledgment of the couplings that exist in acoustic instruments between sounding objects, sound-producing actions and the resultant sounds themselves. Digital music technologies, on the other hand, are not limited to such natural couplings, but allow for arbitrary new relationships to be created between objects, actions and sounds. The endless possibilities of such virtual action–sound relationships can be exciting and creatively inspiring, but they can also lead to frustration among performers and confusion for audiences. This paper presents the theoretical foundations for an action–sound approach to electronic instrument design and discusses the ways in which this approach has shaped the undergraduate course titled ‘Interactive Music’ at the University of Oslo. In this course, students start out by exploring various types of acoustic action–sound couplings before moving on to designing, building, performing and evaluating both analogue and digital electronic instruments from an action–sound perspective.

Type
Articles
Copyright
Copyright © Cambridge University Press 2013 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Berthoz, A. 1997. Le sens du mouvement. Paris: Odile Jacob.Google Scholar
Cadoz, C. 1988. Instrumental Gesture and Musical Composition. In Proceedings of the International Computer Music Conference. The Hague: ICMC, pp. 6073.Google Scholar
Carello, C., Wagman, J.B., Turvey, M.T. 2005. Acoustic Specification of Object Properties. In J. Anderson and B. Anderson (eds.) Moving Image Theory: Ecological Considerations. Carbondale: Southern Illinois University Press, pp. 79104.Google Scholar
Collins, N. 2006. Handmade Electronic Music: The Art of Hardware Hacking. New York: Routledge.Google Scholar
Fiebrink, R., Wang, G., Cook, P.R. 2007. Don't Forget the Laptop: Using Native Input Capabilities for Expressive Musical Control. In Proceedings of the International Conference on New Interfaces for Musical Expression. New York: NIME, pp. 164167.Google Scholar
Gaver, W.W. 1993a. How Do We Hear in the World? An Ecological Approach to Auditory Event Perception. Ecological Psychology 5(4): 285313.CrossRefGoogle Scholar
Gaver, W.W. 1993b. What in the World Do We Hear? An Ecological Approach to Auditory Event Perception. Ecological Psychology 5(1): 129.Google Scholar
Gibson, J.J. 1977. The Theory of Affordances. In R. Shaw and J. Bransford (eds.) Perceiving, Acting, and Knowing: Toward an Ecological Psychology. Hillsdale, NJ: Erlbaum, pp. 6782.Google Scholar
Gibson, J.J. 1979. The Ecological Approach to Visual Perception. New York: Houghton-Mifflin.Google Scholar
Giordano, B.L. 2005. Sound Source Perception in Impact Sounds. PhD dissertation, University of Padova.Google Scholar
Godøy, R.I. 2006. Gestural-Sonorous Objects: Embodied Extensions of Schaeffer's Conceptual Apparatus. Organised Sound 11(2): 149157.Google Scholar
Godøy, R.I., Jensenius, A.R., Nymoen, K. 2010. Chunking in Music by Coarticulation. Acta Acoustica united with Acoustica 96(4): 690700.Google Scholar
Hunt, A., Wanderley, M.M., Paradis, M. 2003. The Importance of Parameter Mapping in Electronic Instrument Design. Journal of New Music Research 32(4): 429440.Google Scholar
Husserl, E. 1991. On the Phenomenology of the Consciousness of Internal Time (1893–1917), Trans. John Barnett Brough. Dordrecht: Kluwer.CrossRefGoogle Scholar
Jensenius, A.R. 2012. Motion-Sound Interaction using Sonification Based on Motiongrams. In Proceedings of the Fifth International Conference on Advances in Computer-Human Interactions, Valencia, pp. 170–5.Google Scholar
Jensenius, A.R., Wanderley, M.M., Godøy, R.I., Leman, M. 2010. Musical Gestures: Concepts and Methods in Research. In R.I. Godøy and M. Leman (eds.) Musical Gestures: Sound, Movement, and Meaning. New York: Routledge, pp. 1235.Google Scholar
Lakoff, G., Johnson, M. 1999. Philosophy in the Flesh: The Embodied Mind and its Challenge to Western Thought. New York: Basic Books.Google Scholar
Leman, M. 2008. Embodied Music Cognition and Mediation Technology. Cambridge, MA: The MIT Press.Google Scholar
McGurk, H., MacDonald, J. 1976. Hearing Lips and Seeing Voices. Nature 264: 746748.Google Scholar
Norman, D.A. 1990. The Design of Everyday Things. New York: Doubleday.Google Scholar
Rocchesso, D., Fontana, F. 2003. The Sounding Object. Florence: Edizioni di Mondo Estremo.CrossRefGoogle Scholar
Schaeffer, P. 1966. Traité des objets musicaux. Paris: Éditions du Seuil.Google Scholar
Thelen, E. 1995. Time-Scale Dynamics in the Development of an Embodied Cognition. In R. Port and T. van Gelder (eds.) Mind as Motion: Explorations in the Dynamics of Cognition. Cambridge, MA: The MIT Press, pp. 69100.Google Scholar
Thelle, N.J.W. 2010. Making Sensors Make Sense: Challenges in the Development of Digital Musical Instruments. MA thesis, University of Oslo.Google Scholar
Thompson, W.F., Russo, F.A. 2006. Preattentive Integration of Visual and Auditory Dimensions of Music. In Proceedings of the Second International Conference on Music and Gesture, Manchester, pp. 217–21.Google Scholar
Trueman, D. 2007. Why a Laptop Orchestra? Organised Sound 12(2): 171179.Google Scholar
Vines, B., Krumhansl, C., Wanderley, M., Levitin, D. 2005. Cross-Modal Interactions in the Perception of Musical Performance. Cognition 101: 80113.Google Scholar