Georgios Diapoulis

Continuous gestural interactions in live coding

When

Thematic Session 5: Mapping and Control (Tuesday, 15:30)

Abstract

Live coding is a performance practice where the musicians are indirectly involved with a generative music system. Notation is necessary for canonical live coding systems, where the composer-programmer interacts using a keyboard. Typing on a keyboard requires skilled serial actions; the same applies to music performance. Contrary, hybrid interfaces move away from the "standard paradigm" to live code, which is also reflected in the performer's gestural interactions. In this workshop, I will present a hybrid live coding interface that affords continuous gestural interactions. The user provides the input using a single knob, and a generative system encodes the movements into a formal language. The system design blends notions of data and computation, which offers many possibilities for developing musical AI applications. Indicatively, the rate of change of bodily motion can be mapped to the input interface to generate many mini-languages on-the-fly. Such systems can sense kinematic movement and translate it to a computable representation, bridging pre-reflective experience and conscious processes. Aspects of completeness are addressed, and various predictive algorithms are discussed. An interactive demo will be presented.

Bio

I am a PhD student in Interaction Design at Chalmers University of Technology and University of Gothenburg, working with generative algorithms for machine musicianship. I have a bachelor's in Materials science and technology from the U. of Crete, Herakleion, Greece, and a master's in Music, mind and technology, from the U. of Jyv?skyl?, Jyv?skyl?, Finland. I enjoy practising and performing live coding and focusing on gestural interactions in music performance. I do software and hardware prototypes and experiment with interactive AI and machine listening technologies.

Published Oct. 22, 2022 7:40 PM - Last modified Oct. 22, 2022 7:40 PM