CUTE | 2015
2015 MASTERCLASS SERIES ON CULTURE AND TECHNOLOGY, AUGUST 28-29, 2015
CUTE is a yearly masterclass series on culture and technology, organized by the Numediart research institute. We bring together a panel of world-renowned experts in various high-tech fields and get them to meet all kinds of audiences (research, arts, industry) in a series of “hands on workshops”…
Location and Participation
All keynotes will take place in the Academic Room.
31, Boulevard Dolez
For more information on how to reach the location, please click here.
All eNTERFACE’15 participants are automatically registered to Cute|2015.
If you do not participate to eNTERFACE’15 but still wish to attend Cute|2015, please complete the registration form here.
|Prof. Petri Toiviainen
University of Jyväskylä
Introduction to the MoCap Toolbox
The Mocap Toolbox is a Matlab toolbox for the analysis and visualization of data collected with motion capture devices. It is mainly aimed at investigating music-related movement. It contains tools for preprocessing and transforming motion capture data as well as for kinematic and kinetic analysis and visualization thereof. Since the toolbox code is available as open source, users can freely adapt the functions according to their needs. The workshop will contain an overview of the content and functionality of the toolbox. This is followed by hands-on work, in which various motion capture data are analyzed and visualised.
Speaker’s Brief Bio
Petri Toiviainen is a Professor of Music at the University of Jyväskylä and the leader of the Finnish Centre of Excellence in Interdisciplinary Music Research. His research interests include music and movement, perception of rhythm and tonality, emotions in music, sound and music computing, and music visualization. He has published several articles and given a number of keynote talks on these topics, and is an editorial board member of a number of journals. He is also a co-author of several widely used software tools for music analysis, including the MIDI Toolbox, the MIRToolbox, and the Motion Capture Toolbox.
|Dr. Anthony Brooks
School of Media Technology
University of Aalborg
Control from Non-Control: Digital Media Plasticity – Human Performance Plasticity
The presentation that was originally scheduled for the 28th of August at 14h00 PM was canceled.
Prof. Todor Todoroff will lecture instead of Dr. Anthony Brooks on the same date.
|Prof. Todor Todoroff
Virtual instruments, sensors and mapping to enhance musical expressivity
This journey started in the early nineties on the ISPW, with the aim to offer an analog type of gestural control to digital sound processing algorithms and to create playable and expressive virtual instruments, in a time where most systems still relied on the definition of a “score” rather than on playing the instruments in real-time.
They were intended at first for electroacoustic music composers in the studio, to allow them to revive and transpose the notion of “sequence-jeu” in the digital world and to provide them new ways to explore more freely and thoroughly the potential of sound transformation and spatialisation algorithms. Increasingly more sensitive and miniaturized sensors and better methods for tracking movements, combined with huge increases of computing power did push the concept towards live concerts, dance performances and interactive installations, where mapping is also used for light, video or even fire.
The talk will be illustrated with some hands-on moments as well as videos excerpts of dance performances, installations and concerts.
Speaker’s Brief Bio
Electrical Engineer from the Université Libre de Bruxelles (ULB), he received a First Prize and a Higher Degree in Electroacoustic Composition at the Royal Conservatories in Brussels and Mons.
Co-founder and president of ARTeM and FeBeME, he was researcher at ULB, Faculté Polytechnique de Mons, Numediart Institute and was Belgian representative of EU-COST actions “Digital audio Effects” and “Gesture Controlled Audio Systems”.
Within ARTeM he developed hardware and software interactive systems for concerts, sound installations and dance performances, using a wide variety of sensors.
His electroacoustic music shows a special interest for sound spatialisation and research into new forms of interaction and sound transformation. Fascinated by the dialogue with other art forms, he also composes music for film, video, dance, theatre and sound installation. He collaborates since 1998 with Belgian choreographer Michèle Noiret and worked with artists like Marie-Jo Lafontaine, FOAM, Fred Vaillant, Mario Benjamin and Laura Colmenares Guerra.
Prize-winner in several international competitions, his music is regularly performed in international festivals.
He received commissions from IMEB, Paris Opera, Art Zoyd, Musiques Nouvelles, ZKM, Festivaal van Vlaanderen, etc.
|Prof. Rebecca Fiebrink
University of London
Machine learning as a tool for designing embodied interactions
Machine learning algorithms can be understood not only as a set of techniques for building accurate models of data, but also as design tools that can enable rapid prototyping, iterative refinement, and embodied engagement— all activities that are crucial in the design of new musical instruments and other embodied interactions. Realising the creative potential of these algorithms requires a rethinking of the interfaces through which people provide data and build models, providing for tight interaction-feedback loops and efficient mechanisms for people to steer and explore algorithm behaviours.
I created the Wekinator software in 2009 to enable composers, game designers, and other creative practitioners to apply such an interactive approach to machine learning to their work. In this masterclass, I’ll share some of my findings from 6 years of observing this software in use in creative contexts, and I’ll give a demonstration of the new version of the Wekinator software (released this summer). These will serve as starting points for a discussion of the future of data and machine learning as design tools– techniques that enable more effective human creative work, and techniques that can scaffold creative work by a greater diversity of people working in different contexts.
Speaker’s Brief Bio
Rebecca Fiebrink creates new technologies for digital music and art, and she designs new ways for humans to interact with computers in creative practice. Much of her current research combines techniques from human-computer interaction, machine learning, and signal processing to allow people to apply machine learning more effectively to new problems, such as the design of new digital musical instruments and gestural interfaces for gaming and health. She is also involved in projects developing rich interactive technologies for digital humanities scholarship, and in designing new approaches to integrating the arts into computer science teaching and outreach.
Rebecca is the developer of the Wekinator system for interactive machine learning. She has worked with companies including Microsoft Research, Sun Microsystems Research Labs, Imagine Research (recently acquired by iZotope), and Smule, where she helped to build the #1 iTunes app “I am T-Pain.” An active musician, she has performed regularly with a variety of musical ensembles, including as a laptopist in Sideband, the principal flutist in the Timmins Symphony Orchestra, and the keyboardist in the University of Washington computer science rock band “The Parody Bits.” Prior to arriving at Goldsmiths, she held a faculty position at Princeton University.
|M. James Morley
Creative Industries Community Developer
Creative re-use of digital cultural heritage – opportunities, challenges, approaches, impact
The presentation that was originally scheduled for the 29th of August at 14h00 PM was canceled.
The CUTE logo has been designed by Simon Gastout.