| Introduction

| Conference Program

| Workshops and Tutorials

| Review Process

| NIME 2010 Organisers

| Supporters

 

Conference Program
Tuesday 15th JUNE Wednesday 16th JUNE Thursday 17th JUNE Friday 18th JUNE
09:00
Morning Tutorials
› Hardware Hacking*
› iPhone Instrument Design*
› NIME Primer: An Overview of the First Ten Years
› Impromptu Live Coding Tutorial
› Instrument Design, the Next Generation: Musical Concepts
* Full Day Tutorials
09:00
Paper A1-A5
Controllers and Interfaces for Musical Expression
09:00
Paper E1-E5
Computational Interfaces and Methods
09:00
Keynote
By Hand / Prof. Nicolas Collins
10:00
Paper J1-J5
Controllers for Collaborative Performance
Paper K1-K5
Augmented Instruments
10:30
Break
10:30
Break
11:30
Break
11:00
Paper B1-B5
Software Tools and Design
11:00
Paper F1-F5
Musical Mapping Strategies
12:00
Panel Session
12:30
Lunch
12:30
Lunch Poster / Demo
N1-N20
12:30
Lunch Poster / Demo
O1-O20
13:00
Lunch Poster / Demo
Q1-Q15
14:00
Afternoon Tutorials
› Hardware Hacking*
› iPhone Instrument Design*
› Hands on Speech and Singing Synthesis
› AudioMulch Tutorial @ NIME 2010
› I-CubeX sensor products
* Full Day Tutorials
14:00
Paper C1-C5
Sonification
14:00
Paper G1-G5
New Interfaces and Robotic Music
14:00
Paper M1-M4
Sensor and Actuator Technologies
Paper L1-L5
Gesture and Music
15:30
Break
15:30
Break
15:30
Break
16:00
Paper D1-D3
Mobile Technologies (Full Papers)
16:00
Paper H1-H4
Mobile Technologies (Short Papers)
16:00
Paper P1-P5
Live Performance, Algorithms and Rendering
19:00
Keynote
Alternate Embodiments Alternate Interfaces / Stelarc
18:30 Conference Dinner 17:30 Concert 17:30 Concert (Repeat)
20:30 Club Performance 19:00 Concert | Ensemble Offspring 19:00 Concert | Charisma Ensemble and Electro-Acoustic
20:00
Stelarc Installation 21:30 Club Performance

21:30

Mobile Concert

Keynote Addresses

Tuesday 15th JUNE 19:00 - 20:00

Alternate Embodiments / Alternate Interfaces
Stelarc

www.stelarc.va.com.au

The Articulated Head has been an outcome of the Thinking Head research project. It is an alternate embodiment for an AI conversational agent. An LCD screen imaging a conversational agent is mounted on a 6 degree-of-freedom robot arm. Armed with a stereovision system and able to do auditory localization, the robot system with its attention model can detect human presence and respond appropriately with a library of physical behaviours. Coupled with the virtual and verbal behaviour of the computer head, the system generates appropriate aliveness and engagement with its interlocutor. It's intelligence results from its appropriate behaviour and conversational capabilities.

The Ear on Arm project considers an alternate architecture for a human body, surgically constructing and stem-cell growing an extra ear for the body. This ear is not so much a listening device as a transmission device. At present, the ear on the arm is only a relief of an ear. Once the helix is lifted to construct an ear flap and a soft ear lobe is grown using the artist's stem cells, a microphone will be re-inserted, that attached to a wireless transmitter will internet enable the ear, allowing people in other places to listen in to what the ear is hearing. A bodily structure has been replicated, relocated and now will be rewired for additional capabilities.

We already perform in multi-modal ways with Mixed Realities. Our bodies are accelerated by machines, our sensory systems extended by instruments and we visualise and manage data streams with virtual systems. There is a need to consider both alternate embodiments and alternate interfaces. Humans will increasingly interact with robot and virtual systems both in local and remote spaces. Perhaps what we need now is not a Second Life but rather a Third Life - an inverse motion capture system that allows avatars to access and actuate human surrogate bodies in the 3D space.

| Top |

 

Friday 18th JUNE 09:00 - 10:00

By Hand
Nicolas Collins

www.nicolascollins.com

It is perhaps a general human habit to view the technological and the organic as opposites. It is certainly the case that the phrase 'Live Electronic Music' strikes many a music fan as oxymoronic or, at best, disingenuous (the laptop performer who's quite likely Facebooking). Isn't the purpose of electronics to do things for us so we don't have to do them 'live' ourselves? To record, perfect and play back performances so we can listen while cycling stationarily? To facilitate the creation of inhumanely intricate compositions that spew themselves out of speakers at the touch of a button, instead of all that messy sliding about on strings or valving spit? While there is no question that composers of tape music and computer music (and a fair number of Pop music producers as well) have long employed electronics to exactly these ends, the pre-history of NIME is punctuated by artists and inventors who sought to create not just new sounds, but new instruments for organizing and manipulating sound as well. Some of these devices are well known and well documented, and have had significant influence on subsequent musical developments. But many never passed from the hands of their makers into the wider world, were never detailed in papers at conferences, and have had word-of-mouth reputations at best.

I will discuss several of my own instruments, built and played from 1974 to the present, but poorly documented and disseminated: a self-stabilizing feedback network that responds to movement, sound and weather; sound-resonated "backwards" electric guitars; a trombone-propelled digital signal processor; software that translates the inflection of speech into piano accompaniment; a circuit that requires 12 hands to play; a candle-powered oscillator. I will illustrate some problems endemic to live electronic performance, highlight some solutions, and hopefully contribute something back into the knowledge pool of the NIME community.

| Top |


PAPER SESSION

Wednesday 16th JUNE

Controllers and Interfaces for Musical Expression | A1-A5

A Shift Towards Iterative and Open-Source Design for Musical Interfaces
Owen Vallis, Jordan Hochenbaum and Ajay Kapur

UnitInstrument: Easy Configurable Musical Instruments
Yutaro Maruyama, Yoshinari Takegawa, Tsutomu Terada and Masahiko Tsukamoto

The Loudspeaker as Musical Instrument
Jos Mulder

An Ultrasound Based Instrument Generating Audible and Tactile Sound
Miha Ciglar

Neurohedron: A Nonlinear Sequencer Interface
Ted Hayes

 

Software Tools and Design | B1-B5

Designing Custom-made Metallophone with Concurrent Eigenanalysis
Nobuyuki Umetani, Jun Mitani and Takeo Igarashi

Freepad: A Custom Paper-based MIDI Interface
Sungkuk Chun, Andrew Hawryshkewich, Keechul Jung and Philippe Pasquier

Music Programming in Minim
John Anderson Mills III, Damien Di Fede and Nicolas Brix

An Epistemic Dimension Space for Musical Devices
Thor Magnusson

Investigating the Potential for Shared Agency Using Enactive Interfaces
A. Baki Kocaballi, Petra Gemeinboeck and Rob Saunders

 

Sonification | C1-C5

Cuebert: A New Mixing Board Concept for Musical Theatre
Noah Liebman, Michael Nagara, Jacek Spiewla and Erin Zolkosky

Dynamic Interactivity Inside the AlloSphere
Charles Roberts, Matthew Wright, JoAnn Kuchera-Morin, Lance Putnam and Graham Wakefiled

Creating Meaningful Melodies from Text Messages
Florian Alt, Alireza Sahami Shirazi, Stefan Legien, Albrecht Schmidt and Julian Mennenöh

Epi-thet: A Musical Performance Installation and a Choreography of Stillness
Tim Humphrey, Madeleine Flynn and Jesse Stevens

From Mozart to MIDI: A Rule System for Expressive Articulation
Tilo Hähnel

 

Mobile Technologies (Full Papers) | D1-D3

Designing Mobile Musical Instruments and Environments with UrMus
Georg Essl and Alexander Mueller

Evolving the Mobile Phone Orchestra
Jieun Oh, Jorge Herrera, Nicholas J. Bryan, Luke Dahl and Ge Wang

Mapping Out Instruments, Affordances, and Mobiles
Atau Tanaka

| Top |

Thursday 17th JUNE

Computational Interfaces and Methods | E1-E5

Composing for Improvisation with Chaotic Oscillators
Mark Havryliv

Beatback: A Real-time Interactive Percussion System for Rhythmic Practise and Exploration
Andrew Hawryshkewich, Philippe Pasquier and Arne Eigenfeldt

Style and Constraint in Electronic Musical Instruments
Michael Gurevich, Paul Stapleton and Adnan Marquez-Borbon

LUSH: An Organic Eco+Music System
Hongchan Choi and Ge Wang

TwinkleBall: A Wireless Musical Interface for Embodied Sound Media
Tomoyuki Yamaguchi, Tsukasa Kobayashi, Anna Ariga and Shuji Hashimoto

 

Musical Mapping Strategies | F1-F5

Expression and Spatial Motion: Playable Ambisonics
Joanne Cannon and Stuart Favilla

Contrary Motion: An Oppositional Interactive Music System
Nick Collins

Images as Spatial Sound Maps
Etienne Deleflie and Greg Schiemer

Relationship-Based Instrument Mapping of Multi-Point Data Streams Using a Trackpad Interface
Kevin Schlei

Instrumentalizing Synthesis Models
Lonce Wyse and Nguyen Dinh Duy

 

New Interfaces and Robotic Music | G1-G5

ScoreLight: Playing with a Human-sized Laser Pickup
Alvaro Cassinelli, Yusaku Kuribara, Alexis Zerroug, Masatoshi Ishikawa and Daito Manabe

Disky: A DIY Rotational Interface with Inherent Dynamics
Karl Yerkes, Greg Shear and Matthew Wright

Development of the Waseda Saxophonist Robot and Implementation of an Auditory Feedback Control
Jorge Solis, Klaus Petersen, Tetsuro Yamamoto, Masaki Takeuchi, Shimpei Ishikawa, Atsuo Takanishi and Kunimatsu Hashimoto

A Pedagogical Paradigm for Musical Robotics
Ajay Kapur and Michael Darling

A Robot Musician Interacting with a Human Partner through Initiative Exchange
Ye Pan, Min-Gyu Kim and Kenji Suzuki

 

Mobile Technologies (Short Papers) | H1-H4

Introducing L2Ork: Linux Laptop Orchestra
Ivica Ico Bukvic, Thomas Martin, Eric Standley and Michael Matthews

MoMu: A Mobile Music Toolkit
Nicholas J. Bryan, Jorge Herrera, Jieun Oh and Ge Wang

Sound Bounce: Physical Metaphors in Designing Mobile Music Performance
Luke Dahl and Ge Wang

Use the Force (or something): Pressure and Pressure-Like Input for Mobile Music Performance
Georg Essl, Michael Rohs and Sven Kratz

| Top |

Friday 18th JUNE

Controllers for Collaborative Performance | J1-J5

Dislocated Sound: A Survey of Improvisation in Networked Audio Platforms
Roger Mills

DRILE: An Immersive Environment for Hierachical Live-looping
Florent Berthaut, Myriam Desainte-Catherine and Martin Hachet

Hey Man, You're Invading my Personal Space! Privacy and Awareness in Collaborative Music
Robin Fencott and Nick Bryan-Kinns

Cross-Artform Performance Using Networked Interfaces: Last Man to Die's Vital LMTD
Charles Martin, Benjamin Forster and Hanna Cormick

Evaluating the Subjective Effects of Microphone Placement on Glass Instruments
Alexander Refsum Jensenius, Kjell Tore Innervik and Ivar Frounberg

 

Augmented Instruments | K1-K5

Glitch Delighter: Lighter's Flame Base Hyper-Instrument for Glitch Music in Burning the Sound Performance
Rudolfo Quintas

Augmenting the Acoustic Piano with Electromagnetic String Actuation and Continuous Key Position Sensing
Andrew McPherson and Youngmoo Kim

Developing a Hybrid Contrabass Recorder Resistances, Expression, Gestures and Rhetoric
Cesar Marino Villavicencio Grossmann

The Bowed Tube: A Virtual Violin
Alfonso Perez Carrillo and Jordi Bonada

Multimodal Musician Recognition
Jordan Hochenbaum, Ajay Kapur and Matthew Wright

 

Gesture and Music | L1-L5

A Left Hand Gesture Caption System for Guitar Based on Capacitive Sensors
Enric Guaus, Tan Ozaslan, Eric Palacios and Josep Lluis Arcos

Support Vector Machine Learning for Gesture Signal Estimation with a Piezo-Resistive Fabric Touch Surface
Andrew Schmeder and Adrian Freed

Motion to Gesture To Sound: Mapping for Interactive Dance
Jan C. Schacher

Generative Improvisation and Interactive Music Project (GIIMP)
Ian Whalley

Searching for Cross-individual Relationships between Sound and Movement Features Using an SVM Classifier
Kristian Nymoen, Kyrre Glette, Ståle Skogstad, Jim Torresen, Alexander R. Jensenius

 

Live Performance, Algorithms and Rendering | P1-P5

"VirtualPhilharmony": A Conducting System with Heuristics of Conducting an Orchestra
Takashi Baba, Mitsuyo Hashida and Haruhiro Katayose

New Sensors and Pattern Recognition Techniques for String Instruments
Tobias Grosshauser, Ulf Großekathöfer and Thomas Hermann

Expressive Articulation for Synthetic Music Performances
Tilo Hähnel and Axel Berndt

Network Jamming: Distributed Performance Using Generative Music
Andrew R. Brown

Glass Instruments: From Pitch to Timbre
Ivar Frounberg, Kjell Tore Innervik and Alexander Refsum Jensenius

 

Sensor and Actuator Technologies | M1-M4

A Malleable Interface for Sonic Exploration
Chris Kiefer

OSC Virtual Controller
Victor Zappi, Andrea Brogni and Darwin Caldwell

Extending the Soundcard for Use with Generic DC Sensors: Demonstrated by Revisiting a Vintage ISA Design
Smilen Dimitrov

Disembodied and Collaborative Musical Interaction in the Multimodal Brain Orchestra
Sylvain Le Groux, Jonatas Manzolli and Paul F.M.J Verschure

| Top |


DEMONSTRATIONS and POSTERS

Wednesday 16th June

Session N1-N20

Designing Expressive Musical Interfaces for Tabletop Surfaces
Jordan Hochenbaum, Owen Vallis, Dimitri Diakopoulos, Jim Murphy and Ajay Kapur

Toward Algorithmic Composition of Expression in Music Using Fuzzy Logic
Wendy Suiter

Expressive Wearable Sonification and Visualisation: Design and Evaluation of a Flexible Display
Kirsty Beilharz, Andrew Vande Moere, Barbara Stiel, Claudia Calo, Martin Tomitsch and Adrian Lombard

Understanding and Evaluating User Centred Design in Wearable Expressions
Jeremiah Nugroho and Kirsty Beilharz

Online Map Interface for Creative and Interactive Music-Making
Sihwa Park, Seunghun Kim, Samuel Lee and Woon Seung Yeo

Analysis of Piano Playing Movements Spanning Multiple Touches
Aristotelis Hadjakos and Max Mühlhäuser

Designing a Shareable Musical TUI
Sebastian Heinz and Sile O'Modhrain

Visualizations and Interaction Strategies for Hybridization Interfaces
Adrian Freed, John MacCallum, Andrew Schmeder and David Wessel

ANTracks 2.0: Generative Music on Multiple Multitouch Devices
Björn Wöldecke, Christian Geiger, Holger Reckter and Florian Schulz

Hé (?): Calligraphy as a Musical Interface
Laewoo Kang and Hsin-Yi Chien

The Sponge: A Flexible Interface
Martin Marier

SurfaceMusic: Mapping Virtual Touch-based Instruments to Physical Models
Lawrence Fyfe, Sean Lynch, Carmen Hull and Sheelagh Carpendale

Mechanisms for Controlling Complex Sound Sources: Applications to Guitar Feedback Control
Aengus Martin, Sam Ferguson and Kirsty Beilharz

Wireless Sensor Data Collection Based on ZigBee Communication
Jim Torresen, Eirik Renton and Alexander Refsum Jensenius

Synchronization of Multimodal Recordings for Musical Performance Research
Javier Jaimovich and R. Benjamin Knapp

POLLEN: A Multimedia Interactive Network Installation
Giuseppe Torre, Mark O'Leary and Brian Tuohy

Irregular Incurve
Xiaoyang Feng

| Top |

Thursday 17th June

Session O1-O20

Peacock: A Non-haptic 3D Performance Interface
Chikashi Miyama

Associating Emoticons with Musical Genres
Jukka Holm, Harri Holm and Jarno Seppänen

Untouchable Instrument "Peller-Min"
Yoichi Nagashima

Ground Me! An Interactive Sound Art Installation
Javier Jaimovich

Mmmmm: A Multi-modal Mobile Music Mixer
Norma Saiph Savage, Syed Reza Ali and Norma Elva Chavez

An Interactive Responsive Skin for Music Performers, AIDA
Chih-Chieh Tsai, Cha-Lin Liu and Teng-Wen Chang

Interactional Sound and Music: Listening to CSCW, Sonification, and Sound Art
Nick Bryan-Kinns, Robin Fencott, Oussama Metatla, Shahin Nabavian and Jennifer Sheridan

Using IR Optical Marker Based Motion Capture for Exploring Musical Interaction
Ståle A. Skogstad, Alexander Refsum Jensenius and Kristian Nymoen

"playing_robot": An Interactive Sound Installation in Human-Robot Interaction Design for New Media Art
Benjamin Buch, Pieter Coussement and Lüder Schmidt

Multimodal Guitar: A Toolbox For Augmented Guitar Performances
Loïc Reboursière, Christian Frisson, Otso Lähdeoja, J. Anderson Mills III, Cécile Picard and Todor Todoroff

The GRIP MAESTRO: Idiomatic Mappings of Emotive Gestures for Control of Live Electroacoustic Music
Michael Berger

Sonic Virtual Reality Game: How Does Your Body Sound?
Kimberlee Headlee, Tatyana Koziupa and Diana Siwiak

Auditory Masquing: Wearable Sound Systems for Diegetic Character Voices
Alex Stahl and Patricia Clemens

The Ghost: An Open-Source, User Programmable MIDI Performance Controller
Paul Rothman

Towards a Taxonomy of Realtime Interfaces for Electronic Music Performance
Garth Paine

Humanaquarium: A Participatory Performance System
Robyn Taylor, Guy Schofield, John Shearer, Pierre Boulanger, Jayne Wallace and Patrick Olivier

Interactive Music Studio: The Soloist
Hyun-Soo Kim, Je-Han Yoon and Moon-Sik Jung

Surfing the Waves: Live Audio Mosaicing of an Electric Bass Performance as a Corpus Browsing Interface
Pierre Alexandre Tremblay and Diemo Schwarz

Examining the Spectator Experience
A.Cavan Fyans, Michael Gurevich and Paul Stapleton

| Top |

Friday 18th June

Session Q1-Q15

Musical Exoskeletons: Experiments with a Motion Capture Suit
Nick Collins, Chris Kiefer, Zeeshan Patoli and Martin White

The Helio: A Study of Membrane Potentiometers and Long Force Sensing Resistors for Musical Interfaces
Jim Murphy, Ajay Kapur and Carl Burgin

FerroSynth: A Ferromagnetic Music Interface
Stuart Taylor and Jonathan Hook

P[a]ra[pra]xis: Towards Genuine Realtime 'Audiopoetry'
Josh Mei-Ling Dubrau and Mark Havryliv

ImprovGenerator: Online Grammatical Induction for On-the-Fly Improvisation Accompaniment
Kris M. Kitani and Hideki Koike

DeviceCycle: Rapid and Reusable Prototyping of Gestural Interfaces, Applied to Audio Browsing by Similarity
Christian Frisson, Benoit Macq, Stéphane Dupont, Xavier Siebert, Damien Tardieu and Thierry Dutoit

Reflective Haptics: Resistive Force Feedback for Musical Performances with Stylus-Controlled Instruments
Alexander Müller, Fabian Hemmert, Götz Wintergerst and Ron Jagodzinski

Revisiting Cagean Composition Methodology with a Modern Computational Implementation
Alison Mattek, Mark Freeman and Eric Humphrey

Movement in a Contemporary Dance Work and its Relation to Continuous Emotional Response
Sam Ferguson, Emery Schubert and Catherine Stevens

Gesture Controlled Virtual Instrument with Dynamic Vibrotactile Feedback
Teemu Ahmaniemi

Creating Integrated Music and Video for Dance: Lessons Learned and Lessons Ignored
Jeffrey Hass

Packages for ArtWonk: New Mathematical Tools for Composers
Warren Burt

Wiiolin: A Virtual Instrument Using the Wii Remote
Jace Miller and Tracy Hammond

The Planets
Max Meier and Max Schranner

| Top |


Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010), Sydney, Australia

Proceedings edited by
Kirsty Beilharz
Andrew Johnston
Sam Ferguson
Amy Yi-Chun Chen


PO Box 123 Broadway NSW 2007 Australia

ISBN: 978-0-646-53482-4