This event has ended. View the official site or create your own event → Check it out
This event has ended. Create your own
Please note:

Sonic Environments will run from Sunday 10th July until Monday 11th July
NIME will run from Monday 11th July until Friday 15th July.

View analytic
Tuesday, July 12 • 13:30 - 15:00
Demo & Poster 1

Sign up or log in to save this to your schedule and see who's attending!

  • Towards a Perceptual Framework for Interface Design in Digital Environments for Timbre Manipulation (Sean Soraghan, Alain Renaud, Ben Supper) Table 3
  • Augmenting the iPad: The BladeAxe (Romain Michon, Julius Orion III Smith, Matthew Wright, Chris Chafe) Table 4
  • Towards a Mappable Database of Emergent Gestural Meaning (Doug Van Nort, Ian Jarvis, Michael Palumbo) Table 5
  • An Analogue Interface for Musical Robots (Jason Long, Ajay Kapur, Dale Carnegie) Table 6
  • The ‘Virtualmonium’: an instrument for classical sound diffusion over a virtual loudspeaker orchestra.(Natasha Barrett, Alexander Refsum Jensenius) Table 7
  • The Smartphone Ensemble. Exploring mobile computer mediation in collaborative musical performance(Julian Jaramillo Arango, Daniel Melán Giraldo) Table 8
  • Development of Fibre Polymer Sensor Reeds for Saxophone and Clarinet (Alex Hofmann, Vasileios Chatziioannou, Alexander Mayer, Harry Hartmann) Table 9
  • PdMIs: Embedded Acoustic DMIs Expressed through 3D Printing (Oliver Hancock, Todd Cochcrane) Table 10
  • Transforming 8-Bit Video Games into Musical Interfaces via Reverse Engineering and Augmentation(Benjamin Olson) Table 11
  • Musician and Mega-Machine: Compositions Driven by Real-Time Particle Collision Data from the ATLAS Detector (Juliana Cherston, Ewan Hill, Steven Goldfarb, Joseph Paradiso) Table 12
  • Mapping Everyday Objects to Digital Materiality in The Wheel Quintet: Polytempic Music and Participatory Art (Anders Lind, Daniel Nylén) Table 13
  • Haptic Music Player - Synthetic audio-tactile stimuli generation based on the notes´ pitch and instruments´ envelope mapping (Alfonso Balandra, Hironori Mitake, Shoichi Hasegawa) Table 14
  • Notation for 3D Motion Tracking Controllers: A Gametrak Case Study (Madeline Huberth, Chryssie Nanou) Table 15
  • Embodiment on a 3D tabletop musical instrument (Edgar Hemery, Sotiris Manitsaris, Fabien Moutarde) Table 16
  • Networked Virtual Environments as Collaborative Music Spaces (Cem Çakmak, Anıl Çamcı, Angus Forbes) Table 19
  • Drum-Dance-Music-Machine: Construction of a Technical Toolset for Low-Threshold Access to Collaborative Musical Performance (Christine Steinmeier, Dominic Becking, Philipp Kroos) Table 20
  • The Laptop Accordian (Aidan Meacham, Sanjay Kannan, Ge Wang) Table 21
  • Music Maker: 3d Printing and Acoustics Curriculum (Sasha Leitman, John Granzow) Table 22
  • The Hexenkessel: A Hybrid Musical Instrument for Multimedia Performances (Jacob Sello) Table 23
  • Snare Drum Performance Motion Analysis (Robert Van Rooyen, Andrew Schloss, George Tzanetakis) Table 24
  • Active Acoustic Instruments for Electronic Chamber Music (Otso Lähdeoja) Table 29
  • SensorChimes: Musical Mapping for Sensor Networks (Evan Lynch, Joseph Paradiso) Table 30
  • NAKANISYNTH: An Intuitive Freehand Drawing Waveform Synthesiser Application for iOS Devices (Paul Haimes, Kyosuke Nakanishi, Tetsuaki Baba, Kumiko Kushiyama) Table 31
  • StrumBot – An Overview of a Strumming Guitar Robot (Richard Vindriis, Dale Carnegie) Table 32
  • Unfoldings: Multiple Explorations of Sound and Space (Tim Shaw, Simon Bowen, John Bowers) Table 33
  • BlockyTalky: A Physical and Distributed Computer Music Toolkit for Kids (Benjamin Shapiro, Rebecca Fiebrink, Matthew Ahrens, Annie Kelly) Table 34
  • XronoMorph: Algorithmic Generation of Perfectly Balanced and Well-Formed Rhythms (Andrew J. Milne, Steffen A. Herff, David Bulger, William A. Sethares, Roger T. Dean) Table 35
  • Focal : An Eye-Tracking Musical Expression Controller (Stewart Greenhill, Cathie Travers) Table 36
  • The Haptic Capstans: Rotational Force Feedback for Music using a FireFader Derivative Device (Eric Sheffield, Edgar Berdahl, Andrew Pfalz) Table 1–2
  • Electromagnetically Actuated Acoustic Amplitude Modulation Synthesis (Herbert H.C. Chang, Spencel Topel) Table 17–18
  • Dooremi: a Doorway to Music (Rebecca Kleinberger, Akito Van Troyer) Table 25–26
  • A Multi-Point 2D Interface: Audio-Rate Signals for Controlling Complex Multi-Parametric Sound Synthesis(Stuart James) Table 27–28
  • Multi Rubbing Tactile Instrument (Yoichi Nagashima) Room 1.21
  • The Extended Clarinet (Carl Jörgen Normark, Robert Ek, Peter Parnes, Harald Andersson) Room 1.39
  • SpectraScore VR: Networkable virtual reality software tools for real-time composition and performance(Benedict Carey) Room 3.66


Julian Jaramillo Arango

Universidad de Caldas|Manizales|Caldas|Colombia

Alfonso Balandra

Tokyo Institute of Technology|Tokyo|Tokyo|Japan

Omer Cem Cakmak

Istanbul Technical University|Istanbul||Turkey

Benedict Carey


Juliana Cherston

MIT Media Lab|Cambridge|MA|02139

Paul Haimes

Tokyo Metropolitan University|Hino-shi|Tokyo|Japan
avatar for Alex Hofmann

Alex Hofmann

Assistant Professor, University of Music and Performing Arts Vienna|Vienna||Austria
Live Electronics with Saxophone | Csound, SuperCollider | Rapsberry Pi

Madeline Huberth

Stanford University|Stanford|CA|USA
avatar for Alexander Refsum Jensenius

Alexander Refsum Jensenius

Associate Professor, Head of Department, University of Oslo|Oslo||Norway
Alexander Refsum Jensenius (BA, MA, MSc, PhD) is a music researcher and research musician working in the fields of embodied music cognition and new interfaces for musical expression (NIME). He is currently the Head of Department of Musicology at the University of Oslo, where he also holds an associate professorship in music technology. Alexander studied informatics, mathematics, musicology, piano performance and music technology at UiO, Chalmers... Read More →

Otso Lahdeoja

University of the Arts, Sibelius Academy|Helsinki||Finland
avatar for Sasha Leitman

Sasha Leitman

Stanford University|stanford|CA|USA

Jason Long

Victoria University of Wellington|Wellington|Wellington|New Zealand
avatar for Evan Lynch

Evan Lynch

Research Assistant, MIT Media Lab

Aidan Meacham

Stanford University|Stanford|California|United States

Romain Michon

CCRMA - Stanford University|Stanford|CA|USA
avatar for Yoichi Nagashima

Yoichi Nagashima

Professor, Shizuoka University of Art and Culture|Hamamatsu|Shizuoka|JAPAN

Doug Van Nort

York University|Toronto|Ontario|Canada

Ben Olson

Signal Narrative|Madison|WI|United States

Alexandra Rieger

Dartmouth |Hanover|New Hampshire|
avatar for Robert Van Rooyen

Robert Van Rooyen

Ph.D. Candidate, University of Victoria|Victoria|British Columbia|Canada
As an experienced multidisciplinary engineer and musician, I am keenly interested in robotics that can render "human like" performances. Exploring the nuances and translating them to stochastic multidimensional motion control that can render comparable performances is of particular interest.
avatar for Dominik Schlienger

Dominik Schlienger

University of the Arts Helsinki|Helsinki||Finland
I'm researching kinaesthetic interfaces for spatially inetractive arts. I'm working on a tracking system using acoustic localisation techniques. Currently, I'm desperate to find a Max MSP Gen~ object which does frequency - domain convolution/correlation to implement the whole system in Max. Please contact me if you have one!

Jacob Sello

HfMT Hamburg|Hamburg|Hamburg|Germany

Tim Shaw

Culture Lab, Newcastle University|Newcastle-upon-Tyne|Tyne and Wear|UK

Eric Sheffield

Louisiana State University|Baton Rouge|Louisiana|United States

Sean Soraghan

Research Engineer, ROLI|London||United Kingdom
avatar for Christine Steinmeier

Christine Steinmeier

Research Assistant, FH Bielefeld University of Applied Sciences | Minden | Germany

Richard Vindriis

Victoria University of Wellington|Wellington|Wellington|New Zealand

Tuesday July 12, 2016 13:30 - 15:00
Basil Jones Orchestral Hall 1.82 Queensland Conservatorium

Attendees (27)