The Web Platform is the Universal Instrument

A session at Node PDX 2016

Tuesday 21st June, 2016

2:00pm to 2:30pm (PST)

Music as an idea, expression, commercial endeavor, and communal art is in its most volatile state since the European Renaissance. We’ve moved from the public adoption of recording technology, through the massive rise and fall of the recording industry, to a new age that was first seeded at Bell Labs during the Computer Science era.

Max Mathews encouraged a generation of computer musicians by declaring the Nyquist-Shannon “sampling theorem shows that there are really no limits to the sounds you can make…the computer is a universal musical instrument.”

Now with a fuller understanding of what Mathews was implying, we can take it a step further and say that the Browser is the universal musical instrument. It's the most accessible, cross-compatible runtime yet–and with the growth of Web Audio and Web MIDI standardization, we’re on the verge of a new renaissance in musical collaboration and interaction.

Unfortunately, the promotion of individualism in our popular culture, and the divide between developers and working artists has kept us from realizing the potential of building useful tools for distributed music collaboration, even in the web platform.

Still, I can see a world coming where community music and recorded works are not identified by regional boundaries, but distributed data regions and organic peer to peer networks. If the development of Web Audio and it’s supporting standards stabilize, music collaboration and exposition could be made available to everyone with no hinderances from age, class, or personal ability.

The WebSound project is my iterative solution to this problem through long-term community engagement, and Audio/MIDI tool versioning.

Our first endeavor is to build a few useful live performance tools enabling remote collaboration:
* Realtime Web MIDI performances streamed to a live-event, enabling the performer to lead songs or compositions remotely. Achieved through an optimized VPN and P2P WebRTC DataChannels.
* Communally performed live music making with MIDI controlled WebAudio and WebSocket broadcasting.
* Audience interaction with the exposed parameters of a live band’s instrumentation–via broadcast methods and microcontroller installations.

About the speaker

This person is speaking at this event.
Ben Michel

Musician–Developer. I compose & perform live soundtracks–and work on opensource Web Audio & MIDI tools. Listen: http://soundcloud.com/benmichelmusic bio from Twitter

Next session in Main Room

3pm I Play the JavaScript by Matt McKegg

2 attendees

  • Adron Hall
  • Ben Michel

1 tracker

  • Erik Ratcliffe

Coverage of this session

Sign in to add slides, notes or videos to this session

Sign in to track this session

Tell your friends!

When

Time 2:00pm2:30pm PST

Date Tue 21st June 2016

Where

Main Room, Bossanova Ballroom

Short URL

lanyrd.com/sfchgz

Official event site

nodepdx.org

View the schedule

Share

See something wrong?

Report an issue with this session