Connect with us

Hi, what are you looking for?


The Intersection of Technology and Music

Technology and music have never truly been separate individuals — you could say that technology has spent centuries disrupting and advancing our musical endeavors, but the truth is that the two have always been intertwined, continually spinning off of each other.

Musical scores scribbled onto paper, the pressurized pipes of the organ, glamorous record players and record player accessories, synthesizers, and the iPod were all considered cutting-edge technologies in their day.

Today’s tech disruptions come primarily in the form of computers and smartphones and are truly remarkable in what they can accomplish. Here’s the latest in the love story of music and tech.

New Ways to Experience Music

Virtual reality has disrupted all sorts of 21st-century experiences. It’s put a whole new meaning to the term armchair travel, and it’s evolving the way students learn in the classroom. With more virtual reality apps and accessories coming onto the market, the musical field is just one more discipline touched by this recent innovation.

The first large scale virtual reality concerts were held as early as 2014 by acts such as Coldplay, Paul McCartney, and Jack White. As part of those concerts, 360 degrees, 3D video was captured and audience members could use oculus and other VR tools to connect their smartphones with the immersive feed.

These concerts were just the beginning. Several major music companies have now teamed up with VR companies in order to record and broadcast their concerts. This has the potential to open the world of music to a whole new (far away) audience.

VR devices are often praised for their accessibility — with just a smartphone and as little as a cardboard box, people around the world can hear and experience music as though they were there in person. While the internet has made it easy to share music, VR will make it even easier to share the musical experience, hopefully leading to greater appreciation and engagement in music worldwide.

Virtual reality technology can also complement traditional music lessons. During the 2016 TechCrunch Disrupt Hackathon, two students created a VR platform that helps children learn music. The initial design of the platform called Teach-U: VR created a virtual reality where users could interact with drumsets and a piano, learning the rhythm and hand gestures required to play those instruments in real life. The platform allows two users to be in the same virtual reality space at once, meaning a teacher in one location can be working with a student in another — through the power of technology, they’re in the same “space.” While not able to replicate the value of in-person music lessons, the creators say VR has the ability to help children who have had their music curriculums cut in school; or who may not have the geographic or financial access to out-of-school music lessons.

Seeing Sound

When combined, technology and sound continually find ways to enhance the way people experience the world.

Take artist Neil Harbisson, for example. Harbisson was born totally colorblind. In order to overcome his constant state of grayscale, he installed a camera that records the colors around him and translates those hues into sounds and music. The frequencies produced by a certain scene are then sent to a surgically implemented antenna, which allows him to better grasp the colorful world beyond his vision.

In 2013 Harbisson used his cyborg technology to perform the world’s first color-conducted concert. By taking photos around Barcelona, and converting those colors into sounds, Harbisson and a choir created an entire musical piece based on the technology he uses to perceive the world. As part of that experiment, Harbisson released the Eyeborg app which allows users to select colors, and hear the frequencies and music they produce.

The Eyeborg app isn’t unique on the market. EyeMusic is a smartphone app designed specifically for blind people; or those who wish to experience the world in their shoes. By holding a smartphone as though you’re taking a photo, the app translates people and objects through the viewfinder into a soundscape of musical notes. Once a person is trained to associate the shapes and expressions that go along with those sounds, they’re able to see a scene in their mind’s eye. As National Geographic reported, sounds that are higher in pitch represent objects at the top of a scene, while lower notes indicate objects closer to the ground. By combining pitch, tempo, and types of musical instrument, the app creates a whole new language from music.

Feeling the Beat

Anyone who has ever listened to music in Windows Media Player will know how hypnotizing it is to see the pulsing colors and shapes of its playback visualizations. The Microsoft visuals depend on the music itself; the volume, tempo, and timbre are all considered by the algorithm that produces the on-screen visualizations.

While Windows Media Player visualizations have been around for decades, they represent an evolving environment where technology is more inherently linked to music.

Earlier in 2016, American musician J.Views created a new music video — which wouldn’t sound like a notable accomplishment, except that this was the first music video to ever sync with a listener’s heartbeat. The way it works is pretty cool: after downloading the app, listeners place a finger against the smartphone’s camera. The app’s technology picks up changes in your skin tone to gauge your heartbeat — and the resulting beat serves as the metronome for the song. In addition to affecting the musical tempo, a listener’s heart rate also changes the music video visuals.

This song and music video are just one track from J.Views’ most recent album, 401 Days, which was created with the assistance from The DNA Project. During that project, the musician used technology to engage with his audience, uploading snippets of songs as they were created and allowing listeners to download and rework certain clips. By guiding listeners through the creative process, J.Views said he hoped each song would have an extended life and inspire its listeners just as much as they inspired him.

With consumers constantly demanding more personalized experiences, this type of user-generated or user-affected music could change the way future artists create, edit, and release their work.

Written By

Jennifer Paterson is the President and Founder of California Music Studios. Jennifer has degrees from Boston University, The Royal Conservatory of Music of Toronto and the University of British Columbia. She was a recipient of The Canada Council Award to study at the well-known Royal Opera House in London, and was the principal soprano for the Boston Lyric Opera Company. Her dedication to the legitimate training of the voice and piano has made her a definite asset to the musical community of Southern California.

Click to comment

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You May Also Like


Music incorporation into your daily workout routine might seem inconsequential to many, especially if not fans of that particular genre. However, have you tried...


For many people, music is a hugely significant part of life. It’s something that guides them through low periods and makes the happy moments...


When you want to experience the best sound, every audiophile knows you need to have the right equipment. Whether you are looking for an...


In 2007, a research team from Stanford University School of Medicine showed that the areas of the brain involved with updating the event in...