Holographic Audio Visualizations

2-1

Holographic Audio Visualizations explores the world of audio visualizations. It strives to rethink the various ways in which audio can be visualized while also taking into consideration the human side of how we interact with sound. Instead of translating audio inputs into binary data, Holographic Audio Visualizations uses the human body as a means for sensor data. Holographic Audio Visualizations was conceived through observing how we, as human beings, move and interact with sound. People tend to move their arms and hands to the rhythm of the music. Depending on the severity of the beat, some may thrust their arms in the air or lightly tap their hand to the music. Through a combination of code and physical computing, Holographic Audio Visualizations uses holograms to visually represent sound in real time and sensor data from the bracelet worn by users to record this human interaction. It is a fun and engaging way for participants to see the sounds they are hearing.

Holographic Audio Visualizations was born out of a curiosity for the world of audio visualizations. I was fascinated by how artists could visually show something that has no visual properties; sound is heard and not seen. This curiosity evolved into a personal exploration through the means of art. I wanted to understand how humans interact with audio and the ways in which they display it; how do we become the visualizations? Exploring this curiosity beyond the medium of dance is where Holographic Audio Visualizations was born.

I started my exploration by making audio visual synthesizers. As I programed these, I realized I did not understand what went into making a sound, and through my research, I discovered that the elements that went into creating this medium were things I did not work well with. Instead of abandoning the topic, I thought about the different ways we see and interact with audio and how to compute it without any audio inputs. As a former dancer, I knew we could see it through the body; through the movements we make when we hear and feel each beat. Combining this with my past in photography, in particular, cinematic photography, I thought of ways in which I could tell the story of sound. For me, that story was the beat; how hard the beat was, the rhythm of the beat, and the way in which each beat made people move.

 

Physical Computing and Wearable Technology

In order to combine interactions, sounds, and visuals, I needed to connect the human body and the computer; I needed to think about physical computing, a course I took from Professor Phoenix Perry. My first step was to find the right sensor. I researched the different sensors used to track movement and found two that could potentially provide me with the data I needed: accelerometers, which tracked the speed of movement, and gyroscopes, which tracked how much a location changed. Since I wanted to tell the story of the beat, I would need to gather how fast and hard someone was moving; I would need to use an accelerometer.

My next step was to think about how to connect an accelerometer to the human body. I thought about my former professor at the University of California, Santa Barbara, Professor Xárene Eskandar. In her class she introduced the genre of wearable technology. Wearables would become my answer for connecting the human body with an accelerometer; connecting it with the computer.

As I brainstormed about wearables and music, I thought about music festivals and how attendees are usually given bracelets as their tickets. This led me to think, if I were to be making a piece for a festival, how could I get each attendee to wear an accelerometer? The answer: the bracelets. An accelerometer bracelet would provide the data about how people moved to a beat, since people have a tendency to move their hands when they dance or hear music.

Both Adafruit and LilyPad are used for wearable technology and have boards and sensors that can be sewn into fabric using conductive thread, which builds connections the same way wires do. For my bracelets I used Adafruit’s Flora V2LilyPad’s Accelerometer ADXL335, and three ply conductive thread. The X, Y, Z, power (+), and ground (-) pins on the accelerometer connect to the D12, D6, D9, 3.3V, and Ground pins, respectively, on the Flora. Since the Flora is not a board made by Arduino, I had to download the Adafruit AVR Boards package. Once the right board was selected, Arduino could successfully upload code to the Flora. I uploaded Arduino’s Standard Firmata code, which can be found in the examples that come with the downloadable Firmata Library. This allows Arduino to send serial data and communicate with Processing, which I planned to use to create my visualizations.

 

The Visualizations

In order to understand how the accelerometer data would work with Processing, I created a series of simple sketches that included pulsating ellipses, rotating grids, and various other moving shapes. To get these working, I needed to download and import the Arduino Library for Processing, call the serial port, set the Arduino sensor data inputs, and translate the input data from digital to analog. I could then use the accelerometer’s data to control my sketches.

As my understanding grew, so did the complexity and creativity of my sketches. I remembered seeing videos of iPhone holograms and became inspired. Essentially, these holograms are made by cutting the clear cover of CD cases into an upside-down pyramid that has been cut flat just above the apex and placed on top of an iPhone screen in the center of a video of four mirrored moving objects. Instead of playing a video, I would run a Processing sketch that used the sensor data to manipulate the hologram’s movements. Essentially, this would allow users to interact with and see audio in a physical manifestation.

My inspiration for designing the first hologram came from Generative Design and its chapter about Randomness and Noise. In it, it shows how to make a noisy landscape. This reminded me of a vintage radio and how they used to display sound levels with a line that would spike up and down. Instead of a line, the noise corresponds to the sensor data, which determines how high or low each noise pattern is.

After creating my first hologram in Processing, I wanted to find a way to display it on an iPad for the exhibition. I also wanted to find a way to enable Bluetooth so that the user did not need to be tethered to send data. However, this is where problems arose.

 

iOS and Bluetooth

First I tried using p5.js and translated the code from Processing’s Java to p5.js’ JavaScript, a language that is compatible with iOS. I found and tried a few labs from a Physical Computing course at New York University’s Interactive Telecommunication Program that used Arduino with p5.js, but I was unsuccessful because of the type of program I was trying to create and because I was not using an Arduino brand board.

I then found someone in Japan who appeared to have successfully connected Arduino to an iPad and ran a Processing sketch using HTML, CSS, and processing.js. I tried following his technique and ordered an iPad Camera Connector but still could not run my sketch because of the three-dimensional images and the type of board I was using.

My next attempt would be to use one of the two Processing apps I found on the App Store, Procoding and Processing iCompiler. Processing iCompiler seemed to work fine with the three-dimensional images, but I could not call the serial port that connected the iPad with the Flora. I tried using the Flora Bluefruit LE to send serial data to my program through Bluetooth, however, the Flora Bluefruit LE is mainly made to control a component remotely as opposed to sending sensor data. I was able to get the Bluetooth on and see the connection through the apps BluefruitnRF ToolboxnRF UART, and A Manager, but could not receive any data. I then tried using Adafruit’s Bluefruit LE Friend to bridge the connection through my computer, but still could not receive any data. If I were to continue this project in the future, the ability to remotely send data to a program would be something I would like to achieve.

However, for the exhibition I decided to return back to tethering the wearable. I tried connecting the Flora to the iPad with a Micro-B USB cable and the Camera Connector. When I opened my code in Processing iCompiler, I found that the Flora V2 used too much power. I connected a 105mAh and a 2000mAh LiPo Battery Pack and received an error message saying the Flora was not compatible with the iPad. I tried troubleshooting using the app Get Console, but I was only able to find the serial port and not read the data.

After I watched Gary Bennett’s YouTube video explaining how he connected Arduino to his iPad, I realized why the Camera Connector was not working and that I would not be able to use iPads for this project. Apparently, Apple does not freely allow external electronics to work with iOS. They have an elite program, called the MFi Program, that allows people to experiment with this technology. If one were to be accepted, they would receive a cable that has a chip inside which allows one to connect external electronics to iOS devices. As an alternative to getting the cable through the MFi Program, there is the Redpark TTL Serial Cable (C2-TTL), which has the chip that is needed to make the connection. On one end, there is a 30-pin connector and on the other end the TTL headers, which allows an Arduino to connect to the power, ground, TX, and RX pins, just like the Adafruit Bluefruit LE. According to Bennett, this chip is vital in order for the iPad to receive serial data, which explains why I was unable to do so with the Camera Connector. However, this cable becomes incompatible when used with apps from the App Store.

 

Acceptance

Since iPads were no longer a viable option, I returned to using monitors and began programming my second hologram. For this one, I got my inspiration from Professor Theodoros Papatheodorou. He had mentioned using a particle system, which inspired me to create particles that were in constant motion but changed their movement along the X and Y-Axis because of the sensor data from the accelerometer. Since I would be placing “pyramids” on top of the screen to hold the holograms inside, I thought about having these particles come out of a pyramid so it would appear to be a pyramid in a “pyramid”. Since object-oriented programming was taught in both Professor Tim Blackwell and Professor Theodoros Papatheodorou classes, the first program employs the use of functions while this one uses both classes and functions.

Although Holographic Audio Visualizations has been developed for Metasis, if I were to continue it, I would refine the bracelets to be cordless, as mentioned. I would also make bracelets that were more fashionable, in addition to the ones that I have for the exhibition, which are designed to be conveniently put on in a gallery. This way users could have options. Lastly, I would want to develop even more holograms and see the different ways in which I could potentially visualize audio in the real world; in a way that appears tangible to the user.

 

http://doc.gold.ac.uk/compartsblog/index.php/work/holographic-audio-visualizations/

http://metasis.io/index.php/work/julianne-rahimi/

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Further Projects