Islands of Sound
Islands of Sound is an interactive interface where people can compose their own ambient sounds by touching different natural and synthetic objects. Projected graphics reflecting their unique energies will also be generated with the sounds.
Create an interactive product that can be used as a wireless speaker as well as a home decor.
TIME: Nov. 2018 — Dec. 2018 | 6 Weeks

TEAM: Ellie Lin, S. J. Zhang  

ROLE: Physical Computing, Experience Design, Programming, Projection Mapping

TOOL: Arduino, p5.js, Processing, ISADORA
Target Users

People who need to stay focus or to release pressure by listening to ambient sounds

People into home decor
Design Goals

Build a stand-alone system

Deliver an interactive experience

Activate the space with sound and light
Future Possibilities

The use of built-in LED matrix

Larger scale for multiple soundtracks mixture

Mobile App or web server for customizing sound sources
Interaction Demonstration

1. Brainstorming
2. Design Goals Set Up
Experience Design

1. User Interviews
2. User Testings

1. Materials Testings
2. Sensors Testings

1. Sensory System Design
2. Programming

1. Sound Design
2. Fabrication

1. Projection Mapping
2. Installation
Ideation and Conceptualization
We challenged ourselves to design a digital – physical hybrid product that has a positive impact on the environment where people live and work.Thinking about the ways to make some improvement to the quality of the living environment, we thought about light and sound. After brainstorming and interviewing, we tended to design a sound experience that can provoke the human sense of touch, sight, and hearing. When we were considering what can help connect human and the physical world in an intuitive way, we started to think about texture, graphics, and sounds.
Interactive Experience Design — What to Be Focused
Explore Opportunities Based on Interviews
Our first attempt was to figure out what kind of experiences participates were expecting from a speaker. We ran some testings and interviewed a few people (including two musicians, three designers and five people who were familiar with neither speakers nor interaction design) to help us further develop our idea.

Our Interviews were mainly based on three aspects:

Reasons for listening to music
Expectations for the interface of a music player
The most intuitive way to interact with a music product

Following are what we noticed during the interviews:

Needs: Wash away distractions

Design Keywords: Simple interactions; be left aside

Solutions: On/Off states; Sound looping

Needs: Customized possibilities

Design Keywords: Friendly interactive interface

Solutions: Relationships between sounds and objects

Needs: Feedback

Design Keywords: Visual / Audio real-time feedback

Solutions: Sounds and graphics generated with the interaction
Insight I. Placement Options Do Affect Interaction
There are two ways in our mind to place the product: it could be placed directly on a table, or it could be mounted on a wall, and there’s no surprise that where the product is does influence how people interact with it.

Expectations from interviewees:
 meditation experiences with pure, concentrating sounds generated and simple interactions.

Challenges: space consuming for a home decor;  high requirements for the environment including quietness; difficulties in graphic projection.

Opportunities: design a table coming with the product; projecting graphics on the wall behind the users or erasing the projection to waive distractions.

Expectations from interviewees:
 rich body movements; graphics generated somewhere on the wall during interactions; could be left aside and continue playing sounds.

Challenges: placements of the objects; sound sources selection and mixture; ways to mount objects, circuits and sensors on the wall.

Opportunities: linear placement of the objects to give a sense of a keyboard instrument; ambient sounds instead of meditation sounds; a board could be hang up.
We went for option II. Wall Mount since the experience that we wanna create is less like a meditation but an ambient sounds experiences.
Insight II. The Most Intuitive Way of Interaction is Touching
We went for option II. Wall Mount sinOne of the most intuitive things people would do when they are trying to “feel” something, is touching. Through our interviews, we asked participants to describe a material they like, all of them mentioned the textures of those materials. We focused on this point, and decided to build an interaction starting from a simple touch.

So, this is how the interaction works:

Sounds can be generated and mixed by users when they simply touch different objects, graphics will be generated indicating the on/off states of the sounds, and the speaker itself can be put aside playing sounds.
System Diagram for Creating Sensory Experience
To create a complete sensory experience, we drew out a system diagram to better dissect what we need to accomplish:

· Touch sensing

· Sounds triggered by touching objects

· Graphics accompanying the sounds
Visual Inspiration for Design
Inspired by sculptures of Isamu Noguchi, works of Arnout Meijer Studio and creations of Olafur Eliasson, we decided to create an artful composition of our selected materials with programmed lighting effects and a mix of natural and synthetic sounds.
Prototyping and User Testing — How Iterations Happened
Prototyping: Materials Chosen and Capacity Sensing
Materials: Natural and Artificial Objects

Considering the texture of the objects we chose matters to the interaction experience, we carefully thought about the materials. Based on our concept to inspire people with the power of the aggregation of nature and human activities, we chose stone, metal, wood, acrylic and fabrics, which have various textures but all are common natural or artificial materials that compose our physical living environment, as our design elements.

Capacity Sensing:

Conductive Paint and TapesWe decided to sense the change of capacity value when objects are touched by people.

It’s not necessary for users to touch the conductive area directly, as long as the touched point is not too far from the painted area, we can get a distinctive value difference from our analog reading.
Metal, Wood and Stone
Capacity Tape Testing
1st Round User Testing: Let’s Make the Volume Adjustable!

Interaction Behavior VariesWe set up one of each our objects on a white foam board and stuck it on the wall. By observing how user interact with the objects and interviewing them of their expectations, we found out that some of their behaviors validated a few of our hypothesis, while others pushed and inspired us exploring new ways to improve the experience.
Though initially, the linear element on the board was only there for visual hierarchy so that the composition can be more balanced. Through our user testings, we found that most of people assumed the linear elements on the board were sliders.

“They look like volume bars.”

“I’d like to have volume levels projected directly on them as a visual feedback.”

Volume Control PossibilityIn this case, we chose to use a brass rod as our volume controller. By calculating the sum of capacity values, we could tell the time length our users holding the rod, and increase/decrease the volume of sounds.
2nd Round User Testing: Capacity is Unreliable, Piezo is the Right Sensor

Capacity Reading IrresponsiveWe pasted up conductive tapes and wire two objects for the second round user testing. The interactive experience was disappointed due to its sensitivity to the physical environment, and its variation from one user to another. We have to find the other way to stabilize the sensor readings.

The volume rod worked well since it only counted on the time length instead of precise capacity value.

SOLUTION: New sensor – Piezo

Piezo (Knock Sensor) can be used for sensing vibration when it is connected directly to an analog pin(+) and the ground(-). It performed sensitive and stable at the same time. Unlike capacity, it could work almost ignoring the environmental influence.

We decided to attach Piezo sensors under the objects replacing conductive tapes. Now Arduino is able to receive much more responsive signals to trigger following instructions.
Schematic Diagram:

Till now, we have following components:I. Input:Four Piezo Discs connected to four 10M Ohms fixed resistors, respectively (in parallel), Pin A0 to A3, and the ground.Three pieces of foils (conductive materials) connected to three 10M Ohms fixed resistors, respectively (in serial), Pin D2, D4 (receive pin), D6 and D7.II. Serial PortBluetooth module connected to 5V power, D0 (TX pin), D1 (RX pin) and the ground.
3rd Round User Testing: Instructions and Headphones are Needed for the Show
WHAT WE FOUND: Unclear Instruction & Noisy Environment

· The shape of our objects made users confused since they looked like knobs. Most of users didn’t know how and where to start.

· We went on the third time user testing in a noisy environment. Even though the installation itself worked fine, users was not able to gain a satisfying sounds composing experience due to the noise.

SOLUTION: Instruction & Headphone

· We decided to project instruction “light tap on us” on the board when it is no being interacted with.

· For the ITP Show, we decided to set up a headphone instead of using speakers. The composing interaction now becomes a single user experience.
Generative Graphic Design Process: Code, and Let it Dance
For the visual elements that would be projected on the board indicating the status of each object, we programmed generative graphics in p5.js and Processing to reflect the amplitude and frequency of sound. From pixel water ripples to fading ellipses; from single/multiple particle systems to bezier waves, we tested the visual effects on both laptop monitors and the physical projected interface, and iterated several times for getting closer to our expectation.
Water Ripples Effect: Object Status

We simulated the dynamic expanding-fading water ripple effect, mapped the amplitude of each sound track to the radius of ripples to show the on/off status of each object.

The size of the water ripples was reduced due to the limited projection surface.
Bezier Wave: Frequency and Amplitude

In the beginning, we tried to apply particle systems to reflect the frequency and amplitude of the sounds. It turned out to be way too complicated and messy visually than we expected.
We decided to keep the graphics simple in case of distracting users from enjoying the sound itself, and chose two bezier waves reflecting the low/bass and high/mid frequency respectively instead.
Fabrication — How We Crafted from 0 to 1
As we were exploring different materials, we decided to add a bluetooth chip to our micro controller so that we only need to power the micro controller and the speakers. This configuration will also make it easier to move the board to different places. For a complete experience, we only need to plug in the power and map the projection. While a finished board itself with power can also performs a sound-only mode.
Wooden Board

The board should be sturdy enough to be mounted with objects, in this case both acrylic and wood works fine. Besides, since we also need space to hide circuits and speakers in the back, something that comes with a hollowed depth would be helpful. Luckily we found a Lazy Susan at the size we need, with a removable bearing base.
Woolen Cover

We found the perfect fabric to cover the surface, a light grey wool with a slight linen-like texture that is soft and stable. We took inspiration from mattress covers’ construction and sewn a cover with elastic back for the board. When it is mounted or hung against the wall, there is only one noticeable seam around the edge and we have a pretty clean side.
Interface Design

We asked a few musicians helping us to place their ideal compositions when we were constructing the interface with real objects at scale. We were told that if we expected users to actually remember the sounds associated with the objects and to be able to purposefully compose something, we should go for a linear composition reminiscent of piano keyboard and other MIDIs.

We layered a few object to give them some height and now the board looks like a landscape from the side.
Volume Controller

Our original idea, was that the volume would be adjusted when users sliding their fingers along the metal rod. Due to the unpredictable nature of capacity sensing, we weren’t able to get accurate data from the sensor on where the contact points is between the rod and the fingers.

We opted for a different solution where the rod is separated into two parts, touching one of them would increase the volume and the other to decrease. We bended metal strip to fix the rod onto the board.

We drilled through the wooden board, led the wire to the back and connected them to the Arduino. The Arduino is then connected to bluetooth chip that communicate wirelessly to a computer.
Sound Mix: Building Relationship between Objects and Sounds
Due to the limitation of time and our ability to compose sound or music, we chose to use free sound for our project. It also came with problems that first, it was hard to mix four sounds with different tempos and beat; second, it’s up to users when to turn on/off a sound.

Dealing with those two problems, we edited the sound, match the tempo manually, and cut them into the same duration length. Besides, we rewrote the p5.js sketch, start to loop the sound in the beginning with 0 volume to make sure whatever sound users triggered is a part of our melody.

We are not quite satisfied with our sounds yet, sound re-composition needed.
Projection Mapping: Light it Up
· Physically, we added a cut out filter in front of the projector lens to block the “colorful black” rectangle that will be projected around the board.

· We used syphon to capture the canvas from the browser in real time.

· We created a round mask cutting out what we need from the sketch, and then fixed the skewed shape by adjusting the perspective of output media with ISADORA.
Presentation: ITP Winter Show 2018
After five weeks of testing and pivoting, we finally put everything together. The board itself is wireless, communicating with a laptop via bluetooth while the laptop receiving serial data from the sensors on the board. Graphics are generated in the browser, captured by syphon, sent to ISADORA, and finally projected on the board.

This project was selected to be presented at the Winter Show 2018, a two-day exhibition of recent creative interactive projects created by the students of ITP and IMA. We had the opportunity to show industry professional as well as friends and families what we’ve been up to at school.

Over two days, around eight hours of opening time, we welcomed more than two hundred guests listening and interacting with our project from gallerists to artists, high school students to advertising veterans.
Future Possibilities: Where can be improved
It would like to design a more complex interactive system if I could have more time. There were some ideas about ripple effects from a participant’s action, both visually and sonically, that I think we could experiment with it further.

Experience Design: It could help to improve the experience if we could run testings in real scenarios. Although we’ve done three times of playful testings at school, due to time and set up limitations, we couldn’t have a chance to test it in an actual living room, for instance, as our design proposal described.

Technology: We could consider built-in LED matrix to replace the projector and make the whole system truly wireless. We also considered building a server to provide the possibilities for users to customized the sound sources that they used to compose the ambient sounds.
Feedback: What I’ve learned
Overall, it was a challenging but extremely fun design journey.

I gained hands-on experience utilizing code to create interactive experiences and generate sounds and graphics responding to the participants’ actions, which was quite different from a digital interface design project or an architectural design project. I had chances to think deeper about how digital design can blend better with physical products.

I also enjoyed the teamwork since working with someone from a different background inspired me a lot to look at a single project from various perspectives. It was also interesting to learn more about how the background of a person would affect which aspects of a design works would be valued the most. I think it could help me to deliver more meaningful experience designs for clients in the future.
Thanks for making it here
Coffee or Tea?