Can we create a spiritual experience through physical computing?

physical computing: experimenting with form and function

For our physical computing project, we developed a device for a user to have a ritualistic experience with technology.  Their is some irony built into this idea as many people already appear to have a set of rituals they perform everyday with their smartphones and computers. Unlike traditional rituals which typically don't provide an interactive experience, our Shrine actively responds to the user giving feedback at each step as the user pays tribute to the forces that allow us to perceive, create, communicate and innovate.

Scope: 8 weeks

Other team members: Julian Gonzalez, Ritwik Deshpande

My role: Idea development, troubleshooter, some Arduino coding, laser cutting, building a working lift


We began our project by sketching out some psuedo-code to understand what the interactions could look like.

psuedo-code

We wanted to build our shrine around a set of needs which would be ritualized through different interactions.

Our first iteration included giving an offering of water to awaken the shrine.

We ran into a few problems with this design. The first was how to make it apparent that the user was supposed to pour water into the cup, and how do we get the current user to empty the cup after the ritual so the next person could participate in the ritual. This was also only going to be the first sequence in a number of interactions with the shrine. However, we quickly realized that in our limited amount of time, we would need to shrink the interactions down to one or two in order to have a viable prototype for the midterm.

In our next round of iterating, we decided to not only make it a ritual interaction, but to create a game that would challenge a user to “balance” the ritual as a metaphor for balancing their own lives. The visuals became very literal as there would be a mouth that spoke to you and a rolling eye to balance:

This simplification of the overall design provided focus (metaphorically and literally).

But, we still had a problem. How do you get the user to know that they have to use their hands to balance the eyeball?

We agreed that the use of visual cues with LED lights should get the users attention to place their hands over the infrared sensors, and this would also simplify the circuits and wiring required to perform this ritual.

We then began to sketch the project to scale to better understand how certain components would fit and where to places holes for wiring:

We also needed to write code for the Arduino that would read the analog sensor of the forward facing IR to trigger the led strips which would be placed just below the outline of the hands. The logic behind this is simple enough, but it still required us to write psuedo-code to have a better understanding of how it should actually work. We had a breakthrough late one night:

Finally got the #code correct to use #irsensor to turn on #ledstrip #physicalcomputing Yasssss!!!!

A video posted by Shane Strassberg (@shootshane) on

We were ready to build the body of the Shrine.  We decided to create a low-fidelity prototype for the first iteration and used a shoebox because it was easily manipulatable and easily replaced if we made a mistake.  

low-fi shrine

Finally, we were going to be using our serial port to communicate from Arduino to Processing, and this proved to be the most complicated part––especially getting the physics of the rolling eyeball to work in tandem with the analog read of two IR sensors. Ritwik was tasked with solving this problem since he had the better overall understanding of the code being used. He worked extremely hard to make this work and he ultimately succeeded, but the actual interaction didn't feel smooth enough. So, he improvised and created something new, which turned out to be very pleasing to interact with using the IR sensors. 

This first round of iterating was for our mid-term, where our classmates and instructor had positive feedback and critiques on where to improve the overall design and interaction for the final project.

finals

Our goal for the final was to give the Shrine a look that not only embodied a sense of mystique––but also looked like a real artifact.  The addition of more LED lights to interact with IR sensors would correspond to the particles on the screen.  Therefore, we also wanted to improve the experience of controlling the particles on screen and add another component to them to initiate the rise of Patient 0 with the lift. Last, we wanted to add sound to the interaction of moving the particles to create an atmospheric layer, so you would be using your sense of touch, sight, and hearing while interacting with the Shrine.

Here are the sketches that Julian designed:

After building the Shrine for the mid-term, we had a better understanding of what was necessary to bring a project like this together. We knew that we had to test components that we hadn’t built yet. The lift was going to use stepper motors to wind up wire or string to lift the platform to the top of its box, but we hadn’t assembled a stepper motor all semester, so that was the first thing we set out to do.

To save space and use less wiring, we decided to use only one stepper motor as a linear actuator.  With the motor placed in a stationary position inside the box and by gluing a bolt to the top of it, we could have a platform rise and fall by using a nut to spin up and down the bolt as the motor rotated.   

Stepper Motor with bolt, nut and platform while Ritwik hammers away on code.

Stepper Motor with bolt, nut and platform while Ritwik hammers away on code.

For the body of the Shrine we chose to use 1/4" black acrylic to give it density and a hefty feel.  We would also need to use the laser cutter to get the precise shape and also to add etchings for aesthetic effect.

The night before the final presentation we worked all day and night. Ritwik was fine-tuning the code, Julian was creating some animations and I concentrated on the lights and the lift. 

A quick synopsis of how it worked:

I. The user turns it on by passing the threshold of an IR Sensor and is alerted by the blinking lights of the LED rings

II. Once standing in front, the user will place hands of hand etchings and begin to move the particles on the screen

III. Once the particles are aligned, the user will be given a visual cue to start the next interaction

IV. If the user does the interaction correctly, the lift will be triggered and Patient 0 will rise

Click for full image. 

Patient 0 became a little tiny Llama that I had found inside Julian's box of physical computing parts.  I thought that it could be viewed as a ritualistic symbol, that it would be completely unexpected and a little humorous. From the reactions it elicited, I believe I made the right choice.

Explaining the process it took to bring it all together during our final: