“You should eat more soy beans to help your corn harvest”
Oftentimes our popular vision of the future includes robot servants, environments full of iridescent architecture, space conquest, and humanity saving heroes. These projections, while entertaining, are oftentimes audacious in ways that are hard to believe or relate to.
Last year’s Her gave a more refreshing, human, and believable vision of the future. It wasn’t very different from contemporary life; the ebbs and flows of life and its messy emotions aren’t typically developed in our futuristic movie heroes.
When prompted to create a simulation with Unity and an Oculus Rift, I kept this vision of the future in mind. Unity, a video game development environment, was used as a vehicle to speculate on the physical-digital interactions that could happen in the internet of everything, 25 years from now. I made a few assumptions:
1. Life in general won’t change too much. People will still go to their day jobs. They will still be bored.
2. Internet of everything. Household objects, and architecture will be embedded with microprocessors and sensors; they’ll talk to each other and to you.
3. Handheld middle-ware. Handhelds won’t go away, they will evolve to serve as the primary middleware to interface with both the physical and digital worlds.
4. Subscription based comforts. Digital services and products will find ways to make home life easier, for a low monthly price.
5. The home is a marketing tool. With connected everything, there are so many new ways to sell you things and influence your behavior.
A video of the prototype is below.
Some of the questions that I hope this demonstration prompts are:
1. How can this type of simulation be used by designers, design technologists, and developers to evaluate design decisions, iterate on designs more quickly, communicate intent, and lower risk?
2. What other opportunities are there in the connected home for corporations and vendors to sell you things? How obvious and creepy will they be? Will we just learn to tolerate it?
3. What kinds of changes will be demanded of our architecture (both buildings and systems) by connected everything? Can embedding personalities into our objects and architecture make life better?
4. Will we still be carrying around handhelds in 25 years? What other types of wearables and devices will augment our daily lives? Do you hear the cars driving around outside? Sound like combustion engines … do you think we’ll still be hearing those in 25 years?
The value of a tool like Unity for both designers and developers is that it provides more cues to prompt these types of questions. Designers are enabled to evaluate the look, feel, and flow of an experience while collaborating with developers and design technologist on the code and development strategy. Perhaps such a process can result in shared ownership of the design intent and development strategy by all parties, including clients.
I have been exploring the ways in which physics simulations can be used to generate form and sculpt space. My final project attempts to use several steering behaviors and a simple physics engine to visualize a generative space.
This proof of concept experiment provides a framework for continued exploration. The sketch is initially seeded with 20 particles that are given random velocities. Each particle is aware of every other particle in the system and alters it’s behavior based on proximity to the other particles. There are also repeller objects in the sketch that repel all the particles in the system. I wanted to introduce elements that could be used to essentially sculpt the behavior of the particles in the system.
I also used this project to experiment with different graphical user interfaces. The final GUI element that has been implemented uses three nodes to dynamically control three variables in the system: separation, cohesion, and alignment. When a flocking system like this is visualized in three dimensions, it becomes challenging to visualize the positions of the particles. For this reason I added colored lines that trace the pathways of each particle. The colors of the line segments are determined by the proximity to the drawing’s origin.
GitHub repo here : https://github.com/davidptracy/FlockingVolumes
Project Proposal can be found here: http://davidptracy.com/files/NatureOfCode/09_ProjectProposal.pdf
CLICK THIS :: http://www.davidptracy.com/NatureOfCode/02/
SmartDust :: presence detecting floor system :: prototype
Last night, I prototyped the standard module by attaching (homemade) pressure sensors to the back of a standard 18″ carpet tile. By the end of the night, Joelle and I had the carpet tile pumping out location data to Spacebrew.
Has learning to build tangible interfaces changed your view of what constitutes good physical
interaction, or has it strengthened your initial ideas?
As we pass the half-way point of the semester, I suppose it is a good time to look back on what I’ve personally accomplished and share some thoughts on physical interaction.
Building tangible interfaces hasn’t necessarily changed my view on what constitutes a good physical interaction, it has fortified it. I’m still on board with Chris Crawford; any exchange of energy/information between two subjects can be technically qualified as an interaction, however just because the exchange is taking place doesn’t mean it’s of a high quality. Listening, thinking, and speaking (receive, process, respond) are all essentials parts of the communication, according to Crawford, all three of them have to be working for quality interaction to occur.
Working with tangible interfaces has reinforced this attitude, but more than anything it has opened up new avenues for expression and introspection, design, delight, and play. Systems (some) and mechanics that once seemed obscure are now accessible and hackable.
In terms of creative exploration and introspection, pathways have been opened up through new means of sensing the world and forming expressive responses to it. Tangible interfaces engage more than just the eyes or the hands, using tangible interactions to engage all of the senses can give extra dimensionality and added dimensionality to art while enabling the artist mix additional media into their work. Eventually I would like to explore the use of wireless communication to enable a device such as a aerial drone to experiment with photographic compositions and sound.
Additionally the potentials for tangible interfaces are great in the design world. In architecture and planning, paper (or more recently the screen) has been the frame through which designers simulate their solutions. The introduction of object based play into the design process could not only lend itself to non-linearity through a parametric physical-digital relationship, but also to the inherent spatial issues involved in architectural design.
I can also imagine the expansion of tangible interfaces having an exciting impact on sculptural issues in art and design. For instance, something like interactive sculpting clay that allows broad and minute modifications to digital and physical geometry. This type of input//output object could tie into the internet to enable collaborative sculpting.
Maybe there’s an idea for a final project in here somewhere …