Finally, the project is fully online! Check it out:

The full concept and development documentation is available at the website, together with photos, videos, a DIY tutorial of the interface, and the open source files of the generative audiovisual installation.

I’ve also concluded the final documentation videos. Here they go:

I hope you all liked the place this project has turned into. This blog was an important part of my development process, as it was interface to communicating with several technologists and creatives to achieve my result.

Thank you all for accompanying this!


Final report

The final report, together with all its essays is finally online. Currently it only has a portuguese version. Soon the website will have more english information, together with the last recorded videos of the installation and the interface prototype. I’m looking forward to it!

Some captures from the last recorded session.

I’m currently preparing two videos of the whole project functioning, together with its website with the full information about it. I’m looking forward to having all done between next month! [=

Final prototype

Prototype #3 - Documented construction

Finally! Here is the documented development process of my last physical prototype. I’ll be writing a more detailed “instructable” of this, but for now, it serves its purpose!

All design decisions were made regarding the whole project intention, and also the fact that this interface had to be strong and well done enough to be able to handle rough manual manipulation of it inside of an installation. Images can be clicked to zoom.


Key materials (most of them, bought on

- Lilypad Arduino.

- Xbee + Lilypad Xbee (I’ve decided to take off Bluetooth due to the problem I’ve previously posted).

- 850mAh Lipo battery.

- 5 Lilypad accelerometers.

- 4ply conductive thread.

- Regular sewing thread.

- Needles.

- Thin wire + solder.

- Textile glue.

- Scissors, stylus, textile pencil.

- 4m x 2m cotton fabric.

- 200x200x2mm plastic plate (not on the photo).

- Strong glue (not on the photo).

- Mounting tape (not on the photo).

I’ve started by soldering the main components. I’ve decided to do it (instead of sewing), because it guaranties the whole circuit power supply, and is way much stronger than sewing. As this area would end up being well protected and isolated, I’ve thought this solution would work the best.

Schematic drawing on the fabric:

Hand sewing of conductive thread - my sewing machine didn’t run the 4ply thread on both sides due to its size.

I had the idea of sewing (now with the sewing machine) spaced regular fabric on top of the previously sewed conductive thread. This should enhance the connection of the hand sewed thread, as well as protect it of frizzing.

Thanks to Juliana Ferrari for lending me her sewing machine! =) 

Final sewing scheme:

Electronic components sewing:

Used textile glue to insulate the conductive thread. Together with the regular fabric already sewed, this actually isolated all the wires, without making them loosing or downing its connection! (tested with a multimeter).

Battery was the last electronic component I’ve sewed. I had to make it in a way it could be removed posteriorly without damaging the fabric. So I’ve glued a strong two-sided adhesive on it, and made some string arcs (to the wire), as seen on the photo.

Whole scheme sewed and already functioning:

I’ve used a thin plastic on the bottom of the main components to prevent it from being rough manipulated and squeezed. This would not also damage the prototype, but the whole experience of the project. With Matsuei’s help, I was able of working this plastic with thin holes (to sew). The final result was grater then I’ve expected, and served very well its purpose.

To protect the whole circuit, I’ve used a very thin two-sided rubber adhesive (mounting tape). It’s very very light, and has a strong glue on it that prevents it from moving (which I’ve also sewed on the textile, just to guarantee).

To finish, I’ve sewed it together with foam on the textile. It functioned also as a “door” to accessing posteriorly the components.

Finished internal:

Exterior sewing:

With the whole internal prototype finished, it was time to begin working on the exterior. The intention of the fabric was to be the more neutral and simple as possible - the attention of the project should be only on its movement and audiovisual interaction. The physical is only a “halfway”, the middle of the communication.

So my idea was to use a raw cotton fabric due to its texture, color, but mainly the fact that it leaves traces of previously folded and crushed activity. So, plastically speaking, the design composition of it would be constructed by the interactors at real time, as well that keeps “past foldings” on its surface.

I’ve had the help of a seamstress to give the finish on the exterior, and also to sew the 3 layers altogether.

Last but not least, a huge thanks to Diego Morales and Vinicius Matsuei from Estúdio Meio, who helped me a lot with the photo equipments (also at shooting some of them), but mostly, with patience and lots of incitement! =)

Soundscape Test #2

Second soundscape recorded session with the malleable interface. The Suppercollider’s sound synthesis code of this version was contributed and masterly built by Ales Tsurko (a genius sound designer, I must say!), who generously agreed to participate at the project. The “composition” and triggering is the same as built by me at the previous version.

It is now very closer to the whole intentions and therefore the final version! I’m very very thrilled with this version and its becomings. Next one will just have some minor changes mainly at the triggering and balance of sounds.

DOWNLOADABLE CODES - Suppercollider files (code and samples).
The Processing system is the same as the previous version (download it here). 

Beginning to get thrilled with the results [=.I’m missing some important fluidity on the whole thing, so I’m currently working on that. I’m also trying to come to a good “data from tangible interface x cell automata data” equilibrium at this. I pretend to continue working on the visuals for this week. Next one will be to finish the final physical prototype and “put some dots” on the sound generation.

Beginning to get thrilled with the results [=.
I’m missing some important fluidity on the whole thing, so I’m currently working on that. I’m also trying to come to a good “data from tangible interface x cell automata data” equilibrium at this. I pretend to continue working on the visuals for this week. Next one will be to finish the final physical prototype and “put some dots” on the sound generation.

Processing - Textile + Verlet Physics tests

First visual output test serie. After a lot of research, I started out with Jose Sanchez’s sketch on openProcessing, and produced the following tests tweaking and playing with toxiclibs geom libraries, together with verlet physics concepts.

Hope to be getting into a good result soon - time schedule is thin! But I have to say I’m very thrilled with the current results so far! Play with generative design is awesome! o/

Some definitions and files

As I’m (very happy indeed :D) receiving some good external feedback, and have been receiving some really good contributors and helpers at the “Supercollider” comunity, I thought it might be a good thing to post here some of the things going on “under the roofs”. First, I’d like to give a (just a little) more explanation of the whole project this interface is part of.

Speaking briefly, the “ground” theme of my graduation project is the concept of virtuality and immanence. Through an audiovisual installation, I’m willing to build an experience where interactors could understand immanently how the virtual relations are built and shaped in a trans-relative reality, and by that incite reflections about our own reality and how we fold it and play with it in order to bring sense to chaos.

To achieve that, I’m using a textile interface that captures movement and shape. Its aim is to bring into the physical reality characteristics of the virtual one. Through its malleability, it works with dualities between liquidity/solidness, extensibility/intensiveness, fragility/roughness, chaos/order. When interactors play with this surface, it changes their own reality at the installation experience as described bellow.

Data from the interface is inputed at a digital cellular automata system. It works as if interactors were playing with another form of living, and their actions reflect on differences at this creature - similar to what happens conceptually at trans-virtual topology. The cellular automata them generates sounds and visuals to the interactors, closing the “feedback” through this communication, aiming to build at them a higher understanding of how to interact with the liquidness of virtuality.

So, the sound at this project has a very important role. It has to build an actual immersive experience into another kind of time and space - a virtual reflexive one, with the characteristics I described earlier - as well as have to be responsive enough to whatever changes the interactors incite through the interface. Right now I’m working on the visual part of it, which will bring into senses the reflections of the folding (and unfolding) of this reality, and indicate even more the concepts behind the project.

I really intend that this whole experience compose with interactors through positive feelings and emotions - mainly because I’ve found this to be better and easier to be accept and understood. I phrase: it should COMPOSE with interactors. It should awake in them a non-verbal understanding of what is going on inside their own reflections, so I’m really trying to keep all the project with an abstract language rather than a concrete (or from collage) one. They should give the meaning - the project’s meaning is the interaction and this reflection itself.

Letting the “small talk” aside :D, bellow is a simple test I’ve made with photographs and papers. Just testing how the generative visual feedback would be with what I imagined.

This is still very very far from ideal - and far from what I’ll probably be able to build at Processing at runtime. But I think it is a start with some of the concepts I thought. And also, it gave me a glimpse of how I can start building the generative visual process! I’m actually wanting the final output to be modified through my process of making it. I hope it finishes very different than this. Let’s see!

They are the same as the last ones, however this time with a “pre-compiled” fake DataSet, so anyone can visualize how it works without having the “hardware”!

nama009b - Processing .pdes and Supercollider .sc.
Start the Internal Server of Suppercollider first, and upload the synths. After, open “nama009b”.pde, and run it to see and make some noise!

 See you soon!

Soundscape Test #1

This is my first version of the full soundscape responding both the interaction at the malleable interface, and the whole cellular automata system. It only uses simple Sine synths - next step will be the “cloth” of all those synths.

Simple synths are both triggered and generated at realtime through the data of the system. Each “culture color” correspond to data from movement of an accelerometer. The more the population of it, more sound it generates. Each accelerometer has a different pitch level.

Bellow, three different “stage” examples of the cell (not the same from the sound file, sorry!) :

After 10 seconds of “movement”:

After 20 seconds of “movement”, and positioning the textile on “negative position”:

After 30 seconds of “movement”, and we’ve got a pattern of the accelerometers that have been moved the most:

These images are just visualizations of the system automata data.

DOWNLOADABLE CODES: - (Supercollider Synths) - (Processing interface, cell auto and sound composition)

I consider the sound development to be 50% complete. Compositions sounds good to me, as well as responsiveness. However, I’m really having a hard time composing synths at Supercollider that matches the whole aesthetic of the project.

I’m really looking forward to getting some help at this! Thanks! (:

Prototype #2

I’ve finally got some time to post the development of the second prototype.

At this, I started by sewing the whole planned interface - 5 accelerometers, LilyPad Arduino, Bluetooth Mate Silver and the Lithium Battery (with the power breakout for LilyPad). Secondly, I’ve already included and modeled the generative cellular automata system (visualized at the video bellow), and also a basic sound output created at SuperCollider, generated at real-time synthesis via OSC data sent from Processing.

The movement of the interface alters the relations between cells on the cellular automaton system. The sound output takes raw data from this system.  The system (here only visualized) has 2 main properties: dynamically store the data from movement, and enable the emergence of new patterns through this data. The upcoming visual and sound feedback is done through these patterns.

I had lots of hard-times at the sewing part, and also at the communication between Processing and Supercollider - but it was a good training before the construction of the next (and final) prototype. As I’m new at all of this - the coding, the languages, and specially the sound synthesis through code - I’ve been learning a lot at this process.

Sewing components onto the fabric:

Some “bridges” had to be built in order to don’t cross + and - threads. I’ve turned some digital pins to power sources (through code) of the accelerometers.

In order to protect the components - and also from keeping data-crossing or power source crossing, I’ve sewed a “cape” at both sides of the fabric.

The corner accelerometers transmit Z-axis rotation data. The central-top, transmits X and Y rotation data. 


- Every time bluetooth “fails”, MACOSx disconnect it’s serial port, bringing the whole system down. This is a major problem, hope to get some answer to this soon.

- Sewing correctly all the components - there is too much noise being transmitted because of the low insulation of the threads.


Codes bellow evolved from the previous ones. Keep in mind these codes may still have problems, and I’m sure all of them could much optimized.

NamaPrototype02.ino (Arduino programming).

NamaPrototype02System.pde (Processing - communication, data system and cellular automata). (Supercollider - generative sound through input from Processing).


Next step is to build the final prototype (this time with a sewing machine, a good fabric and perhaps a professional to help) and the final audiovisual output (sound in Supercollider and visuals at Processing) at the same time.

I’m very thankful for all the help I’ve had so far. Please, if you have any critic or contribution, feel free to write me! Thanks! :)

Prototype #1

It has been a while, but finally I’ve been able to start focusing at the interface development. Bellow is the first prototype, in which I was able to test the components, and begin programming the serial communication with the Arduino IDE. At a second step, I’ve built a simple Processing sketch to start interpreting the data and roughly visualizing it through images and sounds.

Both steps were a little bit more difficult them expected (even the sewing isn’t exactly my best skill) - but both were important to know what do I have to improve for the next prototype. Main difficulties were at the data side (conversion of a -1g to 1g input into angles), and also the filtering of data (accelerometers can be very noisy and unstable!). 

The video shows three tests I’ve run with different people, in order to test the responsiveness of the interface and its components. Be aware that the OUTPUT isn’t anything like I expect for the project - this has just been made to accomplish this testing task.

The gray lines represent the incoming data from X, Y and Z. The colored lines above them are the smoothed data built upon a customized Kalman Filter. The first 3d object is changed upon the rough data, and second one is the smoothed.

General setup:

- Lilypad Arduino Mainboard.

- 1 triple axis accelerometer (ADXL335).

- Bluetooth Mate Silver.

- Polymer Lithium Ion Battery = 850mAh

- Sewed with conductive thread (234/34 4ply).

- Cheap/common fabric.

For a first prototype, I have to say I’m really excited! This is still very far away of what I expected, but with some changes it can be very well improved:

- The Kalman filter must be more refined to be able to transfer more subtle data and exclude the noise highs and ups.

- The sewing must be very well made in order to don’t change the contact of the thread and the component - this occasions data loss and more noisy input.

- Bluetooth serial port (running with MAC OSx) keeps coming down.

- Focus on Z axis rotation. Explaining:

I’ve also made tests with the sound output corresponding to the rotation under each of the 3-axis, just for testing which of them (at the complexity of this interface specifically) have the best responsiveness. It turns out that the Z rotation brings the best results (not completely sure why yet). However, the X&Y angles have very little changing, and I’m thinking about not using these input on the following prototypes (since the exact location of each accelerometer isn’t needed).


NamaPrototype01.ino (Arduino programming)

NamaPrototype01Visualization.pde (Processing visualization code)


Prototype #2 will have the final configuration of the interface (as seen on a previous post), and will be used to start coding the definitive cellular automata system and the audiovisual output.

Now let’s keep it up, there is little time and much to be done! :)

First basic setup

of the whole project:

Currently seeking what is the best programming language (processing or C++ [openFrameworks or Cinder]) suited for my project (and my current skills), until my Sparkfun package doesn’t arrive [=.

Ricardo O'Nascimento

Great inspiration and eletronic art projects with textiles.

General doubts

Just posted some doubts on Arduino and Instructables forums:,82583.0.html,82585.0.html

I’m also contacting some experienced e-textile designers to perhaps contribute and criticize the whole schematic created. It is important for me, at this stage of acquiring the materials, to be sure about the components and schematic functionality. I realize it is best to experiment when building this kind of project - but my resources are a little limited, since obtaining these stuff in Brazil isn’t so fast and cheap.

About Arduino and the schematic:

- Will the raw data of the whole schematic (5 accelerometers to lilypad to bluetooth) be fast enough generated? - a frequency of 10 “readings” per second should be enough for what I intend to do with them, but the ideal would be 30.

- How much power will the schematic consume? Are three 20mm coin cell batteries going to do the job? In how much time would they be drown out?

About soft circuits and e-textiles:

- What is best to transport data and power through this interface: conductive fabric, conductive thread or even a really thin wire? - It is important that it could be folded, twisted and very resistant through time.

- Could you indicate any kind of textile more suitable for this application? - It should be just a little stretchy, and very resistant.

- Any ideas of how should I protect the components from damage by rough hand manipulation?

- Any advice about sewing the whole application?