Background

During the last 3 years, 2002 onwards, we have seen and lived as users the rise of the wireless networks, the commercial upcoming of location aware devices (gps, mobile phones, pda's all capable of locating and connecting the user to both planetary coordinates and the internet meshwork) Since then, we have gathered all kinds of logs of data from bike rides, mountains, international borders to South American trips into massive cities like Medellin or Bogota in Colombia.

from all that data gathered we thought to make use in the way of sound and image, this lead into augemented reality interests and the idea of adding realtime computer processes to the very human act of mapping.

our first project was a collaboration witch main interest was bringing together wireless sniffer applications with 3D modeling software (kismet+fluxus), then music and OSC (open sound control) so to achive audible packets. Recently GPSD so to integrate a trajectory in realtime to a line of events that relate to how we move in the cities allowing the visitor/navigator to move fwd and backwards in the timeline of the derive, him/her will be able to interact in time with sound and objects that represent the place s/he also dwells.

following the question, how to map ouselves in the mixed reality? we came to develop this projects cause of the needs and interests on using the long logs of wardriving data gathered here and elsewhere.

is clear that theres no need to always begin from scratch and create "new" tools (the GNU/LINUX world is full&growing with great tools) but to find the way how to make interact those already there according to one's needs and expectations. For acomplishing this artistic treat we are building bridges to get PureData (Pd www.puredata.org) and OpenGL applications able to receive G.P.S and wireless data (Kismet www.kismetwireless.net) in real time.

some referent urls from side projects that lead to S.O.U.P are: