Friday, May 6, 2011

Friday the 13th of May..its time for BANGFACE

Wednesday, May 20, 2009

Extra Senses, Extra Interference. Presentation day. Photos




Extra senses, Extra Interference Juan&Pablo final report

Extra Senses Extra Interference [AS Research Group]
May 2009 – Final Report
listening without ears [project codename]
by Juan Cantizzani and Pablo Sanz Almoguera


- a description of the extra sense you have made
It consists in a system based in a bone conduction interface for sound and the sonification of electromagnetic emissions.

- what does it sense ?
electromanetic emissions (VLF close range or high frequency -100mhz to 2.5GHz aprox-)

- how is this information translated to something we can perceive ?
the electromagnetic emissions are sonified within the audible range and transmitted to the choclea via bone conduction through the skull.

- what can be communicated through this sense ?
the pressence of electromagnetic fields, devices, transmissions, etc (wifi networks, electronic appliances, phone calls, antennas, etc)

- how does it work technically ?
To pick up the em emissions there are a couple of different EM sniffers (one for VLF and one for high frequency), attached to the body with a bracelet. The close range VLF sniffer has a coil that works as a detector. This detector is attached to a stick, operated by the user in order to scan electronic devices and perceive their EM emissions.

The audio output of these devices is feed through a bone conduction/tactile wearable sound system. The interface is based in three small solenoids sewed to a elastic head band, they are directly feed from an ampllifier (not portable in this version).

- how did you expect this sense to change our everyday perception and behaviour ?
hypothetically it could increase our awareness towards the pervasive electromagnetic activity in our everyday environments.

- how does the extra sense change the perception of the world around you ? - (how) does the extra sense change the way you behave in the world around you ?
It adds an extra layer of sound to our everyday perception, that besides of the aesthetical pleasure (if you like noise) can provide information about the EM emissions of specific electronic devices and EM fields present in some areas. The final outcome of this depends of the possible reaction of each person to this. It could be addictive for noise enthusiasts or scare other people about electromagnetic emissions.

- how did your project develop and change over time ?
We focused for most of the time in the research and building of the bone conduction interface, that was our main curiosity for this project. We found different solutions and tried a couple of them. After finding the solenoids as a nice and doable solution within our time constraints, we focused in getting several of them and building the interface. Regarding the input, it worked quite straightforward from what it was planned (also because we got a already made circuit), even though we think some improvements and changes could be tried, for example regarding the coil to probe devices, limiting the frequency ranges or doing a more complex sonification of the input.

- what happend as you expected and what unexpected things took place?
We learned a lot researching about bone conduction/tactile sound for this project and also building the interface, on which process some tests were more successful that others. Even if it was not suitable for this device, we found quite interesting our first tests with piezo drivers as valuable for others projects. Finding the quite simple diy solution of the solenoids for our interface was a surprise. Getting the EM sniffers is nice since they will be useful for other projects too.

extra-senses-report-juan-pablo.pdf

Monday, May 11, 2009

ghosts in the machine

Hi,

here a link to an interesting project, with some links to the course:


http://www.we-make-money-not-art.com/archives/2008/12/ghosts-in-the-machine-uses.php


this project is part of a larger undertaking:

http://www.ucalgary.ca/~einbrain/new/main.html

Thursday, May 7, 2009

arduino information

The best source of information is the arduino site:

http://www.arduino.cc/

first thing you need is 'getting started'; this page explains how to install software, how to connect and to get the first example working:

http://arduino.cc/en/Guide/HomePage

good for starters are the different examples on the arduino site:

http://arduino.cc/en/Tutorial/HomePage

another tutorial for starters you can find here:

http://www.ladyada.net/learn/arduino/index.html

other references are:

here is a pdf-guide covering the arduino programming language:

http://www.arduino.cc/playground/uploads/Main/arduino_notebook_v1-1.pdf

and here the book by Massimo Banzi, the brain behind the arduino (I have it if anyone is interested to take a look at it)

http://oreilly.com/catalog/9780596155513/

a (much) earlier version of this, not all of it is still correct:
(this link will disappear after the course, since Massimo Banzi has stopped distributing this version)

Arduino_booklet02.pdf

pager motors


These are the pager motors I bought, on the catalog page are some technical details as well. They are € 1.50 each.

http://www.voti.nl/winkel/catalog.html?MOT-15
How The Brain Controls What The Eyes See

ArtScience students are probably familliar with the well known optical illusion as depicted below. We see either a vase or the faces of two people. What we observe depends on the patterns of neural activity going on in our brains and entirely depends on changes that occur in our brain, since the image always stays exactly the same. When viewing ambiguous images such as optical illusions, patterns of neural activity within specific brain regions systematically change as perception changes. More importantly, patterns of neural activity in some brain regions are very similar when observers are presented with comparable ambiguous and unambiguous images. The fact that some brain areas show the same pattern of activity when we view a real image and when we interpret an ambiguous image in the same way implicates these regions in creating the conscious experience of the object that is being viewed.

Findings from these studies may further contribute to scientists’ understanding of disorders such as dyslexia - a case in which individuals are thought to suffer from deficiencies in processing motion - by providing information about the functional role that specific brain regions play in motion perception.

Wednesday, May 6, 2009

eyeborg




http://eyeborgproject.com/
Take a one eyed film maker, an unemployed engineer, and a vision for something that's never been done before and you have yourself the EyeBorg Project. Rob Spence and Kosta Grammatis are trying to make history by embedding a video camera and a transmitter in a prosthetic eye. That eye is going in Robs eye socket, and will record the world from a perspective that's never been seen before.


natural interactive walking



http://www.niwproject.eu/
http://www.cim.mcgill.ca/~alvinlaw/research/

Project Description


NIW will investigate possibilities for the integrated and interchangeable use of the haptic and auditory modality in floor interfaces, and for the synergy of perception and action in capturing and guiding human walking. Its objective is to provide closed-loop interaction paradigms, negotiated with users and validated through experiments, enabling the transfer of skills that have been previously learned in everyday tasks associated to walking, and where multi-sensory feedback and sensory substitution can be exploited to create unitary multimodal percepts.

NIW will expose walkers to virtual scenes presenting grounds of different natures, populated with natural obstacles and human artefacts, in which to situate the sensing and display of haptic and acoustic information for interactive simulation, and where vision will play an integrative role. Experiments will measure the ecological validity of such scenarios, investigating also on the cognitive aspects of the underlying perceptual processes. Floor based interfaces will be designed and prototyped by making use of existing haptic and acoustic sensing and actuation devices, comprising interactive floor tiles and soles, with special attention to simplicity of technology. Its applicability to navigation aids such as land-marking, guidance to locations of interest, signalling, warning about obstacles and restricted areas, will be assessed. NIW will nurture floor and shoe designs which may impact the way we get information from the environment.

FET-Open will further benefit from the discovery of cross-modal psychophysical phenomena, the design of ecologically valid walking interaction paradigms, the modelling of motion analysis and multimodal display synthesis algorithms, the study of non visual floor-based navigation aids, and the development of guidelines for the use of existing sensing and actuation technologies to create virtual walking interaction scenarios.



Tuesday, May 5, 2009

SWAN : System for Wearable Audio Navigation

SWAN is a project of the Psychology Department's Sonification Lab at Georgia Institute of Technology.

Background

Georgia Tech SWAN system fits in a small backpack.There is a continuing need for a portable, practical, and highly functional navigation aid for people with vision loss. This includes temporary loss, such as firefighters in a smoke-filled building, and long term or permanent blindness. In either case, the user needs to move from place to place, avoid obstacles, and learn the details of the environment.

SWAN Architecture
The core system is a small computer--either a lightweight laptop or an even smaller handheld device--with a variety of location and orientation tracking technologies including, among others, GPS, inertial sensors, pedometer, RFID tags, RF sensors, compass, and others. Sophisticated sensor fusion is used to determine the best estimate of the user's location and which way she is facing. See the SWAN architecture figure for more details of the components. You can also find out more about the bone conduction headphones, or "bonephones" we use to present the audio interface/sounds to the user, on our Bonephones Research page.

Once the user's location and heading is determined, SWAN uses an audio-only interface (basically, a series of non-speech sounds called "beacons") to guide the listener along a path, while at the same time indicating the location of other important features in the environments (see below). SWAN includes sounds for the following purposes:

  • Navigation Beacon sounds guide the listener along a predetermined path, from a start point, through several waypoints, and arriving at the listener's destination.
  • Object Sounds indicate the location and type of objects around the listener, such as furniture, fountains, doorways, etc.
  • Surface Transition sounds signify a change in the walking surface, such as sidewalk to grass, carpet to tile, level corridor to descending stairway, curb cuts, etc.
  • Locations, such as offices, classrooms, shops, buildings, bus stops, are also indicated with sounds.
  • Annotations are brief speech messages recorded by users that provide additional details about the environment. For example, "Deep puddle here when it rains."

Sunday, May 3, 2009

project report -- listening without ears

hello there, this is Juan & Pablo, introducing our first ideas about the project to undertake in the forthcoming weeks.

CONCEPT

The initial idea is to attempt the development of some kind of device/s based in the principles of bone conduction and tactile sound, in order to use them as interfaces for the perceptualization of data/sensory information.

CALL FOR COLLABORATION

we are completely open for anybody that would like to join us, even if maybe just for collaborating in any aspect of the project or trying to combine it with any other of the projects developed within the group. In addition, any help, suggestions and advise of any kind you may have will be very welcomed and appreciated.

RESOURCES

Bone conduction is the conduction of sound to the inner ear directly through the bones of the skull, bypassing the eardrum. Tactile sound is the sensation of sound transmitted directly to the body by contact. We are interested in exploring the frontiers between acoustic and tactile perceptions and to examine if these principles could be used to receive sensory information non-perceivable normally. Being something quite unobtrusive but complementary with the rest of the existing sensory modalities, we find it as an interesting way of extending our perception, making use of this possibility of 'listening without ears' that we all possess, and which its not very known.

So far we have been researching a little bit about those concepts and compiling info about sound art projects. commercial products and existing technologies that make use of them. Those are some of the links with further info that we have found:

http://en.wikipedia.org/wiki/Bone_conduction
http://en.wikipedia.org/wiki/Bonephones
http://en.wikipedia.org/wiki/Tactile_sound
Wired´s article 'High-tech hearing bypasses ears'
forum thread on bone-earphones
bone-conduction speaker device paper
Goldendance MGD-01/MGD-02
SwiMP3
bone-conduction pillow
bone-phone (Sanyo) [2]
bone-phone (Finger-whisper) [2]
previous post in our 'extra-senses' blog with related art projects

WHAT TO PERCEPTUALISE / LIMITATIONS

In addition, we started to think about what would we interesting to feed through these devices, considering also both the limitations and possibilities we could eventually find out. In any case, the device will receive sound, so its likely that any kind of sonification or direct audification processes will have to be necessarily applied to whichever input we intend to use.

We also found differences between using direct bone conduction via transducers attached to the skull, and those attached to other areas. These differences include the perceivable frequency range, creating in areas of the body others than the skull a mostly pure tactile feeling, similar to what we would get using small vibrators.

Another issue is the possibility of receiving spatial information. In the case of bone conduction through the skull, the source of the stimuli seems to be inside our head, with no perceived spatial cues, therefore this would not be suitable for any input with crucial information that requires precise localization (orientation, navigation, etc) , but could work to sense something relative for example to the state of an environment. Regarding this, we think that maybe an hybrid device could be built, including a combination of both bone conduction through the skull and tactile stimuli in other areas of the body.

So, possible things to be made perceivable through this device coud be: any range of the non-perceivable electromagnetic spectrum, infra-ultrasound, magnetic/electric fields, movement, augmented reality marks... ¿? We are thinking about these possibilities and checking how to implement any of them, but so far the priority is to start to work in the device itself, since later different inputs could be connected to it to try them out, and probably more accurate ideas about what could work will arise when starting to experiment.

NEXT STEPS

Therefore, first plans include starting immediately this coming week to find out how to build wearable bone-conduction/tactile small transducers... which electronic components are more suitable, etc... in order to have asap something to start to experiment with in practice.

Any specific info about this you might have would be very much appreciated, we found some possible solutions, but none of them really small, so this technical issue is still not completely clear for a pair of electro-dummies like us.



Frontispiece of John Bulwer's Philocophus' The Deaf and Dumbe Mans Friend. Printed for Humphrey Moseley, London, 1648. Note the kneeling man who is “hearing” music through his teeth via bone conduction.