EFTA00675668.pdf
PDF Source (No Download)
Extracted Text (OCR)
From: Joscha Bach
To: Kevin Slavin <slavin@media.mit.edu>
Cc: Jeffre E stein
, Ari Gesher
, Martin Nowak
Greg Borenstein
Subject: Re: MDF
Date: Wed, 23 Oct 2013 21:53:58 +0000
Am 23.10.2013 um 17:40 schrieb Kevin Slavin cslavin(Thmedia.mit.edu>:
An experiment that I would like to see one day (and of which I am not aware if someone already tried it):
equip a subject with an augmented reality display, for instance Google Glass, and continuously feed a visual
depiction of auditory input into a corner of the display. The input should transform the result of a filtered
Fourier analysis of the sounds around the subject into regular colors and patterns that can easily be discerned
visually. At the same time, plug the ears of the subject (for instance, with noise canceling earplugs and white
noise). With a little training, subjects should be able to read typical patterns (for instance, many phonemes)
consciously from their sound overlay. But after a few weeks: Could a portion of the visual cortex adapt to the
statistical properties of the sound overlay so completely that the subject could literally perceive sounds via
their eyes? Could we see music? Could we make use of induced synesthesia to partially replace a lost
modality?
It's not exactly what you're proposing, but are you familiar with Neil Harbisson's work/life:
http://en.wikipedia.org/wiki/Neil_Harbisson
He's an artist, primarily -- and has been living this way for quite a while now but E not aware of anyone
(including Neil himself) who has looked into how it has actually altered his perception...
Just remembered the study that demonstrated how blind people may recruit parts of the visual cortex to learn
echolocation: http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjoumal.pone.0020162
It seems that many blind people can learn to derive spatial imagery by using the sound of tongue clicks.
Also I am just warming up to run a class next semester called -- for the moment -- Animal Superpowers (new
name TK). Fifteen students, each one picks an animal, goes deep into how they perceive the world, and then
builds the sensory apparatus to allow a human user to understand the world as that animal does.
This is brilliant! Take that, Thomas Nagel ("What is it like to be a bat?")! On the other hand: What a bold
proposal, especially if we consider how (perhaps subtly?) different the perception of the world or even facial
features works out for different humans, and how the acquaintance with skills and experiences will shape the
way we perceive our environment... Studying the perceptual and cognitive affordances of animals in this way is
going to be very inspiring.
Now imagine studying the motivation, affects and decision making of animals, too, and comparing them to
human counterparts...
EFTA00675668
If there are ways in which these kinds of experiments can fold into / draw from what we're discussing here,.
welcome it.
It might not be directly related to the "Center for Competition and Cooperation", but to the "Center for Creative
Cognition" ;-)
... and certainly to Takashi's cluster of ideas?
— Joscha
EFTA00675669
Document Preview
PDF source document
This document was extracted from a PDF. No image preview is available. The OCR text is shown on the left.
This document was extracted from a PDF. No image preview is available. The OCR text is shown on the left.
Extracted Information
Email Addresses
Document Details
| Filename | EFTA00675668.pdf |
| File Size | 106.1 KB |
| OCR Confidence | 85.0% |
| Has Readable Text | Yes |
| Text Length | 3,155 characters |
| Indexed | 2026-02-11T23:28:11.749133 |