imomus: (Default)
[personal profile] imomus
One of the more interesting topics that came up in yesterday's three-hour livecast from the Loophole was the question of whether humans, in future, will have computers actually implanted into their brains. I thought they would, and imagined a semi-comical scenario in which a man defaulting on his broadband bills gets notice from his provider that his entire brain will be temporarily short-circuited until he pays up. Despite this sort of nightmare scenario, I said that I'd probably be someone who'd sign up for the "internalised computer" scenario, because I've largely externalised my brain. Click Opera, for instance, is a sort of externalised memory for me now, and I find it quite difficult to give a lecture without having it beaming and searchable on a screen behind me.



Today I read an interesting and related article in the London Review of Books on the same subject. Jerry Fodor's Where is my mind? is an explanation -- and attempted refutation -- of the position I adopted yesterday, which is (I learn) called the Extended Mind Thesis, or EMT. Fodor's article is a review of Andy Clark’s new book, Supersizing the Mind: Embodiment, Action and Cognitive Extension.

In a preface (co-written by David Chalmers), Clark lays out a simple example of EMT; David's new iPhone. "I bought an iPhone. The iPhone has already taken over some of the central functions of my brain... My iPhone is not my tool, or at least it is not wholly my tool. Parts of it have become parts of me... When parts of the environment are coupled to the brain in the right way, they become parts of the mind."



Fodor doesn't buy this, and doesn't accept Clark's argument that, where person X remembers an address and person Y consults an iPhone to find it, the iPhone is an extension of person Y's mind. Clark says that an opponent of mind-supersizing (or EMT) would have to show there was a significant difference between person X and person Y in this example. The argument then takes in something called "the parity principle" and the slippery slope between the terms of a dualism -- at which point I lose patience with the philosophical hair-paring (everything in philosophy hinges on arbitrary definitions, and I tend to take the Pirandellian view; "right you are -- if you think so").

Fodor's big argument against EMT is that minds have contents, and can make decisions. He thinks that an iPhone can't have contents or make decisions, but what would he say about Click Opera? It surely does have contents -- the past activities of my brain and the brains of my interlocutors. And sure, it can't make decisions, but decision-making isn't the only brain function; there's also the storage of information in memory. Click Opera stores information that I no longer have in my brain; when I connect to this information, I have a way to remember. Click Opera can't substitute my brain, but it can extend it.



Fodor then modifies this by saying that the mind has "underived content" whereas an iPhone (and presumably Click Opera) has "derived" content -- content derived from the mind. I think this underestimates how much of the content of our minds is derived from other minds. In other words, if the mind extends into the world, by the same token the world impinges into the mind. Mind and world are really not distinguishable.

Fodor's arguments get too technical for me towards the end of his piece -- and when I arrive at the bottom I discover that his main intellectual enterprise seems to be pointing out Darwin's mistakes, which worries me somewhat. The general thrust of his argument seems to be posited on a belief in the absolute independence of the individual brain, a Cartesian insistence of its a priori separation from culture, body, world. I would be inclined much more to cultural determinism -- I'm entirely happy with the idea that our national cultures, our software, our media, and even our philosophical traditions (Mr Fodor!) think us. I don't think this determines us absolutely, but I do think it takes conscious effort to escape the patterning of, say, the way a music sequencing software engineer makes us think about music, or the way a British newspaper editor makes us think about current events in the world. I also think that one can't ever entirely escape structurations like these, only swap them for other structurations (improv jazz, a Japanese poet's take on current world events).



Most of all, though, I feel that Clark and Chalmers' supersizing idea -- the Extended Mind Thesis -- fits my life intuitively. I feel that both technology and media extend my mind, and mingle it with other minds. This is why I do what I do; I like that promiscuity, that cultural reproduction. Rinus said an interesting thing when we were talking about this yesterday. "If you wrote two entries a day, and scheduled them, you could keep blogging daily thirty years after you're dead," he said. It was a joke, but not really so far-fetched, and the perfect refutation of Fodor's case here. Even after the brain itself dies, the mind can live on thanks to culture, that fantastic extender and supersizer.

(no subject)

Date: 2009-02-08 11:56 am (UTC)
From: (Anonymous)
you might want to (re?)read mcluhan's "understanding media"

(no subject)

Date: 2009-02-08 12:31 pm (UTC)
From: [identity profile] imomus.livejournal.com
McLuhan seems to be on the extensionalist side:

"we have extended our central nervous system itself in a global embrace, abolishing both space and time as far as our planet is concerned. Rapidly, we approach the final phase of the extensions of man - the technological simulation of consciousness, when the creative process of knowing will be collectively and corporately extended to the whole of human society, much as we have already extended our senses and our nerves by the various media. Whether the extension of consciousness, so long sought by advertisers for specific products, will be "a good thing" is a question that admits of a wide solution. There is little possibility of answering such questions about the extensions of man without considering all of them together. Any extension, whether of skin, hand, or foot, affects the whole psychic and social complex."

He may be ambivalent about the Extended Mind Thesis being a good thing, but he doesn't doubt that it's happening.

(no subject)

Date: 2009-02-08 12:31 pm (UTC)
From: [identity profile] 33mhz.livejournal.com
But iphones and anything else with a CPU in it actually CAN make decisions. Conditional branching is universal to anything that could truthfully be called a computer.

(no subject)

Date: 2009-02-08 12:49 pm (UTC)
From: [identity profile] imomus.livejournal.com
Dennet on Fodor's anti-Darwinism (http://ase.tufts.edu/cogstud/papers/fodorondarwinism3b.pdf).

A frog may have such intentions,

Date: 2009-02-08 09:44 pm (UTC)
From: [identity profile] pay-option07.livejournal.com
Would I hate to think of Menno as a frog?

http://mennokrant.net/

(no subject)

Date: 2009-02-08 01:11 pm (UTC)
From: [identity profile] jermynsavile.livejournal.com
Like an address book? A diary? A library?

And a phone is just a way of sending letters.

It's all just information, just the means of transmission is different.

I'm not quite sure why calling it "technology and media" makes it any different in principle?

(no subject)

Date: 2009-02-08 01:30 pm (UTC)
From: [identity profile] imomus.livejournal.com
You don't see media as a way for one mind to seep into another? Even after the first mind is dead?

Fodor seems not to want the binary MIND / WORLD to be broken down, but I see that membrane as permeable, and becoming more so with every technological step.

(no subject)

Date: 2009-02-08 01:42 pm (UTC)
From: [identity profile] jermynsavile.livejournal.com
I do see see media as a way for one mind to seep into another, even after the first mind is dead. But I think that all forms of human communication do that, I just don't see why modern technology makes it fundamentally different.

And as for chips in brains, surely all our evidence of the way that commerce and politics works suggests that interventions would not be benign, ushering in an fabulous era of super-brains: what we'd more likely get is a mechanism of corporate and state control, a narrowing of possibilities not an expansion of them.

You say utopia, I say dystopia. Let's call the whole thing off.

(no subject)

Date: 2009-02-08 03:20 pm (UTC)
From: (Anonymous)
Maybe I'm missing something here, too...

"Even after the brain itself dies, the mind can live on thanks to culture, that fantastic extender and supersizer."

But hasn't art--in the form of paintings, books, sculpture, dance, architecture, music, you name it--always allowed for that sort of immortality? In fact, isn't that sort of one of the whole reasons for making art (if a secondary or tertiary one)?

What if that software in my brain needs upgrading but I'm not going for that barely beta-tested version the Micromonsters of the future are trying to force upon me? What if all the cool people have their Mac chip and I and my family are ostracized because we went for the cheaper, slower PC variety?

(no subject)

Date: 2009-02-08 03:28 pm (UTC)
From: (Anonymous)
PS And when is something more of a crutch than a tool? A crutch that cripples; a real tool being something that enables you to avoid the crutch. (The more I use my iPod touch and the internet, I'm afraid the lazier my mind has become; maybe it's simply age, but I know more now about everything and remember less and less of anything every day.

(no subject)

Date: 2009-02-09 09:31 am (UTC)
From: [identity profile] qscrisp.livejournal.com
And as for chips in brains, surely all our evidence of the way that commerce and politics works suggests that interventions would not be benign, ushering in an fabulous era of super-brains:

I don't understand why not everyone sees this.

(no subject)

Date: 2009-02-09 09:42 am (UTC)
From: [identity profile] qscrisp.livejournal.com
what we'd more likely get is a mechanism of corporate and state control, a narrowing of possibilities not an expansion of them.

I agree with this so strongly that I just wanted to add the following: People wouldn't look forward to being lobotomised, but that's bascially what they're looking forward to when they want to be turned into machines. The only difference is that technology, through mobile phones and so on, has become cool, like Nike trainers or something. So, technological lobotomy is now like having the latest Nike trainers. "Hey, you've turned yourself into a techno-zombie! Cool!"

(no subject)

Date: 2009-02-08 02:13 pm (UTC)
From: [identity profile] krskrft.livejournal.com
Well, I think that saying Click Opera is truly an extension of your memory, in a computational sense, is kind of disingenuous, because I'm sure you're aware of the fact that you don't completely forget everything you deposit here for later use. The blog allows you to go back and find the exact quote, the exact idea, etc. but more often than not, you're going to remember the basic idea without assistance. This is what makes the blog useful in the first place, because if you hadn't even the slightest remembrance of what you deposited here, you'd never recall that there was anything here worth searching for.

It's like a journal, or a book, or a recording, or whatever else. It's a record. There's nothing special about it just because it's more modern and technological. Search functions make it convenient, and it's online, which allows others to comment and contribute ideas. But these are all things that people have done for ages (card catalogs = search; correspondence, letters to the editor, etc = comments).

But it seems like these considerations are totally separate from the workability of a "brain chip," which was more my sticking point on the issue. This idea that, in the future, we're going to have elective cybernetic implants and things of that nature ... it's up there with the flying cars, as far as futurism goes.

My generation is probably the last one that started learning to type on typewriters. In the industrialized West, anyway. This also makes me part of the generation that was truly captivated by computing when it became accessible. My family wasn't very well off, so we didn't own a computer until about 1993 or 1994 (an IBM 286 running DOS). Anyway, back then it was just fun to fuck around, learning how to use computers. I remember the simple pleasure of typing short notes in "San Francisco" font (the one that looked like a magazine cut-out ransom note) on the Macintosh in my 4th grade classroom. In other words, my generation seemed to have great enthusiasm for even the simplest aspects of computing.

But I was kind of alarmed when I started teaching undergraduates as a GTA in 2005, and most of them didn't even seem to know how to operate Microsoft Word properly. The generations succeeding mine take modern technology for granted, and they don't really have as much enthusiasm for it as you might imagine. They buy new gadgets (iPods, cell phones, laptops, etc), but they most certainly aren't experienced users. They know how to browse the internet, write email, set up social networking pages, use mp3 player apps, etc. but they don't have much interest in actually knowing how a computer works, what actions are represented by the buttons they click on the interface. Just look at Linux for example. Linux was written in 1991 (the GNU project has been around since 1984 or thereabouts, if we want to go back even further), and virtually no mainstream users even knew what it was, or would have considered using it, until the Ubuntu distro--purportedly the most "user friendly" of the bunch--gained popularity over the last few years. To use Linux, ordinarily, was to be forced to learn more about the internal workings, to learn the command line, to learn about dependencies, etc.

It seems to me that willingly having a chip inserted in one's brain is something that only an enthusiast (at the very least) would do, because it deals quite directly with the internal workings, the wiring, the actual processes. And by the looks of things, people are only getting less interested in that stuff. They don't want to be bound up in the technology, they don't want to learn its language.

no longer in the brain

Date: 2009-02-08 04:04 pm (UTC)
From: [identity profile] karl.dubost.myopenid.com (from livejournal.com)
>Click Opera stores information that I no longer have in my brain; when I connect to this information, I have a way to remember.

This, you can't know and that is the irony. There are things which are still in our brain but we don't remember how to access them. Then suddenly one day, they come back to the surface.

What is happening with the external storage is not the storage by itself, but the search functionality on this external storage. I can have thousand of photos, but if there are not slightly organized (no dates into it for example), there are just a useless pile of bits. The fact that they contained structured information help when we need to access them with tools.

A simple creation date of the file can achieve that. We relate the date to memories of places. Either in your brain or somewhere on the external storage if you write the list of things you have done.

(no subject)

Date: 2009-02-08 04:27 pm (UTC)
From: (Anonymous)
These hypothetical brain implants seem inevitable. And their potential potential seems to trivialize all of our individual pursuits. If you and I can immediately download, or be connected continuously to a network which has, the sum of human knowledge, then our importance as individual entities shrinks. Man will then become a single collective network, acting together for who knows what silly purpose.

(no subject)

Date: 2009-02-08 06:02 pm (UTC)
From: [identity profile] kumakouji.livejournal.com
well thanks for making the future sound absolutely terrifying.

(no subject)

Date: 2009-02-08 06:57 pm (UTC)
From: (Anonymous)
The question of "do we want humanity to go in this direction" will be asked, but it will be impossible to resist our new found power. We have never been strong enough to resist technological progress at any time. When it comes down to it, we'll always move on into the future because we've seen the life offered up by the past and the present and we're dissatisfied with it.

It's kind of funny to consider how such an entity would behave when it had the benefit of complete human knowledge. Would this collective force thirst for more knowledge, or would it come to realize its own futility in a lonely purposeless existence? Would it get bored of its unity and create a world of individuals who were ignorant of their true origin (kind of like how we are now)?

absolutely beginners

Date: 2009-02-08 09:30 pm (UTC)
From: [identity profile] pay-option07.livejournal.com
A socialist complains to a democrat!

Re: absolutely beginners

Date: 2009-02-08 09:52 pm (UTC)
From: [identity profile] kumakouji.livejournal.com
The Borg are the commies of the Sci-fi world.

Re: absolutely beginners

Date: 2009-02-08 10:01 pm (UTC)
From: [identity profile] pay-option07.livejournal.com
I'm interpreting this as the assimilation of diverse technology/species for survival. It seems to be in conflict with concepts of regional communities.

Re: absolutely beginners

Date: 2009-02-08 10:03 pm (UTC)
From: [identity profile] kumakouji.livejournal.com
The world is one big regional community. Resistance is futile.

Re: absolutely beginners

Date: 2009-02-08 10:12 pm (UTC)
From: [identity profile] pay-option07.livejournal.com
The main reason for viticulture !

(no subject)

Date: 2009-02-08 05:20 pm (UTC)
From: [identity profile] mylifeismundane.livejournal.com
i fully concur. i actually believe that the internet, when used to its full capacity, will develop into a human hive-mind that will enable us to make ourselves smarter through externalising and individually processing all discovered knowledge.

cognitive science

Date: 2009-02-08 05:31 pm (UTC)
From: [identity profile] i-am-a-hot-sale.livejournal.com
This is my research area! very exciting that even the more obscure philosophical issues are getting popularized. though obscure philosophically they are straightforward intuitively and phenomenologically in our lived experience. andy clark is very well respected and a great writer about the role of our bodies and environment in cognition. fodor makes an excellent punching bag, as dennett is aware. he did do some decent work on modularity in language systems, but not much.

"Even after the brain itself dies, the mind can live on thanks to culture, that fantastic extender and supersizer."

this living on is an issue for me because I want to understand what creates experience. I don't think clark's perspective helps much here.. I'm interested in the nexus of consciousness, and while it is interesting to consider the distribution and exteneded capabilities of consciousness it seems that there is a kernel apart from that which is readily distributable such as memory, emotional states, varying degrees of task performance etc. then again maybe consciousness isn't all that some of us think it is.

(no subject)

Date: 2009-02-08 09:28 pm (UTC)
From: [identity profile] i-am-a-hot-sale.livejournal.com
but I have to say, reading fodor's LRB articles, he makes an excellent columnist. I always enjoyed his ecstatic and boisterous style.

(no subject)

Date: 2009-02-08 07:45 pm (UTC)
From: (Anonymous)
Until we reach a point where body and computer technology actually become one, the reliability of the technology would have to be improved significally. I'm glad I didn't have to rely on my external drive to remember important events from my life, since it just died on me.

-r

(no subject)

Date: 2009-02-08 08:44 pm (UTC)
From: (Anonymous)
Wonderful entry, Momus. Instant fave :)

--Ian

trouble with the memes

Date: 2009-02-09 12:04 am (UTC)
From: (Anonymous)
Click opera may not be as good as your original brain.

If you go for meme theory, and the evolution of your ideas (not that I entirely do..)

evolution depends on an imperfect replicator, your idea/meme could be mutated. This could be advantageous or disadvantageous. This would happen with your conventional memory.

If you have click opera as a constant reference for your exact thoughts as written and fixed, not mutating, perhaps your ideas/memes will evolve at a slower rate?

matt

Re: trouble with the memes

Date: 2009-02-09 12:13 am (UTC)
From: [identity profile] imomus.livejournal.com
Yes, but you forget the comments section. A lot of gene splicing goes on down here. This is where mutation happens: via dialectics. Without bouncing my ideas off other people, sure, they would get stale and start inbreeding. Down here there's exogamy.

Re: trouble with the memes

Date: 2009-02-09 01:19 am (UTC)
From: (Anonymous)
yeah i was going to mention it, but didn't want to go on at length. i agree about the comments section. trouble is, after say, 1 week the comments generally stop, so after your intitial period of rapid mutation, it becomes fixed.

with your conventional memory, i'd suggest there is the possibility of mutation every time you revisit the idea with imperfect recall

matt

loophole

Date: 2009-02-09 12:52 am (UTC)
From: (Anonymous)
what really struck me about the pod cast was how relaxed and comfortable you sounded,made for a mellow three hours.
the field recordings initialy were a wee bit intrusive although i liked the concept.
ps indeed it would have sounded good on radio 4

Re: loophole

Date: 2009-02-09 01:37 am (UTC)
From: (Anonymous)
true,but the dutch guy was slightly annoying especially changing topics whilst you were mid flow.would make a great condensed 1hr radio 4 special.

Profile

imomus: (Default)
imomus

February 2010

S M T W T F S
  1 2 3 4 5 6
7 8 9 10111213
14151617181920
21222324252627
28      

Style Credit

Expand Cut Tags

No cut tags