Supersize mind
Feb. 8th, 2009 12:42 pmOne of the more interesting topics that came up in yesterday's three-hour livecast from the Loophole was the question of whether humans, in future, will have computers actually implanted into their brains. I thought they would, and imagined a semi-comical scenario in which a man defaulting on his broadband bills gets notice from his provider that his entire brain will be temporarily short-circuited until he pays up. Despite this sort of nightmare scenario, I said that I'd probably be someone who'd sign up for the "internalised computer" scenario, because I've largely externalised my brain. Click Opera, for instance, is a sort of externalised memory for me now, and I find it quite difficult to give a lecture without having it beaming and searchable on a screen behind me.

Today I read an interesting and related article in the London Review of Books on the same subject. Jerry Fodor's Where is my mind? is an explanation -- and attempted refutation -- of the position I adopted yesterday, which is (I learn) called the Extended Mind Thesis, or EMT. Fodor's article is a review of Andy Clarkâs new book, Supersizing the Mind: Embodiment, Action and Cognitive Extension.
In a preface (co-written by David Chalmers), Clark lays out a simple example of EMT; David's new iPhone. "I bought an iPhone. The iPhone has already taken over some of the central functions of my brain... My iPhone is not my tool, or at least it is not wholly my tool. Parts of it have become parts of me... When parts of the environment are coupled to the brain in the right way, they become parts of the mind."

Fodor doesn't buy this, and doesn't accept Clark's argument that, where person X remembers an address and person Y consults an iPhone to find it, the iPhone is an extension of person Y's mind. Clark says that an opponent of mind-supersizing (or EMT) would have to show there was a significant difference between person X and person Y in this example. The argument then takes in something called "the parity principle" and the slippery slope between the terms of a dualism -- at which point I lose patience with the philosophical hair-paring (everything in philosophy hinges on arbitrary definitions, and I tend to take the Pirandellian view; "right you are -- if you think so").
Fodor's big argument against EMT is that minds have contents, and can make decisions. He thinks that an iPhone can't have contents or make decisions, but what would he say about Click Opera? It surely does have contents -- the past activities of my brain and the brains of my interlocutors. And sure, it can't make decisions, but decision-making isn't the only brain function; there's also the storage of information in memory. Click Opera stores information that I no longer have in my brain; when I connect to this information, I have a way to remember. Click Opera can't substitute my brain, but it can extend it.

Fodor then modifies this by saying that the mind has "underived content" whereas an iPhone (and presumably Click Opera) has "derived" content -- content derived from the mind. I think this underestimates how much of the content of our minds is derived from other minds. In other words, if the mind extends into the world, by the same token the world impinges into the mind. Mind and world are really not distinguishable.
Fodor's arguments get too technical for me towards the end of his piece -- and when I arrive at the bottom I discover that his main intellectual enterprise seems to be pointing out Darwin's mistakes, which worries me somewhat. The general thrust of his argument seems to be posited on a belief in the absolute independence of the individual brain, a Cartesian insistence of its a priori separation from culture, body, world. I would be inclined much more to cultural determinism -- I'm entirely happy with the idea that our national cultures, our software, our media, and even our philosophical traditions (Mr Fodor!) think us. I don't think this determines us absolutely, but I do think it takes conscious effort to escape the patterning of, say, the way a music sequencing software engineer makes us think about music, or the way a British newspaper editor makes us think about current events in the world. I also think that one can't ever entirely escape structurations like these, only swap them for other structurations (improv jazz, a Japanese poet's take on current world events).

Most of all, though, I feel that Clark and Chalmers' supersizing idea -- the Extended Mind Thesis -- fits my life intuitively. I feel that both technology and media extend my mind, and mingle it with other minds. This is why I do what I do; I like that promiscuity, that cultural reproduction. Rinus said an interesting thing when we were talking about this yesterday. "If you wrote two entries a day, and scheduled them, you could keep blogging daily thirty years after you're dead," he said. It was a joke, but not really so far-fetched, and the perfect refutation of Fodor's case here. Even after the brain itself dies, the mind can live on thanks to culture, that fantastic extender and supersizer.

Today I read an interesting and related article in the London Review of Books on the same subject. Jerry Fodor's Where is my mind? is an explanation -- and attempted refutation -- of the position I adopted yesterday, which is (I learn) called the Extended Mind Thesis, or EMT. Fodor's article is a review of Andy Clarkâs new book, Supersizing the Mind: Embodiment, Action and Cognitive Extension.
In a preface (co-written by David Chalmers), Clark lays out a simple example of EMT; David's new iPhone. "I bought an iPhone. The iPhone has already taken over some of the central functions of my brain... My iPhone is not my tool, or at least it is not wholly my tool. Parts of it have become parts of me... When parts of the environment are coupled to the brain in the right way, they become parts of the mind."

Fodor doesn't buy this, and doesn't accept Clark's argument that, where person X remembers an address and person Y consults an iPhone to find it, the iPhone is an extension of person Y's mind. Clark says that an opponent of mind-supersizing (or EMT) would have to show there was a significant difference between person X and person Y in this example. The argument then takes in something called "the parity principle" and the slippery slope between the terms of a dualism -- at which point I lose patience with the philosophical hair-paring (everything in philosophy hinges on arbitrary definitions, and I tend to take the Pirandellian view; "right you are -- if you think so").
Fodor's big argument against EMT is that minds have contents, and can make decisions. He thinks that an iPhone can't have contents or make decisions, but what would he say about Click Opera? It surely does have contents -- the past activities of my brain and the brains of my interlocutors. And sure, it can't make decisions, but decision-making isn't the only brain function; there's also the storage of information in memory. Click Opera stores information that I no longer have in my brain; when I connect to this information, I have a way to remember. Click Opera can't substitute my brain, but it can extend it.

Fodor then modifies this by saying that the mind has "underived content" whereas an iPhone (and presumably Click Opera) has "derived" content -- content derived from the mind. I think this underestimates how much of the content of our minds is derived from other minds. In other words, if the mind extends into the world, by the same token the world impinges into the mind. Mind and world are really not distinguishable.
Fodor's arguments get too technical for me towards the end of his piece -- and when I arrive at the bottom I discover that his main intellectual enterprise seems to be pointing out Darwin's mistakes, which worries me somewhat. The general thrust of his argument seems to be posited on a belief in the absolute independence of the individual brain, a Cartesian insistence of its a priori separation from culture, body, world. I would be inclined much more to cultural determinism -- I'm entirely happy with the idea that our national cultures, our software, our media, and even our philosophical traditions (Mr Fodor!) think us. I don't think this determines us absolutely, but I do think it takes conscious effort to escape the patterning of, say, the way a music sequencing software engineer makes us think about music, or the way a British newspaper editor makes us think about current events in the world. I also think that one can't ever entirely escape structurations like these, only swap them for other structurations (improv jazz, a Japanese poet's take on current world events).

Most of all, though, I feel that Clark and Chalmers' supersizing idea -- the Extended Mind Thesis -- fits my life intuitively. I feel that both technology and media extend my mind, and mingle it with other minds. This is why I do what I do; I like that promiscuity, that cultural reproduction. Rinus said an interesting thing when we were talking about this yesterday. "If you wrote two entries a day, and scheduled them, you could keep blogging daily thirty years after you're dead," he said. It was a joke, but not really so far-fetched, and the perfect refutation of Fodor's case here. Even after the brain itself dies, the mind can live on thanks to culture, that fantastic extender and supersizer.
(no subject)
Date: 2009-02-08 11:56 am (UTC)(no subject)
Date: 2009-02-08 12:31 pm (UTC)"we have extended our central nervous system itself in a global embrace, abolishing both space and time as far as our planet is concerned. Rapidly, we approach the final phase of the extensions of man - the technological simulation of consciousness, when the creative process of knowing will be collectively and corporately extended to the whole of human society, much as we have already extended our senses and our nerves by the various media. Whether the extension of consciousness, so long sought by advertisers for specific products, will be "a good thing" is a question that admits of a wide solution. There is little possibility of answering such questions about the extensions of man without considering all of them together. Any extension, whether of skin, hand, or foot, affects the whole psychic and social complex."
He may be ambivalent about the Extended Mind Thesis being a good thing, but he doesn't doubt that it's happening.
(no subject)
Date: 2009-02-08 12:31 pm (UTC)(no subject)
Date: 2009-02-08 12:49 pm (UTC)A frog may have such intentions,
Date: 2009-02-08 09:44 pm (UTC)http://mennokrant.net/
(no subject)
Date: 2009-02-08 01:11 pm (UTC)And a phone is just a way of sending letters.
It's all just information, just the means of transmission is different.
I'm not quite sure why calling it "technology and media" makes it any different in principle?
(no subject)
Date: 2009-02-08 01:30 pm (UTC)Fodor seems not to want the binary MIND / WORLD to be broken down, but I see that membrane as permeable, and becoming more so with every technological step.
(no subject)
Date: 2009-02-08 01:42 pm (UTC)And as for chips in brains, surely all our evidence of the way that commerce and politics works suggests that interventions would not be benign, ushering in an fabulous era of super-brains: what we'd more likely get is a mechanism of corporate and state control, a narrowing of possibilities not an expansion of them.
You say utopia, I say dystopia. Let's call the whole thing off.
(no subject)
Date: 2009-02-08 03:20 pm (UTC)"Even after the brain itself dies, the mind can live on thanks to culture, that fantastic extender and supersizer."
But hasn't art--in the form of paintings, books, sculpture, dance, architecture, music, you name it--always allowed for that sort of immortality? In fact, isn't that sort of one of the whole reasons for making art (if a secondary or tertiary one)?
What if that software in my brain needs upgrading but I'm not going for that barely beta-tested version the Micromonsters of the future are trying to force upon me? What if all the cool people have their Mac chip and I and my family are ostracized because we went for the cheaper, slower PC variety?
(no subject)
Date: 2009-02-08 03:28 pm (UTC)(no subject)
Date: 2009-02-09 09:31 am (UTC)I don't understand why not everyone sees this.
(no subject)
Date: 2009-02-09 09:42 am (UTC)I agree with this so strongly that I just wanted to add the following: People wouldn't look forward to being lobotomised, but that's bascially what they're looking forward to when they want to be turned into machines. The only difference is that technology, through mobile phones and so on, has become cool, like Nike trainers or something. So, technological lobotomy is now like having the latest Nike trainers. "Hey, you've turned yourself into a techno-zombie! Cool!"
(no subject)
Date: 2009-02-08 02:13 pm (UTC)It's like a journal, or a book, or a recording, or whatever else. It's a record. There's nothing special about it just because it's more modern and technological. Search functions make it convenient, and it's online, which allows others to comment and contribute ideas. But these are all things that people have done for ages (card catalogs = search; correspondence, letters to the editor, etc = comments).
But it seems like these considerations are totally separate from the workability of a "brain chip," which was more my sticking point on the issue. This idea that, in the future, we're going to have elective cybernetic implants and things of that nature ... it's up there with the flying cars, as far as futurism goes.
My generation is probably the last one that started learning to type on typewriters. In the industrialized West, anyway. This also makes me part of the generation that was truly captivated by computing when it became accessible. My family wasn't very well off, so we didn't own a computer until about 1993 or 1994 (an IBM 286 running DOS). Anyway, back then it was just fun to fuck around, learning how to use computers. I remember the simple pleasure of typing short notes in "San Francisco" font (the one that looked like a magazine cut-out ransom note) on the Macintosh in my 4th grade classroom. In other words, my generation seemed to have great enthusiasm for even the simplest aspects of computing.
But I was kind of alarmed when I started teaching undergraduates as a GTA in 2005, and most of them didn't even seem to know how to operate Microsoft Word properly. The generations succeeding mine take modern technology for granted, and they don't really have as much enthusiasm for it as you might imagine. They buy new gadgets (iPods, cell phones, laptops, etc), but they most certainly aren't experienced users. They know how to browse the internet, write email, set up social networking pages, use mp3 player apps, etc. but they don't have much interest in actually knowing how a computer works, what actions are represented by the buttons they click on the interface. Just look at Linux for example. Linux was written in 1991 (the GNU project has been around since 1984 or thereabouts, if we want to go back even further), and virtually no mainstream users even knew what it was, or would have considered using it, until the Ubuntu distro--purportedly the most "user friendly" of the bunch--gained popularity over the last few years. To use Linux, ordinarily, was to be forced to learn more about the internal workings, to learn the command line, to learn about dependencies, etc.
It seems to me that willingly having a chip inserted in one's brain is something that only an enthusiast (at the very least) would do, because it deals quite directly with the internal workings, the wiring, the actual processes. And by the looks of things, people are only getting less interested in that stuff. They don't want to be bound up in the technology, they don't want to learn its language.
no longer in the brain
Date: 2009-02-08 04:04 pm (UTC)This, you can't know and that is the irony. There are things which are still in our brain but we don't remember how to access them. Then suddenly one day, they come back to the surface.
What is happening with the external storage is not the storage by itself, but the search functionality on this external storage. I can have thousand of photos, but if there are not slightly organized (no dates into it for example), there are just a useless pile of bits. The fact that they contained structured information help when we need to access them with tools.
A simple creation date of the file can achieve that. We relate the date to memories of places. Either in your brain or somewhere on the external storage if you write the list of things you have done.
(no subject)
Date: 2009-02-08 04:27 pm (UTC)(no subject)
Date: 2009-02-08 06:02 pm (UTC)(no subject)
Date: 2009-02-08 06:57 pm (UTC)It's kind of funny to consider how such an entity would behave when it had the benefit of complete human knowledge. Would this collective force thirst for more knowledge, or would it come to realize its own futility in a lonely purposeless existence? Would it get bored of its unity and create a world of individuals who were ignorant of their true origin (kind of like how we are now)?
absolutely beginners
Date: 2009-02-08 09:30 pm (UTC)Re: absolutely beginners
Date: 2009-02-08 09:52 pm (UTC)Re: absolutely beginners
Date: 2009-02-08 10:01 pm (UTC)Re: absolutely beginners
Date: 2009-02-08 10:03 pm (UTC)Re: absolutely beginners
Date: 2009-02-08 10:12 pm (UTC)(no subject)
Date: 2009-02-08 05:20 pm (UTC)cognitive science
Date: 2009-02-08 05:31 pm (UTC)"Even after the brain itself dies, the mind can live on thanks to culture, that fantastic extender and supersizer."
this living on is an issue for me because I want to understand what creates experience. I don't think clark's perspective helps much here.. I'm interested in the nexus of consciousness, and while it is interesting to consider the distribution and exteneded capabilities of consciousness it seems that there is a kernel apart from that which is readily distributable such as memory, emotional states, varying degrees of task performance etc. then again maybe consciousness isn't all that some of us think it is.
(no subject)
Date: 2009-02-08 09:28 pm (UTC)(no subject)
Date: 2009-02-08 07:45 pm (UTC)-r
(no subject)
Date: 2009-02-08 08:44 pm (UTC)--Ian
trouble with the memes
Date: 2009-02-09 12:04 am (UTC)If you go for meme theory, and the evolution of your ideas (not that I entirely do..)
evolution depends on an imperfect replicator, your idea/meme could be mutated. This could be advantageous or disadvantageous. This would happen with your conventional memory.
If you have click opera as a constant reference for your exact thoughts as written and fixed, not mutating, perhaps your ideas/memes will evolve at a slower rate?
matt
Re: trouble with the memes
Date: 2009-02-09 12:13 am (UTC)Re: trouble with the memes
Date: 2009-02-09 01:19 am (UTC)with your conventional memory, i'd suggest there is the possibility of mutation every time you revisit the idea with imperfect recall
matt
loophole
Date: 2009-02-09 12:52 am (UTC)the field recordings initialy were a wee bit intrusive although i liked the concept.
ps indeed it would have sounded good on radio 4
Re: loophole
Date: 2009-02-09 01:37 am (UTC)