green_amber: (Default)
green_amber ([personal profile] green_amber) wrote2005-10-24 11:49 am
Entry tags:

Quote found on toilet floor

.. by [livejournal.com profile] bondage_and_tea - anyone recognise it?

"It's amusing that our model of ourselves is that of an impenetrable machine we somehow need to decode and predict - then and only then can we make the right decisions in order to be happy. We set up miniature experiments and carefully monitor our responses and how others react to us to see if we should repeat or continue the experience. Frantically moving from one friend, lover, job, university, project, political cause, to the next, each briefly improving the situation and giving us the status and self-importance we need to get out of bed in the morning. Worrying about the global issues, reading the news religiously every day so we're informed individuals and can ramble on for hours about the pains of people in the world we'll never meet. Ignoring people we could share happiness with or - worse - learning methods of manipulation so we can influence those closest (proximal) to us, the satisfaction of a person molded feeding back into our personal status machine. Eye contact, use first name, soft tone, develop a rapport but not for too long lest honesty and humility creeps in. Helping and diplomacy rather than sharing and empathy."

I've always been intrigued by the fact that we compare how our brains work to whatever the current hip technology is. We think of ourselves as computers - how often have you said, my brain's crashed - the Victorians compared intelligence to the telegraph. What did stone age man think of themselves as? What will we think of intelligence when nanotechnology is really here?
zotz: (Default)

[personal profile] zotz 2005-10-24 10:55 am (UTC)(link)
Freud's model of the self - Id, Ego, Superego, with libido flowing between them and it being important to have things in balance - was very influenced by hydraulic engineering.

[identity profile] surliminal.livejournal.com 2005-10-24 11:18 am (UTC)(link)
Mmm.. I didn't know that. There's a PhD in this - historical influences on metaphors of the mind..

[identity profile] thishardenedarm.livejournal.com 2005-10-24 12:26 pm (UTC)(link)
its been done, ive got the book, its called psychology as metaphor, in fact there are two, the other ones is (i think) metaphors of the mind.

[identity profile] supergee.livejournal.com 2005-10-24 11:13 am (UTC)(link)
"Brain keep spinning around like wheel."

[identity profile] surliminal.livejournal.com 2005-10-24 11:18 am (UTC)(link)
snort!

[identity profile] blue-condition.livejournal.com 2005-10-24 11:23 am (UTC)(link)
I've always just believed it's a gooey mess of electrochemical gunk and any attempt to understand it on a large scale is doomed. ;)
andrewducker: (Default)

[personal profile] andrewducker 2005-10-24 11:39 am (UTC)(link)
Indeedy. It's a complex systems. You can't explain complex systems _purely_ as a collection of smaller systems. I mean, they _are_, but the interactions between them are too complex to actually understand all at once. And the 'meaning' of it changes depending on what level of abstraction you're referencing it in.

You can look at it on the small scale ("This neuron works in this way, with these inputs.") and on the large scale ("When a person is depressed, this area of the brain is less active than in a happy person."), but relating the two of them is always going to be a tortuous and complex process, which isn't simultaneously understandable on a precise level and a high level.

Well - not with _our_ brains. You'd need bigger brains to understand it with :->

[identity profile] blue-condition.livejournal.com 2005-10-24 11:42 am (UTC)(link)
Indeed, but I don't believe there's anything "magical" about consciousness. It's just an emergent property of a big load of gloop.
andrewducker: (Default)

[personal profile] andrewducker 2005-10-24 11:53 am (UTC)(link)
I certainly didn't mean to imply anything magical. I don't believe in magic :->

[identity profile] surliminal.livejournal.com 2005-10-24 11:54 am (UTC)(link)
Oddly consciousness is the other thing I was noodling about at the weekend. My cats are so obviously conscious and in really quite subtle ways. they are disappointed, happy, enticing, vain, frustrated, envious and irritated. Yet they have brains like peas no? How stupid do you have to get before consciousness vanishe? Do bees have consciousness? Do goldfish? Do rats?
zotz: (Default)

[personal profile] zotz 2005-10-24 12:12 pm (UTC)(link)
You could read some Dan Dennet on this. Or various other people, of course, but I like Dennet. There's a book called Kinds of Minds where IIRC he discusses this sort of thing. There's a famous question posed by (I think) Thomas Nagle, "What is it like to be a bat?"

Very different from what we're used to, obviously, but clearly like something, in a sense that it's probably not like anything to be a brick. Dennet speculates that consciousness doesn't vanish entirely even when you're down to flies. They may just have a very rudimentary consciousness.

He also suggests that, to take it much further, even something like a thermostat could be said to be aware of its surroundings, although obviously at a level as far removed from a fly as a fly would be removed from us.

[identity profile] surliminal.livejournal.com 2005-10-24 12:16 pm (UTC)(link)
He also suggests that, to take it much further, even something like a thermostat could be said to be aware of its surroundings, although obviously at a level as far removed from a fly as a fly would be removed from us.

But this is exactly what I think we all know to be the fallacy of strong AI (forgive my no doubt old hat terminology - I used to teach Ai a bit in the early 90s). WE all I think KNOW our sense of consciousnes is a bit like our cat's, maybe even a teeny bit like a fly, but NOT AT ALL Like a thermostat. Thrmostats don't think. they may feel but they don't intend.
zotz: (Default)

[personal profile] zotz 2005-10-24 12:25 pm (UTC)(link)
NOT AT ALL Like a thermostat

Well, for a start there's no requirement that any consciousness be like ours. Clearly if there could be said to be an infinitesmal crumb of consciousness in a thermostat, it would be entirely unlike ours, but then if you could build a supercomputer with consciousness there's no particular reason I can think of to believe its consciousness would be similar to ours.

Thrmostats don't think. they may feel but they don't intend.

It doesn't seem to me that the one would be less odd than the other. After all, we can't do other than speculate that they feel (and even that is stretching definitions), but we can be certain that they act.

[identity profile] surliminal.livejournal.com 2005-10-24 01:48 pm (UTC)(link)
Well, I was (to be helpful) using "feel" in a very non human strong-AI kinda way . I don't think thermostats feel in the same way we do, but they have sensors which means they detect heat which means they "feel" for some functional definition of feel. It's hard to see een a functional simulacrum of consciousnes in what a thermostat does. "Act" doesn't seem to get us any further either in making a conscious/non-conscious distinction; the Turing text box "acted" but it was neither intelligent nor did it think (or feel).

has anyone written in sf about a supercomputer consciousness that is fundamentally different from ours? (I suppose this is just a subset of the writing a real alien not a human with wrinkly prostheses problem..)
zotz: (Default)

[personal profile] zotz 2005-10-24 01:59 pm (UTC)(link)
It's hard to see een a functional simulacrum of consciousnes in what a thermostat does.

The same is true of, for example, amoebae. The point of the example was, I think, to push a point rather than to be obvious. In fact, being inobvious was probably its virtue.

His point, I think, is that asking whether something is conscious isn't just asking if it has the richness of perception and inner life that we do.

has anyone written in sf about a supercomputer consciousness that is fundamentally different from ours?

Hmm. Dunno. Wintermute and Neuromancer seem very strange, but they're left very vaguely sketched. Banks's Minds seem fairly straightforward by comparison. I can't offhand think of anyone who's tried to depict the mindset and experience of a radically-different conscious AI, no.

[identity profile] surliminal.livejournal.com 2005-10-24 02:05 pm (UTC)(link)
The same is true of, for example, amoebae. The point of the example was, I think, to push a point rather than to be obvious. In fact, being inobvious was probably its virtue.

His point, I think, is that asking whether something is conscious isn't just asking if it has the richness of perception and inner life that we do.


Yes I get that - it's a good quote and I thank you for it. But what it pushes me into thinking/feeling is that there IS a crucial, patent difference between capacity to gather inputs, and "consciousness" , even of a diminished-from-human kind ; just as there is between capacity to perform useful functions on inputs and "intelligence". Andy's example of the fly to me goes into the "thermostat" class, as does your amoeba - feeling/acting but no consciousness, mere instinct - so I'm not just making a plain organic/inorganic distinction. (I recognise this gets us no further.)
zotz: (Default)

[personal profile] zotz 2005-10-24 02:09 pm (UTC)(link)
OK, I see, yes. I agree, I think, but I couldn't say what it is, precisely. I'm not sure enough to write off the thermostat example entirely.

[identity profile] pigeonhed.livejournal.com 2005-10-24 02:26 pm (UTC)(link)
Dan Simmons has consistently related high-level AIs to the Gods of Greek Myth, with their interpersonal feuds etc. It isn't fundamentally different to ours but he seems to suggest that the advanced AI would view our levels of consciousness and intelligence in the same way that we look upon that of cats or apes, or even fleas.

[identity profile] blue-condition.livejournal.com 2005-10-24 12:19 pm (UTC)(link)
I reject the whole question, because "consciousness" doesn't have a satisfactory definition. If it's indeterminate whether something has it or not purely by observation of its behaviour, it's impossible to make any meaningful decision as to whether it has it or not.
zotz: (Default)

[personal profile] zotz 2005-10-24 11:50 am (UTC)(link)
the interactions between them are too complex to actually understand all at once.

That doesn't really show that it can't be done, but that it's extremely difficult. Indeed, in such a system different facets may only be understood in-depth by separate people, but that doesn't mean that it isn't understood.

isn't simultaneously understandable on a precise level and a high level.

It isn't really required that the same individuals do both.
andrewducker: (Default)

[personal profile] andrewducker 2005-10-24 11:53 am (UTC)(link)
But people seem to think that at some point we'll understand brains in the same way that we understand cars. As if we'd be able to say "That neuron there is the one that causes your depression."

Depression isn't an attribute of neurons - it's an attribute of whole brains. Understanding the low level will definitely help our understanding of it, but it's not the level at which it means anything.

[identity profile] surliminal.livejournal.com 2005-10-24 11:55 am (UTC)(link)
I suppose if we can understand chaotic systems like weather - which we do a lot better than we used to - is there any reason we can't eventually understand most of how the brain works? I'm sure there's some Godel thing that says we can never understand all of it, but?
andrewducker: (Default)

[personal profile] andrewducker 2005-10-24 11:59 am (UTC)(link)
But when we understand the weather we understand it on the level of fronts, huge flows of air, and abstract concepts like "clouds" - not the level of rain drops. We know that clouds are made up of rain drops, we know how they form, etc. But while that knowledge informs our understanding of _why_ cold fronts cause rain when they hit warm fronts, we don't explain the weather in terms of how individual rain drops are behaving.
zotz: (Default)

[personal profile] zotz 2005-10-24 12:04 pm (UTC)(link)
at some point we'll understand brains in the same way that we understand cars.

That would clearly be going too far, yes. Cars are very simple and predictable by comparison with even simple biological systems.

Depression isn't an attribute of neurons - it's an attribute of whole brains.

Indeed.

Understanding the low level will definitely help our understanding of it, but it's not the level at which it means anything.

That depends. Ultimately, thoughts are electrochemical events - or, more properly, patterns or changes in patterns (in both space and time) of electrochemical events - across the brain, although obviously we experience them differently. Given that it's possible to study these (in a basic way at the moment) while asking people how they feel and what they think, it's not inconceivable (although admittedly a tall order) that we might at some point be able to relate the activity of the brain as a whole to its subjective experience at that point.
andrewducker: (Default)

[personal profile] andrewducker 2005-10-24 12:38 pm (UTC)(link)
we might at some point be able to relate the activity of the brain as a whole to its subjective experience at that point.

Exactly - the activity of the brain _as a whole_.

The "brain in a jar" experiment frequently amuses me for the fact that it leaves out all sorts of hormonal signals that have very definite effects on our brains.
zotz: (Default)

[personal profile] zotz 2005-10-24 12:42 pm (UTC)(link)
Yes, but these experiments are more aimed at discussion of consciousness. I don't think there's any suggestion that removing hormonal effects would remove consciousness. The knowledge of being a brain in a jar would have fairly strong effects on a person's frame of mind too, but it's not really what people are trying to discuss in these cases.

[identity profile] pigeonhed.livejournal.com 2005-10-24 02:20 pm (UTC)(link)
I don't understand how my car works...

[identity profile] alexmc.livejournal.com 2005-10-24 11:36 am (UTC)(link)
Yesterday I saw some amusing graffitti on the door of the gents cubicle in the pub near to the Tate Modern.



"Bad Wolf"

[identity profile] surliminal.livejournal.com 2005-10-24 11:51 am (UTC)(link)
nice :-)

[identity profile] kissmeforlonger.livejournal.com 2005-10-24 12:12 pm (UTC)(link)
Someone's just quoted this poem by Ogden Nash on their journal :-)

Someone I was chatting to recently said her 'self' was 'behind her face' ie in her brain. But people used to think the centre of ourselves was the stomach. I guess that's supposed to indicate radically different values, or something.

Personally I think it's a product of too much value placed on intellectualising and not enough on the physical.

[identity profile] bondage-and-tea.livejournal.com 2005-10-24 02:04 pm (UTC)(link)
I love this poem.

[identity profile] surliminal.livejournal.com 2005-10-24 02:07 pm (UTC)(link)
Oh it is, is it, all right then, you sleep under a six-inch blanket of snow and
I'll sleep under a half-inch blanket of unpoetical blanket material and we'll see which one keeps warm,



That's great :-)

[identity profile] blue-condition.livejournal.com 2005-10-24 03:17 pm (UTC)(link)
> Personally I think it's a product of too much value placed on intellectualising and not enough on the physical

I think rather the opposite. Too much of the intellectualising is over things that are either too ill-defined to be interesting or have no defintive answer. Concentrate on well-defined problems (or on problems that better define what is/isn't well defined) rather than fannying around with philosophy and the world would be a better place

[identity profile] drdoug.livejournal.com 2005-10-24 04:14 pm (UTC)(link)
her 'self' was 'behind her face' ie in her brain. But people used to think the centre of ourselves was the stomach. I guess that's supposed to indicate radically different values, or something.

Nah - I think it's far more prosaic than that.

If you ask people where in the body they think they are, they'll usually say somewhere in the middle of their head. My guess is this is because vision is the primary sense modality (for most people) and this point is slap behind the eyes. If you ask people where in their body their deep and powerful feelings come from, they'll usually point to their chest or their abdomen. My guess is this is because deep and powerful feelings lead to tightness in the chest or discomfort in the abdomen.

I reckon that physicality underlies all the cultural stuff about 'gut feelings', 'from the heart' and 'you are your brain'.