alexr_rwx: (communist underneath)
Alex R ([personal profile] alexr_rwx) wrote2005-03-21 12:59 am

expect this to be a paper soon, or a part thereof?

SEARLE: 1. Programs are entirely syntactical. 2. Minds have semantics. 3. Syntax is not the same as, nor by itself sufficient for, semantics. Therefore, programs are not minds. QED. [from The Mystery of Consciousness, pg 11, paraphrase of his well-known "Chinese Room Argument"]

RUDNICK: In principle, given a powerful enough computer and a sufficient model of physics, I could simulate with arbitrary precision all of the happenings in a section of space -- assuming that this space does not contain the computer running the simulation. Now I choose to simulate a room containing a human, sitting in a chair. The simulated human in the simulated chair will, for all useful definitions of "conscious", be conscious (although will not necessarily realize that he's part of a simulation). The brain-in-a-vat problem needs neither a brain nor a vat, just a mathematical model of a brain. If you want to say that the program is not a mind but contains one, that's analogous to saying that the physical world is not a mind but contains one. The main point is that if we believe the physical world to be mathematically modelable, we can bring about awake minds within a computer.

Furthermore, "semantics" is an inherently shady word, loaded with connotations of some "objectively extant physical world" that one can reference. You Professor Searle, know just as well as I that you can't get outside of your own head -- eyes don't guarantee that you're seeing anything more real than what a robot with a video camera or an agent with a get_world_input() function gets.

[identity profile] smileydee.livejournal.com 2005-03-21 08:41 pm (UTC)(link)
It's a very visceral argument. I feel I would have more understanding than someone sitting in the Chinese Room. Because I can react creatively to input. I can change my rulebook and be able to instantly evaluate how well my new rules work. So I feel that there must be something else besides the on/off electrical pulses my brain gets. Some sort of interpreter. Some sort of self.

I admit I haven't thought about this a whole lot. It's difficult to wrap the brain around. The Chinese Room Argument bothered me quite a lot when I first heard it, and it still bothers me. Because yes, there is the scientific fact that our brains don't get input that different than computers do. What's to say that we just receive input and produce output so quickly and so fluidly that it produces this "self" I talked about above?

But I don't know. I still feel like I am reacting in a more complex way than simply responding to input according to my programming.

[identity profile] zip4096.livejournal.com 2005-03-21 09:51 pm (UTC)(link)
Mm, well, if the self exists, it has to exist in the brain, and the brain is just a bunch of neurons, which are really simple on their own...

I know where you're coming from, though- I feel the same way viscerally, but the brain as a machine makes perfect sense to me.... It's easy to think of the heart as a machine, isn't it?

When you talk about changing the rulebook and be able to evalulate yourself, I'm certain I've heard of such techniques used in AI...

This sense of self is a really complicated thing! I do know that personality and maybe this sense of self we have, they think is located in the frontal lobes, and I think I heard that it's pretty resilent- some people with severe brain damage still have a sense of who they are...

Anyway if one were to argue that what makes us different from machines is feelings, well, feelings are just chemical signals (I'm sure you knew that though)...

I don't know where exactly I'm going with this :) I enjoyed talking about the brain though, it fascinates me. I wish I knew more and could answer your questions better.

Oh yeah, and, I agree with this :) "What's to say that we just receive input and produce output so quickly and so fluidly that it produces this "self" I talked about above?"

[identity profile] smileydee.livejournal.com 2005-03-21 11:00 pm (UTC)(link)
But does the self have to exist in the brain? Does it have to be a physical phenomenon? Maybe there is a soul that ties all of these disparate electrical and chemical inpulses into a coherent, thinking, self-aware being.
ext_110843: (cartoon me)

[identity profile] oniugnip.livejournal.com 2005-03-21 11:25 pm (UTC)(link)
... or it's entirely possible that "soul" or "self" is a name we use to label a bundle of effects -- "All of that happening, over there? That's a self!"

I'm not going to step to the whole dualist soul-argument thing, but I will say that it's really difficult to test -- and not particularly parsimonious as a description, which is what we go for as scientists. If we can explain everything as physical without introducing non-physical elements into the world (almost wrote "ontology")... then why not?

If it's all in the brain, that doesn't mean that it's not there -- if it's an emergent phenomenon, it's still real. The canonical analogy about this is that a single neuron isn't awake in the same way that a single molecule of water isn't wet. It takes a lot of them doing a very specific thing together...

[identity profile] smileydee.livejournal.com 2005-03-22 12:07 am (UTC)(link)
I don't mean to be an idiot about all this. I just really don't understand at which point intelligence and consciousness come in. At what level of complexity do we somehow change from a collection of parts reacting to stimuli according to a set of rules, to an intelligent, self-aware being? What causes that change?

Or have I just stuck myself in a nasty little slippery slope?

If I throw a ball at a tree, it must travel half the distance from me to the tree, and then it must travel half the distance from that point to the tree, and therefore it never ever hits the tree...
ext_110843: (juggling)

[identity profile] oniugnip.livejournal.com 2005-03-22 07:17 am (UTC)(link)
That's a really good point -- no idiocy here. Maybe there are differing degrees of conscious experience? It works out intuitively...

I had an interesting experience with this, recently. I found myself absently staring at the reflection of a can of shaving cream, in the mirror. No idea how long I was doing that -- not a very awake or aware mental state. I expect being dead is something like that, but more so.

[identity profile] reality-calls.livejournal.com 2005-03-23 05:41 am (UTC)(link)
   I found myself absently staring at the reflection of a can of shaving cream, in the mirror. No idea how long I was doing that -- not a very awake or aware mental state. I expect being dead is something like that, but more so.

That's a beautiful simile!  Gotta make a quote of that...  Do you mind if I rearrange it a bit and condense it so that it says something like, "I expect being dead is like absently staring at the reflection of a can of shaving cream in the mirror, but more so."?

Come to think of it, I should run that through a few different translations in Babelfish and see what it comes up with...

On a more serious note, I know exactly what you mean.  I have long periods of sitting-and-not-being-very-conscious myself, usually right after my last class gets out.

Seems to me like death would be more like being eternally anesthetized, though.  I got put under to have my wisdom teeth removed and next thing I knew I was being helped out of the dentist's office a few hours later.  I think death would be a lot like the time in between being anesthetized and waking up, only you wouldn't wake up.  This all, of course, depends on your religious beliefs, and who can say what death is like, anyway?  I don't seem to recall Jesus giving any vivid descriptions...

As far as the whole "differing degrees of conscious" theory goes, I can see that being the case.  I'm pretty sure that ants don't really have any idea that they exist, since they always respond in a predictable manner, and I've heard that there's evidence dolphins and apes can recognize themselves to some degree, thought they don't think about it nearly as much as humans do.  Of course, I'm not entirely sure of any of this.  Also, I would think that a lot of one's consciousness is developed through experience, much the same way we would expect the consciousness of an AI program to develop itself as it interacted with whatever environment it could sense.  The degree to which it developed would have to do with the physical ability of the computer, but also with the experiences that it had in its environment.  If, say, you took a well-written, advanced AI program (or a newly-grown brain in a vat, for that matter) and connected it to a video camera that was fixed on a reflection of a shaving-cream can in a mirror, I don't think it would develop much in the way of self-consciousness.

(Well, that turned out to be longer than today's post)

      "Live from the People's Republic"