alexr_rwx: (communist underneath)
[personal profile] alexr_rwx
SEARLE: 1. Programs are entirely syntactical. 2. Minds have semantics. 3. Syntax is not the same as, nor by itself sufficient for, semantics. Therefore, programs are not minds. QED. [from The Mystery of Consciousness, pg 11, paraphrase of his well-known "Chinese Room Argument"]

RUDNICK: In principle, given a powerful enough computer and a sufficient model of physics, I could simulate with arbitrary precision all of the happenings in a section of space -- assuming that this space does not contain the computer running the simulation. Now I choose to simulate a room containing a human, sitting in a chair. The simulated human in the simulated chair will, for all useful definitions of "conscious", be conscious (although will not necessarily realize that he's part of a simulation). The brain-in-a-vat problem needs neither a brain nor a vat, just a mathematical model of a brain. If you want to say that the program is not a mind but contains one, that's analogous to saying that the physical world is not a mind but contains one. The main point is that if we believe the physical world to be mathematically modelable, we can bring about awake minds within a computer.

Furthermore, "semantics" is an inherently shady word, loaded with connotations of some "objectively extant physical world" that one can reference. You Professor Searle, know just as well as I that you can't get outside of your own head -- eyes don't guarantee that you're seeing anything more real than what a robot with a video camera or an agent with a get_world_input() function gets.

Date: 2005-03-21 06:17 am (UTC)
From: [identity profile] zip4096.livejournal.com
Ahh, I completely agree with your argument!! :)

Stuff I've learned in my favorite class, Sensation and Perception, supports it, too, I would say. Have a look at http://www.psy.fsu.edu/~johnson/snp :) The brain actually "makes stuff up"... what we perceive is most certainly not reality.

If you don't already know about it (you know what, I probably told you about it in meatspace), you should check out the McGurk effect, it's absolutely fascinating (our brain trusts our visual system more than the auditory system):

http://www.media.uio.no/personer/arntm/McGurk_english.html

Date: 2005-03-21 06:29 am (UTC)
ext_110843: (cartoon me)
From: [identity profile] oniugnip.livejournal.com
Oh my goodness. That's amazing. I totally heard "da da da..." until I closed my eyes. This is my new favorite thing in the whole world -- it used to be the Doppler Effect (which leads us, very blatantly, to "things don't really have colors")...

(all of this is making me want to go back and study psychology too...)

Would you like to get together in meatspace this week? :)

Date: 2005-03-21 07:54 am (UTC)
From: [identity profile] zip4096.livejournal.com
Awesome, I'm so glad you liked it :) :) I thought it was incredible too :) I was definitely the most vocal about it in class when Dr. Johnson showed us a similar video on the projector in class- "Whoa!!!" I think was my reaction. I thought maybe it might've been particularly pronounced for me at the time because I was working so hard on the dialogue in the cartoon, so I was having to stare at mouths a lot...

Ahh the Doppler Effect- you know I hadn't thought of that as implying things odn't really have colors before, interesting. Also I think I'd heard in a past class that dogs (and cats?) have two types of color receptors whereas we have three :) And then, can't bees perceive some ultraviolet? Ah, just more examples of perception not being reality, none as striking as the McGurk effect huh :)

You could get an excellent background in neuroscience just from reading some of my textbooks which I'd be happy to let you borrow :) They're many, many times easier to read than, say, the White Book (shudder!!), and I don't know how much more you'd get from lectures (Dr. Johnson's lectures might be an exception... actually, hell Alex, you could sit in on one if you wanted!! Thursday would be best :)

I would like to get together. You, G, and I, how about that?

I'm going to try to go to ATL Tuesday and tomorrow I'm going to be way busy, Weds I have an appointment with Bobby Link to fix his computer (again), so later in the week would be better.

If Whitney's with you Thursday might be a good day for her to see the station, 'cause I'll most likely be doing Club Convergence...

Date: 2005-03-21 07:31 pm (UTC)
ext_110843: (mighty penguin)
From: [identity profile] oniugnip.livejournal.com
I might have to take you up on the offer :) Everybody likes books... perhaps this summer?

Thursday would be great! We have no Whitney, but I'd love to hang out :) I'll probably go to visit Natalie in Jacksonville sometime this week, and possibly Lloyd on Friday or Saturday (if he's not swamped), but other than that, I'm totally free...

What's happening in the ATL tomorrow? (... is it google-related? ...)

Date: 2005-03-21 08:17 pm (UTC)
From: [identity profile] zip4096.livejournal.com
In the ATL, The Animation Show (http://theanimationshow.com) is happening! :D I'm so excited. Shorts picked by Mike Judge & Don Hertzfeldt.

Yeah you should totally come and sit in on my class, that'd be awesome :)

Date: 2008-06-08 11:04 am (UTC)
lindseykuper: Photo of me outside. (Default)
From: [personal profile] lindseykuper
Damn. Y'know, at work, a bunch of our science books have an integrated SciAm news feed thing, and I often read it in passing. A podcast about the McGurk effect came over the wire recently, and I thought it was pretty cool and intended to tell you about it. But you've known all about it for years!...

...I never wanted to bother with psychology when I was in school, but working on psychology textbooks has made me think, "Hey, this stuff is important and interesting." I was right there -- why didn't I at least audit the courses? Man.

Date: 2005-03-21 03:53 pm (UTC)
From: [identity profile] smileydee.livejournal.com
That last one was pretty cool. I didn't hear "DA" though, I heard "BA" with my eyes closed and "GA" with my eyes open. I must be weird. I wish they had put the explanation on a seperate page from the video, so people wouldn't be affected by an explanation of what they are supposed to be hearing.

With regards to computers possibly containing minds, the only argument I've heard about it was this analogy from a philosophy class. Basically, imagine you are sitting in a room, and somebody unbeknownst to you hands you some symbols on paper, and you have a bunch of symbols on paper and a list of rules about what to send out of the room given what is sent in to you. No matter how extensive your rules or how quickly you can follow the rules, you'll still never have a way to understand what you are being given or what you are sending out.

Since we clearly have some sort of understanding of our input and output, therefore we have something as humans that computers can never have.

Date: 2005-03-21 07:00 pm (UTC)
ext_110843: (communist underneath)
From: [identity profile] oniugnip.livejournal.com
That's the Chinese Room Argument (http://en.wikipedia.org/wiki/Chinese_room), which is from this very dude, John Searle. I would say that the room-you-rulebook system as a whole understands Chinese -- and with the right rulebook, the system would be conscious.

This comes down to the Symbol Grounding Problem... how are you going to say that you have any more "understanding" of your inputs and outputs than a computer does when all your brain gets as input are sequences of electrical pulses?

Date: 2005-03-21 07:08 pm (UTC)
From: [identity profile] elysianboarder.livejournal.com
I have a question about the Chinese Room Argument. It says that the person in room, given the rules for example if you see x write y. would come out of the room not knowning anymore chinese then when they entered. But the question being is that a vaild statement. Seeing that this is how many people learn another languange. If I say Hola you say Como esta.... Can a computer with AI learn the same way? Like the program you were writing that saw a word or phrase in spanish and could translate it. Was what you were doing basically the same thing as placing a person in a room?

Date: 2005-03-21 07:23 pm (UTC)
ext_110843: (juggling)
From: [identity profile] oniugnip.livejournal.com
Hmmm... well, imagine yourself in that situation -- could you learn Chinese just by looking at a lot of Chinese text? I don't think I could; if I was around Chinese people, and they were doing things and showing me items while saying the words, then I could probably learn it. Sitting in the room, you'd probably get better at recognizing the characters, but you wouldn't have any good way to learn what they mean without a dictionary or some more context...

Maybe if you had children's books, with pictures...

Date: 2005-03-21 08:23 pm (UTC)
From: [identity profile] zip4096.livejournal.com
What is "understanding" though?

What if the experience of "understanding" is that some part of our brain produces the correct output to the given problem, and then another region is sent a signal that gives the *feeling* of "Oh yes, I understand that perfectly." Personally I've had the experience of thinking I understand some material (eg physics), but when it comes test-time, I can't produce the right output :) Or also, an answer to something will come to me in a flash...

I dunno. I think you covered it when you referenced the Symbol Grounding Problem, Alex :) I hadn't heard of any of this stuff...

Alex's journal is educational!

Date: 2005-03-21 08:45 pm (UTC)
From: [identity profile] smileydee.livejournal.com
Well, if we're assuming that our brains are just receiving and responding to input, then how are we to know that the other region is sending us a signal that feels like understanding? Wouldn't we just receive it as some other nameless input? How would be able to differentiate that from some nameless input telling us we don't know what the hell we're talking about?

Date: 2005-03-21 09:41 pm (UTC)
From: [identity profile] zip4096.livejournal.com
Hmm, this is tough! :)

Well... I'm a little uncomfortable defending my "understanding" signaling idea, because it's just that, an idea, and I don't know of a physical basis for it that's been discovered (I'm all about empirical evidence and science; philosophy often seems like a bunch of bullshit to me).

But anyway, let me give you an example... Regions in the brain communicate with one another all the time. Say you see something very scary- well the optic nerve carries the raw image to a large region of cortex (visual processing takes a lot of power!) in the occipital lobe which interprets the image as being scary, and then a signal is sent to your amygdala, which makes you feel afraid, and that has wiring to the hippocampus, which will make you remember it (and even better than normal because there's a strong emotion associated with it).

What I'm talking about is communication within the brain.. It's not input from the outside... does that answer your question at all... ?

I'm sorry, I actually had to reread your post a few times and I'm still not entirely sure if I understand what you were getting at. If it gets too abstract I have difficulty, but I love the physical, real science stuff :)

Date: 2005-03-21 10:58 pm (UTC)
From: [identity profile] smileydee.livejournal.com
Well, my point is, that if we parse all these signals from the brain down, they're still, at the core, electrical impulses, right? Like 1s and 0s. So we're still back to the main problem of how do we translate 1s and 0s into something that makes sense? How does that become consciousness? No matter how complex our brains are, we're still back to the chinese room. When does self-awareness and intelligence come in?

Date: 2005-03-21 11:19 pm (UTC)
ext_110843: (coffee)
From: [identity profile] oniugnip.livejournal.com
That, m'dear, is the big problem. Where's the subjective, conscious experience, and how does it come about out of a physical system?

Hofstadter says (and I'm inclined to believe, and he puts it much more eloquently) that it comes about from a sufficiently self-referential computational system, which is to say a thing that has a model of itself and does processing about its own state...

Date: 2005-03-21 11:15 pm (UTC)
ext_110843: (juggling)
From: [identity profile] oniugnip.livejournal.com
Oh my goodness. That's a really interesting thing to consider -- there's a feeling that we call "understanding". Hrm.

I think... that I couldn't competently answer that -- seems like something that needs to be empirically checked in on?

Date: 2005-03-22 03:47 pm (UTC)
From: [identity profile] elysianboarder.livejournal.com
Remember brett there is the long term and short term memory stores. The amigdula (*buzz word alex!*) converts the to. So this further proves that if your brain is flawed as a system then you can not clearly fuction the input and output of that system. Question placed, since "understanding" is simply a function of the brain, then how do you have a true since of self. Are people who are missing or have damaged sections of the whole not understanding and not self?

Date: 2005-03-22 04:23 pm (UTC)
ext_110843: (coffee)
From: [identity profile] oniugnip.livejournal.com
Mmm... yes. I don't think anybody's going to argue that some people with damaged brains have a messed up understanding and maybe less sense of self.

... but it's a remarkably resilient thing :)

But surely it would be possible to get an even stronger consciousness out of a few million more years of evolution or R&D? Given enough time, could we not breed people who are born with very well-controlled conscious minds?

Date: 2005-03-22 04:38 pm (UTC)
From: [identity profile] elysianboarder.livejournal.com
That's really very trivial. With enough monkeys in a room you can hack out Hamlet. So being able to breed people with a very well-controlled conscious mind or *cough cough* the mind itself is only a matter of time. I mean I think they might even come up with a name for it. I would call it something like artifical intelligence. Since it would be trying to have inputs and from those determine to an extent an output. :D

Date: 2005-03-21 08:41 pm (UTC)
From: [identity profile] smileydee.livejournal.com
It's a very visceral argument. I feel I would have more understanding than someone sitting in the Chinese Room. Because I can react creatively to input. I can change my rulebook and be able to instantly evaluate how well my new rules work. So I feel that there must be something else besides the on/off electrical pulses my brain gets. Some sort of interpreter. Some sort of self.

I admit I haven't thought about this a whole lot. It's difficult to wrap the brain around. The Chinese Room Argument bothered me quite a lot when I first heard it, and it still bothers me. Because yes, there is the scientific fact that our brains don't get input that different than computers do. What's to say that we just receive input and produce output so quickly and so fluidly that it produces this "self" I talked about above?

But I don't know. I still feel like I am reacting in a more complex way than simply responding to input according to my programming.

Date: 2005-03-21 09:51 pm (UTC)
From: [identity profile] zip4096.livejournal.com
Mm, well, if the self exists, it has to exist in the brain, and the brain is just a bunch of neurons, which are really simple on their own...

I know where you're coming from, though- I feel the same way viscerally, but the brain as a machine makes perfect sense to me.... It's easy to think of the heart as a machine, isn't it?

When you talk about changing the rulebook and be able to evalulate yourself, I'm certain I've heard of such techniques used in AI...

This sense of self is a really complicated thing! I do know that personality and maybe this sense of self we have, they think is located in the frontal lobes, and I think I heard that it's pretty resilent- some people with severe brain damage still have a sense of who they are...

Anyway if one were to argue that what makes us different from machines is feelings, well, feelings are just chemical signals (I'm sure you knew that though)...

I don't know where exactly I'm going with this :) I enjoyed talking about the brain though, it fascinates me. I wish I knew more and could answer your questions better.

Oh yeah, and, I agree with this :) "What's to say that we just receive input and produce output so quickly and so fluidly that it produces this "self" I talked about above?"

Date: 2005-03-21 11:00 pm (UTC)
From: [identity profile] smileydee.livejournal.com
But does the self have to exist in the brain? Does it have to be a physical phenomenon? Maybe there is a soul that ties all of these disparate electrical and chemical inpulses into a coherent, thinking, self-aware being.

Date: 2005-03-21 11:25 pm (UTC)
ext_110843: (cartoon me)
From: [identity profile] oniugnip.livejournal.com
... or it's entirely possible that "soul" or "self" is a name we use to label a bundle of effects -- "All of that happening, over there? That's a self!"

I'm not going to step to the whole dualist soul-argument thing, but I will say that it's really difficult to test -- and not particularly parsimonious as a description, which is what we go for as scientists. If we can explain everything as physical without introducing non-physical elements into the world (almost wrote "ontology")... then why not?

If it's all in the brain, that doesn't mean that it's not there -- if it's an emergent phenomenon, it's still real. The canonical analogy about this is that a single neuron isn't awake in the same way that a single molecule of water isn't wet. It takes a lot of them doing a very specific thing together...

Date: 2005-03-22 12:07 am (UTC)
From: [identity profile] smileydee.livejournal.com
I don't mean to be an idiot about all this. I just really don't understand at which point intelligence and consciousness come in. At what level of complexity do we somehow change from a collection of parts reacting to stimuli according to a set of rules, to an intelligent, self-aware being? What causes that change?

Or have I just stuck myself in a nasty little slippery slope?

If I throw a ball at a tree, it must travel half the distance from me to the tree, and then it must travel half the distance from that point to the tree, and therefore it never ever hits the tree...

Date: 2005-03-22 07:17 am (UTC)
ext_110843: (juggling)
From: [identity profile] oniugnip.livejournal.com
That's a really good point -- no idiocy here. Maybe there are differing degrees of conscious experience? It works out intuitively...

I had an interesting experience with this, recently. I found myself absently staring at the reflection of a can of shaving cream, in the mirror. No idea how long I was doing that -- not a very awake or aware mental state. I expect being dead is something like that, but more so.

Date: 2005-03-23 05:41 am (UTC)
From: [identity profile] reality-calls.livejournal.com
   I found myself absently staring at the reflection of a can of shaving cream, in the mirror. No idea how long I was doing that -- not a very awake or aware mental state. I expect being dead is something like that, but more so.

That's a beautiful simile!  Gotta make a quote of that...  Do you mind if I rearrange it a bit and condense it so that it says something like, "I expect being dead is like absently staring at the reflection of a can of shaving cream in the mirror, but more so."?

Come to think of it, I should run that through a few different translations in Babelfish and see what it comes up with...

On a more serious note, I know exactly what you mean.  I have long periods of sitting-and-not-being-very-conscious myself, usually right after my last class gets out.

Seems to me like death would be more like being eternally anesthetized, though.  I got put under to have my wisdom teeth removed and next thing I knew I was being helped out of the dentist's office a few hours later.  I think death would be a lot like the time in between being anesthetized and waking up, only you wouldn't wake up.  This all, of course, depends on your religious beliefs, and who can say what death is like, anyway?  I don't seem to recall Jesus giving any vivid descriptions...

As far as the whole "differing degrees of conscious" theory goes, I can see that being the case.  I'm pretty sure that ants don't really have any idea that they exist, since they always respond in a predictable manner, and I've heard that there's evidence dolphins and apes can recognize themselves to some degree, thought they don't think about it nearly as much as humans do.  Of course, I'm not entirely sure of any of this.  Also, I would think that a lot of one's consciousness is developed through experience, much the same way we would expect the consciousness of an AI program to develop itself as it interacted with whatever environment it could sense.  The degree to which it developed would have to do with the physical ability of the computer, but also with the experiences that it had in its environment.  If, say, you took a well-written, advanced AI program (or a newly-grown brain in a vat, for that matter) and connected it to a video camera that was fixed on a reflection of a shaving-cream can in a mirror, I don't think it would develop much in the way of self-consciousness.

(Well, that turned out to be longer than today's post)

      "Live from the People's Republic"

Date: 2005-03-21 08:18 pm (UTC)
From: [identity profile] zip4096.livejournal.com
I'm glad you tried it Denise :) There was another page that was a lot more careful about revealing what you were supposed to hear and whatnot- it was like, "do this, and then scroll down past this point" and whatnot.

Date: 2005-03-23 05:46 am (UTC)
From: [identity profile] reality-calls.livejournal.com
That's frightening-- I hate it when my mind plays tricks on me!

The really scary thing is that even after I knew what the actual sound was, I still heard "DA" or "GA" when I had my eyes open!

      "Live from the People's Republic"

Date: 2005-03-21 03:33 pm (UTC)
From: [identity profile] falun.livejournal.com
i think i need to borrow that for my paper... don't let me forget?

Profile

alexr_rwx: (Default)
Alex R

May 2022

S M T W T F S
1234 567
891011121314
15161718192021
22232425262728
293031    

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Aug. 24th, 2025 11:03 am
Powered by Dreamwidth Studios