alexr_rwx: (communist underneath)
Alex R ([personal profile] alexr_rwx) wrote2005-03-21 12:59 am

expect this to be a paper soon, or a part thereof?

SEARLE: 1. Programs are entirely syntactical. 2. Minds have semantics. 3. Syntax is not the same as, nor by itself sufficient for, semantics. Therefore, programs are not minds. QED. [from The Mystery of Consciousness, pg 11, paraphrase of his well-known "Chinese Room Argument"]

RUDNICK: In principle, given a powerful enough computer and a sufficient model of physics, I could simulate with arbitrary precision all of the happenings in a section of space -- assuming that this space does not contain the computer running the simulation. Now I choose to simulate a room containing a human, sitting in a chair. The simulated human in the simulated chair will, for all useful definitions of "conscious", be conscious (although will not necessarily realize that he's part of a simulation). The brain-in-a-vat problem needs neither a brain nor a vat, just a mathematical model of a brain. If you want to say that the program is not a mind but contains one, that's analogous to saying that the physical world is not a mind but contains one. The main point is that if we believe the physical world to be mathematically modelable, we can bring about awake minds within a computer.

Furthermore, "semantics" is an inherently shady word, loaded with connotations of some "objectively extant physical world" that one can reference. You Professor Searle, know just as well as I that you can't get outside of your own head -- eyes don't guarantee that you're seeing anything more real than what a robot with a video camera or an agent with a get_world_input() function gets.

[identity profile] elysianboarder.livejournal.com 2005-03-22 03:47 pm (UTC)(link)
Remember brett there is the long term and short term memory stores. The amigdula (*buzz word alex!*) converts the to. So this further proves that if your brain is flawed as a system then you can not clearly fuction the input and output of that system. Question placed, since "understanding" is simply a function of the brain, then how do you have a true since of self. Are people who are missing or have damaged sections of the whole not understanding and not self?
ext_110843: (coffee)

[identity profile] oniugnip.livejournal.com 2005-03-22 04:23 pm (UTC)(link)
Mmm... yes. I don't think anybody's going to argue that some people with damaged brains have a messed up understanding and maybe less sense of self.

... but it's a remarkably resilient thing :)

But surely it would be possible to get an even stronger consciousness out of a few million more years of evolution or R&D? Given enough time, could we not breed people who are born with very well-controlled conscious minds?

[identity profile] elysianboarder.livejournal.com 2005-03-22 04:38 pm (UTC)(link)
That's really very trivial. With enough monkeys in a room you can hack out Hamlet. So being able to breed people with a very well-controlled conscious mind or *cough cough* the mind itself is only a matter of time. I mean I think they might even come up with a name for it. I would call it something like artifical intelligence. Since it would be trying to have inputs and from those determine to an extent an output. :D