You may not think of our flesh-eating diseased brethren as being the thoughtful types. Maybe they are, maybe they aren't.
As Sci mentioned, I'm gonna be holed up in the Costco for a while so I got time to think about it. They're the slow-moving-undead zombies, not those ultra-quick "infected" (I hate those creepy bastards). I rolled down those big steel doors, barricaded them with anything heavy I could find here, gathered up all the lighting supplies for when the power goes out, bandaged up that bite on my arm, and I've taken to making jerky out of all this meat I've got laying around the store. I even got chainsaws in case they break in. Not my weapon of choice by a long shot since all it does is attract more zombies, but they'll do. I should be able to last a while. Other survivors are knocking on the door. I can't let them in, I might put myself at risk. Maybe I'll toss them some supplies from the roof later. If they make it.
So while I slice up all this beef and prep it for low heat cookin', I'm thinking about these buggers. Are they just driven by a bunch of chemical reactions in their diseased brains?They got a rudimentary consciousness in there someplace, right? Or do they? People normally think of consciousness as the ability to self-evaluate, to reflect on one's own mental state. Consciousness is frequently referred to as an emergent phenomenon, or one that can't readily be predicted by knowing all the properties of its constituents. For example, we can't really predict "wetness" from knowing the properties of a water molecule.
Some philosophers suggest that mental processes supervene on physical processes. That is, you can't reductively just break down all the physical processes of the brain and get a true understanding of mentation. That being said, though, you can't have the mental process without the physical. At least that we know about or can explain.
Hmm I'm getting hungry and these granola bars aren't cutting it. Guess I don't have to wait for the meat to cook, I can just start eating it as I slice it. Hey, it ain't that gross! Whole cultures make a dining habit of eating raw beef. Ever heard of carpaccio, huh? Steak tartare? Kitfo? Mmm. Funny thing is, the more raw meat I eat the less that bite on my arm itches.
Some philosophers would say that it is possible to imagine a universe just like ours in every way, with all the same physical laws, except that this property of supervenience doesn't apply to mental phenomena. That is to say, there's another universe out there with an identical "you", down to every last molecule, doing exactly what you are doing right now. Like making jerky. Mmmm jerky.
God I wish those survivors would stop banging on the door. I can't think with all that racket!
The difference between You and 2nd you (called Ewe) is that Ewe doesn't have the ability to self-evaluate. Ewe is not aware, Ewe is just playing out the fucktillions of molecular interactions going on in Ewe's body. And since all the same physical laws apply to You and Ewe, You and Ewe will continue to live out the exact same life. The only difference is, Ewe isn't aware of any of it. Kind of like those ravenous, flesh-eating fuckers out there right now, hunting down the last of humanity and tearing us to pieces with their jagged little teeth.
I don't really buy into this as an explanation for how consciousness works, because in this case consciousness really can't impact thought processes at all. It is a passive, useless thing. Personally I prefer to think that evolution shaped our mental processes by shaping our physical processes, meaning that self-evaluation serves a useful role in our survival. If I get out of this mess, I'm gonna head to the library and read up some more......
Dammit, stop banging on the damn door!!!!!!! RRRRRRRRRRR.
So anyway... self-evaluation. Yeah. What was I saying? Shit. Can't think. Hungry. Arm itches.
Tried raw beef, pork, chicken, turkey. I'm craving some long pig, and Costco doesn't carry that. Really craving it.
Time to go let those other survivors in.
- Log in to post comments
I always thought the idea of something that acts exactly as if it were conscious, but wasn't, was kind of incoherent. For some problems with this sort of idea, see here.
On the topic of actling like being conscious without actually being conscious, I recommend the SF novel "Blindsight".
The roots of our self-awareness originally came from the need of an organism to monitor its physical internal state, viscera and so on. Every complex organism also needs to be aware of the difference between self and nonself. As life became more sophisticated, so did the need for better "introspection".
With the advent of social organisms, things got more complex and reports of passing the "mirror test" now even include non-primate organisms.
It is as if evolution -when given time enough- will spit out self-aware (but not necessarily technology-wielding) creatures.
By contrast, I am highly skeptical about the prospect of ever seeing truly self-aware "strong" AI. Without an evolutionary heritage to shape the brain, you will only get something very good at simulating awareness.
A question that's been circulating in my mind for the past few years in relation to the following:
The difference between You and 2nd you (called Ewe) is that Ewe doesn't have the ability to self-evaluate. Ewe is not aware, Ewe is just playing out the fucktillions of molecular interactions going on in Ewe's body. And since all the same physical laws apply to You and Ewe, You and Ewe will continue to live out the exact same life. The only difference is, Ewe isn't aware of any of it.
I don't really buy into this as an explanation for how consciousness works, because in this case consciousness really can't impact thought processes at all. It is a passive, useless thing.
Is the Turing Test really a test of intelligence or is it more a test of consciousness?
Birger, I enjoyed Blindsight but even in that book there was a sifference between simulated consciousness and real ones. It's just, in that book, the simulated ones were more effective survivors. :)
Can you clarify your point about 'strong AI'? A silicon architecture may not be as efficient for 'awareness processing' as the b rain, but unless the brain is doing something non-turing-complete, an AI could be 'truly' self-aware...
Hi, Evil Monkey
Could you please define the difference between you and the Evil Monkey {EWE}, could please clarify your points. Ewe is not aware, Ewe is just playing out the fucktillions of molecular interactions going on in Ewe's body.
God I wish those survivors would stop banging on the door. I can't think with all that racket!
This theory would suggest physicalism isn't quite correct and that if, in some far future, we could replicate every synapse of every neuron and every neurotransmitter and other molecules/atoms in someone's brain at a given state in time, that we wouldn't create a new consciousness. This proposed theory suggests that the physical state of the system cannot give arise to consciousness, is that correct?
Such an interesting discussion, thank you for bringing it up and thank you for incorporating zombies. Brains!