FacebookTwitter
Hatrack River Forum   
my profile login | search | faq | forum home

  next oldest topic   next newest topic
» Hatrack River Forum » Active Forums » Books, Films, Food and Culture » Philosophical Games! (Page 1)

  This topic comprises 2 pages: 1  2   
Author Topic: Philosophical Games!
Jhai
Member
Member # 5633

 - posted      Profile for Jhai   Email Jhai         Edit/Delete Post 
Has anyone ever tried out some of the games and interactive activites on The Philosophers' Magazine website? I have no idea what the magazine is about, but the games are quite fun, and pretty good from a philosophical standpoint. They typically don't judge whether a particular position is good or bad, but instead check the rational consistency in your answers.

My favorite is probably Battlefield God, where you answer questions on your beliefs about religion & God. The goal is to keep your answers consistent, and try to minimize injuries from "biting the bullet" (occurs when your answers imply something that is traditionally seen as philosophically troubling). Of course, a simple game online isn't nearly as deep as properly studying theology, but it's a good jumping off point. Another great one is Staying Alive, which is about philosophy of mind - an area of philosophy most people don't get much exposure to. There's also some on philosophy of art, logic (both deduction & induction), and one on philosophical trivia (where I totally sucked it up at). Recommended for all philosophy geeks. [Smile]

Posts: 2409 | Registered: Sep 2003  |  IP: Logged | Report this post to a Moderator
Threads
Member
Member # 10863

 - posted      Profile for Threads   Email Threads         Edit/Delete Post 
quote:
Your choices are consistent with the theory known as psychological reductionism. On this view, all that is required for the continued existence of the self is psychological continuity. Your three choices show that this is what you see as central to your sense of self, not any attachment to a particular substance, be it your body, brain or soul.

But there is a tension. In allowing your brain and body to be replaced by synthetic parts, you seemed to be accepting that psychological continuity is what matters, not bodily continuity. But if this is the case, why did you risk the space ship instead of taking the teletransporter? You ended up allowing your body to be replaced anyway, so why did you decide to risk everything on the spaceship instead of just giving up your original body there and then?

I didn't take the teletransporter because I didn't think it would work. The idea itself doesn't make much sense. Let's assume the teletransporter works and the "me" (consciousness) present on one end of the teletransporter gets transfered to the reconstructed body. What happens if the teletransporter does not destroy the original "me"? For the teletransporter to work it would require assuming that "I" (as in my consciousness) could occupy two bodies at once. That is not consistent with our current understanding of consciousness. Therefore I assume that the transported body that appears on the other end of the teletransporter is not myself but rather a clone of myself.

The artificial brain scenario is different. Let's say that my brain gets replaced piece by piece. What happens if half of the part of my brain responsible for consciousness gets replaced but the other half stays intact? Does the original "me" get destroyed and replaced by a cloned mind? I assume that this is not true because the atoms in our brain are constantly being replaced (meaning parts of our brain can be replaced while maintaining our original conscious identity). This means that the new half-new half-old brain should still be the original me. But now we are back to square one. We can perform the same procedure on the other half of my brain and the original logic still holds.

Posts: 1327 | Registered: Aug 2007  |  IP: Logged | Report this post to a Moderator
MightyCow
Member
Member # 9253

 - posted      Profile for MightyCow           Edit/Delete Post 
Woohoo, only 13% tension in my "Philosophical Health", although I disagree with the assessment of the tension between my supposed inconsistent beliefs.

Let's see if I can actually get it to let me run the Battlefield God - it's been out of order so far.

Posts: 3950 | Registered: Mar 2006  |  IP: Logged | Report this post to a Moderator
TomDavidson
Member
Member # 124

 - posted      Profile for TomDavidson   Email TomDavidson         Edit/Delete Post 
quote:
For the teletransporter to work it would require assuming that "I" (as in my consciousness) could occupy two bodies at once. That is not consistent with our current understanding of consciousness.
I'm curious: in what way?
Posts: 37449 | Registered: May 1999  |  IP: Logged | Report this post to a Moderator
MightyCow
Member
Member # 9253

 - posted      Profile for MightyCow           Edit/Delete Post 
Threads: If you can replace the brain piece by piece and still stay you, why can't you replace all those pieces at once (by teleporter)?

Does the "you" in the replacing brain piece by piece theory have to shift to the original brain while the new brain is being replaced, and can then flow from the original brain into the new brain?

Posts: 3950 | Registered: Mar 2006  |  IP: Logged | Report this post to a Moderator
Itsame
Member
Member # 9712

 - posted      Profile for Itsame           Edit/Delete Post 
I get nearly perfect (if not) internal consistency on all of them... probably because I am a philosophy major.
Posts: 2705 | Registered: Sep 2006  |  IP: Logged | Report this post to a Moderator
AvidReader
Member
Member # 6007

 - posted      Profile for AvidReader   Email AvidReader         Edit/Delete Post 
I took the Taboo test and had no inconsistencies. I was more likely than others who took it to condemn the actions, try to interfere with the behavior, and less likely to ignore social norms when deciding morality.

So I'm intolerant, but I'm consistantly intolerant. [Smile]

Posts: 2283 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
Threads
Member
Member # 10863

 - posted      Profile for Threads   Email Threads         Edit/Delete Post 
quote:
Originally posted by MightyCow:
Threads: If you can replace the brain piece by piece and still stay you, why can't you replace all those pieces at once (by teleporter)?

In one case you are making a complete clone that has no dependency on the original person. If you make an exact copy of a chair you wouldn't say that the copy is the original chair. I think the default assumption is that this would be the same for humans.

quote:
Originally posted by MightyCow:
Does the "you" in the replacing brain piece by piece theory have to shift to the original brain while the new brain is being replaced, and can then flow from the original brain into the new brain?

What do you mean by "flow"? If we replaced the neurons in my brain one by one with artificial ones that exactly mimicked the neurons that they replaced, at which point would I stop being my old self? Neurons die all the time and we don't have our original consciousness replaced (this is technically an assumption because there is no way for my present self to know).
Posts: 1327 | Registered: Aug 2007  |  IP: Logged | Report this post to a Moderator
Threads
Member
Member # 10863

 - posted      Profile for Threads   Email Threads         Edit/Delete Post 
quote:
Originally posted by TomDavidson:
quote:
For the teletransporter to work it would require assuming that "I" (as in my consciousness) could occupy two bodies at once. That is not consistent with our current understanding of consciousness.
I'm curious: in what way?
I phrased that poorly. What I meant was that, from what we know, our consciousness is intimately tied to our brain and that we have to be careful when we try to treat them as separate entities. If we created a copy of our brain then why would our consciousness jump to the new copy? We don't know that it's not possible but there is nothing to suggest that it is.
Posts: 1327 | Registered: Aug 2007  |  IP: Logged | Report this post to a Moderator
TomDavidson
Member
Member # 124

 - posted      Profile for TomDavidson   Email TomDavidson         Edit/Delete Post 
quote:
If we created a copy of our brain then why would our consciousness jump to the new copy?
As someone who doesn't believe in qualia, and therefore doesn't believe in consciousness except as a convenient fiction, I don't see the problem; our "consciousness" is an imaginary concept that describes the process by which a self-image is produced. If you produce two brains running the same process, and drawing on the same storage, you're going to produce two identical consciousnesses that will begin to diverge from each other immediately.
Posts: 37449 | Registered: May 1999  |  IP: Logged | Report this post to a Moderator
Threads
Member
Member # 10863

 - posted      Profile for Threads   Email Threads         Edit/Delete Post 
quote:
Originally posted by TomDavidson:
quote:
If we created a copy of our brain then why would our consciousness jump to the new copy?
As someone who doesn't believe in qualia, and therefore doesn't believe in consciousness except as a convenient fiction, I don't see the problem; our "consciousness" is an imaginary concept that describes the process by which a self-image is produced.
The "problem" is that the teletransporter copies humans rather than transporting them. In the interest of survival (the point of the game), going through the teletransporter would be the same as me committing suicide. It would produce a copy of me while killing the original.

quote:
Originally posted by TomDavidson:
If you produce two brains running the same process, and drawing on the same storage, you're going to produce two identical consciousnesses that will begin to diverge from each other immediately.

That's what I believe as well.
Posts: 1327 | Registered: Aug 2007  |  IP: Logged | Report this post to a Moderator
MEC
Member
Member # 2968

 - posted      Profile for MEC   Email MEC         Edit/Delete Post 
I poked around a little bit, so far I have to say I don't like how they control all of the definitions.

(for example, whether by omni- you mean infinite- or all-)

Posts: 2489 | Registered: Jan 2002  |  IP: Logged | Report this post to a Moderator
Tresopax
Member
Member # 1063

 - posted      Profile for Tresopax           Edit/Delete Post 
quote:
If you produce two brains running the same process, and drawing on the same storage, you're going to produce two identical consciousnesses that will begin to diverge from each other immediately.
Which raises the question: Which one is you?

The problem is that you are saying that when we copy your body and destroy the original, a la Star Trek, the copy body is still you. But it seems like if that logic is true, then if we do the exact same thing yet don't destroy the body, BOTH the copy and the original would be you simultaneously. That is a problem - who gets to be married to your wife? Who gets your bank account? OR if both of them are you, that would mean the original transporter is murdering a person every time it makes a copy somewhere else, which raises some serious moral questions to transporting... After all, would you agree to commit suicide if someone told you a copy of you would be built somewhere else that had all your thoughts and memories in it and would live you life as you would?

Posts: 8120 | Registered: Jul 2000  |  IP: Logged | Report this post to a Moderator
Jhai
Member
Member # 5633

 - posted      Profile for Jhai   Email Jhai         Edit/Delete Post 
quote:
Originally posted by MEC:
I poked around a little bit, so far I have to say I don't like how they control all of the definitions.

(for example, whether by omni- you mean infinite- or all-)

All of the definitions they use are pretty standard in philosophical circles. I'm not sure where your example of "omni-" is coming from, but defining, say, omnipotent as all-powerful seems pretty natural to me, given the dictionary definition of the prefix "omni-". As the OED says, omni- "form[s] compounds in which the first element has the sense ‘in all ways or places’, or ‘of all things’."

What would you rather define it as?

Posts: 2409 | Registered: Sep 2003  |  IP: Logged | Report this post to a Moderator
Mucus
Member
Member # 9735

 - posted      Profile for Mucus           Edit/Delete Post 
quote:
Originally posted by Tresopax:
The problem is that you are saying that when we copy your body and destroy the original, a la Star Trek, the copy body is still you. But it seems like if that logic is true, then if we do the exact same thing yet don't destroy the body, BOTH the copy and the original would be you simultaneously. ...

I believe that standard operating procedure in this case would be to determine which version is your evil counterpart yet required for the daring of command and which version is your good counterpart yet too weak to make decisions.

Then both halves have to be merged again in the transporter before roughly 45 minutes are over.

Posts: 7593 | Registered: Sep 2006  |  IP: Logged | Report this post to a Moderator
Philosofickle
Member
Member # 10993

 - posted      Profile for Philosofickle           Edit/Delete Post 
You spelled my name wrong.
Posts: 208 | Registered: Sep 2007  |  IP: Logged | Report this post to a Moderator
Xaposert
Member
Member # 1612

 - posted      Profile for Xaposert           Edit/Delete Post 
I believe you have that exactly right, Mucus...
Posts: 2432 | Registered: Feb 2001  |  IP: Logged | Report this post to a Moderator
MightyCow
Member
Member # 9253

 - posted      Profile for MightyCow           Edit/Delete Post 
Threads: Let's look at it slightly differently then. You seem to support the idea that by replacing the brain piece by piece, "you" still maintain your selfhood, because it is analogous to how your brain cells naturally die and are replaced.

What if the doctors were wrong about the disease, and as they took each piece of brain out and preserved it in a jar for further study, when they took out the last piece of brain, and "you" were still you, only now with a fully computerized brain, the brain bits in the jar also regained consciousness, and from the standpoint of the brain, it felt as though it were "you", and had simply been asleep for the procedure.

Is the body with the computer brain "You", since it has felt like you all along, or is the brain "You", because it is actually your brain, and was only damaged for a period of time before recovering?

Posts: 3950 | Registered: Mar 2006  |  IP: Logged | Report this post to a Moderator
Juxtapose
Member
Member # 8837

 - posted      Profile for Juxtapose   Email Juxtapose         Edit/Delete Post 
Only 7% tension on my health check! Not too shabby.
Posts: 2907 | Registered: Nov 2005  |  IP: Logged | Report this post to a Moderator
Achilles
Member
Member # 7741

 - posted      Profile for Achilles           Edit/Delete Post 
7% here too.
Posts: 496 | Registered: Apr 2005  |  IP: Logged | Report this post to a Moderator
Threads
Member
Member # 10863

 - posted      Profile for Threads   Email Threads         Edit/Delete Post 
quote:
Originally posted by MightyCow:
Threads: Let's look at it slightly differently then. You seem to support the idea that by replacing the brain piece by piece, "you" still maintain your selfhood, because it is analogous to how your brain cells naturally die and are replaced.

I support the idea that what matters is the structure of the brain. We know that parts of our brain get replaced all of the time at the atomic level and we know that individual neurons in our brain can die off or get replaced. I would be surprised if more macroscopic replacements are not possible. Clearly for these replacements to work they would have to not kill the brain so the vague methods I outlined earlier may not work.

quote:
Originally posted by MightyCow:
What if the doctors were wrong about the disease, and as they took each piece of brain out and preserved it in a jar for further study, when they took out the last piece of brain, and "you" were still you, only now with a fully computerized brain, the brain bits in the jar also regained consciousness, and from the standpoint of the brain, it felt as though it were "you", and had simply been asleep for the procedure.

Is the body with the computer brain "You", since it has felt like you all along, or is the brain "You", because it is actually your brain, and was only damaged for a period of time before recovering?

I would say that the computer is still "me" (though I won't commit to this being my final answer). A relevant question is what if we chopped someone's brain up into 100 pieces, preserved those pieces for 100 years, and then somehow put them back together. If the brain still functioned would it be the same person? That is a very tough question to answer.

The major problem with answering these questions is that our definitions are poor. Here's an example to illustrate: let's say a wooden chair is broken and then repaired. Is the repaired chair the same chair as before it was broken? While arguments can be made either way, the scenario itself is flawed. A chair is a human concept. The laws of physics operate on particles* not on chairs. We refer to chairs as if they are individual entities because it is highly convenient to do. In reality they are not so the original question does not even make sense. To answer the question we would have to lay out arbitrary rules that define exactly what a chair is and how a chair can lose its "chairiness".

A similar problem exists in our original discussion because we have a lousy definition of "consciousness" (and a poor understanding of that lousy definition at that). We also don't have a good definition of what it means to survive. I take survive to mean that my brain doesn't die and since our definitions of life vs. non-life are also very poor/arbitrary, we encounter corner cases where it is not obvious whether the brain does or doesn't die. We arrive at our current situation where the original assignment demanded that I try to maximize my survival probability but I have to answer a question which calls into question the definition of "survival".

* or so we think

Posts: 1327 | Registered: Aug 2007  |  IP: Logged | Report this post to a Moderator
Itsame
Member
Member # 9712

 - posted      Profile for Itsame           Edit/Delete Post 
"The major problem with answering these questions is that our definitions are poor."

O Wittgenstein, O Wittgenstein! How are thy words are so wise!

Posts: 2705 | Registered: Sep 2006  |  IP: Logged | Report this post to a Moderator
scholar
Member
Member # 9232

 - posted      Profile for scholar   Email scholar         Edit/Delete Post 
Well, in the taboo problem, they forgot about ghosts. The first question has a son promise his dying mother to visit her grave and says he notices no harm and feels no guilt. But it says nothing about how her ghost feels. To truly have no harm, we need to know if the ghost is ok with her son's lie or not.
Posts: 1001 | Registered: Mar 2006  |  IP: Logged | Report this post to a Moderator
Juxtapose
Member
Member # 8837

 - posted      Profile for Juxtapose   Email Juxtapose         Edit/Delete Post 
I like you, Threads. You say a lot of things I'd like to, only better.

Carry on.

Posts: 2907 | Registered: Nov 2005  |  IP: Logged | Report this post to a Moderator
orlox
Member
Member # 2392

 - posted      Profile for orlox           Edit/Delete Post 
http://bluebrain.epfl.ch/
Posts: 675 | Registered: Aug 2001  |  IP: Logged | Report this post to a Moderator
twinky
Member
Member # 693

 - posted      Profile for twinky   Email twinky         Edit/Delete Post 
0% tension. My beliefs are perfectly internally consistent. [Big Grin]
Posts: 10886 | Registered: Feb 2000  |  IP: Logged | Report this post to a Moderator
Shawshank
Member
Member # 8453

 - posted      Profile for Shawshank   Email Shawshank         Edit/Delete Post 
The problem I have with these games is their dependency on empiricism. Not being an empiricist- it makes me seem like I'm less logical since differences me and the game have differences in epistemology.
Posts: 980 | Registered: Aug 2005  |  IP: Logged | Report this post to a Moderator
Itsame
Member
Member # 9712

 - posted      Profile for Itsame           Edit/Delete Post 
My problem is that I don't believe that the laws of excluded middle or non-contradiction are absolute.

Yeah, I would get yelled at by a) 100% of mathematicians and b) 98% of philosophers, but whatever. Two percent agree with me.

Posts: 2705 | Registered: Sep 2006  |  IP: Logged | Report this post to a Moderator
Tara
Member
Member # 10030

 - posted      Profile for Tara   Email Tara         Edit/Delete Post 
I really want to play Battleground God but the server keeps being too busy. [Frown]
Posts: 930 | Registered: Dec 2006  |  IP: Logged | Report this post to a Moderator
Itsame
Member
Member # 9712

 - posted      Profile for Itsame           Edit/Delete Post 
Took another one. Your Moral Parsimony Score is 100%
Posts: 2705 | Registered: Sep 2006  |  IP: Logged | Report this post to a Moderator
Tresopax
Member
Member # 1063

 - posted      Profile for Tresopax           Edit/Delete Post 
quote:
would say that the computer is still "me" (though I won't commit to this being my final answer). A relevant question is what if we chopped someone's brain up into 100 pieces, preserved those pieces for 100 years, and then somehow put them back together. If the brain still functioned would it be the same person? That is a very tough question to answer.

The major problem with answering these questions is that our definitions are poor.

I disagree. We could define everything perfectly, but I still don't think we'd know the answer to the above question unless we empirically test it out.

The major problem with answering these questions is that self-hood is untestable except on yourself. If you walk through a door and then someone who looks, acts, and functions like you comes out of that door a minute later, I can ASSUME it is you, but that is just an assumption. It could be a clever robot or clone designed to fool me. In order to know for sure, I'd need access to your subjective, internal consciousness. I'd need to be able to see into your mind to see that it is the same mind as before. But only you can see into your mind.

Philosophers try to get around this by postulating that self-hood is defined by things we CAN test and observe - such as our personality or our physical body. But in examples like these, those criteria always seem to end up failing. That, I think, is the reason to believe that self-hood is not defined by any characteristic at all, except the continuity of a personal stream of consciousness and/or a soul which may or may not continue to exist if we replaced all the parts of our brains with silicon.

Posts: 8120 | Registered: Jul 2000  |  IP: Logged | Report this post to a Moderator
C3PO the Dragon Slayer
Member
Member # 10416

 - posted      Profile for C3PO the Dragon Slayer           Edit/Delete Post 
Why does the do-it-yourself-deity game results refer to God as "she?"
Posts: 1029 | Registered: Apr 2007  |  IP: Logged | Report this post to a Moderator
Jhai
Member
Member # 5633

 - posted      Profile for Jhai   Email Jhai         Edit/Delete Post 
A lot of acadmic types, in a (IMHO, misguided) effort to promote a gender-neutral and/or pro-feminist outlook, switch between using genders when there's no clear reason why a person ought to be one gender or another. If they're writing an example, sometimes they'll use he and sometimes they'll use she. I suppose whoever wrote the page didn't think that God, by definition, had to be a he or a she.

I'd rather they just stick with he, since switching genders around confuses me. Plus it can look really weird. The worst case of gender-neutral language I've seen is on the Chronicle of Higher Education forums, where a significant number of the posters regularly use "hu" as a third person singular.

Posts: 2409 | Registered: Sep 2003  |  IP: Logged | Report this post to a Moderator
Destineer
Member
Member # 821

 - posted      Profile for Destineer           Edit/Delete Post 
It's funny how far from average my results on "Taboo" are as an academic philosopher. I've long since decided that the incest and chicken examples are perfectly fine morally. In the deathbed promise case, I think you do harm the late mother by frustrating her desires.
Posts: 4600 | Registered: Mar 2000  |  IP: Logged | Report this post to a Moderator
Jhai
Member
Member # 5633

 - posted      Profile for Jhai   Email Jhai         Edit/Delete Post 
Clarification question: when are you harming her - before she's dead, or afterwards?
Posts: 2409 | Registered: Sep 2003  |  IP: Logged | Report this post to a Moderator
Threads
Member
Member # 10863

 - posted      Profile for Threads   Email Threads         Edit/Delete Post 
quote:
Originally posted by Tresopax:
I disagree. We could define everything perfectly, but I still don't think we'd know the answer to the above question unless we empirically test it out.

Define "life" and "consciousness" then.
Posts: 1327 | Registered: Aug 2007  |  IP: Logged | Report this post to a Moderator
The Pixiest
Member
Member # 1863

 - posted      Profile for The Pixiest   Email The Pixiest         Edit/Delete Post 
quote:
That, I think, is the reason to believe that self-hood is not defined by any characteristic at all, except the continuity of a personal stream of consciousness and/or a soul which may or may not continue to exist if we replaced all the parts of our brains with silicon.
OR we could die moment by moment whenever our brain changes state. We only think we're the same person because we have all of the old person's memories.

I found that particular quiz annoying. They change the rules on the last question by saying "Hey, btw, souls exist" then chastize me for being inconsistant.

Posts: 7085 | Registered: Apr 2001  |  IP: Logged | Report this post to a Moderator
orlox
Member
Member # 2392

 - posted      Profile for orlox           Edit/Delete Post 
Self-hood is not necessarily beyond external testing. I refer you to the Blue Brain project linked above. If we can successfully model consciousness, the jig is up.

They plan on sticking an artificial rat brain in a rat robot in the next two years. If all goes well, it is straight on to the full human model. But even at rat stage, if it works, we will have artificially generated first person experience.

http://www.seedmagazine.com/news/2008/03/out_of_the_blue.php

Posts: 675 | Registered: Aug 2001  |  IP: Logged | Report this post to a Moderator
Tara
Member
Member # 10030

 - posted      Profile for Tara   Email Tara         Edit/Delete Post 
"Strange New World" is pretty much a waste of time, IMO.
Posts: 930 | Registered: Dec 2006  |  IP: Logged | Report this post to a Moderator
scholarette
Member
Member # 11540

 - posted      Profile for scholarette           Edit/Delete Post 
The taboo issue did make me redefine morality somewhat. I realized that for me, it isn't the harm, it is the probability of doing harm (weighed against the potential gain). The fact that in retrospact that probability was avoided does not make the act suddenly moral for me. For example, if I went into a public place and started shooting, even if miraculously no one was harmed, my action is still immoral.
Posts: 2223 | Registered: Mar 2008  |  IP: Logged | Report this post to a Moderator
Jhai
Member
Member # 5633

 - posted      Profile for Jhai   Email Jhai         Edit/Delete Post 
I liked the taboo one a lot too. And I agree with that randomly shooting a gun where there are people are present is a BAD thing. However, in the taboo game, at least as I understand it, absolutely no one though there was any chance of them being harmed or others being harmed through their actions. Do just think they're wrong about the potential harm, or were you okay with all the taboo-like scenarios (barring scenarios which had religion implications)?
Posts: 2409 | Registered: Sep 2003  |  IP: Logged | Report this post to a Moderator
scholarette
Member
Member # 11540

 - posted      Profile for scholarette           Edit/Delete Post 
I think that there is potential harm that they aren't factoring in. Like the son is assuming that there is no life after death and his mom isn't being harmed. In the chicken one, is he sure that his cooking eliminates any potential disease transmission? As far as sex with siblings, assuming no emotional complications with sex is a huge assumption to make- even if it ended up being correct- it is a careless choice.
Posts: 2223 | Registered: Mar 2008  |  IP: Logged | Report this post to a Moderator
Destineer
Member
Member # 821

 - posted      Profile for Destineer           Edit/Delete Post 
quote:
Clarification question: when are you harming her - before she's dead, or afterwards?
After. When you think about it we often want to say that you can 'harm' someone after they die -- for instance, if I violate the terms of your will.
Posts: 4600 | Registered: Mar 2000  |  IP: Logged | Report this post to a Moderator
Jhai
Member
Member # 5633

 - posted      Profile for Jhai   Email Jhai         Edit/Delete Post 
Personally, I never want to say that you can harm someone after he's dead - since that person no longer exists. I don't think you can harm a thing which doesn't exist. It's just a slight twist on the non-identity problem. While it's true that we refer to a dead body as "a dead person," unless you believe in a soul or some other continuation of the mind/personality/life-force, you've really just got a lifeless chunk of organic matter. No being left to harm.

Taking your example of violating the terms of a will: I don't think that's a harm to the person who's dead, but instead a harm to society in that contracts are being broken, and a specific harm to those close to the deceased, who might feel better if the terms of the contract were followed. Also you could suppose that the deceased had a better idea than others as to how his estate should be dealt with, in which case you'd have efficiency loss should the terms of the contract.

Posts: 2409 | Registered: Sep 2003  |  IP: Logged | Report this post to a Moderator
Tresopax
Member
Member # 1063

 - posted      Profile for Tresopax           Edit/Delete Post 
quote:
Define "life" and "consciousness" then.
When I say we could define those terms perfectly, I don't mean I could define them perfectly right now in one post on this forum. I mean we "could" define them in the same way that we "could" replace every part of the brain with silicon - in other words, in a thought experiment. If we did, in such a thought experiment, I don't think that definition by itself would solve the dilemma of whether or not consciousness would survive in a silicon brain. I think that question is empirical.
Posts: 8120 | Registered: Jul 2000  |  IP: Logged | Report this post to a Moderator
Mike
Member
Member # 55

 - posted      Profile for Mike   Email Mike         Edit/Delete Post 
quote:
Originally posted by Tresopax:
The major problem with answering these questions is that self-hood is untestable except on yourself. If you walk through a door and then someone who looks, acts, and functions like you comes out of that door a minute later, I can ASSUME it is you, but that is just an assumption. It could be a clever robot or clone designed to fool me. In order to know for sure, I'd need access to your subjective, internal consciousness. I'd need to be able to see into your mind to see that it is the same mind as before. But only you can see into your mind.

I'd amend the first sentence to read "The major problem with answering these questions is that self-hood is untestable, with no exceptions." Imagine, for example, that in your scenario the person walking back out of the door looks, acts, and functions like the person who walked in, but also believes himself to be that person. Could not this person still be a clever robot or clone?
Posts: 1810 | Registered: Jan 1999  |  IP: Logged | Report this post to a Moderator
rollainm
Member
Member # 8318

 - posted      Profile for rollainm   Email rollainm         Edit/Delete Post 
Damn it, people. Now I'm really confused.

Who am I?! [Angst]

Posts: 1945 | Registered: Jul 2005  |  IP: Logged | Report this post to a Moderator
ketchupqueen
Member
Member # 6877

 - posted      Profile for ketchupqueen   Email ketchupqueen         Edit/Delete Post 
quote:
Originally posted by scholarette:
I think that there is potential harm that they aren't factoring in. Like the son is assuming that there is no life after death and his mom isn't being harmed. In the chicken one, is he sure that his cooking eliminates any potential disease transmission? As far as sex with siblings, assuming no emotional complications with sex is a huge assumption to make- even if it ended up being correct- it is a careless choice.

I agree, although I would say I premise different harms to the guy with the chicken.
Posts: 21182 | Registered: Sep 2004  |  IP: Logged | Report this post to a Moderator
ketchupqueen
Member
Member # 6877

 - posted      Profile for ketchupqueen   Email ketchupqueen         Edit/Delete Post 
Oh, and I have tried to get to Battlefield God since this thread was posted, at all different times of day and night, and the server is ALWAYS down. Those who've played it, when did you get in?
Posts: 21182 | Registered: Sep 2004  |  IP: Logged | Report this post to a Moderator
dean
Member
Member # 167

 - posted      Profile for dean   Email dean         Edit/Delete Post 
About two years ago. I don't remember what time of day. [Taunt]
Posts: 1751 | Registered: Jun 1999  |  IP: Logged | Report this post to a Moderator
  This topic comprises 2 pages: 1  2   

   Close Topic   Feature Topic   Move Topic   Delete Topic next oldest topic   next newest topic
 - Printer-friendly view of this topic
Hop To:


Contact Us | Hatrack River Home Page

Copyright © 2008 Hatrack River Enterprises Inc. All rights reserved.
Reproduction in whole or in part without permission is prohibited.


Powered by Infopop Corporation
UBB.classic™ 6.7.2