Here's an awesome little video I found on io9 today, from the website of Tom Scott. He calls it "A science fiction story about what you see when you die. Or: the Singularity, ruined by lawyers." It perfectly sums up why I don't want to be uploaded. I'm always baffled when Transhumanists talk about mind uploading as a way to become immortal and invulnerable. If anything, existing purely as digitized information would make you more vulnerable, as the above video demonstrates.
The scenario in the video clearly utilizes whole brain emulation of a specific individual. I won't bother discussing the technical objections to this method of uploading, merely the ideological ones. For those who consider 'the self' to be purely personality and memories (which are not consistent across a person's lifespan, so this seems an odd way to define selfhood to me), brain emulation isn't a problem. I, however, consider life to be conscious experience. What separates me from other individuals is that I do not share in their conscious experiences. If you programed a computer to think like me, I wouldn't be able to experience consciousness through him, nor him through me, thus making us separate individuals. He would be a kind of artificial twin. To those who consider a mentally identical twin to be you, please note that the subject's brain was only emulated with a 98.356 % accuracy rate. That's about the same difference between chimp and Human DNA. Surely such a margin of error in a brain emulation would result in non-trivial differences between the emulation and the original. Even if you don't require continuity of consciousness to maintain selfhood, would an emulation with such noticeable differences from the original still be the same person?
Regardless of who this upload is, he doesn't appear to be in for a very good time. The first occurrence that I find objectionable is that the system forcefully adjusts his mental state in order to calm him down, even though he's in a situation where he has every right to be upset. As the terms and conditions section makes clear, this virtual world has no objection to restricting people's mental activities. They have no problem with altering your personal brand preferences to align with those of their sponsors. Unless you're willing (and able) to pay for top tier, your cognitive functions will be limited or even disabled during periods of high server use. Your mind is purged of any mental patterns that might incline you towards crime or terrorism, and your memories of all copy righted works are deleted. After such an extensive mental overhaul, how would you even be the same person?
Clearly, existing as a purely digital mind makes it ridiculously easy for someone to rewrite your personality and memories against your will. And it says right on the screen, rejection of these terms will result in termination of your simulated personality. Far from being invincible, you could be deleted with two clicks of a mouse. "Do you want to commit murder?*click yes* Are you sure?*click yes*". I find neurohacking a terrifying prospect. Far from protecting me from mind rape, uploading actually makes it easier for anyone to read, alter and erase my thoughts. If you were only a computer program, you'd be highly vulnerable to hackers, viruses and other malware, system crashes, and the computer running your program being physically destroyed. I don't think that would really be a good trade off.
No matter what security measures were taken against neurohacking, it wouldn't be enough. People would hack mind uploads just to prove how unsafe it was. Anonymous does this today. In 1903, Nevil Maskelyne hacked one of Marconi's wireless telegraph transmissions, purely because Marconi had been bragging about the impenetrable security of his telegraph system. You can bet that advocates of mind uploading will brag about the security of their system, which will only entice some asshole in a Guy Fawkes mask to hack it.
Personally I doubt that whole brain emulation will be achieved in forty years, if at all (I expect Moore's law to be broken by 2030). I don't even plan to get a webchip in my head, out of fear of neurohackers or loosing my individuality to a collective consciousness. I'll stay in my meat body, my mind confined to my brain. I think I have a good chance of reaching my biblical three score and ten. Moderate medical advances in the next few decades may allow me to live to a hundred, and there's a slim chance I may even be able to live for several centuries in a modestly enhanced Human body. But I won't change too much, not if I want to preserve my precious continuity of consciousness, and eventually something will kill me. If I could choose how I died, it would be after I had lived for a very long time and had truly grown weary of life. Some completely treatable malady would threaten my life, and instead of getting fixed up I would let it take me. But of course, we don't get to choose when or how we die.