Waste less time on Facebook — follow Brilliant.

An Article About \(\forall\) You

Whattup humans,

Wrote an article - came out on the front page. Would be really cool if you checked it out, I'll gladly hear out what you think about the topic.

Click for the article

(Alternatively, try here)


(P.S. - If you'd like to see an in-depth discussion about this topic, you may enjoy this thread.)

Note by John Muradeli
2 years ago

No vote yet
1 vote


Sort by:

Top Newest

Great responses everyone. Not sure what's up with the link; try THIS

I'd like to reply to all, but I just want to throw something out there; when considering this question, it is not wise to take on an entirely materialistic perspective. In fact, as much as I know nothing about spiritualism, I believe it is necessary. A person I know has went through an "astral projection" and located the person to a pin-point precision the location of a man who was being searched for - on another side of the planet, in prison. Now THAT's something you can't say is simply a manifestation of the brain, or a "placebo experience."

And yes, Brian, a millimeter can have a colossal difference on a macroscopic system in the long run; so can a nanometer. But my point was, that pretty much all the knowledge and memories could be recreated with an almost exact precision; the only thing that would change is the initial behavior form the entity's starting point. And yes - the teleportation problem. That's basically it. If you're "recreated," but your original is not destroyed, then ... heck knows. I guess disrupting the continuity of the brain entirely distorts consciousness and perhaps the "spirit" or the "soul" as well.

Though theoretically it is very intriguing to think of the possibility of quantum entangling two brains and bodies - and sustaining them. NOW we've got a debate. Anyway, I feel like now I'm about to go again into the stream of consciousness - so let me stop here.

I hope you enjoyed the article, and have a great day!

(P.S. - "RedWings" is the name/symbol of my High School, used officially in sports teams. "Go RedWings!") John Muradeli · 2 years ago

Log in to reply

@John Muradeli Oh, haha, "RedWings" as in Detroit RedWings, (for example) as opposed to a telepathic raptor. Sorry, I have Ultron on my brain. :P Brian Charlesworth · 2 years ago

Log in to reply

Do you have an alternate link? My internet connection refuses to download the pdf. Siddhartha Srivastava · 2 years ago

Log in to reply

@Siddhartha Srivastava I cannot get the pdf Agnishom Chattopadhyay · 2 years ago

Log in to reply

@Agnishom Chattopadhyay Seems like a problem for all Indian users * gasp *. Raghav Vaidyanathan · 2 years ago

Log in to reply

Great article, John; you really know how to engage the reader. Avatar A seems plausible and useful, but the notion of head/brain transplants seems barbaric, so Avatar B crosses a line in the sand from my perspective. As for Avatar C and D, I'm not buying the pool ball analogy; an initial discrepancy of a few millimeters can make a huge difference to the trajectory of the balls over time, and the same would go for the initial neuron-mapping of the artificial brains. But whether we are able to transplant personalities, memories, skills, etc., of a particular "real" human into an artificial one is secondary to the possibility of creating any truly independent AI entity that can mimic and even improve upon our species. The timeline for such a super-sentient being would probably be measured in centuries rather than mere decades, but I see no reason why we should consider it to be the folly of mad scientists. However, we could easily be opening Pandora's box with this kind of technology, so a discussion needs to begin sooner rather than later about how to ensure that AI serves our species and does not endanger it in any irrevocable way.

P.S.. Hope you're coping with your crazy workload. :)

P.P.S.. Does "Redwing" refer to Falcon's bird in Marvel Comics? Brian Charlesworth · 2 years ago

Log in to reply

I like the story but I feel as though this story is exactly the same as how the Cybermen from Doctor Who came into being. Sharky Kesa · 2 years ago

Log in to reply

Let me add a little bit to that scenario. It's the year 2046. I'm told that my mind, through being recreated in an Avatar, will be "transferred" to a new kind of a body, and thus I'll be able to achieve a kind of an immortality. I give the medical scientists permission to proceed. Vast computeristic machinery works, and, after a while, one of the scientists come into my room and informs me that it has been a success! My mind has been recreated and now resides in a futuristic Avatar with decidedly superior physical properties than my body has! So, he then says, "well, but now I must do away with you, this is the next step," and then pulls out a futuristic ray gun and vaporizes me. Am I going to wake up as my new Avatar? Michael Mendrin · 2 years ago

Log in to reply

@Michael Mendrin

Michael Mendrin · 2 years ago

Log in to reply

@Michael Mendrin There's only one you, Michael. :) And after you're vaporized, what if the Avatar were ever to find out about his/it's origins and the fate of his/it's prototype, (if that's the right word)? Would he/it then want to seek revenge on whoever killed you, or would he/it be enlightened enough to know that he/it .... oh heck.... he is not in fact you but that your vaporization was necessary for his future? This is all getting rather macabre. :P Brian Charlesworth · 2 years ago

Log in to reply

@Brian Charlesworth In fact, this is the familiar "teleportation problem". If a person is being recreated elsewhere, doesn't that mean the destruction of the original? In the Star Trek scenario, a person's body is read, information is sent elsewhere, and as this is happening, the body is being disassembled. What if the body is momentarily not disassembled, so that the "new" person is now elsewhere, but the "old" person is still standing around, wondering what's going on? In fact, this problem is actually the basis of one Star Trek episode where a particularly bright man refuses to allow himself to be teleported for this reason. This is all part of the still hotly debated "mind-body problem"--what is the relationship between consciousness and the body which carries it? Michael Mendrin · 2 years ago

Log in to reply

@Michael Mendrin Yes, I think that was the TNG episode where the neurotic engineer Reg saw eel-like creatures while in the transporter beam, which turned out to be crew members stuck in limbo. Then there was another TNG episode where Scottie from TOS had deliberately suspended himself in the transporter's memory for what turned out to be 75 years in the (realized) hopes that some Federation ship would come by and save him from his crashed vessel.

Anyway, the mind-body problem.... My preferred definition of "life" is "that which maintains form through change of substance." Teleportation, ideally, does maintain form through change of substance, so the "new" person would be \({identical}\) to the "old" person in every way; memories, skills, the whole shebang. If the old person somehow remains as well, then although the two entities have been cast from the same mold, so to speak, their experiences will differ from the point of duplication onward and hence they will truly be two individuals who happen to have everything in common except their future. So there are now two bodies and two distinct consciousnesses that go with those bodies..... Perhaps. As you say, the mind-body problem is hotly debated, which is in large part because we don't have a complete (or even partial) handle on what consciousness truly is, So while the transporter twin may now be a distinct individual, s/he may still have some psychic connection to his or her "origin being", since neither individual has any more right than the other to being the original anymore.

Also, as @Raj Magesh has pointed out, how do we really know that we are "continuous beings" in the first place? In his scenario, (even in waking life if the vaporization/reconstruction process is virtually instantaneous), we would never be the wiser to what is truly going on. So while we, as living beings, are constantly changing substance while (essentially) maintaining form, (although in the long term that "form" can change greatly), there may be processes happening in some other dimension that we are completely oblivious to in our realm of existence. Brian Charlesworth · 2 years ago

Log in to reply

@Michael Mendrin I believe that the notion of continuity in this sort of problem is what causes the confusion. Am I now the same person that I was last night, or has Raj-28/04/15 been replaced with Raj-29/04/15?

Essentially, when I fall asleep, I lose all consciousness. If my sleeping body were to be blown up, "I" wouldn't feel a thing: death is equivalent to sleep is equivalent to loss of consciousness. "I" wouldn't know anything anyway, since "I" wouldn't exist anymore.

Coming to the teleportation problem: the original guy wouldn't feel a thing; he'd simply cease to exist. The newly replicated guy (assuming he was fully conscious when he was teleported) would feel a jarring disorientation (he teleported!?) but his stream of consciousness would not seem to be broken to him, since he will remember everything before and after the teleportation without any discontinuity.

Essentially, you vaporized one guy at point A (stopping his stream of consciousness forever), and created an identical guy at point B who believes he was teleported across a huge distance. The net result is moving one guy from point A to point B.

The moral? Don't fret too much about the morality of it all. After all, you're risking the same when you sleep.

Thoughts? Opinions? Raj Magesh · 2 years ago

Log in to reply

@Raj Magesh While many probably would try to read morality into the mind-body problem, it's not about morality. For purposes of analysis, it would be more useful if we first consider artificial intelligence implemented by a computer, and then look at the problem of "telelporting" that AI elsewhere. What if computer A contains the AI, and then I simply mirrored it in another computer B? Would the AI think it's now in both places, or no? It's a very intriguing subject. it's difficult enough to analyze without getting into morals or souls or spirituality, which only gets in the way of a practical understanding of the problem grounded in science. Michael Mendrin · 2 years ago

Log in to reply

@Michael Mendrin Hmm. Interesting question. Assuming (quite reasonably, I believe) that consciousness is an emergent property arising from mere physical arrangements and interactions of atoms, you would essentially create two identical AIs independent of each other. I am pretty certain that AI-A and AI-B wouldn't be aware of the other's existence or be affected by each other in any way. Nonetheless, if exposed to the same stimuli, both AIs would respond identically, since they have the exact same programming.

But if you tell the AI that you're mirroring it, it'll probably understand that it now has an identical twin, who is now a distinct entity. I don't think it's like splitting one AI into two parts; rather, it's creating another identical AI somewhere else. Raj Magesh · 2 years ago

Log in to reply

@Raj Magesh The key is how a consciousness is influenced by or aware of the environment. Experiments with humans in sensory deprivation can produce some interesting results, suggesting that our sense of consciousness and self depends on our interaction with the real world. Hence, for the AI in computer A to be seemingly "entangled" with its mirror in computer B, both computers 1) need to share information in real time, and 2) be interacting with its local environment. Then the AI could maybe think it's at two places at once, like looking at a double image. I imagine that when one is "transporting" from one place to another, or "into a new Avatar", something similar should happen. It's just a conjecture, but I think there is a deep connection between the consciousness and its awareness of its physical environment. Michael Mendrin · 2 years ago

Log in to reply

@Michael Mendrin Are you postulating a single AI connected to two (sensory?) input devices that give it information about the outside world, with each device placed at different locations? That's a very interesting idea.

A more intuitive analogy is having an extra pair of eyes of yours placed several miles away but connected to your brain. To a human, the sensory input would be jarring, since our brains have not evolved to process images like that simultaneously. I don't think a mature human brain can rewire itself to process both images simultaneously (how cool would that be?)

I believe that an AI, however, would "feel" that it is in both locations, since it is fully conscious of events in both localities, if you define consciousness as being aware of yourself and your surroundings. Essentially, if you could duplicate the same AI at every point in space and allow for a mode of instantaneous information transfer, I think you would create an omnipresent consciousness... Cool!

But in humans, even in the transportation/new Avatar scenario, unless a suitable first-person instantaneous/semi-instantaneous information transfer mechanism is provided (basically seeing through someone else's eyes), I believe a double-vision scenario would not occur. I doubt that such a mechanism would be inherently formed during the process of transport, speaking purely physically, unless of course, a metaphysical consciousness of some sort exists.

Your view on consciousness being inextricably linked to the environment is very interesting, though I wonder if AIs would behave identically to humans in the absence of sensory information. What would an AI do if it couldn't interact with the world in any manner? For that matter, what would a human do? I'd love to read about the experiments on sensory deprivation. Do you have any links, by any chance? Raj Magesh · 2 years ago

Log in to reply

@Raj Magesh Oh, gawd, it's been a while since I read papers about sensory deprivation and what that does to human consciousness. I'll look around. But the point is, consciousness doesn't just exist by itself, it needs interaction with its environment, whether it's physical or abstract, local or remote. Furthermore, consciousness isn't something that you either have or do not have. It comes in a spectrum, and I would argue that the more intelligent and aware you are of your environment, the "more consciousness" you have. Hence, when considering any kind of "transfer of consciousness", we seriously need to also consider how that consciousness is interacting with its environment(s). Notice that I put the plural there deliberately. Of course it's quite possible for humans to be simultaneously aware of more than one environment. All you have imagine is looking out a plate glass window where you not only see what's outside, but see the reflection of what's inside at the same time. We can deal with that.

Here's an article published in WIRED magazine:

Sensory Deprivation Michael Mendrin · 2 years ago

Log in to reply


Problem Loading...

Note Loading...

Set Loading...