Comments

You must log in or register to comment.

alcatrazcgp t1_j1yyrmn wrote

I dont find it realistic that you can "copy" your consciousness into a robot body and now have a new body, instead that would lead to an exact copy of you in that robo body while the original host basically dies.

so the robot isn't you, it's just a perfect copy of you, which isn't the same.

79

----Zenith---- t1_j1z05p3 wrote

This. I’ve tried to explain it so many times but some people just don’t get it. It wouldn’t be you in any way shape or form. Just an exact replica. You’d be dead. Something else that’s not you would be alive.

18

alcatrazcgp t1_j1z0scs wrote

the best way to achieve "immortality" is keeping yourself alive, or at the very least transplanting your entire brain. which is in this case "you".

no brain = no you

you = specific signals in the brain

you can't just move that in code form onto a machine, thankfully im not the only one who realized this

11

FrmrPresJamesTaylor t1_j20lcd9 wrote

Honestly I would love to see a bunch of billionaires and influential technophiles essentially sign up for their own deaths in this manner. If someone thinks this technology is desireable or even possible, they can go first.

3

alcatrazcgp t1_j20lwm8 wrote

that actually seems like the worst "shortcut" for immortality, once these billionaires get it, who is to say you cant just turn off their copies? its just lines of codes, its not a living human, its a copy of them.

now if you took the route of biology and bioengineering and prolong your existing life by many different ways, thats a whole different story, thats true immortality

3

thisimpetus t1_j24pn4w wrote

> you can't just move that into code form onto a machine

That's an absolutely enormous claim I will be utterly shocked if you could truly defend, and that's not to insult you but to suggest that you might vastly underestimate the scale of that claim. It is absolutely not something that can be taken as obvious.

3

alcatrazcgp t1_j24qnyh wrote

No, I do not underestimate it, I truly think its impossible, at least for a very very long time. while you can copy it, you can't MOVE it, moving it would mean you somehow, some way, transform my brain into code while not killing me in the process, and then putting that into a machine, again, without killing me.

you can easily scan the brain and its signals and just translate that into code and input that into a machine yes, but you can't move the brain and "me" into that machine, you can just input a copy of me in it. hope that makes sense

1

thisimpetus t1_j24roa0 wrote

Well "moving it" isn't a meaningful thing to say, there is nothing to move, structure and data aren't material things. You're literally constantly changing, there is nothing static about you. You are the information in motion; where it is and by what means it moves doesn't mean anything. Copying a PDF doesn't physically relocate parts of your drive to another location, it represents that information identically somewhere else. So too your consciousness; just as reading the same song from different devices changes absolutely nothing about the song—and just as a song has to be happening in time to actually be the song—what makes you you is the data structure connected in real time to the environment, not the medium.

3

alcatrazcgp t1_j24uot7 wrote

no, your consciousness is not the same as digital data, you cannot have 2 copies at the same time, you can only control one, you cannot control 2 different "you"s in different places, thats not how it works

2

thisimpetus t1_j24vor3 wrote

Well, I'm no expert in this field but I do have a little academic training in it and I'll tell you that these claims you're making are very, very big claims that a great many PhD's have debated and I think if you're really interested in this subject you might consider getting into some of the reading.

Because the thing is, I don't think you'll find much agreement with your position at the top of the game, but that's because these are really, really hard questions and our intuitions about them tend to be really bad. That makes a lot of sense; we certainly can't expect ourselves to have an evolved understanding of these ideas. But all the same, if you're really interested, there are some fundamental ideas that you're challenging and I'd wager you might reconsider some of them if you got some exposure to some rigorous investigation of them. It's very interesting stuff, I know my thinking was forever changed by it. D.C. Dennet is a great place to start because his writing is enjoyable in addition to being top-shelf cognitive philosophy.

Best.

3

Stainless_Heart t1_j23n9hs wrote

Tell me, exactly which signals are constant, uninterrupted, and represent you as a person?

2

alcatrazcgp t1_j23nqcv wrote

all of them

0

Stainless_Heart t1_j23nzah wrote

Are they all permanent signals? Or do they come and go, regenerated when needed?

If the latter, you’re constantly dying in little bits and being recreated in little bits.

If the former, if all your brain signals were always happening without cessation… you’d be insane or at least in full seizure.

2

thisimpetus t1_j24p36p wrote

No cell in the body you typed that with was with you when you were born, and no cell you were born with is with you still. You've replaced them all.

So, you've already moved your entire consciousness from one medium to another, you just did it piecemeal and without a disruption in function.

Now, if you fall in a frozen lake and, after being technically dead for a few minutes, are revived, I'll wager you still think that's you.

So if we can find practical examples of both disrupted function and transference to another medium, we'd have to suppose that doing both all at once is what makes the difference. I don't see that at all.

You are not your body, you are just that pattern of data dancing about. So long as it dances, it's you. If there were two of you for a moment, or a thousand, they'd all basically immediately start being someone else because the dances would begin to be different. But this idea that there is an authentic you of which copies are something else really doesn't hold up under scrutiny unless you believe in a soul.

2

CadmusMaximus t1_j20a7e4 wrote

Exactly. Though it's not QUITE as easy.

Theoretically the copy would think it's "you" also.

So you essentially have a 50/50 chance of waking up as the robot or as the poor sap who's still mortal or (worst case) now dead.

Of all things, the movie "Multiplicity" deals with this pretty well.

Same with that Black Mirror episode with Jon Hamm.

So the real question you have to ask yourself:

"Do you feel lucky?"

1

----Zenith---- t1_j20alc0 wrote

Well no you’d have 100% chance of not waking up at all. But yes the copy would think it’s you and would not be able to tell the difference unless it already assumed what we are saying here before they copied themselves.

5

CadmusMaximus t1_j20b48e wrote

Not necessarily. What's to say that you're not experiencing the robot's "memories" right now?

Like your whole life is (for lack of a better way of describing it) building up to being the consciousness that "lives on" in the robot?

If that's the case, you'd think you "got lucky" and woke up as the robot.

There still would 100% be a poor sap that was left as a mere mortal /dead.

In that case, it absolutely is 50/50 you "end up" as the robot or mortal / dead.

−1

----Zenith---- t1_j20gj02 wrote

If I were the bot and thought I was real I’d still not be the original me.

The original me would be dead. Then I’m just a copy who doesn’t know it’s a copy, but is one.

There is no 50/50 chance of anything. 100% chance that the original dies and the copy is created.

2

Spursfan14 t1_j21xj1g wrote

What makes you you then? Why are you the same person you were 10 years ago and why does that exclude the copy?

1

----Zenith---- t1_j2210xr wrote

Well if you want to view it that way then none of us are really “alive” anyways just a code or algorithm

2

Spursfan14 t1_j21xefe wrote

If I took your biological brain and put in another person’s body, such that you had exactly the same memories, personality, likes and dislikes etc, would that still be you?

If I rearranged another person’s brain such that it had exactly your current state (ie same memories, personality as above) and in the process killed your original body, would that still be you?

At what point does it stop being you?

1

PeakFuckingValue t1_j208cc5 wrote

Nah. If you do it a certain way I believe you can remain you. It would not be about copying the signals though. It should be about merging with AI at first. Augmented you. Evolved you. Almost like bringing the internet into you vs the other way around.

−1

AbstractReason t1_j2056tl wrote

I think the solution to the copy vs original you problem is to replace certain parts of the brain over time so there is ‘continuity’ of individual consciousness through biological and artificial processes running in tandem as the process takes place. You’re just replacing ‘parts’ rather than doing a one shot transfer.

17

alcatrazcgp t1_j20dmq3 wrote

which is more realistic, considering every 16 months or so, i dont remember the exact number, every atom in your body will be different, meaning you are a whole new person to what you were 16 months ago.

the philosophy of Ship of theseus, is it still the same ship if you replace every part?

Seems to be the same, clearly we think so

8

ChiggenNuggy t1_j21i9je wrote

I believe your brain cells and some of your cells are with you forever.

9

Calfredie01 t1_j21kscf wrote

I’m gonna be real with you here. I think you might be mistaken with that one. Just thinking about it for a few seconds raises lots of questions such as where do those old atoms go, why are atoms breaking their bonds where they don’t need to, why wouldn’t we have completely new memories and everything, what about cells that stay with you your whole life.

The list just goes on

3

alcatrazcgp t1_j21lx5b wrote

you shed skin, where does it go?

imagine your shredded skin as your previous atoms, you are constantly changing, regenerating, healing being damaged, so on, that's how the body works, a human body sheds like 1000 skin cells per minute or something along those lines, i dont remember the eaxt number

−2

zeldarus t1_j23bva9 wrote

Skin is designed to be replaced at a constant rate as the outermost defensive layer. Most of the tissues in your internal organs and especially the cerebrum most certainly are not designed to "shed".

1

Spursfan14 t1_j21x3ou wrote

>the philosophy of Ship of theseus, is it still the same ship if you replace every part?

>Seems to be the same, clearly we think so

And what if I secretly take every original part and reconstruct the ship? Which is the original then?

1

adarkuccio t1_j21ry0u wrote

This makes sense, probably... unless you just lose consciousness and die slowly, I don't think we know enough about the human brain and consciousness to even start imagining those futuristic scifi tech.

4

ExoHop t1_j2d54pc wrote

continuity, as in sleeping?

1

ChlorineDaydream t1_j1zklck wrote

The game Soma (and it's ending) perfectly explain this at the end of the game, albeit with a twist but the same idea.

14

v3rtanis t1_j24hqrm wrote

God the existential dread I got from those parts...

1

Docpot13 t1_j1zxaci wrote

You would have to be far more specific about the term “you” to have a fair discussion about this topic. I am reminded of the comedian Stephen Wrights joke that someone broke into his house and replaced everything with an exact duplicate. He couldn’t believe it ‘everything was exactly the same.’

7

anengineerandacat t1_j20ck9g wrote

It really depends on what the desired "outcome" is.

Do you want to keep your lover alive? Then copying might just be the thing to do this for you.

Do you want to keep yourself alive? Copying will be where your life ends and your copies life begins. Your own conscious is now lost, you can't likely copy that as there is no connection from the old to the new.

You can likely cyberize bits and pieces but the cerebral cortex is thought to control your conscious which is basically the bulk of the brain.

Frontal lobe:

  • Decision-making, problem-solving.
  • Conscious thought.
  • Attention.
  • Emotional and behavioral control.
  • Speech production.
  • Personality.
  • Intelligence.
  • Body movement.

Occipital lobe:

  • Visual processing and interpretation.
  • Visual data collection regarding color, motion and orientation.
  • Object and facial recognition.
  • Depth and distance perception.
  • Visual world mapping.

Parietal lobe:

  • Sensory information processing.
  • Spatial processing and spatial manipulation.

Temporal lobe:

  • Language comprehension, speech formation, learning.
  • Memory.
  • Hearing.
  • Nonverbal interpretation.
  • Sound-to-visual image conversion.

You could perhaps replace everything but the Frontal Lobe with digital versions but you would likely need "tuning" or some bridged interface to translate from "your" signals to the standardized inputs and then convert those standardized outputs to "your" inputs.

It's hard to really say if this still makes the individual the same... a simple stroke is enough to completely change a person... this is way more destructive than a stroke.

5

Techutante t1_j20jrse wrote

I guess it depends on if "acting the same" or "being the same person" are, well, the same thing.

We all act differently every day. If you wake up on a Tuesday and hear that a close family member has died, you're definitely not the same person you were on Monday.

4

Docpot13 t1_j20ejbp wrote

Not sure I am following you here. I feel as if you are implying some “ghost in the machine” as if being able to perfectly replicate what the brain would do in response to specific information isn’t sufficient to recreate the self.

2

anengineerandacat t1_j20ocdr wrote

Generally speaking, yes which is why I stated it really depends on what the outcome you seek really is.

If it's the preservation of the individuals ability to share ideas and knowledge, then we could likely clone that individual state and continue to utilize them in society.

To put it very simply, you are asking to effectively create two batteries and want them to store the same exact electrons; it's just not possible.

1

Docpot13 t1_j20quw7 wrote

You and I appear to have different understandings of the brain and the nervous system. You speak of consciousness and memories as things in the brain. I view them as products of neural activity. The self is the functioning of the brain. If you can recreate all of the circuits and how they interact, you have recreated the self.

1

anengineerandacat t1_j21137c wrote

I view them as neural activity too.

The newer you would have their own conscious and be free to make their own decisions and to be honest would already likely be pretty divergent because of the procedure alone.

Your consciousness is unique to your very specific brain, it's an activity from all the impulses that fire not something that you just map that region and copy.

If you asked both beings a very complex psychological question, you would likely get slightly different answers; the words used, the makeup of the sentence, perhaps even the tone.

This is why I stated it's really dependent on what the desired outcome is... if you wanted to become immortal to continue living "your" life then brain copying isn't the way.

If the idea is to preserve yourself for others, than yes it's likely a valid-ish strategy; your clone's self would have all the wordly experiences you did and even more (as you yourself won't know what it's like to be a clone).

This is a topic that's somewhat both scientific and spiritual to some extent and it's not exactly easy for me to articulate what is being lost but I hope this made it slightly more clear?

3

Docpot13 t1_j212zrn wrote

Not sure there is any evidence to support what you suggest to be true. Sounds more like a desire to believe there is more to “being” than just basic biology, which is natural, but not supported by evidence.

2

anengineerandacat t1_j21a64j wrote

Inner voice fMRI: https://www.frontiersin.org/articles/10.3389/fpsyg.2019.02019/full

Consciousness can't be artificially stimulated: https://www.sciencedirect.com/science/article/abs/pii/S0361923009003657?casa_token=71CCXG8979oAAAAA:B9dF0u65Zs-S2PVeN2gg_Ik4thZ56PP6Qtuglt7L5fanVKRBcPw4CQmqXx7BBb-6iHZPJQO54w

The thing is that your "inner voice" is a brain activity, not something that is biologically wired but instead triggered from outside stimuli.

The consciousness can even be triggered in individuals whom are in coma's:

https://www.nature.com/articles/d41586-019-02207-1

Very little research in this space sadly, it's all theory and conjecture which I mean the entire conversation is about that considering we have no means to verify any of what is presented.

Hell, some individuals can be missing 50% of their brain cells and still live very normal lives: https://www.bbc.com/future/article/20141216-can-you-live-with-half-a-brain

In short, you can copy all the neurons / receptors / chemical makeup all you want but the activity of consciousness and their inner voice is unique to the individual.

You as /u/Docpot13 would cease to exist, only your clone and whereas they might communicate in a very similar fashion for some time the "you" that went through the procedure is long gone.

Starting to think you might be one of the 7% that doesn't have a typical inner voice lol.

1

Docpot13 t1_j21i5qc wrote

I definitely don’t agree with the idea that an internal monologue is consciousness. This would make the existence of consciousness dependent on the ability to use language. And you are correct, I am one of those 7%.

2

anengineerandacat t1_j21ig1w wrote

Curious then, since I have never really met someone like that... how do you process situations? Like when you read a book, what is going on in your head? Do you even like reading books are they engaging to you?

1

Docpot13 t1_j21j93i wrote

I read all the time. It’s a form of communicating information which is as useful to me as anyone else. What is puzzling to me is why someone needs to talk to themselves. Who is talking to whom? If you are truly talking to yourself don’t you already know everything you are putting into words and now just making thought more complicated by trying to represent with words things which may not be well captured by language? What’s the point of telling yourself something? In order to communicate it you already had to understand it so why then mentally speak it? Bizarre.

1

anengineerandacat t1_j21u0r6 wrote

I am communicating with myself; when I read I basically verbalize in-my-head what I am reading (and what I am typing). It mostly sounds like my physical voice but sometimes it could be in another's voice depending on the context and situation.

As far as to "whom" it's like talking into a room with an empty audience, sometimes I can visualize an audience to talk too and make-up things / situations for them to say but most often it's just me.

I am genuinely curious how you actually plan-ahead without having an inner voice, do you just "talk" to people without verbalizing it internally?

As to "why" I can't explain, it's been there since as long as I can remember... perhaps the voice has gotten louder over the years as I have learned to do "more" with it; I work as a Software Engineer day-to-day so most of my day is spent building mental structures and models of applications in my head and walking through where I'll do certain things next or even talking with my inner voice about said things in a form of rubber duck debugging.

Even this post, and your post are basically read back aloud and if I knew what you sounded like I would likely read the post back in your actual voice.

Without my inner voice... I don't think I would feel like I exist as a person, the bones / muscle / flesh surrounding my body are just what give me mobility but that "voice" is "me".

Which is why perhaps when I say you could clone the brain, since you can't clone my inner voice "I" will cease to exist. To my friends and family I might still exist but it'll be a different "me".

1

Stainless_Heart t1_j21rbkb wrote

Heinlein explores this in one of the Lazarus Long novels (might be Time Enough for Love which deals with similar concepts) when the character is permanently leaving the planet and his AI assistant, ostensibly built into the computing power of his office, decides to leave with him. When doing so requires copying into a new mobile computer on the ship, Long points out that it will be a copy and the original identity will be left behind, or erased/die in the process. Asking if that philosophical point will worry the AI, it replies something like “I just did it back and forth six times while you were talking.”

The point being that that the human concept of self/identity through a continuity of being may be flawed; that human consciousness is not continuous, it is always just momentary but in possession of memories. Much like Gibson’s replicants, an identity feels real, to be itself, only because memories provide the proof needed regardless of their truth or artificiality.

Does our “self” die with every passing moment, replaced by another self-generating one that carries along that big box of memories? Do we cease to exist when losing consciousness and a proximate version is born again upon waking? Personally, I think so. I feel the value of “me” in the memories I’ve accumulated, the knowledge gained, the ways of thinking that have developed, all the skills that I can exercise whether it’s the ingrained way to hold a fork or the vision to build a complex CAD structure.

So would all of these things combined, the memories and the thought structures, if they were copied into a robot body be me? Yes. I believe that robot would be me because it would think it’s me, remember things I’ve done, and do new things using my old mental skills. It would continue on as my flesh body does, learning new skills and accumulating new memories. For any particular time that it exists, it is me then.

Let’s make it more interesting; if all of my brain stuff were copied into a robot body and my flesh body remained alive, there would then be two of me. At least for a moment, that is. As soon as RobotMe starts storing memories that I don’t have, even if it’s looking the other direction across the table to where FleshMe is looking back at it, that’s enough. Now it’s a new self, developing new thoughts. It started as me and would become an alternate version of me. FleshMe might technically be the original version (as much as we ignore cellular reproduction has replaced every bit of an older me from a younger age), but being original doesn’t lessen the individuality of the copy. Two of me, common basis, becoming unique selves with every passing moment.

To view it another way; identity is a data-based illusion and no more or less valid because this-you remembering isn’t the that-you who generated the experience.

7

w0mbatina t1_j1zui85 wrote

Depends on how you look at it. If you copy a file from one computer to another without altering it, is it still the same file? If you move your conciousness in the same way, why wouldnt it be the "same" just like the file is.

2

polar_pilot t1_j1zzy89 wrote

When you transfer a file from your hard drive to a USB, the computer copies it onto the USB and then either keeps or deletes the original on the hard drive…. Which is what we’re talking about here. So no, it wouldn’t be the same to you- just an outside observer.

3

alcatrazcgp t1_j1zwlq1 wrote

if you "Copy" a file? the definition of copy is the answer there, its not moving it, its copying it

0

w0mbatina t1_j2b9ei5 wrote

Afaik the only difference between copying and moving a file is deleting the "original" afterwards.

1

alcatrazcgp t1_j2b9kct wrote

right, so would you accept dying so your clone can continue living as you?

i wouldnt, id rather continue living instead, isn't that the whole fucking point of this

1

MobiusFlip t1_j1zzrhm wrote

I think the best solution for this is something more like augmentation. If it's possible to run a mind on a computer, you can likely connect an organic brain to a computer and use it as additional memory storage and processing power. Do that, make sure all your memories are copied over, and then deactivate your organic brain. You would maintain consciousness through the process, so it would really be your consciousness in that computer, not just a copy.

2

moldymoosegoose t1_j1zwm5u wrote

You would hook up your brain to the fake brain and transfer very small parts over time and suddenly, you're entirely copied over and you couldn't even process when the switch actually happened and wind up with a single copy.

1

Villad_rock t1_j20ejsj wrote

What if everyday you wake up it’s just a copy of you? Would you care?

1

alcatrazcgp t1_j20gafb wrote

specify, I am a copy of me waking up, or a copy of me is waking up, which one?

if its the first, I don't care, if its the second, thats not me

Me is what gives me the will to do whatever I want, see through my own eyes and make actions with my own body, someone else in a different body is not me

2

sidv81 t1_j216rt6 wrote

Agreed. Heading even deeper into sci-fi speculation, it's unclear if the Star Trek characters stepping out after transporter use are indeed the originals in a way that matter and not some biological constructs with the memories and personalities of Kirk, Picard, etc. while the original Kirk and Picard were killed the moment they used the transporter for the first time.

1

alcatrazcgp t1_j2198nb wrote

correct, you can't simply "Teleport" without literally dying, you are deconstructed on an atomic level, then reconstructed again, who is to say it was the same exact atoms used? even if they were, you are already gone, you are just reconstructed the way you were, but who is to say its actually you? maybe you are dead, and thats just the perfect copy of you.

all in all, if your brain dies, you die with it

1

megalotusman t1_j21bhr4 wrote

I think for anything other than what you are saying to be true there would have to be some measurable essence, making a person a person, a soul, that could only exist in one body at a time, and in and of itself be unable to be copied no matter advancements in technology. Meaning that a clone could be made, but it would not have life unless the essence allowing it to run was taken from the original which would cause the original to die.
Essentially, magic.
That is the only way I think, you could say a clone is not a clone.
But what you're really doing is kicking the can down the road and saying the soul is the person not the body they inhabit.

1

m0estash t1_j21n0ri wrote

80 - 100 days ago your body was made of entirely different cells to the ones that make your body this instant. Were you a different copy of who you are now? Yes! Was it someone else or you?

1

FinancialCurrent3371 t1_j21n3ae wrote

Think of it as crystals that take your consciousness and places it in a game like Tron. Most people would think Sword Art Online or Code Leioko, but it would technically be inserted into a game or a computer function.

1

alcatrazcgp t1_j21ni5x wrote

explain how you move and convert my brain signals from organic meat into crystals or in digital form.

do you copy the signals or do you literally somehow move them? moving them would mean they are no longer in the brain, copying would mean its just a clone, not me.

simulating myself in a virtual world would just be stimulating the brain and simulating the effect within the game, similar to vr and how it tracks movement, motion tracking, but instead of motion tracking its brain tracking

2

FinancialCurrent3371 t1_j21o5hh wrote

It aligns more with brain dead. No signals in the brain for life with the motor functions still moveable. The body stays as the consciousness is take from the brain in every cell. The meat you basically posses would be grey not pink and not allow a electrical charge. The crystal is a mirror image of what the charge would be. More LIGHT than electricity. Imagine it as being stuck in a mirror.

1

nitrohigito t1_j22ro3w wrote

If you lose consciousness before the copy, perceptually you'll receive a new body. The original won't be aware of it dying.

1

Enjoyitbeforeitsover t1_j22xieu wrote

Exactly, unless perhaps there's some biological connection where you kind off start disabling the host while at the same time you start yapping away on the robot side, like a slow and steady upload but via organic connection. Think avatar at the end lol

1

thisimpetus t1_j24o1d6 wrote

Well, first of all

> it's a perfect copy of you...which isn't the same

I mean that's simply incoherent. That's what perfect means. The only way those two things aren't identical is if you subscribe to religion and the concept of a soul.

As for "realistic", consciousness just is the operation of the brain. If you are able to flawlessly replicate that function, then the subsequent consciousness is, again, identical.

There is lots of room there for deniability; a perfect copy might be impossible or else so tremendously difficult that we don't find it useful—it may also be relatively easy—but unless you can point to why such a copy of you isn't identical to you, I suggest that you consider the possibility that you simply have an emotional resistance to the idea that you aren't inherently unique and inviolate.

We're just information in motion, wherever you wish to house it and however you wish to move it.

1

TheUmgawa t1_j20d1gr wrote

But, from the robot's perspective, it's you, so I don't see the distinction. In The Prestige, does it matter that the Great Anton in the balcony isn't the one in the tank? Not at all.

0

alcatrazcgp t1_j20dd3j wrote

I see a massive distinction, you die, a clone of you lives, you will not experience anything that clone does, "you" are no longer alive, that is an imposter, a copy of you

2

TheUmgawa t1_j20dwo5 wrote

I am fine with that. You know who else is fine with that? The imposter who is, for all intents and purposes, me. How much guilt would you feel if one day you woke up, then watched someone who looks just like yourself die, and then you just went on living for another hundred years? To you, you're not an imposter. And the dead guy doesn't care, because he's dead.

0

alcatrazcgp t1_j20gioj wrote

your copy is indeed "you" and thinks it's "you". if you met your copy and told it you are the original, would it care? Probably not, now there are two of you, but the copy will always know it's not the original, its different, even if its the perfect copy, you two will always be different in many ways

1

TheUmgawa t1_j20hues wrote

Will it, though? Let's say that somewhere in the past, you had a medical emergency and they had to put you under. While you were under, they copied your memories and whatever passes for consciousness into a new body, and then they pulled the plug on the old one. And then, when you wake up, they say, "It's a miracle! The doctors managed to get all of your organs going again, and they say you've got another forty years."

In that scenario, where everyone is lying to you (or perhaps the doctors are lying to everyone), how would the replacement know it wasn't the original? As far as it's concerned, it went to sleep and then it woke up.

1

alcatrazcgp t1_j20i396 wrote

yeah, you still died, your imposter just replaced you, what's your point? you don't care that you'll be killed and replaced by your copy?

idk about you but that sounds like a massive crime if that were to ever happen

1

TheUmgawa t1_j20lq2r wrote

Is it, though? Because as far as you're concerned, you're still alive? You can even testify at the murder trial, "No, your honor, that couldn't be murder because I'm right here. Go ahead, ask me anything about my life."

0

alcatrazcgp t1_j20m3kr wrote

so what if you get cloned, and the clone insists it wants to destroy the imposter, the imposter being you, even though you are original, but how can they tell? what if he remembers more than you in the moment about your own life? and you are terminated?

2

TheUmgawa t1_j20mtq1 wrote

What happens in international waters stays in international waters. If I'm running a combination human cloning lab and monkey knife fighting arena, that's my business. Why should everyone live by your sense of morality? What makes your sense of morality any better than anyone else's?

And, honestly, why would either one of them say, "I have to destroy the other! I am the original"? That's like some garbage out of a bad sci-fi movie. Please don't consider being a writer.

2

sceadwian t1_j208pxg wrote

It's it's a perfect copy of it is you perfect means the same so you're basically logically inconsistent there.

−1

alcatrazcgp t1_j20deuo wrote

i got a seizure reading this

1

sceadwian t1_j20shok wrote

You need to do something about that reading problem you have then.

1

Grinagh t1_j20oait wrote

Michio Kaku is hardly an authority on anything, he likes to talk about things far in the future as if we are only decades from their implementation.

15

Dependent-Interview2 t1_j20xri1 wrote

He's full of shit.

Not sure why anyone who's supposed to follow science would listen to him

9

CEOofTwitter2 t1_j21wy4n wrote

Because he looks like an eccentric scientist from central casting and talks a good game. Which is the primary reason most scientists get booked for tv shows.

He’s always been a buffoon.

3

CEOofTwitter2 t1_j21ws9i wrote

Kaku starts to come off as a bit of a crazy person the more you listen to him over the years.

7

grillcheesehamm t1_j23812o wrote

Not crazy. Just a typical ignorant human that doesn't understand biology.

2

thexavier666 t1_j1z78jz wrote

I think Ghost in the Shell spoke about cyber brains way earlier

5

Fay_LanX t1_j20c5sb wrote

People in the comments talking like they definitively know what consciousness is, let alone how to replicate/transfer it, is hilarious to me.

5

dunkinghola t1_j201nee wrote

Haven't watched the video yet, but where does consciousness sit in the brain that it can be uploaded? Nowhere and everywhere. So, how does anyone expect to be able to "upload" into a device when it's non local?

3

Desperate_Food7354 t1_j205mbr wrote

It’s the entire brain, just make the external computer take over neurons at the exact same clock cycle those neurons run at, basically turn one off and one on at the same time while allowing communication between the two systems to maintain functionality.

0

MyDogAteMyUsernames t1_j20bpus wrote

That's a matter of perspective to the copy it will feel like it's the original even if in reality it is not

3

Desperate_Food7354 t1_j1zzira wrote

I suppose what you could do is add external modules to the brain with the same functions of the brain itself, and slowly migrate the neuro network from within into the computer modules by method of cut and paste in very quickly succession, how? Idk, perhaps possible though.

2

extopico t1_j211huz wrote

Science fiction writers were on this train a century ago.

2

megalotusman t1_j21aj49 wrote

This is an philosophical question more than a scientific one.A copy is not the original in a literal sense. Even if there is no measurable difference between an original and a copy after the event that they split-off, the fact remains they do not share the same history. Philosophically that is a distinction that could have weight.

If humans agree that a perfect copy is functionally no different from the original despite the fact that it is a copy with a different history, then for human purposes they could be considered the same thing.And if we agree to draw a distinction that says they are different, then they are, for our purposes different.

Persistence of a consciousness can be a huge factor. If the original ceases to exist at the same time the replica is made conscious, it is very easy for a person to rationalize that there was a transference of consciousness, not the end of one and the beginning of another.

The game Soma explores this a lot. (super recommend)

The end of The Prestige explores this a bit.

Star Trek TNG explores this in "Second Chances"

2

kiblerthebiologist t1_j21tksj wrote

I see people here saying it would be a replica but I think it would be giving it too much credit. So much of our personality is based on cascading biological mechanisms and neural connection. We may be low on sugar and grumpy. Something irritating occurs and our cortisol level increase and triggers a response. Each person has different expression levels of chemicals and different levels of receptors… and so on. All that too say it gets very complex.

2

Konstant_kurage t1_j2233ad wrote

You’re still dead. There’s no continuity of consciousness, no one even knows how to even approach the issue. When your body and brain dies, you’re dead. The new electronic version knows the body is dead and the instance will not miss anything, but the other instance is gone.

2

blazindekutree t1_j23f057 wrote

havent you seen the newer episodes of Sword Art Online?

Copies of people within machines do not work out - the mind almost instantly breaks

2

sweglord42O t1_j2a8cvk wrote

I think wether the robot is a copy or the real "you" is only a philosophical difference. In reality theres no difference.

2

PMMCTMD t1_j1zvbta wrote

That is like making an exact replica of New York city and thinking Miles Davis is going to show up at the Blue Note.

1

Curiousgreed t1_j21phzd wrote

Are you saying that neurons themselves have a conscience on their own?

1

PMMCTMD t1_j23z452 wrote

no. I am saying the infrastructure (the buildings) does not produce the events (Miles blowing his horn). Mapping the brain won’t produce the stored memories.

1

ZenFreefall-064 t1_j21plj0 wrote

Hold up ( screeching brakes). Until AI is self aware, this suggestion is moot at most. That is not to say this will not be developed in the future but for now let's try and wrap our mind around quantum mechanics.

1

squidsauce99 t1_j221gsq wrote

You can’t upload your consciousness onto a computer, and neither can you upload a copy. This is all very silly stuff but obviously people should try to do it I guess…

1

MsPI1996 t1_j2cnb2x wrote

Think I zoned out when I caught his lecture. Talk about, omg omg it's really him! At least my friends noticed I put on a poker face when getting his autograph.

1

MrZwink t1_j1z3xhu wrote

Can't be done, heisenbergs uncertainty principle. Mr Kaku knows this. He just likes to fantasize in the media.

You cannot make a copy of a brain because it would need to copy all particles, and the their interactions at the same time to "image" the brain. But one cannot measure where a particle is and what it is doing at the same time. Because the measurements disturbing the interactions.

This is something enherent to quantum mechanics, and not a solvable issue to overcome.

−3

Desperate_Food7354 t1_j1zz98p wrote

Wouldn’t this also just apply to regular computers and we can already do that?

3

MrZwink t1_j20227p wrote

No, you make a copy of the hard drive. Not the entire computer while it's running.

−1

Desperate_Food7354 t1_j202qhc wrote

I think it’s very possible given CPU architecture if you do the entire snapshot with one clock cycle.

2

MrZwink t1_j208zby wrote

You would need to know where electrons are and what they are doing in the processor. And you run into the same problem.

1

Desperate_Food7354 t1_j20aais wrote

everything happens in discrete clock cycles, at any moment you can know where every bit of data is inside of a CPU, the transfer only happens during the next clock cycle in which the bits move around, of course, humans are analogue so that is much more complex, but with computers I see it as already feasible.

1

MrZwink t1_j20bkon wrote

Humans and computers are both complex chains of quantum interactions. It is impossible to separate the computer (or human) from it's quantum states.

1

Desperate_Food7354 t1_j20ce8p wrote

The information within a computer is stored in gates, it isn’t on the micro scale of quantum events. You do not need to know what is going on in the atomic realm of a computer to understand it’s transfer of data, it is completely deterministic. Accept this as fact in regards to computers as this is a fully understood construct we created.

1

MrZwink t1_j20gart wrote

Then what is a random bit flip?

1

Desperate_Food7354 t1_j20hgxj wrote

Bit flips are random but are very unlikely to occur in the millisecond it would require to transfer all stored registers into another computer. Also if you transferred the data underground and in low temperature the chances are essentially 0 here. Computers are extremely simple machines when you look at the components.

1

MrZwink t1_j20hmht wrote

But WHAT are they?

1

Desperate_Food7354 t1_j20j9qr wrote

Computers are just the logic of transistors in order to perform arithmetic. It’s in binary because a transistor can either be off or on in the digital sense. If we we’re talking about analog you’d have somewhat of a point but because computers work in either 0 or 1 this isn’t the case. If you look at a cpu architecture video you’ll see just how simple they are and how easily this idea of transferring every state in a computer all at once could be done.

1

MrZwink t1_j20m0jb wrote

They're usually quantum interactions interfering with the computers operation. Usually nutritions

1

Desperate_Food7354 t1_j20pjme wrote

With the amount of quantum you are spilling out computers shouldn’t work at all.

2

MrZwink t1_j20rskb wrote

What are you on about, computers work on quantum principles...

1

Desperate_Food7354 t1_j20shbq wrote

Quantum quantum quantum, a transistor is ON or OFF, the only quantum thing here is quantum tunneling and that is only a problem in the development of smaller transistor sizes. My expertise is in this and you keep spilling quantum out like it changes the register values. Register values remain the same, you obviously have no idea how the inside of a computer works nor digital logic if you are going to say quantum mechanics makes duplicating all the states within a computer impossible. It isn’t how much liquid is in a tank, it’s whether there is x amount of liquid in a tank to reach the threshold of whether it’s a 1 or a 0, 1 or 0, we build computers that have a processing speed of 10^18 calculations per second, none of what you are saying applies in reality.

1

MrZwink t1_j20tvw0 wrote

A transtor is a semi conductor. It moves electrons through a semipermeable barrier. This is an interaction at a quantum level. The smaller you make them the more prone to quantum tunneling they become. So no a transistor is not on or off, 0.001% of the time it's both or neither.

There are safeguards in place in computers to check for random bit flips because it is needed. It's called hashing.

Im not saying computers don't work. I'm saying a computer is a machine that processes information on a quantum level. And it is impossible to separate the computer from it's quantum interactions. Heisenbergs uncertainty principle applies wether you want it or not.

You cannot build a house without bricks.

1

Desperate_Food7354 t1_j20xvsi wrote

I could transfer every bit in my 8 bit computer in a single clock cycle no problem. Computers work in discrete time increments with no uncertainty to when the crystal oscillator will be on or off.

1

Zaflis t1_j229y8i wrote

It doesn't have to be 100% identical transfer to be fully recognizable by others and the person him/herself. Also i won't be using the word copy because when you "move" a file the original is lost. So is the brain in this case. If we talk about it as a copy then it's not an improved "you" but a new entity entirely. Ideally you could still think uninterrupted during the transfer process, in all points from 0% to 100%. Needless to say for science being able to do that all is still too far in the future.

1

MrZwink t1_j22lfx8 wrote

This is exactly what heisenbergs uncertainty principle prevents.

1

Zaflis t1_j2323rb wrote

You are now assuming that every atom has to be copied with the exact same position and speed or something? But that is not correct. When you copy a limb, say a human arm to another person, it will still work even if it's different in a lot of ways. Accuracy can be improved but the copy is not, nor need to be too precise. And when you go into the computer representation, the format of data is already drastically different.

0

MrZwink t1_j23ugwm wrote

An arm is not concious. If you cant copy the processes going on in the brain you cant copy the consciousness.

1

Zaflis t1_j24qfid wrote

You only need to copy the stored information, not anything that is being moved at the time. Like harddrives and RAM but anything that CPU is processing now is irrelevant, if we compare RAM to the short term memory that changes more aggressively than long term memory.

1

MrZwink t1_j24s80m wrote

The brain doesn't work like that. It's constantly exchanging information, and forming new connections. There is no "off" switch.

0

Jetison333 t1_j20jma3 wrote

Your actualy pretty much correct. It's impossible to make a perfect clone of an atom, as you would have to know it's momentum and position at the same time, it's called the no cloning theorem. However what you can do is move that state from one atom to another, effectively just transferring it.

Incidentally this solves the whole uploading problem, as a perfect copy of your brain would neccesarily destroy the original.

0

MrZwink t1_j20m791 wrote

I love it how you say that like it's a surprise. I know I'm correct. I'm probably getting a lot of downvotes because I really don't likes kaku's unsubstantiated blabbering. And it shows.

While you're right about 1 atom. A brain is more than that. You can copy the entire brain. But anything in "active" memory would be destroyed. There are millions on quantum interactions ongoing at any one time. But then this is essentially the same problem. You have to choose: measure the state, or the interactions.

2