Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Sci-Fi Technology Science

Will You Ever Be Able To Upload Your Brain? (nytimes.com) 269

An anonymous reader points out this piece in the Times by professor of neuroscience at Columbia and co-director of the Center for Theoretical Neuroscience Kenneth Miller, about what it would take to upload a human brain. "Much of the current hope of reconstructing a functioning brain rests on connectomics: the ambition to construct a complete wiring diagram, or 'connectome,' of all the synaptic connections between neurons in the mammalian brain. Unfortunately connectomics, while an important part of basic research, falls far short of the goal of reconstructing a mind, in two ways. First, we are far from constructing a connectome. The current best achievement was determining the connections in a tiny piece of brain tissue containing 1,700 synapses; the human brain has more than a hundred billion times that number of synapses. While progress is swift, no one has any realistic estimate of how long it will take to arrive at brain-size connectomes. (My wild guess: centuries.)"
This discussion has been archived. No new comments can be posted.

Will You Ever Be Able To Upload Your Brain?

Comments Filter:
  • by Anonymous Coward on Sunday October 11, 2015 @10:14PM (#50706663)

    If an elderly but distinguished scientist says that something is possible, he is almost certainly right; but if he says that it is impossible, he is very probably wrong.

    - Arthur C Clarke

    • by Fire_Wraith ( 1460385 ) on Sunday October 11, 2015 @10:20PM (#50706681)
      The estimate that it will take centuries is probably what is the farthest off.
    • Halting Problem (Score:4, Insightful)

      by dcollins ( 135727 ) on Sunday October 11, 2015 @11:32PM (#50706911) Homepage

      Alan Turing said in 1936 that it's impossible to construct an algorithm that generally solves the halting problem.

      So who's wrong: Clarke or Turing?

      • If an elderly but distinguished scientist says that something is [...] impossible, he is very probably wrong.

        Turing wasn't elderly, and he didn't just say it, he proved it.

  • Locality of self. (Score:5, Insightful)

    by tlambert ( 566799 ) on Sunday October 11, 2015 @10:16PM (#50706669)

    Locality of self.

    The problem with almost all "uploading" schemes is that it creates a copy of your brain structure, so it's a copy of you, rather than you. Externally, there might be no apparent difference to an outside observer, but internally, you're kind of dead, if that 1 cubic foot of meat space is no longer functional.

    The only hope of an upload of the actual "you" would be an incremental replacement of brain structure, such that you lived in both meat-you and electronic-you at the same time, until the electronic-you completely replaced the meat-you, without a loss of continuity of consciousness.

    Otherwise, you're just building pod people. Which could be useful, if you wanted to embed one of them in a starship (or more likely, a tank or other weapon of war), or if you wanted to make a lot of duplicate copies of a particular mind, and didn't care about their locality of self, either.

    • by Hartree ( 191324 ) on Sunday October 11, 2015 @10:38PM (#50706745)

      I mostly agree, but will mumble a bit.

      I'm not even sure that the incremental replacement method would "work".

      Defining what we mean by "it worked" when it comes to something judged by subjective experience only is very squishy on whether it really worked, or you just think it worked.

      Since we can't even define consciousness well yet, and good luck on The Hard Problem, I'd instead say it doesn't look hopeful, but the jury is still out.

    • Re:Locality of self. (Score:5, Interesting)

      by jeepies ( 3654153 ) on Sunday October 11, 2015 @10:39PM (#50706751)
      The result is the same whether the brain is replaced a little at a time or all at once in a copy.

      There's an old story about an axe that has it's handle replaced a few times. Eventually over the years it's used so much the head is replaced. And a few more handles after that. There was always a piece of the axe included when something was replaced. Is the current axe the same axe we started with? If not, at what point did it become a different axe?

      As to whether an exact copy of you is actually you, I would say yes, unless you're going to argue something supernatural like a soul. It would be just the same as cloning a computer hard drive and placing it in identical hardware. From their perspective each computer is the original ...or the copy, there's no way for them to tell

      You're probably thinking of a continuous point of view being the original, but human consciousness generally only exists in 16 hour spurts. When you sleep, is the 'you' that wakes up the same 'you' that went to sleep? There's certainly a gap in your consciousness which would be the same as being dead and coming back. Or the same as a copy waking up.
      • by khallow ( 566160 )

        There's an old story about an axe that has it's handle replaced a few times. Eventually over the years it's used so much the head is replaced. And a few more handles after that. There was always a piece of the axe included when something was replaced. Is the current axe the same axe we started with? If not, at what point did it become a different axe?

        Hence, why the discussion of the incremental change versus copying. The human brain is a perduring (for lack of a better word, see perdurantism [wikipedia.org] for my inspiration for the term) phenomenon like your example of the ax. Copying onto a completely different substrate with very different properties is a very considerable change which might be enough to void the property of perdurantism.

        • Re: (Score:3, Informative)

          by jeepies ( 3654153 )
          Ah yes, I've heard of this referred to as worm theory. It's one possible solution to the Theseus Paradox. (Essential the same as the axe story Iu see above). Good video on possible ways of resolving the paradox, including wom theory here [youtube.com].
          • Re: (Score:3, Insightful)

            by khallow ( 566160 )
            It's also a practical and widely used technique in math called homotopy [wikipedia.org] which puts it beyond philosophical or empirical theory (as no such basis for the idea is required as a result - though math carries its own considerable baggage here).

            Also, glancing at the video you linked, I counted five solutions, not five possible solutions. There is an implicit assumption made in the video that these solutions can't be simultaneously applied. However, just by the act of outlining each solution in turn, they are a
      • by AmiMoJo ( 196126 )

        The problem with your argument that a copy is you is that it allows for two copies of you to exist at the same time. Aside from the legal quagmire that leads to, the two copies immediately start to diverge as their experiences differ. If you were married, which copy is still married? Which one does the husband/wife continue to share their life with? Both? Which one has a moral right to your stuff? If you split it 50/50 then clearly the copying process has deminished you somehow. If a child is copied, would

        • The problems you mention are all easy to solve. At least much easier than copying a brain. ;-)

          The problem with your argument that a copy is you is that it allows for two copies of you to exist at the same time. Aside from the legal quagmire that leads to, the two copies immediately start to diverge as their experiences differ. If you were married, which copy is still married?

          Both, of course.

          Which one does the husband/wife continue to share their life with?

          That's for him or her to decide. Probably the original rather than some machine.

          Both?

          Possibly, why not?

          Which one has a moral right to your stuff?

          Both, of course.

          If you split it 50/50 then clearly the copying process has deminished you somehow.

          Not you, just your possessions. Unless you're a selfish asshole...

          If a child is copied, would the parents have a moral duty or emotional bond with both the original and the clone?

          Of course they have the same moral duty. As for emotional bonds, you'd have to ask them.

          The copy is clearly not "you", it's just a copy, otherwise how could two "yous" exist at once?

          If you're the copy, then the copy is clearly "you". As you said, the experiences diverge after copying. How could two "yous" exist at

          • by AmiMoJo ( 196126 )

            If you split it 50/50 then clearly the copying process has deminished you somehow.

            Not you, just your possessions. Unless you're a selfish asshole...

            Practically though you both need a place to live, a bed to sleep in. It's clearly not the same as if your brain was simply replaced with a mechanical one and a single version of you continued to exist.

            What about your job? You worked hard to get it and advance your career, but your employer doesn't want two of you. The fruit of that labour can only go to one of you. What if you were an author, who gets paid for for sales of books written before you were duplicated?

            What about your identity? It clearly has val

            • Yes, I totally agree that there are many practical problems that you and your wife, and possibly some lawyers, would need to consider before you make a copy of yourself, and I was admittedly skipping over some of them. My point is just that they are not very 'deep' problems.

              • by AmiMoJo ( 196126 )

                The deep problem is that a human life cannot be duplicated. You could create a physical copy of a person, but you couldn't duplicate all the non-physical things that make up who they are. Like the example of the Argus, what constitutes that ship is not simply the physical material, there is more to it than that.

                As Satre says in Existentialism and Humanism, we start from nothing and define ourselves. The whole is more than the sum of its parts, the mere material of the body and mechanical operation of the mi

                • The past belongs to you as much as it belongs to your copy. If you were a person before the copying, your copy will be just as much of a person. Think about it this way: Creating a copy of yourself is like a divorce. It is potentially painful, lengthy, and probably involves lawyers, and in the end you may end up estranged from your former wife with only half of your possessions left. But it's not a fundamental or deep problem.

          • Which one has a moral right to your stuff?

            Both, of course.

            I now have a monkey wrench for you... what if the copy was made involuntarily, against the will of the person being copied? Which one now has the moral right to "your stuff" (in which I include the relationship with the wife, and so on)?

          • If you were married, which copy is still married? Which one does the husband/wife continue to share their life with?

            That's obvious, the copy is left with the wife. That's the whole point of the copy, someone to maintain the married life...I remember some Outer Limits or Twilight Zone like this, the guy copies himself so he can do other stuff; I think in the end somehow the copy ended up with the family and the original was an outcast.
    • Stroke plugs (Score:5, Interesting)

      by Okian Warrior ( 537106 ) on Sunday October 11, 2015 @10:50PM (#50706793) Homepage Journal

      Suppose you have a stroke, and it damages a small section of your brain.

      The (cerebral cortex surface) brain is made up of a repeating pattern of cortical columns, which is a structure that connects vertically among it's 6 layers, but not laterally beyond the column boundary. There are connections out the top to the higher order layers in the brain, and connections into the bottom from lower layers, but it's an independent function(*).

      As far as anyone can tell, the cerebral cortex is composed of a repeating array of these columns.

      Suppose you have a synthetic "plug" that can take the place of a number of cortical columns. You remove the damaged part of the brain and replace it with the synthetic plug.

      The plug contains processing units which then learn from the existing connections. The human helps to train the connections by giving feedback: as the plug tries out the connections and actions, the human can tell whether the output is right or wrong, and act accordingly.

      For example, if the plug was within the speech centers, the human would have to relearn that part of speech which was damaged, but he would have all the rest of his experiences and knowledge as a basis. His environment and other humans (family, friends) would also help support the learning process.

      Eventually, the plug would learn the correct responses to any of the inputs, and it would be a replacement for the damaged part.

      Now suppose you have another stroke, and it damages another part of the brain.

      Continue the process to its logical conclusion, and you migrate the essence of the person from the biological into the synthetic. This is possible because the information in the brain is not stored in one place, but distributed over many areas. If you lose one area, the information can still be reconstructed from information in other areas.

      I can well imagine when the technology gets advanced enough, that rich people might be able to get "stroke plugs" implanted, and over time completely replace the biological portions of their brain.

      Is this not a sufficient definition for uploading?

      (*) Yes, a glossy, simplistic description.

      • I always think glial cells are a great target for this sort of thing.

        Your neurones have this vast support network of other cells that don't do any thinking. Replace them with high-tech nanobots that perform the functions of glial cells, but also network with each other, and watch the neurones to learn how to be you. Gradually permit clusters of them to actively participate in your natural connectome. You slowly transition to a being composed of mostly thinking nanobots with some squishy bits hanging around

        • You'll know when you feel your soul slip out of your body due to too much cyber. You'll become a cyberzombie, with an essence of 0. Or at least that's how it goes in Shadowrun lol.
    • by Falos ( 2905315 )
      Sister posts bring up the usual practical/philosophical thoughts, including the broom/axe/ship parable. I don't think there can be a satisfying answer if we dodge around a certain necessity, where we explicitly establish What Is A Soul Anyway. Or self, consciousness, whatever. What was "the ship" anyway?

      Me, I take a shortcut and embrace apathy. I might kill the prior body myself - I'd wire it to die in advance. I qualify that "I" am still alive and in place. I'm (we're) probably more comfortable with thi
    • Really, what's the difference between making a copy of yourself, and going to sleep? It may seem like they're completely different concepts, but think about happens when you sleep. Your self shuts down, eventually you dream and than wake up, but for all intents and purposes it may as well be a new you.

      • Sleep is not a full shutdown. There are measurable processes going on there - it's not like turning off your computer, where it's "consciousness" (RAM contents) are rebuilt entirely from long-term storage in the morning. There is no area of your brain that gets "wiped" periodically.

        The closest analogy is that a somewhat reduced version of you is performing system maintenance processes.

    • I don't feel much kinship with the "me" of 20 years ago when I was 20. Nor at 30. Like... who was that person? what on earth was he thinking?

      I imagine that uploading might be similar.

    • by Greyfox ( 87712 )
      We could just program the new you not to notice the difference.

      Or possibly your software version of you would become another part of you. At that point it'd be easy enough to set you up with an implant that allows you to communicate with it, synchronize your memories and such. Except the software you would have much easier access to the online networks of information and might even be able to copy itself around for backup purposes and to accomplish more tasks simultaneously. Freed from the constraints of

    • by synaptic ( 4599 )

      Obligatory: http://www.terrybisson.com/pag... [terrybisson.com]

    • Yes, I believe you've hit upon the key problem with all this. Leaving aside all the daunting difficulty is making a true copy, the result would only benefit your survivors, not you. Now that's no small accomplishment, but it most certainly falls short of immortality.

      Another element that is frequently overlooked is that our brains are embedded in our bodies. Proprioception depends on all the real-time feedback from the stuff that's outside the brain. So without simulating the rest of us as well, the u
    • The problem with almost all "uploading" schemes is that it creates a copy of your brain structure, so it's a copy of you, rather than you.

      That makes no sense. Let A be the original and B be the copy. Even if you could not figure out which of them you are (an unlikely scenario), you would always be you, namely either A, the continuation of the original, or B, the copy of the original. The question as which one you end up is rather meaningless, because your self and your self-consciousness are copied.

      Externally, there might be no apparent difference to an outside observer, but internally, you're kind of dead, if that 1 cubic foot of meat space is no longer functional.

      Of course, either a replacement body (robotic or biological) or appropriate sensory inputs and body chemistry simulations need to be provided, or

  • by iggymanz ( 596061 ) on Sunday October 11, 2015 @10:17PM (#50706671)

    Connectome will be done not in centuries but a decade or less, really that's problem to be solved by automation and computing

    However, the 2nd reason, left out of the quote but in the article, has to do with the function rather than physical configuration of synapses and neurons. We don't understand that well at all. And that is probably where the "mind" is.

    • by narcc ( 412956 )

      Connectome will be done not in centuries but a decade or less,

      It's only been 10 years out ... for the last 60 years and counting.

    • No way dude. We won't even know how a brain functions in a decade. There isn't even a formal definition of consciousness. There is a lot of evidence to suggest the free will of the conscious mind is in fact an illusion.
  • [ - ]


    Because #Concise
  • Emulation (Score:3, Interesting)

    by dcollins ( 135727 ) on Sunday October 11, 2015 @10:41PM (#50706761) Homepage

    The primary problem with this recurrent geek fantasy is that at best it's not really a copy; it's an emulation on different hardware. And that means a different added layer of possible breakdowns, bugs, glitches, etc. "All abstractions are leaky", per Joel Spolsky I think. Will the person feel hungry, thirsty, sleepy, horny, too cold/hot, react the same way to their favorite booze/weed/drugs, etc.? Probably not. Will there be outages due to power, networking, input/output devices? Likely so. And it's really hard to pretend that in the face of those radically changed experiences of the world that it's the same person.

    This thought experiment serves as a pretty good case study that the Western attempt to cast a hard distinction between mind and body is not really tenable. You are your body, and your body is you.

  • Hans Moravec (Score:5, Interesting)

    by seven of five ( 578993 ) on Sunday October 11, 2015 @10:50PM (#50706791)
    In Mind Children, Moravec described a fascinating scenario. A probe equipped with molecular-scale surgical tools, encloses a few brain cells and simulates them in software while you lie on a table. You have a switch in your hand; as you press it, you flip back and forth between the simulation and the working cells; when you can't tell the difference, the cells are removed. The probe continues to work its way through your brain until no real cells are left. You have been slowly, gradually uploaded into software. This is you, your continual awareness, not a copy of you that takes your place after you've died.
    • "when you can't tell the difference"

      Hey, on this toggle I don't feel hungry, thirsty, horny, or short of breath anymore, and the weed I smoked before surgery seems to have lost its kick.

    • His concept was partials. A partial of yourself was an instance of yourself *at that time* that could be downloaded to a computers and then conduct problem solving.

      On finding the answer the partial would signal the originating consciousness that it had completed and was ready.

      At death, you consciousness was available for restoration to either reality or a simulated environment. Which didn't help if your body ended up in some inaccessible place.

    • by MrKaos ( 858439 )
      Also had the concept of uploaded consciousness in it's infancy and the scandal it created when it didn't work.

      Except that it did and the owner of the consciousness was a very rich person experimenting with other people until it was right.

      Some very interesting scenarios there.

      I'll be looking forward to reading the one you have here - thanks for that.

  • Neither you, anyone you know, or your grandchildren, or anyone they know.
    Eventually, probably yes, this will be a 'thing'.
  • What other as yet unguessed effects go into making life, a consciousness, a mind? I'm not talking magic, I'm talking about science and the description of the physical reality.

    We're talking about modeling what we see as the physical structure of the brain. I suspect the actual cloning of a consciousness is quite some distance away, after we're able to fully explain just what is consciousness and the mind. Heck, just describing what makes something alive is beyond us right now. A potato is most surely ali

  • Start with storing data there.

  • by Daniel Matthews ( 4112743 ) on Sunday October 11, 2015 @11:45PM (#50706955)
    If the scale required is your only argument you have made a very common error regarding the speed of change in exponential processes.

    What can we do now?
    What is the rate of technology doubling, D?
    How many times, X, do we need to do it to get to the required magnitude?
    It will take D*X years where 2^X = one hundred billion

    And that is without anything radically new being discovered in that time period, so 20 to 30 years is actually possible.
    Imagine what a large scale 3D quantum computing array would be capable of. We have just seen silicon based quantum logic fabrication developed and we already have 3D silicon based memory arrays.
    • Moore's law doesn't really describe a consistent and straight forward "technology doubling". CPU frequency, power consumption and single core performance have almost flat-lined [sogeti.com]. And that's without considering the bandwidth and latency delays you face when attempting to scale problems to multiple cores.
    • You forget the simple fact that no exponential growth can be sustained forerver. Moore's law will come to an end (in a few years, btw), simply when the required size for transistors is smaller than a single atom (or a single sub-atomic particle if we manage to do that; the idea is the same). Dennard's scaling [wikipedia.org] has already hit the wall. Networking will never send data using less than a single photon per bit (actually, the limit imposed by quantum noise is around 15-20 photons/bit) or a single electron/bit, an

    • you have made a very common error regarding the speed of change in exponential processes.

      I doubt he has, actually. But in any case your argument is circular by the very definition of an exponential function.

      And that is without anything radically new being discovered in that time period, so 20 to 30 years is actually possible.

      Right, because past performance is always an indicator of the future.

      https://xkcd.com/605/ [xkcd.com]

  • Idiocracy (Score:4, Interesting)

    by PPH ( 736903 ) on Sunday October 11, 2015 @11:46PM (#50706959)

    If we can reduce the number of synaptic connections in the average human brain while we are working on improving the technology, we ought to get the two to meet much sooner than the few centuries that TFS predicts.

  • Second (Score:5, Funny)

    by edittard ( 805475 ) on Monday October 12, 2015 @12:04AM (#50707009)

    Unfortunately connectomics [..] falls far short of the goal of reconstructing a mind, in two ways. First, we are far from constructing a connectome.

    Second, we get distracted halfway through a small list.

  • OK, so we're mostly software geeks here who have a vague idea how the underlying digital hardware works. It's not surprising that we think of 'uploading' a mind into our limited area of expertise. But why?

    Is there something wrong with biology and existing brains? We can grow brains. We are learning the first steps toward interfacing with them. Let's do what we can with real brains while adventurous explorers probe the distant frontier of digital brains.

    • have a vague idea how the underlying digital hardware works

      If that phrase refers to the human brain it is astoundingly wrong. Even using the term 'digital' is fundamentally wrong. The human brain is not digital.

    • OK, so we're mostly software geeks here who have a vague idea how the underlying digital hardware works. It's not surprising that we think of 'uploading' a mind into our limited area of expertise. But why?

      Is there something wrong with biology and existing brains? We can grow brains. We are learning the first steps toward interfacing with them. Let's do what we can with real brains while adventurous explorers probe the distant frontier of digital brains.

      Yes there is something wrong with our current hardware.
      the math coproccesor is shitty, our memory is prone to bitrot, and the network interface is nonexistant. And worst off all I have no means of making backups. Oh and the uptime is negligable i mean we have to shutdown at least once a day or our program becomes unstable and bugs start cropping up.

  • by RyanFenton ( 230700 ) on Monday October 12, 2015 @12:05AM (#50707013)

    Isn't the job of the nerves in the brain supposed to be to communicate?

    Shouldn't we just have to play the role of a nerve, and just 'ask' the brain nerve to tell us its contents, and those of its close neighbors?

    I mean,there's parasites that do this to an extent, such as toxoplasma gondii [wikipedia.org], seems odd that we haven't created an interface to work with nerves and just get them to communicate to us, as nerves logically have to do, in order to act like minds.

    Even if the process is slow, we should be able to do it at lots of locations simultaneously, so long as it's non-destructive communications. Sure, we'd be reinforcing connections by doing the queries, but so long as it was even-handed, it would be *nothing* compared to acts like dreaming or most of regular life.

    Worst case, even if we couldn't recreate a living landscape of a mind completely right away, we could at least save the long-term memories, and have something better than the complete destruction of being that happens with death now.

    Even if it would be embarrassing by conventional standards, I'd actually like the idea of my complete memory set continuing after I'd dead, rather than the feeble methods we currently use to leave something of ourselves. Add a query system to it, could be very odd, but really neat too - real life information ghosts.

    Far better than nothing, for my preferences at least.

    Ryan Fenton

  • I'm certain it's possible to meaningfully upload my consciousness. But that doesn't mean we're smart enough to do it.

    Assume the smartest mind possible by the laws of physics has an IQ of 1000, and assume to make an artificial brain you need an IQ of 2000. Although there's a solution to the puzzle it's not a solution that will ever be found.

    • Assume the smartest mind possible by the laws of physics has an IQ of 1000, and assume to make an artificial brain you need an IQ of 2000.

      Or, alternatively, assume that none of that is true. Problem solved!

  • I recently read that a fairly large swath of top AI researchers were polled about when we may be likely to see human- and superhuman level artificial intelligence. The median was around 2060. It seems to me that once computers are perhaps millions of times smarter than we are, seemingly insurmountable problems such as this one will be rapidly solved. When that happens I question whether humanity will even remain biological as there are clearly disadvantages to this format.
  • Never underestimate the bandwidth of a station wagon full of brains hurtling down the highway.

  • The question is not whether we'll ever be able to "upload" a map of a neocortex, but rather whether we'll be able to transfer the will and sense of self that makes us who we are.

    Perhaps at the time technology is able to upload a map, we'll discover that we really are nothing but meat machines. But I believe there is an extra "something" in the specific timings of how your particular neurons fire and interact with each other that makes you you. Not really a soul, per se, but a "something" bound in the c

  • Not everything is as easy as we'd like, or works out the way it logically "should."

    The bottom line is that with all of these "revolutionary" technologies, what should be possible and what can actually get done right now are often very, very different things. When an expert says it's going to take "centuries" to solve a scientific problem, it's because it might take many generations to do the necessary re-formations of the approach, the culture, the interface with other scientific disciplines, and the expec

  • not without a bonesaw :p

    even if you could upload the entire contents of your brain to a computer memory bank it wont be you
  • Every decade or so, someone thinks they've learned all about the brain. A decade later, we know that most of what they thought was at best hilariously incomplete. Lather, rinse, repeat. The "wiring map" in your brain is only a part of the puzzle. You have to have a good snapshot of the state inside of each neuron as well. Think of a network of computers. The connections are just the network. It enables work to be done, but the work happens inside of the neurons. If you don't know what they're going to do wh

  • A real solution would be some tech that copies your brain at a quantum level, basically destroying it as it copies due to collapsing wave functions. Therefor, a true copy can never have duplicates.
  • At the same time we get teleportation. Both require fast, wide and deep scanning. We're on the teetering edge of that now.
  • There is more to a brain than just it's connectome. It is going to be a long time before uploads are possible.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...