Forgot your password?
typodupeerror
Social Networks

Startup Out of MIT Promises Digital Afterlife — Just Hand Over Your Data 241

Posted by timothy
from the ashes-to-ashes-dust-to-nsa dept.
v3rgEz writes "A new startup out of MIT offers early adopters a chance at the afterlife, of sorts: It promises to build an AI representation of the dearly departed based on chat logs, email, Facebook, and other digital exhaust generated over the years. "Eterni.me generates a virtual YOU, an avatar that emulates your personality and can interact with, and offer information and advice to your family and friends after you pass away," the team promises. But can a chat bot plus big data really produce anything beyond a creepy, awkward facsimile?"
This discussion has been archived. No new comments can be posted.

Startup Out of MIT Promises Digital Afterlife — Just Hand Over Your Data

Comments Filter:
  • No. (Score:5, Interesting)

    by damn_registrars (1103043) <damn.registrars@gmail.com> on Thursday January 30, 2014 @04:51PM (#46113883) Homepage Journal

    But can a chat bot plus big data really produce anything beyond a creepy, awkward facsimile?"

    No, it cannot. Once you're dead, you're dead. Game over.

    • But can a chat bot plus big data really produce anything beyond a creepy, awkward facsimile?"

      No, it cannot. Once you're dead, you're dead. Game over.

      True that. I doubt any software can truly emulate the nuance of human personality based solely on pictures and tweets.

      • Re:No. (Score:5, Funny)

        by SJHillman (1966756) on Thursday January 30, 2014 @05:06PM (#46114093)

        I don't know... most of the people I know that use Twitter and Facebook heavily have slightly less personality than most software.

      • Re:No. (Score:5, Insightful)

        by dmbasso (1052166) on Thursday January 30, 2014 @05:07PM (#46114125)

        No, it cannot. Once you're dead, you're dead. Game over.

        True that. I doubt any software can truly emulate the nuance of human personality based solely on pictures and tweets.

        Actually it is worse than that. People should learn to grieve and then go on with their lives. A bot would only hinder this necessary mental healing process.

        • Well if you do it right and use brain state rather than just tweets and facebook posts, then you don't need to greive at all, because you'd still be alive, and your consciousness would be housed in a silicon brain rather than a meat brain.
          • Re: (Score:2, Interesting)

            by Anonymous Coward

            Well if you do it right and use brain state rather than just tweets and facebook posts, then you don't need to greive at all, because you'd still be alive, and your consciousness would be housed in a silicon brain rather than a meat brain.

            And how exactly would you be transferring said "consciousness" into that silicon brain? A copy of a brain and it's mind is not the original consciousness. And even if in some fantasy a consciousness generator (BEC or somesuch) were inlined into the system it still would not be the original persons consciousness, merely another one with the same memories etc. The original person, consciousness, would still be dead or whatever and you'd be living with a lie, which granted is enough for most people..

            • by dmbasso (1052166)

              To be fair, it is hard to tell, as you can't experience other people's consciousness. But you yourself isn't exactly the same you of a millisecond ago. So if you could make a copy of your neural pathways, even if roughly accurate, other people wouldn't be able to easily distinguish copy from original... I think that's the point TsuruchiBrian was making, for all practical purposes, you would be alive (from someone else's point of view). If your copies would have the subjective experience of being the actual

            • by fractoid (1076465)
              What does "the original consciousness" even mean?

              If you're completely unconscious, then when you wake up, you aren't "the same" consciousness because continuity has been broken. You're merely another 'you' running on the same hardware with the same memories.
          • What does this even mean? If you can transfer "brain state" to silicon, why can't you just make a copy of a living person instead of a dead or dying one? And if you had a silicon copy of yourself, would you be willing to kill the meat-you? No? Then I'd say a brain-state copy isn't you, it's a copy.

            In short, either all this business about a continuous, individual consciousness is largely illusory or we just don't understand the phenomenon very well at all yet.
            • And if you had a silicon copy of yourself, would you be willing to kill the meat-you? No? Then I'd say a brain-state copy isn't you, it's a copy.

              I would not be willing to kill the meat me. This is because I have evolved some pretty sophisticated self preservation instincts. That doesn't mean the copy can;t *also* be me. There is no rule of the universe that says there can be only 1 me. If I copied myself, I'm sure the meat me, would be sad that he is still going to die of old age. The silicon me will be ecstatic that the copying process worked.

              It's easy to say that the meat me is the *real* me. But I think a more appropriate way to look at the

      • by Anonymous Coward
        I don't know, "creepy" and "awkward" gets you 97% there for most of this audience.
    • by Hatta (162192) on Thursday January 30, 2014 @05:12PM (#46114179) Journal

      Well, at least they can recreate the readership of /.

      • Well, at least they can recreate the readership of /.

        An argument has been made (by both myself and others) that at least one slashdot user is a script already. Not necessarily an intelligent one, but a script nonetheless.

        • by Dogtanian (588974) on Thursday January 30, 2014 @05:59PM (#46114711) Homepage

          An argument has been made (by both myself and others) that at least one slashdot user is a script already. Not necessarily an intelligent one, but a script nonetheless.

          Does it bother you that an argument has been made (by both yourself and others) that at least one slashdot user is a script already?

          • An argument has been made (by both myself and others) that at least one slashdot user is a script already. Not necessarily an intelligent one, but a script nonetheless.

            Does it bother you that an argument has been made (by both yourself and others) that at least one slashdot user is a script already?

            It bothers me only that I have no mod points to award to that comment.

    • by ackthpt (218170)

      But can a chat bot plus big data really produce anything beyond a creepy, awkward facsimile?"

      No, it cannot. Once you're dead, you're dead. Game over.

      Well, assume they could make a perfect clone, the original you would still be dead and that's what we fear the most in this, some construct which believes it's me also, doesn't have the continuity.

      Besides, I don't think I'd be a happy clone/chatbot going around with a patent number on my arse.

      • Besides, I don't think I'd be a happy clone/chatbot going around with a patent number on my arse.

        Yes, you would. You'd be programmed to be happy. (and to not notice the patent number).

      • The "original you" dies every time something in your brain changes. What's the difference?

        What if you replaced every neuron in your brain with an artificial one, one at a time. Would this make it easier to pretend their is continuity? What if we hide the blob of meat that used to be your brain after the process is over?

    • Of course it can! Why the resistance? Human-level AI will exist by the time young people reading this are dead. Max Headroom: 20 Minutes Into The Future was, more or less, right.
      • Re: (Score:2, Informative)

        by Anonymous Coward

        Of course it can! Why the resistance? Human-level AI will exist by the time young people reading this are dead. Max Headroom: 20 Minutes Into The Future was, more or less, right.

        How ironic that you should mention Max Headroom. Perhaps you forgot though that the episode where a company was doing exactly this was just a scam? They just used the deceased's image and had it parroting some phrases, essentially a really bad chat-bot, whereas they were advertising that they had made a perfect copy of them and were keeping them "alive" for a price.

        btw: The Max Headroom AI was created by accident. The scientists at that time did not know how to make that level of AI on demand.

      • I think it will be closer to 20 or 30 years.
    • People's obsession with the world around them after death is odd. None of the major religions talk about just hanging around in the normal world after death to see what is going on. And secular followings certainly don't. So most of what we do before death, regarding death, is for the sake of satisfaction while we're alive. And that should really cover, at most, your required responsibilities. I.e. if you are the sole income earner in a family with kids, then get a life insurance policy to be responsible fo

  • by ravenscar (1662985) on Thursday January 30, 2014 @04:56PM (#46113939)

    Reporting for duty.

    "Smoke me a kipper. I'll be back for breakfast."

  • Yikes (Score:5, Insightful)

    by Bovius (1243040) on Thursday January 30, 2014 @04:56PM (#46113947)

    Holy balls that is creepy. At best, this would really weird people out who knew the dearly departed. At worst, it would provide a hook for traumatized loved ones to avoid dealing with the grief and get increasingly bottled up in a fantasy world.

    It is difficult for me to imagine ways in which this would be a good thing.

  • by WPIDalamar (122110) on Thursday January 30, 2014 @04:56PM (#46113951) Homepage

    Can I get this before I die? I hate talking with people sometimes.

    • by Tom (822)

      Yes, please.

      I was about to post something scathing to the effect of "that is the LAST thing I want happening to me afterwards", but yeah, if this can handle all that Facebook crap for me...

    • by Ichijo (607641) on Thursday January 30, 2014 @05:18PM (#46114243) Homepage Journal

      A chatty avatar version of me that keeps people on the phone as long as possible without committing to anything would be a great way to get telemarketers to stop calling. Maybe even better than Lenny [itslenny.com]. As a bonus, it would be seamless: just push a button in the middle of a conversation and the avatar would take over without the caller knowing.

      MIT, please make it happen.

    • Can I get this before I die

      And for people who aren't dead?

      I.e. can I get this to replace my long distance X-girlfriend? Or would it also decide I'm getting too creepy and we need to break up?

      I can't be the only slashdotter that wants it for this purpose.

  • They "promise" to do it... but they don't promise when.

    This may be possible someday, but not yet.

    Of course, if they do it too well, it may cause psychological trauma for some people who won't accept that the person they cared about is really dead.

    • by leftover (210560)

      1. Sell empty promises now.
      2. Wait for your "customers" to die.
      3. No 'Profit!' because you bolted with the money during Step 2.

      Look for them to avoid any preview of the avatar,

  • by Calavar (1587721) on Thursday January 30, 2014 @04:58PM (#46113981)
    Even if these guys could make an AI algorithm that is 100% accurate if given the correct input, internet posts are not the best seed data. People tend to be dicks on the internet. I'm pretty sure most people would not like to interact with the online versions of their departed loved ones.
    • by mark-t (151149) <markt@lynx . b c.ca> on Thursday January 30, 2014 @05:07PM (#46114115) Journal
      I'm inclined to think that in general, people who act like dicks on the internet are actually dicks in real life who at best, possibly for reasons of conformity, may just be curbing their tendency towards being a dick around people they meet in real life to avoid the potential social and cultural complications. That doesn't mean that's who they really are, however.
      • by Calavar (1587721)
        I think you're talking talking about closet racists and internet trolls and similar folks, and when it comes to those people, I completely agree. But even for your average Joe who loves his kids, and and is steadfastly loyal to his buddies, the internet is an entirely different setting. It's tone deaf for one, so a comment could be completely innocuous or scathingly sarcastic depending on how you read it. Since humans often have trouble telling the difference, an AI algorithm that perfectly mimics humans wi
        • by mark-t (151149)
          Actually, I was talking about anybody who acts like a dick online... generally a person who would call somebody that they didn't know a moron online because they said something amazingly stupid or uninformed but never to their face for the same reason is usually just somebody who is trying to avoid the social complications that would arise from such namecalling, and is not really any less of a dick just because they are wise enough to recognize that it's socially inappropriate in such contexts.
  • This is kind of terrible. Capitalizing on people's loss by selling them a pie in the sky dream. I admire the ambition, but I would think we would need to create an AI that can sufficiently pass the Turing test before we create one that represents a person's personality well enough to fool the person's closest family/friends.
  • by Anonymous Coward on Thursday January 30, 2014 @04:59PM (#46113999)

    From the Max Headroom episode Deities [youtube.com].

  • As long as Jimmy the Saint is doing the sales.
  • by Eric S. Smith (162) on Thursday January 30, 2014 @05:03PM (#46114053) Homepage
    Tell me more about can a chat bot plus big data really produce anything beyond a creepy, awkward facsimile?
  • so that's a deal killer right there.

    If I were to sign up for Facebook, and then do nothing more than post cat pictures, what kind of digital afterlife would I end up with, anyways?

  • Caprica (Score:5, Insightful)

    by stewsters (1406737) on Thursday January 30, 2014 @05:04PM (#46114061)
    Caprica. Watch it. Doesn't end well.
  • by holophrastic (221104) on Thursday January 30, 2014 @05:05PM (#46114081)

    So I'm dead. Why do I care about this? And why would I choose to spend money on it now?
    And what if I want to retain my own intellectual property when I'm dead? Can I install a web-server in my tomb-stone to host this thing?

    Oh wait, there is no tomb-stone -- again, because I'm dead so why would I want one?
    Hey look! It's another service to rape and impoverish people who have zero self-esteem in the first place!

    Don't worry. You can suck in this life. In your afterlife, you'll be wise and useful.
    Hey look! It's another religious promise!

    Last I checked, a facsimile after death is called a zombie.

  • by MDMurphy (208495) on Thursday January 30, 2014 @05:06PM (#46114097)

    This is the basis of S02E01 of "Black Mirror"
    http://en.wikipedia.org/wiki/L... [wikipedia.org]

    The episode did a pretty good representation of the idea, showing things that the the dearly departed's avatar would know and not know based on their chat and email history.

  • by CatsupBoy (825578) on Thursday January 30, 2014 @05:06PM (#46114099)
    Really reminds me of that moment where harry potter talks to his loved ones before going to die in the woods (sorry for crappy ref, i'm not a huge potter buf). He isnt really experiencing something new with them hes just talking with them and they are giving him reassurance.

    On the surface of course this sounds creepy, but its amazing how easy it is to comfort that "human" side of your brain. In a similar manner this would provide someone pretty much the same thing. You know, kinda like, if it sounds like joe, acts like joe, says something i think joe might say, then you can probably be reconnected in that small way, relieving your pain in a small way.

    I think anything that has the potential to ease suffering probably has a future.
  • and postmodern spiritualism.

    "We'll be happy to conduct a social media seance and allow you to contact your dearly departed. But first we'll need all that personal information."

  • Dixie Flatline (Score:4, Interesting)

    by SpectreBlofeld (886224) on Thursday January 30, 2014 @05:09PM (#46114141)

    `How you doing, Dixie?'
        `I'm dead, Case. Got enough time in on this Hosaka to
    figure that one.'
        `How's it feel?'
        `It doesn't.'
        `Bother you?'
        `What bothers me is, nothin'~ does.'
        `How's that?'
        `Had me this buddy in the Russian camp, Siberia, his thumb
    was frostbit. Medics came by and they cut it off. Month later
    he's tossin'~ all night. Elroy, I said, what's eatin'~ you? Goddam
    thumb's itchin'~, he says. So I told him, scratch it. McCoy, he
    says, it's the _other_ goddam thumb.' When the construct laughed,
    it came through as something else, not laughter, but a stab of
    cold down Case's spine. `Do me a favor, boy.'
        `What's that, Dix?'
        `This scam of yours, when it's over, you erase this goddam
    thing.'

    -Neuromancer

  • by wed128 (722152) <woodrowdouglass&gmail,com> on Thursday January 30, 2014 @05:09PM (#46114143)

    I'm sorry. My responses are limited. You must ask the right questions.

  • by Jason Levine (196982) on Thursday January 30, 2014 @05:10PM (#46114157)

    Let's say that the best case scenario happens and they're actually able to do this. You've now got chat bots functioning as long-dead people chatting away with living people. So far so good. Of course, the technology to do this would be impressive and would attract the attention of "the big boys." How long before they get bought out by Facebook or Google (or some other company)? How much longer after that until the chat bots get monetized? Perhaps by increasing the likelihood that a chat bot would mention a specific brand name instead of a general product that the formerly living person was interested in or perhaps by just blurting out random product callouts. Even if the monetization doesn't happen, how long until the entire project is folded into some other group and the chat bots get shut down for good?

    Even if they manage to do this, I don't see this lasting for long enough for many of the participants to actually die and be "resurrected" as chat bots.

  • What began as a conflict over the transfer of consciousness from flesh to machines escalated into a botnet which has decimated a million websites. Facebook and Twitter have all but exhausted the resources of the Internet in their struggle for domination. Both sides now crippled beyond repair, the remnants of their users continue to post on ravaged smartphones, their dumbassery fueled by over four thousand posts of total crap. Now this will go past their death. For each user, the only acceptable outcome is t

  • Priority override: Tears of Ra

  • by sconeu (64226) on Thursday January 30, 2014 @05:14PM (#46114201) Homepage Journal

    The "avatars" in the Alex Benedict series.

  • Charlie Brooker's excellent series "Black mirror". Had exactly this idea in the episode "Be right back".

    A company that would take all the tweets, facebook etc as input and create a bot of the deceased personality that you would be able to text with. The story had a pregnant recent widow start talking to her "deceased" husband. To extend it to the logical conclusion the company had upgrades that went from texting, through to phone conversation if audio input was put in, to finally an android based on the per

  • For some people, there's no need for any disclosure beyond what they've already done 100% publicly. I'm pretty sure I could whip up an RMSbot over a long weekend, for example.

  • by gweihir (88907) on Thursday January 30, 2014 @05:22PM (#46114281)

    This is, of course, utter nonsense. Not only is technology not advanced enough to do anything like this, the data required is unsuitable for the task for any but the most shallow individuals.

    That even a nearly perfect simulacrum would not be you is obvious.

  • All that data (Score:3, Interesting)

    by laie_techie (883464) on Thursday January 30, 2014 @05:28PM (#46114327)

    Do you trust any company with all the data it would take to train the AI? Do you trust the employees of that company not to read your emails and online posts and use it against you before you die? Do you trust their servers not to get hacked resulting in massive identity theft?

  • by EMG at MU (1194965) on Thursday January 30, 2014 @05:29PM (#46114343)
    I just think this is sad. When I become worm food I hope people find solace in their memories of me, the good times we had together, the adventures we went on. My life is defined by what I do in meatspace, not what digital excrement is left over in cyberspace. So many people are living more and more of their lives online, if your legacy is chat logs and facebook posts god dammit did you really live? Facebook isn't you, it is a digital representation of what you want other people to think you are
  • Or was I the only one that watched Caprica...

  • If you haven't seen it, I suggest you watch Black Mirror. http://en.wikipedia.org/wiki/L... [wikipedia.org] Season 2 Episode 1 is about exactly this concept, just much more extreme. That episode is seriously freaky and intelligent sci fi. The others are all excellent too and each is different from the rest.
  • ... in 'Tales of Pirx the Pilot' about forty years ago.

    If I remember correctly, at some point the simulation of a famous departed scientist has to point out to the protagonist, that he can't really come up with any new idea since he's only a collection of the data and knowledge of the person.

  • Because that's exactly what I want... to live on emailing people creepy messages from the Wired.

  • by Minwee (522556) <dcr@neverwhen.org> on Thursday January 30, 2014 @06:15PM (#46114873) Homepage
    "Honey? Your dad's on the phone again. He wants you to switch to a new insurance carrier, and hire someone to have the carpets cleaned."
  • Nope.

    You can't even pass the Turing test yet, let alone represent a brain state digitally, and you want to recreate a person based on text data? This is to mind uploading what ELIZA is to artificial intelligence.

  • Obviously a digital version is not as good as imprinting a clone with your life's history, but give cloning a few more decades ..

    http://en.wikipedia.org/wiki/C... [wikipedia.org]
  • What I would be way more interested in, is a service that upon my death gets handed over all of my digital accounts and proceeds to send them out in a blaze of glory. Epic attacks on trolls I dislike, statement after statement of the most raw and un-PC thoughts ever to leave a final mark upon the world. You could pay extra for more advanced writers to craft your final remarks.

    So much cooler than an Eliza that is Me flavored.

  • He is the best-documented human that ever lived, by his own decision. If they can get something out of his Chronofile, as a proof of concept, then it's interesting. http://en.wikipedia.org/wiki/D... [wikipedia.org]

To do nothing is to be nothing.

Working...