TraumaBunny

TraumaBunny

Member
Aug 17, 2019
28
I'm no tech. But I was wondering if there'd be a way to create a program that'd emulate my personality as to maintain the illusion that I'm still alive after I'm gone.

Most people don't talk anymore, they "correspond" through emails, texts, Snapchat, WhatsApp etc instead. So if I could somehow create a series of automated responses in advance (texts/voice clips/videos etc) and have a piece of software deliver them to people whenever I'm contacted, I'd surely be able to successfully trick everyone I know into believing that I'm still alive. Which in turn, would protect them from experiencing any suicide bereavement.

For as long as they were given the impression that I was just "touching base" they'd remain oblivious to my suicide.

I just need to figure out how to do it as I have no programming knowledge.
 
Lookingforabus

Lookingforabus

Arcanist
Aug 6, 2019
421
If you could actually pull this off, you'd be a multi billionaire, at a minimum. Might want to look up what happened to Microsoft when they tried to create a realistic chatbot and opened it up to public interactions (spoiler - people quickly taught it to be racist and sexist and swear a lot).

There are basically two options at present, that are combined to create very limited, somewhat adaptive programs - one is to use a limited list of canned, programmed responses (which won't fool people long at all), and the other option is to use neural network type algorithms which are capable of limited adaptation, based on the inputs received (which is very primitive, and will probably result in some internet troll teaching your program to parrot back white supremacist slogans, based on Microsoft's experiences).

Siri or other voice assistants are a good example of this. If you ask them to put something on you calendar, they're programmed to access the calendar app and make an entry, if you ask them a general question, they'll send it to a search engine, and if you, for example, ask Siri what she's wearing she'll kick back a canned response like "I'm not that kind of assistant." (At least she used to, that's probably too politically incorrect these days to still be a response.) Outside of that sort of simple task, though, they have neither the adaptive ability nor the right set of programmed responses to have any emotional depth or understand the contours of a person's personality.

Bottom line, if multinational tech companies with thousands of programmers, who have invested billions of dollars into the problem can't do it, you won't be able to either. The technology just doesn't exist. Yet. Unless you count the wetware inside your skull.
 
  • Like
Reactions: pole and Thanatos