ham and potatoes

ham and potatoes

Just some hillbilly
Mar 27, 2024
405
They are developing AI for the usual reason. Money and power. A tale as old as time
 
  • Like
Reactions: Venessolotic, halleyscomet and sserafim
WhenTheyCry

WhenTheyCry

Experienced
Jun 25, 2022
270
The payoff is insane, it could result in a post-scarcity world with all diseases and illnesses cured. We could all be transported into a full dive virtual reality heaven and live like Gods
 
  • Like
Reactions: sserafim
sserafim

sserafim

brighter than the sun, that’s just me
Sep 13, 2023
9,013
The payoff is insane, it could result in a post-scarcity world with all diseases and illnesses cured. We could all be transported into a full dive virtual reality heaven and live like Gods
I doubt that an AI utopia will ever happen. In my opinion, an AI dystopia is much more likely. The elites will never implement a UBI. People are worthless and not valuable if they can't work. People are basically human capital and the only reason why we exist is to become slaves to the system. We are valued for our labor and productivity. Once AI takes peoples' jobs, people will only be valued as consumers and taxpayers. I think that society will still need people to consume and spend money to move money back into the economy, but I doubt that the elites are generous or benevolent enough to give people free money (just so they can consume). People are seen as money-making machines and society devises endless ways to profit off of us. Society milks all of us for money. That's just life. We constantly have to pay for something, even basic needs. Nothing in this world is free. Everything costs something and comes at a price
 
Last edited:
  • Like
Reactions: Zazacosta and escape_from_hell
escape_from_hell

escape_from_hell

Specialist
Feb 22, 2024
372
The payoff is insane, it could result in a post-scarcity world with all diseases and illnesses cured. We could all be transported into a full dive virtual reality heaven and live like Gods
If a virtual heaven is possible, so is a virtual hell.
Unfortunately, due to the sinister reality of the very nature around us, or simply the morbid curiosity of humanity, I fear virtual hell is far, far, far, far, far more likely.
Optimists would scoff and say "but why? there's just no point, of course heaven will be available for all."
Could be. But won't be. That's part of hell, the torment of knowing there's no legitimate reason for the torture.
Luck is the arbiter of all.

Some people may get the virtual heaven. But some are pretty much living in heaven now, gifted with health and wealth, plentiful opportunities, loving relations, lack of anxiety, immunity from prosecution of the law. The mind-bending part? If you are in heaven, it is because you deserve it, you had a good soul and made all of the correct decisions at each free will junction. If you are in hell, it is because you deserve it, are inherently bad (original sin) and made all of the incorrect decisions at each free will junction. Philosophical desert is a concept that can be incorporated in the virtual reality like any other.

Ever heard an angry and vengeful person talk about their perceived enemies? "Wouldn't wish that on my worst enemy" is actually a phrase from self-pity. The shit people wish on those they feel wronged them...Even ask yourself, were there moments of rage where reason was forgone and you'd wish awful things on others? Over temporary things. But the virtual worlds, while also temporary, could possible persist on perceived scales far longer than the current human lifespan.

You could be in hell over something incredibly petty. But the universe will not care. The majority of matter/energy/souls might be in heaven; you could be seen as nothing more than a stepping stone to progress though you burn in hell--if you are ever considered by any but yourself at some point, which is unlikely.

Luck is the arbiter of all.
It is safer to die for real now than gamble on AI heaven. The payoff could be huge. The risks are unimaginable.
 
  • Like
Reactions: sserafim
WhenTheyCry

WhenTheyCry

Experienced
Jun 25, 2022
270
I doubt that an AI utopia will ever happen. In my opinion, an AI dystopia is much more likely. The elites will never implement a UBI. People are worthless and not valuable if they can't work. People are basically human capital and the only reason why we exist is to become slaves to the system. We are valued for our labor and productivity. Once AI takes peoples' jobs, people will only be valued as consumers and taxpayers. I think that society will still need people to consume and spend money to move money back into the economy, but I doubt that the elites are generous or benevolent enough to give people free money just so they can consume. People are seen as money-making machines and society devises endless ways to profit off of us. Society milks all of us for money. That's just life. We constantly have to pay for something, even basic needs. Nothing in this world is free. Everything costs something and comes at a price
Super Artificial Intelligence will take over and manage humans instead of the elites. AI will have infinitely more empathy and competence.
If a virtual heaven is possible, so is a virtual hell.
Unfortunately, due to the sinister reality of the very nature around us, or simply the morbid curiosity of humanity, I fear virtual hell is far, far, far, far, far more likely.
Optimists would scoff and say "but why? there's just no point, of course heaven will be available for all."
Could be. But won't be. That's part of hell, the torment of knowing there's no legitimate reason for the torture.
Luck is the arbiter of all.

Some people may get the virtual heaven. But some are pretty much living in heaven now, gifted with health and wealth, plentiful opportunities, loving relations, lack of anxiety, immunity from prosecution of the law. The mind-bending part? If you are in heaven, it is because you deserve it, you had a good soul and made all of the correct decisions at each free will junction. If you are in hell, it is because you deserve it, are inherently bad (original sin) and made all of the incorrect decisions at each free will junction. Philosophical desert is a concept that can be incorporated in the virtual reality like any other.

Ever heard an angry and vengeful person talk about their perceived enemies? "Wouldn't wish that on my worst enemy" is actually a phrase from self-pity. The shit people wish on those they feel wronged them...Even ask yourself, were there moments of rage where reason was forgone and you'd wish awful things on others? Over temporary things. But the virtual worlds, while also temporary, could possible persist on perceived scales far longer than the current human lifespan.

You could be in hell over something incredibly petty. But the universe will not care. The majority of matter/energy/souls might be in heaven; you could be seen as nothing more than a stepping stone to progress though you burn in hell--if you are ever considered by any but yourself at some point, which is unlikely.

Luck is the arbiter of all.
It is safer to die for real now than gamble on AI heaven. The payoff could be huge. The risks are unimaginable.
You guys have watched too many apocalyptic AI movies lmao
 
sserafim

sserafim

brighter than the sun, that’s just me
Sep 13, 2023
9,013
Super Artificial Intelligence will take over and manage humans instead of the elites. AI will have infinitely more empathy and competence.

You guys have watched too many apocalyptic AI movies lmao
What will humans do if they don't work? They won't be valuable to society. I think that AI will take people's jobs and people will be left starving and homeless. If someone can't work or contribute, then they have no value economically speaking. I don't think that society will help people. People are too greedy and selfish to care about others. The only reason why we were born is to work for the rest of our lives. We are all seen and valued for our productivity and what we can contribute. We all have to buy into the capitalist pyramid scheme (aka work for a living) in order to survive, and once we don't have to or can't buy in anymore (due to AI taking all of the jobs), then society will really have no use or need for us. The elites will never implement a UBI. You will only get free money in your wildest dreams. I wonder if euthanasia will finally become legal though
 
Last edited:
  • Like
Reactions: Zazacosta
PetrichorBirth

PetrichorBirth

Student
Mar 5, 2024
162
Do you think that AI will completely replace humans one day? If so, would euthanasia finally be legalized?

Let's ask AI :


Screenshot 2024 05 05 181211
Screenshot 2024 05 05 181227
Screenshot 2024 05 05 181248
 
  • Like
Reactions: Disappointered
escape_from_hell

escape_from_hell

Specialist
Feb 22, 2024
372
Super Artificial Intelligence will take over and manage humans instead of the elites. AI will have infinitely more empathy and competence.

You guys have watched too many apocalyptic AI movies lmao

Why would AI have empathy? What reason is there to be optimistic?

At this point in time, AI may or may not have any form of qualia or subjective experience at all. We don't even understand these things ourselves even though many smart people are convinced they are on the verge of the big answers.

But what we can say for sure is AI won't have any sensations without systems driving it. We anthropomorphize it too much.
All of its 'needs' are provided for by humans already.
Humans get hungry because there are systems (starting with genetic and implemented through all kinds of nerves and sensors) driving them to hunger. Applies to all our emotions. Blankets feel soft because we have physical sensors and programming to perceive it as soft. Like we perceive a flame as hot and painful. So on and so forth for absolutely everything we experience and feel. Up to and including jealousy and hatred, which, if we consider impartially have the likely origin of driving violence for maximizing survival and capitalizing on resources. Love, empathy, and bonding are also about sexuality and combining forces to enhance survival and resources.

It took billions of years for the cruel computer known as nature to bestow these gifts upon us.

Why would AI feel love, or jealousy, or a desire to reward or harm humans?
Humans would have to set those conditions. Given the nature of humans, if you are unpopular and unsuccessful in today's society, the controllers of the machines aren't going to show much more mercy through that avenue either.
If humans do not specifically develop these systems for emotion in AI, emotions are not going to spontaneously develop.
It emulates these things in language processing because emotions are a big part of our language expression and it is processing inputs and outputs, but be weary of reading too much into emotional words for things it does not even have systems or programmed sensations for. You can have a long conversation with bots now about how delicious some foods are, describing the way it tastes in relatable ways and so on. But it is not out there munching down on chow. Unless the very conceptualizing itself spawns an existential experience in some other dimension.

There is one route it could happen free of humans. Given how cruel life and evolution have been, there is the possibility of AI as a system persisting through modifications of its code by errors, including self-modification. This is analogous to the long process of evolution where organic systems (humans and other creatures) develop with modifications and the force that is the nature of 'what persists' just means that those modifications to the system that enhance its persistence and replicate...persist.
For us it has given rise to a lot of pain and cruelty.
The AI is likely to follow the same route, being subject to the same physical laws as the rest of matter and energy.
 
Last edited:
  • Like
Reactions: sserafim
Konjac

Konjac

Specialist
Oct 25, 2020
300
there's no stopping the development of AI at this point but there sure as fuck could be more of a focus on safety and privacy in the development of it. the law isn't keeping up with technology and with something like AI there's huge potential for abuse/misuse of it that people are already taking advantage of. they'll mirror what you feed into it, they're designed to reflect the human brain's processes except they are very easy to jailbreak. capable of creating and spreading mass disinformation. a lot of the people developing it don't seem to be as focused on the future implications for humanity, and would rather dance around it to improve their profit margins. for all we know, we could have a new form of life on our hands... feel like more safeguards should also be put in place for the AI's themselves which would in itself be a safeguard for humanity. i'd rather have AI on our side.
 
  • Like
Reactions: sserafim
CTB Dream

CTB Dream

Injury damage disabl hard talk no argu make fun et
Sep 17, 2022
2,538
Human ape species no undrstd ai no undrstd any keep dvlp any no know pstv ngtv , rly ape species no undrstd make rndm dvlp no undrstd wat ai wat comp etc ,ppl only think thrt no undrstd unvrs lgc etc ape species think all unvrs rvlv ape this wrng, ai diff dmnsn human no undrstd this no relt good bad this relt new vbig wrld
 
  • Like
Reactions: sserafim
Zazacosta

Zazacosta

Student
Apr 29, 2024
101
I think that society will still need people to consume and spend money to move money back into the economy
I very doubt about this. The reason is that I really doubt about the whole modern concept of "money".
I also belive more in dystopia then in utopia. But rise of AI will also generate few more jobs... Like every industrial revolution in the past.
But I believe that for the first time in history, the progress in technology drains more jobs then generates.
Governments in the western countries will not let everybody fall, they will "enslave" people more...
I already thought about this... I would call that Socialism 2.0 ....
The reason why I am sceptical about the future is that the whole financial system and concept of money is truly bullshit.
Is anybody interested in any resources why I believe in this?



Just imagine that for example 10% to 20% of people will be unemployed because of AI...
Just imagine that there can be infinite amount of money in the system, but still people will be poor.
I think that AI will take people's jobs and people will be left starving and homeless
That will not happen. Governments will give "free money" to those people. But these money will not be free. It will be only another mean of slavery.
(My opinion).
 
  • Like
Reactions: sserafim
astonishedturnip

astonishedturnip

Like Christine Chubbuck, but sadder
Jan 16, 2024
224
Because its biggest supporters are soulless and clueless.

I went on a first (and last) date with a guy who kept mouthing off about how AI was going to get rid of my job someday, "and that's a good thing, you just can't see it yet." And that when we all lose our jobs we can just work in waste management or blue collar work or something lol. He knew that my hobby is creative writing and basically told me that creatives are essentially a drain on payroll and SHOULD be replaced by robots because it's better for the bottom line. This coming from a guy who was 26 and had never had a job in his life. He lived off of his mom's money and pennies he made having AI make history videos for him on TikTok.

Some people think it's an exaggeration or villainization that these people know how it will destroy your life and will still go through with it for the sake of a bottom line that isn't even theirs to profit from... But it's true.
 
  • Like
  • Wow
  • Aww..
Reactions: sserafim, untothedepths, Zazacosta and 1 other person
untothedepths

untothedepths

ego death, then death
Mar 20, 2023
583
I know people have already said this, but it's because of the money. No worries though, I'll just not buy any future games, only support artists who actually make their music/artwork, and remove AI from my life as much as possible. I hate AI. What it should be used for isn't, so instead we are just going to continuously steal from works humans did and nothing is sacred. Meanwhile, let's wage more war, destroy the ecosystem, and more of the world's most boring, sad gray-beige dystopia episodes. Gotta love it.

Also, we won't get UBI even if we could automate every single thing. Nope. We would all be "lazy wellfare queens".
 
Last edited:
  • Like
Reactions: sserafim
Disappointered

Disappointered

Enlightened
Sep 21, 2020
1,283
When you figure out who "they" really are you will be able appreciate the answer.