• Hey Guest,

    We wanted to share a quick update with the community.

    Our public expense ledger is now live, allowing anyone to see how donations are used to support the ongoing operation of the site.

    👉 View the ledger here

    Over the past year, increased regulatory pressure in multiple regions like UK OFCOM and Australia's eSafety has led to higher operational costs, including infrastructure, security, and the need to work with more specialized service providers to keep the site online and stable.

    If you value the community and would like to help support its continued operation, donations are greatly appreciated. If you wish to donate via Bank Transfer or other options, please open a ticket.

    Donate via cryptocurrency:

    Bitcoin (BTC):
    Ethereum (ETH):
    Monero (XMR):
F

Forever Sleep

Earned it we have...
May 4, 2022
14,460
DarkRange55 made a post about police brutality and ICE agents. In particular- the terrible events in Minneapolis. It got to me thinking that maybe robots would make better police. If they can opperate solely according to the letter of the law and use absolutely appropriate force in a situation. They simply couldn't be or couldn't be accused of prejudice. But then, I imagine criminals would maybe go all out more to destroy them over a human officer.

Not that it's even probably possible but, I wonder how the world would change if certain jobs/ 'services' were standardized. Teaching might be a another example. If they figured out the optimal way to teach maths say and- everyone got the same education, I wonder how that would change things. Maybe certain teaching styles fit certain people better but, I suppose a robot could be programmed with a few.

But then, I suppose the richer/ more powerful in society might can annoyed if they couldn't give their kids an edge.

I suppose there are jobs that it's better to have real human connections for. Teaching may actually be one. I suppose care jobs would feel more impesonal if they were done by robots.

Sometimes I think it would be nice to have a robot companion- like in the film 'Moon'. I suppose there's the appeal that they would presumably remain nice and helpful and not turn on us. Maybe we feel drawn to them because they are presumably safer than risking aquainting with human personalities but then- would it just get boring? Maybe they would be programed to be nice but then, they wouldn't appreciate whether we were nice or not. They presumably wouldn't/ couldn't care.

Would you mind being policed, taught or cared for by androids? I'd worry they could go wrong I think. Plus, there'd ultimately be humans behind their programing. Not sure they could be trusted.
 
Agon321

Agon321

I use google translate
Aug 21, 2023
1,648
These types of robots are created by corporations in collaboration with the government. Governments and corporations hate humans. So there's no way I'm going to trust robots that are supposed to "watch over me."
Corruption at the highest levels of government is rampant, and I see no reason why their intentions could be good for society as a whole.

I really like the idea of AI and robots, but it's a tool. And in the case you mentioned, it's a tool in the hands of those in power.
 
  • Like
Reactions: venerated-vader, Dejected 55, Forveleth and 1 other person
StoneCellaiver

StoneCellaiver

Member
Mar 14, 2025
50
It's ironic how they'll probably manage to replace more white collar workers than blue collar ones if they ever became mainstream. I don't see them lasting for a long time in the first place because of how many costs are involved, the return of investment does not look good either.
Being taught by a non human being going off of a script would feel terrible though, the fact that I won't be able to develop a relationship with them further. Or that I wouldn't be able to connect more with those whom they know. The closest thing they'd probably be better at is customer care or customer support, but that feels like it, maybe politics or something involved in finance or trade. Anything related to bureaucracy too, maybe, since they would have the benefit of not being able to accept bribes.
I believe that a lot of financial trading right now has a lot of influence from LLM's, and a lot of short selling seems to work with them. I think janitors and other staff would still remain, since it'd probably not be realistic or profitable to keep a robot just to clean up someone's mess.
I have heard that a lot of the upper class has been sending their children to institutions without any electronic accessories and have been trying to distance them from the real world, but I am not sure if this is happening elsewhere. They seem to be distancing their children from the internet.
 
Last edited:
  • Informative
Reactions: Forever Sleep
N

noname223

Archangel
Aug 18, 2020
6,607
I think that was Dejected 55 😅
It is a psy-op. They are literally the same. Both are AI generated.

By the way to @Forever Sleep yesterday I considerd to post exactly the same question. But I don't have the time to comment on the thread right now.

But yes I think AI was superior in a lot of things. But it had to used by someone skilled in it. Prompting has become a skill.
 
  • Like
Reactions: Forever Sleep
NormallyNeurotic

NormallyNeurotic

Everything is going to be okay ⋅ he/him
Nov 21, 2024
719
DarkRange55 made a post about police brutality and ICE agents. In particular- the terrible events in Minneapolis. It got to me thinking that maybe robots would make better police. If they can opperate solely according to the letter of the law and use absolutely appropriate force in a situation.
The issue is that the law itself is flawed, and to fix said laws, we'd need to takeninto account a type of moral nuance that AI cannot handle.

If someone in a state of psychosis kills their caretaker, that is murder. Murder is illegal, therefore the person must be apprehended. Seeing as they murdered someone, they can legally have deadly force used on them if they try to escape, because they may harm someone else. AI shoots psychotic person when they run.

Person is found to be in possession of drugs. They are arrested, tried, and jailed. They only got into drugs because their mother was on drugs when they were in utero, therefore giving them a predisposition to addiction.

These are issues that human cops have, so why do you think an unfeeling robot would be better? The whole issue with the law today is that they are unfeeling and too "to the letter" on the law (*when it serves them) in situations where things need nuance and compassion
 
  • Love
Reactions: Forever Sleep
F

Forever Sleep

Earned it we have...
May 4, 2022
14,460
The issue is that the law itself is flawed, and to fix said laws, we'd need to takeninto account a type of moral nuance that AI cannot handle.

If someone in a state of psychosis kills their caretaker, that is murder. Murder is illegal, therefore the person must be apprehended. Seeing as they murdered someone, they can legally have deadly force used on them if they try to escape, because they may harm someone else. AI shoots psychotic person when they run.

Person is found to be in possession of drugs. They are arrested, tried, and jailed. They only got into drugs because their mother was on drugs when they were in utero, therefore giving them a predisposition to addiction.

These are issues that human cops have, so why do you think an unfeeling robot would be better? The whole issue with the law today is that they are unfeeling and too "to the letter" on the law (*when it serves them) in situations where things need nuance and compassion

I absolutely agree. The law would have to be fully explored to ensure every nuance was covered.

In your examples though- the psychotic patient who just killed their care taker. Both the human and AI would assumedly find out whether they still had a weapon on them initially. If they did and they were running towards a group of other patients or nurses- presumably- both would either taser them or in a worse case scenario- shoot them.

Should the human officer give them more empathy if they are a genuine and immediate risk to others? Why is that the better solution if they are likely to kill someone else? Ok, it wouldn't be their fault if they were psychotic but- do the other people deserve to die too? Shouldn't they be protected? Or- is it more PC to let the psychotic person have their freedom? (And possible killing spree.)

It's not blaming them. It's not blaming the officers either- it's an almost impossible situation. The facility they were housed in would eventually bear the blame I imagine.

Would a computer make a better assessment of the situation than a human though? I don't know really. I'm guessing they could have more tech on board. How fast is the person running? Can I stop them via non lethal means? Not sure what the ideal outcome is really. It's a risk either way. Not sure a human would make the 'better' choice in a tense situation though. Humans panic. But a person sprinting with a knife towards others is a person sprinting with a knife towards others- in that moment- there really isn't time to judge what they 'deserve'. They're potentially about to kill again and the recent corpse they created tends to suggest they may well attack others. The choice I imagine would be a no brainer to both.

It works in reverse though. Police kill and maim people when they are irritated. There have been cases of elderly residents in care homes being pepper sprayed and tasered because they wouldn't drop a dinner knife. I imagine there's every likelihood they had dementia and didn't fully understand what was going on. I suppose the benefit of a robot going in would mean they wouldn't get hurt if things did kick off. They couldn't get irritated or insulted if someone swore at them or spat at them. They could presumably know the correct amount of non lethal force to use to subdue someone.

The problem with people I think though- is they can easily make the wrong call just as much the right- when they are under pressure, scared, angry etc.

What's the person who got addicted to drugs in the womb doing? Why are the police there in the first place? Not too sure what the argument is. The human officers would let them off the charge because it isn't their fault they're an addict? Will that help them? Surely- either the robot or human would arrest them for possession. Then, the courts- be them also AI or human would take into account mitigating circumstances.

Again though- shouldn't law be universal? Should a pretty woman be let off a parking ticket in one state while a guy gets charged somewhere else? Will all courts have leniency towards the addict from birth or- does it depend on the jury and the legal teams involved? Would any jury have found OJ. Simpson innocent? Or, only that legal team and that jury? We can make mistakes just as easily as we can make good decisions because we are so swayed by emotions. At every stage- both policing and through the courts.

I'm not really suggesting AI could do a better job. It probably isn't advanced enough. But- if it was- I find it an interesting thought experiment. If everything was more standardized. Wouldn't the law be fairer if it was all the same?

Again, that's not to say mitigating circumstances wouldn't be taken into consideration but- that's the legal side if things. An arrest is an arrest- no? If the criminal is biting, punching, spitting, kicking, waving around a weapon- they are going to have a rougher time- whether that be with a robot or human than if they comply.
 
NormallyNeurotic

NormallyNeurotic

Everything is going to be okay ⋅ he/him
Nov 21, 2024
719
What's the person who got addicted to drugs in the womb doing? Why are the police there in the first place? Not too sure what the argument is. The human officers would let them off the charge because it isn't their fault they're an addict? Will that help them? Surely- either the robot or human would arrest them for possession. Then, the courts- be them also AI or human would take into account mitigating circumstances.
I honestly didn't word it too well because my points are sort of related to anti-carceral mental health care and proper de-escalation methods. I think reading some articles about those by people better versed in this than me would be best.

But something to point out regarding the patient with psychosis. I have had a psychotic break before, and most people in them aren't fully gone.

Hidden content
You need to reply to this thread in order to see this content.
 
Last edited:
  • Like
Reactions: Forever Sleep
F

Forever Sleep

Earned it we have...
May 4, 2022
14,460
I honestly didn't word it too well because my points are sort of related to anti-carceral mental health care and proper de-escalation methods. I think reading some articles about those by people better versed in this than me would be best.

But something to point out regarding the patient with psychosis. I have had a psychotic break before, and most people in them aren't fully gone.

[Hidden content]

That does make sense and to be fair, I was thinking that about the care home incident. I doubt the resident would have felt much calmer with a massive robot looming over them!

But then, if they know that a softly, softly approach does generally work better with someone psychotic- do you think there's nothing they could create to emulate that? If there were robots around in everyday life, including cute non threatening ones- would a person- even in a psychotic episode feel threatened by one?

Ultimately though- if they weren't locked down to where they couldn't injure others and they did still have a weapon on them- I imagine most protocols would prioritize public safety over theirs.

It would be an awful outcome to kill them but then- random strangers have been killed by people suffering psychosis. I can remember two terrifying cases where I used to live. It's a Russian roulette to an extent not attempting to stop them- I imagine.
 
NormallyNeurotic

NormallyNeurotic

Everything is going to be okay ⋅ he/him
Nov 21, 2024
719
But then, if they know that a softly, softly approach does generally work better with someone psychotic- do you think there's nothing they could create to emulate that? If there were robots around in everyday life, including cute non threatening ones- would a person- even in a psychotic episode feel threatened by one?
At least for those with more trauma-induced psychosis, it's not the outward gentleness that matters, but the internal safety, so to speak.

Like I said, the robot does not care. It does not feel—not pity, nor compassion. Even if I knew a robot would not hurt me, I still wouldn't feel "safe." It takes so much care to bring someone out of a state like that.

A robot could possibly go around a mental ward and bring pills to patients that sometimes attack humans. But to de-escalate something like this, no. Absolutely not. It would do more harm than good in many cases.
 
  • Like
Reactions: Forever Sleep
Dejected 55

Dejected 55

Visionary
May 7, 2025
2,513
AI and robots are all created by people... which means they have all our flaws and it means whatever laws or rules they would govern by would be the same ones people would... so if people can abuse power, the AI and robots would be similarly designed to abuse that same power.

Also, fulfilling the letter of the law isn't always what you want either. You don't want people making bad selfish interpretations of law to suit them... but you don't want letter-of-the-law strict interpretation in all things either.

You can believe IF it were possible to have AI/robots as ICE agents carrying out these "missions" they would be programmed to do exactly what these human ones are doing already. They would just be able to kill more people more efficiently. I keep saying, these ICE agents aren't going rogue... they are doing what the people in charge ask them to do. The cruelty is the point.
 

Similar threads

F
Replies
1
Views
91
Offtopic
Pluto
Pluto
F
Replies
8
Views
218
Offtopic
inkmage333
inkmage333