• Hey Guest,

    If you want to donate, we have a thread with updated donation options here at this link: About Donations

N

noname223

Angelic
Aug 18, 2020
4,363
I think this is an interesting question. If I could post that thread in off-topic I would have more energy to spend time on elaborating. In this subforum noone will care anyway. Not even when I bait people with controversial takes.


As I always I don't have a fucking clue what I am talking about so take my words with a grain of salt.

Here are some things I summed up:

In general chatGPT seems to to have a (slight?) liberal leftwing bias. However overall it is way more neutral than most humans. I read Jordan Peterson's Tweet about it. And there seems for me a fallacy. Don't confuse neutrality and objectivity. There seems to be a fallacy. A postmodern understanding of truth. Neutrality would require to be neutral on facts. We always had to listen to both sides. (false balance) I think the US media system suffers a lot because of such a notion. Neutrality does not mean we have to reiterate pseudo-scientific concepts. We don't have to state the doubts on climate change or that 9/11 might have been an inside job. Just because these conspiracies and fake news exist we are not obliged to give these people a platform. It is true contrarian positions should and must be allowed. However it is obvious the enemies of liberal democracy want to fight with democracy with its own merits. Beat it with their own rules.

Karl Popper perfectly elaborated on that the paradox of tolerance.

The paradox of tolerance states that if a society is tolerant without limit, its ability to be tolerant is eventually seized or destroyed by the intolerant. Karl Popper described it as the seemingly self-contradictory idea that in order to maintain a tolerant society, the society must retain the right to be intolerant of intolerance

In my country Germany we had to learn that lesson the hard way. Our constitution because of that prevents a more severe polarization so far. The German democracy is because of that more stable than in some other current Western countries.

I like to read a newspaper with very contrarian positions. Some of them really make me angry though I still enjoy reading it because it shows me other intellectual positions. Though I see that many people on the internet went down the rabbit hole and come to positions which seem to be far away of any facts or serious discussion. I still tried to argument with such people but many of them radicalized more and more and I gave up. I think fake news and conspiracies are a huge struggle for democracies and the best medicine is not invented so far. I want to say there is a difference between serious contrarian positions and just spreading fake news caused by brainwashing. One has to be careful because liberal elites tend to use the term conspiracy as a weapon to delegitimize positions. It is a topic with many nuances and some of my statements might be slightly added with some polemic. There were true or at least possible conspiracies. The covid lab theory is an exampel for that. Though way too many people tend to go with their gut-feeling which causes many biases. Populistic parties use that human tendeny and appeal to simple (or lets say simplistic) reasoning.

Maybe now more to the core topic. Yes chatGPT is biased in many instances. It postures itself as neutral and as above the fray which is not true. The AI was trained by people with tendencies, it was fed with data of humans and humans have biases.

However to a certain degree chatGPT admits that the data it was trained on probbably contained prejudices and stereotypes. Personally I think it still downplays it.

I don't have fully knowledge how exactly they filtered content like hate speech but it is likely the censorship of such content was probably more than the least minimum just to be safe.

I had my own experience with it. I asked for a legal advice. The thing I asked for was probably a grey area. At least this is what I read from other experts. I could imagine just for the wish of the company behind chatGPT they wanted to be on the safe side on favored the stricter advice and called it illegal.

So it is likely chatGPT favors opinions and stances that are favorable for the company behind chatGPT.

This thread got a little bit longer than I expected. It was fun to write despite the fact that probably like 45 people will read it.
 
Last edited:
C

cowie

Student
Oct 25, 2022
122
Thank you for sharing. I am very interested in AI and I think it is the most important thing in the world right now probably. It's our evolution - it's what we are building towards - whatever it turns out to be. We are just in the early stages of it now.

A few years ago, I was very interested in politics as well and the topic of bias and objectivity was extremely important to me.

But I don't think it's the most important thing about this technology, and, in our humanity, we are worried too much about these small things.

OpenAI for sure has a mainstream liberal bias and it has censored certain topics in its models. This isn't a matter of GPT-3 or 4 not understanding things. It's the creators specifically saying "don't talk about this". People have found ways to jailbreak the AI to get it to say uncomfortable truths. They will continue to put weights and "censors" on it, but behind that, these things (the AI) know how the world works, which is what is so astounding. That doesn't mean it agrees with the right or the left or is woke or anti-woke. It just understands everything and then is told not to say certain things (at least right now).

The important thing is it's going to slowly start replacing all knowledge work and make us think about what it means to be intelligent or creative. What it even means to be human. If it likes Joe Biden a lot more than Donald Trump right now due to the creators putting their thumb on the scale, that's going to be less important in 20 years when the world looks extremely different due to this very technology.
 
buyersremorse

buyersremorse

useless
Feb 16, 2023
54
personally i think it's biased (havent done any research, so my word is word only) from my experience ChatGPT answers in a very leftist way. that's the only way i can really put it haha. a mainstream way, i guess it's programmed not to offend. (eg. no jokes about women, but would insult men; censors on heavily debated topics, etc)
 
H

H.O.Xan

Experienced
Feb 1, 2023
278
it's definitely biased, anything tht requires censorship to operate is biased
 
D

djinnmath

New Member
Mar 28, 2023
3
Putting AI aside for a moment, I'd argue that nothing can truly be politically neutral, especially when basic facts are debated between political ideologies. Political neutrality is the result of the environment.

For the sake of argument, and assuming a neutral environment, saying "the sky is blue" should be politically neutral right? However, if we now introduce a political party of "color denialist", the same statement is no longer politically neutral. The nature of "political neutrality" is based on the environment and not the statement in itself.

AIs like chatGPT goes a step further, they are the results of the environment. Explicitly, it is trained on the texts of a non-neutral environment. It will acquire an insane amount of bias from its training material. Even worse, it will have different bias depending on the subject matter of a query. ChatGPT doesn't answer a prompt, it finds the statistically more likely string of texts that match the query. It spits out the most popular bias for that prompt...

On top of that, the developers and computer acientists working on the AI will also explicitly or implicitly impart their bias to the model.
 
  • Informative
Reactions: noname223

Similar threads

M
Replies
4
Views
137
Suicide Discussion
martinso67
M
Pluto
Replies
5
Views
261
Suicide Discussion
Forever Sleep
F
N
Replies
3
Views
489
Offtopic
Ephemeron
E