• Hey Guest,

    We wanted to share a quick update with the community.

    Our public expense ledger is now live, allowing anyone to see how donations are used to support the ongoing operation of the site.

    👉 View the ledger here

    Over the past year, increased regulatory pressure in multiple regions like UK OFCOM and Australia's eSafety has led to higher operational costs, including infrastructure, security, and the need to work with more specialized service providers to keep the site online and stable.

    If you value the community and would like to help support its continued operation, donations are greatly appreciated. If you wish to donate via Bank Transfer or other options, please open a ticket.

    Donate via cryptocurrency:

    Bitcoin (BTC):
    Ethereum (ETH):
    Monero (XMR):
N

noname223

Archangel
Aug 18, 2020
6,937
I think I sometimes Do that. Sometimes I get faster answers. But I am a climate and environment monster for doing that
It is really energy inefficient.

They say chatGPT should not be used that way. But I think many consumers do that.

Something I don't understand it if AI should not be used that way why Do so many search engine integrate AI? There are So many browsers that force consumers to use AI that way.

I wrote 9 pages about my mental health to my therapist. I used AI for help and I wish she would put a lot of energy into factchecking my theories. There might be new narratives I tell myself now. But she is so lazy she probably won't read it critically.
 
  • Like
Reactions: Forever Sleep and katagiri83
azo

azo

Wizard
Jun 20, 2023
673
I use Perplexity, which is an LLM-powered search engine that always provides sources at the end of every line. It tends to be accurate in my experience (i.e. actually gets answers from the sources, which I usually check out anyway).

I believe the environmental impact comes from training the models—since this relies on large amounts of data, it uses a lot of compute power. When you use an LLM, you're using it in inference mode, which is to say that it's already been trained, and you're using the "finished" product. So your prompts do not have a very significant environmental impact.
 
  • Hugs
Reactions: GlassMoon
F

Forever Sleep

Earned it we have...
May 4, 2022
15,354
No. I'm trying to avoid AI mostly. I'm sure it is already being introduced whether we want it or not though. I noticed a few months back that Google was sometimes getting things completely wrong.

I'm pretty naive though. I don't really understand the difference between a traditional search engine and AI.

My phone keeps pushing it though. I keep having to opt out.
 
  • Hugs
Reactions: GlassMoon and noname223
E

Eriktf

Elementalist
Jun 1, 2023
825
mostly use brave search but sometimes google

when i use ai always chat with it to get it to explain followup questions
on pc mostly leo in brave
on phone gemini
 
Grog

Grog

I am a defect.
Jun 3, 2025
499
I use Chat GPT every day. I usually use it instead of a search engine like Google. I ask it everything tbh.
 
  • Like
  • Hugs
Reactions: Eriktf and GlassMoon
sheeplit

sheeplit

Member
Mar 8, 2023
47
I do use it to search. For important things, I ask for links and go to them. For low risk stuff, I don't mind having it summarize or describe things for me.

My main use case is for searching an idea. For example, asking an LLM to list down as many names it can find that have done research on the subject of fear. Include basic background of the people listed, list of works, which of them might be associated with others on the list, collaborations, other pertinent info I'm interested in, etc. I also ask for links to sources so I can verify. Traditional search engines cannot accomplish this without LLMs. With LLMs, one prompt, one curated answer, no excess.
 
B

bleeding_heart_show

Student
Dec 23, 2023
195
I cannot stomach the idea of it.
 
Cauliflour

Cauliflour

I'm the doodler, I make terrible doodles.
Mar 24, 2025
717
Something I don't understand it if AI should not be used that way why Do so many search engine integrate AI? There are So many browsers that force consumers to use AI that way.
AI is the new buzzword that makes idiot investors throw money at you. That's all.
I believe the environmental impact comes from training the models—since this relies on large amounts of data, it uses a lot of compute power. When you use an LLM, you're using it in inference mode, which is to say that it's already been trained, and you're using the "finished" product. So your prompts do not have a very significant environmental impact.
You still have to cool the servers though. Processors in machines heat up when having a heavy load on (ie the amount of energy it takes to process a single prompt) and these big data centres need to be cooled efficiently so they use loads and loads of water that can't be recycled (I assume it's because it has chemicals in now).
 
B

BradGuy123

Specialist
Jul 6, 2025
330
I use AI a lot. I started using Chat GPT but it started limiting me and saying that I had wait until x time to ask anything else. So I started Copilot chat and have not had any issues.
 
azo

azo

Wizard
Jun 20, 2023
673
You still have to cool the servers though. Processors in machines heat up when having a heavy load on (ie the amount of energy it takes to process a single prompt) and these big data centres need to be cooled efficiently so they use loads and loads of water that can't be recycled (I assume it's because it has chemicals in now
Are you sure that it takes a large environmental impact to process a single prompt in inference? The compute involved in that is magnitudes smaller than the compute involved in a single training epoch. How would that impact compare to doing a couple of google searches? Or switching on a light bulb? I've heard it pointed out by vegans that eating a burger has a far greater environmental impact than something like a thousand prompts.

I'm just skeptical of environmental arguments against AI because plenty of other things have an environmental impact that people don't really note and unless we're going to explicitly quantify all of those impacts and adjust our behaviour accordingly it feels a little baseless to me.
 
  • Like
Reactions: avoid
avoid

avoid

Jul 31, 2023
443
  • Article:
    We estimate the median Gemini Apps text prompt uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO₂e), and consumes 0.26 milliliters (or about five drops) of water — figures that are substantially lower than many public estimates. The per-prompt energy impact is equivalent to watching TV for less than nine seconds.
  • Article:
    [Google] queries vary in degree of difficulty, but for the average query, the servers it touches each work on it for just a few thousandths of a second. Together with other work performed before your search even starts (such as building the search index) this amounts to 0.0003 kWh of energy per search, or 1 kJ.
  • Article:
    In an interview, Alphabet's Chairman John Hennessy told Reuters that having an exchange with AI known as a large language model likely cost 10 times more than a standard keyword search, though fine-tuning will help reduce the expense quickly.
  • Article:
    We experiment with several recent small and large language models:
    1. Phi-3-7B, a Small Language Model (SLM) with 7 billion parameters.
    2. Claude 3.5 Sonnet (2024-10-22), the latest model (≈175B parameters) from the Claude 3.5 family offering state-of-the-art performance across several coding, vision, and reasoning tasks.
    3. Gemini 2.0 Flash: the latest/most advanced Gemini model. Other Google models such as Med-PaLM models (540B), designed for medical purposes, were not publicly available.
    4. ChatGPT (≈175B) and GPT-4 (≈1.76T), a "high-intelligence" model.
    5. GPT-4o (≈200B) providing "GPT-4-level intelligence but faster" and the GPT-4o-mini (gpt-4o-2024-05-13) small model (≈8B parameters) for focused tasks.
    6. The latest o1-mini (o1-mini-2024-09-12) model (≈100B), and o1-preview (o1-preview-2024-09-12) model (≈300B) with "new AI capabilities" for complex reasoning tasks.
  • Article:
    ModelParameters (Billions)Energy per response (Joules)Energy per response (Wh)
    Gemma 2 2B240.420.011
    Mistral 7B743.750.012
    Phi 3 Small744.890.012
    Llama 3.1 8B851.120.014
    Phi 3 Mini454.590.015
    Mistral Nemo1258.620.016
    Gemma 2 9B961.350.017
    Phi 3 Medium1485.270.024
    Mixtral 8x7B47121.490.034
    Gemma 2 27B27126.260.035
    Llama 3.1 70B70306.960.085
    Mistral 8x22B123564.000.157
    Mixtral Large141752.290.209
    Llama 3.1 405B4053352.920.931
  • Article:
    But as data centers get hotter, water cooling alone doesn't cut it, says Tony Atti, CEO of Phononic, a startup that supplies specialist cooling chips. […] The chips inside servers suck up around 45% of the power in a data center. But cooling those chips now takes almost as much power, around 40%.
Yesterday, Google released some figures on the energy consumption of Gemini Apps text prompts.

The median Gemini Apps text prompt uses 0.24 Wh in 2025 (article 1), and a Google search used 0.3 Wh in 2009 (article 2). The gap of 16 years between these datapoints makes it hard to compare them. Though in 2023, Alphabet's Chairman said that an LLM prompt likely costs 10 times more than a standard Google search (article 3).

The energy cost of some publicly available LLMs range from 0.01 to 0.93 Wh (article 5). If I plot these datapoint on a graph and insert a linear trendline, I find that GPT-4 with 1.76 trillion parameters (article 4) has a per-response energy cost of 3.86 Wh. Though most people would probably use a smaller ChatGPT LLM such as GPT-o1-mini-2024-09-12, which has an estimated per-response energy cost of 0.126 Wh. And I believe that these figures don't account for the energy needed for cooling, which is said to be almost equal to the per-response energy cost (article 6). So, after doubling it to 0.252 Wh, it's very similar to Google's figure of 0.24 Wh for Gemini Apps.

But let's not forget the energy cost of training models. For as long as people keep using LLMs, new LLMs will be trained, especially in these early days of generative AI. You can't separate these two things. It's like the large battery for an electric vehicle: you use clean energy to drive but let's not forget how the rare earth minerals were sourced to create the battery. Though unlike with EV batteries, I don't think there are studies on how training LLMs impact the environment. I don't know the upfront energy cost of training LLMs and how many prompts are needed to make the upfront energy cost insignificant in comparison to the usage costs.

That said, I use ChatGPT when I know my question won't be easily answered by multiple Google searches. An LLM exchange require more energy than a Google search, but what if I need to rephrase my Google search many times to get the results I want, where I would only need 1 ChatGPT prompt? Would ChatGPT not be more environmentally friendly in that case? Though in the end, I don't really care. Like Aergia said, there are better ways to reduce your carbon footprint.
 
Last edited:
  • Informative
Reactions: azo
Niron1492

Niron1492

pew pew ;>
Aug 28, 2025
46
Yeah,its helpful in books if you are confused about something.

For example,some characters are way too confusing to remember or the writer writes them with a similar name or some locations you just forgot.

I mean its helpful in the sense if you use the voice thing so its easier rather than searching by just talking into your phone,though the information is not really always correct or if you have an accent and cant speak english well,it sometimes doesnt understand you and you have to repeat it several times.
 

Similar threads

N
Replies
0
Views
154
Offtopic
noname223
N
N
Replies
1
Views
235
Offtopic
noname223
N
N
Replies
8
Views
351
Offtopic
noname223
N
F
Replies
6
Views
91
Offtopic
NoPoint2Life
NoPoint2Life
N
Replies
1
Views
236
Offtopic
Forever Sleep
F