When people talk to AI, many use polite words like “please” and “thank you.”
But OpenAI’s CEO Sam Altman recently revealed that this simple politeness comes with a surprising price tag.
According to him, all those extra words add up to tens of millions of dollars in electricity costs every year.
Here’s why. Every single word you type into ChatGPT is processed as a token.
The more tokens, the more computing power required. That means more electricity, more GPU time, and higher costs.

A single polite word doesn’t matter much.
But when millions of people use ChatGPT daily, all those extra “pleases” and “thanks” create a massive energy bill.
Altman isn’t complaining, though. He said those costs are “well spent.” The reason is that polite interactions make people feel more human while using AI.
It reflects our social habits, even when talking to machines. Many users say they can’t imagine being rude, even to software.
Some even joke about being careful “just in case” AI develops feelings one day.

This small detail highlights a much bigger challenge. AI models consume enormous amounts of electricity. Data centers run 24/7, powering GPUs that process billions of requests.
As AI becomes more widespread, energy consumption is only going to increase.
Adding polite words may not cause climate change, but it’s a reminder of how resource-hungry these systems are.
For OpenAI, the cost of courtesy is just part of the business. But for the rest of us, it raises an interesting question: should we be polite to AI at all?
Altman’s answer suggests yes. It may not matter to the machine, but it matters to us.
And if the price of kindness is a little extra electricity, then maybe it’s a cost worth paying.