Altman replied to a social media post saying that computing power is needed to accommodate “please” and “thank you,” which costs more electricity. He joked it’s “money well spent,” but it highlights how humans being nice impacts AI operations in a big way.
Why Politeness Costs So Much
The root of the issue lies in how AI models function. Every response the ChatGPT model generates demands significant computational power, and the longer or more complex the conversation, the more energy it consumes. Even simple words like please and thank you need processing as extra tokens, which makes more work for OpenAI’s data centers. These facilities not only need a lot of electricity to run but also require excessive cooling so they don’t overheat, adding to costs.
Many users are surprisingly polite when they talk to AI. A few people do this as a habit or because of a personal principle, while others are upfront about doing it to stay on beneficial terms. Just in case, the AI becomes really smart and “remembers” how it was treated.
Many in the US use ‘please’ and ‘thank you’ when talking to AI, a study shows. AI users prefer to use respectful language with bots. For some people, using respectful language is simply the right thing to do. Others think that being rude to AI bots may backfire sometime in the future.
Balancing Costs and Advancements.
The cost of AI processing is coming down as technology improves, but OpenAI still has huge operational costs. The company remains positive about its financial situation. OpenAI is expecting revenue to more than triple this year and is expected to make a profit by 2029, meaning they are counting on all their users growing alongside their technology.
The next time you type a “thank you” to ChatGPT, it may be warming up the server somewhere. OpenAI seems fine with the added cost that ChatGPT’s polite responses are bringing up, if Altman’s teasing confession is any indication. In the big picture, a little kindness may be worth the expense.
