As AI advances, balancing innovation with sustainability remains a critical challenge. OpenAI recently unveiled o3, its most powerful AI model to date. Besides the cost to run the models, its environmental impact is another aspect that’s garnering attention.
A study reveals that each o3 task consumes approximately 1,785 kWh of energy, equivalent to the electricity used by an average US household over two months.
The analysis of benchmark results done by Boris Gamazaychikov, AI sustainability lead at Salesforce, says that it roughly translates to 684 kilograms of CO₂ equivalent (CO₂e) emissions, which is comparable to the carbon emissions from more than five full tanks of petrol.
The high-compute version of o3 was benchmarked on the ARC-AGI framework, with calculations based on standard GPU energy consumption and grid emissions factors. “We need to pay more attention to the tradeoffs as we start to scale and integrate this technology,” said Gamazaychikov.
He also notes that this calculation does not include embodied carbon and is focused just on the GPU, so the amounts calculated are likely an underestimate.
In addition, Kasper Groes Albin Ludvigsen, a data scientist and green AI advocate, said, “An HGX server with 8 Nvidia H100s consumes around 11-12 kW, way more than the mere 0.7 kW per GPU.”
In terms of the definition of a task, Pierre-Carl Langlais, co-founder at Pleias, has raised concerns about energy costs if the model design does not scale down fast. “Here, solving complex math problems requires lots of drafts, intermediary tests, reasoning, etc.,” he said.
Earlier this year, it was established that ChatGPT consumes 10% of an average person’s daily drinking water in one chat. This is almost half a litre of water, which may seem less, but when millions of people use the chatbot daily, it increases the combined water footprint.
Experts like Kathy Baxter, principal architect for responsible AI & tech at Salesforce,
have warned of the potential for Jevon’s Paradox to occur with AI advancements like OpenAI’s o3 model.
“There can be efficiency tradeoffs where less energy is required, but more water is used,” she said.
This comprehensive perspective is critical to making informed decisions about the deployment of AI technologies, ensuring that unintended consequences are minimised and benefits maximised.
While powerful, AI data centres face critical challenges, such as high energy consumption, complex cooling requirements, and the need for vast physical infrastructure.
Companies like Synaptics and embedUR are trying to solve this through edge AI to reduce reliance on data centres and minimise latency and energy use. They do it by allowing decisions to be made in real time at the device level.
The post OpenAI o3 Consumes Five Tanks of Gas Per Task appeared first on Analytics India Magazine.