Grok is the AI chatbot built by xAI, Elon Musk’s artificial intelligence company. It is integrated with X and competes with other major generative AI tools such as ChatGPT, Claude and Gemini.
The exact amount of energy used by Grok is not publicly disclosed. xAI has not published a verified figure for energy use per Grok query, daily Grok electricity consumption, or the total annual energy used to train and run its models. However, enough public information is available to make useful estimates, especially because xAI has disclosed that its Colossus supercomputer has been expanded to 200,000 GPUs, and xAI says Grok 4 used Colossus for large-scale reinforcement learning.
The short answer is that a single Grok query probably uses a very small amount of electricity — likely somewhere from fractions of a watt-hour to several watt-hours depending on the type of request — but the infrastructure behind Grok can consume energy on the scale of a large industrial facility.
Summary answer
| Question | Best available answer |
|---|---|
| Does xAI publish Grok’s exact energy use? | No, not in a complete public dataset |
| Likely energy for a simple Grok text query | Probably fractions of a watt-hour to around 1Wh |
| Likely energy for a complex reasoning or DeepSearch-style query | Potentially several Wh or more |
| Publicly stated size of xAI’s Colossus cluster | 200,000 GPUs |
| Reported full-capacity electricity use of Colossus 1 | Around 150MW |
| Annual electricity use at 150MW continuous load | Around 1.31TWh per year |
| Equivalent UK household electricity use | Around 487,000 typical homes |
| Main energy concern | Data-centre scale, not individual prompts |
Ofgem estimates that a typical household in England, Scotland and Wales uses 2,700kWh of electricity per year, so a 150MW facility running continuously would use as much electricity as roughly 487,000 typical UK homes. This comparison refers only to household electricity use, not gas or total household energy consumption.
Why Grok’s energy use is difficult to calculate
There is no single public meter for Grok. Energy use depends on several factors:
| Factor | Why it matters |
|---|---|
| Model size | Larger models generally need more computing power |
| Query length | Longer prompts and longer answers require more processing |
| Reasoning depth | “Thinking”, agentic and search-based tasks can use more compute |
| Hardware | H100, H200 and GB200 systems have different power profiles |
| Server utilisation | GPUs use different amounts of power when idle, partially loaded or fully loaded |
| Cooling | Data centres need extra electricity for cooling and power distribution |
| Location | Carbon impact depends on whether the power comes from gas, renewables, nuclear or grid electricity |
| Training versus inference | Training a model and answering user prompts are very different energy tasks |
This is why any precise claim such as “one Grok question uses exactly X watt-hours” should be treated cautiously unless it comes from xAI or an independently audited measurement.
What is Grok running on?
xAI’s main AI training system is called Colossus. xAI describes Colossus as the world’s biggest supercomputer and says it was built in 122 days before being doubled to 200,000 GPUs.
Nvidia has said Colossus is used to train xAI’s Grok family of large language models, and xAI’s own Grok 4 announcement says the company used its 200,000 GPU cluster to run reinforcement learning training for Grok 4.
That matters because the GPUs used in frontier AI systems are power-intensive. Nvidia’s H100 specification lists a maximum thermal design power of up to 700W for the SXM version of the chip, while Nvidia’s GB200 rack-scale systems can draw around 120kW per rack.
A simple GPU-based estimate
A very rough lower-bound estimate can be made from GPU power alone.
| Cluster size | Assumed GPU power | GPU-only power draw |
|---|---|---|
| 100,000 H100 GPUs | 700W each | 70MW |
| 200,000 H100 GPUs | 700W each | 140MW |
| 200,000 GPUs at 500W average | 500W each | 100MW |
| 200,000 GPUs at 350W average | 350W each | 70MW |
This does not include CPUs, networking, storage, cooling, power conversion losses, lighting, back-up systems or other data-centre infrastructure. It also assumes the GPUs are H100-equivalent for illustration, whereas xAI’s infrastructure has reportedly included a mixture of Nvidia hardware over time.
This is why a full facility can require far more electricity than the headline GPU calculation suggests.
How much energy does Colossus use?
The Guardian reported in January 2026 that xAI’s Colossus 1 data centre uses 150MW of electricity at full capacity and that xAI plans to expand.
If a 150MW data centre ran continuously for a full year, it would consume:
| Power demand | Annual electricity use | Equivalent typical UK household electricity use |
|---|---|---|
| 50MW | 0.44TWh | 162,000 homes |
| 100MW | 0.88TWh | 324,000 homes |
| 150MW | 1.31TWh | 487,000 homes |
| 250MW | 2.19TWh | 811,000 homes |
| 422MW | 3.70TWh | 1.37 million homes |
| 1GW | 8.76TWh | 3.24 million homes |
| 2GW | 17.52TWh | 6.49 million homes |
The 422MW figure is included because Reuters reported claims from community and environmental groups that xAI had operated 35 gas turbines with a combined capacity of 422MW at its Tennessee data-centre site. That is a capacity figure, not proof that the turbines ran at full output all year.
Reuters also reported in December 2025 that xAI had acquired another building near Memphis as part of a plan to scale its AI training capacity towards nearly 2GW, although that reflects expansion ambition rather than current verified continuous consumption.
How much energy does one Grok query use?
There is no official public figure for a single Grok query. The best approach is to use wider AI inference research as a guide.
Recent research and industry estimates suggest that a typical frontier text-model query may use around 0.3Wh of electricity, although results vary depending on the model, prompt length, output length and methodology. Epoch AI estimated typical GPT-4o queries at roughly 0.3Wh, while Google estimated the median Gemini Apps text prompt at 0.24Wh. Microsoft researchers have also estimated a median of around 0.34Wh for frontier-scale LLM inference under realistic workloads.
For Grok, a reasonable public estimate might look like this:
| Type of Grok use | Possible electricity use per request | Notes |
|---|---|---|
| Short factual query | 0.1–0.5Wh | Similar to other efficient text-model estimates |
| Normal chatbot answer | 0.3–1Wh | Depends heavily on response length |
| Longer answer or coding task | 1–5Wh | More tokens and more processing time |
| Reasoning, DeepSearch or agentic task | 5–20Wh+ | Multi-step reasoning and search can increase compute use sharply |
| Very long, complex or tool-heavy task | 20Wh+ | Some AI energy studies show much higher figures for long prompts and intensive models |
A 2026 Joule paper on AI inference found a median energy use of 0.31Wh per query for baseline models over 200 billion parameters, while other testing has found that the most energy-intensive long-prompt cases can exceed 29Wh or 33Wh per prompt.
The important point is that the individual prompt is not usually the biggest energy issue. Even if a single Grok answer uses less than 1Wh, huge numbers of users and repeated AI use can scale into meaningful electricity demand.
What happens at scale?
The table below shows how small per-query numbers become large when usage scales.
| Grok usage level | At 0.3Wh per query | At 1Wh per query | At 10Wh per query |
|---|---|---|---|
| 1 million queries per day | 300kWh/day | 1,000kWh/day | 10,000kWh/day |
| 10 million queries per day | 3MWh/day | 10MWh/day | 100MWh/day |
| 100 million queries per day | 30MWh/day | 100MWh/day | 1GWh/day |
| 1 billion queries per day | 300MWh/day | 1GWh/day | 10GWh/day |
At 1 billion queries per day, a 0.3Wh estimate would mean about 109.5GWh per year. At 10Wh per query, it would mean 3.65TWh per year.
That difference shows why transparency matters. The energy footprint of an AI service depends not only on how efficient each answer is, but also on how many answers are generated, how complex they are and how often users rely on reasoning modes, search modes, image generation, code execution or agents.
Training Grok versus using Grok
Grok’s energy use can be divided into two categories: training and inference.
| Category | What it means | Energy profile |
|---|---|---|
| Training | Building or improving the model using large datasets and reinforcement learning | Very high energy use over concentrated training periods |
| Inference | Running the model when users ask questions | Lower per request, but repeated continuously at huge scale |
Training is the spectacular part because it can involve enormous GPU clusters running for long periods. Inference is the part that may matter most over time because it happens every time a user asks Grok a question.
xAI said Grok 4 used Colossus for reinforcement learning training at large scale, and that infrastructure and algorithmic work improved training compute efficiency by 6x. That means newer training runs may be more efficient per unit of capability, but it does not mean total energy use falls, because the company may use the efficiency gain to train larger or more capable models.
Why Grok is especially controversial from an energy perspective
Grok is not just another chatbot. Its energy footprint has received attention because xAI’s data-centre build-out has been extremely fast and has involved large-scale power infrastructure.
Reuters reported in April 2025 that community and environmental groups said xAI had increased the number of gas turbines at its Tennessee facility to 35 turbines with 422MW of capacity, far above the 20 turbines and 100MW previously reported. The groups alleged that the expansion required a major air permit under the Clean Air Act.
The Guardian later reported that a regulator had ruled that xAI’s Memphis data centre was generating extra electricity illegally, while environmental and community groups argued that the turbines added pollution to already overburdened communities.
xAI’s position and the legal/regulatory position around these facilities may continue to evolve. For energy users, the wider lesson is that frontier AI is no longer just a software issue. It is also becoming a power-generation, grid-connection and local environmental issue.
How does Grok compare with ChatGPT, Claude and Gemini?
It is not possible to produce a fully reliable league table because the major AI companies do not all report energy in the same way.
| AI system | Public energy transparency | Broad energy picture |
|---|---|---|
| Grok | Limited public per-query data | Large visible infrastructure footprint through Colossus |
| ChatGPT | No complete public per-query audit, but many third-party estimates | Very large user base means scale is the key issue |
| Claude | Limited model-specific energy disclosure | Likely significant data-centre use through cloud infrastructure |
| Gemini | Google has published a median Gemini Apps text-prompt estimate | Google estimates 0.24Wh for a median text prompt |
| Open-source models | Easier to test in controlled settings | Energy varies enormously by model size and hardware |
Google’s Gemini figure of 0.24Wh per text prompt is useful because it shows that simple AI text queries can be much less energy-intensive than some older public estimates suggested. However, that does not automatically apply to Grok, and it does not cover every kind of AI task.
Why the data-centre footprint matters more than the prompt
For most users, asking Grok one question is not comparable to running a washing machine, boiling a kettle repeatedly or charging an electric car. The energy use of one simple text prompt is usually small.
The bigger issue is cumulative demand:
- millions or billions of prompts per day
- longer responses
- AI search replacing conventional search
- reasoning modes becoming default
- businesses embedding AI into everyday workflows
- AI agents running tasks in the background
- model training becoming more frequent and more compute-intensive
- multiple companies racing to build larger data-centre campuses
The International Energy Agency says data centres consumed about 415TWh of electricity in 2024, or around 1.5% of global electricity use, and projects that global data-centre electricity consumption could more than double to around 945TWh by 2030. It identifies AI as the most important driver of this growth.
Could Grok’s energy use become more efficient?
Yes. AI systems can become more energy-efficient through:
| Efficiency route | How it helps |
|---|---|
| Better chips | New GPUs can deliver more AI computation per watt |
| Model optimisation | Smaller or sparse models can reduce unnecessary computation |
| Caching | Reusing common answers can avoid repeated full inference |
| Quantisation | Lower-precision maths can reduce compute requirements |
| Routing | Simple queries can be sent to smaller models |
| Better cooling | Liquid cooling and efficient data-centre design reduce overhead |
| Higher utilisation | Better scheduling reduces wasted idle power |
| Renewable PPAs | Clean electricity procurement can reduce carbon impact |
However, efficiency does not guarantee lower total energy use. If AI becomes cheaper and more capable, usage may rise faster than efficiency improves. This is a classic rebound problem: each query may become cheaper, but total queries and total model training may grow rapidly.
What does Grok’s energy use mean for businesses?
For businesses, Grok’s electricity consumption matters in two ways.
First, businesses using AI tools may want to understand the hidden energy and carbon footprint of AI-assisted work. A single prompt is usually small, but large-scale use across marketing, customer service, coding, research, sales, analytics and automation could become a measurable part of a company’s digital carbon footprint.
Second, AI data centres are becoming a major new source of electricity demand. Large AI campuses can affect power generation, grid connections, local network investment and energy infrastructure planning. This may eventually affect electricity prices and network charges, especially in regions where data centres cluster.
For UK businesses, the direct effect of Grok’s Memphis energy use on UK electricity bills is limited. However, the wider AI data-centre boom is relevant to UK business energy costs because the same pressures are emerging globally: more demand for grid capacity, more competition for low-carbon power, and more investment needed in electricity networks.
Final verdict
The precise energy consumption of Grok AI is not publicly known. xAI does not publish a full per-query or annual electricity-use breakdown for Grok.
A simple Grok text query may only use a fraction of a watt-hour to around 1Wh, based on wider estimates for frontier AI inference. More complex reasoning, search or agentic tasks could use several watt-hours or much more. The bigger energy story is not the individual query, but the scale of the infrastructure needed to train and run Grok.
xAI’s Colossus system has been expanded to 200,000 GPUs, and reports suggest its electricity use is already comparable to a large industrial site. At a reported 150MW full-capacity load, Colossus would consume around 1.31TWh per year if operated continuously — roughly equivalent to the annual electricity use of nearly 487,000 typical UK homes.
That makes Grok a useful example of the wider AI energy challenge: each prompt may be small, but the data-centre build-out behind frontier AI is becoming enormous.
FAQ
xAI has not published an official figure. Based on wider AI inference research, a simple Grok text query may use fractions of a watt-hour to around 1Wh. Complex reasoning, long answers, DeepSearch-style tasks or agentic workflows could use several watt-hours or more.
There is not enough public data to say confidently. Grok runs on xAI’s Colossus infrastructure, while ChatGPT runs on OpenAI’s infrastructure through Microsoft and other systems. The energy use depends on model size, prompt length, reasoning mode, hardware, data-centre efficiency and user volume.
The Guardian reported that Colossus 1 uses around 150MW of electricity at full capacity. If that load ran continuously for a full year, it would equal about 1.31TWh of electricity.
xAI’s Memphis data-centre operations have reportedly used natural gas turbines for additional power. Reuters reported claims that the site had 35 turbines with 422MW of capacity, while environmental groups and regulators have scrutinised permitting and pollution issues.
One simple Grok prompt is unlikely to have a large environmental impact by itself. The larger concern is cumulative use: millions or billions of prompts, repeated model training, data-centre expansion, cooling demand and the source of electricity used to power AI infrastructure.
Estimates vary because different studies measure different things. Some include only GPU electricity, while others include servers, cooling and data-centre overhead. The answer also changes depending on model size, response length, hardware, utilisation, grid mix and whether the request uses reasoning or tools.
Yes. Better chips, smaller models, routing simple queries to lighter systems, improved cooling and more efficient data centres could reduce energy per answer. However, total electricity use could still rise if Grok gains more users, handles longer tasks or trains larger models.
The main issue is not the electricity used by one question. It is the scale of xAI’s data-centre infrastructure, the large GPU clusters used to train and run frontier models, and the power-generation and grid infrastructure needed to support them.