×
AI data centers now consume more power than Pakistan and raise US electric bills
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The artificial intelligence revolution is reshaping America’s energy landscape in ways most people never see. Behind every ChatGPT query, Netflix stream, and cloud-saved photo lies a vast network of data centers—warehouse-sized facilities packed with humming servers that are consuming electricity at unprecedented rates.

As AI capabilities explode, these digital powerhouses are multiplying across the country, creating both economic opportunities and infrastructure challenges that could soon appear on every American’s electricity bill. The scale is staggering: U.S. data centers consumed more electricity in 2024 than the entire nation of Pakistan, and that figure is projected to more than double by 2030.

This rapid expansion has sparked a quiet but consequential competition among states vying to host these facilities, while raising urgent questions about grid capacity, environmental impact, and who ultimately pays the price for America’s AI ambitions.

What exactly is a data center

Data centers are the hidden backbone of our digital economy—massive buildings filled with rows upon rows of computer servers, data storage systems, and networking equipment, plus the power and cooling infrastructure needed to keep everything running. Think of them as the engine rooms of the internet, processing every digital interaction from sending emails to streaming videos to running AI chatbots.

While these facilities have existed for decades, they’ve rapidly expanded to support increasingly sophisticated AI models that require enormous computational power. The newest generation of AI-optimized data centers represents a quantum leap in both capability and energy consumption.

The industry distinguishes between several types of facilities. Hyperscale data centers are the giants—warehouse-sized complexes housing at least 5,000 servers, with the largest containing hundreds of thousands of servers across footprints exceeding one million square feet. These facilities emerged with cloud computing and cryptocurrency mining but are now expanding rapidly to meet AI demand.

Enterprise data centers, by contrast, are smaller facilities owned and operated by individual companies for their private computing needs. Colocation facilities take a different approach, renting space to multiple businesses that need data center services but don’t want to build their own.

To put the scale in perspective, a typical hyperscale facility could house more servers than a small city has residents, all working around the clock to process the digital requests that power modern life.

The geography of America’s data center boom

Tracking the exact number of U.S. data centers proves surprisingly difficult since no federal registration requirement exists, and many operators keep locations confidential for security and competitive reasons. However, Data Center Map, one of the industry’s most comprehensive databases, estimates over 4,000 facilities nationwide, including operational sites and those under development.

The distribution is far from even. Just three states—Virginia, Texas, and California—host one-third of all American data centers, with Virginia leading at 643 facilities, followed by Texas with 395 and California with 319.

This concentration isn’t coincidental. Companies choose data center locations based on critical infrastructure requirements: access to reliable, high-capacity electrical grids, properly zoned industrial land, robust network connectivity, and favorable regulatory environments. Northern Virginia, Dallas, Chicago, and Phoenix have emerged as dominant hubs because they excel across these criteria.

The clustering effect is self-reinforcing. According to the International Energy Agency (IEA), a Paris-based organization that analyzes global energy trends, half of all new U.S. data centers currently under construction are being built within existing major clusters. This concentration makes economic sense for companies but creates intense pressure on regional power grids and infrastructure.

Why states are rolling out the red carpet

The competition to attract data centers has become fierce, with states offering substantial financial incentives and streamlined permitting processes. The appeal is straightforward: data centers bring construction jobs, ongoing employment opportunities, significant local tax revenue, and the promise of attracting related technology businesses.

These economic development arguments have proven compelling. States view data centers as clean, high-tech industries that can diversify local economies without the environmental concerns associated with traditional manufacturing. The federal government has also designated data center development as a national priority, committing land and funding to support growth in the name of maintaining America’s competitive edge in the global AI race.

Beyond immediate economic benefits, supporters argue that data centers enable broader technological advancement. AI applications could help solve environmental challenges, revolutionize healthcare, and drive innovation across industries. For many policymakers, the question isn’t whether to compete for these facilities, but how aggressively to pursue them.

The staggering scale of energy consumption

The numbers behind data center energy consumption are difficult to comprehend. In 2024, these facilities consumed 183 terawatt-hours (TWh) of electricity—equivalent to more than 4% of total U.S. electricity consumption. To put this in perspective, that’s roughly the same amount of electricity used by the entire nation of Pakistan, with its population of 240 million people.

More concerning is the trajectory. The IEA projects data center electricity consumption will grow by 133% to 426 TWh by 2030, assuming current industry forecasts and regulatory conditions persist. This explosive growth is driven primarily by AI applications, which require vastly more computational power than traditional data center workloads.

A typical AI-focused hyperscale data center annually consumes as much electricity as 100,000 American households. The largest facilities currently under construction are expected to use 20 times more power than that, creating electricity demand equivalent to small cities.

This growth is already straining regional power grids. In Virginia, data centers consumed approximately 26% of the state’s total electricity supply in 2023, according to the Electric Power Research Institute, a nonprofit energy research organization. Other states are experiencing similar pressures, with data centers consuming 15% of North Dakota’s electricity, 12% of Nebraska’s, 11% of Iowa’s, and 11% of Oregon’s total supply.

The concentration of facilities in specific regions amplifies these effects, creating localized energy shortages and infrastructure bottlenecks that utilities are scrambling to address.

Where all that energy goes

Understanding how data centers use electricity reveals why AI is driving such dramatic consumption increases. Approximately 60% of a data center’s electricity powers the servers themselves—the computer systems that process and store digital information. This percentage is even higher at AI-optimized hyperscale facilities, where advanced servers equipped with powerful graphics processing units (GPUs) can perform trillions of mathematical calculations per second.

These AI-specific chips are energy-hungry beasts, consuming two to four times more power than traditional server processors. The computational requirements for training large AI models or running complex AI applications translate directly into massive electricity demand.

The second-largest energy consumer in data centers is cooling systems, which prevent servers from overheating. This cooling load varies significantly depending on facility design and efficiency, ranging from about 7% of total energy use at well-designed hyperscale facilities to over 30% at less efficient enterprise data centers.

These cooling systems also create substantial water consumption requirements. In 2023, U.S. data centers directly consumed approximately 17 billion gallons of water, with hyperscale and colocation facilities accounting for 84% of that total, according to a Berkeley Lab report commissioned by the U.S. Department of Energy. Hyperscale facilities alone are projected to consume between 16 billion and 33 billion gallons annually by 2028.

These figures exclude indirect water consumption from electricity generation and semiconductor manufacturing, meaning the true water footprint is substantially larger.

The complex energy mix powering AI

Despite the technology industry’s public commitments to renewable energy, data centers still rely heavily on fossil fuels. As of 2024, natural gas supplied over 40% of electricity consumed by U.S. data centers, making it the dominant energy source. Renewable sources like wind and solar provided about 24% of data center electricity, while nuclear power contributed around 20% and coal approximately 15%.

Natural gas is projected to maintain its leading role through 2030, but the energy mix may shift as companies pursue carbon neutrality goals and states implement renewable energy requirements. Several states, including California, Illinois, Minnesota, New Jersey, and Virginia, have introduced or considered legislation requiring data centers to source electricity from renewable sources and report their energy and water consumption.

Nuclear power could play an increasingly important role in data center energy supply. Several major technology companies have announced purchasing agreements with nuclear power startups, recognizing that nuclear plants can provide the reliable, carbon-free baseload power that data centers require. Plans are underway to restart two retired nuclear facilities—Three Mile Island in Pennsylvania and Duane Arnold in Iowa—specifically to meet growing data center demand.

This nuclear renaissance reflects the industry’s recognition that intermittent renewable sources alone cannot reliably power facilities that must operate continuously at high capacity.

The coming impact on electricity bills

The rapid expansion of data centers is beginning to translate into higher electricity costs for American consumers and businesses. Utilities must invest billions in grid upgrades, new transmission lines, and additional power generation capacity to meet growing data center demand. Without specific ratepayer protections, these infrastructure costs typically get passed along to all electricity customers.

The scale of these impacts is becoming clear in specific regions. In the PJM electricity market, which serves 13 states from Illinois to North Carolina, data centers drove an estimated $9.3 billion increase in capacity market prices for 2025-26. Capacity markets are systems where utilities commit to providing specific amounts of electricity to ensure grid reliability, and higher prices in these markets directly translate to consumer bills.

As a result of these data center-driven price increases, average residential electricity bills are expected to rise by $18 per month in western Maryland and $16 per month in Ohio. These regional impacts preview what could become a national trend as data center development accelerates.

A Carnegie Mellon University study projects that data centers and cryptocurrency mining could increase average U.S. electricity bills by 8% by 2030. In the highest-demand markets of central and northern Virginia, the increases could exceed 25%.

These projections come against a backdrop of already rising electricity costs. The typical American household paid $142 per month for electricity in 2024, according to the U.S. Energy Information Administration—a 25% increase from $114 per month in 2014. Much of this increase reflects utility investments in grid modernization, extreme weather resilience, and cybersecurity improvements, but data center growth is becoming an increasingly significant factor.

Public perception and environmental concerns

American public opinion on AI’s environmental impact remains divided and uncertain, reflecting the complexity of the trade-offs involved. A Pew Research Center survey conducted in August 2024 found that 25% of U.S. adults expect AI to have a negative environmental impact over the next 20 years, while an identical 25% predict equally positive and negative effects.

Another 20% foresee net positive environmental benefits from AI, while 30% remain unsure about the technology’s environmental implications. This uncertainty likely reflects the competing narratives around AI’s environmental impact—the technology’s enormous energy consumption versus its potential to optimize energy systems, accelerate clean energy development, and solve environmental challenges.

The divided public opinion suggests that how the data center boom unfolds—particularly regarding energy sources, efficiency improvements, and cost distribution—could significantly influence broader acceptance of AI development and deployment.

Looking ahead

The data center boom represents one of the most significant shifts in American energy consumption patterns in decades. As AI capabilities continue advancing, the tension between technological progress and energy sustainability will only intensify.

The choices made today about data center locations, energy sources, grid infrastructure investments, and cost allocation will shape both America’s competitive position in AI and the everyday energy costs facing businesses and consumers. For policymakers, utility companies, and technology firms, the challenge lies in managing this growth while minimizing negative impacts on communities and the environment.

The data center revolution is no longer a distant technological trend—it’s a present reality reshaping America’s energy landscape, with consequences that will soon reach every electricity bill in the country.

What we know about energy use at U.S. data centers amid the AI boom

Recent News

Law firm pays $55K after AI created fake legal citations

The lawyer initially denied using AI before withdrawing the fabricated filing.

AI experts predict human-level artificial intelligence by 2047

Half of experts fear extinction-level risks despite overall optimism about AI's future.

OpenAI acquires Sky to bring Mac control to ChatGPT

Natural language commands could replace clicks and taps across Mac applications entirely.