I wish my career were half as interesting as some people make it out to be.

When I’m not writing opinion columns, I spend most of my employed time working as an information technology manager — a role I was promoted into after spending nearly two decades doing server and end-user support. Throughout my career, I’ve spent time in data centers, basements, closets, cubicles and various other environments where someone decided one or more computers might belong. If everything goes right and my health holds up, I will likely spend another two decades doing the work I’ve done throughout most of my adult life.

Some of that time will undoubtedly be spent in a data center.

According to breathless national coverage of the environmental harms of data centers, recent opinions in this publication and elsewhere, and some heated public comments in my municipal backyard, however, this apparently makes me a mustache-twirling supervillain single-handedly responsible for your rising power bill while my colleagues and I cook the atmosphere.

I kid. Mostly.

Hyperbole aside, I understand commentary about data centers isn’t actually commentary about data centers as a specific discrete point of discussion. It’s commentary about artificial intelligence (AI), which, in turn, is commentary about the tech industry, marketing hype, the outsized political and social influence of certain prominent tech entrepreneurs, and so on.

Can anyone stop Peter Thiel, who co-founded PayPal, from being a longtime supporter of the political career of the vice president-elect? No.

Can anyone stop Elon Musk from spending more than a quarter billion dollars on getting Donald Trump and other Republicans elected in exchange for functioning as a shadow president who gets to set immigration policy? Apparently not.

Can anyone stop Google, Microsoft and the rest of the tech industry from forcing AI into their core products, frequently making them worse in the process? Once again, no — not if they’re collectively investing $1 trillion into developing AI and need to demonstrate a return on that investment.

Can anyone in or near the information technology industry discuss AI with a modicum of humility and without claiming they’re either going to save the world or destroy humanity? Again, no — certainly not while credulous financiers continue to throw vast sums of venture capital at anyone who seems like they might have an idea that could deliver hockey stick returns.

But people can stop a data center from getting built.

To be fair, data centers have grown larger and more numerous in a hurry. Large language models, or LLMs — the algorithm used to train the current generation of AI — are extremely computationally intensive. To perform the computations necessary to improve the accuracy and range of responses an AI might produce, companies are pouring billions of dollars into buying and running the hardware necessary to run those computations. 

NVIDIA, the company that makes the most in-demand chip for AI work, saw its annual revenue increase from $26.91 billion to $60.9 billion in only two years. That investment, in turn, is being mirrored by similar investments in the data centers that power those chips. 

Those investments grew the North American data center market by 24.4 percent in 2024, according to CBRE, a commercial real estate services and investment firm. That growth, in turn, added 807.5 megawatts of potential demand against the nation’s power infrastructure — a bit more than half of the energy production NV Energy generates to keep Northern Nevada’s lights on and businesses running.

That increase in demand for electricity is causing data centers to consume a larger proportion of all power generated in this country. According to a recent report published by the Lawrence Berkeley National Laboratory, data centers now use 4.4 percent of the electricity produced in the United States each year — more than double the share they consumed five years ago.

Based on the information above, it’s understandable why someone might assume data centers are going to swallow the world.

The missing half of the story, however, is that the United States produces roughly the same amount of electricity it generated in 2007:

According to the U.S. Energy Information Administration, the United States generated 4,157 billion kilowatthours in 2007. In 2023, that number increased to 4,178 billion kilowatthours — a modest 0.5 percent increase over 16 years. Within that same time period, the population of the United States increased from 310 million to nearly 335 million — an 8 percent increase. We also invented smartphones, built electric vehicles and started building electrified high-speed rail systems during that same time period.

In other words, we added eight Nevadas worth of people, electrified their infrastructure and amusements, and then refused to build enough additional infrastructure to support any of them. No wonder adding half of a Northern Nevada’s worth of demand to the national power grid is causing issues.

Even if we don’t build additional power generation capacity, there are still some causes for optimism.

Water usage, even in the comparatively intensive context of training and running AI models, is decreasing over time because the amount of water used in generating power is decreasing over time. According to the U.S. Geological Survey, water consumption from power plants decreased by 25 percent from 2008 to 2020. 

That’s important because a recent study published by the University of California, Riverside, measured the water footprint of running AI models. It determined that Google, Microsoft and Meta collectively consume 0.33 percent of the water withdrawn in the U.S. in 2022. Nearly 90 percent of that consumption, however, was “scope-2” water usage — usage caused by generating the power delivered to data centers. Continuing to reduce the amount of water consumed by power plants, then, would not only dramatically improve the environmental efficiency of data centers, it would also reduce the amount of water consumed by power smartphones, laptops, electric vehicles or high-speed trains.

Additionally, there are signs that the demand for new data centers may subside by the end of the decade. 

For the past few years, improvements in the performance and accuracy of LLM-driven AIs have been achieved by throwing additional data into their models. Doing so has required additional computing power to process the additional data. Many AI developers, however — including OpenAI co-founder Ilya Sutskever — are already talking about “peak data,” or the moment when LLMs will be able to read and incorporate every piece of public human-generated text data into their models. Once that moment is reached, the pressure to build additional computational power may relax since there will be less additional data to process.

A recent research paper suggests that moment might arrive between 2026 and 2032 — assuming the dozens of copyright lawsuits filed against AI companies during the past two years don’t shrink the pool of data available for processing before then.

Outside of AI, another driver in data center construction has been the increased adoption of public cloud software, such as Microsoft 365 and Google Workspace. These products, however, often replaced systems that were previously hosted on-site and maintained by office staff. Speaking from experience, many of those on-site systems were hosted in closets that didn’t even qualify as Tier I data centers — which meant the power consumed by overbuilt office servers to host a few dozen email accounts was never measured as data center power consumption.

In other words, email and file hosting for most businesses is no longer the unmeasured addition to the corporate office power bill that it usually was when I first started my career in information technology. Now it’s tightly controlled and reported as a public cloud provider’s data center expense — one which is significantly more legible to policy makers and corporate accountants alike.

That migration, however, also faces finite limits. Many companies, burned by skyrocketing public cloud bills, are now migrating some of their workloads back on-premises. Though some businesses may be going back to the server-in-a-closet model I originally started my career on, many other businesses are deploying private clouds — systems, in other words, that are hosted in data centers but are supported and maintained remotely by the business.

Finally, even if Moore’s Law — the trend that allowed the number of transistors to double every two years — is dead now that we’re manufacturing against the limits of physical reality, there’s still plenty of room for growth in computing power and efficiency. Not every chip currently manufactured is manufactured with the most up-to-date technology. Additionally, a lot of software was optimized to reach the market in a hurry, not to perform efficiently.

Put it all together — peak data, saturated cloud adoption, continued improvements in computer technology — and there’s a strong chance companies will soon run out of reasons to build new data centers in such a hurry.

In short, we don’t need to panic. We just need to plan — and build in the meantime.

Between the recent increases in data center construction and the growing adoption of electric vehicles, it’s time to stop pretending we can power the future using the same amount of power we generated in 1999. Fortunately, that’s getting cheaper and more environmentally friendly — solar power was already the cheapest source of electricity available in 2019 and it’s only getting increasingly affordable with time. Additionally, the largest data center companies — Microsoft, Amazon and Google — are all looking for ways to power their own data centers without overburdening public grids.

We also need to keep an eye on tax incentives, such as the partial abatements for data centers already written into state law, to ensure we’re not inadvertently subsidizing lines of business that are organically growing rapidly of their own accord.

Beyond that, we just need to treat data centers like any other industry. If there are skilled people in the area available to work in one, land available to build one and power available to keep one running — all of the same ingredients any other business needs — let them build.

David Colborne ran for public office twice. He is now an IT manager, the father of two sons, and an opinion columnist for The Nevada Independent. You can follow him on Mastodon @[email protected], on Bluesky @davidcolborne.bsky.social, on Threads @davidcolbornenv or email him at [email protected]



Source link

By admin

Malcare WordPress Security