Article 52NPY Our love of the cloud is making a green energy future impossible

Our love of the cloud is making a green energy future impossible

by
Danny Crichton
from Crunch Hype on (#52NPY)
Mark MillsContributorShare on TwitterMark Mills is the author of the book, "Digital Cathedrals: The Information Infrastructure Era," and is a senior fellow at the Manhattan Institute, a Faculty Fellow at Northwestern University's McCormick School of Engineering, and a partner in Cottonwood Venture Partners, an energy-tech venture fund.

An epic number of citizens are video-conferencing to work in these lockdown times. But as they trade in a gas-burning commute for digital connectivity, their personal energy use for each two hours of video is greater than the share of fuel they would have consumed on a four-mile train ride. Add to this, millions of students 'driving' to class on the internet instead of walking.

Meanwhile in other corners of the digital universe, scientists furiously deploy algorithms to accelerate research. Yet, the pattern-learning phase for a single artificial intelligence application can consume more compute energy than 10,000 cars do in a day.

This grand 'experiment' in shifting societal energy use is visible, at least indirectly, in one high-level fact set. By the first week of April, U.S. gasoline use had collapsed by 30 percent, but overall electric demand was down less than seven percent. That dynamic is in fact indicative of an underlying trend for the future. While transportation fuel use will eventually rebound, real economic growth is tied to our electrically fueled digital future.

The COVID-19 crisis highlights just how much more sophisticated and robust the 2020 internet is from what existed as recently as 2008 when the economy last collapsed, an internet 'century' ago. If a national lockdown had occurred back then, most of the tens of millions who now telecommute would have joined the nearly 20 million who got laid off. Nor would it have been nearly as practical for universities and schools to have tens of millions of students learning from home.

Analysts have widely documented massive increases in internet traffic from all manner of stay-at-home activities. Digital traffic measures have spiked for everything from online groceries to video games and movie streaming. So far, the system has ably handled it all, and the cloud has been continuously available, minus the occasional hiccup.

There's more to the cloud's role during the COVID-19 crisis than one-click teleconferencing and video chatting. Telemedicine has finally been unleashed. And we've seen, for example, apps quickly emerge to help self-evaluate symptoms and AI tools put to work to enhance X-ray diagnoses and to help with contact tracing. The cloud has also allowed researchers to rapidly create "data lakes" of clinical information to fuel the astronomical capacities of today's supercomputers deployed in pursuit of therapeutics and vaccines.

The future of AI and the cloud will bring us a lot more of the above, along with practical home diagnostics and useful VR-based telemedicine, not to mention hyper-accelerated clinical trials for new therapies. And this says nothing about what the cloud will yet enable in the 80 percent of the economy that's not part of healthcare.

For all of the excitement that these new capabilities offer us though, the bedrock behind all of that cloud computing will remain consistent - and consistently increasing - demand for energy. Far from saving energy, our AI-enabled workplace future uses more energy than ever before, a challenge the tech industry rapidly needs to assess and consider in the years ahead.

The new information infrastructure

The cloud is vital infrastructure. That will and should reshape many priorities. Only a couple of months ago, tech titans were elbowing each other aside to issue pledges about reducing energy usage and promoting 'green' energy for their operations. Doubtlessly, such issues will remain important. But reliability and resilience - in short, availability - will now move to the top priority.

As Fatih Birol, Executive Director of the International Energy Agency (IEA) last month reminded his constituency, in a diplomatic understatement, about the future of wind and solar: "Today, we're witnessing a society that has an even greater reliance on digital technology" which "highlights the need for policy makers to carefully assess the potential availability of flexibility resources under extreme conditions." In the economically stressed times that will follow the COVID-19 crisis, the price society must pay to ensure "availability" will matter far more.

It is still prohibitively expensive to provide high reliability electricity with solar and wind technologies. Those that claim solar/wind are at "grid parity" aren't looking at reality. The data show that overall costs of grid kilowatt-hours are roughly 200 to 300 percent higher in Europe where the share of power from wind/solar is far greater than in the U.S. It bears noting that big industrial electricity users, including tech companies, generally enjoy deep discounts from the grid average, which leaves consumers burdened with higher costs.

Put in somewhat simplistic terms: this means that consumers are paying more to power their homes so that big tech companies can pay less for power to keep smartphones lit with data. (We will see how tolerant citizens are of this asymmetry in the post-crisis climate.)

Many such realities are, in effect, hidden by the fact that the cloud's energy dynamic is the inverse of that for personal transportation. For the latter, consumers literally see where 90 percent of energy is spent when filling up their car's gas tank. When it comes to a "connected" smartphone though, 99 percent of energy dependencies are remote and hidden in the cloud's sprawling but largely invisible infrastructure.

For the uninitiated, the voracious digital engines that power the cloud are located in the thousands of out-of-sight, nondescript warehouse-scale data centers where thousands of refrigerator-sized racks of silicon machines power our applications and where the exploding volumes of data are stored. Even many of the digital cognoscenti are surprised to learn that each such rack burns more electricity annually than 50 Teslas. On top of that, these data centers are connected to markets with even more power-burning hardware that propel bytes along roughly one billion miles of information highways comprised of glass cables and through 4 million cell towers forging an even vaster invisible virtual highway system.

Thus the global information infrastructure - counting all its constituent features from networks and data centers to the astonishingly energy-intensive fabrication processes - has grown from a non-existent system several decades ago to one that now uses roughly 2,000 terawatt-hours of electricity a year. That's over 100 times more electricity than all the world's five million electric cars use each year.

Put in individual terms: this means the pro rata, average electricity used by each smartphone is greater than the annual energy used by a typical home refrigerator. And all such estimates are based on the state of affairs of a few years ago.

A more digital future will inevitable use more energy

Some analysts now claim that even as digital traffic has soared in recent years, efficiency gains have now muted or even flattened growth in data-centric energy use. Such claims face recent countervailing factual trends. Since 2016, there's been a dramatic acceleration in data center spending on hardware and buildings along with a huge jump in the power density of that hardware.

Regardless of whether digital energy demand growth may or may not have slowed in recent years, a far faster expansion of the cloud is coming. Whether cloud energy demand grows commensurately will depend in large measure in just how fast data use rises, and in particular what the cloud is used for. Any significant increases in energy demand will make far more difficult the engineering and economic challenges of meeting the cloud's central operational metric: always available.

More square feet of data centers have been built in the past five years than during the entire prior decade. There is even a new category of "hyperscale" data centers: silicon-filled buildings each of which covers over one million square feet. Think of these in real-estate terms as the equivalent to the dawn of skyscrapers a century ago. But while there are fewer than 50 hyper-tall buildings the size of the Empire State Building in the world today, there are already some 500 hyperscale data centers across the planet. And the latter have a collective energy appetite greater than 6,000 skyscrapers.

We don't have to guess what's propelling growth in cloud traffic. The big drivers at the top of the list are AI, more video and especially data-intense virtual reality, as well as the expansion of micro data centers on the "edge" of networks.

Until recently, most news about AI has focused on its potential as a job-killer. The truth is that AI is the latest in a long line of productivity-driving tools that will replicate what productivity growth has always done over the course of history: create net growth in employment and more wealth for more people. We will need a lot more of both for the COVID-19 recovery. But that's a story for another time. For now, it's already clear that AI has a role to play in everything from personal health analysis and drug delivery to medical research and job hunting. The odds are that AI will ultimately be seen as a net "good."

In energy terms though, AI is the most data hungry and power intensive use of silicon yet created - and the world wants to use billions of such AI chips. In general, the compute power devoted to machine learning has been doubling every several months, a kind of hyper version of Moore's Law. Last year, Facebook, for example, pointed to AI as a key reason for its data center power use doubling annually.

In our near future we should also expect that, after weeks of lockdowns experiencing the deficiencies of video conferencing on small planar screens, consumers are ready for the age of VR-based video. VR entails as much as a 1000x increase in image density and will drive data traffic up roughly 20-fold. Despite fits and starts, the technology is ready, and the coming wave of high-speed 5G networks have the capacity to handle all those extra pixels. It requires repeating though: since all bits are electrons, this means more virtual reality leads to more power demands than are in today's forecasts.

Add to all this the recent trend of building micro-data centers closer to customers on "the edge." Light speed is too slow to deliver AI-driven intelligence from remote data centers to real-time applications such as VR for conferences and games, autonomous vehicles, automated manufacturing, or "smart" physical infrastructures, including smart hospitals and diagnostic systems. (The digital and energy intensity of healthcare is itself already high and rising: a square foot of a hospital already uses some five-fold more energy than a square foot in other commercial buildings.)

Edge data centers are now forecast to add 100,000 MW of power demand before a decade is out. For perspective, that's far more than the power capacity of the entire California electric grid. Again, none of this was on any energy forecaster's roadmap in recent years.

Will digital energy priorities shift?

Which brings us to a related question: Will cloud companies in the post-coronavirus era continue to focus spending on energy indulgences or on availability? By indulgences, I mean those corporate investments made in wind/solar generation somewhere else (including overseas) other than to directly power one's own facility. Those remote investments are 'credited' to a local facility to claim it is green powered, even though it doesn't actually power the facility.

Nothing prevents any green-seeking firm from physically disconnecting from the conventional grid and building their own local wind/solar generation - except that to do so and ensure 24/7 availability would result in a roughly 400 percent increase in that facility's electricity costs.

As it stands today regarding the prospects for purchased indulgences, it's useful to know that the global information infrastructure already consumes more electricity than is produced by all of the world's solar and wind farms combined. Thus there isn't enough wind/solar power on the planet for tech companies - much less anyone else - to buy as 'credits' to offset all digital energy use.

The handful of researchers who are studying digital energy trends expect that cloud fuel use could rise at least 300 percent in the coming decade, and that was before our global pandemic. Meanwhile, the International Energy Agency forecasts a 'mere' doubling in global renewable electricity over that timeframe. That forecast was also made in the pre-coronavirus economy. The IEA now worries that the recession will drain fiscal enthusiasm for expensive green plans.

Regardless of the issues and debates around the technologies used to make electricity, the priority for operators of the information infrastructure will increasingly, and necessarily, shift to its availability. That's because the cloud is rapidly becoming even more inextricably linked to our economic health, as well as our mental and physical health.

All this should make us optimistic about what comes on the other side of the recovery from the pandemic and unprecedented shutdown of our economy. Credit Microsoft, in its pre-COVID 19 energy manifesto, for observing that "advances in human prosperity " are inextricably tied to the use of energy." Our cloud-centric 21st century infrastructure will be no different. And that will turn out to be a good thing.

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=k3-bxAOCajs:-sLcMmZwSO0:-BT Techcrunch?i=k3-bxAOCajs:-sLcMmZwSO0:D7D Techcrunch?d=qj6IDK7rITsk3-bxAOCajs
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments