Tuesday, June 24, 2014

The Future Electric Grid

 

We hear increasingly that technology is making today’s electric utility model ‘obsolete’ and will put its companies into a ‘death spiral.’ Is it possible that so much has changed so quickly?
It was only a little more than ten years ago that a National Academy of Engineering report ranked the invention of the electric grid at the top of a list of the 20 greatest inventions of the 20th century. Not just one of the great engineering achievements, but first amongst them. The Academy ranked the Internet 13th.
Now we hear increasingly that technology is making today’s electric utility model “obsolete” and will put its companies into a “death spiral.” Is it possible that so much has changed so quickly?
Post-utility advocates point to three technologies as disrupters: photovoltaics (PV), batteries, and smart or micro grids. The U.S. Department of Energy (DOE), along with a conga line of venture firms in Silicon Valley, invested tens of billions of dollars in these three domains over the past half-dozen years. Volumes of analyses and claims can be summarized in three paragraphs:
Solar arrays on the roofs of homes and buildings, it is argued, will obviate central power generation, especially much-reviled coal plants, and will do so rapidly, as PV costs decline and approach “grid parity.” The Department of Energy released a report chronicling the progress, titled Solar Revolution, that inspired palpitations from New York Times columnist Paul Krugman, who wrote that “it’s no longer remotely true that we need to keep burning coal to satisfy electricity demand.”



It is true that the nation’s electric grid is morphing, but just not quite the way green energy proponents imagine.
Lithium battery technology, incredibly improved courtesy of the mobile Internet, we’re told will now migrate into basements of homes and buildings to store PV electricity for nights and cloudy days, obviating the grid as backup. The global proliferation of lithium-powered hybrid-electric cars is just a first step. And when Tesla recently announced plans to build a “gigafactory” that would alone produce more than all of the world’s existing lithium battery factories combined, the green-tech media erupted with excitement, claiming such economies-of-scale promise revolution, not just for electric cars, but also the grid.
Finally, third in the triad, a smart grid, in particular in the form of “micro grids,” connects everything. With far more granular and real-time information about how much, when, and where electricity is used, advocates assert that social and economic behavior will change to radically reduce energy use and further undermine utility revenues.
These three technology forces in combination, the post-utility analysts claim, will “transform the way the utility industry meets energy demand.” It is, we frequently hear, analogous to and as inevitable as the destruction of the Ma Bell landline phone model when cell phones emerged. (Apparently none who offer this analogy notice that AT&T is doing just fine, and is still a huge if differently regulated business.)
The central problem with this post-utility construct is that the physics of information and electricity are profoundly different, and render the Bell analogy meaningless. More on that shortly. First though, it is true that the nation’s electric grid is morphing, but just not quite the way green energy proponents imagine.
The need for a harder grid
Modern society is in much more urgent need of a harder grid, not so much a greener grid. Demand for reliability is rising faster than demand for kilowatt-hours themselves. Two words epitomize this new reality – Metcalfe and Sandy.
In the aftermath of Hurricane Sandy’s widespread and persistent outages, federal and state policymakers called for more spending on grid resilience and recovery. And more recently, policymakers and utilities are still reacting to the fallout from learning about a terrorist-like gunfire attack on California’s Metcalf substation last year, an incident that had been kept a secret until this past spring. That attack prompted a flurry of “what if” scenarios about potential blackouts from future, similar attacks on any of the nation’s tens of thousands of substations.
Electricity powers everything people think is modern about our economy, from older but indispensable things like lights, motors, refrigerators, and air conditioners, to new technologies like the Internet, electric cars, 3D printing, and gene sequencing.
On average though, more mundane events lead to the vast majority of increasingly intolerable blackouts: car accidents, squirrels chewing through cables, and old equipment failing. The average incidence of grid outages has been rising at about 8 percent to 10 percent annually since 1990. And the duration of outages has also been rising by about 14 percent per year. (Eaton Corporation provides revealing state-by-state data and trends in their Blackout Tracker.) And then there are the rising concerns over cyber attacks on the grid – arguably one of the most critical areas, demanding increased spending and attention.
All this comes at a time of greater demand for “always on” power to keep our digital and information-centric economy humming. Electricity powers everything people think is modern about our economy, from conventional but indispensable things like lights, motors, refrigerators, and air conditioners, to new technologies like the Internet, electric cars, 3D printing, and gene sequencing.
The share of the U.S. GDP associated with information is three times bigger than the share associated with the transportation sector that moves people and stuff. The former is entirely dependent on electricity and is growing far faster than the latter, which uses oil. (For more on the Cloud’s surprising electricity appetite, see my earlier report.)
It should thus be unsurprising to learn that studies find the cost of outages, measured per kilowatt-hour, is ten to ten thousand times more than the cost of the power itself.
Even as the importance of reliability grows, the consumption of kilowatt-hours also keeps growing, despite billions invested trying to stifle that growth. U.S. electric demand today is 10 percent higher than 2001, perhaps a seemingly modest amount, but for a grid the scale of America’s this increase equals Italy’s entire annual use. For the future, the Energy Information Administration (EIA) forecasts a 12 percent rise over the next decade – that will require the United States to add capacity equal to Germany’s entire current grid.
Thus the future will not be dominated by trying to bolt more renewables onto the grid for their own sake, but in using them to meet growing demand and to add resiliency and reliability.
Technological limits
Smart grids. The key to a more resilient, flexible, and useful grid is to operate it like the Internet, which is nodal, interactive, and highly controllable. This is where “smart” meters and microgrids come in, and where solar energy and batteries play a role.
The future will not be dominated by trying to bolt more renewables onto the grid for their own sake, but in using them to meet growing demand and to add resiliency and reliability.
An Internet-like grid will know how much power is needed, when and where, and even what "flavor" of electrons some customers prefer — say, greener or cheaper. It would help moderate variations in peak demand by using software to negotiate in real-time with local and remote power sources, as well as by purchasing “avoided” power (temporarily cycling off air conditioners and refrigerators, but not computers and TVs). It would also reduce outage frequency through predictive analytics that anticipate maintenance before failures. And when failures occur, it would reduce outage duration by more rapidly locating, identifying, and optimally dispatching.
But thus far, spending on the smart grid has been dominated by smart meters that allow more granular and frequent readings and the transmission of that data to the utility, eliminating the old-fashioned meter reader. But just adding a communications feature to the meters is not deeply game-changing; it is the equivalent of installing a speedometer and gas gauge without a steering wheel and brakes. The game-changer is in controlling power.
Internet-like real-time control of power is mainly found at low power levels inside homes and buildings, not on the grid, and is unimaginatively labeled “building automation.” This is a small part of the smart-grid architecture wherein, to continue the information analogy, it is equivalent to the era of stand-alone mainframe computing before the Internet. But control of megawatt-hours, not megabytes, on big grids is a daunting technology problem.
The difference between the two power levels, controlling traffic on the Internet versus grid-power traffic, is what dictates physical material, and safety challenges. That difference is comparable to going from controlling a toy drone to a Boeing 777. Technologies are emerging that make grid-level dynamic switching and control possible, but they’ll take some time yet to get deployed. In the future you’ll hear a lot more about new classes of power transistors and semiconductors, like gallium nitride and silicon carbide, that can manage weapons-grade flows of electrons.
It’s still early days for such technology, and deployment in smart microgrids has barely begun. The country’s most successful and arguably only operational microgrid to date is on the campus of the University of California at San Diego. That 40 MW microgrid seamlessly exits the local public grid when regional demand (or prices) peak, and keeps the campus and its supercomputer lit with on-site power that includes fuel cells, solar arrays, batteries, and natural gas turbines. Notably it’s natural gas that supplies 75 percent of the on-site power.
Microgrids are a start but not the end game. To continue the information analogies, microgrids no more replace central power plants than WiFi networks replace Google’s central computing.
Photovoltaics. It is with the collapsing cost of PV cells that post-utility advocates assert we are close to the tipping point for grid and central power plant disruption.
No regulatory fiat, as exhibited notably by California policymakers implementing the nation’s only mandate for that state’s utilities to install grid-scale storage, can change the reality of simple arithmetic.
The capital cost of PVs has improved by a remarkable 200 percent in the past decade. But that rate of decline is slowing as the underlying technologies mature and physics limits are approached. (This happens to everything: aircraft engines improved more than 200 percent in their early years too, and now get better at single-digit percentages at best). Going forward, Germany’s Fraunhofer Institute recently estimated that PV costs will decline by another 30 percent by 2030 important but hardly revolutionary. And today, an unsubsidized PV array on homes and buildings, Fraunhhofer notes, produces far more expensive electricity than a central power plant.
And, it is argued, the central plant depends on a costly grid to get power to consumers. But solar needs the grid too. In order to ensure the 24-7 electric supply society demands, a PV array today uses the grid as “back-up.” But that raises questions about how to share the cost of the grid’s power plants and infrastructure, an issue regulators are struggling with in many states.
The alternative is to convert episodic on-site solar generation into “always on” power using batteries, or on-site back-up generators. The latter solution, distributing millions of small car-sized engine-driven power plants to every home or office to back up solar arrays, is not economically viable, much less sensible. It is battery technology that post-utility solar advocates hold out as the Holy Grail. Just store the electricity for when the sun’s not shining.
Batteries. Assume for the sake of argument that big batteries are cheap. Even then, a solar-only or solar-dominated system remains economically untenable. Supplying electricity all day, every day with a battery-solar combination requires, on average, buying two to three extra solar panels for every one installed in order to generate and store extra power when the sun is shining, thereby doubling or tripling system costs.
And assume again that batteries are cheap, the world would have difficulty producing enough of them to be impactful at grid levels. California policymakers apparently think otherwise, having implemented the nation’s only mandate for that state’s utilities to install grid-scale storage. Consider the reality of simple arithmetic:
All the world’s lithium battery factories collectively produce about 30 GWhr (30 billion watt-hours) of storage capacity annually. The United States alone consumes about 4,000,000 GWhr of electricity a year. Thus the entire planet’s annual production of lithium batteries for all purposes can store about five minutes worth of U.S. electric demand.
Politicians face increasing peril if their policies cause something as important as electricity to become increasingly expensive and less reliable.
As for Tesla’s putative gigafactory, if it gets built, its entire annual output adds another five minutes of U.S. grid-scale storage. And at that, Tesla batteries cost at least 500 percent more than today’s solution for providing electricity when outages or peaks happen. Reliability of supply comes from building extra power plants to have on standby, and storing gas in caverns or coal in piles adjacent to those power plants.
This reality is precisely why society-levels of reliable, affordable electricity supply is such a great engineering challenge, and why the National Academy honored that achievement. For other energy commodities (and in general, most commodities), it is technically easy and inexpensive to store several months — not minutes — of demand at any given time in order to ensure price stability and physical reliability.
Still, better battery technology will emerge in due course. But it will come from some university research lab using big data to unravel chemical mysteries, not from building bigger buildings using yesterday’s chemistry. And you can bet that the company that invents the new way to store electricity will focus on selling into the huge high-value market for powering mobile devices. That’s where battery fortunes will be made, because consumers pay $20 per kilowatt-hour to keep iPads and iPhones lit, compared to $0.20 a kWh to keep the grid lit.
And, as better, cheaper battery technology does emerge, it will be as valuable, arguably more valuable, for conventional power plants, for reasons of simple economics. One would store the cheapest electrons when they are in surplus — i.e., coal-fired electricity in surplus at night delivered on uncongested lines — to resell later when prices and grid congestion are highest, around midday. Cheap grid-scale batteries will reinforce and arbitrage a complementary role for coal and solar. Somewhat ironic perhaps.
Speaking of coal, it’s impossible to talk about the grid and not make note of the carbon and global warming issues. To tilt the field away from hydrocarbon fuels, policymakers and regulators have taken actions that increase their costs, and also subsidize non-hydrocarbon energy. But the laws of physics of energy, and laws of economic reality, cannot be ignored. Politicians face increasing peril if their policies cause something as important as electricity to become increasingly expensive and less reliable.
Today’s technological breakthroughs: Smart drilling and big data
The two technologies that are reshaping the electric grid, allowing both more resilience and reliability, are not the ones pundits and policymakers expected: smart drilling, and big data.
Smart drilling has unleashed an entirely unexpected bounty of shale gas, not only making it easier to meet rising demand, but also to ensure reliability with plenty of spare capacity, all at low cost.
The entire planet’s annual production of lithium batteries for all purposes can store about 5 minutes worth of U.S. electric demand.
Just as technology lead to a 200 percent drop in the capital needed to produce a unit of PV electricity over the past decade, so too has technology driven a similar 200 percent (and even greater) decline in capital needed to produce a unit of shale gas – but the latter has happened in the past four years, and continues.
EIA sees natural gas and coal, in almost equal shares (coal still dominates in EIA’s forecast), providing about 70 percent of electric supply a decade out. These two fuels are now in a race to the bottom in terms of price, to the benefit of consumers and ensuring a permanent, low-cost electricity future (absent meddling from policymakers) that will confer an enormous economic advantage on U.S. industries.
The other disruptor, big data analytics, is made possible by the combination of proliferating low-cost sensors, ubiquitous wireless connectivity, and the Promethean power of computing. While big data will eventually impact every sector of the economy, one of the immediate benefits that real-time analytic intelligence brings is to wring greater value out of existing supply chains and infrastructures.
For some indication of the power of big data, consider the unheralded success of the PJM ISO — the system operator for the long-distance transmission system that lies between Chicago, New York City, and Washington, D.C., and its new sensor, control, and big data analytics system. After bringing it on line a little more than a year ago, it not only resulted in greater reliability and hundreds of millions of dollars in operational savings, but it also increased the system’s power-carrying capacity two-fold without adding new power lines.
Tesla batteries cost at least 500 percent more than today’s solution for providing electricity when outages or peaks happen.
While it is technically more difficult to implement that architecture on the local grids — the urban roads of the grid, versus the interstates of long-distance transmission — that is what will come next. Big data will also create greater markets for solar arrays and batteries as a feature of a central-power-plant grid. While the future may lead to different kinds of companies owning some or all the pieces of the grid, it will still be a grid and for consumers it will still feel as much like a “utility.”
Still, some utility CEOs, notably NRG’s outspoken chief David Crane, believe in a post-grid world. Earlier this year, Crane said: “Think how shockingly stupid it is to build a 21st-century electric system based on 120 million wooden poles... the system from the 1930s isn’t going to work in the long term.” This is not much of an improvement over the flawed Ma Bell analogy. Today we still build furniture and houses out of wood, a system predating the Romans; we just use the same materials much more efficiently.
When the National Academy of Engineering gets around to a 21st-century retrospective, odds are that ranked as top achievements will be whatever we end up calling big data, and whatever we end up calling the technologies that unlocked the shale. And the grid, pioneered in the 20th-century, will still be around, just a lot better.

Mark P. Mills is a senior fellow at the Manhattan Institute and CE

No comments:

Post a Comment