Mother Jones
<!DOCTYPE html PUBLIC “-//W3C//DTD HTML 4.0 Transitional//EN” “http://www.w3.org/TR/REC-html40/loose.dtd”>
This story first appeared on the Atlantic website and is reproduced here as part of the Climate Desk collaboration.
Being a utility executive used to be a sweet gig.
State regulators told you how much you could charge your customers for electricity and dictated your profit margin. Your job was to build big power plants, or buy energy from those that do, and distribute it your customers. And those customers weren’t exactly going anywhere. After all, you owned the transmission lines that delivered your electrons to their homes. In other words, it was a bit like sitting in the corner suite of AT&T, circa 1981, when Ma Bell was the only game in telephone town.
Those days are over. Regulators now want you to obtain a growing percentage of the electricity you sell from wind, solar, and other renewable sources that are carbon-free but intermittent, which plays havoc with the power grid. And your customers? They’re increasingly generating their own electricity from rooftop solar arrays, fuel cells, wind farms, and self-contained power systems called microgrids. The rapid expansion of this so-called distributed generation deprives utilities of revenues while leaving them liable for maintaining the grid. And increasingly severe weather spawned by climate change is raising doubts about the wisdom of relying on a centralized power system.
Read more:
Who Will Compete With Energy Companies in the Future? Apple, Comcast, and You