Renewable Energy Technologies Just as Reliable as Fossil Fuel Plants
24 Jan, 2008 01:15 pm
The "intermittency" of wind and solar is not a reason to reject renewables
While it is certainly true that the output from conventional power plants can be measured quite accurately, virtually every other aspect of planning for and implementing that resource is riddled with uncertainty. Three types of uncertainty are most common: variance in construction costs, variance in short-term demand forecasts, and variance in long-term demand forecasts.
First, a large variance exists between the projected costs and actual costs of conventional power plant construction. Experience has shown that there can be project delays and other unforeseen problems that can lead to considerable cost overruns and even project cancellations. Generally, the larger the project (in terms of installed capacity and thus cost), the longer it takes to complete and the more it is at risk to unforeseen changes (such as interest rates, labor costs, environmental regulations, etc.). The very fact that large power plants take many years to construct and complete dates are imprecise, adds uncertainty to the electric system.
Second, once large projects get built, their output is often subject to rapidly changing patterns in consumer demand (and thus required load). Weather events such as sudden thunderstorms can persuade customers to switch on lights, just as unexpected hours of sunshine can convince them to turn them off. Millions of people are constantly switching on and off equipment—televisions, lights, computers—that demand instant power.
In the modern, restructured electricity market, system operators typically employ complicated forecasting techniques to minimize such uncertainty. New York, New England, and PJM independent system operators determine load imbalance on five-minute intervals and use supply curves to dispatch the load-following units participating in the real-time market. System operators employ an automatic generation control (AGC) system to manage minute-to-minute load imbalances (a service known as “regulation”). Units participating in the AGC are equipped with governors that sense a change in frequency and automatically adjust output. Intra-hour dispatch every few minutes allows the units providing regulation to return to their nominal set points. To enhance system reliability, AGC units operate at lower power output than would be dictated by optimal economic dispatch without the requirement to following changing loads. Thus the entire electric utility system is already built to address variability, just of a different type.
Third, utility resource acquisition decisions are based on forecasts of future customer demand, which can be ridiculously uncertain. We have a hard enough time predicting the weather or political elections; imagine the difficulty in projecting how an entire industry will be 10 to 20 years down the road. Changes in industry structure and long-term climatic conditions such as drought or unpredicted heat can particularly impact large hydroelectric and nuclear facilities. Uncertainty in long-term forecasting was widely encountered in the utility industry in the 1970s and 1980s, when excessively high forecasts of growth in demand for electricity led to overbuilding of electric generating plants and massive electric system cost over-runs in many states. A somewhat infamous example of this was in Washington State, where the Washington Public Power System (WPPS) began a construction program for as many as seven new nuclear power plants in the early 1970s.
After large cost overruns and collapsing electricity demand growth in the late 1970s and early 1980s, the power system faced financial disaster and all but one of those plants was cancelled, leading to, at the time, the country’s largest municipal bond default. The entire experience came to be called the “WHOOPS” fiasco (as a play off of the WPPS acronym) and experts have called it “an enduring illustration of the risk associated with large electric system supply-side investments.”
Renewable energy technologies such as wind and solar minimize each of these sources of variability by deploying technologies that are smaller, more modular, and less capital intense. Classic grid systems are typically “lumpy systems” in the sense that additions to capacity are made in primarily large lumps (gargantuan power plants, new transmission lines). These plants have long lead times and uncertainties, making planning and construction difficult, especially when the balance of supply and demand can change rapidly within a short period of time.
In contrast, renewable energy technologies tend to have a quicker lead time than conventional coal and nuclear plants that can take 5 to 15 years to plan, permit, and construct. Quicker lead times enable a more accurate response to load growth, and minimize the financial risk associated with borrowing hundreds of millions of dollars while plants are built. Florida Power and Light says it can take as little as 3-6 months from groundbreaking to commercial operation of new wind farms.
Because renewable energy technologies can be produced at smaller scale, they can be located (or situated) almost anywhere, enhancing their ability to match smaller increments of demand. In the case of unexpected changes, renewable energy technologies limit financial risk and capital exposure. Modular plants can be cancelled easier, so that stopping a project is not a complete loss (and the portability of most renewable energy systems means value can still be recovered if the technologies would need to be resold as commodities in a secondary market). Smaller units with shorter lead times reduce the risk of purchasing a technology that becomes obsolete before it is installed, and quick installations can better exploit rapid learning, as many generations of product development can be compressed into the time it would take to build one giant plant.
Conventional power plants operating on coal, natural gas, and uranium are subject to an immense amount of variability. The issue, therefore, is not one of variability or intermittency per se, but how such variability and intermittency can best be managed, predicted, and mitigated. And the advantages of renewables—in addition to being theorized for the past three decades—have been empirically proven in large parts of the world in the past few years.
References
Christopher Cooper and Benjamin Sovacool, Renewing America: The Case for a National Renewable Portfolio Standard (New York: Network for New Energy Choices, 2007), ix + 158pp, available at http://www.newenergychoices.org/dev/uploads/RPS%20Report_Cooper_Sovacool_FINAL_HILL.pdf.
Ed Vine, Marty Kushler, and Dan York, “Energy Myth Ten—Energy Efficiency Measures are Unreliable, Unpredictable, and Unenforceable,” In B.K. Sovacool and M.A. Brown (Eds.) Energy and American Society—Thirteen Myths (New York: Springer, 2007), pp. 265-288.
-
12/12/12
“Peak Oil” is Nonsense… Because There’s Enough Gas to Last 250 Years.
-
05/09/12
Threat of Population Surge to "10 Billion" Espoused in London Theatre.
-
05/09/12
Current Commentary: Energy from Nuclear Fusion – Realities, Prospects and Fantasies?
-
04/05/12
The Oil Industry's Deceitful Promise of American Energy Independence
-
14/02/12
Shaky Foundations for Offshore Wind Farms
One of the most widely abundant renewable energy sources (that you've never heard of ) is the Convective Available Potential Energy in the troposphere. It has the advantage of being at its maximum, precisely when the thermal stress on the population reaches its plateau--between noon and 9 pm.
The amount of it in the atmosphere is huge. A great example of the abundance is the near permanet "catatumbo" convective storm at the southeastern end of Lake Maracaibo in Venezuela (the warmest large body of water on the planet). It is estimated to provide at least 10% of all the "lightning produced" protective ozone on the planet. If only a small fraction of this energy could be harnessed it could provide electricity for the entire Western half of Venezuela and eastern part of Colombia.
As I discussed several weeks ago, the technology to harvest it is already available in the form of the Atmospheric Vortex Engine. It would take only a modest amount of investment to develop it , maybe as little as $50 MM total and could be accomplished in only a couople of years. Compared to the amount invested in "clean coal" and sequestration, it is only a drop in the bucket.
Then, China could be provided with an alternative to building a "coal plant a week", and we here in NA could wean ourselves off this grossly polluting technology as well.
See www.vortexengine.ca and educate yourselves on the enormous potential of the AVE technology.
Solar power is reliable enough to serve as a peak daytime power source in the desert Southwest. However, cloud cover decreases the reliability of solar power in much of the United States. Solar power systems require extremely expensive carbon fueled back up systems. Thus outside the Southwest Solar power is not a good candidate to provide daytime peak power, and no where is solar a candidate for night time peak power. Thus the case for renewables as a reliable power source is seriously flawed.
[Response] Charles, I urge you to read up on some of the recent studies on renewables that challenge almost all of your claims. See: National Hydropower Association. (2002). ?Averting Disaster: Keeping the lights on with hydropower,? Issue Brief, Tables 2 and 3, which shows significant hydropower resource availability in the US; Wendell A. Duffield and John H. Sass, Geothermal Energy: Clean Power from the Earth?s Heat (Washington, DC: U.S. Department of Interior/U.S. Geological Survey, 2003), which argues that the US has immense remaining geothermal potential; And, most importantly: A November 2007 article from two Stanford researchers showing wind can provide reliable baseload power, available at http://www.stanford.edu/group/efmh/winds/aj07_jamc.pdf; Paul Denholm technical paper on hybrid biomass wind baseload plants, available at http://www.nrel.gov/docs/fy06osti/38270.pdf; Denholm, Paul, G.L. Kulcinski, and T. Holloway. (2005) "Emissions and Energy Efficiency Assessment of Baseload Wind Energy Systems," Environmental Science and Technology. 39, 1903-1911.
http://www.decaturdaily.com/stories/3020.htm
TVA would love to find more hydro resources, but there are none in the Tennessee Vally, and the same holds true for the rest of the Southeast. Because of the limited amount of water in Tennessee rivers, TVA concidres its hydrogenerators peak rather than base power.
Dry geothermal technology is untested, and its costs are probable not competative. Recently a geothermal project triggered an earthquake in Switzerland. The government shut the project down and threatened to take legal actions against the project developers. This was not good advertising for Geothermal. If a good geothermal resources is no more thsn a mile down and has water under pressure at over 100 C, then I seriously doubt that there are good geothermal resources in the Southeast. To boot, the summer wind resources of the Southeast are poor, and the Sotheast has cloudy days all yea around. Thus by far the most reliabile carbon free electrical source is nuclear power,
As you must be aware capacity utalization for both solar and wind electrical generation runs about 20%. In argument for the reliability of wind rests on data that for 10 selected sites, at any given time at least 3 can be counted on to be counted on to produce. Since any one wind generator at the site can be counted on to produce at no more than 20%. The reliable base power would be 20% of the total rated power of the 3 sites. That would be 6% of the rated power of the 10 sites. In contrast nuclear plants have capacity utalization rates of up to 90%, and their shut downs can be managed to give peak power generation capacity at peak demand times.
[Response] Charles,
I find it very ironic that a nuclear advocate is arguing against renwables on the grounds that some use water. Yes, hydroelectric plants suffer from drought (and also lose water due to evaporation). But nuclear plants and other thermoelectric facilities are much, much worse. Three plants in the South have already had to temporarily close because of lack of water, and the French heatwave a few years back shut down almost all of French capacity for days. "Real world" facts mean wind, solar, and other less water-intense generators have huge advantages.
In a similar vein, the argument that geothermal is "untested" and should be discounted would also apply to Generation IV nuclear reactors and a host of other technologies. Why is it we should trust that nuclear can overcome its technical diffucilties, but geothermal cannot?
For the record, nuclear is in no way carbon free. An assessment I am doing of 103 different lifecycle studies of nuclear energy shows an average of 66 grams of c02 emitted for every kWh of nuclear power. I'd be happy to forward you a copy of that study once its done.
And if you read the Denholm article I suggested, when coupled with economically feasible compressd air storage, wind power can operate as a baseload plant with capacity factors above 70 percent.
[Response] I'm not sure I ever said we shouldu dam every river, lake, and stream, nor where any "green dogma" ever said so either. If you're talking about true cost-benefit analysis, though, why don't we account for all positive and negative attributes of fossil, nuclear, and renewable energy systems, and see which ones are truly the best? I concede that nuclear will probably fare much better than coal and natural gas, but think renewables will be far ahead of nuclear.
What about intermittancy? Where will the power come from when the wind isn't blowing and the sun isn't shining.?
Since the subject of water came up, let me point out that there are work-arounds. Thermal plants can use dry cooling towers where necessary. Better, the waste heat can be used productively as industrial heat or for heating buildings.
The puzzle of how to get dry geothermal energy out of the ground has been studied for over 30 years and is no closer to solution than before. In contrast, Gen IV power plants are an extension of present technology.
Micro-hydro was official dogma for a long time, but maybe before you got involved in this. The last time somebody from UCS brought it up as the "real solution" was probably ten years ago.
[Response] Rob,
We can disagree about whether Generation IV plants or geothermal plants are technically feasible, or whether micro-hydro has potential, but on many of these issues you are flat-out wrong.
The argument about intermittency is not that small power increments go in faster; the more intermittent sources you connect, the smoother they operate, and the more reliable they come. Remember, the DOE is now admitting wind plants can run at 70 percent capacity factor when coupled with compressed air storage, approaching the low end of coal and oil plants.
Dry cooling? Give me a break. First, the DOE argues that just 5 percent of thermal plants actually have the environmental conditions needed for dry cooling. Second, the output of a plant with dry cooling will be about 2 percent less than that of a similar plant with evaporative closed-loop cooling, but plant efficiency may decrease by up to 25 percent in extremely hot weather.
See U.S. Department of Energy, Energy Demands on Water Resources: Report to Congress on the Interdependence of Energy and Water (Washington, DC: U.S. Department of Energy, December, 2006);
Clear Air Task Force and Land and Water Fund of the Rockies, The Last Straw: Water Use by Power Plants in the Arid West (Washington, DC: The Hewlett Foundation, April, 2003), available at http://www.catf.us/publications/reports/The_Last_Straw.pdf.
Consider the amount of compressed air needed. There is a facility in Huntorf, Germany that we can use for an example. [REF] It compresses air to 1000 pounds per square inch pressure.
The data show that it stores 3 x 290 = 870 MWH of energy and the cave volume is 310,000 cubic meters.
For one day of electricity storage for the US, the volume needed would be 11,000,000/870 X 310,000 = 3.92 billion cubic meters = 138.4 billion cubic feet.
Suppose a cave had an average cross-section of 50 ft X 50 ft = 2500 sq ft.
For one day's electricity storage, the cave's length would have to be 138.4 billion / 2500 = 55.34 million feet = 10,490 miles. Granted that most big caves have never been surveyed, it's still safe to say that there aren't ten-thousand miles of caves in the US. So there is no possible way enough energy could be stored to see the country through 100 days of low winds.
You might find these maps interesting: US-Winter , US-Spring , US-Summer , US-Autumn .
They show that even averaging the winds over quarters, the variation is more than two-to-one everywhere except in the Great Plains, in many places much more. These maps don't show diurnal variations or day-to-day, which surely would be much larger.
I'm not arguing that dry cooling is as good as wet cooling, just that it works. Since you don't give a reference, I can't respond to the numbers you give. Here's a DOE report that shows the maximum penalty on a 1% hot day would be 13.1%, compared to 4% for wet cooling. That's for retrofitting 100% of existing power plants. The authors argue against 100% retrofit for reasons of practicality, but they also point out that plants designed and built for dry cooling will fare better. Perhaps the data you looked at started with different assumptions, for example undersized towers.
I don't have any way to access the first source you gave, but the second one explicitly recommends using dry cooling.
In contrast, I haven't come across any place that shows how wind power can generate electricity when there's no wind.
[Response] Rob,
I'm not trying to play fast and loose with you. I provided the sources on the compressed air/intermittency stuff in my response to Charles, but forgot that I hadn't for you.
The case that wind can make a reliable baseload plant comes from, among others, the following:
A November 2007 article from two Stanford researchers showing wind can provide reliable baseload power, available at http://www.stanford.edu/group/efmh/winds/aj07_jamc.pdf;
Paul Denholm (a DOE/NREL researcher) technical paper on hybrid biomass wind baseload plants, available at http://www.nrel.gov/docs/fy06osti/38270.pdf;
Denholm, Paul, G.L. Kulcinski, and T. Holloway. (2005) "Emissions and Energy Efficiency Assessment of Baseload Wind Energy Systems," Environmental Science and Technology. 39, 1903-1911.
You may have to write the journal or get online access through a library to access these reports; I havent seen them online anywhere.
I don't think any of these studies say that wind can provide all baseload power for the US, just that they can be configured to be very reliable.
The dry cooling numbers I referenced came from a DOE report that, for some reason, is no longer online. I'm in the middle of doing a report right now on water and energy that talks extensively about dry cooling. You're right that it can be a better option, but it can't be widely used. For example, it only works in arid environments (ruling out most of the East Coast) and has some pretty severe energy penalties, meaning it reduces plant efficiency. I'll make sure to post a link to my report on Scitizen once its published.
[REF] http://www1.ceit.es/asignaturas/tecener1/Lesson6.pdf
maps: http://rredc.nrel.gov/wind/pubs/atlas/maps/chap2/2-12m.html
http://rredc.nrel.gov/wind/pubs/atlas/maps/chap2/2-13m.html
http://rredc.nrel.gov/wind/pubs/atlas/maps/chap2/2-14m.html
http://rredc.nrel.gov/wind/pubs/atlas/maps/chap2/2-15m.html
DOE report: http://www.ipd.anl.gov/anlpubs/2006/11/57837.pdf
According to Archer & Jacobson, the part of the average output that can be considered 87.5% reliable is between 33% and 47%, depending on how many wind turbines are interconnected. However, the area they studied has the most reliable winds in the US and their results don't translate to the country as a whole. Even so, they show that wind farms would have to be oversized by a factor of at least 2. They elect to call it base load, but that's not appropriate. It only can be base load if there is also some form of load-following power.
That's a problem. Without fossil fuel and nuclear energy, load following is limited to whatever hydro and pumped storage can be made available, and at most that can only be a few percent.
Denholm recognized that and suggests using biofuels for load following. But there are a couple of problems here. One is that nowhere does he consider the fuel required to grow the biomass and convert it into biofuel. Currently, it takes a gallon of fuel to produce a gallon of fuel. The one data point we have shows it won't work. Perhaps this output ratio can be improved, but it seems unlikely that it will ever take no fuel to produce a gallon of fuel. In the absence of better information, his study has to be considered extremely optimistic.
His optimistic estimate is that it would take 6.9 hectares or .0266 sq mi to produce biofuels that would generate 1000 MWH per year. The US uses 4 billion MWH/year, so the area required would be 106,400 square miles, out of 650,000 square miles of arable land. Since we're using almost all the arable land for food and fiber, it's not clear where the 106,400 square miles will come from. Also, to farm land of this magnitude means using less-productive land. He assumes 11.3 tonnes/hectare yields, which would require prime Iowa land, so the land areas would be much greater and very likely would require irrigation, for which water will not be available. That's enough trouble already, but consider that the needs for motor fuels will vastly outweigh the need for bio-electricity, because there is another, better, way to generate electricity but no alternative way to produce non-fossil motor fuels.
I enjoyed this exercise but it doesn't change anything. Wind energy doesn't work without a backup, and biofuels won't provide the backup.
Consider the problem of motor fuels, which is the toughest of the two problems. At this point, there are only two possibilities in view, besides electrified vehicles, bicycles, foot travel, horseback, rickshaw and some other specialized transportation modes. The two possibilities are hydrogen and hydrogen-enriched biofuels. The world has to have the capability of producing large amounts of hydrogen and nuclear plants allow for thermochemical production of hydrogen, by far the most efficient technique available.
Once all the fossil-fired power plants are replaced, nuclear and renewables can complement each other. The nuclear plants can provide whatever electricity is needed during times of dim sunlight and low winds, or no sunlight and no wind. When the sun is shining and the wind is blowing, and when demand for electricity is low, nuclear plants can divert some of their capacity to generating hydrogen.
This allows solar and wind to play their maximum part in providing electricity. Further, it allows them to contribute efficiently to the production of hydrogen, by taking some load off the nuclear plants. This is the kind of solution that will minimize global warming. Trying to paper over the limitations of renewable sources with scientific-looking obfuscations, if it's successful, can only keep the world on its present reckless path to self-destruction.
[Response] Rob,
I do enjoy these debates with you. First, I think you're reading Archer & Jacobson selectively. They also note that as you interconnect wind farms, the more they do tend to follow the load, in the same way that solar does (i.e., the connected wind farms have a better ELCC that corresponds to daily and sesonal peaks in demand). Also, I know it's hard to get, but check out the Denholm et al. article from ES&T. That's the one that argues that wind turbines + compressed air = baseload plants with 70% capacity factor and above.
As for your arguments about biofuels, I urge you to consider three things. First, the only studies I've seen about a 1-1 trade in terms of energy balance is for ethanol; no one has ever talked about using ethanol for electricity. Second, Denholm is not talking about ethanol-based bioelectric plants but those based on more reliable and available feedstocks. Third, from what limited stuff I do know about ethanol, its older studies from Pimental et al. and others that argue there is a 1-1 tradeoff, but newer studies show ethanol needs around 35,000 BTU to make around 70,000 BTU of fuel. This seems like a much better tradeoff ....
My library doesn't access ES&T; thousands of others, but not that one. I don't doubt that if enough compressed-air storage could be found it could backup the wind-turbines. But enough storage would be more than exists, by a huge multiple. Instead of sending me on these wild-goose chases looking for obscure references, would you kindly quote to me the assumptions they made about storage availability. For example, how much volume is needed per MW of turbine rating and at what pressure, and the efficiency of storage and retrieval. Arithmetic doesn't change, and no matter how many mathematical manipulations they make the end result will be that wind power isn't going to provide a major part of the country's electricity unless nuclear power is available to back it up.
I pointed out the fallacy of ignoring fuel inputs for producing biofuels, but then went on to point out that even with Denholm's optimistic assumption than no fuel is needed to produce biofuel there still isn't enough land to grow enough biomass to make a difference. This conclusion is based on his data, not somebody else's data on ethanol.
Thanks for this. It's material for my next blog, about how some people think you can change reality by manipulating data. If you can just beat it with statistical analysis and smother it in paper you can make it whatever you want. You want windmills to turn when there's no wind? No problem. Just crank out fifteen pages of equations, tables, diagrams, and charts and they'll turn themselves!