Comparison of Arizona Nuclear and Solar Energy

December 9, 2015

Let’s compare and contrast solar energy and nuclear energy in Arizona. There is only one nuclear power plant in the state, the Palo Verde Nuclear Generating Station in Tonopah. There are several solar energy sites, so we will pick the Aqua Caliente Solar Project because it won the Renewable Energy World Solar Project of the Year category in their 2012 Excellence in Renewable Energy Awards.

Palo Verde Nuclear Generating Station

This nuclear plant consists of three reactors with with a total nameplate capacity of 3,937 MW. If these reactors ran for 24 hours day for 365 days a year they would yield 34,500 GWh (gigawatt hours) per year. The actual output is about 31,300 GWh per year (2010). This means they have a capacity factor of about 90%. Averaged over time Palo Verde yields 3,543 MW.

Palo Verde became operational in 1988 and is currently approved to operate until 2047, giving a lifetime of nearly 60 years.

Palo Verde’s construction cost was $5.9 billion in 1988 ($11.86 billion in 2015 dollars). Its operating costs for fuel and maintenance were about 1.33 cents per kWh in 2004 (1.67 cents in 2015 dollars.)

Based on an average power yield of 3,543 W and a cost of $11.86 billion (in 2015 dollars), the construction cost per watt for Palo Verde was $3.34 per Watt (in 2015 dollars).

Agua Caliente Solar Project

This 9.7 square kilometer solar energy farm has a nameplate capacity of 290 MW peak.  Its first year of full operation was 2014. If it were able to produce its nameplate capacity of 290 MW continuously for one year the energy output would be 2540 GWh. The energy output was 741 GWh in 2014, which means a capacity factor of 29%, an excellent result for solar energy. Averaged over time, this solar farm yields 84.6 MW.

Construction cost for Aqua Caliente was $1.8 billion.

Based on an average yield of 84 MW and a construction cost of $1.8 billion, the construction cost per watt for Aqua Caliente was $21.43 per Watt.


The cost per kilowatt hour of energy for either of these sources is combination of the construction cost and the operation, fuel and maintenance cost.  The longer the facilities are in operation the lower the fraction of construction cost per kilowatt hour.

The operation, fuel and maintenance cost for the Palo Verde Nuclear plant were about 1.33 cents per kWh in 2004 (1.67 cents in 2015 dollars.)  The great advantage of the Agua Caliente solar farm is that its fuel cost is zero, and we will assume for the sake of argument that its other operation and maintenance costs are also zero.

The following chart shows various costs per kilowatt hour for each of the facilities for various lifetimes.


1.  $0.0133 per kilowatt hour in 2004.  Converted to 2015 dollars.
2. 2013 energy output.
3. $5.9 million construction cost in 1988 dollars.  Converted to 2015 dollars.
4. 2014 energy output
5. $1.8 billion construction cost in 2014.
6. (GWh/year) x (number of years) x (1,000,000)
7. (Construction cost) / (kilowatt hours produced over lifetime)
8. (Construction cost per kWh) + (operating cost per kWh)

Two blocks of data are highlighted in yellow.  These are the most likely lifetime scenarios for each of the power generating plants.  The Palo Verde nuclear plant has had its license extended to 60 years.  Aqua Caliente solar farm is made from First Solar CdTe modules that have a 10 year material and workmanship warranty and a  warranty of 80% of the nominal output power rating during twenty-five (25) years.  It is reasonable to hope that it will last 40 years

There is one more thing to be considered.  We have assumed so far that the yearly output of each of these power generating stations it the same year after year.  That is not entirely correct.  Historically, the Palo Verde nuclear plant has increased its capacity factor through time as operations have become more efficient.  Whether that trend will continue is unknown.

Solar modules tend to slowly degrade with time.  The First Solar CdTe modules that are used at Aqua Caliente will likely decay at about 0.5% per year. The chart above gives a best case estimate for Agua Caliente and does not compensate for this degradation.

Based on the highlighted sections of the above chart, Aqua Caliente Solar Farm will likely cost about 2.5 times more per kilowatt hour than the Palo Verde Nuclear Plant over the course of their lifetimes.

One more point.  Aqua Caliente requires 9.7 square kilometers to generate an average of 84.6 MW.  Palo Verde Nuclear Plant generates and average of 3,543 MW.  So it would take 41 Agua Calientes to equal the power of Palo Verde.  That would require about 400 square kilometers.

Energy is the lifeblood of civilization.  The pursuit of energy abundance is the pursuit of healthier and more fulfilling lifestyle for greater numbers of people.  I present this data to help inform the choices that need to be made in that pursuit.


How much photovoltaics to provide 100 kilowatt hours per person per day?

November 8, 2015

Suppose you wanted to power the world at the level that each human being can enjoy the same level of energy abundance as the average American. And suppose we wanted to do it all with photovoltaic solar energy. What would it take?

There are an average of 250 kilowatt hours consumed per person per day in the United states. Maybe that seems like a lot to you because you occasionally look at your home electric bill and see less than 1000 kilowatt hours used in a entire month for a home that houses four people. That 1000 kilowatt hours for four people in a month works out to only about eight kilowatt hours per person per day. But that electric bill is a very poor indicator of how much energy is actually expended for your benefit. That is why claims that some energy source will power X number of homes is incredibly misleading.

Here is the reality.  According to Lawrence Livermore National Laboratory the United States consumes 98.3 quads of energy every year.


That works out to about 250 kilowatt hours per person per day

(98.3 quads/year) x (2.933 x 1011 kw-hr/quad) / (year / 365 days) / (3.2 x 106 people) = 247 kilowatt hours/person/day

Fortunately, this daunting amount of energy is also somewhat misleading.  Look at the right side of the graph from Lawrence Livermore.  Notice that the two final energy outputs on the right side of the graph are “Energy Services” and “Rejected Energy.”  “Energy Services” is energy that actually does some useful work.  “Rejected Energy” is energy that is lost, mostly in form of waste heat.  For example, if you burn a lump of coal in a steam generator and get a kilowatt of energy out in the form of electricity, but lose two kilowatts in the form of heat to the atmosphere, then you got one kilowatt hour of Energy Service but two kilowatt hours of Rejected Energy.  As you can see from the graph, only 40% of the energy that is input comes out the the system as Energy Services (38.9 quads / 98.3 quads).

One of the big advantages of solar photovoltaics is that you don’t lose 60% of your energy to heat.  Electric cars put far more of their stored electric energy into useful work (Energy Services) and far less into “Rejected Energy” than do blazing hot internal combustion engines.

Let’s make the assumption for now that every possible efficiency is applied, so that we only need to produce 40% of the 250 kilowatt hours per day per person, or 100 kilowatt hours per day per person.  Still a lot of energy, but more manageable than 250 kilowatt hours.

So, for 7 billion people we need 700 billion kilowatt hours per day (100 kilowatt hours per person x 7 billion people).  If we got all that energy from solar photovoltiacs, how much land would be required for solar arrays, how much would it cost?

Topaz Solar Farm

To get estimates of these values, we can look at some of the world’s biggest solar arrays.  Consider the Topaz Solar Farm in California.  It is one of the biggest and one of the newest in the world and in an area of very high solar insolation.  It is expected to generate 1,100 GWh of energy per year while occupying 25 km2 with a cost of $2.5 billion.  Therefore it would generate the energy consumed by about 30,000 people at 100 kWh per person per day.

(1100 GWh/year)x(1×106 kWh/GWh)x(year/365 days)/(100 kWh/person/day) = 30,136 people

From this it is clear that it would take about 6 million km2 of solar photovoltaics of the Topaz Solar Farm density to generate all the energy consumed by 7 billion adequately powered people.

(7×109 people) / (30,136 people/25 km2) = 5.8×106 km2

Keeping in mind that the Topaz Solar Farm cost $2.5 billion and yields enough energy for 30,136 people, then the cost for 7 billion people would be about $580 trillion.

(7×109 people) / (30,136 people/$2.5×109) = $5.8×1014 .

For the sake of comparison the, the gross domestic product of the United States is about $17 trillion, or less that 3% of that $580 trillion.  The gross product of the entire world  is about $78 trillion, or about 13% of that $580 trillion.  So, if every penny or mark or yen, etc. of world product for about 7.5 years were dedicated to this project, it could be accomplished.

Some points to consider

What would be the consequences of covering 6 million square kilometers of land with PV?  This would be like completely covering an area the combined size of Arizona, Nevada, Colorado, Wyoming, Oregon, Idaho, Utah, Kansas, Minnesota, Nebraska, South Dakota, North Dakota, Missouri, Oklahoma, Washington, Georgia, Michigan, Iowa, Illinois, Wisconsin, Florida, Arkansas, Alabama, North Carolina, New York, Mississippi, Pennsylvania, Louisiana, Tennessee, Ohio, Virginia, Kentucky, Indiana, Maine, South Carolina, West Virginia, Maryland, Vermont, New Hampshire, Massachusetts, New Jersey, Hawaii, Connecticut, Puerto Rico, Delaware, Rhode Island with solar panels.  Of course, this would be spread out over the about 100 million square kilometers of land at latitudes lower than about 50 degrees.

This plan would also require a distribution system that could move energy from daytime areas to nighttime areas, or at least a few days of storage for every person on the planet.  Such a distribution system is not feasible at this time, and the massive amount of storage is prohibitively expensive.

Two days of storage would be 200 kilowatt hours of stored energy per person.  Probably the best mass storage option today (2015) is with Tesla’s Powerwall, which stores 7 kilowatt hours, costs $3,000, and weights 220 pounds.  So we would need about $90,000 and about 6,600 pounds of storage for each of the 7 billion people.  That adds another $630 trillion to the cost.

These calculations serve simply to give a feel for what could be done with solar photovoltaics and what the limitations might be.  I am not suggesting that the world should be powered solely with PV.  With other energy sources in the mix less money and land would need to be devoted to PV (but more to those other sources).  For example, if you did the same calculations for wind, then you would find that about twice as much area  (about 12 million square kilometers) would have to be covered by wind farms to get the same amount of energy.  But at least you can grow corn are graze cattle below the turbines in a wind farm.

I have led you to water.  It is up to you to drink up your own conclusions about the viability of using solar energy to bring the world up to a reasonable level of energy consumption.


Kirk Sorensen – The Promise of Thorium in Meeting Future World Energy Demand

September 28, 2015

If you really care about future energy abundance, then you should watch this video from Kirk Sorensen.   I believe that Thorium offers the world truly fantastic possibilities…


Uh, Oh! Karl, et. al., is bad news for Stefan Rahmstorf’s sea level rise rate.

September 25, 2015

Conclusion first

When the 20th century GISS temperature is modified according to Tom Karl, et.al., it causes the 21st century sea level predictions of Vermeer’s and Rahmstorf’s semi-empirical model to go down!


I have written extensively about “Global sea level linked to global temperature,” by Vermeer and Rahmstorf (which I will refer to as VR2009).

VR2009 was a widely cited claim of using historical 20th century sea level and temperature data to calculate parameters that could be used to build a model to predict 21st century sea level rise for various 21st century temperature scenarios.  I reproduced the VR2009 model based on their description.  My code was verified by reproducing the VR2009 results using the same inputs that they used.

I spent a lot of time pointing out some of the bizarre results of their model that surely disqualified it form being taken seriously, some of which can be seen here, here, and here.

I also spent a lot of time pointing out that the VR2009 choices of 20th century sea level data sources left much to be desired.  For example, they used the 2006 Church and White sea level data that was already outdated.  If they had used the revised Church and White data, then their resulting sea level rise predictions for the 21st century would have been much lower.

They happily modified Church’s and White’s outdated sea level data by subtracting a reservoir correction (Chao, et. al.), which made their 21st century predictions for sea level rise go up. But they made no attempt to estimate a groundwater depletion correction. It turns out, unsurprisingly, that the groundwater depletion is of the same magnitude as the reservoir correction (Wada, et. al.), and including it would have made their 21st century predictions go down.

Nevertheless, Rahmstorf would later claim that his modeling approach was “robust!”  That is, it would give essentially the same result for the 21st century given different sources of 20th century sea level data.

So, I also implemented the VR2009 technique using several different sources of sea level data, which should have given similar results, according to Rahmstorf’s claim of robustness.  In fact, they gave widely varying results, and every combination of sea level data, reservoir data, and groundwater depletion data that I tried gave lower results than VR2009’s chosen combination.

New Temperature Data!

The widely reported nearly two decade long pause in global warming was causing suicidal ideation among hard-core global warming alarmists.  Something had to be done to stop them from slitting their wrists with shards of glass from their shattered thermometers.

Just in the nick of time – revised temperature data!   Like all proper revisions of temperature data, this revision caused the reported temperature change of the 20th century to go up.

This was a result of a paper by Tom Karl, et. al. (Nature) based on very thin reasoning (see for example) that argued for such revision.  The folks at GISS (who provided VR2009’s temperature data) glommed onto Karl’s logic and subsequently revised their temperature data accordingly.  Other temperature data source like UAH and RSS did not.

Which means we must ask ourselves, what happens to 21st century sea level rise predictions based on the VR2009 model using the now modified GISS data?

VR2009 applied their model to six families of temperature scenarios for the 21st century form the IPCC’s 4th Assessment Report.  Let’s see what happens to each of those scenarios when we update the 20th century GISS temperature data.

The IPCC temperature scenarios that VR2009 used for prediction of 21st century sea level rise.

Case 1.

Sea level inputs are identical to what VR2009 used: Church’s and White’s sea level with the Chao reservoir correction.  The old GISS temperature data is replaced with the new GISS temperature data.  The table below shows that the new GISS data yields 21st century sea level rises that are about 17% less than when the old GISS data is used.

Old GISS vs New GISS

It is a shame that after Tom Karl went to all the trouble to increase the temperature rise of the 20th century it just makes VR2009’s model predict LOWER sea levels for the 21st century.  This must be a great disappointment to Vermeer and Rahmstorf, so you can be pretty sure they will never tell you this result. But I just did.

Case 2

As I pointed out previously, VR2009 chose to use outdated 2006 Church and White sea level data, instead of Church’s 2009 data.  They also neglected a groundwater depletion correction.  When these improvements are included the VR2009 model yields 21st century sea level rises that are only about 55% of VR2009.  When the new GISS temperature data is included in the mix this drops to about 45%.

New GISS CW2009 Chao Wada

Case 3.

Lest Vermeer or Rahmstorf argue that their large sea level rise rates are saved by another update of the Church and White data in 2011, I have include these results also.  The difference between 2009 and 2011 Church and White sea level data was small.  Here is how the 2011 Church and White sea level data version plays out in the VR2009 model. The resulting 21st century sea level rise predictions are only about 43% of the VR2009 predictions.

New GISS CW2011 Chao Wada

The trend continues.

It seems that no matter what combination of inputs that are used in the VR2009 model, the predicted sea level rise for the 21st century is always smaller than with VR2009’s choice of inputs.  I wonder what that implies?


Has anybody ever heard of Angela Landolt?

September 21, 2015

I received this comment from Angela Landolt on a recent post.  It looks fishy to me.  I blanked out the link to her survey.

I am a student at the Institute of Mass Communication and Media Research (IPMZ) University of Zurich. As part of my Master’s thesis, I am conducting a survey on how climate change bloggers’ perceive themselves and their role in the climate change debate.

If you blog about climate change, I would like to ask you to participate in my survey. Your contribution will help us to gain valuable insights into the field of climate change blogging.

Link to the survey: http://ww2.unipark.de/uc/__________________

The questionnaire will take about 7 minutes to fill out.
There are no right or wrong answers. I am interested in your personal opinion.
The study does not serve any commercial purpose. The data provided is solely for the purpose of scientific analysis and is evaluated anonymously.
The questionnaire can be filled out in English and German.
Please feel free to contact me if there are further questions or comments.

Angela Landolt B.A.


Angela, if you are interested in my thoughts on global warming, then feel free to read my blog.  I have taken a look at some examples of work of the “Institute of Mass Communication and Media Research” and it looks like silly psycho-babble to me.


Chinese Nuclear News

September 10, 2015

Chinese reactor to be built in UK

Selina SykesUK Daily Express (9/6/15)

David Cameron is adamant to get the project off the ground – which is at the core of the Government’s drive to replace Britain’s ageing fossil fuel plants with low-carbon alternatives.

The Chinese – who are currently have 26 nuclear power reactors in operation – are vital to Britain’s low-carbon initiative.

The Chinese design is expected to be capable of producing one gigawatt of electricity – enough to power 1m homes.

China to increase nuclear capacity to 58 GW by 2020

The Economic Times (9/9/15)

China aims to lift its operational nuclear power installed capacity to 58 million kilowatts by 2020, and those under construction will reach 30 million kilowatts.

The rapid economic growth of inland provinces means the area will need more power, and China should develop inland nuclear power projects to meet rising total and per capita energy consumption, according to a research report from Chinese Academy of Engineering.

Construction of the Xipu fast neutron reactor nuclear power demonstrative project in Fujian Province, east China, could start at the end of 2017.


Barack Obama: Glaciologist

September 6, 2015

The avid outdoors-man and eminent scientist, Barack Obama, has been trekking through Alaska lately.  He is lamenting the demise of the great glaciers of the North.  He is surely grieving over the harm that man is inflicting on the planet by spewing his toxic CO2.  The Washington Post reports

Standing near the foot of the Exit Glacier, which has receded 1.25 miles since 1815 and 187 feet last year alone, Obama said “this is as good of a signpost of what we’re dealing with it comes to climate change as just about anything.”

The man certainly has a way with words – a true poet.

I guess we are supposed to be alarmed because 187 feet per year is a lot faster than 1.25 miles per 200 years.  After all, 1.25 miles in 200 years averages out to only 33 feet per year.  The message we are supposed to get is that the Exit Glacier is receding about 6 times faster now than its average over the last 200 years.  This, of course, is due to the CO2 that vile humans use to poison the atmosphere and it means endless and escalating disaster unless we socialize the economy of the world.

But what does the National Park service say about the retreat rate of Exit Glacier? The following table of retreat distances and rates comes from the National Park Service’s “The Retreat of Exit Glacier.” Annotation in red was added by me.

Exit glacier retreat annotatedSo, this data confirms Obama’s assertion that the Exit Glacier has retreated 1.25 miles in the last 200 years.  But it also makes it quite clear that it was retreating as fast, or faster, 100 years ago.

If CO2 is the culprit today, what was the culprit 100 years ago?  The following graph shows the amount of anthropogenic CO2 in the atmosphere as a function of time going back to 1750.  The data comes from Oak Ridge National Laboratory.  I made the plot and added the annotation. It’s kind of hard to explain why the retreat rate was so much greater in the past when there was less than 10% of the anthropogenic atmospheric CO2 than there is today.  Perhaps Professor Obama will elucidate.

anthro atmos carbonMy wife and I were up in Alaska a few years ago, and we also visited some some of those receding glaciers.  At Glacier Bay National Park, which is several hundred miles southeast of Exit Glacier, I happened to pick up a park pamphlet that had the following series of illustrations showing the glacier extents in the park going back to 1680.

glacier bay extents v3The first thing that jumps out at you is the rapid ice advance between 1680 and 1750 and the subsequent retreat between 1750 and 1880.  The pamphlet said

“The Little Ice Age came and went quickly by geologic measures.  By 1750 the glacier reached its maximum, jutting into Icy Strait.  But when Capt. George Vancouver sailed here 45 years later, the glacier had melted back five miles into Glacier Bay – which it had gouged out.”

As an aside, a co-worker once told me that the Little Ice Age was not a global phenomenon, but rather, local to Europe.  He cited the Union of Concerned Scientists as the source of this insight.  But there it is, in Alaska!

It is hard to argue with the Union of Concerned Scientists because they’re, well, scientists.  Not just anybody can be a Concerned Scientist.  You have to send a check first.  My wife used to send a check years ago, but it was from our joint account so I figure I was only half a Concerned Scientist then.  Now I guess I am just a wholly unconcerned scientist.

IMG_1546 v2Anyway, Obama was getting excited about 1.25 miles of glacier recession since 1815, and a whopping 187 feet in the last year.  That pamphlet that I mentioned also had a large map of the Glacier Bay area marking the location of the various glaciers back to 1760. It’s easy to string the locations together and calculate the recession rate of these glaciers.  The image at the left  shows the map as I marked it out for Grand Pacific Glacier. (Click to enlarge.)

I have plotted the distance as a function of time for three glacier routes using this crude method.   As you can see below, these glaciers have receded at a much faster rate than Exit Glacier.  But Exit Glacier and the Glacier Bay National Park glaciers have one thing common:  they all retreated at their maximum rate back when anthropogenic atmospheric CO2 levels were very low compared to today.

Glacier retreatLet’s take a closer look at the Grand Pacific Glacier.  John J. Clague and S. G. Evans (J. of Glaciology)  used various data sources to plot the retreat of the Grand Pacific Glacier.  I have converted their data to miles and overlaid it with my coarser data from the map. The Clague data and the map data agree nicely, but the Clague data fills in some of the gaps.  The most interesting point is that like Exit Glacier, the retreat rate for the Grand Pacific Glacier was greatest around the last part of the 19th century. In fact, the Clague data may indicate that the Grand Pacific Glacier was slightly progressing, not retreating, during most of the 20th century.

Grand Pacific Glacier retreatIt is pretty clear that the Grand Pacific Glacier was retreating fastest around 1860.  Where is that on the anthropogenic atmospheric CO2 timeline?  The graph below shows that the anthropogenic atmospheric CO2 level was only about 2% of today’s level when the Grand Pacific Glacier was retreating at its fastest by far!

CO2 and Grand PacificHow is that possible???????  I thought it was high CO2 levels that caused the glaciers to recede.


Get every new post delivered to your Inbox.

Join 69 other followers