Monday, December 28, 2009


The disappointing end of the talks in Copenhagen has predictably resulted in a slew of finger-pointing. Although Bill McKibben and others laid the blame on President Obama, other observers argue that China intentionally crippled the talks. For instance:

While Brown refrained from naming countries, his climate change minister Ed Miliband said China had led a group of countries that “hijacked” the negotiations which had at times presented “a farcical picture to the public.”

In a very interesting Guardian piece Mark Lynas gives an account of how this transpired:
Even George Monbiot, writing in yesterday's Guardian, made the mistake of singly blaming Obama. But I saw Obama fighting desperately to salvage a deal, and the Chinese delegate saying "no", over and over again.

To those who would blame Obama and rich countries in general, know this: it was China's representative who insisted that industrialised country targets, previously agreed as an 80% cut by 2050, be taken out of the deal. "Why can't we even mention our own targets?" demanded a furious Angela Merkel. Australia's prime minister, Kevin Rudd, was annoyed enough to bang his microphone. Brazil's representative too pointed out the illogicality of China's position. Why should rich countries not announce even this unilateral cut? The Chinese delegate said no, and I watched, aghast, as Merkel threw up her hands in despair and conceded the point. Now we know why – because China bet, correctly, that Obama would get the blame for the Copenhagen accord's lack of ambition.

China, backed at times by India, then proceeded to take out all the numbers that mattered. A 2020 peaking year in global emissions, essential to restrain temperatures to 2C, was removed and replaced by woolly language suggesting that emissions should peak "as soon as possible". The long-term target, of global 50% cuts by 2050, was also excised. No one else, perhaps with the exceptions of India and Saudi Arabia, wanted this to happen. I am certain that had the Chinese not been in the room, we would have left Copenhagen with a deal that had environmentalists popping champagne corks popping in every corner of the world.

Lynas accounts for this diplomatic strategy as follows:

Why did China, in the words of a UK-based analyst who also spent hours in heads of state meetings, "not only reject targets for itself, but also refuse to allow any other country to take on binding targets?" The analyst, who has attended climate conferences for more than 15 years, concludes that China wants to weaken the climate regulation regime now "in order to avoid the risk that it might be called on to be more ambitious in a few years' time".

This does not mean China is not serious about global warming. It is strong in both the wind and solar industries. But China's growth, and growing global political and economic dominance, is based largely on cheap coal. China knows it is becoming an uncontested superpower; indeed its newfound muscular confidence was on striking display in Copenhagen. Its coal-based economy doubles every decade, and its power increases commensurately. Its leadership will not alter this magic formula unless they absolutely have to.

I think that Lynas is probably too generous here. The existence of sizable renewable energy industries means little with regard to climate change in the absence of meaningful emissions reductions, which were explicitly rejected by the Chinese.

Beijing's logic in rejecting binding emissions targets is not hard to understand. Following the Chinese Communist Party's abandonment of socialist ideals, its only claim to legitimacy is China's continued economic growth. The cessation or even deceleration of economic expansion would place the stability and potentially even the survival of the regime at risk. And as Lynas points out, China's rise to economic prominence has been fueled by cheap coal. Thus, for the PRC, emissions caps are more threatening than climate change. That they managed to deflect the blame for crippling the talks on Obama is just icing on the cake.

The lesson of Copenhagen, then, is that the only way to forge an international agreement to limit greenhouse emissions will be to offer a real alternative to coal. A carbon-free source source of energy that will allow the developing world to develop without wrecking the Holocene. Given geostrategic and engineering realities, this basically means Gen IV nuclear reactors--ideally LFTRs and IFRs but also less sustainable reactors such as the Russian SVBR that can conceivably scale quickly in the near term.

Furthermore, the promise of these technologies could prove an indispensable "carrot" in future climate change talks. Taking steps to commercialize advanced reactors, and offering assurances to share these with countries making efforts to decarbonize their economies, could provide a major incentive for recalcitrant nations to accede to emissions reductions, or at least a disincentive to sabotage negotiations like the PRC did. Furthermore, the promise of advanced reactors doesn't have to wait until they reach commercial status. Even a major development effort could be a significant incentive for developing nations hesitant to commit to emissions controls. Therefore, we don't have to wait decades for Gen IV nuclear to begin saving the climate--its enormous potential can start playing a vital role right now in climate negotations.

But until we have a viable alternative to coal, we can expect future negotiations to turn out like Copenhagen did. Gen IV nuclear reactors like the LFTR can be the carrot that saves the climate; it's high time we put some real political muscle into developing them, so that they can start putting some muscle behind our diplomacy.

Sunday, December 20, 2009

Books on VVER and RBMK

Since I arrived in Russia three months ago I have been accumulating books not merely about civil defense, but also about the Soviet civilian nuclear power. The one on the left is titled "NPP with VVER: Procedures, Characteristics, Effectiveness," and the one on the right is "Channelized Water-Graphite Reactors: Textbook."

The book on water-graphite reactors is particularly interesting, as it was sent to press a mere six months before the accident at Chernobyl Unit 4. In the fourth chapter, "Safety of Nuclear Reactors," the authors note that:

Нарушение соответствия между выделяемым и отводимым теплом и, как следствие этого, разрушение твэлов и других элементов конструкции могут возникуть или при повышении энерговыделения выше допустимого уровня, или при ухудшении теплоотвода. В результате бесконтрольного увеличения реактивности первый процесс возможен как в активной зоне в целом, так в отдельных ее частях. Причинами такого явления могут быть, например, заклинивание регулируюших стержей на вводе в активную зону, резкие изменения темпуратуры теплоносителя и т. п.

A lack of agreement between produced and removed heat, and consequently the destruction of fuel elements and other elements of construction, can result from either an increase in heat production beyond acceptable levels, or from a reduction in heat removal. The first process is possible as a result of an uncontrolled increase in reactivity in the active zone as a whole, or in a part of it. The causes of such an event can be, for instance, the jamming of control rods on entry to the active zone, abrupt changes in coolant temperature, and so forth. (p. 104)

Although such a reactivity spike caused the explosion at Chernobyl, it's clear from the textbook that the accident the authors envisioned was much more limited than that which actually transpired. This was probably because if the RBMK-1000 was operated according to regulations, the resulting power excursion would have been much smaller. (RBMKs have "safety rods" as well as control rods, and the operators removed most of these in order to start the reactor while it was subject to xenon poisoning, leaving it in an extremely unstable and dangerous state.) In any case, the book is merely an introduction (albeit a highly technical one) to Soviet water-graphite reactors. I'll keep my eye out for a more complete account of RBMKs, like the book about VVERs on the left.

Thursday, October 29, 2009

As Though the Future and the Climate Didn't Matter

I've had some fun on this blog in the past critiquing Joe Romm's various "analyses" of nuclear power, but I basically gave up once Romm came out as an apologist for fossil fuels. That took the joy out of it for me; it hardly seems worth the bother to pursue "debate" with someone who's so clearly out of touch with the real issues in the climate crisis to realize that all fossil fuels are too carbon intensive, PERIOD.

Predictably, however, Romm has continued his bizarre crusade against new nuclear builds in the US. Seizing upon setbacks such as the NRC's letter challenging Westinghouse to demonstrate the efficacy of the AP-1000's shield building design to the rejection of AECL's ACR-1000 proposal in Ontario, his disdain for the nuclear option is readily apparent. And in keeping with past examples, the announcement that Toshiba's asking price for the South Texas nuclear project had increased by $4 billion came in for similar treatment.

However unpleasant the cost run-up is, it appears to be mainly a hardball negotiation tactic on Toshiba's part. As usual, veteran nuclear industry-watcher Dan Yurman is on top of things:
CPS interim general manager Steve Bartley told the San Antonio newspaper the $4 billion price increase could be a "negotiating tactic." He agreed with Mayor Castro that the decision to postpone the bond vote "sends a signal to Toshiba" that the delivered price of the twin reactors must come down. Bartley added that CPS Energy will send a delegation to Japan to sit down with Toshiba to discuss costs.
Note that a similar process happened with Rosatom's nuclear tender in Turkey. The Russians originally offered an extremely high quote of $0.21/kWh, which of course Romm seized upon as "proof" that nuclear power is ruinously expensive. But as more sober observers always knew, it was really the Russians' desire to gouge the Turks, rather than anything intrinsic about nuclear power, which resulted in the high bid. As of a few weeks ago the Russians and the Turks were down to $0.15/kWh and were still negotiating. Look for a similar process in coming months with the project in Texas.

Romm's constant companion in his recent anti-nuclear tirades has been Craig Severance, a Republican (!) accountant and disco-era coal apologist. I have to hand it to Mr. Severance--he's an alchemist who would make Hermes Trismegistus jealous. He transmuted Joe Romm, crusader against climate change, into Joe Romm, shill for natural gas. I never thought I would see the day.

It seems that Joe Romm is merely among the more prominent individuals suckered by the natural gas industry's present marketing strategy, which is really quite brilliant. I might even admire it, if it didn't have the unpleasant side effect of wrecking the planet upon which I happen to live. Fortunately, Rod Adams has been paying close attention to this trend (see particularly here and here). Essentially, the natural gas industry is selling itself to the public as a "cheap bridge to a renewable future" while assuring its investors that the lean times won't last forever, and that soon their industry will be buoyed by strong demand resulting from economic recovery and the need for carbon reductions. A fine visual example of this cynical gambit pulled from the web:

Note that nearly all of the "clean energy alternatives" that Romm's recent post puts forward are really ways to burn natural gas. Compressed-air energy storage to "firm" wind capacity? Gas. Combination solar-thermal/gas plants? Gas. (And a particularly wasteful use of it, given the lower thermodynamic efficiency of such an arrangement compared to a CCGT). Severance even goes so far as to cut to the obvious and suggest
Another type of power plant San Antonio could build might be a natural gas power plant (of course, it can wait until at least 2015 to decide to do so, as noted above under “Rushed Decision”).
The reason this isn't OK is not just because natural gas prices will recover from their currently depressed level by then (keep in mind that the REAL reason for recent low prices is the economic downturn, NOT the unconventional gas discoveries, and that those merely pushed the date that US natural gas production will begin to decline from the immediate to the intermediate future), both due to probable economic recovery as well as the fact that new legislation will encourage gas in preference to coal for electrical generation. Nor is it because the Russians are clearly hoping to manipulate the world energy market to maximize their gas export revenues in coming years. It's because the climate advantages of natural gas have been greatly exaggerated. It's true that gas is better than coal--but nowhere near as much as many believe, including Joe Romm.

The issue is methane. Burning methane for fuel may produce less CO2 than coal in the same applications, but methane is itself a very strong greenhouse gas--indeed, twenty-five times as much as CO2. While comparing the smokestack CO2 emissions from coal and natural gas plants may suggest that gas plants are vastly superior, this gives only an incomplete picture of the actual situation. A full-lifecycle analysis including the methane inevitably lost during extraction and transport leads to much more sobering conclusions.

This shouldn't really be news. See, for instance, this 2007 paper from Environmental Science & Technology, "Comparative life-cycle air emissions of coal, domestic natural gas, LNG, and SNG for electricity generation." The authors found that when the entirety of the fuel cycle is accounted for, conventional gas is nearly as bad for the climate as coal--and LNG is as bad as coal! Natural gas hardly seems to be the stuff from which a bridge to a climate-friendly energy future will be built.

The importance of accounting for non-CO2 greenhouse gases has recognized by professional climatologists for years. Indeed, a 2006 NYRB piece by James Hansen (who has been the target of repeated and increasingly unreasonable criticism from Joe Romm) noted that:
Further global warming can be kept within limits (under two degrees Fahrenheit) only by means of simultaneous slowdown of CO2 emissions and absolute reduction of the principal non CO2agents of global warming, particularly emissions of methane gas. Such methane emissions are not only the second-largest human contribution to climate change but also the main cause of an increase in ozone—the third-largest human-produced greenhouse gas—in the troposphere, the lowest part of the Earth's atmosphere. Practical methods can be used to reduce human sources of methane emission, for example, at coal mines, landfills, and waste management facilities.
If we want to get serious about fighting climate change, we need real clean energy solutions. Funding underhanded schemes that will drain our pockets via fuel surcharges while depositing methane and carbon dioxide in the atmosphere do not fall into this category, however successful the gas industry's spin doctors may be at convincing credulous pundits like Joe Romm otherwise. If we we're going to act like the future and the climate actually matter, we must be willing to take the steps needed to build a genuinely climate-friendly energy infrastructure--and this is definitely going to include considerable investment in new nuclear facilities.

Friday, October 23, 2009

Atomic Technology in the Sixth Five-Year-Plan

From Voennye znaniia, June 1956:

"We Communists must place the greatest discovery of the 20th century--atomic energy--in full service of that task, the fulfillment of which is the programmatic goal our Party--the task of building Communism."
--N.A. Bulganin, at the 20th Party Congress

One of the most poorly-remembered aspects of "Atoms for Peace" back in the mid-50s was the way in which the Soviets responded to it. President Eisenhower and his advisors hoped that their proposal for an international nuclear fuel bank would deprive the Soviet weapons complex of fissile material it couldn't spare, and thus restricting the growth of the USSR's nuclear arsenal. Instead, the Soviet Union elected to produce a flood of propaganda demonstrating that the "peaceful atom" really only existed on their side of the iron curtain. Touting the construction in Obninsk of the first nuclear power plant in 1954, two years before Calder Hall in the United Kingdom and three before Shippingport went into service in the United States, as proof of a Soviet lead in civilian applications of atomic energy, Soviet propagandists in the mid-1950s portrayed a world in which socialism and nuclear power combined to alleviate technical and social problems.

Emboldened by this early success, the Communist Party adopted outrageously overambitious goals for their civilian nuclear program during the Sixth Five-Year-Plan (1956-60). As described by a 1956 article in Voennaia znaniia (Military Knowledge):

In the Sixth Five-Year-Plan it is planned to build five large atomic power stations. A large atomic power station will be put into service near Moscow. It will have a 400 thousand kilowatt capacity. Two atomic power power stations with a total capacity of a million kilowatts will be constructed in the Urals.

Altogether in the current Five-Year-Plan atomic power stations will be constructed with a total capacity of 2-2.5 million kilowatts. . . .No capitalist country, including the USA and England, are planning to place atomic power stations of such great capacity into service.

The USSR saw the construction of these plants as a means of fulfilling two goals: firstly, as a means of constructing energy centers in regions lacking local fuel supplies, and secondly, as a "great experiment" in order to determine what reactor technologies would be most economical and advantageous for widespread deployment in the subsequent Five-Year-Plan. According to the article, "up to ten types of reactors" ranging in electrical output from 50 to 200MWe would be developed before 1960. One suggestion was a homogeneous reactor that would use the radiolysis of the fuel solution to power a galvanic cell in addition to a turbine. One of the reactors would use thorium fuel, and might potentially be an aqueous homogenous reactor as suggested by Academician A.I. Alikhanov in Geneva in 1955, producing U-233 from Thorium.

Besides power stations, atomic energy would also be put to use by the Socialist motherland for transportation purposes. One goal for the the Five-Year-Plan was the construction of an atomic icebreaker; this would ultimately see the light of day as the Lenin, entering service in 1959. But this was not to be all; nuclear airplanes and locomotives would also be the objects of intense research and development. It is true that the Myasishchev design bureau made several prospective designs for nuclear-powered strategic bombers in the late 1950s--the M-30 and M-60--but like their American counterparts, these aircraft did not ultimately see the light of day. And while the article stated that "it can be expected that atomic locomotives will travel our railways in the near future," and was illustrated with an elaborate picture of a two-story atomic locomotive quite reminiscent of the one in the 1979 NBC flop Supertrain, no such conveyance ever left the drawing board.
Atomic Locomotive As Imagined By Soviet Artist (1956)

The article concluded that:
Our country stands ahead of other countries in the use of atomic energy for peaceful purposes. In the years of the Sixth Five-Year-Plan the Soviet Union will make a great new leap ahead in the development of atomic technology and in the use of the immeasurable energy of the atomic nucleus.
Unfortunately, this was not quite the way things turned out. The only item mentioned in the article that actually got completed during the Five-Year-Plan was the icebreaker Lenin. The next nuclear plant to go into service after Obninsk was in Tomsk in 1958; but these were basically plutonium-production reactors for the weapons program that also produced electricity. The civilian reactors (water-graphite in Beloyarsk and light water at Novovoronezh) began construction in 1957 and 1958 respectively, but only entered service in 1964. But little notice was paid to these failures at the time, given another technological marvel grabbed the world's attention:

Friday, October 16, 2009

Amory Lovins Admits He Doesn't Know the Carbon Intensity of "Micropower"

From Amory Lovins' reply to Rod Adams' critique of this post on Grist:

Mr. Adams’s claim about “an awful lot of diesel, coal, and natural gas” being consumed by micropower is addressed in Part One of my response to David Bradish’s post at Mr. Bradish was referring to the fuel mix of the non-biomass cogeneration that our “micropower” database combines with renewables other than big hydropower. As I stated, cogeneration does burn some coal, but not much. The mainly gas-fired cogeneration fuel mix is unknown in detail but does include some coal, chiefly in China and India (where gas is often available), and to some extent in Germany, all aided by coal subsidies. USEIA also reported that 18.7% of the U.S. cogeneration in its partial database for 2006 that burned fossil fuels was coal-fired, including culm or waste coal. However, even coal-fired cogen greatly reduces the carbon otherwise emitted by separate production of power and heat, because it displaces the separate fueled boiler(s) otherwise needed to produce the heat that cogen recovers. The resulting carbon saving is smaller than for the predominant gas-fired cogen, let alone for renewables, but is still substantial. I hope soon to receive updated cogen data casting more light on the fuel mix, and if I do, will post it to our micropower database at

So Amory Lovins admits that, so far as the supposed carbon savings from "micropower" are concerned, he really doesn't know what he's talking about... but he might find out soon?

As I pointed out in the past, Lovins' definition of "micropower" makes no sense and only serves to obscure the fact that Lovins is in practice essentially shilling for fossil fuels, his (perhaps earnest) claims to the contrary aside. All that's new is that Lovins is admitting he doesn't actually have the data to support his claims.

Sunday, October 11, 2009

No, the Soviets Did NOT Build a "Doomsday Machine"

From Wired:

"The Perimeter system is very, very nice," he says. "We remove unique responsibility from high politicians and the military." He looks around again. Yarynich is talking about Russia's doomsday machine. That's right, an actual doomsday device—a real, functioning version of the ultimate weapon, always presumed to exist only as a fantasy of apocalypse-obsessed science fiction writers and paranoid über-hawks.

No, no, NO. Perimeter is NOT a doomsday machine and does not meet the definition of one. As Russian nuclear arms expert Pavel Podvig explained in a 2006 post on his excellent blog:

The Soviet Union never built this automatic Doomsday Machine (also known as Dead Hand) -- the Perimeter communication system that is often mistaken for it is something quite different.

As Podvig explains, the "Dead Hand" was a proposal made in the mid-1980s that was ultimately rejected. Distinct from Perimeter, it was to be fully automated--if it detected nuclear strikes on the Soviet Union and lost contact with all human agencies with the authority to launch a retaliatory strike, it would retaliate on its own. Perimeter, meanwhile, merely automatically delegates launch authority from the highest civilian and military authorities to a hardened command center in case of a decapitation strike. The difference is that Perimeter is not fully automatic--launch authority ALWAYS remains under human control. Furthermore, never mind that the classic doomsday machine was supposed to be much more destructive than Perimeter--keep in mind that Perimeter would in all likelihood only ever be activated once a very substantial fraction of the Soviet nuclear arsenal had been destroyed in an American preemptive nuclear strike. The doomsday device proposed by Leo Szilard and explored by Herman Kahn in his 1960 On Thermonuclear War was supposed to be a world-destroying weapon which would render the world uninhabitable. By this definition, even the fully automated and unrealized "Dead Hand" would not be a doomsday machine.

I can see the temptation to confuse Perimeter with a doomsday machine--after all, everyone loves Dr. Strangelove and you wouldn't exactly sell as many books if they were titled "Perimeter: Soviet Automated System for Delegation of Launch Authority in Case of Decapitation Strike." But Perimeter just isn't the stuff pulp thriller novels are made of. Indeed, Thompson even admits this, despite the hype contained in his own article:

According to both Yarynich and Zheleznyakov, Perimeter was never meant as a traditional doomsday machine.

Unlike some other spurious Soviet doomsday machine myths this one contains a germ of truth. Furthermore, Perimeter is not in fact particularly dangerous. Multiple layers of human authority are still required to launch a nuclear attack, in addition to the detection of nuclear strikes on Russia. What was really dangerous was the practice in the 1950s and 1960s to field nuclear weapons that either lacked permissive action links or in which they were effectively disabled (it is well-known that SAC bypassed the PALs in the 1960s by setting all of them to strings of zeros). This would raise the possibility of lone "General Jack T. Rippers" launching nuclear wars all on their own. During the early Cold War Soviet nuclear posture was vastly less aggressive and accident-prone than that of the United States (the actual weapons were only to be released to the military in a crisis), greatly reducing the likelihood of such a scenario.

Perimeter is neither a doomsday machine nor a serious threat to U.S. security, past or present. It's time to stop pretending otherwise.

Thursday, October 01, 2009

The Energy Crisis in the Capitalist World, 1975

I apologize for my recent lack of posts. I received a Fulbright-Hays grant and am currently in Moscow conducting dissertation research about the history of the Soviet civil defense system. Hopefully my efforts will lead to some kind of closure to the longstanding debate in the US during the Cold War about the extent, nature, and intent of Soviet civil defense. I'm making solid headway in the archives so far--today I found some figures for the implementation of the 1955-56 campaign for the education of the adult population in "anti-atomic defense" for various Soviet republics. (It turns out Estonia did very, very poorly).

I'm currently living in a building that was constructed in the 1950s to house members of the Soviet Academy of Sciences, and (presumably) because of this, it's right across the street from the Academy of Sciences bookstore. On my previous trips to Moscow I had always made a point to frequent this establishment, as it offers a very good selection of used items at reasonable prices. Last week I found an interesting book there--a 1975 analysis by the Soviet Academy of Sciences of the capitalist world's energy woes.

Titled Энергетический кризис в капиталистическом мире (The Energy Crisis in the Capitalist World), this 478-page volume offers fascinating insights into Soviet thinking on energy in the mid-1970s. Published in an edition of 5000 copies, this highly technical book was clearly intended mainly for specialists, rather than everyday readers.

On the whole, the authors of the book attribute the energy crisis to the "fundamental contradictions of capitalism," and in particular to a lack of planning. In their view the attendant inflation resulting from the energy crisis would deepen international capitalism's problems, encourage increasing divisions between the first world and the third world, and ultimately further the cause of socialism. Comecon, meanwhile, was utterly devoid of energy problems and indeed a model for the world to follow: "These countries supply the current and perspective fuel cycle from their own resources . . . act as exporters of energy resources on the world market, and work to provide assistance to many developing countries in accessing their energy resources, without demanding political, military, or other concessions. In effect the socialist energy sector is the most stable and balanced component part of the world's energy industry." (p. 474) While certainly rather hyperbolic, keep in mind the role that the energy export sector played in enabling the USSR to avoid confronting its many internal problems in the 1970s--at the very least, it was one of the best-run aspects of the socialist economies.

What's perhaps more interesting sections of the book is its analysis of the Ford Administration's then-current efforts to free America from oil imports by 1985--"Project Independence." Presciently, they predicted the failure of the United States to achieve these goals. Of the range of measures intended to solve America's energy problems technologically, ranging from synthfuels to renewable energy, the Soviets expected only one--nuclear energy--to meet expectations. At the same time, they expected all of these efforts to be realized on a larger scale than they actually were. For instance, follwing contemporary western estimates they forecast 280 GW of nuclear generation in service in the United States in 1985, when the actual figure was under 100.

The Soviet authors made the following comment about renewable energy that still rings true 34 years later:
Preliminary evaluations of many new trends in energy take on an extremely polemic character in the USA. On the one side, strong monopolies established in the fuel-energy complex attempt to minimize the potential technological and economic value of these resources. On the other side, small firms (most of the efforts for the development of new forms of energy come from this sector) loudly advertise their products, making maximum use of the current market situation. (p. 329)

One of the more surprising sections is that on energy efficiency efforts in the US, particularly with regards to combined heat and power (CHP) plants. In a statement that would make free-market "natural capitalist" and CHP guru Amory Lovins' head explode, they declare that:
American economists are attracted by Soviet experience in the development of centralized heating of cities with the assistance of CHP plants. But under the conditions of capitalism speculative land prices and laws regarding the laying of thermal mains discourage the development of this trend in energy. (p. 315)
On the whole, an interesting historical artifact. The more things change...

Wednesday, September 02, 2009

The RRW: Not So Dead After All

From Newsweek:
As a candidate, Barack Obama declared war on nukes, but now he's calling a tactical truce. To encourage tougher international action against proliferation, he hopes to ratify the 1996 Comprehensive Nuclear-Test-Ban Treaty. The idea of outlawing weapons tests was so divisive that the Senate said no in 1999, and Republicans are ready to fight if Obama tries again. To buy them off, Obama will propose updating America's aging nuclear-weapons manufacturing complex and funding design work that would tiptoe to the very edge of crafting a new warhead, according to a senior official's recent briefing to a small group of outside experts. (Candidate Obama pledged "not to authorize the development of new nuclear weapons and related facilities.") Meanwhile, the Pentagon, working on a new "nuclear posture review," is contemplating a force of 1,000 weapons deployed and 2,000 in reserve. That's well below the 1,675 agreed to in Moscow this May, with 2,500 currently in reserve, but it dismays some of those who have been briefed. "It's Bush Lite," says one, speaking anonymously to preserve his access. "That's not what Obama promised."
I had stated in the past that the RRW would likely prove to be a program that just won't die. This is because entrenched interests--and not just the weapons labs--are deeply invested in the project. But I'm surprised with the apparent ease with which Obama seems to be retreating on this issue. I had expected the RRW to face much greater obstacles from this administration.

Wednesday, August 26, 2009

Scientists Release Radioactive Cockroaches Into Phoenix City Sewer

From the "research committee wouldn't approve that nowadays" file:
Approximately 6,500 American roaches, P. americana, were trapped in sewer manholes, tagged with P32, and then released at selected sites. To capture the specimens, a quart jar fitted with a plastic screen cone and baited with over ripe bananas was placed on its side in each of 9 manholes. A total of 18 traps was operated for 12 days to collect the desired number of specimens. Prior to marking, the roaches were maintained in 18-inch square screen cages on a diet of banana and powdered milk.

To tag the roaches, a radioactive casein solution containing 10 microcuries of P32 per milliliter was sprayed upon the specimens under confinement. The spray mixture contained equal parts of a 10 per cent casein solution and a P32 solution, the former being included to assure adhesion of the spray to the integument of the roach. The initial step in the treatment of the roaches was to place 1,000 to 1,500 specimens in a 10 or 20 gallon garbage container which was covered by a transparent plastic lid. The latter was equipped with an exhaust filter and a center hole for nozzle insertion. The spray was then introduced by means of a nasal syringe attached to a small air compressor. A total of 40 to 50 milliliters of spray solution sufficed for each can application, the operation requiring approximately five minutes.

Following the application of the spray, the container was allowed to remain undisturbed for fifteen minutes. At the end of that time, the plastic lid was replaced by the standard garbage can cover. When treatment of all roaches was completed, the contained specimens were transported to the liberation sites which consisted of four manholes one block apart and serving the same trunk line. Release of the specimens occurred at dusk, the container being lowered into the manhole and the lid removed. The opened container remained in the manhole for a 24-hour period.

For recovery of the tagged specimens, 34 traps were located in sewer manholes within a one-mile radius of the four release sites, the majority of the stations being within the 0.5 mile radius (Fig. 1). On the basis of the direction of sewage flow, stations were selected at manholes below and above the release points and at manholes located on secondary lines. In addition to the manhole sites, 10 traps were placed on premises in the blocks immediately adjacent to four liberation sites. Collection of specimens was effected at each station for 8 1/2 weeks following the release of the tagged roaches, a total of 12 samples being procured from each manhole. Radioactive roaches were detected by examining all samples with a laboratory or field count rate meter equipped with a thin-walled Geiger tube.
This is from "The Occurrence and Movement of Periplaneta Americana (L.) Within an Urban Sewerage System," by H.F. Schoof and R.E. Siverly, published in the March, 1954 issue of The American Journal of Tropical Medicine and Hygiene.

What was the logic behind tagging cockroaches with radiophosphorus and releasing them into a municipal sewer? As the authors explained,
These instances coupled with the prevalence and movement of cockroaches in and around food-handling establishments, residences and waste disposal sites have focused further attention upon the importance of these insects as possible vectors of enteric infections. Concurrent with this interest is the renewed effort by communities to control roaches within city sewerage systems (Gary, 1950). The heavy roach infestations within such systems combined with the availability of human wastes are factors which conceivably could constitute a potential hazard to the health of a community.

Since it has been demonstrated that roaches resident within the sewerage systems can become contaminated with pathogenic organisms, the next step in the mode of spread of the pathogens would involve the degree of dispersion of the infected roaches and the contact between the insects and the human population. To obtain information on the dispersion of roaches within and from a sewerage system, a study was conducted at Phoenix, Arizona, in October 1952. Previous surveys of 22 selected manholes in that city for a seven-week period had shown a weekly average of 92 to 143 specimens per manhole with all roaches being P. americana.
Mercifully, it turns out that the radioactive cockroaches didn't go much of anywhere:
The collection data are summarized in Table 1. As is apparent, only one tagged specimen was recovered from sites other than the release point. Despite the absence of marked specimens, all manhole stations yielded P. americana, the average number per collection being 39 specimens. Only one specimen was trapped in the 10 yard stations but this roach was radioactive. Three of the four release sites were trapped to provide a total of 929 roaches in 17 collections or an average of 54 specimens per sample. Of this number, 97.5 per cent were radioactive, thus demonstrating that the method of tagging had been effective. Further substantiation of this aspect was shown by the recovery of tagged roaches throughout the 8 week period. Specimens captured 39 days after release displayed counts of 1,000 to 6,000 per minute.
The authors concluded from these findings that:
The conclusion derived from the experimental evidence is that P. americana does not disperse throughout the urban sewerage system of Phoenix, Arizona. . . .The results reported tend to raise a question as to the relative importance of roaches as a means of disseminating disease pathogens within the sewerage system and from such locations to human habitations. Further evidence discrediting the concept is the finding that the roach populations in sewer manholes are composed of one species, P. americana, whereas the predominant species taken in homes have been Supella supellectilium and Blattella germanica.
Isn't that reassuring?

Friday, August 21, 2009

That Doesn't Even Make Any Sense

I've been eternally mystified by the insistence some arms control types have that restricting the domestic deployment of civilian nuclear technology will somehow forestall proliferation abroad. The classic example of this is the Ford/Carter reprocessing ban. Given that reprocessing continued in the UK, France, Russia, and Japan, it seems that this policy failed to make much of an impression, and given the ability of North Korea to build a basic plutonium extraction plant, it hasn't done anything to halt determined would-be nuclear states. But that doesn't stop certain observers from claiming that this policy was actually a success and should be used as a model for future US nuclear energy policy. The most recent example is James M. Acton's article in the August/September issue of Survival, "Nuclear Power, Disarmament and Technological Restraint." As the author puts it:
The appropriate way to evaluate a strategy of desist and discourage is to ask whether it not only discourages states from taking small-scale research programmes to an industrial level, but leads states to avoid launching new reprocessing programmes in the first place. (Small-scale reprocessing programmes are perhaps even more worrying from a proliferation perspective than their industrial-scale counterparts.) For this reason, the claim from a recent Department of Energy report that 'U.S. opposition [to reprocessing] has not slowed large-scale reprocessing programs in Europe, Japan, and Russia', while true, is also somewhat beside the point. What the Department of Energy's statement really underlines is that, because of the web of political, legal and financial commitments needed to create such multibillion-dollar programmes, it is extremely difficult to stop them once they have been set in motion. This phenomenon, termed 'entrapment' by William Walker, highlights the importance of a policy aimed at stopping such programmes before they have even started. Here, there is evidence that the US moratorium had a positive, albeit modest, effect.
And where were these effects felt?
Most Western nuclear-power programmes prior to the mid 1970s were built around the expectation that power-reactor fuel would be reprocessed. The seminal 1976 study Moving Towards Life in a Nuclear Armed Crowd? observed that, given contemporary plans, 17 states would have reprocessing facilities and enough separated plutonium for between three and six nuclear weapons by 1985; today, just eight or nine states (including North Korea) are reprocessing. Not all of these stoppages were due to the US moratorium. Some programmes, such as South Korea's and Taiwan's (both of which had a clear military dimension), were avoided because of intense US pressure on both the supplier and recipient of reprocessing-technology transfers. Others, however, were influenced by the moratorium.
Acton uses Italy as an example of a state influenced by the moratorium, but ultimately concludes that "the evolution of policy in Italy was driven by a domestic debate about the economics of reprocessing and safety concerns about plutonium," leaving the reader to wonder exactly where it was that the policy had the desired effect. On the whole, it's not at all convincing as a defense of this kind of policy.

The technology which Acton particularly seems to envision for this kind of treatment is Silex laser enrichment:

Realistically, the gas centrifuge is too economically advantageous, and its use too entrenched, to be phased out. The opportunity does exist, however, to forsake enrichment and other nuclear technologies that have not yet been commercialised.

Today, for instance, Global Laser Enrichment (GLE, owned by General Electric Hitachi) is attempting to commercialise a new enrichment process (known as the SILEX process) based on lasers. GLE expects that the SILEX process will be more profitable to enrichment firms than other technologies. However, the economic benefits of cheaper enrichment to electricity consumers are slight because enrichment typically accounts for less than 5% of the total cost of nuclear electricity. Meanwhile, laser enrichment is probably even more worrying from a proliferation perspective than the gas centrifuge because detecting a small, clandestine laser-enrichment plant is likely to be even harder than detecting a secret gas-centrifuge enrichment plant of a similar capacity. Regulators should factor such concerns into licensing decisions for all nuclear technologies and be willing to deny applications if they determine that the costs outweigh the benefits, as is almost certainly the case with GLE, for instance. Forsaking sensitive nuclear technologies on non-proliferation grounds would be controversial, but justifiable.
I'm not sure why this follows. Silex is an extremely challenging technology which is already subject to what are arguably some of the tightest information controls ever applied to a civilian endeavor. Centrifuge technology, however, is much more achievable to would-be nuclear states and information about it has already been widely disseminated thanks to the efforts of A.Q. Kahn and others. Nuclear proliferators would be fools to pursue Silex, so why should we deny it to ourselves?

In general, Acton claims that "policymakers, industry insiders and regulators have usually failed to factor proliferation concerns into decisions about nuclear energy." This is a highly questionable assessment, given not only the history of the reprocessing ban in the US recounted by the author. To read the article, one would not know that such international controls have been in place already for decades. Incredibly, Acton totally ignores the existence of the Nuclear Suppliers Group, which has been working to discourage nuclear proliferation since 1978. Perhaps one could argue that the efforts of the NSG have been utterly inadequate, but a failure to consider its efforts historically in an article making this argument is, to say the very least, an extraordinary oversight.

For all its shortcomings, however, the article does make an important and often-overlooked point:
The proliferation costs of not selling less-sensitive technologies are frequently underplayed. A dramatic example is the US decision to cut the United Kingdom and Canada out of the development of civil nuclear power after the Second World War. Reluctant to rely on the United States as a supplier of enrichment, Britain and Canada decided to focus on reactors that did not use enriched uranium (GCRs and HWRs, respectively). These reactors are, however, more suitable for proliferation than LWRs (which is not to say that LWRs are proliferation proof). Indeed, the Indian nuclear-weapons programme was based on a Canadian-supplied HWR. South Korea tried to acquire an almost identical reactor in the early 1970s, when it was pursuing a nuclear-weapons option. And, as noted above, North Korea produced plutonium for its weapons programme using a GCR based on a British design.
Historically, I think that Acton's example is rather imperfect, but I think the overall point is sound. The UK developed the MAGNOX GCR in order to produce plutonium for its weapons program; it seems unlikely that the US would have provided HEU for weapons use. The Canadians chose HWRs in the 1950s because they determined that developing domestic enrichment capabilities for solely civilian purposes would be prohibitively expensive, and there was no international civilian source for enriched uranium at the time. The US denial wasn't necessarily the issue per se. But it is easy to imagine how an unwillingness to share nuclear technology today could have undesirable consequences. For instance, states denied US reactor technology might turn to India and Russia, which could sell them reactors like the BN-800 that would have less proliferation resistance than conventional LWRs. This is one reason why more sensible arms control wonks have embraced the UAE deal as an example for how nuclear exports should be conducted.

The biggest flaw with Acton's argument, however, is that it is an anachronism in an era when the US dominates the world nuclear field far less than it did thirty years ago. In 1978, the US had much more influence over the civilian nuclear energy field; today, the major American players have been bought out by the Japanese, we have to import critical forgings, and Areva and Rosatom are furiously competing for the export market with offers of financing, fuel cycle services, and other enticements. Acton speaks mysteriously of "a small number of advanced nuclear states," but does this have any bearing on the world situation today? As nations like China and India build up their domestic nuclear industries, US influence continues to shrink, for better or for worse. Any strategy for non-proliferation that is posited on continued American dominance in the nuclear energy field is doomed to failure.

Friday, July 17, 2009

Thermodynamics Fail

From Grist:

There's an old book called How to Lie With Statistics of which I'm very fond. Here we have a good example of an anti-nuclear argument that hinges on flimsy statistical assumptions--in this case, a silly definition of "delivered energy" which actually makes it appear that only the electrical generation field has to abide by the laws of thermodynamics.

In fairness the EIA perpetuates these silly myths by accounting for use of fossil fuels for transportation and heating as "delivered energy," even though the use of the primary energy in these applications is far from 100% efficient. This leads to numbers that give the impression that the electrical sector is massively wasteful, which is not at all the case. Look at Table A2 in the linked EIA document: the only "losses" accounted for are in the electrical field: 27.88 out of 101.9 quads of primary energy used in the US.

Grist promises that "Architecture 2030 will post a better answer on Grist next week." I certainly hope so, since so far all they've demonstrated is an ignorance of thermodynamics.

Monday, June 01, 2009

The SVBR: Russia Makes it Modular

Recently there has been a flurry of attention in the English-language blogosphere about the Russians' prospective SVBR reactor series, inspired by this Oil Drum post. While the SVBR development program is hardly new, it has received important endorsements from the Russian government in recent months that suggest that the SVBR reactors will actually see the light of day.

The SVBR-100 reactor.

SVBR stands for Svintsovo-vismutovyi bystryi reaktor, or in English "Lead-Bismuth Fast Reactor." And this is exactly what the SLBR is--an evolved version of the lead-bismuth fast reactor which powered the Alpha submarine. The SVBR is one of three liquid-metal-cooled reators currently under development in Russia, along with the sodium-cooled BN-series and the lead-cooled BREST. Unlike the lead- and sodium-cooled reactors, which Rosatom ultimately envisions as very large (1600+ MWe in the case of the case of the BN-series), the SVBR is designed as a small, modular, and passively-safe reactor.

The reason I think that this is interesting is because I am convinced that such relatively small modular reactors are the future of nuclear power, and the SLBR is the Russian entry into this field, along with the Pebble-Bed Modular Reactor, the NuScale, and others. Furthermore, with Moscow's support the SLBR has better odds going for it than most of the other perspective modular designs.

The SVBR is currently envisioned in three sizes: 10MWe, 75MWe, and 100MWe. The first is intended for various portable and remote uses, incuding electricity and heat. (Ironically, I believe that the SVBR-10 meets Amory Lovins' definition of "micropower.") The 75MWe version was conceived as a way of reusing the equipment at aging Russian VVER-440 reactors. As described here, four SVBR-75 modules will replace the characteristic VVER steam generators within existing plant buildings and take advantage of existing turbomachinery. It's a very elegant idea, and it's easy to see why such a proposal would appeal to Rosatom. Finally, the SVBR-100 would be used in various stand-alone applications, including combined-heat-and-power and clustered in groups to create large power plants.

According to its designers, the SVBR will possess passive safety thanks to its "monobloc" design which incorporates the entire primary coolant loop in the reactor vessel, the physical and nuclear characteristics of the lead-bismuth coolant, and the placement of the reactor vessel in a pool of water which will serve as a passive emergency cooling system. The arguments for the SVBR's safety are, in my view, quite convincing, and it is clearly much safer than the current VVERs and the BN-series. I have my doubts, however, about whether the lead-bismuth coolant is quite friendly enough in operational practice to allow the general civilian application of these reactors. The Alpha submarine program, after all, suffered myriad difficulties with their lead-bismuth reactors and the expectation of widespread use of these reactors in the Soviet navy went unrealized. Gidropress believes that they have solved the attendant materials and design issues associated with lead-bismuth coolant, however. Only when the construction of a prototype SVBR will ascertain whether or not this confidence is justified. Furthermore, bismuth is rare, but geological exploration in Russia has already located enough of the element for hundreds of SVBR reactors.

Besides use repowering VVER-440s and building new electrical and heating plants within Russia, Rosatom also envisions the SVBR as a profitable export item. Anna Kudriavtseva, director of the department of research and scientific-technical policy at Atomenergoprom, reported at a conference last November that the SVBR could conceivably take 10-15% of the world market for small and medium-sized reactors, projected by the IAEA as 500-1000 units by 2040 and a potential value of $600 Billion. To this end the government is investing 16 Billion rubles in the project (about $500 million at current exchange rates), with a view to complete development work by 2017 and to have a pilot plant in operation by 2020. Although certainly less immediate than one might like, it appears that the SVBR will enter the commercial market at around the same time as the PBMR and other competing modular concepts.

Although the cost estimates I can find for the SVBR are not entirely reliable (they are, after all, made for an as-yet undemonstrated reator), they indicate that the SVBR is likely to be more economical than either the VVER or the ruinously expensive BN-series. This makes me wonder why Rosatom's expectations seem to belimited to sales of a few hundred units; barring unforseen events the SVBR would be its most attractive product both within Russia and on the export market by the early 2020s. By then there will probably be enormous demand for an economical, passive-safe nuclear reactor like the SVBR--a demand which Rosatom may be in a highly advantageous position to fill. I am heartened, however, by the fact that the NRC seems to be pondering the adoption of regulations encouraging the development of small, modular reactors. Hopefully the US won't abandon this critical arena to the Russians.

Tuesday, May 05, 2009

Is Joe Romm a "Delayer-1000"?

I'm rapidly losing patience with Joe Romm, whose self-righteous arrogance is, if anything, actually harming the cause of addressing global climate change. Today, Romm published two posts on his blog illustrating the shortcomings of his methods and outlook.

The first, Memo to James Hansen: Your opposition to Waxman-Markey is ill-conceived and unhelpful, is the latest in Romm's series of critiques of James Hansen, the world's most prominent authority on climate change. Romm opines that:
"Why oh why do even smart people like NASA’s James Hansen think that there is such a thing as “simple carbon tax”? Have you folks ever looked at the friggin’ tax code?"
Besides denigrating Hansen, Romm apparently considers it obvious that there is only one credible solution to climate change--socialism: "There is only one way. That is a WWII-style and WWII-scale government-led mobilization." Despite this rather ambitious goal, Romm dismisses a carbon tax as "a political dead end." Are you kidding me? A carbon tax is infinitely more politically acceptable to the American electorate than state control of most of the economy (and that was what state-led WWII economic mobilization consisted of.) Furthermore, Hansen is right that the courts are really the best (and probably only) hope for substantial action on carbon reduction. Perhaps Romm is blind, but the Senate is never going to pass a bill that will reduce emissions substantially anytime soon, much less socialize the economy.

On the bright side, however, Romm's inane critiques of Hansen only serve to undermine his credibility. Anyone with sense can tell the difference between a brilliant, world-renowned scientist and an insufferable, politically intolerant and self-righteous blogger.

In another post Romm used Progress Energy's recent announcement that it will delay its new reactor builds by two years to crow about the supposed uselessness of nuclear power to combat climate change:
I’ll cast an even more positive light on the project delay. Maybe it will give Florida regulators or utility executives time to figure out that other options would be superior.
Well, I've got news for you, Joe. Turns out that those executives and regulators are way ahead of you. And you're dead wrong. Your three proposed alternatives: efficiency, biomass, and CSP, are no alternative to the new nuclear units.

Despite the fact that Amory Lovins' sophistry has deluded many into believing that "efficiency" is baseload power, this is nonsense. Efficiency does not generate electricity: it merely uses it more efficiently. When faced with the challenge of decarbonizing their electrical generation over the next few decades, Florida utilities have to address the prospect of how they will provide reliable, around the clock power even under adverse conditions to a growing population in a carbon-constrained world. Efficiency is important and worthwhile, but because it conserves energy rather than produces it, it does not allow FPL and Progress to avoid the sticky question of how they will generate electricity 25 years from now.

Last year the Florida Public Service Commission issued a study examining the feasibility of implementing a Renewable Portfolio Standard in the state. This study is the most complete comparison of the costs and possibilities of renewable and conventional energy I have ever seen. It compares the costs of different forms of generation across a spectrum of scenarios, using the Florida utilities' actual capital cost projections for these technologies. The results are sobering, as they show that renewables, particularly solar and wind, are not going to be cheaper than new nuclear plants in Florida except with large government subsidies. Biomass is another story, as it is projected to be cheaper than nuclear. However, biomass potential is limited by the availability of sustainable biomass feedstock. Furthermore, several form of "biomass" included in the analysis are not carbon-neutral, such as municipal solid waste burning. As nearly half of the 13,750MW of solid biomass feedstock available per year is from devoting existing farmland to energy crops, it should be readily apparent that biomass is not really an easy, "renewable" alternative to new nuclear plants for Florida. There just isn't enough feedstock for biomass to provide the entirety of Florida's baseload generation needs in coming decades.

Finally, the Florida PSC's study clearly shows that concentrating solar power (CSP, or "solar baseload" in Romm's insufferable newspeak) is an absolute loser in Florida. It is projected to cost more than nuclear power, and indeed more than anything, including all other forms of renewable energy. With generous subsidies the study found that CSP might come in at a little over 16 cents/kW-hr, but under pessimistic assumptions it would cost well over 30 cents/kW-hr.

The Navigant study, significantly, was actually made to support the Florida PSC's Renewable Portfolio Standard proposal, which was actually approved a few months ago. The PSC has definitely done its homework, as have Progress and FPL. Anti-nuclear types like Romm like to pretend as if huge utilities are so stupid that they would make multi-billion dollar investments in nuclear plants--investments that will restrict their near term profitability and severely impact their cashflow--without actually investigating alternatives. The fact of the matter is, these new nuclear plants are being built because the utilities see no good intermediate-term alternative. If coal or gas was acceptable, or if renewables were cost-effective, available, and non-intermittent, FPL and Progress would not be pursuing new LWRs. The very same Florida PSC that sent a RPS proposal to the state legislature also approved rate-recovery to build the nuclear plants. Somehow, I doubt that the PSC is confused about the necessity of nuclear power for Florida.

Furthermore, Romm and his certified accountant (and 70s-era coal apologist) associate Craig Severance are playing fast and loose with the nuclear cost estimates. Severance's so-called "analysis" that Romm "published" on his blog is both mendacious and irrelevant to estimating the costs of the new plants in Florida. Firstly, Severance simply chose to ignore the real reason that FPL's and Progress' cost estimates rose so rapidly in 2007-8--that the utilities were prudently and conservatively extending the ongoing increases in construction costs into the 2010s. Severance assumed that costs would continue to escalate beyond these estimates at a similar rate, but due to the global economic downturn and increasing competition in the production of nuclear components such as pressure vessels, construction cost escalation should soon drop precipitously. But the fundamental problem with the Severance "study" is that it uses capital cost assumptions that are in spectacular excess of the way FPL and Progress are actually financing the new builds. Severance assumed a 15% cost of equity, which was in keeping with the 2003 MIT study but is greatly in excess of the 11.85% rate set by the Florida PSC. Over a multi-year period, the difference between 15% and 11.85% interest is enormous. Furthermore, Severance added hugely inflated figures for decommissioning and waste-disposal costs which he essentially fabricated out of whole cloth. Romm's enthusiastic endorsement of this travesty is a testament to both his own pitifully low intellectual standards as well as his desperate desire to avoid owning up to being spectacularly wrong on the nuclear issue.

Romm claims that "The only thing that would decrease risk is not pursuing this nuclear power plant until all of the lower-cost efficiency and renewable options were exhausted." But if Florida pursued this course (which on close inspection isn't actually available), it would be too late. Such a short-sighted approach would leave Florida in the 2020s with a dwindling base of firm generation. Only decisive action now can forestall the likely challenges of the 2020s, and FPL and Progress are taking the most prudent course open to them. Joe Romm is too proud to admit that addressing climate change will probably require some measures that he doesn't care for. That's his right, but in his own way he's as much of a "delayer-1000" as the figures he decries on his blog.

Wednesday, April 22, 2009

Jon Wellinghoff is Blind

Via Joe Romm, some comments from the Obama Administration's pick for head of the Federal Energy Regulatory Commission, Jon Wellinghoff:

“I think [new nuclear expansion] is kind of a theoretical question, because I don’t see anybody building these things, I don’t see anybody having one under construction,” Wellington said.

Building nuclear plants is cost-prohibitive, he said, adding that the last price he saw was more than $7,000 a kilowattmore expensive than solar energy. “Until costs get to some reasonable cost, I don’t think anybody’s going to [talk] that seriously,” he said. “Coal plants are sort of in the same boat, they’re not quite as expensive.”

I guess that Mr. Wellinghoff is blind, because anyone who has actually been paying attention knows that there's gonna be construction starting soon right here in the good ol' USA:
Southern Nuclear has given notice to its main contractors to proceed towards two new reactors at Vogtle. Permissions already in place allow some construction work to begin.
And as Dan Yurman reported here, Progress Energy in Florida and NRG in Texas are moving in the same direction. But this is symptomatic of much more serious problems with Mr. Wellinghoff: from all appearances, he has absolutely no idea what he's talking about.

This quotation gets at the heart of what I mean:

“I think baseload capacity is going to become an anachronism,” he said. “Baseload capacity really used to only mean in an economic dispatch, which you dispatch first, what would be the cheapest thing to do. Well, ultimately wind’s going to be the cheapest thing to do, so you’ll dispatch that first.”He added, “People talk about, ‘Oh, we need baseload.’ It’s like people saying we need more computing power, we need mainframes. We don’t need mainframes, we have distributed computing.”

The technology for renewable energies has come far enough to allow his vision to move forward, he said. For instance, there are systems now available for concentrated solar plants that can provide 15 hours of storage.

“What you have to do, is you have to be able to shape it,” he added. “And if you can shape wind and you can effectively get capacity available for you for all your loads.

“So if you can shape your renewables, you don’t need fossil fuel or nuclear plants to run all the time. And, in fact, most plants running all the time in your system are an impediment because they’re very inflexible. You can’t ramp up and ramp down a nuclear plant. And if you have instead the ability to ramp up and ramp down loads in ways that can shape the entire system, then the old concept of baseload becomes an anachronism.”

Wellinghoff is seriously confused, both in terms of the current status of renewable technology but also in that he's proposing technological frameworks that are currently wishful thinking. I'm not sure what he means by "shaping" wind, but he seems to be proposing making wind's serious inadequacies irrelevant by allowing energy providers to control end-use--an idea currently in vogue but whose popularity I am certain will disintegrate once it starts being implemented widely. The thing is, most (all?) qualified experts dispute the idea that "baseload is an anachronism." Intermittent generators just aren't dispatchable by definition, and once you get past the wishful thinking and do serious analysis, the barriers to making something like wind dispatchable prove much more forbidding than building new nuclear plants.

Here's a pertinent example of some people who don't share Welinghoff's vision: the authors of the DOE study that estimated that the US could generate 30% of its electricity from wind by 2030. As they concluded:
Wind power cannot replace the need for many “capacity resources,” which are generators and dispatchable load that are available to be used when needed to meet peak load. If wind has some capacity value for reliability planning purposes, that should be viewed as a bonus, but not a necessity. Wind is used when it is available, and system reliability planning is then conducted with knowledge of the ELCC of the wind plant. Nevertheless, in some areas of the nation where access to generation and markets that spans wide regions has not developed, the wind integration process could be more challenging.
However, Mr. Wellinghoff's position makes more sense when we consider his background:
Chairman Wellinghoff is an energy law specialist with more than 30 years experience in the field. Before joining FERC, he was in private practice and focused exclusively on client matters related to renewable energy, energy efficiency and distributed generation. While in the private sector, Chairman Wellinghoff represented an array of clients from federal agencies, renewable developers, and large consumers of power to energy efficient product manufacturers and clean energy advocacy organizations.

While in private practice, Chairman Wellinghoff was the primary author of the Nevada Renewable Portfolio Standard (RPS) Act. The Nevada RPS is one of the two states to receive an “A” rating by the Union of Concerned Scientists. In addition, he worked with clients to develop renewable portfolio standards in six other states. The Chairman is considered an expert on the state renewable portfolio process and has lectured extensively on the subject in numerous forums including the Vermont Law School.

Chairman Wellinghoff’s priorities at FERC include opening wholesale electric markets to renewable resources, providing a platform for participation of demand response and other distributed resources in wholesale electric markets including energy efficiency and plug-in hybrid electric vehicles (PHEVs), and promoting greater efficiency in our nation’s energy infrastructure through the institution of advanced technologies and system integration.
So the head of FERC is not merely ignorant (not knowing that the new nuclear projects are breaking ground is totally inexcusable for someone in his position), but he's a technological fantasist who's determined to pick energy winners before they've been tested in the real world. It's rather akin to trying to pick the winning racehorse before it has been born. It makes me glad that FERC isn't powerful enough to actually implement most of this; at this rate, let's hope it stays that way.

Thursday, April 09, 2009

"Solar-Powered City of Tomorrow" Will Probably Be More Nuclear Than Solar

From Time:
An NFL lineman turned visionary developer today is unveiling startlingly ambitious plans for a solar-powered city of tomorrow in southwest Florida's outback, featuring the world's largest photovoltaic solar plant, a truly smart power grid, recharging stations for electric vehicles and a variety of other green innovations. The community of Babcock Ranch is designed to break new frontiers in sustainable development, quite a shift for a state that has never been sustainable and lately hasn't had much development.
Where will the power for this "green" city come from?
Kitson has been promising unprecedented sustainability all along, but today's shocker was the announcement of Florida Power & Light's plan to provide electricity for Babcock Ranch with a 75-megawatt photovoltaic plant nearly twice as big as the current record holder in Germany. Solar power has been slow to catch on in the gas-powered Sunshine State, but FPL hopes to start construction on the 400-acre, $300 million plant by year's end. The utility expects it will provide enough power for Babcock Ranch and beyond.
Furthermore, Time writer Michael Grunwald claims that:
At $4 million per megawatt — FPL estimates the cost to its customers at about 31� per month over the life of the project — it should be more than four times as cost-effective as the nuclear reactors FPL is trying to build near the Florida Keys.
This statement raises the obvious question of why FPL is trying to expand its nuclear capacity if solar photovoltaics are already four times as cost effective. The answer, in fact, is elegantly simple: Grunwald's math is way off. Not only will the solar plant produce power at least twice as costly as that from FPL's new reactors, but thanks to its likely <25% capacity factor and a lack of energy storage, this "solar-powered city of tomorrow" will likely end up consuming more nuclear-generated electricity than solar-generated electricity.

A quick Google search reveals that Grunwald's numbers are simply inaccurate to begin with:
The Babcock Ranch project will cost between $350 million and $400 million, FP&L officials said. Three other solar projects now being built by FP&L will add 31 cents to the average monthly bill of the utility's 4.5 million customers.
Instead of the $4000/kW given by Grunwald, that's $4667-$5333/kW. This is still considerably lower than the estimates that I had seen in studies commissioned by Florida PSC, for instance this one. This study expected ground-mounted single-axis tracking solar PV installations to reach this price point in approximately 2014. Still, even at that point using favorable assumptions electricity from these facilities was projected to cost over $0.20/kW-hr, with government subsidies reducing that price to $0.136-$0.15/kW-hr. Meanwhile, the same study concluded on the basis of equivalent methodology that the new nuclear plants currently being planned in Florida will produce power at a cost of approximately $0.12-$0.13/kW-hr, beginning in 2016. This was assuming the relatively high figure of $7700/kW capital cost for the nuclear plants. The Navigant study does, however, make it absolutely clear why the Florida PSC has adopted the particular set of policies it has by endorsing aggressive pursuit of both nuclear power and some renewables. However, in ALL scenarios it studies nuclear power is cheaper than all forms of solar electricity after 2016, even with government subsidies. Far from being "more than four times as cost-effective as the nuclear reactors FPL is trying to build near the Florida Keys," this plant will ultimately be considerably more expensive.

Furthermore, it will not actually power Babcock Ranch most of the time. As reported by the Miami Herald:
Though researchers are working to create storage capability for sunlight-generated power, solar electricity at present only is available during daytime hours. The concept is FPL's 75-megawatt solar generator will produce more power for the state's electric while the sun shines than the city will use in 24 hours.

''We're going to generate more renewable energy than the city consumes,'' said Kitson spokeswoman Lisa Hall. ``It will be a leader in solar. It's a great opportunity to overcome that storage thing. The carbon footprint is going to be net zero.''

Basically, the solar plant will produce four times as much electricity as is used by the development one-quarter of the time. The majority of the electricity generated by the plant will go out into the grid and be used throughout Florida. At night and when the sun isn't shining brightly enough to produce much power, the city will receive its electricity from the rest of FPL's generation base... including the reactors at Turkey Point. Currently about 19% of FPL's electricity in Florida is generated at Turkey Point and St. Lucie, but with the ongoing uprates and the two new units at Turkey Point this should increase to 30%+. Furthermore, as solar insolation varies considerably between summer and winter, most of the electricity from the solar plant will be produced in the summer. This means that the PV arrays will produce far more electricity than the city will use in the daytime in June, and far less than it needs in the winter and, of course, none whatsoever at night. So it is unavoidable that most of the energy consumed by Babcock Ranch will come from elsewhere on FPL's grid, and given the probable makeup of their generation mix one decade from now, it will use more electricity generated by nuclear reactors than by its own solar plant.

Isn't that just deliciously ironic?

Thursday, March 26, 2009

Nuclear Power: An Indispensable Climate Change Solution

Joe Romm clarifies his position on nuclear power:

Why not more than 1 total wedge of nuclear? Based on a post last year on the Keystone report, to do this by 2050 would require adding globally, an average of 17 plants each year, while building an average of 9 plants a year to replace those that will be retired, for a total of one nuclear plant every two weeks for four decades — plus 10 Yucca Mountains to store the waste. I also doubt it will be among the cheaper options. And the uranium supply and non-proliferation issues for even that scale of deployment are quite serious. See “An introduction to nuclear power.”

Note to all: Do I want to build all those nuclear plants. No. Do I think we could do it without all those nuclear plants. Definitely. Therefore, should I be quoted as saying we “must” build all those nuclear plants, as the Drudge Report has, or even that I propose building all those plants? No. Do I think we will have to swallow a bunch of nuclear plants as part of the grand bargain to make this all possible and that other countries will build most of these? I have no doubt. So it stays in “the solution” for now.

Romm's take on nuclear power is not particularly well-informed, as I've discussed in the past. But examining its limited role in his proposed solution reveals that Romm has not seriously considered the physical limitations associated with his preferred energy options. For political, geographical, and practical reasons, nuclear power must ultimately play a vastly larger role in our energy future than predicted by Romm.

Romm describes his preferred future energy mix as follows:

What's wrong with this picture?

4000 GW wind, 5000 GW solar thermal, 2000 GW solar photovoltaic. This is an increase of two orders of magnitude for wind and three for both types of solar. I notice that the capacity factor assumptions implied by Romm are quite high. Wind turbines are now a fairly mature technology, so its economics are increasingly apparent, but the costs solar thermal and solar photovoltaic are still unclear. But for the sake of argument I'm willing to grant Romm that maybe in 2050 these technologies will be cost-competitive. The important thing is that the qualitative limitations of these sources of energy go far beyond cost. With the possible exception of a handful of exceptionally well-endowed nations, investment in solar and wind can NEVER assure energy security.

Solar and wind generators depends on the ambient energy resources available in the locations where they are installed. There is, of course, no place on earth where the sun shines all the time, and not many where the wind always blows. So these are intermittent resources by nature. But some countries are better-endowed than others. Imagine, if you will, a future world of 2050 with the energy supply specified by Romm. Some nations, such as Russia, would be unable to meet their own generation needs through wind and solar power. They could import electricity from abroad, but they would have to compete with other markets such as India and China for it. Not only would this make energy expensive, but it would also place Russia at the mercy of its energy suppliers. Hostile states could cripple Russia's economy by interrupting its energy supplies. States exporting renewable energy would also have substantial incentive to underproduce to both encourage uncertainty and raise energy prices. There would be little incentive to produce enough energy for the have-nots, especially since electricity transmission would make them largely captive markets, unlike present-day oil importers. Countries without abundant renewable energy resources would therefore have a desperate need for more secure energy supplies.

Hence the reason why nuclear energy is likely to dominate our energy future. Because relatively few nations have the renewable resources needed to support their economies themselves (just how many depends on how these technologies develop), the most logical step for them to take to secure their post-carbon energy security is to invest in nuclear energy infrastructure. They would have every reason to doubt that other countries would build the infrastructure needed to provide them with affordable and reliable energy, as it would be in those states' interest to underfulfill their needs. Even in a world where renewable energy technology could fulfill all of the world's energy needs affordably, geographic realities would make nuclear power more attractive.

I do not actually believe that wind and solar power are cheaper than nuclear, but my point is that the barriers to a world powered by solar and wind are not merely technological, but geographical, political and economic. I do not expect that solar thermal electricity will cost less than nuclear electricity in 2050, but even if it did this would not translate into energy security for most of the world. Only the provision of non-intermittent energy sources with the ability to store months' or years' worth of energy will secure the interests of these nations. And nuclear power fits these requirements.

In 2050, I expect there to be far more than 350-700 GW of new nuclear plants in operation. In fact, I would not be surprised by 5000-6000 GW of new nuclear by this point. Most of this will probably consist of mass-produced Generation IV reactors, including ALMRs, PBMRs, and various kinds of MSRs. Not only can these technologies replace fossil-fuel electrical generation anywhere on earth at reasonable cost, but they also allow nations to stockpile decades or even centuries worth of fuel--meaning that even a war or natural catastrophe could potentially have minimal effect on energy production.

The real-world alternative to this is NOT an idealistic future of cooperation, windmills, and solar panels. It is a dystopian nightmare where most of the world continues to burn coal because they lack the ability to domestically produce or import environmentally benign energy. It is a world wracked by war, catastrophe, and want. Even if the myriad technological problems of renewable energy were solved, the simple geographic fact remains that some nations lack sufficient energy resources, be they oil, gas, sunshine, or wind. For this reason, nuclear power is indispensable for averting climate catastrophe. Those who pretend otherwise, such as Romm, are fooling themselves.

Tuesday, March 24, 2009

Noisome Falsehoods About Three Mile Island

As I expected, the upcoming anniversary of the Three Mile Island Accident has inspired Harvey Wasserman to trot out one of his more dubious perennial deceptions:
People died---and are still dying---at Three Mile Island.

As the thirtieth anniversary of America's most infamous industrial accident approaches, we mourn the deaths that accompanied the biggest string of lies ever told in US industrial history.
Mr. Wasserman does not apparently live with us in the reality-based community, where it is widely understood that radiation release at TMI was minimal, and that the public health impact from radiation was nonexistent. Repeated epidemiological studies have confirmed this (see Hatch et Al., Am. J. Pub. Health 81:719-24 (1991), Talbott et Al., Environmental Health Perspectives 108:545-62 (2000), and so on).

Sadly, Wasserman has some allies in perpetuating his falsehoods:
A study by Columbia University claimed there were no significant health impacts, but its data by some interpretations points in the opposite direction. Investigations by epidemiologist Dr. Stephen Wing of the University of North Carolina, and others, led Wing to warn that the official studies on the health impacts of the accident suffered from “logical and methodological problems.” Studies by Wing and by Arnie Gundersen, a former nuclear industry official, being announced this week at Harrisburg, significantly challenge official pronouncements on both radiation releases and health impacts.

Gundersen, a leading technical expert on nuclear engineering, says: “When I correctly interpreted the containment pressure spike and the doses measured in the environment after the TMI accident, I proved that TMI's releases were about one hundred times higher than the industry and the NRC claim, in part because the containment leaked. This new data supports the epidemiology of Dr. Steve Wing and proves that there really were injuries from the accident. New reactor designs are also effected, as the NRC is using its low assumed release rates to justify decreases in emergency planning and containment design."
The notion that TMI radiation releases were two orders of magnitude higher than official estimates is preposterous. How do we know this?

As a public service, in 1979 the Eastman Kodak Company collected all the unexposed film that it could locate in the area around Three Mile Island and examined it for evidence of radiation-induced fogging. This would provide excellent evidence of even relatively small radiation exposures, because the film would begin fogging at a mere 5 millirem.

Kodak found nothing. As the reputable scientists who have examined the accident since have emphasized, this totally rules out the theory that public exposure was substantially above background.

Wasserman and his ilk, however, have something better than science. They have... ANECDOTES!
Anecdotal evidence among the local human population has been devastating. Large numbers of central Pennsylvanians suffered skin sores and lesions that erupted while they were out of doors as the fallout rained down on them. Many quickly developed large, visible tumors, breathing problems, and a metallic taste in their mouths that matched that experienced by some of the men who dropped the bomb on Hiroshima, and who were exposed to nuclear tests in the south Pacific and Nevada.
In March of 1980, I went into the region and compiled a range of interviews clearly indicating widespread health damage done by radiation from the accident. The survey led to the book KILLING OUR OWN, co-authored with Norman Solomon, Robert Alvarez and Eleanor Walters which correlated the damage done at TMI with that suffered during nuclear bomb tests, atomic weapons production, mis-use of medical x-rays, the painting of radium watch dials, uranium mining and milling, radioactive fuel production, failed attempts at waste disposal, and more.

My research at TMI also uncovered a plague of death and disease among the area's wild animals and farm livestock. Entire bee hives expired immediately after the accident, along with a disappearance of birds, many of whom were found scattered dead on the ground. A rash of malformed pets were born and stillborn, including kittens that could not walk and a dog with no eyes. Reproductive rates among the region's cows and horses plummeted.

Much of this was documented by a three-person investigative team from the Baltimore News-American, which made it clear that the problems could only have been caused by radiation.
The plural of anecdote is not "data." The best longitudinal mortality study of TMI was the Talbott et Al. study published in Environmental Health Perspectives in 2003. (EHP 111: 341-348). Following 32,135 individuals who were within five miles of TMI during the accident, the authors followed their mortality rates between 1979 and 1998. Their findings?
In conclusion, the mortality surveillance of this cohort, with a total of almost 20 years of follow-up, provides no consistent evidence that radioactivity released during the TMI accident (estimated maximum and likely gamma exposure) has had a significant impact on the mortality experience of this cohort through 1998. Slight increases in overall mor- tality and overall cancer mortality persist. The findings of increased risk of LHT for males for maximum gamma exposure and in females for background gamma are of interest and merit continued surveillance to deter- mine if the trend continues. With the excep- tion of breast cancer risk and all lymphatic and hematopoietic tissue (LHT) and maximum gamma exposure, no apparent trends were seen with any of the radiation exposure variables. The slight trend for female breast cancer and likely gamma exposure seen in the earlier update is no longer evident.
Basically, Wasserman's claim that the TMI cohort is unstudied is simply a noisome falsehood. Longitudinal studies have simply discredited his preconceived understanding of the accident. But Wasserman's ravings are not what bothers me. I'm more concerned by the fact that there are still people like Steve Wing, who is trying to push the same discredited scaremongering as genuine science. This is an embarrassment to the University of North Carolina and the field of epidemiology. The tactic of presenting "new studies" at a news conference rather than in a peer-reviewed journal is the kind of tactic employed by cold fusion charlatans and other pseudoscientists. We've humored these people far too long; the public needs to learn that the debate is over, and that they lost.