The Merit of Baseload


South Australia 2015 load duration curve, annotated from Appendix A.

Over the course of a year, Australian demand for electricity tends to peak during air conditioned summer afternoons, and reach its minimum in the hours after midnight. The lumpy day-to-day profile of this demand can be rearranged in order of highest to lowest in what is termed a load duration curve.

This chart is the duration profile for South Australia in 2015, as described by the black curve. By viewing the state’s demand and proportion of wind generation this way, certain features become clear.

  • Baseload of roughly 700 megawatts is defined by the grey line, and is clearly truncated below 80% of the year by the addition of wind generation. This is the basis for the diminished economics of “baseload” power stations, and the claims of no further need for such capacity.
  • Wind helps meet demand between the black line and the blue line (the residual load curve) as a wavy wedge, with its least contribution up at the times of highest demand near “0%” which is instead met by expensive stand-by “peaking” capacity. The consequence is that the “baseload” and other capacity greatly displaced by wind down near 100% are the cheaper sources of power which tend to maintain downward pressure on wholesale supply prices.
  • Clearly then, even with South Australia’s noteworthy wind capacity which has worked to cut state electricity sector emissions by roughly one quarter, that peak demand up around 3000 megawatts requires the operation of as much firm capacity as if the blue curve weren’t there.

Further analysis was offered recently by the Grattan Institute (page 8):

Increasing supply in any market when demand is falling or flat will push down prices. But a further characteristic of wind power suppresses wholesale electricity prices in the short run.

The marginal cost of wind generation – the amount it costs to generate an additional unit of power – is near zero. In fact, if a wind generator chooses not to generate, it will effectively lose money since it will not generate a subsidy under the Federal Government’s Renewable Energy Target (RET) scheme… This is why, at times, a wind generator may bid into the market at a negative price – it is prepared to pay the market to take its electricity because it knows it will get revenue from the subsidy.

Intermittent generators must also either dispatch or dump the electricity they create. When the wind blows, power is generated. If wind generators are to dispatch, they need to make sure that their electricity gets bought or that they pay someone to take it.

Increasing the supply of low marginal cost generation leads to changes in the ‘merit order’, reducing the price that all generators are paid in the NEM. This is known as the merit-order effect.

But these lower wholesale spot prices will not cover wind farms’ long-term costs. The long-term cost of wind generation is around $100 per megawatt hour, although this can vary with individual projects. This cost is very much higher than today’s average NEM prices of around $50 per megawatt hour. Consumers must eventually pay to cover the long-term costs of all generation.

This echoes the illustration of the merit order effect provided by Deloitte in 2015:


The merit order effect can be induced by any form of generation or demand side resource that has a lower short-run cost (i.e. ignoring fixed costs and the capital costs of building the plant) and is a feature of an efficient market. The limiting factor is usually that the new plant needs to be able to recover its fixed and capital costs over time as well, so the price cannot be pulled too far down. However, by incentivising renewable plant outside the market, policies such as the Renewable Energy Target (RET) move this limit.


Imagine that a new type of combined cycle gas turbine power station could operate without emitting carbon dioxide. The ultimate purpose of the RET – climate action – should mean it would be compensated by something like these Large-scale Generating Certificates. Suddenly, the short-run market advantage claimed by wind generation is gone.

Armed with this perspective, we can begin to see where some enthusiasts for exclusive energy supply scenarios begin to go wrong. Take GetUp/Solar Citizens, for example:

Renewable energy generators have low marginal costs (the cost of producing one extra unit of electricity), which means they can bid into the wholesale market low. This pushes more expensive generators out of the stack of successful bids and lowers the overall wholesale price of electricity for all of us. This is called the ‘merit-order effect’, and why it’s not called the ‘renewables winning effect’ is beyond us.

Hopefully at this stage it isn’t beyond you, dear reader.

sa2015_from_mcconnell_2So, what happens if storage is added to this equation? Household-scale batteries for the time-shifting of rooftop solar generation obviously enable the use of renewable energy after dark. In contrast, storage scaled up to the wider electrical grid – which exists only in the form of pumped hydroelectric – is dominated by the economics of covering the costs of operation, maintenance and input electricity with the revenue of selling output electricity. This means arbitraging supply from low demand periods to high. The result can be seen in this version of the load duration curve: if storage were paired with wind in South Australia, operators would effectively move supply from the thick end of the wedge back as close to the point as practicable, to maximise financial return. This could actually begin replacing high cost peaking capacity. Ironically, enough of this would start to bring baseload back, at the same time as eroding the arbitrage economics for further addition of storage. It should be obvious, though, that the absolute last position on the demand profile which would be economically served by stored capacity would be baseload itself.


Storage in the Cold Light of Day

battery-deadPeople want energy in modern society when they want it, and so you’ve go to have supply and demand matching. And, again, there’s a new delusion that’s spreading through the world at the moment which is, “oh yes, now solar is coming down in price, wind is coming down in price, and batteries are coming down in price as well.” People seem satisfied with these simple statements: the prices are coming down so it’s all going to be fine, but they haven’t done the numbers to think through actually how big the batteries would need to be if you wanted to do a solar-and-batteries-only solution.

A solar-plus-battery solution in a place like Las Vegas, I can definitely see playing a large role… Society still needs reliability, though… Society stops functioning if we don’t have a reliable electricity system going all the time, and so for a place like Las Vegas you’re still going to want other technologies in that mix as well. So, I’d advise Las Vegas to get a nuke, for example…

I’m delighted how the book has been helpful… but I’m also still irritated that these delusions about the easiness of getting by with a bit of renewables and a bit of batteries… I think there’s still a lot more to do.

~ Sir David MacKay

It was a relief to hear that sensible projections regarding the role of batteries in Australia’s near-term electricity supply challenge were authoritatively expressed at the meeting of energy ministers last month:

The AEMO told the recent COAG Energy Ministers meeting it may be 10-20 years until battery storage would be able to exert an influence on grid stability and support.

There’s understandable disappointment from some commentators. However AEMO’s sober assessment merely echoes that of the CSIRO.


AEMO itself expects approximately 6.6 gigwatt hours of battery storage distributed amoung rooftop solar capacity by 2035-2036, which sounds like a whole lot more than exists now.

Yet, to get a sense of perspective, 6.6 gigawatt hours would provide no more than 2/3 of one percent of the 62 hour becalmed period described by WattClarity (with a hypothetical ten-fold wind capacity connected to the present national market).

The COAG Energy Council and its independent review process must maintain this realism as it strives to “maintain the security, reliability, affordability and sustainability of the national electricity market” and integrate climate and energy policy. This should encompass a technology-neutral approach, and recognise that avoiding consideration of the future benefits of modern nuclear capacity – potentially available on a comparable timescale to batteries, but historically proven – serves only a diminishing, out-dated activism, when we really have a whole lot more to do.


Criticisms and Casks

isfsi-trimmedThe report from The Australia Institute, commissioned by the Conservation Council of South Australia, entitled Digging for Answers is arguably the premier critique of the conclusions of the Nuclear Fuel Cycle Royal Commission, which identified lucrative potential opportunities in permanent international used reactor fuel stewardship. These and other organisations hold an ideological opposition to any involvement in the fuel cycle, and would ideally prefer everyone just stop finding out about it.

As such, I have prepared a page to host a detailed rebuttal of The Australia Institute’s strikingly superficial analysis, which was submitted for consideration by the Joint Committee on the findings of the NFCRC, and I encourage all to have a read. In addition, the transfer of used nuclear fuel to the kind of dry cask storage which has been considered in detail by the Royal Commssion, along with rare pictures, has been provided to help with a wider familiarity of this routine materials management process. Please share it widely.

The Dry Cask Storage Process


When Your Local Nuclear Plant is the Safest Place in the World

We can only have a rational debate about the risks and benefits of nuclear power if we can put the risks into a balanced perspective. ~ Professor Gerraldine Thomas

In March 2011 the advanced industrial nation of Japan was brought to its knees by a record 9.0 magnitude earthquake. A tsunami of devastating height inundated the east coast of Honshū, overwhelming many sea walls, washing away towns and wreaking unimaginable destruction. Over eighteen thousand people were lost and thousands of others injured.

As The Telegraph’s Michael Hanlon later observed:

When it became clear the waves had struck a nuclear power plant, Fukushima Daiichi, 100 or so miles north of Tokyo, it was almost as if the great disaster we had witnessed had been erased from view. Suddenly, all the reports concentrated on the possibility of a reactor meltdown, the overheating fuel rods, and the design flaws in this ancient plant.

It is understandable, then, that the survival and safe shutdown of the nuclear plant closest to the undersea epicentre went unnoticed.


Onagawa Nuclear Power Plant

Back it up eighteen years, to 1993. The second boiling water reactor at the Tōhoku Electric Co’s Onagawa nuclear station is completed after a three and a half year build, costing $2.64 billion in today’s US dollars. The site is already elevated and fortified beyond historical tsunami indications, the legacy of a corporate safety culture instilled by vice president Yanosuke Hirai. This diligence pervaded and persisted through the company, driving safety focus and disaster preparedness. A further unit is later constructed beside Onagawa-2. The plant operates well above average Japanese availability factor.

The response of Onagawa to the natural disasters in 2011 has been detailed in the literature by senior personnel, as well as by an independent journalist. In response to the quake, all three reactors shut down automatically, as designed. Workers were quick to organise and get to work ensuring the plant’s safety. Backup power sytems including diesel generators and offsite power lines were safe from the waves and continued to cool the decay heat within the reactor cores. Tsunami damage was limited to a non-safety switchgear fire and auxiliary building flooding.


Residentsof Onagawa: warm, fed and safe.

The safety and reliable electricity at the plant in the midst of unprecedented devastation drew local survivors. Hundreds of people were housed in Onagawa’s gymnasium for three months and provided with warmth and supplies.

Onagawa-2, and its sister units, rapidly reached secure cold shutdown, as designed. In 2013, Tōhoku Electric Co began the process of obtaining approval for restarting the reactor. Approval and operating requirements are much tighter now, but in the words of Onagawa’s personnel:

We were able to properly manage the earthquake and tsunami on March 11. However, there are still many lessons that we have learned from the experience.

To pursue and maintain higher safety, we will continue to implement various enhancements. Furthermore, we will continue to grow our skills at executing emergency procedures properly and correctly.

Having fended off the worst nature could dish out, they are now focused on getting even better.

In the meanwhile, Japan increasingly relies on imported natural gas. When it’s not gas being burned it’s coal – in record amounts. The direct health and far-reaching climate impacts of this fossil fuel combustion are unequivocal. This is the true ongoing disaster that began in March 2011.


The psychological toll from Chernobyl was far worse than the damage done by radiation. The political, media and activism frenzy following the Fukushima accident overshadowed not just the natural disaster but also all we had learned regarding legitimate appreciation of nuclear hazards… although five years on and the extent of the overreaction is crystallising. It would be good to see the lessons studiously relearned, to protect us next time.

But perhaps there is a bonus lesson provided by Onagawa. A properly designed power plant, with reactors built rapidly and affordably, and which shrugged off nature’s mightiest blows thanks to the synergy with the exemplary safety culture of its human component. Onagawa could teach us that there doesn’t have to be a next time.

This article originally appeared at Energy For Humanity.

Storaging Your Way Out

This week, the long-awaited UK government approval for the construction of Hinkley Point C was granted. This will be a pair of modern light water reactors of the French EPR type, will generate 3,260 megawatts at full power, and has a design life of 60 years. An exhaustive description of the costs, subsidies and national context for this huge piece of energy supply infrastructure is available from the World Nuclear Association.


Taishan unit 1 in China will be the first EPR commissioned, early in 2017. Other projects in France and Finland have faced substantial delays and cost problems.

Inevitably, many will wonder if the 25.5 billion kilowatt hours and 14 million avoided tons of greenhouse gas can be achieved some other way – maybe even cheaper. Indeed, solar has already been advanced as a prefered option… by the UK’s Solar Trade Association. Just as inevitably, 1) this is solar plus storage, and 2) the amount of necessary storage isn’t specified.

This has already been dissected over at Energy Matters where it was estimated to be 7 billion kilowatt hours of storage capacity when relying on solar alone – “roughly the equivalent of eight hundred more Dinorwigs”. Dinorwig is the largest pumped hydro storage facility in the UK. Ironically, it was originally intended to store nuclear power overnight. Alternatively, it would take over 87 thousand of the largest battery storage installations ever proposed.


The devil’s always in the details. Especially when the details aren’t provided.

To be clear, solar plus storage still won’t provide what huge reactors do. And EPRs can’t provide anything like the flexibility of distributed solar/storage combinations. They have fundamentally different profiles, different scales. Since they can’t substitute for each other, it’s perverse to feed the persistent nuclear vs renewables struggle with them.