Sunday, August 14, 2022

Nuscale's SMR reactors get much more expensive

SMR stands for Small Modular Reactor, and the argument was that because all the parts can be built on an assembly line in a factory, SMRs should produce much cheaper electricity than the giants we have tended to build so far.  Unfortunately, estimated costs have more than doubled in 5 years.


From a tweet by @NuclearEngnrng



Estimated cost of NuScale 12-pack SMR is $6.1B in 2020, was $3.6B in 2017.

Estimated cost of NuScale 6-pack is $5.3B in 2022.

HT for chart: @ecopolitain


The chart doesn't estimate the LCOE, but in 2017 Nuscale estimated it at below $60/MWh. Estimated costs have doubled, so that suggests $120/MWh now.  That may still be acceptable if nuclear turned out to be necessary for the last 10% of de-carbonisation of the grid.

Two years before the great crash

Extreme inequality tends to be followed by economic catastrophe, social chaos and war.  The huge inequality of the 1920s worsened the Great Depression and, as the depression spread round the world, led to the rise of Hitler and right-wing parties in other countries.  



Autralia's inflation also at 30 year highs

 Inflation is a global phenomenon now, driven by supply chain issues, oil and gas prices, war, covid stimulus in 2020, and climate change.  After allowing for the spike caused by the introduction of the GST (General Sales Tax) in 2000, inflation was last at these higsh in 1991.

As an aside, it's depressing that the Ozzie government (both main parties) refuses to fund the Bureau of Statistics adequately, so we are obliged to rely on a private sector body to calculate monthly CPI indices.  





Thursday, August 11, 2022

The River Loire almost dries out

 This river, normally wide and full, is virtually empty because of the extreme drought in France.



Hydrogen efficiency will beat expectations

 From a Twitter thread by Gniewomir Flis



Hydrogen efficiency will beat expectations After spending the last six months looking at cutting edge hydrogen tech I believe that the prevailing view that hydrogen is inefficient needs an update. There’s much innovation to be excited about on the horizon.

Take this T&E [Transport & Environment] chart for instance, which assumes that electrolysis is 76% efficient, and fuel cells are only 54%, which together account for the bulk of the losses. Almost every paper out there uses similar, often worse, numbers.



Current tech is already better than that. For instance, Nel’s stack can be up to 93% efficient (3.8kWh/Nm3 H2). Those numbers are rarely achieved in practice though as the stacks are run at higher current densities (=lower efficiency, but less capex)

But efficiencies are getting better. See the performance of the Oort Energy stack. 90% efficiency at 2A/cm2. This is not some experimental stuff, it’s a full sized stack ready to be manufactured.



In addition to high efficiency, Oort can electrochemically pressurise H2 internally to 200 bar at less than half the energy cost of mechanical compression.

You may also have heard of Hysata which set a record breaking efficiency (98%), though tech is still several years away from commercialisation as it has serious degradation problems.


But more innovation is coming. From the less conventional architectures, we’ve got H2Pro which claims 95% efficiency. So, upcoming electrolysis solutions are looking pretty efficient to me, a step up from the 70-75% efficiencies assumed by many today.

On the fuel cell the innovation is just as good, if starting from a lower baseline (50%). The technology to watch for is high temperature PEM. By high temperature I'm referring to 200C, which aids in rejection of water, a rate limiting step.

Practically, this means that high temperature PEM fuel cells would enjoy a 30% increase in efficiency, i.e from 54% to 70%. Several companies working on this, like Hy-point, Advent, or Mebius. [See this article in Electrive]

Applying these innovations to the T&E chart: Hydrogen production with storage is now 82% efficient.

Round trip efficiency is now 54%, so only 1/3rd less efficient than using electricity directly. Compare with 42% T&E had assumed for… 2050!





And that’s not the end of it. In the storage part, companies like @RuxEnergy (metal organic frameworks) or @VerneH2 (cryocompression) are almost able to match the density of liquified hydrogen without 40% energy loss.Does this change the calculus for hydrogen? To a certain degree. On the road batteries have a significant first mover advantage which I think they’re likely to retain, especially since this up and coming hydrogen innovation won’t get commercialised overnight.


But hydrogen might feel a boost in heavier and infra sensitive applications. I think regional aviation will be a good case in point. Heavy trucking and construction too. Trains could be a good one.


High temperature process heat is another one which would also get a boost. Even if heating with hydrogen is 12% less efficient than with electricity, it may be simpler to repurpose gas infrastructure than it is to lay hundreds of MW of new lines. EU modelling sees a role here.


To conclude: electrolysis is becoming much more efficient than the 70% it gets quoted on. Fuel cells are also getting better. Does this mean we’re all getting FCEVs and hydrogen boilers in 2030? Probably not, but other applications may get a hydrogen boost.


To conclude: electrolysis is becoming much more efficient than the 70% it gets quoted on. Fuel cells are also getting better. Does this mean we’re all getting FCEVs and hydrogen boilers in 2030? Probably not, but other applications may get a hydrogen boost.


Perhaps the real prize here is not that FCEVs are more efficient, but that we will need much less renewable energy to introduce green hydrogen in feedstock applications.



See also his corrections in this thread

The key point is this: producing hydrogen from electrolysis is going to become much cheaper, and hydrogen fuel cells are going to get more efficient. That will mean hard-to-de-carbonise sectors like heavy-duty trucking, air transport and sea transport will be able to switch to hydrogen from fossil fuels. More efficient electrolysis will also make power-to-gas (producing green hydrogen to be later used to make electricity in the grid) will become much cheaper, allowing us to reach the holy grail of long-term storage via methane.

Audience

By Pants (Mr Joshua)



De-carbonisation via carbon capture is a mirage



From The Conversation.




Collectively we three authors of this article must have spent more than 80 years thinking about climate change. Why has it taken us so long to speak out about the obvious dangers of the concept of net zero? In our defence, the premise of net zero is deceptively simple – and we admit that it deceived us.

The threats of climate change are the direct result of there being too much carbon dioxide in the atmosphere. So it follows that we must stop emitting more and even remove some of it. This idea is central to the world’s current plan to avoid catastrophe. In fact, there are many suggestions as to how to actually do this, from mass tree planting, to high tech direct air capture devices that suck out carbon dioxide from the air.

The current consensus is that if we deploy these and other so-called “carbon dioxide removal” techniques at the same time as reducing our burning of fossil fuels, we can more rapidly halt global warming. Hopefully around the middle of this century we will achieve “net zero”. This is the point at which any residual emissions of greenhouse gases are balanced by technologies removing them from the atmosphere.

This is a great idea, in principle. Unfortunately, in practice it helps perpetuate a belief in technological salvation and diminishes the sense of urgency surrounding the need to curb emissions now.

We have arrived at the painful realisation that the idea of net zero has licensed a recklessly cavalier “burn now, pay later” approach which has seen carbon emissions continue to soar. It has also hastened the destruction of the natural world by increasing deforestation today, and greatly increases the risk of further devastation in the future.

 


To understand how this has happened, how humanity has gambled its civilisation on no more than promises of future solutions, we must return to the late 1980s, when climate change broke out onto the international stage.

On June 22 1988, James Hansen was the administrator of Nasa’s Goddard Institute for Space Studies, a prestigious appointment but someone largely unknown outside of academia.

By the afternoon of the 23rd he was well on the way to becoming the world’s most famous climate scientist. This was as a direct result of his testimony to the US congress, when he forensically presented the evidence that the Earth’s climate was warming and that humans were the primary cause: “The greenhouse effect has been detected, and it is changing our climate now.”

If we had acted on Hansen’s testimony at the time, we would have been able to decarbonise our societies at a rate of around 2% a year in order to give us about a two-in-three chance of limiting warming to no more than 1.5°C. It would have been a huge challenge, but the main task at that time would have been to simply stop the accelerating use of fossil fuels while fairly sharing out future emissions.

Four years later, there were glimmers of hope that this would be possible. During the 1992 Earth Summit in Rio, all nations agreed to stabilise concentrations of greenhouse gases to ensure that they did not produce dangerous interference with the climate. The 1997 Kyoto Summit attempted to start to put that goal into practice. But as the years passed, the initial task of keeping us safe became increasingly harder given the continual increase in fossil fuel use.

It was around that time that the first computer models linking greenhouse gas emissions to impacts on different sectors of the economy were developed. These hybrid climate-economic models are known as Integrated Assessment Models. They allowed modellers to link economic activity to the climate by, for example, exploring how changes in investments and technology could lead to changes in greenhouse gas emissions.

They seemed like a miracle: you could try out policies on a computer screen before implementing them, saving humanity costly experimentation. They rapidly emerged to become key guidance for climate policy. A primacy they maintain to this day.

Unfortunately, they also removed the need for deep critical thinking. Such models represent society as a web of idealised, emotionless buyers and sellers and thus ignore complex social and political realities, or even the impacts of climate change itself. Their implicit promise is that market-based approaches will always work. This meant that discussions about policies were limited to those most convenient to politicians: incremental changes to legislation and taxes.

Around the time they were first developed, efforts were being made to secure US action on the climate by allowing it to count carbon sinks of the country’s forests. The US argued that if it managed its forests well, it would be able to store a large amount of carbon in trees and soil which should be subtracted from its obligations to limit the burning of coal, oil and gas. In the end, the US largely got its way. Ironically, the concessions were all in vain, since the US senate never ratified the agreement.


Postulating a future with more trees could in effect offset the burning of coal, oil and gas now. As models could easily churn out numbers that saw atmospheric carbon dioxide go as low as one wanted, ever more sophisticated scenarios could be explored which reduced the perceived urgency to reduce fossil fuel use. By including carbon sinks in climate-economic models, a Pandora’s box had been opened.

It’s here we find the genesis of today’s net zero policies.

That said, most attention in the mid-1990s was focused on increasing energy efficiency and energy switching (such as the UK’s move from coal to gas) and the potential of nuclear energy to deliver large amounts of carbon-free electricity. The hope was that such innovations would quickly reverse increases in fossil fuel emissions.

But by around the turn of the new millennium it was clear that such hopes were unfounded. Given their core assumption of incremental change, it was becoming more and more difficult for economic-climate models to find viable pathways to avoid dangerous climate change. In response, the models began to include more and more examples of carbon capture and storage, a technology that could remove the carbon dioxide from coal-fired power stations and then store the captured carbon deep underground indefinitely.

This had been shown to be possible in principle: compressed carbon dioxide had been separated from fossil gas and then injected underground in a number of projects since the 1970s. These Enhanced Oil Recovery schemes were designed to force gases into oil wells in order to push oil towards drilling rigs and so allow more to be recovered – oil that would later be burnt, releasing even more carbon dioxide into the atmosphere.

Carbon capture and storage offered the twist that instead of using the carbon dioxide to extract more oil, the gas would instead be left underground and removed from the atmosphere. This promised breakthrough technology would allow climate friendly coal and so the continued use of this fossil fuel. But long before the world would witness any such schemes, the hypothetical process had been included in climate-economic models. In the end, the mere prospect of carbon capture and storage gave policy makers a way out of making the much needed cuts to greenhouse gas emissions.

When the international climate change community convened in Copenhagen in 2009 it was clear that carbon capture and storage was not going to be sufficient for two reasons.

First, it still did not exist. There were no carbon capture and storage facilities in operation on any coal fired power station and no prospect the technology was going to have any impact on rising emissions from increased coal use in the foreseeable future.

The biggest barrier to implementation was essentially cost. The motivation to burn vast amounts of coal is to generate relatively cheap electricity. Retrofitting carbon scrubbers on existing power stations, building the infrastructure to pipe captured carbon, and developing suitable geological storage sites required huge sums of money. Consequently the only application of carbon capture in actual operation then – and now – is to use the trapped gas in enhanced oil recovery schemes. Beyond a single demonstrator, there has never been any capture of carbon dioxide from a coal fired power station chimney with that captured carbon then being stored underground.

Just as important, by 2009 it was becoming increasingly clear that it would not be possible to make even the gradual reductions that policy makers demanded. That was the case even if carbon capture and storage was up and running. The amount of carbon dioxide that was being pumped into the air each year meant humanity was rapidly running out of time.


 

With hopes for a solution to the climate crisis fading again, another magic bullet was required. A technology was needed not only to slow down the increasing concentrations of carbon dioxide in the atmosphere, but actually reverse it. In response, the climate-economic modelling community – already able to include plant-based carbon sinks and geological carbon storage in their models – increasingly adopted the “solution” of combining the two.

So it was that Bioenergy Carbon Capture and Storage, or BECCS, rapidly emerged as the new saviour technology. By burning “replaceable” biomass such as wood, crops, and agricultural waste instead of coal in power stations, and then capturing the carbon dioxide from the power station chimney and storing it underground, BECCS could produce electricity at the same time as removing carbon dioxide from the atmosphere. That’s because as biomass such as trees grow, they suck in carbon dioxide from the atmosphere. By planting trees and other bioenergy crops and storing carbon dioxide released when they are burnt, more carbon could be removed from the atmosphere.

With this new solution in hand the international community regrouped from repeated failures to mount another attempt at reining in our dangerous interference with the climate. The scene was set for the crucial 2015 climate conference in Paris.

As its general secretary brought the 21st United Nations conference on climate change to an end, a great roar issued from the crowd. People leaped to their feet, strangers embraced, tears welled up in eyes bloodshot from lack of sleep.

The emotions on display on December 13, 2015 were not just for the cameras. After weeks of gruelling high-level negotiations in Paris a breakthrough had finally been achieved. Against all expectations, after decades of false starts and failures, the international community had finally agreed to do what it took to limit global warming to well below 2°C, preferably to 1.5°C, compared to pre-industrial levels.

The Paris Agreement was a stunning victory for those most at risk from climate change. Rich industrialised nations will be increasingly impacted as global temperatures rise. But it’s the low lying island states such as the Maldives and the Marshall Islands that are at imminent existential risk. As a later UN special report made clear, if the Paris Agreement was unable to limit global warming to 1.5°C, the number of lives lost to more intense storms, fires, heatwaves, famines and floods would significantly increase.

But dig a little deeper and you could find another emotion lurking within delegates on December 13. Doubt. We struggle to name any climate scientist who at that time thought the Paris Agreement was feasible. We have since been told by some scientists that the Paris Agreement was “of course important for climate justice but unworkable” and “a complete shock, no one thought limiting to 1.5°C was possible”. Rather than being able to limit warming to 1.5°C, a senior academic involved in the IPCC concluded we were heading beyond 3°C by the end of this century.

Instead of confront our doubts, we scientists decided to construct ever more elaborate fantasy worlds in which we would be safe. The price to pay for our cowardice: having to keep our mouths shut about the ever growing absurdity of the required planetary-scale carbon dioxide removal.

Taking centre stage was BECCS because at the time this was the only way climate-economic models could find scenarios that would be consistent with the Paris Agreement. Rather than stabilise, global emissions of carbon dioxide had increased some 60% since 1992.

Alas, BECCS, just like all the previous solutions, was too good to be true.

Across the scenarios produced by the Intergovernmental Panel on Climate Change (IPCC) with a 66% or better chance of limiting temperature increase to 1.5°C, BECCS would need to remove 12 billion tonnes of carbon dioxide each year. BECCS at this scale would require massive planting schemes for trees and bioenergy crops.

The Earth certainly needs more trees. Humanity has cut down some three trillion since we first started farming some 13,000 years ago. But rather than allow ecosystems to recover from human impacts and forests to regrow, BECCS generally refers to dedicated industrial-scale plantations regularly harvested for bioenergy rather than carbon stored away in forest trunks, roots and soils.

Currently, the two most efficient biofuels are sugarcane for bioethanol and palm oil for biodiesel – both grown in the tropics. Endless rows of such fast growing monoculture trees or other bioenergy crops harvested at frequent intervals devastate biodiversity.

It has been estimated that BECCS would demand between 0.4 and 1.2 billion hectares of land. That’s 25% to 80% of all the land currently under cultivation. How will that be achieved at the same time as feeding 8-10 billion people around the middle of the century or without destroying native vegetation and biodiversity?


Growing billions of trees would consume vast amounts of water – in some places where people are already thirsty. Increasing forest cover in higher latitudes can have an overall warming effect because replacing grassland or fields with forests means the land surface becomes darker. This darker land absorbs more energy from the Sun and so temperatures rise. Focusing on developing vast plantations in poorer tropical nations comes with real risks of people being driven off their lands.

And it is often forgotten that trees and the land in general already soak up and store away vast amounts of carbon through what is called the natural terrestrial carbon sink. Interfering with it could both disrupt the sink and lead to double accounting.

As these impacts are becoming better understood, the sense of optimism around BECCS has diminished.

Given the dawning realisation of how difficult Paris would be in the light of ever rising emissions and limited potential of BECCS, a new buzzword emerged in policy circles: the “overshoot scenario”. Temperatures would be allowed to go beyond 1.5°C in the near term, but then be brought down with a range of carbon dioxide removal by the end of the century. This means that net zero actually means carbon negative. Within a few decades, we will need to transform our civilisation from one that currently pumps out 40 billion tons of carbon dioxide into the atmosphere each year, to one that produces a net removal of tens of billions.


[The article continues, here]

There is only one plausible way to cut CO2 and methane emissions.  And that's to actually cut them.  Offsets won't work.  Negative emissions won't work.  BECCS won't work.  It's no wonder emissions continue to rise.  We are heading towards a 3° C rise, not 1.5°.  And that will be catastrophic for our civilisation, the world's people and the environment.