Google Analytics Alternative

You are browsing the archive for Greenhouse gases.

The energy revolution is in reverse

April 18, 2014 in Climate, Emissions reductions, Energy, Fossil fuels, Greenhouse Gases, IPCC, Mitigation, Nuclear power, Policy, Shale Gas, Subsidies, Warming


Not-so-calm waters ahead: The IPCC urges a move away from business as usual Image: Walter Siegmund via Wikimedia Commons

Not-so-calm waters ahead: The IPCC urges a move away from business as usual
Image: Walter Siegmund via Wikimedia Commons

By Henner Weithöner

The UN climate panel’s prescription for tackling climate change is admirably clear. The problem is that the world is heading in precisely the opposite direction.

BERLIN, 18 April – Keeping the rise in global average temperatures to no more than 2°C above pre-industrial levels will not be prohibitively expensive, the Intergovernmental Panel on Climate Change (IPCC) says, though it won’t be easy.

There’s just one problem: the atmospheric facts show that the world is not simply ignoring the IPCC. It’s moving smartly away from the clean energy future that the Panel says is attainable towards an inexorably hotter and more risky future.

Reaching the target will mean cutting greenhouse gas emissions by 40-70% over 2010 levels by mid-century, the IPCC report says. Yet what is happening at the moment is the exact opposite: average global emissions rose by a billion tonnes a year between 2000 and 2010, faster than ever before.

To avoid the worst impacts of climate change as cheaply as possible, the report urges an energy revolution to end the dominance of fossil fuels. The IPCC says  investments in renewable energy need to triple, with subsidies to fossil fuels declining and a switch to natural gas to help countries to get rid of coal.

The path to lower emissions may cost the energy giants dear, the IPCC acknowledges. “Mitigation policy could devalue fossil fuel assets and reduce revenues for fossil fuel exporters,” Professor Ottmar Edenhofer, co-chair of the IPCC’s Working Group III, which produced the report, told a public meeting here. “To avoid dangerous interference with the climate system, we need to move away from business as usual.”

‘Negligible’ cost

Another controversial point is the report’s inclusion of nuclear power as a low-carbon option (it acknowledges that it has declined globally since 1993 and faces safety, financial and waste-management concerns). The report also advocates carbon capture and storage (CCS), noting that it remains untested on a large scale.

But the IPCC insists that diverting hundreds of billions of dollars from fossil fuels into renewable energy and cutting energy waste would shave just 0.06% off expected annual economic growth rates of 1.3%-3%. “Statistically you won’t notice,” said Dr Ryer Gerlagh, a co-ordinating lead author on the economics chapter of the report.

Li Shuo of Greenpeace China said: “Science has spoken: climate action is no burden, it’s an opportunity. As renewable energies are growing bigger, better and cheaper every day, the age of dangerous and polluting coal, oil and gas is over. The only rational response to this report is to start the phase-out of fossil fuels immediately.”

Wrong direction

Global temperatures have risen about 0.8°C since record-keeping started in 1850. Current pledges by governments to reduce emissions by 2020 have set the world on a path to between 3 and 5°C of warming by 2100, the IPCC says.

The Working Group III contribution to the IPCC’s Fifth Assessment Report (AR5) is intended to provide a comprehensive assessment of the options for mitigating climate change through limiting or preventing greenhouse gas emissions. It may have shown that those options exist and are affordable. But that is very far from showing that governments can be persuaded to use them. – Climate News Network

Henner Weithöner is a freelance journalist in Berlin specialising in renewable energy and climate change.

Climate costs ‘may prove much higher’

April 16, 2014 in Atlantic Multidecadal Oscillation, Economy, Greenhouse Gases, Methane, Permafrost, Warming slowdown


A meat store built in the melting  permafrost of Herschel Island in the Arctic Ocean Image: Ansgar Walk via Wikimedia Commons

A meat store built in the melting permafrost of Herschel Island in the Arctic Ocean
Image: Ansgar Walk via Wikimedia Commons

By Tim Radford

There may be a higher price for our descendants to pay for the greenhouse gas build-up, researchers say, as the real costs are updated.

LONDON, 16 April – Economists and scientists may have seriously underestimated the “social cost” of carbon emissions to future generations, according to a warning in Nature.

Social cost is a calculation in US dollars of the future damage that might be done by the emission of one metric ton of carbon dioxide as greenhouse gas levels soar and climates change, sea levels rise and temperature records are broken in future decades.

How much would society save if it didn’t emit that tonne of CO2? One recent US estimate is $37. Such a measure helps civil servants, businessmen and ministers to calculate the impact of steps that might be taken.

On the other hand, say Richard Revesz of New York University School of Law and US and Swedish colleagues, assumptions of cost per tonne – and these range from $12 to $64 according to various calculations – are based on models that need to be improved and extended. The cost of climate change could be higher, for four reasons.

Flawed assumptions

The impact of historic temperature variation suggests societies and economies may be more vulnerable than the models predict, and in this case weather variability is more important than average weather – because crop yields are vulnerable to extremes of temperature.

Then the models omit the damage to productivity, and to the value of capital stock, because of lower growth rates: as these lower growth rates compound each other, human welfare will begin to decline. And that’s without factoring in climate-induced wars, coups or societal collapse.

Third, the models assume that the value people attach to ecosystems (and water is an ecosystem service) remains constant. But, they point out, as commodities become scarce, value increases, so the costs of ecosystem damage will rise faster than models predict.

Finally, the models assume that a constant discount rate can translate future harms into today’s dollars. But discount rates of the future may not be constant.

More warming

“What now?” they ask. “Modellers, scientists and environmental economists must continue to step outside their silos and work together to identify research gaps and modelling limitations.”

They hint at an even deeper problem: the basis of the social harm costs dates from calculations more than 20 years old, and is predicated on an average global warming of less than 3°C. Yet without mitigation, the Intergovernmental Panel on Climate Change projects a warming of 4°C by the end of the century.

“If warming continues unchecked into the twenty-second century, it could render parts of the planet effectively uninhabitable during the hottest days of summer, with consequences that would be challenging to monetize,” they write.

Economic harm may not be the only thing underestimated. Michael Mann, a meteorologist at Penn State University in the US, reports in Geophysical Research Letters that the so-called “slowdown” in global warming during this decade  could be because of an underestimate of the impact of a meteorological monster called the Atlantic Multidecadal Oscillation (AMO), an oceanographic cycle of warming and cooling that delivers natural change in northern hemisphere weather patterns.

More methane

A misreading of this cycle – probably because scientists have not known about it for long – could account for this apparent slowdown. “Some researchers in the past attributed a portion of Northern Hemispheric warming to a warm phase of the AMO,” said Professor Mann.

“The true AMO signal, instead, appears likely to have been in a cooling phase in recent decades, offsetting some of the anthropogenic warming temporarily.” And when the rate of warming rises again, there’s yet more alarming evidence of possible acceleration, according to new research.

The thawing of the Arctic sea ice is also accompanied by a softening and warming of the Arctic permafrost, and changes in the chemistry of the preserved peat, that could release ever larger amounts of methane. Methane is a greenhouse gas, present in smaller quantities than carbon dioxide, but 34 times more potent as a warming agent over 100 years.

If the permafrost melts entirely, that would put five times the present levels of carbon into the atmosphere, US researchers report in the journal Proceedings of the National Academy of Sciences.

“The world is getting warmer, and the additional release of gas would only add to our problems,” said Jeff Chanton of Florida State University, a co-author. – Climate News Network

IPCC tries a gamble with shale gas

April 14, 2014 in Adaptation, Coal, Energy, Fracking, Greenhouse Gases, IPCC, Methane, Nuclear power, Renewables, Shale Gas, Solar energy, Wind power


Non merci: A French protest against drilling for shale gas Image: Camster via Wikimedia Commons

Non merci: A French protest against drilling for shale gas
Image: Camster via Wikimedia Commons

By Alex Kirby

The latest IPCC report urges a dash for gas to allow us to reduce the burning of coal. And it accepts the use of shale gas, which threatens to be far more polluting than originally thought.

LONDON, 14 April – If you support fracking, you should be pleased with the latest report from the Intergovernmental Panel on Climate Change (the IPCC). It’s given the green light to the use of shale gas as a short-term way to slow climate change.

The report is the third and final part of the latest IPCC assessment on climate change (known as AR5). While it puts considerable emphasis on the need for more renewable energy – including solar, wind and hydropower – it says emissions of greenhouse gases can be cut in the medium term by replacing coal with less-polluting gas, though the gas will itself ultimately have to be phased out.

On shale gas, obtained by the controversial fracking process, Ottmar Edenhofer - co-chair of the working group that produced the report – said it was quite clear that the fuel “can be very consistent with low carbon development and decarbonisation”.

Among the objections to fracking is the fact the process releases quantities of methane, a greenhouse gas often reckoned to be at least 20 times more powerful than carbon dioxide at warming the atmosphere. That is the comparison we have often used in the Network’s reporting. It’s right, so far as it goes. But by some calculations it doesn’t go nearly far enough.

Own goal

Recently an observant reader pointed out that methane is 20 times more potent than CO2 when its impact is measured over a century. But in the short term it is a far greater problem. Over the space of two decades it is estimated to be at least 84 times more damaging than carbon dioxide.

Robert Howarth is professor of ecology and environmental biology at Cornell University. He and his colleague Drew Shindell of the US National Oceanic and Atmospheric Administration have predicted that unless emissions of methane (and black carbon) are reduced immediately, the Earth will warm by 1.5°C by 2030 and by 2.0°C by between 2045 and 2050, whether or not carbon dioxide emissions are reduced.

Professor Howarth puts the global warming potential of methane higher still. He has written: “At the time scale of 20 years following emission, methane’s global warming potential is more than 100-fold greater than for carbon dioxide (Shindell et al. 2009).”

Some critics will conclude that the IPCC’s search for a bridging strategy to move us rapidly to a world of clean energy has scored an own goal by failing to rule out a fuel which entails a large and avoidable increase in greenhouse emissions. The cost of the infrastructure needed to exploit shale gas on a large scale may also work to prolong its use.

Affordable transformation

Ironically, the clean energy world the IPCC seeks need be no more than 15 years away, according to one US expert. Mark Z Jacobson is professor of civil and environmental engineering at Stanford University, California, and director of its atmosphere and energy program. He believes that wind, water and solar power can be scaled up cost-effectively to meet the world’s energy demands, ending dependence on both fossil fuels and nuclear power.

Professor Jacobson described in Energy Policy in 2010 how he and a colleague had analysed “the feasibility of providing worldwide energy for all purposes (electric power, transportation, heating/cooling, etc.) from wind, water, and sunlight (WWS)”.

He continued: “We suggest producing all new energy with WWS by 2030 and replacing the pre-existing energy by 2050. Barriers to the plan are primarily social and political, not technological or economic. The energy cost in a WWS world should be similar to that today.”

It sounds like a less risky path to a world of clean energy than the IPCC is urging. Fifteen years to build a different way of fuelling society, or 20 years of watching spiralling methane emissions, seems a no-brainer. – Climate News Network

Enough uranium, but nuclear power is still shrinking

April 11, 2014 in Energy, Nuclear power, Resource shortages


Highly enriched uranium: The growing difficulty of extracting high-quality ore is increasing greenhouse gas emissions Image: Via Wikimedia Commons

Highly enriched uranium: The growing difficulty of extracting high-quality ore is increasing greenhouse gas emissions
Image: Via Wikimedia Commons

By Paul Brown

Many people believe nuclear power could save the planet from climate change. But several factors mean the industry is dying, a new analysis suggests.

LONDON, 11 April – There is enough uranium available on the planet to keep the world’s nuclear industry going for as long as it is needed. But it will grow steadily more expensive to extract, because the quality of the ore is getting poorer, according to new research.

Years of work in compiling information from around the world has led Gavin M. Mudd from Monash University in Clayton, Australia to believe that it is economic and political restraints that will kill off nuclear power and not any shortage of uranium, as some have claimed.

Writing in the journal Environmental Science & Technology that renewables do not have the disadvantages of nuclear power, which needs large uranium mines that are hard to rehabilitate and which generates waste that remains dangerous for more than 100,000 years.

In addition, research shows that renewable technologies are expanding very fast and could produce all the energy needs of advanced economies, phasing out both fossil fuels and nuclear.

Mudd, who is a lecturer in the department of civil engineering at Monash, has compiled decades of data on the availability and quality of uranium ore. He concludes that, while uranium is plentiful, mining the ore is very damaging to the environment and the landscape.

It is expensive to rehabilitate former mines, not least because of the dangerous levels of radiation left behind. As a result many of the potential sources of uranium will not be exploited because of opposition from people who live in the area.

‘Too cheap to meter’

His paper examines the history of uranium mining and its wild fluctuations in price. These have little to do with supply, but rather with demand that is badly affected by nuclear accidents like Chernobyl and Fukushima, and by the political decisions by governments to embark on new nuclear building programmes, or to abandon them.

“Despite the utopian promise of electricity ‘too cheap to meter’, nuclear power remains a minor source of electricity worldwide”, Mudd writes. In 2010 it accounted for 5.65% of total primary energy supply and was responsible for 12.87% of global electricity supply. Both contributions have effectively been declining through the 2000s.

“Concerns about hazards and unfavourable economics have effectively slowed or stopped the growth of nuclear energy in many Western countries since the 1980s.”

The Fukushima accident in Japan has accelerated the trend away from nuclear power. The growth in projects in some countries, notably China, Russia and India, does not offset the fact that many more nuclear power stations will reach retirement age over the next 15-20 years than will be constructed.

Among the factors Mudd considered in the fluctuation of supply was the conversion of Russian and American nuclear weapons into power station fuel supplying 50% of American needs since the mid-1990s, and 20% of global uranium supply.  This has not materially affected the long-term supply of uranium.

Mining blighted

Another issue that is more politically contentious is the high cost of rehabilitating mines, notably in Germany and the US. In many of the countries where uranium has been mined and no rehabilitation attempted, the prospect of further mining is blighted. Mudd gives the examples of Niger, Gabon, Argentina and Brazil, where there has been considerable public opposition to opening up fresh deposits as a result.

If these resources and other uranium deposits elsewhere in the world are to be exploited, Mudd argues, the issue of rehabilitating existing and future mines needs to be addressed.

“There is a critical need for a thorough and comprehensive review of the success (or otherwise) of global U mine rehabilitation efforts and programmes; such a review could help synthesise best practices and highlight common problems and possible solutions,” he says.

The paper also examines in detail the quality of the ore and the difficulty of extracting uranium from various rocks. Mudd concludes that as time passes the richer ores in the rocks that are easiest to extract are becoming scarce.

As a result, for each pound of uranium extracted more greenhouse gases are generated, adding to the CO2 emissions of nuclear power. However, he believes, in the overall comparisons of various energy systems the increase is only marginal.

“The future of nuclear power clearly remains contested and contentious — and therefore difficult to forecast accurately. While some optimists remain eternally hopeful, reality appears to be relegating nuclear power to the uneconomic category of history.

“Overall, there is a strong case for the abundance of already known U resources, whether currently reported as formal mineral resources or even more speculative U sources, to meet the foreseeable future of nuclear power. The actual U supply into the market is, effectively, more an economic and political issue than a resource constraint issue,” Mudd says. – Climate News Network

Camels are frugal methane emitters

April 10, 2014 in Greenhouse Gases, Livestock, Methane


Camels do not eat enough to emit methane to rival cattle and sheep Image: By Keven Law via Wikimedia Commons

Camels do not eat enough to emit methane that could challenge the levels reached by cattle and sheep
Image: By Keven Law via Wikimedia Commons

By Alex Kirby

Some domesticated animals add appreciably to the methane which is warming the atmosphere.  But despite earlier assumptions, camels are not among them.

LONDON, 10 April – Never accuse science of neglecting the smallest and apparently least significant detail in its efforts to understand fully how the Earth and all that’s in it keeps going.

One of the latest arcane revelations comes from scientists in Switzerland, who describe in the Public Library of Science journal PLOS One why we should not heap blame on camels for adding to the methane already in the atmosphere.

Camels – and their camelid relatives, llamas, guanacos, alpacas, vicuñas, dromedaries and Bactrian camels – do produce methane, which is more than 20 times as potent a greenhouse gas as carbon dioxide. But they produce significantly less of it than ruminants like cattle, sheep and goats.

When they are digesting their food, ruminants exhale large quantities of methane, around 20% of global methane emissions. So far the assumption has been that camels, with their similar digestive systems, produce the same amount of the gas.

Smaller appetites

But now researchers at the University of Zurich and ETH Zurich have shown that camels release less methane than ruminants.

Ruminants and camelids are similar but not identical: both groups have stomachs with several chambers, enabling them to regurgitate food from one chamber in order to reduce it in size by renewed chewing. That is why people had assumed till now that camelids and ruminants produce similar amounts of methane. But the researchers have concluded that in absolute terms camels release less methane than cows and sheep of comparable body size.

Admittedly, it is slightly more complicated than that: if you compare methane production with the amount of what the team calls “converted feed”, then methane releases are the same in both groups. But the amount of converted feed is what matters.

The research may be less esoteric than it at first appears. Working with Zurich zoo and private camel keepers, the researchers measured methane production in three types of camelids. They found that all three had a lower metabolism than ruminants – because they eat less.

Not enough meat

One of the report’s authors, Dr Marcus Clauss, a veterinary surgeon from the Vetsuisse Faculty of the University of Zurich, told the Network: “For each unit of digested food, ruminants and camelids produce the same amount of methane.

“But camels generally have a lower metabolism and hence eat less than domestic ruminants. So the total amount of digested fibre per day is lower in camelids, hence the total amount of methane produced is also lower.”

The authors say the camelids’ lower metabolism may be important for countries with lots of camels, like the dromedaries of the Middle East and Australia, or the alpacas and llamas of South America. But they do not advocate a switch from beef and lamb to camel meat.

Dr Clauss says: “Personally, I do not think this has relevance to agricultural systems, because there are many other things to consider. For example, I am sure you could not produce the same amount of meat in the same time from a camel as you can from a steer.” – Climate News Network

Wooden skyscrapers help cool climate

April 4, 2014 in Adaptation, Built Environment, Energy, Forests, Greenhouse Gases, Technology


Builders have used wood for millennia: Now the technology is reaching for the skies Image: Chris Reynolds via Wikimedia Commons

Builders have used wood for millennia: Now the technology is reaching for the skies
Image: Chris Reynolds via Wikimedia Commons

By Tim Radford

Wooden skyscrapers could tick a number of important boxes, including making a serious contribution to cutting climate impacts. The good news is they’re already helping to do that.

LONDON, 4 April – US scientists have a new green solution to urban construction: chop down trees and use the wood for buildings. Good strong timber buildings – and there are plans for 30-storey skyscrapers built of wood – would save on concrete and steel, save on carbon dioxide emissions and cut the use of fossil fuel.

The argument may seem counter-intuitive: that is because a substantial component of climate change stems from changes in land use and the loss of forests. And some researchers have demonstrated that even the most mature trees, the forest giants, can go on absorbing carbon dioxide from the atmosphere.

But Chadwick Oliver, a forester at the University of Yale and colleagues make the case in the Journal of Sustainable Forestry. They argue that if the world stepped up the harvest of the forests and used the wood efficiently then economies could save on fossil fuel, reduce carbon dioxide emissions and give people a reason to value the forests.

It works like this. Overall, trees add 17 billion cubic metres of new wood to the planet’s biomass each year. Right now, humans take about 20% of this new growth – that’s 3.4 billion cubic metres – and a lot of that is burned, inefficiently as cooking fuel, or just burned.

Savings outweigh emissions

If humans stepped up the wood harvest to 34% and used it for construction, they could reduce the use of steel and concrete, and cut between 14% and 31% of global carbon dioxide emissions (the authors count methane and nitrous oxide emissions as carbon equivalents in this calculation).

And of course, carbon would stay locked up in the wood in permanent structures. This would also save between 12 and 19% of annual global fossil fuel consumption: the wood left over from construction could be turned into energy.

The savings on concrete and steel happen because about 16% of global fossil fuel consumption is accounted for by the manufacture of steel, concrete and brick. Factor in the need to transport building materials and that brings the fossil fuel share to between 20% and 30%. So wood-based construction consumes less energy.

The loss of forests represents the release of carbon dioxide, but as long as the harvesting is efficient, more carbon emissions are saved overall.

Better than agriculture

But, of course, this makes forests valuable. “The study shows still another reason to appreciate forests,” says Professor Oliver, “and another reason not to let them be cleared for agriculture.

“Forest harvest creates a temporary opening that is needed for forest species such as butterflies and some birds and deer before it regrows to large trees. But conversion to agriculture is a permanent loss of all forest biodiversity.”

So suddenly, in every sense, wood is cool. Wooden skyscrapers and apartment buildings are already being designed and tested in Sweden and in Canada. Selective harvesting of forests could help protect stands of timber against the spread of wildfire, benefit wildlife and maintain wealth.

“Forests historically have had a diversity of habitats that different species need,” says Professor Oliver. “This diversity can be maintained by harvesting some of the forest growth. And the harvested wood will save fossil fuel and CO2 and provide jobs — giving local people more reason to keep the forests.” – Climate News Network

Climate change ‘makes violence likelier’

March 31, 2014 in Adaptation, Climate risk, Conflict, El Niño, IPCC, Warming


A UN peacekeeper chats to local youths in Darfur: Many governments now give high priority to climate change as a security issue Image: © UN Photo/Albert Gonzalez Farran

A UN peacekeeper chats to local youths in Darfur: Many governments now regard climate change as a security issue
Image: © UN Photo/Albert Gonzalez Farran

By Alex Kirby

Scientists say there is a direct link between changing climate and an increase in violence, reinforcing a key finding of the latest IPCC report.

LONDON, 31 March – US scientists say there is evidence that a warming climate is closely related to political and social instability and a higher risk of conflict.

Professor Solomon Hsiang and colleagues  described in the journal Nature in 2011 how they had investigated whether anything linked “planetary-scale climate changes with global patterns of civil conflict”.

They examined evidence of a possible link between El Niño, the periodic weather disruption off the Pacific coast of South America, which affects the weather and causes higher temperatures across much of the world, and its partner, the cooler La Niña phenomenon, with outbreaks of unrest.

After analysing data from 1950 to 2004, they found that “the probability of new civil conflicts arising throughout the tropics doubles during El Niño years relative to La Niña years.”

They wrote: “This result, which indicates that ENSO may have had a role in 21% of all civil conflicts since 1950, is the first demonstration that the stability of modern societies relates strongly to the global climate” (ENSO, the El Niño/Southern Oscillation, is the scientific term for the cycle of alternating warmer and cooler years).

“Climate change can indirectly increase risks of violent conflicts in the form of civil war and inter-group violence”

The work of Professor Hsiang and his colleagues predates one of the key conclusions of the latest report from the Intergovernmental Panel on Climate Change, entitled Climate Change 2014: Impacts, Adaptation, and Vulnerability, from the IPCC’s Working Group II.

This details the impacts of climate change so far, the future risks from a changing climate, and the opportunities for effective action to reduce the risks.

The report says: “Climate change can indirectly increase risks of violent conflicts in the form of civil war and inter-group violence.” It does not however argue that there is a direct link between climate change and conflict.

Professor Hsiang’s study is cited in a report by a London-based group, the Environmental Justice Foundation, which works to protect the environment and to defend human rights. Its report, The Gathering Storm: Climate Change, Security and Conflict, says the world’s major military powers increasingly regard climate change as a significant threat.

The EJF says: “In 2012, one person every second was displaced by a climate or weather-related natural disaster.

“With millions of people forced to move each year by rapid-onset climate-related hazards and slow-onset environmental degradation, social wellbeing, human rights, economies and even state stability are at risk…at the highest level, climate change is being assessed as a risk to national security and potentially to global stability.”

It identifies several points of concern, including the shrinking of Arctic ice; competition over water resources in Central Asia; sea-level rises and small island developing states; and climate change-induced migration in the Sahel region of Africa.

“We find strong causal evidence linking climatic events to human conflict… across all major regions of the world”

The EJF report says that while climate change may not be the sole cause of conflict in future, it will play an increasingly significant role as “a threat multiplier”.

It cites a 2013 study by Professor Hsiang and others published in Science, an analysis of data drawn from archaeology, criminology, economics, geography, history, political science, and psychology.

The authors write: “We find strong causal evidence linking climatic events to human conflict across a range of spatial and temporal scales and across all major regions of the world.”

They say every 1°C rise in temperature has been estimated to cause a 14% increase of intergroup conflict and a 4% increase of interpersonal violence.

With the possibility of global average temperatures rising by 2-4°C this century, they conclude: “Amplified rates of human conflict could represent a large and critical impact of anthropogenic climate change.”

EJF is campaigning for the recognition of climate change as not simply an environmental problem, but as a human rights issue as well. It wants the United Nations Human Rights Council (UNHRC) to establish a special rapporteur on human rights and climate change. – Climate News Network

Human activities ’caused record Oz heat’

March 24, 2014 in Australia, El Niño, Extreme weather, Forecasting, Heatwave, Ocean Warming, World Meteorological Organization


Sanctuary for some: the summer of 2013 was a grim time for both humans and wildlife Image: By Австралиец via Wikimedia Commons

Sanctuary for some: the summer of 2013 was a grim time for both humans and wildlife
Image: By Австралиец via Wikimedia Commons

By Alex Kirby

Australia’s 2013 summer was the hottest on record only because of human influences on the climate,  meteorologists say. They report that people’s activities raised the likelihood of a record by about five times.

LONDON, 24 March – Australian researchers are in no doubt about what happened there last year. The country’s Bureau of Meteorology is a model of clarity: “2013 was Australia’s warmest year on record. Persistent and widespread warmth throughout the year led to record-breaking temperatures and several severe bushfires. Nationally-averaged rainfall was slightly below average.”

Now two Australian scientists say it is virtually certain that no records would have been broken had it not been for the influence on the climate of humans. They even put a figure on it: people, they say, raised the stakes about five times.

The World Meteorological Organization devotes a section in its report, WMO statement on the status of the global climate in 2013, to the scientists’ peer-reviewed case study, undertaken by a team at the ARC Centre of Excellence for Climate System Science at the University of Melbourne. It was adapted from an article originally published in the journal Geophysical Research Letters,

The study used nine global climate models to investigate whether changes in the probability of extreme Australian summer temperatures were due to human influences.

More frequent extremes ahead

It concluded: “Comparing climate model simulations with and without human factors shows that the record hot Australian summer of 2012/13 was about five times as likely as a result of human-induced influence on climate, and that the record hot calendar year of 2013 would have been virtually impossible without human contributions of heat-trapping gases, illustrating that some extreme events are becoming much more likely due to climate change.”

The report also strikes a warning note: “These types of extreme Australian summers become even more frequent in simulations of the future under further global warming.”.

It says last year was notable as well because it was marked by what scientists call “neutral to weak La Niña ENSO conditions”, which would normally be expected to produce cooler temperatures across Australia, not hotter. El Niño is characterized by unusually warm temperatures and La Niña by unusually cool ones in the equatorial Pacific.

Before 2013 six of the eight hottest Australian summers occurred during El Niño years. The WMO says natural ENSO variations are very unlikely to explain the record 2013 Australian heat.

“There is no standstill in global warming…The laws of physics are non-negotiable”

Introducng the report the WMO secretary-general, Michel Jarraud, said many of the extreme events of 2013 were consistent with what we would expect as a result of human-induced climate change. And he repeated his insistence that claims of a pause in climate change were mistaken.

There is no standstill in global warming. The warming of our oceans has accelerated, and at lower depths. More than 90% of the excess energy trapped by greenhouse gases is stored in the oceans.

“Levels of these greenhouse gases are at record levels, meaning that our atmosphere and oceans will continue to warm for centuries to come. The laws of physics are non-negotiable.”

The report says 13 of the 14 warmest years on record have all occurred during this century, and each of the last three decades has been warmer than the previous one, culminating with 2001-2010 as the warmest decade on record. It confirms that 2013 tied with 2007 as the sixth warmest year on record, continuing the long-term global warming trend.

Temperatures in many parts of the southern hemisphere were especially warm, and Australia was not the only country to feel the impact: Argentina had its second hottest year on record.- Climate News Network

Warmer winters will limit Olympic snow

March 22, 2014 in Greenhouse Gases, Temperature Increase, Warming


Is Sochi, home of the 2014 Winter Olympics, now on the skids? Few host cities look likely to survive Image: By Lite via Wikimedia Cpmmons

Is Sochi, home of the 2014 Winter Olympics, now on the skids? Few host cities look likely to survive
Image: By Lite via Wikimedia Cpmmons

By Tim Radford

Only one in three of the cities which have staged the Winter Olympics in the last 90 years is likely to be cold enough to do so as this century draws to its end.

LONDON, 22 March – The future looks cloudy for the Winter Olympics. If the world keeps on burning fossil fuels in the usual way, then of the 19 cities that have staged the event since 1924, only six are likely have enough natural snow and ice by the 2080s.

Daniel Scott, a geographer at the University of Waterloo in Canada and colleagues from Ontario and Innsbruck, Austria report in the journal Current Issues in Tourism that, on average, daytime temperatures in February at the Games sites between 1924 and 1950 were 0°4C. During the 21st century, this had changed dramatically. The average daytime maximum temperatures had reached 7.8°C.

“There are limits to what current weather risk management strategies can cope with. These limits will be increasingly tested in a warmer world,” the authors write.

The Olympic Winter Games are big business. In 1924 there were 250 amateurs from 16 countries competing in 16 medal events. In 2010 in Vancouver, Canada there were 2,500 athletes – amateur and professional – from 82 countries competing in 86 events.

High stakes

Around 1.5 million spectator tickets were sold in Vancouver, and worldwide media broadcast revenues were more than $1.2 billion. Broadcasts reached 200 nations and a potential audience of 3.8 billion. So the Games are now a big tourist attraction, and at the same time a spectacular showcase for the host country as a future destination for pleasure-seekers.

But, say the researchers, it now looks as though artificial snow technology could become more important than ever, and some places that invested heavily in Olympic facilities will become increasingly marginal, high-risk or downright unreliable.

The authors did three things. They considered the pattern of change over the last 90 years. They analysed the minimum requirements in temperature, weather and snowfall for a games venue. And then they considered the likely warming of the next 80 years in two scenarios – one with low greenhouse gas emissions, and one under the notorious “business-as-usual” scenario.

Games planners ideally need the following: daytime maximum temperatures no higher than 10°C in February and minimums of 0°C or below (so that the slush can freeze again); as little rain as possible (because that’s really bad for snow and discouraging for spectators); natural snow deep enough for alpine and cross country skiing; and overall temperature conditions that would make artificial snowmaking possible if the natural snowfall is not enough.

Few candidates left

They distilled all these to two basic requirements: a reliable daily minimum of 0°C and 30 cms of natural snow on the hillsides. And then they checked all these against the predictions for 19 cities and resorts around the world that have already staged the games and watched the candidates eliminate themselves.

Chamonix in France, home of the first-ever games, and a famous winter playground for more than a century, becomes “high-risk/marginal” by mid-century and, under the high emissions scenario, simply “unreliable” by 2080.

The same is true for Grenoble in France, Oslo in Norway and Sarajevo in Bosnia-Herzegovina. Vancouver becomes not reliable under the high emissions scenario by 2050, and thereafter under all cases. Garmisch-Partenkirchen in Germany and Sochi in Russia, home of the 2014 games, become not reliable under all scenarios. By 2050, only 10 places will still be suitable, with reliable snow and ice. By 2080, this total will be down to six

That raises some big questions: what future is there for winter sports? Will new winter sports powers and regions emerge? Could anyone ever devise truly artificial snow – stuff that would not depend on the temperature? In a substantially warmer world, celebrating the second centennial of the Olympic Winter Games in 2124 would become “increasingly challenging,” the researchers say. – Climate News Network

Call for more efficient livestock management

March 11, 2014 in Agriculture, Emissions reductions, Food security, Greenhouse Gases, Livestock, Methane


Feed me grass, not wheat Image: Man vyi via Wikepedia Commons

Feed me grass, not wheat
Image: Man vyi via Wikimedia Commons

By Tim Radford

Many methods of raising livestock are chronically inefficient argue a group of scientists. Changing the way animals are fed would mean more food being available for human consumption: methane emissions would also be reduced.

LONDON, 11 March – British and international scientists have proposed eight strategies to make cattle and sheep-farming more sustainable, to make both the animals and people who depend on them healthier, and to reduce the strain on the planet.

Mark Eisler, a veterinary scientist at the University of Bristol and colleagues argue in the journal Nature  that cattle, sheep and some other livestock eat the grass and crop residues that people cannot digest.

In so doing farm animals can offer both efficient use of resources and high quality protein for those who might need it most.

Dependence on livestock

Almost a billion of the world’s poorest people depend on livestock for their livelihood, but the scientists are not arguing that the world should eat more meat: rather that there should be more careful use of livestock as a resource.

They make their argument at a time when obesity has emerged as a global problem, and at a time when reports once again link a diet high in meat and animal products with cancer and diabetes.

“Annual meat consumption in India is just 3.2 kilograms per head, compared with 125kg per person in the United States in 2007, much of it from heavily processed foods, such as burgers, sausages and ready meals,” they write. “The focus should be on eating less, better quality meat.”

For the world’s poorest, however, there are clear nutritional advantages to high quality animal foods rich in protein. The challenge is to manage livestock in the most effective way.

Changing diet

They argue that farm animals now consume a third or more of the world’s cereal grain, which would be better used to feed people directly. Other scientists have lately argued that a higher quality diet for animals would lead to lower emissions of methane.

Instead, the Bristol-led team wants to see ruminants or cud-chewing animals fed food that humans cannot eat. Rather than try to improve yield in the developing world by exporting cattle that can deliver 30 litres of milk a day – yields that don’t survive a change in climate and environment – farmers and scientists should try to improve local livestock already adapted to local conditions.

They also argue that animal health is important. Sick animals can make people sick too. A total of 13 diseases transferred between animals and humans are linked to 2.2 million deaths a year and 2.4 billion cases of illness.

Professor Eisler and his fellow authors also want to see animal productivity boosted by the right food supplements to encourage better growth, more efficient digestion and lower levels of greenhouse gas emissions.

Less red meat

They believe in a balanced diet with an average of no more than 300 grams of red meat per person per week; they want to see better management of animals and grazing land to conserve biodiversity and increase carbon capture by plants and soil, and they want to see a global network of research farms.

“The quest for intensification in livestock farming has thundered ahead with little regard for sustainability and overall efficiency, the net amount of food produced in relation to inputs such as land and water,” said Prof Eisler.

“With animal protein set to remain part of the food supply, we must pursue sustainable intensifications and figure out how to keep livestock in ways that work best for individuals, communities and the planet.” – Climate News Network