Q & As Archive - The Years Project

What is carbon capture and storage, and what role can it play?


Carbon sequestration, also known as carbon capture and storage (CCS), is a technology that is being pursued which might allow the continued use of fossil fuels, especially coal. Unfortunately, CCS has developed more slowly than expected, and the technology is unlikely to make a major contribution to reducing carbon pollution until after the 2020s.


Burning fossil fuels releases carbon dioxide into the atmosphere, and that carbon dioxide is the primary cause of recent global warming. In general, the vast majority of strategies to reduce such carbon dioxide emissions involve reducing fossil fuel combustion, either by replacing fossil fuels with carbon-free sources (such as solar energy or nuclear power) or by using a technology that does the same job but simply uses less energy (such as an energy efficient light bulb or motor).

To ensure fossil fuel combustion does not release carbon pollution into the atmosphere, the carbon dioxide from a coal-fired power plant (or potentially a gas-fired one) must be captured and stored somewhere forever. That carbon dioxide could be removed before combustion or after combustion. Doing so before you burn the fossil fuel is much simpler and cheaper because after combustion, the carbon dioxide begins to diffuse in the exhaust (flue) gas and then the atmosphere. The more diffuse the carbon dioxide, the more difficult and costly it is to extract from the air.

On the other hand, coal can be gasified, and the resulting “synthesis gas” (syngas) can be chemically processed to produce a hydrogen-rich gas and a concentrated stream of carbon dioxide. The latter can be piped directly to a carbon storage site. The former can be burned in a highly efficient “combined cycle” power plant. The whole process—integrated gasification combined cycle (IGCC) plus permanent storage in underground sites—is considerably more expensive than conventional coal plants. In 2009, Harvard’s Belfer Center for Science and International Affairs published a major study, “Realistic Costs of Carbon Capture.” The Harvard analysis concluded that first-of-a-kind CCS plants will have a cost of carbon abatement of some $150 per ton of carbon dioxide avoided, not counting transport and storage costs. This yields a “cost of electricity on a 2008 basis [that] is approximately 10 cents/kWh higher with capture than for conventional plants.” That price would effectively double the cost of power from a new coal plant. In 2003, The National Coal Council explained a key problem that is slowing development of IGCC: “Vendors currently do not have adequate economic incentive” to pursue the technology because “IGCC may only become broadly competitive with” under a “CO2-restricted scenario.”

It would certainly be more useful to have a CCS technology that could capture and store the carbon dioxide from the exhaust or flue gas postcombustion produced by thousands of existing coal plants than to have CCS technology that works only on newly designed plants. However, that technology has historically been even further from commercialization at scale and necessarily involves capturing carbon dioxide that is far more dilute. As a 2008 U.S. DOE report had pointed out:

“Existing CO2 capture technologies are not cost-effective when considered in the context of large power plants. Economic studies indicate that carbon capture will add over 30% to the cost of electricity for new integrated gasification combined cycle (IGCC) units and over 80% to the cost of electricity if retrofitted to existing pulverised coal (PC) units. In addition, the net electricity produced from existing plants would be significantly reduced—often referred to as parasitic loss—since 20-30% of the power generated by the plant would have to be used to capture and compress the CO2.”

Given how very expensive early-stage carbon capture and storage is, jump-starting accelerated development and deployment of CCS requires:

  1. A rising price on carbon dioxide to make CCS profitable or
  2. Large subsidies by some government entity or
  3. Significant investment and financing by the private sector or
  4. Some combination of those three things

Perhaps the major reasons for the very slow development of CCS for both new and existing power plants have been (1) lack of a price on carbon dioxide or other government policy that could provide large ongoing subsidies coupled with (2) lackluster interest and investment by the private sector.

How slow has development been? In October 2013, the New York Times summarized the state of CCS with their headline, “Study Finds Setbacks in Carbon Capture Projects.” The story noted that, “the technology for capturing carbon has not been proved to work on a commercial scale, either in the United States or abroad.” One major CCS demonstration at a West Virginia coal plant was shut down in 2011 because “it could not sell the carbon dioxide or recover the extra cost from its electricity customers, and the equipment consumed so much energy that, at full scale, the project would have sharply cut electricity production.”

The 2013 survey on the “Global Status of CCS,” by the Global CCS Institute found that “while C.C.S. projects are progressing, the pace is well below the level required for C.C.S. to make a substantial contribution to climate change mitigation.”


The Norwegian oil and gas company Statoil is one of the few in the world that has actually captured carbon dioxide from natural gas processing facilities and stored it geologically (in former gas and oil fields). Statoil’s vice president for CCS said in late 2014, “Today the cost per ton is economically prohibitive,” and so “We need public-private partnerships where the government takes commercial exposure and some of the risks.”

A key issue for CCS is that although the development of large-scale commercial projects has been very slow, the requirements for CCS to make a major dent in the global warming problem are huge. Vaclav Smil, Distinguished Professor Emeritus of the Environment at the University of Manitoba in Canada, described “the daunting scale of the challenge,” in his analysis “Energy at the Crossroads”:

“Sequestering a mere 1/10 of today’s global CO2 emissions (less than 3 Gt CO2) would thus call for putting in place an industry that would have to force underground every year the volume of compressed gas larger than or (with higher compression) equal to the volume of crude oil extracted globally by [the] petroleum industry whose infrastructures and capacities have been put in place over a century of development. Needless to say, such a technical feat could not be accomplished within a single generation.”

There are many other issues with CCS. For instance, there is the leakage issue. Even a very small leakage rate from an underground carbon storage side (TK CHECK:site?) (less than 1% a year) would render it all but useless as a “permanent repository.” In addition, a Duke University study found the following: “Leaks from carbon dioxide injected deep underground to help fight climate change could bubble up into drinking water aquifers near the surface, driving up levels of contaminants in the water tenfold or more in some places.” What kind of contaminants could bubble up into drinking water aquifers? The study noted: “Potentially dangerous uranium and barium increased throughout the entire experiment in some samples.” This problem may not turn out to be fatal to CCS, but it might well limit the places where sequestration is practical, either because the geology of the storage site is problematic or because the site is simply too close to the water supply of a large population.

Public acceptance has already been a major problem for CCS. Public concern about CO2 leaks (small and large) has impeded a number of CCS projects around the world. Modest leaks risk water contamination, but large leaks can actually prove fatal because in high concentrations, CO2 can suffocate people. As BusinessWeek reported in 2008:

“One large, coal-fired plant generates the equivalent of 3 billion barrels of CO2 over a 60-year lifetime. That would require a space the size of a major oil field to contain. The pressure could cause leaks or earthquakes, says Curt M. White, who ran the US Energy Department’s carbon sequestration group until 2005 and served as an adviser until earlier this year. ‘Red flags should be going up everywhere when you talk about this amount of liquid being put underground’.”

With the use of hydraulic fracturing to produce natural gas in the United States, we have seen considerable concern about leakage of methane and other potentially harmful substances. There is a growing body of research linking hydraulic fracturing to earthquakes. That has been especially true for the so-called reinjection wells, where millions of gallons of wastewater from the fracturing process are injected deep underground, much as the carbon dioxide would be in CCS. Research published by Stanford University concluded in 2012:

We argue here that there is a high probability that earthquakes will be triggered by injection of large volumes of CO2 into the brittle rocks commonly found in continental interiors. Because even small- to moderate-sized earthquakes threaten the seal integrity of CO2 repositories, in this context, large-scale CCS is a risky, and likely unsuccessful, strategy for significantly reducing greenhouse gas emissions.

Concern about leaks helped kill one of the world’s first full CCS demonstrations of capturing, transporting, and storing carbon dioxide by the Swedish company Vattenfall in northern Germany. The project started operation in 2008. In 2009, Germany tried to introduce legislation that would have had the government assume liability for companies injecting carbon dioxide underground. The legislation failed to pass. Vattenfall did not get a permit to bury the carbon dioxide. As a result in July 2009, the plan “ended with CO2 being pumped directly into the atmosphere, following local opposition at it being stored underground.” In May 2014, the company announced that it was ending all CCS research.

Carbon capture and storage has a long way to go to become a major contributor to addressing the threat of climate change starting in the 2030s. We will need vastly more effort by the public and private sector if CCS is going to provide as much as 10% of the carbon dioxide emissions reductions needed by 2050.

What is my best source for climate change fact checking that I can use to help convince my climate denier friends?

Anyone who plans to talk about climate change with their friends or family should “Climate Change: What Everyone Needs to Know” and spend some time at the website SkepticalScience.com.

Question submitted by Daniel Rose

Because there is a growing national and global conversation on climate change, with major world figures like the Pope joining in, you are likely to encounter people who do not know basic climate science or actually “know” things that are not true. In particular, certain flawed arguments against the science of human-caused climate change have become very commonplace.

These myths have become popular for two key reasons. First, most of them are repeated again and again by the disinformation campaign funded by the fossil fuel industry. Second, they sound plausible on the surface.

Anyone who plans to talk about climate change with their friends or family or colleagues should read my Oxford University Press primer, “Climate Change: What Everyone Needs to Know” and spend some time at the website SkepticalScience.com. Skeptical Science tracks and debunks the most popular climate science myths. It provides both simple and more detailed responses to all of the myths, complete with detailed citations of and links to the recent scientific literature. It even has an app for that purpose. Furthermore, it also includes the best strategies for effective communications based on the social science literature. By permission, I will make use of their material below—with tweaks and additions—to provide short answers to the myths and questions you are most likely to hear (which are in quotation marks).

  1. “The climate has changed before” or “The climate is always changing.” This assertion is actually true, but it is meant to imply that because the climate changed before humans were around, humans cannot cause climate change. That is a logical fallacy, like saying smoking cannot cause lung cancer because people who do not smoke also get lung cancer. In fact, climate scientists now have the same degree of certainty that human-caused emissions are changing the climate as they do that cigarette smoking is harmful. The key point is that the climate changes when it is forced to change. Scientific analysis of past climates shows that greenhouse gases, principally CO2, have controlled most ancient climate changes. The evidence for that is spread throughout the geological record. Now humans are forcing the climate to change far more rapidly than it did in the past mainly by our CO2 emissions—50 times faster than it changed during the relatively stable climate of the past several thousand years that made modern civilization (and particularly modern agriculture) possible.
  2. “Warming has stopped, paused, or slowed down.” In fact, 2014 was the hottest year on record, until 2015 crushed it, and then 2016 easily topped 2016. The warming trend in the past two decades now exceeds the warming trend in the two decades before that. Also, empirical measurements of the Earth’s heat content show the planet is still accumulating heat. Global warming is still happening everywhere we look, especially the oceans, where more than 90% of the extra heat trapped by human carbon pollution goes.
  3. “There is no scientific consensus on human-caused warming”: In fact, our understanding that humans are causing global warming is the position of the Academies of Science from 80 countries plus many scientific organizations that study climate science. More specifically, surveys of the peer-reviewed scientific literature and the opinions of experts consistently show a 97%–98% consensus that humans are causing global warming.
  4. “Recent warming is due to the sun.” In fact, in the last 35 years of global warming, the sun and the climate have been going in opposite directions—with the sun actually showing a slight cooling trend. The Sun can explain some of the increase in global temperatures in the past century, but a relatively small amount. The best estimate from the world’s top scientists is that humans are responsible for all of the warming we have experienced since 1950.
  5. “Are surface temperature records reliable?” Independent studies using different software, different methods, and different data sets yield similar results. The increase in temperatures since 1975 is a consistent feature of all reconstructions. This increase cannot be explained as an artifact of the adjustment process, the use of fewer temperature stations, or other nonclimatological factors. Natural temperature measurements also confirm the general accuracy of the instrumental temperature record.
  6. “Isn’t Antarctica gaining ice?” Satellites measurements reveal Antarctica losing land ice at an accelerating rate, leading many scientists to increase their projections of sea-level rise this century. Why, then, is Antarctic sea ice growing despite a strongly warming Southern Ocean? The U.S. National Snow and Ice Data Center explained in 2014 that the best explanation from their scientists is that it “might be caused by changing wind patterns or recent ice sheet melt from warmer, deep ocean water reaching the coastline. . . . The melt water freshens and cools the deep ocean layer, and it contributes to a cold surface layer surrounding Antarctica, creating conditions that favor ice growth.”
  7. “Didn’t scientist predict an ice age in the 70s?” The 1970s ice age predictions you hear about today were predominantly from a very small number of articles in the popular media. The majority of peer-reviewed research at the time predicted warming due to increasing CO2
  8. “Climate change won’t be bad.” As the scientific literature detailed in this book makes clear, the negative impacts of global warming on agriculture, the environment, and public health far outweigh any positives. The consequences of climate change become increasingly bad after each additional degree of warming, with the consequences of 2°C being quite damaging and the consequences of 4°C being catastrophic. The consequences of 6°C would be almost unimaginable.
  9. “Can climate models be trusted?” A related question is, “Since we can’t predict the weather a few weeks from now, how can we predict the climate a few decades from now?” Although there are uncertainties with climate models, they successfully reproduce the past and have made predictions that have been subsequently confirmed by observations. Long-term weather prediction is hard because on any given day a few months from now or a few years from now, the temperature could vary by tens of degrees Fahrenheit or even Celsius. Similarly, there could be a deluge or no rain at all on any given day. The weather is the atmospheric conditions you experience at a specific time and place. Is it hot or cold? Is it raining or dry? Is it sunny or cloudy? The climate is the statistical average of these weather conditions over a long period of time, typically decades. Is it a tropic climate or a polar climate? Is it a rainforest or a desert? The climate is considerably easier to predict precisely because it is a long-term average. Greenland is going to be much colder than Kenya during the course of a year and during almost every individual month. The Amazon is going to be much wetter than the Sahara desert virtually year-round.

Is climate change making hurricanes more destructive?

Is climate change making hurricanes more destructive?

The most damaging aspect of a hurricane is the storm surge, as in the case of Hurricane Katrina and Hurricane Sandy. We have already seen that sea-level rise is increasing the chances of a Sandy-level storm surge. In addition, we know that global warming increases the intensity of rainfall from the biggest storms, which further adds to flooding. However, there is also evidence to suggest that the warming itself provides fuel for the biggest storms.

Long-term tropical storm records around the world tend to be problematic because “we do a poor job estimating the intensity of storms that are not surveyed by aircraft,” as Massachusetts Institute of Technology hurricane expert Dr. Kerry Emanuel explained in 2015. He notes that “Currently, only North Atlantic tropical cyclones are routinely reconnoitered by aircraft, and only if they threaten populated regions within a few days.” So the best recent analyses attempt to create a consistent or homogeneous way of comparing hurricanes.

A 2012 study, led by Dr. Aslak Grinsted, created a consistent record of large storm surge events in the Atlantic over the previous nine decades. It found the worst storm surges “can be attributed to landfalling tropical cyclones”—hurricanes cause the biggest storm surges. It also found the worst storm surges “also correspond with the most economically damaging Atlantic cyclones”—hurricanes with the biggest storm surges caused the most destruction. A major finding was that Katrina-sized surges “have been twice as frequent in warm years compared with cold years.”

Why does this happen? There are more active cyclones in warm years than cold years, and “The largest cyclones are most affected by warmer conditions.” This is not a huge surprise given that hurricanes get their energy from warm surface waters. In fact, tropical cyclones and hurricanes are threshold events: if sea surface temperatures are below 80°F (26.5°C), they do not form. One of the ways that hurricanes are weakened is the upwelling of colder, deeper water due to the hurricane’s own violent churning action. However, if the deeper water is also warm—as one would expect in warmer years—it does not weaken the hurricane. In fact, it may continue to intensify.

Typically, for a hurricane to become a Category Four or Five superstorm, it must pass over a pool of relatively deep, warm water. For instance, the National Climatic Data Center 2006 report on Katrina begins its explanation by noting that the surface temperatures in the Gulf of Mexico during the last week in August 2005 “were one to two degrees Celsius above normal, and the warm temperatures extended to a considerable depth through the upper ocean layer.” The report continues, “Also, Katrina crossed the ‘loop current’ (belt of even warmer water), during which time explosive intensification occurred. The temperature of the ocean surface is a critical element in the formation and strength of hurricanes.”

In a 2013 paper, “Projected Atlantic Hurricane Surge Threat from Rising Temperatures,” Grinsted and his colleagues determined that the most extreme storm surge events “are especially sensitive to temperature changes, and we estimate a doubling of Katrina magnitude events associated with the warming over the 20th century.” The study concludes, “we have probably crossed the threshold where Katrina magnitude hurricane surges are more likely caused by global warming than not.”

Another 2013 paper, “Recent Intense Hurricane Response to Global Climate Change,” in Climate Dynamics, looked at hurricane frequency and intensity in recent decades as they relate to human-emissions of greenhouse gases and aerosols. Researchers at the U.S. National Center for Atmospheric Research found no human signal in the total number of tropical cyclone or hurricane that occur each year; however, they find that “since 1975 there has been a substantial and observable regional and global increase in the proportion of Cat 4–5 hurricanes of 25–30% per °C of anthropogenic global warming.”

A third 2013 paper is “Trend Analysis with a New Global Record of Tropical Cyclone Intensity,” in the Journal of Climate. The study was led by Dr. James Kossin of NOAA’s National Climatic Data Center. Hurricane expert Emanuel calls this “the best existing analysis of South Pacific tropical cyclones” in his article on Haiyan and Pam “two exceptionally intense tropical cyclones,” that caused devastation in the western Pacific. The 2013 Kossin et al. paper concluded: “Dramatic changes in the frequency distribution of lifetime maximum intensity (LMI) have occurred in the North Atlantic, while smaller changes are evident in the South Pacific and South Indian Oceans, and the stronger hurricanes in all of these regions have become more intense.”

Thus, the best evidence and analysis finds that although we are not seeing more hurricanes, we are seeing more of the Category 4 or 5 super-hurricanes, the ones that historically have done the most damage and that have destroyed entire coastal cities. At the same time, we are seeing a significant rise in the most damaging storm surges, whereby even a Category 1 hurricane (such as Sandy) that hits in precisely the worst possible place can cause unprecedented damage to coastal communities and major cities.

How much have seas risen?

Human-caused warming has raised ocean levels on average several inches since 1900.

Human-caused warming has raised ocean levels on average several inches since 1900.  In addition, the rate of sea level rise since the early 1990s has been almost 0.3 centimeters (0.12 inches) a year, which is double what the average speed was during the prior eight decades. 

One of the most visible and dangerous impacts from global warming is sea level rise. Human-caused warming has raised ocean levels on average several inches since 1900. In addition, the rate of sea level rise since the early 1990s has been almost 0.3 centimeters (0.12 inches) a year, which is double what the average speed was during the prior eight decades. Some of the most important contributors to sea level rise are accelerating.

As one 2014 study explained, there are five main contributors to warming-driven sea level rise:

  1. Thermal expansion
  2. Changes in groundwater storage
  3. Glacier ice loss
  4. Greenland ice loss
  5. Antarctic ice loss

Thermal expansion raises sea levels because the ocean, like all water, expands as it warms up and thus takes up more space. Warming-driven expansion is responsible for approximately half of the sea level rise in the past hundred years. In addition, around the globe, large amounts of land-based water, especially groundwater (such as is found in underground aquifers), is pumped out for farming and drinking. Because more groundwater is extracted than returns to the ground, that water also ends up in the world’s oceans, which contributes to sea level rise.

Melting mountain glaciers also contribute to sea level rise, because frozen water that was trapped on land flows to the sea. Globally, some 90% of glaciers are shrinking in size. The previously landlocked ice ends up in the oceans, which boosts sea level rise. The cumulative volume of global glaciers began to decrease sharply in the mid-1990s. This coincides with a more than doubling of the rate of sea level rise.

Greenland and Antarctica are both covered with two enormous ice sheets. The Greenland ice sheet is nearly 2 miles (3 kilometers) thick at its thickest point and extends over an area almost as large as Mexico. If it completely melts, Greenland, by itself, would raise sea levels more than 20 feet. In 2012, a team of international experts backed by NASA and the European Space Agency put together data from satellites and aircraft to produce “the most comprehensive and accurate assessment to date of ice sheet losses in Greenland and Australia.” They found that the Greenland ice sheet saw “nearly a five-fold increase” in its melt rate between the mid-1990s and 2011. The year 2012 in particular saw unusually high spring and summer temperatures in Greenland. NASA reported that year, “According to satellite data, an estimated 97% of the ice sheet surface thawed at some point in mid-July.” Scientists told ABC News they had never seen anything like this before. In the summer of 2012, the Jakobshavn Glacier, Greenland’s largest, moved ice from land into the ocean at “more than 10.5 miles (17 kilometers) per year, or more than 150 feet (46 meters) per day,” another study found. The researchers pointed out, “These appear to be the fastest flow rates recorded for any glacier or ice stream in Greenland or Antarctica.” By 2014, researchers were able to map Greenland’s ice sheets using the European Space Agency satellite CryoSat-2, which can measure the changing height of an ice sheet over time. They found that since 2009, Greenland had doubled its annual rate of ice loss, to some 375 cubic kilometers per year.

The Antarctic ice sheet is vastly larger than Greenland— bigger than either the United States or Europe—and its average thickness is 1.2 miles (2 kilometers). The Antarctic ice sheet contains some 90% of all the Earth’s ice. It would raise sea levels 200 feet if it completely melts. The West Antarctic ice sheet (WAIS) in particular has long been considered unstable because most of the ice sheet is grounded far below sea level—on bedrock as deep as 1.2 miles (two kilometers) underwater. The WAIS is melting from underneath. As it warms, the WAIS outlet glaciers become more unstable. In the future, rising sea levels themselves may lift the ice, thereby letting more warm water underneath it, which would lead to further bottom melting, more ice shelf disintegration, accelerated glacial flow, and further sea level rise, in an ongoing vicious cycle. A 2012 study found that Antarctica’s rate of ice loss rose 50% in the decade of the 2000s. In 2014, researchers looked at measurements by the European Space Agency’s CryoSat-2 satellite “to develop the first comprehensive assessment of Antarctic ice sheet elevation change.” They concluded: “Three years of observations show that the Antarctic ice sheet is now losing 159 billion tonnes of ice each year—twice as much as when it was last surveyed.” Two major studies from 2014 found that some WAIS glaciers have begun the process of irreversible collapse. One of the authors explains, “The fact that the retreat is happening simultaneously over a large sector suggests it was triggered by a common cause, such as an increase in the amount of ocean heat beneath the oating sections of the glaciers.”

In late 2014, researchers reported the results of a comprehensive, 21-year analysis of the fastest-melting region of Antarctica, the Amundsen Sea Embayment. This region is approximately the size of Texas, and its glaciers are “the most significant Antarctic contributors to sea level rise.” During those two decades, the total amount of ice loss “averaged 83 gigatons per year (91.5 billion U.S. tons).” This is equivalent to losing a Mount Everest’s worth of ice (by weight) every 2 years. Coauthor Isabella Velicogna said, “The mass loss of these glaciers is increasing at an amazing rate.”

What is the biggest source of confusion about what humanity needs to do to avoid the worst climate impacts?

Perhaps the biggest source of confusion in the public climate discussion is that avoiding catastrophic warming requires stabilizing carbon dioxide concentrations not emissions. Studies find that many, if not most, people are confused about this, including highly informed people, and they mistakenly believe that if we stop increasing emissions, then global warming will stop. In fact, very deep reductions in greenhouse gas (GHG) emissions are needed to stop global warming.

One study published in Climatic Change on the beliefs of Massachusetts Institute of Technology (MIT) graduate students, found that “most subjects believe atmospheric GHG concentrations can be stabilized while emissions into the atmosphere continuously exceed the removal of GHGs from it.” The author, Dr. John Sterman from MIT’s Sloan School of Management, notes that these beliefs are “analogous to arguing a bathtub filled faster than it drains will never overflow” and “support wait-and-see policies but violate conservation of matter.”

Let me expand on the bathtub analogy. Although atmospheric concentrations (the total stock of CO2 already in the air) might be thought of as the water level in the bathtub, emissions (the yearly new ow into the air) are represented by the rate of water flowing into a bathtub from the faucet. There is also a bathtub drain, which is analogous to the so-called carbon “sinks” such as the oceans and the soils. The water level will not drop until the flow through the faucet is less than the ow through the drain.

Similarly, carbon dioxide levels will not stabilize until human-caused emissions are so low that the carbon sinks can essentially absorb them all. Under many scenarios, that requires more than an 80% drop in CO2 emissions. If the goal is stabilization of temperature near or below the 2°C (3.6°F) threshold for dangerous climate change that scientists and governments have identified, then carbon dioxide emissions need to approach zero by 2100. A key related point of confusion is that temperatures do not stop rising once atmospheric carbon dioxide levels have stabilized. It takes a while for the Earth’s climate system to actually reach its equilibrium temperature for a given level of CO2. If CO2  levels stopped rising now, temperatures would keep rising for another few decades, albeit slowly. Put another way, the warming that we have had to date is due to CO2 levels from last century. As long as we keep putting enough carbon dioxide into the air to increase CO2 levels, then this lag will persist and the ultimate warming we face will continue to rise. In addition, certain key impacts, such as the disintegration of the great ice sheets, will also not stop for decades. Moreover, if we wait too long and pass the point of no return, then ice sheet collapse and sea-level rise will continue for centuries, even if temperatures stop rising.

The MIT study, “Understanding Public Complacency About Climate Change: Adults’ Mental Models of Climate Change Violate Conservation of Matter,” notes that there is an apparent “contradiction” in “public attitudes about climate change”:

Surveys show most Americans believe climate change poses serious risks but also that reductions in greenhouse gas (GHG) emissions sufficient to stabilize atmospheric GHG concentrations or net radiative forcing can be deferred until there is greater evidence that climate change is harmful. US policymakers likewise argue it is prudent to wait and see whether climate change will cause substantial economic harm before undertaking policies to reduce emissions. Such wait-and-see policies erroneously presume climate change can be reversed quickly should harm become evident, underestimating substantial delays in the climate’s response to anthropogenic forcing.

Such a misconception of climate dynamics may lead some people to mistakenly believe that action to reduce carbon dioxide emissions does not need to start imminently.

How will climate change affect agriculture and our ability to feed the world’s growing population?

Dust-bowl conditions are projected to become the norm for large areas in both food-importing and food-exporting countries. Feeding the global population in the face of a worsening climate is likely the greatest challenge humans have ever faced.

Every part of the world will be routinely hit by extreme deluges, floods, droughts, and heat waves that damage crops. At the same time, salt water intrusion from sea level rise threatens some of the richest agricultural deltas in the world, such as those of the Nile and the Ganges. Meanwhile, ocean acidification combined with ocean warming and overfishing may severely deplete the food available from the sea.

On the demand side, the United Nations Food and Agricultural Organization estimates that some 800 million people are chronically undernourished. In the coming decades, we will be adding another billion mouths to feed, then another billion and by most projections, another billion, taking us to 10 billion. At the same time, many hundreds of millions of people around the world will be entering the middle class, and, if they are anything like their predecessors around the globe, they will be switching from a mostly grain-based diet to a more meat-based one, which can require 10 times as much acreage and water per calorie delivered.

The World Bank issued an unprecedented warning about the threat to global food supplies in a 2012 report, “Turn Down the Heat: Why a 4°C Warmer World Must be Avoided.” The Bank noted that the latest science was “much less optimistic” than what had been reported in the Intergovernmental Panel on Climate Change’s 2007 Fourth Assessment report:

These results suggest instead a rapidly rising risk of crop yield reductions as the world warms. Large negative effects have been observed at high and extreme temperatures in several regions including India, Africa, the United States, and Australia. For example, significant nonlinear effects have been observed in the United States for local daily temperatures increasing to 29°C for corn and 30°C for soybeans. These new results and observations indicate a significant risk of high-temperature thresholds being crossed that could substantially undermine food security globally in a 4°C world.

And that’s just temperature rise: “Compounding these risks is the adverse effect of projected sea-level rise on agriculture in important low-lying delta areas.” Moreover, we have the threat to seafood of ocean acidi cation. Finally, we have Dust-Bowlification:

The report also says drought-affected areas would increase from 15.4% of global cropland today, to around 44% by 2100. The most severely affected regions in the next 30 to 90 years will likely be in southern Africa, the United States, southern Europe and Southeast Asia, says the report. In Africa, the report predicts 35% of cropland will become unsuitable for cultivation in a 5°C world.

What is some of the underlying science behind these conclusions? Using a “middle of the road” greenhouse gas emissions scenario, a study in Science found that for the more than five billion people who will be living in the tropics and subtropics by 2100, growing-season temperatures “will exceed the most extreme seasonal temperatures recorded from 1900 to 2006.” The authors of “Historical Warnings of Future Food Insecurity with Unprecedented Seasonal Heat” conclude that “Half of world’s population could face climate-driven food crisis by 2100.”

A study led by MIT economists found that “the median poor country’s income will be about 50% lower than it would be had there been no climate change.” That finding was based on a 3°C warming by 2100, which is much less than the warming we are currently on track to reach. A further study led by NOAA scientists found that several regions would see rainfall reductions “comparable to those of the Dust Bowl era.” Worse, unlike the Dust Bowl, which lasted about decade at its worst, this climate change would be “largely irreversible for 1,000 years after emissions stop.” In other words, some of the most arable land in the world would simply turn to desert.

In my Nature article, “The Next Dust Bowl,” I wrote, “Human adaptation to prolonged, extreme drought is dif- cult or impossible. Historically, the primary adaptation to dust-Bowlification has been abandonment; the very word ‘desert’ comes from the Latin desertum for ‘an abandoned place’.” During the relatively short-lived U.S. Dust Bowl era, some 2.5 million people moved out of the Great Plains.

However, now we are looking at multiple, long-lived droughts and steadily growing areas of essentially nonarable land in the heart of densely populated countries and global breadbaskets. In a 2014 study, “Global warming and 21st century drying,” the authors concluded, “An increase in evaporative drying means that . . . important wheat, corn and rice belts in the western United States and southeastern China, will be at risk of drought.”

The study’s lead author, Dr. Benjamin Cook, a top drought expert with joint appointments at NASA and Columbia, explained to me that we are headed into a “fundamental shift in Western hydro-climate.” This drying includes the agriculturally rich Central Plains. The study warns that droughts in the region post-2050 “could be drier and longer than drought conditions seen in those regions in the last 1,000 years.” Given how rapidly growing the population of the West is, I asked him whether there would be enough water for everyone there. He said “we can do it,” but only “if you take agriculture out of the equation.” However, that, of course, is not an option. Columbia University’s Lamont-Doherty Earth Observatory further notes that “while bad weather periodically lowers crop yields in some places, other regions are typically able to compensate to avert food shortages. In the warmer weather of the future, however, crops in multiple regions could wither simultaneously.” That would make food-price shocks “far more common,” according to climatologist and study coauthor Richard Seager.

The international aid and development organization Oxfam has projected that global warming and extreme weather will combine to create devastating food price shocks in the coming decades. They concluded that wheat prices could increase by 200% by 2030 and corn prices could increase a remarkable 500% by 2030.

In 2014, the IPCC warned that humanity is risking a “breakdown of food systems linked to warming, drought, flooding, and precipitation variability and extremes.” This was a key conclusion from its summary of what the scientific literature says about “Impacts, Adaptation, and Vulnerability,” which every member government approved line by line. The IPCC pointed out that in recent years, “several periods of rapid food and cereal price increases following climate extremes in key producing regions indicate a sensitivity of current markets to climate extremes among other factors.” So warming-driven drought and extreme weather have already begun to reduce food security.

If we jump to a more heavily populated and climate-ravaged future, the IPCC warns that climate change will “prolong existing, and create new, poverty traps, the latter particularly in urban areas and emerging hotspots of hunger.” You might think the question of the future of agriculture under high levels of warming would be something that has been well studied because of the importance of feeding so many people in a globally warmed world. However, the IPCC notes that “Relatively few studies have considered impacts on cropping systems for scenarios where global mean temperatures increase by 4°C [7°F] or more.”

Even though humanity is currently headed towards 4°C [7°F] and beyond, we do not have a very good scientific picture of the full impact such climate change will have on agriculture and food supplies. The IPCC does mention briefly that our current path of unrestricted carbon emissions (the RCP8.5 scenario) holds unique risks for food supplies: “By 2100 for the high-emission scenario RCP8.5, the combination of high temperature and humidity in some areas for parts of the year is projected to compromise normal human activities, including growing food or working outdoors.” If we warm anywhere near that much—some 4°C [7°F] or more—the challenge of feeding 9 billion people or more will become exponentially harder.

What are the expected health impacts of climate change?

Not only does extreme weather create a direct risk of harm and death, but rising global temperatures facilitate the spread of tropical diseases and increase the risk of death due to heat waves.

Not only does extreme weather create a direct risk of harm and death, but rising global temperatures facilitate the spread of tropical diseases and increase the risk of death due to heat waves.

Human health will be negatively affected by climate change in a number of ways. Not only does extreme weather create a direct risk of harm and death, but rising global temperatures facilitate the spread of tropical diseases and increase the risk of death due to heat waves. Floods and droughts will also impact health by threatening sources of clean water and food.

Climate change is expected to have a broad range of direct and indirect impacts on health this century. These impacts range from increased mortality due to longer and stronger heat waves to health problems created by warming-driven urban smog to risks posed by malnutrition and lack of access to water. Warming will have some beneficial impacts, most notably a modest drop in cold-related illness and death. “But globally over the 21st century, the magnitude and severity of negative impacts are projected to increasingly outweigh positive impacts,” as the Intergovernmental Panel on Climate Change (IPCC) concluded in its comprehensive 2014 literature review on “Impacts, adaptation, and vulnerability.”

Most worrisome, if humanity stays near its current path of greenhouse gas emissions, the IPCC warns with “high confidence” that “the combination of high temperature and humidity in some areas for parts of the year is projected to compromise normal human activities, including growing food or working outdoors.” In that case, simply being outdoors in summer months will be unhealthy, and those areas of the world would increasingly be seen as uninhabitable.

Although one might think that the human health impacts of global warming would be among the most well studied areas of climate change, it is only in the last decade that the medical community and other health professionals have focused on this issue in depth. As recently as 2009, a landmark Health Commission created by The Lancet medical journal and the University College London (UCL) Institute for Global Health could warn that the “full impact” of climate change to human health “is not being grasped by the healthcare community or policymakers.” Lead author, Anthony Costello, a pediatrician and director of UCL Institute for Global Health, said that he himself “had not realised the full ramifications of climate change on health until 18 months ago.”

The report, “Managing the Health Effects of Climate Change,” concluded, “Climate change is the biggest global health threat of the 21st century.” It warned, “Climate change will have devastating consequences for human health from”:

– Changing patterns of diseases, and increased deaths due to heat waves

– An increase in the frequency and magnitude of extreme climate events (hurricanes, cyclones, storm surges) causing flooding and direct injury

– Increasing vulnerability for those living in urban slums and where shelter and human settlements are poor

– Large-scale population migration and the likelihood of civil unrest

A 2011 editorial in The British Medical Journal, led by the surgeon rear admiral of the UK’s Ministry of Defence, reviewed and synthesized recent reports on “Climate change, ill health, and conflict.” The editorial warned that “Climate change poses an immediate and grave threat, driving ill health and increasing the risk of conflict, such that each feeds on the other.” The threat posed by climate change to regional security “will limit access to food, safe water, power, sanitation, and health services and drive mass migration and competition for remaining resources.” There will be a rise in starvation, diarrhea, and infectious diseases as well as in the death rate of children and adults. The authors note that “in 2004, seven of the 10 countries with the highest mortality rates in children under 5 were conflict or immediate post-conflict societies.”

Are recent climatic changes unprecedented?

Many of the recently observed climate changes are unprecedented over decades to millennia.

A stable climate enabled the development of modern civilization, global agriculture, and a world that could sustain a vast population, now exceeding 7 billion people. We already have unprecedented levels of CO2 in the atmosphere, so it would not be surprising to learn that some of the CO2-driven climate changes are unprecedented.

Until the last century, global temperatures over the past 11,000 years varied quite slowly, generally not more than a degree Fahrenheit (under a degree Celsius) over a period of several thousand years. In its final 2014 synthesis of more than 30,000 scientific studies, the Intergovernmental Panel on Climate Change concluded, “Warming of the climate system is unequivocal, and since the 1950s, many of the observed changes are unprecedented over decades to millennia.” How unprecedented those changes were became clear in an earlier 2012 study, the most comprehensive scientific reconstruction of global temperatures over the past 11,000 years ever made. The study’s funder, the National Science Foundation, explained in a news release: “During the last 5,000 years, the Earth on average cooled about 1.3 degrees Fahrenheit–until the last 100 years, when it warmed about 1.3 degrees F.”  In short, primarily because of human-caused greenhouse gases, the global temperature is changing 50 times faster than it did during the time when modern civilization and agriculture developed, a time when humans figured out where the climate conditions—and rivers and sea levels—were most suited for living and farming.

In 2013, scientists from the International Programme on the State of the Ocean reported that the rate we are acidifying the oceans is also “unprecedented.” Approximately one quarter of the CO2 humans emit into the air gets absorbed in the oceans. The CO2 that dissolves in seawater forms carbonic acid, which in turn acidifies the ocean. As a result, the oceans are more acidic today than they have been over the last 300 million years. A 2010 study concluded that the oceans are acidifying 10 times faster today than 55 million years ago when a mass extinction of marine species occurred.

Which extreme weather events are being made worse by climate change?

Warming directly makes heat waves longer, stronger, and more frequent. For instance, a major 2012 study found that extreme heat waves in Texas, such as the one that occurred in 2011, are much more likely—20 times more likely in years like 2011—to occur than they were 40–50 years ago.

Although human-caused global warming makes extremely warm days more likely, it makes extremely cold days less likely. So while we will continue to have record-setting cold temperatures in places, the ratio of record-setting hot days to record-setting cold days will grow over time, which has been measured. The U.S. National Center for Atmospheric Research (NCAR) reported in late 2009 that “Spurred by a warming climate, daily record high temperatures occurred twice as often as record lows over the last decade across the continental United States, new research shows.” Likewise, the UK Met Office reported in 2014 that, globally, the ratio of days that are extremely warm versus the days that are extremely cold has risen sharply since 1950. They point out “Globally, 2013 was also in the top 10 years for the number of warm days and in the bottom 10 years for the number of cool nights since records began in 1950.”

Global warming directly makes droughts more intense by drying out and heating up land that is suffering from reduced precipitation. The warming also worsens droughts by causing earlier snowmelt, thus reducing a crucial reservoir used in the West during the dry summer season. Finally, climate change shifts precipitation patterns, causing semi-arid regions to become parched. For instance, the 2012 Texas study found “indications of an increase in frequency of low seasonal precipitation totals.”

The heat and the drying and the early snow melt also drive worsening wildfires, particularly in the West. The wildfire season is already more than 2 months longer than it was just a few decades ago, and wildfires are much larger and more destructive.

Warming also puts more water vapor in the atmosphere, so that wet areas of the world become wetter and deluges become more intense and more frequent. This effect has already been documented and linked to human activity in the northern hemisphere. As New York Governor Andrew Cuomo said after Superstorm Sandy slammed his state just 2 years after it was deluged by hurricane Irene, “We have a one-hundred year flood every two years now.” Note that this means that when it is cold enough to snow, snowstorms will be fueled by more water vapor and thus be more intense themselves. We thus expect fewer snowstorms in regions close to the rain-snow line, such as the central United States, although the snowstorms that do occur in those areas are likely to be more intense. It also means we expect more intense snowstorms in generally cold regions. This may appear to be counterintuitive, but the warming to date is not close to that needed to end below-freezing temperatures over large parts of the globe, although it is large enough to put measurably more water vapor into the air.

In addition, warming raises sea levels by heating up and expanding water and by melting landlocked ice in places such as Greenland and Antarctica. Those rising sea levels in turn make devastating storm surges more likely. For instance, warming-driven sea level rise nearly doubled the probability of a Sandy-level flood today compared with 1950. Studies also find that global warming makes the strongest hurricanes more intense, because hurricanes draw their energy from ocean warmth, so that once a hurricane forms, global warming provides it more fuel.

Is there a difference between global warming and climate change?

Global warming generally refers to the observed warming of the planet due to human-caused greenhouse gas emissions. Climate change generally refers to all of the various long-term changes in our climate, including sea level rise, extreme weather, and ocean acidification.

In 1896, a Swedish scientist, Svante Arrhenius, concluded that if we double atmospheric CO2 levels to 560 parts per million (from preindustrial levels of 280), then surface temperature levels would rise several degrees. The first published use of the term “global warming” appears to have been in 1975 by the climatologist Wallace Broecker in an article in the journal Science titled, “Climatic Change: Are We on the Brink of a Pronounced Global Warming?” In June 1988, global warming became the more popular term after NASA scientist James Hansen told Congress in a widely publicized hearing that “Global warming has reached a level such that we can ascribe with a high degree of confidence a cause and effect relationship between the greenhouse effect and the observed warming.”

The term “climate change” dates at least as far back as 1939. A closely related term, “climatic change,” was also common, as in the 1955 scientific article, “The Carbon Dioxide Theory of Climatic Change” by Gilbert Plass. By 1970, the journal Proceedings of the National Academy of Sciences published a paper titled “Carbon Dioxide and its Role in Climate Change.” When the world’s major governments set up an advisory body in 1988 of top scientists and other climate experts to review the scientific literature every few years, they named it the “Intergovernmental Panel on Climate Change.”

Climate change or global climate change is generally considered a “more scientifically accurate term,” than global warming, as NASA explained in 2008, in part because “Changes to precipitation patterns and sea level are likely to have much greater human impact than the higher temperatures alone.”8 When you consider all of the impacts scientists have observed in recent decades—including the acidifying ocean, worsening wildfires, and more intense deluges—climate scientists are likely to continue favoring the term climate change. In general or popular usage, global warming and climate change have become interchangeable over the past several decades, and that trend is likely to continue this century, especially as the warming itself becomes more and more prominent.