Thursday, 28 June 2012

Induced Earthquakes A Potential Hazard for Geological Storage of Carbon Dioxide

Summary.  The world is burning fossil fuels at an ever-increasing rate, resulting in increased release of the greenhouse gas carbon dioxide into the atmosphere.  This results in an increase in the long-term globally averaged temperature.  Consequently there is great interest in developing carbon capture and storage in geological repositories to help abate the increase in atmospheric carbon dioxide.


Zoback and Gorelick have just published a paper that a) emphasizes the vast amounts of carbon dioxide that need to be captured and stored, and b) analyzes in detail the likelihood that small-scale earthquakes may be induced at the injection sites because of the increased fluids introduced into the storage sites.  Their concern is that even small to medium scale earthquakes may destroy the integrity of the sites, leading to significant leakage of carbon dioxide back into the atmosphere.  They conclude that extensive deployment of carbon storage involves considerable risk.



Background.  Zoback and Gorelick have analyzed the long-term geological storage of carbon dioxide (CO2) as a means of permanently removing this greenhouse gas from the atmosphere by carbon capture and storage (CCS; see the next section).  First we present some introductory information on CCS. (Background on CCS may be found in this earlier post.)

The European Union (EU) has embarked on the only multinational program in the world, based on binding enacted policies, to reduce emissions below the emissions levels of 1990 by 20% by 2020, and by 80-95% by 2050 (the EU Roadmap; see this post).  Achieving such goals requires decarbonization of most energy sources.  The EU recognizes that a major portion of this reduction should come from use of CCS for large scale fixed sources involved in generating electricity. 

To begin research and development of CCS technology, the EU has selected six demonstration projects in six member countries, using differing capture and sequestration technologies.  The EU has committed EUR1 billion (US$1.25 billion) to them.  Variously, they range in size from one at 30 MW (to be scaled up to over 300 MW) to 900 MW, with most projects expected to capture about 90% of the emitted CO2.  Storage will be in land-based or offshore saline aquifers, and depleted land-based or offshore gas fields.

In the U. S. the state of California is implementing a plan very similar to the EU’s Roadmap.  In a non-official report detailing how California might attain these goals, the California Science and Technology Council (CSTC) relies heavily on decarbonizing energy sources to the greatest extent possible (see this earlier post).  Electricity generation is to be decarbonized, to the extent that use of fossil fuels is maintained, by use of industrial-scale CCS, even though the report recognizes that this technology remains unproven.  Decarbonization of electricity generation is especially important because CSTC envisions use of electric vehicles to decarbonize transportation.

The U. S. Department of Energy (DOE) is sponsoring research on CCS, as reported in the Carbon SequestrationProgram: Technology Program Plan of the National Energy Technology Laboratory.  Its budget request for Fiscal Year 2011 was about US$140 million, with anticipated sharing by an equal amount from Regional Carbon Sequestration Partnerships with universities and corporations.  This budget has grown from about US$10 million in 2000.  Recent support from the American Recovery and Reinvestment Act of 2009 (the fiscal “stimulus”), included in the recent growth of this funding, is essentially exhausted at this time. All aspects of the various stages in capture, release and concentration, transportation and geological storage, as well as monitoring, verification and accounting, are being investigated at laboratory and small pilot scale.

Similar programs are also supported in the DOE Fossil Energy program.  Their requested budget for Fiscal Year 2013 is about US$276 million for CCSand Power Systems, which supports projects as large as industrial scale pilot projects.

Cautionary Analysis of CCS.  Zoback and Gorelick analyzed the dangers to maintenance of reservoir integrity in geological sequestration of CO2, in a paper published in the Proceedings of the National Academy of Sciences, June 26, 2012, vol. 109, pp. 10164-10168 .  As background, the authors note:

·        CCS will be very costly;

·        in the U. S. use of coal for generating electricity produces about 2.1 billion metric tons of CO2 a year, or about 36% of all U. S.emissions;

·        China’s emissions are about 3 times more than this from coal-fired generation, corresponding to about 80% of its emission rate;

·        annually, on a worldwide basis, CCS has to contend with 3.5 billion tons of CO2, which requires injecting an amount of CO2 underground roughly equal to the volume of all the oil extracted from oil wells worldwide;

·        this amount of injected CO2 requires that worldwide about 3,500 functional industrial-scale injection facilities be operational by mid-century, averaged to about 85 facilities added per year; and

·        geological storage must remain faultlessly leak-tight in order to compare with freedom from emissions of renewable energy sources.

The authors include the following analyses:

o       The paper itemizes several instances of earthquakes apparently triggered by underground injection of liquids.  This can arise because many geological formations are already in states of unresolved stress, so that the relatively minor perturbation arising from fluid injection releases the stress in an earthquake.  The fluid in essence makes it easier for the stressed surfaces to slide over one another, which is the hallmark of an earthquake.  Zoback and Gorelick emphasize that it is not any land-based earthquake damage to human wellbeing that concerns them, but rather that even small earthquakes, likely not to produce damage to structures, are likely to damage the geological structures holding the pressurized CO2.  CO2 could then readily permeate to or near the surface, permitting release into the atmosphere and defeating the intent of the storage in the first place.  They present the results of calculations that even a small earthquake of Magnitude 4 could induce slippage of several cm. along a fault of about 1-4 km (0.6-2.4 mi).

o       In stressed geological formations, it is not only the pressure of injected CO2that is potentially hazardous, but also the rate of injection.  More rapid pressure buildup is more likely to trigger an earthquake event; the need to dispose of large volumes of CO2would be an incentive for high injection rates.

o       A widely known injection site is the Utsira formation of the Sleipner gas field in the North Sea.  About 1 million tons of CO2 has been separated from natural gas and reinjected below ground every year, for the past 15 years.  There has been no earthquake activity to date.  The authors calculate that about 3,500 such sites would have to be identified and put into service to accommodate storage needs projected for 2050 (most of which would be needed right now, in fact).  The authors conclude “Clearly this is an extraordinarily difficult, if not impossible task” if only geologically suitable sites are to be used.

o       Depleted oil and gas wells, while seemingly attractive as potential injection sites, are not numerous enough to satisfy the need, and are not necessarily located conveniently for the need.

The authors conclude “multiple lines of evidence indicate that preexisting faults found in brittle rocks almost everywhere in the earth’s crust are subject to failure, often in response to very small increases in pore pressure. In light of the risk posed to a CO2 repository by even small- to moderate-sized earthquakes, formations suitable for large-scale injection of CO2must be carefully chosen.”  Because of the extremely large volumes of CO2 needing to be disposed of, the industrial-scale CCS needed will be “extremely expensive and risky for achieving significant reductions in greenhouse gas emissions”.

Certain CCS projects have been abandoned due to risk and lack of financing.  The very factors identified by Zoback and Gorelick are echoed in these two recent news reports. 

The Guardian on June 17, 2012 reported that Ian Marchant, chief executive of Scottish and Southern Energy, while still favoring CCS development, warned the British Parliament that a CCS project his company is undertaking is “the most risky project I’ll ever invest in….CCS is…at the demonstration stage….We do not know that this technology will work”.  He called for UK government support at this demonstration phase of the project.

The same article noted that another company, Scottish Power, abandoned CCS technology last year.  Together with Shell, the company evaluated it would need at least £1.5 billion (US$2.3 billion), and the UK government could not support such a funding level.

Similarly, theGuardian reported on June 26, 2012that Ayrshire Power (Scotland) abandoned its planned new CCS-fitted 1852 MW power plant because it feared it could not obtain funding from the UK and the European Commission.  Nevertheless, the Scottish energy minister still strongly supports CCS development since it borders North Sea offshore CO2 storage sites.

Rebuttals of Zoback and Gorelick’s warnings.  There has been response from the CCS community rebutting the serious concerns expressed by Zoback and Gorelick.  For example, two scientists were featured in the internet-based Carbon CaptureJournal (accessed June 27, 2012).

Dr. Malcolm Wilson, Chief Executive Officer, The Petroleum Technology Research Centre (PTRC), provided a detailed accounting of the experience gained at the Weyburn-Midale Project, an oil field storage development project in Saskatchewan, Canada, which it seems is an extended oil recovery project as well.  Storage has been under way there for 11 years, with a total of 21 million tonnes (metric tons) of CO2stored in that time.  Detailed research and characterization of the site has been undertaken throughout this time; indeed, seismic events with Magnitudes of -1 (extremely small) have been recorded.  Dr. Wilson considers this site now to be industrial scale, as 2.8 million tonnes of new CO2, and more than 5 million tonnes when recycled CO2 are included, have been injected; no earthquake activity or leakage has been identified.

PTRC is also conducting research on their Aquistore Project, for storage in saline aquifers.  Noting with approval that Zoback and Gorelick cite aquifers favorably because of their very large storage capacities, Dr. Wilson notes that the Aquistore Project will be the first industrial scale storage project, since it will receive CO2 from a coal-fired power plant.

Dr. Bruce Hill, senior staff geologist at Clean Air Task Force (CATF) rebuts the concern over lack of integrity of storage sites due to earthquake activity by emphasizing the rate of CO2 migration toward the surface, rather than the total amounts potentially released.  Dr. Hill emphasizes that there are many layers of rock structures, extending thousands of feet, overlaying injection sites, seeming to belittle the concerns of Zoback and Gorelick.  Dr. Hill feels that the examples cited by the authors are not representative.  He points out that “approximately 1 billion tons of CO2 have been safely injected (and stored) in the process of enhanced oil recovery in the U.S. since the late 1970s, with no reported seismic incidents. In fact, there have been no earthquakes reported anywhere from saline CO2 injections either”.

Dr. Hill concludes that CCS technology is “viable” and should play a significant role in potentially storing the very large amounts of CO2that need to be recovered to reduce atmospheric CO2 accumulation.

George Peridas responded to the paper on the Natural Resources Defense Council Blog on June 22, 2012.  Mr. Peridas believes that Zoback and Gorelick raise valid issues, including whether CCS can cause earthquakes and whether such earthquakes could lead to leakage of the injected CO2.  But in his opinion, the conclusions reached by the authors are more extensive than warranted by the evidence, for example with respect to the second issue, leakage.  He does not agree that an earthquake event would lead to migration of CO2 all the way to the surface.  He believes that an experiment cited by the authors, performed on granite, a brittle mineral, is not representative of capstone layers anticipated in CCS, which would be more compliant, yet impermeable, shales.  In the case of existing fossil fuel geological reservoirs, large earthquakes have been known to occur without loss of the materials.  Mr. Peridas additionally cites Sally Benson (Stanford University and Lead Coordinating Author of the Underground Geological Storage Chapter in the Intergovernmental Panel on Climate Change Special Report on CCS) as stating that naturally care must be taken in choosing CCS injection sites, but that finding such sites should be feasible.



Discussion

Our earlier post, “Carbon Capture and Storage: A Needed yet Unproven Technology”, presented background information on the various technologies that may be employed in each phase of capturing CO2, from the burning of fossil fuels for energy, to transporting the CO2 to a storage site, and finally the actual storage process.   Many problems remain to make CCS industrially viable for utility-scale facilities.  Resolving these problems requires investment of large sums of money, worldwide, to arrive at practical CCS by about 2020.  Currently a relatively small number of demonstration and pilot projects are under way around the world.

The use of fossil fuels is projected to grow considerably in the coming decades around the globe, primarily in developing countries which will power their rapidly expanding economies with energy derived from burning fossil fuels.  This means that the annual rate of CO2emissions will continue expanding, and that the total accumulated concentration of atmospheric CO2 likewise will continue increasing.  Even in developed countries having programs to abate CO2 emissions at various stages of maturity, a major aspect of such abatement involves shifting transportation to electric power.  Thus the total demand for electricity is projected to grow in developed countries as well; to the extent that this demand is not met by renewable sources the need for contending with abatement of CO2 emissions likewise will grow.  For this reason emission abatement programs will rely ever more heavily on technologies such as CCS.

The paper by Zoback and Gorelick serves at least three useful functions.  First, by arithmetic analysis, it underscores the vast, unprecedented need for functional and effective injection sites projected by 2050.  Some of this information has been summarized above.

Second, its geophysical modeling emphasizes the many unknown factors remaining in choosing and developing new CO2 injection sites.  The seals installed surrounding well bores, and the many geological factors involved in retaining the injected CO2out of contact with the atmosphere for hundreds or thousands of years must be essentially fail-safe.  Yet this work emphasizes that the very act of injecting pressurized fluid facilitates potential small-scale earthquakes that, according to the modeling, have the potential of opening fissures in these seals that could lead CO2 back to the surface.

Third, it has engendered fruitful debate in the CCS community about the integrity of proposed injection sites.  Although these issues were already known among workers in the community, they have now been aired among a wider public.  This has the effect of ensuring that research and data gathering, involved in characterizing new injection sites, will be carried out diligently and effectively so that wise siting choices may be made.

The critics of Zoback and Gorelick, such as those cited above, include examples in their rebuttals of injection sites taking advantage of pre-existing wells used in the extraction of oil and gas from their geological repositories.  These have kept the fuels underground for millions of years, and so are cited as justifying CO2 injection for the same reasons.  These are likely not representative of the thousands of new storage injection projects that will be needed to accommodate the demand.  Overall the number of pilot injection sites worldwide is small, and many are new experimental projects.  The concerns raised by Zoback and Gorelick merit careful attention going forward as CCS technology is developed further and deployed in number.


© 2012 Henry Auer

Monday, 18 June 2012

Our Invisible Energy-Video

I've created a video entitled Our Invisible Energy, in the format of a pictorial tutorial, which expands on the ideas expressed in this post

The video makes the point that most of our uses of energy in our daily lives are second nature, so that we don't think much about them.  Nevertheless, the energy in question is mostly obtained from burning fossil fuels, thereby releasing the greenhouse gas carbon dioxide into the atmosphere.

Please have a look!

This video is the first in a series that also includes
          Light and Heat - The Greenhouse Effect and

          Fossil Fuels and Global Warming

 
© 2012 Henry Auer

Wednesday, 13 June 2012

Storm sentinels


Beginning this summer and over the next several years, NASA will be sending unmanned aircraft dubbed "severe storm sentinels" above stormy skies to help researchers and forecasters uncover information about hurricane formation and intensity changes.

Several NASA centers are joining federal and university partners in the Hurricane and Severe Storm Sentinel (HS3) airborne mission targeted to investigate the processes that underlie hurricane formation and intensity change in the Atlantic Ocean basin.

NASA's unmanned sentinels are autonomously flown. The NASA Global Hawk is well-suited for hurricane investigations because it can over-fly hurricanes at altitudes greater than 60,000 feet with flight durations of up to 28 hours - something piloted aircraft would find nearly impossible to do. Global Hawks were used in the agency's 2010 Genesis and Rapid Intensification Processes (GRIP) hurricane mission and the Global Hawk Pacific (GloPac) environmental science mission.

"Hurricane intensity can be very hard to predict because of an insufficient understanding of how clouds and wind patterns within a storm interact with the storm’s environment. HS3 seeks to improve our understanding of these processes by taking advantage of the surveillance capabilities of the Global Hawk along with measurements from a suite of advanced instruments," said Scott Braun, HS3 mission principal investigator and research meteorologist at NASA's Goddard Space Flight Center in Greenbelt, Md.

HS3 will use two Global Hawk aircraft and six different instruments this summer, flying from a base of operations at Wallops Flight Facility in Virginia.

"One aircraft will sample the environment of storms while the other will measure eyewall and rainband winds and precipitation," Braun said. HS3 will examine the large-scale environment that tropical storms form in and move through and how that environment affects the inner workings of the storms.

HS3 will address the controversial role of the hot, dry, and dusty Saharan Air Layer in tropical storm formation and intensification. Past studies have suggested that the Saharan Air Layer can both favor or suppress intensification. In addition, HS3 will examine the extent to which deep convection in the inner-core region of storms is a key driver of intensity change or just a response to storms finding favorable sources of energy.

The HS3 mission will operate during portions of the Atlantic hurricane seasons, which run from June 1 to November 30. The 2012 mission will run from late August through early October.

The instruments to be mounted in the Global Hawk aircraft that will examine the environment of the storms include the scanning High-resolution Interferometer Sounder (S-HIS), the Advanced Vertical Atmospheric Profiling System (AVAPS) also known as dropsondes, and the Cloud Physics Lidar (CPL). The Tropospheric Wind Lidar Technology Experiment (TWiLiTE) Doppler wind lidar will likely fly in the 2013 mission.

Another set of instruments will fly on the Global Hawk focusing on the inner region of the storms. Those instruments include the High-Altitude Imaging Wind and Rain Airborne Profiler (HIWRAP) conically scanning Doppler radar, the Hurricane Imaging Radiometer (HIRAD) multi-frequency interferometric radiometer, and the High-Altitude Monolithic Microwave Integrated Circuit Sounding Radiometer (HAMSR) microwave sounder. Most of these instruments represent advanced technology developed by NASA, that in some cases are precursors to future satellite sensors.

NASA's Science Mission Directorate Global Hawk aircraft will deploy to Wallops Flight Facility from their home base at NASA's Dryden Flight Research Center on Edwards Air Force Base, Calif.


NASA's Global Hawk soars aloft from Edwards Air Force Base, Calif. The NASA Global Hawk is well-suited for hurricane investigations because it can over-fly hurricanes at altitudes greater than 60,000 feet with flight durations of up to 28 hours — something piloted aircraft would find nearly impossible to do. Credit: NASA/Tony Landis
"HS3 marks the first time that NASA's Global Hawks will deploy away from Dryden for a mission, potentially marking the beginning of an era in which they are operated regularly from Wallops," said Paul Newman, atmospheric scientist at NASA Goddard and deputy principal investigator on the HS3 mission.

NASA's Science Mission Directorate in Washington is establishing a Global Hawk operations center for science operations from Wallops. "With the Global Hawks at NASA Dryden in California, NASA Wallops will become the 'Global Hawk - Eastern' science center," Newman said.

From rockets studying the upper atmosphere to unmanned aircraft flying over hurricanes, NASA's Wallops Flight Facility is fast becoming a busy place for science. Wallops is one of several NASA centers involved with the HS3 mission. Others include Goddard, Dryden, Ames Research Center, Marshall Space Flight Center, and the Jet Propulsion Laboratory.

The HS3 mission is funded by NASA Headquarters and managed by NASA's Earth System Science Pathfinder Program at NASA's Langley Research Center, Hampton, Va. The HS3 mission also involves collaborations with various partners including the National Centers for Environmental Prediction, Naval Postgraduate School, Naval Research Laboratory, NOAA's Hurricane Research Division and Earth System Research Laboratory, Northrop Grumman Space Technology, National Center for Atmospheric Research, State University of New York at Albany, University of Maryland - Baltimore County, University of Wisconsin, and University of Utah.

Mild fire forecast


Forests in the Amazon Basin are expected to be less vulnerable to wildfires this year, according to the first forecast from a new fire severity model developed by university and NASA researchers.

Fire season across most of the Amazon rain forest typically begins in May, peaks in September and ends in January. The new model, which forecasts the fire season’s severity from three to nine months in advance, calls for an average or below-average fire season this year within 10 regions spanning three countries: Bolivia, Brazil and Peru.

“Tests of the model suggested that predictions should be possible before fire activity begins in earnest,” said Doug Morton, a co-investigator on the project at NASA’s Goddard Space Flight Center in Greenbelt, Md. “This is the first year to stand behind the model and make an experimental forecast, taking a step from the scientific arena to share this information with forest managers, policy makers, and the public alike.”


Gauges convey the fire severity forecast for 10 regions in the Amazon Basin where fire activity varies greatly from year to year, and where climate conditions have a significant impact on fire activity. Credit: Yang Chen/UC Irvine
The model was first described last year in the journal Science. Comparing nine years of fire data from NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) on the Terra satellite, with a record of sea surface temperatures from NOAA, scientists established a connection between sea surface temperatures in the Pacific and Atlantic oceans and fire activity in South America.

“There will be fires in the Amazon Basin, but our model predictions suggest that they won’t be as likely in 2012 as in some previous years,” said Jim Randerson of the University of California, Irvine, and principal investigator on the research project.

Specifically, sea surface temperatures in the Central Pacific and North Atlantic are currently cooler than normal. Cool sea surface temperatures change patterns of atmospheric circulation and increase rainfall across the southern Amazon in the months leading up to the fire season.

“We believe the precipitation pattern during the end of the wet season is very important because this is when soils are replenished with water,” said Yang Chen of UC Irvine. “If sea surface temperatures are higher, there is reduced precipitation across most of the region, leaving soils with less water to start the dry season.”

Without sufficient water to be transported from the soil to the atmosphere by trees, humidity decreases and vegetation is more likely to burn. Such was the case in 2010, when above-average sea surface temperatures and drought led to a severe fire season. In 2011, conditions shifted and cooler sea surface temperatures and sufficient rainfall resulted in fewer fires, similar to the forecast for 2012.


Improvements to the model are possible by incorporating data from the MODIS instrument on NASA's Aqua satellite, accounting for fires that occur in the afternoon when conditions are hotter and drier. Credit: Doug Morton.
Building on previous research, the researchers said there is potential to adapt and apply the model to other locations where large-scale climate conditions are a good indicator of the impending fire season, such as Indonesia and the United States.

Amazon forests, however, are particularly relevant because of their high biodiversity and vulnerability to fires. Amazon forests also store large amounts of carbon, and deforestation and wildfires release that carbon back to the atmosphere. Predictions of fire season severity may aid initiatives – such as the United Nation’s Reducing Emissions from Deforestation and forest Degradation program – to reduce the emissions of greenhouse gases from fires in tropical forests.

“The hope is that our experimental fire forecasting information will be useful to a broad range of communities to better understand the science, how these forests burn, and what predisposes forests to burning in some years and not others,” Morton said. “We now have the capability to make predictions, and the interest to share this information with groups who can factor it into their preparation for high fire seasons and management of the associated risks to forests and human health.”

NASA's new carbon-counting instrument leaves the nest


Its construction now complete, the science instrument that is the heart of NASA's Orbiting Carbon Observatory-2 (OCO-2) spacecraft — NASA's first mission dedicated to studying atmospheric carbon dioxide — has left its nest at NASA's Jet Propulsion Laboratory in Pasadena, Calif., and has arrived at its integration and test site in Gilbert, Ariz.

A truck carrying the OCO-2 instrument left JPL before dawn on Tuesday, May 9, to begin the trek to Orbital Science Corporation's Satellite Manufacturing Facility in Gilbert, southeast of Phoenix, where it arrived that afternoon. The instrument will be unpacked, inspected and tested. Later this month, it will be integrated with the Orbital-built OCO-2 spacecraft bus, which arrived in Gilbert on April 30.

Once technicians ensure the spacecraft is clean of any contaminants, the observatory's integration and test campaign will kick off. That campaign will be conducted in two parts, with the first part scheduled for completion in October. The observatory will then be stored in Gilbert for about nine months while the launch vehicle is prepared. The integration and test campaign will then resume, with completion scheduled for spring 2014. OCO-2 will then be shipped to Vandenberg Air Force Base, Calif., in preparation for a launch as early as the summer of 2014.

Technicians load the OCO-2 instrument and its ground support equipment aboard a moving van at JPL in preparation for its trek to Orbital Science Corporation's Satellite Manufacturing Facility in Gilbert, Ariz. Credit: NASA/JPL-Caltech.
"The OCO-2 instrument looks great, and its delivery to Orbital's Gilbert, Ariz., facility is a big step forward in successfully launching and operating the mission in space," said Ralph Basilio, OCO-2 project manager at JPL.

OCO-2 is the latest mission in NASA's study of the global carbon cycle. Carbon dioxide is the most significant human-produced greenhouse gas and the principal human-produced driver of climate change. The original OCO mission was lost shortly after launch on Feb. 24, 2009, when the Taurus XL launch vehicle carrying it malfunctioned and failed to reach orbit.

The experimental OCO-2 mission, which is part of NASA's Earth System Science Pathfinder Program, will uniformly sample the atmosphere above Earth's land and ocean, collecting more than half a million measurements of carbon dioxide concentration over Earth's sunlit hemisphere every day for at least two years. It will do so with the accuracy, resolution and coverage needed to provide the first complete picture of the regional-scale geographic distribution and seasonal variations of both human and natural sources of carbon dioxide emissions and their sinks-the places where carbon dioxide is removed from the atmosphere and stored.

Scientists will use OCO-2 mission data to improve global carbon cycle models, better characterize the processes responsible for adding and removing carbon dioxide from the atmosphere, and make more accurate predictions of global climate change.

The mission provides a key new measurement that can be combined with other ground and aircraft measurements and satellite data to answer important questions about the processes that regulate atmospheric carbon dioxide and its role in the carbon cycle and climate. This information could help policymakers and business leaders make better decisions to ensure climate stability and retain our quality of life. The mission will also serve as a pathfinder for future long-term satellite missions to monitor carbon dioxide.

Each of the OCO-2 instrument's three high-resolution spectrometers spreads reflected sunlight into its various colors like a prism, focusing on a different, narrow color range to detect light with the specific colors absorbed by carbon dioxide and molecular oxygen. The amount of light absorbed at these specific colors is proportional to the concentration of carbon dioxide in the atmosphere. Scientists will use these data in computer models to quantify global carbon dioxide sources and sinks.

For more information on the mission, visit: the JPL and NASA OCO-2 websites.

Muddled outlook


The 2012 hurricane season in North and Central America arrives with a muddled outlook. Sea surface temperatures are not particularly warm or cool, and the El Niño–Southern Oscillation (ENSO) is drifting in a neutral state that NASA climate scientist Bill Patzert playfully calls “La Nada.”

The map above shows sea surface temperatures (SSTs) in the tropical Atlantic Ocean and tropical eastern Pacific on May 30, 2012. The map was built with data from the Microwave Optimally Interpolated SST product, a NASA-supported effort at Remote Sensing Systems. Researchers combine observations and analyses from NASA’s Tropical Rainfall Measurement Mission and Aqua and Terra satellites, as well as the U.S. Navy’s WindSAT instrument on the Coriolis satellite (operated jointly with the Air Force).

Shades of blue depict water temperatures below 27.8 degrees Celsius (about 82 degrees Fahrenheit), while yellows, oranges, and reds depict waters above that threshold. Scientists generally agree that waters above that temperature are needed to build and sustain hurricanes, though there are exceptions. Of course, measurements of sea surface temperature account for only the top few millimeters of the ocean, and the amount of heat stored at greater depths (which is harder to measure) can also be a factor in hurricane development. So SSTs do not tell the whole story, but they are a fair predictor of the readiness of the ocean to sustain tropical storms.

“The waters look on the slightly cool side across some of the ‘main development region (MDR)’—the tropical band extending over the east and central Atlantic off Africa,” noted Jeff Halverson, a hurricane researcher at the University of Maryland–Baltimore County. “Whether this will persist for several months as we get into the high season, I don't know.”

The official start of hurricane season is June 1, though four named tropical storms in May—Alberto and Beryl in the Atlantic, Aletta and Bud in the Pacific—didn't wait for the calendar. The Hurricane Research Division of the National Oceanic and Atmospheric Administration (NOAA) announced on May 24, 2012, that it is expecting a near-normal season, with nine to fifteen named storms and four to eight hurricanes. According to NOAA, an average season between 1980 to 2010 produced 12 named storms with six hurricanes, including three major hurricanes.
“We shouldn't be fooled by the storms that have already developed off the southeast U.S. in May,” Halverson said. “Development can and does happen this early—albeit infrequently—and these developments are almost always not far off the U.S. mainland. They have little to do with what is coming off Africa and streaming across the MDR. So these early home-grown storms are not necessarily a predictor of the August to October season, which is dominated by Cape Verde storms.”

Meteorologists often look to ENSO for a sense of whether atmospheric weather patterns will promote or tamp down hurricane formation. In general, researchers believe that El Niño reduces hurricane activity and La Niña promotes it. But the science on the matter is not really settled, and it may be that ENSO affects the number but not necessarily the intensity of storms.

La Niña just ended earlier this spring, and the next El Niño may be some months off. “The equatorial Pacific is neutral, with no El Niño developing...not even a hint,” said Patzert, who is based at the Jet Propulsion Laboratory. “If El Niño builds, I think it will be late and whimpy.”
In the eastern Pacific, NOAA is calling for a near-normal or below-normal season. “Forecasters estimate a 70 percent chance of 12 to 18 named storms, which includes 5 to 9 hurricanes.”

Regardless of the predictions, the key to hurricane season is vigilance. “The important issue is hurricane preparedness along the coasts,” said Patzert. “All it takes is one in your neighborhood to wreak havoc. Listen to the National Hurricane Center, know your evacuation routes, and be super prepared.”