To give you the best possible experience, this site uses cookies. Continuing to use this site means you agree
to our use of cookies. If you'd like to learn more about the cookies we use please find out more

Session 4.3 abstracts

Symposium home | Session 2 | Session 3 | Session 4 | Abstracts by Author |


4.3 Implementing a long-term international programme for ocean analysis and forecasting

Session conveners: Pierre-Yves Le Traon, Andreas Schiller and Clemente Tanajura

The table below lists all abstracts for Session 4.3 by author. To read the full abstract click on the title-link.

The unique reference number (ref. no.) relates to the abstract submission process and must be used in any communications with the organisers.

All abstracts from session 4.3 are available for download - pdf

 Ref.NoPrimary AuthorAffiliationCountryAbstract titlePoster
S4.3-01Barron, CharlieNRLUnited StatesDeveloping Future Data Assimilation Capabilities for Operational Navy OceanographyCancelled
S4.3-02Boyer, TimUS National Oceanographic Data CenterUnited StatesA historical quality-controlled temperature and salinity dataset for assimilation and climate studies – an international approachCancelled


ID 4.3-01

Developing Future Data Assimilation Capabilities for Operational Navy Oceanography

Charlie N. Barron1, Matthew Carrier1, James A. Cummings1, Emanuel M. Coelho2, Jan M. Dastugue1, Robert W. Helber1, Hans E. Ngodock1, Clark D. Rowley1, Lucy F. Smedstad1, Scott R. Smith1, Tamara L. Townsend1, Mozheng Wei1, and Max Yaremchuk1

1 Naval Research Laboratory, Stennis Space Center, MS, USA

2 University of New Orleans, Stennis Space Center, MS, USA


The Naval Research Laboratory (NRL) is developing and transitioning a range of data assimilation capabilities in support of operational oceanography at the Naval Oceanographic Office (NAVOCEANO). These new developments are designed to advance present systems by taking advantage of new opportunities: introducing new satellite data streams, optimally guiding swarms of undersea gliders, enabling informed decisions by matching probabilistic forecasts to critical thresholds of derived downstream products. Such new capabilities are incorporated into the operational procedures only after they demonstrate a net positive impact on NAVOCEANO products or resource requirements. Such transitions span aspects of operational oceanography from incorporation of new satellite data streams and their covariances with the ocean interior, optimization of in situ sampling from airborne, ship, and autonomous underwater platforms, representation of observation errors and covariances, development of 4D-Variational assimilation capabilities, estimation and correction of flux imbalances at interfaces between coupled models, and estimation and application of forecast uncertainties. Introduction of these capabilities allows more effective use of operational resources, reduces forecast error, and better provides the information to make operational decisions based on probabilistic forecast of the ocean environment.

ID 4.3-02

A historical quality-controlled temperature and salinity dataset for assimilation and climate studies – an international approach

Tim Boyer (GDAC leader), on behalf of the CLIVAR/GSOP workshop team

US National Oceanographic Data Center


High quality long-term observations of subsurface ocean temperature and salinity are fundamental to the understanding of variability and change in the Earth’s energy and hydrological cycle, and to discriminate between natural and anthropogenic drivers, particularly now in the context of climate change.

Studies of the ocean’s role in the Earth climate system, either through direct use of historical observations or their assimilation into an increasing number of reanalyses/synthesis efforts, however, have been hampered by lack of data availability and the uncertain quality of global datasets. For example, a significant source of uncertainty in ocean heat content calculations (and their regional patterns) arises from choices of subsurface data used and how they are screened and quality-controlled.

From early 1900s and prior to the global deployment of the Argo floats array (~2000s), subsurface measurements were collected using a mix of instrumentations and methods (with varying degrees of calibration and operational expertise); were not necessarily archived electronically in their full-resolution along with metadata; were not generally available or closely scrutinized for bias until recently; and suffer from a highly variable temporal and spatial coverage as they were usually funded for purposes other than monitoring global change. All of these factors have contributed to global datasets of mixed or unknown accuracy, sometimes with significant duplicates and biases. In addition, many groups around the world are performing complex and different screening to global datasets, duplicating effort.

Although much has been added over the years to the great challenge of rescuing and assembling original temperature and salinity profiles along with metadata into global datasets as well as improving quality control procedures and reducing instrumental biases, more and globally-coordinated efforts are required to maximize the quality and consistency of the historical ocean observations and to properly characterise their uncertainties – if we are to strive for the best and most widely use of these expensive and valuable ocean data.

With that in mind, attendees of a recent CLIVAR/GSOP workshop and interested parties in various countries ( http://www.clivar.org/ organization/gsop/activities/clivar-gsop-coordinated-quality-control-global-subsurface-ocean-climate ) are formulating a coordinated approach to quality control ocean temperature (in the near term) and salinity (in the future) profiles. Participating groups will set up a mutually agreed standardized set of automated quality control procedures to apply to the existing in situ ocean profiles database while at the same time attempting to augment the archive with ocean profile data which are not yet generally available and source crucial metadata for existing profiles. Profiles which are flagged during agreed upon quality control checks will be passed to experts for manual quality control and application of final quality assessment in the form of quality flags. The fully quality controlled data set will be then made public available. A key activity will be to assign errors to each observation based on its source and platform for ready use in global syntheses and data assimilating reanalyses. We will describe the approach and work plan that is being developed.