To give you the best possible experience, this site uses cookies. Continuing to use this site means you agree
to our use of cookies. If you'd like to learn more about the cookies we use please find out more

Session 4.1 abstracts

Symposium home | Session 2 | Session 3 | Session 4 | Abstracts by Author |


4.1 The future of Ocean Observations

Session conveners: Pierre-Yves Le Traon, Andreas Schiller and Clemente Tanajura

The table below lists all abstracts for Session 4.1 by author. To read the full abstract click on the title-link.

The unique reference number (ref. no.) relates to the abstract submission process and must be used in any communications with the organisers.

All abstracts from session 4.1 are available for download - pdf

 Ref.NoPrimary AuthorAffiliationCountryAbstract titlePoster
S4.1-01DiGiacomo, PaulNOAA/NESDISUnited StatesRe-defining Operational: An Evolving Paradigm for Satellite Oceanography Oral
S4.1-02Jacobs, GreggNRLUnited StatesOcean Applications of the Surface Water and Ocean Topography MissionCancelled
S4.1-03Miller, LauryNOAAUnited StatesJason-Continuity of Services: Creating a Reference for Low and High Resolution Ocean Surface Topography TBC
S4.1-04Petzrick, ErnestTeledyne Webb ResearchUnited StatesProfiling from 6,000 meters with the APEX Deep float Poster
S4.1-05Roemmich, DeanScripps Institution of Oceanography/UCSDUnited StatesThe Future of Global Ocean Observations in the Argo ProgramOral
S4.1-06Wang, XiaochunJIFRESSE, UCLAUnited StatesObserving System Simulation Experiments with GNSS-r Sea Surface Height Observation in the Gulf of MexicoPoster-pdf


ID 4.1-01

Re-defining Operational: An Evolving Paradigm for Satellite Oceanography

P.M. DiGiacomo1, H. Bonekamp2, C. Brown1, E. Kwiatkowska2, A. O'Carroll2, C. Wilson3

1 NOAA/NESDIS Center for Satellite Applications and Research, College Park, MD, USA

2 EUMETSAT, Darmstadt, Germany

3 NOAA/NMFS Southwest Fisheries Science Center, Pacific Grove, CA, USA


There is a prevailing perception, particularly in the oceans domain, that operational satellite missions need only support near-real time (NRT) applications, and that quality is not a primary driver for operational data. This existing paradigm has significant limitations and is not applicable to all operational activities, e.g., fisheries, ecosystem and climate monitoring, analyses, applications and services that require an accurate understanding of change that is occurring over time relative to climatological conditions.

As such, the existing operational paradigm is outdated and must be updated to reflect the reality that operational satellite data need to support a broad spectrum of ocean users and their needs. This includes users traditionally considered to be research-oriented as well as applications on time-scales ranging from NRT to climate. In particular, space-based ocean measurements are becoming increasingly mature and transitioning into operations, with many operational missions now (or soon) providing altimetry, ocean color, ocean surface vector winds, sea-surface temperature and synthetic aperture radar et al. measurements. As a consequence, operational missions need to implement and maintain robust supporting infrastructure and scientific/technical activities (e.g., reprocessing, cal/val, orbital maneuvers) that will ensure the resulting data are fit for the broad spectrum of users and their needs.

In this research and operations mode, operational missions must provide routine and sustained (i.e., operational) data of the highest possible quality, supporting research and applications, on time-scales spanning from NRT to climate. Strong, fundamentally-sound science underpins both research and applications. In addition to providing NRT and climate monitoring, assessments, analyses, and services, innovations will result from these operational missions as well as from the crucial complementary research and development (R&D) missions. These R&D missions will in turn provide new and improved measurement capabilities, novel approaches, and key discoveries that will enable synergistic development and ultimately infuse, improve and transition into operations.

In this broader context, this paper will focus on operational satellite oceanographic data, products and services with near-real time, off-line and re-processed timeliness. It discusses how these data, products and services are strongly related and crucial for diverse users and applications, including weather, climate, and ecosystem monitoring, assessments and marine environmental modeling, ocean analysis and forecasting. It is concluded that the full benefit of operational satellite ocean services can only be realized if the strong and integral relationship between services with different timeliness and user bases are recognized in all phases of mission development and operations, and that operational satellite oceanography missions intrinsically serve as climate as well as NRT monitoring assets. This will further the sustained development and implementation of operational oceanography as a whole.

ID 4.1-02

Ocean Applications of the

Surface Water and Ocean Topography Mission

G. Jacobs1, J. Richman1, M. Srinivasan2, C. Peterson3

1 Naval Research Laboratory, Stennis Space Center, Mississippi, USA

2 California Institute of Techonolgy Jet Propulsion Laboratory, Pasadena, California, USA

3 Stennis Space Center, Mississippi, USA


Physical processes transport heat and carbon throughout the oceans from scales of the ocean basis at 5000km to submesoscale at less than 10km. The stirring and mixing properties below the mesoscale O(100km) remain largely unknown. Circulation at these scales is responsible for transporting half of the heat and carbon from the upper ocean to the deep ocean. SWOT will provide high-spatial resolution, global measurements of ocean surface topography, which will first allow observation of the ocean processes down to 4km that will lead to improved ocean circulation models and to better prediction of weather and climate as well as variations in ocean currents important for navigation, fisheries, and offshore commercial operations.

The NASA Applied Sciences Program actively supports a mission-level data applications approach. A significant program goal is to engage applications-oriented users and organizations to enable them to envision possible applications and end-user needs as a way to increase the benefits of these missions to the nation.

One current experiment highlights the difficulties in present satellite observations in trying to observe and predict the submesoscale circulation across the Gulf Stream just after separation from the continental shelf at Cape Hatteras. Even with data from the Jason-2, CryoSat-2 and AltiKa satellites being assimilated into the models, there is insufficient information in space and time to define the mesoscale, let alone submesoscale, features, such as individual eddies. To mimic the type of data available from SWOT, high resolution Airborne eXpendable BathyThermograph (AXBT) over four flights were added to the data assimilation stream. The first two flights are required to adequately define mesoscale circulation, with the two additional flights demonstrating skill at predicting submesoscale frontogenesis which results in thinned mixed layer. Even with this intensive observation campaign, accurate placement of submesoscale features is difficult.

High-resolution observations from SWOT will provide dramatic improvements to these models. The submesoscale frontogenesis processes targeted here are powered by energy from the mesoscale field. The frontogenesis produces vertical circulation that can greatly increase heat fluxes by bringing cold water to the surface and warmer waters away from the surface, and fluxes of nutrients and biology are associated with the features as well.

Another opportunity for supporting applications of SWOT may be in the data products from the AirSWOT mission, an airborne instrument collecting limited inland and coastal data sets as a precursor to SWOT.

Successful strategies to enhance science and practical applications of SWOT data streams will require engaging with and facilitating between representatives in the science, societal applications, and mission planning communities.

ID 4.1-03

Jason-Continuity of Services: Creating a Reference for Low and High Resolution Ocean Surface Topography.

L. Miller1, H. Bonekamp2, C. Donlon3, J. Lambin4,

1 NOAA/NESDIS, Washington, USA

2 EUMETSAT, Darmstadt, Germany

3 ESA/ESTEC, Noordwijk, The Netherlands

4 CNES, Toulouse, France


Jason Continuity of Services (Jason-CS) is a multi-partner program proposal to extend the data record as provided by the TOPEX/Poseidon, Jason-1, Jason-2 and Jason-3 (the latter planned to be launched in 2015). An overview of the Jason-CS program is given by Francis and Parisot, 2013 (this symposium). The first and most important objective of the Jason-CS missions is to maintain continuity of the more than 20 year global sea level climate record beyond Jason-3. In addition, Jason-CS mission is expected to function as a contributing and reference mission in a virtual constellate of altimeter mission for the study of the mesoscale features in the global ocean. These other altimeter missions will fly with SAR or interferometric altimeter techniques to satisfy the near-future operational and research requirements to measure sea surface height with a much higher resolution. To match its contemporary counterparts and, importantly, to be fully backward compatible with the Jason heritage data records, the Jason-CS altimeter is designed to operate in both a SAR altimeter mode and the traditional pulse-width limited altimeter mode simultaneously. This dual mode is the so-called interleaved mode as explained in detail in Cullen and Francis, 2013 (this symposium). In contrast to the Sentinel-3 altimeter missions and the Cryosat-2 missions, the Jason-CS interleaved SAR altimeter mode will operate continuously and not in closed-burst to maximise precision for open ocean applications. The interleaved mode will provide unique opportunities to understand the spectral characteristics of the different modes operation linking the future with the past. This contribution will elaborate the known and anticipated advantages of the Sea Surface Height measurement with Jason-CS interleaved mode for the sea surface height climate data record and operational oceanography.

ID 4.1-04

Profiling from 6,000 meters with the APEX Deep float

Ernest Petzrick, James Truman & Hugh Fargher

Teledyne Webb Research, Falmouth, Massachusetts, USA


This paper describes the Teledyne Webb Research autonomous APEX Deep profiling float, designed to repeatedly profile to depths of 6,000 meter. Such a profiler has recently been deployed in the Puerto Rico Trench, successfully reporting scientific data from depths of over 6,000 meter.

The main elements of the APEX Deep float consist of a controller, high pressure hydraulic pump for the buoyancy engine and pressure housing. The housing, which incorporates entry ports for sensors, is also protected inside a custom “hard hat”. Profiling from 6,000 meters presented a challenge for the design and construction of each of these elements.

The pressure housing was achieved using a glass sphere. This represented the most cost effective solution for deep profiling hulls and avoids the use of more expensive materials such as titanium. Although glass spheres have routinely been deployed to 6,000 meters, they are usually not “cycled” multiple times. In order to understand the viability of glass in multi-use vehicle hulls, cycle tests were conducted on representative glass spheres through 2011. The result (constructed from two hemi-spheres) was small, light and in-expensive enough to satisfy the requirements of the original APEX Deep float. For example, when populated with the controller & hydraulic pump, and placed within a custom "hard hat", the resulting float could be easily carried by two people. Stress on the glass sphere was minimized by using multiple smaller ports (as opposed to a single large port) to connect to the inside of the sphere. Finally, a stand was also used to minimize the chance of breakage while the float was being carried & deployed.

It was soon realized that the hydraulic pump and bladder system used on existing Apex floats (rated for 2,000 meters) would not function on APEX Deep floats. A larger oil displacement was required for the float to descend to 6,000 meter, while being able to then reach the surface again. After extensive design studies & testing of proto-type pump & bladder combinations, a fixed displacement pump was chosen, with oil pumped into rubber tubes wrapped longitudinally around the glass sphere to change the overall float volume. In June 2012, the pump and bladder passed bench testing to 150 simulated profiles (at pressure) without failure.

Similar to existing Apex profiling floats, a pneumatic system was used to optimize satellite communications when the profiler surfaced by ensuring that the float antenna was positioned well above the surface. Air-bladders wrapped around the glass sphere (similar to the rubber tubes used to contain pumped oil) were used, being inflated with air from the partially evacuated glass sphere. This provided a more efficient mechanism (in terms of energy consumption) than pumping extra oil once the float was at the surface.

Energy consumption was another consideration, given the relatively large energy required to increase buoyancy (by displacing oil) at 6,000 meter. First, the oil displacement required to achieve the target depth of 6,000 meter was carefully calculated to avoid any over-shoot. To ascend from this target depth, oil was then pumped in small increments to minimize pumping against pressure as much as possible, while still maintaining an ascent time of around 20 hours. Using this approach, a total energy consumption of around 90 kJoule was achieved for a typical 6,000 meter profile, including CTD capture and telemetry. Plans are to reduce this level of energy consumption in future by calculating optimal oil displacement increments, given data from the 6,000 meter descents currently underway.

For efficiency of data transfer, science & engineering data was transmitted from the float to the iridium satellite system in compressed binary format. Also, to avoid having to repeat transmission of interrupted data packets, the z-modem protocol was used. This resulted in lower iridium communication costs and less time spent on the surface (where the float is vulnerable to damage). All science & engineering data was stored on a removable SD card within the float. SD cards come in a variety of memory sizes, although a 4 gigabyte card was used for the current deployment. However, by using a larger card, APEX Deep profilers can readily include enough flash memory to store more data than is collected over the entire lifetime of a standard Apex float.

Once assembled, the profiler hydraulic system was bench-tested at full pressure using a 'copper ocean' assembly, in which the CTD pressure port was attached to a tube containing water at high pressure. This was particularly useful for ensuring that the Deep-float completed the required park/profile/ascend/telemetry phases, while being able to closely monitor each step through a connected laptop. Also, an external power supply was attached to the float during these tests to avoid draining the batteries.


Figure 1: Measured Pressure versus Temperature & Salinity from 6,000 meter.

A complete prototype system was successfully pool tested in July and August 2012. Field tests followed with a deployment in the Puerto Rico Trench area, which provides ocean depths of over 7,000 meter. Initial science data collected during this deployment included pressure, temperature and salinity, while engineering data included power consumption. Figure 1 shows typical science data collected by the APEX Deep float during tests in March 2013, with pressure measurements starting at around 6,000 dbar.

Since regions of the ocean with depth over 6,000 meters are limited, planned enhancements include a downward-looking altimeter to prevent the profiler from hitting bottom when drifting into shallower regions. This would also allow incrementally adjusted buoyancy to descend to any required altitude above the ocean floor.

ID 4.1-05

The Future of Global Ocean Observations in the Argo Program

D. Roemmich1, S. Wijffels2 and the Argo Steering Team3

1 Scripps Institution of Oceanography UCSD, La Jolla CA, USA

2 CSIRO/Centre for Australian Weather and Climate Research, Hobart TAS, Australia

3 http://www.argo.ucsd.edu/members.html


The Argo Program has revolutionized the systematic collection of global ocean observations for a broad range of research (http://www.argo.ucsd.edu/Bibliography.html) and operational applications. Since 2007 Argo has maintained an array of more than 3000 profiling floats, providing near real-time temperature/salinity profiles (over a million so far), and velocity measurements, covering the global oceans from 60oS to 60oN and 0 to 2000 m depth.

New capabilities in profiling floats include increased buoyancy control, more robust designs, rapid bidirectional satellite communications, and ice-avoidance software. These improvements are enabling longer float lifetimes, reduction in grounding, bio-fouling and array divergence problems, higher vertical resolution in profile data, profiling close to the sea surface for temperature and salinity structure, and underway changes in float mission parameters. Considering the new capabilities, enhancements to the Argo array were recommended by the OceanObs’09 conference, and were recently endorsed by the Argo Steering Team. These enhancements include poleward extension of Argo through the seasonal ice zones, sampling of all marginal seas to be implemented through regional partnerships, and increased density of spatial coverage in energetic western boundary regions and along the equator. These enhancements have begun but will take some years to complete, and will require additional floats. Argo’s top priority is to sustain its original mission and to keep improving its high data quality, but increases in the number of active floats are allowing implementation of enhancements.

A major future evolution of Argo will be its extension into the deep ocean, profiling beyond 2000 m to the ocean bottom. Deep Argo floats are being developed, and successful deployments have been carried out using 4000 and 6000 m designs. A CTD with improved sensor stability needed for abyssal measurement is under parallel development. Objectives of Deep Argo, in combination with satellite missions including altimetry and gravity, will include closure of the sea level, ocean mass, and energy budgets on regional and global scales. Deep Argo will also provide new information on ocean circulation and water mass formation and properties, as well as many other new applications. For ocean data assimilation modeling, Deep Argo will mitigate the lack of observations below 2000 m.

A second major evolution of Argo is the addition of biological and biogeochemical sensors on Argo floats. About 200 Argo floats already carry dissolved oxygen, and initiatives in several nations will add nitrate, pH, and bio-optical sensors to a subset of Argo floats. Challenges include ongoing improvement in sensor stability and development of data management protocols, especially for delayed-mode quality control.

The largest challenges in the future of Argo are (i) to sustain these essential and valuable observations across scientific generations and despite shrinking national resources, and (ii) to achieve international consensus on global deployment of Argo floats, including in EEZs.

ID 4.1-06

Observing System Simulation Experiments

with GNSS-r Sea Surface Height Observation in the Gulf of Mexico

Xiaochun Wang¹, Tony Lee² and Zhijin Li²

¹Joint Institute For Regional Earth System Science and Engineering,

University of California at Los Angeles, Los Angeles, USA

² Jet Propulsion Lab, California Institute of Technology, Pasadena, USA


Radio signals from the Global Navigation Satellite System (GNSS) reflected off the sea surface (GNSS-r) may provide sea surface height (SSH) measurements that can complement traditional satellite altimetry in characterizing mesoscale features of the ocean. The uncertainty of GNSS-r measurements is much larger than that of existing altimeters. However, there are large numbers of GNSS satellites. The large volume of GNSS-r data received by space-borne receiver(s) can compensate for the large uncertainty (e.g., through space and temporal averaging). This study examines the impact of synthetic SSH observation derived from GNSS-r signals in constraining the Gulf of Mexico (GoM) Loop Current eddy shedding, which has important implications to the physical, biological, and environmental conditions in the GoM. The study is done through Observing System Simulation Experiments (OSSEs) using a high-resolution (6-km) data-assimilative ocean circulation model in the GoM. Synthetic SSH observations were derived from a model control simulation by sampling the model using the tracks of assumed space-borne receivers. In one case we assume one receiver on the International Space Station (ISS) with low-inclination orbits. In the second case we assume six receivers on a planned COSMIC-II Polar mission with high-inclination orbits (the more receivers the more GNSS-r data). A 3-D variational method is used to assimilate the synthetic SSH data in perturbed model runs. The ability of the model to reproduce the Loop Current eddy shedding process is examined through the OSSEs. The results suggest that GNSS-r data can be potentially useful in constraining the Loop Current eddy shedding when there are multiple space-borne receivers.