- JoNova - http://joannenova.com.au -

Study shows ARGO ocean robots uncertainty was up to 100 times larger than advertised

Posted By Joanne Nova On June 3, 2015 @ 4:26 pm In Global Warming | Comments Disabled

The oceans contain 90% of the heat energy on the surface of the Earth, which makes it “kinda important”. There are claims that the missing heat went into ocean temperatures, which are allegedly warming by five thousandths of a  degree per  year (which is still a lot less than the models predicted). The ARGO array of 3,000 ocean buoys deployed from mid-2003 is a vast improvement on the occasional sampling from ships that preceded it, but each single thermometer measures a vast 200,000 cubic kilometers of ocean.

The original Argo Science Report had an expected temperature sensor uncertainty of 0.005C. But it’s just not possible to measure the ocean temperature that accurately. Each thermometer may be accurate in a laboratory to 0.005C, but thermal noise in the ocean is an impossible beast. The four-kilometer-deep swirling mass of eddies varies from 0C – 30C. It is not a well mixed swimming pool at one temperature, being measured 3,000 times simultaneously — the statistics are entirely dissimilar.

I went looking for papers on error estimates and found Hadfield 2007.

The Hadfield study compared the new ARGO robotic buoys to other ways of measuring ocean temperatures in a slice across the North Atlantic. The results are fairly devastating for claims that the oceans are heating by 0.005° C per year. Hadfield et al found that the Argo network made errors around 0.5° C, and up to 2° C in one area.

Essentially a boat cruised across the Atlantic in mid 2005, stopping to take precise measurements along the way by lowering an accurate sensor (a “CTD”). The Hadfield study compared the ARGO results of water temperature to that hydrographic study. One of the problems with this study was the newness of the ARGO array at the time, which had 2,000 buoys in 2005, and didn’t reach the full complement of 3,000 until 2007. So the sampled error will be smaller now than it was then. But there are orders of magnitude of errors, that can’t be solved with 50% more data points.

Figure 6. Position of the cruise track (thick line) and the temperature profiles used to estimate the temperature field along the hydrographic section; circles indicate positions of profiles sampled within 30 days of the cruise CTD, pluses indicate positions of profiles sampled more than 30 days before or after the cruise CTD. The temporal spread of Argo data used in the OI spans 62 days before the cruise CTD to
72 days after the cruise CTD.

Figure 7. (a) Signal, (b) noise, and (c) the signal to noise ration (SNR) across 36N (Cape Hatteras–Mediterranean). Figures 7a and 7b are in degrees Celsius and Figure 7c is unitless. Contours of 1 are shown in Figure 7c.

As far as heat content goes, I notice  that models predict about 0.7W/m2 for the radiative effect of CO2 on the oceans. But in Hadfield, the sampling error was quote, “10–20 W/m2“.  And it was even worse in the “Gulf Stream” and “north of 40N” (which is a large part of the world) where it was 50W/m2.

A minuscule change in degrees,
In all the world’s oceans and seas,
Should not cause dismay,
As on any one day,
Sea-water could boil or could freeze.

— Ruairi

Abstract

The accuracy with which the Argo profiling float dataset can estimate the upper ocean
temperature and heat storage in the North Atlantic is investigated. A hydrographic
section across 36N is used to assess uncertainty in Argo-based estimates of the
temperature field. The root-mean-square (RMS) difference in the Argo-based temperature
field relative to the section measurements is about 0.6°C. The RMS difference is smaller,
less than 0.4°C, in the eastern basin and larger, up to 2.0°C, toward the western
boundary. In comparison, the difference of the section with respect to the World Ocean
Atlas (WOA) is 0.8°C. For the upper 100 m, the improvement with Argo is more dramatic,
the RMS difference being 0.56° C, compared to 1.13C with WOA. The Ocean
Circulation and Climate Advanced Model (OCCAM) is used to determine the Argo
sampling error in mixed layer heat storage estimates. Using OCCAM subsampled to
typical Argo sampling density, it is found that outside of the western boundary, the mixed
layer monthly heat storage in the subtropical North Atlantic has a sampling error of
10–20 Wm2 when averaged over a 10 x 10 area. This error reduces to less than
10 Wm2 when seasonal heat storage is considered. Errors of this magnitude suggest that
the Argo dataset is of use for investigating variability in mixed layer heat storage on
interannual timescales. However, the expected sampling error increases to more than
50 Wm2 in the Gulf Stream region and north of 40N, limiting the use of Argo in these
areas.

 The paper came out long ago in January 2007. Strangely there are only nine citations.

Results:

Over much of the section, the Argo-based
estimates of temperature agree with the cruise measurements
to within 0.5C. However, there are several regions in
the 500–1000 m layer west of about 40W where the
differences exceed this value (Figure 9a). Furthermore at
the western boundary, west of 74W, the temperature is
more than 2C warmer in the Argo section than in the cruise
section. As expected, the climatological values from the
WOA typically show larger differences from the cruise
section than the Argo-based sections, particularly in the
surface waters across the section (Figure 9b, upper panel)
and the upper 1200 m at 65–73W.

REFERENCE

R. E. Hadfield, N. C. Wells, S. A. Josey, and J. J-M. Hirschi (2007) On the accuracy of North Atlantic temperature and heat storage fields from Argo, Journal of Geophysical Research, Vol. 112, C01009, doi:10.1029/2006JC003825 [Abstract] PDF

VN:F [1.9.22_1171]
Rating: 9.2/10 (92 votes cast)

Article printed from JoNova: http://joannenova.com.au

URL to article: http://joannenova.com.au/2015/06/study-shows-argo-ocean-robots-uncertainty-was-up-to-100-times-larger-than-advertised/

Copyright © 2008 JoNova. All rights reserved.