The Search for Acceleration, part 1June 17, 2013
A while back I wrote a tongue-in-cheek post alleging that NASA satellites are being used to raise the global sea levels. The point was that the most commonly quoted 20th century sea level data had the average rise rate at about 1.8 mm per year, but the satellite data since 1993 had the rise rate at about 3 mm per year.
I have always wondered if those two facts could be reconciled. If they are both true, does that mean there has been an extreme acceleration in sea level rise rates at the tail end of the 20th century? Shouldn’t such an acceleration be apparent in the tide gauge data? So I have spent much time recently searching the data for such an acceleration.
For all my searching, the outcome is still somewhat ambiguous. This is the beginning of a series of posts on that search. In this post I will cover the basic methods I have used, and present results in subsequent posts.
The map in figure 1 (click to enlarge) shows all the locations of tide gauge data from the RLR data set, which is maintained by the Permanent Service for Mean Sea Level (PSMSL). The PSMSL overview notes…
Established in 1933, the Permanent Service for Mean Sea Level (PSMSL) has been responsible for the collection, publication, analysis and interpretation of sea level data from the global network of tide gauges. It is based in Liverpool at the National Oceanograhy Centre (NOC), which is a component of the UK Natural Environment Research Council(NERC).
The PSMSL data is a comprehensive collection of sea level data. The map in figure 1 (click to enlarge) shows 1384 RLR data sites. The circles are 100 km radius and centered on each site. This seems like a gold mine of data. But most of the data is not useful for the purpose I have stated, because it either covers an insufficient length of time, or has too many “holes” in the time period that it does cover.
We need sites with good records of sea level for at least the last half century. This drastically limits the number of adequate sites and their global distribution. Figure 2 (click to enlarge) shows all the sites with data starting in 1955 at the latest, ending in 2005 at the earliest, and having data for at least 90% of the months between 1955 and 2005. Of the original 1384 sites, only 112 meet these criteria and their global distribution is greatly reduced.
The methods that I will employ to mine the data will start by selecting data from sites that meet various completion criteria. Typically, those criteria will be all the data sites in some region that have 90% complete data over some specified time period.
Needle in a hay stack
Looking for acceleration in sea level data can be like looking for a needle in a haystack. The average global sea level rise rate for the 20th century was about 1.8 mm per year. But the fluctuations from year to year or even month to month at any particular site or region can be hundreds of times greater than the yearly average.
Consider the sea level data from the North Sea port city of Den Helder in the Netherlands. Figure 3, below, (click to enlarge) shows the type of processing I use to analyse sea level data in order to find the needle of acceleration in the haystack of data. Read the figure caption for details. Note that 3A through 3F all show a red bar in the upper left corner to represent the amount of globally averaged sea level rise for the entire 20th century, and compare it to the short term fluctuation at Den Helder. The wide fluctuations of the Den Helder data is typical of all the tide gauge sea level data from around the world.
Not rise rates, but change in rise rates.
I want to stress that in most cases I will not be concerned the actual sea rise rate, but rather, I am looking for a change in the sea level rise rate. This is an important distinction, because often times tide gauge data from two or more sites in the same region may have very different rise rates, but very similar changes in sea level rise rates over a long period of time. For example, consider Wernemunde, Germany and Stockholm, Sweden, both in the Baltic Sea region, about 600 km apart. Figure 4a shows their sea level data from 1940 to 2010, with a 2 year Fourier long pass filter. A linear fit to the Wernemunde data over that period gives an overall sea level rise rate of 1.68 mm/year, while Stockholm had an overall rise rate of -3.35 mm/year. The Wernemunde and Stockholm sea levels are obviously dominated by different local effects.
But if the Wernemunde and Stockholm data are detrended (the best linear fit of the data subtracted from the data itself) then we can see that there are remarkable similarities (see figure 4b). Those two data sets are clearly measuring a combination of signals: local, regional, and global. While their sea level rise rates may be very different, their changes in sea level rise rates are very similar.
Mathmatically speaking, let f1(t) and f2(t) be sea level data for two sites in the same region and let
The example illustrated in figure 4 shows only two sets of tide gauge data from the Baltic Sea region. When I finally present data for the Baltic Sea region in a later post I will use about 25 tide gauge sites. The similarity between these 25 data sets after detrending is quite amazing (at least to me). It leaves me with great confidence that those sets of detrended data are measuring nearly the same regional and global signals while having very different local signals. But the question still remains: how to combine the data for those sites?
I will typically combine such data sets two different ways: simple unweighted year by year average of the detrended data, and a weighted average of the detrended data. The weighting will be based on distance between sites. S0, for example, if I choose a weighting threshold of 200 km, then sites that are more than 200 km away from the next closest site will be weighted as “1.” A site that is within 200 km of n other sites will be weighted as 1/(n+1). Figure 3 shows an example of the site weighting.
I will use a fourier technique to remove yearly signals. That is, the Fourier transform of a sea level time series will have the components coresponding to 12, 6, 4, 3, and 2 months removed. Additional long pass smoothing may be applied by removing all Fourier components for periods shorter than some threshold period, but usually I will apply a Gaussian filter after the yearly signal has been removed. It will always be noted when I apply these techniques.
In later posts I will analyze data and present results from various regions around the world.