Energy

The Weekly Carboholic: independent statisticians reject recent global cooling claims in blind analysis

carboholic

cooling

Climate disruption deniers have been claiming for years now that the global temperature has been cooling down, even though the temperature data clearly shows that it isn’t. Scientists and statisticians have pointed out that, mathematically speaking, the recent reduced warming trend is well within the noise, or put another way, it’s weather, not climate.

A new report by the Associated Press reveals what many of us knew already – the denier’s claims don’t hold water, statistically speaking. The report is intriguing because the AP provided their data to four independent statisticians without telling them what it was, and all four found that the slower warming of the past decade was statistically insignificant with respect to the actual data.

In a blind test, the AP gave temperature data to four independent statisticians and asked them to look for trends, without telling them what the numbers represented. The experts found no true temperature declines over time.

“If you look at the data and sort of cherry-pick a micro-trend within a bigger trend, that technique is particularly suspect,” said John Grego, a professor of statistics at the University of South Carolina.

Furthermore, the data that the AP sent to the statistician came from two different sources – the National Climate Data Center (NCDC), run by NOAA, and the satellite data preferred by climate disruption deniers that is generated by scientists John Christy and Roy Spencer from the University of Alabama in Huntsville. In both cases, the statisticians found no statistically significant trends over the last ten years.

Statisticians who analyzed the data found a distinct decades-long upward trend in the numbers, but could not find a significant drop in the past 10 years in either data set. The ups and downs during the last decade repeat random variability in data as far back as 1880.

Saying there’s a downward trend since 1998 is not scientifically legitimate, said David Peterson, a retired Duke University statistics professor and one of those analyzing the numbers.

Identifying a downward trend is a case of “people coming at the data with preconceived notions,” said Peterson, author of the book “Why Did They Do That? An Introduction to Forensic Decision Analysis.” (emphasis mine)

The AP interviewed Don Easterbrook, who claimed that “We started the cooling trend after 1998. You’re going to get a different line depending on which year you choose.” According to one of the statisticians, the fact that you have to choose 1998 as your starting point in order to observe a (statistically insignificant) cooling trend is part of the problem.

Grego produced three charts to show how choosing a starting date can alter perceptions. Using the skeptics’ satellite data beginning in 1998, there is a “mild downward trend,” he said. But doing that is “deceptive.”

The trend disappears if the analysis starts in 1997. And it trends upward if you begin in 1999, he said.

This is what’s referred to in statistics as “endpoint sensitivity,” and it’s the main reason that climate disruption deniers like Easterbrook can appear and sound so reasonable when they’re actually misusing or misunderstanding the data.

———-

DDTfig2Melting glaciers releasing pollutants from decades ago

A study published in the journal Environmental Science & Technology has revealed a new and troubling aspect to climate disruption – as glaciers melt, they are releasing persistent organic pollutants like DDT, PCBs, other pesticides, and synthetic musks (chemicals that mask body odor).

The scientists studied the annual sediment layers in a high alpine lake in Switzerland and found that there the annual flux of pollutants varied consistently across all the studied pollutants – the fluxes started low in the 1950s, peaked in the 1960s and 70s, dropped off again in the 1980s, and then rose to a new peak in the late 1990s. But in the case of all the pollutants except for musks, the production of the pollutants ceased by 1986 at the latest, and the musks have been in constant production globally since the late 1980s. The image at right illustrates these peaks for the various pollutants the scientists studied.

According to the study, the first peak corresponds closely to when the production of the various pollutants peaked, either in Switzerland or in continental Europe. That peak likely is a result of airborne delivery of the pollutant, either by way of dust or precipitation depositing the pollution in the lake and surrounding land directly. But since there has been no production (or constant production) of the pollutants in decades, it’s extremely unlikely that dust or rain/snow is responsible for the second peak.

In addition, the authors compare the results from the high alpine, glacial melt-fed lake to several other lower altitude lakes. The comparison shows that the low altitude lakes do not show the same spike in pollutants in the late 1990s that the alpine lake does, but they do show similar dust/precipitation driven spikes in the 1950s, 60s, and 70s.

As a result, the authors’ hypothesized that glacial ice had been accumulating pollutants since the 1960s and 70s and then started releasing those pollutants into the lake as the pollution-laden ice melted. And given the strength of their data, they’re almost certainly correct.

The ramifications of this are significant. Other studies have found recent increases in pollutants around the world even though the production of those pollutants stopped decades ago. Pesticides have been discovered in alpine lakes in the Italian Alps and the Canadian Rockies, and Antarctic penguins have been found to have old DDT in their bodies. If this result holds for other glacially-fed lakes around the world (and there’s no reason to believe that the results won’t hold), then the dangerous pollutants that environmentalists thought had largely been phased out will return and could cause similar ecological damage as they caused decades ago (DDT-thinned eggshells, fishing limitations due to PCBs, etc.). And all as a result of glacier melt that has been caused or enhanced by climate disruption-driven warming. And the results of the study point out that the pollutants present in the studied lake are not likely to be everything that the glacier holds:

The burden of pollutants in Lake Oberaar sediment due to glacier melting is already in the same range as the earlier accumulation from direct atmospheric input. The undiminished increase of the fluxes of many organohalogens into the sediment of Lake Oberaar does not yet prefigure an exhaust of the glacial inventory of these contaminants.

In other words, the environmental toll of these pollutants isn’t over yet by a long shot.

———-

gasburnersIEA: climate treaty necessary to keep energy prices low

There are many reasons to address climate disruption, ranging from saving species to reducing U.S. dependence on foreign oil to reducing the chance of catastrophic drought. The economy is usually not considered to be one of the reasons, especially by those who have a vested interest in maintaining their own profits at the expense of the environment and global climate. However, there are those who say that addressing climate change is critical to maintaining a healthy future economy. According to a new Reuters article, we can now add to that small but growing list the International Energy Administration (IEA).

Reuters interviewed Fatih Birol, author of the International Energy Agency’s World Energy Outlook, and he said that the world needed to work towards a carbon dioxide (CO2) concentration of no higher than 450 ppm in order to keep energy costs from skyrocketing by 2030. According to Birol’s estimates, Europe alone would see energy prices increase by 300% over the average of what Europe paid over the last 30 years, from $160 billion per year to $500 billion.

Birol’s also estimates that oil prices will reach $100 per barrel by 2015 and $190 per barrel by 2030. Given that there is evidence that the high oil prices of 2008 were part of what caused the global recession, this should make the U.S. and other oil dependent countries nervous. And the global oversupply of natural gas that is keeping prices low in the U.S. this year won’t last – Birol estimates that the demand for natural gas by 2030 will far outstrip supply.

The Guardian is reporting that an anonymous IEA whistleblower is claiming that US pressure has been applied to redefine the point at which peak oil occurs. If this is true and can be verified, then peak oil is probably much closer than previously expected and Birol’s estimates are very likely optimistic. Similarly, Reuters doesn’t discuss whether Birol has any coal estimates or not, but the USGS has pointed out that the U.S. could be approaching “peak coal” as well, after which the price of energy would skyrocket.

Diversifying energy out of carbon-based fossil fuels makes sense from an environmental perspsective, from a climate disruption perspective, from a green jobs perspective, and from an economic perspective. All that remains is for the world’s governments to accept that it makes sense from a political perspective as well.

———-

floatcityFloating cities as a response to sea level rise

Some ideas are just too cool and deserve mention just because they’re cool. According to the NYTimes blog Green Inc., the Dutch are designing floating cities to replace or augment land-based cities as the global sea level rises over the next few centuries. The floating cities would be connected to each other and to the mainland via floating highways and rail lines. According to the article, the designers plan to use the ocean to help moderate the cities’ temperatures in much the same way as ground source heat pump does – pump cold water up from the depths beneath the city in order to cool it efficiently.

In case you’re not convinced that concrete can be made to float, there are floating bridges across Lake Washington in Seattle – the glacially-carved lake is far too deep to drive pilings into the lake bed to support the bridge, so it floats instead.

The first floating proof-of-concept residences in a Rotterdam residential neighborhood are expected to be available in 2010.

———-

American Physical Society rejects changes to climate change statement

Earlier this year, a small group of American Physical Society (APS) members requested that the APS change it’s official statement on climate change. This statement reads:

Emissions of greenhouse gases from human activities are changing the atmosphere in ways that affect the Earth’s climate. Greenhouse gases include carbon dioxide as well as methane, nitrous oxide and other gases. They are emitted from fossil fuel combustion and a range of industrial and agricultural processes.

The evidence is incontrovertible: Global warming is occurring. If no mitigating actions are taken, significant disruptions in the Earth’s physical and ecological systems, social systems, security and human health are likely to occur. We must reduce emissions of greenhouse gases beginning now.

Because the complexity of the climate makes accurate prediction difficult, the APS urges an enhanced effort to understand the effects of human activity on the Earth’s climate, and to provide the technological options for meeting the climate challenge in the near and longer terms. The APS also urges governments, universities, national laboratories and its membership to support policies and actions that will reduce the emission of greenhouse gases.

A committee was appointed by the Council earlier this year to determine if the latest science justified any changes to the statement. According to the official APS press release, the committee recommended that no changes be made, and on November 8, the Council of the American Physical Society “overwhelmingly” rejected the proposed changes to the 2007 statement on climate change.

Appointed by APS President Cherry Murray and chaired by MIT Physicist Daniel Kleppner, the committee examined the statement during the past four months. Dr. Kleppner’s committee reached its conclusion based upon a serious review of existing compilations of scientific research. APS members were also given an opportunity to advise the Council on the matter. On Nov. 8, the Council voted, accepting the committee’s recommendation to reject the proposed statement and refer the original statement to POPA for review.

The APS has over 47,000 members, of which only 206 appear to have signed the petition to the APS Council. That’s about 0.4% of the APS membership. According to the 2009 “Six Americas” study by the Yale Project on Climate Change and the George Mason University Center for Climate Change Communications, fully 18% of Americans are either doubtful or dismissive of climate disruption. If those numbers applied to the 47,000 members of the APS, we could expect almost 8500 signatories to the APS petition.

There are three possible interpretations of this difference:

  1. Physicists may be less willing to sign online petitions for whatever reason(s).
  2. Physicists may actually be more knowledgeable of the science and mathematics than the average American (or less easily swayed by denial industry-manufactured FUD) and thus they accept the overwhelming scientific data to date.
  3. Both 1 & 2

My best guess is that it’s probably option #3. But even so, I doubt that reticence to sign petitions accounts for a 45x difference from physicists to the general population.

Image credits:
Geophysical Research Letters
Environmental Science & Technology
Delft University, via Green Inc

13 replies »

  1. How did I not see this coming? We’ve been doing our best to finish off the glaciers, and now they’re striking back. Well played glaciers, well played.

    Would it be wrong to say revenge is a dish best served cold?

  2. Hey Brian. Just for fun, I’m going to post a couple of comments from a friend of mine in the lab. Let’s call him RT. I’m sure you’ve already addressed all of this in previous posts, but he’s pretty rabid about it. So, I thought I’d give you another shot. Unfortunately, he’s too much of a wuss to do it himself. 🙂

    “This guy and this article is CRAP! A “slower warming” say it like it is, cooling! These hypocrites cherry pick when it suites them but don’t allow others to do so. Don’t cherry pick look at all the data! The 30’s very hot and still hold many record highs, yet CO2 is low. The 70’s very cold … and still hold many record lows, yet CO2 was high and emission standards low. Now the 2000’s the temperature is once again going down in spite of CO2 levels. These are the facts! This guy likes to give all these mathematical analysis of a chaotic system based on trends…CRAP results! Tantamount to reading the tea leaves! It’s easy to confuse people with mathematical analysis and to fool them into a state of fear. I want to see the predictions and see them come to fruition. Keep in mind that prior to 2000, we were told that as a result of the same logic we would be experiencing even higher temperatures than we did prior to 2000. Noting a cooling trend refutes this math that predicts catastrophe. How many times do you have to predict things wrong do you decide that the logic itself is flawed? Also keep in mind that the argument for man-made global warming is a direct relationship to CO2. Even this guy agrees that it’s cooling now, do we have less CO2? If not there are other factors other than CO2 and these factors are the true reason temperature has and always will change. It’s because of the sun activity…wow, what a novel notion!”

    Enjoy. 🙂

  3. RT, since you work in Uber’s lab, I’m going to assume that you have a higher level of knowledge regarding math and science than most. So I’m going to talk to that level rather than trying to simplify things. Your rant has at six different points that can be addressed, so I’ll go through them one by one.

    1. “A ‘slower warming’ say it like it is, cooling!” and “Now the 2000’s the temperature is once again going down in spite of CO2 levels.”

    There has been no cooling since 1998 (or the turn of the century, if you prefer) that can’t be explained by endpoint sensitivity (ie cherry-picked endpoints). Similarly, as I mentioned in the piece about the AP story, the trends over the last decade are not statistically significant to any measure I know of. Here’s two graphs I generated last week that show the slope is small, but still positive.

    Uber asked what the R2 was for each of the trends shown above since I hadn’t put it on the graphs when I generated them, so here’s that information as well – GISS: 0.135; CRUT: 0.015; NCDC: 0.064; RSS TLT: 0.005; UAH T2LT: 0.011.

    Not also that I went out of my way to chose the one of the hottest months of 1998, July, in order to make the trend as flat as possible. If I’d selected September, when the global average temperature was about half a degree C lower, the slopes and R2 would have all been greater (although still not statistically significant) – GISS: 0.172; CRUT: 0.037; NCDC: 0.090; RSS TLT: 0.019; UAH T2LT: 0.026. Or I could have chosen to wait until the La Nina of 1999 had fully kicked in in March 1999 in order to boost the slope and R2 even more (again, though, they’re still statistically insignificant) – GISS: 0.191; CRUT: 0.048; NCDC: 0.092; RSS TLT: 0.039; UAH T2LT: 0.052.

    In other words, I “cherry-picked” my endpoint to make your case stronger, and still failed to find cooling. (My data and calculations are available if you want them, although you can create your own from publicly available data.)

    So since 1998 (or 2005, depending on whether you think the Hadley data or the GISS data are more accurate), the three major surface station datasets all show a statistically flat trend or slower warming (ie the linear trend has a positive slope, but the trend is slow enough that it’ll take decades for it to become statistically valid to a 63% confidence level, never mind 90% or 95%), but no cooling.

    If you have data that opposes this conclusion, I’d love to see it. None of the “cooling trends” I’ve seen have been statistically significant and all have suffered from endpoint problems (the trends I’ve seen have all started on an El Nino and ended on a La Nina).

    2. “The 30’s very hot and still hold many record highs, yet CO2 is low. The 70’s very cold … and still hold many record lows, yet CO2 was high and emission standards low.”

    What you neglect to point out is that from 1976 or so until about 1998, CO2 was high and global temperatures rose quickly. Here’s a graph of the complete monthly surface temperature record from all three sources from 1880 until October 2009:

    Notice that the R2 is very likely statistically significant for all three trends (GISS, NCDC, and CRUT).

    But there’s more to this than just whether a trend is statistically significant or not. For example, there’s a new study out from researchers at NOAA, NCAR, The Weather Channel, and Climate Central that supports your point regarding record lows and highs in the U.S. – except that it shows that the 2000’s have had more record highs than any decade since daily records started being kept regularly in 1950 (press release here, the paper is in print and there’s no link to it yet, although you can probably ask the authors for a preview copy) and that there’s a significant deviation from the predicted 1/n trend for both record highs and record lows in the U.S.

    Of course, that’s just the U.S. – the 1930s were certainly warmer than present day in the U.S., but that’s not the case when you look at the global average. Globally, the warmest year (annual average) since 1880 is either 1998 (according to CRUT) or 2005 (according to GISS and NCDC).

    Now, as for the comparison to CO2, here’s how it looks:

    When you run the cross correlations over the complete period, you find that the R2 is 0.796 for the CRUT data, 0.761 for GISS, and 0.800 for NCDC. That’s pretty good, but doesn’t prove that the CO2 is the cause of the temperature rise. There’s always a chance that some unknown cause is shifting both. But that’s pretty unlikely, given the large number of studies that have attributed the increase in temperature to the increase in CO2 using multiple different methods.

    Finally for this point, there’s a paper by Swanson and Tsonis about “synchronized chaos” causing the climatic shifts that have been observed in the instrumental record (going from cooling to heating around 1910, back to cooling around 1940, and then back to heating again around 1975). However, this high degree of internal variability (suggesting a positive and large feedback ratio, or “climate sensitivity” value) doesn’t negate external forcing by any stretch, as Swanson and Tsonis pointed out in their conclusion:

    It is straightforward to argue that a climate with significant internal variability is a climate that is very sensitive to applied anthropogenic radiative anomalies (c.f. Roe [2009]). If the role of internal variability in the climate system is as large as this analysis would seem to suggest, warming over the 21st century may well be larger than that predicted by the current generation of models, given the propensity of those models to underestimate climate internal variability.

    This point essentially comes from the fact that climate is a feedback system with a high Q and both positive and negative feedback paths. And because this is how climate looks, mathematically speaking, it’s very sensitive to both internal variability and external forcing.

    3. “It’s easy to confuse people with mathematical analysis and to fool them into a state of fear.”

    Unfortunately, it’s been the case more recently that climate disruption deniers/skeptics have been using the math to produce fear, uncertainty, and doubt (FUD). For example, it’s not statistically valid for me to claim that there’s significant warming in the 1998-2009 monthly data I show above because the R2 (representing how noisy the data is compared to the linear trend in time) is too low to draw any valid conclusions beyond “it’s effectively flat.” People who should know better, such as Joe D’Aleo, Tim Ball, Roy Spencer, and S. Fred Singer continue to use an image that shows cooling very similar to this one below:

    The R2 values are 0.001 for the GISS data, 0.132 for CRUT, and 0.041 for NCDC, showing that the trends are also statistically invalid. Furthermore, because of the small number of samples and the high “noise” of the samples around the trends, the effective AR-1 corrected 1 sigma error is 0.94 C for GISS, 0.42 C for CRUT, and 0.61 C for NCDC. With trends per decade that are at least an order of magnitude smaller than the error, it’ll take decades for these trends to reach the 95% confidence level.

    This is one example that I had immediately available. If you’ like more examples of math being used to manipulate the public into believing that climate disruption isn’t actually as bad as we think it is, I’ll spend some time and come up with more. There are many to choose from.

    4. “I want to see the predictions and see them come to fruition. Keep in mind that prior to 2000, we were told that as a result of the same logic we would be experiencing even higher temperatures than we did prior to 2000. Noting a cooling trend refutes this math that predicts catastrophe.”

    First off, climate models don’t make predictions as such – predictions are of the form “this WILL happen,” while projections are of the form “this is the most likely to happen.” It’s a subtle, but important, difference. Climate projections are phrased in terms of statistical probability and confidence intervals rather than given as certainties.

    Climate models are essentially massive matrices of partial differential equations with more variables than unknowns. As such, the models can’t and don’t provide give singular, unique solutions. Instead, climate models are tested using Monte Carlo runs to determine the sensitivity of the system to initial conditions and assumptions as to the ranges of variables. The result is that the models produce hundreds or thousands of unique solutions that are then combined using statistics to produce confidence intervals and mean model response. When you look at the IPCC’s data, they take it one step further, combining multiple models, each of which works a little differently, and generate a multi-model mean and multi-model confidence intervals.

    What this means is that every individual solution is averaged out with every other solution, and in the process, some information is lost while other information is enhanced. As an example, when averaging, signal-to-noise ratio increases as the square root of the number of samples that are being averaged together because white noise has an average of zero. Similarly, short-term effects (aka “weather”) will be reduced while underlying signals (aka “climate”) are enhanced.

    Put this all together, and you find that while individual solutions of the climate equations may have predicted the slowed warming of the last few years, that single solution wasn’t statistically valid as a projection and so was given only a small weight in the overall model or multi-model means.

    There have been at least two papers that discussed this in detail. One, titled “Is the climate warming or cooling”, is where the first image in the post is from. As I wrote in a previous post here at S&R about this paper:

    The problem, says the paper, is that natural variability is added atop the carbon dioxide (CO2) driven warming signal. And when that natural variability is greater on short time scales than the CO2 signal, then it’s all but inevitable that there will be decades where warming appears to stop only to start up again later. This is illustrated in the figure above for two periods in a single model run. (emphasis added)

    In addition, a excerpt of a paper that was sent to the Bulletin of the American Meteorological Society says:

    The 10 model simulations (a total of 700 years of simulation) possess 17 nonoverlapping decades with trends in ENSO-adjusted global mean temperature within the uncertainty range of the observed 1999–2008 trend (−0.05° to 0.05°C decade–1). Over most of the globe, local surface temperature trends for 1999–2008 are statistically consistent with those in the 17 simulated decades. (emphasis added)

    To reiterate, this illustrates that individual model simulations do predict decades where there are flat trends even in periods where the overall, multi-decadal trend is positive. So your comment that a cooling trend negates the projections is incorrect.

    5. “[D]o we have less CO2? If not there are other factors other than CO2 and these factors are the true reason temperature has and always will change.”

    This is a non sequitur, I’m afraid. There are other factors affecting climate than CO2, but that does nothing to negate the fact that CO2 affects climate too. For example, El Nino increases global temperatures greatly, but for only for a few months after the El Nino fades. Large volcanic eruptions cool global temperatures, but only for a couple of years. The climate is determined by those factors that change global temperature on the scale of decades, and there’s no evidence that El Nino or volcanism are affecting or are capable of affecting global climate on the scale of decades and longer. Given how long CO2 and many other greenhouse gases (but not water vapor, which has a very short residence time in the atmosphere) remain in the atmosphere, there is a large body of evidence that these gases do have the ability to affect climate.

    Similarly, just because your car stops when you put your foot on the brake doesn’t mean that it will always stop – you might have a leak in your brake line that causes you to have no brakes. Or someone might have cross-connected your brake with your gas pedal. Or you might have stepped on the gas accidentally instead. Just because other factors have changed the climate in the past doesn’t mean that those factors are responsible for changing the climate this time around.

    6. “It’s because of the sun activity”

    Solar activity certainly has influence, but it’s not as strong an influence as most people think it is.

    Total solar irradiance (TSI), or the amount of energy that the sun produces, in W/m2, at the Earths’ distance from the sun is about 1366 W/m2. Over a solar cycle, the amount of output changes by about +/- 0.5 W/m2, or less than 0.1%. We know that this isn’t the primary driver of climate from two facts – there’s not a strong 11 year cycle present in the climate data, and TSI hasn’t varied enough to cause the observed global temperature changes without an amplifier. There are amplifiers in the climate system (clouds, aerosols, soot, and non-cloud water vapor to name a couple), but they’d amplify non-solar factors as much as they amplify solar factors, so those are no good.

    The other possibility is that longer-term variations in solar wind could be responsible for changes in cloud formation triggered by cosmic rays. There’s evidence for this (correlation) prior to the late 1980s/early 1990s, but no sign of correlation since. At this point, the best available science says that GCRs probably have an impact, but that the impact is insufficient to account for the bulk of the observed global warming over the last 130 years or so. However, there is serious research ongoing at CERN into this very question, and hopefully we’ll know more soon (with “soon” defined as “sometime in the next couple of years”).

  4. I’m still on #3, but I wanted to make a slight clarification before I forgot. I think you might be using terms incorrectly. The R^2 value just tells you how well things are correlated. For example, if you go up 1 degree for every 1e6 ppm CO2, I think you’d have an R^2 value of 1. The significance comes in to describe how well the curve (in this case, the line) fits the data. Generally, P<0.05 is considered "significant," at least in biology. In physics, I think they like to be a bit more conservative. You could very easily have a significant curve fit with an R^2 = 0. You could also have an R^2 = 1 with a lousy significance, too. That's probably not an issue with the number of data points you are working with, but I thought I'd clarify anyway.

    Personally, I think that some sort of cyclical curve fit would work better (maybe a sine wave + a linear component?) but that's a hell of a lot harder to explain to someone who doesn't work with these sorts of stats regularly.

  5. That’s a fair point, Uber. A pure sinusoid would have a very low R2, but the signal itself would be quite significant. The trend, however, would likely be insignificant.

    Linear trends are the simplest approach to the statistics, but as you point out, they don’t explain everything. And they’re usually not predictive, especially in a very noisy system like global climate. That’s part of why I included the 1 sigma value above as well – it’s clear that the 1 sigma (68.3% confidence level – I erroneously indicated that it was 63% above) uncertainty is much greater than the trend, so we can’t really make firm conclusions, at least not based off of relatively simple statistics.

    Similarly, most climatologists use AR-1 statistics as well, but there’s some question as to whether this is a reasonable approach. But it’s simple and easy to do, while taking persistence into account is much more difficult. It’s something I’m looking up papers on and researching now, actually, so I’m not prepared to go into it in any detail.

    One of the things you can do to make this easier, however, is you can remove some of the noise from the data via a couple of different methods. The simplest is simply to scale a known noise source, time-shift it by N months (or M years for annual data), and minimize the resulting residuals. This can be done to the ENSO signal (there are at least six different ENSO indices, however, and so you have to choose one that you think is “best” – there’s a paper I’m going to track down that supposedly indicated that the simple indices are just as good as the complex, multivariate indices, but I haven’t done so yet) and the volcanic aerosol signal. I’ve done it with the ENSO signal once, but I didn’t figure out how to do it with the volcanic aerosol signal until just recently, so I’m still working on this one.

    I’m planning on posting that when I’m done with it, because just removing the ENSO signal made the CO2 signal pop right out – removing the volcano signal should make it even clearer. And then I’ll dive into clearing out the PDO and other oceanic signals too and see if what’s left.

  6. I’m sure there’s some sort of time phase shift in modelling some of this data, too, to take any sort of lag into account. Way too much work for me to look into it, tho. Haha. I have enough trouble keeping up with my immune data. 🙂