by Aaron Schatz
It's no secret that 2009 was not the best year for projections around here. All around the Internet, I'm guessing that fans on various message boards will ask if they should buy Football Outsiders Almanac 2010 based on last year's projections, which is a problem for us since a) we would really like people to buy the book because it is interesting, informative, and funny, not because they think they will find perfect projections, and b) we would really much prefer to be judged on our projections from 2007 and 2008, thank you very much.
However, let's be honest, saying "well, gee, everyone has a bad year, cut us some slack" sounds pretty damn whiny. It's probably a lot better to go look at how teams changed (and did not change) between 2008 and 2009 to see if that can teach us any lessons as to what went wrong with last year's projections.
With this in mind, I did go and look at 2009 compared to other years. What I found was somewhat odd, and I write about it in the introduction to this year's book:
Certainly, very few people went into last season expecting New Orleans to emerge holding the Lombardi Trophy. However, everybody knew that the Saints had a powerful offense, and that ties into the strange trend that defined the 2009 season. We're all so used to the NFL standings changing from year to year that it was hard to notice that the average team's change in performance was only about half the size it usually is.
The most dramatic issue was offense. From 2002 through 2008, only 15 teams per season came within 50 points of their total points scored from the previous year. Last year, 23 teams scored within 50 points of their total from the previous year.
Want a more extreme example? From 2002 through 2008, 6.2 teams per year saw their total of points scored either rise or fall by more than 100 points. Last year, only two teams had points scored change by over 100 points: Cincinnati, which scored 101 more points, and Tampa Bay, which scored 117 fewer.
We use a method called "Pythagorean wins" to estimate how many games teams should win based solely on points scored and allowed. The year-to-year correlation of Pythagorean wins from 2008 to 2009 was nearly twice as strong as any other two-year span in recent NFL history. And yet, the year-to-year change in each team's points allowed was actually no more consistent than in any other recent offseason.
There isn't time for a lot of graphs in a short introduction piece, but I wanted to look at the year-to-year correlations that I wrote about in the book, and show folks exactly what I mean. First, let's take a look at the year-to-year correlation coefficients for offense, for each two-year span going back to 2001-2002. This chart shows you the year-to-year consistency of two stats, DVOA and good ol' points scored.
As noted above in my quote from FOA 2010, offensive numbers last year were absurdly similar to the year before, especially compared to the way things usually go in the NFL. Yes, offensive DVOA was generally more consistent earlier in the decade, but the correlation from 2008 to 2009 was still higher than in any of those two-year spans, and the year-to-year correlation for points scored was off the charts. Now, as you probably know, there wasn't a corresponding rise in consistency on defense. In fact, early in the season we were writing a lot about how there had rarely been a season with more change on defense, with a lot of teams bringing in new coordinators who dramatically upgraded performance. By the end of the season, that had settled down a bit --Denver's run defense going in the tank in the second half, Tennessee ended up following a good 2008 with a bad 2009 instead of a ridiculously horrifying 2009, etc. -- so the year-to-year correlation of defense is not historically low. However, it still was lower than the average over the past decade, as you can see from this chart:
The year-to-year correlation stands out more with DVOA than it does with points allowed. As for the third element of football, well, we don't have a "points scored/points allowed" equivalent for special teams, but I need a place to toss in a similar graph showing year-to-year correlation of special teams DVOA. It turns out the year-to-year correlation of special teams DVOA from 2008-2009 was the lowest ever, nearly zero and definitely lower than the rest of the decade. This will look especially odd when we get to the chart that shows you how well last year's projections did in various categories.
Now that we've looked at offense, defense, and special teams, we can look at total performance, and it looks like the extreme offensive consistency from 2008 to 2009 overpowered the usual level of year-to-year change in the NFL. Both total DVOA and Pythagorean wins were more consistent from 2008 to 2009 than in any other two-year span since 2001. As I note in the book, the correlation coefficient for Pythagorean wins is twice as high as the average two-year span this decade.
Now let's bring in our projections from previous seasons. 2004 was the first year we attempted to project DVOA before the season, although that was only on the website, not in a book. In the middle of my self-flagellation over the inferior quality of our 2009 projections, I went back and looked at the correlation of our projections to actual DVOA in each season, using the projections as we published them at the time -- not the projections that would result from retroactively applying our current projection system. Here's a look at the result.
In retrospect, our 2009 projections were nice and strong on offense and special teams. The only year where we were more accurate when it came to offensive DVOA was 2007, and the 2009 special teams projections were the most accurate we've ever done. That seems a little screwy since actual special teams DVOA differed so heavily from what it had been in 2008, but apparently, our system spotted a lot of the trends that were going to fuel that change.
In defense, however, this did not happen.
I wrote about it a couple times earlier in 2010, but this chart fully shows how much our 2009 defensive projections sucked. The correlation between our projections and teams' actual defensive DVOA was pretty much zero -- in fact, it was on the negative side of zero. A dartboard would have been just as accurate. When we added up offense, defense, and special teams to get our total DVOA projections, we ended up with the opposite of what actually happened with NFL teams: our poor defensive projections overwhelmed our quality offense and special teams projections, leading to the worst projections we've done in six seasons.
There are two explanations for what happened last year (and I refer more to the overall NFL offensive numbers than I do to our projections). The first explanation is that something in 2009 dramatically changed in the way NFL teams build their rosters and turn over their talent from season to season. Because of this change, most NFL offenses in the near future will barely decline or improve from year to year. In addition, old trends that indicated when teams might improve or decline no longer apply, which means that projections based on previous data (such as ours) are now useless.
Or, 2009 was a bit of a fluke year.
Occam's razor points to the second explanation, and I'm inclined to agree. Of course, that doesn't mean I wasn't crazy busy this offseason, trying to rework the projection system and identify whatever caused changes from 2008 to 2009 without changing anything in the system based on the trends that indicated improvement and decline in previous seasons. However, I think it is safe to say that 2010 NFL offenses will not be as similar to 2009 NFL offenses as the 2009 NFL offenses were to 2008 NFL offenses -- and the projections in Football Outsiders Almanac 2010 will come closer to matching the accuracy we saw in 2007 and 2008.
Postscript: I had said earlier that we would announce in this space when we had a new version of FOA up with more typos fixed. There's now a PDF version online that fixes all typos found through Monday afternoon.