Which receivers were truly most effective with the ball in their hands last season? We look at the leaders in YAC+ for 2014 and the last nine years.
06 Aug 2012
by Danny Tuccitto
The most controversial element of Football Outsiders Almanac 2012 is clearly the team forecast that predicts the biggest nosedive of any team this year compared to last: 7.2 mean projected wins for the San Francisco 49ers. The Bay Area media picked up on it, and we went so far as dedicating our first Sabermetrics Video Network contribution to the topic.
The main explanation, which Aaron Schatz talks about at length in that video, is the Plexiglass Principle. What we say in FOA 2012 is that, if you look at the 41 teams in DVOA history that -- like the 2011 49ers -- improved between 25 and 35 percentage points from one season to the next, such teams averaged a 12.1-percentage-point DVOA decline the following season, as well as a decrease of 2.3 wins. We have the 49ers declining by about six wins this season, so the Plexiglass Principle isn't the only thing at play in terms of the overall projection, but that's what I'm going to focus on for the purposes of this little addendum.
If you want to see the Plexiglass Principle in one neat package, here's a table:
|DVOA CHG YR 2 Groups||# of Teams||WIN CHG YR 3||DVOA CHG YR 3|
|-30% or Worse||30||+1.6||+10.2%|
|-29% to -20%||49||+2.4||+10.5%|
|-19% to -10%||85||+0.3||+4.2%|
|-9% to 0%||134||+0.5||+3.0%|
|0% to -9%||103||+0.1||+0.3%|
|+10% to +19%||110||-1.1||-6.3%|
|+20% to +29%||58||-2.0||-10.4%|
|+30% or Better||20||-1.1||-9.1%|
I've grouped all teams in 10-percent increments (without rounding) according to their DVOA change from Year 1 to Year 2. For instance, the 2011 49ers would fall into that second-to-last group because of their 29.8-percentage-point improvement from 2010. The last two columns show what happened, on average, to each group in Year 3. Win changes aren't as tidy because of the randomness of wins, but a DVOA change of 20 percentage points or more in Year 2 predicts about twice as big of a DVOA change in Year 3 than a Year 2 change of -19% to +19%. Ladies and gentlemen, the Plexiglass Principle.
A popular argument for why the 49ers' bounce down from the plexiglass might not be that big this season is that their bounce up coincided with a coaching change. Obviously, from the theme of the San Francisco chapter in FOA12, I'm on board the Harbaugh train, so it seems useful to test this hypothesis more generally. Have teams with a new head coach in their bounce-up year tended to bounce down less than those that improved with an incumbent head coach?
In the book, we cited only 41 49ers-like teams. For this piece, I'm going to use the info we just learned from the table, and expand the plexiglass definition to the 78 teams that improved by 20 percentage points or more. For each team, I marked down who the head coach was in each season of the three-year period.
We're trying to isolate the effect of a head coaching change between Year 1 and Year 2 on performance in Year 3, so I threw out four teams that had the same coach in Year 1 and Year 2, but changed coaches between Year 2 and Year 3. Dick Vermeil retired after the Rams' championship-winning improvement in 1999. Bill Parcells
retired for the second time went shopping for groceries after leading New England to a 31.4% DVOA improvement in 1996. Marty Schottenheimer took over the Chargers in 2002 after they improved by 30.6% DVOA under Mike Riley in 2001. Tom Cable got fired after a 2010 season in which the Raiders improved by 29.9% DVOA.
Another pesky issue I had to resolve was that 29 of the remaining 74 teams had overlapping three-year periods within the same franchise. For instance, the 1992-1994 Broncos overlap with the 1994-1996 Broncos. Year-to-year dependency in NFL team data is pretty much unavoidable, but I'd have been remiss to just ignore the problem here. What I ended up doing with the overlapping teams was to randomly choose one of them for inclusion.
That left 59 teams in the study: a "new head coach in Year 2" group with 17 plexiglass teams, and an "incumbent head coach in Year 2" group with 42 plexiglass teams. Only thing left was to get each group's average change in Year 3, and see if the difference between the groups was statistically significant. Drumroll please...
|Group||# of Teams||WIN CHG YR 3||DVOA CHG YR 3|
|New Head Coach||17||-0.9||-5.1%|
|Incumbent Head Coach||42||-2.1||-12.9%|
On average, plexiglass teams see a smaller dropoff in wins and Total DVOA if their massive improvement in Year 2 coincided with a head coaching change. However, neither difference from incumbent-coached plexiglass teams is statistically significant, even at the 90-percent confidence level (i.e., both p-values are higher than .10).
Now, a statistical analysis like this is certainly open to interpretation. For instance, some might say, "The DVOA difference may not be significant at the 90-percent confidence level, but I'm totally comfortable being 88.2-percent confident in saying a new head coach matters!" No rebuttal from me there. However, the win change difference is nowhere near significant, and that's been the main source of pushback against our projection for San Francisco. It's reasonable to argue that changing coaches insulates a plexiglass team from the average DVOA decline (-10.7%), but arguing for a smaller win decline than average (-1.8) is just conjecture at this point.
Another other obvious issue with the analysis is only having 17 members of the new head coach group. Again, no rebuttal from me there. It's simply the case that coaching changes coinciding with a huge year-to-year improvement have been exceedingly rare over the past 20 years. Maybe when we have twice as many data points 20 years from now, we'll see a clear picture of new head coaches insulating teams from the Plexiglass Principle. Right now, though, the stats just don't support it.
Bringing things full circle, I'll end with something specific to the 2012 49ers that I haven't even mentioned yet in terms of the Plexiglass Principle. If we look at the Year 1 baseline, new head coaches improved teams that averaged 4.1 wins and -23.5% DVOA the previous season, whereas incumbents improved teams that averaged 6.7 wins and a -11.4% DVOA. San Francisco's Harbaugh hiring came on the heels of a 6-10 record and -11.2% DVOA, which is almost identical to the typical plexiglass team that didn't change coaches.
So what we saw in San Francisco last year is Harbaugh guiding a mediocre team to legit Super Bowl contender, which is extremely atypical in a plexiglass context. This isn't Tony Sparano improving a 1-15 Dolphins team by 10 wins and 27.6 percentage points in 2008 or even Marvin Lewis improving a 2-14 Bengals team by six wins and 32.2 percentage points in 2003. Of the 17 plexiglass teams in the "new head coach" group, only four had six or more wins in Year 1, which makes it hard to apply my general finding to the 49ers' specific case. This is something FO alum Bill Barnwell touched on in his Grantland column a couple of weeks ago.
In sum, we're encased in a plexiglass No Man's Land. Statistically speaking, there's not enough historical evidence to suggest that a head coaching change between Year 1 and Year 2 portends a smaller DVOA decline in Year 3. At the same time, issues related to sample size and the team Harbaugh inherited make it reasonable for people to downplay historical trends in the case of this year's Niners. Perhaps that's what's subconsciously influencing both Aaron and yours truly when we express subjective skepticism about San Francisco's model-based win projection.
Nevertheless, setting the plexiglass principle aside, our San Francisco chapter of Football Outsiders Almanac 2012 shows pretty conclusively that they'll win fewer than 13 games for other reasons. The beauty of publishing this stuff in a book and on a website is that we can reconvene early next year to see just how right or wrong we were.
77 comments, Last at 15 Apr 2013, 7:14am by johnpeterson