19 Dec 2013
by Bill Connelly
"We'll see if they can keep it up in the BCS title game. When you are in this sort of rhythm, the last thing you want to do is wait four or weeks for the next game; just ask 2008 Oklahoma."
Last week, in an SB Nation piece about the Auburn-Missouri SEC Championship game, I casually dropped in a reference to the dreaded Bowl Break. I referred to it as an accepted fact that offenses can lose their rhythm while on hiatus, and while there are plenty of examples of just that, is this a real thing or a collection of anecdotes?
It's hard to get too clear an answer without delving into offensive styles, and even then we'd have to generalize a lot, but I thought I would take a 5,000-foot view of bowl games as compared to regular season games. Is there a difference in general levels of offensive or defensive success?
First, let's look at the distribution of plays and yards. Here's a frequency chart looking at gains of certain yardage. It comes from the 2012 season, with FBS vs. FCS games removed to in some way standardize levels of competition. Obviously bowl season filters out the worst teams, but removing extreme examples helps, I think.
In the 2012 regular season, 22.8 percent of plays went for no gain. (Obviously a lot of these were incomplete passes.) In the bowl season, that jumps slightly to 23.5 percent. But depending on how we frame things, there might be something noticeable here. In the regular season, 29.9 percent of plays either go for no gain or lose one to five yards; in the bowl season, that number jumps to 31.4 percent. Meanwhile, 28.0 percent of regular season plays gain between three and seven yards; in the bowl season, that drops to 26.9 percent.
There are some other defense-friendly stats here. In the regular season, 2.8 percent of passes were intercepted; in bowl season, it was 3.4 percent. In the regular season, plays had a 42.7 percent success rate (43.6 percent rushing, 41.7 percent passing; in bowl season: 41.1 percent (42.1 percent rushing, 40.0 percent passing). In the regular season, rushes averaged 3.01 line yards per carry; in bowl season: 2.94.
My initial impression was that, instead of attributing this to a bowl break, some of these numbers could simply be tied to the fact that there are better defenses, on average, in bowl games than in the regular season. But there are better offenses, too, right? And among other things, the PPP (EqPts Per Play) on successful rushing plays is actually higher in bowl season (0.92) than in the regular season (0.89). (PPP on successful passing plays: 1.41 in the regular season, 1.36 in bowl season.) That suggests that defensive breakdowns are of the same magnitude (or greater) after a break, which suggests the bowl break also impacts the defense in some way.
These aren't significant differences in the end; a 1.6 percent decrease in success rate simply means that one extra play for every 62.5 is unsuccessful. So depending on your pace, one to 1.5 extra plays per game might go for a loss instead of a short gain. And there's an extra interception for every 167 passes. Still, it does appear that there might be a slight drop off in offense thanks to the bowl break.
Might that mean that strong defensive teams have advantages over strong offensive teams in bowls? Not really. The teams in last year's Def. F/+ top 10 went 8-2 in bowls (average score: 28.9 to 17.8), but teams in the Off. F/+ top 10 went 7-3 (average score: 32.8 to 27.6), with all three of the losses coming to other top offensive teams. In games pitting a top 10 offense against a top 10 defense, the offense-happy teams went 2-0, with Clemson beating LSU and Louisville beating Florida.
So yeah. There's probably an effect here, but it's one too small to worry about in your bowl picks. Then again, if somebody wants to bet the under in every game in their bowl pool and report back to me, that would be fine. You know, for science.
5 comments, Last at 23 Dec 2013, 5:18pm by Bill Connelly