This week: a bad coach gets paid, then insulted; a bad quarterback gets optimistic; another bad quarterbcak gets a cunning plan; a bad play gets Matt Ryan irked; a bad play gets burned; and Jets and Raiders fans get drunk.
30 Dec 2010
by Bill Connelly
I have always loved the relatability of a lot of baseball stats. WAR (Wins Above Replacement) boils a complicated measure down to, basically, how many wins a player is worth. Measures like EqA (Equivalent Average) give you something more telling and accurate than, say, batting average, but since people know what a good or bad batting average is, they scale it to where it resembles batting average. Something like DIPS (Defense Independent Pitching Stats) takes figures more reflective of pitching quality and equates them to an ERA-type measure.
Clearly, FO readers have begun to figure out what good or bad DVOA, F/+, S&P+, etc., ratings look like, but the casual reader still might be a little thrown by it. As I was looking into ways to improve our F/+ performances against the spread, I began to wonder what S&P+ might look like in a different format. What would it tell us if we looked at a single-game S&P+ performance in terms of a point figure? This would give us an opponent-adjusted, tempo-adjusted (since S&P+ is a per-play measure) way to judge offenses, in a more recognizable form.
I did not want to overthink this process -- I simply used regression equations to convert the "+" scores into something resembling real scores.
Let's see what we can learn with these new adjusted point totals.
The first thing we can do is look at single-game standouts. Here are the 10 best offensive performances of the season against BCS conference teams. Why only BCS teams? Because the biggest outliers still take place against non-BCS teams. The goal is to adjust perfectly for opponents, where the best and worst performances of the season can come against teams of any shapes and sizes, but we're not quite there yet.
|Oct. 10||Oregon State||California||35||66.2|
|Sept. 11||Oklahoma||Florida State||47||60.8|
|Oct. 30||Iowa||Michigan State||37||58.3|
|Oct. 7||Nebraska||Kansas State||48||58.0|
|Nov. 13||Alabama||Mississippi State||30||55.3|
|Sept. 18||Stanford||Wake Forest||68||55.0|
Strangely enough, the two best overall performances of the season each came against Miami (Ohio). Cincinnati registered an adjusted point total of 79.9 against them, Missouri 79.8. South Carolina "scored" 73.5 against Troy, Auburn 72.2 against UL-Monroe, and TCU 68.4 against Wyoming.
At the very least, this should help you understand the level of dominance associated with certain "+" scores. Saying Oregon State registered a 210.7 single-game Off. S&P+ against California is impressive if you know that 100.0 is average, but saying they scored the equivalent of 66.2 points is a lot more relatable.
Setting this up for defense is a little trickier. Why? Because of good ole decimal points.
Team A's season-long S&P average is 0.650. Against Team B, they collapse and, while the game is close, record an S&P of 0.050. With no caps of any sort, Team B's single-game defensive S&P+ is (0.650 / 0.050)*100, or 1,300.0. If we were to adjust that into something resembling a point total, it would end up saying something like Team B "allowed" -160 points ... which ruins the point of adjusting to realistic point totals. Because of this, I put in a cap of 250 for any single-game defensive S&P+ figure. It is rare -- only 10 times all season did a team record a perfect 250 against BCS competition -- and necessary. A Def. S&P+ score of 250 still equates to a negative number, -6.9, which I actually somewhat enjoy. Let's face it: Sometimes it seems like a team deserved negative points. Technically we could scale this to simply zero if we wanted to.
Here are the 10 games in which teams "allowed" an adjusted point total of minus-6.9.
|Oct. 30||Oregon State||California||7||-6.9|
|Oct. 9||Ohio State||Indiana||10||-6.9|
|Sept. 11||Iowa||Iowa State||7||-6.9|
|Nov. 13||South Carolina||Florida||14||-6.9|
|Nov. 20||West Virginia||Louisville||10||-6.9|
We've already talked about Georgia-Kentucky, which involved an almost perfectly coincidental division between Kentucky's success when the game was and wasn't "close." Games like that will happen anytime you're dividing between garbage time and non-garbage time. The other nine games on this list indeed involved near-complete defensive domination while the game was competitive.
So now let's move on to some per-game figures. What happens when we average out teams' Adjusted Scores for the season? We don't end up with exactly the same rankings as full-season Offensive S&P+ -- we're looking at per-game totals instead of per-play totals now. However, comparing Adjusted Points Per Game (PPG) to actual PPG shines a light on who faced a strong slate of defenses and who did not.
|22||San Diego State||35.0||19||32.9|
In all, this averages out pretty closely to the full-season Offensive S&P+ rankings. TCU and Wisconsin, No. 20 and No. 12, respectively, both benefit from the per-game division. This is presumably because they had some pretty significant single-game outliers. TCU "scored" 68.4 adjusted points against Wyoming, 57.1 against Utah and 55.9 against Baylor. Meanwhile, Wisconsin put up totals of 56.7 against Austin Peay and 51.4 against Northwestern.
Comparing per-game adjusted scores to actual points scored, we see some predictable teams getting extra credit. Auburn and Alabama both rank among the top three, and South Carolina gets a boost as well. Missouri, which faced solid defenses like Nebraska, Oklahoma, and Illinois, got a steady boost as well. And of course, since this is S&P+ we're talking about, Oregon was demoted. But we will come back to them.
Here are the Top 25 defenses according to the same criteria above.
Two teams stand out in the above list: Georgia and Southern Miss. In Todd Grantham's first season as Georgia's defensive coordinator, the Bulldogs improved considerably. Their rankings are inflated by three perfect, 250.0 performances. They "allowed" -6.9 points to Kentucky, UL-Lafayette, and Idaho State and 11.1 to Vanderbilt. Obviously that helps the averages. If we were to scale anything below zero to simply zero, that would have a negative impact on Georgia's per-game averages.
Southern Miss is in the same boat. They "allowed" -6.9 adjusted points to Prairie View A&M and Marshall, and their rankings rose because of it. Even though negative numbers are not realistic, this does illustrate the impact single games can have when we look at per-game figures instead of per-possession or per-play. It boils a ton of possessions and plays into 12 or so single data points.
So which offenses and defenses benefited considerably from the opponent adjustment involved in these figures?
Five teams played a slate of defenses challenging enough that they would have averaged more six points more per game if playing nothing but average opponents.
What about defenses?
This list is comprised mostly of terrible defenses, suggesting that there is more of a standard deviation with defensive performance than offensive performance. The worst of the worst come back toward the middle when we adjust for opponent and use regression equations to divide everybody up.
Imagine if every team in the country played a perfectly average team every week -- that's basically what we are doing when we come up with single-game S&P+ scores. We're comparing the team's overall performance to the baseline average (200.0). If we use adjusted scores, that means every team has an adjusted scoring margin for each game, right? To illustrate what this figure might tell us, let's look at the two national title game participants.
|Outcome||Adj. Pts||Adj. Pts
|Sept. 4||Auburn||Arkansas State||52-26||W||45.9||23.7||W|
|Sept. 9||Auburn||Mississippi State||17-14||W||31.2||21.2||W|
|Sept. 25||Auburn||South Carolina||35-27||W||41.5||32.0||W|
|Oct. 30||Auburn||Ole Miss||51-31||W||38.7||33.1||W|
|Dec. 4||Auburn||South Carolina||56-17||W||46.4||26.4||W|
Thinking in terms of, "What if they played a perfectly average team each week?", it should be no surprise that Auburn's undefeated record does not change. The Tigers handled their business against the good teams on the schedule, and they cleaned bad teams' clocks.
On the other hand, looking at Oregon's schedule, you can start to see why S&P+ is not as much of a fan.
|Outcome||Adj. Pts||Adj. Pts
|Sept. 4||Oregon||New Mexico||72-0||W||36.7||7.5||W|
|Sept. 18||Oregon||Portland State||69-0||W||41.2||-6.9||W|
|Sept. 25||Oregon||Arizona State||42-31||W||25.4||28.7||L|
|Oct. 9||Oregon||Washington State||43-23||W||31.1||33.4||L|
|Dec. 4||Oregon||Oregon State||37-20||W||30.9||20.3||W|
Oregon was basically the anti-Georgia, doing a lion's share of its damage right after it eased the game out of "close" range. On average, the current definitions of "close" result in the best correlation between S&P+ and win percentage, so the definitions stay. Regardless of the exact "close" definitions or opponent adjustments, we see here that Oregon simply didn't bring it every week. They beat Arizona State because of turnovers, and they were only decent in wins over Washington State and California. Other teams in the country were better than decent every week. (Of course, Oregon also won all its games, and I'm pretty sure they're not willing to give up their national title bid because some nerd says they didn't bring it every week. Wins are wins in the end.)
So which teams had the best "record" using these adjusted scores?
|Adj. Rec.||Team (Actual Record)|
Boise State (11-1)
Ohio State (11-1)
Oklahoma State (10-2)
|12-1||Florida State (9-4)|
|11-1||Michigan State (11-1)
|11-2||Northern Illinois (10-3)
South Carolina (9-4)
Texas A&M (9-3)
West Virginia (9-3)
Notre Dame (7-5)
The basically tells us is which teams were consistently good. It doesn't give us a precise definition of which teams were the best for the whole season, but ... neither does real win-loss record, does it? The full-season S&P+ is still preferable in terms of overall evaluation, but that doesn't stop this from being interesting.
It's always interesting to look at things from slightly different angles. If this kind of perspective was more interesting or telling than the typical S&P+ figures, scaled to 100 or 200, feel free to share that in comments. The goal is to always provide information in the most relatable way possible, and I felt it would be interesting to take a look at things in this manner.
Since we're talking about new ways of looking at things...
"I Still Haven't Found What I'm Looking For," by U2
"I'm Looking Through You," by The Beatles
"Keep On Looking," by Sharon Jones & The Dap-Kings
"Looking At You," by MC5
"Looking Down the Barrel Of A Gun," by The Beastie Boys
"Looking for a Way Out," by Uncle Tupelo
"Looking For Another Pure Love," by Stevie Wonder
"Looking for Leonard Cohen, Part 1," by Lizzie West
"Looking for Love," by Johnny Lee
"Looking for the Perfect Beat," by Afrika Bambaataa
From Lizzie West to Bambaataa in two moves. This might be my favorite playlist yet.
27 comments, Last at 03 Jan 2011, 3:47pm by Alabama ManDance