/cdn.vox-cdn.com/uploads/chorus_image/image/13055273/blackground.0.jpg)
Football statistics are weird. This is one of those weird ones. I haven't yet decided whether this I should call this one signal or noise, but I think it's interesting enough to do a quick write up and get some others' input on it.
There's little question that TV has increased the the time that it takes to play a college football game. There hasn't been, as far as I can find, any work done on determining whether this increase in game length has any impact on the outcomes on the field.
Looking at games between FBS teams from 2007-2012 (ignoring the one percent of games on the extremes), there appears to be a noticeably negative correlation between game length and home winning percentage.
As a quick orientation to the chart, the black line is the rend line of home winning percentage. The red line is the 58% average home winning percentage for FBS games. The box encloses 88% of all FBS games and the red star marks the scatter point of both averages...58% home winning percentage and 197 minute game length.
Although the scatter plot shows a lot of variation in winning percentage as the game length increases there is a obvious negative correlation even in the boxed area which encloses 88% of all games.
I'm pretty confident that the correlation is there, but I'm stumped as to why it's there. Intuitively, game length shouldn't favor one team or the other. I can't think of any plausible explanation for this, so I'd like to know if you have any ideas.
Let's hear them.
Loading comments...