Over the years the BCS (Bowl Championship Series) has become synonymous with conferences that play the highest level of college football. The schools that belong to the automatic qualifying (AQ) BCS conferences are given a higher status on the basis of their inherited social status, basically they are judged worthy of this higher status simply because of the conference they belong to.
This collegiate caste system has been strengthened by fans and media alike when a school is identified as being a “BCS school” or a “non-BCS school”; technically they are all BCS schools, some are AQ and others are non-AQ. Additionally for some mystical reason a BCS AQ school is held at a higher football esteem than a non-AQ school simply by their conference association. If a non-AQ school proves its strength on the field of battle and makes it to a BCS bowl it is known as a “Cinderella” team, yet if a normally mediocre AQ team makes it to a BCS bowl they are spoken of as if they belonged there. If schools were graded simply by their individual football effort rather than by the conference they just happen to be affiliated with one could determine which schools actually deserve the status of a football power and which schools are simply riding coat-tails.
On the other side of the spectrum schools in non-AQ conferences are regularly rated based on their success against AQ schools and this logic is very flawed. On an average year wouldn’t a win against Ohio State be more impressive than a win against Indiana? Both are from the same AQ conference and both have been endowed with the title of a “BCS school” but any college football fan knows there are light years between the quality of each school’s football program.
To try and quantify which schools actually deserve to be called upper echelon football programs I needed to identify some standard of football that all teams could be compared against.
I am a believer that computer rankings can help put some general distinction between teams. Computer rankings (if done correctly) rank teams by looking at the whole body of work through all the games played instead of making just an emotional choice based on the last few games played that most human polls tend to do. The more data given the sharper the computer algorithm is so computer rankings are not very reliable until 3 or 4 weeks into the season. A computer algorithm will take into account important things like strength of schedule, margin of victory, and where the game was played while many human polls weigh heavily simply on wins and losses. For example a computer may rank a 9-2 SEC team over an 11-0 WAC team, obviously the SEC team has a far superior strength of schedule and it comes down to who they lost to while a human poll would simply rank the WAC team higher because they are undefeated.
The computerized ranking that I follow the closest and trust the most (besides my own) is the Sagarin ranking so I will use this ranking data as part of my analysis, specifically I will be using Sagarin’s ELO CHESS rating system which is utilized as part of the BCS formula.
In addition I will use the top non-AQ conference, the Mountain West Conference (MWC), as a measuring stick to help determine a BCS hierarchy.
I have compiled the Sagarin season ending rankings since 2005, I used 2005 as a cutoff because that was the season TCU joined the MWC. A 5 year span is a good indicator of a school’s current football strength, if I ran this logic again in 5 years the schools at the top and bottom of the list would basically stay the same. I averaged the season ending ranking of each BCS AQ school and the schools of the MWC. A criticism of the MWC has been that it is a very top heavy conference with 3 very good teams and 6 weak teams and the 5 year average of Sagarin rankings does show this to be true but this also leads to some interesting results in my attempt to quantify the strength of BCS AQ schools.
I have divided the MWC conference into 3 sections based on the 5 year average rankings; the number to the right of each school denotes that school’s average season finish in the Sagarin rankings:
Top third –
TCU (22), BYU (27), Utah (29)
Middle third –
Air Force (62), Wyoming (76), New Mexico (77)
Bottom third –
Colorado State (78), San Diego State (89), UNLV (93)
Of the 65 schools in BCS AQ conferences only 12 schools over the past 5 years averaged out better than the top MWC schools. These schools are the Who’s Who in college football; Florida (7), USC (8), Ohio State (9), Texas (9), LSU (9), Virginia Tech (13), Oklahoma (14), West Virginia (14), Georgia (14), Penn State (14), Oregon (17), and Alabama (21).
Really no surprises, these are those BCS schools that can be called “the best of the best”. Based on their averages any of these schools would win the MWC just about every year.
The next group of BCS schools are schools with averages from 22 to 29 which is the same as the MWC top third and would be considered a contender for the MWC title each season. There are 11 schools in this group; Auburn (25), Oregon State (25), Wisconsin (25), Boston College (25), Texas Tech (26), Cal (28), and Florida State (28).
The next group of BCS schools are schools with averages from 30 to 61. These would be schools that could compete in the MWC and might win a title once in a while but most of the time these schools would finish outside the top three in the MWC. There are too many schools in this group to name individually but it is interesting to note that this group represents 33 of the 65 BCS AQ schools, 51% of all BCS AQ schools.
The middle of the MWC consists of schools with averages from 62 to 77 and 10 BCS AQ schools fall into this group. Theoretically these schools would rarely contend for a MWC title. These schools are Kansas State (62), Mississippi State (62), Purdue (63), Minnesota (64), Colorado (64), Baylor (69), Washington State (71), Iowa State (73), Washington (73), and Illinois (76).
That leaves the bottom of the MWC conference and believe it or not 3 BCS AQ teams fall into this group; Indiana (79), Syracuse (84), and Duke (91).
The final breakdown is as follows:
Of the 65 BCS automatic qualifying schools
12 schools (18%) would dominate the MWC
7 schools (11%) would compete for a MWC title
33 schools (51%) are not quite good enough to be in the top third of the MWC
10 schools (15%) could compete with the middle of MWC
3 schools (5%) would be MWC bottom dwellers
In theory 71% or over two-thirds of BCS AQ schools would have a difficult time competing in the MWC yet these same teams can get a free pass to a BCS bowl game just because of their conference affiliation. The MWC is criticized for being top heavy but by proportion the BCS AQ conferences as a whole have the same problem, one third of the teams are good enough to consistently compete with the best schools in the country while the other two-thirds are riding the coat-tails of the schools doing all the work.
FYI for you Golden Domers out there, Notre Dame averaged out at 43 so theoretically they would have fallen in the “not quite good enough” group.