To finalize college football’s approaching National Signing Day, as is the case every year, a dozen or more recruiting services will promptly unveil the second-most prominent set of final rankings in the sport, trailing only the actual polls released the day after the national title game.
Is there significance in team recruiting rankings for college football?
Is there any correlation between these rankings compared to the final poll results?
If certain teams, on average, rank high (or low) in regard to their incoming recruiting class, is it safe to speculate the majority of the same teams usually rank high (or low) at the end of the season based on their on-field performances?
There are many adversaries of recruiting rankings—whether team or individual prospects—consider them to be nonsense and/or marketing ploys by the aforementioned services.
For example in one particular article, Recruiting Rankings Don’t Translate Into Wins, begins with “Putting stock in recruiting rankings is like investing with Bernard Madoff. It might be fun in the short-term, but you're going to get burned in the end.”
Noticing particular deficiencies in a number of these articles with an “anti-recruiting rankings” stance, I decided to find out myself if there was any connection between the two types of rankings.
I compared team recruiting rankings from 2002-2007 with actual, final rankings from 2004-2009. I decided a six-year span was an adequate amount of time to evaluate —not too short nor too long. I staggered the rankings two years apart since I determined it takes more than a season for the majority of recruiting classes to make considerable impact but not as many as three seasons.
I figured each team’s actual, final ranking (AR) by averaging its yearly finishes in the AP Poll, including “others receiving votes” beyond the Top 25.
If a team did not receive any AP votes, which occurred more often than not, I used its ranking from Jeff Sagarin’s computer ratings.
I prefer Sagarin’s ratings because they are used in the BCS rankings, viewed by many as highly credible, and rank each and every Division I team—all 245 of them this past season.
At times, a team did not appear in the AP Poll (including “others receiving votes”) but its Sagarin ranking was higher than the number of teams receiving AP votes. In this case, to be fair to those schools receiving votes, the assigned ranking was the next number in the pecking order following the number of AP vote-getters.
For example, Tennessee did not receive any votes in the AP Poll last season but finished 36th in Sagarin’s rankings. Since 40 teams received AP votes, Tennessee was ranked 41st for 2009.
I calculated each team’s recruiting ranking (RR) by averaging its yearly ranking in the Top 75 “Frosh Recruiting Ratings” (freshmen and JUCOs) located in Phil Steele’s college football preview magazine. If a team did not appear in Steele’s top 75, I used its recruiting ranking from Rivals.com.
I favor Steele’s rankings because he compiles information “based on the many different recruiting services across the country,” including Rivals.com, Parade, Tom Lemming, PrepStar, and ESPN.
Similar to the Tennessee example, if a team did not appear in Steele’s top 75 but was in Rivals.com's top 75, the team was given a ranking of 76th.
For example, Utah did not appear in Steele’s top 75 in 2007 but was 71st in Rivals.com’s recruiting rankings. To be fair to those schools ranked by Steele, Utah was ranked 76th in recruiting for that season.
Below is the 2004-2009 AR top 25 to the left and the 2002-2007 RR top 25 on the right. Each team is listed with its average ranking over the six-year period, followed by its alternative ranking in parentheses (RR on left, AR on right).
*For teams with the same AR average, like USC and Texas, the total number of AP Poll votes from 2004-2009 determined the ranking order.
1. USC, 5.8* (1) 1. USC, 2.0 (1)
2. Texas, 5.8* (2) 2. Texas, 5.0 (2)
3. Ohio State, 7.5 (5) 3. Michigan, 6.3 (33)
4. Florida, 9.3 (4) 4. Florida, 7.0 (4)
5. Virginia Tech, 11.7 (23) 5. Ohio State, 7.3 (3)
6. LSU, 11.8 (9) 6. Oklahoma, 8.5 (7)
7. Oklahoma, 12.5 (6) 7. Tennessee, 8.8 (26)
8. Georgia, 14.7 (12) 8. Miami (Fla), 9.0 (28)
9. Boise State, 16.0 (73) 9. LSU, 9.5 (6)
10. West Virginia, 16.7 (47) 10. Notre Dame, 10.0 (39)
11. Texas Tech, 21.5 (42) 11. Florida State, 10.8 (20)
12. Auburn, 22.0 (15) 12. Georgia, 11.3 (8)
13. Penn State, 22.3 (13) 13. Penn State, 15.3 (13)
14. Utah, 23.3 (71) 14. Texas A&M, 16.8 (47)
15. Wisconsin, 23.3 (24) 15. Auburn, 20.0 (12)
16. Oregon, 23.7 (31) 16. Alabama, 22.0 (21)
17. Boston College, 24.3 (27) 17. Nebraska, 22.3 (30)
18. California, 26.7 (24) 18. Virginia, 22.8 (45)
19. BYU, 27.2 (56) 19. UCLA, 23.0 (41)
20. Florida State, 27.8 (11) 20. South Carolina, 23.8 (35)
21. Alabama, 28.2 (16) 21. Iowa, 24.0 (25)
22. Clemson, 30.5 (28) 22. Maryland, 26.5 (49)
23. TCU, 31.2 (61) 23. Virginia Tech, 26.7 (5)
24. Oregon State, 31.2 (51) 24. California, 28.5 (18)
25. Iowa, 32.7 (21) Wisconsin, 28.5 (15)
What is first distinguishable about these rankings is USC (Photo: USC's Mark Sanchez, a top high school prospect in 2005, led the Trojans to a No. 3 ranking in the 2008 final AP Poll.) and Texas are number one and two, respectively, in both the AR and RR and most of the top teams are the same in both sets of rankings; six of the top seven teams in the AR are in the RR’s top nine.
In the rankings’ Top 10 there are several noteworthy teams where there is little correlation between their recruiting and final rankings.
Virginia Tech, West Virginia, and especially Boise State, appear to have exceeded expectations while Michigan, Tennessee, Miami (Fla), and Notre Dame have not performed to their high level of recruiting.
Of all the FBS and FCS teams examined, below are the top 10 "overachievers" (left) and "underachievers" (right), according to the difference in AR and RR. Each team is listed with its difference in the two rankings followed by its AR and RR in parentheses.
1. Boise State, +64 (9/73) 1. Illinois, -39 (75/36)
2. Navy, +57 (37/94) 2. Miss. State, -36 (73/37)
Utah, +57 (14/71) 3. Washington, -35 (74/39)
4. Cincinnati, +49 (34/83) 4. Texas A&M, -33 (47/14)
5. TCU, +38 (23/61) 5. Ole Miss, -32 (61/29)
6. BYU, +37 (19/56) 6. Michigan, -30 (33/3)
West Virginia, +37 (10/47) 7. Notre Dame, -29 (39/10)
8. Air Force, +35 (57/92) 8. Duke, -27 (87/60)
9. Connecticut, +31 (46/77) Maryland, -27 (49/22)
Texas Tech, +31 (11/42) Virginia, -27 (45/18)
Most of the overachievers are non-BCS teams recognized as having excellent head coaches; these schools, because of a variety of reasons, have become more able-bodied over time to compete with the BCS squads. Also, a number of overachievers run a particular offensive system; they sign players who may not be highly recruited but fit and later excel within the team’s system.
All of the top 10 underachievers are from BCS conferences. Nine of the 10 have fired at least one head coach since the end of the 2004 season, and rightfully so, based on the significant difference between recruiting and on-field performance.
Notwithstanding, there seems to be some correlation between recruiting and actual rankings for most of the teams evaluated.
Nearly two-thirds (16 of 25) of the teams in the AR top 25 and 56 percent (14 of 25) of the RR top 25 have a difference in rank of only 10 or less. Of all the teams evaluated, more than 57 percent have a difference of only 15 spots or less while just 23 percent have a difference of 25 or more.
Scott Kennedy, Director of Scouting for Scout.com, said, “Team recruiting rankings are a compilation of individuals. Games are won by TEAMS. It's not always the best collection of individuals that makes up the best teams.”
I totally agree with Kennedy: the best recruiting classes do not always translate into the best teams. However, more often than not, there appears to be at least somewhat of a parallel between recruiting rankings for college football programs and their final rankings; this especially holds true for most of the highest-ranking, whether recruiting or performance-based, traditional powers.