The Cost of Winning: Revised return on investment metric for NCAA basketball

I. Introduction

A few days ago, I wrote up a report in which I investigated the idea of return on investment for Division I basketball programs. To do so, I looked at ten years worth of financial reports from the Department of Education’s Equity in Athletics Data Analysis. I measured each program’s total wins between 2008 and 2017 as a function of the annual programmatic expenses during that span. The resulting metric is called “adjusted return on expenses”, or ROE+.

There were certainly some interesting findings, but they were preliminary and gave an incomplete view of what could be considered “returns” on the investment. The most obvious weakness in the original metric is that it treated every win equally; that is, strength of schedule was not included.

This omission was intentional, as I was more focused on just measuring the raw wins in that introductory piece.

However, the results ended up skewed towards successful teams in small conferences that don’t require as much investment. And while I still consider that to be an interesting dataset, in order to get a better picture of the quality of these wins, strength of schedule needs to be included in the metric. I also wanted to include NCAA Tournament performance into the equation.

II. Revised ROE+ Metric

In order to account for strength of schedule, I used the SOS metric from Sports-Reference. For each season, the SOS results for each team were adjusted to fall between 0 and 1, with a result of 0.500 meaning that a team played a perfectly average schedule. This means that every season, the team with the highest SOS would score 1.00, while the team with the lowest would score 0.001. I adjusted the lower limit so that the team with the lowest SOS would not automatically return an ROE+ score of 0.

Aside from just SOS, it also seemed prudent to include NCAA Tournament performance into the ROE+ measurement. Since tournament success really is the goal of any season, I decided to make this a pretty important feature of the revised metric. To measure the success, I implemented a points system for results in the NCAA Tournament.

  • 0 points – Missed the NCAA Tournament
  • 1 point – Eliminated in the Round of 64/65/68
  • 2 points – Eliminated in the Round of 32
  • 4 points – Eliminated in the Sweet 16
  • 8 points – Eliminated in the Elite 8
  • 16 points – Eliminated in the Final Four
  • 32 points – Eliminated in the Championship
  • 64 points – Won the National Championship

So now, rather than just looking at raw wins, I now have something called “adjusted returns”. What this measures is average wins plus NCAA Tournament points, multiplied by average SOS and winning percentage over the ten-year span of the data. I believe that this measurement more accurately reflects success at the Division I level.

Another change to the metric is on the financial side. Before, I was treating every dollar spent as equal, regardless of the season. However, Division I basketball has become increasingly expensive every year. The national spending average in 2008 was $2.4 million, but that number grew to over $4 million by 2017. So, rather than look at the raw dollars, I’m now using the relative spending average. For example, if a team spent $12 million in 2008, its relative spending would be 500%, or five times the national average. That same $12 million in 2017 would register a relative spending result of 300%, or three times the national average.

Defining terms and metrics in academic prose can be a bit confusing, so here’s a quick summary of the components of the revised ROE+ metric:

  • Adjusted Returns = (Wins + NCAAT points) * SOS * Win %
  • Relative Spending = Expenses / Given Year’s National Average
  • (New) ROE+ = Adjusted Returns / Relative Spending

Here is the revised ROE+ spreadsheet with all of the new information. I hope that it is cleaner and easier to parse for people who want to comb through the data or even use it for their own purposes. Before we get into the rankings, I want to once again credit Sports-Reference and the Equity in Athletics Data Analysis for their data. If you use this information elsewhere, all I ask is that you credit me, Sports-Reference, and the EADA.

II. Conference-Independent Rankings

With all of this in mind, there are, of course, new winners and losers in this measurement. In this installment, I will first take a look at the top and bottom programs in the revised ROE+ rankings. This will be conference-independent; that is, programs who have spent time in multiple programs will be treated as a single entity. I have also broken the numbers down in a conference-dependent way, but I will discuss that a little later.

Below are the top 10 teams in the revised conference-independent ROE+ rankings.

1. Saint Mary’s – 14.81 ROE+

  • Average Wins: 26.3 (0.777 win%, 0.611 SOS)
  • NCAA Tournament Points: 9
  • Average Adjusted Returns: 12.9
  • Relative Spending Average: 87.2%

2. Harvard – 14.39 ROE+

  • Average Wins: 19.3 (0.645 win%, 0.428 SOS)
  • NCAA Tournament Points: 6
  • Average Adjusted Returns: 5.5
  • Relative Spending Average: 38.2%

3. Butler – 13.95 ROE+

  • Average Wins: 25.0 (0.717 win%, 0.755 SOS)
  • NCAA Tournament Points: 77
  • Average Adjusted Returns: 17.7
  • Relative Spending Average: 127.0%

4. North Carolina – 13.91 ROE+

  • Average Wins: 29.2 (0.772 win%, 0.925 SOS)
  • NCAA Tournament Points: 200
  • Average Adjusted Returns: 35.1
  • Relative Spending Average: 252.6%

5. South Dakota State – 13.84 ROE+

  • Average Wins: 20.6 (0.613 win%, 0.403 SOS)
  • NCAA Tournament Points: 4
  • Average Adjusted Returns: 5.2
  • Relative Spending Average: 37.5%

6. UNC Asheville – 13.30 ROE+

  • Average Wins: 19.0 (0.582 win%, 0.324 SOS)
  • NCAA Tournament Points: 3
  • Average Adjusted Returns: 3.6
  • Relative Spending Average: 27.4%

7. Sam Houston State – 12.89 ROE+

  • Average Wins: 20.3 (0.616 win%, 0.301 SOS)
  • NCAA Tournament Points: 1
  • Average Adjusted Returns: 3.8
  • Relative Spending Average: 29.3%

8. Princeton – 12.64 ROE+

  • Average Wins: 18.5 (0.615 win%, 0.385 SOS)
  • NCAA Tournament Points: 2
  • Average Adjusted Returns: 4.4
  • Relative Spending Average: 35.1%

9. Yale – 12.32 ROE+

  • Average Wins: 16.8 (0.559 win%, 0.403 SOS)
  • NCAA Tournament Points: 2
  • Average Adjusted Returns: 3.8
  • Relative Spending Average: 31.1%

10. Green Bay – 12.32 ROE+

  • Average Wins: 19.5 (0.597 win%, 0.528 SOS)
  • NCAA Tournament Points: 1
  • Average Adjusted Returns: 6.2
  • Relative Spending Average: 50.1%

Like in the previous ROE+ rankings, the top teams in the Ivy League do very well here. As I discussed in the last piece, the Ivy gets a boost from not offering athletic scholarships, which keeps expenses low. Some other teams who dominated low-spending leagues appear once again, such as UNC Asheville, South Dakota State, and Sam Houston State.

A key difference here, though, is the inclusion of some more prominent mid-major and high-major teams. NCAA Tournament performance really helped boost the case for schools like North Carolina and Butler. Even Kansas, with its monstrous 343.7% relative spending average, ranks 24th in the new ROE+ metric, due to its NCAA Tournament successes.

The new formula also allows a team like Saint Mary’s, with its modest spending, high win totals, and above-average SOS, to rocket to the top of the rankings. Even without much NCAA Tournament success to speak of, the Gaels still had a great decade. Green Bay is another program who, somewhat surprisingly, fits this mold. These results, in my mind, capture the essence of the idea of return on expenses: consistently doing more with less.

Conversely, there are a lot of teams who are doing less with more. There are also some teams who are simply doing less than anyone else, regardless of how little they spend. Let’s take a look at the bottom 10 teams in the revised rankings.

339. Rice – 1.74 ROE+

  • Average Wins: 11.3 (0.345 win%, 0.532 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 2.1
  • Relative Spending Average: 119.2%

340. Central Arkansas – 1.63 ROE+

  • Average Wins: 7.3 (0.247 win%, 0.261 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.5
  • Relative Spending Average: 28.8%

341. Florida A&M – 1.61 ROE+

  • Average Wins: 9.5 (0.303 win%, 0.175 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.5
  • Relative Spending Average: 31.4%

342. Fairleigh Dickinson – 1.49 ROE+

  • Average Wins: 8.8 (0.288 win%, 0.283 SOS)
  • NCAA Tournament Points: 1
  • Average Adjusted Returns: 0.7
  • Relative Spending Average: 48.7%

343. Alcorn State – 1.42 ROE+

  • Average Wins: 9.0 (0.287 win%, 0.097 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.3
  • Relative Spending Average: 17.7%

344. Fordham – 1.40 ROE+

  • Average Wins: 9.1 (0.301 win%, 0.615 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 1.7
  • Relative Spending Average: 120.4%

345. DePaul – 1.30 ROE+

  • Average Wins: 10.0 (0.316 win%, 0.830 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 2.6
  • Relative Spending Average: 202.0%

346. Alabama State – 1.21 ROE+

  • Average Wins: 15.7 (0.499 win%, 0.042 SOS)
  • NCAA Tournament Points: 2
  • Average Adjusted Returns: 0.3
  • Relative Spending Average: 27.6%

347. Grambling – 0.59 ROE+

  • Average Wins: 6.6 (0.218 win%, 0.075 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.1
  • Relative Spending Average: 18.3%

348. Alabama A&M – 0.40 ROE+

  • Average Wins: 10.0 (0.348 win%, 0.027 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.1
  • Relative Spending Average: 23.0%

Here, we see a lot of low-performing, low-spending teams from conferences such as the SWAC, MEAC, NEC, and Southland leagues. The key measure weighing these teams down is their meager SOS numbers. There are a lot of contributing factors to why they have such low SOS numbers, including the fact that many of these cash-strapped schools have to travel around to be the sparring partner for big-name programs in the early season, all just to keep the lights on. I wrote about the issues of HBCU scheduling last November, if you want to read more about this.

However, we still get some of the more well-known programs from high-major conferences and some of the more prominent mid-majors. DePaul, who was dead-last in the first rankings, still finds itself deep in the hole. Fordham and Rice are other programs who fit the bill here. As discussed in the previous piece, these schools are all in major U.S. cities that also have NBA teams, which may contribute to a lower level of public interest. Consequently, these programs may have a harder time drawing top-level talent to play for them, resulting in disappointing win totals.

III. Conference-Dependent Rankings

Besides those overall rankings, though, I felt that it was worth looking at ROE+ from a conference-dependent standpoint. NCAA realignment was rampant in Division I during the decade in question, so a lot of teams’ financial outlooks changed significantly in this time. Some teams made the jump from mid-major conferences to high-major leagues, which resulted in large spending increases. Some mid-majors made the leap without increasing their bottom line by too much. Others stepped down to lower leagues and began tightening the purse strings.

In order to really grasp which teams delivered the most returns on investment, I also looked at the ROE+ numbers when treating teams who realigned as separate entities. For example, between 2008 and 2017, Colorado played four seasons in the Big 12 and six seasons in the Pacific 12. So for the conference-dependent rankings, Colorado (Big 12) and Colorado (Pacific 12) have separate entries. There are 450 teams in the rankings under these conditions.

I also made the decision to divide the Big East into two separate conferences, because of the massive restructuring of the conference in the wake of the American Athletic Conference’ s creation. I wrestled with whether or not to do this for other leagues who experienced major changes, but ultimately felt that the Big East was a special case.

Here are the top ten teams in the conference-dependent revised ROE+ measure.

1. Butler (Horizon League)18.31 ROE+

  • Seasons: 5
  • Average Wins: 27.8 (0.779 win%, 0.677 SOS)
  • NCAA Tournament Points: 67
  • Average Adjusted Returns: 21.7
  • Relative Spending Average: 118.7%

2. Saint Mary’s (West Coast Conference) – 14.81 ROE+

  • Seasons: 10
  • Average Wins: 26.3 (0.777 win%, 0.611 SOS)
  • NCAA Tournament Points: 9
  • Average Adjusted Returns: 12.9
  • Relative Spending Average: 87.2%

3. Harvard (Ivy League) – 14.39 ROE+

  • Seasons: 10
  • Average Wins: 19.3 (0.645 win%, 0.428 SOS)
  • NCAA Tournament Points: 6
  • Average Adjusted Returns: 5.5
  • Relative Spending Average: 38.2%

4. North Carolina (Atlantic Coast Conference) – 13.91 ROE+

  • Seasons: 10
  • Average Wins: 29.2 (0.772 win%, 0.925 SOS)
  • NCAA Tournament Points: 200
  • Average Adjusted Returns: 35.1
  • Relative Spending Average: 252.6%

5. South Dakota State (Summit League) – 13.84 ROE+

  • Seasons: 9
  • Average Wins: 20.6 (0.613 win%, 0.403 SOS)
  • NCAA Tournament Points: 4
  • Average Adjusted Returns: 5.2
  • Relative Spending Average: 37.5%

6. Butler (Atlantic 10 Conference) – 13.50 ROE+

  • Seasons: 1
  • Average Wins: 27.0 (0.750 win%, 0.786 SOS)
  • NCAA Tournament Points: 2
  • Average Adjusted Returns: 17.1
  • Relative Spending Average: 126.6%

7. UNC Asheville (Big South Conference) – 13.30 ROE+

  • Seasons: 10
  • Average Wins: 19.0 (0.582 win%, 0.324 SOS)
  • NCAA Tournament Points: 3
  • Average Adjusted Returns: 3.6
  • Relative Spending Average: 27.4%

8. Chicago State (Division I Independents) – 13.01 ROE+

  • Seasons: 2
  • Average Wins: 15.0 (0.493 win%, 0.474 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 3.5
  • Relative Spending Average: 27.0%

9. Sam Houston State (Southland) – 12.89 ROE+

  • Seasons: 10
  • Average Wins: 20.3 (0.616 win%, 0.301 SOS)
  • NCAA Tournament Points: 1
  • Average Adjusted Returns: 3.8
  • Relative Spending Average: 29.3%

10. Villanova (new Big East Conference) – 12.88 ROE+

  • Seasons: 4
  • Average Wins: 32.3 (0.883 win%, 0.854 SOS)
  • NCAA Tournament Points: 70
  • Average Adjusted Returns: 37.5
  • Relative Spending Average: 291.5%

A lot of this list is the same as the conference-independent rankings, but there are some pretty interesting differences. Obviously, Butler’s time as a mid-major was hugely successful, especially when they made two consecutive national championship games while representing the Horizon League. Interestingly, Butler’s spending hasn’t increased very much since that time, even though they are now in the Big East. Their win totals haven’t quite kept up, though. Probably because their strength of schedule has jumped from 0.677 in the Horizon to 0.845 in the new Big East.

One of the biggest surprises here is the presence of Chicago State from their two-season run as an independent Division I team. Over the past few seasons, the Cougars have become one of the least successful teams in the country, but times weren’t always so bad. With decent returns against the backdrop of a very low spending average, they shot up in the rankings. However, their four seasons in the WAC have them ranked 310th out of 450 teams, while their stint in the now-defunct Great West was even worse, registering as 410th. Clearly, being in a conference hasn’t suited Chicago State very well.

Villanova is the last new face in these rankings, using their 2016 championship to vault into such a lofty position. Their adjusted returns of 37.5 during their time in the new Big East is tops in the conference-dependent rankings. Yes, their spending is pretty huge (17th in the nation), but the returns are high enough to overcome that fact. It will be interesting to see if, with the next EADA release, the Wildcats jump into the top ten for conference-independent rankings as well.

441. Alcorn State (Southwestern Athletic Conference)1.42 ROE+

  • Seasons: 10
  • Average Wins: 9.0 (0.287 win%, 0.097 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 9.0
  • Relative Spending Average: 17.7%

442. Fordham (Atlantic 10 Conference) – 1.40 ROE+

  • Seasons: 10
  • Average Wins: 9.1 (0.301 win%, 0.615 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 1.7
  • Relative Spending Average: 120.4%

443. Rutgers (Big Ten Conference) – 1.38 ROE+

  • Seasons: 3
  • Average Wins: 10.7 (0.329 win%, 0.781 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 2.7
  • Relative Spending Average: 198.1%

444. Longwood (Big South Conference) – 1.34 ROE+

  • Seasons: 5
  • Average Wins: 8.6 (0.264 win%, 0.259 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.6
  • Relative Spending Average: 43.6%

445. South Florida (American Athletic Conference) – 1.26 ROE+

  • Seasons: 4
  • Average Wins: 9.0 (0.283 win%, 0.654 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 1.7
  • Relative Spending Average: 132.0%

446. Alabama State (Southwestern Athletic Conference) – 1.21 ROE+

  • Seasons: 10
  • Average Wins: 15.7 (0.499 win%, 0.042 SOS)
  • NCAA Tournament Points: 2
  • Average Adjusted Returns: 0.3
  • Relative Spending Average: 27.6%

447. DePaul (Big East Conference) – 1.18 ROE+

  • Seasons: 6
  • Average Wins: 9.7 (0.309 win%, 0.834 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 2.5
  • Relative Spending Average: 212.0%

448. Houston Baptist (Great West Conference) – 1.14 ROE+

  • Seasons: 3
  • Average Wins: 9.7 (0.315 win%, 0.162 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.5
  • Relative Spending Average: 43.3%

449. Grambling (Southwestern Athletic Conference) – 0.59 ROE+

  • Seasons: 10
  • Average Wins: 6.6 (0.218 win%, 0.075 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.1
  • Relative Spending Average: 18.3%

450. Alabama A&M (Southwestern Athletic Conference) – 0.40 ROE+

  • Seasons: 10
  • Average Wins: 10.0 (0.348 win%, 0.027 SOS)
  • NCAA Tournament Points: 0
  • Average Adjusted Returns: 0.1
  • Relative Spending Average: 23.0%

Not shockingly, the four SWAC teams from the conference-independent are still here. So, too, are the high-profile teams like DePaul and Fordham. There are a few interesting additions down here, though.

Houston Baptist had a pretty rough go of things during their time in the Great West Conference. The Huskies are anchored down by their 2011 season, in which they won just five games against a strength of schedule of just 0.129. The entire, brief history that conference is pretty terrible, but Houston Baptist got the worst of it.

Similarly, Longwood had some difficulty adjusting to a conference. After spending some time as an independent, the Lancers joined the Big South Conference and things did not go so well. In five seasons in the Big South, Longwood’s highest win total was 11, while their relative spending average hovered in the low 40s – which is about ten points higher than teams in the lowest-spending leagues in the country.

Perhaps the most notable new entries are Rutgers and South Florida. The Scarlet Knights were never high in these rankings. Their time in the old Big East ranked 366th and their one year in the American Athletic Conference ranked 410th. But things got really bad when they jumped up to the Big Ten, as their spending average jumped up by nearly 50 points, while their win totals dropped by about three wins. South Florida’s spending really didn’t change much between their Big East days and their AAC days, but their average wins plummeted from 14.2 to 9.0, with their SOS falling from 0.873 to 0.654. Performing worse against worse competition is a quick way to the cellar.

IV. Further Study

These results are for entertainment purposes, but I do think that this kind of analysis opens up some interesting questions about how to measure return on investment in college basketball. For example, should other postseason tournaments be included in success metrics? Should conference records be treated separately from non-conference performance? Should revenue also be considered into the equation? What about just looking at coach salaries instead of overall expenses in the program?

All of these questions are interesting topics for further research. I might undertake some of them. The next piece that I’ll be working on regarding NCAA basketball finances is too look at yearly spending patterns by conference to make some determinations about different tiers past just the false dichotomy of “high-major” and “mid-major”.

If you have any requests for research projects in this vein, feel free to send me a message. I will consider these requests and address the ones that I feel are both interesting and possible with the available data. You can stay up to date with this series of pieces by following me on Twitter at @andrewdieckhoff.

I also strongly support anyone else using this data to do their own research. As I said before, please just give credit to me, Sports-Reference, and the EADA reports if you use any of this information.

The Cost of Winning: Measuring return on investment in D1 basketball

You get what you pay for.”

Of course, this is only half true. Often, there is plenty that you get for free and plenty that doesn’t live up to its price tag. This reality can be seen in all walks of life. And in college basketball’s increasingly dystopian world of haves and have-nots, it is very apparent.

Recently, I have become fascinated with the financial reports published by the Department of Education in their annual Equity in Athletics Data Analysis. These reports come in a spreadsheet that looks like a wall of numbers, measuring things such as revenue, total expenses, and operating costs. I wanted a way to distill all of this information into something meaningful.

The question I kept coming back to: Which schools are getting the best return on their investment?

To figure this out, I took the EADA data from the ten most recent years available, spanning from 2008 to 2017. The reports are about two years behind, but I think there’s still much to be gleaned from them. Then, I used datasets made available by Sports-Reference to track total wins and winning percentage by year.

In order to determine the relative return on investment, I looked at each school’s average wins over the ten-year span as a function of how much was spent on the program, measured in millions of dollars. To keep the numbers from being skewed by schools with low win totals and even lower investments, I adjusted the numbers to reflect winning percentage over the decade.

The resultant number is what I call “adjusted return on expenses”, or ROE+. (I am using “expenses” instead of “investment” to differentiate between money spent by the universities and revenue earned.)

For example, let’s look at the midpoint of the data. Dayton spent an average of $4.65 million on its men’s basketball program from 2008 to 2017 and in that time, averaged 23.6 wins per season. By dividing those wins by the average expenses, we get 5.08. Then, that number is adjusted by the overall winning percentage in those ten years – in this case, 0.685. This drops the number down to 3.48, which is the Flyers’ final ROE+ number.

The range of ROE+ measures across 347 Division I programs included in the dataset goes from 13.51 ROE+ at its highest, all the way down to 0.50 ROE+ at its lowest. The mean ROE+ is 4.06, indicating that Dayton gets relatively less bang for its buck than the average Division I program.

Keep in mind that ROE+ measures relative success. There are a finite number of wins available each season, but there is theoretically no limit to the amount of money a school can spend on its program. Therefore, blue bloods that back up the truck to fund their programs are going to score low in this metric. For example, Duke’s 30-win season in 2013 registered a lower ROE+ than Weber State’s 30-win season that same year, because the Blue Devils outspent the Wildcats by nearly $12.5 million for the same number of wins.

A couple additional notes on my analysis:

  • I only looked at EADA figures from a team’s time in Division I
  • The expenses referenced refer to both operating costs (gameday expenses) and other costs, including coaches’ salaries, recruiting expenses, and student aid for scholarship athletes.
  • Strength of schedule is not factored into the analysis
  • For some reason, Kennesaw State did not report to the EADA between 2008 and 2014.
  • The military schools – Air Force, Army, and Navy – are exempt from EADA reporting because they do not receive Title IV funding for student loans. You can read more about this situation in this USA TODAY piece by Brian Schotenboer and Steve Berkowitz.
  • These numbers are two years old, so there’s every chance that your team’s ROE+ number has changed significantly since the last EADA report.

In case you want to play around with the data yourself, or just want to see where your team ranks, here is a link to the spreadsheet. All that I ask is that if you publish any work based on these data, that you reference me, as well as the EADA and Sports-Reference.

If you’re still with me, congratulations! Now we get to the fun part, which is actually identifying the teams that have overachieved and underachieved the most, relative to their school’s investments.

We’ll start first with the ten best ROE+ figures, listed below.

1. Sam Houston State – 13.51 ROE+

  • Average wins: 20.3
  • Winning percentage: 0.616
  • Average expenses: $926,136

2. Stephen F. Austin – 13.05 ROE+

  • Average wins: 24.5
  • Winning percentage: 0.751
  • Average expenses: $1,408,842

3. UNC Asheville – 12.63 ROE+

  • Average wins: 19.0
  • Winning percentage: 0.582
  • Average expenses: $874,903

4. Jackson State – 10.63 ROE+

  • Average wins: 14.2
  • Winning percentage: 0.438
  • Average expenses: $584,864

5. Harvard – 10.35 ROE+

  • Average wins: 19.3
  • Winning perectage: 0.645
  • Average expenses: $1,202,613

6. Princeton – 10.25 ROE+

  • Average wins: 18.5
  • Winning perectage: 0.615
  • Average expenses: $1,110,089

7. South Dakota State – 10.17 ROE+

  • Average wins: 20.6
  • Winning perectage: 0.613
  • Average expenses: $1,238,901

8. Vermont – 9.98 ROE+

  • Average wins: 22.7
  • Winning perectage: 0.668
  • Average expenses: $1,520,776

9. Savannah State – 9.89 ROE+

  • Average wins: 14.2
  • Winning perectage: 0.460
  • Average expenses: $660,492

10. North Dakota State – 9.69 ROE+

  • Average wins: 20.0
  • Winning perectage: 0.625
  • Average expenses: $1,290,271

You may be surprised to see some unfamiliar names here, but keep in mind that schools in the “lowest” Division I conferences will have considerably less overhead. Because there is less revenue coming in for these programs, the expenses have to be kept relatively low. Venues are smaller, coaching contracts are smaller, and so on.

In this list, we can see a few patterns. The clearest indicator of ROE+ success, by definition, is winning a lot of games while spending a small amount of money on the program. So perennial powers in conferences such as the Southland, Big South, America East, and Summit League are well-represented by Sam Houston State, Stephen F. Austin, UNC Asheville, Vermont, South Dakota State, and North Dakota State. The cost of playing in these conferences is lower than other mid-major conferences, so the wins are more valuable.

Similarly, running a program in the Ivy League doesn’t cost quite as much as other conferences around the country. The Ivy merits special mention, though, because of the peculiarity that keeps costs so low league-wide. Schools in this conference don’t offer athletic scholarships, which is one of the key expenses included in the report. So, it shouldn’t be surprising to see the Ivy schools performing well in the ROE+ metric. Six of the eight Ivy League teams are in the top 66 schools. Only Penn and Dartmouth fall lower, due to their low win totals throughout the decade.

The other peculiarity here is that teams such as Jackson State and Savannah State can make this list, despite both finishing under .500 during the decade in question. The simple answer is the fact that these two both fall in the bottom ten with regard to annual expenses. The average yearly win total for the 32 schools that spent less than $1 million annually is 12.4 wins. That both of these teams exceeded this win total explains their presence here.

Of the ten programs with the lowest expenses, seven come from either the SWAC or the MEAC. This, of course, includes Jackson State and Savannah State. The financial and scheduling issues facing these HBCU conferences is well-documented. Their case is an extreme example of how the

ROE+ rewards relative success. That is, their 14 wins per year were much more valuable in these conferences than Oregon State’s 14 wins in the Pac-12, which registers a lowly 1.26 ROE+, the 14th-lowest score in the database.

With that difference in mind, let’s take a look at the other end of the spectrum. Below are the ten programs with the lowest ROE+ scores.

338. Rice – 1.08 ROE+

  • Average wins: 11.3
  • Winning percentage: 0.345
  • Average expenses: $3,601,850

339. South Florida – 1.07 ROE+

  • Average wins: 12.1
  • Winning percentage: 0.372
  • Average expenses: $4,202,693

340. Indiana – 1.04 ROE+

  • Average wins: 19.1
  • Winning percentage: 0.562
  • Average expenses: $10,291,656

341. St. John’s – 0.99 ROE+

  • Average wins: 15.8
  • Winning percentage: 0.483
  • Average expenses: $7,741,544

342. Rutgers – 0.92 ROE+

  • Average wins: 12.5
  • Winning percentage: 0.391
  • Average expenses: $5,333,095

343. Boston College – 0.91 ROE+

  • Average wins: 13.4
  • Winning percentage: 0.413
  • Average expenses: $6,108,657

344. Auburn – 0.81 ROE+

  • Average wins: 14.6
  • Winning percentage: 0.453
  • Average expenses: $8,193,977

345. TCU – 0.76 ROE+

  • Average wins: 14.4
  • Winning percentage: 0.436
  • Average expenses: $8,311,529

346. Fordham – 0.72 ROE+

  • Average wins: 9.1
  • Winning percentage: 0.301
  • Average expenses: $3,803,264

347. DePaul – 0.50 ROE+

  • Average wins: 10.0
  • Winning percentage: 0.316
  • Average expenses: $6,291,660

Looking at these numbers, two patterns emerge: low-performing Power 5 teams and teams in huge municipalities both seem to struggle. Of course, this makes a lot of sense. Let’s look at each pattern a little further.

Playing in the big leagues means that, despite your yearly records, you have to constantly keep up with the Joneses. The venues are (generally) bigger, as are the contracts that you have to pay to coaches. When it comes to expenses, you have to travel down the list to #45 before you find a program that isn’t in the “high-major” category (Power 5 plus Big East and AAC). That team, unsurprisingly, is Gonzaga. The five biggest spenders are Duke, Kentucky, Louisville, Syracuse, and Kansas – all either bona fide blue-bloods or schools that have poured a lot of money into developing their basketball identities.

When you compound that high-level basketball with being in a big city, the expenses seem to be higher. The other issue, though, is that many schools that fit this bill have struggled to establish or maintain a strong following in those cities – especially if there is a professional team there.

With the exception of Auburn, each of the teams in the bottom ten of the ROE+ rankings have to compete with an NBA team that is within reasonable driving distance:

  • DePaul – Chicago Bulls
  • Fordham & St. John’s – New York Knicks/Brooklyn Nets
  • TCU – Dallas Mavericks
  • Boston College – Boston Celtics
  • Indiana – Indiana Pacers
  • Rutgers – New York/Brooklyn/Philadelphia 76ers
  • South Florida – Orlando Magic
  • Rice – Houston Rockets

It seems that Indiana may be the one team that really doesn’t quite fit here – the Hoosiers basketball following is unquestionably one of the most devoted. Theirs is simply an issue of underperforming during the ten years covered here. For the other schools, though, it may be difficult for them to attract a solid fanbase due to the fact that the professional game is so accessible in these regions. By extension, it stands to reason that top recruits may be less willing to come to these relatively less-established schools.

Obviously, schools can overcome the combination of major-conference affiliation and NBA proximity. Butler, for example, is the best-performing high-major program in the ROE+ ranks and share a city with the NBA’s Pacers. But it is still interesting to see so many teams near the bottom that fall within these parameters.

To summarize, the figures in the EADA highlight the financial discrepancies between the top and bottom conferences in Division I basketball. However, just because a team spends more (or less) money on its basketball program, that doesn’t guarantee anything with regard to winning games.

If you support a team that excels in one of the lower conferences, you’re one of the lucky ones. Those teams really deliver the best return on investment for their schools. Conversely, if you’re a fan of a high-major team and you get the feeling that you aren’t getting enough bang for your buck, you might be right.