I don't need to tell you that ever since the Yankees won the World Series earlier this month, there's been a lot of renewed discussion about whether or not New York's payroll advantage finally "bought" them a championship, and if a salary cap is needed to restore competitive balance to baseball. That topic has really been talked about to death over the past few weeks, but I feel like a salary cap -- ostensibly designed to prevent the Yankees from spending $50 million more than the next highest-payrolled team -- is only half of the discussion. If all you do is put a cap on team payrolls, you're still going to have cheapskate owners who take their revenue-sharing money and fail to invest it in their teams, owners who have learned to game the system and are content to put a poor product on the field if it means they can keep more cash for themselves.
So, obviously, you need to talk about a salary floor every bit as much as a salary cap. Discussing salary floors, though, leads to the question of whether you should force owners to spend a certain amount of money on their players, whether they have it or not. Opinions on the profitability of MLB teams vary wildly depending on who you talk to -- Bud Selig routinely claims teams are operating at a loss, while Forbes magazine routinely disagrees -- so it's an open question at this point in terms of how high of a floor you can impose on some of the more (allegedly) cash-strapped teams. But here's a fun exercise we can engage in, as long as we're pretending that a salary cap/floor is even a remote possibility...
From 1985-87, MLB owners colluded against the players in free agency, artificially keeping salaries down; they were subsequently found out, and for all intents and purposes the modern era of free agency began in 1988. In 1988, the average team spent $11,555,862 on salaries, with a standard deviation of $3,386,331, while in 2009 those figures were $88,824,233 and $33,857,093, respectively. What I'm going to do is convert team payrolls since 1988 into 2009 "equivalents" using these numbers -- for instance, the 1988 Yankees spent $19,441,152 on players, so we convert that to a z-score by taking ($19,441,152 - $11,555,862) / $3,386,331 = 2.33, and then apply that to 2009 like so: (2.33 * $33,857,093) + $88,824,233 = $167,662,661. In other words, the 1988 Yankees' payroll is $167,662,661 in equivalent "2009 dollars".
Having said that, here are the top spenders in equivalent 2009 payroll since 1988:
Okay, but here's a more interesting table... For each franchise, what was their highest single-season 2009 equivalent payroll since 1988?
If you want to be a stickler and demand Montreal be excluded from Washington's history, since they are radically different markets, then the Nats' high mark is $73,882,155 in 2006, which would be (by far) the lowest on the list.
Now, am I saying all of these teams can afford to spend exactly this much today (since economic conditions have obviously changed for many of these cities since the late '80s)? No. But is it really a stretch to think they couldn't all at least approach these numbers if they wanted to? I mean, think about what this is saying: in the last 20 years, every MLB franchise has spent the equivalent of nearly $90 million 2009 dollars on their payroll at one point or another... All but three spent $100 million... All but seven spent $110 million... And more than half spent $130 million. Small-market teams, big-market teams, it doesn't matter, they all ponied up for talent at some point since '88. And the 1990 Kansas City Royals nearly spent the equivalent of '09 Mets!
So theoretically, would it really be that crazy to institute a cap-and-floor system which ensures that the players continue to be paid what they're worth, but also spreads talent more equitably across all of MLB?
This entry was posted on Sunday, November 29th, 2009 at 11:18 am and is filed under Insane ideas. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.