It was really only a matter of time before my nerdiness made its first appearance on this new stage. Here's hoping you don't mind, but I just couldn't resist.
Steven Levitt, who most of you probably know as the author of Freakonomics, and Kenneth Kovash, who most of you probably don't know at all, released a new paper in September entitled "Professionals Do Not Play Minimax: Evidence from Major League Baseball and the National Football League." Whew! That's a mouthful."But 310ToJoba," you say, "what does that even mean? 'Minimax' sounds like an adult film channel!" I know, I know. The quick and dirty definition of the term is from game theory or decision making theory and it basically asserts that a player in a zero-sum game will select options that minimize their maximum losses. To be overly thorough and clarify even further, a zero-sum game is a situation where the "players" can only benefit at the expense of each other because the possible payoffs from any set of actions is exactly the same. Checkers is an example of a zero-sum game since one player wins and the other player loses. Same with gambling at a casino because the amount that the gambler wins is precisely equal to the amount that the house loses in paying him. Makes sense, right? Not surprisingly the interactions between batters and pitchers are zero-sum games. Either the pitcher "wins" and the batter makes an out, or the batter "wins" and gets on base.
Clearly this is overly simplistic because some hits/walks are more "valuable" to the outcome of a game than others, so simply saying that a hitter "wins" doesn't necessarily mean that the pitcher is doing a bad job. That's where Levitt and Kovash come in. But I'm a skeptical person who has to be an ass and question everything. Wouldn't a pitcher with a good fastball be better served to throw it more often? So follow me after the jump to see what's what.
Like all good economists, these guys won't let you read the full article without paying for it, but there is a handy abstract that works well for our purposes:
Authors Kenneth Kovash and Steven Levitt find that: "Pitchers appear to throw too many fastballs; football teams pass less than they should." They also find that the selection of pitches or plays is too predictable. The researchers conclude that "correcting these decisionmaking errors could be worth as many as two additional victories a year to a Major League Baseball franchise and more than a half win per season for a professional football team."
Kovash and Levitt examine all Major League pitches - more than 3 million of them -- during the regular seasons from 2002 to 2006 (excluding extra innings). They categorize them as fastballs, curveballs, sliders, or changeups. They measure the outcome of each pitch using the sum of the batter's on-base percentage and slugging percentage (a measure they label OPS) and they determine that fastballs lead to a slightly higher OPS than other types of pitches.
If batters are more likely to score runs on fastballs, then minimax theory says that pitchers should adjust. To find out why they haven't, the authors look more deeply into the data, controlling for everything from the inning and number of strikes to the number of runners on base. A key factor, they find, is pitch count. As long as there are fewer than two strikes during an at-bat, the difference in outcome between throwing fastballs and non-fastballs tends to be small. But when there are two strikes, the outcomes diverge dramatically. Fastballs generate an OPS that is more than 100 points higher than non-fastballs. The authors calculate that if a team's pitchers reduced their share of fastballs by 10 percentage points, they would allow roughly 15 fewer runs in a season, about 2 percent of their total runs allowed.
We could probably have a whole argument over their use of OPS as a metric, but that's a discussion for smarter folks and another day. At a basic level, it does address the issue of some batter outcomes being more valuable than others, but it's obviously not perfect. Instead, let's just take their general conclusion, "too many fastballs," and run with it.
Is there even such a thing as throwing too many fastballs? Intuition and the abstract would seem to say yes. After all, how many pitchers have gotten through a whole career using just one pitch? In the modern era, only Mariano Rivera (homerism!) comes to mind, and you could easily argue that he's the exception rather than the rule. But if we were to look at 2009, do Levitt and and Kovash have a point?
If you scroll to the bottom of each of these Baseball-Reference pages, you'll see the AL and NL leaders in ERA+ for 2009. This is a quick and dirty measure of which guys were the "best." Not surprisingly the Greinkes, Lincecums, Verlanders and Carpenters of the baseball world dominate these lists.
Now let's consider the Pitch FX data found on Fangraphs. You can readily sort this table to reveal which pitchers threw the highest quantity of fastballs in 2009. The top 5? Mike Pelfrey, Rick Porcello, Jeff Niemann, Clayton Kershaw and Matt Garza. All of these fellows went to the heat more than 70% of the time for an average of 74.02%. Oh, and would you look at that, only one of them (Kershaw, ERA+ of 149) made the top 10 in the ERA+ metric in 2009. Then again, Clayton Kershaw is really, really good at pitching, so we probably shouldn't be surprised about him. The rest of the top 5? A decidedly pronounced crop of mediocrity or worse. So far so good for the economists.What also isn't surprising is that four out of the top 5 in number of fastballs thrown are absent from the top 10 in number of runs saved per 100 uses of the heat. Again, Kershaw is the exception. This would intuitively seem to corroborate the findings of the economists, since a good batter will know a particular pitcher has a tendency to favor the fastball and will probably wait for that pitch.
But then again it's not enough to simply point out a problem, you have to try suggesting a way to fix it as well. Levitt and Kovash claim that pitchers have become too predictable. On the whole, they don't like throwing consecutive pitches of the same type but will favor the fastball when in doubt. The answer then, as Levitt and Kovash suggest, is to lower the fastball percentage (by as much as 10% or more) and mix in more variety. Back to the 2009 data!
If we move down the Fangraphs fastball frequency chart 8-12 percentage points, you'll notice something strange. All of a sudden, names like Felix Hernandez, Matt Cain, Jair Jurrjens and CC Sabathia start popping up. Suddenly, as you start throwing less fastballs, there's a much higher overlap between the ERA+ list and the fastball percentage list. Zack Greinke, who had arguably one of the best pitching seasons in recent memory by a guy not named Pedro, used his heater just under 60% of the time!
Does this mean we should all refuse to draft Clayton Kershaw on our fantasy teams next year? I certainly hope not. But Levitt and Kovash have given people a substantial mountain of evidence to question pitchers who rely too much on a particular pitch. While their overall emphasis on the necessity of varying pitches is fairly obvious to anyone who actually watches baseball, the fact that some pitchers still don't adjust is really quite interesting. If nothing else the article shows the continuing trend of baseball analysis becoming more advanced, something my nerdy self finds quite appealing. And sorry for being long-winded.