By the Numbers: Bunting in baseball can be downright medieval

Greg Jayne: By the Numbers

By Greg Jayne, Columbian opinion editor

Published:

 

I ran across a conversation on the Internet the other day and thought I had been transported back to the 13th century.

Well, except for the fact that it was on the Internet. And except for the fact that it was about baseball. Everything else about it, however, was positively medieval.

You see, some people were extolling the virtues of the sacrifice bunt as a baseball strategy. And considering how soundly the bunt has been discredited in recent years, they might as well had been arguing that the earth is flat or that gruel is best served with mead.

Let's start with the thesis statement: The bunt rarely is a wise strategy at the major-league level. I know, I know, a century-and-a-half of baseball history tells us differently. But those were the Dark Ages.

So let's go to the proof, using information from Retrosheet, which maintains a database of play-by-play records for every major-league game.

From 1993-2010, if a team had a runner on first base with no outs, on average it would score .941 runs from that point until the end of the inning.

If a team had a runner on second base with one out, the average was .721 runs from that point forward.

So, let's say a batter walks to lead off an inning. If the team hits away, on average it will score almost one run in the inning. If it successfully bunts and now has a runner on second with one out, it has just decreased the run expectancy by 23 percent.

The decline is similar for a runner on first base with one out. It's similar for a runner on second base, and it's similar for runners on first and second. For every conceivable base/out situation, a sacrifice bunt reduces a team's expected number of runs in that inning.

The reason is that outs are a commodity. They are the currency by which the game is governed, and willfully giving up an out is never worth the extra base that a bunt can provide. It's trading a piece of gold for a piece of silver.

This likely has always been true, or at least since the Deadball Era ended in 1920.

From 1950-68, much of which was a pitching-dominated period, bunting a runner from first to second with nobody out reduced a team's run expectancy by 19 percent.

Is there ever a time when a bunt is a wise play? Of course, particularly when a pitcher is at the plate. The percentages say that certain poor hitters can improve their team's chances more by bunting than by swinging away. Like Chone Figgins and Brendan Ryan, for example.

But even if you're playing for one run, if you're down by one or tied in the ninth inning, the numbers say that a bunt is a poor play with a runner on first base. With a runner on first and nobody out, there is a 44 percent chance a team will score at least one run in that inning; with a runner on second and one out, there is a 42 percent chance a run will score.

All things being equal, discounting the skill of the hitter at the plate, the odds say the only time a bunt makes sense is when you need one run and have a runner on second — or runners on first and second — with nobody out. In those situations, a successful bunt slightly increases a team's odds of scoring in the inning.

Baseball people finally are understanding this dynamic. Last season, there was an average of 0.34 sacrifice hits per team per game in the major leagues; in 1991, the average was 0.40; in 1971 it was 0.46, and in 1951 it was 0.50.

Some 40 years ago, nobody bothered to research these things. Conventional wisdom was taken as gospel and was unquestioned by managers.

Now, a new way of thinking is having an impact on the way the game is played. Consider it a Renaissance.

Questions or comments for By the Numbers? Contact Greg Jayne, Sports editor of The Columbian, at 360-735-4531, or by e-mail at Greg.Jayne@columbian.com. To read his blog, go to columbian.com/weblogs/GregJayne.