Wednesday, March 25, 2020
Statcast Lab: Is there a different run value needed based on the infield slice?
One of the things that we’ve done in the long past is to give a different run value for 1B/3B, compared to 2B/SS. The idea was simple enough to understand: if a 2B or SS allowed a hit, it was likely a single. And if it was a 1B/3B, there’s a chance that it could be an extra base hit down the line.
Seems reasonable enough. So, what we ended up doing, in the long past, was to give .75 runs per play for 2B/SS and .80 runs for 1B/3B. Again, seems reasonable enough.
I looked at the Outs Above Average (for infielders only; I’ll do outfielders later today or tomorrow). And while the direction of that theory holds, the magnitude does not hold quite as much. For the 2B/SS roles, the impact of their play is -.005 runs, compared to the average infield play. While for the 1B/3B roles, the impact of their play is +.010 runs, compared to the average infield play. (The overall WEIGHTED average is 0, and you get there because there’s about 2X the plays at 2B/SS compared to 1B/3B).
So, the end result is that the gap in runs between the middle infielders and the corner infielders is about .015 runs, not the presumed long past value of .050 runs.
Why would that be? It’s probably easiest to say that 5% of the “assigned hits” are extrabase hits. But as we know, there’s alot more than just 5% hits that are extrabase hits, even if we limit it to the infield. For example, almost 10% of groundballs are extra base hits. So why the discrepancy? Well, half of those groundball extra base hits are “automatic hits”. In other words, they are hits not because the fielder wasn’t good enough to get there, but rather, his POSITIONING didn’t allow him for a chance to get there. And since Outs Above Average takes as an assumption of fact that the positioning of the player is not a skill of the player (easier to believe these days with shifting), then those auto-hits are not opportunities for the player. They end up being noise.
When we get to Layered Hit Probability (and by extension Layered wOBA), we will recover those “lost” hits, and be able to properly assign them to “team fielding alignment”. But, for the Outs Above Average metric, those aren’t in play (no pun intended).
Ok, so you may be thinking,we lost half, so maybe instead of the long past value of .050 runs, maybe it should be .025 runs? That is a good thought. Except, alot of those remaining extra base hits that are assigned to the fielder are “really difficult”. In other words, they remain in the pool for the player, but the hit probability is so low that they have limited damage to the fielder.
So, if you want a quick summary: the kind of hit that an infielder is responsible for is almost always a single. And because of that, when you look at outs saved, the translation to runs saved will be almost identical for middle infielders as for corner infielders.
Next time, I’ll compare IF to OF.
With the outfield, it’s a big deal. Whereas about 5% of basehits for infielders are extrabase hits, in the outfield, it’s over 50%.
That by itself establishes the run value for an OF play at 0.90 runs (and in the infield, it’s 0.75 runs).
***
The question is if different OF plays should get a different run value. So, I watched dozens of plays by Buxton (who had among the fewest number of extra base hits allowed) and Heredia (who had among the highest).
Here’s two sample Buxton plays for your consideration:
https://baseballsavant.mlb.com/sporty-videos?playId=9fea8dbe-570d-450c-8dd2-3a4f2b36e74b
https://baseballsavant.mlb.com/sporty-videos?playId=609070f5-e65d-4ea7-b5fa-4bbb789c10ba
And in looking at these dozens, what it came down to me was: the skill of the player, and the leverage of the opportunity. And there’s analogy with relievers and Leverage Index. How to you give out wins in terms of leverage index, for opportunities that the reliever didn’t create, but he did… leverage.
If a pitcher keeps getting balls hit to the warning track, and Buxton keeps pulling them down, he’ll get tons of runs saved.
And if a pitcher keeps getting balls hit shallow, and the outfielder pulls them down, he won’t get as many runs saved.
If we give play specific runs to the outfielders, you may end up with an outfielder making 60% catch probability outs that are shallow getting less runs than 70% catch probability that are deep.
But in terms of the SKILL of the outfielder, what matters is the difficulty of the play, not the leverage of the play.
FOR THAT PLAY, the outfielder may have gotten value. But that’s alot like giving credit to a grand slam HR being worth more than a solo shot. In terms of the SKILL of the hitter, 40 solo shots tells us more than 20 grand slams.
And so for a hitter, it makes more sense to give out the skill value for the HR, and then create a “timing bucket” for the leverage of the HR.
***
Even if we were to roll it all up together, when I looked at each outfielder, the “leverage” component came in at +/- 2 runs for every outfielder. So, we’ll capture it, but not as a skill value, but something else.
***
Going back to IF v OF, if both are +20 OAA, the infielder will count as +15 runs and the outfielder as +18 runs. This is not terribly different from counting all OAA as 0.8 runs (or +16). But in this case, the quality of infield and outfield plays are different enough as to warrant the differing run values.