On throwing to first, Part III
April 12, 2007 5 Comments
So far, we’ve taken a look at what the pickoff throw to first says about baserunners and run scoring, but we haven’t yet looked at it from the defense’s point of view. Does throwing to first mean that the pitcher won’t be as effective at working the batter? Does holding the runner really open up a hole in the right side of the infield that hitters exploit? All told, is throwing to first a good strategy? Let’s jump in and find out.
The first question from the pitcher’s perspective is what factors influence whether a pitcher will throw over. In Part I, we find that there’s a pretty clear relationship between whether or not the runner is a known base-stealer and whether he will draw a throw. Outs also seem to make a difference. Runners draw throws more often in no out situations (27.6% of the time) than in one out situations (26.6%), and in turn more in one out situations than with two outs (23.2%). When a pitcher does throw over, he throws over slightly more often, on average, with no one out (1.54 throws) than with one out (1.47) or two out (1.45). I also checked to see whether the score differential would predict whether a throw would be made or not. Absolute score differential did predict whether a throw would be made, (explaining about 9% of the variance in the binary logit regression.) As might be expected, pitchers were more likely to throw over if the score were close than if a blowout was in progress. The marginal value (in terms of win probability) of the runner stealing second in a blowout becomes less as score differential increases, and a steal is more costly in terms of run expectancy with fewer outs. With more at stake, pitchers are more likely to throw over.
It made no difference whether the pitcher played for the home or visiting team as to whether he would throw over (In fact, the percentages were identical!), but it did matter what inning it was. In the first inning, 34.4% of runners who reached first base with an open second base in front of them were the recipients of a check up throw. The figure slowly dwindled in a fairly consistent manner until the ninth inning, when only half as many runners in that situation drew throws (16.1%). This was balanced out by the fact that when he did throw over, there was a general upward trend in the number of throws the pitcher made as the game went on. As the game goes on, pitchers become more selective about whom they keep an eye on, but when they do try to hold the runner, they become more invested in it.
I was curious to see whether teams might take advantage of this pattern. I isolated the instances in which the runner made an attempt for second on a non 3-2 pitch with 2 outs, before the pitcher made a throw and checked for an association by inning. The two most popular innings for this strategy were the first and ninth innings, with the runner going before a throw was made 8.7% and 9.9% of the time, respectively (and it happened most often on the first pitch of the at bat). Those ninth inning totals include a fair number of defensive indifferences, so those should be interpreted with caution. In innings 2-8, the rate of trying for second before a throw was made was pretty uniformly a touch below 6%. Additionally, an non 3-2-2 running attempt of any kind was made most often in the first inning (16.3%), but then rates dropped down to around 12% and dwindled to 9.2% by the eighth inning. In the ninth, the rate jumps back to 13.5%.
So, the running game is at its most active in the first inning, but dies down a bit after that until the ninth. In the middle innings, pitchers may not pay as much attention to runners whom they do not consider to be a threat to steal. It doesn’t matter because teams are less likely to run in those innings or attempt a “surprise” steal. In this case, it’s very possible that there’s simply an equilibrium that’s been reached here. A small wrench in the equilibrium: stolen base success rates are highest in the first inning (76.1%), when pitchers are paying the most attention to the runners. They drop off to 65.2% in the second inning, but slowly rise through the rest of the game (peaking in the 7th inning, 75% and dropping slightly in the 8th and 9th). As the pitchers look out less, runners are more likely to be successful when they try.
I might suggest to a manager that if you’d like to steal a base, the pitcher is less likely to be looking in the middle and late innings, so a less speedy player might be better able to make it. I might also suggest to the pitching coach that he remind the pitcher to check on the runner in the 6th inning now and then. However, if that strategy became known, would it alter the pitchers’ and runners’ behaviors and alter the equilibrium? Probably. But, if any managers are out there reading this, it can be our little secret.
Another question occurred to me as to whether pitchers might throw over more often based on two factors: their pitching coach and their catcher. Some pitching coaches may preach holding the runners on, while others may think it a waste of time. Those effects would be seen at a team level. What about catchers? Suppose that I have Yadier Molina behind the plate (who threw out more than half of the would-be base stealers against him last year). Perhaps I would save my arm a little wear and tear and not bother throwing over, since Yadier will probably nail the runner anyway? Suppose I had Mike Piazza behind the plate. Perhaps I should make sure that runner stays close, because Mike’s not gonna throw anyone out.
The team-level results were intriguing: It seems that the Brewers are the most conscientious about the runners on first, checking up, as a team, on 36% of the runners. The Dodgers apparently brought up the rear, with a mere 15% of the runners reaching first receiving a throw. It looks like some pitching coaches and/or managers have different opinions on how important it is to hold runners on. When a pitcher did decide to throw over, the number of throws made was roughly the same with the Padres making the fewest throws (1.33) and the Astros making the most (1.69), and the list re-shaped itself quite a bit. In fact, the correlation between the percentage of runners thrown at and the number of throws made when checking on runners was a mere .170, which was not significant. It appears that pitching coaches give two sets of instructions to their pitchers: one on how to pick which runners to keep an eye on and the other on how close an eye to keep.
Because catchers are generally nestled into teams (that is, a catcher generally plays for one team during a year), any data on whether catchers have any effect on whether a pitcher will throw over must first take into account the team on which he plays. I solved for this problem by using multi-level modeling, selecting for catchers with more than 250 appearances in our runner on 1st scenario. (For those with a stats background, GLM with a first level effect for catcher and second level effect for team, nestled). If you have no idea what I just said, what MLM allows you to do is to sift out what effects are those of the team and which are individual to the catcher. First, it looked like team level variables accounted for a little more than 1% of whether or not a throw would be made, while catcher variables made up about half a percent. Catchers’ stolen base success rates against (SB/(SB + CS)) weren’t correlated with how often their pitchers threw over to first or how many times they threw over on average. It doesn’t look like catcher characteristics have much to do with pitchers throwing to first.
One of the favorite critiques of holding a runner is that it “creates a hole on the right side,” presumably one that a hitter may punch a ground ball through for a hit. Surely, the first baseman must play closer to the bag, and so there is a physical hole where he usually plays. Do hitters really take advantage of it? Well, the “hole” really is there. I had to back track to Retrosheet’s PBP data from 1997 to check (the mid-90s data has much better coverage of hit location), but more ground balls got through the right side of the infield for hits when the pitcher threw over (and as such, the first baseman was playing “out of position”) Take a look at the Retrosheet/Project Scoresheet hit location diagram. A grounder toward the “3” part of the infield (where the 1B usually plays) went through for a hit 22% percent of the time when a throw had been made (I realize it’s not entirely accurate to equate a throw having been made with the first baseman holding the runner, but it’s the best we’ve got right now), but only 8.3% percent of the time in general. A grounder toward the “34” hole (between first and second) showed the same pattern with 63% percent going for hits when the runner was being held and 47.5% when not. These numbers were off-set a bit by the fact that grounders down the line were slightly more likely to be stopped when the first baseman was playing close to the bag (34% were hits with the first baseman holding, vs. 37.6% when not), and that these types of grounders most often became doubles. But, how often does a batter actually hit a ground ball to the right side, and does he do so more often in runner-on-first situations? Overall, batters hit ground balls to the right side of the infield in 5.3% of their total plate appearances (7.4% of all balls in play). With a runner being held at first, the percentages were actually lower with a ground ball to the right side happening in 4.5% of plate appearances and 6.2% of balls in play. The hole is there, but hitters don’t take advantage!
The final question is whether the batter draws an advantage from the pitcher being distracted by the runner at first. Coming back to 2006, I checked to see what the batting line of all league batters would be, normalized to 700 PA, in situations where a throw was made and a situation where a throw was not made. (This controls for whatever pressure there might be on a pitcher from the mere presence of a runner on first.) The results:
Without throw: .281/.345/.446, 35 2B, 3 3B, 21 HR, 100 K, 46 BB
With throw: .270/.343/.428, 34 2B, 4 3B, 20 HR, 129 K, 57 BB
All 2006 MLB plate appearances: .268/.337/.432, 34 2B, 4 3B, 20 HR, 117 K, 58 BB
Those stats are right. Hitters perform slightly worse when the pitcher has made a throw to first. When holding a runner at first, pitchers are more likely to walk a batter, but also more likely to strike him out. But, throwing over to first seems to be associated with hitters coming back down to baseline level. If I had more time, I would compare whether there was a difference in hitter quality that might determine whether the pitcher would throw over. Perhaps it’s not throwing over that makes the pitcher better, but that the pitcher feels that he has the luxury of throwing over only when a weaker hitter is at the plate. If I had more time, I’d look.
If you’ve read all three parts of this series, you must realize that we have together meditated on a relatively insignificant and minor part of the game. Or is it? In general, throwing over to first saves a team roughly 8 runs per year by controlling the opposition’s running game. It also appears to have relatively few ill effects in the aggregate, and in fact, could probably be put to better use in the middle of a game. No doubt, you have been to or watched a game in which a friend or fellow fan cursed the pitcher for throwing over to first so much because it was slowing the game down. Next time this happens, you will be ready to explain to him that throwing to first is part of an intricate and largely unseen tactical plan that has real (but largely unseen) consequences. He won’t get it, but at least you can feel smart.