So, I was at the AAA All-Star game this past week, and I noticed that a player in the game (I wish I remember who now, but it was a bit hazy and their stats are no longer the same) had an OBP lower than their batting average at the time.
Now, looking at the formula for batting average:
As well as OBP:
We can see that it is possible for a player to have an OBP lower than their batting average by having few to no walks/times hit by pitch with a higher number of sac. flies.
For example, a player with 1 hit, 0 walks, and 1 sf would have a batting average of 1.000 with an OBP of 0.500
With that information in mind, let’s take a look at some of the times this has happened over the course of an entire year.
SELECT concat(m.nameFirst, " ", m.nameLast) AS Name, b.yearID AS Year, b.G, AB, (AB+BB+HBP+SF+SH) AS PA, H, 2B, 3B, HR, BB, HBP, SF, SH, H/AB as AVG, (H+BB+HBP)/(AB+BB+HBP+SF) as OBP, (H+2B+2*3B+3*HR)/AB as SLG, ((H+BB+HBP)/(AB+BB+HBP+SF)) + ((H+2B+2*3B+3*HR)/AB) as OPS FROM Batting b INNER JOIN Master m ON m.playerID = b.playerID WHERE b.yearID >= 1955 HAVING PA >= 100 and OBP < AVG ORDER BY yearID desc
So, it looks like in the 11 times this has occurred since 1955 (with a minimum of 100 plate appearances), Mario Duncan in ’95 did it with the best OPS (0.6881), as well as the most plate appearances (201).
Additionally, the largest difference between the two was Steve Carlton’s 1974 season, with a 0.0070 difference between the two.
Other than that, of these 11 seasons, 8 of them were performed by pitchers. This is probably due to the combination of pitchers being used to move runners along in conjunction with not taking as many walks as regular batters.