Stolen base attempts are, like save opportunities and intentional walks, one of the few baseball stats that are entirely elective. You can't decide to hit a home run or strike out a batter or even steal the base, but you can decide that you have the green light, read the pitcher and take off for the next bag. It's because of this that analyzing stolen base attempts is a fairly pure activity: They tell you pretty much what is going on in the game, as a whole, at any moment in time.
Now, one of the popular misunderstandings of sabermetrics was fundamentally a timing issue. Because analytics made the leap to the mainstream during a time of high offense -- with statheads correctly warning that the risks involved in attempting to steal a base weren't worth the payoff -- the idea took hold that sabermetrics says that stealing bases is bad. The fact is, in a vacuum, nothing is good or bad. Baseball is a closed system for the most part, so changes in one part trickle through and affect the whole. In the 1990s and 2000s, it was relatively easy to advance baserunners with extra-base hits, and a home run was a likely event, historically speaking. Because of this, the value of moving a runner up 90 feet was lessened, and the cost -- the out created by having that runner caught stealing -- was incredibly high. Stolen bases weren't bad because statheads don't like seeing close plays at second base; stolen bases were bad because the juice wasn't worth the squeeze.
It stands to reason, then, that if the value of stolen bases goes up in low-offense eras, that steals should be coming back into vogue. And they are. In 1992, there were 4,865 stolen-base attempts, about 1.15 per team game; or looked at another away, a steal was attempted about 12 percent of the time that a runner reached first base, as approximated by adding singles, walks and HBPs (this is a blunt instrument designed to better estimate stolen-base opportunities). In 2000, at the peak of the offensive era, teams were looking to swipe a bag .87 times a game, a drop of nearly 25 percent in just eight years, and they ran about 8.6 percent of the time a runner was on first.
By comparison, the 2011 season is a track meet. On a per-game basis, steals have bounced back a bit, to .94 per team game, but that doesn't tell the whole story. Look at the numbers above again: Batting average is even lower, but slugging is higher. Extra-base hits are still high relative to singles, so the better denominator is opportunities: those runners on first base. Teams are now attempting to steal bases at a rate of around 10 percent of the time that a runner reaches first. It's not quite 1992 again, but the running game is bouncing back toward prominence.
The reason it may not get all the way back is that while runs per game are trending down, the shape of offense is still more like the 2000s than the early 1990s. We're playing in a high-strikeout era, one in which power is still a big part of the game. Singles are as rare as they've ever been -- the dip in offense has been about trading singles for outs and homers for doubles and triples. The percentage of plate appearances ending in a single, 15.2 percent, is lower than it was when offense reached its peak in 2000 (15.6 percent), but it's much lower than it was the last time offense was at its current level, in 1992 (16.3 percent). Even in 1968, the nadir of the second dead-ball era, singles accounted for 15.8 percent of all PAs. For all the focus on how home runs have become less frequent -- and they have -- it's the slow decline of the base hit that is at the core of 2011's low run-scoring.
So while run levels would dictate trying to steal more, the biggest benefit of a steal -- putting a runner in a place where he can score on a single -- is trending toward all-time lows. It is, quite frankly, a vexing problem: creating runs in a peak-strikeout, low-BABIP, low-HR/FB environment. We simply haven't ever seen these conditions in MLB before, and it may be a while before we know what works best in this environment.