A few months ago, we looked at the best and worst average seasons for Pittsburgh Pirates position players. But who has the best and worst average seasons among franchise pitchers?
A few months ago, we looked at the best and worst average seasons in Pittsburgh Pirates' history based on statistics like OPS+ and wRC+. OPS+ weighs how good a batter's OPS was to the rest of the league, as well as their ballpark. wRC+ does the same thing, but for weighted on-base average (wOBA).
However, what we didn't cover were pitchers. Like OPS+ and wRC+ are two era, league, and ballpark-adjusted statistics, there are many pitching statistics that do the same, such as ERA+, ERA-, WHIP-, and FIP-. In each of these statistics, 100 is considered the average. If it has a + in front of it, that means that 101 is 1% better than average, 99 is considered 1% worse than average, and so on. If it has a minus sign, it means the opposite (101 is 1% worse than average, 99 is 1% better than average).
These stats help analyze how good a player was to the rest of the league. A 4.00 ERA can mean one thing in the early 2010's, as the league average ERA only surpassed 4.05 once from 2010 to 2014. However, from 1995 to 2000, the average ERA dipped below 4.40 just once. Posting a 4.00 ERA in 2011 might be about average, but posting a 4.00 ERA in 2000 (4.71 league average ERA) would have made you a well-above-average arm.
Another way we will look at this is through fWAR. For a starting pitcher, 2.0 fWAR is considered average. However, WAR won't be much of a factor for relievers here as it's far from a perfect stat, especially for a position that does not appear in very many innings.
So with how I deduced the best/worst seasons out of the way, let's take a look at some of these 'average' players. (Note that this is also post-integration, so after 1947)