Range factor is a statistic developed by Bill James in the early 1980s to measure the defensive prowess of a player; it is devised by dividing number of chances accepted (i.e. total chances successfully fielded) by a player per inning in the field.
James developed the statistic as he realized that fielding percentage did not give a true measure of a player's defensive worth, as it advantaged players who were not really good defensive players, as they got to relatively few balls and made few errors, but also allowed many balls in play that could otherwise have been turned into outs to become hits.
Range factor is not perfect either, as it is a crude measure that is impacted by team and ballpark effects. For example, a player whose home ballpark has a small outfield area (i.e, a left fielder in Fenway Park) will necessarily have a worse range factor than one in a park with more acreage, or a third baseman on a team with many left-handed starters will have more chances, given opponents will stack their line-ups with right-handed hitters. A team that has a staff of power pitchers who record a lot of strikeouts will also depress the range factor of its fielders. The statistic also is a lot less useful for first basemen and catchers, most of whose chances are the result of the work of teammates (infielders fielding ground balls in the case of first-sackers, and pitchers recording strikeouts in the case of catchers). More advanced fielding metrics, such as zone rating, have thus been developed to account for these biases and provide a truer picture of who the better fielders are. But this came later; with the data that was available at the time, range factor was a vast improvement over what existed.