One day, in 1938, American physicist Frank Benford was looking up a book of logarithmic tables.
Frank Albert Benford
Before calculators, these helped mathematicians multiply or divide any number.
But Benford noticed something strange: the first pages were more worn than the last.
This indicated that numbers with low first digits were looked up more often than those with a high first digit.
Benford could see no logical reason for this, so he set about investigating other numbers, to see whether the pattern would be repeated.
Benford set about investigating over 20,000 sets of unrelated numbers, from the lengths of rivers, to baseball statistics and numbers in magazine articles.
Rather than being random, as would have been expected, the first digits of the data followed a pattern.
Most people, including Benford, thought it was as likely that a number would start with the digit one as it would with nine.
Instead, Benford found that first-digit use falls in sequence.
Numbers with low first digits appear most often.
There is a 30% chance that number one will be the first digit, compared to only a 4% chance that nine will be the first digit.
First digits are not equally likely
Low first digits appear most often
Occurrence of Benford's Law
It's perhaps unsurprising that number one appears most frequently at the beginning of man-made numbering systems...
Like street numbers or the number of books in libraries.
This is because in a sequentially ordered number system, one, 10 or 100 will usually appear first.
But it is surprising that Benford's Law also appears in data from the natural world, from the seismic activities of earthquakes, to the distance of stars from Earth.
The reasons Benford's Law works are complex.
It took mathematicians over 60 years to prove it holds true.
Now that it is known to be true, Benford's Law can be used to authenticate data.
Because humans trying to falsify data will usually make it appear random.
Whereas mathematicians know it should follow the pattern of Benford's Law.