Discovered by English clergyman Thomas Bayes, the formula is a simple one-liner: Initial Beliefs + Recent Objective Data = A New and Improved Belief. A modern form comes from French mathematician Pierre-Simon Laplace, who, by recalculating the equation each time he got new data, could distinguish highly probable hypotheses from less valid ones. One of his applications involved explaining why slightly more boys than girls were born in Paris in the late 1700s. After collecting demographic data from around the world for 30 years, he concluded that the boy-girl ratio is universal to humankind and determined by biology.
Many theoretical statisticians over the years have assailed Bayesian methods as subjective. Yet decision makers insist that they bring clarity when information is scarce and outcomes uncertain. During the 1970s John Nicholson, the U.S. submarine fleet commander in the Mediterranean, used Bayesian computer analysis to figure out the most probable paths of Soviet nuclear subs. Today Bayesian math helps sort spam from e-mail, assess medical and homeland security risks and decode DNA, among other things.
Now Bayes is revolutionizing robotics, says Sebastian Thrun, director of Stanford University’s Artificial Intelligence Laboratory and Google’s driverless car project. By expressing all information in terms of probability distributions, Bayes can produce reliable estimates from scant and uncertain evidence.
May 26, 2011
Bayes: The Theory That Would Not Die [book]
The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy by Sharon Bertsch McGrayne. From the SciAm review, Why Bayes Rules: The History of a Formula That Drives Modern Life: