Jump to content

Leonard Jimmie Savage

From Wikipedia, the free encyclopedia
(Redirected from J. Savage)
Leonard Jimmie Savage
Born(1917-11-20)20 November 1917
Died1 November 1971(1971-11-01) (aged 53)
Alma materUniversity of Michigan (B.A., Ph.D.)
Known forSavage loss
Savage's representation theorem
Savage's subjective expected utility representation
Friedman–Savage utility function
Halmos–Savage factorization theorem
Hewitt–Savage zero–one law
Likehood principle
Minmax regret criterion
Subjective expected utility
Sure-thing principle
Scientific career
FieldsMathematics, Statistics
InstitutionsUniversity of Chicago
Princeton University
Yale University
Columbia University
University of Michigan
Doctoral advisorSumner Myers
Doctoral studentsDon Berry
Morris H. DeGroot
Roy Radner
William S. Cleveland

Leonard Jimmie Savage (born Leonard Ogashevitz; 1917 – 1971) was an American mathematician and statistician. Economist Milton Friedman said Savage was "one of the few people I have met whom I would unhesitatingly call a genius."[1]

Education and career

[edit]

Savage was born and grew up in Detroit. He studied at Wayne State University in Detroit before transferring to University of Michigan, where he first majored in chemical engineering, then switched to mathematics, graduating in 1938 with a Bachelor's degree. He continued at the University of Michigan with a PhD on differential geometry in 1941 under the supervision of Sumner Byron Myers.[2] Savage subsequently worked at the Institute for Advanced Study in Princeton, New Jersey, the University of Chicago, the University of Michigan, Yale University, and the Statistical Research Group at Columbia University. Though his thesis advisor was Sumner Myers, he also credited Milton Friedman and W. Allen Wallis as statistical mentors.

During World War II, Savage served as chief "statistical" assistant to John von Neumann, the mathematician credited with describing the principles upon which electronic computers should be based.[3] Later he was one of the participants in the Macy conferences on cybernetics.[4]

Research and contributions

[edit]

His most noted work was the 1954 book The Foundations of Statistics, in which he put forward a theory of subjective and personal probability and statistics which forms one of the strands underlying Bayesian statistics and has applications to game theory.

One of Savage's indirect contributions was his discovery of the work of Louis Bachelier on stochastic models for asset prices and the mathematical theory of option pricing. Savage brought the work of Bachelier to the attention of Paul Samuelson. It was from Samuelson's subsequent writing that "random walk" (and subsequently Brownian motion) became fundamental to mathematical finance.

In 1951 he introduced the minimax regret criterion used in decision theory. The Hewitt–Savage zero–one law and Friedman–Savage utility function are (in part) named after him, as is the Savage Award given annually by the International Society for Bayesian Analysis for the best dissertations in Bayesian analysis.

See also

[edit]

Notes

[edit]
  1. ^ Friedman, Milton; Friedman, Rose (1998). Two Lucky People: Memoirs. Chicago: The University of Chicago Press. pp. 146. ISBN 0-226-26414-9.
  2. ^ O'Connor, J J; Robertson, E F (2010). "Leonard Jimmie Savage (1917–1971)". Biographies.
  3. ^ Hacking, Ian (2001). An Introduction to Probability and Inductive Logic. Cambridge: Cambridge University Press. p. 184. ISBN 0-521-77287-7.
  4. ^ Heims, Steve (1991). The Cybernetics Group. Cambridge, MA: The MIT Press. p. 348. ISBN 978-0262082006.
[edit]
  • Leonard Jimmie Savage papers (MS 695). Manuscripts and Archives, Yale University Library. [1]