October 15, 1999
Colloquium Speaker: Roy Frieden
Professor Roy Frieden is a Professor of Optical Sciences at the University of Arizona, which he joined in 1966 after obtaining a Ph.D. in Optics from the University of Rochester. He has worked extensively in the fields of image and signal processing, and was the first person to process images using E.T. Jaynes' principle of maximum entropy. This led to experimentation with minimum Fisher information, first as a tool of image processing and ultimately as an approach to deriving physics. He won an award from the Gravity Research Foundation (jointly with H. Rosu, 1997) for some of this work. His recent book "Physics from Fisher Information" (Cambridge University Press, 1998) derives a great deal of physics from the information viewpoint taken in this colloquium.
Physics from Fisher Information is a decisive difference between the experiences of (a) passively observing a lamp voltage of 120.0 v on a meter and (b) becoming an active part of the electrical phenomenon by sticking your finger in the lamp socket. And curiously, physics may be derived out of this distinction. This amounts to a distinction between what is observed and what is (electrical current flow, etc.). A way of quantifying this is in relation to information I in data about a physical parameter (the domain of Fisher information). The information had to come from somewhere. That 'somewhere' is the phenomenon, where it had value J. The distinction between I and J is the issue. It implies a difference between how much one knows about a phenomenon and how much it is POSSIBLE to know about it. This is the key consideration in a new approach to physics that derives nearly all of it out of (i) extremizing and (ii) zeroing the information difference (I-J). Aside from providing a unifying framework for physics, the approach predicts some new effects: a 'Boltzmann H-theorem' for information I, that is, dI/dt (less than or equal to) 0; a Fisher-based thermodynamics; a 'Fisher temperature' T defined as 1/T = -dl/dE, E = energy, that is nonlinear in ordinary temperature; that time, on the quantum level, is as random as position; a higher-order positronium molecule; that the universal physical constants follow a 1/x probability law; that the rate of increase of entropy is bounded above by square root of; and an uncertainty principle relating time and fitness as complementary variables in population genetics.