Copyright (c) 2012 John L. Jerz

Information: The New Language of Science (von Baeyer, 2003, 2005)

Home
A Proposed Heuristic for a Computer Chess Program (John L. Jerz)
Problem Solving and the Gathering of Diagnostic Information (John L. Jerz)
A Concept of Strategy (John L. Jerz)
Books/Articles I am Reading
Quotes from References of Interest
Satire/ Play
Viva La Vida
Quotes on Thinking
Quotes on Planning
Quotes on Strategy
Quotes Concerning Problem Solving
Computer Chess
Chess Analysis
Early Computers/ New Computers
Problem Solving/ Creativity
Game Theory
Favorite Links
About Me
Additional Notes
The Case for Using Probabilistic Knowledge in a Computer Chess Program (John L. Jerz)
Resilience in Man and Machine

Confronting us at every turn, flowing from every imaginable source, information defines our era--and yet what we don't know about it could--and does--fill a book. In this indispensable volume, a primer for the information age, Hans Christian von Baeyer presents a clear description of what information is, how concepts of its measurement, meaning, and transmission evolved, and what its ever-expanding presence portends for the future.
 
Information is poised to replace matter as the primary stuff of the universe, von Baeyer suggests; it will provide a new basic framework for describing and predicting reality in the twenty-first century. Despite its revolutionary premise, von Baeyer's book is written simply in a straightforward fashion, offering a wonderfully accessible introduction to classical and quantum information.
 
Enlivened with anecdotes from the lives of philosophers, mathematicians, and scientists who have contributed significantly to the field, Information conducts readers from questions of subjectivity inherent in classical information to the blurring of distinctions between computers and what they measure or store in our quantum age.
 
A great advance in our efforts to define and describe the nature of information, the book also marks an important step forward in our ability to exploit information--and, ultimately, to transform the nature of our relationship with the physical universe.

p.65 In a letter to a colleague he explained what he believed to be the mandate of science: 'Our task is not to penetrate into the essence of things, the meaning of which we don't know anyway, but rather to develop concepts which allow us to talk in a productive way about phenomena in nature.'
 
p.96 When Max Planck, the father of quantum mechanics, later wrote this equation in mathematical notation, he chose S for entropy in order to distinguish it from energy, k for the constant... and W for the number of ways. S = k log W
 
p.98 Boltzmann's measure of information was simply W, the number of ways of rearranging a system. When that number is large, our ignorance is large; when it is small, our ignorance is correspondingly small. In this roundabout way - by identifying entropy with missing information - Boltzmann hurled the concept of information into the realm of physics.

Enter supporting content here