Helgoland by Rovelli, Erica (story read aloud txt) đź“•
Read free book «Helgoland by Rovelli, Erica (story read aloud txt) 📕» - read online or download for free at americanlibrarybooks.com
Read book online «Helgoland by Rovelli, Erica (story read aloud txt) 📕». Author - Rovelli, Erica
The wings of the butterfly will always be the same color for all of us.
INFORMATION
I close this second part of the book with a comment on the role of information in quantum theory. Words are never precise: the variegated cloud of meanings that they carry about with them is their expressive power. But it also generates confusion, “’cause you know sometimes words have two meanings.” The word “information” that I used a few lines ago is a word packed with ambiguity. It is used to mean quite different things in different contexts.
It is often used to refer to something that has meaning. A letter from our father is “rich in information.” In order to decipher this type of information, you need a mind that understands the meaning of the sentences in the letter. This is a “semantic” notion of information, that is to say, one that is linked to meaning.
But there is also a usage of the word “information” that is much simpler and has nothing to do with semantics or the mind: it comes directly from physics, where we do not speak of either minds or meanings. It is what I did above when writing that the thermometer “has information” on the temperature of the cake, in order to say only that when the cake is cold the thermometer indicates cold, and when the cake is hot the thermometer registers heat.
This is the simple and general sense of the word “information” used in physics. If I drop two coins, there are four possible results (heads-heads, heads-tails, tails-heads and tails-tails). But if I glue the two coins to a piece of transparent plastic, both face up, and let them drop, there are no longer four possible outcomes but just two: heads-heads and tails-tails. Heads for one coin implies heads for the other. In the language of physics, we say that the sides of the two coins are “correlated.” Or that the sides of the two coins “have information about each other.” If I see one, this “informs” me about the other.*
To say that a physical variable “has information” on another physical variable in this sense is simply to say that there is a tie of some kind (a common history, a physical link, the glue on the plastic sheet) due to which the value of one variable implies something about the value of the other.68 This is the meaning of the word “information” that I am using here.
I hesitated to speak of information in this book, because the word is so ambiguous: everyone tends instinctively to read into it what they will, and this becomes an obstruction to understanding. But I have taken the risk of including it because the concept of information is important for quanta. Please remember that “information” is used here in a physical sense that has nothing to do with anything mental, or with semantics.
This clarified, here is the point: it is possible to think of quantum physics as a theory of information (in the sense outlined) that systems have about one another. The properties of an object can be thought of, as we have seen, as the establishment of a correlation between two objects, or rather as information that one object has on another.
The same is true in classical physics. But this language allows us to pinpoint the difference between classical and quantum physics. This can be summarized in two general facts that radically differentiate quantum physics from classical physics, encapsulating the novelty of the quanta:69
The maximal amount of relevant information about an object70 is finite.
It is always possible to acquire new relevant information about any object.
These two facts are so basic that they are called “postulates.” At first glance the two postulates appear to contradict each other. If the information is finite, how is it always possible to obtain more of it? The contradiction is only apparent, however, because the postulates refer to “relevant” information. Relevant information is that which counts for predicting the future behavior of the object. When we acquire new information, part of the old information becomes “irrelevant”: that is to say, it does not change what can be said about the future.71
Quantum theory is summed up in these two postulates.72 Let’s see how.
1. Information Is Finite: Heisenberg’s Principle
If we could know with infinite precision all the physical variables that describe a thing, we would have infinite information. But this is impossible. The limit is determined by the Planck constant ħ.73 This is the meaning of Planck’s constant. It is the limit up to which we can determine physical variables.
Heisenberg brought this crucial fact to light in 1927, shortly after having conceived the theory.74 He showed that if the precision with which we have information on the position of something is ΔX, and the precision with which we have information on its speed (multiplied by its mass) is ΔP, the two precisions cannot both be arbitrarily good. They cannot both be too close to zero. Their product cannot be smaller than a minimum quantity: half the Planck constant. As a formula, it is this:
ΔX ΔP ≥ ħ/2
This reads: “Delta X times Delta P is always greater than or equal to h-bar divided by two.” This general property of reality is called “Heisenberg’s uncertainty principle.” It applies to everything.
An immediate consequence is granularity. Light, for instance, is made of photons or grains of light, because portions of energy that were even more minute than this would violate this principle: the electric field and the magnetic field (that are like X and P, for light) would both be too determined and would violate the first postulate.
2. Information Is Inexhaustible: Noncommutativity
The uncertainty principle does
Comments (0)