Heisenberg quantum uncertainty.  Heisenberg uncertainty relation.  Generalized uncertainty principle

Heisenberg quantum uncertainty. Heisenberg uncertainty relation. Generalized uncertainty principle

Although this principle seems rather strange, in essence it is extremely simple. In quantum theory, where the position of an object is characterized by the square of its amplitude, and the magnitude of its momentum by the wavelength of the corresponding wave function, this principle is nothing more than a simple fact characteristic of waves: a wave localized in space cannot have one wavelength. The puzzling thing is that when we talk about a particle, we mentally imagine its classical image, and then we are surprised when we discover that the quantum particle behaves differently from its classical predecessor.

If we insist on a classical description of the behavior of a quantum particle (in particular, if we try to attribute to it both a position in space and momentum), then the maximum possible accuracy of simultaneous determination of its position and momentum will be related to each other using a surprisingly simple relation first proposed by Heisenberg and called the uncertainty principle:

where are the inaccuracies, or uncertainties, in the values ​​of the momentum and position of the particle. Product of momentum and position inaccuracies

turns out to be of the order of magnitude of Planck's constant. In quantum theory, unlike classical theory, it is impossible to simultaneously localize a quantum particle and assign a certain momentum to it. Therefore, such a particle cannot have a trajectory in the same sense as a classical particle. We do not mean psychological uncertainty at all. This uncertainty characterizes the nature of such an object, which cannot simultaneously possess two properties - position and momentum; an object that vaguely resembles a storm in the atmosphere: if it extends over long distances, then weak winds blow; if it is concentrated in a small area, then a hurricane or typhoon occurs.

The uncertainty principle contains in a surprisingly simple form what was so difficult to formulate using the Schrödinger wave. If there is a wave function with a given wavelength or with a given momentum, then its position is completely uncertain, since the probabilities of finding a particle at different points in space are equal. On the other hand, if a particle is completely localized, its wave function must consist of the sum of all possible periodic waves, so that its wavelength or momentum is completely indeterminate. The exact relationship between the uncertainties of position and momentum (which comes directly from wave theory and is not particularly related to quantum mechanics, since it characterizes the nature of any waves - sound waves, waves on the surface of water or waves traveling along a stretched spring) is given in a simple form Heisenberg's uncertainty principle.

Let us recall the previously considered particle, the one-dimensional movement of which occurred between two walls located at a distance from each other. The uncertainty in the position of such a particle does not exceed the distance between the walls, since we know that the particle is enclosed between them. Therefore the value is equal to or less

The position of the particle, of course, can be localized within narrower limits. But if it is given that the particle is simply enclosed between the walls, its x coordinate cannot go beyond the distance between these walls. Therefore, uncertainty, or lack of

knowledge, its coordinates x cannot exceed the value I. Then the uncertainty of the particle momentum is greater than or equal to

Momentum is related to speed by the formula

hence the speed uncertainty

If the particle is an electron and the distance between the walls is equal to cm, then

Thus, if a particle with the mass of an electron is localized in a region whose dimensions are on the order of magnitude, then we can only talk about the particle’s velocity with an accuracy of cm/s,

Using the results obtained earlier, one can find the uncertainty relation for the Schrödinger wave in the case of a particle confined between two walls. The ground state of such a system corresponds to a mixture in equal shares of solutions with momenta

(In the classical case, an electron rushes from wall to wall, and its momentum, remaining equal in value all the time, changes its direction with each collision with a wall.) Since the momentum changes from to, its uncertainty is equal to

From de Broglie's relation

and for the ground state

In the same time

Hence,

This result can be used to estimate the lowest energy value that a quantum system can have. Due to the fact that the momentum of the system is an uncertain quantity, this energy is generally not equal to zero, which radically distinguishes a quantum system from a classical one. In the classical case, the energy of the particle under consideration coincides with its kinetic energy, and when the particle is at rest, this energy vanishes. For a quantum system, as shown above, the uncertainty of the momentum of a particle located in the system is

The momentum of such a particle cannot be determined accurately, since its possible values ​​lie in an interval of width. Obviously, if zero lies in the middle of this interval (Fig. 127), then the momentum will vary in value from zero to Therefore, the minimum possible momentum that can be attributed particle, is equal due to the uncertainty principle

At lower values ​​of momentum the uncertainty principle is violated. The energy corresponding to this impulse is

can be compared with the lowest energy, the value of which we calculated using the Schrödinger equation by selecting a suitable standing wave between the walls of the vessel:

The value of the result obtained lies not in the numerical agreement, but in the fact that we were able to make a rough estimate of the value of the minimum energy using only the uncertainty principle. In addition, we were able to understand why the minimum value of the kinetic energy of a quantum mechanical system (unlike a classical system) is never equal to zero. The corresponding classical particle confined between the walls has zero kinetic

energy when it is at rest. A quantum particle cannot be at rest if it is captured between the walls. Its momentum or speed is essentially uncertain, which manifests itself in an increase in energy, and this increase exactly coincides with the value that is obtained from a rigorous solution of the Schrödinger equation.

This very general result has especially important consequences in that section of quantum theory that corresponds to classical kinetic theory, that is, in quantum statistics. It is widely known that the temperature of a system, as stated by kinetic theory, is determined by the internal motion of the atoms that make up the system. If the temperature of a quantum system is high, then something very similar to this actually occurs. However, at low temperatures, quantum systems cannot come to absolute rest. The minimum temperature corresponds to the lowest possible state of a given system. In the classical case, all particles are at rest, but in the quantum case, the energy of the particles is determined from expression (41.17), which does not correspond to the rest of the particles.

From all this it may seem that we are paying too much attention to the electrons confined between two walls. Our attention to electrons is completely justified. What about the walls? If we analyze all the previously considered cases, we can be convinced that the type of force system, be it a vessel or something else, holding an electron in a limited region of space is not so significant.

Two walls, a central force or various obstacles (Fig. 128) lead to approximately the same results. The type of specific system that holds the electron is not so important. It is much more important that the electron is captured at all, that is, its wave function is localized. As a result, this function is represented as a sum of periodic waves and the momentum of the particle becomes uncertain, and

Let us now analyze, using the uncertainty principle, one typical wave phenomenon, namely the expansion of a wave after it passes through a small hole (Fig. 129). We have already analyzed this phenomenon geometrically, calculating distances

which humps intersect with depressions. It is not surprising that now the results will be similar. It’s just that the same theoretical model is described in different words. Let us assume that an electron enters a hole in the screen, moving from left to right. We are interested in the uncertainty of the position and speed of the electron in the x direction (perpendicular to the direction of motion). (The uncertainty relation is satisfied for each of the three directions separately: Ah-Arkhzhk,

Let us denote the width of the slit by this value, which is the maximum error in determining the position of the electron in the x direction when it passed through the hole to penetrate the screen. From here we can find the uncertainty of the momentum or speed of the particle in the i direction:

Therefore, if we assume that an electron passes through a hole in a screen of width, we must admit that its speed will then become indefinite up to the value

Unlike a classical particle, a quantum particle cannot, after passing through a hole, produce a clear image on the screen.

If it moves with speed in the direction of the screen, and the distance between the screen and the hole is equal, then it will cover this distance in time

During this time, the particle will move in the x direction by an amount

Angular spread is defined as the ratio of the displacement to the length

Thus, the angular spread (interpreted as half the angular distance to the first diffraction minimum) is equal to the wavelength divided by the aperture width, which is the same as the result obtained previously for light.

What about ordinary massive particles? Are they quantum particles or Newtonian-type particles? Should Newtonian mechanics be used for objects of normal sizes and quantum mechanics for objects whose sizes are small? We can consider all particles, all bodies (even the Earth) to be quantum. However, if the particle's size and mass are commensurate with those typically observed in macroscopic phenomena, then quantum effects—wave properties, position and velocity uncertainties—become too small to be detectable under normal conditions.

Consider, for example, the particle we talked about above. Let's assume that this particle is a metal ball from a bearing with a mass of one thousandth of a gram (a very small ball). If we localize its position with an accuracy accessible to our vision in the field of a microscope, say with an accuracy of one thousandth of a centimeter, then localized over a length of cm, the uncertainty in speed turns out to be too small a value to be detected by ordinary observations.

The Heisenberg uncertainty relations relate not only the position and momentum of the system, but also its other parameters, which in classical theory were considered independent. One of the most interesting and useful relationships for our purposes is the connection between the uncertainties of energy and time. It is usually written in the form

If a system is in a certain state for a long period of time, then the energy of this system is known with great accuracy; if it remains in this state for a very short interval of time, then its energy becomes uncertain; this fact is accurately described by the relationship given above.

This relation is usually used when considering the transition of a quantum system from one state to another. Let us assume, for example, that the lifetime of some particle is equal to , i.e., between the moment of birth of this particle and the moment of its decay, a time of the order of s passes. Then the maximum accuracy with which the energy of this particle can be known is

which is a very small amount. As we will see later, there are so-called elementary particles whose lifetime is on the order of c (the time between the moment of birth of a particle and the moment of its annihilation). Thus, the period of time during which a particle is in a certain state is very small, and the energy uncertainty is estimated as

This value, 4-106 eV (a million electron volts is abbreviated as MeV), is enormous; This is why, as we will see later, such elementary particles, sometimes called resonances, are not assigned an exact energy value, but a whole range of values ​​over a fairly wide range.

From relation (41.28) one can also obtain the so-called natural width of the levels of a quantum system. If, for example, an atom moves from level 1 to level 0 (Fig. 130), then the energy of the level

Then the spread of energy values ​​of this level is determined from the expression:

This is the typical natural width of the energy levels of an atomic system.

The uncertainty principle is a fundamental law of the microworld. It can be considered a particular expression of the principle of complementarity.

In classical mechanics, a particle moves along a certain trajectory, and at any moment in time it is possible to accurately determine its coordinates and its momentum. Regarding microparticles, this idea is incorrect. A microparticle does not have a clearly defined trajectory; it has both the properties of a particle and the properties of a wave (wave-particle duality). In this case, the concept of “wavelength at a given point” has no physical meaning, and since the momentum of a microparticle is expressed through the wavelength - p=To/ l, then it follows that a microparticle with a certain momentum has a completely uncertain coordinate, and vice versa.

W. Heisenberg (1927), taking into account the dual nature of microparticles, came to the conclusion that it is impossible to simultaneously characterize a microparticle with both coordinates and momentum with any predetermined accuracy.

The following inequalities are called Heisenberg uncertainty relations:

Δx Δ p x ≥h,Δ yΔp y ≥ h,Δ zΔp z h.

Here Δx, Δy, Δz mean coordinate intervals in which a microparticle can be localized (these intervals are coordinate uncertainties), Δ p x , Δ p y , Δ p z mean the intervals of pulse projections onto the coordinate axes x, y, z, h– Planck’s constant. According to the uncertainty principle, the more accurately the impulse is recorded, the greater the uncertainty in the coordinate will be, and vice versa.

Principle of correspondence

As science develops and accumulated knowledge deepens, new theories become more accurate. New theories cover ever wider horizons of the material world and penetrate into previously unexplored depths. Dynamic theories are replaced by static ones.

Each fundamental theory has certain limits of applicability. Therefore, the emergence of a new theory does not mean a complete negation of the old one. Thus, the movement of bodies in the macrocosm with speeds significantly lower than the speed of light will always be described by classical Newtonian mechanics. However, at speeds comparable to the speed of light (relativistic speeds), Newtonian mechanics is not applicable.

Objectively, there is continuity of fundamental physical theories. This is the principle of correspondence, which can be formulated as follows: no new theory can be valid unless it contains as a limiting case the old theory relating to the same phenomena, since the old theory has already proven itself in its field.

3.4. The concept of the state of the system. Laplace determinism

In classical physics, a system is understood as a collection of some parts connected to each other in a certain way. These parts (elements) of the system can influence each other, and it is assumed that their interaction can always be assessed from the standpoint of cause-and-effect relationships between the interacting elements of the system.

The philosophical doctrine of the objectivity of the natural relationship and interdependence of phenomena of the material and spiritual world is called determinism. The central concept of determinism is the existence causality; Causality occurs when one phenomenon gives rise to another phenomenon (effect).

Classical physics stands on the position of rigid determinism, which is called Laplaceian - it was Pierre Simon Laplace who proclaimed the principle of causality as a fundamental law of nature. Laplace believed that if the location of the elements (some bodies) of a system and the forces acting in it are known, then it is possible to predict with complete certainty how each body of this system will move now and in the future. He wrote: “We must consider the present state of the Universe as the consequence of the previous state and as the cause of the subsequent one. A mind which at a given moment knew all the forces operating in nature, and the relative positions of all its constituent entities, if it were still so vast as to take all these data into account, would embrace in one and the same formula the movements of the largest bodies of the Universe and the lightest atoms. Nothing would be uncertain for him, and the future, like the past, would stand before his eyes.” Traditionally, this hypothetical creature, which could (according to Laplace) predict the development of the Universe, is called in science “Laplace’s demon.”

In the classical period of the development of natural science, the idea was affirmed that only dynamic laws fully characterize causality in nature.

Laplace tried to explain the whole world, including physiological, psychological, and social phenomena from the point of view of mechanistic determinism, which he considered as a methodological principle for constructing any science. Laplace saw an example of the form of scientific knowledge in celestial mechanics. Thus, Laplacean determinism denies the objective nature of chance, the concept of the probability of an event.

Further development of natural science led to new ideas of cause and effect. For some natural processes, it is difficult to determine the cause—for example, radioactive decay occurs randomly. It is impossible to unambiguously relate the time of “departure” of an α- or β-particle from the nucleus and the value of its energy. Such processes are objectively random. There are especially many such examples in biology. In modern natural science, modern determinism offers various, objectively existing forms of interconnection of processes and phenomena, many of which are expressed in the form of relationships that do not have pronounced causal connections, that is, do not contain moments of generation of one by another. These are space-time connections, relations of symmetry and certain functional dependencies, probabilistic relationships, etc. However, all forms of real interactions of phenomena are formed on the basis of universal active causality, outside of which not a single phenomenon of reality exists, including the so-called random phenomena, in the aggregate of which static laws are manifested.

Science continues to develop and is enriched with new concepts, laws, and principles, which indicates the limitations of Laplacean determinism. However, classical physics, in particular classical mechanics, still has its niche of application today. Its laws are quite applicable for relatively slow movements, the speed of which is significantly less than the speed of light. The importance of classical physics in the modern period was well defined by one of the creators of quantum mechanics, Niels Bohr: “No matter how far the phenomena go beyond the classical physical explanation, all experimental data must be described using classical concepts. The rationale for this is simply to state the precise meaning of the word “experiment.” With the word "experiment" we indicate a situation where we can tell others exactly what we have done and what exactly we have learned. Therefore, the experimental setup and observational results must be described unambiguously in the language of classical physics.”

It is impossible to simultaneously accurately determine the coordinates and speed of a quantum particle.

In everyday life, we are surrounded by material objects whose sizes are comparable to us: cars, houses, grains of sand, etc. Our intuitive ideas about the structure of the world are formed as a result of everyday observation of the behavior of such objects. Since we all have a lived life behind us, the experience accumulated over the years tells us that since everything we observe behaves in a certain way over and over again, it means that throughout the Universe, on all scales, material objects should behave in a similar way. And when it turns out that somewhere something does not obey the usual rules and contradicts our intuitive concepts about the world, it not only surprises us, but shocks us.

In the first quarter of the twentieth century, this was precisely the reaction of physicists when they began to study the behavior of matter at the atomic and subatomic levels. The emergence and rapid development of quantum mechanics has opened up a whole world to us, the system structure of which simply does not fit into the framework of common sense and completely contradicts our intuitive ideas. But we must remember that our intuition is based on the experience of the behavior of ordinary objects of a scale commensurate with us, and quantum mechanics describes things that happen at a microscopic and invisible level to us - no one has ever directly encountered them. If we forget about this, we will inevitably end up in a state of complete confusion and bewilderment. For myself, I formulated the following approach to quantum mechanical effects: as soon as the “inner voice” begins to repeat “this cannot be!”, You need to ask yourself: “Why not? How do I know how everything really works inside an atom? Did I look there myself?” By setting yourself up in this way, it will be easier for you to perceive the articles in this book devoted to quantum mechanics.

The Heisenberg principle generally plays a key role in quantum mechanics, if only because it quite clearly explains how and why the microworld differs from the material world we are familiar with. To understand this principle, first think about what it means to “measure” any quantity. To find, for example, this book, when you enter a room, you look around it until it stops on it. In the language of physics, this means that you made a visual measurement (you found a book by looking) and got the result - you recorded its spatial coordinates (you determined the location of the book in the room). In fact, the measurement process is much more complicated: a light source (the Sun or a lamp, for example) emits rays, which, having traveled a certain path in space, interact with the book, are reflected from its surface, after which some of them reach your eyes, passing through the lens focuses and hits the retina - and you see the image of the book and determine its position in space. The key to measurement here is the interaction between the light and the book. So with any measurement, imagine, the measurement tool (in this case, it is light) interacts with the measurement object (in this case, it is a book).

In classical physics, built on Newtonian principles and applied to objects in our ordinary world, we are accustomed to ignoring the fact that a measuring instrument, when interacting with an object of measurement, affects it and changes its properties, including, in fact, the quantities being measured. When you turn on the light in the room to find a book, you don’t even think about the fact that under the influence of the resulting pressure of light rays, the book can move from its place, and you recognize its spatial coordinates, distorted under the influence of the light you turned on. Intuition tells us (and, in this case, quite correctly) that the act of measurement does not affect the measured properties of the object being measured. Now think about the processes occurring at the subatomic level. Let's say I need to fix the spatial location of an electron. I still need a measuring instrument that will interact with the electron and return a signal to my detectors with information about its location. And here a difficulty arises: I have no other tools for interacting with an electron to determine its position in space, other than other elementary particles. And, if the assumption that light, interacting with a book, does not affect its spatial coordinates, the same cannot be said regarding the interaction of the measured electron with another electron or photons.

In the early 1920s, during the explosion of creative thought that led to the creation of quantum mechanics, the young German theoretical physicist Werner Heisenberg was the first to recognize this problem. Starting with complex mathematical formulas describing the world at the subatomic level, he gradually came to a formula of amazing simplicity, giving a general description of the effect of the influence of measurement tools on the measured objects of the microworld, which we just talked about. As a result, he formulated uncertainty principle, now named after him:

uncertainty in coordinate value x uncertainty in speed > h/m,

whose mathematical expression is called Heisenberg uncertainty relation:

Δ x x Δ v > h/m

where Δ x— uncertainty (measurement error) of the spatial coordinates of the microparticle, Δ v— uncertainty of particle speed, m— particle mass, and h - Planck's constant, named after the German physicist Max Planck, another of the founders of quantum mechanics. Planck's constant is approximately 6.626 x 10 -34 J s, that is, it contains 33 zeros before the first significant decimal place.

The term “spatial coordinate uncertainty” precisely means that we do not know the exact location of the particle. For example, if you use the GPS global reconnaissance system to determine the location of this book, the system will calculate them to within 2-3 meters. (GPS, Global Positioning System is a navigation system that uses 24 artificial Earth satellites. If, for example, you have a GPS receiver installed on your car, then by receiving signals from these satellites and comparing their delay time, the system determines your geographic coordinates on Earth accurate to the nearest arcsecond.) However, from the point of view of a measurement made by a GPS instrument, the book could, with some probability, be located anywhere within the system's specified few square meters. In this case, we are talking about the uncertainty of the spatial coordinates of an object (in this example, a book). The situation can be improved if we take a tape measure instead of a GPS - in this case we can say that the book is, for example, 4 m 11 cm from one wall and 1 m 44 cm from the other. But even here we are limited in the accuracy of measurement by the minimum division of the tape measure scale (even if it is a millimeter) and by the measurement errors of the device itself - and in the best case, we will be able to determine the spatial position of the object accurate to the minimum division of the scale. The more accurate the instrument we use, the more accurate the results we obtain will be, the lower the measurement error will be, and the less uncertainty will be. In principle, in our everyday world it is possible to reduce uncertainty to zero and determine the exact coordinates of the book.

And here we come to the most fundamental difference between the microworld and our everyday physical world. In the ordinary world, when measuring the position and speed of a body in space, we have practically no influence on it. So ideally we can simultaneously measure both the speed and coordinates of an object absolutely accurately (in other words, with zero uncertainty).

In the world of quantum phenomena, however, any measurement affects the system. The very fact that we measure, for example, the location of a particle, leads to a change in its speed, which is unpredictable (and vice versa). That is why the right-hand side of the Heisenberg relation is not zero, but positive. The less uncertainty about one variable (for example, Δ x), the more uncertain the other variable becomes (Δ v), since the product of two errors on the left side of the relation cannot be less than the constant on the right side. In fact, if we manage to determine one of the measured quantities with zero error (absolutely accurately), the uncertainty of the other quantity will be equal to infinity, and we will not know anything about it at all. In other words, if we were able to absolutely accurately establish the coordinates of a quantum particle, we would not have the slightest idea about its speed; If we could accurately record the speed of a particle, we would have no idea where it is. In practice, of course, experimental physicists always have to look for some kind of compromise between these two extremes and select measurement methods that allow them to judge both the speed and spatial position of particles with a reasonable error.

In fact, the uncertainty principle connects not only spatial coordinates and speed - in this example it simply manifests itself most clearly; uncertainty equally binds other pairs of mutually related characteristics of microparticles. Through similar reasoning, we come to the conclusion that it is impossible to accurately measure the energy of a quantum system and determine the moment in time at which it possesses this energy. That is, if we measure the state of a quantum system to determine its energy, this measurement will take a certain period of time - let's call it Δ t. During this period of time, the energy of the system changes randomly - it occurs fluctuation, - and we cannot identify it. Let us denote the energy measurement error Δ E. By reasoning similar to the above, we arrive at a similar relationship for Δ E and the uncertainty of the time that a quantum particle possessed this energy:

Δ EΔ t > h

There are two more important points to make regarding the uncertainty principle:

it does not imply that either of the two characteristics of a particle—spatial location or velocity—cannot be measured with any precision;

the uncertainty principle operates objectively and does not depend on the presence of an intelligent subject performing the measurements.

Sometimes you may come across claims that the uncertainty principle implies that quantum particles none certain spatial coordinates and velocities, or that these quantities are completely unknowable. Don't be fooled: as we just saw, the uncertainty principle does not prevent us from measuring each of these quantities with any desired accuracy. He only states that we are not able to reliably know both at the same time. And, as with many things, we are forced to compromise. Again, anthroposophical writers from among the supporters of the concept of the “New Age” sometimes argue that, supposedly, since measurements imply the presence of an intelligent observer, then, at some fundamental level, human consciousness is connected with the Universal Mind, and it is this connection that determines the principle of uncertainty . Let us repeat this point once again: the key to the Heisenberg relation is the interaction between the particle-object of measurement and the measurement instrument, which influences its results. And the fact that there is a reasonable observer in the person of a scientist is not relevant to the matter; the measuring instrument in any case influences its results, whether an intelligent being is present or not.

See also:

Werner Karl Heisenberg, 1901-76

German theoretical physicist. Born in Wurzburg. His father was a professor of Byzantine studies at the University of Munich. In addition to his brilliant mathematical abilities, he showed a penchant for music from childhood and became quite successful as a pianist. While still a schoolboy, he was a member of the people's militia, which maintained order in Munich during the troubled times that followed Germany's defeat in World War I. In 1920, he became a student at the Department of Mathematics at the University of Munich, however, faced with a refusal to attend a seminar that interested him on issues of higher mathematics that were relevant in those years, he achieved a transfer to the Department of Theoretical Physics. In those years, the entire world of physicists lived under the impression of a new look at the structure of the atom ( cm. Bohr's atom), and all the theorists among them understood that something strange was happening inside the atom.

Having defended his diploma in 1923, Heisenberg began work in Göttingen on problems of the structure of the atom. In May 1925, he suffered an acute attack of hay fever, which forced the young scientist to spend several months in complete solitude on the small island of Heligoland, cut off from the outside world, and he took advantage of this forced isolation from the outside world as productively as Isaac Newton used his many months of imprisonment in quarantine plague barracks back in 1665. In particular, during these months scientists developed a theory matrix mechanics— a new mathematical apparatus of emerging quantum mechanics . Matrix mechanics, as time has shown, in a mathematical sense is equivalent to the quantum wave mechanics that appeared a year later, embedded in the Schrödinger equation, from the point of view of describing the processes of the quantum world. However, in practice it turned out to be more difficult to use the apparatus of matrix mechanics, and today theoretical physicists mainly use the concepts of wave mechanics.

In 1926, Heisenberg became Niels Bohr's assistant in Copenhagen. It was there in 1927 that he formulated his uncertainty principle - and it can be argued that this became his greatest contribution to the development of science. That same year, Heisenberg became a professor at the University of Leipzig, the youngest professor in German history. From that moment on, he began to work closely on creating a unified field theory ( cm. Universal theories) - by and large, unsuccessfully. For his leading role in the development of quantum mechanical theory, Heisenberg was awarded the Nobel Prize in Physics in 1932 for the creation of quantum mechanics.

From a historical point of view, the personality of Werner Heisenberg will probably forever remain synonymous with uncertainty of a slightly different kind. With the coming to power of the National Socialist Party, the most difficult to understand page opened in his biography. First, as a theoretical physicist, he became involved in an ideological struggle in which theoretical physics as such was labeled “Jewish physics” and Heisenberg himself was publicly called a “white Jew” by the new authorities. Only after a series of personal appeals to the highest-ranking officials in the ranks of the Nazi leadership did the scientist manage to stop the campaign of public harassment against him. Much more problematic is Heisenberg's role in the German nuclear weapons program during the Second World War. At a time when most of his colleagues emigrated or were forced to flee Germany under pressure from Hitler's regime, Heisenberg headed the German national nuclear program.

Under his leadership, the program focused entirely on building a nuclear reactor, but Niels Bohr, during his famous meeting with Heisenberg in 1941, was under the impression that this was just a cover, and in fact the program was developing nuclear weapons. So what really happened? Did Heisenberg really deliberately and at the behest of his conscience led the German atomic bomb program to a dead end and directed it onto a peaceful path, as he later claimed? Or did he simply make some mistakes in his understanding of nuclear decay processes? Be that as it may, Germany did not have time to create atomic weapons. As Michael Frayn's brilliant play Copenhagen shows, this historical mystery is likely to provide enough material for generations of fiction writers to come.

After the war, Heisenberg became an active supporter of the further development of West German science and its reunification with the international scientific community. His influence served as an important tool in achieving the nuclear-free status of the West German armed forces in the post-war period.

Influenced by the success of scientific theories, especially Newton's theory of gravitation, the French scientist Pierre Laplace at the beginning of the 19th century. a view of the Universe as a completely determined object was developed. Laplace believed that there must be a set of scientific laws that would make it possible to predict everything that can happen in the Universe, if only a complete description of its state at some point in time is known. For example, if we knew the positions of the Sun and planets corresponding to a certain moment in time, then using Newton’s laws we could calculate what state the Solar system would be in at any other moment in time. In this case, determinism is quite obvious, but Laplace went further, arguing that there are similar laws for everything, including human behavior.

The doctrine of scientific determinism met with strong resistance from many who felt that this limited God's free intervention in our world; nevertheless, this idea remained a common scientific hypothesis at the very beginning of our century. One of the first indications of the need to abandon determinism were the results of calculations by two English physicists, John Rayleigh and James Jeans, from which it followed that a hot object like a star should radiate infinitely more energy all the time. According to the then known laws, a hot body should equally emit electromagnetic waves of all frequencies (for example, radio waves, visible light, X-rays). This means that the same amount of energy must be emitted both in the form of waves with frequencies between one and two million million waves per second, and in the form of waves whose frequencies are in the range of two to three million million waves per second. And since there are infinitely many different frequencies, the total radiated energy must be infinite.

To get rid of this apparently absurd conclusion, the German scientist Max Planck in 1900 accepted the hypothesis that light, X-rays and other waves cannot be emitted with arbitrary intensity, but must be emitted only in certain portions, which Planck called quanta. In addition, Planck suggested that each quantum of radiation carries a certain amount of energy, which is greater the higher the frequency of the waves. Thus, at a sufficiently high frequency, the energy of one quantum can exceed the available amount of energy and, consequently, high-frequency radiation will be suppressed, and the rate at which the body loses energy will be finite.

The quantum hypothesis was in excellent agreement with the observed radiation intensities of hot bodies, but what it meant for determinism was not clear until 1926, when another German scientist, Werner Heisenberg, formulated the famous uncertainty principle. To predict what the position and speed of a particle will be, you need to be able to make accurate measurements of its position and speed at the present moment. Obviously, to do this, light must be directed at the particle. Some of the light waves will be scattered by it, and thus we will determine the position of the particle in space. However, the accuracy of this measurement will be no greater than the distance between the crests of two adjacent waves, and therefore short-wavelength light is needed to accurately measure the particle's position. According to Planck's hypothesis, light cannot be used in arbitrarily small portions, and there is no smaller portion than one quantum. This quantum of light will disturb the movement of the particle and unpredictably change its speed. In addition, the more accurately the position is measured, the shorter the light wavelengths should be, and therefore, the greater the energy of one quantum will be. This means that the perturbation of the particle velocity will become greater. In other words, the more accurately you try to measure the position of a particle, the less accurate the measurements of its velocity will be, and vice versa. Heisenberg showed that the uncertainty in the position of a particle, multiplied by the uncertainty in its speed and its mass, cannot be less than a certain number, which is now called Planck's constant. This number does not depend either on the way in which the position or speed of the particle is measured, or on the type of this particle, i.e., the Heisenberg uncertainty principle is a fundamental, mandatory property of our world.



The principle of uncertainty has far-reaching consequences related to our perception of the world around us. Even after more than fifty years, many philosophers have not definitively agreed with them, and these consequences are still the subject of debate. The uncertainty principle meant the end of Laplace's dreams of a scientific theory that would provide a completely deterministic model of the Universe: indeed, how can one accurately predict the future without even being able to make accurate measurements of the state of the Universe at the present moment! Of course, we can imagine that there is a certain set of laws that completely determines events for some supernatural being who is able to observe the current state of the Universe without disturbing it in any way. However, such models of the Universe are of no interest to us mere mortals. It would be better, perhaps, to use the principle of “economy”, which is called the principle of “Occam’s razor” (W. Ockham /1285‑1349/ - English philosopher. The essence of the principle of “Occam’s razor”: concepts that cannot be verified in experience should be removed from science - editor's note) take and cut out all the provisions of the theory that are not observable. Adopting this approach, Werner Heisenberg, Erwin Schrödinger and Paul Dirac in the 20s of our century revised mechanics and came to a new theory - quantum mechanics, which was based on the uncertainty principle. In quantum mechanics, particles no longer have such definite and mutually independent characteristics as position in space and speed, which are not observable. Instead, they are characterized by a quantum state that is some combination of position and velocity.

Quantum mechanics, generally speaking, does not predict that an observation should have any single definite result. Instead, it predicts a number of different outcomes and gives the probability of each of them. This means that if we made the same measurement for many identical systems, the initial states of which are the same, we would find that in one number of cases the result of the measurement is equal to A, in another - B, etc. We can predict in how many In approximately cases, the result will be equal to A and B, but it is impossible to determine the result of each specific measurement. Thus, quantum mechanics introduces an inevitable element of unpredictability or randomness into science. Einstein spoke out very sharply against this concept, despite the enormous role that he himself played in its development. For his enormous contribution to quantum theory, Einstein was awarded the Nobel Prize. But he could never agree that the universe is governed by chance. All of Einstein's feelings were expressed in his famous statement: “God does not play dice.” However, most other scientists were inclined to accept quantum mechanics because it agreed perfectly with experiment. Quantum mechanics is indeed a remarkable theory and underlies almost all modern science and technology. The principles of quantum mechanics form the basis for the operation of semiconductor and integrated circuits, which are the most important part of electronic devices such as televisions and electronic computers. Modern chemistry and biology are based on quantum mechanics. The only areas of physics that do not yet make good use of quantum mechanics are the theory of gravity and the theory of the large-scale structure of the Universe.

Despite the fact that light radiation consists of waves, nevertheless, according to Planck's hypothesis, light in some sense behaves as if it were formed by particles: the emission and absorption of light occurs only in the form of portions, or quanta. The Heisenberg uncertainty principle says that particles, in a sense, behave like waves: they do not have a specific position in space, but are “smeared” over it with a certain probability distribution. Quantum mechanical theory uses a completely new mathematical apparatus, which no longer describes the real world itself based on ideas about particles and waves; these concepts can now only be attributed to the results of observations in this world. Thus, in quantum mechanics, partial-wave dualism arises: in some cases it is convenient to consider particles as waves, while in others it is better to consider waves as particles. One important conclusion follows from this: we can observe the so-called interference between two particle waves. The crests of the waves of one of them may, for example, coincide with the troughs of another. The two waves then cancel each other out rather than amplify each other, summing up, as one would expect, into higher waves (Figure 4.1). A well-known example of light interference is soap bubbles shimmering in different colors of the rainbow. This phenomenon occurs as a result of the reflection of light from two surfaces of a thin film of water, which forms a bubble. White light contains all kinds of wavelengths corresponding to different colors. The crests of some waves reflected from one of the surfaces of the soap film coincide with the troughs of waves of the same length reflected from the second surface of the bubble. Then the reflected light will lack colors corresponding to these wavelengths, and the reflected light will appear multi-colored.

So, thanks to the dualism that arose in quantum mechanics, particles can also experience interference. A well-known example of such particle interference is an experiment with two slits in a screen (Fig. 4.2). Consider a screen in which two narrow parallel slits are cut. On one side of the screen with slits there is a light source of a certain color (i.e., a certain wavelength). Most of the light hits the surface of the screen, but a small portion of it will pass through the slits. Next, imagine an observation screen installed on the other side of the screen with slits from the light source. Then light waves from both slits will reach any point on the observation screen. But the distance traveled by light through the slits from the source to the screen will, generally speaking, be different. This means that the waves passing through the slits will hit the screen in different phases: in some places they will weaken each other, and in others they will strengthen each other. As a result, the screen will get a characteristic picture made up of dark and light stripes.

Surprisingly, exactly the same bands appear when you replace the light source with a source of particles, say electrons, emitted at a certain speed (this means that they correspond to waves of a certain length). The described phenomenon is all the more strange because if there is only one slit, no bands appear and a simply uniform distribution of electrons appears on the screen. One might assume that another slit would simply increase the number of electrons hitting each point on the screen, but in fact, due to interference, the number of these electrons in some places, on the contrary, decreases. If one electron were passed through the slits at a time, then one would expect that each of them would pass through either one slit or the other, i.e., would behave as if the slit through which it passed was the only one , and then a uniform distribution should appear on the screen. However, in fact, the bands appear even when electrons are released one at a time. Therefore, each electron must pass through both slits at once!

The phenomenon of particle interference has become decisive for our understanding of the structure of atoms, those smallest “building blocks” that are considered in chemistry and biology and from which we ourselves and everything around us are built. At the beginning of the century, it was believed that atoms were like the solar system: electrons (particles carrying a negative electrical charge), like the planets around the Sun, revolve around a centrally located core that is positively charged. It was assumed that electrons were held in their orbits by attractive forces between positive and negative charges, similar to how the gravitational attraction between the Sun and the planets prevents the planets from leaving their orbits. This explanation ran into the following difficulty: before the advent of quantum mechanics, the laws of mechanics and electricity predicted that electrons would lose energy and therefore spiral toward the center of the atom and fall onto the nucleus. This would mean that the atoms, and with them, of course, all matter, should quickly collapse into a state of very high density. A particular solution to this problem was found in 1913 by the Danish scientist Niels Bohr. Bohr postulated that electrons could not move in any orbits, but only in those that lie at certain specific distances from the central nucleus. If the assumption were also made that each such orbit could contain only one or two electrons, then the problem of atomic collapse would be solved, because then the electrons, moving in a spiral towards the center, could only fill orbits with minimal radii and energies.

This model perfectly explained the structure of the simplest atom - the hydrogen atom, in which only one electron rotates around the nucleus. It was not clear, however, how to extend the same approach to more complex atoms. Moreover, the assumption of a limited number of allowed orbits seemed quite arbitrary. This difficulty was resolved by a new theory - quantum mechanics. It turned out that an electron rotating around a nucleus can be imagined as a wave, the length of which depends on its speed. Along some orbits, an integer (rather than a fractional) number of electron wavelengths fits. When moving along these orbits, the crests of the waves will end up in the same place on each orbit, and therefore the waves will add up; such orbits are classified as Bohr allowed orbits. And for those orbits along which an integer number of electron wavelengths do not fit, each ridge as the electrons rotate is sooner or later compensated by a trough; such orbits will not be allowed.

American scientist Richard Feynman came up with a beautiful way that makes it possible to visualize wave-particle duality. Feynman introduced the so-called summation over trajectories. In this approach, unlike the classical, non-quantum theory, there is no assumption that the particle should have one single trajectory in space-time, but on the contrary, it is believed that the particle can move from A to B along any possible path. Each trajectory has two numbers associated with it: one of them describes the size of the wave, and the other corresponds to its position in the cycle (crest or trough). To determine the probabilities of transition from A to B, it is necessary to add up the waves for all these trajectories. If you compare several neighboring trajectories, their phases, or positions in the cycle, will differ greatly. This means that waves corresponding to such trajectories will almost completely cancel each other out. However, for some families of neighboring trajectories, the phases will change little when moving from trajectory to trajectory, and the corresponding waves will not cancel each other out. Such trajectories belong to Bohr allowed orbits.

Based on such ideas, written in a specific mathematical form, it was possible, using a relatively simple scheme, to calculate the allowed orbits for more complex atoms and even for molecules consisting of several atoms that are held together by electrons whose orbits cover more than one nucleus. Since the structure of molecules and the reactions that occur between them are the basis of all chemistry and all biology, quantum mechanics in principle allows us to predict everything we see around us with the accuracy allowed by the uncertainty principle. (However, in practice, calculations for systems containing many electrons turn out to be so complex that they are simply impossible to carry out).

The large-scale structure of the Universe appears to obey Einstein's general theory of relativity. This theory is called classical because it does not take into account the quantum mechanical uncertainty principle, which must be taken into account to be consistent with other theories. We do not contradict the results of observations due to the fact that all the gravitational fields with which we usually have to deal are very weak. However, according to the singularity theorems discussed above, the gravitational field should become very strong in at least two situations: in the case of black holes and in the case of the big bang. In such strong fields, quantum effects must be significant. Therefore, the classical general theory of relativity, having predicted the points at which density becomes infinite, in a sense predicted its own failure in exactly the same way that classical (i.e., non-quantum) mechanics doomed itself to failure by concluding that atoms must collapse until their density becomes infinite. We do not yet have a complete theory in which the general theory of relativity would be consistently combined with quantum mechanics, but we do know some properties of the future theory. We will talk about what follows from these properties in relation to black holes and the big bang in subsequent chapters. Now let's move on to the latest attempts to unify our understanding of all other forces of nature into one, unified quantum theory.

In classical mechanics, the state of a material point (classical particle) is determined by specifying the values ​​of coordinates, momentum, energy, etc. The listed quantities are called dynamic variables. Strictly speaking, the specified dynamic variables cannot be assigned to a microobject. However, we obtain information about microparticles by observing their interaction with devices that are macroscopic bodies. Therefore, the results of measurements are inevitably expressed in terms developed to characterize macrobodies, that is, through the values ​​of dynamic variables. Accordingly, the measured values ​​of dynamic variables are attributed to microparticles. For example, they talk about the state of an electron in which it has such and such an energy value, etc.

The peculiarity of the properties of microparticles is manifested in the fact that not all variables obtain certain values ​​during measurements. So, for example, an electron (or any other microparticle) cannot have simultaneously exact values ​​of the x coordinate and the momentum component. The uncertainties of the values ​​satisfy the relation

( - Planck's constant). From (20.1) it follows that the smaller the uncertainty of one of the variables or the greater the uncertainty of the other. A state is possible in which one of the variables has an exact value, while the other variable turns out to be completely uncertain (its uncertainty is equal to infinity).

A relationship similar to (20.1) holds for y and , for z and , as well as for a number of other pairs of quantities (in classical mechanics such pairs of quantities are called canonically conjugate). Denoting canonically conjugate quantities by the letters A and B, we can write

(20.2)

Relation (20.2) is called the uncertainty relation for quantities A and B. This relation was discovered by W. Heisenberg in 1927.

The statement that the product of the uncertainties of the values ​​of two conjugate variables cannot be of an order of magnitude less than Planck's constant is called the Heisenberg uncertainty principle.

Energy and time are canonically conjugate quantities. Therefore, the uncertainty relation is also valid for them:

This relationship means that determining the energy with accuracy should take a time interval equal to but less than .

The uncertainty relation was established by considering, in particular, the following example. Let's try to determine the value of the x coordinate of a freely flying microparticle by placing on its path a slit of width , located perpendicular to the direction of motion of the particle (Fig. 20.1). Before the particle passes through the gap, its momentum component has an exact value equal to zero (the gap is by convention perpendicular to the momentum), so that, on the other hand, the x coordinate of the particle is completely uncertain. At the moment the particle passes through the slit, the position changes. Instead of complete uncertainty of the x coordinate, uncertainty appears, but this is achieved at the cost of losing the certainty of the value. Indeed, due to diffraction, there is some probability that the particle will move within the angle , where is the angle corresponding to the first diffraction minimum (maxima of higher orders can be neglected, since their intensity small compared to the intensity of the central maximum). Thus, uncertainty arises:

The edge of the central diffraction maximum (first minimum), resulting from the width of the slit, corresponds to the angle for which

(see formula (129.5) of the 2nd volume). Hence,

Hence, taking (18.1) into account, we obtain the relation

consistent with (20.1).

Sometimes the uncertainty relation receives the following interpretation: in reality, a microparticle has exact values ​​of coordinates and momenta, but the impact of a measuring device that is noticeable for such a particle does not allow these values ​​to be accurately determined. This interpretation is completely wrong. It contradicts the experimentally observed phenomena of diffraction of microparticles.

The uncertainty relation indicates to what extent the concepts of classical mechanics can be used in relation to microparticles, in particular, with what degree of accuracy we can talk about the trajectories of microparticles. Movement along a trajectory is characterized by well-defined values ​​of coordinates and speed at each moment of time. Substituting the product in (20.1) instead of the product, we obtain the relation

We see that the greater the mass of the particle, the less uncertainty in its coordinates and speed and, therefore, the more accurately the concept of trajectory is applicable. Already for a macroparticle with a size of only 1 micron, the uncertainties in the values ​​are beyond the accuracy of measuring these quantities, so that practically its movement will be indistinguishable from movement along a trajectory.

Under certain conditions, even the movement of a microparticle can be approximately considered as occurring along a trajectory. As an example, consider the movement of an electron in a cathode ray tube. Let us estimate the uncertainties of the electron coordinate and momentum for this case. Let the trace of the electron beam on the screen have a radius of the order of , the length of the tube is of the order of 10 cm (Fig. 20.2). Then the electron momentum is related to the accelerating voltage U by the relation

Hence Under tension. B electron energy is equal to Let us estimate the magnitude of the momentum:

Therefore, finally, according to relation (20.1):

The result obtained indicates that the movement of an electron in a cathode ray tube is practically indistinguishable from movement along a trajectory.

The uncertainty relation is one of the fundamental principles of quantum mechanics. This relationship alone is enough to obtain a number of important results. In particular, it allows one to explain the fact that an electron does not fall on the nucleus of an atom, as well as to estimate the size of the simplest atom and the minimum possible energy of an electron in such an atom.

If an electron fell on a point nucleus, its coordinates and momentum would take on certain (zero) values, which is incompatible with the uncertainty principle. This principle requires that the uncertainty of the electron coordinate and the uncertainty of the momentum be related by condition (20.1). Formally, the energy would be minimal at Therefore, when estimating the lowest possible energy, one must put . Substituting these values ​​into (20.1), we obtain the relation