Contents:

Finnegans Wake (29394) 
One of the unsolved problems of nineteenthcentury science was the question of the nature of light. Newton, who experimented with light and published his results in Opticks (1704), believed light to consist of minute particles. This corpuscular theory was consistent with the rectilinear propagation of light, but it failed to explain light's refraction (bending on passing from one medium to another). By analogy to sound waves, a wave theory of light was suggested by other physicists, but the authority of Newton tended to divert attention from it (Hull 237). Then, in 1803, Thomas Young (17331829) conducted a simple, ingenious test (called "double slit experiment") which proved that light did indeed consist of waves. To travel as waves, light required a medium conducting the movement (the air is such a medium for sound waves). To account for the wave nature of light, then, the theory of ether was created. The theory stated that the universe is permeated by ether: an invisible, tasteless, odorless and motionless substance which exists solely for the purpose of propagating light waves (Hull 236). The ether theory, however, created a new problem, for the elastic, solidlike properties of ether, required for the propagation of light, were hard to reconcile with the unimpeded motion of the planets.
The difficulty created by the ether theory disappeared only with the introduction of Maxwell's electromagnetic theory in 1864. Maxwell's theory provided a precise mathematical description of the phenomena of electro magnetism discovered experimentally by Faraday in 1831. Maxwell showed that if electromagnetic action were assumed to travel as a disturbance in ether, as Faraday imagined, then it would be propagated through ether in the form of transverse waves and its speed would be equal to the speed of light. Maxwell's equations were taken for a time to prove that the propagation of light through ether involved a passage of electric and magnetic forces rather than material vibrations (Jeans, Growth 287). The theory of ether was eventually proven to be incorrect, but the attempt to explain it in terms of electromagnetic phenomena was a most important development: it showed the first sign of moving away from the purely mechanical standpoint of nineteenth century science (Hull 240).
Thomson 
Another such step towards a new physics followed in 1897 with J. J. Thomson's discovery of the electron. His revelation of the new particle, apparently much smaller than the atom, directly challenged the billiardball model of the atomic world. It became obvious that atoms of chemistry could no longer be regarded as final constituents of matter. The atom was now imagined to consist of a number of negatively charged electrons accompanied by a positive charge sufficient to neutralize the total charge of the atom. An extension of Maxwell's electrical theory showed that light might be produced by the motion of the electrons inside the atom (Jeans, Growth 3056). This movement of electrons was believed to be the source of the radiant energy that made hot objects glow.
The new atomic theory, however, presented science with a new problem. According to the theorem of equipartition, the radiation from a redhot body should consist mostly of waves of the shortest possible wavelength (Jeans, Growth 328). Since short wavelength corresponds to high frequency, even a moderately hot object should emit more highfrequency light (violet, blue) than lowfrequency light (red). Accordingly, upon heating, a piece of iron should turn whitehot to a lesser and then to a stronger degree. The new theory could not account for the red glow imparted by moderately hot objects. This problematic emission of radiation from a hot body (physicists call it "black body radiation") remained unresolved until the turn of the century. The failure of theory to fit the facts created a serious concern among scientists. They referred to it (prophetically, we may now add) as the "ultraviolet catastrophe" (Kuhn, Black Body Theory 152).
Planck 
One of those concerned with black body radiation was the German physicist Max Planck (18581947). In 1900 his experiments led him to the conclusion that the energy of vibrating electrons is emitted not smoothly but discontinuously, in discrete packets. He called those energy packets "quanta." In his new quantum theory, Planck postulated that energy can only be absorbed and emitted in multiples of an elementary quantity which is constant for the light of each given frequency. A quantum of energy is obtained by multiplying the frequency of the given light by a constant h, now called Planck's constant. Consequently, the quantum of lowfrequency light, such as red, is smaller than the quantum of highfrequency violet light. The theory thus easily accounted for the red glow of a moderately hot object: the energy quanta of red light were the smallest in the visible light spectrum and so they required the least energy to radiate. Planck's quantum theory, however, did much more than solve the puzzle of black body radiation. It was to become a turning point in the history of science and by the end of the nineteen twenties it had forced man to re evaluate not only his knowledge, but also his own status in the universe.
Planck took little part in the later development of the quantum theory. A revolutionary in spite of himself, he was a conservative scientist who only reluctantly accepted the disquieting results of his own findings. But only five years after his formulation of the quantum theory, another revolutionary German physicist, Albert Einstein, offered new insights on the question of radiation. In 1905 Einstein, a twentysixyearold employee of the Swiss Patent Office at Berne, published a series of scientific papers, two of which were to play a crucial role in the development of new physics. One of them dealt with the quantum nature of light; the other set forth the special theory of relativity.
The paper on the nature of light won Einstein the Nobel Prize in 1922. Defying the generally accepted wave theory of light, he suggested a beam of light to be a stream tiny particles, later called "photons." This was a return to the Newtonian corpuscular theory of light, but unlike Newton, who could only speculate and draw analogies, Einstein based his theory on the recently published experimental work with the socalled "photoelectric effect"a phenomenon in which the impact of light falling on the surface of metal releases electrons from its atoms, causing an electric current to flow. Experiments showed that in photoelectric effect reducing the intensity of the beam of light did not affect the velocity of released electrons, but it did reduce their number. The velocity, however, could be altered by changing the color of the impinging light. The current wave theory of light could not account for these phenomena, but Einstein's new theory offered an elegant and simple explanation. Einstein proposed that each photon of a given color has a certain amount of energy which never changes for that light frequency. A photon of green light, for example, will always strike an electron with the same amount of energy and give it a specific velocity. Reducing the intensity of light can only reduce the number of photons, never their individual energy level. This energy level, however, increases with the frequency of light. A photon of high frequency light, such as violet, has more energy than a photon of midfrequency green light. Consequently, a change in light color alters the velocity of rebounding electrons (Einstein and Infeld 25862).
Einstein 
Einstein's new theory of light not only confirmed the basic assumptions of the quantum theory, but took it a step further. Planck postulated that energy can only be emitted and absorbed in quanta. According to Einstein, it is not just the process of emission and absorption, but the energy itself that is quantized: "we must assume that homogeneous light is composed of energy grains and replace the old light corpuscules by light quanta, which we shall call photons, small portions of energy, traveling through empty space with the velocity of light" (Einstein and Infeld 260).
Einstein's explanation of the photoelectric effect was a considerable achievement, but it did not resolve the ambiguities surrounding the nature of light. For one thing, Young's double slit experiment of 1803 was still accepted as proof for the undulatory nature of light. The new corpuscular theory offered different conclusions on the nature of light, but it did not directly question the validity of Young's experiment. Furthermore, in his explanation of the different levels of energy in rebounding electrons, Einstein had to rely on the notion of frequency, which is a property of waves. Describing a bulletlike stream of particles in terms of wave frequency defied common sense, and yet that seemed to be the only available explanation of the photoelectric effect.
This waveparticle duality of light became one of the most fundamental concepts of quantum physics. It created an unprecedented situation where a physicist could prove that light consists of waves or particles, depending on his choice of experiment to support his claim. The role of the scientist was thus changed from that of an impartial observer to an active participator. Science could hardly claim absolute objectivity for its findings, since each experiment or theory was influenced by the choice of one method over another. Nor was the old apparatus of classical science adequate to describe the new subatomic reality. The Aristotelian logic, with its rule of the excluded middle, claimed that a statement could only be true or falsethere was no third possibility. The waveparticle duality suggested a different rule: either of two mutually exclusive statements ("light consists of waves" and "light consists of particles") could be true or false, depending on the methodology of the scientist determining their validity.
The weakened foundations of classical science received an even greater shock with the appearance of Einstein's next paper. His special theory of relativity proposed new relations between some of the most fundamental notions of physics: time, space, mass and energy. Einstein's starting point was the issue of the constancy of light's velocity which had puzzled scientists for several years. It had been proven experimentally, but it conflicted with both classical mechanics and common sense. According to the transformation laws of classical mechanics, the velocity of light should be higher if the observer moved toward the source of light and lower if he moved away from it. Experimental evidence, however, suggested that light's velocity always remained the same (Einstein and Infeld 169). Einstein confronted this apparent paradox with an ingenious reversal of ideas: instead of attempting to solve the puzzle, he turned it into his postulate and made the principle of the constancy of light's velocity a foundation of his new theory. He adapted another basic postulate from Galileo's "principle of relativity." Galileo asserted that the laws of mechanics which are valid in a given frame of reference, are also valid in all other frames of reference that move uniformly in relation to it. Einstein expanded the scope of the principle and postulated that all laws of nature are the same in all frames of reference which are moving uniformly in relation to each other (Einstein and Infeld 177).
With these two assumptions as a basis for his considerations, Einstein now proceeded to analyze the paradox of the velocity of light. He resolved to disregard common sense (which contradicted the experimental findings) and concentrated on such data as was available. Considering that the measurement of speed involves the use of two instruments, a clock and a ruler (the physicists refer to it as a rod), Einstein reasoned that if the velocity of light measured by two different observers in two different frames of reference (i.e. moving with different speed in relation to the source of light) appears to be always the same, then it must be that the measuring instruments change depending on the frame of reference in which they are used (Einstein and Infeld 186). More specifically, with the increase of velocity, the measuring rod contracts in the direction of its motion until at the speed of light it disappears altogether, while the clock gradually slows down until at the speed of light it stops.^{[1]} These phenomena, however, are only apparent from outside the frame of reference; the scientist performing the measurement cannot observe any changes in his instruments. The fact that these variations in length and duration cannot be perceived in daily life must be attributed to the extremely slow speeds (relative to the speed of light) which we experience in everyday reality (Einstein and Infeld 191).
Length and duration, however, are not the only attributes of a moving object to undergo transformation at high velocity. Einstein suggested further that the object's mass increases significantly at the speed approaching that of light. According to the new theory, mass and energy are to be equated: in Einstein's words, "energy has mass and mass represents energy" (Einstein and Infeld 197). Their relationship is expressed by the famous E=mc2: energy equals mass multiplied by the square of the speed of light. The formula implies that even the smallest particle of matter has enormous amounts of energy stored in itan assertion which was gruesomely demonstrated to the world in the atomic bomb explosion forty years later. The new equivalence of mass and energy challenged the very essence of the old paradigm by questioning the notions of solidity and permanence of matter.
Einstein's considerations of the impact of high velocity on a moving object implied that man's ideas about space and time had to be thoroughly reexamined. The directional contraction of the measuring rod suggested that our space measurement is relative to one specific frame of reference and cannot claim any universal objectivity. The numbers we obtain do not express the properties of space but rather the values of our reckoning of space. Absolute space measurement is not only out of our grasp; it is nonexistent within our experiential frame of reference. Distances, lengths, and volumes are all relative, and without a frame of reference they are meaningless to the physicist.
Similar objections had to be raised against the Newtonian idea of universal time. Absolute time, like absolute space, is beyond our experience. All that can be measured is its flow relative to one specific frame of reference. Nor is the ordering of events in time absolute as we believed. Depending on their frames of reference, two observers can perceive two events as occurring in different sequence, while yet another observer may see them as simultaneous (Jeans, Growth 296).
Time and space, in the new theory, can no longer be considered separate entities: they form a fourdimensional continuum consisting of three dimensions of space and one dimension of time. A given event does not occur at a point in space and at an instant of time but rather exists as a point of timespace continuum. It is separated from other events by a timespace "interval" which, unlike either space or time considered separately, is an absolute form of measurement, independent of any specific frame of reference. This new conception of timespace continuum was developed in detail by Einstein's mathematics teacher Hermann Minkowski in 1908. Minkowski suggested that the dynamic picture of Newtonian physics, in which events located in threedimensional space develop with the passage of time, has to be replaced by a static one, in which the spacetime continuum, with all the events: past, present and future, exists in its own right. Each specific observer, as his time passes, comes in contact with some portions of the continuum; he experiences successive stages of what is called his world line: a series of events which appear to develop continuously, coming into existence even as he perceives them. In reality, however, the events constituting the fabric of timespace exist independently of our cognition, although we have no practical way of removing ourselves from the timespace continuum in order to study its nature.
The special theory of relativity had a profound impact on the old paradigm. Like the earlier paper on the quantum nature of light, but perhaps even more distinctly, it suggested that the ideal of scientific objectivity is unattainable. Since the time of Galileo physicists had sought numerical expression of their experimental findings. For two centuries they had improved the accuracy of their measuring equipment, only to learn that the distance between two points cannot be measured in any absolute way; for distance is not a relation between two points only, but it also involves the observer whose relation to the points directly affects the value obtained in the measurement. Similar considerations apply to our reckoning of time: temporal intervals have no absolute value since the flow of time directly depends on the relation between the clock and the observer. It now appeared that what physicists had been studying and describing was the universe seen from only one specific frame of reference. What we had taken to be the universe was in fact only a subjective perception of the universe.
With this realization came a major change in the investigative methods of physics. Since measurement, an essential aspect of the scientific experiment, could no longer be relied on as a source of objective data, the stress shifted to theoretical mathematics. The study of the inner workings of nature, till then the domain of the engineerscientist, thus passed to the mathematician (Jeans, Mysterious Universe 106). It was no longer possible to claim that a model could be made to represent the phenomena of nature. The physicist now had to work with concepts defying precise verbal expression. These concepts were arrived at mathematically; they could be expressed exactly and concisely by appropriate formulae, but they lacked a point of reference in everyday reality, and so they could never be fully understood by nonmathematicians.
The new dependence on mathematics became apparent with the publication of the special theory of relativity, but it was not a sudden transformation. Between the time of Newton and the publication of Einstein's theory, a number of important developments in mathematics had already taken place. The most notable of those was the creation of nonEuclidean geometry, which foreshadowed the twentiethcentury breach with the purely mechanical attitudes in physics.
Bolyai 
Of the ten postulates of Euclidean geometry, the second and the fifth had met with some objections over the years. They stated, respectively: "A finite straight line can be extended indefinitely to make an infinitely long straight line" and "Given a straight line and any point off to the side of it, there is, throughout that point, one and only one line that is parallel to the given line" (Guillen 107). Challenged was not the verity of the postulates but the claimed selfevidence. It was with the transformation of these postulates that the development of new geometry started. In 1824 a young Hungarian Janos Bolyai replaced Euclid's fifth postulate with another one which appeared contrary to common sense: "Given a straight line and any point off to the side of it, there is, through that point, an infinite number of lines that are parallel to the given line" (Guillen 108). He then combined it with the remaining nine postulates to derive a set of theorems which were different from those of Euclidean geometry but turned out to be just as logical as a system. Bolyai's discovery was independently repeated by the Russian Nikolaus Lobachevski in 1832, though supposedly preceded by that of the German Carl Friedrich Gauss (who, however, had not published his findings). This new geometry was followed by that of Bernard Riemann in 1854. Riemann replaced Euclid's second postulate with its opposite: "A finite straight line cannot be extended indefinitely to make an infinitely long line," and modified the fifth to read: "Given a straight line and any point off to the side of it, there are, through that point, not any lines that are parallel to the given line" (Guillen 110). As a result, he created yet another system of geometry with its own set of theorems, different from Euclid's but as logical and as consistent.
Riemann 
The appearance of nonEuclidean geometry was important for the development of physics, for it ended the fallacy that Euclidean geometry is unique and therefore it must describe the properties of the universe. Prior to its development, scientists believed that any set of axioms other than that of Euclid would eventually lead to contradictions. The new systems, however, turned out to be as selfcontained and free from contradictions as traditional geometry. They were constructed out of mathematical abstractions, but the worlds they described could be visualized in terms of threedimensional concepts: the geometry of GaussBolyaiLobachevski described a reality which could be pictured as the surface of a pseudosphere (a figure resembling two infinitely long straight trumpets with their bells joined together), while Riemannian geometry described a world resembling the surface of a sphere. Euclidean geometry, in comparison, represented the reality of a flat surface (Guillen 11011).
The creation of new kinds of geometry freed mathematics from dependence on common sense and opened it to a whole new range of possibilities. The nonrational assumptions of the new geometries indicated that mathematics is not a body of universal truths about reality, but rather an arbitrary product of human imagination (Guillen 109). Consequently, the application of mathematics can be extended beyond the limits imposed on man by his senses. Unlike the axioms of Euclid, which were derived from observation and deemed selfevident truths, the postulates of nonEuclidean geometry were admittedly abstract assumptions. Their successful application indicated that mathematicians need no longer rely on the common sense developed from experience with the physical world. Instead, they can define their own world by creating new axioms and exploring their implications.
Einstein's special theory of relativity was an obvious example of such a new application of mathematics: it described with utmost precision phenomena out of reach of the experimental physics of 1905. Consequently, it had to rely on nonverbal expression and involved notions defying conceptualization, such as, for example, the fourdimensional timespace continuum. The same dependence on abstract mathematics also characterized the theory's further extension, the general theory of relativity. Completed in 1916, the new theory clarified the status of the new geometry by concluding that the universe is not Euclidean (Einstein and Infeld 237). Since the publication of the special theory of relativity in 1905 Einstein had been entertaining the idea of enlarging its scope. The word "special" in the theory's name was used in the sense of "restricted," since the theory could be applied only in a situation involving uniform motion (Einstein and Infeld 213). If one of the two observers, for example, remained motionless, while the other was moving at constant speed and fixed direction, the theory enabled the physicist to calculate the relation of space and time measurements between the two observers. Such uniform motion, however, is only an abstract concept. In everyday reality we experience the nonuniform motion, in which the velocity or direction of the moving object changes. According to the Newtonian paradigm, nonuniform motion was the result of external forces. The earth, for example, would move in a straight line if not for the pulling force from the sun (the force of gravity), which constantly curved the earth's path, keeping the planet in its orbit around the sun. In his general theory of relativity Einstein showed that the force of gravity as conceived by Newton is not a feature of reality but a mere product of the human mind, devised to compensate for our mistaken notions about the geometry of the universe.
The general theory of relativity took almost ten years to complete. It originated in 1907 with the formulation of the "principle of equivalence." Einstein proposed in it that a nonuniform motion such as acceleration increase of speed) cannot be physically distinguished from gravity (Jeans, Growth 297). We habitually ascribe behavior of objects in our environment to the force of gravity which constantly pulls them down, but it is not impossible to imagine that the objects are motionless, while the earth is rushing in their direction. As soon as we drop a stone, the earth rushes upwards and starts pushing the stone, like everything else. The principle of equivalence showed that gravity was a superfluous concept and had to be abolished. Instead, Einstein suggested in his new theory, we must attribute the nonuniform character of observable motion to the peculiar characteristics of the timespace continuum which curves in the presence of mass. According to Einstein the earth revolves about the sun not because it is pulled in its direction by the sun's gravitational field, but because, in the presence of the sun's mass, the timespace continuum curves back upon itself. Thus the nonuniform motion of the planet does not involve any outside forces but rather is the result of the peculiar geometry of the timespace continuum. The rules of Euclidean geometry, which we had taken to be universal, can be applied only to small parts of the universe (such as our engineering and topographical measurements) or the empty regions of interstellar space, where the absence of mass results in a flat timespace continuum. As soon as we encounter mass, however, the continuum changes from flat to curved and nonEuclidean geometry is necessary to describe the motion of objects.
Rutherford 
These fundamental changes in our ideas of space and time, developed by Einstein between 1905 and 1916, coincided with the gradual introduction of a new concept of matter, which formed the basis for quantum mechanics. In 1905 Einstein's assertion that matter represents stored energy challenged the permanence and solidity of matter. Six years later Ernest Rutherford introduced the idea of the porosity of matter, showing that most of what we had taken for solid matter is actually empty space. Rutherford carried out a series of experiments to study the behavior of alpha particles which are emitted by some radioactive substances. In one of the experiments he had bombarded with the particles a layer of gas and found that while the majority of alpha particles penetrated the obstacle, some were deflected by it and scattered in various directions. By studying their deflection Rutherford was able to ascertain for the first time some features of the subatomic world. In 1911 he announced his findings in the form of a new planetary model of the atom. Rutherford postulated that most of the mass of the atom is concentrated in the centrally located and positively charged nucleus, while the negatively charged and considerably smaller electrons rotate around it, much as the planets orbit around the sun (Jeans, Growth 315).
Bohr 
Two years later Niels Bohr, a Dane who had studied with both J.J.Thomson and Rutherford, applied the assumptions of the quantum theory to the planetary model of the atom and created a new, detailed theory explaining the behavior of electrons. He suggested that electrons revolve around their nucleus not at any distance but in orbits of specific sizes. Each of these orbits can accommodate only a specific number of electrons. In the lowest energy state of the atom ("ground state"), the electrons are grouped as close to the nucleus as possible, but as they absorb energy, they jump to further orbits. The distance of the jump depends on the amount of absorbed energy. As soon as the source of the additional energy is removed, the electrons return to their inner orbits. Each time an electron jumps back to a smaller orbit, the atom emits the excess energy in the form of light. The definite sizes of the orbits can thus account for the emitted energy being quantized (Pagels 53).
Bohr's specificorbits model of the atom gained immediate recognition and won him the Nobel Prize, for it solved a mystery which had puzzled scientists for over a decade. Scientists had observed that if a substance is heated or if electric current is passed through a gas, the substance or gas emits light which, when split by a prism, produces not a full spectrum of colors, as sunlight does, but a unique pattern made up of only some parts of the spectrum. It turned out that each chemical element has a unique set of spectral lines which can be used as its "fingerprint." This phenomenon defied explanation in terms of classical physics and remained a mystery until the publication of Bohr's theory. Bohr applied his theoretical assumptions to the simplest of all atoms, the atom of hydrogen, which consists of the nucleus and one electron. By analyzing all the possible jumps of the electron to and from different orbits, he discovered that the number of combinations available to the electron on its way back towards the innermost orbit (i.e. when the emission of light takes place) is the same as the number of discrete lines in the hydrogen light spectrum (Pagels 5354). It appeared that the discontinuity of each element's spectrum was a visual representation of its quantum nature.
In the years following its publication, Bohr's new theory was applied extensively to other, more complicated elements, providing new knowledge about the unexplored subatomic world. New knowledge, as always, meant new questions. What governed the mechanism of the electron jumps? How were specific electrons selected for a given jump? What did this selection depend on? Further experiments with various subatomic particles revealed a startling new world in which traditional Newtonian physics did not work. The realm of subatomic particles was not "determined" in the way Newton claimed the world to be. An experiment conducted twice could lead to different results and there was no way to predict, prior to the experiment, which of the results would occur. According to classical physics the world has objective existence, that is, it is believed to exist even when not perceived by anyone. The newly ascertained features of the subatomic world contradicted that view by stressing the subjective character of reality. As in relativity theory, the observer was shown to have an active role in determining reality: his decisions on what to observe could directly influence the results of an experiment. It became clear that the subatomic realm could not be successfully visualized by referring its elements to some counterparts in everyday reality. Bohr's concept of subatomic particles as minute objects existing in space and time and exerting forces on one another could not convey the essence of the quantum world (Jeans, Growth 33233). The need for a new theory became obvious, and in the midtwenties two new explanations of atomic phenomena were introduced: Heisenberg's matrix mechanics and Schroedinger's wave mechanics.
Heisenberg 
Like Einstein, Heisenberg succeeded in creating a new theory partly because he renounced both the traditional concepts of physics and current speculations. Unlike Bohr, he refused to concern himself with a hypothetical picture of unobservable electrons and concentrated instead on the available physical data: the energy in the form of emitted light (Jeans, Growth 331). When he described mathematically the energy transitions of an atom, it turned out that his calculations corresponded to the socalled algebra of matrices, an abstract type of mathematics invented in 1860 (Guillen 70). Heisenberg's concept was developed further by Max Born and Pascual Jordan and, independently, by Paul Dirac. Matrix mechanics became a complete, dynamic theory replacing Newtonian mechanics in the subatomic world. Its chief characteristic was its purely mathematical character. The theory was not a basis but a substitute for a physical picture of the subatomic world. It reflected the attitude of those physicists who believed that the true picture of the universe can be expressed only mathematically (Pagels 6061).
de Broglie 
Schroedinger 
An alternative interpretation of the subatomic phenomena was meanwhile in the making, based on the idea of waveparticle duality of light. In 1923 Louis de Broglie suggested that if light, proven experimentally to be a wave, could sometimes behave like a stream of particles, then the electrons, traditionally considered particles, could sometimes behave as waves. Using Planck's and Einstein's equations he devised a formula determining the wavelength of the "matter waves." De Broglie then inspired Erwin Schroedinger, who, in 1926, hypothesized that electrons are not solid particles but patterns of standing waves. Schroedinger formulated an equation which, when applied to the atom of hydrogen, confirmed Bohr's findings about its structure. Later the same year Max Born interpreted the electron wave function of Schroedinger as an expression of the probability of finding an electron at some point in timespace. He suggested that while the individual events in the subatomic world cannot be predicted, it is possible to predict the probability of each specific event (Pagels 6163).
The following year, 1927, brought to a close the first, stormy phase of the development of quantum mechanics. Early that year both Heisenberg and Bohr suggested new ways of understanding the subatomic world. Heisenberg announced his "uncertainty principle" which revealed the limits of human knowledge. According to the principle, the nature of the universe is such that it is impossible to measure exactly and simultaneously both the position and the momentum of a moving subatomic particle. A particle, such as electron, for example, cannot be located unless it scatters light. In doing so, however, it receives a shock of unpredictable amount, so that it is impossible to determine its momentum (Jeans, Growth 33435).^{[2]} Heisenberg's principle stressed again the recurrent motif of the scientist's active role in creating the observed reality. It implied that we cannot observe the world without influencing its course.
Bohr's "complementarity principle", announced at about the same time, suggested similar implications in spite of his different approach. While Heisenberg developed his principle mathematically, Bohr's interpretation was philosophical. His "principle of complementarity" postulated that an object of knowledge can have such complementary properties that the knowledge of one precludes the knowledge of the other. For example, undulatory and corpuscular characteristics are complementary aspects of light. We can talk about the complementarity of two mutually exclusive aspects without logical contradiction because they are not properties of light but rather of our interaction with light (Pagels 75). Light cannot demonstrate either of its complementary aspects until the scientist chooses the approppriate experiment. Thus, as in Heisenberg's principle, the scientist cannot avoid influencing the world he studies.
In the fall of 1927 physicists working with new physics met in Brussels at the fifth Solvay Conference. In Brussels they formulated the first consistent interpretation of quantum phenomena, which became known as the "Copenhagen Interpretation of Quantum Mechanics" (to reflect the decisive influence on it of Niels Bohr from Copenhagen). Joining recent findings and theories into one consistent body of knowledge, the Copenhagen Interpretation stated that quantum reality is statistical, not certain. It is impossible in principle to predict individual events with accuracy; we can only predict probabilities of events. A quantum particle is essentially different from other particles of physics in that it cannot be conceived as an object. According to quantum mechanics a subatomic particle is no more than a "tendency to exist" expressed in terms of probability. It is a certain quantity of something that seems to lie beyond our conceptual faculties. Furthermore, quantum reality is partly created by the observer. It is meaningless to talk about properties of quantum objects without specifying the experimental arrangement by means of which they were obtained, for quantum mechanics does not describe reality; it correlates our experience of reality (Pagels 76).
Dirac 
The new paradigm which thus emerged found its best expression in Paul Dirac's Quantum Mechanics published in 1930, exactly three decades after Max Planck's revolutionary quantum postulate. Dirac's book played a role similar to Newton's Principia: it unified the various current theories and gave them a consistent mathematical form. Dirac's result was an abstract mathematical theory which included both matrix mechanics and wave mechanics as special cases. Quantum Mechanics suggested that the fundamental processes of nature do not occur in the timespace continuum, but rather belong to a substratum of events which is out of our reach. Experiments in the subatomic world can bring some of the events to the surface of the substratum where they can be registered by our senses or instruments, but the essential nature of that substratum does not permit of a spatiotemporal representation (Jeans, Growth 336).
Index Introduction Background Physics Philosophy Relativity Quanta Conclusion
^{1} The idea of directional contraction of matter was first suggested by the Irish physicist George F. FitzGerald in 1892 and became known as the "FitzGerald contraction." A year later Hendrik Lorentz, a Dutch physicist, gave the idea a precise mathematical expression. His formula, the "Lorentz transformation," was adapted by Einstein and became an essential part of the new theory. [Back]
^{2} Heisenberg's uncertainty relation does not actually apply to a single particle; it was arrived at mathematically and it expresses a statistical average resulting from multiple measurements (Pagels 71). [Back]
Index Introduction Background Physics Philosophy Relativity Quanta Conclusion