Monday, September 6, 2010

Tao level 0: the meta-mystery which cannot go away

In 2002 Physics World published a list of experiments considered the finest in the history of physics, the winner was the Yound experiment or the double-slit experiment and applied to light or electrons, one of the foundations that have generated experimental development of quantum mechanics.
The original experiment performed by
Thomas Young the November 24, 1803 focused on determining if light is represented by waves or particles (as suggested by Newton).
Young's experiment was simple.


A ray of sunlight was passed through a hole in a card, then reached a second screen with two holes. The light that passed through the two holes of the second screen finally ended on a screen, where it created a set of lights and shadows that Young explained as a consequence of the fact that light spreads through the two holes in the form of waves. These waves give rise, in places where you add, clear bands (constructive interference), while in places where no sisommano banded dark (destructive interference).
Young's experiment was accepted as evidence that the light shines through the waves. In fact, if light were composed of particles, would not have seen this alternation of light and shadow, but it would be observed only two light bands, one for the forum.
The experiment confirmed the predictions of
Maxwell classical electromagnetism, where there are solutions in form of  electromagnetic wave propagation.
In 1900 Max Planck, Nobel Prize in Physics 1918, suggested by other experimental data such the photoelectric effect explained by Einstein in a seminal article in 1905 - for which he received the Nobel Prize for Physics in 1921 - that light was composed of particles or quantum of energy.
The experiment was repeated with both light and electron, with techniques that can launch a  single photon or  electron at a time:


The result is that increasing (from left to right) the intensity of photons or electrons one passies from a wave to a particle behavior. Furthermore, at  high intensity, if the slits are open you get the two wave-like behavior, it is open only one it gives the particle behavior. Electrons fired into a double-slit experiment produce the interference pattern on the screen detector (in this case a screen similar to a television set) and must move as a wave. However, on arrival, they generate a single spot of light, thus behaving as particles. It has therefore led to conclude that electrons travel as waves but  become as particles arrive arrival.If the electron was a particle we deduce that each particle passes through either of two holes in the experiment, but the interference pattern generated on the screen shows that these are waves which pass through the two holes at the same time.
Another experimental configuration requires not only a detector for the two slits but two individual detectors on each of the slits. If the detector is only one it is obtained the wave
result,if they are two one get the particles behaviour, and from which detector is activeted one can deduced from which slit the particle has passed.The quantum entities therefore prove capable of passing through two slits at the same time, not only have a sort of awareness of past and future, so that each can choose to make its contribution to the interference pattern at the correct point, what contributes to the creation of the pattern, rather than its destruction.
The result of these experiments was conclusive to formulate the wave-particle duality or the Complementarity Principle, formulated by Niels Bohr, revealeing that the observed behavior depends on the experimental configuration used to measure it, and then, ultimately, by the observer.
In the words of Richard Feynman in the double-slit experiment and in the wave/particle duality is enclosed "the quantum mystery which cannot go away” (Feynman, 1977), or the "central mystery" of quantum mechanics. This is a phenomenon which is impossible to find a classical analogous and explanation, and well represents the core of quantum mechanics. "


From this basic experiment had two dual formulations of quantum mechanics: the wave mechanics developed by Erwin Schrödinger, Nobel Prize in Physics 1933, and the particle one made by Werner Heisenber, Nobel Prize in Physics 1932.Both theories lead to two eigenvalue equations; one in the form of wave equation, the other ont he form of operator equation for the function probability amplitude, which describes in statistical / probabilistic properties of quantum entities.

The dual wave/particle behavior was always subject of intense discussions, of which the most historically significant occured between 1930 and 1980, called the the  Copenhagen interpretation, according to which conscience, through the exercise of observation, at least in part determines reality.
The Copenhagen interpretation, formulated by
Niels Bohr and Werner Heisenberg during their collaboration in Copenhagen in 1927, explains the double-slit experiment as follows:
  • the electron leaves the electron gun like particle;
  • immediately dissolves into a series of superimposed probability waves, or a superposition of states;
  • waves pass through both slits and interfere with each other to create a new superposition of states;
  • the detector screen, making a measurement of the quantum system, make a collapse of  the wave function into a particle, in a well defined point of the screen;
  • immediately after the measurement, the electron starts to dissolve into a new superposition of waves.
According to the Copenhagen interpretation the objective existence of an electron in a certain point in space, for example in one of the two slits, independently of any actual observation, makes no sense. The electron seems to show a real existence only when it is observerved. The reality is created, at least in part, by the observer.

In 1978 John A. Wheeler proposed an ingenious mental version of the experiment, called the delayed- choice double-slit experiment, starting from the experiments that show that when you place a detector on the slits and is analyzed by which slit the photon passes, the interference pattern disappears.
In the delayed-choice experiment of the detector is placed at a point between the two slits and the detector end, to see which path is taken by each photon after passage between the two slits, but first to get to the final detector. Quantum theory says that if you turn off the detector and intermediate analyzing the trajectories of photons, these will form an interference pattern. If however, we observe the photons to determine which slit passed, even if the observation is made after that they crossed, there will be no interference pattern. The "delayed choice" comes into play just because you can choose to analyze the photon (decision made randomly by a computer) after that the photon has passed through the slit(s). The decision, according to quantum theory, appear to affect the way in which the photon behaves when passing through the slit(s), actually just an infinitesimal fraction of time before the observation.
In the words of Wheeler, " Thus one decides the photon shall have come by one route or by both routes after it has already done its travel".
The enormous importance of the Wheeler conceptual experiment has stimulated a series of experiments to achieve results, the most conclusive of  which was produced by the Quantum Optics Group of CNRS lead by Alain Aspect with single-photons lasers using interferometry.





The result is against any classic commonsense even in the wave/particle duality, the dtected behavior depends not only by the configuration of the screen but also from that determined after the particle has passed the screen, with an effect of back-causality connection, working backwards in time.In Wheeler's words: "we have a strange inversion of the normal order of time. We, now, by moving the mirror in or out have an unavoidable effect on what we have a right to say about the already past history of that photon"


Among various theories proposed to try to explain such behavior the most fascinating is the MWI (Many Worlds Interpretation) proposed by Everett in the early 50's and supported by Wheeler.
This theory has the idea that every time the world faces a choice at the quantum level (for example if an electron can choose in which slit to pass on the double-slit experiment), the universe splits into two (or in many parts as there are possible choices), so that all options are implemented (in the experiment described above, in a world the electron passes through slit A, the other through slit B).




Thursday, September 2, 2010

the no-time of Tao


metaTao: the Tao of Tao

In common speaking a usual situation is the one when a descriptive/application term (called subject term) is applied to another term (object) within its scope of definition/application to form a predicate.
For example, the subject term "production" may be applied after the object "apple" getting the term "apple production", a well-formed predicate with meaning.
A special case is when the subject
term and the object term coincide, or where the subject term is applied to himself.
In the example we get "production of productions", a term that has still meaning and could refer generally to the study of the production techniques.
When a term is applied to subject itself is of use described by the prefix meta (from the greek: μετά = "after", "beyond", "with", "self"): metaterm.
In the
shown example a production of productions is called metaproduction.
Not all the words with the prefix meta naturally have this meaning, for example a
metaphor is not exactly a "phor of phor" while Metaphysics is not properly physics applied to physics.
In metadescriptions it is crucial to distinguish between the two levels of discourse, the level of the object elements and that relative to the metaelements which describe them. Born here a further distinction between levels, the logical one in addition to the hierarchical.
The logic levels can be organized hierarchically, and hierarchical levels can also be logical levels.
The fundamental distinction between logical and hierarchical levels is that the classes that represent the first are
one another self-contained but are always on the same logical plane: each class is an extension of the previous but still at the same logical level, such as biology contains the chemistry, chemistry include physics, etc. while the classes corresponding to logic levels are not on the same plane, between the there is a logical gap and not just an extension of new elements: a metaclass is not simply an extension of the classes that compose it but a new class with characteristics and logical/functional properties entirely new and different from those of the classes that compose it.



In the figure we have a class C2 with C1 elements, mapped to a metaclass C4 that contains as elements some class C3, of which C2 is an element,; this is further mapped to a meta-meta-upper class etc.. At each class corresponds a logic level 1, 2 etc.

Some examples of metaterms are metadata, data that organize data, metatheories, a theory on another theory, as metamathematics, where one can define metatheorems - for example on the proof theory of theorems in mathematics, or metalogic, a metatheory study of logic, especially mathematical logic.

One description area where the distinction between logical levels is essential is in the field of linguistics, the study of natural languages (such English, Italian, et.) and artificial languages, such programming languages (C, LISP, HTML, etc.). In fact, if a book on C is written in english there is no possibility of confusion between the natural language narrative subject (english) and the described object language (C), but when an english book speaks about english, such as a book of english linguistics , several problems of confusion may arise when an english word is used as a metaterm of the metalanguage subject and when instead is a term of the object language described. In this case the use of metaterms between language and metalanguage and the distinction between logical levels of discourse is essential.

In system theory a system is commonly composed of elements, but more generally may include other systems, and in this case is a metasystem, as well as a system process may be composed of other processes, and in this case is a metaprocess. The distinction between logic levels which operates the description of system/process is essential in order not to create confusion or paradoxes.

Tuesday, August 31, 2010

false Tao



René Magritte - "The False Mirror" - 1928

Monday, August 30, 2010

Tao level 0: relativity of Tao and Tao relative

In 1905, in what can be described with the same words Einstein himself used for the Newton's Principia as the "largest single contribution produced by a single individual in all the history of physics", Einstein introduced a revolutionary new paradigm, laying the foundations of Special Relativity.




ON THE ELECTRODYNAMICS OF MOVING BODIES

By A. EINSTEIN

June 30, 1905

It is known that Maxwell’s electrodynamics—as usually understood at the
present time—when applied to moving bodies, leads to asymmetries which do
not appear to be inherent in the phenomena. Take, for example, the reciprocal
electrodynamic action of a magnet and a conductor. The observable phenomenon
here depends only on the relative motion of the conductor and the
magnet, whereas the customary view draws a sharp distinction between the two
cases in which either the one or the other of these bodies is in motion. For if the
magnet is in motion and the conductor at rest, there arises in the neighbourhood
of the magnet an electric field with a certain definite energy, producing
a current at the places where parts of the conductor are situated. But if the
magnet is stationary and the conductor in motion, no electric field arises in the
neighbourhood of the magnet. In the conductor, however, we find an electromotive
force, to which in itself there is no corresponding energy, but which gives
rise—assuming equality of relative motion in the two cases discussed—to electric
currents of the same path and intensity as those produced by the electric
forces in the former case.


Einstein started from the consideration that the classical Newtonian mechanics transformations that bind space and time between inertial reference frames, who have between them a relatively constant speed, without acceleration, were the Galilean transformation, where time and space are absolute. In other words, the laws of Newtonian mechanics are invariant under Galilean transformations, that is co-variate with them, while this is not the case with the laws of classical electromagnetism.
Einstein showed that the
correct transformations for electromagnetism that make invariant the Maxwell's field equations, are the Lorentz transformation where, unlike those of Galileo, time and space are no absolute but relative: both depend on the relationship between the relative velocity in relation to that of light between the two systems of reference .
When the Lorentz transformations are applied to the Newtonian mechanics to
create relativistic mechanics, a new series of phenomena occurs against the common sense, which is based on the experience of objects which are large (compared to the atomic nucleus) and slow (compared to the speed of light), typical of classical physics, such as length contraction and time dilation measured between two reference systems that travel together to relative velocities close to that of light.
A typical apparent paradox due to the relativity of time is the
twin paradox, a typical thought experiment (Gedankenexperiment), an experiment that is considered impossible to perform experimentally, or for its intrinsic structure or due to inadequate technologies, but the result, even conceptually, is significant: of two twins on earth one starts with a spaceship and reach speeds close to that of light. When he returns to earth he finds the other twin has aged, or he is younger. This is a direct consequence of special relativity largely confirmed experimentally. The paradox is that the twin on the ship could be argued that the other twin has left with the earth at the speed of light, so both should be rejuvenated to return. The paradox is only apparent and assumes that the situation of the twins is symmetrical why it is not: only the twin on earth has remained in an inertial frame, while the one on the ship in at least two times, acceleration and deceleration, is was in a non-inertial reference frame.
The extension of relativity to non-inertial systems and the integration of the last part of classical physics, the theory of gravity, was made by Einstein in 1916 as a theory of General_relativity, central to any cosmological model of the Universe.
As speed becomes smaller than the light, Lorentz transformations are reduced to those of Galileo, and Relativistic mechanics reduces to the Newtonian, in which case the Theory of Relativity is considered as an extension of classical physics.

purely acoustic Tao

Tao level 0: Observing the Tao and the Tao which observes

In defining a system is central the role performed by the Observer.
It is in fact the observer which determines how and many are the system elements, which are the relations/processes to be observed and defines the system boundary. Furthermore, to discover and define the system - to measure it at level 0 - the observer must interact with it, and therefore modified it, so the original system is never known but only the detected one can be described. More, also the system under observation may interact with the observer. It is therefore necessary to consider a broader vision at a higher-level system which consists of observer-observed system.

This apparently lapalissade view is not so obvious since still today persists the myth that the reality of scientific description, particularly for the "hard sciences" such physics - described in the formal language of mathematics - is "objective" therefore not dependent from the observing subject. In this myth the observer never modify the observed system by observing it, and can always know the "original" physical reality
This gross confusion of one of the foundations of Galilean scientific method

that is the result of an experiment to be considered valid must be independent and agreed by all parties that carry out it, is the result of seventeenth-century Descartes'dualism between res extensa and res cogitans, mind and body, subject and object at physical level 0.



"No amount of testing can prove I'm right, a single experiment can prove that I was wrong."

Albert Einstein, letter to Max Born, 1926 december 4

The myth of objectivity, at least at level 0, is also a consequence of the enormous success until 1900 of the newtonian classical physics (classical mechanics and gravitation) and Maxwellian (classical electromagnetism) to explain virtually all the physical phenomena observed, from the motion of the planets to the propagation of light. Even today almost all the technologies developed in 800s and 900s, from mechanics to electronics, have their foundation based on these two classical theories, which are the basis of our experience of "subjective" reality of the physical level 0.



In classical physics, the observer does not exist, or rather has no influence, if not for the fact that all laws/equations are to be defined in a given coordinate system, specifying the location and the time reference from which one points. All the laws of classical physics are invariant to any reference system in space and time, that is are valid for any observer at any place and at any time. Space and time are therefore considered absolute.
The conceptual paradigm of classical physics was radically revolutionized at the beginning of 900s when "places" of physics not yet studied previously where examined theoretically and experimentally, particularly those of high speed/energy (comparable to that of light) and picodimensions such as the inside of the atomic nucleus and of its constituents.
In the two theories developed during the 900s for these areas of phenomena, the theory of
Special Relativity and of Quantum Mechanics, the observer's role is central, and denies any possibility of beliefs as "objective reality" and of commonsense already at the level 0.

The figure shows the scope for the various theories. Horizontally is shown the size of objects d, vertically their velocity v. As the upper limit of speed is shows the speed of light c and as the upper limit of size below which quantum phenomena are involved is given as the length (diameter) of an atomic nucleus Lp. For dimensions larger than Lp and speed lower to about half that of light c/2, that is in the world of our experience, classical physics CF is valid; for dimensions smaller than Lp and speed lower than c/2 quantum physics QF applies; for dimensions larger than Lp and speed higher than c/2 is the relativistic physics RF that applies; finally for dimensions smaller than Lp and speed faster than c/2 are valid both quantum theory and relativity, or the quantum-relativistic physics Q-R F, is to be used .
The integration between Quantum Mechanics and Theory of Relativity was initiated by the work of Paul Dirac
in 1928 and followed with the development of Quantum Field Theories.

Some relevant experiments that demonstrate the key central role of the Observer and rule out the commonsense of the everyday experience described by classical physics are:


Syntropy