8. Subtlety between the Cracks of Approximation

Approximations can hide possibility.

2: Articles
3. Sections
4. Paragraphs

At the edges, there are openings.

Physics based upon countless approximations.

There is one feature of Physics that is not generally acknowledged. Physics, like its cousin Engineering, is based upon countless approximations. Admittedly these approximations provide an incredible fit. Indeed for those so inclined, the precision of the intersection between experimental findings and theoretical physics borders upon the miraculous. Despite the fact that theory and reality coincide almost exact, there is always some imprecision involved, especially at the borders.

Discarding terms that ‘won’t make a difference’

This occurs in a variety of ways for a variety of reasons. When scientists make their derivations to come up with a mathematical theory, they routinely discard many terms that are presumably too small to make a difference. Indeed, some claim that genius is the ability to know when to discard the inessential in pursuit of the essential. Both Einstein and Feynman had this talent. This approximation process is a crucial feature of the scientific method.

As William James said, “The art of being wise is the art of knowing what to overlook.”

Sometimes approximations obscure understanding.

Sometimes, these approximations have obscured a deeper understanding. Frequently, better data exposes these misunderstandings. Two examples come to mind.

Example: Copernicus’ Circle to Kepler’s Ellipse

Copernicus theorized that the Earth circled around the Sun. This theoretical approximation worked well for over a century. At the time, the best observations could only pin planetary position to within 10 minutes of the arc. Due to better observational methods Tycho Brahe was able to generate celestial data that had a precision error of only 4 minutes. Kepler employed this data to describe planetary motion as elliptical rather than circular. Kepler’s mathematical description of planetary motion led to Newton’s revolutionary laws of motion.

Another example: Light behavior from wave to particle

The second example comes from modern times. We’ve all seen the partial reflection of light from a lake or a pane of glass. For centuries the theory that light is a wave provided an explanation for this common experience. In the 20th century, more precise experimental methods were devised that could measure the behavior of individual photons. Photons didn’t gradually fade away as the wave theory predicts. Instead, there were just fewer of them. This new data forced physicists to revise their theory. Instead of behaving like a wave, light exhibits particle-like behavior at the subatomic level. This deeper understanding of the nature of subatomic entities opened the door to more technological advances. The point of these examples is that approximations sometimes hide possibility.

Amazing Happenings beneath the limits of scientific understanding.

Quantum Approximations

Another type of approximation is due to the quantum nature of the subatomic world. Plank’s constant places theoretical constraints upon what can be known. Heisenberg’s Uncertainty Principle elucidated these limits.

Hawking: Singularity variations beneath the Uncertainty Principle account for subtle differences.

Stephen Hawking theorized that all the variation in the known universe occurs from beneath these limits. If the singularity before the Big Bang is uniform, the universe is also uniform. If even the slightest variation is introduced into the singularity the subtleties of our world are eliminated. Only by introducing subtle changes beneath the level defined by subatomic uncertainty could scientists account for the refinement of our world. In other words, amazing things can occur below the possible limits of scientific understanding.

Approximation based upon Simplification required for comprehension.

Descartes considers the interaction between only 2 objects.

There is yet a 3rd type of approximation that is significant. Descartes introduced this approximation. The technique is a cornerstone of the scientific method. Instead of contemplating the bewildering complexity of the multiplicity, Descartes only considered the interactions between two objects. Employing this simplification, he came up with the theory of conservation of momentum. This idea affirmed the notion that inertia is a component of mass. Both of these ideas are taught in beginning classes in Physics.

Simplification eventually leads to modern technology.

The ability of mathematicians to solve increasingly complex two body problems led to miraculous extensions of Descartes original theory. Modern technology provides an affirmation of these extensions. Despite the excellent fit between theory and reality, as determined by experimental evidence, mathematicians have not been able to devise a method to solve the complexities of the dynamic interaction between three bodies. Approximations provide all the pragmatic accuracy that is necessary.

Feynman et al: subatomic characterization of simple hydrogen atom.

In similar fashion, Feynman and his subatomic colleagues have isolated the complex interactions of the subatomic world. To begin to understand the complexity, their studies have only focused upon the simplest phenomena. Instead of attempting to understand the oxygen atom from the subatomic perspective, they have only concentrated upon the simplest atom – the hydrogen atom.

Empirical evidence validates bizarre theories.

The precision of the data has led to some unusual, even paradoxical, mathematical theories. These theories include a variety of bizarre scenarios: the possibility of electrons and photons moving forward and backward in time, virtual photons, and traveling faster than the speed of light. Despite the counterintuitive nature of the conclusions, the fit between theory and empirical data is so tight that the theory is accepted as scientific fact.

Experience-based Phenomena Slips between the Computational Cracks

Complexity of computations block application to other atoms

The basic idea behind this technique is that it possible to generalize from the individual to the multiple, even though it is impossible to do the more complex computations. Descartes inferred that all objects would interact in similar fashion to two objects. In similar fashion, subatomic theorists have inferred that all atoms behave in a fashion that is analogous to the hydrogen atom. While mathematicians have been able to accurately characterize the interactions of two bodies, they haven’t been able to solve the ‘three body’ problem due to the complexity of the computations. Similarly, although theoretical physicists have nailed down the subatomic interactions of the hydrogen atom, they have not been able apply their same mathematical techniques to any other atom due to the complexity of the computations.

Generalizations introduced to include other phenomena.

In both cases, generalizations are introduced to make sense of more complex phenomena. Although not contradictory, these simplifications frequently have nothing to do with the theoretical underpinnings. As an example, to make sense of most phenomena associated with light in our ordinary world, we assume light to be an electromagnetic wave that moves at a constant speed, not a subatomic particle that moves at different speeds backwards and forwards in time. In other words, the general level has many properties that the individual does not have. Sometimes it is possible to derive the general properties from the individual and sometimes not. While subatomic interactions reveal much about the nature of light and electrons, it tells us little if anything about planetary interactions, organic chemistry, psychology or even computers. Within the imprecision lie incredible potentials. There is a lot happening between the computational cracks.

Feynman: 3 fundamental actions just a small beginning

With characteristic humility, Feynman recognizes this phenomenon. “You might wonder how such simple actions could produce such a complex world. It’s because phenomena we see in the world are the result of an enormous intertwining of tremendous numbers of photon exchanges and interferences. Knowing the three fundamental actions is only a very small beginning toward analyzing any real situation, where there is such a multitude of photon exchanges going on that it is impossible to calculate – experience has to be gained as to which possibilities are more important. Thus we invent such ideas as ’index of refraction’ or 'compressibility’ or 'valence’ to help us calculate in an approximate way when there’s an enormous amount of detail going on underneath. It’s analogous to knowing the rules of chess – which are fundamental and simple–compared to being able to play chess well, which involves understanding the character of each position and the nature of various situations – which is much more advanced and difficult.” (QED, p. 114)

Experience applied to rules leads to pragmatic results.

In other words, knowledge of the behavior of subatomic entities doesn’t lead directly to more general topics. Scientists must make approximations to deal with large-scale phenomenon, even though “there’s an enormous amount of detail going on underneath”. Feynman’s chess example is instructive. The rules of chess are simple, but to play chess well more holistic factors must be taken into account. While founded in the rules, these factors are based upon experience. Experience leads to the pragmatic application.

Although based in subatomic world, solid-state physics has its own questions.

 As an example, Feynman goes on to talk about other areas of Physics. “The branches of physics that deal with questions such as why iron (with 26 protons) is magnetic, while copper (with 29) is not, or why one gas is transparent and another one is not, are called ‘solid-state physics,’ or ‘liquid-state physics,’ or ‘honest physics.’ The branch of physics that found these three simple little actions (the easiest part) is called ‘fundamental physics– we stole that name in order to make the other physics feel uncomfortable! The interesting problems today – and certainly the most practical problems – are obviously solid-state physics. But someone said there is nothing so practical as a good theory. And the theory of quantum electrodynamics is definitely a good theory.” (QED, p. 114)

Rules provide boundaries: Experience determines pragmatic action.

The rules of the game, whether of chess or subatomic entities, are important as they provide a framework for exploration. However, experience determines how we apply these rules. The rules determine the boundaries, while experience determines the pragmatic action. In this sense, the subatomic rules provide boundaries for living systems, but experience determines practical behavior. Although Physicists have a complete understanding of the interaction of photons, electrons, and protons in a hydrogen atom, this provides no insight as to how to raise a child. The matrix of this behavior has its own unique code. While there may be linkages, each matrix, living and material, has a unique code. This code is frequently revealed through experience not analysis.

The inherent imprecision of Physics opens the door to other perspectives.

The overall point of this article is to demonstrate that even within the tightly defined field of Physics that there is a fudge factor. Although physicists have successfully ignored this imprecision in accounting for the complex behavior of an unbelievable number of primarily material phenomena, the imprecision also opens the door for an equal number of theoretical possibilities. For instance, we speculate that the quanta of info energy interact with the quanta of photons and electrons beneath the uncertainty level where the computations are too complex for even the greatest computer. Open the door to allow the Living Algorithm’s information system in.

Link

For more, check out the next article in the stream – Subatomic Loose Ends.

 

Home    Subatomics    Previous    Next    Comments