Continued from Part II.
In the last part of this post, I gave an introductory idea of the meaning of doubt or uncertainty in sciences. Then I proceeded with a selective survey of fields of knowledge organizing them in levels from areas with the most doubt in their conclusions upward. Briefly, I covered parasciences, social sciences, behavioral sciences, and biological sciences. In this post I intend to end my survey with physical sciences and mathematics.
Given the depth and breadth of this topic this overview is very general and needs clarifications. There is also a need to be more aritulate on how this survey relates to the initial idea (that certain knowledge comes only from the Divine). Hence I have decided to end this post with a final section, Part IV, to be presented as a concluding synopsis.
Now let us continue with the survey.
Physical sciences
I have selected two fields for my sruvey: physics and meteorology. Being pertinent to inanimate things, one might assume that such scienes should be free of doubt. However, uncertainty pervades even the very basic task of measuring physial quantities such as length, and air pressure – an issue which is definitely even more stronger in case of the nonphysical sciences, particularly the sociopsychological domains.
Am I measuring it accurately?
A lot depends on the answer to this question for a scientist. Measurements are the building blocks on which the high rise of scientific progress is construted: observations taken, experiments planned, hypotheses tested, ultimately paving the way for theories. For practial purposes, a 0.1% or lesser inaccuracy of measurement will not make a differene in real life, but will do so in research — which, in physical sciences today ranges from the nanoscale of subparticles making up electrons and protons, through the global variables of hurricane velocity and the density of ozone layer, all the way to observations of astronomical proportions such as the desplacement of stars, and the velocity of a meteor coming towards earth.
An idea of how relative our standards of measurement are (even such common everyday quantities such as length and weight), requiring constant research to keep them as accurate and uniform in time as possible, will be gained through the following examples:

” The international prototype of the kilogram, an artefact made of platinumiridium, is kept at the BIPM↓1 under the conditions specified by the 1st CGPM in 1889 … It follows that the mass of the international prototype of the kilogram is always 1 kilogram exactly, m(_{}) = 1 kg. However, due to the inevitable accumulation of contaminants on surfaces, the international prototype is subject to reversible surface contamination that approaches 1 µg per year in mass. For this reason, the CIPM declared that, pending further research, the reference mass of the international prototype is that immediately after cleaning and washing by a specified method”

“From 1889 to 1960, the metre was defined to be the distance between two scratches in a platinumiridium bar kept in the vault beside the Standard Kilogram at the International Bureau of Weights and Measures near Paris.
This replaced an earlier definition as 10^7 times the distance between the North Pole and the Equator along a meridian through Paris; unfortunately, this had been based on an inexact value of the circumference of the Earth.
From 1960 to 1984 it was defined to be 1650763.73 wavelengths of the orangered line of krypton86 propagating in a vacuum.
It is now defined as the length of the path traveled by light in a vacuum in the time interval of 1/299,792,458 of a second.” (From the online Hyper Dictionary)

“A measurement process such as this one (i.e. measuring metre through the platinumiridium prototype), which runs for decades, should have dimensionally stable control standards, but completely stable materials are rare and perhaps nonexistent.”↓2 [The underlining is mine].
This relativity of measurement seems to exist because, in order to establish a standard for the unit of a quantity (for instance, the standard for 1 kg, 1 lt, and 1 sec, etc), we must rely on certain other substanes (such as the platinumiridium rod) which are subject to change over time (such as erosion, or loss of mass through radiation) or on other quantities (such as the speed of light) which in turn are subject to uncertainty of measurement.
Of course, a special twist to the scenario is added by the famous special theory of relativity, which showed (and was proved through later experiments) that measurement of quantities such as time, length and weight, depends upon the frame of reference of the observer.
The vagaries of predicting weather
A recent series of catastrophi weather events in the US history have prompted me to include this area of science. Just this last weekend, those of us living in the western side of the world were harrassed by the certain looking possibility of horrific and massive damages from Hurricane Irene. It’s size alone was onethird the length of the entire east coast of the United States, and it was expected to (and certainly did) sweep that entire coastline even upwards into the Canadian side of the coast. It did, but it turned out to be much more slower and certaintly less horrifying than it was predicted to be; however, it did cause huge damages in certain areas and through some of its features which were being highlighted to a lesser degree by the forecasters.
Weather forecasting has been traditionally performed using numerical weather prediction. As wikipedia explains, even in the modern day:
Manipulating the vast datasets and performing the complex calculations necessary to modern numerical weather prediction requires some of the most powerful supercomputers in the world. Even with the increasing power of supercomputers, the forecast skill of numerical weather models extends to about only six days. Factors affecting the accuracy of numerical predictions include the density and quality of observations used as input to the forecasts, along with deficiencies in the numerical models themselves.
The problems are really aggravated by the fact that even small differences in observing or predicting conditions on a day can lead to widely different forecast predictions for the followup days. This is because weather systems are of such massive and timedependent nature, they exhibit a strongly sensitive dependence on the initial conditions↓3. That is, even shortterm future scenarios in weather can vary hugely depending upon the set of currently existing conditions. As a result, meteorologists have switched to a different form of forecasting called as ensemble forecasting, requiring multiple predictions from the same set of initial data. Each prediction is made along with an estimate of it’s accuracy (actually, uncertainty in the accuracy of the prediction), but even that doesn’t help ensure freedom from the errors: “Sometimes the atmosphere behaves more chaotically, and small errors amplify rapidly. At other times the various forecasts stay within a narrow range, therefore they can be treated with more confidence”↓4.
Uncertainty at the very core of physics
I have mentioned this type of uncertainty in a biographial post before. It’s the Heisenberg’s uncertainty principle I’m talking about. I quote directly from the webpage I’ve linked the principle to:
“… the position and the velocity of an object cannot both be measured exactly, at the same time, even in theory. The very concepts of exact position and exact velocity together, in fact, have no meaning in nature… Ordinary experience provides no clue of this principle. It is easy to measure both the position and the velocity of, say, an automobile, because the uncertainties implied by this principle for ordinary objects are too small to be observed…”
And:
Any attempt to measure precisely the velocity of a subatomic particle, such as an electron, will knock it about in an unpredictable way, so that a simultaneous measurement of its position has no validity. This result has nothing to do with inadequacies in the measuring instruments, the technique, or the observer; it arises out of the intimate connection in nature between particles and waves in the realm of subatomic dimensions.
Every particle has a wave associated with it; each particle actually exhibits wavelike behavior. The particle is most likely to be found in those places where the undulations of the wave are greatest, or most intense. The more intense the undulations of the associated wave become, however, the more ill defined becomes the wavelength, which in turn determines the momentum of the particle. So a strictly localized wave has an indeterminate wavelength; its associated particle, while having a definite position, has no certain velocity. A particle wave having a welldefined wavelength, on the other hand, is spread out; the associated particle, while having a rather precise velocity, may be almost anywhere. A quite accurate measurement of one observable involves a relatively large uncertainty in the measurement of the other.
In other words (from my own post linked above):
What this law in physics boils down to is, that the position of one of the smallest particles that we are made of cannot be firmly located [if we are trying to measure it’s velocity at the same time. In layman terms, an electron assumes its position when it is being observed. And when its not being observed noone can predict its location. Rather, its in a state of ‘suspension’, its literally not there; however, as soon as it is observed it (magically!) assumes a specific position in time and place! Wow!
In a way, this uncertainty is the same kind which taints all of science, only Heisenberg states it much more outrightly with a sense of finality of the impossibility of measurement of two quantities at once that is rare in other sciences: As soon as we perfectly control one variable for an experiment/measurement, we totally loose our grip on another.
It was Heisenberg’s uncertainty principle from which quantum physics sprang forth (the theory that assumes that an atom behaves like a wave and a particle at the same time. Interestingly, this theoretical arena challenges Einstein’s theory of general relativity which helps explain all the observable phenomenon at the astronomical scale. Conversely, quantum physics explains today everything known about the subatomic level of things (the particles and subparticles in an atom). And yet, when considered together, both the theories cannot be right at the same time!↓5
This puzzle, although interesting and worth considering, is beyond the scope of the current post, however.
Mathematics
The queen of sciences, mathematics is certainly not free of uncertainty of doubt, despite being considered the most ‘tight’ and ‘rigorous’ of all fields of knowledge.
Given the current specialization and sophistication of it’s subfields, the trend these days is that published mathematical proofs of theorems are so prolonged and complicated that it may take a great amount of reviewer time to confirm or refute them. The result is that the publishers and critics have began talking in terms of percent of certainty of a proof being correct↓6.
The more interesting seed of doubt in the domain of maths however is the idea, that if a system set out to represent all the mathematical statements there were (such as a computer), there will be one true statement which would be unprovable by the system. This is the Gödel’s incompleteness theorem which I attempted to present in my review of the book Gödel, Escher, Bach: An Eternal Golden Braid. According to wikipedia, the theorem unsettles the foundation of mathematics ___ those subfields of the queen of field that search for the ultimate basis on which math statements can be called as true. Gödel essentially proved that there are limits to what we can expect from a tightly logical system.
Thus not even the apparently most rigorous of sciences is free of uncertainty and doubt.
____________________________________________________________________________
A synopsis tying the various threads of this prolonged post as well as presenting certain important clarifications will be presented in the fourth part of this post.
Notes
1. The BIMP is the International Bureau of Weights and Measures responsible for maintaining the current system of measurement units universally adopted (suh as kiloggram and metre). The above passage has been taken from their website at the following page: http://www.bipm.org/en/si/si_brochure/chapter2/21/kilogram.html
2. In a research paper found at:
http://ts.nist.gov/MeasurementServices/Calibrations/upload/4998.pdf
3. In http://en.wikipedia.org/wiki/Ensemble_forecasting
4. In http://wwwdas.uwyo.edu/~geerts/cwx/notes/chap13/ensemble.html
5. See: http://www.nytimes.com/books/first/g/greeneuniverse.html
6. See: http://seedmagazine.com/content/article/mathematical_uncertainty/