What is empiricism?
In philosophy, empiricism is one of the views of epistemology that emphasizes the role of empirical evidence in the formation of ideas. Empiricism, used by both social and natural scientists, says that knowledge is based on experience. And empirical research, including utilizing pieces of evidence discovered in experiments and validated measurement tools, guides the scientific method.
Common sense is based on everyday experience, which gives empiricism wide credibility in society. We all grew up hearing the same words from our parents that we “went through the situations before, you should listen to us.” People might find listening to those “experiences” from parents annoying, but it does impact their decision-making: people trust older teachers who have taught thousands of students before than young teachers who just graduated from colleges; people prefer a more experienced doctor than a young doctor to examine their diseases.
To some degree, A/B testing that’s widely used in the tech industry is also an application of empiricism. A/B testing is a user experience research methodology, which consists of a randomized experiment with two variants A and B. Companies like Facebook conduct a large amount of A/B testings to understand user engagements and satisfaction of its products, such as new features. By analyzing the metrics of A/B testing output, companies make shipment decisions. The evidence of delivering a product is the experiment results, which is the fundamental method of empiricism.
Fruitful experiences indeed make judgements more precise. Like our ancestors had early success predicting the motion of celestial bodies from years of experience, empiricism helps us protect ourselves, create value systems, and handle the crisis. However, all live under limitations with environmental and spatial factors. Can a person be experienced enough to give good advice? Can the past experience cover all cases? Even for online A/B testing, results are only based on variables A and B, what if C, D, E are better solutions? You can try 10, 100, or 1000 experiments and have comparisons, but can you do millions of them? What if the best solution is the one you didn’t have an experiment with?
Rationalism vs. empiricism
Rationalism, opposed to empiricism, says that reality has an intrinsically logical structure and empirical proof, and physical evidence are regarded as unnecessary to ascertain certain truth. The rationalists argue that truths exist, and that the intellect can directly grasp these truths. Descartes, the first of the modern rationalists, states that truths are attained by reason are broken down into elements that intuition can grasp, which, through a purely deductive process, will result in clear truth and reality. In fact, modern science was born with rationalism. Scientists are heavily devoted to fields like logic, mathematics, physics, and metaphysics to find the laws, the rules of the law.
Newton was the person who won the widespread acceptance of the modern concept of a scientific law with his three laws of motion and his law of gravity, which accounted for the orbits of the earth, moon, and planets and explained phenomena such as the tides. The handful equations he created, and the elaborate mathematical framework we have since derived from them, are still taught today, and employed whenever an architect designs a building, and an engineer builds a car, or a physicist calculates how to aim a rocket meant to land on the Moon.
If there exist intrinsically logic structures, if there exist laws of nature, how do we find them? Galileo who uncovered a great many laws advocated the principle that observation is the basis of science and he agrees that the purpose of science is to research the quantitative relationships that exist between physical phenomena that are deductible. Today, most scientists would say a law of nature is a rule that is based upon an observed regularity and provides predictions that go beyond the immediate situations upon which it is based.
Additionally, most scientists would say that the reasons behind – the laws of nature - are the mathematical reflection of an external reality that exists independent of the observer who sees it. Nevertheless, here comes a famous goldfish-view problem raised by Stephen Hawking: do we really have reason to believe that an objective reality exists? Goldfish who gaze out from a curved water bowl would have a distorted view of reality. A freely moving object that we observe to move in a straight line would be observed by the goldfish to move along a curved path. Ironically, goldfish could still formulate scientific laws governing the motion of the objects they observe outside of the bowl and be able to make predictions about the future motion of the objects. Do we admit that goldfish’s view is a valid picture of reality? How do we know that we have the true, undistorted picture of reality but not living in a bigger water bowl?
Reality and uncertainty principle
What is reality? Philosophers from Plato onward have argued over the years about the nature of reality. Classical science is based on the belief that there exists a real external world, an objective reality whose properties are definite and independent of the observer who perceives them. Objectivism also adds that reality exists independently of consciousness, that human beings have direct contact with reality through sense perception.
According to classical science, certain objects exist and have physical properties, such as speed and mass, that have well-defined values. However, as we all know, the Newtonian theory was found to be inadequate for the description of nature on the atomic – or subatomic – level. Modern physics gives another perspective that theory-independent reality is difficult to defend.
Quantum physics is a framework for understanding how nature operates on atomic and subatomic scales, it provides a new model of reality that gives us a picture of the universe. It is a picture in which many concepts fundamental to our intuitive understanding of reality no longer have meaning.
For example, in quantum physics, a particle has neither a definite position nor definite velocity unless and until those quantities are measured by an observer.
Another of the main tenets of quantum physics is the uncertainty principle, formulated by Werner Heisenberg in 1926. The uncertainty principle tells us that there are limits to our ability to simultaneously measure certain data, such as the position and velocity of a particle. The more precisely you measure the speed, the less precisely you can measure position.
On the quantum level, nature does not dictate the outcome of any process or experiment, even in the simplest of situations. Rather, it allows a number of different eventualities, such as a certain likelihood of being realized. To paraphrased by Einstein, as if God throws the duce before deciding the result of every physical process. Quantum physics might seem to undermine the idea that nature is governed by laws, but that is not the case. Instead, it leads us to accept a new form of determinism - given the state of a system at some time, the laws of nature determine the probabilities of various futures and pasts rather than determining the future and pasts with certainty. Meanwhile, quantum physics agrees with observations. It has never failed a test, and it has been tested more than any other theory in science.
If futures and pasts cannot be measured with certainty, can the observation give us a certain likelihood of various futures and the pasts?
Is AI a new form of empiricism?
In information theory, this is also a term related to uncertainty. It is called entropy, introduced by Claude Shannon in his 1948 paper “A Mathematical Theory of Communication”. The entropy of a random variable is the average level of “information”, “surprise”, or “uncertainty” inherent in the variable’s possible outcomes.
When we think of artificial intelligence, especially machine learning, we’ll notice that machine learning looks like predictive analytics to reduce uncertainty, reduce entropy. There’s a frequently used loss function called cross-entropy loss function that measures how accurate the models are – how much uncertainties it reduces. In a sentence, what machine learning does is given a set of data points associated with a set of outcomes, we want to build a model that learns how to predict y from x. In fact, a machine-learning algorithm doesn’t give us a determined result but gives probabilities distribution of the data or continuous decision boundary. It integrates with the new form of determinism we learned from quantum: predicting the likelihood of all various futures.
We also need to realize one key difference between artificial intelligence systems and other forms of consumer technology: they rely on the ingestion, analysis, and optimization of vast amounts of data. The algorithm improves from samples, through experience. Training datasets are needed for likelihood maximization for optimal parameters. For example, in natural language processing, computers succeed text prediction tasks by calculating the probabilities of the following word based on a large amount of corpus, not linguistic syntax. Computers, rhetorically, are learning from humans’ writing experience. Isn’t this a new empiricism? Machine-intelligent empiricism?
Uncertainties in life
Everyone lives with a lot of uncertainties in life. People who survived 2020 couldn’t be more agreed. In 2020, the global pandemics left the world with new emerging political, economic, and social problems that no one could foresee. Well, a man was pretty close. Bill Gates in a 2015 TED Talk titled “The next outbreak? We’re not ready” said at that time that “If anything kills over 10 million people in the next few decades, it’s most likely to be a highly infectious virus than a war – not missiles but microbes.” That wasn’t a precise prediction, but it provides a possible future, a future that’s more likely to happen.
So do we ordinary people have the power to predict our own future, to deal with uncertainties better in life?
My answer is positive. Quantum physics tells us that no matter how thorough our observation of the present, the (unobserved) past, like the future, is indefinite and exists only as a spectrum of possibilities. Like a particle, the universe doesn’t have just a single history, but every possible history, each with its own probability; and our observations of its current state affects its past and determine the different histories of the universe, just as the observations of the particles in the double-slit experiment affect the particle’s past. Like its pasts, there are different futures of the universe. We only live in the current moment, but a moment that leads to various possible future moments, various paralleled universes. There seems to be a vast landscape of the possible universe, we gotta choose our own universe.
Resources:
1. Baird, Forrest E.; Walter Kaufmann (2008). From Plato to Derrida. Upper Saddle River, New Jersey: Pearson Prentice Hall.
2. Young, Scott W. H. (2014). "Improving Library User Experience with A/B Testing: Principles and Process". Weave: Journal of Library User Experience.
3. Stanford Encyclopedia of Philosophy, Rationalism vs. Empiricism First published August 19, 2004; substantive revision March 31, 2013, cited on May 20, 2013.
4. Stephan, Hawking; Leonard, Mlodinow (2010). The Grand Design.
5. Shannon, ClaudE E (1948). "A Mathematical Theory of Communication". Bell System Technical Journal.
Report a bug. In the quantum world, a particle actually can have either a definite position or velocity before being measured.