
Since the visual spectrum, which we can perceive with our eyes, is one of the crucial prerequisites for photography, I would like to take this opportunity to explain one of the longest-researched phenomena since the dawn of humanity: LIGHT
I have tried to be so clear that I understand it well myself. And if I understand it, then anyone can comprehend it. Don’t give up if you don’t get something after reading it the first time, just read it again. The way in which mankind has tried to explain the phenomenon of light over the millenniums is an interesting topic and certainly worth reading.
Wishing you an interesting journey of discovery.
This essay focuses on the search for the nature of light: is light an emission of particles; photons, or is it a propagating wave motion in a medium; electromagnetic radiation?


The fact that the behavior of light can be described explicitly as electromagnetic radiation at one time and as photons at other times raises concerns; Light by itself is ordinary light with constant behavioral properties. After so many millenniums of speculation and research, do we actually know exactly what light is?
I don’t believe we can know anything with complete certainty, but we can know roughly everything and with very different degrees of probability.
Pierre Perrault, 1673.
1 Early Ideas
Since eyesight is the primary sense of men, we can assume that people have wondered and speculated about light for thousands of years. Long before there was a method to record those thoughts in writing.
And because of its physical properties, it wasn’t until the 19th century CE that there was a widely accepted idea among scientists about the behavior of light.
CE = Common Era.
Prehistory
At a certain moment, prehistoric men understood that in addition to preparing food, fire could be used to dispel darkness. An important development was the manufacture of a portable form of lighting; torches and stone lamps.
Prehistoric men probably discovered how to make fire autonomous during the Middle and Late Pleistocene (240,000-116,000 BCE). Initially, people only lit a large fire in the open air on a fire pit or in a cave, if that was possible. For a long time, this was only done during the day to warm oneself, to cook food or to scare away wild animals. When it was dark, they went to sleep, and the fire was extinguished. It was only later that it was realized that the day could be extended if the fire was left to burn after dusk. From that moment on, open fire was also used as lighting.
Until 75,000 BCE, a large open fire would remain the only form of lighting. From that moment on, burning trances of wood were carried as lighting in places where there was no fire. Those were the first primitive torches.
Somewhere between about 13,000 and 10,000 BCE, the possibility of making light was discovered using oil and grease lamps. That was a big step forward. The stones that were used as lamps were usually stones with a (natural) cutouts, in which the combustible material was placed.
From that moment on, people were able to lighten up themselves without splashing sparks, so with considerably less risk of burns and less risk of uncontrolled fire.

Without portable light, a reliable way of illumination, prehistoric men would not have been able to create cave paintings.

Prehistoric men had different techniques for making images of hands. Sometimes they pressed their pigmented hands against the cave wall. They also made negatives by placing their hand against the wall and blowing the pigments in all kinds of colors through a straw.
One explanation for the origin of religion is related to the expansion of our brains and thus, for example, the development of language, empathy and the ability to procedure abstract thoughts. Not only did men acquire the capacity to invent and manufacture complex tools, but he was also entrusted with a higher degree of consciousness and understood that most effects have a cause. For inexplicable natural phenomena, gods were invented, and religion gave meaning to life. Thus, gods and religion are a secondary effect of the increase in cognitive properties.
Most early religions already worshiped a sun god in one form or another, but around 1370 BCE it was the Egyptian pharaoh Akhenaten who introduced a worship of a sun divinity that included the sun’s rays as well. He saw the light of the sun as life-giving and had it clearly depicted as such in the Armana style of Egyptian art.
Gold was considered the flesh of Aton and the surprising thing is that in the end they were not far off. Gold has been present on Earth since the planet’s inception, but it has not been formed there because its production requires powerful fusion reactions. Stars, from about 8 times the mass of our sun, end their life with a supernova explosion and the associated nuclear fusion reactions provide enough energy to produce gold.
Akhenaten (He who is useful to Aten) was born as Amenhotep IV (Amun is pleased), the youngest son of Amenhotep III. Akhenaten was originally not designated as the successor to the throne. Because of his deviant appearance, Akhenaten was hidden away by his father during his childhood, did not participate in public manifestations and was not assigned titles or other expressions of his origin. It was his mother who, after the death of his father, got him on the throne.
To keep the royal blood pure and property not to be lost, in principle royals married within the family, brothers married their (half) sisters, cousins with (grand)nieces, etc., with physical abnormalities because of inbreeding.

Aton is the sun god depicted as a disk with an uraeus cobra as a sign of his royal status and rays ending in little hands. Some hands hold an ankh sign (☥), but only in relation to the royal family as a sign of their immortality, the key to eternal happiness. It was supposed to protect them from all kinds of dangers. The presentation of an ankh by a divinity to a pharaoh was the symbol for the donation of life energy. The ankh was often held just under the nose, with which the ankh energy was passed on to the recipient through the nasal breath.
While it is going too far to describe the complexities about the gods of Ancient Egypt in this essay, it is important to point out the difference between the gods Aton and Ra and to clear up any possible confusion.
Ra is best known for his role as a sun god who also has a nocturnal appearance in the form of a scarab beetle.
Mythological stories give an impression of an Egyptian supreme god, the main divinity in traditional religion who plays several roles. Ra is often fused with other gods; among others with Atum, Horus and Amun (e.g. Amun-Ra).


Ra is the denotation for the visible sun disk that can only be seen during the day and is depicted as a falcon with a sun disk. In his nocturnal form, Ra is represented as the black scarab beetle: Chepri.
From that moment on Aton is depicted as a sun disk with sun rays in the form of arms that end in small hands. The hands often hold the symbol for life itself: the ankh (☥), which is offered to members of the royal family only.

Representation of the god Aton
This hymn expresses an ecstatic joy over creation. The universe treasures in the divine light of Aton. Men and beast are in worship, and the spectacle of the world is one of richness and joy. Underlying the hymn is clearly a deeply lived sense of religious admiration, with Akhenaten being referred to as the son of Aton.
And more: King Tut
Akhenaten was married to his cousin Nefertiti (The beautiful one has come). Together they were the only chosen ones who could worship Aton directly, thus providing the only medium for the Egyptian people to worship Aton. After Nefertiti suddenly disappeared from the scene, Akhenaten remarried twice and had a son named Tutankhaten with his second wife. Although this is not 100% proven because there are 4 other potential mothers in the picture. However, DNA testing has confirmed that Akhenaten was his father.
After Akhenaten’s death (perhaps murdered by dissenters), the medium of communication with Aton was no longer available, the former priests regained power and all the gods were restored to their former glory. For political-religious reasons, Tutankhaten changed his name to Tutankhamun. Thus, “Living Image of Aton” was changed to “Living Image of Amun” (Amun is the king of the gods).
A recent inspection of Tutankhamun’s skeleton has revealed that he was 19 years old when he died and was about 1.68 m tall. There was 1 cm difference between the length of his legs, and he had a slightly crooked spine. In addition, he had a club foot and was missing a toe on the other foot. All the ailments he suffered from were probably the result of a bone disease and his weak immune system, both of which were probably the result of the many generations of inbreeding that had preceded him. He walked with a stick that portrayed him and 170 sticks were found in his grave. This investigation has shown that Tutankhamun most likely died from the effects of an inflammation in his knee caused by a fracture, combined with the effects of a malaria infection. A less pleasant death, so to speak.

Tutankhamun portrayed with a stick, but without a club foot or crooked spine.
Until the reign of Akhenaten, the pharaohs were represented in sculpture as young male figures who had to show their manhood on the battlefield and also with (numerous) concubines. Akhenaten, however, was depicted as a strange and grotesque appearance. In it, he has a large gourd-shaped head, a very long thin neck, narrow eyes, and bulging lips. His belly is like that of a pregnant woman, with thick thighs and thin lower legs. He is depicted with strange, feminine features. He most likely suffered from Marfan syndrome and did not want to hide it. As the most powerful person in Egypt, of course, he didn’t have to do that either.
In addition, he allowed himself to be portrayed in domestic circumstances, together with his wife and children, something that had never happened before. The light-hearted Armana art style was an innovation in painting and sculpture that was later banned as “heretical” art and often deliberately destroyed.


On the left an image of Akhenaten according to the Armana art style. On the right a statue of Amenhotep III the father of Akhenaten in the traditional style depicting pharaohs as eternally young resilient and decisive rulers.
Obelisks, in the form of tapering stone columns, were associated with the sun god and most likely represent rays of light.
Of the 29 ancient obelisks that are still standing, Egypt itself can claim 8 of them. Rome owns 12, all stolen from the land of the pharaohs. During the Renaissance, the obelisks were used by the popes as a symbol of power and usually a cross was placed on the obelisk to indicate the superiority of the Christian faith.
Obelisks first appeared in historical records around 2,575 BCE. These monuments are therefore more than 2000 years older than Christianity and the strange thing is that tourists do not realize this at all, given the limited attention that obelisks receive during a visit to Rome when almost everyone is only staring at Christian monuments.

In the middle of St. Peter’s Square - Vatican City stands a 25.5-meter-high Ancient Egyptian obelisk. This obelisk was brought to Rome from Alexandria by Emperor Caligula in 37 CE. The obelisk was removed from the “Circus of Nero” in 1586 CE and placed in the center of the square at the order of Pope Sixtus V. Re-erecting the obelisk required some 900 men and nearly 100 horses. The work took more than a year.
On the first day of creation right after God created the heavens and the earth, God said: “Let there be light,” and there was light. And God saw that the light was good. And God separated the light from the darkness. God called the light, day, and the darkness he called night.
On the fourth day God created the lights in the sky; the sun, moon, and stars to signify seasons, days, and years. The great light (the sun) to rule the day, the small light (the moon) to rule the night.
Thus, according to Judeo-Christian tradition, light is first present and three days later the sun.
This can be read in the book of Genesis, where according to tradition Moses (1,391-1,271 BCE) is mentioned as the author. However, modern insights regard this book as a product of the period 600-500 BCE.

God said: “Let there be light,” and there was light.
The very first Greek documents available on light show the kind of confusion of ideas and blending of supernatural themes one might expect from the dawn of science, the first attempts at rational non-religious explanations of nature.
The first Greek philosophers (530 BCE.) regarded daylight (coming from the heavens) and sunlight as two completely different and independent phenomena. Moreover, darkness had its own physical and even material existence, completely separate from the light that could be seen during the day. Initially, darkness was not recognized as the absence of light. The true nature of darkness was established in 415 BCE.
During this period, it was believed that the phases of the moon could be explained by the fact that that celestial body reflects the light of the sun instead of emitting its own light. At the same time, the sunlight was believed to be simply the light coming from the earthly firmament during the day, reflected and focused by the earth. Here at least the sunlight and daylight were connected, but in reverse.

The visible appearances of the moon were explained by the ancient Greeks as light coming from the earthly firmament, reflected by the sun, and then projected onto the moon.
At one point, two opposing theories were developed.
Intromission theory:
Explaining eyesight by something entering the eye.
Around 400 BCE it was claimed that all matter consists of invisibly small, indistinguishable particles called atoms (atomos, Greek for indivisible). Due to the constant motion of atoms, visible objects continuously emit thin films of atoms from their outer surfaces, and it was these films (which retained the object’s shape and color) that penetrated the eyes to produce the visual perception of that object.
Extramission (or just; emission) theory:
Explaining eyesight by something leaving the eye, promoted by the Athenian philosopher Plato (427-347 BCE). Optical rays, coming from the eye, are reflected from objects, providing a visual perception of those objects.
Aristotle (384-322 BCE) refined the intromission theory (something that enters the eye). A theory that lasted the longest in Western culture because Aristotle painted a complete and integrated picture of the world that included light and vision. Pretty much along the lines of; if one theory of that man is correct, another theory of his will also be correct.
He did not regard light as a substance or even a movement, but as the “presence of fire in what is a transparent object”. What was sent to the eye was not light but color. Color was transmitted through the intermediate transparent medium. The opaque body that was finally seen somehow imprinted its color on the layer of the medium (usually air) touching the surface, and that layer passed it on to the next layer, and so on towards the eye. In this whole process, the function of a luminous body, such as the sun or an oil lamp, was to make the medium transparent. Only the presence of such a luminous body made possible the perception that Aristotle assumed was light and manifested itself as the transparency of the medium: a kind of resonance of this “fire” in the medium with a similarity in the color-emitting object. Aristotle, like us, would say that one cannot see in the dark because there is no light, but in his theory the inability to see stems from the inability of the medium to transmit the colors of objects. From this it can be deduced that Aristotle did not know that the sunlight is a composition of all the colors in the visual spectrum.

According to Aristotle’s theory, the shape of an object was transported to the eye in shells by means of an (in this example) orange fire, which was contained in a transparent object, and then observed. The sunlight, or some other light source, caused the intermediate medium to become transparent so that the eye could perceive the object.
Therefore, intromission theories (something that enters the eye) will be further split into particle and wave theories from this time on. It is a differentiation that continues into modern times in theories of light.
Also, around 400 BCE, based on axioms (not proven, but accepted claims from which various theorems are derived by following strict rules of mathematical logic) the behavior of optics, vision was explained based on the extramission theory (something that leaves the eye). People were so convinced of the correctness of the optics axioms that experiments were considered unnecessary. Even with the extramission theory (something that leaves the eye) as its basis, this theory was able to correctly predict many laws of perspective and geometric optics (a model of optics that describes light propagation in terms of rays), including the law of reflection.
Around 150 BCE, again based on the extramission theory (something that leaves the eye), more physical theories were elaborated, substantiated with experiments of reflection and refraction (bending of rays when entering another medium). It was concluded that the visual rays from the eye were of the same nature as the light rays from luminous objects.
Finally, at the very end of the Greek science period, there was a convergence of theories about light and vision.
European Middle Ages
Or in this case the more appropriate name: The Dark Ages.
After the fall of the Roman Empire, the study of science (including that of light) lay dormant in Europe for a long time. The reason for this sleep can be explained as follows: on the one hand, there was the advancing force: science, which helped to advance the understanding of physics. On the other hand, there was an unchanging traditional power: religion, which slowed down this progress, fearing that religious beliefs and statements would be undermined.
Science is the embodiment of reason in which the (Judeo-Christian) religion is reduced to a dubious naive belief in which light is the nature and character of the supreme spiritual leader. Not light or any kind of light, but the light itself and based in part on limited (physical) knowledge and beliefs from 500 BCE when the book of Genesis was written.
The Inquisition was established in the late Middle Ages. This ecclesiastical court was charged with tracking down “heretics”. A heretic was someone who held views that were “different” from official Church philosophies and were fiercely contested. For example, the Church felt that she was the only one allowed to practice science and no one was sure of his or her life in the long run.
In the book titled: Het Vijfde Zegel (The Fifth Seal), one of the most important Dutch authors: Simon Vestdijk, describes the last 24 hours of heretics condemned to the stake by the Inquisition, right at the beginning of the story. Being regretful at the ultimate moment was no longer seen as credible with such a death execution in prospect. The most one could achieve by being regrettable was to be strangled before being consumed by the flames. Unlike death by flame, a strangled person could still be admitted to heaven. There was also the option of paying for every minute that the strangulation would be shortened, an important contribution to the executioner’s income. In addition, there was also an angry mob that wanted to see blood before the execution took place and did everything they could, to get it done, especially since it often involved high-ranked persons. All in all, enough reason not to end up in such a situation.
This is a measure which, as we shall see, will have its effect well into the 17th century, the era when scientists traveled regularly through Europe and the Inquisition was particularly active in France, Spain, Portugal, Italy and England.

A 19th century portrayal of Galileo Galilei who is held accountable for his ideas by the Inquisition.
Arabic Golden Ages
At the time of the Dark Ages in Europe, the Arab world was experiencing the Islamic Golden Ages (786-1258 CE).
Medieval Islamic science had both practical purposes as the development of knowledge. For example, astronomy was useful in determining which direction to pray in, botany had practical applications in agriculture and geography to make accurate maps. Islamic mathematicians made progress in algebra, geometry and Arabic numerals. Islamic doctors explained diseases such as smallpox and measles and described the preparation of hundreds of medicines made from medicinal plants and chemical compounds. Islamic physicists studied optics and mechanics as well as astronomy and criticized Aristotle’s view on motion.
Optics developed rapidly during this period. By 800 CE, works had been published on geometric, and physical optics, also called wave optics, the branch of optics that studies interference, diffraction, polarization, and other phenomena for which the beam approximation of geometric optics does not hold.
Topics covered included specular reflection and the development of lenses for magnification and improvement of vision.
The Arabic scholars came close to discovering the law of refraction, although this step was not taken.
The first aspherical lenses were produced that concentrated light without geometric aberrations. This is a lens whose surface profile is not a perfectly round geometric object in three-dimensional space like the surface of a round ball.
Geometric aberration is the imaging error of a lens, caused by the fact that, with a pure spherical lens shape, parallel rays of light entering at different distances from the optical axis do not converge at the same focal point.

The complex shape of an aspherical lens developed by Arab scientists.

Earliest known correct schematic of the human visual system, Book of Optics, 1011-1021.
Do you recognize the nose?
These theories were highly influential on European scholars of the later Middle Ages who adopted the Arab theory of vision almost one-to-one in its entirety.


Reflection and refraction of a light beam
The man who successfully used this tool was Alhazen. He was the first to combine the geometrical optics of the Greeks with the intromission theory (something that enters the eye). Alhazen ignored all but the central beam from a point entering the eye. This theory was not filled in correctly, because in the Arab world no one really knew how the eye worked to bend the rays coming in. Yet Alhazen’s theory represented a major advantage over the idea of rays shooting out of the eye (extramission). Probably even more progress could have been made if more knowledge about the internal structure of the eye had been known. However, Arab scientists had to rely on misleading ancient Roman diagrams because the Islamic religion forbade dissecting corpses.
After 1000 CE, all Greek ideas about eyesight were rejected.

The eye according to Arab scholars, 1200 CE.
The major advances in understanding the nature of light would come after the great flourishing of research in Europe known as the Renaissance (1300-1600), in which the Italian physicist, astronomer, mathematician and philosopher Galileo Galilei (1564-1642) completely dismantled Aristotle’s (384-322 BCE) conception of astronomy and mechanics and laid the foundation for modern science. Galileo was one of the first to design and then test an experiment. His experiment to measure the speed of light involved two observers, each equipped with a lantern with shutters, overnight on mountaintops miles apart. An observer opened his lantern and started the timing. When the second observer saw the flash of the first lantern, he had to open his. The first observer stopped timing after seeing the light from the second lantern. As one can imagine, the speed of light is too great to be measured by this rough method.

Inaccurate way of measuring the speed of light by Galileo Galilei.
Willebrord Snel van Royen (a.k.a. Snellius 1580-1626, Snellius’ Law) was a Dutch mathematician, physicist, humanist, linguist, and astronomer, and may be wrongly credited with discovering the refraction of light during media transitions. Already 600 years earlier, Arab scientists described how curved mirrors and lenses could focus light.
René Descartes (a.k.a. Renatus Cartesius, 1596-1650) was a French philosopher and mathematician who lived much of his life in the Republic of the Seven United Netherlands. As Aristotle had done two thousand years earlier, he tried to construct an all-embracing system that would explain all physical events, except that his system claimed to use only matter and motion theories as an explanation. According to him, the entire space was filled with balls of a material called the “ether” that could transfer forces through direct contact. A luminous body like the sun was caused by a spinning motion in this ether. The outward pressure of this vortex, transmitted by the ether balls pressing against each other, was the phenomenon: light. In this theory, light had an infinite speed; it was transferred from one point to another without delay.

Because of his philosophical beliefs, René Descartes proposed in 1644 that no empty space can exist, and that space must consequently be filled with matter. The parts of this matter tend to move in straight paths, but because they lie close together, they cannot move freely, which according to Descartes implies that every motion is circular, so the ether is filled with vortices.
1. Light as disturbance, transported through the ether.
2. The association of different colors in the visual spectrum with different periodic, constantly repeating movements of a particular kind.

René Descartes, was in Europe responsible for the first detailed analysis of the rainbow. A ray of light (A and F coming from the top left) that penetrates a raindrop (the circle), is reflected internally one or more times and finally exits. When a ray of light moves from air to water, it changes direction according to Snell’s law.
Corpuscular Theory
The classical period of physics began in the late 17th century with the work of Isaac Newton (1642-1727). He described an overarching and all-embracing scientific theory of motion and gravitational force that is still considered valid, apart from very high speed or very small dimension. The calculations that space explorers send to the moon or planets are based on Newtonian mechanics theories.
Newton accepted Descartes’ idea of an ether for the transmission of forces such as electricity or magnetism, but he could not accept the image of light as a disturbance in the form of a vortex propagating through the ether.
One reason he couldn’t was because the light travels in straight lines. He knew that disturbances in a medium, such as waves in water or sound in air, tend to curve around obstacles, while obstacles to light produce sharp shadows. Newton thought that light was the movement of a substance through the ether, perhaps small particles or bodies, emanating from the luminous object. Therefore, Newton’s name became associated with the corpuscular theory of light. Streams of particles moving quickly and in straight lines out of a luminous object can easily explain the sharp shadows of solid bodies.
Wave Theory
Yet Descartes’ idea that light was a pressure or disturbance (vortex) in the ether continued to fascinate many scientists. The greatest opponent of this view in Newton’s time was the 13 years older Christiaan Huygens (1629-1695). Huygens was a leading Dutch mathematician, physicist and astronomer, inventor and author of early science fiction about intelligent life forms on other planets. Because Huygens wanted to engage in science but did not want a collision with the Church, his science fiction ideas were only published after his death.
Although the threat basically no longer existed at the end of the 17th century, but you were burned at the stake by the Inquisition around 1600 for claiming that the Earth was not the center of the universe. Galileo Galilei had to justify twice for heresy in front of the Inquisition, being happy that the first time it was a warning and the second ended with a lifelong house arrest. A potential conflict with the Church was also reason for Descartes in 1633 to renounce from publishing earlier work. Fortunately, in 1992, the Pope apologized for the treatment of Galileo, recognizing him as a man of faith and not having to burn eternally in hell for his pioneering ideas at the time. Lucky Galileo shall we say.
Johannes Kepler did not escape the dance either, at least not his elderly mother. Kepler had discovered that planets did not move in circular orbits, according to the Churches views, but in elliptical orbits. The widow Katharina was first accused of witchcraft in 1615 and this accusation dragged on for years. Despite her famous son and after being held in various dungeons for 14 months, where she had to declare that she was a witch under threat of severe torture, which she did not do, she was released in 1621 and died 6 months after. Two strong men guarded the seventy-three-year-old woman day and night. The defendant and her relatives had to pay the costs. It was unusual for a defense attorney to be admitted to a witch trial. According to protocol, Katharina appeared “unfortunately with the help of her son Johann Kepler, Mathematicus”.
Because Johannes had influential patrons in his network, the Inquisition could not exert any direct influence over him, otherwise he too would have been indicted and the outcome would undoubtedly have been very different.

“Various modes of torment, common in the Inquisition” - copper engraving from the 18th century

Christiaan Huygens’ explanation for the changing shape of Saturn, Systema Saturnium, 1659.
The book: Een Eeuw van Licht (An Era of Light, which is unfortunately not translated into English yet), about the life of Christiaan Huygens, written by Hugh Aldersey-Williams is definitely recommended. Except for the short chapter about music that didn’t interest me as much.
The proof that convinced Huygens that light was a pressure wave moving through the ether was the fact that two beams of light whose paths intersect are not affected by the intersection but continue their way. The same result is easily seen with water waves, which can cross directionally and continue in their original direction, while any two particle streams (imagine the result of two crossing water jets) deflect each other when intersecting. It has been suggested that it was quite natural for a Dutch physicist to rely on waves as an explanation, as the Netherlands was a seafaring nation and a country intersected by canals, allowing natives to observe wave phenomena on a daily basis from childhood. However, this does injustice to Huygens’ talents. He was “universal”, observer, thinker and maker all combined into one. He is one of the Dutch most important, but most underestimated scientists, surpassing Isaac Newton in some important respects.



Intersecting rays of light, intersecting water waves and intersecting water jets.

The foundation of Huygens’ wave theory was the principle of wave propagation. Thus, he explained the visible properties of light, such as the rectilinearity of light rays and the laws of reflection and refraction. It was a revolutionary new look at light, 1677.
This acceptance was not universal, however, as some scientists and philosophers still supported the wave theory. Among these advocates of the wave approach was the American politician, scientist and moralist Benjamin Franklin (1706-1790). Experiments with electricity (legendary is the kite experiment during a thunderstorm), a political revolution (co-authored the United States Declaration of Independence, 1776), and the United States Constitution (one of the Founding Fathers of the United States) were his main claims to authority.

Benjamin Franklin used a kite to catch some electrical charge from a thundercloud, demonstrating that lightning is an electrical phenomenon.
Young also became the first person to convincingly demonstrate interference effects in light experiments, producing darkness with light. Effects that could be explained based on wave theory. His experiment with the double slit used a point source of monochromatic light (light with a single wavelength) and split the light into two parts by passing it through two slits in an opaque screen. The light from the two slits was then projected onto a screen opposite the point source.

We can show that light has wave properties as follows. If light shines on a plate with two thin slits, circular waves are formed behind both slits, which start to interfere with each other. This is called the double slit experiment. If a screen is placed behind the slits, we see maximums (light area) and minimums (dark area).

If light had only had particle properties, we would have expected the above pattern. Light would then only be visible in two places on the screen.
Logically, the color perceptions gradually merge with changing wavelengths, fading into darkness due to limitations of our visual sensor, the eye. In general, the visual spectrum is defined in wavelengths from about 400 nm to 700 nm. These wavelengths are all less than a thousandth of a millimeter and this short length can be seen as a major reason that the wave characteristic of light was not discovered before.
- 380 to 450 nm violet
- 450 to 490 nm blue
- 490 to 560 nm green
- 560 to 590 nm yellow
- 590 to 630 nm orange
- 630 to 760 nm red
Speed of Light Measurements
In 1675, the Danish astronomer Olaf Roemer (1644-1710) made an astronomical measurement of the speed of light using variations in the eclipse of a moon of Jupiter. This was the first time a realistic measurement of the speed of light was obtained. In 1727, another astronomical speed determination was obtained by the English astronomer James Bradley (1692-1762), who used an effect called: stellar aberration (deviation). Bradley’s result at least roughly confirmed the previous measurements, based on Jupiter’s lunar eclipse.
In the case of “stellar” or “annual” aberration, the apparent position of a star to an observer on Earth varies periodically over the course of a year as the Earth’s velocity relative to the star changes as it orbits the sun.

Roemer measured the speed of light by timing eclipses of Jupiter's moon: Io. In this figure, S is the sun, E1 is the position of the Earth when it is closest to Jupiter (J1), and E2 is Earth about six months later, on the opposite side of the sun from Jupiter (J2). When Earth is at E2, light from the Jupiter system must travel an additional distance equal to the diameter of Earth's orbit around the sun. This causes a delay in the moment of the lunar eclipse. Roemer measured that deceleration and, knowing approximately the Earth’s orbit diameter, made a first realistic estimate of the speed of light.


The first sublunar measurement of the speed of light was recorded in 1849 by Armand Fizeau.
Knowledge of the exact wheel speed (this could not have been achieved without the aid of an electric motor, first developed in 1834) and knowledge of when the eclipse occurs allows one to calculate the time it takes for the light to travel 16 km and thereby calculate the speed of light. Similar experiments in the 19th century have established that the value of the speed of light c = 3 * 108 m/s in vacuum has almost the same value as in air.
This means a speed of 300,000,000 meters per second, or 300,000 km/s, or 300 million m/s. Given the distance between the Earth and the Moon is 384,401 km, it would take a light wave 1.28 seconds to travel this distance.
Light can travel 1 meter in about 3 ns (ns stands for nanoseconds, one billionth of a second), so 3 * 10-9 s.
Let these values sink in for a while. As comparison; sound in air has a speed of 330 m/s and is therefore roughly a million times slower than light. First you see the lightning flash, later you hear the thunderbolt.
A crucial measurement in relation to the wave theory of light was that of the speed of light in a relatively dense medium, such as water in a glass. To explain the observed phenomenon of refraction, i.e. the bending of light as it passes from one medium to another, the corpuscular theory had to assume that light accelerated as it entered water from the air, whereas the wave theory had to assume the opposite. In 1850, French scientist Jean Bernard Léon Foucault (1819-1868, the guy of the pendulum) succeeded in measuring the speed of light in water and found that it was lower than the value in air. This discovery is believed to be the final nail in the coffin of Isaac Newton’s classical particulate light theory (1642-1727).
Knowledge of both the speed and the wavelength of light allows a calculation of the light frequency.
The frequency of green light:
𝜆 = wavelength in meters (500 nm = 5 * 10-7 m for green light)
c = speed in meters/second (3 * 108 m/s for light)
f = frequency in Hertz, cycles/second. Cycles is the number of waves that pass through a fixed point in the wave’s continuing sinusoidal path (usually the zero point).
Here c = 𝜆 * f thus f = c / 𝜆
f = 3 * 108 / 5 * 10-7 = 6 * 1014 Hz = 600.000 GHz = 600 THz. (1 Tera = 1000 Giga = 1000.000 Mega = 1000.000.000 kilo = 1000.000.000.000).
This example shows which extremely high frequencies are typical for light.
Every second, six hundred trillion waves pass every point in the radiation path of a green light wave, every second again and again.
Look at the colors around you and realize how many different frequencies are continuously present at such speeds in a lit environment. Our eyes can perceive more than 200 different colors, distinguish between the most detailed nuances, and recognize more than 20 levels of saturation and 500 levels of brightness per color. Almost incomprehensible, right?
Saturation is a measure of the purity of a color. High-saturation colors are called vivid, bright, clear, or deep; low-saturation colors are called muted, faded, or grey. Monochromatic light, light consisting of a single frequency, has a saturation of 100%. Black and white images (grayscale) have a saturation of 0%.
Brightness is a property that is assigned to a color, the brightness increases as the percentage of black in the color decreases.
In an illuminated environment, we process countless color stimuli at any moment, a fairytale world of colors that we often don’t realize because we don’t pay attention to them, since we’ve taught our brains to bring order in this disordered abundance of information and only to draw attention to the most important information.
Electromagnetic Waves
Aside from establishing the wave nature of light, classical physics has contributed to the understanding of light with another major breakthrough. This breakthrough was contained in the work of the Scottish physicist James Clerk Maxwell (1831-1879). Maxwell was one of a long line of British scientists who wanted to explain electrical and magnetic effects in terms of the mechanics related to that mysterious substance called: ether. In theoretical work on the nature of electric and magnetic forces, Maxwell concluded in 1861 that an electric charge changing speed (read accelerating) should cause a disturbance in the ether that radiates away from the source at a speed of 3 * 108 m/s. Since the speed Maxwell found was calculated solely using electric and magnetic constants, it suggested that it was equal to the speed of light already measured and thus light was a similar perturbation. To produce a regular, uniform wave, the electric charge would have to vibrate back and forth.


The first true color photo. Already his student days, Maxwell showed that all possible colors can be created from the primary colors red, green and blue. In 1861 Maxwell had three black-and-white photos made of a rosette created from a cloth of tartan ribbon. Each photo was taken with a different color filter in front of the lens. The images were then overlaid to create a single composite. The result was a color reproduction of the ribbon that contained all of the original colors of the tartan

Wavelength Spectrum of electromagnetic radiation of which visible light is just a small part.
Optional to read for those who want to understand the whole picture.
At the beginning of the last century, most scientists were convinced that the fundamental nature of light had been understood. In fact, many scientists thought that the entire physical world could be explained by the theories of that time. The sciences of heat and thermodynamics were statistically reduced to the mechanics of very small particles (atoms and molecules). Electricity, magnetism and optics were all theoretically combined, and the theoretical limitation to explaining the mechanics of the ether was expected with confidence by many.
One of the discrepancies facing physics around 1900 was the emission of light and other electromagnetic radiation originating from hot solids.
Think of an iron rod that turns orange-red when heated to a high temperature. And if the rod is heated even further, the color changes to blue and white.

Red hot to white hot iron.

Continuous spectrum (solid).

Line spectrum (depending on the gas being heated).
The continuous spectrum emitted by a hot solid is called blackbody radiation. Because a perfectly black object (an object that absorbs all the radiation from the visual spectrum that strikes it) is theoretically the easiest to analyze, and most solids emit radiation in a way similar to the way black objects absorb them.
Keep the following in mind:


In the emission pattern of black objects, there is some power at all wavelengths at a given temperature, but not the same power at all wavelengths. We characterize a similar spectrum by plotting the radiated power of each wavelength (intensity) at a given temperature versus the wavelength.

When a solid body is heated to (4000 – 273.15) °C, mainly a red color is emitted,
at (5000 – 273.15) °C yellow-green. Here 0 K(elvin) is equal to -273.15 °C(elsius), the so called absolute zero.
Any object with a temperature above absolute zero (0 K, -273.15 °C) exhibits such radiation. But for temperatures commonly found on Earth’s surface, the radiation curve is so far into the infrared (with the peak on the right of the graph) that there is no detectable radiation in the visual region.
At body temperature, everyone radiates an energy with wavelengths around 9000 nm. You therefore continuously radiate electromagnetic waves that are in the infrared range, which is why you are visible in the dark with the help of infrared binoculars.
First Quantum Theory
In 1900, a scientist went off the trampled track to arrive at results that not only correctly predicted the radiant energy of black objects, but also represented the beginnings of a new, non-classical physics known as quantum mechanics. The German physicist Max Planck (1858-1947) examined the radiation curve of black objects to investigate what assumptions would be necessary to derive them correctly. He found that if he abandoned some classical ideas about the emission of electromagnetic energy from the atoms of a solid, but instead assumed that they radiate energy in small chunks called quantum’s, which he could derive with a calculated value, they very closely matched at all wavelengths of the experimental curve. Each emitted quantum had to have an energy proportional to the frequency of the electromagnetic radiation.
At the time, Planck himself was not happy at all with the consequences of his quantum hypothesis. Until then, classical physics had always sufficed, and no one wanted to deviate from it in favor of uncertain new ideas. Planck and others thought that a method still could be found to explain the radiation of black objects in the classical way and that the quantum hypothesis would then be seen only as a temporary, ad hoc explanation obtained by luck. But quantum theory didn’t fade that way. Instead, it proved increasingly useful for explaining phenomena that had turned classical physics on its head. One such phenomenon was the photoelectric effect.
Photoelectric Effect
An explanation for the photoelectric effect had been a mystery to scientists since it was first discovered in 1887 by German physicist Heinrich Hertz (1857-1894), the same scientist who first confirmed Maxwell’s prediction of electromagnetic waves and to whom the unit of frequency is named. The effect is basically the release of negative electrical charge with some metals when their surfaces are exposed to high frequency electromagnetic waves. Some metals show the effect when exposed to visible light, but generally ultraviolet waves are required.
Later was it discovered that the charge was released in the form of elemental negative charges called electrons (electrons themselves were not discovered until 1897). One aspect of the photoelectric effect that was confusing was the fact that increasing the energy of the incoming electromagnetic beam, by making it more intense, did not increase the energy of the individual electrons released; instead, it produced more electrons. Also, puzzling was the fact that increasing the frequency of the incoming radiation increased the energy of the individual electrons released. And perhaps most confusing was that according to the classical calculation it would take minutes to days for an electron in the metal to store enough energy from the incoming electromagnetic waves to be released from the surface at all, while experiments showed that photoelectric emission practically took place immediately.

Electromagnetic waves strike a metal object and transfer energy to the electrons in the metal. This allows the electrons to escape from the metal.

Not E = mc², but an explanation for the photoelectric effect brought Einstein a Nobel Prize.
Nuclear Atom
Advances in knowledge about the nature of atoms soon presented another dilemma for physicists. In 1911 the English physicist Ernest Rutherford (1871-1937) developed a new model of the atom based on the results obtained from the scattering of 𝛼-particles (subatomic particles produced by radioactive substances) caused by a thin foil of (heavy) gold atoms. The scattering experiments indicated that the positive electric charge and almost the entire mass of an atom is concentrated in a very small central nucleus. The atom’s electrons orbit the nucleus, just as the planets of our solar system revolve around the sun.


The radioactive 𝛼-radiation coming from radium consists of positively charged 𝛼-particles that have a high velocity and collide with a thin foil of gold. The vast majority of particles pass through it with a slight change in direction. However, a small number of them undergo a major change of direction. Compare the interaction between positively charged 𝛼-particles and the positively charged nucleus with the repulsion of two equal oriented magnet poles.

Based on the above observation, it was assumed that the positive charge and thus most of the mass of an atom is concentrated in a nucleus and that the electrons move in a thin shell around that nucleus.
However, Rutherford’s model of the atom helped little in understanding the line spectra of gases. According to the classical theory of electromagnetic waves, any electron moving in a circular orbit must continuously send radiation at the same frequency with which it orbits the nucleus. Such an electron, because it radiates energy, would have to lose energy and spiral into the nucleus, increasing its rotational frequency as it does, resulting in an increasing radiation frequency. Therefore, Rutherford’s atomic model was not only theoretically unstable, but it should also emit radiation of increasing frequency for a very short time interval during its existence, which is not the case.
Bohr’s Atomic Model
Again, it was quantum theory that came to the rescue. In 1913, the young Danish physicist Niels Bohr (1885-1962) explained Rutherford’s atomic model that correctly predicted the spectrum lines of hydrogen by considering the ideas of quantum theory. Hydrogen is the lightest element and has a relatively simple line spectrum, so it has always been thought to have the most uncomplicated atoms and has therefore always been the first subject of research of theoretical analysis. Bohr’s theory was essentially classical but integrated with quantum ideas at certain key points.
The principles were:
2. An electron does not emit or absorb electromagnetic radiation while spinning in these spherical shells, but only when it jumps between the shells.
3. An electron emits or absorbs a quantum of electromagnetic radiation as it jumps from one shell to another.

If an electron jumps to a shell closer to the atomic nucleus, it emits energy equal to the energy of a photon.

According to Bohr's atomic model, the electrons of an atom reside in several shells around the nucleus, which have different energy levels. Each shell can hold a limited number of electrons. The electrons of a stable atom are in the lowest energy shells closest to the nucleus.
Additional evidence came in 1922 from the American physicist Arthur Holly Compton (1892-1962). He had an X-ray strike a block of graphite. Part of the beam went straight, a part curved, at different angles. This scattered radiation had acquired a frequency that was lower according the greater the scattering angle. This phenomenon is the Compton effect, named after him.

An incoming high-energy photon, generated in an X-ray tube, collides with an electron, and knocks it out of orbit around its nucleus. For this, the incoming photon must have an energy that is much greater than the binding energy of the electron. The photon with the remaining energy after the collision is deflected in a different direction than the direction of incidence and can eventually knock another electron out of its orbit. Since the energy of the photon decreases, there is a corresponding increase in the wavelength towards the visual spectrum. In general, there is a small “redshift”, a kind of Doppler effect but then applicable to light waves and scattering of the photons as they pass through the atomic configuration of the material.
Matter Waves
It was clear that there was a need for a unified theory that could be applied to any quantum problem, as well as a theory that could explain the wave particle dilemma. Such a unified theory was developed between 1925 and 1930 and was based on a brilliant insight from the French scientist Louis de Broglie (1892-1987). He suggested that since light was considered by classical physics to be waves that sometimes behaved like particles, the reverse might also be true. Maybe photons sometimes behaved like waves? Reasoning by analogy with light, he derived a wavelength for the wave associated with a particle of mass m traveling at speed v.
This wavelength of De Broglie is:
𝜆 = h / (m * v)
Planck’s constant, denoted by h, has been introduced for the relationship between frequency ν and energy E of a light quantum (photon) according to: E = h * ν. The letter ν is also used instead of the frequency quantity f, in optics and quantum mechanics, for example.
In other words, Planck’s constant divided by the momentum of a particle gives the wavelength of the corresponding wave. Planck’s constant is a physical constant that occurs in all equations of quantum mechanics. The constant has been adjusted in 2019 from:
h = (6.626 070 040 ±0,000 000 081) * 10−34 Js to 6.626 070 150 * 10−34 Js.
An adjustment of a constant on this scale (smaller than 10-40) says something about the accuracy with which the results of experiments can be analyzed nowadays.
The joule-second (Js) is a unit of action or angular momentum that denotes Planck’s constant.
Because Planck’s constant is so small, relatively large, low-velocity objects, such as a rock thrown, have immeasurably small wavelengths that cannot be detected. But electrons, traveling at great speeds, have a small enough momentum to produce a detectable wavelength.
The speed of an electron is two thousand kilometers per second. That’s less than 1% of the speed of light. And because its shell radius is so small, in one second the electron orbits a whopping seven quadrillion (7 * 1015 or 7,000,000,000,000,000) times around the nucleus of the atom.
By 1930, the unified quantum theory, called wave mechanics, based on De Broglie’s “matter waves” had been completed.
Louis de Broglie’s contribution did not end the wave-particle duality but expanded it further to include material particles. Yet there is a new similarity to be discovered in this expansion. Light is not that different from other matter. In modern theories, the main difference between electrons and photons is that the former have a rest mass that is not zero (that is, their mass when they are not moving). The rest mass of the electron is 9.10938356 * 10-31 kg, while photons have a rest mass of zero, which also means that photons at rest do not exist. Because if present, they always move at the speed of light (300 km/s).
4 Latest Developments
Duality of Light Visualized
In 2015 the Italian researcher Fabrizio Carbone developed an experiment that shows that both quantum mechanics and its paradoxical nature can be recorded, the duality of light can be observed at the same time. Being able to record and control quantum mechanical phenomena at the nanometer scale opens a new route to quantum computing and other fundamental sciences. Quantum communication has now shown that one hundred percent secure communication connections can be achieved.
The experiment: A laser pulse is fired at a piece of metal nanowire, which adds energy to the charged particles in the nanowire, causing them to vibrate. Light travels along this wire in two directions. When the opposite-travelling waves, reflected at the end of the wire meet, they form a new wave that looks like a standing wave. This standing wave functions as the starting point for the experiment. At the same time, a stream of electrons is fired close to the nanowire. The standing light wave is fixed with those electrons. When the electrons come close to the light that is “trapped” around the nanowire, they move slower or faster. By using an ultra-fast electron microscope, the position where this speed change takes place was recorded, representing a standing wave. In addition to the representation of the standing wave, the light particle could also be demonstrated in this way. The electrons that flew close to the beam of light “hit” the photons, or the light particles. Because they touch each other, the speed also changes. This change in speed consists of the exchange of “energy packets”, the quanta, between the electrons and photons. The existence of these energy packets shows that the light on the nanowire behaves like particles.

The result of an experiment showing the duality of light. Energy-space photography of light, trapped in a nanowire, shows both spatial interference and energy quantization at the same time.
Quantum Computers
If you can reproduce the following, you will have enough knowledge about this subject for the coming years:
Quantum computers solve many problems exponentially faster and with less energy consumption than classical, or binary, computers. To understand why, imagine a two-dimensional maze.
A classical computer needs to run one path after the other until it finds the way out of the maze. If the maze comprises 256 possible paths, the classical computer must run through the maze about 128 consecutive times (on average, half of a maze’s paths must be tried to find the right one).
A quantum computer, however, can work with all 256 paths at once and find the answer in one computing run.
To put it a bit differently, an 8-bit classical computer can represent only a single number from 0 to 255, but an 8-qubit quantum computer can represent every number from 0 to 255 simultaneously.
How is that possible? The answer is based in fundamental laws of quantum mechanics: While a classical-computing binary unit, or bit, can hold a value of either 0 or 1, a qubit (short for quantum bit) can represent 0 or 1 / or it can hold both values at the same time.
The seemingly inevitable fact is that light behaves like waves in its propagation through space and as particles in its interaction with matter. Light has both wave and particle properties, and the more an experiment reveals one aspect, the less it reveals the other.
Because when we say that something is made up of waves, we are saying that it behaves like a familiar motion that we have observed before, such as that of moving water waves or standing waves in a tense guitar string.
Likewise, when we say that something is made up of particles, we are actually saying that it behaves like a kicked soccer ball, falling raindrops, or fired bullets. Waves and particles are concepts derived from the world we can touch and perceive and there is no compelling logical reason why light should behave differently from all other phenomena of the macroscopic world. Or does light behave according to a concept that we have yet to discover?
Light is what it is, and if we can model it with just two concepts from the world known to us, perhaps we should count ourselves lucky. But questions will certainly continue to raise, because for an average intellect it will be very difficult to understand something that seems so trivial.
Final Conclusion
In fact, we are not able to describe the appearances of the optical phenomenon we have called phō̂s (Greek for light) as an observable event with a single concept. One specific item from the collection of ideas, explanations, or theories used in analyzing the world that we can touch and perceive by comparing it with an already known other phenomenon. In science, a theory is usually a tested model for explaining observations of reality.
Keep in mind that light is simply light and behaves harmoniously as such.
Once again:
It’s good that this shortcoming doesn’t stop us from taking “beautiful” photos of objects that reflect the light.
BRgds,
Michel