Optipedia • SPIE Press books opened for your reference.
Speed of Light
Excerpt from Optical Engineering Fundamentals, Second Edition
Over the years the history of optics has been tied inexorably to the quest to determine the velocity of the propagation of light within various media. Initially, it was thought that light traveled with infinite speed. As early as the eleventh century it was thought that light did travel at a finite speed, but much too fast to be measured using normal methods. In 1675 the Danish astronomer Olaf Roemer (1644-1710) made the first scientific determination of the speed of light based on observations of the eclipses of the innermost moon of Jupiter. Roemer noted a significant difference in the timing of these eclipses, depending on the relative positions of the Sun, Earth, and Jupiter when the observations were made. Essentially, when the earth was nearest to Jupiter the eclipses would occur several minutes ahead of the predicted time and when the earth was farthest from Jupiter, that eclipse would occur several minutes later than was predicted. While there is no record that Roemer actually did the final calculation, his data would lead to the conclusion that light travels at a speed of 200,000 km/s. Contemporaries of Roemer would modify his findings, including more accurate data on the earth's orbital radius, and arrive at a value close to 300,000 km/s.
Approximately 50 years later, in 1728, the noted British astronomer James Bradley (1693-1762) made an entirely different type of astronomical observation from which he was able to calculate the speed of light. This experiment involved the observation of a star using a telescope with its axis set perpendicular to the plane of the earth's rotation. It was found that in order to compensate for the speed of the incident light, the telescope's axis would have to be tilted through a small angle in the direction in which the earth is traveling. The amount of telescope tilt required allowed Bradley to calculate the speed of light, which he found to be 301,000 km/s.
The first terrestrial measurement of the speed of light was recorded by the French scientist Armand Fizeau (1819-1896) in 1849. Fizeau's experiment is illustrated in Fig. 2.3. A light source was focused through a beamsplitter onto an image plane where a spinning toothed wheel was located. The light passing between teeth of that wheel was then projected to a mirror at a distance of about 8 km, where it was collected and then reflected back to the point of origin. The rotational speed of the wheel was then increased until the returning light was blocked by the tooth on the wheel just adjacent to the space through which it had passed. Using this data, Fizeau was then able to calculate the speed of light. Limited by the precision of his measurements, Fizeau calculated the speed of light to be 315,000 km/s. Fizeau's experiment was later modified by French physicist Jean Léon Foucault (1819-1868), who replaced the toothed wheel with a rotating mirror. With this new arrangement Foucault determined the speed of light to be 298,000 km/s, much closer to today's accepted value. Foucault was also able to insert a tube filled with water between the rotating mirror and the distant mirror and determine conclusively that the speed of light was reduced when it was traveling through water rather than air. This conclusion went a long way toward disproving the corpuscular theory, which stated that the speed of light in water would be greater than it is in air. The Foucault method was further improved by many, with the most precise measurement being made by Albert A. Michelson (1852-1931). The average result of a large number of measurements done by Michelson was 299,774 km/s. Many aspects of modern technology have been applied to the determination of the speed of light in recent years, yielding a currently accepted value of 299,793 km/s. Finally, it is interesting to note that electromagnetic theory allows the velocity of electromagnetic waves in free space to be predicted, with a resulting value of 299,979 km/s, which is within 0.1% of the most precise measured values.