Because, according to the general theory, the speed of a light wave depends on the strength of the gravitational potential along its path, these time delays should thereby be increased by almost 2×10−4 sec when the radar pulses pass near the sun. Such a change, equivalent to 60 km in distance, could now be measured over the required path length to within about 5 to 10% with presently obtainable equipment.
Throughout this article discussing the time delay, Shapiro uses c as the speed of light and calculated the time delay of the passage of light waves or rays over finite coordinate distance according to a Schwarzschild solution to the Einstein field equations.
The time delay effect was first observed in 1964, by Irwin Shapiro. Shapiro proposed an observational test of his prediction: bounce radar beams off the surface of Venus and Mercury and measure the round-trip travel time. When the Earth, Sun, and Venus are most favorably aligned, Shapiro showed that the expected time delay, due to the presence of the Sun, of a radar signal traveling from the Earth to Venus and back, would be about 200 microseconds, well within the limitations of 1960s-era technology.
The first tests, performed in 1966 and 1967 using the MIT Haystack radar antenna, were successful, matching the predicted amount of time delay. The experiments have been repeated many times since then, with increasing accuracy.
In a nearly static gravitational field of moderate strength (say, of stars and planets, but not one of a black hole or close binary system of neutron stars) the effect may be considered as a special case of gravitational time dilation. The measured elapsed time of a light signal in a gravitational field is longer than it would be without the field, and for moderate-strength nearly static fields the difference is directly proportional to the classical gravitational potential, precisely as given by standard gravitational time dilation formulas.