Everyone has to measure lengths, reckon time and weigh various bodies. Therefore, everyone knows just what a centimetre, a second and a gram are. But these measures are especially important for a physicist—they are necessary for making judgements about most physical phenomena. People try to measure distances, intervals of time and mass, which are called the basic concepts of physics, as accurately as possible.
Modern physical apparatuses permit us to determine a difference in length between two-metre long rods, even if it is less than one-billionth of a metre (nanometre). It is possible to distinguish intervals of time differing by one-billionth of a second. Good scales can determine the mass of a poppy seed with a very high degree of accuracy.
Measurement techniques started developing only a few hundred years ago, and agreement on what segment of length and what mass of a body to take as units has been reached relatively recently.
But why were the centimetre and the second chosen to be such as we know them? As a matter of fact, it is clear that there is no special significance to whether the centimetre or the second be longer.
A unit of measurement should be convenient—we require nothing further of it. It is very good for a unit of measurement to be at hand, and simplest of all to take the hand itself for such a unit. This is precisely what was done in ancient times; the very names of the units testify to this: for example, an “ell” or “cubit” is the distance between the elbow and the fingertips of a stretched-out hand, an “inch” is the width of a thumb at its base. The foot was also used for measurement—hence the name of the length “foot”.
Although these units of measurement are very convenient in that they are always part of oneself, their disadvantages are obvious: there are just too many differences between individuals for a hand or a foot to serve as a unit of measurement which does not give rise to controversy.
With the development of trade, the need for agreeing on units of measurement arose. Standards of length and mass were at first established within a separate market, then for a city, later for an entire country and, finally, for the whole world. A standard is a model measure: a ruler, a weight. Governments carefully preserve these standards, and other rulers and weights must be made to correspond exactly to them.
The basic measures of weight and length in tsarist Russia—they were called the pound and the arshin—were first made in 1747. Demands on the accuracy of measurements increased in the 19th century, and these standards turned out to be imperfect. The complicated and responsible task of creating exact standards was carried out from 1893 to 1898 under the guidance of Dmitri Ivanovich Mendeleev. The great chemist considered the establishment of exact standards to be very important. The Central Bureau of Weights and Measures, where the standards are kept and their copies made, was founded at the end of the 19th century on his initiative.
Some distances are expressed in large units, others in smaller ones. As a matter of fact, we wouldn’t think of expressing the distance from Moscow to Leningrad in centimetres, or the mass of a railroad train in grams.
People therefore agreed on definite relationships between large and small units. As everyone knows, in the system of units which we use, large units differ from smaller ones by a factor of 10, 100, 1000 or, in general, a power of ten. Such a condition is very convenient and simplifies all computations. However, this convenient system has not been adopted in all countries. While the metric system is used in scientific and medical fields in the USA, the country still primarily relies on customary units such as inches, feet, and pounds in everyday use.1
In the 17th century the idea arose of choosing a standard which exists in nature and does not change in the course of years and even centuries. In 1664, Christiaan Huygens proposed that the length of a pendulum making one oscillation a second be taken as the unit of length. About a hundred years later, in 1771, it was suggested that the length of the path of a freely falling body during the first second be regarded as the standard. However, both variants proved to be inconvenient and were not accepted. A revolution was necessary for the emergence of the modern units of measurement—the Great French Revolution gave birth to the kilogram and the metre.
In 1790 the French Assembly created a special commission containing the best physicists and mathematicians for the establishment of a unified system of measurements. From all the suggested variants of a unit of length, the commission chose one-ten-millionth of the Earth’s meridian quadrant, calling this unit the metre. Its standard was made in 1799 and given to the Archives of the Republic for safe keeping.
Soon, however, it became clear that the theoretically correct idea about the advisability of choosing models for our measures by borrowing them from nature cannot be fully carried out in practice. More exact measurements performed in the 19th century showed that the standard made for the metre is approximately 0.08 of a millimetre shorter than one-forty-millionth of the Earth’s meridian.
It became obvious that new corrections would be introduced as measurement techniques developed. If the definition of the metre as a fraction of the Earth’s meridian were to be retained, it would be necessary to make a new standard and recalculate all lengths anew after each new measurement of the meridian. It was therefore decided after discussions at the International Congresses of 1870, 1872 and 1875 to regard the standard of the metre, made in 1799 and now kept at the Bureau of Weights and Measures at Sévres, near Paris, rather than one-forty-millionth of a meridian, as the unit of length.
Together with the metre, there arose its fractions: one-thousand-th, called the millimetre, one-millionth, called the micron, and the one which is used most frequently, one-hundredth—the centimetre.
Let us now say a few words about the second. It is much older than the centimetre. There were no disagreements in establishing a unit for measuring time. This is understandable: the alternation of day and night and the eternal revolution of the Sun suggest a natural means of choosing a unit of time. The expression “determine time by means of the Sun” is well known to everyone. When the Sun is high up in the sky, it is noon, and, by measuring the length of the shadow cast by a pole, it is not difficult to determine the moment when it is at its summit. The same instant of the next day can be marked off in the same way. The interval of time which elapses constitutes a day. And the further division of a day into hours, minutes and seconds is all that remains to be done.
The large units of measurement—the year and the day—were given to us by nature itself. But the hour, the minute and the second were devised by humans.
The modern division of the day goes far back to antiquity. The sexagesimal, rather than the decimal, number system was prevalent in Babylon. Since 60 is divisible by 12 without any remainder, the Babylonians divided the day into 12 equal parts.
The division of the day into 24 hours was introduced in Ancient Egypt. Minutes and seconds appeared later. The fact that 60 minutes make an hour and 60 seconds make a minute is also a legacy of Babylon’s sexagesimal system.
In Ancient Times and the Middle Ages, time was measured with the aid of sun dials, water clocks (by the amount of time required for water to drip out of large vessels) and a series of subtle but rather imprecise devices.
With the aid of modern clocks it is easy to convince oneself that the duration of a day is not exactly the same at all times of the year. It was therefore stipulated that the average solar day for an entire year would be taken as the unit of measurement. One-twenty-fourth of this yearly average interval of time is what we call an hour.
But in establishing units of time—the hour, the minute, the second—by dividing the day into equal parts, we assume that the Earth rotates uniformly. However, lunar-solar ocean tides slow down, although to an insignificant degree, the rotation of the Earth. Thus, our unit of time—the day—is incessantly becoming longer.
This slowing down of the Earth’s rotation is so insignificant that only recently, with the invention of atomic clocks measuring intervals of time with great accuracy—to one billionth of a second—has it become possible to measure it directly. The change in the length of a day amounts to 1–2 milliseconds in 100 years.
But a standard should exclude, when possible, even such an insignificant error. Later, we shall show how this is done.
The following measures of length were officially adopted in England: the nautical mile (equals 1852 m); the ordinary mile (1609 m): the foot (30.48 cm), a foot is equal to 12 inches; an inch is (2.54 cm; a yard, 0.9144 m, is the “tailors’ measure” used to mark off the amount of material needed for a suit.
In the USA and some other countries using customary units, mass is measured in pounds (454 g). Small fractions of a pound are an ounce (\(\tfrac{1}{16}\) pound) and a grain (\(\tfrac{1}{7000}\) pound).↩︎