Time Conversions


Convert From
Convert To
 

Week - A week is a grouping of days or a division of a larger grouping such as a lunar month, year, etc. Most parts of the world currently use a seven-day week. Weeks of other lengths have been used historically in various places.

1 week = 7 days = 168 hours = 10,080 minutes = 604,800 seconds (except at daylight saving time transitions or leap seconds). 1 Gregorian calendar year = 52 weeks + 1 day (2 days in a leap year). 1 week = 23.01% of an average month. In a Gregorian mean year there are exactly 365.2425 days, and thus exactly 52.1775 weeks (unlike the Julian year of 365.25 days or 525⁄28 weeks, which cannot be represented by a finite decimal expansion). There are exactly 20871 weeks in 400 Gregorian years, so 10 April 1605 was a Sunday just like 10 April 2005. A system of Dominical letters has been used to determine the day of week in the Gregorian or the Julian calendar.

Decade - A decade is a period of ten years. The word is derived from the late Latin decas, from Greek decas, from deca. The other words for spans of years also come from Latin: lustrum (5 years), century (100 years), millennium (1000 years). The term usually refers to a period of ten years starting at a multiple of ten. For example, "the 1950s" refers to 1950 through to 1959 (inclusive). In English, "decade" can also be used to specify any period of ten years. For example, "During his last decade, Mozart explored chromatic harmony to a degree rare at the time".

Day - A day is the time it takes the Earth to spin around once. It is day time on the side of the Earth that is facing the Sun. When it is night time, that side of the earth is facing away from the Sun. It takes 24 hours for the earth to spin once, so that is one day, including the day time and night time.

Millenium - A millennium (pl. millenniums or millennia) is a period of time equal to one thousand years (from Latin mille, thousand, and annum, year). The term may implicitly refer to calendar millenniums; periods tied numerically to a particular dating system, specifically ones that begin at the starting (initial reference) point of the calendar in question (typically the year 1) or in later years which are whole number multiples of a thousand years after it. The term can also refer to an interval of time beginning on any date. Frequently in the latter case (and sometimes also in the former) it may have religious or theological implications (see Millenarianism). Especially in religious usage such an interval may be interpreted less precisely, not being, exactly, 1,000 years long.

Year - A year (symbol y or sometimes a) is the amount of time it takes the Earth to make one revolution around the Sun. By extension, this can be applied to any planet: for example, a "Martian year" is the time in which Mars completes its own orbit. Although there is no universally accepted symbol for the year, NIST SP811 and ISO 80000-3:2006 suggest the symbol a (in the International System of Units. Although a is also the symbol for the are unit of area, context is usually enough to disambiguate). In English, the deprecated abbreviations kyr, Myr, Gyr (for kiloyear, megayear and gigayear) are still used informally when denoting intervals of time remote from the present.

A calendar year is the time between two dates with the same name in a calendar. The Gregorian calendar attempts to keep the vernal equinox on or soon before March 21; hence it follows the vernal equinox year. The average length of this calendar's year is 365.2425 mean solar days (which can be thought of as 97 out of 400 years being leap years) whereas the vernal equinox year is 365.2424 days. Among solar calendars in wide use today, the Persian calendar is one of the most precise. Rather than being based on numerical rules, the Persian year begins on the day (for the time zone of Tehran) on which the vernal equinox actually falls, as determined by precise astronomical computations. No astronomical year has an integer number of days or lunar months, so any calendar that follows an astronomical year must have a system of intercalation such as leap years. In the Julian calendar, the average length of a year was 365.25 days. (This is still used as a convenient time unit in astronomy as shown below.) In a non-leap year, there are 365 days, in a leap year there are 366 days. A leap year occurs every 4 years. A half year (one half of a year) may run from January to June or July to December.

The exact length of an astronomical year changes over time. The main sources of this change are: The precession of the equinoxes changes the position of astronomical events with respect to the apsides of Earth's orbit. An event moving toward perihelion recurs with a decreasing period from year to year; an event moving toward aphelion recurs with an increasing period from year to year. But this effect does not change the average value of the length of the year. The gravitational influence of the Moon and planets changes the motion of the Earth from a steady orbit around the Sun. The Earth orbit varies in a chaotic way, but in an interval quite more reduced than the orbits of the nearest planets. Tidal drag between the Earth and the Moon and Sun increases the length of the day and of the month (by transferring angular momentum from the rotation of the Earth to the revolution of the Moon); since the apparent mean solar day is the unit with which we measure the length of the year in civil life, the length of the year appears to change. Tidal drag in turn depends on factors such as post-glacial rebound and sea level rise. Changes in the effective mass of the Sun, caused by solar wind and radiation of energy generated by nuclear fusion and radiated by its surface, will affect the Earth's orbital period over a long time (approximately an extra 1.25 microsecond per year). Other effects tend to shorten the Earth's orbital period: the Poynting-Robertson effect (about 30 nanoseconds per year). And the gravitational radiation (by about 165 attoseconds per year)

Minute - A minute is a unit of measurement of time or of angle. The minute is a unit of time equal to 1/60th of an hour or 60 seconds. In the UTC time scale, a minute occasionally has 59 or 61 seconds; see leap second. The minute is not an SI unit; however, it is accepted for use with SI units. The symbol for minute or minutes is min. The fact that an hour contains 60 minutes is probably due to influences from the Babylonians, who used a base-60 or sexagesimal counting system.

The first division was originally known as a "prime minute", from Latin "(pars) minuta prima", meaning "first minute (i.e. small) part (or division)" of the hour. Likewise, the second was known as a "second minute", meaning "the second small division" of the hour.

Nanosecond - A nanosecond (ns) is one billionth of a second.

Second - The second (SI symbol: s), sometimes abbreviated sec., is the name of a unit of time, and is the International System of Units (SI) base unit of time. It may be measured using a clock. SI prefixes are frequently combined with the word second to denote subdivisions of the second, e.g., the millisecond (one thousandth of a second), the microsecond (one millionth of a second), and the nanosecond (one billionth of a second). Though SI prefixes may also be used to form multiples of the second (such as "kilosecond," or one thousand seconds), such units are rarely used in practice. More commonly encountered, non-SI units of time such as the minute and hour increase by multiples of 60 and 24 (rather than by powers of ten as in SI). The second was also the base unit of time in the centimetre-gram-second, metre-kilogram-second, metre-tonne-second, and foot-pound-second systems of units.

Before mechanical clocks - The Egyptians subdivided daytime and nighttime into twelve hours each since at least 2000 BC, hence their hours varied seasonally. The Hellenistic astronomers Hipparchus (c. 150 BC) and Ptolemy (c. AD 150) subdivided the day sexagesimally and also used a mean hour (1/24 day), but did not use distinctly named smaller units of time. Instead they used simple fractions of an hour.

The day was subdivided sexagesimally, that is by 1/60, by 1/60 of that, by 1/60 of that, etc., to at least six places after the sexagesimal point (a precision of less than 2 microseconds) by the Babylonians after 300 BC, but they did not sexagesimally subdivide smaller units of time. For example, six fractional sexagesimal places of a day was used in their specification of the length of the year, although they were unable to measure such a small fraction of a day in real time. As another example, they specified that the mean synodic month was 29;31,50,8,20 days (four fractional sexagesimal positions), which was repeated by Hipparchus and Ptolemy sexagesimally, and is currently the mean synodic month of the Hebrew calendar, though restated as 29 days 12 hours 793 halakim (where 1 hour = 1080 halakim). The Babylonians did not use the hour, but did use a double-hour lasting 120 modern minutes, a time-degree lasting four modern minutes, and a barleycorn lasting 3⅓ modern seconds (the helek of the modern Hebrew calendar).

In 1000, the Persian scholar al-Biruni gave the times of the new moons of specific weeks as a number of days, hours, minutes, seconds, thirds, and fourths after noon Sunday. In 1267, the medieval scientist Roger Bacon stated the times of full moons as a number of hours, minutes, seconds, thirds, and fourths (horae, minuta, secunda, tertia, and quarta) after noon on specified calendar dates. Although a third for 1/60 of a second remains in some languages, for example Polish (tercja), Turkish (salise), the modern second is subdivided decimally.

Seconds measured by mechanical clocks - The first clock that could show time in seconds was created by Taqi al-Din at the Istanbul observatory of al-Din between 1577-1580. He called it the "observational clock" in his In the Nabik Tree of the Extremity of Thoughts, where he described it as "a mechanical clock with three dials which show the hours, the minutes, and the seconds." He used it as an astronomical clock, particularly for measuring the right ascension of the stars. The first mechanical clock displaying seconds in Europe was constructed in Switzerland at the beginning of the 17th century.

The second first became accurately measurable with the development of pendulum clocks keeping mean time (as opposed to the apparent time displayed by sundials), specifically in 1670 when William Clement added a seconds pendulum to the original pendulum clock of Christian Huygens. The seconds pendulum has a period of two seconds, one second for a swing forward and one second for a swing back, enabling the longcase clock incorporating it to tick seconds. From this time, a second hand that rotated once per minute in a small subdial began to be added to the clock faces of precision clocks.

Modern measurements - In 1956 the second was defined in terms of the period of revolution of the Earth around the Sun for a particular epoch, because by then it had become recognized that the Earth's rotation on its own axis was not sufficiently uniform as a standard of time. The Earth's motion was described in Newcomb's Tables of the Sun (1895), which provide a formula estimating the motion of the Sun relative to the epoch 1900 based on astronomical observations made between 1750 and 1892. The second thus defined is the fraction 1/31,556,925.9747 of the tropical year for 1900 January 0 at 12 hours ephemeris time.

This definition was ratified by the Eleventh General Conference on Weights and Measures in 1960. The tropical year in the definition was not measured, but calculated from a formula describing a mean tropical year which decreased linearly over time, hence the curious reference to a specific instantaneous tropical year. This definition of the second was in conformity with the ephemeris time scale adopted by the IAU in 1952, defined as the measure of time that brings the observed positions of the celestial bodies into accord with the Newtonian dynamical theories of their motion (those accepted for use during most of the twentieth century being Newcomb's Tables of the Sun, used from 1900 through 1983, and Brown's Tables of the Moon, used from 1923 through 1983).

With the development of the atomic clock, it was decided to use atomic clocks as the basis of the definition of the second, rather than the revolution of the Earth around the Sun. Following several years of work, Louis Essen from the National Physical Laboratory (Teddington, England) and William Markowitz from the United States Naval Observatory (USNO) determined the relationship between the hyperfine transition frequency of the cesium atom and the ephemeris second. Using a common-view measurement method based on the received signals from radio station WWV, they determined the orbital motion of the Moon about the Earth, from which the apparent motion of the Sun could be inferred, in terms of time as measured by an atomic clock. They found that the second of ephemeris time (ET) had the duration of 9,192,631,770 +- 20 cycles of the chosen cesium frequency. As a result, in 1967 the Thirteenth General Conference on Weights and Measures defined the second of atomic time in the International System of Units as the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium-133 atom.

This SI second, referred to atomic time, was later verified to be in agreement, within 1 part in 1010, with the second of ephemeris time as determined from lunar observations. During the 1970s it was realized that gravitational time dilation caused the second produced by each atomic clock to differ depending on its altitude. A uniform second was produced by correcting the output of each atomic clock to mean sea level (the rotating geoid), lengthening the second by about 1 x 10-10. This correction was applied at the beginning of 1977 and formalized in 1980. In relativistic terms, the SI second is defined as the proper time on the rotating geoid.

The definition of the second was later refined at the 1997 meeting of the BIPM to include the statement "This definition refers to a cesium atom at rest at a temperature of 0 K." The revised definition would seem to imply that the ideal atomic clock would contain a single cesium atom at rest emitting a single frequency. In practice, however, the definition means that high-precision realizations of the second should compensate for the effects of the ambient temperature (black-body radiation) within which atomic clocks operate, and extrapolate accordingly to the value of the second at a temperature of absolute zero.

Today, the atomic clock operating in the microwave region is challenged by atomic clocks operating in the optical region. To quote Ludlow et al. "In recent years, optical atomic clocks have become increasingly competitive in performance with their microwave counterparts. The overall accuracy of single trapped ion based optical standards closely approaches that of the state-of-the-art cesium fountain standards. Large ensembles of ultracold alkaline earth atoms have provided impressive clock stability for short averaging times, surpassing that of single-ion based systems. So far, interrogation of neutral atom based optical standards has been carried out primarily in free space, unavoidably including atomic motional effects that typically limit the overall system accuracy. An alternative approach is to explore the ultranarrow optical transitions of atoms held in an optical lattice. The atoms are tightly localized so that Doppler and photon-recoil related effects on the transition frequency are eliminated."

The NRC attaches a "relative uncertainty" of 2.5 x 10-11 (limited by day-to-day and device-to-device reproducibility) to their atomic clock based upon the 127I2 molecule, and is advocating use of an Sr88 ion trap instead (relative uncertainty due to linewidth of 2.2 x 10-15). See magneto-optical trap and "Trapped ion optical frequency standards". National Physical Laboratory. Such uncertainties rival that of the NIST F-1 cesium atomic clock in the microwave region, estimated as a few parts in 1016 averaged over a day.

Millisecond - A millisecond (from milli- and second; abbreviation: ms) is 1000th of a second.

Century - A century is a way to describe a length of time. One century is one hundred years. The ancient Romans used the word century to describe a group of about one hundred soldiers, organized into a single unit. The Roman numeral for 100 is "C". The word for 100 in Latin is "centum". Centuries in the Gregorian Calendar (the most-commonly used calendar with 365 days) begin at one instead of zero. That means that in the year 2004, we were in the 21st century, rather than 20th century.

Hour - The hour was originally defined in ancient civilizations (including those of Egypt, Sumer, India, and China) as either one twelfth of the time between sunrise and sunset or one twenty-fourth of a full day. In either case the division reflected the widespread use of a duodecimal numbering system. The importance of 12 has been attributed to the number of lunar cycles in a year, and also to the fact that humans have 12 finger bones (phalanges) on one hand (3 on each of 4 fingers). (It is possible to count to 12 with your thumb touching each finger bone in turn.) There is also a widespread tendency to make analogies among sets of data (12 months, 12 zodiacal signs, 12 hours, a dozen).

The Ancient Egyptian civilization is usually credited with establishing the division of the night into 12 parts, although there were many variations over the centuries. Astronomers in the Middle Kingdom (9th and 10th Dynasties) observed a set of 36 decan stars throughout the year. These star tables have been found on the lids of coffins of the period. The heliacal rising of the next decan star marked the start of a new civil week, which was then 10 days. The period from sunset to sunrise was marked by 18 decan stars. Three of these were assigned to each of the two twilight periods, so the period of total darkness was marked by the remaining 12 decan stars, resulting in the 12 divisions of the night. The time between the appearance of each of these decan stars over the horizon during the night would have been about 40 modern minutes. During the New Kingdom, the system was simplified, using a set of 24 stars, 12 of which marked the passage of the night.

Earlier definitions of the hour varied within these parameters: One twelfth of the time from sunrise to sunset. As a consequence, hours on summer days were longer than on winter days, their length varying with latitude and even, to a small extent, with the local weather (since it affects the atmosphere's index of refraction). For this reason, these hours are sometimes called temporal, seasonal, or unequal hours. Romans, Greeks and Jews of the ancient world used this definition; as did the ancient Chinese and Japanese. The Romans and Greeks also divided the night into three or four night watches, but later the night (the time between sunset and sunrise) was also divided into twelve hours. When, in post-classical times, a clock showed these hours, its period had to be changed every morning and evening (for example by changing the length of its pendulum), or it had to keep to the position of the Sun on the ecliptic (see Prague Astronomical Clock). One twenty-fourth of the apparent solar day (between one noon and the next, or between one sunset and the next). As a consequence hours varied a little, as the length of an apparent solar day varies throughout the year. When a clock showed these hours it had to be adjusted a few times in a month. These hours were sometimes referred to as equal or equinoctial hours. One twenty-fourth of the mean solar day. See mean sun for more information on the difference to the apparent solar day. When an accurate clock showed these hours it virtually never had to be adjusted. However, as the Earth's rotation slows down, this definition has been abandoned.

40 Hour Work Week - The legal workweek (a US term, in the UK called the working week) varies from nation to nation. What constitutes the workweek is mandated by law in some jurisdictions, but in others custom applies[clarification needed] The weekend is a part of the week usually lasting one or two days in which most paid workers do not work (except for those in retail and entertainment where business often needs to be covered seven days a week).

The standard business office workweek in the United States is from Monday through Friday, 40 hours per week. However, many service providers are open for business on Saturday and Sunday as well.