There has been a lot of interest about the correct method of calculating leap years lately, especially since it seems many programmers have got it wrong. Before Y2K the approximation "a leap year occurs every four years" was good enough for most situations. If you needed to deal with historical dates (post 1600AD), then you had to add the clause "unless the year was divisible by 100".
This is where many programmers left it - but the rule for the Gregorian calendar is:
- Most years divisible by 4 are Leap Years (i.e. 1996 was)
- However, most years divisible by 100 are NOT! (1900 was not)
- Unless they are also divisible by 400 (2000 will be)
It turns out though that there are even more subclauses that can be appended to these rules to take into account the fact that the Earth takes 365.2421898 days to orbit the sun (well, it did in 1996 - it's now a shade shorter than that). Because of this, the Gregorian calendar will be off by about a day in 3000 years or so. There are various proposals on how to deal with this, but it seems it won't really be an issue until 2800AD.
One proposal (by the astronomer John Herschel (1792-1871) among others) suggests
- Every year which is divisible by 4000 is not a leap year.
This is more accurate than the Gregorian calendar, but has not been officially adopted.
Another proposal (by the Orthodox church in Greece) is to replace the "divisible by 400" rule by the following:
- Every year which when divided by 900 leaves a remainder of 200 or 600 is a leap year.
This makes 1900, 2100, 2200, 2300, 2500, 2600, 2700, 2800 non-leap years, whereas 2000, 2400, and 2900 are leap years. This will not create a conflict with the rest of the world until the year 2800. However, this rule has also not been officially adopted.