The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
Seriously, it's a number related to some super large astronomical cycle. Astronomers decided on it a few hundred years ago and, as origin is arbitrary, it stayed like that. For details see:Julian day | Calendar Wiki | Fandom[^]
I apologize if I'm too dense or too nerdy: I don't see anything intrinsically wrong with storing a date as a float.
A 4-byte float value has 3 bytes for mantissa giving 2^24 values that can be correctly represented. That makes almost 17 million days or almost 46000 years. I'd argue that this is decent range for most general purpose applications.
It might be unusual or not keeping with established rules, but is not breaking any (computer) law.
It's simply not a date, and every time you'll have to use it you either need to cast it to a date (performance bottleneck) or you need to know exactly how it's formatted, etc.
It'll also be a matter of time before you get 20201005.00000000003 because it's still floating point arithmetic.
Or perhaps someone will enter 1 or 2020931, who'll say as there are no validations.
Perhaps people will use a local date, such as 05102020 or 10052020, good luck working with that.
Now let's say the float contains ISO dates in the form of yyyyMMdd and the date is 20201029, now add 7 days, good luck!
All in all, it's a lot of trouble, ambiguity and simply disaster waiting to happen.
If you really need dates that do not fit into any SQL date type, use separate integers for year, month and day, I guess.
I see, you are talking about those dates, with random number of days in the month. Sorry, but in that case the cartoon bubble should have said "Officer, he is using vaguely defined dates!", and yes, that should be a punishable offense.
The only well-defined timestamps are those based on a definition like "the number of <somethings> from <origin>". In case of dates can be "number of days from noon, Jan 1st of 4713BC" (Julian date) or "number of days from midninght, Jan 1st, 1970" (Unix time). Notice that in this case fractional parts have a natural meaning (fractions of a day). For good measure, don't forget to add a meridian so we know exactly what "midnight" means.
To me, that is a matter of software design philosophy.
A date is not a float, it is a date. So it should not be stored as a float but as a date.
I also shiver at old C, where you coded an infinite loop as "while (1) ...". A logical value is not an integer! And weekdays are Monday, Tuesday, ... Sunday, not integers 0..6 (or 1..7).
Some tools force you to handle data values as being of a different type/class than they are from a semantic point of view. We'll just have to accept that - and wrap it up so that you see as little as possible of that misleading, "wrong" type. (Even in K&R C, you could #define true and false.) If e.g. a database system does not provide a proper Date value, you may be forced to convert it to some acceptable representation when written, and back when read. Your application code should never see that, but treat it as a date type/class.
Couldn't your database write/read wrappers convert dates to/from floats? There is a principal difference between measuring and counting. Would you store NumberOfApples as a float, too? Days are discrete, countable units. They should be treated as countable values, not as measurement values. They should be integers (whether a day count from a given zero, or a multi-field year-month-day value), or, if this is not feasible, as a closed set of string values (e.g. as an ISO 8601 string). But not as a measurement value!
Except when they aren't: right now in Sydney it's October 6 while I just started October 5th; the October Revolution started on November 7th and I could keep giving examples of time weirdness. In the words of Dr Who (and he is a Time Lord, so he should know something about that), time is a "big ball of wibbly wobbly... time-y wimey... stuff"[^].
As you said, apples are countable objects and behave like that. Time, for historical and astronomical reasons, was counted in different ways. Sometimes after 0228 you get 0229 and sometimes you get 0301. The fact that after 59 you get 00 (and sometimes after 23 you get to 0) goes back all the way to ancient Babylon.
The only reasonable way to keep track of time is as a measure on a long, continuos axis that has some fixed origin. The actual origin and the unit increment on the axis may depend on your application but that's all it is to it.
Excel and dates is terrible.
Haven't used Access, but I can only assume it's the same.
To say SQL Server saves dates as floating point numbers is a bit simple.
It may do so, but then there's probably a ton of code to make it work like a datetime
As I do every once in a while, I fired up the weather app built into Windows 10 this morning to have a quick look at today's forecast. I was greeted by a small dialog box saying Microsoft had updated its licensing terms, and presenting two buttons, Learn More and Continue (or Ignore...I forget already). I hit the latter and carried on. Typically the Learn More button would load some page showing the entire EULA. No way I'm gonna bother reading that, even though a part of me was curious as to what it is exactly they've changed.
Then I got thinking: What apps with changing Terms of Service need is a Differences screen, like Visual Studio does when you're about to check code into a source control system. That is, present the previous and current versions side-by-side, but highlight the parts that are different. I'll bet a lot more people would actually read them if presented in that way. I know I would.
But, no doubt nobody's ever going to do that, since the current method allows them to slip in more and more nefarious items over time that would otherwise be placed into the spotlight.