|
pwasser wrote: verb "not fly" for example No such thing. You are standing or sitting or walking or laying or some other ground-based thing, not "not flying".
#SupportHeForShe
Government can give you nothing but what it takes from somebody else. A government big enough to give you everything you want is big enough to take everything you've got, including your freedom.-Ezra Taft Benson
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
TheGreatAndPowerfulOz wrote: Where did these people learn grammar? They didn't. Schools stopped teaching grammar, along with spelling, years ago when the education establishment decided it was good to not force children to learn rules.
|
|
|
|
|
I understand what you're saying...
But, the two sentences, to me, have very different meanings.
a) I'm going to not hit him - implication, there will be no hitting, but kicking is a potential; regardless, he needs a smack down. If my mother told me to not hit my brother, I could use this sentence, kick my brother, and then my mother would say, "You knew what I meant!" and I would get hit
b) I'm not going to hit him - implication, no physical violence will occur
|
|
|
|
|
Tim Carmichael wrote: b) I'm not going to hit him - implication, no physical violence will occur
same applies
b) I'm not going to hit him ... but that also doesn't mean I wont bite him
a) is as OP said it's considered 'bad grammar,' but only in English. As others also pointed out different languages (German, Dutch) do change the order. Given all these (and others) are Germanic rooted languages it's odd what's OK in one is considered bad in another. But bad does not mean wrong, perhaps 'ambiguous,' 'incomplete' or 'unnatural/uncommon' or even 'outdated'
I'm going to hit him not. .... I'm going to hit not him.
!hit(him)
hit(!him)
^hit(him)
// and perhaps a few others
Sin tack
the any key okay
|
|
|
|
|
Exactly my reaction, too - but case A could also be a complete rejection of some proposal from others: I am doing the opposite, I am going to defend him! The main thing is that "to not do it" is a very explicit rejection, not just refrain from doing something.
Another example: Religious people sometimes claim that rejecting the idea of a divine creator is just as much a religion as believing in a deity. A proper response to that is "Yes, and my favorite hobby is to not collect staps".
|
|
|
|
|
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful.
It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date.
Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it.
Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one.
It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers
Chris Maunder
|
|
|
|
|
At school they called it "information science". We work with information, are responsible for it's correctness and validity when storing, as well as giving the correct output.
We are, essentially, replacing paper-and-people-equivalent processes, in any form, that result in the same output. No-one outside the developer has any influence on how that is done, so ultimately, you're the one responsible - mayhaps not in a legal way, but that is an entirely different thing.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Chris Maunder wrote: we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously Yes.
Chris Maunder wrote: Is this something you do? Yes.
Chris Maunder wrote: Is it something your lead actually stops you doing? No, and if he did I'd bite his head clean off.
#SupportHeForShe
Government can give you nothing but what it takes from somebody else. A government big enough to give you everything you want is big enough to take everything you've got, including your freedom.-Ezra Taft Benson
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Some developers are rebellious, and the guy (or girl, or gender neutral being) who thought up the US date format was probably just doing it to irritate us europeans !
|
|
|
|
|
RickZeeland wrote: probably just doing it to irritate us europeans ! In that case, I shall restore my support for the mm/dd/yyyy format!
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Why Do Americans Write Dates: Month-Day-Year? - YouTube[^]
TLDW: No one knows but we've been doing it since colonizing NA. Personally I only usually care about the month and current day of the week. The specific day and year are largely irrelevant day-to-day. Maybe that's why? Just a guess.
|
|
|
|
|
If that's the case when/why did the UK switch to day/month/year?
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
It's almost certainly a matter of month/day being most often used, year being added on optionally.
Consider the MM/DD portion of the date to be day-of-year . It's big-endian, which isn't bad in itself. Appending the year afterwards vs beforehand is normal in speech, but when you write it, it appears that you're now using little-endian: day-of-year /year . It's awkward, but not crazy.
If you think that crazy because mixing little-endian with big-endian is crazy, consider that DD/MM/YYYY is already mixing little- with big-, since the actual digits we write are big-endian.
(I prefer YYYY-MM-DD , myself, which is all big-endian and fully consistent, but the American way isn't illogical, per se.)
Jesse
|
|
|
|
|
Since colonial times... Interesting. The Declaration of Independence is month day year, but I think Lincoln used day month year. But he also used four score and seven years ago, which is confusing to everyone today.
My theory has always been that it was the IRS in early 1900s using month day year that forced U.S. to "standardize" on the wacky date format.
|
|
|
|
|
Yea, he even points out an instance where they use MM/DD/YYYY then DD/MM/YYYY in the same sentence. It seems as soon as people came over to the Americas, you see the MM/DD/YYYY format start being used along with DD/MM/YYYY, and then at some point people just decided to use MM/DD/YYYY exclusively. A historical mystery. Maybe it was just to spite the British?
|
|
|
|
|
Chris Maunder wrote: whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously Absolutely.
Chris Maunder wrote: Is this something you do? Always.
Chris Maunder wrote: Is it something your lead actually stops you doing? I am the lead. But if someone suggested that ambiguous display of dates was acceptable, I don't think they would last very long in our organization.
/ravi
|
|
|
|
|
I manipulate and persist dates/times in an unambiguous fashion. I present them according to the user's preferences as indicated by the Windows locale or other mechanism.
Software Zen: delete this;
|
|
|
|
|
Like a good developer should, thank you very much.
But in the current "developer" world (i.e. Silicon Valley style), minding that "not all of your millions of users are from the US" is a mind boggling concept. You think dates are bad? How about keyboard short-cuts that absolutely cannot work (looking at you Android Studio)? Everytime a see a "Ctr+/" or something like that, it's apparent that nobody tried the software with a non-US/UK keyboard. (hint, most naturalized keyboars use key combinations for (), [],\, etc.. So shortcuts that require those keys will not work.
|
|
|
|
|
André Pereira wrote: How about keyboard short-cuts that absolutely cannot work I have been through that scenario. I create user interfaces for our line of commercial ink-jet printing systems. Our UI is largely touch-screen driven. For one product line I implemented several locale-specific on-screen keyboard layouts for entry of file names and such. Most of them were fairly easy, except for the Japanese, Korean, and Simplified Chinese. For those I sent photos of the physical keyboard to colleagues in-country, and had them send me text files with the keytop characters encoded in UTF-16.
Software Zen: delete this;
|
|
|
|
|
Quote: photos of the physical keyboard
Quote: had them send me text files with the keytop characters encoded in UTF-16
That's a very hands on solution I like it. It's something a script kiddie could do, instead of researching the whole thing. But no, they just look at their Mac and that's it.
|
|
|
|
|
not forgetting degrees (temperature) and degrees angle. The latter is odd because almost all humans use the 0..360 whereas almost every math library uses radians - easy to visualise a 35 degree slope, but a .4 radians is how many? With pi an irrational number and computers not capable of doing infinite digits yet (not that long ago computers couldn't do over 6 dp very well) what a stupid choice that was.
Another that's slipping is currency: starting to see single decimals popping up: i.e. $5.5 ... sure cents (pennies if you must) are annoying, but it's just being lazy to skip that last digit.
(currently the temperature here is 298 degrees and my chair tilted at about .1 degrees, just the way this grumpy irrational old man likes it.)
Sin tack
the any key okay
|
|
|
|
|
Actually I'm not sure why we don't stick to more fundamental units like that. 2π rad = a full circle - what could be easier? And frankly I'd be happy to switch to Kelvin if it meant never having to look at another negative temperature.
cheers
Chris Maunder
|
|
|
|
|
I'd be happy just to never feel another negative temperature.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
It seems we've come 2π rad.
/ravi
|
|
|
|