The Lounge is rated PG. If you're about to post something you wouldn't want your
kid sister to read then don't post it. No flame wars, no abusive conduct, no programming
questions and please don't post ads.
I saw a video on this subject recently and this is why US dates are presented as MM/dd/yyyy rather than the more common dd/MM/yyyy. Apparently it became common practice in the US to state dates as August 3rd or December 15th rather than the other way around and that translated into the short form that is used to express a date.
You get used to it. You never like it, but you learn to live with it.
0) How the data is handled in the back-end.
1) How the data is entered on the front-end.
2) If and how "1" is labelled on the UI.
Given your subject line it seems you're more interested at the moment in 2. If the whole world had one method then problem solved - no label required. Unfortunately, all 7.5 billion* of us can't agree on much of anything so the data needs to be labelled. Just treat it like anything that has a unit of measure. Mass, volume, speed, temperature, etc...
*Or should I write 7,500,000,000? Or is that 7500000,000? Or is that 75,00,00,00,00? Or is that 7.5 x 109 Or should I write 7.5 milliard?
Whenever you find yourself on the side of the majority, it is time to pause and reflect. ~ Mark Twain
In short yes. But also we have a responsibility to the user in particular rather than just anyone willy-nilly. The fact is, people will always find a away to disagree on stuff globally. It helps us feel unique. And in fact it's quite healthy, otherwise we'd all be mindless zombies never challenging or changing the status quo. Of course, it's all about balance, otherwise we as people would never agree on anything and thus never get anywhere as well.
So, in short, stuff like date format is context sensitive. It's why we have locales, etc. But you're right in the fact we as UI designers need to make certain things obvious. There are ways to do it, just a lot of people are lazy and only do the bare minimum.
I recently had a JSON feed with UTC dates in it... UTC dates mind you!! The feed tried to get all fancy with dashes to make it look like an ISO format too. But no, the person that made the feed put it in as mm-dd-YYYY - dashes not slashes. Lets overlook the fact a UTC date should never be formatted this way IMO, but this dude is apparently anti-slash and anti-YYYY-mm-dd.
Point being, the dashes are a hint for most normal people it should be an ISO date. And in a way that's like what the UI should try to accomplish, find ways to give us hints and visual cues about something. When done properly it helps tremendously. As in don't clutter up the screen with information overload. Find ways to make them subtle.
And another way of helping people figure it out is to give the user what the user expects to see in the first place, depending on its context.
Anyway, hopefully in the future paper receipts will be digital and they'll account for this kinda stuff.
Particularly annoying with receipt printers is that the processor (the company that handles the CC transaction) often dictates the format of the receipt in case of a dispute, including the date format, specifically so that it is unambiguous to them. At least, such was the case when I had to generate the receipts for CC purchases and check cashing.
I don't display dates in mm/dd/yy unless the stake holder demands it. I always prefer to use dd-MMM-yyyy for the very reason you cite.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 - You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 - When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
The real goal is to switch everybody to yyyy-mm-dd (e.g. 2017-08-07).
1. It contains all the needed information.
2. It's good for sorting because it goes bigger --> smaller.
3. No one would get confused over which number means what (bigger --> smaller principle).
4. It's already used by 1.5 billion Chinese and possibly in other places in Asia too.
While we're at it, we should also switch to a 24H clock (basically for the same reasons).
If you like it: use it in your daily life!
You might still want to use mm/dd/yy on your IRS reports, but other than that - use it whenever you can and by god we will make the world unite around this one proper date/time format.
Unlike programmers who always simplify THEIR OWN life, there is normal people who don't care your way of sorting.
We should think about usage of dates in every cases, not only dates of files! And if you're human, hardly you like months in the numeric form. What's that month "10"??? November? August? Don't give a damn, it should be LETTERS! And of course if you're not time traveler, you know what year today is! So first place should be occupied by most important parts - day and month. My preference is: "dd MMM yyyy" - it's HUMANLY and it's convenient. And never let you guessing what order is. "17 Feb 2017" - simple, obvious.
During the US military attack on Iraq in 2003, Amercians renamed (or tried to do so) "french fries" to "freedom fries" to indicate their dissatisfaction with French opposition to the bombings.
I think that a nice similar political action, in these days of terrorism from the Arabic world, would be to reverse the digit order in our numbers. Our digits are, as most people know, Arabic numerals - exactly the same as the digits used in Arabic writing. Except that in Arabic, they are little endian, in Latin script they are big endians: We write them the same, but the Arabs read them from right to left, we read them from left to right. To clearly dissociate from Arab culture, we should reverse the digit order, and reject doing it "the Arab way".
This of course pinpoints another problem: By trying to be different from the Arabs, by writing numbers the other way around, in another sense it makes us more like them! They do it little endian, we change to little endian, too... Hmmmm....
First of I'd answer your actual question (do devs have a UI responsibility) with a resounding "yes".
As to why US companies insist on forcing the rest of the world to use an illogical date format that practically no other country uses? I think it's the same underlying reason why we get poor UI's in general: wrong mindset.
As a dev, your job is to write software for your users, not for yourself/your ego. This is something a lot of developers seem to forget/aren't aware of. That means putting yourself in your users' shoes and designing software that meets your users' functional requirements while also being a joy to use.
Poor UI design generally is the result of only the first part of this equation being deemed important: meeting the functional requirements. The ease of use is an afterthought, and you can guarantee the dev never actually used the UI in a real life situation (they might, if you're lucky test a few boundary cases). A prime example of this was the drop-down list in Outlook years ago for choosing the year of birth for a contact. Not only was this a drop-down list with 100 entries or so (one per year), the default value was the current year at the top of the list.
Now if they'd tried entering the data of a real person or two, they would immediately have realised that none of their users will have an address book full of babies born this year, and that selecting something like "1967" from a massively long list is a UI fail.
But if all you test is "date in the past, today, date in the future", you won't ever see the massive design flaw. MS fixed this in a subsequent update.
And this mindset - whereby people find it difficult to empathise and put themselves in other people's shoes - is also the reason for these weird date formats nobody (apart from Americans) wants.
As a programmer, you must know that there are different date formats (you should at least know ISO and your local format if it's not ISO). So it can't be ignorance, it can only be an unwillingness to put yourself in your users' shoes. Because if you did, you'd immediately understand that no one wants to use an unfamiliar and confusing date format. If your mindset is "we don't care about anyone else", your software suffers as a result. And it won't only be the date format that suffers.
Somewhere a programmer decided to output the date this way.
I have another thought about this, on this surprisingly long thread.
US Date is written the way we say it. August 7, 2017. I don't know all of the Europolyglot, but I believe a German date would be 7 August 2017. These are how they are spoken. Now, just transform them to a number format whilst maintaining this order natural to the reader.
At least to begin with.
Conventions and adaptions have followed suit. Consider: even the metric system could be recast with a different size 'meter', and everything else, calculated to remain the same relative to the alternate size (rename them if you wish). But it isn't that way. Only one base-ten system was adopted by (so far as I know) all parties using it.
This desire for convention is relatively new. Think of what is now the ultimate example of adopting convention: the Euro. With its particular value. Not a value with any real symbolic significance (or perhaps, trying to be similar to the US dollar in value!)
So - unless its from a purely logical standpoint, (YYYYMMDD), which is useful for sorting, the internal representation we so love (which, you recall, are actually FP values - dates like in StarTek), the date is for a human-readable form (UI, remember!). Honestly - doesn't it make sense to write it the way the reader will naturally say it!
One school of thought says that I should use the default format because the platform should take it from the OS preferences. In addition, if those are wrong, it is not my fault... right?
Well, no. I should take the default date format as another external input, and as any external input, I should not trust it.
Thus, I decided a while ago to use the one and only true date format standard: ISO 8601. For the date format that would be yyyy-MM-dd...
... Unless the client asks for something else, that is. They rarely do. Why may they ask for something different? Here are a few possible reasons:
- The software needs to interact with a software made by third party that expect default date format.
- The above can happen because of legal reasons. For instance, in my country there is a file format designed by the goverment that uses dd/MM/yyyy. At least it doesn't use two digits for the year.
- The client wants a date forma that the date to spell out the month name. Translating those is another issue, so at this point I will probably be using a library that supports this, instead of the default format functions.
I disagree with the foundation of your postulate. I think it's our job, as developers, to remove as many barriers to understanding as possible, but ambiguity exists in the context of the interpretation, not in the processing and presentation of the data.
The example that you cite is almost the perfect example: there is a cultural difference between you and the developer of a piece of software. If providing services to your cultural norm is not part of the system design, that UI designer would be doing a BAD job by formatting for Canadian norms. A team member that likes to jump on the "added features" train is a liability; I know this too well as I'm often that guy and need to get slapped with YAGNI periodically.
If you are the target market, ergo formatting for your culture (or just general internationalization) is part of the system spec, then what you have is a badly designed product and it might be time to look to alternatives.
Honestly, though, kvetching about cultural differences is counterproductive and useless. Especially when you're all wrong and yyyyMMMdd does the best job of removing ambiguity
"There are three kinds of lies: lies, damned lies and statistics."
- Benjamin Disraeli
I've always thought that the entire world was wrong. In my mind the only reasonable expression would be YYYYMMDD. that should suffice for the next roughly 8k years and by that time I really don't give a rip!
These days, with more self-taught programmers in the market, it is no surprise to find anything UI or GUI ignored. My background is as a Mac programmer, and they had an entire manual on user interface design. Buttons were to be x pixels wide and y pixels tall. Today, finding even evenly spaced buttons is a joy.
I not only feel responsible for making the GUI make sense, but to make the user experience as easy and intuitive as possible. Most projects do not allow me time to go back and "clean up", so extra effort has to go in at the beginning to click all the proper properties and use the correct widgets.
Back to dates, since I deal with a lot of vendors and different business sites, my default is YYYY-MM-DD with YYYY-MMM-DD for monthly reports to differentiate them from daily or one time. A side advantage of this is that daily or one-time reports sort properly in the directory.
I have found that projects dealing with multiple vendors and multiple departments quickly adapt to this and I do not have to enforce it. That makes the project run smoother and interactions more friendly. It is the unknown and feeling like you aren't getting requirements that usually cause friction, so it is a proactive effort to get everyone speaking the same language and feeling comfortable.
Yeah, I liked that Oracle defaulted to dd-MMM-yy
and we used that a lot.
Over the years, I ultimately prefer some variation of YYYYMMDD and use that as time stamps for files, etc. I find it sorts nicely, is total unambiguous, and when combined with YYYYMMDD_HHNNSS it still sorts, and moves the ball forward.
But don't get me started on AM/PM... Who thought of that? And what were they thinking? And timezones, and Daylight savings time. Obviously not a lot of computer planning went into any of this when the first PC could not represent a date before 1/1/1900 lol.
Finally. Where is the metric system when you need it. There should be 10 seconds per minute. 10 minutes per hour, 10 hrs per day, and 10 days per month, etc... How much easier would life be then?
First, you have to know your audience. The software I develop is only used in the US, so 06/07/2017 is unambiguous.
That being said, it could be improved. Why shouldn't we say June 7, 2017, just in case we ever get a ROW contract?
To answer your question, UI is our responsibility in a similar way that good design principles, good data structures, etc are our responsibility. Part of our role is to be consultants to the business people, and point out these kinds of problems when we think of them.
In the same way we might raise issues with how data actually is related when gathering requirements (for example), by asking deeper questions during the requirements phase based on our experience, we should raise these kinds of issues at that time.
For with this specific issue, though, just having a software standard seems like a good idea: dates should be displayed as <good worldwide format> or <user's specified culture> or whatever works for you.
I prefer some variation on yyyy-MMM-dd, although for some cultures it may be yyyyy-MMM-dd or even yyyyyy-MMM-dd. But that's because I like easy sorting, something that is easy to do with a computer but not so easy in a paper ledger.
If you know that the application is going to be used on a system that supplies a default date formater, always use that. If the user doesn't have it set to what they like at least it will be consistent with most of the other software the user uses.
Or you can work in an industry that specifies the format that everybody has to follow. In my case that is ddMMMyy or ddMMMyyyy both of which are pain to sort if all you have is text.
But to answer the question...
For a legacy UI that you don't have time/budget/permission to recode and regression test, stay with the same format of data display. Changing it for your piece will generate user irritation because it is different, or will make them irritated with the older portion because it's not as nice as the new part.
For new code, use system defaults. Maybe add a section to documentation about setting the system date. Of course that has it's pitfalls as well.
Yes we have a responsibility when creating a UI. You seem to be forgetting what the U stands for. When people talk about dates, people will say "June Seventh" way more often than they say "Seventh of June". Also when presented with a list of dates, having things in mm/dd format makes them more easy to compare at a glance. If you are designing middleware you can be as logical and unambiguous as you please. But if you design a UI and prioritize your personal sense of logic and order above what the users feel is comfortable and familiar your design will be a failure.
The way american Papuan speaks has nothing common with how it should be written. If you see time "10 hours 12 minutes", IT IS NOT "thousand twelve", whatever military people say. Feel the difference? Same with dates, especially when only crazy USA has clumsy order "month day year".
Normal order is "day month year". Day number is more important than year. And if you exchange dates with other people, think twice before you force 'em to scratch head with your "17/7/17".
Universal date is quite simple: "dd MMM yyyy". No mistakes, no fighting, suits for much more people than living in america.
Your time example actually proves my point more than yours. The format you like: dd/mm/yyyy is in decreasing significance order, meaning most specific to least specific. So if you were to extend that with time then it would be ss:mm:hh dd/mm/yyyy. But I am sure that seems ridiculous to you. We all use cultural rules when choosing how to display data. Do you also insist that in languages that are read from right-to-left that they should switch to left-to-right because you feel it suits more people? If you were designing a UI with a status indicator and you chose a fairly standard red/yellow/green scheme would you refuse to change it after finding out your users were colorblind because the scheme suits more people? The point is that it is how your users feel about the way data is presented that matters. That is a higher responsibility than adherence to rules you feel are "universal".
Yes that works just fine unless they speak only Russian, or Chinese (which accounts for quite a large number of the humans in the world). The fact that you TLDR for a single paragraph means you have no interest in learning anything and are willing to stay ignorant. Good luck with that.
Just try to implement dd-MMM-yyyy in many places only to have the [non-technical] "stakeholder" go 'Why is the date f!@#$% up? Go fix that. Our stupid users won't unnerstand'. Yes Biff - going Biff....
Well, it always helps to have empathy towards the end-user in order to achieve as much ergonomic and intuitive programs, but back to your case, the programmer in charge of redacting the database transaction layer is supposed to make sure that the date is stored using a time-stamp, whereas the UI programmer should take care of accordingly parsing this time-stamp on an end-user localization basis so that the end-user can for example always remotely generate a consistent receipt whatever his location.
But yes, both programmers can definitely be the same person. Presenting information to the world in various notations or languages require more UI efforts than using more standardized schemes, but have its charms.
In all cases, if the data is presented ambiguously, then the project is an epic failure (!?), with data dumped to the end-user becoming inconsistent so unusable, but in theory and back to your initial question, inside a project with respective road-maps for several programmers, it would be the UI programmer's job to properly dump / format data that have been consistently stored.
Developers have a responsibility to understand the system that they are building, and preferably before they start building the system IMHO. How the system is intended to be used (via a User Interface), and by whom, is part of that understanding. Therefore the answer has to be 'Yes'.
Wow, I was surprised that this is even a question but pleased so many believe they are responsible for a user-friendly UI.
Absolutely, even if the programmer is given specific specifications, they need to validate what the user experience will be. We do not program stuff just to move data around. Ultimately a user must interface with the information and when they do, it should be as intuitive as possible. Nothing is more crazy making for the user than a UI that is confusing or ambiguous when the designer and programmer (and the whole team) have the power to make clean sensible user experiences.