The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
Oddly, I believe it is the artist within each person. I was a scientist, a mathematician. Artists were that odd lot that lived in the eclectic dorm. I have never had any talent: musical or artistic, and no desire to perform in front of an audience. Nonetheless, the "fun", I think, is the creative element of any endeavor.
Architecting a building is fun; engineering the stress requirements is tedious. Developing an operational budget is engaging; accounting for individual line-items sucks. Developing a novel is captivating; cranking out formula romance novels is dreary.
"Art" is a part of software development. Every developer I know spends 90% of his time "painting the screen": what color to use, where should that button be located, etc. Sadly, that means he then has to find and cobble together the "boring" pieces, in his remaining 10% of time, that actually do the work.
Forty years ago, we created "masterpieces" of coding that today would repulse us. Today, we have "experts" who have analyzed, homogenized and pasteurized every aspect of software development. We have developed repeatable processes and reportable performance metrics. We read this expertise and adopt it, or have it foisted upon us, so that we, too, can be "respectable". Throughout history, art has NEVER been respectable. So, we are creating conflict within our very selves.
In short, we turned the process into a business rather than an art, and that has drained the fun from it.
It was the same grind that it is today. Just different and more primitive.
I worked at a bank and the computer was a Burroughs 400. It took up a 600+ square foot room, it had 4K of ram, multiple tape drives and the program had to be loaded from Punch Cards. Once. one of my co-workers dropped a box of several hundred cards, while carrying it to the card reader to load a program. It took an extra 40 minutes or so to resort them into proper order again, so we could run the program.
Nostalgia isn't that great if you had to live through those "good old days". One good thing about those days, we didn't have to put up with social media, "robo-calls", Uber, etc.
The more things change, the more they stay the same....
See it this way: Due to the not-so-integrated circuits back then, tinkering was pretty much the mainstream norm. Heck, I remember CRT TVs coming with a maintenance manual telling you which capacitator to change in what case.
Nowdays, it's all integrated and mainstream tech isn't tinkerable (mostly, that dude who got an integrated headphone jack into his iPhone is a hero). Still, if you want to tinker, there's loads of fun to be had. It's just not mainstream anymore. But if you want, you can order an 8 bit computer self-assembly kit with programming instructions, for example. Or order the schematics and buy the parts yourself. Or understand the schematics and overclock it. I recently backed a Kickstarter which recreates the Altair on a ruler, meaning that if it's fun to program in REAL machine language (assembly in text files which get translated into machine language is one step further away from the tech running the code), then I'll do it.
I have made a habit of naming the parameter when supplying contstants and the context doesn't clearly indicate the semantics of the constant, even when the parameter is NOT defaultable.
Sometimes, even literal values are self explanatory, from the function name, like SetLightSwitchOn(false), but my threshold for saying 'No, it isn't quite clear enough' has been adjusted down in several steps.
If my memory is correct, the option to name non-defaultable parameters is quite new in C#. In some of my older code, from when only defaultable parameters could be named, I specified a default that was never invoked, simply to allow me a compact way to document the semantics in every call. Yes, a comment would have served the same purpose, but naming the parameter is more pointed and briefer. Besides, it stimulates you to give even parameters good, descriptive names when you know that the name will be spread all over the application; you can't just say 'Just read the function and see how it is used; its name is inessential'. Exposing your naming makes you a more responsible programmer.
The solution is wrong, but the answer is right.
One of the chemicals used in old-style photograph production is Developer - a fluid you immerse the exposed negative in to reduce the silver halides into metallic silver and convert the latent image to a visible (but colour reversed) image ready for Fixing and Printing.
Sent from my Amstrad PC 1640
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
Default methods, great idea. If they switch .net to use binary interfaces this could be implemented by placing the default method at a pre-defined location of the interface, maybe the fourth entry in the interface so it's all standardised.
Last Visit: 31-Dec-99 18:00 Last Update: 20-Apr-21 3:27