The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
You make me want to go down in my basement to dig up all my old BYTE issues from the late 1970s - I never got my collection complete from Isusse #1, but it is close.
One of the 70's DYI projects that I remeber well was a computer controlled wood stove. This guy had build a container for finely cut wood (it wasn't pellets, but roughly that size, I believe) with a funnel into his stove, so the wood could fall down by gravitation. This must have been a few years before the IBM PC; the computer may have been an Altair or Imsai, controlling the motor opening the hatch allowing more wood to fall down, and the motor opening/closing the air vent. What I don't remember is how he read the inputs - you couldn't simply buy a USB thermometer in those days...
When the magazines are ten years old, everybody ask "Why don't you throw that old sh*t out?" If you stubbornly cling to the magazines until they are fourty, everybody gasps: "What a treasure!"
I had a huge number of magazines for quite a while including Byte, PC, and a bunch of IEEE and ACM journals. Sadly, I had to get rid of them a few moves ago. I managed to keep about two boxes of what I thought were outstanding issues though.
Because this was the heyday of the hobbyist. Then we shot ourselves in the foot by getting jobs, thinking we could get paid to do this fun stuff. It was still fun for a while. Then eventually it became a job.
In my youth, I seriously considered taking a music education, becoming a professional in the field of classical music. Except for the small problem that I was far from good enough: When I now look upon those musicians who have managed to get safe jobs in a symphony orchestra - what a boring life! Every week yet another symphony. Maybe accompanying some soloist in a small concerto. Next week, another symphony, another concerto. Every Christmas audiences demanding that we play all the same boring music that we have played dozens of times before...
Today I am so happy that I never got close to my old dream of making classical music for a living: I still love music! And I have (and use) the freedom to go on to other musical styles, decided by my personal preferences this year, without worrying about my income.
Rather, progamming went from being fun to being a chore, just for earning money. Yet I am happy that I ruined the joy of programming and kept the joy of music, rather than the other way around.
You're looking for fun in the wrong places today. Sites like Hackaday cover doing low level in the weeds electronics projects now.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
Oddly, I believe it is the artist within each person. I was a scientist, a mathematician. Artists were that odd lot that lived in the eclectic dorm. I have never had any talent: musical or artistic, and no desire to perform in front of an audience. Nonetheless, the "fun", I think, is the creative element of any endeavor.
Architecting a building is fun; engineering the stress requirements is tedious. Developing an operational budget is engaging; accounting for individual line-items sucks. Developing a novel is captivating; cranking out formula romance novels is dreary.
"Art" is a part of software development. Every developer I know spends 90% of his time "painting the screen": what color to use, where should that button be located, etc. Sadly, that means he then has to find and cobble together the "boring" pieces, in his remaining 10% of time, that actually do the work.
Forty years ago, we created "masterpieces" of coding that today would repulse us. Today, we have "experts" who have analyzed, homogenized and pasteurized every aspect of software development. We have developed repeatable processes and reportable performance metrics. We read this expertise and adopt it, or have it foisted upon us, so that we, too, can be "respectable". Throughout history, art has NEVER been respectable. So, we are creating conflict within our very selves.
In short, we turned the process into a business rather than an art, and that has drained the fun from it.
It was the same grind that it is today. Just different and more primitive.
I worked at a bank and the computer was a Burroughs 400. It took up a 600+ square foot room, it had 4K of ram, multiple tape drives and the program had to be loaded from Punch Cards. Once. one of my co-workers dropped a box of several hundred cards, while carrying it to the card reader to load a program. It took an extra 40 minutes or so to resort them into proper order again, so we could run the program.
Nostalgia isn't that great if you had to live through those "good old days". One good thing about those days, we didn't have to put up with social media, "robo-calls", Uber, etc.
The more things change, the more they stay the same....
See it this way: Due to the not-so-integrated circuits back then, tinkering was pretty much the mainstream norm. Heck, I remember CRT TVs coming with a maintenance manual telling you which capacitator to change in what case.
Nowdays, it's all integrated and mainstream tech isn't tinkerable (mostly, that dude who got an integrated headphone jack into his iPhone is a hero). Still, if you want to tinker, there's loads of fun to be had. It's just not mainstream anymore. But if you want, you can order an 8 bit computer self-assembly kit with programming instructions, for example. Or order the schematics and buy the parts yourself. Or understand the schematics and overclock it. I recently backed a Kickstarter which recreates the Altair on a ruler, meaning that if it's fun to program in REAL machine language (assembly in text files which get translated into machine language is one step further away from the tech running the code), then I'll do it.
I have made a habit of naming the parameter when supplying contstants and the context doesn't clearly indicate the semantics of the constant, even when the parameter is NOT defaultable.
Sometimes, even literal values are self explanatory, from the function name, like SetLightSwitchOn(false), but my threshold for saying 'No, it isn't quite clear enough' has been adjusted down in several steps.
If my memory is correct, the option to name non-defaultable parameters is quite new in C#. In some of my older code, from when only defaultable parameters could be named, I specified a default that was never invoked, simply to allow me a compact way to document the semantics in every call. Yes, a comment would have served the same purpose, but naming the parameter is more pointed and briefer. Besides, it stimulates you to give even parameters good, descriptive names when you know that the name will be spread all over the application; you can't just say 'Just read the function and see how it is used; its name is inessential'. Exposing your naming makes you a more responsible programmer.