|
Amen to that brother. It's a good trade off too. For that trade we keep our sanity.
Jeremy Falcon
|
|
|
|
|
Wise choice bro. I might have gone insane if I forced myself to learn all these AngularJs, BackboneJs, EmberJs, WEb Toolkit, jQuery, MooTools, React, OpenUI5, Smart Client, UnifiedJs, VueJs, and Webix.
|
|
|
|
|
Just stay with:
_start:
mov edx,len
mov ecx,msg
mov ebx,1
mov eax,4
int 0x80
Works for me!
User: Technical term used by developers. See Idiot.
|
|
|
|
|
Wow, you're using 32-bit registers now... high tech bro.
Jeremy Falcon
|
|
|
|
|
That won't work on Linux
We can’t stop here, this is bat country - Hunter S Thompson RIP
|
|
|
|
|
Some of my coworkers are in their 60s and can debug any problem like it's nobody's business, because they learned low-level skills that have followed them throughout their entire careers. They have inner-working understanding the n00bs can only dream of.
These days there's too many people in this field who'd have to resort to calling their IT support department because you disconnected their keyboard while they were away at lunch time. The framework, library, or language of the day they were experts at 3 years ago is useless today, and their skillset simply can't be adapted to new environments/situations. Those who are worth keeping around in the long term are few and far in-between--that's why there's so many job-hoppers.
|
|
|
|
|
Totally agree man. Gotta know the basics and have a strong foundation with just about anything in life you want to be good at.
Jeremy Falcon
|
|
|
|
|
Unfortunately a lot of that useful knowledge that's indicative of a serious programmer gets drowned out in today's application process. I applied for what was described as a senior position a couple months back with a local government bureau. An actual part of the interview I remember:
Them: "So what's an interface?"
Me: "A contract. It specifies a minimum requirement without specifying a concrete implementation. Kinda like 'I don't care what object you are, as long as you can do X, Y, and Z we're good.'"
Them: "What's a WHERE clause?"
Me: "A predicate to filter SELECT results."
Them: "Ok, any questions for us?"
Me: "No questions about design patterns, architecture, query optimization, PK/FK decisions, index clustering, version control, deployment, etc?"
Them: *Look at each other* "No."
I never heard back I think I'm just terrible at interviews
|
|
|
|
|
Were you being interviewed by HR people, or actual developers? If the latter, I suspect they immediately understood you were going to make them look bad.
It's probably just as well you didn't hear back from them. In hindsight, perhaps the question you should've asked them is how they managed to get their jobs...
|
|
|
|
|
I got the impression one person was definitely HR and one probably a developer. The third person I honestly couldn't place as he didn't say much beyond the greeting. I was just kinda dumbfounded. If I was hiring a carpenter to build a house I wouldn't ask him "Do you know what a hammer is? What about wood? Alright, that's all I need."
|
|
|
|
|
Debugging and testing are the most valuable skills, and they're seldom taught.
|
|
|
|
|
dandy72 wrote: These days there's too many people in this field who'd have to resort to calling their IT support department because you disconnected their keyboard while they were away at lunch time.
With a vast array of desirable business technology needs people specialize. Just as long ago the person that built a log cabin could dig the outhouse latrine but today I do not expect the cable guy to fix my toilet.
|
|
|
|
|
While I agree with your assertion in the general sense, are you saying it's ok for people to never try to do anything, ever, that deviates from the only script they've learned to follow? If that's the case, then the automation revolution can't get here fast enough, because clearly nothing of value will be lost.
|
|
|
|
|
dandy72 wrote: are you saying it's ok for people to never try to do anything, ever
No.
|
|
|
|
|
Jeremy Falcon wrote: you learn X, Y, and Z.
I wasn't there either, but if I understand correctly, you didn't learn all three.
You picked your career path and then learned COBOL or FORTRAN or ASSEMBLY.
Or, you learned Pascal and BASIC and hoped to get a job teaching.
|
|
|
|
|
Or Algol. There's a good chance if you were a programmer in the 60s you'd be exposed to Algol.
|
|
|
|
|
You say that like it is some form of harmful radiation. I like.
|
|
|
|
|
In the end, it's all about breaking down and solving problems. I'll gladly learn a new stack/framework if it solves a problem at hand. (makes or saves $) That said, I usually don't (aside from maybe reading articles) invest in learning something new just to add a feather to my cap.
On another topic, with the answers to the universe at out fingertips these days, getting by on your wits is much easier than it used to be. Either I've done it (or something like it) and can re-use the code/logic, or I can usually find something useful in less than 10 seconds using google. This is why I haven't bought a real programming book/manual in more than 5 years. These days the only mastery required is in phrasing search terms.
"Go forth into the source" - Neal Morse
|
|
|
|
|
One cannot simply learn everything that the industry uses, this could apply to other fields too.Considering the remarkable impact that computer and engineering has made on other disciplines and considering the modern trends in the industry.Older systems and technology gets replaced by newer system and programming languages.....On a long enough timeline, the survival rate for everyone drops to zero...
Caveat Emptor.
"Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
|
|
|
|
|
To digress a little.... Time was an intelligent and educated person could know just about everything there was to know. Literally. And from that grew the stereotype of the lone scientist in his lab coming up with some new invention to change the world... for a while, such people could exist, but not any longer. No one can know everything, not even within one subject area - the most anyone can be is a master at one or two (r more, maybe) disciplines within a subject, there is that much knowledge about.
So science now, and in the future, is and will be a collaborative affair. The big advances now - take nuclear fusion (if it ever happens), quantum computing, or a myriad of medical advances - these aren't and won't be made by our stereotypical white-coated lone scientist in a lab, but by the collaborative efforts of different research groups around the world.
We all have to stand now on the giant collective shoulder of those around us in order to see anything.
|
|
|
|
|
Makes me think of James Burke's Connections series. The path to any discovery is usually weird, and builds on what came before.
|
|
|
|
|
The code I am trying to maintain began with a definable message based API. It was a little slow on the under-powered embedded system we started with, so "optimizations" were made. This is my current contract interface: API[^]
I now have logic in some key processing that depends on the string content of a global value. Rather than just look at the interface, I am reduced to searching through all project files for all references of said global string variable. I have code from multiple targets in the wrong files, etc. I've never seen entropy attack so fast.
Charlie Gilley
<italic>Stuck in a dysfunctional matrix from which I must escape...
"Where liberty dwells, there is my country." B. Franklin, 1783
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
|
|
|
|
|
Don't try to learn anything new until you need to. For instance, if knowing A, B, and C gets the job done where you're at now, then become proficient in A, B, and C.
When you're considering trying to find a new job (or have to because you got fired/laid off), look at what's currently in use, and learn enough to at least be familiar with the general territory, and when you do get a job do the new tech, become proficient in it.
Little secret - all programming languages are essentially the same. The primary stuff you have to learn is the topology of the framework(s)-de-jeurs. THAT is where the steep learning curves exist.
I've been doing this crap for 40 years, and that's the way I've been doing it the entire time.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
I'd agree with that - about the time you figure out the framework, it's just the same thing with different sauce. I pondered Xaml for a while, then I learned it was repackaged UIL from days past...
Charlie Gilley
<italic>Stuck in a dysfunctional matrix from which I must escape...
"Where liberty dwells, there is my country." B. Franklin, 1783
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
|
|
|
|
|
I was a little too young to learn programming in the 60s, but had my university education around 1980. That was an age where a lot of exciting things happened in the language world - every man had his own language. I can't count the languages I have been programming in.
So when people asked me how many langugages I were familiar with, I said: Maybe three or four.
First comes the algorithmic. That covers Fortran, Basic, Algol and Pascal including variants like Concurrent Pascal. Then comeplain C and C++ and Objective C, and lesser known languages like CHILL and the vendor-specific Planc, and later I have been using Java and Javascript ... and let's throw in COBOL, for good measure. Python, of course, and PHP. Those are all samples of Language One - the way of thinking is very much the same in all of these, and syntax differnces are not that significant.
Second comes the "workspace" model which is like a sandpit where you throw in and take out functions and objects over time. Smalltalk comes in this class; my main experience is with APL, which is also quite different with its extremely array based data structures. Yet, the most essnetial difference from the algorithic group is the workspace model.
The third language is the predicate languages, of which I have only used three variants: SQL, Prolog and SNOBOL. You could say that SNOBOL is a semi-algorithmic language; it has flow control and at the top level is rather sequential, but the more familiar you are with the language, the more you leave to predicate logic. I never became friendly with XSLT, and am happy that I managed to sneak out of it regex I have to do every now and then.
Fourth language: Essentially Lisp, which I have only used for programming Emacs. I can see that some people like it, but it doesn't give you much help in offloading the conceptual model into the saved file; essentially you must free up space inside your brain to hold the model.
For a couple of projects, we peeked at functional programming, but I wouldn't count that as number five. We just studied the syntax of Erlang (and peeked at others from a distance), but I never really tried it in pratice.
I would rather like to add as number five: Data modelling - not a complete language by itself, as it usually won't provide definition of operations, only the interface to them. I think this language (group) is the most underrated one! A good data model, whether you use ER or ASN.1 or an XML schema, is essential to understand the problem domain. You may say that if you use the whole of the OSI communication protocol stack toolbox, you do operations modelling very much in the sam spirit as you do data modelling.
We may add yet another language group: State/event programming. Look at how the OSI session layer is programmed: Tables with current state along one axis, event along the other, each square stating predicates and actions. No other language could possibly come close to that state/event description of the logic, when it comes to clarity, consiseness and freedom from ambiguity.
- - -
Gee, did we have a plethora of languages in those days! Now, we have chopped off eighty percent of the algorithic group and 100% of the rest. State/event is squeezed into a switch. Data modelling is squeezed into C struct. The workspace concept is squeezed into C malloc, or by its modern name: new. List processing is squeezed into C linked structures ... And so on. One bad thing is that today's young programmers never ever question the () around if/while conditions (whether they program in a C variant, Python, Java,... When they see Pascal code, they wonder: Didn't you forget the parentheses? Talk to them about workspace models, and they worry how efficient the malloc will be in such an environment, and when a Cobol decimal field is 6 digits wide they figure that a long long is needed to hold the value range.
Really, today's programming world is a lot more primitive than in the 80s.
And you could say the same about OSes. About file systems. About communication protocols. About user interfaces. Maybe the 80s were a chaos, but it certainly wasn't that primitive monoculture of today, where 95+% of all programming is done in C-like languages, 95+% of all data traffic is limited by the Internet Protocol family at least at some level or stage, 95% of all data is stored in a file system that provides no high level support, and for 95% of process/thread modelling the entire world uses one of two models - and those two certainly do not differ very much.
True: We have got dozens of variations on C. Dozens of variations of U*ix. Dozens of variations of sh. Dozens of variations on fork. But those are essentially small ripples on the surface. Fundamentally different ways of thinking and of doing things have weathered away. I think that is just as big a loss as the ripples are a problem.
|
|
|
|
|