The Lounge is rated PG. If you're about to post something you wouldn't want your
kid sister to read then don't post it. No flame wars, no abusive conduct, no programming
questions and please don't post ads.
I learned very early in my career that I was forever meeting and working with people who were far better developers than me. Which was a good thing because it gave me levels of ability to strive towards. But I never considered myself a good developer, just that occasionally I managed to produce some good code. Unfortunately I continued to make many mistakes.
If you were a mechanic, would you question your ability because you aren't skilled in all areas: general repair, electronics, small engines, diesel, foreign imports, jets, planes?
Probably not.. you'd have your areas that you were capable in and work from there; you'd probably work on expanding your skill set, but I don't think you'd expect you MUST be able to be an expert in all areas.
So... why do we, as developers, think that we need to keep current with an ever changing playing field?
My career started with Fortran, VAX/VMS and FMS Forms; it grew to include OSISoft PI systems and then C and Ingres. Next up was Windows development, SQL Server and Oracle.
Now.. I work primarily with OSISoft based technologies - I touch Windows development and SQL Server, but only in that they are required for the core technologies I use.
In today’s market there is ever growing expectation from developer to be proficient in whatever technology is hot at that time and there are quite a few at any given point in time. Not only you need to know that technology inside out you are expected to know any other tools, libraries and frameworks build around it. It makes you question are you really a good developer even if you don’t know latest technologies or tools ?
This is mixing apples and oranges.
For the TLDR folks:
Good developer: overall skills and technical knowledge in specific areas capable of learning new things.
Good fit: a person with the specific skills the business needs now.
Those are two different things.
First off, I'm a good developer because I have a lot of (what I think is) very good overall knowledge, practices, work ethic and discipline, including skills such as communication, documentation, thinking outside of the box, and so forth. I'm also a good developer because I can learn new things -- business processes, technologies, etc. I'm also a good developer because when I start on a project, I ask a lot of questions, I assume I know nothing, I ask why decisions were made that are contrary to how I would have done things, so I can learn what the trade-offs were, and I constantly look at what I'm doing and work on self improvement.
On the other hand, if a client / employer needs someone with a specific expertise to immediately jump in and work on something in which I don't have those specific technical skills, then I am not a good fit for their requirements. That does not mean I'm not a good developer. It only means that for their business requirements, they can't afford the (sometimes considerable) tooling up / training that I would need.
Obviously, there is the third camp, let's call those people junior developers, that are neither good developers (and hopefully will be one day with proper mentorship) nor have the technical skills. Those are the people that companies are the most reluctant to hire nowadays. The idea of grooming someone just out of school, building a long term relationship with them, having incentives for their staying with the company, etc., those days are for the most part long gone, partly because of market forces and partly because companies more and more treat people like replaceable cogs in the wheel, when actually they are not, especially once they've learned a lot of the intangible knowledge of the specific business domain. I'm not saying anything original here, this issue has been written about for years.
The other thing about modern development, and something that seems to be not fully recognized, or worse, ignored, is that the days of one man development are pretty much over. It really does require a team nowadays - front end, back end, middleware, source control, documentation, testing, QA, deployment -- and in that world, while it's still important to know your particular area of expertise well, communication, organization, and "higher level" skills are becoming (IMHO) even more important, and also what I find generally lacking are the people that can glue it all together well. You might call these people "architects", but they are certainly not the typical product/project manager.
Eloquent, as expected, Marc: I like the focus on "fit;" I wonder if that might be extended to thinking of hiring (at its most "rational") as being a cost/benefit analysis which weighs immediate fitness, potential fitness, and long-term adaptability in an industry where hardware, and software tools, will change radically every few years.
I spoke today with a former colleague (in the U.S.) from my years at Adobe; he's 59 years old, has a stunning resume: he says he's unemployable right now in Silicon Valley, and he believes that his age is a factor.
Did you ever read Coupland's book, "Microserfs" ?
«There is a spectrum, from "clearly desirable behaviour," to "possibly dodgy behavior that still makes some sense," to "clearly undesirable behavior." We try to make the latter into warnings or, better, errors. But stuff that is in the middle category you don’t want to restrict unless there is a clear way to work around it.» Eric Lippert, May 14, 2008
In today’s market there is ever growing expectation from developer to be proficient in whatever technology is hot at that time and there are quite a few at any given point in time. Not only you need to know that technology inside out you are expected to know any other tools, libraries and frameworks build around it.
I'd rather not work for someone who has those kinds of unrealistic expectations. I'd rather work for someone who values one's ability to learn and adapt to any new technology being thrown at them.
In today’s market there is ever growing expectation from developer to be proficient in whatever technology is hot at that time and there are quite a few at any given point in time. Not only you need to know that technology inside out you are expected to know any other tools, libraries and frameworks build around it. It makes you question are you really a good developer even if you don’t know latest technologies or tools ? You don’t know those technologies because up until now you did not have a need to use it in whatever application you are building. Just because AngulrJS sounds cool and everyone is using it I must use it ? One thing I have felt is if you don’t use today’s latest fad you are perceived to be old school and at times will not get a chance to even be considered for a new role.
The problem, I think, is that people tend to equate libraries and frameworks with technology. They're not. They're approaches to problem solving at best, and like all approaches they involve trade-offs.
The bigger question is: can you write good, stable, and functional code without the new toys, understand what frameworks are doing, and play nice with them by doing a careful code-walk (assuming they're not a steaming pile of garbage, which many are).
So there's an easy answer: the only objective measure of if you're a good software developer is if the software that you develop is good. That's the only thing that matters; libraries and frameworks are just tools to achieve that end, and that's all that they are.
"There are three kinds of lies: lies, damned lies and statistics."
- Benjamin Disraeli
This may be a little late but now that it has been out in the wild, rampaging around and causing untold damage and misery, why would anyone downgrade to Windows 10 from Windows 7? (I can see why for Win8 to WinX, but not from Win7).
I am looking for good, solid reasons; not just "it's better" or "it's better on tablets" or "it's got a cooler UI" - none of which are, in my opinion, even a little bit true.
These need to be reasons that couldn't have been implemented with a quick update to Windows 7.
...and don't give any of that bull about it being more stable, because it clearly is actually less stable!
...and I don't care that it starts up faster. I rarely have to restart Windows 7; it runs for months on my desktops and laptops (with no hibernation/wake-up problems, ever).
- I would love to change the world, but they won’t give me the source code.
Because if you don't embrace change you'll end up like the COBOL programmers of today. I used to work with a COBOL programmer who couldn't get a job at times, so he delivered pizza. All because he failed to embrace change. You wanna be like the guys still stuck on XP? Sure, don't always change just because... I get that. But don't be so afraid of it you never do.
And I say this as a person who can't stand the way Win 10 handles updates, seems every new thing MS implements it does so poorly until a few versions go by. So I'm not biased for Win 10, but living on Win 7 for the rest of your life isn't a solution either.