I just read an article which augues that coding with only one programming language would limit one's capability to code.
I'm a little confused about this argument, because I've always thought that, one developer proficiently mastering one language outperforms one using many language but he actually knows relatively little about each one.
So, I'm not sure whether the idea of the passage I read is very correct.
I just want to hear more opinions about this issue, so, please feel free to show your thoughts about it.
I suspect they're concerned about paradigms more than actual languages. If you know only imperative languages or only functional languages for instance you may have trouble solving problems that are better suited to the other paradigm. You may know several imperative languages, but you still may not easily solve problems suited to functional languages.
If your unique programming language is C++ then you have no limits.
Idealistically one can argue that but realistically it isn't true. An programmer that has spend 15 years writing embedded C++ controllers for hard drives probably isnt going to be as effective for creating a new web server on windows versus someone that has been doing exactly that for 8 years using C#.
And the reverse is true as well.
Businesses care about how much it will cost and when it will be delivered.
What pisses me off is job specs that list the source controll SW they use! As if that is important!
What makes a good product is good architecture, good design. Everything else after that is just implementation. Use the tools you need, beit VB, C#, C, and if you dont know the tools then read up and start coding. Soon enough you will know them well enough to write a great project.
Oh, and most of the fancy features of a language are useless. They are juct mental bling, and do not add one jot of usability to the project.
Look, as a mechanical engineer are you expected to revel in the interface of a particular milling machine?
No, you design the product and use what ever milling mechine is to hand. Period.
Besides what's already been said, to flip this around, I've grown quite skeptical of developers who know lots of languages and platforms and claim mastery in many of them. My own observation is that it takes 5 to 7 years to become an expert at anything. There are exceptions, like Marc Clifton, but they are exceptions. (To be fair, too many companies don't care--regardless of what they say, "good enough" is their actual standard of quality.)
I always found it very refreshing to learn and know multiple programming languages. I've found that broadening my horizons that way can give me unique insights or alternative looks on coding, which can be helpful and enjoyable.
I'd say it also makes communication easier with other programmers, as they might not be proficient in the same languages as you; a broader view of the coding landscape can help.
Also, just because one knows multiple languages this doesn't prohibit you from mastering one. It can even help in this regard, as more knowledge is always welcome.
That said, I do not believe that just because one only knows one language this means that they are more limited than those who know multiple. As has been noted in this topic by others, it's also very important to be able to tackle problems from multiple angles, regardless of the language.
All in all, I'd argue that a change of scenery is a good thing, and that it can never hurt to broaden your view. It can also be a lot of fun to learn a new language, and you don't have to go as far as mastering it.
My best programming experience, the most fun I ever had, was learning Common Lisp (I started in VB6 and C++). Has this made me a better programmer? I'd like to think so, if only because I better recognise the strengths and limits of the various languages and platforms out there.
I have Microsoft Visual C++ 2008 and Windows 8. Please how can I make an existing project compile into static mode (not dynamic mode), so that variables and malloc()'ed storage stay in the same place through a run of a program?
A program which compiled and ran correctly under Windows Vista now acts odd as if declared variables and/or malloc()'ed storage sometimes move about.
That should make no difference, since all you are doing is opening the file with sharing enabled. This has nothing to do with your variables getting corrupted in memory. The fact that it works now, is more luck than judgement. And given all the index values to your arrays, it's highly likely that you are overruning or otherwis writing in the wrong place in your buffer.
BTW it makes it much easier for us to read your code if you format your code with proper indents, and place it between <pre> tags, like:
and malloc()'ed storage stay in the same place through a run of a program?
I doubt that concept even exists in Vista although if you can educate me otherwise I would like to see something about it. Specifically what options you set in the compiler to get it to do that.
Excluding perhaps some of the most basic parts of the OS and perhaps, only perhaps, so very low level drivers everything else is relocatable because that is how virtual memory works. I suppose there might be a way to change the relocation address or whatever it is called and then if one uses no threads and fixed input data then one run to the next would map the same. But it still can move in physical memory when it runs.