|
Call it availability. C is the most available language.
There is an important point you make:
I can always (within resource limits) create compiler/interpreter that originated on one platform and implement it for another.
"within resource limits" is the key here. C is simple. Java JVM is complicated (and owned).
It's important that you know that I'm not defending C.
It's really ugly compared to other languages and prone to error.
But it's fast (predictable fast, no GC collection pause), very "available", flexible and stable.
Most of the programs I use everyday are built in C/C++.
None in Perl. None in Java.
it´s the journey, not the destination that matters
|
|
|
|
|
ErnestoNet wrote: "within resource limits" is the key here. C is simple. Java JVM is complicated
(and owned).
Not sure what you think "owned" means but far as I know the term "Java" is owned but the language is licensed in such a way that I can in fact take it and create a compiler and complete JDK fro a new platform free of charge.
See "Licensing" in the following.
http://openjdk.java.net/faq/[^]
If you had to start from scratch then C, as in C11, is not trivial. And I am talking about the language and not the libraries. Implementing everything from scratch would be a large undertaking.
Of course one would start with the existing source and modified for a new platform.
Which is the same thing one would do for Java.
ErnestoNet wrote: But it's fast (predictable fast, no GC collection pause),
Again - research Real Time Java.
Other than that modern business usage is not impacted by the modern GC.
ErnestoNet wrote: Most of the programs I use everyday are built in C/C++. None in Perl. None in
Java.
Which is why *I* said that language choice is by user preference and not availability.
Perl is available on all of the C platforms so they don't choose C because it is available.
|
|
|
|
|
pasztorpisti wrote: I'm open to any reasoning against the above list
In terms of the errors - those are not errors that show up for me. The errors that I must find involve logic errors. Whether those errors are made by me or others.
And if you spend significant amounts of time day to day on errors like those then at best that I can judge you have a problem that can only be solved with a change in process.
pasztorpisti wrote: In my opinion tools (including dev envs and languages) are very important.
I can and I have achieve orders of magnitude increase in performance by changing requirements.
The only way I have ever achieved anything close to that at the implementation level is because the design itself that lead to the implementation was wrong. Other than that implementation improvements can only lead to small increases when everything else is held steady.
The following is what impacts performance and even project success.
1. Requirements - most impact
2. Architecture
3. Design
4. Implementation - the least impact.
1/2 are not impacted by language although 2 might be impact by technology.
3 can be impacted by technology but only minimally by language.
4 is impacted by language.
So the reason language as less impact is because it does.
pasztorpisti wrote: I think every company that continues other than garage development has some
kind of dev process,
Every company has a process. Some companies have a formalized process. The only measured and significant improvements in development have come from formalizing the process and improving that. Far as I can remember improvement measures consisted of fewer bugs (detected at various points in life cycle), short delivery time, reduced cost and reduced maintenance cost.
pasztorpisti wrote: but that is another dimension of the problem of cutting development time and
providing quality on a completely different level: management
No it isn't. Again the only impact from the studies that I read was based on process improvements. Tools of any sort had nothing to do with it.
|
|
|
|
|
Please understand that I never questioned the importance of a development process and the same is true for architectural decisions and design. Indeed, the biggest mistakes I have ever seen were architectural/design mistakes. Without good design or anything above that we have nothing to speek about. My answers assume good architectural conditions and this time focus on the beautification of the least important 4th step.
Perhaps some day I will work just with the first 3 steps and I will probably be less concerned about the language and toolset the coders have hard time with... Seriously!
|
|
|
|
|
Richard MacCutchan wrote: When it was chosen, it was the only choice.
Really?
The Multics operating system was being programmed in PL/1.
The Burroughs B-5000 and subsequent machines had their operating systems coded in Algol with extensions. And a damn fine operating system it was.
My alma mater, Case Western Reserve University, implemented an operating system with an Algol-like language called Chili.
All of this while C was just beginning to happen and Unix was not yet born.
Richard MacCutchan wrote: I meant it was unfair in that you were judging a language developed in the 80s
by the standards of today's knowledge and technology.
Oh! Does that mean you will be saying nice things about Fortran and COBOL? Or, at least, refrain from saying nasty things about them?
|
|
|
|
|
Vivic wrote: All of this while C was just beginning to happen and Unix was not yet born. And when UNIX was born it was developed with C (and its predecessors) in mind. Of the others I only ever worked on Burroughs' Algol based OS and it was one of the most difficult I ever tried to understand.
Vivic wrote: Does that mean you will be saying nice things about Fortran and COBOL I often have. I worked with both languages in the 70s and 80s and found them perfectly adequate for solving specific problems. That is not the case today but it does not detract from their usefulness at the time.
One of these days I'm going to think of a really clever signature.
|
|
|
|
|
pasztorpisti wrote: The problem is that the accident has already happened and windows and linux are
already in C.
Which would be relevent except for the fact that windows has been re-written several times.
|
|
|
|
|
The sources of win2k have been leaked. Download them and look at the code. It contains tons of legacy code even from win31. Not to mention the backward compatibility between windows versions. Windows has never been rewritten.
|
|
|
|
|
I'm surprised no hackers from around the world has hacked into Microsoft(R) servers and steal every O.S. source codes or projects (or at least some of them), and them publish them on torrents. But Google, Bing, Yahoo and all those other people filter out those results and/or delete them. There probably has been private court orders of Microsoft vs. some hacker, stealing source. But I'm pretty sure Microsoft knows everything about security then any other security company out there combined, considering you have to send your driver to them to get it signed now. Basically Microsoft probably has there servers extremely hard to break through or breach, they probably have it to where if it detects a breach or unauthorized transfer of specific files, it disconnects the servers and computers and locks them down, terminating the hacking or suspicious signal. Who knows, they are probably smart and keep the internet cord unplugged, considering the fact that government leaks there "state of the art" F-35 blue prints on the internet, where everyone can see (technically, cause if there's internet, there's hackers), or maybe that's bogus that the government sent out to intentionally trick hackers into thinking they got the "real stuff", but who knows what crazy stuff tech companies do these days.
Simple Thanks and Regards,
Brandon T. H.
Programming in C and C++ now, now developing applications, services and drivers (and maybe some kernel modules...psst kernel-mode drivers...psst).
Many of life's failures are people who did not realize how close they were to success when they gave up. - Thomas Edison
|
|
|
|
|
Every OS contains tons of bugs, old bugs go, new ones come with the new features. Still the wast majority of security holes are caused by buggy networking applications and not by the OS itself. The more complex a networking application the more changes you have to have a security hole in your system. (For example a browser is quite a complex piece of software!!!) Anyway, several windows and internet explorer patches followed the windows source leaking, not without reason but because they had to fix a lot of discovered and (I suspect that) known but shelved (!!!) bugs that became obvious for the hackers from the sources!
|
|
|
|
|
pasztorpisti wrote: It contains tons of legacy code even from win31.
Which would be relevant if I had claimed that they had written a newer version of windows from scratch using a black box development effort.
But I am pretty sure that the vast majority of code is some command like 'type' has essentially remained the same since pre-win95.
On the other hand I know for a fact what is required to move a code base that was originally written for 32 bits when that was new to what it takes to use a 64 bit system.
How exactly do you think that windows handles VMing? Did you ever attempt the same thing in Windows 95? I can tell you that I did in fact run two OSes on a Win95 box and it was not easy.
|
|
|
|
|
Win95 is "just a nice DOS program". Even if they have replaced and extended parts in the OS they always stayed backward compatible on api and source level and thats alone is a good reason for not switching the language even if all they do is wrapping new kernel. You can take the sources of a pretty old windows program and with minimal or no modifications you have good chances to compile it even for a 64 bit system. I think this backward compatibility drains a lot of their energy and it implies some practical restrictions (like language).
|
|
|
|
|
pasztorpisti wrote: Win95 is "just a nice DOS program"
The point however is that windows has been re-written.
pasztorpisti wrote: You can take the sources of a pretty old windows program and with minimal or no modifications you have good chances to compile it even for a 64 bit system
I suspect I can take 'cat' from a pre windows unix variant and with "minimal" modification get it to work on a windows 7.
That however doesn't mean that windows wasn't re-written.
|
|
|
|
|
jschell wrote: The point however is that windows has been re-written.
Well, from our point of view its not important how much they extended or replaced their codebase. Its a fact that they kept backward compatibility with their old api and that doesn't leave much space for practically changing the underlying language. Lot of parts have been tweaked and replaced in windows in the past decade, but I was shocked how much hacks have been kept in the code and win9x compatiblity layers. I'm pretty sure the level of backward compatibility they have often ties their hands pretty strong. On hack I really mean (sometimes really dirty) hack for example to avoid crashes of specific popular old programs that has bugs that don't crash the program on older systems but without special handling they would simply die on NT. And the hacks are explicitly commented with bug IDs and reasonings. This is another hidden face of windows' backward compatibility. I think what makes windows successful also holds it back in development, but I really respect the MS coder guys for what they achieved. Keeping backward compatibility on such a large scale is tremendous work.
They don't rewrite everything, what I was curious in the sources is module loading, that consisted of pretty old sources with lot of win3.1 and win9x sources. They use a lot of typedefs that makes porting relatively easy even to 64 bit and thats quite OK till they go on with backward compatibility.
jschell wrote: I suspect I can take 'cat' from a pre windows unix variant and with "minimal" modification get it to work on a windows 7.
That however doesn't mean that windows wasn't re-written.
I wouldnt compare the complexity of a cat program with even the simplest windows gui program. The same is true for the winapi versus posix. The posix api doesn't contain api calls that seriously enforce windows specific compatibility restrictions internally. On the other hand if a program for example subclasses a windows common controls dialog and it expects the border size to be X pixels and hacks around with gui hooking and expects you to send unrelated window messages to avoid a crash then you have a very complex (and sometimes not too well designed) api system to simulate natively and thats near not as easy as implementing a pure simple posix api.
Well, we are talking about why havent they changed the language, and the answer is clear: Source level backward compatibility. Its pointelss spinning around how much they rewrote from windows - it simply meaningless if they decided to keep backward compatiblity its impractical to start thinking about changing the language.
|
|
|
|
|
Windows isn't written in Basic.
So based on your reasoning why does Microsoft keep producing that?
|
|
|
|
|
Ah, sorry, I misunderstood your post so I correct my answer. I don't know the answer to that and I'm not really interested to research one because I'm not a big basic fan. I could just guess and I don't like doing that. You can however open a new topic for that and then everyone can tell their opinions or whatever they know about it. But how is this related to bad/good things in C/C++?
modified 22-Sep-12 23:55pm.
|
|
|
|
|
pasztorpisti wrote: like header files that terribly slow down the compile time Have you any proof supporting this sentence, regarding the C language?
pasztorpisti wrote: Again, the only reason for the existence of C/C++ is massive amount of legacy code This is an opinion (mine, for instance, is completely different).
Veni, vidi, vici.
|
|
|
|
|
CPallini wrote: Have you any proof supporting this sentence, regarding the C language?
Yes, in the last years we had many times when we had to rearrange the header includes and optimize for compile times for our CI system. We compiled the codebase (~2millions loc) with a grid system (IncrediBuild) plus SSD drives in all machines in the grid and the compile time was still 20 minutes. By rearranging some header files we could decrease the build time to around 5 minutes. Thats what I'm talking about not some few file hobby projects that make no sense to measure such things.
CPallini wrote: This is an opinion (mine, for instance, is completely different).
And could you make a list of language features and compare that to some other languages that have better support for that? I see significant deficiencies in C++ comparred to some other languages, and its syntax because more-and-more complex with every new draft. A language that has redundant features and backward compatiblity with a thousand years old other language simply can't be "optimal".
|
|
|
|
|
pasztorpisti wrote: Yes, in the last years we had many times when we had to rearrange the header includes and optimize for compile times for our CI system. We compiled the codebase (~2millions loc) with a grid system (IncrediBuild) plus SSD drives in all machines in the grid and the compile time was still 20 minutes. By rearranging some header files we could decrease the build time to around 5 minutes. Thats what I'm talking about not some few file hobby projects that make no sense to measure such things.
Still it is not a proof. You should compare it to the compilation time of a similar project written with your favourite language and achieving the same performance (if your favourite language could assist you on that).
pasztorpisti wrote: And could you make a list of language features and compare that to some other languages that have better support for that?
C and C++ are performant. No other language (other than assembly) compares with them. You should know that.
pasztorpisti wrote: I see significant deficiencies in C++ comparred to some other languages, and its syntax because more-and-more complex with every new draft.
While, for instance, C# syntax becoming simpler?
pasztorpisti wrote: A language that has redundant features and backward compatiblity with a thousand years old other language simply can't be "optimal"
Still is compatible.
I wouldn't call it 'optimal'. However I like it (this doesn't means I show apparent disgust for other languages - with the very exception of COBOL).
Veni, vidi, vici.
|
|
|
|
|
CPallini wrote: Still it is not a proof. You should compare it to the compilation time of a similar project written with your favourite language and achieving the same performance (if your favourite language could assist you on that).
Its already a proves that that header files suck. On the other hand I worked with similar project sizes in Delphi and java too. Both outperforms C++ in compilation many times on much weaker hardware configuration.
CPallini wrote: C and C++ are performant. No other language (other than assembly) compares with them. You should know that.
Its not C++ that provides the performance, its the underlying compiler architecture. Lot of other languages can also produce code with good performance (like pascal/Delphi). Its often better to leave assembly generation to the compiler because of optimization. Very few cases reason the use of 'manual assembly'.
CPallini wrote: While, for instance, C# syntax becoming simpler?
No but I never said that C# is a good language especially in this respect. Take a look at java, its older then C# still preserved its superb simplicity, even the format the sources and arrangement of projects is unified. This is something respectful and exemplary.
CPallini wrote: Still is compatible.
I wouldn't call it 'optimal'. However I like it (this doesn't means I show apparent disgust for other languages - with the very exception of COBOL).
No it isn't compatible at all. Some libraries require very heavy modifications to get it compile with your compiler. Even some new "late" language features like templates work totally different in major compilers (like VC++ and gcc) requiring you to take attention to avoid compiler specific bugs(!!!) in your project. Thats terrible.
|
|
|
|
|
pasztorpisti wrote: On the other hand I worked with similar project sizes in Delphi and java too. Both outperforms C++ in compilation many times on much weaker hardware configuration
I can't believe that (comparing with C language).
Anyway your quickly compiled project would suck in performance, compared to a similar C/C++ one.
pasztorpisti wrote: Its not C++ that provides the performance, its the underlying compiler architecture. Lot of other languages can also produce code with good performance (like pascal/Delphi).
This is a nonsense. The 'underlying compiler architecture' depends on the language. Almost all other programming languages are outperformed by C++ . That's a fact.
pasztorpisti wrote: No it isn't compatible at all
We were not talking about that. We were talking instead about backward compatibility with C.
Veni, vidi, vici.
|
|
|
|
|
CPallini wrote: I can't believe that (comparing with C language).
Anyway your quickly compiled project would suck in performance, compared to a similar C/C++ one.
The time you win in another language comes from the fact that there is no header hell, and the parsing of the language is much simpler. For example delphi uses unit files that contains ready-made data for the compiler (the same is true for a lot of other languages), in C/C++ you have to read in and parse and compile the same header files a dozen times. This becomes even worse if the headers contain a lot of inlining and/or templates. The parsing and compiling of C++ is also much more complex for the compiler frontend than the same for some other languages like pascal/delphi.
This has nothing to do with optimization. Anyway, any other language can use the exact same optimizations as C++ (see llvm).
CPallini wrote: This is a nonsense. The 'underlying compiler architecture' depends on the language. Almost all other programming languages are outperformed by C++ . That's a fact.
@See llvm.
CPallini wrote: We were not talking about that. We were talking instead about backward compatibility with C.
I was talking about C++'s compatibility with C++. But the same is almost true for C's compatibility with C but this isn't so big problem because C is much simpler. With some modifications (like eliminating header files and some more type safety) C could be a nice simple language.
|
|
|
|
|
Again you didn't understand. C language compilation is relatively fast.
C++ language compilation, on the other hand is slow, I am aware of it. Anyway you cannot achieve the same performance of a (well) written C++ application with a (well) written application using your favourite programming language (try it).
LLVM? Do you mean: "An aggressive open-source compiler for C and C++ and Stacker, a forth-like language"?
Veni, vidi, vici.
|
|
|
|
|
llvm was started to kill off gcc, so its first frontends are C and C++. Its a nice clean compiler infrastructure that already has other frontends and you can integrate any other languages relatively easy compared to the same work with for example gcc. Its a beam of hope to get rid of the "C++ is good because its the optimal" reasoning because that is simply ridiculous and still doesn't change the fact that both C and C++ are defective. And yes, compiling C is much faster, but on the other hand a C project is also much harder to maintain and the language features don't give much support to arrange huge codebases. The biggest C codebase was around 100.000 loc and it was already hard to navigate. The lack of type safety and typeless linking, lack of namespaces (and the list goes on) give way too much space for human errors. Yes, C code compiles faster but it misses lots of nice features that wouldn't require you to sacrifice compile speed.
|
|
|
|
|
pasztorpisti wrote: "C++ is good because its the optimal" reasoning because that is simply ridiculous
Your point is ridiculous. You have nothing to bear out it.
Veni, vidi, vici.
|
|
|
|