|
I'm surprised no hackers from around the world has hacked into Microsoft(R) servers and steal every O.S. source codes or projects (or at least some of them), and them publish them on torrents. But Google, Bing, Yahoo and all those other people filter out those results and/or delete them. There probably has been private court orders of Microsoft vs. some hacker, stealing source. But I'm pretty sure Microsoft knows everything about security then any other security company out there combined, considering you have to send your driver to them to get it signed now. Basically Microsoft probably has there servers extremely hard to break through or breach, they probably have it to where if it detects a breach or unauthorized transfer of specific files, it disconnects the servers and computers and locks them down, terminating the hacking or suspicious signal. Who knows, they are probably smart and keep the internet cord unplugged, considering the fact that government leaks there "state of the art" F-35 blue prints on the internet, where everyone can see (technically, cause if there's internet, there's hackers), or maybe that's bogus that the government sent out to intentionally trick hackers into thinking they got the "real stuff", but who knows what crazy stuff tech companies do these days.
Simple Thanks and Regards,
Brandon T. H.
Programming in C and C++ now, now developing applications, services and drivers (and maybe some kernel modules...psst kernel-mode drivers...psst).
Many of life's failures are people who did not realize how close they were to success when they gave up. - Thomas Edison
|
|
|
|
|
Every OS contains tons of bugs, old bugs go, new ones come with the new features. Still the wast majority of security holes are caused by buggy networking applications and not by the OS itself. The more complex a networking application the more changes you have to have a security hole in your system. (For example a browser is quite a complex piece of software!!!) Anyway, several windows and internet explorer patches followed the windows source leaking, not without reason but because they had to fix a lot of discovered and (I suspect that) known but shelved (!!!) bugs that became obvious for the hackers from the sources!
|
|
|
|
|
pasztorpisti wrote: It contains tons of legacy code even from win31.
Which would be relevant if I had claimed that they had written a newer version of windows from scratch using a black box development effort.
But I am pretty sure that the vast majority of code is some command like 'type' has essentially remained the same since pre-win95.
On the other hand I know for a fact what is required to move a code base that was originally written for 32 bits when that was new to what it takes to use a 64 bit system.
How exactly do you think that windows handles VMing? Did you ever attempt the same thing in Windows 95? I can tell you that I did in fact run two OSes on a Win95 box and it was not easy.
|
|
|
|
|
Win95 is "just a nice DOS program". Even if they have replaced and extended parts in the OS they always stayed backward compatible on api and source level and thats alone is a good reason for not switching the language even if all they do is wrapping new kernel. You can take the sources of a pretty old windows program and with minimal or no modifications you have good chances to compile it even for a 64 bit system. I think this backward compatibility drains a lot of their energy and it implies some practical restrictions (like language).
|
|
|
|
|
pasztorpisti wrote: Win95 is "just a nice DOS program"
The point however is that windows has been re-written.
pasztorpisti wrote: You can take the sources of a pretty old windows program and with minimal or no modifications you have good chances to compile it even for a 64 bit system
I suspect I can take 'cat' from a pre windows unix variant and with "minimal" modification get it to work on a windows 7.
That however doesn't mean that windows wasn't re-written.
|
|
|
|
|
jschell wrote: The point however is that windows has been re-written.
Well, from our point of view its not important how much they extended or replaced their codebase. Its a fact that they kept backward compatibility with their old api and that doesn't leave much space for practically changing the underlying language. Lot of parts have been tweaked and replaced in windows in the past decade, but I was shocked how much hacks have been kept in the code and win9x compatiblity layers. I'm pretty sure the level of backward compatibility they have often ties their hands pretty strong. On hack I really mean (sometimes really dirty) hack for example to avoid crashes of specific popular old programs that has bugs that don't crash the program on older systems but without special handling they would simply die on NT. And the hacks are explicitly commented with bug IDs and reasonings. This is another hidden face of windows' backward compatibility. I think what makes windows successful also holds it back in development, but I really respect the MS coder guys for what they achieved. Keeping backward compatibility on such a large scale is tremendous work.
They don't rewrite everything, what I was curious in the sources is module loading, that consisted of pretty old sources with lot of win3.1 and win9x sources. They use a lot of typedefs that makes porting relatively easy even to 64 bit and thats quite OK till they go on with backward compatibility.
jschell wrote: I suspect I can take 'cat' from a pre windows unix variant and with "minimal" modification get it to work on a windows 7.
That however doesn't mean that windows wasn't re-written.
I wouldnt compare the complexity of a cat program with even the simplest windows gui program. The same is true for the winapi versus posix. The posix api doesn't contain api calls that seriously enforce windows specific compatibility restrictions internally. On the other hand if a program for example subclasses a windows common controls dialog and it expects the border size to be X pixels and hacks around with gui hooking and expects you to send unrelated window messages to avoid a crash then you have a very complex (and sometimes not too well designed) api system to simulate natively and thats near not as easy as implementing a pure simple posix api.
Well, we are talking about why havent they changed the language, and the answer is clear: Source level backward compatibility. Its pointelss spinning around how much they rewrote from windows - it simply meaningless if they decided to keep backward compatiblity its impractical to start thinking about changing the language.
|
|
|
|
|
Windows isn't written in Basic.
So based on your reasoning why does Microsoft keep producing that?
|
|
|
|
|
Ah, sorry, I misunderstood your post so I correct my answer. I don't know the answer to that and I'm not really interested to research one because I'm not a big basic fan. I could just guess and I don't like doing that. You can however open a new topic for that and then everyone can tell their opinions or whatever they know about it. But how is this related to bad/good things in C/C++?
modified 22-Sep-12 23:55pm.
|
|
|
|
|
pasztorpisti wrote: like header files that terribly slow down the compile time Have you any proof supporting this sentence, regarding the C language?
pasztorpisti wrote: Again, the only reason for the existence of C/C++ is massive amount of legacy code This is an opinion (mine, for instance, is completely different).
Veni, vidi, vici.
|
|
|
|
|
CPallini wrote: Have you any proof supporting this sentence, regarding the C language?
Yes, in the last years we had many times when we had to rearrange the header includes and optimize for compile times for our CI system. We compiled the codebase (~2millions loc) with a grid system (IncrediBuild) plus SSD drives in all machines in the grid and the compile time was still 20 minutes. By rearranging some header files we could decrease the build time to around 5 minutes. Thats what I'm talking about not some few file hobby projects that make no sense to measure such things.
CPallini wrote: This is an opinion (mine, for instance, is completely different).
And could you make a list of language features and compare that to some other languages that have better support for that? I see significant deficiencies in C++ comparred to some other languages, and its syntax because more-and-more complex with every new draft. A language that has redundant features and backward compatiblity with a thousand years old other language simply can't be "optimal".
|
|
|
|
|
pasztorpisti wrote: Yes, in the last years we had many times when we had to rearrange the header includes and optimize for compile times for our CI system. We compiled the codebase (~2millions loc) with a grid system (IncrediBuild) plus SSD drives in all machines in the grid and the compile time was still 20 minutes. By rearranging some header files we could decrease the build time to around 5 minutes. Thats what I'm talking about not some few file hobby projects that make no sense to measure such things.
Still it is not a proof. You should compare it to the compilation time of a similar project written with your favourite language and achieving the same performance (if your favourite language could assist you on that).
pasztorpisti wrote: And could you make a list of language features and compare that to some other languages that have better support for that?
C and C++ are performant. No other language (other than assembly) compares with them. You should know that.
pasztorpisti wrote: I see significant deficiencies in C++ comparred to some other languages, and its syntax because more-and-more complex with every new draft.
While, for instance, C# syntax becoming simpler?
pasztorpisti wrote: A language that has redundant features and backward compatiblity with a thousand years old other language simply can't be "optimal"
Still is compatible.
I wouldn't call it 'optimal'. However I like it (this doesn't means I show apparent disgust for other languages - with the very exception of COBOL).
Veni, vidi, vici.
|
|
|
|
|
CPallini wrote: Still it is not a proof. You should compare it to the compilation time of a similar project written with your favourite language and achieving the same performance (if your favourite language could assist you on that).
Its already a proves that that header files suck. On the other hand I worked with similar project sizes in Delphi and java too. Both outperforms C++ in compilation many times on much weaker hardware configuration.
CPallini wrote: C and C++ are performant. No other language (other than assembly) compares with them. You should know that.
Its not C++ that provides the performance, its the underlying compiler architecture. Lot of other languages can also produce code with good performance (like pascal/Delphi). Its often better to leave assembly generation to the compiler because of optimization. Very few cases reason the use of 'manual assembly'.
CPallini wrote: While, for instance, C# syntax becoming simpler?
No but I never said that C# is a good language especially in this respect. Take a look at java, its older then C# still preserved its superb simplicity, even the format the sources and arrangement of projects is unified. This is something respectful and exemplary.
CPallini wrote: Still is compatible.
I wouldn't call it 'optimal'. However I like it (this doesn't means I show apparent disgust for other languages - with the very exception of COBOL).
No it isn't compatible at all. Some libraries require very heavy modifications to get it compile with your compiler. Even some new "late" language features like templates work totally different in major compilers (like VC++ and gcc) requiring you to take attention to avoid compiler specific bugs(!!!) in your project. Thats terrible.
|
|
|
|
|
pasztorpisti wrote: On the other hand I worked with similar project sizes in Delphi and java too. Both outperforms C++ in compilation many times on much weaker hardware configuration
I can't believe that (comparing with C language).
Anyway your quickly compiled project would suck in performance, compared to a similar C/C++ one.
pasztorpisti wrote: Its not C++ that provides the performance, its the underlying compiler architecture. Lot of other languages can also produce code with good performance (like pascal/Delphi).
This is a nonsense. The 'underlying compiler architecture' depends on the language. Almost all other programming languages are outperformed by C++ . That's a fact.
pasztorpisti wrote: No it isn't compatible at all
We were not talking about that. We were talking instead about backward compatibility with C.
Veni, vidi, vici.
|
|
|
|
|
CPallini wrote: I can't believe that (comparing with C language).
Anyway your quickly compiled project would suck in performance, compared to a similar C/C++ one.
The time you win in another language comes from the fact that there is no header hell, and the parsing of the language is much simpler. For example delphi uses unit files that contains ready-made data for the compiler (the same is true for a lot of other languages), in C/C++ you have to read in and parse and compile the same header files a dozen times. This becomes even worse if the headers contain a lot of inlining and/or templates. The parsing and compiling of C++ is also much more complex for the compiler frontend than the same for some other languages like pascal/delphi.
This has nothing to do with optimization. Anyway, any other language can use the exact same optimizations as C++ (see llvm).
CPallini wrote: This is a nonsense. The 'underlying compiler architecture' depends on the language. Almost all other programming languages are outperformed by C++ . That's a fact.
@See llvm.
CPallini wrote: We were not talking about that. We were talking instead about backward compatibility with C.
I was talking about C++'s compatibility with C++. But the same is almost true for C's compatibility with C but this isn't so big problem because C is much simpler. With some modifications (like eliminating header files and some more type safety) C could be a nice simple language.
|
|
|
|
|
Again you didn't understand. C language compilation is relatively fast.
C++ language compilation, on the other hand is slow, I am aware of it. Anyway you cannot achieve the same performance of a (well) written C++ application with a (well) written application using your favourite programming language (try it).
LLVM? Do you mean: "An aggressive open-source compiler for C and C++ and Stacker, a forth-like language"?
Veni, vidi, vici.
|
|
|
|
|
llvm was started to kill off gcc, so its first frontends are C and C++. Its a nice clean compiler infrastructure that already has other frontends and you can integrate any other languages relatively easy compared to the same work with for example gcc. Its a beam of hope to get rid of the "C++ is good because its the optimal" reasoning because that is simply ridiculous and still doesn't change the fact that both C and C++ are defective. And yes, compiling C is much faster, but on the other hand a C project is also much harder to maintain and the language features don't give much support to arrange huge codebases. The biggest C codebase was around 100.000 loc and it was already hard to navigate. The lack of type safety and typeless linking, lack of namespaces (and the list goes on) give way too much space for human errors. Yes, C code compiles faster but it misses lots of nice features that wouldn't require you to sacrifice compile speed.
|
|
|
|
|
pasztorpisti wrote: "C++ is good because its the optimal" reasoning because that is simply ridiculous
Your point is ridiculous. You have nothing to bear out it.
Veni, vidi, vici.
|
|
|
|
|
CPallini wrote: Your point is ridiculous. You have nothing to bear out it.
Which point from the many listed in my posts?
|
|
|
|
|
(In my previous reply) I reported your sentence.
Veni, vidi, vici.
|
|
|
|
|
I will tell it to my mom!
|
|
|
|
|
Veni, vidi, vici.
|
|
|
|
|
pasztorpisti wrote: Yes, in the last years we had many times when we had to rearrange the header
includes and optimize for compile times for our CI system. We compiled the
codebase (~2millions loc) with a grid system (IncrediBuild) plus SSD drives in
all machines in the grid and the compile time was still 20 minutes. By
rearranging some header files we could decrease the build time to around 5
minutes. Thats what I'm talking about ...
And what I am talking about is that is a poor design.
Let us just suppose that there is absolutely no way that you can divide that into independent deliverables (each with a defined public interface.)
It certainly wouldn't stop a well design code base in which one could still work effectively on a piece without using the entire code base. If it was designed correctly.
pasztorpisti wrote: And could you make a list of language features
I use and have used all of the popular languages to make large systems. The features of the language have nothing to do with the success of the projects.
|
|
|
|
|
jschell wrote: And what I am talking about is that is a poor design.
Thanks for your advices, it had always been cut into about 10 DLLs. Still the CI system sometimes performed full clean build intentionally and that iteration time was important for us before relase days. Relatively rarely when some central system headers needed modification a grid came very-very handy. Of course I was talking about the full clean build time. I have to admit that the legacy codebase we are speaking about was full of lava code/flow and refactorization and/or partial rewrites in such a huge codebase is not always an option, its cheap neither in time nor in money. Still the owner wants to see it running and extending with new features. :P
jschell wrote: I use and have used all of the popular languages to make large systems. The features of the language have nothing to do with the success of the projects.
A well chosen language and its development environment can cut the development/debugging (and maintenance!!!) time and the money spent considerably, so it has to do with the success of a project. Of course this isn't true if your company has infinite amount of time and money (I've already worked for one that seemingly had these traits... ).
|
|
|
|
|
pasztorpisti wrote: A well chosen language and its development environment can cut the development/debugging (and maintenance!!!) time and the money spent considerably
Wrong. If you think otherwise then provide a reference.
The only proven thing that can reduce maintenance is a well regulated process. And that has nothing to do with language choice. A good process can provide other proven benefits as well. Scan articles from IEEE and ACM to find the studies that demonstrate it.
|
|
|
|
|
Then why have people invented languages and dev environments at all when assembly and edlin was already there? The first time I rrealized the importance of dev evnironment support is when I started to use Delphi. That indeed cut development and maintenance time to its fraction in some cases! It provides 10 to 100 times faster iterations for certain type of projects, and the same is true for some other languages/ides when its about specific problems. This has to do with the development process and strategy and also with the budget.
|
|
|
|