|
The removal of UseWebpackDevMiddleware made me weep.
cheers
Chris Maunder
|
|
|
|
|
For someone that is not a programmer, the man has contributed more to computer science than most anyone i can think of short of maybe Ulman and Aho.**
My hubby calls him "that <expletive> generativist" because apparently generative linguistics is not fashionable in academia.
Still, bless the man.
** and a non-exhaustive list of foundational contributors like Turing.
Real programmers use butterflies
modified 14-Feb-20 8:25am.
|
|
|
|
|
And hats off to (Roy) Harper, of course.
|
|
|
|
|
If it's not "fashionable" in academia, it must make sense.
Chomsky's political views are close to yours, I believe, so there's also that.
I still have the Aho/Ullman text from my compiler course. A quick check on Amazon seems to show that it's still going strong and even continues to feature a dragon on the cover!
For computer science and software generally, there's also Knuth, Dijkstra, Hoare...
|
|
|
|
|
And Wirth. =)
Chomsky is pretty left, and I do have some of his books including Failed States, but I'm actually more concerned and impressed with his contributions to CS than I for his politics.
Real programmers use butterflies
|
|
|
|
|
It does not solve my Problem, but it answers my question
modified 19-Jan-21 21:04pm.
|
|
|
|
|
I rememeber having to study Noam Chomsky on my degree course all those years ago when we studies compilers and in particular grammars relating to language design. It was a really useful and enjoyable part of my course. Not sure if they still teach this stuff. One of the junior developers where I work hadn't done anything at all relating to compilers on his course and he graduated 2 years ago.
"There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult." - C.A.R. Hoare
Home | LinkedIn | Google+ | Twitter
|
|
|
|
|
I only ever see papers online (be they simple assignments or entire theses) produced by what looks like masters and grad students so maybe they don't teach it during 4 year courses at some unis.
Real programmers use butterflies
|
|
|
|
|
I had compilers in 4th year, and that was in the late '70s. But with everything being dumbed down...
|
|
|
|
|
I went straight into the field at 18 rather than going to school for it, so I have no idea. I just have to go by what I find.
Real programmers use butterflies
|
|
|
|
|
You're proof that a degree isn't necessary, even for STEM disciplines.
|
|
|
|
|
I don't think it's being dumbed down so much as there's just much more to learn now in the same amount of time. Project management (get that fail rate down!), CI pipelines, machine learning, big data, operating systems are way more complex, web and mobile development, etc. If someone was really into compilers, the colleges I've been to/researched have a generic "blank" course for undergrads where the student simply has to find a teacher willing to teach a specific topic.
|
|
|
|
|
My first time of hearing about Chomsky's classifications of languages was with compilers on my Comp Sci degree, then came across his work when learning Sign Language (apparently he was the first to recognised Sign Language as being a real language), and have had good discussions with a Speech Therapist about him as she had had to learn about him as well.
|
|
|
|
|
honey the codewitch wrote: apparently generative linguistics is not fashionable in academia Oh, I think we've managed to kill off most (if not all) of the prescriptivists in academia, by now. The only ones left are laymen who've only read a style guide or two, so they don't have a clue what they're talking about.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
I particularly thank Noam for the intellectual spawn of "colorless green ideas sleep furiously."
Also really appreciated that Noam begat Pinker who could actually describe some of Noam's linguistic theories in a form mere humans could understand, and who has the courage to argue against some of them.
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
|
|
|
|
|
So I wanted to use a piece of software written in C(++?).
As a user, I shouldn't really care what it's written in, except that when I want to use it on anything else but a Mac I have to build it myself...
I've seen that more often, although there's usually a Windows installer as well, but not this time.
Screw that, I'm not using a tool that's supposed to help me except it first makes me work for it.
I believe this is standard practice for everything ever written for Linux, but what's the idea behind having to build it yourself?
Why do the developers not simply build it for me (and thousands of others) and save me some trouble?
Why would a user ever want to build it himself?
I'm sure "because we're 1337 and you're a n00b" isn't the only reason although it's the first that comes to mind
|
|
|
|
|
Quote: Why would a user ever want to build it himself? Because he wants to modify the sources according to his needs.
Or, more reasonably:
because his platform (e.g. an embedded system) is not directly supported.
|
|
|
|
|
In this case I'm looking at a static code analyzer, not something you'd use on an embedded system
I'm guessing a default Windows installer would suffice for 99.9% of the user base.
CPallini wrote: Because he wants to modify the sources according to his needs. That's a good reason, although I wonder how often that really happens.
It seems silly to have build-it-yourself as the only option though.
|
|
|
|
|
Or equally reasonably,
the developer does not have access to the OS that you are running.
Do you have access to:
Windows 10
Windows IoT
MacOS
Linux x86-64,i386,arm32,arm64 rpm format
Linux x86-64,i386,arm32,arm64 deb format
Then there's all the minor players like pacman on Arch Linux, AmigaDOS, (Net,Free,Open)BSD, Haiku etc.
Chances are that if you provide a useful tool in source code, somebody will figure out how to compile it on their system and, hopefully, feed the changes back to you, or at least make them available to others
|
|
|
|
|
sounds dangerously like common sense !
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
|
|
|
|
|
So it should be an option, which a lot of windows programmers give you, with a separate "download source" link.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
it was common on linux because of different platforms, c/c++ compiles down to native machine code and linux (more so unix) runs on many different hardware from PDP to IBM. Even on the same architecture a lot had to be handled before compiling such as byte order, sizeof's (32/64 bit ints)....
the unix way was also often "here's my answer, feel free to take it as is, or improve it for yourself/purpose.
a lesser (later) reason was the "what's really inside" worries - compiled to native even decompiling/disassembling didn't tell you much (or being assembly told you way too much in painfully simple detail... MOV 1 A, CMP A B... 325,000 lines, is there a trojan in there somewhere?).
when windows came out compilers were both expensive and uncommon but it was only one architecture (ms introduced it's own hacks for 32 bit code on 64 bit machines) so giving compiled code was fine as long as you trusted the author/download site (and enough others had tried it and didn't get ransomwared.)
A few unix folks jumped into some windows dev, their habits often didn't change (i.e. provide the source: use as is or mod it yourself.)
after many otherwise intelligent sounding suggestions that achieved nothing the nice folks at Technet said the only solution was to low level format my hard disk then reinstall my signature. Sadly, this still didn't fix the issue!
|
|
|
|
|
A remnant from ye olden days!
Building software back then sounds like a real PITA...
I really wish those Unix devs got with the times and gave me an installer though
I'm really not going to download the C/C++ tooling just so I can try out this one tool.
To be fair, I'm not a C/C++ developer, which is who this tool is for.
I just need it to test some functionality of another tool, but I'm just going to take their word for it
|
|
|
|
|
Nowadays, there are lots of naive peope cheering "Containers! Docker! Hallelujah!", believing that once you have put the stuff into a container, it is "build once, run anywhere!" Sure...! The interface between host and the running container is, at the functional level, reasonably simple; it is realistic to implment it on "any" architecture. But inbetween the host/container interactions, the container is on its own - together with the CPU, of course.
When Apple jumped from 68K to x86, they developed an emulator for running 68K code on the x86 that was surprisingly efficient. There is nothing like that in the Docker container. If the machine code inside the container is x86, the CPU better be x86, too! "Run anywhere, provided that you are on an x86" is sort of "You can have the T-Ford in any color you want, as long as you want it it black". Lots of Docker affectionados haven't realized that yet.
Obviously, the host may - outside the container - provide an emulated CPU by having, say, a PPC cpu interpret in software every x86 instruction code. You could hardly describe that as "lightweight" virtualization! And for it to be universally "run anywhere", every Docker host would have to be able to emulate any instruction set that might be found within a container.
I wonder if MS is working on containerization where the code inside consists of dotNet assemblies, compiled on-the-fly to the native code of the host. That could be (part of) a solution for truly build once, run everywhere. (The CLR sort of provides (parts of) this, but without the container protection.) I think that Docker containers will live for many years, but it is certainly not the last word in container technology. My guess is that the container/host interface in future techologies may include some JIT code generating, so that the container at startup (/JIT) deliver something like an assembly to host and get the native binary code back for execution in the containerized enviromment. That will probably be in a container framework different from today's Docker, though.
|
|
|
|
|
The real reason is that everyone and his kid brother have taken Linux, and bent/folded/stapled it so that it is incompatible with any other version (including those compiled for the same hardware). In other works, there is no agreed ABI which guarantees that a shared library compiled for x64 on one system will run on another x64 system.
In this jungle, the only way to ensure that code will work is to compile it locally. If the code compiles (sometimes - this is a big if...), it will probably work on your machine. Otherwise, you're SOL.
Note that even compiling without error is no guarantee of proper working - see Chemists bitten by Python scripts: How different OSes produced different results during test number-crunching • The Register
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|