The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
We were running ProDOS on our machines. The boss came in one day with a 5Mb HDD and we were awed. He would take me and my coworkers code into his office load it onto the HDD and compile and link it. The HDD sounded like a jet plane taking off when it spun up.
No never even heard of Beagle Bros. We were a tiny shop writing give away software that went with new modem boards.
I'm not sure how many cookies it makes to be happy, but so far it's not 27.
Assembly languages vary. I worked with one where you e.g. loaded float register 4 from memory location xyzzy by "F4 := xyzzy" and the multiplied it by 4.5 with an "F4 * 4.5" instruction. A conditional jump was written like e.g. "IF = GO Label" (if the flag bits were not set other ways, you would have to precede the IF by a "COMP x, y". Call this syntactic sugar - it definitely is! - but it makes the code a lot easier to read than the treaditional assembler acronym letter soup.
This was definitely a CISC machine: E.g. it had a loop instruction "W LOOPI i, imax, TopLoop" for incrementing i, comparing it to imax, and until imax was reached, jump to TopLoop- a for-loop control in a single instruction. There were call instructions transferring a list of arguments onto the stack, moving the stack pointer and checking for stack overflow. There were heap allocate/free instructions (and the call instruction might allocate stack frames from the heap, for coroutine use). A comprehensive set of string instructions, e.g. for translating a string to another (single-byte) encoding / case / ... - strings were addressed through descriptors giving the location and length. Math functions such as square root, X**Y, log and trig functions were single instructions.
The distance in abstraction level between K&R C and this assembler/instruction set was so moderate that at the time, I didn't really see any advantage of C other than that it could also be compiled for more primitive instruction sets. In fact, we used to refer to C as "Machine independent assembler" I preferred the machine dependent one... (But for the most part, we were programming in higher level languages than C).
I started assembly language programming in the late 70's but in the last 20 years in my career as an Embedded Systems Engineer (now retired), I saw no need to use assembly language in any of the products I worked on. C/C++ compiler tech for anything from an 8051 to a TI DSP generate excellent code that's relatively hard to improve upon.
Years ago I wrote a front-end program in dBASEIII that generated AutoLisp scripts to manipulate AutoCad drawings. The engineers would provide a few dozen input parameters, the AutoLisp script was built, then AutoCad was fired up and produce the drawing for them.
I remember spending lots of time counting parenthesis and certainly agree with another comment - LISP = Lost In Stupid Parenthesis.
The end result worked really well, and I learned how to count really well.
No reflection on you, Mike, but the article's a load of crap. I'm of the opinion that if you "dread" working in a programming language, then that's a sign that you don't understand it. That's a problem you can correct and should before you try working in it.
Yeah, there are languages and environments you like working in more than others, but sometimes you don't have a choice. In that case stop whining, suck it up, and dig in.
I'm of the opinion that if you "dread" working in a programming language, then that's a sign that you don't understand it.
I'm very well versed in VB.NET, yet when I got an offer to take over a VB.NET WinForms program I sighed, thought of the customer it would gain me, bit my lip and said I'd do it.
As I suspected, there are plenty of forms that I dread working on.
Not because I don't know the language, but because the original programmer didn't and now I have to make brain farts to get a gist of what he was thinking while a wrote that crap.
So basically your average VB(.NET) project
And if the original programmer was a jackass, a function may even return another type depending on the input.
I guess that's what you get for not being type-safe.
Maybe I don't dread the language, but I dread how easy it is to write crap code in some languages that I have to deal with because others are bad at their job.
I don't get why people hate VB so much though. Granted I only use it sporadically, but it's not like it's really hard to get the kind of work people want done in it done.
Interesting reading but I would like to propose that Forth or its derivative "White lightning" should be included in the list. In the mean time it has become so obscure that most people don't remember it ever existed.
If I ever see a piece of code I wrote in the derivative I am 100% sure I would not even remember how it was supposed to work.
TECO - built for string manipulations and was used for the first version of Emacs. Definitely a write-only language as there was an annual competition to see who could figure out what a specific TECO line would do to an arbitrary string.
I still use TECO occasionally. Learned it in 1972. I sometimes wrote TECO macros to do a task and while I was waiting for it to finish, I could write a Cobol or Assembler program to do the same thing, compile it and run it while still waiting for TECO to finish. It was really easy to use for scripting when the amount of data was small.
DXL is the worst one I've used... It's a pretty basic C-like language - the horrors come from the interface to the DOORS requirements management system, and how easy it is to leak memory without even trying. Here's an example - DXL can allocate, but not deallocate strings, so every string you use takes up memory, until the process (which might be on the server side of DOORS) is terminated.
I encountered DOORS and DXL when developing an API for accessing DOORS from Java, using (in-effect) hand-rolled RPC to call from Java into DXL (using a DLL written in C++ as an intermediary) and then returning results from DXL in JSON, which was deserialised into Java objects using Jackson. Of this, two parts didn't suck - C++ and Jackson. But... it worked. And worked reliably.
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p