The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
Agreed. I started using OO over 20 years ago and from the start, I was wary of some of the uses I saw, and that's just increased as time's passed.
There are some cases where it can be useful, but the vast majority of the time, you are more likely to get value from constraining behaviour (I like the type class feature in Haskell, and also traits in Rust). And if you really have a bunch of related types, then gathering them together as a 'sum type' often has better ergonomics.
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
I did OO before there was no OO... (Well, Simula had been around for some years, but the term OO was not yet in use.
The professor teaching my first university level programming course insisted on strict discipline: With a Pascal RECORD definition followed those functions that operated on the record fields; those were the only code allowed to work on the fields - just like OO methods. A record "class" could have variants, "subclasses", etc.
I must admit that I liked the orderly, disciplined way. We didn't know that we did OO; it was just a "best practices" thing to more easily manage the operations on the records.
A couple of years later, I got access to the OS source code for a 16-bit mini, written in a language midway between assembler and C. For the drivers, I saw a class hierarchy: Several subclasses of class objects, with method tables and pointers to superclasses, and then instances objects of these for each device, all hand written, but the same way as the a C++ compiler would lay out its classes and instances. I must admit that I was impressed. By this time, OO was on its way in, so I recognized the OO concepts, but when I mentioned this to the OS developers they didn't get it. That's simply how you write an orderly structure of generic drivers with specializations. OO, what's that?
So I've come to use OO in a fairly lightweight style: To organize similar data and the operations to go with them. Language developers seem to think that every new release of a language simply must include some new "language concepts", some fancy mechanism(s) that can do things in even more abstract, hard-to-grasp ways (maybe "elegant", once you have comprehended it, but often quite far from "obvious" in the way it works). I tend to ignore super-conceptual, super-fancy mechanism; even though they may help you conceptualize the problem solution in your head, it often tends to make the program code more obscure. "That's not a feature, that is a (design) bug!"
Use those OO facilities that make the program code more lucid, better organized. But don't let it obscure your program solution, just because you insist on showing that you master this new concept that was introduced in the last language revision.
I don't think that 'OOP as currently conceived by many languages didn't fulfil all the promises made about it' is particularly controversial.
Especially the idea that we can (or even should) model reality into classes representing things seems a bit odd in retrospect. We found ways to work around these limitations (like design patterns, not using inheritance), but the very fact that we need these things shows the limitations of the paradigm.
I came from this from the other way, I learned functional programming and am now very happily merging the 2 paradigms together. (we all do, LINQ is FP)
OOP does some things better, FP does others better (like concurrency), the most important thing is to keep it simple. People get hung up on reuse and abstractions, while they should be focussed on cognitive load and simplicity.
Codewitch, from the replies, your opinion isn't all that unpopular.
In specific uses cases, OO is a great technique. Thinking about encapsulation is helpful in identifying all the pieces-n-parts, even in situations where the solution is procedural.
But as others have stated, it's not a universal tool and it's amazing how badly people use and mis-use it. I've seen co-workers spend months shoe-horning a solution into an inheritance model, when a procedural solution was completed in 2 weeks (after the OO solution failed).
It's like everything else -- when the only tool you have is a hammer, everything looks like a nail ...
I don't think that OO in general is "ill conceived" but I do agree that is by far overused and misused since it was "invented".
There are certainly benefits to encapsulate both code and data, and to some degree, abstraction can help to make code more readable and thus maintainable. But in far too many cases, this leads nowadays just to a royal mess where people do this ad nauseam, creating totally unreadable and incomprehensible code (beside the original coder, at the time they creating that piece of code) and are even proud of it.
It's simply has turned into "too much of a good thing"...
A lot of coders spend a lot of lines of code dividing things into tiny steps which they then make whole classes for and abstract everything to the Nth degree, often even when the abstraction is not helpful.
They're doing it wrong. Deciding what things or processes to abstract and how to divvy things up isn't difficult to do once you know how, but learning it can be painful. It took me over ten years to acquire the background to be able to handle some things. There are a lot of developers out there who simply never get it. They end up blindly applying one or more approaches from Design Patterns[^] or other cookbooks. They finally reach steady state as one trick ponies (apologies to Paul Simon) where every problem is a nail, and they've got the hammer with which to beat it to death.
Coincidentally, I was tripping through a large C++ program (about 100,000 LOC) to make some relatively minor enhancements. The original architect(s) of this program did a fantastic job of using all the features of C++ including lambdas, auto's, executable code in header files, virtual functions, just to name a few.
The frigging code was terribly difficult to follow!
But what was the major hindrance was the exorbitant use of getters/setters between classes. A lot of the code looked like:
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
Just heard back from the people I was undergoing interviews with and they had a positive feedback for my work and they're ready to make a final offer to me. Though I might start a little late because of on-going quarantine but just the news that I got the job is enough to make my day.
P.S. On that note, does that make me the first person to be hired during the bad bad times? That's what they said.