The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
I think the real world is similar. They always want me to, say, distinguish between different people when I see them as a homogenous grey map. They even try to tell that this "object" belongs to "that" object, while I think I should be free to use anything the way I want to. They even say that there are things I am not allowed to look at, it is their "private life". This idea of the world being split into distinct "objects" really bothers me.
The simplest pieces of code we try to make so abstract that at some point it doesn't make sense anymore and gets hard to understand. You end op with classes like: OrderManagerProviderOrchestrator or OrderFactoryStrategy. And all of this because, you know, SOLID, KISS, abstraction, dependency injection, blah blah blah,...
We spend so much time making code that way, making it independent, scaleable, etc. But in the end, whenever some change it necessary: oh no, this means we have to refactor everything!
After doing C++ for a couple of years, I've mostly restricted object design to db models and very rudimentary abstractions over external services and data sources.
I feel like in OO design, too often we build abstractions on top of abstractions in a weird attempt to clean up ugly datasets, that somehow feel wrong to us on a basic level. 95% plain data and 5% esoteric? You can bet someone over-engineered a solution so the 5% can now be deduced from model-state alone.
Every year I spend more and more time reversing situations like that, just to keep projects manageable. Cutting factories, flattening inheritance trees, and sometimes even re-introducing the dreaded 2% data redundancy that took 15 objects to solve.
IMO, people are inherently bad at abstraction, so it's in our best interest to KISS.
I found much the same - although it was C++ that finally convinced me that OO wasn't 'all that' 8)
After 'objectifying' some intricate real-world code (for an airline/shipping booking system) I realised that in order to get the behaviour of the objects right I was creating what were in effect object 'global' variables that could (due to inheritance) easily get accidentally 'hidden' by a variable with the same name in the dependent module. In other words - because it was a legal thing to do in C++ - variable scope could be overridden without warning (fortunately compilers these days do warn about such things).
When the fiasco that was manipulators (which I used extensively!) hit when going from Ver1 to Ver2, which then got corrected again in Ver3 (FFS!) I decided that C++ and its vision of OO was getting far more complex and difficult to manage correctly (for me anyway - my failing perhaps, not the language itself?) that I switched to Java for my next OO projects. Very verbose cf C++ but much more stable at the level I was using.
However, other programmers of long-standing who I respect have gradually formed similar views to mine about OO and C++ in particular so I don't feel too downhearted. 8)
Quite by chance all my work now is in either Visual Basic for Applications (Access), SQL Server or Python, so go figure! If I had to go back to something close to the machine I would almost certainly go for C rather than C++ (or maybe Go etc).
Agreed. I started using OO over 20 years ago and from the start, I was wary of some of the uses I saw, and that's just increased as time's passed.
There are some cases where it can be useful, but the vast majority of the time, you are more likely to get value from constraining behaviour (I like the type class feature in Haskell, and also traits in Rust). And if you really have a bunch of related types, then gathering them together as a 'sum type' often has better ergonomics.
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
I did OO before there was no OO... (Well, Simula had been around for some years, but the term OO was not yet in use.
The professor teaching my first university level programming course insisted on strict discipline: With a Pascal RECORD definition followed those functions that operated on the record fields; those were the only code allowed to work on the fields - just like OO methods. A record "class" could have variants, "subclasses", etc.
I must admit that I liked the orderly, disciplined way. We didn't know that we did OO; it was just a "best practices" thing to more easily manage the operations on the records.
A couple of years later, I got access to the OS source code for a 16-bit mini, written in a language midway between assembler and C. For the drivers, I saw a class hierarchy: Several subclasses of class objects, with method tables and pointers to superclasses, and then instances objects of these for each device, all hand written, but the same way as the a C++ compiler would lay out its classes and instances. I must admit that I was impressed. By this time, OO was on its way in, so I recognized the OO concepts, but when I mentioned this to the OS developers they didn't get it. That's simply how you write an orderly structure of generic drivers with specializations. OO, what's that?
So I've come to use OO in a fairly lightweight style: To organize similar data and the operations to go with them. Language developers seem to think that every new release of a language simply must include some new "language concepts", some fancy mechanism(s) that can do things in even more abstract, hard-to-grasp ways (maybe "elegant", once you have comprehended it, but often quite far from "obvious" in the way it works). I tend to ignore super-conceptual, super-fancy mechanism; even though they may help you conceptualize the problem solution in your head, it often tends to make the program code more obscure. "That's not a feature, that is a (design) bug!"
Use those OO facilities that make the program code more lucid, better organized. But don't let it obscure your program solution, just because you insist on showing that you master this new concept that was introduced in the last language revision.
I don't think that 'OOP as currently conceived by many languages didn't fulfil all the promises made about it' is particularly controversial.
Especially the idea that we can (or even should) model reality into classes representing things seems a bit odd in retrospect. We found ways to work around these limitations (like design patterns, not using inheritance), but the very fact that we need these things shows the limitations of the paradigm.
I came from this from the other way, I learned functional programming and am now very happily merging the 2 paradigms together. (we all do, LINQ is FP)
OOP does some things better, FP does others better (like concurrency), the most important thing is to keep it simple. People get hung up on reuse and abstractions, while they should be focussed on cognitive load and simplicity.
Codewitch, from the replies, your opinion isn't all that unpopular.
In specific uses cases, OO is a great technique. Thinking about encapsulation is helpful in identifying all the pieces-n-parts, even in situations where the solution is procedural.
But as others have stated, it's not a universal tool and it's amazing how badly people use and mis-use it. I've seen co-workers spend months shoe-horning a solution into an inheritance model, when a procedural solution was completed in 2 weeks (after the OO solution failed).
It's like everything else -- when the only tool you have is a hammer, everything looks like a nail ...
I don't think that OO in general is "ill conceived" but I do agree that is by far overused and misused since it was "invented".
There are certainly benefits to encapsulate both code and data, and to some degree, abstraction can help to make code more readable and thus maintainable. But in far too many cases, this leads nowadays just to a royal mess where people do this ad nauseam, creating totally unreadable and incomprehensible code (beside the original coder, at the time they creating that piece of code) and are even proud of it.
It's simply has turned into "too much of a good thing"...