The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
Ironic, I got this email from Postman last week basically tooting their horn about how great they are, how much their customer base has grown, etc. Sad, because their UI sucks, and their app sucks too IMHO.
Your point is valid, and it sure is a problem that kids of today never learned the RAM equivalent of the big-O of algorithms
Yet it is easy to be misled by too quick observations. If you map a 2 GByte file into RAM, you haveessentially set up 500K page table entries, without bringing any of it into physical RAM. In many cases, just a tiny little fraction of it will ever get into RAM before the application terminates. Checking my RAM use right now, there is slightly above 6 GB "In use", green color in the resource monitor, which means that is any process needs RAM, nothing needs to be paged out - the other process can take over those pages without any fuzz. A mere 35 MB is currently "Modified" and needs to be saved to backing storage before another process takes over. 1.9 MB is in "Standby" - the last user of those pages are no longer using them, but they are frequently used segments, so chances are that another process soon may ask for (parts of) it to be "loaded", which is a null operation of the pages are still in memory.
Of course there are cases of software that really needs huge amounts of data space - FEM and huge matrix models (read: weather forecasting) are the classical ones. I am currently testing out Coverity (a code analysis tool) that builds a complete flow graph of a million lines of source code; that fills some space.
And then there are those that really shouldn't need more than a handful of MB, but requires a few hundred. Or a couple GB. Yet, if it requires a GB during startup, and then that memory is paged out (which is a null operation for code segments), leaving a working set of a dozen MB for continued running, it won't slow down your other programs very much.
Only if it actually addresses RAM "all over the place", continously maintaining a huge working set, is there something to worry about. Some programs are that way. But lots of users are screaming out because they see huge numbers, without understanding what the numbers represent.
17 tabs and almost 5GB. Close Chrome, restart with exactly the same 17 tabs, and now it uses a meager 2GB.
Don't necessarily blame Chrome for that one. I worked on a project a few years ago, a "web designer" had built the front end and I was supposed to make the back-end work. The page loaded fine (apart from the 20 or so errors thrown up in the console, that the "designer" had never looked at); but it then used about 1Mb every 5 seconds until Chrome crashed or Windows ground to a total halt. Never did find exactly what the issue was, but thinning out the 200+ stylesheets and script references made a big difference. Recoded the script to just use JQuery plus about 100 lines of custom code and all was well.
A lot of websites seem to be built by designers just throwing in every possible framework and widget because they can't be bothered to style something by hand and don't know how to do even basic scripting.
Speaking of memory usage and Chrome, and maybe it is just me. Open several of your favorite websites in separate tabs in Firefox. Then open Chrome (you shouldn't have to do much, maybe go to Google with one tab). Sit back and watch the fun in Task Manager.
MOST times on my PC BOTH Firefox and Chrome go off into Not Responding and eating up CPU cycles. Looking at the processes tab, Chrome has a bunch running even though it only has one tab open. Kind of like Chrome has knocked Firefox over in some alley and is rifling through the pockets.