|
Of course. And is one really all that much better than the other?
|
|
|
|
|
Not "better" ... greedier at collecting data and calling home. I said Chrome was always "busy"; including tasks like "GoogleUpdate"; or whatever.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
|
obermd wrote: SQL Server - limit memory usage via Server memory configuration options - SQL Server | Microsoft Learn The problem wasn't the limit, the problem was that there simply wasn't any memory left.
We fixed the issue by saying SQL Server should grab a minimum (and maximum) amount of memory, 1024 MB.
Works like a charm now
All the other apps can do with a little less memory
|
|
|
|
|
Unless you explicitly tell SQL Server, any version, to not consume all memory it will consume everything it can up to the limits of that version.
|
|
|
|
|
How/where do you do that?
|
|
|
|
|
When you right-click on the server node in the object explorer you can go to properties and then the memory tab.
There you can set a minimum and a maximum amount of server memory.
|
|
|
|
|
Sander Rossel wrote: We're getting timeouts like you wouldn't believe,
Just realized I had not looked at memory utilization in years. Right now with 16 gb I have nothing but three browser windows open and the memory util is at 5 gig.
Performance problems for me always come down to CPU. Usually when VS is starting up or when the virus checker is going crazy.
But you didn't mention the CPU at all?
When this happens the CPU is low?
Also which version of windows?
Sander Rossel wrote: 20GB and that fixes the problem
And what was it set to before?
Mine uses auto manage.
The windows suggestion appears to have that value as low. Suggestion seems to be "1.5 to 3 times" of physical memory. So 24 to 48 would be an appropriate range.
|
|
|
|
|
jschell wrote: But you didn't mention the CPU at all?
When this happens the CPU is low?
Also which version of windows? CPU isn't the issue, the guy has a good one.
Windows 11 Pro.
jschell wrote: And what was it set to before?
Mine uses auto manage. Same.
|
|
|
|
|
Sander Rossel wrote: CPU isn't the issue, the guy has a good one.
CPU load, not the hardware.
|
|
|
|
|
Here's a snapshot of memory usage[^] on my 16GB desktop running Ubuntu 22.04.2 LTS.
I'm currently running:
* GIMP,
* FireFox (with 30 tabs open) - one streaming music,
* two remote desktop connections via Remmina,
* 3 terminal sessions,
* MS Teams,
* VSCode,
* various other programs
Because I saw someone mention Docker, I went ahead and started my Docker container running MS SQL Server.
Here's the memory snapshot[^] with Docker running too.
I could never run all this on my 16GB Windows 10 laptop.
What do those Linux devs know? 😂😆🤓
There really are advantages to running Linux as dev.
I understand that I can't run full Visual Studio but learning command line and migrating to .NET Core makes it so I can do everything I need (yes, not WinForms) on Linux with VS Code. and I feel much less pain.
|
|
|
|
|
raddevus wrote: What do those Linux devs know? CLI.
|
|
|
|
|
raddevus wrote: yes, not WinForms Unfortunately, a necessity, as is .NET Framework
|
|
|
|
|
Yeah, I understand the constraints of legacy software etc.
It makes sense. It's just amazing to me that Linux really does seem to keep memory cleaner.
I think it kind of indicates that Linux devs are a bit more disciplined in creating things really.
Whereas in the Windows world it feels more like a free-for-all "memory is there, go ahead and eat it".
I am also always astonished at the number of things running "to support" the Win OS or whatever all that stuff is. Windows itself just eats so much memory and feels ridiculous. I mean, as you seem to be learning, 32GB probably is the minimum now for Win10. Oy!
|
|
|
|
|
raddevus wrote: Yeah, I understand the constraints of legacy software etc. virtual machines?
raddevus wrote: I think it kind of indicates that Linux devs are a bit more disciplined in creating things really. I saw it already back to the late 2000s... burning a cd ín windows almost 40 min. In linux 3.
raddevus wrote: Whereas in the Windows world it feels more like a free-for-all "memory is there, go ahead and eat it". sadly... yes
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Have you tried limiting how much memory SQL Server is allowed to use? It will take every bit of free memory on the machine as it needs it and tends to hold on to it long after that need is gone.
There are no solutions, only trade-offs. - Thomas Sowell
A day can really slip by when you're deliberately avoiding what you're supposed to do. - Calvin (Bill Watterson, Calvin & Hobbes)
|
|
|
|
|
Yeah, but that wasn't the problem.
The problem was SQL Server could not get enough memory to run properly, so we gave it a minimum amount of memory (and also a maximum).
Works like a charm now
|
|
|
|
|
I've had my current laptop a couple of years and usually just use it to RDP into my work desktop and run Teams locally. The laptop came with 8Gb while the work machine has 64Gb.
Recently I found that even with only 1 instance of RDP, MS Teams and a couple of Chrome windows under Win 11, the laptop was using 7.8 out of 8Gb, paging like mad and, according to Resource Monitor, "Memory Compression" was causing over 1200 Hard Faults per second! I've added another 8Gb and it seems OK... For now!
I think it's got to the stage where a 16Gb machine is entry level for Windows and a dev machine should probably have 64Gb, possibly more.
(One day I'll think of a signature...)
|
|
|
|
|
Sorry, I run VMs all the time on my main development laptop... that might end shortly due to retirement and other personal issues, but the last 3 laptops I've had - unit -1 (sitting on the shelf), unit 0 (I'm typing on it) and unit 1 (the new one that I am migrating too) all have 64GB. It's a defense against the Microsoft virus.
Oh, the Xp VM runs all day long with 2GB, Win 10 loves 16GB, Win 11 meh, not there yet.
Tried to get a 16gb machine up with a VM and it choked.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
Interesting you should compare Win 11 with the other OSs, we only started having issues with our massive 64GB memory all being gobbled up after we installed Win 11.
There does seem to be something with Win 11 that is not quite right with VMs or WSL, as though there is a memory leak somewhere.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
Visual studio 2022 I presume.
Make it run at 32 bits, that will save your memory. (or use 2019)
Unless of course your programs are humungous.
|
|
|
|
|
SQL Server loves memory. If you're running an SQL Server instance on the same machine, throttle the amount of memory that SQL uses from the default value of "as much as I can get" down to something resonable.
|
|
|
|
|
I guess that if it by default limited itself to, say, 2 GB, there would be a number of people standing up and shouting: "Silly database system, not making use of the resources available to it!"
Especially when we are talking about a server: Server software is primarily architected to run on a more or less dedicated server machine. Then it makes perfectly sense to default to using all memory (and other) resources available. As long as there is a way in a non-dedicated-machine environment to reduce the resource consumption, that is fine with me.
Side note:
I am not familiar with the MS SQL Server. But, I have several times had to explain to (non-professional) hacker friends that when the Resource Monitor shows less than 10% "Free" memory, it does not imply that they should double their RAM, when 75% of the memory is marked as "Standby". "Standby" memory is like "Take it if you need it, but as long as you don't ask for it, I keep track of which data was left behind when it was abandoned, just in case someone comes along wanting exactly that piece of data."
If some database system (SQL server or any other) handles memory in a similar way, saying "As long as noone else need this memory, I will use it as a disk cache area, but I am willing to give it up on short notice when someone asks", then that should be perfectly OK, even if it looks as if it is hogging all RAM.
Please note: I do not know if this is the case for SQL server or any other given DBMS. It could be implemented that way. A DBMS never has any significant backlog of modified pages; after a commit, all changed data have been written to stable storage. So giving up a gigabyte of RAM does not require writing back a gigabyte to disk.
|
|
|
|
|
Make sure you limit the amount of memory that the SQL Server service uses. By default, it will grow to use all memory available!
|
|
|
|
|
As always, the answer is: it depends.
You don’t say what version SQL server is running, or what the load is. But, you will probably want to set the service’s MIN and MAX memory allocation. Unfettered, the service will take as much memory as it can, and not release it unless caches are cleared.
Hopefully it’s only being used as a development server and has only your coworker’s load on it.
We always put our SQL servers (dev,staging, and production) on separate network machines.
Good luck.
Time is the differentiation of eternity devised by man to measure the passage of human events.
- Manly P. Hall
Mark
Just another cog in the wheel
|
|
|
|