I usually need to have multiple VS2010 instances open at the same time, each never consuming less than 200Mb. Actually this number is from a fresh start, because after a few days that instance gladly owns 1Gb or more.
Plus a SQL Management Studio instance, IIS, several browsers (IE, FF, Chrome), each with several tabs.
For me I also need Word and Excel docs.
This said and putting the operating system and all its stuff my minimum is always over 4Gb, being the average around 8Gb.
With the /MP option... Visual Studio will spawn multiple instances of the 'cl.exe' process. Yes each instance of cl.exe would be limited to 32 bit addressing.
I've done extensive testing with the scaling of Visual Studio. If for example... you were compiling a really large C++ project with /MP on an octa-core workstation then you would be able to use 8 instances of 'cl.exe'. If you have enabled '/GL' (Whole Program Optimization) and '/LTCG' then the extra RAM would potentially be fully utilized.
When developing for Windows, there is a distinct reason to use a 3 GB machine instead of a 4 GB machine as 3 GB machines typically run Windows 32-bit while 4 GB machines run Windows 64-bit. Unless you have a large amount of memory and virtual machines, any 16-bit programs or specific 32-bit programs (ie. coLinux) don't run on 63-bit machines.
We can infer from this survey that at least 42% of the developers on codeproject are on a 64 bit operating system. I would hazard a guess that some of the remaining 58% are also on a 64 bit operating system.
You can only really infer that 3.19% are on a 64 bit system. Everyone that said "16 GB or fewer" might only have 640k. Since that option obviously includes all the others, I was tempted to click that one rather than expend the effort to move my browser window 3 inches to the left so I could see my conky display that says how much RAM I have.
I still said 8GB or less, although the Windows XP VM that I use for Visual Studio only gets 2GB of that.
I got an improve to 6 Gb just because my VM was complaining about the virtual memory when I started to work in a project with a really huge HMI with SQL running in the background.
But with my (until last week) 4 GB RAM I was more than happy, I was compiling faster than other collegues with 8GB RAM just because I have an SSD (2 Laptops compiling the very same project, mine needed 3 mins and about 20 secs, the other guy needed almost 7 mins)
So... forget about RAM and get an SSD, you will increase 90% of the processes in more than 50% of performance.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpfull answers is nice, but saying thanks can be even nicer.
My work rig has 8 GB, which is more than enough for running multiple web browsers (for add-on testing), email client, 1-3 instances of VS, and more without breaking a sweat, the issue is that my hard drive is only 75 GB
At home I've got 16 GB RAM and another 2 GB in the graphics card, which is far more than I need. But then again, my home computer isn't about need, it's about want (which is why I opted for the "gaming" RAM that included fans to cool the RAM ).
I usually have about 3 open VS 2010 instances, four browser instances (one with several tabs) for preview and of course the usual suspects: mail client, notepad++, vlc for music... I am also running a IIS with SQL 20008, of course.
If I'm going to test in a VM, another 2Gigs are needed... (which is what the client has to have)
I can have a lot of apps open and running and not even crack 5 gig...including VS 2010.
It's nice, real nice, but we all know that we have to develop to the masses and not to ourselves.
"the meat from that butcher is just the dogs danglies, absolutely amazing cuts of beef." - DaveAuld (2011) "No, that is just the earthly manifestation of the Great God Retardon." - Nagy Vilmos (2011)
"It is the celestial scrotum of good luck!" - Nagy Vilmos (2011)
"But you probably have the smoothest scrotum of any grown man" - Pete O'Hanlon (2012)
In previous jobs I've had to get by with a 2-4Gb Core Duo for tasks ranging from deleting mails to running a VM with SharePoint or BizTalk server. The latter scenario is so soul-destroyingly frustrating that I'm sure the CIA use it as an enhanced interrogation technique. It amazed me that my then employers expected me to be able to work productively with so few resources. What amazed me even more is that when I showed them the 30 minute boot time and 5 minutes waiting for a context menu their answer was simply "OK, we see the issue.. you'll just have to get on with it". Me waiting for my hardware to catch up all day (and therefore being barely billable) is fine, but as soon as there's a 5 minute discrepancy between the burndown chart in TFS and our billing system you call all of the team in (5 developers) for an hour long meeting about "doing things correctly". WTF.
My current employer is considerably more enlightened when it comes to stuff like this, I've now got an i7 with an SSD, 1Gb dedicated graphics and 8 Gb RAM. Anything remotely server-y gets put on an ESX VM, including Database Servers, ArcGIS Server and the like. These days I don't have to wait for anything much to catch up.
The funny thing is that I moved from a huge multinational with over 22,000 employees to a much smaller company with say 400 employees, albeit an ESRI partner. This taught me quite an important lesson: big companies sound great and all, but in reality you're just another number to them. They don't really give a toss what happens as long as the billable hours keep getting logged.