The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
We used to do that at work, frequently restoring a clean machine. This was essentially for the software test setups: We wanted tests to be run in a controlled environment.
It worked OK for the first few weeks, maybe a month or two. But then the Windows Updates started piling up (the security guys in the IT group demanded that we install them all. If you install a six month old image, you have to rerun all the same updates as you did when installing the image the week before, and two weeks before, collected over the six months, plus those of last week. It might take an hour or two running the updates.
I think this must have bothered a lot of users. MS did tune up its update procedures to speed up things. This was a few years ago.
We couldn't wait for the update processing (maybe several times a week). So we made an cleanup script to do all the things mentioned here: Clean up all temp directories (including personal ones in \users\<username>\AppData\Local for all users), the \users\<username>\AppData\pip directory (if you are using Python software), any .log, .temp or .tmp file, uninstall files, browser caches, and a number of locations particular to the tools we are using. Nowadays we are introducing Docker, and if your users are not well behaved and forget to include the -rm option when they run a container, you'll soon have a big pile of stale containers that must be cleaned out by a 'docker system prune'.
To catch the big disk space thiefs, I would recommend TreeSize Free[^] (there is a fancy 'pro' pay version as well, but the free version will do the job). It may run as Administrator, to map space usage in directories you cannot access as a non-prileged user. It certianly is not without flaws, but is is fast, catches the most obvious space hogs, and it is free.
It worked OK for the first few weeks, maybe a month or two. But then the Windows Updates started piling up (the security guys in the IT group demanded that we install them all.
That's why I do a restore, update new image in the "ready to go" version. Keeping a copy of each second updated vm
Yes... I have 8 or 9 copies of Win7 in different states
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
That is what I do at my home computer, where I can discard the old image with no worries.
At work, there is a requirement that we - for years into the future - must be able to reconstruct the exact same development environment as was built to generate a given release, capable of building a bit identical copy of the released software. One consequence of that was that we would have to archive every single complete disk image. In principle, we could discard those that was never used for any release, but with many projects and many releases, the management problem would be large and the risk of missing info about a release (and deleting the image used to build it) was large. So we rejected that option.
I am far from being a great lover of Docker (especially on Windows, but even the Linux variant is ... well...), but it does solve a number of such problems. Once you have generated a Docker image, it is not affected by any environment changes (unless, of course, the build script fetches stuff from outside through the network interface). The way we have organized our images, in a multi-layered structure, it is very space efficient as well: If, say, one project requires a couple new/updated Python packages, we make a new image with the old one as base, and run a couple "pip install" to create a "varnish layer" that requires space measured in kilobytes. All the unchanged elements are physically shared with the old image (even at runtime).
So we have few if any disk space issues. Still we have issues: Some developers insist on using the latest and greatest version at any time (this goes particularly for Python packages). We could end up with thousands of Docker images, which would cause a significant management problem. So we let the developers use a "development" Python Docker image in the development phase, allowing downloading of any new version. In the relase phase, they are required to switch to a stable Python image that has disabled any network download: The project presents a list of packet versions they "need", we make a new varnish layer with these packets, and that is what is used for the relase build. This seems to be working fine - at least for now
(But that is for the Linux projects. Windows and Docker... arrrgh. We have given up for now. Maybe it will come later.)
Just today, I was inspecting a Windows desktop PC that came back from a test setup at our manufacturing partner. It had a 1GB hard drive, with less than 1MB space available.
I looked at %TEMP%, \Windows\SoftwareDistribution, \Windows\Prefetch, \Windows\winSXS, \ProgramData ... no luck.
Then I found 850GB in \Windows\Temp
Apparently, "Disk Cleanup" does not clean up \Windows\Temp
Lots of you will now say that you already knew that, since you have it as a VM in Azure.
There was some error on not being able to install "mssql-tools" due to a missing dependency. Apt-get that dependency, get a similar message on a new package. Try that, another dependency. After four of those, a Y/n on install, and work your way up through the dependencies. That's the tools (like SqlCmd), Sql Server was already running.
Connection error; so this idiot tried to connect using Windows Authentication
Try Sql Server authentication and login with sa, new connection error. Quick google teaches me that SqlWb needs to have the (default) port number after the instance name. So, add ",1433" after the instance name, and it connects, acting like a normal Sql Server instance would.
Created a database, created a table, from MS Sql Wb running on a Win10 machine. No additional errors, which must mean I done something wrong. Reboot, and it still works! It's running on not-so-great hardware (hence Linux), but it is still bloody fast (not much other stuff installed). If it works on that machine, then it may work on my Raspberry. Imagine a backup-databaseserver the size of a creditcard. Now all I need to build my own cloud[^] is to get RAID5 working on cheap USB pendrives.
Thank you Microsoft (and Scott)
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
Ofcourse, since there's only so much ports on a system and running them all behind a single hub defeats the purpose of the excercise a bit. Still, I'm not the only one with that idea, and it seems it works for some people;