The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
Unix/Linux does consider unlimited temp directory as security flaw (as temp is accessible to every application) and limits it size... It also clears it on every boot (configurable) in most distros...
This also has (or had) some security flaws, but interesting the different approach to temp folder...
Skipper: We'll fix it. Alex: Fix it? How you gonna fix this? Skipper: Grit, spit and a whole lotta duct tape.
Superfetch is one of the first item that should be disabled when you go SSD,
even on HDD these days it's value is little to none
1. what ms defaults into superfetch includes rarely used items such as wordpad,
2. the check to if it's cached the latest/updated in which time the performance of random seeks to compare the file/directory info vs sequential reading the source item on today's optimised HDD's also almost completely mitigates any time saving.
3. Loading rarely used items and then later swapping them out when you load some monster like vs (which for some reason is often not, or only small parts thereof, cached into the superfetch makes it an even more useless artefact of days past.
Signature ready for installation. Please Reboot now.
Yes it does, but it did not clear the temporary Internet files for some reason. Either that, or it is also possible that the Windows Disk Cleanup utility simply reported an incorrect number of files remaining. As you can see from the thread I quoted in my initial post, my problem was in no way exceptional. Many other users had the same complaint.
The pagefile has been useless on the desktop for over a decade. Maybe for some niche workloads.
By the way I cleared my "old windows installations" after the Fall Update, that gave me back 35GB. On a 250GB disk, so that's a lot. So much that IMO it was really unacceptable to steal all of that space in the first place.
Privazer is very impressive: gave me back 9 gigs; in return they got a small donation I think it's more functional in cleaning than CCleaner Pro (which I own); however, CCleaner does have other useful features Privazer does not.
The one slight glitch I had with Privazer was that after using it I had to re-login to GMail and CP ... for other sites, LastPass is working. I thought I had very carefully configured Privazer's options to leave all login stuff untouched.
«While I complain of being able to see only a shadow of the past, I may be insensitive to reality as it is now, since I'm not at a stage of development where I'm capable of seeing it.» Claude Levi-Strauss (Tristes Tropiques, 1955)
The one slight glitch I had with Privazer was that after using it I had to re-login to GMail and CP
I think I figured it out: Privazer modifies the "Clear browsing data" setting in Edge to delete the cache and other items every time you close the browser. Restore these settings in Edge and you should not have to log into CP every time.
IT guy I met specializes in supporting law firms. One got hit and paid. Said they had backups but it was cheaper to pay.
I have recovered twice, both hit file servers. Both had current (offline) backups.
How is your DR plan today?
User: Technical term used by developers. See Idiot.
These days companies have contracts to deliver, and even contracts to receive. If their data gets locked up they will be in deep sh*t from both sides, not just termination of contracts but probably fines on top of that to boot. It's not like in IT a small project that get's delayed a week, it's contracted fulfilment to customers and suppliers - they can't say "we'll get it to you next week," because it'll cost them their entire business.
The fault lies in putting corporate networks on the internet. Once upon a time in networking classes they used to talk about subnets, private subnets, with defined interfaces that protected their core data (sometimes only connected by a physical data transport). Along came the "access anywhere/anytime" and "the cloud" and in their infinite stupidity the network admins threw private subnetting out the window (and any smidgen of physical separation) in favour of using software access control and encryption..., soft separation, and being soft too damn easy to punch holes through it.
Anyway back to the real world, the business of doing business: 2 choices, pay the ransom, take the hit but keep their customers/suppliers, or your words of wisdom: "Stop. Paying. Ransoms" and go out of business, tell the owners they are now bankrupt and owe millions in fines, and tell the employees they don't have a job any more.
And whose to blame? IT, 100% IT. Crap network admin together with poorly designed access (including the applications that seem to need access made simple & easy).
As the article says the attackers are getting smarter, more surgical in who, what and when the attack, they will know when it matters most (i.e. just about to fulfil a large contracted order...)
"Stop. Paying. Ransoms" - will become even more "not an option."
Signature ready for installation. Please Reboot now.
Sorry, but you're wrong: "Stop paying ransoms" is the only long term solution. As long as there is a profit in it, they will keep doing it - and some of them aren't too scrupulous about "encryption" rather than "randomization".
If your company doesn't have a good disaster recovery plan which includes a good, solid backup regime, then you are elephanted anyway - it's not just ransomware that can ruin your day. Even in the days of paper, companies went bust because of fires which meant they had no idea who owed them money and who didn't. If you don't prepare for a problem in your core systems - be they paper, people, or computers - then one day you are going to get bitten, and bitten hard.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
Last Visit: 9-Dec-19 16:58 Last Update: 9-Dec-19 16:58