The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
Just like we can create checkpoints & restore Windows to an older state, this is possible on Linux(ubuntu server)?
For example, I'm installing a bunch of packages, all of a sudden I suspect I installed the wrong versions in between. And there's a pile of them. Instead of checking one by one, will it be possible to just clean wipe all the installations and take it back to the orginal state? Like maybe creating a checkpoint and then getting back there if something went wrong.
sounds easy, until you start running into symbolic links, devices, named pipes, temp filesystems and you've got dbus some of their newfangled thingymawatsits. ...ends up being a mess of of command line options for this or that.
not that it can't be done, get it right and create a script.
but yes can be done: in fact did it to clone an entire setup onto backup [different config] hardware - something windows definitely can't do. Cloned the SSD, unplug from source, plug into dest, boot... done. (just change the system name if you want to talk to it over a network.)
I've cloned the disks in windows to different hardware.
It's a pain, but the process was load over the required drivers for the new hardware.
Installable, but NOT installed.
Clone the drive. Put the clone in the different hardware machine, boot into safe mode without networking.
get the monitor/keyboard/mouse drivers first. Generics usually work at first.
Apply all of the drivers specific to this machine.
After about 10 reboots. You are pretty much good as gold.
Now, deal with all of the software/copy protection like QuickBOoks, etc.
That recognize the drive changed, or the CPU changed or the core hardware changed.
Back in the day WINNT, we did this in order to upgrade developers computers without reinstalling everything. Probably did this about 30 times.
Anything that causes a crash just requires safe mode boot, installed the new hardware driver, and delete the old one.
USB and Networking are the single biggest nightmares. And nowadays the entire Mainboard and subsystems.
See, every now and then I uploaded animated screenshot of my progress on the vector graphic eraser!
Now I have problem with the vector graphic unionizer! as seen here!
(can't close the loop, and the code run significantly slower when it should close )
Well, I used that app, ScreenToGif, to take those shots! really nice app!
There seems to be a general view that if you want to do some serious compute bound work C++ will always win over C# because of its unmanagedness. This may be the case, but then there's an alternative argument that goes if you sidestep garbage collection, and use unsafe constructs you get close, and because your code gets JITed, there may be a chance to hone the code for the actual CPU it will run on which could actually make .NET faster over a generalised native binary. There seems to be better scope for this if you use the SIMD enabled System.Numerics.
I've been mucking around with sound synthesis in .NET recently, and by using unmanaged memory (Marshal.AllocHGlobal) and unsafe pointers the performance is good enough. In fact I've got it so there are virtually no garbage collections at all (beware of Linq - it creates enumerators all over the place). GCs are terrible news for audio, because a delay can mean a buffer not being ready in time and you get nasty clicks. It means a different approach to coding, but it's still miles preferable to header files and linking libs and all that 32/64 bit nastiness you get with C++.
This is a stunning masterclass in what C# can do. I recommend having a look at the video, and reading the bits about GPU vs. CPU and 'a different kind of c#' entries. I'm fairly hard to impress these days, but I've downloaded the code built it and played with it and just... wow.
Moreover, I'd argue its the last nail in the coffin of the argument that C# is not a viable choice for high performance compute bound work.
(I haven't followed the link yet - it's Saturday, and I'm feeling lazy.)
It's probably rather like the old C / Assembler debate. Yes, a skilled assembler programmer can produce faster, more compact code than a skilled C programmer purely because he can tell the machine exactly what he wants it to do rather than adding a layer of "interpretation" via a compiler. But ... it'll take a lot longer to code, and an unskilled assembler programmer can still make a serious dogs dinner of the same job!
I suspect that an average C# coder will produce code that is less efficient than a skilled C++ coder to do the same job - but I also suspect that he'll produce it in less time, and it'll be more easily maintainable by an average developer. And with the performance of modern machines that's a critical factor in most cases. Additionally, I suspect that a skilled C# developer will produce better, faster code than an average C++ dev, and get it out the door quicker as well.
Don't get me wrong, C++ is a good language, I used it for many years - but C# produces good code as well which is often a lot more readable and less prone to silly and avoidable bugs.
Sent from my Amstrad PC 1640 Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!