The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
I know... if you see again in my first message, there were smilies because I was just joking / pissing
honey the codewitch wrote:
As a counter-example I just sped up Glory by at least 100% in all cases through some well placed optimizations.
I re-invented the wheel in an old project reducing cycle time from 84 msec to 17 msec while handling with 35% to 40% more data as the previous version.
honey the codewitch wrote:
Or as I like to say "first make it work, then make it fast"
But don't forgetting than many times "Best is the enemy of very good"
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
I agree with you here, and sorry if I took you too seriously. There is one case in the past year where I bit twiddled to get the absolute best performance I could, and in that case I'd be able to justify it in code review.
Performance matters when it matters, even if it often doesn't matter for things like bizdev and webdev as much.
I don't want to saddle developers with huge build times like I said, so it's worth my effort to optimize it you know? But after a point it's diminishing returns. I spend under an hour doing it so it was well worth it, even if I was doing it for work, I could justify that. It pays for itself by a dozen builds easy.
I'm in the process of upgrading my Linux OS, so I've got a bunch of old files that I'd like to compress, before I move everything over to the new drives I have purchased for the upgrade. It seemed like trying to do it the "obvious" way e.g.
find /* find options */ | xargs gzip
missed out on the fact that I have an 8 core processor, and should be able to leverage that, to speed things up. I was thinking about ways I might go about writing something that I could feed in a list of files, and keep all 8 cores busy compressing away, thereby reducing the time it takes to compress all the stuff I've collected over the years. After spending some time mulling it over, I figured I'd google to see if anyone else had tackled a similar problem, and come up with a solution.
Enter GNU Parallel. This wonderful piece of software does exactly what I want, plus tons more - for example it can make use of the computing resources of the other systems on your network that you have access to. For anyone interested in adding parallel execution to your shell scripts, or making use of those other systems on your network that are otherwise idle (or at least not being used by you), this seems like a great tool to help "git 'er done". For starters the author has a great set of videos on youtube: GNU Parallel videos - YouTube
Just thought I'd give it a mention as I think its quite cool.
Interesting, most popular zippers have a command line option for multi threading, but it is hard to find information about it. I found this list (probably for Windows): -m (Set compression Method) switch[^]
Note that 7-Zip supports multithread mode only for LZMA / LZMA2 compression and BZip2 compression.
should be possible to find a table cloth (paper ones too) that'll fit your desk at most department stores.
after many otherwise intelligent sounding suggestions that achieved nothing the nice folks at Technet said the only solution was to low level format my hard disk then reinstall my signature. Sadly, this still didn't fix the issue!