|
Has S(tupid)D(rivel)Times ever published something that wasn't drivel?
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
I think these CTO's are comparing bad developers, the 1x (the ones that write spaghetti and add loads of unnecessary complexity to a project), to good developers, what they call 10x.
I'd go as far as to say the bad ones are adding negative value, so they're more like -2x.
So the message shouldn't be "find 10x developers", but find competent developers.
What I wonder, why am I not hearing this about managers?
Even the "10x" developer can be useless if the team or project is not managed correctly, and it rarely is.
Actually, why not any profession?
We all know good ones and bad ones.
You have your favorite cashier because they handle your groceries 10x faster (this one's even measurable)!
Heck, I know people who switch doctors after months or even years and almost immediately have their ailment cured (or at least diagnosed)!
In the case of doctors it could even save your life, so definitely find 10x doctors.
|
|
|
|
|
'There'll be no more work from external/unpaid contributors' – dev Put a fork in it?
|
|
|
|
|
Kent Sharkey wrote: Put a fork in it? Or just fvrk it?
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
The backslash was intended so that the symbols ∧ and ∨, used in the Algol programming language to represent “and” and “or” respectively, could be represented in Ascii as /\ and \/, saving space for a character. All your DOS paths are just figments of your imagination
|
|
|
|
|
Quote: Re: Backslashes aren’t real My physics teacher told us the inertial forces are not real too... but try to not hold yourself to something in the bus...
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Torvalds, Linux's creator, finds AMD's processors deliver a much bigger bang for the buck than Intel's CPUs. It's not a new year until there's a new Linus rant
|
|
|
|
|
Well... He's right. I've been saying this for decades.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
A broken Linus is right twice a day? (Or at least every once and a while)
TTFN - Kent
|
|
|
|
|
This ECC rant:
The first machine I got to know intimately had 6 ECC bits for each 16 bit word. This was in the days when everyone was convinced that 64 kbit memory chips was the end of the line - alpha radiation from space would easily knock out 64 kbit chips, and larger chips were unthinkable of. I don't know what happened to memory technology, but a few years later we had a lot larger chips, with far weaker error correction. Maybe even none!
A parallel: In the 1990s, I took a group of students, well versed in X.25, to an external presentation of Frame Relay protocols, with its end-to-end error checking. These students were sceptic, asking "But if a transmission error occurs halfway?" The presenter tried to convince them: We are talking about fiber optic networks here, where you don't have bit errors! ... The student were not 100% convinced. Few years later, they were all using networks where you practically never had to request retransmission of a packet due to a bit error, on machines essentially with no memory error correction at all - at most: trivial error detection by a parity bit.
A bit error in your code segment is likely to make your code go crazy. A bit error in your data may lead to crazy pointers. Summations in your double bookkeeping not matching up. Your UTF8 encoded string may refer to an undefined code point. It rarely goes undetected. But in practice, few cases turn up.
There is a reason why you today rarely find memory with 6 bits ECC per 16 bit memory word: It is not needed. It is like the famous British law that calls for a man walking in front of every railroad train, waving a red flag as a warning. (Allegedly, that law was challenged in the House of Lords, but was rejected as the law should be considered an element of the "cultural heritage" that shouldn't be touched ... Brits are Brits ).
ECC and bit errors is an issue at the physical level, and should be handled there - it is not an issue for the OS (with a possible exception for the hardware abstraction layer, but even there quite low down). If you relate any API level functionality to ECC or anything at that level, the 2020s are not for you.
|
|
|
|
|
It's all a matter of how much reliability you are willing to pay for.
If in a particular system ordinary memory has an error rate of 1 bit per month, and ECC memory has an error rate of 1 bit per year, but costs 1.5 times as much, it may be worth your while to accept the occasional glitch and simply reset the system once a month on average.
Most people turn off (or hibernate) their systems much more frequently than that, so memory errors go unnoticed.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Linus Torvalds said: you get more bang for the buck from AMD processors.
I don't always agree with Linus, but when I do, it's AMD!
My current machine (desktop) runs the AMD® Ryzen 5 2600x six-core processor × 12 and I just purchased a laptop (impatiently waiting for it to arrive) with a AMD Ryzen™ 7 4700U (2.0 GHz, up to 4.1 GHz, 8 MB L3 cache, 8 cores) +
AMD Radeon™ Graphics
256 GB PCIe® NVMe™ M.2 SSD
16 GB DDR4-3200 SDRAM (2 x 8 GB)
Only $622
This would've been over $1000 if it had a comparable i7.
|
|
|
|
|
Does that laptop come with it's own nuclear plant to power it? Yow. Let me know what it's like when it's running - temp and/or fans. Other than that HDD, it sounds almost too good to be true.
TTFN - Kent
|
|
|
|
|
Kent Sharkey wrote: Other than that HDD, it sounds almost too good to be true.
yeah, I was amazed too. I went through the HP "build your own" and I was cheap on the SSD m2 drive so it only 256GB.
|
|
|
|
|
The year 2021 will see a number of important Microsoft applications reach a state of deprecation and retirement. We list a few your business should keep track of. Will there be cake at the retirement party?
|
|
|
|
|
Why you can’t find the groundbreaking search engine AltaVista on the web anymore. Friends don’t let friends visit Digital.com without knowing the truth. Let me search for it using Lycos
Oh hey - both Lycos and Excite are still working.
And Ask Jeeves. Wow. memories.
|
|
|
|
|
AltaVista was the first place I saw using machine translation (provided by Babel Fish / SYSTRAN). It really sucked back then
|
|
|
|
|
I recently discovered a few Gopher sites running, so who needs WWW?
|
|
|
|
|
If you look really hard, you might even find some dial-up BBSes…
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
One thing I really miss from AltaVista is the ability to sort the hits on date of publication or change.
I guess that one reason for abandoning it is that today, the majority of pages are created dynamically, so it makes much less sense than in the days when you set up the entire page by editing the HTML directly using the Brief editor. But I sure miss the functionality, especially when searching news media. (Lots of internet newspapers have a search function for their own articles, with a 'Sorted on date' option, but that comes too late in the search process!)
|
|
|
|
|
From an observatory high above Chile's Atacama Desert, astronomers have taken a new look at the oldest light in the universe. Their observations, plus a bit of cosmic geometry, suggest that the universe is 13.77 billion years old - give or take 40 million years. What do you get a universe that has everything?
|
|
|
|
|
Kent Sharkey wrote: What do you get a universe that has everything?
Multiverse
Zen and the art of software maintenance : rm -rf *
Maths is like love : a simple idea but it can get complicated.
|
|
|
|
|
Kent Sharkey wrote: What do you get a universe that has everything? A fiery ending, and a naked hologram that pops out of a cake.
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
|
|
|
|
|
Wow, it doesn't look a day over 12.5 billion years old.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
|
|
|
|
|
What's new? I have heard for the last twenty or thirty years that the universe is about that age.
(well, maybe they have corrected 13.75 to 13.77 billion years, but I guess that also depends on which of the twins you are.)
|
|
|
|