|
Marc Clifton wrote: Assembly - Yes, assembly I have the least complaints with because it is the ultimate simple, you get what you code, language. Though, perhaps I speak naively of the days of 6502, Z80, and 8086/8088 programming. More recently, there has been some complaining about the annoying non-orthogonality of x86's SIMD, especially before SSE4.1. For example, there were some integer multiplications, but the one most people actually wanted: pmulld (which as of SSE4.1 exists), multiplying 32bit integers and giving the low 32bits of the products. Or, there was a way to poke a word into a vector, but no other widths. There were min and max for signed words and unsigned bytes, but nothing else. Integers of almost all sizes could be compared for equality, except qword. It was weird, and people complained.
After SSE4.1 people mostly stopped complaining (other than the usual "why does my favourite pet instruction not exist"), but then AVX came and the complaining returned. Firstly, it was only about floating point, except vptest . Weird. Also, the damn slicing - AVX didn't extend vector width the way one might expect (which admittedly would create complicated problems), but acts mostly as though you have two SSE-width vectors together, the difference being there's almost never communication across the "split" (except some new, slow, instructions specifically for inter-slice communication). With AVX2, the complaining about float-only stopped because it went away, but since it was useful to more people, there were more people complaining about the slicing. AVX2 also treats bytes and words like an afterthought, upgrading their old instructions but adding new stuff almost only for dwords and qwords (hey Intel, copy vpperm please). And while it introduced a gather, it was so slow that it wasn't useful. It has improved though.
Also there was complaining about incompatibilities between AMD and Intel, such as the FMA4 debacle and XOP being awesome yet not supported by Intel.
Then there were the µarch-weirdness complaints, such as Haswell's port 7 supporting only simple addressing (which for a while not many people knew about, so performance was less than expected), and even if you use that, the instruction will sometimes "steal" port 2 or 3, thereby costing load throughput. Or the false dependency on the output register of popcnt , tzcnt ..
Of course not many people had these complaints, for obvious reasons.
|
|
|
|
|
harold aptroot wrote: More recently,
Dang. So much for the simple days of assembly programming.
Marc
|
|
|
|
|
Well this time there is only one language (Mongo) in the list that is not a language, so I guess that's some progress.
"If you don't fail at least 90 percent of the time, you're not aiming high enough."
Alan Kay.
|
|
|
|
|
I suspect this is in nearly the order of how much each language is actually uses. (Everyone avoids objective-c like the plague including, it seems, apple's own devs, so no big surprise there.)
|
|
|
|
|
I’m starting my career as a software dev now and one of the things I’ve been wondering about recently is how to write good code? I think generally that I can recognize well written code from poorly written code, but I want to get even better and I’m not sure how. Engage brain before activating fingers
|
|
|
|
|
Learn from Accomplished Practitioners
BTW, why doesn't markdown work anymore? (and yes, the checkbox is checked.)
Marc
|
|
|
|
|
code more and read more efficient code. Try to code less with more done.
|
|
|
|
|
Here are the steps to write a good code or less code.
1) Don't use the more if statements, Instead of it try to use modern if else statements.
Modern if else
2) Don't put too many comments for the code, Put only the comment which is needed.
3) The code always be the understanble to other developer's also.
|
|
|
|
|
I'd take the existing code base from a former company and say "don't do ANY of that."
|
|
|
|
|
Even before the release of Swift 3, Apple is talking up Swift 4 features like ABI stability, string processing, concurrency, and new scripting capabilities. They're so swift, they're a full version ahead
|
|
|
|
|
Toward the end of last year, the people behind the Large Hadron Collider announced that they might have found signs of a new particle. "What's he like? It's not important. Particle man"
Not the Higgs Boson, this is another one
|
|
|
|
|
Towards the end of next year, the people behind the LHC announce that they have discovered a new particle. The DHP (Disappearing Hypothetical Partical)
It is:
both a particle and a wave
has both infinite and zero mass
spins in all directions, and if it stops spinning it will make your head spin
has simultaneous positive, neutral, and negative charge until you observe it (then it flips to the mood of the physicist doing the observation)
is the constituent substance of the Q Continuum, yet again showing how Star Trek predicts the future
Physicists are now trying to discover it's anti-particle version, the RHP.
Marc
|
|
|
|
|
I have a Higgs Futon in my guest bedroom, but the problem with it is that it isn't there when I'm looking at it.
|
|
|
|
|
Technically Incorrect: Brian Hall, general manager of the Surface brand at Microsoft, says Apple is doing its customers a great disservice by not innovating in computers. Pot, meet kettle
|
|
|
|
|
Coming from the company that sold products that either broke (ie - XBox dvd drives....) or became obsolescent (ie 300$ USB steering wheel no longer working on newer Windows versions)....
MS would do the same if their products lasted and weren't comparable to shoddy workmanship or planned obsolescence.
|
|
|
|
|
I think the problem is that Apple makes so little profit from it's Macs, that there is little enthusiasm in executive management to spend any significant R&D on them.
Mac laptops do have two things I really like: excellent touchpad and magnetic power connector. Perhaps they should just license those to everyone!
|
|
|
|
|
A group of researchers has found a way to hack directly into the tiny computer that controls your monitor without getting into your actual computer, and both see the pixels displayed on the monitor—effectively spying on you—and also manipulate the pixels to display different images. "There is nothing wrong with your television set. Do not attempt to adjust the picture."
|
|
|
|
|
|
I wondered how that 'adult' screen-saver got there
|
|
|
|
|
I have no doubt they could do interesting things with the chip on the monitor, but I'm skeptical of how they get to it. I have a feeling there's a big "Magic Happens" in the steps.
|
|
|
|
|
Tim Berners-Lee announced the World Wide Web project on August 6th, 1991. The gift for the 25th is traditionally silver, but I couldn't think of a joke about that
|
|
|
|
|
Perhaps: How many times was its RAID re-silvered? Or at least, attempted to be?
|
|
|
|
|
Microsoft only started rolling out Windows 10's Anniversary Update this week, but it's already discussing future updates for the operating system. Two more chances to ignore Windows 10 next year!
|
|
|
|
|
My prediction is:
Update 1: they will now force you to log into the microsoft account cloud. Personal user accounts will be forbidden.
Update 2: remaining personalization setting will be removed and you will be forced to do everything through the microsoft store. Users who do not run Candy Crush will be scrutinized to see why they are dissidents.
|
|
|
|
|
Basildane wrote: Update 1: they will now force you to log into the microsoft account cloud. Personal user accounts will be forbidden.
Plausible, really possible...
But if yes... then bye bye microsoft
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|