The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
All Visual Studio's have had their issues so far.
My first was 2008 (I think, only for a few months), which was fine, but quickly felt limited when I started using VS 2010, especially with regards to data tools like Entity Framework.
I even did some work in 2005, but only on older projects, which felt outdated to say the least.
I think it was 2012 (or 2013?) and 2015 that felt very screamy WITH ALL CAPS, but they did their job.
I've been on 2019 since 2019 and it's fine, probably a matter of getting used to.
Going back isn't really an option if you're doing any modern web and cloud development because the tools simply aren't there in earlier versions.
ALL have given me headaches with projects that suddenly won't build, or projects that build, but are still giving me errors.
Weirdo seeks the similarly motivated to engage in building a better mousetrap - a very niche mousetrap running on hardware you probably don't even have want.
spelunking through the ESP-IDF
using google translate on chinese technical documentation, and then coding against what i get back
writing random missives on the code project lounge
writing code that gives @SanderRossel fits
Turn offs include
holy rolling (unless it's about a technology i also holy roll about)
caring too much about standards (we take a buddhist approach to coding standards in *this* house - *blows nose on robes* - don't get too attached to anything)
avoiding the undocumented parts
But if anyone wants to help come up with a better way to do FAT32 on an ESP32, and wrap or improve the IDF in general i'll be over here, trying to learn Mandarin.
A politician or political commentator in the UK (I forget who) recently said "why do we teach kids French in school; it would be far more useful to teach them Japanese". As far as I can see, learning Mandarin would be the better alternative.
1: He didn't pronounce the semi-colon. I added that for the sake of efficient paraphrasing.
When in Paris I found that if I asked anyone if they spoke English they would say, "non" and look disgustedly at me as if to say, "Why don't you speak French, you English Pig?"
So then I would start speaking French to them, thanks to my excellent English education!
Immediately, they remembered that they did, in fact, speak English (probably due to their French education) and refused to speak to me in anything but English - still looking at me disgustedly as if to say, "Stop butchering my language, you English Pig!"
- I would love to change the world, but they won’t give me the source code.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
But a thing to remember SoC is intended for embedded devices, which are more prone to static damage.
Stick you 7nM AMD is the envoirment I work in and it will dead with a matter of hours from static/seawater ingress, 40nM will last longer as there is more to be damaged before it pops. While not an issue for most things, if you are mounting it on the sea bed where down time cost a very large telephone number one of the key design requirments is how hard is to break, not how efficient it is. My company looked at getting Siemens to restart making an old chip design as it was 32 bits, big flat pack with hand solderable leads to supply spares.
That's true, but you'd think given how many people *aren't* using these things in hardcore industrial applications where that is an issue one has to wonder about the dearth of ultra low power versions, even if they are a bit more fragile and expensive. For something like the device I'm currently building it would be a win. For a lot of the things I've seen other people build (though non-commercial in that case it still makes $$ for espressif) it would be a win as well. Just my opinion.
There is definitely a cost factor involved. Those little chips are intended for sale at very low prices. The latest, cutting-edge processor designs are very, very expensive in part because of the cost to fabricate devices with the latest equipment. That does not make very much sense if you want to sell a low-cost device. I remember when I worked on some systems used to make PICs they used rather old technology, being fabbed on 4 and 5-inch wafers when we had recently installed systems at Intel, TSMC, and Samsung that used 12-inch wafers. FWIW, most of our systems were for 6 and 8-inch wafers. The three companies listed above were the only ones that used 12s that we worked with out of over three hundred companies.
I think the real answer to your question is they don't use the latest technology because they do not need to.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"