|
1. The lounge is for the CodeProject community to discuss things of interest to the community, and as a place for the whole community to participate. It is, first and foremost, a respectful meeting and discussion area for those wishing to discuss the life of a Software developer.
The #1 rule is: Be respectful of others, of the site, and of the community as a whole.
2. Technical discussions are welcome, but if you need specific programming question answered please use Quick Answers[^], or to discussion your programming problem in depth use the programming forums[^]. We encourage technical discussion, but this is a general discussion forum, not a programming Q&A forum. Posts will be moved or deleted if they fit better elsewhere.
3. No sys-admin, networking, "how do I setup XYZ" questions. For those use the SysAdmin[^] or Hardware and Devices[^] forums.
4. No politics (including enviro-politics[^]), no sex, no religion. This is a community for software development. There are plenty of other sites that are far more appropriate for these discussions.
5. Nothing Not Safe For Work, nothing you would not want your wife/husband, your girlfriend/boyfriend, your mother or your kid sister seeing on your screen.
6. Any personal attacks, any spam, any advertising, any trolling, or any abuse of the rules will result in your account being removed.
7. Not everyone's first language is English. Be understanding.
Please respect the community and respect each other. We are of many cultures so remember that. Don't assume others understand you are joking, don't belittle anyone for taking offense or being thin skinned.
We are a community for software developers. Leave the egos at the door.
cheers,
Chris Maunder
The Code Project | Co-founder
Microsoft C++ MVP
modified 16-Sep-19 9:31am.
|
|
|
|
|
I've been looking for information on chip lithography processes used in creating SoCs for IoT devices like the ESP32 based devices.
The ESP32 is @ 40nm which seems huge today, given my AMD APU is at 7nm.
Even when it was released 40nm was kind of big.
Now, these devices are known for low power, but even now, I struggle writing software that will use my ESP32's tensilica CPU in such a way as to extend the operating life on a charge.
If it's a cost issue, I don't think it's a tenable one:
I would pay 4x the price for ultra low power versions of the ESP32 widgets I have.
I don't know as much about other offerings like ARMs and so far googling isn't coming up with much real world info on lithography used across the range of IoT SoC offerings.
But my takeaway is these little chips have some catching up to do in an area where they could sorely use it.
Real programmers use butterflies
|
|
|
|
|
But a thing to remember SoC is intended for embedded devices, which are more prone to static damage.
Stick you 7nM AMD is the envoirment I work in and it will dead with a matter of hours from static/seawater ingress, 40nM will last longer as there is more to be damaged before it pops. While not an issue for most things, if you are mounting it on the sea bed where down time cost a very large telephone number one of the key design requirments is how hard is to break, not how efficient it is. My company looked at getting Siemens to restart making an old chip design as it was 32 bits, big flat pack with hand solderable leads to supply spares.
|
|
|
|
|
That's true, but you'd think given how many people *aren't* using these things in hardcore industrial applications where that is an issue one has to wonder about the dearth of ultra low power versions, even if they are a bit more fragile and expensive. For something like the device I'm currently building it would be a win. For a lot of the things I've seen other people build (though non-commercial in that case it still makes $$ for espressif) it would be a win as well. Just my opinion.
Real programmers use butterflies
|
|
|
|
|
There is definitely a cost factor involved. Those little chips are intended for sale at very low prices. The latest, cutting-edge processor designs are very, very expensive in part because of the cost to fabricate devices with the latest equipment. That does not make very much sense if you want to sell a low-cost device. I remember when I worked on some systems used to make PICs they used rather old technology, being fabbed on 4 and 5-inch wafers when we had recently installed systems at Intel, TSMC, and Samsung that used 12-inch wafers. FWIW, most of our systems were for 6 and 8-inch wafers. The three companies listed above were the only ones that used 12s that we worked with out of over three hundred companies.
I think the real answer to your question is they don't use the latest technology because they do not need to.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
There is a need though. I mean, on the ESP32 forums (reddit, espressif, anywhere) you'll find people struggling with battery life for their devices.
Plus at the current cost of $5 even making it $20 for a ULP version isn't unreasonable.
Real programmers use butterflies
|
|
|
|
|
Yes, but then that will cause other limitations like complicating interface with external devices. As the song goes, one thing leads to another.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
If you are contractually bound to the ESP32, then you are contractually bound to the ESP32. Case closed.
If you are free to switch to another chip, why don't you state your requirements for power consumption, on-chip functionality and maximum cost (for development kit, raw chip or whatever), and other requirements / restrictions?
I don't expect anyone around here to be able to reduce the power consumption of the ESP32, except by trivial measures such as reducing the clock frequency (if that is possible on the ESP32) or being more clever in turning of peripherals when not needed (if that is possible / relevant on the ESP32).
Knowing nothing about your functional requirements (so this may not be relevant for you): If what you need is Bluetooth, the nRF52 / nRF53 IoT SoC chips are recognized for their low power consumption. If going to another chip is an option, take a look at nRF[^]. These chips also support some older wireless technologies such as ANT and Zigbee. If what you need is 4G IoT, there is the nRF91 chip, but I believe that is a more expensive chip.
Off hand, I can't tell which technology these chips are using, but the proof of the pudding is the actual power consumption, not the nanometers. If you have the option to go to another chip, and your target is either Bluetooth or LTE, you could pick up the specifications of the nRF chips from the website for consideration.
|
|
|
|
|
I am a curious amateur, knowing "nothing" about advanced biology. But I am curious
I still remember the days when there was a race to be the first research institute to do a complete sequencing of human DNA. It took weeks, or maybe it was months.
Nowadays, DNA sequencing seems to be as simple as taking a breath. "Everyone" seems to do it in a snap. Covid19 RNA (it is RNA, isn't it? or is it DNA?) is obviously magnitudes below human DNA, but even human DNA sequencing seems to be a piece of cake today, to determine your ancestry, or your disposition of various diseases, or in criminal cases, or for a whole lot of other purposes.
Most of this change has come in less than twenty years. What happened? Computers haven't become that much faster! (I got the impression that they are essential as a tool.) Is the speedup in other, non-computer analysis hardware? Or have the scientists developed a completely new methodology that is a magnitude or two faster? Or are those companies offering info about ancestry or disease risk doing only a quick, partial analysis rather than a full sequencing?
To phrase it differently: If we twenty years ago had had all the knowledge that we have today about methodologies, would the hardware of the day be capable of sequencing the human genome in a many hours as they did use months, or are the methods of today fully dependent on recent hardware development? (I assume that computer hardware development is only part of it!)
|
|
|
|
|
trønderen wrote: It took weeks, or maybe it was months years. FTFY.trønderen wrote: it is RNA, isn't it? or is it DNA? It can be either one, depends on the virus group. RNA is only used for replication, which is what most viruses are targetting.trønderen wrote: What happened? It is like learning how to ride a bike, once you know how it goes you can ride all bikes, not only yours.
trønderen wrote: would the hardware of the day be capable of sequencing the human genome in a many hours as they did use months No.
|
|
|
|
|
trønderen wrote: it is RNA, isn't it? or is it DNA?
It's a single-stranded RNA virus.
|
|
|
|
|
Covid is an RNA virus
Back in the day (staring in 1990) the Human Genome Project was labouring to sequence the human genome. However, the technologies they (and others) developed advanced, and in 1998 Craig Venter started a parallel private effort. Depending on what "complete" counts as, he actually published the sequence first. I've seen estimates of the Human Genome Project as costing USD2.7B, but now you can get a full sequence done for < USD1000.
So, yeah. If we had today's technology then, stuff would have completed faster. Something short (~30,000 bases) like Covid doesn't take them long anymore - they had the full sequence last January, and they know exactly where the variants differ as well.
Ancestry, 23andMe, and the others aren't doing full sequencing though. They're doing a partial method (as you guessed) that involves trying to find specific differences. (They chop up the DNA at specific sequence types, then separate out the remaining chains to find differences).
TTFN - Kent
|
|
|
|
|
trønderen wrote: Nowadays, DNA sequencing seems to be as simple as taking a breath. "Everyone" seems to do it in a snap. Covid19 RNA (it is RNA, isn't it? or is it DNA?) is obviously magnitudes below human DNA, but even human DNA sequencing seems to be a piece of cake today, to determine your ancestry, or your disposition of various diseases, or in criminal cases, or for a whole lot of other purposes. Google "RNA vs DNA".
trønderen wrote: Most of this change has come in less than twenty years. What happened? Computers haven't become that much faster! (I got the impression that they are essential as a tool.) Is the speedup in other, non-computer analysis hardware? Or have the scientists developed a completely new methodology that is a magnitude or two faster? Or are those companies offering info about ancestry or disease risk doing only a quick, partial analysis rather than a full sequencing? Computers did increase, as did our knowledge of biology.
trønderen wrote: To phrase it differently: If we twenty years ago had had all the knowledge that we have today about methodologies, would the hardware of the day be capable of sequencing the human genome in a many hours as they did use months, or are the methods of today fully dependent on recent hardware development? (I assume that computer hardware development is only part of it!) Twenty years ago is 2001. I'm from 1977. We'd have dealt with it another way; stricter lockdowns, quarantines for entire cities.
Let me paint you a (realistic) picture; I'll be vaccinated this month, but the effectiveness of the vaccine is 70%. Nothing like measles. I won't be allowed outside during lockdown, vaccinated or not.
For every human it infects, millions of copies (and evolution). If it infects someone vaccinated in those 30%, it may become immune to the vaccine. So after my shot, I still need to stay indoors.
I am old enough to remember when every new vaccine was celebrated. We seen the effects of polio. Suddenly, half of our country refuses vaccines without any good reason. This "minor disease" as called here, may turn hostile soon, if we don't stop the amount of people it infects.
We need stop spread. We need to stop idiots.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Eddy Vluggen wrote: We need to stop idiots.
That's what people from both sides of the issue say.

|
|
|
|
|
Imagine the same with polio. Look up what the disease did, and how vaccines killed it.
There's no discussion here.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
I say we wear masks forever, and stay in lockdown forever.
|
|
|
|
|
So, you trouble with history? We could and we won't. Because in time of polio, everyone wanted vaccine.
Slacker007 wrote: I say we wear masks forever, and stay in lockdown forever. That's what I expect; if people do not want vaccine nor stick to the rules, the virus will spread, and with spreading it will copy a million times in every new body. And thus, it will evolve faster.
And with part of us vaccinated it will evolve beyond that. If just part of us vaccinated and lot infected, it will mutate beyond that.
There once was a time where we celebrated new vaccines because they save lives. Congratulations and kudo's to you.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
you bore me Eddy. You bore me to 
|
|
|
|
|
Then quit reading and responding to what he writes.
|
|
|
|
|
Slacker007 wrote: you bore me Eddy. You bore me to Quit dreaming, I'll never bore into sh*t.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
There is no "both sides of the issue" with this. There are the people who accept the word of experts and doctors on virus behavior and its impact on society, and then the people who deny the reality of it, thinking they know more than the people who study this stuff for their entire lives.
You either know what Dunning-Kruger is, or you're on the low end of that spectrum.
|
|
|
|
|
You sure told him. 
|
|
|
|
|
Dave Kreskowiak wrote: There is no "both sides of the issue" with this True, there's only science.
Dave Kreskowiak wrote: Thereare the people who accept the word of experts and doctors on virus behavior and its impact on society, and then the people who deny the reality of it So, the people who accept science, and the religious ones.
This is not a discussion; it about whether you accept science, with all it's proof, or go for religion without any. There's no middle ground here.
And frankly, those that do believe, should be left to their gods.
--edit
Dave Kreskowiak wrote: You either know what Dunning-Kruger is, or you're on the low end of that spectrum.
The Dunning-Kruger effect is a cognitive bias (thought error or bias), in which people who lack knowledge and have poor skills in social or intellectual areas incorrectly overestimate these characteristics and abilities in themselves Which you proven. Quod Eddy Demonstrandum.
Did you have more to add?
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
modified 6hrs 5mins ago.
|
|
|
|
|
Lost frogs on a low-calorie diets should still know their weigh around the pond,
(Ref Post)[^]
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
I'm sure I toad you that one last year ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|