The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
Then: It is my honest opinion that firmware should not use dynamic allocation of memory at all. It has a well defined job to do; the memory requirements for all tasks should be possible to calculate in advance. That is, the code and data should be structured so that it is possible to calculate all memory needs in advance, and to set off that memory at compile time.
There are reasons why Fortran (with no dynamic memory allocation, and for many versions without a stack - no recursion possible) held up for so long. One of the reasons was that is was reliable wrt. memory allocation: It would never crash due to a failing new() or a too deep recursion.
Ada, made for military robustness, did allow recursion, but only if a static code analysis could calculate the deepest nesting level and stack requirement.
CHILL, created for embedded code in telephone switches, required that you marked functions that are to be called recursively. I don't think it made any difference in generated machine code; it was just a way for the compiler to raise a red flag: Here you make a recursion to a function that is not marked as RECURSIVE. Look out - maybe you will run into stack issues.
Embedded code is executed in an environment where you don't have the same facilities as for desktop applications. Often, less resources are available; you can't just let the stack grow to fill those 16 GBytes of RAM. So, the programmer cannot be allowed the same freedom as a standard application developer. Far more restrictive use of dynamic memory and stack are among the limitations an embedded developer should accept.
I know well that this is certainly always reality in embedded code.
Great post. Those are some great points.
and I guess the thing is that I have this piece of nifty hardware that feels like an FM radio receiver or something that just works all the time.
You know what I mean...you never had to reboot your FM radio receiver. There were no chips that got full. It "just worked"!
Anyways, I have this hardware that feels like hardware but it is actually software inside a brittle plastic box and it is, of course, the software that is broken.
It's always software down at the bottom ruining everything.
Also, I know the FM radio receiver was analog and wasn't a computer and that's why it "just worked". Which makes me wonder...Instead of Quantum Computing maybe we should switch back to Analog Computers so they'll be dependable.
We don't need no stinking software!
I'm glad I'm not the only analog electronic fan (in some areas)
For example I'm fed up with digital switches that keep draining batteries even when you switch off (Hey Sony A7 or Thinkpad8, can you hear me?) or refuse to power on when you switch on and you have to hold the switch down for 1 minute 😡
And then do you want to compare the pleasure and comfort to feel with your fingers a phisical switch you can operate even in the dark vs a touch one?
How about the pleasure to fine tune FM radio with an analog tuning wheel vs a beeping digital RDS scanner deciding instead of you which tunes are ok to play and which ones are to bypass?
An analog volume amp vs a digital one?
The 1st analog walkman and its headphones vs the new digital players with earbuds?
The 80s analog HiFi systems vs the current digital micro HiFi?
Am i the only analog nostalgic?
Please bring back some old school analog devices 🙏 even at premium price I'd buy some
I'm with you on about all of that except volume controls on things. It would usually be nice if there were more bits used in the digital signal that was converted to a voltage before being used to control the amplification rate, I will admit that. But potentiometer volume dials on portable audio devices without exception for me, always developed regions of high-resistance (low volume) [+]/[-] buttons never had this problem.
However, actually - after getting a 1/2 decent car-stereo, I wouldn't buy an analog unit if they were free. The DSP and the included impulse responses can make poorly positioned speakers magically sound like they've moved from calf-height to shoulder height. I could never wrap my head around that before looking at convolution, fast fourier transform and a bunch of other software tricks. Tricks unavailable without a cpu.
The DSP and the included impulse responses can make poorly positioned speakers magically sound like they've moved from calf-height to shoulder height.
Not only speaker placement and such...
In 2016, the last year before Norway closed down FM transmissions in favor of DAB, I spent weekends and summer vacation driving up and down the country to compare signal and sound quality. I upgraded my car radio to a quite expensive one, but also tested with a cheap, battery portable radio, with no fancy circuitry. In locations where the signal was so weak that you could hardly make anything out of the "slightly modulated noise" in the portable, the sound from the expensive, heavily DSP-equipped car radio was … well, certainly not without distortion, but it was certainly well within the acceptable for car listening.
The only "problem" if you are in a hurry is that the DSP takes time! Don't synchronize chronometers, one by a primitive, straight through radio, and the other one by a radio with heavy DSP. The latter may be at least a second or two behind.
I partially agree, but maybe for more "abstract" / conceptual reasons. When a value is measured as opposed to counted, it the value is analogous to the real world phenomenon, you have an analog value. I hate digital speedometer, where I must continuously interpret a numeric value and compare it mathematically to the speed limit … It does not require any major mental effort, but it is so indirect! My speed is a natural phenomenon, completely independent on Arabic numerals in a base 10 number system. The speed limit is a point on a continuous scale, and the needle should not move past that point. Simple and direct.
Watches are similar: Twice as long time is twice as large movement. Direct and logical.
Then comes these who make everything into curves and bars and whathaveyou. I've seen recipes showing that you need five apples as a bar of length 5, rather than as 5 countable apple icons. Please leave countable values in the countable domain! (Or as we computer guys say: As integers.)
How about the pleasure to fine tune FM radio with an analog tuning wheel vs a beeping digital RDS scanner
This is where I am beginning to disagree somewhat ... Sound is, in its nature, analog. The selection of one radio channel among a countable set of channels is not an analog operation. Conceptually, it belongs in the digital domain.
Furthermore: We have had all digital DAB radio for three years in Norway. Three years completely free of hiss due to poor signal. Free of electrically induced nose - car ignition and such. Free of interference from neighboring channels.
From a technical viewpoint it has some great advantages as well: It is extremely modest in frequency and bandwidth requirements: A 2 MHz frequency band covers the entire country; you never need to allocate another transmission channel (frequency) for that set of program channels. A transmitter draws a fraction of the electrical power of an FM transmitter covering the same area, carrying 12-20 program channels rather than a single one. If the signal from the transmitter to the north is too weak, and so is the signal from the one to the south, maybe the sum of those two signals is good enough. If it is not, there is no need to set up a huge full power transmitter between them: A small, cheap transmitter that adds to the two (or more, or one) existing signal to raise it up to the required level.
There are several countable, non-continuous aspects in the radio: The selection of a multiplex, of a program channel within that mux, the set of transmitters, the mux ID and program ID, the supplementary text (like RDS text), ... Really, the only continuous, analog component is the sound itself. And that has improved greatly when going from FM to DAB; I'd say extremely so in you car radio when driving around in the Norwegian valleys with mountain sides for the FM waves to bounce back and forth between, and hills to shade the signal down to a minimum value (and below).
The radio system seen as a whole: I am certainly not nostalgic, longing back to the FM days. The switch to DAB has been a great step forward in all significant aspects. The only thing that was made wrong is that it took 20+ years from it was standardized until it took over (and in several European countries it is still secondary, while we wait for the FM transmitters to break down from old age). If DAB had been pushed through ten to fifteen years earlier, it would have hit the (end of) the great radio years. It is a pity that we introduce a truly great radio technology at a time when ever fewer people listen to the radio.
(This is in particular if you distinguish between "hearing" and "listening": Listening is wanting to hear. Not just hearing some background music, which can come from any jukebox-type source, but listening to a great radio play, in depth news reports, causeries, reports from festivals all over, debates, ... all the non-jukebox stuff. People of today seem to value that stuff down; they don't need the radio to hear some background music. Without listening.)
I wanted to reply yesterday but my 40EU/month ADSL from Vodafone Italy is completely unusable in the evening, the provider can't handle the incresead traffic due to covid emergency (young people stay at home watching netflix instead of going out)
Regarding the 1st part of your comment about converting infinite scale signals into finite entities (digits) I guess it's all about resolution. If the converter has more resolution than our senses can perceive than there is no difference between digital and analog, that can apply to color or sound fidelity for example.
My original comment however was more about the "feeling" you get while using a device. With analog devices I could get more physical feelings. When I was a teen in the 80s I had a great experience listening to my pocketable analog stereo radio on my bed using over-the-ear headphones (same as 1st walkman model) . The tuning wheel was so precise and sensitive I could feel like almost catching the waves with my fingers, If I heard a nice song a bit disturbed I could easily adjust the quality by moving a bit the radio in my hands. Overall I could listen a sound full of juice and with wide open stereo, yes maybe a bit of background white noise but overall I got very good vibes and feelings. Current pocketable devices sounds dull in my opinion and the interaction gives no "physical" feelings. It's difficult to describe but I hope you understand what I mean.
Another pleasent "analog" experience I forgot to mention is car A/C. Until early 2000 you could find cars with manual A/C, that is you tune the temperature with a mechanical wheel. I loved that because I could fine tune the temperature based on my physical feeling and I could do it while keeping my eyes and focus on the road. Today you have touch screen UI where temperature is chosen by clicking a touch screen icon '<' or '>' and it increments the desired temperature number to 20, 21, etc. with a stupid sensor placed who knows where that decide instead of me how much cool air to pump in. When you click the icon you risk an incident because your eyes and focus go on the touch screen 😱 Car assistant can help but once I rent a car that suddenly started to talk French for a software bug 😱
Even my Philips razor and hair cutter are now digital with touch screen interface, lcd displays and sealed li-ion batteries and often they don't do what I want. The razor software switch now randomly doesn't turn on so I bought a low cost double AA battery razor with analog switch. Such a relief! Damn you just need to turn on and off a rotating blade, why to throw in all that digital sh*t?
What I want to say is that digital has certainly brought us many great things but there are some areas where analog was simpler, more effective and pleasant. I guess Jamiroquai anticipated this with "Virtual insanity" song.
So I hope that when this digital fever will be over some good sense analog electronics will come back
Touch panels are not calibrated for my fingers. Maybe the skin on my fingertips is to dry, or too oily, or too warm, or too cold, or too conductive, or too isolating, or ...
Whatever the reason: I never can make a touch control panel work on the first try. I have to tap at least two or three times to get any response at all. Some of them are extremely sensitive on your speed as well: Our office coffee machine is one of the few that takes my tapping at the second try - it changes the icon when it reacts. But if I give it a brief tap, it says "Oh, so you just touched me by accident. Then I'll go back to square one". So I try again, pressing a little longer, and it says "I see that you keep pressing, so I assume that you changed your mind and want to cancel it". The tapping must be exactly the right duration (or at least within very narrow margins) to work.
None of my coworkers have problems similar to mine. They stand around watching me try again and again. Maybe on the fourth try I succeed in getting some respones, plus tapping with the right duration.
Texting on my smartphone simply doesn't work, so I bought myself a Bluetooth keyboard - that works well. My baking oven is not that sensitive to the duration of the tap, but it isn't very sensitive at all - I may be tapping five of six times for it to respond.
I don't mind Up/Down pushbutton pairs replacing a dial. My portable radio has a single dial, used for scrolling up and down in various menus; that is fine with me. I am not bothered by digital controls. But I hate touch panels!
Allocating most memory during initialization, based on configuration parameters, is good practice in servers.
I worked on telephone switches and never wrote anything recursive. But I did write some interesting code in which function #1 invoked function #2, which in turn invoked function #1.
A former boss wrote the code for call waiting. When one of the calls needed something done on the other call, it sent a message. After several releases the code stumbled into an obscure path where the two calls just kept exchanging messages, creating an overload situation. Although there was code to guard against one call getting into an infinite loop, this hadn't been anticipated. When the code was fixed, defensive code was also added to guard against any one call using an unbecoming percentage of the CPU time.
But I did write some interesting code in which function #1 invoked function #2, which in turn invoked function #1.
That counts as (indirect) recursion!
I understand that it was unintentional. If that had happened in a no-stack implementation (such as classical Fortran), your program would be likely to crash when control returned to the first function #1, or at least upon return from it - possibly long time after the second call of #1.
You did not tell whether this indirect recursion caused problems or not! Did this happen in a stack based environment, saving the situation, or do you recall it because it failed?
(recursion, n.: When you first curse because of the crash, then recurse when you understand why it crashed.)
It certainly is recursion. Mutual recursion, perhaps?
It was deliberate, and in a stack-based environment, but it would rarely occur. The code is still running after more than 30 years, and I doubt it has ever recursed beyond one level, although it would be possible to set up a test where it did.
Even garbage collection may leave the heap with external fragmentation. If you want GC to leave a heap with no external fragmentation, either the GC must trace every pointer update in all software, which is rather intrusive. In languages allow casting between pointer and non-pointer types, the task of tracing e.g. the use of an integer that has received a (int)pointer value is non-trivial.
Or you must add another level of indirection for all heap access: The code "pointer" is really an index (or "handle") into a pointer table, the only place where the actual heap pointer is found. This strategy is used in some systems, like for some Windows structures, and, I believe, JVM. This adds a (small) execution overhead, but the biggest problem is that it is poorly fit for systems manipulating addresses directly, such as C/C++ pointer arithmetic.
In some old architectures, now more or less completely forgotten, indirect memory addressing was directly supported by hardware. One of the few that enjoyed a (short, very short!) commercial life was the Intel 432 CPU. In the 1980s, there was a whole crowd of experimental, one-of-a-kind, such "capability based" machines. Typically, the pointer table also had a lot of access control flags etc., and could support virtual memory. (Sometimes, I wish that Intel brushed up that 432 architecture so that it could be used in modern systems - it did have a number of interesting features!)
The primary task of GC is to detect inaccessible memory block blocks to have them freed, and to combine neighboring free blocks into a single larger one. Packing memory blocks comes as an extra.
Most routers are based on Linux. The Linux weenies claim Linux gets great uptime.
Very good point. I run Ubuntu on my home rig and haven't booted into Windows for a couple of weeks now either -- since WFH. But, you are right...it may be Linux's fault down at the bottom.
I'll have to see if I can tell what my generic cable modem (from Spectrum) runs.
The closest approximation to what you're claiming I said is that there are routers that have to be rebooted more frequently than some of my Windows systems (or words to that effect--they're still there for you to go back to and re-read).
I sometimes get nostalgic, remembering that "Made for Windows 95"(*) sticker: To be granted permission to put that on your product (ads, package etc.), there was a set of requirements you had to satisfy. One of them was that installation/uninstallation should not require any restart. This was considered a major step forward - from the DOS days onward, we were used to most software installation requiring a reboot.
I believe this was upheld for Win98, but for XP, the no-reboot requirement was gone, and it has been gone since (although the facilities for making no-reboot installers are a lot better today than they were for Win95). I can - sort of - excuse MS for requiring reboot after updates of OS kernel modules (but talk to those making e.g. telephone switch software: Every module, kernel or not, must be replaceable without rebooting the switch).
A few years ago, when Windows updates were downloaded and run one by one, most of them did not require a reboot. Nowadays, with everything wrapped up into a composite package, chances are high that at least one of the components sits so deep into the core of the OS that a reboot is required. A small update may happen to comprise only no-reboot components; there is no rule requiring an update after every Windows update.
(*) Maybe I do not remember the wording "Made for Windows95" correctly; it may have been slightly different.
Are you sure that it is "real" Linux, or just "Linux-like"?
I have never been working with routers specifically, so you may be right. Lots of monitors / OS / executives / kernel (whatever you call it) present themselves as "Linux-like", and it doesn't take very much to claim to be Linux-like. Often it has a tiny little fraction of the API, with "what is needed", with identical function declarations, but the implementation is completely independent and not based on Linux source code.
I started working with embedded systems using the 8-bit 8051 architecture. Even for the 8051, there were people claiming to have Linux-like kernels. Chips of today are far more powerful, and many of them could run "true" Linux, but you will usually try to keep RAM size down to reduce both cost and power consumption. There generally is no need for a significant part of the Linux functionality. If you look up "List of embedded operating systems" in Wikipedia, there is a long list not in the "Embedded Linux" category (but the majority of them would claim to be "Linux-like" - or at least many of their users would say so).
On the other hand: The task of IP routing requires so much processing resources that I guess the extra burden of running a "full" Linux may not make that much difference.
(Nostalgia: 25 years ago, I was supervisor for a student project setting up a 8-to-8 switch: This was a single AT-bus board with eight 155 Mbps lines in, 8 out. Ideally, if none of the outputs were fed more than 155 Mbps, this AT-board could reach a throughput 0f 1.24 Gbps, which was quite a feat in 1995. But that was ATM routing, not IP routing.)
Are you sure that it is "real" Linux, or just "Linux-like"?
That's really the key, isn't it?
In Linux's defense, how hard do you have to work at it to take some Linux source, make a change, and as a result destabilize it so badly that you now have a version that has to be rebooted every few days?
Last Visit: 19-Sep-20 16:29 Last Update: 19-Sep-20 16:29