The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
True or not, this story made me lough. Especially the harebrained attempt to cover up the 'problem' and the reaction when it led to some questions. Girl, I have a solution. It's a box of latex gloves. I use them when I spray around with paint or whenever I don't want to get something else on my hands. Leave my socks alone.
Let's say this is not safe for work, even if nothing offensive is shown: YouTube[^]
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
I believe I was in college when I started playing around with C (self-learning). I loved the expressiveness, and was sold on it at that point. Then, as I played around more and more, I started bumping into situations where the logic was becoming nested so deep that it became painful to keep track of. (IE, 7 level deep 'if' statements.)
When I finally got my head around C++, those logic nightmares vanished with the magic of virtual functions. It was a HUGE shift, and magnificently freeing.
I've always wondered why so many like yourself seem to love C so much, given that its syntax (was?) so easy to become ugly, with all the casting involved, and can fairly easily get the deeply nested logic I encountered. I'm not asking to attack C, just curious as to whether I'm overlooking something big, since most C stuff can be done in C++ if you chose not to use objects. And if you slightly objectify things, the casting is reduced.
Working in C, and doing it well, requires an unusual level of discipline. For many years, I worked in a similar proprietary language. But it had better typing and other concepts alien to C (more like Modula, say), which was one reason we could compete with larger firms that used C.
Much of our better software was rather object-oriented, but it was done manually. A struct containing function signatures would be defined. This was effectively an abstract class. A concrete class would populate it with function pointers and register it against a type index so that it could be invoked polymorphically, through an array of such structs. There were even some ad hoc examples of inheritance.
Building these things by hand was tedious, but it worked well when people took the time. Naturally, there were also horrors like deeply nested functions, a surfeit of global variables, obscure side effects, and bugs fixed by
IF(the conditions under which bug report#aannnn occur are true)
update a little of this and a little of that so the code can carry on successfully;
This isn't necessarily a problem restricted to procedural languages, but I find that C++ does a better job of encouraging one to find the root causes of problems instead of working around them, which helps to both keep the code cleaner and evolve the system. Not to mention that having polymorphism, inheritance, and encapsulation built into the language saves a lot of time!
Those are some of my thoughts as well. I'm hoping Richard gives his input, based on his experience because I'm very curious as to why some people stick with C. For instance, I hear Linus say that C is the only language for building operating systems in (or something like that), and I just shirk at the thought of such an endeavor without the help of thinking of things in objects. In my mind I see tons of errors that would be easily avoided using higher-level constructs. But I haven't attempted that type of project, so I can't say much about them. One argument in favor of C may be a smaller memory footprint (~5%???), which might be critical for embedded code, but even then??? If you are having to build objects by hand like you mentioned, wouldn't that savings be moot???
Given that C++ is basically a superset of C, Linus is wrong that C is the only viable language for operating systems. "Too much" C++ would unduly degrade performance, but an operating system isn't the only place where this is a consideration.
For memory footprint reasons, Embedded C++ eliminated RTTI, exceptions, and templates. At the time, this made some sense. But with memory as cheap as it now is, I doubt there are many cases where it still does.
For embedded applications, the amount of memory is a significant factor in the power budget. For several IoT device groups (or similiar, on other standards), battery lifetime is essential. If a chip with less memory can give you 30% longer battery change intervals, and is large enough, you'd go for it.
This is most essential with RAM sizes; code often resides in flash/ROM, needing no power to retain the data. But when the flash/ROM is accessed, some solutions have power requirements that depends on the memory size (although not by 30% of the total).
Another aspect is that even though the real cost of manufacturing a SoC chip is almost independent of the memory size, the vendor will often differentiate the customer pricing significantly: Customers wanting that 256 Mi version subsidise customers who go for the budget 1 Mi versions.
(Historical note: Around 1980, the age of Vax class superminis, I was working with a company selling a series of machines in three significantly different price ranges. The CPU was identical in all model, but in the budget model, the CPU cache was removed - they were distinct chips in those days. The high end model was delivered in a double cabinet, with lots of space for peripheral interfaces. I talked with a customer of the top model who was convinced that their machine was a lot faster than the midrange model. Learning that they were identical with respect to CPU power, she was seriously considering suing the vendor for fraudlent business practices.)
I use embedded for any system dedicated to a specific purpose, usually with specialized hardware. This ranges from toasters to IoT to smartphones to servers. But I've never seen a formal definition, so YMMV. Even IoT must mean many things, because I've seen articles about Linux for IoT.
Sure. I consider as "embedded" any CPU that doesn't present itself through an explicit user interface (to the computer), but receives commands from some other source than the user as a human. Maybe the user pushes some button or rotates a knob, but that is all defined by the function of the device, whether a car, rice cooker, stereo system or whatever. The user is unaware of the CPU; in theory the function could be realized by other measures. (E.g. up/down buttons could, in principle, be direct power switches to motors pulling a potentiometer one way or the other.) As long as the device has plenty of power available - including cellphones acting as a central for several sensors - there is no need to worry about the power requirements of a larger RAM. My concern was with the button cell powered sensors etc.
The cellphone itself has a quite extensive power management system: The circuits are organized in several power domains with are turned on and off individually. If some circuit is not required as the moment, it is turned off to save the power of keeping it available. The more, and smaller, power domains defined by the chip, the more focused can the power management be, and the more power can be saved. E.g. some chips allow power to be turned off half or three fourths of the RAM if the current load does not require more RAM. Yet, cellphones have huge batteries, compared to small button cell powered sensors.
(The Bluetooth Low Energy standard essentially reduces energy consumption because the slave/sensor and master/central makes agreements: We'll talk again in exactly 875 milliseconds! In the meantime, the slave turns off all power except for a clock programmed to wake up the chip just in time for the agreed next communication.)
For the time being, I use the "Internet of Things" (IoT) term only for devices communicating over an IP based protocol. There are several other wireless alternatives, some of them proprietary. Many of them have been used for years, long before the IoT term was invented.
I guess that within a few years, IoT blurs into a general term for any small device communicating with some more central unit - IP protocol or not. E.g. I've got a couple thermometers / "weather stations" receiving information from sensors of outdoor and indoor temperature, rainfall, wind... They are old, from long before the IoT protocols were specified. Nevertheless, they will soon be called IoT devices.
"True" IoT devices may use Bluetooth as a physical carrier. Alternately, they may use BT profiles directly (with far less overhead), maybe as a pure software choice. So is it an IoT device on one protocol stack, but not on another one? Borderlines will washed out in the future. But for now, I assume that IoT refers to devices running IP based protocols from the Internet IoT protocol family.
Last Visit: 30-Nov-20 1:41 Last Update: 30-Nov-20 1:41