|
During the 70s and 80s Japan had been far more prevalent in electronics than China today. It's been in pair with US even in the high-end chips, something China doesn't achieved quite yet. And I don't see many people use Japanese today. China is not even at Taiwan's and South Korea's level. Heck, Intel produces i5 chips in Vietnam, but not in China, and now is shifting billions of dollars of investments to Western Europe. So maybe you can put on hold learning Chanese just for a little while.
Advertise here – minimum three posts per day are guaranteed.
|
|
|
|
|
We must not confuse what we today think of Japan's industry and business in the 70s and 80s with how we viewed it when we were still in the 60s and 70s! (Be aware of that 10 year shift from my time reference to yours - a lot changed during those ten years!)
We might see a similar shift from how we view Chinese industry and business today to how we in fifty years look back on how China was in the 2020s.
Maybe closing your eyes is a good strategy. I am not quite sure that it is the best strategy.
|
|
|
|
|
Another cultural difference- Asian philosophers have been critical of Western dualism for quite some time, and that might give an edge to situations that aren't polar yes/no, good/bad, etc.
Many of the early applications of fuzzy logic control systems were in Japan.
Fuzzy logic - Wikipedia[^]
You even see it in entertainment media. For instance, Hayao Miyazaki flatly refuses to do simplistic "good vs evil" stories, and that's a tendency that's true for other anime and manga.
|
|
|
|
|
trønderen wrote: If they develop new programming languages expressed in Chinese script, the source code will make no sense to 99,9% of all Western software developers
Several assumptions in that.
First of course is that the theft is of the source code in the first place. Could just be the idea. Or the process. Or hardware. Or binaries.
Second programming languages follow a set pattern. So if you had the source code, and the compiler, it is not that hard for a knowledgeable group to create a translator which would take the source code, translate the ideographs into an english version and just produce the source code in an english format. No need at all to work in the original language.
Third ideographs in general in the most complete way possible do not provide for a good way to program. The formalism is nice but the day to day activity of producing code is better managed by a far more simpler set of characters. This is equivalent to how cell phones and texting work in japan.
trønderen wrote: Chinese consumer authorities demand that anything sold in China shall have both user and maintenance documentation in Chinese, and be maintainable in China, i.e. software produced in a language and with tools compatible with Chinese standards?
France and the province of Quebec in Canada. A company cannot fire an employee if they cannot use any tool because it is not in French. And selling anything to the government means it must always be in French. I have no idea how they deal with programming languages themselves though.
trønderen wrote: If China in five to ten years, say, builds the world's strongest software industry,
Heard of Alibada? Very, very successful software platform. So successful that the owner started branching out and financing other ventures. Even internationally.
Then 20 years later the Chinese government decided they didn't like it. So now the Founder has disappeared.
Attempts to force success just do not work. Attempting that just leads to toadies that make promises, hide failures and attempt to sabotage anything else that might succeed. There are plenty of other examples which demonstrate that.
|
|
|
|
|
I'm not asking how to codez teh homework, and I'm looking for more theory. But, if this seems like a programming question, then just downvote this sucker and give me angry emojis.
But, C++ modules... what's the big deal with them? In JavaScript/ECMAScript it's a huge deal because prior to modules the best we had was clumsy hacks to workaround lack of support for separating code. But, in C++... I don't get it. What's the benefit over using a static library with a pre-compiled header?
This one of them things where history just repeats under a new name so younger devs feel like there's change when there's really not? Or is this a marketing thing where C++ is trying to play like the new, cool kid on the block too?
Edit:
Also, when did using the pre-processor become discouraged in C++? I never understood the disdain for that (assuming you don't go too crazy with macros like the Win32 API does). Years and years of C coding and I never once ran into an issue because of the pre-processor, so generally I just chalk that up to people wanting to sound fancy by insulting things they have little concept of.
Jeremy Falcon
modified 30-Jan-23 13:02pm.
|
|
|
|
|
Well, I probably should've Googled a bit more. Came across a better explaination that, once you remove all the useless hoopla, said this:
Quote: Before C++ modules, only one option was available: precompiled headers. They are not standard therefore results will vary depending on platform and compiler. IMO that's the only valid argument for them. Everything else is a crap argument. Guess this is one area where MS was way ahead of the curve on.
Note: I have nothing against modules. I'm just no longer fooled by hoopla. Decades of coding will do that to you.
Jeremy Falcon
|
|
|
|
|
The idea is not bad. But as far as I know there is not yet a standard defined and google and MS do it in a different way
|
|
|
|
|
0x01AA wrote: The idea is not bad Yeah, at its core I think it's nice to standarize this. Not trying to sound poopy, just trying to get to the truth without hoopla is all.
0x01AA wrote: But as far as I know there is not yet a standard defined and google and MS do it in a different way This is makes me laugh. That twisted sense of humor kicking in.
Jeremy Falcon
|
|
|
|
|
Note that VS has one problem with modules that is worth being aware of: forward declarations. I reported this a year ago, and it would be nice to see it fixed: Visual Studio Feedback
|
|
|
|
|
Jeremy Falcon wrote: Before C++ modules, only one option was available: precompiled headers. That was never true and is even less so now. Precompiled headers just allow compilations to run a bit faster, but they were never mandatory, and are probably less valid with modern high powered systems.
|
|
|
|
|
Richard MacCutchan wrote: That was never true and is even less so now. How so? AFAIK there was no other pre-compiled mechanism for removing header compilation prior to PCH.
Richard MacCutchan wrote: Precompiled headers just allow compilations to run a bit faster, but they were never mandatory, and are probably less valid with modern high powered systems. This is missing the point. Nobody said PCH was mandatory. We're talking abstract concepts here, a bit higher level than this.
Jeremy Falcon
|
|
|
|
|
Maybe I missed the point you are trying to make.
|
|
|
|
|
More like trying to understand what the big deal is, not that I have anything against them in theory. But to me, it seems they may add a slightly better way of doing things, but it's not Earth shattering like it was for JavaScript/ESM as C/C++ had more than one way to split and re-use code before this.
I'm a cranky, old-ish fart. So, ya know, want to get past the fluff I've seen on Google so far and get straight to the real talk as I inquire about them.
Jeremy Falcon
|
|
|
|
|
Jeremy Falcon wrote: I'm a cranky, old-ish fart. You don't know what old is sonny. My youngest son is eight years older than you, and my eldest grand-daughter only ten years younger.
But thank you for the clarification above; it's something I need to learn more about.
|
|
|
|
|
Richard MacCutchan wrote: You don't know what old is sonny. Touché.
Richard MacCutchan wrote: it's something I need to learn more about. Me too. I think C++ evolving is cool. I'm just to that point in life I need more than fluff or hyberpole to listen to something. Which after a quick Googling was all I found. So, ya know... bug CP about it.
Jeremy Falcon
|
|
|
|
|
"Before C++ modules, only one option was available: precompiled headers. They are not standard therefore results will vary depending on platform and compiler."
Finally, someone explains it to me.
And it seems I don't need to use modules then (unless someone only publishes a library that way)
But I was never a big fan of PCH (I never put compilable code in them anyway). Growing up with C/C++ (and then assembler) on everything from DOS to embedded systems, I never saw the need to question how includes and linking worked. I was always more worried about getting the smallest and fastest executable code possible (which VS is notorious for injecting useless junk into) and if it meant using obscure/esoteric techniques from ancient Unix, so be it.
I am aware that the linking system inherited from C has something to do with being able to link to Fortran object code...I think it's high time I abused that ancient caveat
|
|
|
|
|
Yeah, agree on all fronts. And I also don't have anything inherently against modules either. But if, like us, you've been around the block... we need more substances than hoopla or hyperbole to accept things.
Juan Pablo Reyes Altamirano wrote: I am aware that the linking system inherited from C has something to do with being able to link to Fortran object code...I think it's high time I abused that ancient caveat That's cool to know. Thanks.
Jeremy Falcon
|
|
|
|
|
I don't know about modules; in my current job/career path, I will probably never use them.
But about the pre-processor thing ... I think it was just abused too much at some point and became a mess; there are just better way to do the same thing with modern language features.
It's the same thing with goto, in itself it's not bad, but when it's badly used, can be a big problem (resource leak, safety issues, ... )
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|
Maximilien wrote: I think it was just abused too much at some point and became a mess; there are just better way to do the same thing with modern language features. Fair enough, and I can see that too. The good must suffer with the bad.
As a joke, once a buddy of mine decided to use macros to give C a Pascal-like syntax. It was a joke though, but some people would be like..... noice.
Jeremy Falcon
|
|
|
|
|
It would be even noicer if he created a parser/lexxer from scratch using that Pascale-like syntax! Full-circle of madness!
|
|
|
|
|
not all heroes wear capes.
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|
|
In the ‘70s, we wrote a PL/1 BNF parser to update Fortran to Fortran77. It soon became apparent that it would be easier to enhance if we wrote it in itself, and someone immediately wrote the inverses to check both. That did it. Soon there was a competition to create the longest loop through the most languages to produce functioning code, with extra points for generating similar code with the same variable names. The project was still going in late ‘82 when I left.
|
|
|
|
|
Noice. That's cool.
Jeremy Falcon
|
|
|
|
|
I haven't graduated to C++20 yet, so I won't comment on modules.
You haven't run into issues with the preprocessor?
I tried to compile my embedded streams library for an older platform.
It failed because on my virtual classes i had functions like getc and putc. Shouldn't be a problem, right?
Wrong.
Because somebody decided to take every C library function and wrap it with a macro, such that getc was #define getc __getc or some garbage like that.
Also scope pollution. The struggle is real. There is no way to control scope with macros, meaning I avoid them like the plague except for as configuration options (allowing them as part of the build scripts) and in implementation C/CPP files where they won't pollute the global namespace.
To err is human. Fortune favors the monsters.
|
|
|
|