|
Not sure, but you answer with irony? But what I have in my mind is the next step from compiler-compiler
|
|
|
|
|
0x01AA wrote: long dreamed of a universal 'meta language'
It cannot exist.
That is why so many languages exist. Reasons vary but in general a need was not being met or the newer one was supposed to be better.
If anything you can look to the many languages that use the Java VM and yet they are not Java. That is not the only language platform where other languages were built on it.
|
|
|
|
|
honey the codewitch wrote: I went about porting my DFA lexer engine from C# to TypeScript
Just noting that javascript has a very feature rich regular expression support.
honey the codewitch wrote: allow you to implement computer sciencey algorithms and constructs
Your description is incomplete. From what I read in your comment it does allow you to do it.
It just is not as fast as you like/want.
honey the codewitch wrote: is one example but I can basically crash it or stall it out for a long time at least with any non-trivial expression.
Perhaps this is your basis? Without having looked at that solution at all, I can only note that being Turing complete does not mean that the language has no bounds. That is not part of the description. And all languages would fail at that.
honey the codewitch wrote: or is it just being adopted because we can?
I know a business that was sold for quite a bit of money which ran high performance high volume real time data processing using javascript.
Not the platform I would choose and I don't know what hardware costs were. But they certainly did it some how.
I doubt most others would choose that also. For complex systems one often uses a mix of technologies.
|
|
|
|
|
jschell wrote: Just noting that javascript has a very feature rich regular expression support.
Can't lex, and won't fulfill the project requirements, which are "learn typescript"
jschell wrote: Your description is incomplete. From what I read in your comment it does allow you to do it.
It just is not as fast as you like/want.
There is a point where something takes too much time and space to be practical. That is a real thing. That is the issue here.
So no, it's not "just not as fast as I'd like", it is not usable.
jschell wrote: Perhaps this is your basis? Without having looked at that solution at all, I can only note that being Turing complete does not mean that the language has no bounds. That is not part of the description. And all languages would fail at that
It's not. Really none of this is relevant.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: which are "learn typescript"
lol...well yes that doesn't work.
honey the codewitch wrote: There is a point where something takes too much time and space to be practical. That is a real thing.
At least with regular expressions, in general (so perhaps not your solution, or could be) it is possible to create ones that will never end. Or will take days to complete.
So that by itself is not a determinate.
|
|
|
|
|
This isn't even to run the expression, just to turn it into a state machine.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Assembler is the only "real" language, everything else is just syntactic sugar.
"A little song, a little dance, a little seltzer down your pants"
Chuckles the clown
|
|
|
|
|
An assembler is what assembles an assembly language. And, of course, even that is just syntactic sugar over machine language.
|
|
|
|
|
And machine language is only syntactic sugar over the rearrangement of charges in the CPU, memory, etc.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
To be fair though the charges are not a language.
Although, far as I know, all modern processors use 'micro-code'. Googling does not really answer if that is Turing Complete though.
|
|
|
|
|
If you can build a Turing machine out of arrangements of electron charges (which you obviously can), then I would claim that it meets my definition of computer language.
We don't usually think of that as a language, but it is no more arbitrary than using certain shapes to represent letters is.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Real programmers use butterflies.
*hides*
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Obligatory xkcd: Real Programmers
Real programmers use emacs!
(Dons asbestos suit, runs and hides)
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: If you can build a Turing machine out of arrangements of electron charges (which you obviously can)
And computers are built using atoms but I am not going to claim that atoms are computers.
|
|
|
|
|
I could also argue with you that C is the only other real language when used as originally defined back in the (what was it) 70's!
On a serious note (at least for me), anything that is not a strongly typed language suffers from a serious flaw in its ability to do the job intended (as opposed to written!)
Actually, I have changed my mind, even assembly isn't a REAL programming language.
The only REAL programming language is the string of 0's and 1's (or bits) that the CPU consumes on its way to World Dominance!
All Hail the AI Overlords!
|
|
|
|
|
Among other things, such as Turing-completeness and datatypes... with a "real" programming language, you can implement it's compiler/assembler. By this requirement, no interpreted language is a "real" programming language.
But there are definitely uses for other programming languages, domain-specific languages in particular. And Operating System scripting languages.
Is JavaScript a domain-specific language? I don't know, I never use it, but it seems like it might be.
In my opinion, the question comes down to do we really need programming languages which are neither "real" nor "domain-specific" nor "Operating System Specific"? E.g. "portable scripting (glue) languages" such as Perl and Python.
Gimli: Very handy in a tight spot, these lads
|
|
|
|
|
PIEBALDconsult wrote: with a "real" programming language, you can implement it's compiler/assembler
Interesting definition.
So for that it would exclude C#, Java, JavaScript.
But would include C/C++, Fortran and Pascal.
Focusing on C# and Java they can create a binary file. It is after all just a matter of writing to a file.
So they can for example create their own interpreter. Perhaps as a hack, but they can do it.
They can definitely create their own compiled (byte code) files. There are actually libraries in both languages for that.
So creation, to a certain extent, is not it.
So it is a two step process that makes it not fit the definition?
Isn't C/C++ 'built' using a compiler and then a linker? Although those can be one application the process of each is distinct. And I have certainly used systems where they were distinct applications.
Additionally I can find both a Microsoft and gcc linker right now. So they still exist however they might be used.
So there are still two steps.
I was also wondering where Lisp fits into the above. Definitely a compiled language. But no way would I want to create a compiler using that.
|
|
|
|
|
jschell wrote: Definitely a compiled language. But no way would I want to create a compiler using that.
Where's your sense of adventure? You mean you don't want to have to maintain 247 level nested parentheses 6 months after you put the code down?
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: 247 level nested parentheses 6 months
lol...I think you are underestimating the needs of a writing a compiler at least by an order of magnitude.
|
|
|
|
|
jschell wrote: exclude C#, Java, JavaScript.
Essentially correct. Can the Java VM or C# runtime (and .net framework) be implemented those languages? I doubt it.
Yet, as to C# and Java, I'm not sure that they are excluded by definition. Certainly their reference implementations rely on Virtual Machines and runtimes, but I'm not ready to say that their core functionality require those. I reserve the notion that maybe someone could use one or the other to implement a proper compiler which supports some core functionality of the language without needing the full VM/runtime -- such would probably not be able to inter-operate with "normal" applications. I don't know Java, but C#'s core functionality/syntax shouldn't need it.
As to C++, I'm not even sure about that. Or, for the most part, any object-oriented language. I think D is implemented in D.
jschell wrote: it is a two step process
jschell wrote: C/C++ 'built' using a compiler and then a linker
jschell wrote: there are still two steps.
I see a statement on another site [ "C++ implementation" means the compiler plus linker plus standard libraries ] and I would respond, "no, forget about any reference implementation and 'standard libraries', look only at the core of the syntax, what is the minimum you require to implement that? Without having to link to the library and such, you don't need a linker. Consider how the first version of the C compiler and library must have been compiled prior to the libraries having been compiled. I know, not very useful, but that's not the point.
Consider only the syntax of the language itself, and not any of the baggage you have come to expect to go with it. Surely someone can (has the ability to) take the language syntax and implement a whole new eco-system which does not require the VM or runtime or 'standard libraries' or whatever. Not that I could implement such a thing myself. Codewitch could probably knock one out in a week.
The linker and pre-compiled assemblies are just nice-to-haves.
jschell wrote: Definitely a compiled language
Just being a compiled language isn't enough.
I would further assert that basically no programming language is inherently "compiled" or "interpreted", though BASIC is the only one I can think of quickly which has had successful implementations of both types.
C# and Java "compile" to some intermediate form which runs on a VM/runtime, so are they truly compiled or just interpreted? Probably the latter.
|
|
|
|
|
PIEBALDconsult wrote: C# and Java "compile" to some intermediate form which runs on a VM/runtime, so are they truly compiled or just interpreted? Probably the latter.
There is no technical reason why one could not build hardware which has the Java bytecode as its machine language. Ditto for C#. Therefore, neither language is inherently interpreted.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
I agree. The reference implementations seem to be interpreted from a certain point of view, but a truly compiled implementation could (conceivably) be created.
|
|
|
|
|
Daniel Pfeffer wrote: There is no technical reason why
There is a commercial reason though because when they tried that long ago with Pascal it failed.
|
|
|
|
|
I'm aware of that. We were having a technical discussion, not discussing the commercial viability of such an implementation.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: There is no technical reason why one could not build hardware which has the Java bytecode as its machine language. Ditto for C#. If you with C# mean to refer to dotNET Intermediate Language (IL), you are comparing two languages at approximately the same abstraction level, but very different in form.
Java bytecode, like lots of other P-code formats (strongly inspired by the P4 code of the original ETH Pascal compiler) are intended to be complete, ready for execution, with no loose ends (except for those defined by the language to be, e.g. late binding), similar to 'real' binary machine code - but for a virtual machine. The instructions (i.e. bytecodes) are executed one by one, independent of each other.
IL, on the other hand, has a lot of loose ends that must be tied up before execution. It contains lots of metadata that are not the machine instructions, but indicates how instructions should be generated. Although you in principle could try to 'interpret' the IL, you would have to construct fairly large runtime data structures to know how to generate the interpretation, similar to those structures built by the jitter to compile the IL to machine code. So you are really doing the full compilation, except that you are sending binary instructions to the execution unit rather than to the executable image.
The line between compilaton (followed by execution) and interpretation is fuzzy.
If you with C# refer to direct source code interpretation, you have a huge task to solve. Here, you would have to build a lot more complex runtime data structures to support the interpreter. Building these would be going a long way to making a full parse tree of the source code, and then you have done a significant part of the compilation job.
Compilers are so fast nowadays that I see no practical advantages in interpreting program code.
For building dedicated hardware:
USCD Pascal, one of the better known Pascal interpreters for PCs, used the P4 bytecode. It also ran on the PDP-11, such as the single-chip LSI-11. For this was written microcode to run P4 directly (rather than PDP-11 instruction set). It turned out to be significantly slower than running the PDP-11 software interpreter.
There are lots of similar stories of hardware implementations not living up to expectations. Intel's object oriented CPU, the 432, was simulated on an 8086. The first 432 implementation turned out to be slower than the simulator.
Yet another example: I worked on a 'supermini' (i.e. VAX class) machine that provided instructions for trigonometric functions. The Fortran compiler/libraries didn't use them; they were too slow. Calculating the functions the traditional way turned out to be faster. I talked to the designers of the CPU, asking why the instructions couldn't do it the same way as the libraries. They just mumbled a lot about having to be prepared for interrupts in the middle of the instruction. But the library sequence of instructions can be interrupted midway, right? Well, that is a different situation ... In other words, I never got a decent answer.
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|