|
Most of spoken languages are written LtR (Left-to-Right) but Maths, the number language, is actually written RtL because the decimal numbers are Arabic numerals. (I know, most will get surprised, but it's true about Maths RtL direction.) But somehow the RtL and LtR languages got mixed up.
Instead of writing
x = a + 12 how about changing it to
a + 12 = x So, what are your views on creating a new programming language which follows proper LtR execution? Is there already such language? (Please, just don't remind me that there are already lots of programming languages (I know already) and I must not (try to) create one more. )
|
|
|
|
|
throw new InvalidPremiseException ( "ADD a AND 12 YIELDING x -- COBOL." ) ;
|
|
|
|
|
I just threw up in my mouth, a little.
Sent from my Amstrad PC 1640
Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Because of the syntax error?
"The only place where Success comes before Work is in the dictionary." Vidal Sassoon, 1928 - 2012
|
|
|
|
|
It depends on who you are trying to communicate with. A computer or a programmer.
From a computers perspective, if you want to model the execution steps, there is already a better language for this representation, reverse polish notation. (a 12 +)
From a programmers perspective, you want the code to be easy to read and I don't see how your proposal assists this.
- When reading code the variable being set is the most important.
- If I'm trying to understand code, I'll want to understand where a var is set, this is easier to do by scanning a block where the variables are aligned. It is easier for my eyes to find the line x=..... rather than .....= x.
- Following the logic of the algorithm would involve understanding where variables are mutated as much as what they are set to.
- For complex expressions, I'll probably only read and understand them once, while I'll explore the looping logic and structure of the flow of the code more.
60 years ago, there was an economic value in a programmer spending significant time making things easier for the computer.
Now, the value is in making things easier for the programmer, even if significantly more complex for the computer.
I've always wanted languages to adopt a true assignment operator x <- a + 12. But it would need to be a single character and exist as an easily usable key on my keyboard. Interestingly, Visual Studio allows unicode variable names, so I've written software using genuine alpha and beta glyphs.
|
|
|
|
|
If you really want to program in RPN there is Forth. In my brief encounter with it I deemed it a write-only language.
At least, there was Forth. I have not heard much of it many years.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Forth is still out there. It's still being used as an intermediate language at a place I worked for years ago. We (actually I) decided to write a multitasking subroutine threaded Forth for our industrial controllers. On the PC we developed an IDE where the programmer would just develop flow charts. Each flow chart would become a task on the industrial controller. The flow charts were then compiled to Forth by the IDE and would then be downloaded to the controller which would compiled it to machine code.
|
|
|
|
|
That sounds interesting. Other than it uses Forth.
That last thing I read about Forth was many years ago. It was about the development of the SPARC processor and Sun workstations. They embedded Forth in its ROMs and wrote the boot loader in it. As I recall, it came up and ran on the first attempt.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
I liked Forth a lot, but then I grew up on assembly language. The whole TIL (threaded interpreted language) scheme is extremely simple and is easily ported to different processors. One of the main problems with Forth is that the programmer is assumed to be an expert. There's pretty much no hand holding.
Forth Inc. is still in business too (www.forth.com).
|
|
|
|
|
I didn't care for Forth. I didn't grasp it immediately and it was always a struggle for me to deal with.
The same applies to RPN for me. I think I was the only one in my engineering school who didn't have an HP calculator. Coincidentally, I went to school in the same town where HP designed and built them at the time.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
I like it because of it's interactive nature. Write a 'word' and you can test it immediately. Made for much quicker development at the time. Also, it was relatively easy to make a multitasking Forth (round robin scheduling).
|
|
|
|
|
Take a look at the PostScript manual. As far as I can see its FORTH with extra graphics bits.
|
|
|
|
|
I knew it is a stack oriented language, but never looked at it's keywords. I took a quick gander and it does look very forth like.
|
|
|
|
|
Algol58 / 60 / 68 had an assignment operator which was (in the spec) a left facing arrow, but usually represent as := in implementations. We referred to it as the 'becomes' operator. The '=' was an equality operator. I do not know why K&R etc decided to use '=' as an assignment operator in C (a source of errors ever since, even in Yoda mode) even though C is ultimately derived from Algol; unless they wanted compatibility with FORTRAN (which uses .EQ. as its equality operator, so no ambiguity).
|
|
|
|
|
For the new language, I am thinking from programmers' perspective.
The Reverse Polish Notation is good for computer but confusing for humans.
Writing a + 12 = x and finding where the var is set - is looking confusing but we may find some solution(s), I haven't yet thought much about it yet. We may also find some unique way of defining variables which is not used in any programming language yet.
The new proper LtR programming language idea came to mind to reduce code, faster parsing/compiling/execution, less code traversal. We may get used to the syntax if we practice more.
|
|
|
|
|
NeverJustHere wrote: I've written software using genuine alpha and beta glyphs Heaven help you if you ever look at the source code on a machine where the system locale is not set to English. I had a case the other day where a Greek mu (µ) character in the UI was showing up as a kanji character.
English version of Windows, check.
User language set to English, check.
English keyboard selected, check.
...
Wait a second. Who set the bloody-be-damned system locale to Simplified Chinese? At some point the box had been restored to the original image, and this was during a brief period when our out-sourced assembly folks were setting all of our boxes to Simplified Chinese. Grrr...
Software Zen: delete this;
|
|
|
|
|
Quote: there is already a better language for this representation, reverse polish notation. (a 12 +)
You were so close! Use prefix notation instead
(+ a 12) and then you already have a powerful language with modern language features.
Postfix notation
(a 12 +) gives you Forth, with little expressive power.
Prefix notation
(+ a 12) gives you Lisp, with all currently known language features.
|
|
|
|
|
In Arabic the numbers are written left to right.
|
|
|
|
|
I am not sure but the direction could have changed in last 100-400 years because of the influence of the western culture.
|
|
|
|
|
With the "programming" aspect of your post aside...
Nikunj_Bhatt wrote: decimal numbers are Arabic numeral I'm confused... what does this mean? How is a number (decimal or otherwise) specifically Arabic? (and therefore specifically RtL)
Nikunj_Bhatt wrote: Maths, the number language, is actually written RtL What do you mean by this? If I write a sum on a piece of paper, I would write:
1 + 2 = 3
|
|
|
|
|
The numbers that we are using is the decimal numbers a.k.a. Arabic numerals. Here is the history of the numbers: Arabic numerals - Wikipedia[^]. The decimal numbers are called Arabic numbers because they were invented by Arabs (however Zero was invented in India, it is also part of Arabic numbers).
So,
1 + 2 = 3 is a correct way in LtR while
3 = 1 + 2 is correct way according to RtL (Arabic, Hebew, etc. RtL languages, however this is now not used because of international Mathematics influence.)
|
|
|
|
|
Just because they may have originated in another language does not mean they are still that language. It depends on the context used. Otherwise you could easily argue a lot of English is not actually English because it originated in another language... it doesn't matter where it came from it's only how it's used that matters.
If your argument is that Maths is it's own language, then it can also defined it's own read order (i.e. LtR), it doesn't matter where the numbers original came from and how they were originally read.
My point is, there is no reason to why they have to be read RtL just because they are decimal numbers.
|
|
|
|
|
musefan wrote: If your argument is that Maths is it's own language, then it can also defined it's own read order (i.e. LtR), it doesn't matter where the numbers original came from and how they were originally read. My concern is about programming, not about Maths. So, Maths can have its own read-write direction but in programming we can define what is logical - because programming is all about logic, isn't it?
|
|
|
|
|
musefan wrote: I'm confused... what does this mean? How is a number (decimal or otherwise) specifically Arabic? (and therefore specifically RtL) Our numbering system (and the number 0) are arabic inventions. Romans use a different system, and do not have a sign for nothing.
Doesn't make the number itself an Arabic thing, only our current defacto way of encoding these.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
It is interesting to notice that numbers are written the same way in LtR and RtL languages, with the most significant digit to the left.
Also notice that in a number of languages, including old English, it was common to read (and write as words) e.g. four-and-twenty, as if read RtL, rather than twentyfour. In German, you still read numbers RtL, at least up to one hundred. Norwegian was similar until a reform in the 1950s.
(If you try to bake twentyfour blackbirds in a pie, the rythm will be screwed up...)
|
|
|
|