|
With var, we can write the code easier, but reading it in some cases is open to interpretation. It should not be necessary to look at a function call to see what it returns to glean the type returned that goes into the variable typed as 'var'.
|
|
|
|
|
I'd say C/C++ are weakly typed because you can always convert one type into random other type. Java/C# only allow limited conversions between related types.
|
|
|
|
|
Imo the difference in that respect between c++ and c# is part of the broader difference that C#'s runtime second-guesses your every instruction while c++ takes your word for it. In both cases, the cast makes it past the syntax check with the same meaning: "trust me, these are the same". The exception is when C#/java will fail validation if the known strong-type cannot possibly also be an instance of the casted type; but that is not applicable to c++ because of multiple-inheritance, an object could always exist that inherits from both of them.
C#'s second-guessing compares "the actual type" via reflection to the casted type, whereas most if not all c/c++ programs at runtime make no definition of type and no support for reflection; that's a runtime distinction, not part of the language for the same reason that compiled vs interpreted is not part of the language. As far as if the language itself is strongly typed goes, they are the same. A c++ compiler and runtime could be invented that does the same as C# without changes to the language itself, or vice versa. I believe such a c++ compiler could even be compliant, with enough effort.
The same applies to accessing an array with an invalid index, c# throws an error right away, because it first checked that the index you gave it matched what it knows to be valid, whereas c++ will trust you and perform the operation, likely resulting in a subsequent error (because the cpu second-guesses you at a more security-oriented level i.e., DEP).
modified 14-Feb-22 23:05pm.
|
|
|
|
|
Which eliminates most uses of the C# var statement.
|
|
|
|
|
and then there's APL
let max = list[0];
for (let i=0; i<list.length; i++) {
max = Math.max(list[i], max);
}
or the APL version
⌈/
The less you need, the more you have.
Even a blind squirrel gets a nut...occasionally.
JaxCoder.com
|
|
|
|
|
Mike Hankey wrote: The less you need, the more you have. That is most certainly a signature matching the text of your post!
|
|
|
|
|
I had never heard of APL but after looking at this and a handful of examples online I firmly place this in my type 2 category because it lacks any form of readability to anyone not very well versed, whereas most programmers who have never touched c++ can probably figure out what the c++ does. Inho those few seconds you save typing will be wasted many times over debugging the swamp of symbols.
|
|
|
|
|
Well, APL is quite readable to mathematicians. (Or at least a certain share.)
APL was not developed as a programming language at all, but as a notation for teaching the math of matrices. Kenneth Iverson used it as "blackboard language" for a number of years when lecturing at Harvard in the late 1950s. It wasn't until he quit teaching and moved to IBM some of his workmates said something like "Hey, we could make a computer do these operations!"
But I must admit that you are right: APL is definitely not a language that you will learn in two weeks. And you must keep on using it, otherwise you will forget so much that you will have to dig up your old APL textbook to find any sense in the APL programs you wrote yourself a few years ago.
My first programming language was Basic - a very basic one, with at most 286 variables named A-Z or A0-A9 to Z0-Z9, and 26 string variables A$-Z$. My next language was APL, and I was extremely impressed by its power, compared to Basic. (I was 17 at the time.) I still have my APL textbook, and I'd love to pick it up again - just for doing programming that is very different from at least 95% of all programming done today. (I just need a tuit.)
|
|
|
|
|
If I had the inclination to learn APL, I suspect I would end up using it like I use regex, not as a language exactly, since I don't think I'd ever write the entirety of a program in it, just a hard-coded string that gets fed to an interpreter by the primary language for the specific operations it's good at. I do not consider regex a language because the definition of loop would have to be stretched to call it turing complete, and I would hate to think about calling functions or 3rd party libraries with it (no idea if that's even possible in APL but even if it is, I doubt I'd use it).
|
|
|
|
|
(I just need a tuit.)
Will an oval one do? Or does it have to be round?
|
|
|
|
|
Thanks a lot for your kind offer to provide an oval one, but I guess I need the round type to get further with my APL activity.
|
|
|
|
|
As an applied math major at an engineering-focused school back in the 80s I actually had APL as a required 1-credit course. I kind of enjoyed it *because* it was a little arcane.
|
|
|
|
|
My categories are :
1. Languages I like to work with
2. Languages I would like to investigate
3. Languages I have zero interest in
Regarding compiled vs. interpreted languages - I wrote my own language once. It started out interpreted and then I changed it to compile to native machine code. That was really fun and resulted in no changes to the language itself. In my opinion, languages are not inherently compiled or interpreted. Their implementations are what make the distinction.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Memtha wrote: Type
Exactly.
Is it strong-typed? Can it reflect on itself? Does it support function programming expressions? Does it supported anonymous types and functions? Is it compiled (the classic definition of compiled, IL included.)
If yes, I like it.
If somewhat yes, I tolerate it.
If no to all, I won't use it.
That means:
C#: yes
TypeScript: tolerated yes
JavaScript, Ruby, Python, etc: I won't use it.
Now, granted, I do like Python for certain things. There's always exceptions to the rule.
|
|
|
|
|
Agreed with strong-typed and compiled. I personally cannot stand reflection because it tends to loose the benefits of strong-typed because string mangling at runtime can't really be compiler checked. Example: I am presently building (not by choice) a scheduling system that pulls the assembly name, type name and method name from sql, loads the assembly, finds the type and method and runs it. If some bonehead comes along and changes the method signature, no compiler errors will occur but the task will fail at runtime.
Anonymous types and functions (and especially lambda and "properties") are perfect examples of the kind of fluff I despise in C#. It does not take that much longer to make proper named classes and functions, and is much more readable and reusable. The extra time will pay for itself later when I don't have to go to the docs to find the implied return type of my function based on what it is being passed to.
As trønderen wrote above: Typing should be explicitly visible in the program text, and clearly identified as a type. "
|
|
|
|
|
functional vs. imperative and general purpose vs. domain specific are my main delineations.
beyond that, "interpreted" vs "static compiled" vs "JIT compiled" is technically an implementation detail. Consider Javascript - initially interpreted, now JIT compiled. It's why I don't consider that when I deal in languages
For me it's about what I can do with it, not even "how" I do it. Although I do consider the form of typing (duck typing, static typing, etc) when evaluating a language.
Real programmers use butterflies
|
|
|
|
|
Does it solve my problem? Yes/No
C# usually does, JavaScript occasionally does, SQL sometimes does, Regex rarely does...
As a PLC programmer you're probably using C or Rust, but as a web developer those aren't really viable choices, C# and JavaScript are.
Golf-oriented gibberish never solves my problem, that's more for fun and giggles.
That's not to say I like all those languages equally, but I'll learn/use them when I have to.
|
|
|
|
|
My categories are :
Do I know it?
Do I need to know it?
Do I want to know it?
Am I sufficiently not lazy to learn it?
|
|
|
|
|
|
My categories ain't terribly different from yours. I categorize by what they value:
-backwards-continuity
-a certain principle or set thereof
-pragmatism as in "getting things done without standing in the way", which would be the good ones
|
|
|
|
|
The good, the bad and the ugly.
Ugly) The languages that my colleagues prefer that I don't like
Bad) Languages that neither my colleagues or I use
good) The languages I use.
Nothing succeeds like a budgie without teeth.
|
|
|
|
|
I categorize them thus:
1) Me no likey
2) Looks interesting and useful
3) Yes please!
Based solely on taste as far as I can tell
|
|
|
|
|
Honestly mostly I just think in terms of whether or not a given language is suitable for my current task.
|
|
|
|
|
|
I would divide general purpose languages into categories (not all fitting languages are listed):
1. Assemblers
2. C, C++
3. Java, C#
4. Python, JavaScript
5. Perl, TCL
Anything that is not like the 5 categories above is not worth categorizing.
|
|
|
|