The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
I agree though I'll say I've written enough parsers and grammars that I guess I expect it to be a language specific subset of whitespace (the JSON standard grammar has a similar issue) but that's me. It's rare that I've found languages that accept literally any whitespace you can throw at it. I think C# is an exception in that it can handle it (but I'd have to look to be certain)
I usually need a bigger hammer.
I think we all tend to look at something newish as a replacement for our current "hammer".. I find python to be very handy to work out what someone really wants. A friend has a bunch of (older) IoT controllers in an office building. The interface is via a built in web server in each controller. He wants to collect the data remotely. Turns out that these things will contact a syslog server if you configure it. Wrote the initial server in python to figure out what data he wanted to parse out and how to format it. Then rewrote it in C++ (runs on a W7 system). Reason to rewrite? Since the controllers are independent, and clashes are probable, performance was high priority... Windows is not an RTOS. Do they miss an occasional event? Probably, how would we know?
Give a man a fish; you have fed him for today.
Teach a man to use the Internet and he won't bother you for weeks, or even months unless you give him your email address.
If you can keep your head while those about you are losing theirs, perhaps you don't understand the situation.
I once spent an hour finding a bug in a Haskell program.
It was a homework assignment so it needed to run, but I kept getting "variable not defined" or some such.
Turned out one line in a give or take 50 lines application was indented with a tab instead of spaces
significant whitespace has no place in language grammars. It's poor design. People will argue about that maybe, but the proof is in the results. Like you said, it took you an hour to find that invisible bug.
It is hard enough to finds the ones you can actually see.
"Program testing can be used to show the presence of bugs, but never to show their absence." - Edsger Dijkstra
"I have never been lost, but I will admit to being confused for several weeks. " - Daniel Boone
With a right editor you can make them visible.
...my code vs.
-> my code.
But that creates a lot of noise on your screen and doesn't contribute to readability.
Even better, an editor that always converts tabs to spaces, which I think VS does too (or at least it's a setting).
Everyone on the team should always use the same amount of spaces for all projects though.
Not having that is annoying in C#, but probably disastrous in Python (or Haskell).
All in all, I don't think it has to be a problem.
I've been doing some YAML lately and I couldn't even get it to break because my (web) editor just wouldn't allow it.
If the code is properly formatted and the spaces are invisible I think it looks pretty and concise though.
If you use a Python aware editor you probably don't need to care. I use IDLE (the Python IDE), and Visual Studio Code. Both of them handle the indentation perfectly so it is not much of an issue. Until you want to reformat the file in a different editor.
Having to use a language aware editor so that it doesn't actually destroy your code is a huge red flag in terms of language design, IMO.
Significant whitespace has no place in language grammars. It's poor design. The proof is in the confusion and other problems it generates, like literally invisible syntax/"compilation" errors in source code.
The thing that keeps me away from it is the significant whitespace in the grammar** and my general distaste for scripting. I know both of those things are just me being nit picky but we all have our preferences.
I agree, though I did get used to the significant whitespace in F#, and other FP languages use that as well, which of course leads to the problem Sander described.
As to scripting, I comfort myself in that any of the heavy lifting is done behind the scenes in C, so the way I think about it is, all Python really is is a more sophisticated language (as compared to say, a .bat file) for gluing together calls to compiled code.
Like any scripting language, I find that it requires its own unique discipline to write decent code - many disciplines are the same, like DRY, but each language has its own unique set of "best practice" disciplines, and Python is no exception.
all true. my main issues with scripting are these:
1. Usually because of duck typing and lack of full compilation and such errors can be harder to track down.
2. In terms of performance it is so often hard to know exactly what is going to cause the performance issues, or worse, sometimes there isn't a reasonable way around it. That's fine, when scripting is used here and there in moderation, but it's a different story when you're powering an entire application with it.
I find Python's type hints to be very useful in my own code.
As you mentioned, they aren't checked when Python compiles your source into bytecode. But good tooling catches errors pretty quickly. Pycharm does a good job of screaming at me if I do anything that violates my type-annotated Python code.
As for performance, it depends on what you're doing. Most of Instagram's UI is still data from a Django app, so big Python apps running at massive scale can work if you're careful. Might not work as well for other types of applications, though.
Usually because of duck typing and lack of full compilation and such errors can be harder to track down.
Oi, that is definitely the worst thing. I wrote a complicated application to run on an rPi that dealt with hardware inputs and outputs, various configurations, etc., and it was painful, and I mean PAINFUL, to run the program only to discover I had some stupid typo.
PyLint helped a lot! As did mocking the classes that did the hardware interface (with a UI written in GTK, yuck), and running the thing on my PC first. Which was awesome, being able to test and debug on Windows before shuffling the code over to the rPi!
I've browsed github python repositories and found much more than that therein. I think your workmate maybe didn't give you the best example. Unfortunately I don't have a better one handy, as my level of involvement in all of this is as a passing interest.
"All these cool things you could do with Python" is very rarely because of the language itself, but due to the ecosystem built around it. A similar ecosystem built around any other language or platform would be just as cool ... in principle. From a technical viewpoint.
But the psychology of identity is super-essential here. A Python package is for us, we the Python community. Let those C++ and C# and VB and Fortran guys do their own, if they like - that's none of our business.
We could try to build a similar ecosystem around dotNET, with the same spirit surrounding an assembly well rivaling PyPi in size and variation, available to all dotNET languages. But what sort of identity and unifying community is that? I think it is far from realistic creating anything close to the same enthusiasm around dotNET, without that focusing point that is The One Unifying Language that we all speak in our code.
There is nothing in the Python language as such that makes it more suitable. What is does or does not offer in programming mechanisms isn't essential. But its function as a beacon to lead you into safe waters of a cool ecosystem is.
imho, this is elegantly said, and points to the social context changing as more and more computational and graphic power are available, affordable.
Narratives, often mythic, form around computer languages qua social movements. With Python, I have wondered if the "sole author" origin has added an aura of "for the rest of us" ... of individuality vs conformity.
The mythos that Steve Jobs was such a conscious creator and performer of ... his "reality distortion field" ... term first used by Bud Tribble at Apple Computer in 1981 to describe Jobs.
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali