|
Ok, maybe not.
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
Quote: even, *gasp*, VB than Java
I thought the lounge was supposed to be KSS safe! go now, we don't want another one of those types here!!
|
|
|
|
|
When you say zero-terminated strings are "slow", how do you mean? They are one of the fastest implementations of string handling you can get, generally.
That's stunningly ignorant. strlen is O(n) and is needed for most operations strings. To speed them up you need a type containing the length along with a char buffer or a pointer to one. And then add a refcount and copy-on-write to avoid unnecessary copying.
|
|
|
|
|
I think it "all depends" on how you define the words "better," and "learn."
And, it depends on your cognitive style: yes, people have different cognitive styles; some are more innately top-down thinkers; others are more innately bottom-up thinkers. For some people thinking recursively is very natural (they feel at home in LISP). People's ability to visualize/conceptualize complex processes may vary in terms of the relative salience of "visual-thinking" and "abstract thinking."
And, context: where you are; the circumstances you are in; the limits, or requirements, of the task at hand, and the time to achieve it in.
imho, what is a more interesting question to ask is: what type of education would best prepare people for careers as professional programmers? And, yes, that takes us right into the briar-patch (in England, would that be a "sticky wicket" ?), of what a "professional programmer" is, these days.
I'll never forget when I worked at Adobe, as a PostScript shaman, on Illustrator, PhotoShop, Acrobat, Multiple-Master Font Technology, etc.; one day, I talked to the young genius, Mark Hamburg (who was awarded the Gordon Moore Prize by his peers for his remarkable work on PhotoShop's evolution).
I asked Mark what he had majored in at college; he replied he'd majored in Mathematics; I asked him if he had considered Computer Science. Mark said, essentially, that he had already read, and understood, all of Donald Knuth's books, while he was in High School, and felt he had nothing more to learn in that area. Unfortunately, most of us are not Mark Hamburgs (and will never be) !
My own journey in the last thirty years has been from 6502, and 6809, assembly language, to Pascal, and Basic, to HyperCard, to LISP, to PostScript, to Visual Basic and VBA, finally to C#. For me C# is perfect: just terse enough, just high-level enough.
Rant: If only MS had put some energy into giving WinForms a high-level vector-based 2d retained-mode graphics/drawing engine, instead of going off the deep-end into the-next-greatest-thing-frenzy with WPF and SilverLight !
But, all the languages I have studied, and used, have helped me become the obscure non-entity I am today
“I'm an artist: it's self evident that word implies looking for something all the time without ever finding it in full. It is the opposite of saying : 'I know all about it. I've already found it.'
As far as I'm concerned, the word means: 'I am looking. I am hunting for it. I am deeply involved.'”
Vincent Van Gogh
|
|
|
|
|
BillWoodruff wrote: For some people thinking recursively is very natural (they feel at home in LISP). I learned LISP when I was taking graduate courses in artificial intelligence back in the late 80's. My best description of the experience was removing the top of your skull, rotating your brain counter-blockwise 90°, and reattaching your skull.
Software Zen: delete this;
|
|
|
|
|
Amen, Brother Wheeler !
I went through a phase of LISPmania. At one point I spent ten days figuring out how to write a three-line method that took two ints as parameters and created a 2d array in memory.
I forget, now, whether it was more than doubly-recursive.
After successfully deprogramming myself from the "cult of 'car and 'cdr," in sesshins at the Berkeley Zen Center, and by binging on cheap Chinese take-out with extra MSG, and Jolt Cola, until bulimic ...
I realized that, in the future, it would take me as much time to revivify my understanding of that three line solution, as it took me to develop it ... and I moved on to ... PostScript, which ... few people ever appreciate this ... is a very LISP-like language with an post-fix notation "front-end," and explicit stacks, wired-up to a monster-great graphics model/rendering-engine.
I do believe that a period of total immersion in an "alternate programming universe," like LISP, Prolog, or, even, PostScript, can be a valuable part of a programmer's education ... if they have a strong base in a strongly-typed language to begin with.
But, I am very influenced by the work of the anthropologist of education, George Spindler, at Stanford, on the utility of "discontinuities" in education and socialization as catalysts for cognitive devlopment, and acculturation.
I think frequently getting your own mental chassis torn-down to the point you become all too aware of what nuts the bolts are, and then, re-assembled, is downright salubrious
Merry, Merry, Bill
“I'm an artist: it's self evident that word implies looking for something all the time without ever finding it in full. It is the opposite of saying : 'I know all about it. I've already found it.'
As far as I'm concerned, the word means: 'I am looking. I am hunting for it. I am deeply involved.'”
Vincent Van Gogh
|
|
|
|
|
That matches my experience. While I don't remember much of the LISP I learned at the time (it was 25 years ago), I do remember how the experience seemed to broaden my approach to things in more 'traditional' languages.
I actually used some of the AI techniques later on. My employer never knew, but I had a rule-based parser in 'C' that would find U.S., Canadian, and U.K. Royal Mail postal information in free-form text and create the appropriate bar code.
Software Zen: delete this;
|
|
|
|
|
Short form - your friend is a moron who will never learn to code well. There's still hope for you.
Long form - Delphi, being the modern version of the best general purpose programming language ever created (Pascal) is excellent for getting the job done, but not so good for making the hardware dance. C and C++ both suck donkey dangly bits, since they use an archaic and unreasonably complex syntax, but they both give you far greater control over the real machine, not just the virtual one that Windows lets you 'see.' If you truly want to master this field, learn Assembly for one or more mainstream processors, then C/C++, then you can decide which of the popular general purpose languages you want to play with. Oh, and find smarter friends...
Will Rogers never met me.
|
|
|
|
|
It all depends on what you are trying to learn and what level you're starting from.
C isn't a great language for absolute beginners because it has quite a weird and confusing syntax - there are more instructional languages to teach the basic concepts of programming and ease someone into it.
However, if you understand the basics and want to get to grip with lower level things, then C is at the simpler end of the C-style-language spectrum and it does allow access to very low level stuff that is gradually hidden away as you move to the C# and Java end of things.
But if you really want to go low level, then you need to write some assembler, so that when you write your high level code you understand what it is the computer is actually doing, and why.
If you want to learn to be a better programmer, then there is no "one" language. Learn at least a bit about as many as you can. Functional programming is a good example of languages that will make you think about the same problem in a very different way, and when you come back to a procedural language you'll write better code because you have a wider view (lateral thinking) of how the problem can be approached.
Lastly, I find the best way to learn is to "just do it". Don't read or re-use someone else's solution, but actually sit down and write the whole program yourself. Want to read an XML file? Then write a simple XML parser. The next time you use an XML parser from a library you'll understand what it has to do and why it's so slow. You'll know how you can re-phrase your XML data layout to make the files faster to load, smaller to transfer, and easier to manipulate. As well as this, if you do it, you'll remember it; if you read it a lot of it may just fade away, unused.
|
|
|
|
|
I believe C/C++ (if not assembly) should be the first language to be learned. This will help the aspiring programmer grasp some concepts that he may miss in high level languages like C# and JAVA.
The concept of pointers, stack and heap are too important to be overlooked and may seem like they don't exist in high level languages and that can make a real difference when building reliable high performance applications.
Abstracting these concepts is good as long as you understand them.
To alcohol! The cause of, and solution to, all of life's problems - Homer Simpson
----
Our heads are round so our thoughts can change direction - Francis Picabia
|
|
|
|
|
I used to think like that some years ago. But now I see that basic skills about CS are more relevant then
mastering on language or another.
Algorithm analysis is a good starting point.
Learn how to build more efficient, faster and readable code is the foundation
of good code.
Know the basic structures like linked lists, trees, graphs and solve basic problems
like sorting, recursion etc can sound a bit redundant considering the huge number of built-in
libraries and frameworks available today, but I think that such skills can help you to sove
a set of new computational problems you may find ahead in your career as developer.
|
|
|
|
|
ASM is always the language to use. Roller Coaster Tycoon 1 proves it works in a production environment.
Anyway, I'm gonna get back to my C++ing now...
|
|
|
|
|
C is lower-level (closer to the hardware) than Java and C#. I don't know anything about Delphi so I can't comment on that.
For someone learning programming there are two schools of though: start with the higher-level concepts using a higher-level language, and gradually delve deeper into understanding how they are implemented and what is going on at a machine level, if necessary. These days, the deeper delving is really not all that necessary, unless you are charged with writing very high-performance code.
The second school of though is to start at the low level and learn up, gradually abstracting away the lower-level concepts with higher level ones. This more closely traces the evolution of computers and languages, and if you're really serious, probably provides the "best" understanding of the whole ecosystem, but is a much higher learning curve.
As for my opinion, if I were to recommend a path to someone I would probably choose the first method, of learning the high-level concepts first (probably with a dynamic language like JavaScript), and delving deeper where one is interested.
It really depends on how "serious" the subject is about learning computer languages. If they're darn serious, learning lower-level-up will provide the best understanding, but if they're not sure about it, starting at the top is the best way to discover if they have a passion for programming or not.
Sad but true: 4/3 of Americans have difficulty with simple fractions.
There are 10 types of people in this world: those who understand binary and those who don't.
{o,o}.oO( Check out my blog! )
|)””’) http://pihole.org/
-”-”-
|
|
|
|
|
You're wrong for thinking that there is a better language to learn that any other. But for a first timer I would choose C++, C# or Java.
|
|
|
|
|
Cody227 wrote: I was alyways convinced that you need to know how a computer works at low-level to be able to write decent code...
Incorrect. Most developers write code for business applications. Most of the code they write is specific to solving business needs. So understanding the business and the application is how one writes "decent code".
It might however help to know some aspect of low level hardware but that is not only less likely that it was in the past and even less likely to be possible. At least depending on what "low level" really means. For example if you are running your newest server on a cloud server you certainly need to know what '16 gig of memory' means but it is absolutely useless to know the kind of memory. And if one ends up writing code for windows, Macintosh, iPhone, Android, Linux with even some old main frame adaptor code then attempting to learn everything is impossible.
Cody227 wrote: But then there are things that really annoy me about C like null-terminated strings. They are so damn slow!
Business application performance problems almost never resolve to low level language problems. They are almost always architecture and design problems. Even more so when in less structured team environments (which is the norm and not the exception.)
Cody227 wrote: And at the same time he asked how to send an array of pixels with WinSock because the appropriate delphi-function only accepts strings.
And are you are expert in every possible technological API that you might reasonable encounter in the modern world? How are your iPhone skills? Done much real time programming for embedded software on a raid driver card? What about interfacing to the cash drawer on a PC? Or really creating an XML/XSD that actually does support international data and not just claiming that it does? What about optimizing a Oracle database and a MS SQL Server database? And how does one set up a geographically redundant data center (and what are the trade offs with hosting yourself or the various cloud possibilities?)
The vast, vast array of technologies means it is impossible to be an expert in all but a few. And one is likely to do more damage to the career by even attempting to span several rather than sticking with a few (for example embedded real time drivers versus standard web business applications.
|
|
|
|
|
jschell wrote: And are you are expert in every possible technological API that you might reasonable encounter in the modern world?
I never claimed to be an expert or even intermediate. Besides, that was not a problem regarding the API, the problem here is that beginners who learn a very high-level language first do not know what data really is. (It wasn't even explained in various C/C++ books) In fact everything in memory is made of the same binary code and you can not tell what it is. It could be a picture, a string, a number or even executable code. The typechecking is only a way to help us remember what we want to do with this piece of memory. For an experienced coder like you that might be obvious, but for the beginner it's not. IMHO knowing that is very important no matter what language you choose (not HTML though xD) because it allows you to bypass typechecking if you need to. In my example you could use this knowledge to split the pixel-array into pieces and cast them to a pascal string so you can send them with WinSock and after receiving rejoin them together. Ofcourse you would also need to know that the first byte in a pascal-string determines it's length and that it wouldn't work with null-terminated strings. (which is kinda important too because many WinAPI functions use null terminated wchar strings). A basic knowledge about stack, heap, stackframes, pointers and things like that might be very useful as well (for example when recursive functions cause a SO or unsafe functions like gets() cause unexplainable behaviour). It also explains why you shouldn't put very big data on the stack and why you should call by reference when using functions which need that data.
|
|
|
|
|
Cody227 wrote: I never claimed to be an expert or even intermediate.
However the point is that your example is one single API. And there are a vast number of them. Even as a general concept it will still fail to help in a vast number of APIs/technologies.
Cody227 wrote: In fact everything in memory is made of the same binary code and you can not tell what it is. It could be a picture, a string, a number or even executable code....
"In fact" ...I have been coding for 40 years and started before Java/C# were even ideas and first encountered C++ before there was even the idea of a standard for it. And I have written assembly, created some drivers and written code that wrote/read directly to memory, disk drives and other hardware. I only work on server side applications and even now a great deal of my time is spent interacting with externals systems which have a very wide array APIs.
So I don't need a tutorial on what the problem is that you are discussing.
But perhaps because of the breadth of of my experience I understand how impossible it is to grasp every possible API variation. The vagaries of different types seldom matters. And keep in mind that my focus/expertise tends to require that I must use far more of these than the average developer.
To be fair what matters far more is that it is very likely that a single API/technology is very likely to be inadequately documented. And thus even if one has an idea of how a specific method of an API should be used doesn't guarantee success. Matter of fact is it possible that two methods within one API might differ internally on something that should be the same.
|
|
|
|
|
BASIC and assembly were the first languages I used to program, and I thoroughly enjoyed both (though they could be frustrating at times). I reminded myself recently about assembly and binary arithmetic and related topics, simply because I find them enjoyable. I then went on to use C++, then C# (non-Web based), then Java, PHP and JavaScript et al. I enjoyed C++, and still use it on occasion, and got annoyed when I was 'coerced' to switch to C#, but after a while I preferred C#, though I never ever really got into ASP.NET, I just don't like it. Then I preferred Java, as I could think more about the purpose of the code I was writing and less about syntax or what pointers were doing. Then I went back to playing with assembly, because I enjoy it. However, having done all that I still feel like a Noob, because I have very little commercial experience (not my aim to get much either). So my point is - knowing about binary arithmetic and registers etc. is fun, but it isn't that much use today, except perhaps for the odd creative use of binary operators. That said, a keen programmer will surely learn about these things eventually, just out of curiosity - surely?
|
|
|
|
|
|
Couldn't this be done in Excel?
"Bastards encourage idiots to use Oracle Forms, Web Forms, Access and a number of other dinky web publishing tolls.", Mycroft Holmes[ ^]
|
|
|
|
|
Yes Excel could do the same, but far less flexible. Here are a few key reasons why using any kind of spread-sheeting for this is worse:
- Reordering items requires the user to cut the row, insert a new row and finally paste the row back in. There is no decent way of quickly reordering an item.
- Non-native Windows interface. In my opinion a spreadsheet equivalent will never look as clean or friendly.
- Not user friendly, especially for those like my parents who would find Excel confusing and are only accustomed to browsing the internet.
- Requires expressions (somewhat like coding) to make a similar sheet in Excel.
- Upcoming features in Rolling Total disabling items (for temporarily removing them from being counted) is not possible in Excel.
- Not everyone can afford Excel, and Google Docs Spreadsheet has a good second or so delay between changes (initially what infuriated me enough to push me to create Rolling Total).
- Item Quantity (prefix of "2x" in the item name doubles to item total) not possible in Excel without some fairly complex expression.
- Inline Calculator (upcoming feature) also not possible in Excel.
modified 22-Dec-13 5:36am.
|
|
|
|
|
First post is boosting your software.
Equals spam, equals spammer.
You will note that the original message is already gone.
See how we feel about this?
|
|
|
|
|
What!? This is a bewilderingly opposite reaction I was expecting, I would even have preferred to have been completely ignored than this.
I was no boosting my software, I was simply asking for feedback to understand better what people think of it . Hence why I titled this "First Impressions For My Software".
I am gutted to see the original post deleted, as the moderators seem to have thought it was spam. I would expect people to see how for example I was asking for feedback, and to see how I personally replied to d@nish. A spammer would post with generally poor grammar, a vague message with a link and run.
|
|
|
|
|
While I agree it didn't look like spam, it did (to me, anyway) look like thinly-veiled advertisement.
That's the curse of a product that you ask money for. Any mention of it is interpreted as advertisement.
|
|
|
|
|
And when it's your first post as well...
|
|
|
|
|