|
You're welcome.
------------------------------------------------
If you say that getting the money
is the most important thing
You will spend your life
completely wasting your time
You will be doing things
you don't like doing
In order to go on living
That is, to go on doing things
you don't like doing
Which is stupid.
|
|
|
|
|
Be careful not to judge a language by the writings of an author. I must have a dozen Java books that are good for a bonfire and another dozen books on C that are wonderful. Java jumped to a leading development language in application programming because of its usefulness, but I would never want to develop gaming programs with it. Think of languages as tools in a toolkit and you will go far.
Just as a hammer is terrible at turning screws, a scripting language will never be good at real-time applications. Other languages are much better at that.
Usually, experienced programmers can point you to good books, or better yet, good authors to help get you where you need to go. CodeProject has a great breadth of experienced developers who are happy to help, and no, I am not getting paid to say that. With Bjarne Stroustrup having developed C++, he is an excellent source, but I found his book to be dry and slow going. Ask for good books for Java and maybe you can get a leg up.
|
|
|
|
|
The first language is the hardest. Once you MASTER any single language, others will be easier to understand.
For Java, you might try the free ebook: Thinking in Java.
It is an okay learning book.
For Java, just remember that Object variables are only "leashes" or "reins" on an Object. They let you control something that wants to roam free (and be garbage collected when that happens). Primitives are values. All variables (including "leashes") are passed to and from functions by value/copying. So if you pass an object to a function, you have a second leash to the same object inside the function. Java object references act like C/C++ pointers, except that the Java language will not let you access the contents. In C/C++, you can dereference a pointer type via * and see the actual memory location.
The one syntactic "sugar" that basic Java gives you is a double quoted String constant: "Assign me to a leash or lose me forever".
|
|
|
|
|
A few thoughts from me:
Why Java?
New kid of the 90s when a new boom of development kicked off with this people. Companies digitizing their processes and looking for how to do that.
A: Free, B: had some features for new programmers that eased development.
New programmers thrown Java
New programmers cheap
Move 20 years on. Java has its foot in lots of business applications, because of that expansion phase. So lots of work maintaining that work because the risk to rewrite seems unproductive from a business perspective.
Tons of courses use Java to teach, because they already have the course setup for that and will cost time and energy to change it. Again its free to use the language and teach from, so many courses used it.
After learning how to write code. It will be helpful to understand some business stuff to be able to manage clients. This is a whole different thing to understand. Unless you happy to be code monkey (in the positive sense, not negative) and find a manager to deal with all business. This still leave you at the manager whims that when you think project should be done as X, but manager says must be done as Y.
Why learn concepts?
For 10 years after uni, never once used an Abstract. had no need.
Had no need because I was in a company whos product had no need.
Until I found a code base which everything had Interfaces. Now some of this might be C# specific but lots of cross over. Interfaces were created for everthing, so dependcy injection could be done on Testing (unit tests). This granular level of testing also took some learning and once clicked the interfaces made sense.
For some understand B before A is more helpful then A then B. I am the former.
"Oh no, you never did tests before!?"
Yes and no. I had tried to do unit testing to little avail. So the previous porjects I had worked on relied more on integration test, automated out, and dedicated testers following manual test sheets. So need to learn that programming had little need.
So back to Abstract. I noticed that there could be a use for an Abstract class. This was not really me thinking in OOP terms, but understanding some C# features, and wanting to trim some code fat.
consider this. There are lots of clients that need food.
We offer Breakfast, Lunch, Dinner, but needs to be implemented different for each client.
So we make an Interface for Breakfast, Lunch, Dinner.
(interface is just method names, properties and return type. Only the public methods)
So each client has to implement each method, even if they dont want something.
So given some 20 clients. and many only want Lunch. means a method which just returns "does not want Breakfast".
This is where an Abstract made sense. I dont want to allow the Abstract to be instanced. but I want make a default return for each.
then the client classes only need to override the methods that it uses.
end
Learning you need to over learn.
Then when go to implment its often cutting back sometimes. If only going to have 2 clients in the above example, then Interface/Abstract might be overkill.
Some developers prefer to side on write it first time to be extenable, so they dont have to rewrite. Others are not able to see that far ahead first time and so require that extra rewrite stage.
Also Ask your teacher
|
|
|
|
|
I went to University to study Computer Science. What I can tell you is that once you have learned four or five languages you will find that learning new ones gets easier. So, you have two options that are fairly obvious to me. One would be to find more books on Java, you could probably do this for free in a public library since the language is so popular. The second option would be to look ahead at the other courses you will be taking and what languages they are using then start learning those languages now, that will help you gain greater insight in programming which will help with Java, and you will have a head start on what is coming up in the future. This second option will work but it will also take longer, so you do have a big choice here.
|
|
|
|
|
I spent over an hour tracking down a compile issue in my C++ code yesterday because I forgot to mark a derivation as virtual.
C++ can be finicky but between it and C they really are the only languages that can fulfill the promise of "code once, run anywhere" despite requiring a recompile to do so.
For a long time, I was hoping .NET would at least become as ubiquitous as Java, and it slowly has (almost).
But my devices have gotten smaller, my needs of my language more demanding, and garbage collection will always hamstring these higher level "device independent" languages in terms of where they can operate.
I do love C#, but it just won't run on what I code for these days. C++ is my first love anyway. template just owns all, and where it doesn't, the preprocessor steps in. It's gorgeous. I can convince the compiler to do just about anything. Me being a fan of domain specific languages and coding, C++ also is the only major language with a flexible enough compiler and preprocessor to make that a reality.
I started my JSON pull parser in C#
I ported it to compile with the Arduino SDK/toolchain
Then I ported it to compile with C++ in general (without being Arduino specific)
Now it runs in places my C# pull parser never will, and that's deeply satisfying.
Besides, money for me today is in C++ code. I just don't like business development, and C++ keeps bizdev contracts out of my hair. Nobody wants to pay for a backend or frontend implementation of a web based site or otherwise business application in C++. Good.
This is almost a dear john letter to C#
Also I'm kinda falling in love with VS Code
Real programmers use butterflies
modified 13-Dec-20 7:05am.
|
|
|
|
|
La Strega wrote: I forgot to mark a derivation as virtual. It's now recommended that only the original function be virtual and that derivations only be override or final . So I guess you're using a version of C++ that doesn't have those tags yet.
C++ can be finicky but between it and C they really are the only languages that can fulfill the promise of "code once, run anywhere" despite requiring a recompile to do so. That's damning with faint praise given that you have to develop or find portable libraries that start with sockets and go all the way up the food chain. But the STL does have <thread> , which is good for building toy systems!
C++ keeps bizdev contracts out of my hair. I can see the appeal, but if you have that library that goes all the way down to sockets, you might not escape so easily!
This is almost a dear john letter to C# I was going to suggest jill, but that would be hypercorrection.
|
|
|
|
|
Greg Utas wrote: It's now recommended that only the original function be virtual and that derivations only be override or final . So I guess you're using a version of C++ that doesn't have those tags yet.
I'm doing that. I'm talking about the inheritance list, like
class B : virtual public A {};
Like that
Greg Utas wrote: That's damning with faint praise given that you have to develop or find portable libraries that start with sockets and go all the way up the food chain.
That's true but the same is the case w/ embedded gadgets generally i've found. ecosystems gotta ecosystem.
Greg Utas wrote: I can see the appeal, but if you have that library that goes all the way down to sockets, you might not escape so easily!
I'll try to avoid it.
ETA: Congratulations on what I am assuming was a 2nd prize win for last month.
Real programmers use butterflies
|
|
|
|
|
class B : virtual public A {}; I'd forgotten about this use of virtual because I don't use multiple inheritance. You're naughty!
Congratulations on what I am assuming was a 2nd prize win for last month. The rewrite article[^] was a co-winner. I've noticed that they give out 2 firsts and 2 seconds for these contests.
|
|
|
|
|
What's so bad about multiple inheritance? It's one of the things I love about C++.
Say I have a class that manages a cursor over a streaming input and a capture buffer for reading data into.
Say I want to separate capture buffer storage concerns from input device concerns.
So I make a LexContext derivative for keeping a fixed length buffer (StaticLexContext<S>) and one for reading from a FILE* - FileLexContext. Then I can just do
template<const size_t S> class StaticFileLexContext :public FileLexContext, public StaticLexContext<S> {
StaticFileLexContext() {
}
};
and Bob's your uncle.
If you refuse to use multiple inheritance, you will either need to duplicate code, or create a third "helper/utility" class or module to keep the shared implementation. =(
Real programmers use butterflies
|
|
|
|
|
It was more of a troll because of the religious debates. I'd lean toward the recommendation to only multiply inherit from classes that themselves derive from an abstract class. Without the abstract classes, admixed classes tend to make assumptions about each other, so refactoring is needed to make other combinations work later. And sometimes there is no "later".
|
|
|
|
|
Greg Utas wrote: Without the abstract classes, admixed classes tend to make assumptions about each other, so refactoring is needed to make other combinations work later.
That's a good point.
I tend to think of this scenario as intermixing class and interface, but I still conceptually treat my two groups of virtual functions in the base (to be filled by two separate derived classes typically) as two "interfaces" even though they don't have an abstract base to back them. The abstract base is mixed in to the base class itself.
I can see avoiding that for complicated classes or classes that will be worked on by multiple people, and classes that will likely be added to a lot, because it quickly becomes a source of confusion as well as a maintenance hole.
However, for a lot of really simple composable legolike core classes I find this technique to be unobtrusive (not requiring a ton of type definitions) and relatively maintainable.
It's a "good enough" approach (IMO) when you don't need full on separate interfaces/contracts, and you want to keep implementation somewhat together. It quickly breaks down for complicated things though so it's basically a shortcut for simple cases.
Real programmers use butterflies
|
|
|
|
|
In the early days of Atmel devices C++ was very clumsy and not many people used it, they used C mostly. I kind of led a crusade for C++ and thank heavens in the last couple of years it has become very mature.
Disclaimer; I was in no way responsible of its maturity, just thankful for it.
I'm not sure how many cookies it makes to be happy, but so far it's not 27.
JaxCoder.com
|
|
|
|
|
Yeah, you kind of need C++11 at least or it's not that grown up. I really like a lot of the features they are adding to the language this century and I'm excited about C++20. With C# other than yield/async/await/var all the other additions have been pretty useless and marginally annoying to me.
Real programmers use butterflies
|
|
|
|
|
Yeah 11 was a great update and like you im anxious to see 20 hit the streets.
I've been tied up remodeling our house for 4 months now and it will probably be a couple more before I can get back to any kind of tinkering. Don't even have my computer here and have a brand new CNC machine assembled on my work bench at the old townhouse that I've not even had a chance to use. I've got a ton of projects lined up and raren' to get to it.
I'm not sure how many cookies it makes to be happy, but so far it's not 27.
JaxCoder.com
|
|
|
|
|
I really like 17 - solely because inline static initialization is a very nice addition.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Good old binary compilation. You can get the latest features in your code by switching to a newer compiler without waiting for the latest "framework"/virtual machine to be deployed everywhere. I mean, well the stdlib and STL stuff notwithstanding.
Real programmers use butterflies
|
|
|
|
|
Part of the clumsiness might have been Embedded C++: no exceptions, no templates, no RTTI. The rationale was a concern over code bloat, but they should have taken the whole language and let individual development teams manage any bloat.
|
|
|
|
|
I agree with that regarding templates. But with structured exception handling you have potentially OS level interop you may have to do depending if your "OS"/CPU supports it or is "exception aware" and that can be architecture dependent. Also, I can't imagine even trying to do SEH on my little 8bit ATmega2560. I can see there being barriers to implementing an SEH mechanism depending on platform and depending on how it is implemented, but maybe I'm wrong.
I've never actually used RTTI, so I wouldn't miss it.
Templates only bloat your code depending on how you use them. It's fine - even desirable to use them on embedded because things like fixed length buffers usually require some sort of max value tied to a block of memory somewhere that was fixed size at compile time. I do that very thing with templates on a lot of my classes for the arduino.
template<const size_t S> class StringBuilder { ... }; like that, where S is the size of your fixed buffer.
Real programmers use butterflies
|
|
|
|
|
I wouldn't miss RTTI either.
True C++ exceptions shouldn't be a problem. But a platform that didn't support POSIX signals would be degenerate for nasty exceptions, so you'd have to design accordingly. Even the full C++ standard says that those are basically undefined behavior.
|
|
|
|
|
Yeah. I'm not really holding the lack of SEH support on embedded against anyone, is my point. Maybe it could be done on some, but it's probably hit or miss if it will even work "properly"
I love C++ and even if I can use some of it over plain old C i'll take it. C++ is a beautiful language, in whole or part.
It's truly my favorite. I missed it during my C# sabbatical but I forgot how much I loved it until I came back to it.
Also VS Code makes working with C++ in linux a joy. Kudos to Microsoft.
Real programmers use butterflies
|
|
|
|
|
One can hope that herbceptions will help there, but proposed treatment for out-of-memory errors is a bit questionable.
|
|
|
|
|
Out of memory will always be a problem without a separate stack, as in sigaltstack .
These guys are still clueless. The standard should mandate the ability to turn a POSIX signal into an exception in a signal handler. Not all platforms could support it, but so what. This also says something about the inane fetish for noexcept .
I would write a proposal but have better ways to spend time than debating pedants in the C++ standards community.
|
|
|
|
|
I traded my Ferrari for a golf cart because the Ferrari couldn't carry my gold clubs.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
And OP's analogy is "I traded my golf cart for a Ferrari because I didn't need to carry golf clubs."
|
|
|
|