|
In 1978, "Adventure" was the first game I ran on my (then) newly built Heathkit H8 computer -- built while I was an undergrad at a regional campus of Purdue. It's been so long that I can't remember where I got the executable that ran it. I'm not sure if I bought it or if it was freeware from some "dial-up" BBS downloaded at 300 baud on my acoustic coupler modem. Ever since I first set up my home Wi-Fi ages ago, its SSID has been "XYZZY." I'm quite sure none of my neighbors get the reference.
I feel so old. Of course, that's because... I am old.
|
|
|
|
|
Not seen any comments from Nagy for a while.
Hope he's ok...
|
|
|
|
|
He's still active on xitter (:spit:), but he hasn't posted here in just over a year.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
"x" makes the "sh" sound?
|
|
|
|
|
Indeed.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Fergus Donaldson wrote: Not seen any comments from Nagy for a while. The Gyn shops miss him too... A LOT
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Nagy's doing okay. He's still a happy worker bee.
|
|
|
|
|
I'm fine, just busy on things.
watches another deadline flying past<
veni bibi saltavi
|
|
|
|
|
Hello people, my name is Emmanuel Katto and I'm from Uganda. I wanted to learn Java can anyone help me in learning Java language?
|
|
|
|
|
The official documentation is probably a good place to start:
The Java™ Tutorials[^]
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
|
Here is an online tutorial that allows you to compile examples in the browser:
Java Tutorial[^]
|
|
|
|
|
Thank you so much for providing the guide.
|
|
|
|
|
I learnt Java from here.
https://see.stanford.edu/Course/CS106A
Though this series is dated, this is indeed a great way to learn programming in general. There may be few syntax differences between this and today's Java. But the concepts sink in for sure.
Many of my juniors have also thoroughly liked the way of teaching here.
|
|
|
|
|
Amarnath S wrote: Though this series is dated
So is java
|
|
|
|
|
What i meant was that Java would have had so many deprecations since then. Though there may be many syntax changes, the concepts do remain same.
|
|
|
|
|
|
Yes Java is getting long in the tooth, but I nevertheless enjoyed learning it. AND: It caused me the gravitate to learn programming Android apps (through the Kotlin language) and that was even more fun! Now Lenovo is going to bring out desktops that run on Android! Will the fun never stop?
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
Has no idea about this reduced. (8)
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Clueless ?
This = clue
reduced = less
Has no idea = Clueless
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
modified 27-Oct-23 5:15am.
|
|
|
|
|
YAUM! Care to edit in the explanation?
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Should that be
this == clue
?
😊
Tricky!
|
|
|
|
|
I'm currently moving all of my professional and hobby project development over to the ARM Cortex family of platforms.
ARM Cortex M7 > ESP32
Microsoft is doing similar with their operating system.
Apple already has, with the M1 and m2, AFAIK.
ARM > Intel
There's no getting around that x86 is showing its age architecturally. Even discounting all the ancient backward compatibility, like "real mode", it's getting awkward.
I read this thread with some interest. Aside from some disagreements in the comments, overall it was very interesting, if taken with a grain of salt.
the_end_for_isa_x86[^]
One nice advantage for me is the ARM Cortex architecture is largely continuous from their little M0 real time chips all the way up to their multicore A line.
That means I can create code that will perform well across little devices and PCs.
This also has to be a huge win for developers of phone and tablet applications, that their work is more transferable to future PCs now.
The fact that ARM doesn't manufacture is also a huge win. They leave fabrication to outfits like NXP. ARM just designs chips. I read somewhere that their time to market for a new offering is about half that of Intel's.
Start moving your stock.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
modified 27-Oct-23 10:01am.
|
|
|
|
|
My wife who is greatly technical had bought an ACER laptop after her HP ran the fan always like a spaz. The Acer out of the box did the same thing. I am typing this on an ASUS Q502 i5 with windows 7, this is my main dev ride, and it runs it fan only when I make it do so by accident. Long story short, My wife ditched both aforementioned machines for an M1 mac with NO FAN. - sorcery I say. She recently traded that one in for an M2. After having being saddled with cpu coolers since the 386sx (with MMX) , it's crazy to see a machine so quick and responsive with no rotational cooling going on.
|
|
|
|
|
I have been fighting x86 since 1992-93, but lost the first battle: The IT department of the Tech. College where I was teaching had two alternatives for a set of 30 new machines, to be used for Unix software and also one of my courses, Computer Architecture (with assembler coding). The choice was made in a democratic manner: The educational staff of the department, including me, came out in favor of an M68030 based system. The department head was in favor of the x86 based solution. When he saw that the majority went against his preference, he announced: I can't be the head of a department that works against me. I quit! Find another department head! So the next day, we repeated the democratic voting, and this time the majority was in favor of the department head's preference, and he didn't quit. I had to teach Introduction to Computer Architecture on the messiest architecture around.
(Btw, Denmark got into EU by a similar democratic vote. They had a referendum, giving an 'no' to join EU. The Danish authorities told the people that the answer was wrong, and gave the people another chance to give the right answer. The second time, The People understood what was expected of them, and Denmark joined EU. Hooray for democratic processes! At least as long as they give The Right Answer.)
M68K didn't survive in the big markets. If it had, the RISC wave would have been mostly superfluous. So let's cross our fingers that the ARM architecture will be strong enough to fight down the x86/x64.
Although ARM started as a 'clean' RISC, it certainly isn't any more today! The very first 'Thumb' instruction set laid the first ground for irregular instruction coding, need for an intermediate decoding level and reduced regularity of the instruction set. That has grown 'worser and worser' with every new architecture revision; it is today very far from the RISC ideal of instruction word bits directly activating the various logic circuits. They have had to introduce caching and pipelining and lookahead and speculative execution and out-of-order execution and whathaveuyou of hardware speedup techniques. The instruction set has grown and grown and grown and ... Certainly not always in an orderly, well designed manner. AArch hasn't had as many years as x86/64 to grow cancer, but the old word saying that 'any sufficiently high-versioned standard is indistinguishable from a can of worms' is beginning to bite ARM as well.
Note that the discussion you are referring to is more than three years old. The thread is almost void of references to the Aarch64 alternatives that were available even then, but with plenty of references to the M1 of 2007. It it tempting to suspect that a fair share of the commenters are not fully aware of the more recent (even then) updates to the architecture.
If you go for the detail, 'the ARM Cortex architecture is largely continuous from their little M0 real time chips all the way up to their multicore A line' does hold true for a sizable core. Not for the Thumb instruction sets. A number of the 'ordinary' instructions didn't make it to the 64 bit architecture. Compatibility at the binary level is significantly less than at the assembler source code level; some of the top Aarch64 models have completely dropped support for Aarch32. Vector instructions are now in its second version of the second generation.
Yet: I do like the general ARM architecture. I have come to love the register based philosophy, with less reliance on the stack. I have seen how the system architecture for 'peripherals' integral to the CPU is great for extending the CPU in a SoC. I am really hoping that traditional PC manufacturer soon will come up with a broader range of ARM based machines, covering even the more 'classical' kind of desktop machines in large cabinets, allowing for extensions with peripherals, memory etc. that you can't do with a portable or tablet.
'The fact that ARM doesn't manufacture is also a huge win' - it is, but don't overestimate it. ARM provides a CPU core, for anyone else to extend with their own (on-chip) peripherals, several architectural features are optional, and every manufacturer will pack the chip to his preferences. So you will rarely if ever see a 'plugin compatible' chip from an alternate vendor. If you have to switch to another chip manufacturer, be prepared for another pin layout, maybe your old chip had some useful peripherals missing in the new one (and if the new one has some similar peripheral, it is almost certainly managed differently) and some instruction codes may be invalid because that option was left out of your new replacement chip.
A common core is of course a great win. But the salesman speak is often a lot more rosy than realities, especially if you are making use of optional functions and on-chip peripherals.
|
|
|
|