|
|
And I never knew...
Thou shall not jog: Proverbs 28:1
"The wicked run when no one is chasing them."
Seems pretty clear, that's good enough for me
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I see you have joined the exercise abstention program. So have I.
|
|
|
|
|
It's God not a Pixie, but yeah, I gotta get that on the back of a dog walkin shirt. I'm forever passed by joggers.
|
|
|
|
|
Is the urge to sing "The Lion Sleeps Tonight" just a whim away, a whim away, a whim away ... ?
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I get the feline that it's all very simba-lic and no claws for concern: just a token gesture.
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Yer not lion, cuz you Musafa had some idea you Scar points with that King of a comment.
The best way to improve Windows is run it on a Mac.
The best way to bring a Mac to its knees is to run Windows on it.
~ my brother Jeff
|
|
|
|
|
Never forget that OG is no mere cat.
|
|
|
|
|
Is there a tail to tell? No thoughts for almost a week!
... such stuff as dreams are made on
|
|
|
|
|
Check The Soapbox .... they're more risque last week.
The best way to improve Windows is run it on a Mac.
The best way to bring a Mac to its knees is to run Windows on it.
~ my brother Jeff
|
|
|
|
|
THX
... such stuff as dreams are made on
|
|
|
|
|
... written in response to today's Lounge posts on Python, and "pattern code bloat," and definitely in response to the comment by Marc Clifton:Quote: in fact it is actually easier to teach good programming in a highly constrained language. Now, given "modern" languages with their generics, templates, classes, interfaces, lambda expressions, typeless or typed, etc., features, teaching good programming is harder for the simple reason that nobody actually seems to teach programming principles and then how to apply those principles to a particular language. Perhaps we need a "dumbed-down" version of C#, call it "CBlunt" ?
No generics, or lambdas, or all that other fancy stuff we love religiously ... after we have broken our minds on it. Simple iterators only: for, foreach, while ?
Strongly typed ? Braces required ? White-space ignored ? I'm a true believer on those.
Now, imagine we've "raised a worthy young programmer" on principles (algorithms ? design pattern ?) ... What do we have ?
Someone who has no idea of asynchronous user-action driven event-handling in visual user interfaces ?
Someone who is blissfully unaware of the grotty nature of hooking up to a database, and CRUDding around with it ? Or, inter-op with the net ?
Someone who is blissfully unaware of bit-maps, raster-ops and bit-twiddling ?
Every one of these (rhetorical questions) I ask myself, frequently
cheers, Bill
«... thank the gods that they have made you superior to those events which they have not placed within your own control, rendered you accountable for that only which is within you own control For what, then, have they made you responsible? For that which is alone in your own power—a right use of things as they appear.» Discourses of Epictetus Book I:12
|
|
|
|
|
BillWoodruff wrote: Perhaps we need a "dumbed-down" version of C#, call it "CBlunt" ?
You do know what BASIC originally was an acronym for? Back to square one and then some people will try to use your dumbed down language for serious business and then start to demand more advanced features. Many years of evolution later someone comes to the conclusion that we need a dumbed down language for beginners...
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
My first programming experience was when Beginners All-purpose Symbolic Image Code was really basic: At most 286 numeric variables (A-Z and A0-A9 .. Z0-Z9) and 26 string variables (A$-Z$).
Which reminds me of C.A.R. Hoare: "I don't know what the programming language of the year 2000 will look like, but I know it will be called FORTRAN". I actually read it when it was first published (it must have been in CACM or Sigplan Notices) in the discussions of which extensions to include in Fortran77. When I first encountered a Fortran 2003 program, I din't recognize the language at all - but it was called FORTRAN ...
|
|
|
|
|
What we need is a programming safety video:
homeland security knocking down their door and arresting them for going to the wrong website,
all the diseases they can get from the keyboard,
their bank account being cleaned out for reading the wrong email
....and other such items
scare em good so they wont want to learn, regardless of the language taught they'll only end up on CP and SO asking for people to do their homework anyway.
Leave programming to the nerds that have figured it out for themselves (and usually know more than the teacher anyway.)
Some people are good at sports, others are good at cooking, ... those that aren't should just stick to watching it on TV. Likewise programming. (Although would make for damn boring TV - except perhaps the episode on cracking pron site passwords and clearing your cache/history afterwards.)
Signature ready for installation. Please Reboot now.
|
|
|
|
|
Lopatir wrote: Leave programming to the nerds that have figured it out for themselves (and usually know more than the teacher anyway.)
This, Sir, is absolutely politically uncorrect and stands for the social injustice that's being done to women, the mentally challenged and all the others who are !nerds!
You must be sexist, elitist and thousands other sorts of *.ist, except perhaps terrorist, socialist or communist.
Thanks for saying it like this.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Riddle me this
if politicians are liars and bs artists: does being "politically incorrect" mean telling the truth?
Signature ready for installation. Please Reboot now.
|
|
|
|
|
In my little world, yes.
In the real world it can also mean that someone is deliberately doing it to get ahead with his own agenda.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: This, Sir, is absolutely politically uncorrect and stands for the social injustice that's being done to women, the mentally challenged and all the others who are !nerds!
We have now reached the logical point of non-discrimination - even discrimination of the grounds of ability is forbidden.
EDIT: s/not/now/ (damned autocomplete )
Ad astra - both ways!
modified 2-May-18 11:15am.
|
|
|
|
|
Discrimination certainly is not acceptable, but creating an illusion for someone who beyond any doubt has not the ability in question is almost just as cruel and does not end well.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
What we need is a completely new paradigm for programming. We're still using constructs created in the 1950's based on human labor workflow processes created in the 1800's based on language constructs created thousands of years ago that are themselves poor at communicating between ourselves and even worse at communicating to machines.
BillWoodruff wrote: Someone who has no idea of asynchronous user-action driven event-handling in visual user interfaces ?
You could start with real life. Whatever the language is, everything should be asynchronous, event driven, and visual.
And we really need to get away from the linear and monolithic workflow paradigm of do a, then b with result of a, then c with result of b, etc., and into a data-driven agent based paradigm: "oh, I have foo that I know how to convert to a bar."
|
|
|
|
|
Marc Clifton wrote: And we really need to get away from the linear and monolithic workflow paradigm of do a, then b with result of a, then c with result of b, etc., and into a data-driven agent based paradigm: "oh, I have foo that I know how to convert to a bar." Do we? That may solve some of your problems and opens new cans of worms for many others.
No many sorts of 'bysomething-driven' paradigms we invent, we will always trade one complication for another. Also, complexity does not magically go away. You can only choose at which end you want to deal with it. If you choose well, the disadvantages will not matter to your application, but that does not make your choice a silver bullet.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: we will always trade one complication for another.
I totally agree. In a distributed computing, real time, sharded data world though, the old "a -> b -> c" workflow pattern really seems not just archaic but untenable, IMO.
|
|
|
|
|
As if every algorithm lends itself to parallel processing as, let's say, raytracing. Ask Henry Ford what to do when you have a strictly linear task at hand.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: Ask Henry Ford what to do when you have a strictly linear task at hand.
You create a hundred work threads, each one does one small operation and passes the work on to the next worker, maximizing each worker's effort and minimizing the time it's waiting to do work. In other words, an assembly line.
|
|
|
|