|
Yep! Quora is horrible - and the original post was exactly the sort of thing you get on there. Basically, clickbait. Apart from the link to the article, there was nothing. I'm guessing it was an attempt to get some hits on the article.
If it was a genuine attempt to discuss the merits of calculus, the original poster made zero effort. Opinion - none. Background to article - none. Summary - none. Just stick in the link and send us all off to read it. And - maybe - if the article had had some value, I wouldn't be so But it didn't.
|
|
|
|
|
If someone is getting a Computer Science degree to be a computer scientist, then yes they should learn all the math as actual CS is heavily math based, with computer science essentially being a specialized area of mathematics.
However, if someone is getting a CS degree to become a professional software developer, then for most CS jobs you wont need the math, as many of the replies have pointed out.
The biggest issue I see is that somewhere along the way it was decided that to get a job as a developer, a degree in CS (or related field) became required, which is absurd.
What is actually needed is more of a "trade school" for software developers that is accepted by the business community. Coding Boot Camps dont cut it as they are just too short, it should be a couple of years of study at least, but focused on software development, not computer science. Which wont happen - the corporate world has turned university studies into de facto trade schools and they're happy with it being that way. So, people who have no need for 3 years of higher math will continue to have to suffer through it, and we will continue to "weed out" people who would otherwise be fine developers for 80% of the programming jobs out there unnecessarily.
Of course, there are many programming jobs that DO need that level of math, and those are the jobs that should be listing a CS degree as a requirement. Most jobs should not have that requirement though.
|
|
|
|
|
Gjeltema wrote: Computer Science degree ... What is actually needed is more of a "trade school" for software developers that is accepted by the business community.
At least in my experience a CS degree by itself is useless as a measure of someone working as a developer.
What often happens though is that while they are getting that degree they actually get a job through the school which allows them to get paid to program. That experience, not the degree, is what gives them the knowledge to get other jobs.
|
|
|
|
|
Sure they can. Even back in my undergraduate days (82 - 86), my school had two different paths to becoming a "programmer" (using your word). The Computer Engineering degree was in the Engineering department and was identical to the Electrical Engineering degree for the first two years, so calculus physics and chemistry were requirements. The Computer Science degree was in the Arts and Sciences department, and only had some logic-type math class requirements, but no physics nor chemistry per se; they had a breadth requirement for some 100-level science class but they could pick which one. My roommate was a CS and I was a CpE. We both have had very successful careers in "programming."
I've found the most important thing to being a success in programming is the ability to know how to solve problems. Know your problem domain, know what tools and languages are available and what support they give you, know the "usual" approach to solving a related problem, know when to throw out the usual approach, and most importantly, know that you don't know everything.
Be wary of strong drink. It can make you shoot at tax collectors - and miss.
Lazarus Long, "Time Enough For Love" by Robert A. Heinlein
|
|
|
|
|
Exactly
Plus the sheer enjoyment of the job.
As a beginning, entry-level report programmer at a small local Savings & Loan,
many days I left feeling as though I had not worked at all because I was having so much FUN writing (easy) code - AND being paid for it
|
|
|
|
|
I graduated in 2000. I started my freshman year going for a Computer Science degree. After barely passing my math class my first semester, I transferred to the Computer Information Systems degree. I was fortunately that my university offered both. My CIS degree had no additional math requirements. There is a need for both degrees and 24 years later, I'm still slinging business software. The most difficult math I have to do is finding the weighted cost of an item in a shopping cart when applying a credit. I'll never take a job writing gaming engines or tracking real time objects, but I'm okay with that.
Hogan
|
|
|
|
|
Tricky. If the question is should calculus be a prereq for CS, I'd say only if you also want to require some EE background, in which case you need calculus to solve many of those problems. I do think a minimal amount of EE background is worthwhile, even if not critical; at my school you could do CS either as a BS or BA; I chose BS, so I did get the EE prereqs and am a little biased. Otherwise, though, calculus has little direct relevance to CS. Much more relevant would be Boolean logic, or maybe even a philosophy of logic course (it never hurts to hone the verbal skills too, and it trains the mind just as assuredly as pure math study).
All that said, if you take it as a given that intro calc is required, and yet the student can't pass the class, if I were their advisor I'd be concerned. Basic calc may be tough to understand intuitively but is rather easy to apply superficially - which also happens to describe a lot of programming problems. They don't have to be able to prove the fundamental theorem of calculus from scratch, but if they can't at least follow the spoon-fed algorithms for obtaining simple derivatives and integrals, and/or don't have the ability to break down complex problems into smaller ones (and then apply said spoon-fed algorithms), then they might also have a hard time tacking difficult programming problems.
This is nothing unique to calculus vs. other advanced math, though. Like 99% of people who learn any advanced math, it's not about whether they will use it later in life (I've literally never applied calculus in real life, to my recollection). It's about training to, and proving they can, solve complex problems given the tools needed to do so. They don't have to study calculus, or any other specific form of advanced math, to be a good programmer, but if they can't handle it after trying, I'd say it's something that at least should be looked into and understood why not.
|
|
|
|
|
Most everyone else will probably say similar, but if the student goes the business route, the student likely won't need anything more than algebra for many tasks. If the student wants to work for Lockheed Martin, Boeing or write modern video games, then the student should probably master differential equations. Passing calculus is easy. Application of calculus is harder. Of course, I think many gaming engines handle physics for you.
|
|
|
|
|
My math background sucked, largely courtesy of a couple of bad teachers in high school. When I hit university, Computer Science was about computers, languages, and programming. We were taught how computers actually worked, and how to make them dance and sing. Every Comp Sci degree programme I've looked at in the past thirty years looked more like a Math degree than Comp Sci. Our final assignment in my second year Comp Sci course was to write, in assembler, a square root function that was shorter, faster and more robust than the one supplied with the operating system. A real programming problem.
I was already programming before I went to university, and got hired out of it before I finished my degree. I've been working for over 50 years, and I'm still developing systems. I'm in the middle of one right now. My lack of math skills has never stopped me, and my work has included defence systems, operating system maintenance and extension, DLL libraries for various functions, international programming language standards (CODASYL, ISO), and almost every other kind of application you could imagine for everything from mainframes to Windows.
Making a computer dance to your tune doesn't take math -- it takes knowledge of your work, knowledge of the goal, the ability to decompose a problem into its component parts, and the ability to apply software (or database, or other things) to its solution. There's only a very small piece of the programming world that really needs math, and quite often you'll likely find that the mathematician's job is to break the problem into tiny pieces so a real programmer can craft the solution without needing to know the math involved.
|
|
|
|
|
I learned calculus, both differential and integral, as well as linear algebra, Boolean algebra, and set theory. Of the four, Boolean algebra and set theory have been applicable in my nearly 40 years as a software engineer in a business environment.
Some math is needed to help you learn to think critically, but what's more important for software engineers is learning how to separate implementation from specifications, how and when to hide complexity, how computers work at the hardware level, garbage collection techniques along with their benefits and weaknesses, and how to evaluate algorithms for complexity, speed (time), and space (memory) tradeoffs.
|
|
|
|
|
This question does concern me. I became a developer/programmer at the age of around 48. I got a degree in Web Development, but also had a few Java classes. I began my job working with Apex in Salesforce, and with JavaScript, Angular, plus Java/Spring Boot. I am definitely not a great programmer, but I would say I am a decent programmer.
Before I got my Web Dev degree, I took Calculus twice (at around age 46) and withdrew both times rather than finishing with a failing grade. When I was in my 20s getting my first degree (non-computer related), I was Magna Cum Laude. I'm convinced that if I had taken Calculus back then I would have passed it. The Calculus is the reason I went for a Web Dev degree at one school vs a Computer Science degree at another. Now, my job gives us free tuition at the latter school, but I can't take any of the CS courses there until I pass Calculus.
I program without knowing a lot about algorithms, and I think I would probably know more about them if I had a stronger Math background. Whether I would be a better programmer in general is hard to say. I'm just grateful I was able to get into the industry at all at such a late age, and I do have concerns about the future. I think my job is secure, and I hope to retire here, but you never know what life will throw at you.
So I do wish I had succeeded with Calculus, and I hope to at some point. It may or may not help, but it will absolutely never hurt.
|
|
|
|
|
I failed calculus the first time I took it.
It was a combination of my having too easy a time in high school that I did not have good study
habits and a visiting professor who was so unintelligible that the college replaced him after a few weeks.
Higher is needed by computer programmers when they are going to convert a client that uses higher math idea
into code. In my close over 30 years of coding I've never needed to code anything beyond algebra and that was
in cooperation with a math major/computer minor who did the math while I did everything else.
The reality is, even though I took a number of higher level math courses in order to get my degree in computer science, I only remember on bit where I saw how to convert certain math functions into a code type would possibly come in handy.
I'd still recommend higher level math for anyone working with graphics or engineering, but that's more to give you a sanity check when looking at the results of testing to ensure the libraries you are referencing are working correctly and you are correctly accessing them.
For 99%+ of coders out there you won't need much beyond the math used by regular finance people.
I honestly don't know what "maths" the quants use for the stock market, but it might be more exotic
and in line with dot matrix theory or chi squares of the like.
Richard
|
|
|
|
|
A lot depends on the particular programming field the student is considering. When I was actively programming, I relied heavily on advanced maths, including calculus and beyond. But for most applications, a student only needs to be able to clearly define problems and formulate solutions. In most cases clear, logical thinking is more valuable than advanced math.
Will Rogers never met me.
|
|
|
|
|
They have to provide a wide range of topics since they have no clue to what kind of programming you're going to end up doing.
I started college as a math major but ended up switching to electrical engineering my junior year. My first job out of college got me into embedded programming (1978, assembly language days). I still did some hardware engineering so math was needed for things like circuit analysis but on the computer side I didn't really need high level math. In the second half of my career I was learning/developing DSP algorithms and math was fairly important there. I really enjoyed DSP programming.
|
|
|
|
|
Considering that the large majority of people programming today don't deal with infinitesimals, differential equations, video/audio compression (or encryption), floating point matrix operations...all of it the stuff of simulators and videogames...it doesn't surprise me.
Maybe you could get away with building a kernel and a compiler if you knew just algebra and had a hankering for Chomsky.
|
|
|
|
|
Even without the actual math, there's definitely common concepts that apply, so I would say if you have the mindset that can handle calculus, then you might find yourself having an easier time than someone who's never done any of it.
YMMV and it depends on your career path.
|
|
|
|
|
Greetings,
When I started out, I was going to be an engineer (Civil or EE - wasn't sure). Took Fortran (yes that long ago) as my first programming language. 4th Semester Calculus was my downfall with 49% as my final grade. I then switch to MIS and finished my degree in that. Additional math courses were non-existent except for Statistics. I managed a 91% average in those 3 courses. The Calculus courses were interesting and I used them at one clients location. However as a rule, wasn't needed. Critical Thinking? yes! Logic? Yes! Calculus? Not really...
Conceptualize multi-dimensional array? Yes! Slide Rule? Fun but not really needed...
Math does help but Calculus doesn't seem to be necessary,
Cegarman
document code? If it's not intuitive, you're in the wrong field
Welcome to my Chaos and Confusion!
|
|
|
|
|
cegarman wrote: Math does help but Calculus doesn't seem to be necessary,
Right. My main point was that if you have the mindset that can cope with Calculus, you might have an easier time coming up with new ideas than someone who hasn't been exposed to Calculus at all.
That being said, I've been coding professionally for almost 30 years now, and in all that time I don't think I've ever written any math code that amount to anything more complex than calculating an average.
|
|
|
|
|
|
Allow me to quote Fred Brooks, in The Mythical Man-Month (1972)
He wrote: ... when schedule slippage is recognized, the natural (and traditional) response is to add manpower. Like dousing a fire with gasoline, this makes matters worse, much worse. More fire requires more gasoline, and thus begins a regenerative cycle which ends in disaster.
If you haven't already, get yourself a copy. Even 50 years later, that book is full of relevant stuff.
Also, remember that impregnating nine women will not produce a baby in one month. (Software is not quite the same as babies, but sometimes I wonder...)
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
As I recall the discussion to which you refer is re/ projects in full progress and significantly advanced. The articles to which I refer are projects which are fully designed though yet to be implemented. Assuming the design of a software project is complete well documented and accepted it is not obvious to me why 100,000 programmers can not be hired to implement one of each of the 100,000 methods of the project and bingo presto voila the project is complete in no time at all and ready for integration test.
"We can't solve today's problems with the mentality that created them." Albert Einstein,
|
|
|
|
|
Let's say that is, in theory, possible.
The amount of time it would take to design and plan ahead would probably make it more efficient to hire less programmers and do less planning.
A railroad or building will have many knowns (because every railroad and building is essentially the same, if you're not doing crazy stuff) while software has a whole lot of unknowns.
So every method, method name, number and order of arguments, probably class names, etc. will have to be thought out in advance.
That's basically saying a few people will write the software up front and then we'll hire 100.000 programmers to fill in a few of the details.
So then, let's say you got to the point where 100.000 programmers can write 100.000 methods simultaneously (after probably years or planning and designing).
Those methods are probably going to call one another, but that's not possible because those methods are still being written!
But let's say you get around that by using a dynamic language like JavaScript or Python.
Now you can't test them because each of the 100.000 programmers will have to get the spelling, casing, order of arguments, etc. correct right from the get go without being able to see, verify or test if the other programmer has, indeed, followed the design to the letter (here you can see why designing to that level would be ridiculous).
But let's say these programmers even got that right and they're going to commit their code...
Imagine the merge conflicts!
Unless, of course, each method has a seperate file, but then that would have to be designed to and those files would somehow need to come together.
All in all, it's not very practical and would take so much time and effort and be so error prone that it's, for all practical reasons, not possible.
|
|
|
|
|
Sander Rossel wrote: order of arguments
Like one of my pet peeves about .net:
public ArgumentException (string? message, string? paramName);
public ArgumentNullException (string? paramName, string? message);
public ArgumentOutOfRangeException (string? paramName, string? message);
In my opinion, ArgumentException has the parameters in the correct order.
|
|
|
|
|
It looked strange to me at first, but I think I get it.
With the ArgumentException it's not clear what is wrong with the argument, so you must specify a message for further details.
In that case the paramName is secundary because it could also be hard coded in the message or the error could relate to multiple parameters.
With the Null and OutOfRange exceptions the error is clear even without a message and a default message would suffice, so you'd rarely specify a message.
What's more important is what parameter the error refers to.
Hence the "switch" in parameter order.
|
|
|
|
|
Okay, so lets say that there is an average of 100 lines of code per method. So 100,000 methods means the project has 10,000,000 lines of code in it. At the accepted rate of 1 defect per 1,000 lines of code, one would expect roughly 10,000 defects to be present in the code base. That is one hell of an integration test to complete.
Physical projects, building things, has safety cushions built into the engineering plans. Software does not have the same cushion to fall back on. You can make a girder with 10% additional material to make up for defects in the material or stress estimates. You can not add 10% additional code to a software module to make up for defects in the code or algorithm. You only add more defects.
It seems that software development leans towards smaller, highly integrated teams with smaller goals and continuous development, integration and testing. (IMHO)
|
|
|
|
|