The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
I am doing some interim testing on a personal project to check its accuracy. It has dozens of slowly converging polynomials, and lots of separate complex functions to be integrated into the final output. I am expecting errors, since most polynomials are approximations, getting closer and closer to the truth, without ever getting there.
My accuracy requirement is basically less than 1 part in 100,000, which doesn't sound like much, but to some of the polynomials, that looks more like 1 in 4 billion.
One of the critical components is producing figures well within my requirements - but the error range is very small, and is all positive. One quick trig function, and I could improve the accuracy by an order of magnitude. The trig function, of course, would be completely made up, have no basis in reality, totally outside any possible justification.
It would just make it all work so prettily!
Should I? No - it would be completely out of order.
Yes, but it's just one very small function. Not a chance, you fraudster!
Hah! The traditional problem that I should be old enough and wise enough to be sceptical about!
Just because one of my reference works declared something to be a constant, that does not mean I should believe them. It may have been a constant when people were using an abacus to add with, but it ain't now. looks like I have to do some research and then write yet another polynomial.
the first and second sets of errors, moved to an understandable range and rounded:-
I took the so called 'constant' and adjusted it by bracketing down to new values, and have learned that although the 'constant' changes very slowly, it does change. I tweaked the digit in the eighth
I was in your shoes actually, although I had played around with it some when I was young.
At microsoft I did not work on compiler stuff. I wish I did, I was hoping for a spot on the compiler team - even testing - over at MSR (back when Herb Sutter was there) to the point where I punted a separate position at MSR to hold out for it (i didn't get it but i'm glad i tried)
I taught myself. I can teach you.
It's a pain in the backside until you "get it" - then it's still a pain but you can at least navigate.
I can break down LL(1) parsing for you pretty easily though.
When I was growin' up, I was the smartest kid I knew. Maybe that was just because I didn't know that many kids. All I know is now I feel the opposite.