|
Well if you add the half an hour for lunch and an extra half hour on Sunday, that calculation sort of works.
I may not last forever but the mess I leave behind certainly will.
|
|
|
|
|
sure, and if you just miss off a couple of decimal places, PI = 3.0!
PooperPig - Coming Soon
|
|
|
|
|
If you divide 80 hours into 7 days it is roughly 11.4 hours a day. If you factor in a half hour meal break, then it works out to be around a 12 hour shift.
|
|
|
|
|
I think you missed the word "basically" somewhere. 4 hours away from 80 (5%) is in my personal realm of "basically." What's yours?
|
|
|
|
|
basically != approximately
PooperPig - Coming Soon
|
|
|
|
|
I hope she means that the 80+ hour work week is a complete joke. Her execution was as poor as the title.
|
|
|
|
|
Always make me laugh; a "businessman" that wonders if he has to pay overtime.
Must be an easy trade, this "business" thingy, judging by the amount of general knowledge of those in the field.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
I have put in insane levels of hours in my youth. I am talking 28hrs of straight coding, amped up on cola, chips and sweets. Followed by 6hrs of sleep and an 18hr day, 8hrs sleep, 12hr days for the rest of the week, and slept all weekend. Made the deadline! (we still did not get the contract, LMAO).
But I learned an important lesson: Can vs. Should.
As a manager, I do NOT want my team working over time! Occasionally, it happens. I allow them to flex the time so that I ACCRUE NO net benefit.
In the end, we all need to be Happy AND Productive. Not just Happy we ARE productive!
|
|
|
|
|
About a year ago, the (now ex-) head of our IT department said, during a staff meeting, "everybody raise your hand who's working 40 hours per week." Then, 45, 50, etc, til finally he was the only one with his hand raised.
I refused to play along, but the clear, unstated, implication was that "you guys aren't working as hard as me and you should be."
The next time I talked to him, I mentioned that if one has to work 80, 90 hour weeks on a consistent basis, they're doing it wrong. Period, full stop, end of statement.
Either they don't understand the problems they're being asked to solve, or they're not giving realistic expectations, or management has put someone in a position to fail (oh, hey Bob, we told them you'd deliver this thing with which you have no experience by next week).
It didn't really sink in with him; he's one of those people who just really enjoys the stress and being seen as "the man".
It does seem to be a little less stressful around here now that he's left.
(P.S. I don't want to paint him in an bad light; he is a pretty good guy, it's just that he didn't get that job < life. I sincerely hope he's doing o.k. now.)
|
|
|
|
|
I have done these kind of hours for months on end, though not in the IT world. It was at a corrugated fiberboard container plant where I was a press operator. If we were down an operator, as we were most of the 3-4 years I held that position, the two remaining had to pull 12 hour shifts. Most of the time, the plant ran 13 days at a time, only giving 2 days a month off due to union rules. After a while, you get used to it. The plus side, is that you are so busy making money, you don't have time to spend it!
In my earlier years as a programmer, when everything was magical, and before I had a wife, it wasn't uncommon for me lose track of time and code until 1 or 2 in the morning. Anymore, it's rare that I code anything past quitting time, though I do work every Saturday.
"Go forth into the source" - Neal Morse
|
|
|
|
|
Kent Sharkey wrote: Really?
I worked for a company where I typically put in slightly more than 60 hours a week and others put in many more than that. They paid hourly. And at least the people I knew were actually producing at that rate.
Conversely I worked at a company at salary that for years claimed that the 'next' release would be easy in terms of scheduling. Never happened. So I started boosting my estimates significantly and started working far less hours. Only way that I saw to demonstrate that I was meeting goals and yet still working in a reasonable way.
|
|
|
|
|
I suppose, chaining them to their desks didn't work out as expected...
|
|
|
|
|
Reminded me of this scene from Pirates of Silicon Valley. Jobs basically describes how he created a cult that would think 60-80 hours was showing a lack of loyalty.
"90 hours a week and loving it... some of them work more than 90 hours a week..."
Of course, to heck with that: I hope dude's company gets some sense or bankrupts.
Nick P
|
|
|
|
|
40 hours - programming
45 hours - extend programming and extend coffee
50 hours - extend programming and extend coffee and EXTEND BUGS
60 hours - extend coffee and EXTEND BUGS
80 hours - Bugs in the programm, Bugs in the sex, bugs enstead the family
|
|
|
|
|
But a math question.
I just read that a selection sort takes O(n*n) (or O(n^2)) steps (verified multiple sources).
So when I take an example of a list with 3 elements I'm going through all 3 elements, take one out, go through the 2 that are left, take one out and look at the last item. So that's 3 + 2 + 1 steps, which equals 6 (and I could actually skip that last one, because I know it's in place).
If I apply O(n^2) I get 3^2, or 3*3, which equals 9 steps!
Is there something about O that the article is not telling me?
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
Well you don't get 3^2, you get that:
∃c, n0 f(n) <= c*n^2 if n>=n0
What you get here if you count only the comparisons, is the familiar (n - 1) + (n - 2) + .. + 2 + 1 = n(n - 1) / 2, which is in O(n2) because there are a c and n0 for which etc etc, for example c=1, n0=1, not requiring any annoying search for those parameters because "less than half the square" is of course less than the square.
|
|
|
|
|
Isn't that the chemical structure of caffeine?
|
|
|
|
|
harold aptroot wrote: you get that I got nothing.
harold aptroot wrote: is the familiar No familiar of mine!
harold aptroot wrote: of course Right...
Let's just keep it at "there is something about O that the article is not telling me"...
I probably should've mentioned I'm not very good at maths (which is an understatement) and this book I was reading was basically an algorithm book for n00bs
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
Ok then, no math this time. That's hard because the big O is fundamentally about sets of functions, but I'll try.
What it means when f is in the big O of g, is that f does not grow faster than g. There is a precise definition, but it's not used that often. The point is more that it's about a growth, not really about specific values, and that it's "not more than" something, it can be less, even by a non-constant factor.
|
|
|
|
|
Oh yeah, it's about growth... But now I'm lost with the n^2 again.
If I have 3 elements in my list O(3^2) = 9, and with 9 elements I have 81... So what do 9 and 81 tell me about growth? That sorting a list with 9 elements takes 9 times as many steps as a list with 3 elements?
But that's not the case either, because sorting 3 elements takes 6 steps and sorting 9 elements takes 45 steps, but 45/6 = 5 and not 9...
I guess I'll dive into some mathy stuff... I just can't stand math. It's the only thing that just doesn't get into my head. And just when I think it does the next math problem proves me wrong!
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
You're still looking at it as a function that means anything for finite values. And this:
Sander Rossel wrote: O(3^2) = 9 does not work at all. Big O gives sets of functions, and here we have O(32) = O(9) (that part is true), but then following the definition of O, that will contain all constants. 1 is in O(9), and 1000000000 is also in O(9). O(9) is the same set as O(1). So filling in some constants tells you exactly nothing, all information has been deleted. Filling in some finite n doesn't mean anything. O(n2) is a set of functions that includes n2, 0.01n2 - 1000, 1000n2+1000n and an infinite number of other functions that all grow quadratically in their limit. You simply cannot take O(n2) and fill in an n, you don't know which function you have!
If you really must use exact values, there is something that does work: plot the points you get and look at the curve they form. It will look like the right half of a parabola.
Sander Rossel wrote: That sorting a list with 9 elements takes 9 times as many steps as a list with 3 elements? Close enough. It tells you that, in the limit, sorting a list with 3n elements will take 9 times as long as sorting a list with n elements.
|
|
|
|
|
harold aptroot wrote: [incomprehensible gibberish...] I appreciate the time and effort you put into explaining this to me. I'm clearly missing some knowledge to make anything out of this.
I'm just going to sit in a corner and cry hoping that math will go away...
Or I'm going to take some math courses (actually I still have a few if I want to finish my Open Universiteit Informatica Bachelor/Master) and understand the hell out of this big-O thing.
So come find me in a corner if you need me
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
Ok, then how about this: understand it without understanding it. That's what most people do with it anyway. Forget the exact definition etc. Just do this:
We had n(n-1)/2 (conjured up magically, because summing series is for mathematicians). That looks vaguely like a square (there's a thing with n in it multiplied by an other thing with n in it), so it's probably going to be in O(n2).
Or: there were two nested loops, each of which runs about n times. Looks vaguely like n*n, probably going to be in O(n2).
Want to know "which big O is better"? Just look it up in this small list:
O(1) < O(log n) < O(sqrt(n)) < O(n) < O(n log n) < O(n2) < O(n3) < O(2n)
That's how non-mathematicians usually use big O.
|
|
|
|
|
harold aptroot wrote: understand it without understanding it Sure, I was going to do that anyway, but then as a bonus I wanted to understand it too
harold aptroot wrote: Want to know "which big O is better"? Just look it up in this small list:
O(1) < O(log n) < O(sqrt(n)) < O(n) < O(n log n) < O(n2) < O(n3) < O(2n) < O(n!)
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
Sander Rossel wrote: take one out
Because a selection sort doesn't "take one out" -- it doesn't remove any elements from the original list.
So if you have n elements, you have to search n elements n times to make sure you've found the "next" element in the sort.
Marc
|
|
|
|