

About a year ago, the (now ex) head of our IT department said, during a staff meeting, "everybody raise your hand who's working 40 hours per week." Then, 45, 50, etc, til finally he was the only one with his hand raised.
I refused to play along, but the clear, unstated, implication was that "you guys aren't working as hard as me and you should be."
The next time I talked to him, I mentioned that if one has to work 80, 90 hour weeks on a consistent basis, they're doing it wrong. Period, full stop, end of statement.
Either they don't understand the problems they're being asked to solve, or they're not giving realistic expectations, or management has put someone in a position to fail (oh, hey Bob, we told them you'd deliver this thing with which you have no experience by next week).
It didn't really sink in with him; he's one of those people who just really enjoys the stress and being seen as "the man".
It does seem to be a little less stressful around here now that he's left.
(P.S. I don't want to paint him in an bad light; he is a pretty good guy, it's just that he didn't get that job < life. I sincerely hope he's doing o.k. now.)





I have done these kind of hours for months on end, though not in the IT world. It was at a corrugated fiberboard container plant where I was a press operator. If we were down an operator, as we were most of the 34 years I held that position, the two remaining had to pull 12 hour shifts. Most of the time, the plant ran 13 days at a time, only giving 2 days a month off due to union rules. After a while, you get used to it. The plus side, is that you are so busy making money, you don't have time to spend it!
In my earlier years as a programmer, when everything was magical, and before I had a wife, it wasn't uncommon for me lose track of time and code until 1 or 2 in the morning. Anymore, it's rare that I code anything past quitting time, though I do work every Saturday.
"Go forth into the source"  Neal Morse





Kent Sharkey wrote: Really?
I worked for a company where I typically put in slightly more than 60 hours a week and others put in many more than that. They paid hourly. And at least the people I knew were actually producing at that rate.
Conversely I worked at a company at salary that for years claimed that the 'next' release would be easy in terms of scheduling. Never happened. So I started boosting my estimates significantly and started working far less hours. Only way that I saw to demonstrate that I was meeting goals and yet still working in a reasonable way.





I suppose, chaining them to their desks didn't work out as expected...





Reminded me of this scene from Pirates of Silicon Valley. Jobs basically describes how he created a cult that would think 6080 hours was showing a lack of loyalty.
"90 hours a week and loving it... some of them work more than 90 hours a week..."
Of course, to heck with that: I hope dude's company gets some sense or bankrupts.
Nick P





40 hours  programming
45 hours  extend programming and extend coffee
50 hours  extend programming and extend coffee and EXTEND BUGS
60 hours  extend coffee and EXTEND BUGS
80 hours  Bugs in the programm, Bugs in the sex, bugs enstead the family





But a math question.
I just read that a selection sort takes O(n*n) (or O(n^2)) steps (verified multiple sources).
So when I take an example of a list with 3 elements I'm going through all 3 elements, take one out, go through the 2 that are left, take one out and look at the last item. So that's 3 + 2 + 1 steps, which equals 6 (and I could actually skip that last one, because I know it's in place).
If I apply O(n^2) I get 3^2, or 3*3, which equals 9 steps!
Is there something about O that the article is not telling me?
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}





Well you don't get 3^2, you get that:
∃c, n_{0} f(n) <= c*n^2 if n>=n_{0}
What you get here if you count only the comparisons, is the familiar (n  1) + (n  2) + .. + 2 + 1 = n(n  1) / 2, which is in O(n^{2}) because there are a c and n_{0} for which etc etc, for example c=1, n_{0}=1, not requiring any annoying search for those parameters because "less than half the square" is of course less than the square.





Isn't that the chemical structure of caffeine?





harold aptroot wrote: you get that I got nothing.
harold aptroot wrote: is the familiar No familiar of mine!
harold aptroot wrote: of course Right...
Let's just keep it at "there is something about O that the article is not telling me"...
I probably should've mentioned I'm not very good at maths (which is an understatement) and this book I was reading was basically an algorithm book for n00bs
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}





Ok then, no math this time. That's hard because the big O is fundamentally about sets of functions, but I'll try.
What it means when f is in the big O of g, is that f does not grow faster than g. There is a precise definition, but it's not used that often. The point is more that it's about a growth, not really about specific values, and that it's "not more than" something, it can be less, even by a nonconstant factor.





Oh yeah, it's about growth... But now I'm lost with the n^2 again.
If I have 3 elements in my list O(3^2) = 9, and with 9 elements I have 81... So what do 9 and 81 tell me about growth? That sorting a list with 9 elements takes 9 times as many steps as a list with 3 elements?
But that's not the case either, because sorting 3 elements takes 6 steps and sorting 9 elements takes 45 steps, but 45/6 = 5 and not 9...
I guess I'll dive into some mathy stuff... I just can't stand math. It's the only thing that just doesn't get into my head. And just when I think it does the next math problem proves me wrong!
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}





You're still looking at it as a function that means anything for finite values. And this:
Sander Rossel wrote: O(3^2) = 9 does not work at all. Big O gives sets of functions, and here we have O(3^{2}) = O(9) (that part is true), but then following the definition of O, that will contain all constants. 1 is in O(9), and 1000000000 is also in O(9). O(9) is the same set as O(1). So filling in some constants tells you exactly nothing, all information has been deleted. Filling in some finite n doesn't mean anything. O(n^{2}) is a set of functions that includes n^{2}, 0.01n^{2}  1000, 1000n^{2}+1000n and an infinite number of other functions that all grow quadratically in their limit. You simply cannot take O(n^{2}) and fill in an n, you don't know which function you have!
If you really must use exact values, there is something that does work: plot the points you get and look at the curve they form. It will look like the right half of a parabola.
Sander Rossel wrote: That sorting a list with 9 elements takes 9 times as many steps as a list with 3 elements? Close enough. It tells you that, in the limit, sorting a list with 3n elements will take 9 times as long as sorting a list with n elements.





harold aptroot wrote: [incomprehensible gibberish...] I appreciate the time and effort you put into explaining this to me. I'm clearly missing some knowledge to make anything out of this.
I'm just going to sit in a corner and cry hoping that math will go away...
Or I'm going to take some math courses (actually I still have a few if I want to finish my Open Universiteit Informatica Bachelor/Master) and understand the hell out of this bigO thing.
So come find me in a corner if you need me
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}





Ok, then how about this: understand it without understanding it. That's what most people do with it anyway. Forget the exact definition etc. Just do this:
We had n(n1)/2 (conjured up magically, because summing series is for mathematicians). That looks vaguely like a square (there's a thing with n in it multiplied by an other thing with n in it), so it's probably going to be in O(n^{2}).
Or: there were two nested loops, each of which runs about n times. Looks vaguely like n*n, probably going to be in O(n^{2}).
Want to know "which big O is better"? Just look it up in this small list:
O(1) < O(log n) < O(sqrt(n)) < O(n) < O(n log n) < O(n^{2}) < O(n^{3}) < O(2^{n})
That's how nonmathematicians usually use big O.





harold aptroot wrote: understand it without understanding it Sure, I was going to do that anyway, but then as a bonus I wanted to understand it too
harold aptroot wrote: Want to know "which big O is better"? Just look it up in this small list:
O(1) < O(log n) < O(sqrt(n)) < O(n) < O(n log n) < O(n^{2}) < O(n^{3}) < O(2^{n}) < O(n!)
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}





Sander Rossel wrote: take one out
Because a selection sort doesn't "take one out"  it doesn't remove any elements from the original list.
So if you have n elements, you have to search n elements n times to make sure you've found the "next" element in the sort.
Marc





That's the conclusion I made too, but what I understand from various sources is that we go through n iterations, where the lowest value for that iteration is swapped with the element at position n and the next iterations starts at element n+1. So when we have {2, 3, 1} we swap 1 and 2 in the first iteration, start at 3 in the second iteration (and swap it with 2) and start at 3 in the last iteration (do nothing) and we're done.
And that makes sense, because why would you want to check elements that you know are sorted?
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}





So the actual number of steps is n(n  1)/2 (the sum of all integers from 1 to n  1 ).
Since BigO notation only cares about the term with the largest growth rate, that becomes O(N^{2}) .
http://en.wikipedia.org/wiki/Selection_sort#Analysis[^]
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
 Homer





Oh yeah, it's about growth... So now maybe I get the big O, but now I don't know how to interpret n^2
I guess I'll dive into some mathy stuff
Really, math is the only thing where I think I get it until I get to the next problem and find I didn't get any of it... And math just keeps coming back too
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}





I read a story about O once  but I don't remember there being much maths in it





Fluent XML Parsing using C#'s Dynamic Object[^]
Now, granted, probably dog slow compared to other forms, but for simple stuff where performance isn't critical, the XML structure is known, it's a good edition to the toolchest of useful stuff!
Marc





Yeah, poor performance. Also, it might work for reading, but probably not for adding.
As long as you already have to know the structure/schema of the XML you might as well write specific classes to do that faster and safer.





PIEBALDconsult wrote: Also, it might work for reading, but probably not for adding.
His part 2 covers writing.
PIEBALDconsult wrote: As long as you already have to know the structure/schema of the XML you might as well write specific classes to do that faster and safer.
I just hate backing classes that are nothing more than "mostly bags of water". But the dynamic code approach is too hardwired for my tastes. Ugh, I guess there's no good answer, especially since I'm translating XML into DB transactions, and the XML schema and the database schema have really no similarities.
Marc





Marc Clifton wrote: Ugh, I guess there's no good answer, especially since I'm translating XML into DB transactions, and the XML schema and the database schema have really no similarities.
Haha, leaving infamous morsels of talking for a while of yapping away.
No object is so beautiful that, under certain conditions, it will not look ugly.  Oscar Wilde




