|
But a math question.
I just read that a selection sort takes O(n*n) (or O(n^2)) steps (verified multiple sources).
So when I take an example of a list with 3 elements I'm going through all 3 elements, take one out, go through the 2 that are left, take one out and look at the last item. So that's 3 + 2 + 1 steps, which equals 6 (and I could actually skip that last one, because I know it's in place).
If I apply O(n^2) I get 3^2, or 3*3, which equals 9 steps!
Is there something about O that the article is not telling me?
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
Well you don't get 3^2, you get that:
∃c, n0 f(n) <= c*n^2 if n>=n0
What you get here if you count only the comparisons, is the familiar (n - 1) + (n - 2) + .. + 2 + 1 = n(n - 1) / 2, which is in O(n2) because there are a c and n0 for which etc etc, for example c=1, n0=1, not requiring any annoying search for those parameters because "less than half the square" is of course less than the square.
|
|
|
|
|
Isn't that the chemical structure of caffeine?
|
|
|
|
|
harold aptroot wrote: you get that I got nothing.
harold aptroot wrote: is the familiar No familiar of mine!
harold aptroot wrote: of course Right...
Let's just keep it at "there is something about O that the article is not telling me"...
I probably should've mentioned I'm not very good at maths (which is an understatement) and this book I was reading was basically an algorithm book for n00bs
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
Ok then, no math this time. That's hard because the big O is fundamentally about sets of functions, but I'll try.
What it means when f is in the big O of g, is that f does not grow faster than g. There is a precise definition, but it's not used that often. The point is more that it's about a growth, not really about specific values, and that it's "not more than" something, it can be less, even by a non-constant factor.
|
|
|
|
|
Oh yeah, it's about growth... But now I'm lost with the n^2 again.
If I have 3 elements in my list O(3^2) = 9, and with 9 elements I have 81... So what do 9 and 81 tell me about growth? That sorting a list with 9 elements takes 9 times as many steps as a list with 3 elements?
But that's not the case either, because sorting 3 elements takes 6 steps and sorting 9 elements takes 45 steps, but 45/6 = 5 and not 9...
I guess I'll dive into some mathy stuff... I just can't stand math. It's the only thing that just doesn't get into my head. And just when I think it does the next math problem proves me wrong!
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
You're still looking at it as a function that means anything for finite values. And this:
Sander Rossel wrote: O(3^2) = 9 does not work at all. Big O gives sets of functions, and here we have O(32) = O(9) (that part is true), but then following the definition of O, that will contain all constants. 1 is in O(9), and 1000000000 is also in O(9). O(9) is the same set as O(1). So filling in some constants tells you exactly nothing, all information has been deleted. Filling in some finite n doesn't mean anything. O(n2) is a set of functions that includes n2, 0.01n2 - 1000, 1000n2+1000n and an infinite number of other functions that all grow quadratically in their limit. You simply cannot take O(n2) and fill in an n, you don't know which function you have!
If you really must use exact values, there is something that does work: plot the points you get and look at the curve they form. It will look like the right half of a parabola.
Sander Rossel wrote: That sorting a list with 9 elements takes 9 times as many steps as a list with 3 elements? Close enough. It tells you that, in the limit, sorting a list with 3n elements will take 9 times as long as sorting a list with n elements.
|
|
|
|
|
harold aptroot wrote: [incomprehensible gibberish...] I appreciate the time and effort you put into explaining this to me. I'm clearly missing some knowledge to make anything out of this.
I'm just going to sit in a corner and cry hoping that math will go away...
Or I'm going to take some math courses (actually I still have a few if I want to finish my Open Universiteit Informatica Bachelor/Master) and understand the hell out of this big-O thing.
So come find me in a corner if you need me
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
Ok, then how about this: understand it without understanding it. That's what most people do with it anyway. Forget the exact definition etc. Just do this:
We had n(n-1)/2 (conjured up magically, because summing series is for mathematicians). That looks vaguely like a square (there's a thing with n in it multiplied by an other thing with n in it), so it's probably going to be in O(n2).
Or: there were two nested loops, each of which runs about n times. Looks vaguely like n*n, probably going to be in O(n2).
Want to know "which big O is better"? Just look it up in this small list:
O(1) < O(log n) < O(sqrt(n)) < O(n) < O(n log n) < O(n2) < O(n3) < O(2n)
That's how non-mathematicians usually use big O.
|
|
|
|
|
harold aptroot wrote: understand it without understanding it Sure, I was going to do that anyway, but then as a bonus I wanted to understand it too
harold aptroot wrote: Want to know "which big O is better"? Just look it up in this small list:
O(1) < O(log n) < O(sqrt(n)) < O(n) < O(n log n) < O(n2) < O(n3) < O(2n) < O(n!)
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
Sander Rossel wrote: take one out
Because a selection sort doesn't "take one out" -- it doesn't remove any elements from the original list.
So if you have n elements, you have to search n elements n times to make sure you've found the "next" element in the sort.
Marc
|
|
|
|
|
That's the conclusion I made too, but what I understand from various sources is that we go through n iterations, where the lowest value for that iteration is swapped with the element at position n and the next iterations starts at element n+1. So when we have {2, 3, 1} we swap 1 and 2 in the first iteration, start at 3 in the second iteration (and swap it with 2) and start at 3 in the last iteration (do nothing) and we're done.
And that makes sense, because why would you want to check elements that you know are sorted?
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
So the actual number of steps is n(n - 1)/2 (the sum of all integers from 1 to n - 1 ).
Since Big-O notation only cares about the term with the largest growth rate, that becomes O(N<sup>2</sup>) .
http://en.wikipedia.org/wiki/Selection_sort#Analysis[^]
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Oh yeah, it's about growth... So now maybe I get the big O, but now I don't know how to interpret n^2
I guess I'll dive into some mathy stuff
Really, math is the only thing where I think I get it until I get to the next problem and find I didn't get any of it... And math just keeps coming back too
My blog[ ^]
public class SanderRossel : Lazy<Person>
{
public void DoWork()
{
throw new NotSupportedException();
}
}
|
|
|
|
|
I read a story about O once - but I don't remember there being much maths in it
|
|
|
|
|
Fluent XML Parsing using C#'s Dynamic Object[^]
Now, granted, probably dog slow compared to other forms, but for simple stuff where performance isn't critical, the XML structure is known, it's a good edition to the toolchest of useful stuff!
Marc
|
|
|
|
|
Yeah, poor performance. Also, it might work for reading, but probably not for adding.
As long as you already have to know the structure/schema of the XML you might as well write specific classes to do that faster and safer.
|
|
|
|
|
PIEBALDconsult wrote: Also, it might work for reading, but probably not for adding.
His part 2 covers writing.
PIEBALDconsult wrote: As long as you already have to know the structure/schema of the XML you might as well write specific classes to do that faster and safer.
I just hate backing classes that are nothing more than "mostly bags of water". But the dynamic code approach is too hard-wired for my tastes. Ugh, I guess there's no good answer, especially since I'm translating XML into DB transactions, and the XML schema and the database schema have really no similarities.
Marc
|
|
|
|
|
Marc Clifton wrote: Ugh, I guess there's no good answer, especially since I'm translating XML into DB transactions, and the XML schema and the database schema have really no similarities.
Haha, leaving infamous morsels of talking for a while of yapping away.
No object is so beautiful that, under certain conditions, it will not look ugly. - Oscar Wilde
|
|
|
|
|
|
Whoa! didn't hear about that. All the best to my man Bruce.
Up the Irons!
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
|
|
|
|
|
'Cause if you're gonna die die with your boots on!
Geek code v 3.12
GCS d--- s-/++ a- C++++ U+++ P- L- E-- W++ N++ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t++ 5? X R++ tv-- b+ DI+++ D++ G e++>+++ h--- r++>+++ y+++*
Weapons extension: ma- k++ F+2 X
|
|
|
|
|
|
This smoking thing really sucks
|
|
|
|
|
Why does Spongebob Wearpants?
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|