|
There is no base case, what is T(1) ?
|
|
|
|
|
take a constant as C>0
valhamdolelah.
|
|
|
|
|
Ok then I think:
T(n) = (n * C)/(e^2) + (n * ln(ln(n))) / ln(2)
edit: that would be O(n log(log n)) I think
|
|
|
|
|
thanks
please say the way?how we can gain to this?
valhamdolelah.
|
|
|
|
|
|
solution =?
how do you solve it?
valhamdolelah.
|
|
|
|
|
Well I cheated and put it on wolfram alpha, this wasn't an exam so
|
|
|
|
|
thanks but i want to know how to solve this problems.
valhamdolelah.
|
|
|
|
|
I don't know what you mean
|
|
|
|
|
In the end, he does want the answer, but he is more interested in how you come up with the answer - the way to solve it. In effect, he does not want you to do his homework for him, he wants to know HOW to do his homework.
That is unusual for this kind of question, and I think he deserves kudos!
Unfortunately, I don't know how to solve it.
Silver member by constant and unflinching longevity.
|
|
|
|
|
Thank you for wanting to know HOW, not just wanting the answer.
.
Are you solving for the computational order of an algorithm that takes this long to process (the Big O notation), or are you looking for a numerical order of magnitude for the result of applying this equation to an arbitrary value 'n' ( eg. T(2) = sqrt(2)+T(sqrt(2))+2)?
.
T(n)=sqrt(n)*(T(sqrt(n))+n
.
The Big O would be O(n). the two terms are added together, so each term can be taken by itself.
sqrt(n) * T(sqrt(n))becomes 'n' in the limit, so you have two O(n) terms, which is the same as O(n).
.
If you are looking for numerics, I can't really help you.
Silver member by constant and unflinching longevity.
|
|
|
|
|
thanks very much for your attention
i want to know the computational order of it as you said :
sqrt(n) * T(sqrt(n))becomes 'n' in the limit, so you have two O(n) terms, which is the same as O(n).
but why in limit it becomes n?,please explain more.
also i think finding it as numerical would be interesting but i think this would be hard!!!
thanks.
|
|
|
|
|
response below
Silver member by constant and unflinching longevity.
|
|
|
|
|
I have to disagree. The two terms are not O(n) at all, one of them is O(n^0.5) which is asymptotically different from O(n), and the other is unknown until the recursion is solved.
|
|
|
|
|
response below your entry
Silver member by constant and unflinching longevity.
|
|
|
|
|
I'm answering both of you, so I am just tacking it on to the end.
The problem in what I wrote was with the T(sqrt(n)) part of the function.
The answer hinges on what F(n) = sqrt(n)*F(sqrt(n)) is. I was thinking it approached (n), but I see I was wrong, it may approach(n^1.5), which would mean it dominated.
When you add in the (+ n), with an infinite recursion, I see that this goes to infinity, even for n = 1.
As a matter of fact, I will kick my own but on this and say that, while I don't know which order of infinity this ends up being, but, since there is no stop condition on the recursion, the answer, even for n=1 is infinity, given this analysis:
T(1) = 1*T(1)+1
which shows that you end up adding 1 at each recursion level, with an infinite number of recursions (no stop condition). And it just gets larger from there.
khomeyni - sorry for my flawed analysis, thank you for asking why.
harold aptroot - is that a better analysis? I'm asking, not being snide.
Silver member by constant and unflinching longevity.
|
|
|
|
|
Yes this is better, this is why I asked him what T(1) was, and he said "take a constant as C>0", which significantly changes the result
|
|
|
|
|
thanks for your getting involved in it
i say that we must take T(1) a constant as c so the stop condition is T(1),but i dont know how either you or harold aptroot solved this? you said that it would be n^1.5 how you find it?
please explain more on the solution not only the final answer.if we only find a order it is enough.
thanks.
valhamdolelah.
|
|
|
|
|
khomeyni wrote: you said that it would be n^1.5 how you find it?
I said that about t(n) = sqrt(n)+T(sqrt(n)),but waswrong on the overall analysis at that point.
Than I went on to do the actual form t(n)=sqrt(n)+t(sqrt(n))+n and said it was infinite, though I don't know what order.
Even if you remove the sqrt(n)part, t(n)=t(sqrt(n))+n, but assume t(1) is a stop condition,if you try t(n >1), sqrt(sqrt(sqrt(...(n)))) approaches, but never reaches 1, so you have and infinite series of summing for x = 1 to infinity of n^(1/x).
Sums an infinite progression of numbers > 1.
Silver member by constant and unflinching longevity.
|
|
|
|
|
|
It's ugly, ugly, ugly....
You measure democracy by the freedom it gives its dissidents, not the freedom it gives its assimilated conformists.
|
|
|
|
|
Hi,
Is it possible to use Bug-1 algorithm without knowing the distance to the target ?
In every place (a matrix) I just know the way to this (N, NE, E, SE..), but not the distance.
modified on Tuesday, November 3, 2009 5:00 PM
|
|
|
|
|
If you now the direction from two points and the distance between them, then it's pretty simple trig to work out the distance to the third; Triangulation[^]
Panic, Chaos, Destruction.
My work here is done.
|
|
|
|
|
I need to make a divide&conquer algorithm for finding the dominant element in an array (dominant element = element which exists at least n/2 times in an array within the size n) . All you can do between those elements is compare them (no < relation or > relation) .
And also I need an algorithm which does it in O(n) (doesn't have to be d&c)
Thanks ahead for any help...
|
|
|
|
|
Well, this may not win any awards for style, but have you considered just using a hash table to aggregate the count?
It would be O(n) to hash everything, since I believe hash tables are theoretically in constant time... Then a subsequent O(n) to find the largest value in the hash table.
There's probably a more clever way to do it, but that would get the job done.
|
|
|
|