Suppose that you found a solution to the A problem and are trying to get some idea of its complexity. You solve A by calling your B sub-routine a total of n^2 times and also doing a constant amount of additional work.
If B is selection sort, what is the time complexity of this solution?
If B is merge sort, what is the time complexity of this solution?
My answer to 1st question is n^2 and to 2nd one is nlogn. Any idea will be appreciated about my answers.
I assume that by "solution" you mean "algorithm", and by "this solution", you mean the algorithm that solves problem A by calling B n^2 times. Furthermore, I assume that by n you mean the size of the input.
Then if B is selection sort, which is an O(n^2) algorithm, the algorithm for solving A would be O(n^2 * n^2) = O(n^4).
If B is merge sort, which is O(n log n), the algorithm for solving A would be O(n^2* n log n) = O(n^3 log n).
Yeah your are right,
O(B) = n ^ 2 -> Selection Sort;
O(B) = n * log(n). -> Marge Sort
Related
This is a Question that my (Data Structure)course teacher did in a Class Test. What would be the proper answer here? Since log n^2 =2 log n , as far as I know in a time complexity it could be written as O(log n) since constant multipliers cancels out. Then is one better than the other in any specific way?
Asymptotically they are the same.
Your reasoning is right, O(log n^2) can be simplified to O(log n) and obviously they are equals.
It's like you have two algorithms that works on an array, the first is O(n) and the second is O(2n).
If you look to the number of performed operation, the second performs double the operation of the first but this is not important for the Asymptotic notation.
They are in the same order that is O(n).
In your specific example the order is O(log n) and they can be considered the same.
I would agree with you that any O(log(x^k)) is O(log(x)). The computational complexity scales the same.
I am trying to answer the following question:
Given n=2k find the complexity
func(n)
if(n==2) return1;
else n=1+func(sqrt(n))
end
I think because there is if-else statement, it will loop n times for sure, but I'm confused with the recursive loop for func(sqrt(n)). Since it is square-rooted, I think the time complexity would be
O(sqrt(n) * n) = O(n^1/2 * n) = O(n^3/2) = O(2k^3/2). However, the possible answer choices are only
O(k)
O(2^n)
O(n*k)
Can O(2k^3/2) be considered as O(k)? I'm confused because although time complexity is often simplified, O(n) and O(n^2) are different, so I thought O(2k^3/2) can only be simplified to O(k^3/2).
I’d argue that none of the answers here are the best answer to this question.
If you have a recursive function that does O(1) work per iteration and then reduces the size of its argument from n to √n, as is the case here, the runtime works out to O(log log n). The intuition behind this is that taking the square root of a number throws away half the digits of the number, roughly, so the runtime will be O(log d), where d is the number of digits of the input number. The number of digits of the number n is O(log n), hence the overall runtime of O(log log n).
In your case, you have n = 2k, so O(log log n) = O(log log k). (Unless you meant n = 2k, In which case O(log log n) = O(log log 2k) = O(log k).)
Notice that we don’t get O(n × √n) as our result. That would be what happens if we do O(√n) work O(n) times. But here, that’s not what we’re doing. The size of the input shrinks by a square root on each iteration, but that’s not the same thing as saying that we’re doing a square root amount of work. And the number of times that this happens isn’t O(n), since the value of n shrinks too rapidly for that.
Reasoning by analogy, the runtime of this code would be O(log n), not O(n × n / 2):
func(n):
if n <= 2 return 1
return func(n/2)
What I am trying to do is to sort the following functions:
n, n^3, nlogn, n/logn, n/log^2n, sqrt(n), sqrt(n^3)
in increasing order of asymptotic growth.
What I did is,
n/logn, n/log^2n, sqrt(n), n, sqrt(n^3), nlogn, n^3.
1) Is my answer correct?
2) I know about the time complexity of the basic functions such as n, nlogn, n^2, but I am really confused on the functions like, n/nlogn, sqrt(n^3).
How should I figure out which one is faster or slower? Is there any way to do this with mathematical calculations?
3) Are the big O time complexity and asymptotic growth different thing?
I would be really appreciated if anyone blows up my confusion... Thanks!
An important result we need here is:
log n grows more slowly than n^a for any strictly positive number a > 0.
For a proof of the above, see here.
If we re-write sqrt(n^3) as n^1.5, we can see than n log n grows more slowly (divide both by n and use the result above).
Similarly, n / log n grows more quickly than any n^b where b < 1; again this is directly from the result above. Note that it is however slower than n by a factor of log n; same for n / log^2 n.
Combining the above, we find the increasing order to be:
sqrt(n)
n / log^2 n
n / log n
n
n log n
sqrt(n^3)
n^3
So I'm afraid to say you got only a few of the orderings right.
EDIT: to answer your other questions:
If you take the limit of f(n) / g(n) as n -> infinity, then it can be said that f(n) is asymptotically greater than g(n) if this limit is infinite, and lesser if the limit is zero. This comes directly from the definition of big-O.
big-O is a method of classifying asymptotic growth, typically as the parameter approaches infinity.
I understand why, in worst case, where T is the running time of the algorithm, that using the median of medians algorithm with blocks of size three gives a recurrence relation of
T(n) = T(2n / 3) + T(n / 3) + O(n)
The Wikipedia article for the median-of-medians algorithm says that with blocks of size three the runtime is not O(n) because it still needs to check all n elements. I don't quite understand this explanation, and in my homework it says I need to show it by induction.
How would I show that median-of-medians takes time Ω(n log n) in this case?
Since this is a homework problem I'm going to let you figure out a rigorous proof of this result on your own, but it might be helpful to think about this one by looking at the shape of the recursion tree, which will be something like this:
n Total work: n
2n/3 n/3 Total work: n
4n/9 2n/9 2n/9 n/9 Total work: n
Essentially, each node's children collectively will do the exact same amount of work as the node itself, so if you sum up the work done across the layers, you should see roughly linear work done per level. It won't be exactly linear work per level because eventually the smaller call starts to bottom out, but for the top layers you'll see this pattern hold.
You can formalize this by induction by guessing that the runtime is something of the form cn log n, possibly with some lower-order terms added in, but (IMHO) it's more important and instructive to see where the runtime comes from than it is to be able to prove it inductively.
If we add the fractional parts of T(2n/3) and T(n/3), get T(n). Then, using the Master theorem, we have n^(log_(b)(a)) = n^(log_(1)(1)) = n. We also have f(n) = O(n). So n^(log_(b)(a)) = O(n) = Theta(f(n)), thus Case 2 of the Master theorem applies. Thus T(n) = Theta(n^(log_(b)(a)) * log(n)) = Theta(n*log(n)).
for (i=0;i<n;i++)
{
enumerate all subsets of size i = 2^n
each subset of size i takes o(nlogn) to search a solution
from all these solution I want to search the minimum subset of size S.
}
I want to know the complexity of this algorithm it'is 2^n O(nlogn*n)=o(2^n n²) ??
If I understand you right:
You iterate all subsets of a sorted set of n numbers.
For each subset you test in O(n log n) if its is a solution. (how ever you do this)
After you have all this solutions you looking for the one with exact S elements with the smalest sum.
The way you write it, the complexity would be O(2^n * n log n) * O(log (2^n)) = O(2^n * n^2 log n). O(log (2^n)) = O(n) is for searching the minimum solution, and you do this every round of the for loop with worst case i=n/2 and every subset is a solution.
Now Im not sure if you mixing O() and o() up.
2^n O(nlogn*n)=o(2^n n²) is only right if you mean 2^n O(nlog(n*n)).
f=O(g) means, the complexity of f is not bigger than the complexity of g.
f=o(g) means the complexity of f is smaller than the complexity of g.
So 2^n O(nlogn*n) = O(2^n n logn^2) = O(2^n n * 2 logn) = O(2^n n logn) < O(2^n n^2)
Notice: O(g) = o(h) is never a good notation. You will (most likly every time) find a function f with f=o(h) but f != O(g), if g=o(h).
Improvements:
If I understand your algorithm right, you can speed it a little up. You know the size of the subset you looking for, so only look at all the subsets that have the size S. The worst case is S=n/2, so C(n,n/2) ~ 2^(n-1) will not reduce the complexity but saves you a factor 2.
You can also just save a solution and check if the next solution is smaller. this way you get the smallest solution without serching for it again. So the complexity would be O(2^n * n log n).