O(nm + n2 log n) complexity time polynomial time? - time-complexity

If an algorithm computes in O(nm + n^2 log n) time. Can you then say it computes in Polynomial time?
I know that O(n log n) is O(n^2) so polynomial time. Just not sure how the n^2 works..

Remember that O(n) means "upper-bounded". If a function T(n) is O(n), then n*T(n) is O(n2).
Of course, you can also multiply T(n) by some other function that is O(n) --- not necessarily f(n)=n. So T(n)*O(n) is also O(n2).
If you know that O(n logn) is O(n2), then you can multiply both by a function that is O(n) and arrive to the conclusion that O(n2 logn) is O(n3), which is polynomial.
Finally, if O(a), O(b) and O(c) are all polynomials, then O(a+b+c) is also polynomial. Because they can all be upper-bounded by the term that grows faster.

Related

O(n) + O(n^2) = O(n^2)?

We know thah O(n) + O(n) = O(n) for this, even that O(n) + O(n) + ... + O(n) = O(n^2) for this.
But what happend if O(n) + O(n^2)?
Is O(n) or O(n^2)?
The Big O notation (https://en.wikipedia.org/wiki/Big_O_notation) is used to understand the limit of a specific algorithm, and how fast its complexity grows. Therefore when considering the growth of a linear and a quadratic component of an algorithm, what remains in Big I notation is only the quadratic component.
As you can see from the attached image, the quadratic curve grows much faster (over y-axis) than the linear curve, determining the general tendency of the complexity for that algorithm to be just influenced by the quadratic curve, hence O(n^2).
The case for O(n) + O(n) = O(n) it's due to the fact that any constant in Big O notation can be discarded: in fact the curve y = n and y = 2n would be growing just as fast (although with a different slope).
The case for O(n) + ... + O(n) = O(n^2) it's not generally true! For this case the actual complexity would be polynomial with O(k*n). Only if the parameter k equals to the size of your input n, then you will end up with a specific quadratic case.

Is O(n * 2^n ) the same as O(2^n)?

Would O(n * 2^n ) simplify to O(2^n) in Big-O notation?
My intuition is that it is not, even though O(2^n) is significantly worse than O(n)
O(n * 2^n) is not equal to O(2^n) and is much worse than O(2^n).
Your intuition is correct. O(n * 2^n) is not equal to O(2^n) and you can see that by definition of big-O. That is,
But with some random k, with only taking n=k+1 you demonstrate that the inequality isn’t true.
One easy way to elucidate this is to compute the quotient of both quantities and let n tend to infinity. In this case the quotient is
n*2^n/2^n = n
which tends to infinity as n goes to infinity. Since the limit is not bounded by any constant, the answer is that O(n*2^n) grows much faster than O(2^n)

Time Complexity Question about Square Roots

I am trying to answer the following question:
Given n=2k find the complexity
func(n)
if(n==2) return1;
else n=1+func(sqrt(n))
end
I think because there is if-else statement, it will loop n times for sure, but I'm confused with the recursive loop for func(sqrt(n)). Since it is square-rooted, I think the time complexity would be
O(sqrt(n) * n) = O(n^1/2 * n) = O(n^3/2) = O(2k^3/2). However, the possible answer choices are only
O(k)
O(2^n)
O(n*k)
Can O(2k^3/2) be considered as O(k)? I'm confused because although time complexity is often simplified, O(n) and O(n^2) are different, so I thought O(2k^3/2) can only be simplified to O(k^3/2).
I’d argue that none of the answers here are the best answer to this question.
If you have a recursive function that does O(1) work per iteration and then reduces the size of its argument from n to √n, as is the case here, the runtime works out to O(log log n). The intuition behind this is that taking the square root of a number throws away half the digits of the number, roughly, so the runtime will be O(log d), where d is the number of digits of the input number. The number of digits of the number n is O(log n), hence the overall runtime of O(log log n).
In your case, you have n = 2k, so O(log log n) = O(log log k). (Unless you meant n = 2k, In which case O(log log n) = O(log log 2k) = O(log k).)
Notice that we don’t get O(n × √n) as our result. That would be what happens if we do O(√n) work O(n) times. But here, that’s not what we’re doing. The size of the input shrinks by a square root on each iteration, but that’s not the same thing as saying that we’re doing a square root amount of work. And the number of times that this happens isn’t O(n), since the value of n shrinks too rapidly for that.
Reasoning by analogy, the runtime of this code would be O(log n), not O(n × n / 2):
func(n):
if n <= 2 return 1
return func(n/2)

Asymptotic growth (Big o notation)

What I am trying to do is to sort the following functions:
n, n^3, nlogn, n/logn, n/log^2n, sqrt(n), sqrt(n^3)
in increasing order of asymptotic growth.
What I did is,
n/logn, n/log^2n, sqrt(n), n, sqrt(n^3), nlogn, n^3.
1) Is my answer correct?
2) I know about the time complexity of the basic functions such as n, nlogn, n^2, but I am really confused on the functions like, n/nlogn, sqrt(n^3).
How should I figure out which one is faster or slower? Is there any way to do this with mathematical calculations?
3) Are the big O time complexity and asymptotic growth different thing?
I would be really appreciated if anyone blows up my confusion... Thanks!
An important result we need here is:
log n grows more slowly than n^a for any strictly positive number a > 0.
For a proof of the above, see here.
If we re-write sqrt(n^3) as n^1.5, we can see than n log n grows more slowly (divide both by n and use the result above).
Similarly, n / log n grows more quickly than any n^b where b < 1; again this is directly from the result above. Note that it is however slower than n by a factor of log n; same for n / log^2 n.
Combining the above, we find the increasing order to be:
sqrt(n)
n / log^2 n
n / log n
n
n log n
sqrt(n^3)
n^3
So I'm afraid to say you got only a few of the orderings right.
EDIT: to answer your other questions:
If you take the limit of f(n) / g(n) as n -> infinity, then it can be said that f(n) is asymptotically greater than g(n) if this limit is infinite, and lesser if the limit is zero. This comes directly from the definition of big-O.
big-O is a method of classifying asymptotic growth, typically as the parameter approaches infinity.

mergesort complexity O(nlogn) + O(n)?

When analyzing the time complexity of the merge sort, I know that since there are O(log(n)) levels and each level takes a O(n) operation, the entire time complexity should be O(nlog(n)).
However, doesn't dividing take O(n) total? Each dividing of the set of elements take O(1) but you divide a total of O(n) times so doesn't the dividing part of the merge sort take O(n)? For example, if you have 8 elements, you have to divide 7 times and if you have 16 elements, you have to divide 15 times.
So, shouldn't the entire merge sort time complexity technically be O(nlog(n))+O(n)? I know that O(nlog(n) + n) is the same thing as O(nlog(n)) but no one seems to mention this in the explanation of the merge sort time complexity.
O(n log n + n) is the same thing as O(n log n). n log n grows faster than n, so the n term is extraneous.