how we can simplify the complexity time of T(n) = O(n3 + 2n2 + 3n) + T(n-1) - time-complexity

how we can simplify the complexity time of T(n) = O(n^3 + 2n^2 + 3n) + T(n-1)
the result should be O(n^4) but how?
It can be solved using General Theorem?

Related

Asymptotic time complexity of c*n*(1 - n)

Suppose on solving a recurrence, I find that:
T(n) = c*n*(1-n) = c*n - c*n^2
where c is a positive constant and n the size of the input
Should I consider the asymptotic time complexity of this recurrence, O(n) as the n^2 term is negative?
UPDATE:
For example, suppose we have the following recurrence:
T(n) = T(a*n) + O(n), where the factor a less than 1:
=> T(n) = c*n*(1 + a + a^2 + a^3 + ... for logan terms)
=> T(n) = c*n*(1 - a^logan)/(1 - a)
=> T(n) = c*n*(1 - n)/(1 - a) ~ c*n*(1-n)
The error as suggested by #meowgoesthedog in the comments is due to an incorrect leap in reasoning (there are log(1/a)n terms and not logan); the correct derivation is as follows:
T(n) = T(a*n) + O(n), where the factor a less than 1:
=> T(n) = c*n*(1 + a + a^2 + a^3 + ... for log(1/a)n terms)
=> T(n) = c*n*(1 - a^log(1/a)n)/(1 - a)
=> T(n) = c*n*(1 - n)/(1 - a) ~ c*n*(1-1/n) ~ O(n)

What is the time complexity of this nested loop?

I stumbled upon a loop for which I am not sure what the time complexity is. It is the following loop:
for(i = 1; i <= n^2; i++){
for(j = 1; j <= i; j++) {
//some elementary operation
}
}
I would have argued that the outer for-loop runs in n^2 and the inner for loop would also run in n^2 as for every iteration of the outer-loop we do n^2 - (n^2 - 1), n^2 - (n^2 - 2),..., n^2. Am I totally going in the wrong direction here?
So the time complexity would be in n^4
The number of operation will be :
1 + 2 + 3 + 4 + 5 + ... + n²
Which is equal to (n² * (n² - 1)) / 2.
The Big O notation is O(n^4). You are correct.
It's a simple arithmetic progression happening here.
Every new iteration is bigger by 1.
Outer loop will do n^2 operations which will result for following sequence:
1 + 2 + 3 + ... + n + ... + n^2 = n^2 (n^2+1) / 2 = O(n^4)

Time complexity analysis - how to simplify the expression

My algorithm's complexity has the below expression. But I am not sure how to simplify this further to express in Big-O notation.
T(n) = 3 * T(n-1) + 3 * T(n-2) + 3 * T(n-3) + ... + 3 *T(1)
T(1) takes constant time.
Appreciate any help.
Calculating T(n-1), we get:
T(n-1) = 3*T(n-2) + 3*T(n-3) + ... + 3*T(1)
So effectively,
T(n) = 3*T(n-1) + T(n-1) = 4*T(n-1) = 4*(4*T(n-2))
Thus T(n) = 4(n - 1).

complexity for recursive functions (Big O notation)

I have two functions which I would like to determine the complexity for.
The first one I just need to know whether my solving is correct, the second one because of the two recursive calls I am struggling to find the solution to, if possible would be good to have the working out so that I can learn how its done.
First:
def sum(list):
assert len(list)>0
if len(list) == 1:
return list[0]
else:
return sum(list[0:-1]) + list[-1]
Attempted solution :
T(0) = 4
T(n) = T(n-1) + 1 + c -- True for all n >0
T(n) = T(n-1) + 1 + c
= T(n-2) + 2 + 2C
= T(n-k) + k = kC --(n-k = 0 implies that k=n)
T(n) = T(0) + n + nC
= T(0) + 2nC --(T0 is nothing but 4)
= 6nC
Complexity = O(n)
Second:
def binSum(list):
if len(list) == 1:
return list[0]
else:
return binSum(list[:len(list)//2]) + binSum(list[len(list)//2:])
Any help would be greatly appreciated.
Regards
For the first case, you can model the time complexity with the recursive function T(n) = T(n-1) + O(1) and T(0) = O(1). which obviously solves to T(n) = O(n).
Here's a more direct and more formal proof by induction. The base case is easy: T(0)<=C by the definition of O(1). For the inductive step, suppose that T(k) <= C*k for all k<=n for some absolute constant C>0. Now T(n+1) <= D + T(n) <= D + C*n <= max(C,D)*(n+1) by the inductive hypothesis, where D>0 is an absolute constant.
For the second case, you can model the time complexity with T(n) = T(n/2) + T(n/2) = 2T(n/2) for n>1 and T(1)=O(1). This solves to T(n)=O(n) by the master theorem.

Recurrence Relation without using Master Theorem

I can easily solve some recurrence relations using the master theorem but I want to understand how to solve them w/o using the theorem
EX:
T(n) = 5T(n/2) + O(n) T(1) =1
Answer: O(n^{log_2(5)}
Expanding,
T(n) = 5T(n/2) + cn = 5(5T(n/4) + c(n/2)) + cn =
..... = 5^i * T(n/(2^i)) + cn*(1 + (5/2) + (5/2)^2 +......+ (5/2)^i)
Now let i= log_2(n)
then
5^(log_2(n)) * T(1) + cn*(1 + (5/2) + (5/2)^2 +......+ (5/2)^(log_2(n)))
After this I am lost . How do I get something similar to n^{log_2(5)?
Update:
Using the formula for the sum of geometric series (Sum = a* (1-r^n)/(1-r))
I get Sum = 1*(1-(5/2)^{log_2(n)})/(-3/2) = 2/3*c*(5^{log_2(n)} - n
How are 5^{log_2(n)} and n^{log_2(5)} related?
Thanks :D
I did not check the rest of your calculation, but note that
a^b = exp(b * ln(a))
and
log_b(a) = ln(a) / ln(b)
And thus
5^{log_2(n)} = exp(log_2(n) * ln(5)) = exp(ln(n) / ln(2) * ln(5))
and also
n^{log_2(5)} = exp(ln(5) / ln(2) * ln(n))