What is the time complexity of a recursive algorithm given by the recurrence T(n) = 2T(n-1) + 1 - time-complexity

I am completely new to this so a step by stem would help greatly, thank you.

First, you can start with iteration method to understand how this behaves.
T(n) = 2T(n-1) + 1 =
= 2*(2T(n-2) + 1) + 1 = 4T(n-2) + 3
= 4(2T(n-3) + 1) + 3 = 8T(n-3) + 7
= 8*(2T(n-4) + 1) + 7 = 16T(n-4) + 15
= 16*(2T(n-5) + 1) + 15 = 32T(n-5) + 31
Now, that we understand the behavior, we can tell
T(n) = 2^i * T(n-i) + (2^i - 1)
Now, we need to use the base clause (which is not given in question), and extract for i=n. For example, if T(0) = 0:
T(n) = 2^n * T(0) + (2^n - 1) = 2^n - 1
This is in O(2^n), when calculating asymptotic complexity.
Note: Iteration method is good and easy to follow - but it is not a formal proof. To formally prove the complexity, you are going to need another tool, such as induction.

Related

is log(n!) equivalent to nlogn? (Big O notation)

In my book there is a multiple choice question:
What is the big O notation of the following function:
n^log(2) +log(n^n) + nlog(n!)
I know that log(n!) belongs to O(nlogn), But I read online that they are equivalent. how is log(n!) the same thing as saying nlogn?
how is:
log(n!) = logn + log(n-1) + ... + log2 + log1
equivalent to nlogn?
Let n/2 be the quotient of the integer division of n by 2. We have:
log(n!) = log(n) + log(n-1) + ... + log(n/2) + log(n/2 - 1) + ... + log(2) + log(1)
>= log(n/2) + log(n/2) + ... + log(n/2) + log(n/2 - 1) + ... + log(2)
>= (n/2)log(n/2) + (n/2)log(2)
>= (n/2)(log(n) -log(2) + log(2))
= (n/2)log(n)
then
n log(n) <= 2log(n!) = O(log(n!))
and n log(n) = O(log(n!)). Conversely,
log(n!) <= log(n^n) = n log(n)
and log(n!) = O(n log(n)).

Solving recurrence relation T(n) = T(n-1) + n

I see from a previous answer to this question, the person gave:
T(n) = T(n-2) + n-1 + n
T(n) = T(n-3) + n-2 + n-1 + n
T(n) = T(n-k) +kn - k(k-1)/2
I'm not understanding completely the third line. I can see they may have derived it from arithmetic series formula summation of 1/2n(n+1)? But how did they get kn and the minus sign in front of k(k-1)/2?
starting from:
T(n) = T(n-2) + n-1 + n
we may rewrite it as follows:
T(n) = T(n-2) + 2n - 1
The second formula says:
T(n) = T(n-3)+n-2+n-1+n
let us convert it the same way we do with the first one:
T(n) = T(n-3)+n+n+n-2-1
T(n) = T(n-3)+3n-2-1
By expanding more terms, we notice that the number subtracted from n in the recursive term:T(n-3) is always the same as the number multiplied by n. we may rewrite it as follows:
T(n) = T(n-k)+kn+...
we also notice that -2 -1 is the arithmetic series but negated and stars from k-1. the arithmetic of k-1 is (k-1)*k/2 just like n(n+1)/2. so the relation would be
T(n) = T(n-k)+kn-(k-1)*k/2 or T(n) = T(n-k)+kn-k*(k-1)/2
Hope this help ;)
The k(k-1)/2 term is just the sum of the numbers 0 to k-1. You can see why you need to subtract it from the following calculation:
T(n) =
T(n-k) + n + (n-1) + (n-2) + ... + (n-(k-1)) =
T(n-k) + (n-0) + (n-1) + (n-2) + ... + (n-(k-1)) =
T(n-k) + n + n + n + ... + n - 0 - 1 - 2 ... - (k-1) =
T(n-k) + kn - (0 + 1 + 2 + ... + (k-1)) =
T(n-k) + kn - k*(k-1)/2
If you look closly:
T(n) = T(n-2) + n-1 + n = T(n-2) + 2n -1
T(n)= T(n-3) + n-2 + n-1 + n = T(n-3)+ 3n -(2+1)
.
.
.
T(n)= T(n-k) + n-(k-1) + n-(k-2) + ... + n = T(n-k) + K * n + (-1 -2 - ... -(k-2) -(k-1))= T(n-k) + kn - k(k-1)/2
You can use recurrence theorem to demonstrate it

Time complexity of recursive function inside for loop

If we have a function:-
int x=0;
int fun(int n)
{
if(n==0)
return 1;
for(int i=0; i<n;i++)
x += fun(i);
}
According to me, time complexity can be calculated as:-
T(n) = T(n-1) + T(n-2) +...+ T(0).
T(n) = nT(n-1).
T(n) = O(n!).
Am I correct?
1. T(n) = T(n-1) + T(n-2) + T(n-3) + .... + T(0)
// Replace n with n-1
2. T(n-1) = T(n-2) + T(n-3) + .... + T(0)
Replace T(n-2) + T(n-3) + .... + T(0) with T(n-1) in 1st Equation
T(n) = T(n-1) + T(n-1)
= 2 * T(n-1)
= 2 * 2 * T(n-2) // Using T(n-1) = 2 * T(n-2)
= 2^n * T(n-n)
= 2^n * T(0) // Consider T(0) = 1
= 2^n
If you're measuring the number of function calls (or additions -- it turns out the same), the correct recurrence relations are:
T(0) = 0
T(n) = T(0) + T(1) + T(2) + ... + T(n-1) + n
You can compute the first few values:
T(0) = 0
T(1) = 1
T(2) = 3
T(3) = 7
T(4) = 15
You can guess from this that T(n) = 2^n - 1, and that's easy to check by a proof by induction.
In some sense you are right that T(n) = O(n!) since n! > 2^n for n>3, but T(n) = O(2^n) is a tighter bound.

The space complexity of Fibonacci Sequence

I saw some textbook about the worst-case space complexity of Fibonacci Sequence. However, I have the following question:
You can start with a concrete example and generalize. Start with n = 5.
S(5) = S(4) + c
= (S(3) + c) + c
= ((S(2) + c) + c) + c
= (((S(1) + c) + c) + c) + c
= S(1) + 4c
There are 4 c's when n = 5. In general, there are n-1 c's.

time complexity of nested for loop with j<=i condition

I had this question on a assignment.
Determine the time-complexity of the nested loop
for(int i=1; i<=n; i=2*i){
for(int j=1; j<=i; i=2*j){
stuff
}
}
I understand that with i and j being incremented by 2x that the complexity would be something along the lines of log2(n) * log2(n), but with the inner loop running to i rather than n I'm completely lost
I need to know the complexity of the nested loop and a step-by-step on how it was solved.
The inner loop runs log(i) + 1 times (log base 2).
Adding the outer-loop, sum the above for i = 1, 2, 4, ... n.
So: (log(1) + 1) + (log(2) + 1) + (log(4) + 1) + ... + (log(n) + 1)
which is: 1 + 2 + 3 + ... + log(n)
using the sum of arithmetic series is: (log(n) + 1) * (log(n) + 2) / 2 = (log(n)*log(n) + 3log(n) + 2) / 2 = O(log(n) * log(n))
Let's assume n=16 for example, so i will have values i = 1, 2, 4, 8, 16.
So: i is basically taking value as log(n) i.e log(16) i.e five iterations.
now for the value of j, it's taking value as log(1) + log(2) + log(4) + log(8) + log(16). It's basically equal to log(i) in each iteration.
So combining the values we got from the above two statements, we can say the time-complexity of the above code is O(log(n) * log(i)).
This is my understanding about the code.