Solving recurrence relation T(n) = T(n-1) + n - iteration

I see from a previous answer to this question, the person gave:
T(n) = T(n-2) + n-1 + n
T(n) = T(n-3) + n-2 + n-1 + n
T(n) = T(n-k) +kn - k(k-1)/2
I'm not understanding completely the third line. I can see they may have derived it from arithmetic series formula summation of 1/2n(n+1)? But how did they get kn and the minus sign in front of k(k-1)/2?

starting from:
T(n) = T(n-2) + n-1 + n
we may rewrite it as follows:
T(n) = T(n-2) + 2n - 1
The second formula says:
T(n) = T(n-3)+n-2+n-1+n
let us convert it the same way we do with the first one:
T(n) = T(n-3)+n+n+n-2-1
T(n) = T(n-3)+3n-2-1
By expanding more terms, we notice that the number subtracted from n in the recursive term:T(n-3) is always the same as the number multiplied by n. we may rewrite it as follows:
T(n) = T(n-k)+kn+...
we also notice that -2 -1 is the arithmetic series but negated and stars from k-1. the arithmetic of k-1 is (k-1)*k/2 just like n(n+1)/2. so the relation would be
T(n) = T(n-k)+kn-(k-1)*k/2 or T(n) = T(n-k)+kn-k*(k-1)/2
Hope this help ;)

The k(k-1)/2 term is just the sum of the numbers 0 to k-1. You can see why you need to subtract it from the following calculation:
T(n) =
T(n-k) + n + (n-1) + (n-2) + ... + (n-(k-1)) =
T(n-k) + (n-0) + (n-1) + (n-2) + ... + (n-(k-1)) =
T(n-k) + n + n + n + ... + n - 0 - 1 - 2 ... - (k-1) =
T(n-k) + kn - (0 + 1 + 2 + ... + (k-1)) =
T(n-k) + kn - k*(k-1)/2

If you look closly:
T(n) = T(n-2) + n-1 + n = T(n-2) + 2n -1
T(n)= T(n-3) + n-2 + n-1 + n = T(n-3)+ 3n -(2+1)
.
.
.
T(n)= T(n-k) + n-(k-1) + n-(k-2) + ... + n = T(n-k) + K * n + (-1 -2 - ... -(k-2) -(k-1))= T(n-k) + kn - k(k-1)/2
You can use recurrence theorem to demonstrate it

Related

How could it be true that T(n) = θ(1) + 2T(n/2) + θ(n) + θ(1) = 2T(n/2) + θ(n)?

I have a question about Time Complexity of this pseudo code.
enter image description here
as following,
T(n) = θ(1) + 2T(n/2) + θ(n) + θ(1)
= 2T(n/2) + θ(n)
Can you explain me why these two 'θ(1)' are ignored in second equation?
Is it because meaningless comparing to 'θ(n)'?
They are constant and not going to change based on the value of n

What will be complexity of reccurance relation : T (n) = 0.5T (n/2) + 1/n

How to calculate the complexity of this case since master method fails for a<1?
Let's solve the recurrence relation below:
T(n) = (1/2) * T(n/2) + (1/n)
T(n/2) = (1/2) * T(n/4) + (1/(2/n))
Therefore:
T(n) = (1/4) * T(n/4) + (1/n) + (1/n) Notice the pattern.
In overall:
T(n) = (1/2^k) * T(n/2^k) + (1/n)*k
2^k = n --> k = log(n)
T(n) = (1/n) * T(1) + (1/n)*log(n)
T(n) = (log(n) + c)/n c = constant
T(n) ~ log(n)/n
Notice that as n goes to infinity, total number of operations goes to zero.
Recall the formal definition of Big-Oh notation:
f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k.
T(n) = O(1) since 0 <= T(n) <= c*1 for all n >= k for some k>0 and c>0.

What is the time complexity of a recursive algorithm given by the recurrence T(n) = 2T(n-1) + 1

I am completely new to this so a step by stem would help greatly, thank you.
First, you can start with iteration method to understand how this behaves.
T(n) = 2T(n-1) + 1 =
= 2*(2T(n-2) + 1) + 1 = 4T(n-2) + 3
= 4(2T(n-3) + 1) + 3 = 8T(n-3) + 7
= 8*(2T(n-4) + 1) + 7 = 16T(n-4) + 15
= 16*(2T(n-5) + 1) + 15 = 32T(n-5) + 31
Now, that we understand the behavior, we can tell
T(n) = 2^i * T(n-i) + (2^i - 1)
Now, we need to use the base clause (which is not given in question), and extract for i=n. For example, if T(0) = 0:
T(n) = 2^n * T(0) + (2^n - 1) = 2^n - 1
This is in O(2^n), when calculating asymptotic complexity.
Note: Iteration method is good and easy to follow - but it is not a formal proof. To formally prove the complexity, you are going to need another tool, such as induction.

is log(n!) equivalent to nlogn? (Big O notation)

In my book there is a multiple choice question:
What is the big O notation of the following function:
n^log(2) +log(n^n) + nlog(n!)
I know that log(n!) belongs to O(nlogn), But I read online that they are equivalent. how is log(n!) the same thing as saying nlogn?
how is:
log(n!) = logn + log(n-1) + ... + log2 + log1
equivalent to nlogn?
Let n/2 be the quotient of the integer division of n by 2. We have:
log(n!) = log(n) + log(n-1) + ... + log(n/2) + log(n/2 - 1) + ... + log(2) + log(1)
>= log(n/2) + log(n/2) + ... + log(n/2) + log(n/2 - 1) + ... + log(2)
>= (n/2)log(n/2) + (n/2)log(2)
>= (n/2)(log(n) -log(2) + log(2))
= (n/2)log(n)
then
n log(n) <= 2log(n!) = O(log(n!))
and n log(n) = O(log(n!)). Conversely,
log(n!) <= log(n^n) = n log(n)
and log(n!) = O(n log(n)).

Time complexity of recursive function inside for loop

If we have a function:-
int x=0;
int fun(int n)
{
if(n==0)
return 1;
for(int i=0; i<n;i++)
x += fun(i);
}
According to me, time complexity can be calculated as:-
T(n) = T(n-1) + T(n-2) +...+ T(0).
T(n) = nT(n-1).
T(n) = O(n!).
Am I correct?
1. T(n) = T(n-1) + T(n-2) + T(n-3) + .... + T(0)
// Replace n with n-1
2. T(n-1) = T(n-2) + T(n-3) + .... + T(0)
Replace T(n-2) + T(n-3) + .... + T(0) with T(n-1) in 1st Equation
T(n) = T(n-1) + T(n-1)
= 2 * T(n-1)
= 2 * 2 * T(n-2) // Using T(n-1) = 2 * T(n-2)
= 2^n * T(n-n)
= 2^n * T(0) // Consider T(0) = 1
= 2^n
If you're measuring the number of function calls (or additions -- it turns out the same), the correct recurrence relations are:
T(0) = 0
T(n) = T(0) + T(1) + T(2) + ... + T(n-1) + n
You can compute the first few values:
T(0) = 0
T(1) = 1
T(2) = 3
T(3) = 7
T(4) = 15
You can guess from this that T(n) = 2^n - 1, and that's easy to check by a proof by induction.
In some sense you are right that T(n) = O(n!) since n! > 2^n for n>3, but T(n) = O(2^n) is a tighter bound.