Recurrence of n-way merge sort - time-complexity

For a regular 2-way merge sort, I know that the recurrence relation is
T(n) = 2T(n/2) + O(n)
and for 3-way, it is
T(n) = 3T(n/3) + O(n)
so theoretically, if I decide to split the array into n bits (1 element by 1)
Would the recurrence relation be
T(n) = nT(n/n) + O(n)
meaning that the time complexity would be O(n^2)?

Related

Quicksort time complextiy analysis (Analysis of recurrence equation)

Quicksort's recurrence equation is
T(n) = T(n/2) + T(n/2) + theta(n)
if pivot always divides the original array into two same-sized sub arrays.
so the overall time complexity would be O(nlogn)
But what if the ratio of the two sub-lists is always 1:99?
The equation definitely would be T(n) = T(n/100) + T(99n/100) + theta(n)
But how can I derive time complexity from the above equation?
I've read other answer which describes that we can ignore T(n/100) since T(99n/100) will dominate the overall time complexity.
But I quite cannot fully understand.
Any advice would be appreciated!
Plug T(n) = n log(n) + Θ(n) and you get
n log(n) + Θ(n) = n/100 log(n/100) + 99n/100 log(99n/100) + Θ(n/100) + Θ(99n/100) + Θ(n)
= n log(n)/100 + 99n log(n)/100 - n/100 log(100) - 99n/100 log(99/100) + Θ(n)
= n log(n) + Θ(n)
In fact any fraction will work:
T(n) = T(pn) + T((1-p)n) + Θ(n)
is solved by O(n log(n)).

Time Complexity - T(n) =T(9n/10) + O(n)

What will be the time complexity of this Equation ?
Using Master's Algorithm , I am getting answer as O(n) using a < b^k case .
But the correct answer is O(nlogn) .
How ?
Using Master's Theorem for an equation like this:
you should first calculate this value:
In this case we have:
a=1
so the value of c will be something like:
which means:
Now it depends on f(n) to choose the right case of Master's method. It can be case 2 or 3 depending on f(n). If f(n) is a constant, then according to case 2, T(n) = O(nlogn) and if f(n) is a polynomial of n, according to case 3, T(n) = O(n).
Using recursive method, I also got O(n). How do you know it's O(nlogn)?
It is O(n).
Look at the recursion tree: (ignoring the constant factor that O(n) term should have)
The non-recursive version of the above is the sum of the right branches or
T(n) = n + (9/10) n + (9/10)^2 n + (9/10)^3 n + ...
which reduces to
T(n) = n * (1 + (9/10) + (9/10)^2 + (9/10)^3) + ... )
which means T(n) is some constant -- I think 10? -- times n, but in any case it is O(n) asymptotically.

Number of Arithmetic Operations Performed By a Fibonacci Function

For this problem, we are given a function f which calculates the nth fibonacci number in the standard way:
f(n)
{
if(n == 0)
return 0;
if(n == 1)
return 1;
else return f(n-1) + f(n-2);
}
I am then tasked with calculating the number of arithmetic operations performed by the function on an input n. I know that each call to f performs 3 operations (2 minus, 1 addition) and I've then attempted to find it by expanding the function to try and find some sort of recurrence relation and then working out how many arithmetic operations get done. But then I get stuck because this doesn't come out as the right answer.
The correct answer is f does 3f(n+1)-3 arithmetic operations on an input n. Please could someone explain this to me? It is important as we are later asked to find the time complexity using this as a starting point.
Thanks very much,
Niamh
Time complexity of your algorithm is close to 2^(n) . Hence it is O( 2^n )
Here is an example of how your algorithm will solve if n = 6
You algorithm does T(n) = T(n-1) + T(n-2) + O(1) .
T(n-1) = T(n-2) + T(n-3) + O(1)
T(n-2) = T(n-3) + T(n-4) + O(1)
.
.
.
T(2) = T(1) + T(0) + O(1)
Have a look at Fibonacci time complexity
T(n) = T(n-1) + T(n-2) + O(1) which is equal to
T(n) = O(2^(n-1) ) + O(2^(n-2) )+ O(1) = O(2^n)
perhaps n/2 times additon operations performed ;
for(i=3;i<=n;i++)
let n=4;
variable a,b=0,c=1;
[a=b+c;b=c;c=a;].........(1) //now a=1;b=1;c=1;
[a=b+c,b=c,c=a)...............(2) //now a=2;b=1;c=2;
output:0112.
Note:if n is Odd No then addition operation performed (n/2)+1 times;
Note:if n is Even No then addition operation performed (n/2) times;
!Please correct me if i am wrong!

How is this algorithm O(n)?

Working through the recurrences, you can derive that during each call to this function, the time complexity will be: T(n) = 2T(n/2) + O(1)
And the height of the recurrence tree would be log2(n), where is the total number of calls (i.e. nodes in the tree).
It was said by the instructor that this function has a time complexity of O(n), but I simply cannot see why.
Further, when you substitute O(n) into the time complexity equation there are strange results. For example,
T(n) <= cn
T(n/2) <= (cn)/2
Back into the original equation:
T(n) <= cn + 1
Where this is obviously not true because cn + 1 !< cn
Your instructor is correct. This is an application of the Master theorem.
You can't substitute O(n) like you did in the time complexity equation, a correct substitution would be a polynomial form like an + b, since O(n) only shows the highest significant degree (there can be constants of lower degree).
To expand on the answer, you correctly recognize an time complexity equation of the form
T(n) = aT(n/b) + f(n), with a = 2, b = 2 and f(n) asympt. equals O(1).
With this type of equations, you have three cases that depends on the compared value of log_b(a) (cost of recursion) and of f(n) (cost of solving the basic problem of length n):
1° f(n) is much longer than the recursion itself (log_b(a) < f(n)), for instance a = 2, b = 2 and f(n) asympt. equals O(n^16). Then the recursion is of negligible complexity and the total time complexity can be assimilated to the complexity of f(n):
T(n) = f(n)
2° The recursion is longer than f(n) (log_b(a) > f(n)), which is the case here Then the complexity is O(log_b(a)), in your example O(log_2(2)), ie O(n).
3° The critical case where f(n) == log_b(a), ie there exists k >= 0 such that f(n) = O(n^{log_b(a)} log^k (n)), then the complexity is:
T(n) = O(n^{log_b(a)} log^k+1 (a)}
This is the ugly case in my opinion.

Can anyone help understanding a recurrence relation?

I'm learning about time complexity and have this function:
public static double pow( double x, int n ) {
if( n==0 ) return 1.0;
return x*pow(x,n-1);
}
I'm tasked with finding the recurrence relation for it's time complexity and I know the answer is T(n) = T(n-1) + O(1), which I don't understand because I think it should be T(n) = T(n-1) + O(n). My reasoning is that I multiply each recursive call with x which happens n times so that's O(n) right?
*edit: I think I understand my mistake. T(n) = T(n-1) + O(1) is only for that specific call so it's clearly O(1) and solving the recurrence leads to n*T(n-1) + (n-1)O(1) so the (n-1)O(1) is ignored since it's less than n*T(n-1). So we get order of growth = O(n).
You don't need to consider every multiplication. Recurrence relations are about how the call for one number relates to the same call for a different number. In this case, calculating pow(x,n) requires that you already know pow(x,n-1) - that's the T(n) = T(n-1) part. Now, what do you additionally need to do to calculate pow(x,n), given that you already have pow(x,n-1)? That's only a single multiplication - x*pow(x,n-1) - which is O(1), so the total recurrence relation is T(n) = T(n-1) + O(1). All the other multiplications are already accounted for in T(n-1)...