Applying Master Theorem - master-theorem

I am trying to study for my exams by using looking at my midterm. One thing I do not understand fully is the Master Theorem. I understand that there are three cases, and can apply them when they are in this form
T(n) = 25T(n/5) + n^(2)
but my professor likes to give some in this form
T(n) = {n+2 if n=0,1,2,3
T(n) = {4T(n-1) - 6T(n-2) + 4T(n-3) - T(n-4) otherwise
So I am confused if there is a different way to do Master Theorem, or if I am meant to somehow change this into into the format I understand.

n^lg25=n^2. and at the non recursive part we have this. So we should mult n^2 *log n. so the solution is o(n^2 log n)

Related

Solving recurrence T(n) = T(n - 1) + n^2 substitution method

I am trying to solve a recurrence using substitution method (I am asking for help/confirmation about this solving method, so don't answer with a solution developed by iteration method or something else).
I want to precise that I have just started studying recurrence relations, so I am asking for confirmation of my work.
I guess that T(n) = O(n^3), so:
then I continued like this:
from now on, this is the part about I am not sure if I did it right:
Now, I am not sure if i did that right.
I know that this recurrence is a O(n^3), but I want to know if my proof is true.
I want to say that this isn't a homework or something like that, it's a simple exercise that came up to my mind.

time complexity of constructing a BST using pre order

I tried to write a program to construct a binary search tree using the pre-order sequence. I know there are many solutions: the min/max algorithm, the classical (or "obvious" recursion) or even iteration rather than recursion.
I tried to implement the classical recursion: the first element of pre-order traversal is the root. Then I search for all elements which are less than the root. All these elements will be part of left subtree, and the other values will be part of the right subtree. I repeat that until I construct all substrees. It's a very classical approach.
Here is my code:
public static TreeNode constructInOrderTree(int[] inorder) {
return constructInOrderTree(inorder, 0, inorder.length-1);
}
private static TreeNode constructInOrderTree(int[] inorder, int start, int end){
if(start>end){
return null;
}
int rootValue = inorder[start];
TreeNode root = new TreeNode(rootValue);
int k = 0;
for (int i =0; i< inorder.length; i++){
if (inorder[i]<= rootValue){
k=i;
}
}
root.left = constructInOrderTree(inorder, start+1, k);
root.right= constructInOrderTree(inorder, k+1, end);
return root;
}
My question is: What is the time complexity of this algorithm? Is it O(n^2) or O(n * log(n) ) ?
I searched here in stackoverflow but I found many contradictory answers. Sometimes, someone said that it is O(n^2), sometime O(n*log(n)) and I got really confused.
Can we apply the master theorem here? If yes, "perhaps" we can consider that each time we divide the tree in two subtrees (of equal parts), so we will have the relation: (O(n) is the complexity of searching in the array)
T(n) = 1/2 * T(n/2) + O(n)
Which will give us a complexity of O(n*log(n)). But, it's not really true I think, we don't divide the tree at equal parts because we search in the array until we found the adequate elements no?
Is it possible to apply the master theorem here?
Forethoughts:
No, it is neither O(n^2), nor O(nlogn) in WC. Because of the nature of trees and the fact that you don't perform any complex actions on each element. All you do is output it, in contrast to sorting it with some comparison algorithm.
Then the WC would be O(n).
That is when the tree is skewed, i.e. one of the root's subtrees is empty. Then you have quasi a simple linked list. Then to output it you must visit each element at least once giving O(n).
Proof:
Lets assume the right subtree is empty and the per call effort is constant(only print out). Then
T(n) = T(n-1) + T(0) + c
T(n) = T(n-2) + 2T(0) + 2c
.
.
T(n) = nT(0) + nc = n(T(0) + c)
Since T(0) and c are constants, you end up in O(n).

Analyzing time complexity (Poly log vs polynomial)

Say an algorithm runs at
[5n^3 + 8n^2(lg (n))^4]
Which is the first order term? Would it be the one with the poly log or the polynomial?
For each two constants a>0,b>0, log(n)^a is in o(n^b) (Note small o notation here).
One way to prove this claim is examine what happens when we apply a monotomically increasing function on both sides: the log function.
log(log(n)^a)) = a* log(log(n))
log(n^b) = b * log(n)
Since we know we can ignore constants when it comes to asymptotic notations, we can see that the answer to "which is bigger" log(n)^a or n^b, is the same as "which is bigger": log(log(n)) and log(n). This answer is much more intuitive to answer.

Renaming variables to solve recursion method

I know the idea of renaming the variables that is transforming the recurrence to one that you have seen before.
I'm OK with slide until line 4 .. they renamed T(2^m) with S(m) >> this mean they made 2^m = m
So S(m) should be :
S(m)= 2T(m^(0.5)) + m
also m i think we shouldn't leave m as it is, because it here mean 2^m but they in real are not
Could any one explain this to me?
And also how can i know which variables I should use to make it easy to me ?
Everything you're saying is correct up to the point where you claim that since S(m) = T(2m), then m = 2m.
The step of defining S(m) = T(2m) is similar to defining some new function g in terms of an old function f. For example, if you define a new function g(x) = 2f(5x), you're not saying that x = 5x. You're just defining a new function that's evaluated in terms of f.
So let's see what happens from here. We've defined S(m) = T(2m). That means that
S(m) = T(2m)
= 2T(√(2m)) + lg (2m)
We can do some algebraic simplification to see that
S(m) = 2T(2m/2) + m
And, using the connection between T and S, we see that
S(m) = 2S(m/2) + m
Notice that we ended up with the recurrence S(m) = 2S(m/2) + m not by just replacing T with S in the original recurrence, but by doing algebraic substitutions and simplifications.
Once we're here, we can use the master theorem to solve S(m) and get that S(m) = O(m log m), so
T(n) = S(lg n) = O(lg n lg lg n).
As for how you'd come up with this in the first place - that just takes practice. The key insight is that to use the master theorem you need to be shrink the size of the problem down by a constant factor each time, so you need to find a transformation that converts square roots into division by a constant. Square roots are a kind of exponentiation, and logarithms are specifically designed to convert exponentiation into multiplication and division, so it's reasonable to try a log or exponential substitution. Now that you know the trick, I suspect that you'll see it in a lot more places.
You could, as alternative, also just divide the first equation by log(n) to get
T(n)/log(n)=T(sqrt(n))/log(sqrt(n)) + 1
and then just use
S(n) = T(n)/log(n) with S(n) = S(sqrt(n)) + 1
or in a different way
S(k) = T(n^(2^(-k)))/log(n^(2^(-k)))
where then
S(k+1)=S(k)+1
is again a well-known recursive equation.

Whats the time complexity of the following?

I've seen many questions of this type but I still can't get the while iteration clear.
for i=1...n
for j=1..i
k=n
while (k>2)
k=k^(1/3)
The two for loops are O(n^2) combined, and the inner loop is O(log2(log2(n)) [*]. Thus the overall complexity is O(n^2*log2(log2(n))).
To find the number of iterations m of the inner loop, we need to solve the following for m:
n = 2^(3^m)
This gives log3(log2(n)), which is the same as O(log2(log2(n)) (using the same log base for consistency).
[*] Assuming that, in your notation, k^(1/3) is the cube root of k.