What will be the time complexity of this Equation ?
Using Master's Algorithm , I am getting answer as O(n) using a < b^k case .
But the correct answer is O(nlogn) .
How ?
Using Master's Theorem for an equation like this:
you should first calculate this value:
In this case we have:
a=1
so the value of c will be something like:
which means:
Now it depends on f(n) to choose the right case of Master's method. It can be case 2 or 3 depending on f(n). If f(n) is a constant, then according to case 2, T(n) = O(nlogn) and if f(n) is a polynomial of n, according to case 3, T(n) = O(n).
Using recursive method, I also got O(n). How do you know it's O(nlogn)?
It is O(n).
Look at the recursion tree: (ignoring the constant factor that O(n) term should have)
The non-recursive version of the above is the sum of the right branches or
T(n) = n + (9/10) n + (9/10)^2 n + (9/10)^3 n + ...
which reduces to
T(n) = n * (1 + (9/10) + (9/10)^2 + (9/10)^3) + ... )
which means T(n) is some constant -- I think 10? -- times n, but in any case it is O(n) asymptotically.
Related
For the following algorithm (this algorithm doesn't really do anything useful besides being an exercise in analyzing time complexity):
const dib = (n) => {
if (n <= 1) return;
dib(n-1);
dib(n-1);
I'm watching a video where they say the time complexity is O(2^n). If I count the nodes I can see they're right (the tree has around 32 nodes) however in my head I thought it would be O(n*2^n) since n is the height of the tree and each level has 2^n nodes. Can anyone point out the flaw in my thinking?
Each tree has 2^i nodes, not 2^n.
So each level has 2^(i-1) nodes: 1 + 2 + 4 + 8 ... 2^n.
The deepest level is the decider in the complexity.
The total number of nodes beneath any level > 1 is 1 + 2*f(i-1) .
This is 2^n - 1.
Derek's answer is great, it gives intuition behind the estimation, if you want formal proof, you can use the Master theorem for Decreasing functions.
The master theorem is a formula for solving recurrences of the form
T(n) = aT(n - b) + f(n), where a ≥ 1 and b > 0 and f(n) is
asymptotically positive. (Asymptotically
positive means that the function is positive for all sufficiently large n.
Recurrent formula for above algorithm is T(n) = 2*T(n-1) + O(1). Do you see why? You can see solution for various cases (a=1, a>1, a<1) here http://cs.uok.edu.in/Files/79755f07-9550-4aeb-bd6f-5d802d56b46d/Custom/Ten%20Master%20Method.pdf
For our case a>1, so T(n) = O(a^(n/b) * f(n)) or O (a^(n/b) * n^k ) and gives O(2^n)
I know the time complexity is n*log(n) , however I could only assess it with an integral for the inner loop to get an upper bound, how do I get a lower bound? to make it theta and not O?
S=0;
for( i=1; i<n; i++)
for( j=0;j<n;j+=i)
S++;
so line 1 is executed once, line 2 is executed n-1 times + 1 check without entering, each one of these n-1 times line 3 is executed n/i times and we get:
T= 1 + n + (n/1+n/2+...+n/n-1) =< 1+n+n (integral of 1/x from 1 to n) = 1+n+nlog(n) . And that's big O, how about Omega?
Let's decompose the function in the following way: T(n) = 1 + n + n + n*S(n), where S = sum(1/x from x=2 to n-1). Note that it is identical to what you wrote.
The function f(x) = 1/x is monotonically decreasing, therefore you can bound the sum S from above by int(1/x from x=1 to n-1) and from below by int(1/x from x=2 to n). In both cases you get log(n) up to constant terms. For the lower bound log(n-1)-log(1) = log(n-1) and for the lower bound log(n)-log(2).
If these bounds are not obvious picture the left and right Riemann sums of the integrals for a decreasing function.
You did use the lower bound, not the upper one, in your question, by the way. (Because 1/x is decreasing, not increasing.)
Then adding that back into the expression for T(n) we have T(n) >= 1 + 2n + n log(n) - n log(2) and T(n) <= 1 + 2n + n log(n-1). Both are asymptotically proportional to n log(n), giving you the Theta class.
For this problem, we are given a function f which calculates the nth fibonacci number in the standard way:
f(n)
{
if(n == 0)
return 0;
if(n == 1)
return 1;
else return f(n-1) + f(n-2);
}
I am then tasked with calculating the number of arithmetic operations performed by the function on an input n. I know that each call to f performs 3 operations (2 minus, 1 addition) and I've then attempted to find it by expanding the function to try and find some sort of recurrence relation and then working out how many arithmetic operations get done. But then I get stuck because this doesn't come out as the right answer.
The correct answer is f does 3f(n+1)-3 arithmetic operations on an input n. Please could someone explain this to me? It is important as we are later asked to find the time complexity using this as a starting point.
Thanks very much,
Niamh
Time complexity of your algorithm is close to 2^(n) . Hence it is O( 2^n )
Here is an example of how your algorithm will solve if n = 6
You algorithm does T(n) = T(n-1) + T(n-2) + O(1) .
T(n-1) = T(n-2) + T(n-3) + O(1)
T(n-2) = T(n-3) + T(n-4) + O(1)
.
.
.
T(2) = T(1) + T(0) + O(1)
Have a look at Fibonacci time complexity
T(n) = T(n-1) + T(n-2) + O(1) which is equal to
T(n) = O(2^(n-1) ) + O(2^(n-2) )+ O(1) = O(2^n)
perhaps n/2 times additon operations performed ;
for(i=3;i<=n;i++)
let n=4;
variable a,b=0,c=1;
[a=b+c;b=c;c=a;].........(1) //now a=1;b=1;c=1;
[a=b+c,b=c,c=a)...............(2) //now a=2;b=1;c=2;
output:0112.
Note:if n is Odd No then addition operation performed (n/2)+1 times;
Note:if n is Even No then addition operation performed (n/2) times;
!Please correct me if i am wrong!
Working through the recurrences, you can derive that during each call to this function, the time complexity will be: T(n) = 2T(n/2) + O(1)
And the height of the recurrence tree would be log2(n), where is the total number of calls (i.e. nodes in the tree).
It was said by the instructor that this function has a time complexity of O(n), but I simply cannot see why.
Further, when you substitute O(n) into the time complexity equation there are strange results. For example,
T(n) <= cn
T(n/2) <= (cn)/2
Back into the original equation:
T(n) <= cn + 1
Where this is obviously not true because cn + 1 !< cn
Your instructor is correct. This is an application of the Master theorem.
You can't substitute O(n) like you did in the time complexity equation, a correct substitution would be a polynomial form like an + b, since O(n) only shows the highest significant degree (there can be constants of lower degree).
To expand on the answer, you correctly recognize an time complexity equation of the form
T(n) = aT(n/b) + f(n), with a = 2, b = 2 and f(n) asympt. equals O(1).
With this type of equations, you have three cases that depends on the compared value of log_b(a) (cost of recursion) and of f(n) (cost of solving the basic problem of length n):
1° f(n) is much longer than the recursion itself (log_b(a) < f(n)), for instance a = 2, b = 2 and f(n) asympt. equals O(n^16). Then the recursion is of negligible complexity and the total time complexity can be assimilated to the complexity of f(n):
T(n) = f(n)
2° The recursion is longer than f(n) (log_b(a) > f(n)), which is the case here Then the complexity is O(log_b(a)), in your example O(log_2(2)), ie O(n).
3° The critical case where f(n) == log_b(a), ie there exists k >= 0 such that f(n) = O(n^{log_b(a)} log^k (n)), then the complexity is:
T(n) = O(n^{log_b(a)} log^k+1 (a)}
This is the ugly case in my opinion.
I think it's interesting but I'm not sure about my solution. This algorithm calculates xn
If I use the master theorem my reasoning goes like this
T(n) = 2 T(n/2) + f(n)
But f(n) in this case is 1? Because n <= 4 is constant. Gives me:
T(n) = Θ(n)
If I use substitution I get this answer
T(n) = Θ(n + log(n))
I think I'm doing lots of things wrong. Can someone point me in the right direction?
T(n) = Θ(n + log(n)) is just T(n) = Θ(n). The lower order term (log(n)) can be omitted when using theta.
Also, f(n) is O(1) because you are only doing one multiplication (rek(x, n/2) * rek(x, (n + 1)/2)) for each recursion.
The complexity is 0(n). Explanation: You make there ALL multiplication as in using simple cycle. And you have no operation thats numbers divided by numbers of * are bigger than a const. So, complexity is about const*0(n) that makes 0(n).