Time complexity for all Fibonacci numbers from 0 to n - time-complexity

I was calculating the time complexity of this code that prints all Fibonacci numbers from 0 to n. According to what I calculated, the fib() method takes O(2^n) and since it is being called i number of times, so it came out to be O(n*2^n). However, the book says it is O(2^n). Can anyone explain why the time complexity here will be O(2^n)?
Here is the code:
void allFib(int n){
for(int i = 0 ; i < n ; i++){
System.out.println(i + ": " + fib(i));
}
}
int fib(int n ){
if(n <= 0) return 0;
else if (n == 1) return 1;
return fib(n-1) + fib(n-2);
}

I've figured it out my own way to understand the book's solution, hope it helps those who are still struggling.
Imagine we now call allFib(n).
Since we have a for loop from 0 to n, the following function will be called:
i = 0, call fib(0)
i = 1, call fib(1)
i = 2, call fib(2)
...
i = n-1, call fib(n-1)
As discussed before fib(n) will take O(2^n) = 2^n steps
Therefore,
i = 0, call fib(0) takes 2^0 steps
i = 1, call fib(1) takes 2^1 steps
i = 2, call fib(2) takes 2^2 steps
...
i = n-1, call fib(n-1) takes 2^(n-1) steps
Thus, the runtime of allFib(n) will be
2^0 + 2^1 + 2^2 + ... + 2^(n-1). *
Follow the sum of powers of 2 formula we have:
* = 2^(n-1+1) - 1 = 2^n - 1.
Thus it is O(2^n)

I finally got my answer from my professor and I'll post it here:
According to him: you should not just simply look the for loop iterating from 0 to n, but you must find what are the actual computations by calculating the steps.
fib(1) takes 2^1 steps
fib(2) takes 2^2 steps
fib(3) takes 2^3 steps
..........
fib(n) takes 2^n steps
now adding these:
2^1 + 2^2 + 2^3 + ........+ 2^n = 2^n+1
and ignoring the constant, it is 2^n, hence the time complexity is O(2^n).

Related

Time Complexity of a loop iteration by i^2

i=2;
while(i<n) {
i = i*i;
//O(1) complexity here
}
I'm new to time complexity and trying to figure out what this would be.
I know that if the iteration would've been i=2*i then it'd be O(log(n)) but I don't really know how I can calculate iterations of i^2.
Intuitively it'd also be O(log(n)) because it "iterates faster" but I don't know how to formally explain this.
Any help would be appreciated, thanks in advance
You can neatly translate this into the i = 2 * i case you mentioned by considering the mathematical log of i. Pseudocode for how this value changes:
log_i = log(2);
while (log_i < log_n) {
log_i = 2 * log_i;
// O(1) stuff here
}
It should be clear from this that the time complexity is O(log log n), assuming constant multiplication cost of course.
I think it's easier to approach this problem just using mathematics.
Consider your variable i. What sequence does it take? It seems to be
2, 4, 16, 162, ...
If you look at it for a bit, you notice that this sequence is just
2, 22, (22)2, ((22)2)2, ... or
21, 22, 24, 28, ... which is
220, 221, 222, 223, ...
so general term for this sequence is 22k where k = 0, 1, 2, ...
Now, how many iterations does your loop make? It will be in the order of k when 22k = n. So let us solve this equation:
22k = n (apply log2 to both sides)
2k = log2n (apply log2 to both sides again)
k = log2(log2n)
In big-O notation, the base of the logarithm doesn't matter, so we say your algorithm has a time complexity of:
O(log log n).
There are different ways to do i^2, The iterative approach (The one that you have) will have a time complexity of O(N) because we are iterating once till N.
Another method is recursive, something like:
static long pow(int x, int power) {
//System.out.println(x+":"+power);
//base condition
if (power == 0)
return 1 l;
if (power == 1)
return x;
power--;
return x * pow(x, power);}
this will also have a time complexity of O(N) because pow(x,n) is called recursively for each number from 1 to n.
The last method (and more efficient one) is the Divide and conquer which can improve the time complexity by only by calling pow(x, power/2).
static long pow(int x, int power) {
//System.out.println(x + ":" + power);
//base condition
if (power == 0)
return 1 L;
if (power == 1)
return x;
log res = pow(x, power / 2);
//if power is even
else if (power % 2 == 0)
return res * res;
else
return x * res * res; //if power is odd}
The time complexity in this case would be O(log N) because pow(x,n/2) is calculated and then stored for using the same result.

What is the time complexity of this nested loop?

I stumbled upon a loop for which I am not sure what the time complexity is. It is the following loop:
for(i = 1; i <= n^2; i++){
for(j = 1; j <= i; j++) {
//some elementary operation
}
}
I would have argued that the outer for-loop runs in n^2 and the inner for loop would also run in n^2 as for every iteration of the outer-loop we do n^2 - (n^2 - 1), n^2 - (n^2 - 2),..., n^2. Am I totally going in the wrong direction here?
So the time complexity would be in n^4
The number of operation will be :
1 + 2 + 3 + 4 + 5 + ... + n²
Which is equal to (n² * (n² - 1)) / 2.
The Big O notation is O(n^4). You are correct.
It's a simple arithmetic progression happening here.
Every new iteration is bigger by 1.
Outer loop will do n^2 operations which will result for following sequence:
1 + 2 + 3 + ... + n + ... + n^2 = n^2 (n^2+1) / 2 = O(n^4)

Time Complexity: O(logN) or O(N)?

I thought the time complexity of the following code is O(log N), but the answer says it's O(N). I wonder why:
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}
For the inners for-loop, it runs for this many times:
N + N/2 + N/4 ...
it seems to be logN to me. Please help me understand why here. Thanks
1, 1/2, 1/4, 1/8... 1/2 ** n is a geometric sequence with a = 1, r = 1/2 (a is the first term, and r is the common ratio).
Its sum can be calculated using the following formula:
In this case, the limit of the sum is 2, so:
n + n/2 + n/4 ... = n(1 + 1/2 + 1/4...) -> n * 2
Thus the complicity is O(N)
Proceeding step by step, based on the code fragment, we obtain:

How to find the time complexity of the following code snippet

What is the time complexity of the following code snippet and how to calculate it.
function(int n){
for(int i = 1 ; i<=n ; i++){
for(int j=1 ; j<=n; j+=i){
System.out.println("*");
}
}
Let's think about the total work that's done. As you noted in your comment, the inner loop runs n times when i = 1, then n / 2 times when i = 2, then n / 3 times when i = 3, etc. (ignoring rounding errors). This means that the total work done is
n + n/2 + n/3 + n/4 + ... + n/n
= n(1 + 1/2 + 1/3 + 1/4 + ... + 1/n)
The term in parentheses is the nth harmonic number, denoted Hn, so the work done overall is roughly nHn. It's known that Hn = Θ(log n), so the total work is Θ(n log n).

Calculate the time complexity of the following function

How do I calculate the time complexity of the following function?
int Compute (int n)
{
int j = 0;
int i = 0;
while (i<=n)
{
i = 2*j + i + 1;
j++;
}
return j-1;
}
Now, I know that the loop has O(n) time complexity, but in this case i grows in a much faster rate. Taking this iteration by iteration I found out that, for every m-th iteration i = m^2. But I'm still confused how to calculate Big-O.
If you look at the values of i and j for a few iterations:
i=1
j=1
i=4
j=2
i=9
j=3
i=16
j=4
and so on. By mathematical induction we can prove that i takes square values: ( 2*n + n^2 + 1 = (n+1)^2 )
Since we loop only while i<=n and since i takes the vales 1, 2^2, 3^2,..., k^2 <=n, it means that we stop when i=k goes over sqrt(n). Hence the complexity seems to be O(k) which means O(sqrt(n)).