Time complexity of this while loop? - while-loop

What's the time complexity of this loop?
j=2
while(j <= n)
{
//body of the loop is Θ(1)
j=j^2
}

You have
j = 2
and each loop :
j = j^2
The pattern is :
2 = 2^(2^0)
2*2 = 2^(2^1)
4*4 = 2^(2^2)
16*16 = 2^(2^3)
Which can be seen as :
2^(2^k) with k being the number of iteration
Hence loop stops when :
2^(2^k) >= n
log2(2^(2^k)) >= log2(n)
2^k >= log2(n)
log2(2^k) >= log2(n)
k >= log2(log2(n))
the complexity is log2(log2(n))

Related

Time complexity from pseudocode

I am trying to find the Time complexity from the lines of code. However, I am not really getting it.
int codeMT(int n) {
int a = 1;
int i = 1;
while (i < n * n) {
j = 1;
while (j < n * n * n) {
j = j * 3;
a = a + j + i;
}
i = i + 2;
}
return a;
}
In this case, this is my thought process.
Firstly, we need to start from the inner loop.
while (j < n * n * n) {
j = j * 3;
a = a + j + i;
}
In here, there is j = j * 3 which makes this log(n)(log base 3) but then we have a loop guard that has nnn which makes the n^3. Therefore, this part of code has log(n^3) time complexity.
Secondly, the outer while loop has the loop guard of n * n which makes it n^2.
Thus, O(n^2(log(n^3))
Is my approach okay? or is there anything that I am missing/things that I could improve my thought process?
Thank you

determine the time complexity in Java

public class complexity {
public int calc(int n) {
int a = 2 * Math.abs(n);
int b = Math.abs(n) - a;
int result = n;
for (int i = 0; i < a; i++) {
result += i;
for (int j = 0; j > b; j--) {
result -= j;
}
for (int j = a/2; j > 1 ; j/=2) {
System.out.println(j);
}
}
int tmp = result;
while (tmp > 0) {
result += tmp;
tmp--;
}
return result;
}
}
I have been given the following program in Java and need to determine the time complexity (Tw(n)).
I checked those site:
Big O, how do you calculate/approximate it?
How to find time complexity of an algorithm
But I have still problem too understand it.
Here is the given solution by the professor.From the for loop on I didn't understand anything how he came up with the different time complexity. Can anybody explain it ?
let's go over it step by step :
before the for loop every instruction is executed in a constant time
then:
for(int i = 0; i < a; i++) is executed 2n + 1 times as a = 2*n so 0=> a = 2n steps and the additional step is when i = 2n the program has to execute that step and when it finds that i = 2n it breaks out of the loop.
Now each instruction in the loop has to be executed 2n times (as we are looping from 0 to 2n - 1) which explains why result += i is done 2n times.
The next instruction is another for loop so we apply the same logic about the line
for (int j = 0; j > b; j--) : as b = -n this instruction will go from 0 down to -n plus the extra step I mentioned in the first for loop which means : 0 -> -n => n steps + 1 = n+1 steps and as this instruction is in the other loop it will be execited 2n times hence 2n * (n+1)
Instructions inside this for loop are done n times and therefore result -= j is done n times and as the loop itself is done 2n times (result -= j) will be done n*2n times
Same goes for the last for loop except here we are going from a/2 which is n as a = 2n to 1 and each time we are dividing by 2, this is a bit complicated so lets do some steps first iteration j = n then j = n/2 then j = n/4 till j is <= 1 so how many steps do we need to reach 1 ?
to reach n/2 we need 1 = log(2) step n => n/2
to reach n/4 we need 2 2 = log(4) steps n => n/2 => n/4
we remark here that to reach n/x we need log(x) steps and as 1 = n/n we need log(n) steps to reach 1
therefore each instruction in this loop is executed log(n) times and as it is in the parent loop it has to be done 2n times => 2n*log(n) times.
for the while loop : result = n + (0 + 1 + 2 + .. + 2n) + (0 + 1 + .. + 2(n^2))
we did this in the for loop and then you do the arithmetic sequence sums it gives
result = n^2 (n+1) and here you go.
I hope this is clear don't hesitate to ask otherwise !
I am willing to clarify i.e. the last while loop. Unfortunately my answer does not match the professor's. And I am not that certain to have not made a grave mistake.
result starts with n
for i from 0 upto 2|n|
result += i
2|n| times done, average i.|n| so result increased by: ~2n.n = 2n².
for j from 0 downto -|n|
result -= j
2|n| times done, result increased by: 2n.n/2 = n²
So result is n + 3n²
The outer for loop remains O(n²) as the inner println-for has only O(log n).
The last while loop would be the same as:
for tmp from n + 3n² downto 0
result += tmp
This is also O(n²) like the outer for loop, so the entire complexity is O(n²).
The result is roughly 3n².3n²/2 or. 4.n4.

How did the inner loop execute "n/i" times?

I have a doubt regarding the time complexity of a code snippet and didn't quite understand the solution given.
function (n) {
for(int i =0 ; i < n ; i ++){
for(int j = 0 ; j <= n ; j += i){
printf("*");
}
In the above code, it's given that inner loop executes n/i times for each value of i . And , overall time complexity is O(nlogn).
I can't figure out why the inner loop executes n/i times. Can someone please help me.
There are two loops, one incrementing i, and one incrementing j.
Given fixed i, for each pass of the j loop, j increments by i, until j reaches n.
So the values of j in each iteration are:
j
j + i
j + 2i
...
j + (n/i - j/i)i
As you can see, this runs O(n/i) times.

Time complexity for nested loops?

a)
sum = 0;
for(i=1;i<2*n;i++)
for(j=1;j<i*i;j++)
for(k=1;k<j;k++)
if (j % i == 1)
sum++;
b)
sum = 0;
for(i=1;i<2*n;i++)
for(j=1;j<i*i;j++)
for(k=1;k<j;k++)
if (j % i)
sum++;
I chanced the above two pseudo code above while looking for algorithm analysis questions to practice on. The answers for the above two snippets are O(n4) and O(n5) respectively.
Note that the running time corresponds here to the number of times the operation
sum++ is executed.
How is it that the time complexity for the above two algorithms different by an order of n when the only difference is the if loop testing for equality to 1? How would I go about counting the O(n) complexity for such a question?
Algorithm A
Let's call f(n) the number of operations aggregated at the level of the outer loop, g(n) that for the first inner loop, h(n) for the most inner loop.
We can see that
f(n) = sum(i=1; i < 2n; g(i))
g(i) = sum(j=1, j < i*i; h(j))
h(j) = sum(k=1; k < j; 1 if j%i = 1, else 0)
How many times j%i = 1 while j varies from 1 to i*i? Exactly i times for the following values of j:
j = 0.i + 1
j = 1.i + 1
j = 2.i + 1
...
j = (i-1)*i + 1
So:
h(j) = sum(k=1; k < j; 1 if `j%i = 1`, else 0)
= i
=> g(i) = sum(j=1, j < i*i; h(j))
= sum(j=1, j < i*i; i)
= i*i * i = i^3
=> f(n) = sum(i=1; i < 2n; g(i))
= sum(i=1; i < 2n; i^3)
<= sum(i=1; i < 2n; 16.n^3) // Here just cap every i^3 with (2n)^3
<= 32.n^4
=> f(n) = O(n^4)
Algorithm B
(With the same naming conventions as algorithm A)
How many times do we have j%i casting to true? Every time j%i is different from 0. And how many times does this happen? We need to remove the occurrences for which j is a multiple of i, which are i, 2.i, ... (i-1).i, over the range of integers 1 to i*i, which has i*i numbers. This quantity is i*i - (i-1) = i^2 - i + 1.
As a result,
h(j) = sum(k=1; k < j; 1 if j%i = 1, else 0)
= i^2 - i + 1
= i^2 // dropping the terms i and 1 which are dominated by i^2 as i -> +Inf.
=> g(i) = sum(j=1, j < i*i; h(j))
= sum(j=1, j < i*i; i^2)
= i*i * i^2
= i^4
=> f(n) = sum(i=1; i < 2n; g(i))
= sum(i=1; i < 2n; i^4)
<= sum(i=1; i < 2n; 32.n^4) // Here just cap every i^4 with (2n)^4
<= 64.n^5
=> f(n) = O(n^5)
Bottom line
The difference of complexities between algorithms A and B comes from the fact that:
You have i values of j for which i%j = 1
You have i^2 - i + 1 values of j for which i%j <> 0

Complexity of the program [duplicate]

In the book Programming Interviews Exposed it says that the complexity of the program below is O(N), but I don't understand how this is possible. Can someone explain why this is?
int var = 2;
for (int i = 0; i < N; i++) {
for (int j = i+1; j < N; j *= 2) {
var += var;
}
}
You need a bit of math to see that. The inner loop iterates Θ(1 + log [N/(i+1)]) times (the 1 + is necessary since for i >= N/2, [N/(i+1)] = 1 and the logarithm is 0, yet the loop iterates once). j takes the values (i+1)*2^k until it is at least as large as N, and
(i+1)*2^k >= N <=> 2^k >= N/(i+1) <=> k >= log_2 (N/(i+1))
using mathematical division. So the update j *= 2 is called ceiling(log_2 (N/(i+1))) times and the condition is checked 1 + ceiling(log_2 (N/(i+1))) times. Thus we can write the total work
N-1 N
∑ (1 + log (N/(i+1)) = N + N*log N - ∑ log j
i=0 j=1
= N + N*log N - log N!
Now, Stirling's formula tells us
log N! = N*log N - N + O(log N)
so we find the total work done is indeed O(N).
Outer loop runs n times. Now it all depends on the inner loop.
The inner loop now is the tricky one.
Lets follow:
i=0 --> j=1 ---> log(n) iterations
...
...
i=(n/2)-1 --> j=n/2 ---> 1 iteration.
i=(n/2) --> j=(n/2)+1 --->1 iteration.
i > (n/2) ---> 1 iteration
(n/2)-1 >= i > (n/4) ---> 2 iterations
(n/4) >= i > (n/8) ---> 3 iterations
(n/8) >= i > (n/16) ---> 4 iterations
(n/16) >= i > (n/32) ---> 5 iterations
(n/2)*1 + (n/4)*2 + (n/8)*3 + (n/16)*4 + ... + [n/(2^i)]*i
N-1
n*∑ [i/(2^i)] =< 2*n
i=0
--> O(n)
#Daniel Fischer's answer is correct.
I would like to add the fact that this algorithm's exact running time is as follows:
Which means: