What is the complexity of this function in terms of n? - time-complexity

f(n):
s = 1
m = n
while m >= 1:
for j in 1,2,3, .., m:
s = s + 1
m = m / 2
My attempt
How many times does the statement s = s + 1 run? Well, let's try a number or two for n and see what we get.
n = 32
m = n = 32
m = 32 => inner for loop, loops 32 times
m = 16 => -.- 16 times
m = 8 => -.- 8 times
m = 4 => -.- 4 times
m = 2 => -.- 2 times
m = 1 => -.- 1 time
In total 16+8+4+2+1 = 31 times for n = 32
Doing the same for n = 8 gives 15 times
Doing the same for n = 4 gives 7 times
Doing the same for n = 64 gives 127 times
Notice how it always seemingly is 2 * n - 1, I believe this to be no coincidence because what we're really observing is that the total amount of times the statement gets executed is equal to the geometric sum sum of 2^k from k=0 to k=log(n) which has a closed form solution 2n + 1 given that we're using base 2 logarithm.
As such, my guess is that the complexity is O(n).
Is this correct?

Well what you just observed is true indeed. We can prove it mathematically as well.
Inner loop execution for first iteration -> n
Inner loop execution for second iteration -> n/2
Inner loop execution for third iteration -> n/(2^2)
and so on....
Therfore total time is (n + (n/2) + (n/(2^2)) + ... + (n/(2^k))) where n/(2^k) should be equal to 1 which implies k = log(n). Taking n common from all terms will give us a GP And now using GP formulae in above series , total time will come out to be 1(1 - ((1/2)^k)) / (1 - 1/2). putting value of k = log(n), we will get total time as 2*(n-1).
Note -:
This gives us time complexity of O(n).
log is of base 2.

Related

Running time of nested while loops

Function f(n)
s = 0
i = 1
while i < 7n^1/2 do
j = i
while j > 5 do
s = s + i -j
j = j -2
end
i = 5i
end
return s
end f
I am trying to solve the running time for big theta with the code above. I have been looking all over the place for something to help me with an example, but everything is for loops or only one while loop. How would you go about this problem with nested while loops?
Let's break this down into two key points:
i starts from 1, and is self-multiplied by 5, until it is greater than or equal to 7 sqrt(n). This is an exponential increase with logarithmic number of steps. Thus we can change the code to the following equivalent:
m = floor(log(5, 7n^(1/2)))
k = 0
while k < m do
j = 5^k
// ... inner loop ...
end
For each iteration of the outer loop, j starts from i, and decreases in steps of 2, until it is less than or equal to 5. Note that in the first execution of the outer loop i = 1, and in the second i = 5, so the inner loop is not executed until the third iteration. The loop limit means that the final value of j is 7 if k is odd, and 6 if even (you can check this with pen and paper).
Combining the above steps, we arrive at:
First loop will do 7 * sqrt(n) iterations. Exponent 1/2 is the same as sqrt() of a number.
Second loop will run m - 2 times since first two values of i are 1 and 5 respectively, not passing the comparison.
i is getting an increment of 5i.
Take an example where n = 16:
i = 1, n = 16;
while( i < 7 * 4; i *= 5 )
//Do something
First value of i = 1. It runs 1 time. Inside loop will run 0 times.
Second value of i = 5. It runs 2 times. Inside loop will run 0 times.
Third value of i = 25. It runs 3 times. Inside loop will run 10 times.
Fourth value of i = 125. It stops.
Outer iterations are n iterations while inner iterations are m iterations, which gives O( 7sqrt(n) * (m - 2) )
IMO, is complex.

Why is code like i=i*2 considered O(logN) when in a loop?

Why because of the i=i*2 is the runtime of the loop below considered O(logN)?
for (int i = 1; i <= N;) {
code with O(1);
i = i * 2;
}
Look at 1024 = 210. How many times do you have to double the number 1 to get 1024?
Times 1 2 3 4 5 6 7 8 9 10
Result 2 4 8 16 32 64 128 256 512 1024
So you would have to run your doubling loop ten times to get 210 And in general, you have to run your doubling loop n times to get 2n. But what is n? It's the log2 2n, so in general if n is some power of 2, the loop has to run log2n times to reach it.
To have the algorithm in O(logN), for any N it would need (around) log N steps. In the example of N=32 (where log 32 = 5) this can be seen:
i = 1 (start)
i = 2
i = 4
i = 8
i = 16
i = 32 (5 iterations)
i = 64 (abort after 6 iterations)
In general, after x iterations, i=2^x holds. To reach i>N you need x = log N + 1.
PS: When talking about complexities, the log base (2, 10, e, ...) is not relevant. Furthermore, it is not relevant if you have i <= N or i < N as this only changes the number of iterations by one.
You can prove it pretty simply.
Claim:
For the tth (0 base) iteration, i=2^t
Proof, by induction.
Base: 2^0 = 1, and indeed in the first iteration, i=1.
Step: For some t+1, the value of i is 2*i(t) (where i(t) is the value of i in the t iteration). From Induction Hypothesis we know that i(t)=2^t, and thus i(t+1) = 2*i(t) = 2*2^t = 2^(t+1), and the claim holds.
Now, let's examine our stop criteria. We iterate the loop while i <= N, and from the above claim, that means we iterate while 2^t <= N. By Doing log_2 on both sides, we get log_2(2^t) <= log_2(N), and since log_2(2^t) = t, we get that we iterate while t <= log_2(N) - so we iterate Theta(log_2(N)) times. (And that concludes the proof).
i starts in 1. In each iteration you multiply i by 2, so in the K-th iteration, i will be 2K-1.
After a number K of iterations, 2K-1 will be bigger than (or to) N.
this means N ≤ 2K-1
this means log2(N) ≤ K-1
K-1 will be the number of iterations your loop will run, and since K-1 is greater or equal to log(N), your algorithm is logarithmic.

Time complexity for the loop

The outer loop executes n times while the inner loop executes ? So the total time is n*something.
Do i need to learn summation,if yes then any book to refer?
for(int i=1;i<=n;i++)
for(int j=1;j<=n;j+=i)
printf("*");
This question can be approached by inspection:
n = 16
i | j values | # terms
1 | 1, 2, ..., 16 | n
2 | 1, 3, 5, ..., 16 | n / 2
.. | .. | n / 3
16 | 16 | n / n
In the above table, i is the outer loop value, and j values show the iterations of the inner loop. By inspection, we can see that the loops will take n * (1 + 1/2 + 1/3 + ... + 1/n) steps. This is a bounded harmonic series. As this Math Stack Exchange article shows, there is no closed form for the above expression in terms of n. However, as this SO article shows, there is an upper bound of O(n*ln(n)).
So, the running time for your two loops is O(n*ln(n)).
I believe the time complexity of that is O(n*log(n)). Here is why:
Let us pick some arbitrary natural number i and see how many steps the inner loop takes for this given i. Well for this i, you are going from j=1 to j<=n with a jump of i in between. So basically you are doing this summation many steps:
summation = 1 + (1+i) + (1+2i) + ... (1+ki)
where k is the largest integer such that 1+ki <= n. That is, k is the number of steps and this is what we want to solve for. Well we can solve for k in the equality resulting in k <= (n-1)/i and thus k = ⌊(n-1)/i⌋. That is, k is the floor function/integer division of (n-1)/i. Since we are dealing with time complexities, this floor function doesn't matter so we will just say k = n/i for simplicity. This is the number of steps that the inner loop will take for a given i. So we basically need to add all these for i = 1 to i <= n.
So numsteps will be this addition:
numsteps = n/1 + n/2 + n/3 + ... n/n
= n(1 + 1/2 + 1/3 + ... 1+n)
So we need to find the sum of 1 + 1/2 + ... 1/n to finish this. There is actually no good closed form for this sum but it is on the order of ln(n). You can read more about this here. You can also guess this since the integral from 1 to n of 1/x is ln(n). Again, since we are dealing with time complexity, we can just use ln(n) to represent its complexity. Thus we have:
numsteps = n(ln(n))
And so the time complexity is O(n*log(n)).
Edit: My bad, i was calculating the sum :P

Time complexity of nested while loop with if-else

I am getting confused about how to analysis the time complexity within a nested while loop which divide into odd and even situation. could anyone help to explain how to deal with the situation?
i = 1
while (i < n) {
k = i
while ( k < n ) {
if ( k % 2 == 1 )
k ++
else
k = k + 0.01*n
}
i = i + 0.1*n
}
So in a problem like this, the factors 0.01 and 0.1 play a huge role.
First let's consider the inner while loop, if k is odd, we increment k by 1. If k is even, we increment k by one-hundredths of n. How man iterations can this inner while loop run?
Clearly if all iterations were of type-1(odd case), the inner while loop would run n-k times, and similarly if all the iterations were of type-2(even case), the inner while loop would run atmost a 100 times(as we increment the value of k by one-hundredths of n each time).
Given value of k, the number of iterations of the inner while loop is:
max(n-k,100). From now on, we will assume the value of n-k to be greater than 100 always, without loss of generality.
Okay, how does the outer loop iterate? In each iteration of the outer loop, the value of i increases by one-tenths of n each time, so the outer while will run at most 10 times.
Making the running times explicit and calculating the overall running time:
Running time for first iteration of outer loop : n-k
Running time for second iteration of outer loop : + n-(k+0.1*n)
+ n-(k+0.2*n)
...
+ n-(k+0.9*n)
-----------
= 10n-10k-(4.5)n
Plugging in k=1(as this is the start value of k),
10n-10-4.5n = 5.5 n -10 = O(n)
Hence complexity is O(n) time.

What is the order of growth of this function?

I'm new to algorithm and big 0. What is the order of growth of this function?
I do a println and f(10) runs 15 times. f(20) runs 31 times.
It looks to me like log(N)*N/2. So it is logarithmic or linearithmic?
static long f (long N) {
long sum = 0;
for (long i = 1; i < N; i *= 2)
for (long j = 0; j < i; j++)
sum++;
return sum;
}
The runtime is O(n). To see this, note that the inner loop runs 1 time on the first iteration, 2 times on the next iteration, 4 times on the next iteration, and more generally 2i times on the 2ith iteration. The outer loop stops after lg n iterations because it keeps doubling, so the total work done is
1 + 2 + 4 + 8 + ... + 2lg n
This is the sum of a geometric series and works out to 2lg n + 1 - 1 = 2 · 2lg n - 1 = 2n - 1 = O(n).
Hope this helps!
Inner loop j counts i times -> max is n
Outter loop counts from 0 to n, multiplying by 2 each time, so it's lgn times.
So total is o(nlgn)
Proceeding formally, you obtain:
O(2^lgn) should be the complexity.growth of exponential function is more than a linear funtion.Hence 2.2^lgn=O(2^lgn) instead of O(n)