Runtime complexity of the function - time-complexity

I have to find the time complexity of the following program:
function(int n)
{
for(int i=0;i<n;i++) //O(n) times
for(int j=i;j<i*i;j++) //O(n^2) times
if(j%i==0)
{ //O(n) times
for(int k=0;k<j;k++) //O(n^2) times
printf("8");
}
}
I analysed this function as follows:
i : O(n) : 1 2 3 4 5
j : : 1 2..3 3..8 4..15 5..24 (values taken by j)
O(n^2): 1 2 6 12 20 (Number of times executed)
j%i==0 : 1 2 3,6 4,8,12 5,10,15,20 (Values for which the condition is true)
O(n) : 1 1 2 3 4
k : 1 2 3,6 4,8,12 5,10,15,20 (Number of times printf is executed)
Total : 1 2 9 24 50 (Total)
However I am unable to bring about any conclusions since I don't find any correlation between $i$ which is essentially O(n) and Total of k (last line). In fact I don't understand if we should be looking at the time complexity in terms of number of times printf is executed since that will neglect O(n^2) execution of j-for loop. The answer given was O(n^5) which I presume is wrong but then whats correct? To be more specific about my confusion I am not able to figure out how that if(j%i==0) condition have effect on the overall runtime complexity of the function.

The answer is definitely not O(n^5). It can be seen very easily. Suppose your second inner loop always runs n^2 times and your innermost loop always runs n times, even then total time complexity would be O(n^4).
Now let us see what is actual time complexity.
1.The outermost loop always runs O(n) times.
2.Now let us see how many times second inner loop runs for a single iteration of outer loop:
The loop will run
0 time for i = 0
0 time for i = 1
2 times for i = 2
....
i*i - i times for j = i.
i*i - i is O(i^2)
3. Coming to the innermost loop, it runs only when j is divisble by i and j varies from i to i*i-1.
This means j goes through i*1, i*2 , i*3 ..... till last multiple of i less than i*i. Which is clearly O(i), Hence for a single iteration of second inner loop innermost loop runs O(i) times, this means total iterations of two inner loops is O(i^3).
Summing up O(i^3) for i = 0 to n-1 will definitely give a term that is O(n^4).
Therefore, the correct time complexity is O(n^4).

Related

How to prove a lower bound of n * log(n) for this simple algorithm

The question is how many times does this algorithm produce a meow:
KITTYCAT(n):
for i from 0 to n − 1:
for j from 2^i to n − 1:
meow
So The inner loop has a worse case of n, but it only runs log(n) times because even though the outer loop runs n times, whenever i > log(n) the inner loop never runs, so you have O(n * log(n)).
However, since I can't assume a best case of n for the inner loop, how do I prove that the algorithm still has a best case of n * log(n)?
When i > log2n the start value of the inner loop is higher than its end value. Depending on how you interpret it, this either means that the inner loop counts down, or that it does not run at all. If you interpret it as counting down, then it gets very big indeed and ends up dominating, and you have Ω(2n), which is not what you seem to be looking for.
If instead you assume the inner loop goes away, then this code is really
for i from 0 to log2n:
     for j from 2i to n - 1:
          meow
giving you Ω(nlogn)
If you're asking how to prove that last step, you can calculate the exact number of iterations -- the inner loop iterates n times, then n-1 times, then n-2, then n-4, etc all the way down to 0. So the exact complexity (at least when n is a power of 2) is
    n + n-1 + n-2 + n-4 + ... + n-n/4 + n-n/2 + n-n
or
    nlog2n - 1 - 2 - 4 - ... - n/4 - n/2
which converges to
    nlog2n - n
which is asymptotically equivalent to nlogn as n -> ∞

simple time complexity O(nlogn)

I am reviewing some Big O notation for an interview and I come across this problem.
for i = 1 to n do:
j = i
while j < n do:
j = 2 * j
simple right? the outer loop provides n steps. and each of those steps we do a single step O(1) of assignment j=i then log(n-j) or log(n-i) since j = i step for the while loop. I thought the time complexity would be O(nlogn) but the answer is O(n).
here is the answer:
The running time is approximately the following sum: Σ 1 +
log(n/i) for i from 1 to n which is Θ(n).
Now it has been a while so I am a bit rusty. where does log(n/i) comes from? I know log(n) - log(i) = log(n/i) however I thought we log(n-i) not log(n) - log(i). and how is the time complexity not O(nlogn)? I am sure I am missing something simple but I been staring at this for hours now and I am starting to lose my mind.
source: here is the source to this problem Berkeley CS 170, Fall 2009, HW 1
edit: after thinking about it a little more it makes sense that the time complexity of the inner loop is log(n/i). cause each inner loop runs n-i times but i double each loop. if the inner loop were always starting at 0 we have log(n) but take into account the number of the loop we don't have to loop over which is log(i). log(n) - log(i) which is log(n/i).
I think the log(n/i) comes from the inner loop
notice how j = i
which means when i=2 (lets say n=10)
the inner loop
while j < n do:
j = 2 * j
will run only from j=2 to 10 where j multilplies itself by 2 (hence the log) & quickly overruns the value of n=10
so the inner loop runs log base 2 n/i times
i ran a simple i=10 through the code & it looks like linear time because most of the time inner loop runs only once.
example : when the value of i becomes such that if you multiply it by 2, you get greater than or equal to n, you don't run the inner loop more than once.
so if n=10 you get one execution in the inner loop starting from i=n/2 (if i=10/2=5) then j starts with j=5, gets in the loop once multiplies itself with 2 & the inner loop condition while j < n do: fails.
EDIT : it would be O(n.log(n)) if the value of j started with j=0 everytime & the inner loop ran from i to n

time complexity exercise (pseudo code)

Just started Data Structure. Got stuck on this one:
I am having trouble with the inner while and for loops, Because it changes if the N number is odd or even.
My best case will be - the inner for loop runs logn (base 2) times,
And the while loop - logn times (base 2)
Would love some help.
Concentrate on how many times do_something() is called.
The outer for loop clearly runs n times, and the while loop inside it is independent of the variable i. Thus do_something() is called n times the total number of times it is called in the while loop.
In the first pass through the while loop, do_something() is called once. The second time, it is called twice, the third time it is called, 4 times, etc.
The total number of times it is called is thus
1 + 2 + 4 + 8 + ... + 2^(k-1)
where k is maximal such that 2^(k-1) <= n.
There is a standard formula for the above sum. Use it then solve for k in terms of n and multiply the result by the n from the outer loop, and you are done.

Compute Time Complexity

I got skunked in computing the time complexity in the inner loop.
Lets consider the following case.
Case 1:
for(int i = 0; i <= n; i++) - O(n)
{
for(int j = 0; j <= i; j++) - O(?);
{
//Some thing goes here
}
}
Here the inner loop got executed every time up to value i.
So, can I tell like, the complexity for the inner loop is some O(i),
and the overall complexity is O(N) * O(I); ie: O(N*I)
Could some one explain in some brief manner, so i can able to understand the computing.
Thanks.
The overall time complexity is O(n²) (in fact, it’s even Θ(n²)).
The inner loop has complexity O(i). However, n is related to i, so simply saying that the whole thing has complexity O(ni) is wrong. The body of the inner loop will run 0 + 1 + 2 + ⋯ + n = (n² + n) / 2 = Θ(n²) times.
Before i go over what you asked for i will explain a simple example.
We caliculate the time complexity based on how many times the innermost loop is executed
consider this case:
for(i=0;i<n;i++){
for(j=0;j<n;j++){
....
}
}
here the outer loop is executed n times and in every iteration the inner loop is executed n times.
1st iteration - n
2st iteration - n
3st iteration - n
.
.
.
nth iteration -n
so the inner loop is executed n*n times.so it is O(n^2).
Now we caliculate for the case what you asked for
for(int i=0;i<=n;i++){
for(int j=0;j<=i;j++){
//Some thing goes here
}
}
Here the Outer loop is executed for n times. and in every i'th iteration the inner loop is executed i times.
1st iteration - 1
2nd iteration - 2
3rd iteration - 3
.
.
.
nth iteration -n
so when we caliculate it would be
1+2+3+.....+n = n(n+1)/2
which is basically O(n^2). :)

Time complexity of nested while loop with if-else

I am getting confused about how to analysis the time complexity within a nested while loop which divide into odd and even situation. could anyone help to explain how to deal with the situation?
i = 1
while (i < n) {
k = i
while ( k < n ) {
if ( k % 2 == 1 )
k ++
else
k = k + 0.01*n
}
i = i + 0.1*n
}
So in a problem like this, the factors 0.01 and 0.1 play a huge role.
First let's consider the inner while loop, if k is odd, we increment k by 1. If k is even, we increment k by one-hundredths of n. How man iterations can this inner while loop run?
Clearly if all iterations were of type-1(odd case), the inner while loop would run n-k times, and similarly if all the iterations were of type-2(even case), the inner while loop would run atmost a 100 times(as we increment the value of k by one-hundredths of n each time).
Given value of k, the number of iterations of the inner while loop is:
max(n-k,100). From now on, we will assume the value of n-k to be greater than 100 always, without loss of generality.
Okay, how does the outer loop iterate? In each iteration of the outer loop, the value of i increases by one-tenths of n each time, so the outer while will run at most 10 times.
Making the running times explicit and calculating the overall running time:
Running time for first iteration of outer loop : n-k
Running time for second iteration of outer loop : + n-(k+0.1*n)
+ n-(k+0.2*n)
...
+ n-(k+0.9*n)
-----------
= 10n-10k-(4.5)n
Plugging in k=1(as this is the start value of k),
10n-10-4.5n = 5.5 n -10 = O(n)
Hence complexity is O(n) time.