Time complexity of code within nested loop - time-complexity

Given
for (int i = 1; i <= n - 1; i++)
for (int j = i + 1; j <= n; j++)
Console.WriteLine(i, j);
I understand that the outer for loop runs 4n - 1 times and the inner runs 3n^2 - 3 times, however I don't understand why the print statement runs n(n - 1)/2 times. I am only getting n(n - 1) as my time complexity yet the slides say n(n - 1)/2. What am I missing?

for i = 1, j varies from 2 to n => n-1 times
for i = 2, j varies from 3 to n => n-2 times
...
...
for i=n-1 j varies from n to n => 1 time
so number of operations => (n-1) + (n-2) + (n-3) + .... +1
that solves to n(n-1)/2 (remember the formula for summation of n natural numbers - https://cseweb.ucsd.edu/groups/tatami/handdemos/sum/

You are not missing much because the big O bound of both n(n - 1) and n(n - 1)/2 is O(n^2). The double loop you showed will be upper bounded by O(n^2), and this is the main point here, I think.

Related

what is the time complexity of a nested for loop that iterates n - 1 - i?

So if I have a loop like this?
int x, y, z;
for(int i = 0; i < n - 1; i++) {
for(int j = 0; j < n - 1 - i; j++){
x = 1;
y = 2;
z = 3;
}
}
so we start with the x, y, z definition so we have 4 operations there,
int i = 0 occurs once, i < n - 1 and i++ iterate n - 1 times, int j = 0, iterates n - 1 times and j < n - 1 - i and j++ iterates (n - 1) * (n - 1 - i) and xyz = 1 would iterate (n - 1) * (n - 1 - i) as well. So if I were to simplify this, would the above code run at O(n^2)?
so we start with the x, y, z definition so we have 4 operations there
This is not necessary, we need only count critical operations (i.e. in this case how often the loop body executes).
So if I were to simplify this, would the above code run at O(n²)?
A function T(n) is in O(g(n)) if T(n) <= c*g(n) (under the assumption n >= n0) for some constants c > 0, n0 > 0.
So for your code, the loop body is executed n - i times for every i, of which there are n. So we have:
Which is indeed true for c = 1/2, n0 = 1. Therefore T(n) ∈ O(n²).
You are correct that the complexity is O(n^2). There is more than one way to approach the question of why.
The formal way is to count the number of iterations of the inner loop, which will be n-1 the first time, then n-2, then n-3, ... all the way down to 1, giving a total of n*(n-1)/2 iterations, which is O(n^2).
An informal way is to say the outer loop runs O(n) times, and "on average", i is roughly n/2, so the inner loop runs on average about (n - n/2) = n/2 times, which is also O(n). So the total number of iterations is O(n) * O(n) = O(n^2).
With both of these techniques, it's not enough to just say that the loop body iterates O(n^2) times - we also need to check the complexity of the inner loop body. In this code, the body of the inner loop just does a few assignments, so it has a complexity of O(1). This means the overall complexity of the code is O(n^2) * O(1) = O(n^2). If instead the inner loop body did e.g. a binary search over an array of length n, then that would be O(log n) and the overall complexity of the code would be O(n^2 log n), for example.
Yes, you are right. Time complexity of this program will be O(n^2) at its worst case.

Time complexity of this loop with multiplication

I wonder the complexity for this loop in terms of n
for (int i = 1; i <= n; i++) {
for (int j = 1; j * i <= n; j++) {
minHeap.offer(arr1[i - 1] + arr2[j - 1]);
}
}
What I did was to follow the concept of Big-O and gave it an upper bound -- O(n^2).
This will involve some math, so get ready :)
Let's first count how many times the line minHeap.offer(arr1[i - 1] + arr2[j - 1]); gets invoked. For each i from the outer loop, the number of iterations of the inner loop is n/i because the condition j * i <= n is equivalent to j <= n/i. Therefore, the total number of iterations of inner loop is n/1 + n/2 + n/3 + .. + 1, or, formally written,
There is a good approximation for this sum explained in detail e.g. here, so take a look. Since we are interested only in asymptotic complexity, we can take only the highest order term which is n * logn. If there was some O(1) operation instead of minHeap.offer(arr1[i - 1] + arr2[j - 1]); that would be a solution of your problem. However, the complexity of offer method in Java is O(logk), where k denotes the current size of priority queue. In our case, priority queue gets larger and larger, so the total running time is log1 + log2 + ... + log(n * logn) = log(1 * 2 * ... * nlogn) = log((nlogn)!).
We can additionally simplify this by using Stirling's approximation, so the final complexity is O(n * logn * log(n * logn)).

Calculating the time complexity

Can somebody help with the time complexity of the following code:
for(i = 0; i <= n; i++)
{
for(j = 0; j <= i; j++)
{
for(k = 2; k <= n; k = k^2)
print("")
}
a/c to me the first loop will run n times,2nd will run for(1+2+3...n) times and third for loglogn times..
but i m not sure about the answer.
We start from the inside and work out. Consider the innermost loop:
for(k = 2; k <= n; k = k^2)
print("")
How many iterations of print("") are executed? First note that n is constant. What sequence of values does k assume?
iter | k
--------
1 | 2
2 | 4
3 | 16
4 | 256
We might find a formula for this in several ways. I used guess and prove to get iter = log(log(k)) + 1. Since the loop won't execute the next iteration if the value is already bigger than n, the total number of iterations executed for n is floor(log(log(n)) + 1). We can check this with a couple of values to make sure we got this right. For n = 2, we get one iteration which is correct. For n = 5, we get two. And so on.
The next level does i + 1 iterations, where i varies from 0 to n. We must therefore compute the sum 1, 2, ..., n + 1 and that will give us the total number of iterations of the outermost and middle loop: this sum is (n + 1)(n + 2) / 2 We must multiply this by the cost of the inner loop to get the answer, (n + 1)(n + 2)(log(log(n)) + 1) / 2 to get the total cost of the snippet. The fastest-growing term in the expansion is n^2 log(log(n)) and so that is what would typically be given as asymptotic complexity.

How to calculate the complexity of the Algorithm?

I have to calculate the complexity of this algorithm,I tried to solve it and found the answer to be O(nlogn). Is it correct ? If not please explain.
for (i=5; i<n/2; i+=5)
{
for (j=1; j<n; j*=4)
op;
x = 3*n;
while(x > 6)
{op; x--;}
}
Katrina, In this example we've got an O (n*log(n))
`
for (int i= 0; i < N; i++) {
c= i;
while (c > 0) {
c= c/2
}
}
How ever you have another for that includes this both bucles.
Im not quite sure to understand the way it works the algorithm but an standard way considering an another for bucle, should be O(nnlog n
for (int j= 0; i < N); j++) { --> O(n)
for (int i= 0; i < N; i++) { O(n)
c= i;
while (c > 0) { O(logn)
c= c/2 O(1)
}
}
} ?
Aftermath in this standard algorithm would be O(n) * O(n) * O(logn) * O(1)
So, I think you forgot to include another O(n)
)
Hope it helps
Let's count the number of iterations in each loop.
The outermost loop for (i=5; i<n/2; i+=5) steps through all values between 5 and n / 2 in steps of 5. It will, thus, require approximately n / 10 - 1 iterations.
There are two inner loops. Let's consider the first one: for (j=1; j<n; j*=4). This steps through all values of the form 4^x between 1 and n for integers of x. The lowest value of x for this to be true is 0, and the highest value is the x that fulfills 4^x < n -- i.e., the integer closest to log_4(n). Thus, labelling the iterations by x, we have iterations 0, 1, ..., log_4(n). In other words, we've approximately log_4(n) + 1 iterations for this loop.
Now consider the second inner loop. It steps through all values from 3 * n down to 7. Thus the number of iterations are approximately 3 n - 6.
All other operations have constant run time and can therefore be ignored.
How do we put this together? The two inner loops are run sequentially (i.e., they are not nested) so the run time for both of them together is simply the sum:
(log_4(n) + 1) + (3 n - 6) = 3 n + log_4(n) - 5.
The outer loop and the two inner loops are, however, nested. For every iteration of the outer loop, both the inner ones are run. Therefore we multiply the number of iterations of the outer with the total of the inner:
(n / 10 - 1) * (3 n + log_4(n) - 5) =
= 3 n^2 / 10 + n log_4(n) / 10 - 7 n / 2 - log_4(n) + 5.
Finally, complexity is often expressed in Big-O notation -- that is, we're only interested in the order of the run time. This means two things. First, we can ignore all constant factors in all terms. For example, O(3 n^2 / 10) becomes just O(n^2). Thereby we have:
O(3 n^2 / 10 + n log_4(n) / 10 - 7 n / 2 - log_4(n) + 5) =
= O(n^2 + n log_4(n) - n - log_4(n) + 1).
Second, we can ignore all terms that have a lower order than the term with the highest order. For example, n is of an higher order than 1 so we have O(n + 1) = O(n). Thereby we have:
O(n^2 + n log_4(n) - n - log_4(n) + 1) = O(n^2).
Finally, we have the answer. The complexity of the algorithm your code describes is O(n^2).
(In practice, one would never calculate the (approximate) number of iterations as we did here. The simplification we did in the last step can be done earlier which makes the calculations much easier.)
As Fredrik mentioned, time complexity of first and third loops are O(n). an time for second loop is O(log(n)).
So complexity of following algorithm is O(n^2).
for (i=5; i<n/2; i+=5)
{
for (j=1; j<n; j*=4)
op;
x = 3*n;
while(x > 6)
{op; x--;}
}
Note that complexity of following algorithm is O(n^2*log(n)) which is not same with above algorithm.
for (i=5; i<n/2; i+=5)
{
for (j=1; j<n; j*=4)
{
op;
x = 3*n;
while(x > 6)
{op; x--;}
}
}

What is the order of growth of this function?

I'm new to algorithm and big 0. What is the order of growth of this function?
I do a println and f(10) runs 15 times. f(20) runs 31 times.
It looks to me like log(N)*N/2. So it is logarithmic or linearithmic?
static long f (long N) {
long sum = 0;
for (long i = 1; i < N; i *= 2)
for (long j = 0; j < i; j++)
sum++;
return sum;
}
The runtime is O(n). To see this, note that the inner loop runs 1 time on the first iteration, 2 times on the next iteration, 4 times on the next iteration, and more generally 2i times on the 2ith iteration. The outer loop stops after lg n iterations because it keeps doubling, so the total work done is
1 + 2 + 4 + 8 + ... + 2lg n
This is the sum of a geometric series and works out to 2lg n + 1 - 1 = 2 · 2lg n - 1 = 2n - 1 = O(n).
Hope this helps!
Inner loop j counts i times -> max is n
Outter loop counts from 0 to n, multiplying by 2 each time, so it's lgn times.
So total is o(nlgn)
Proceeding formally, you obtain:
O(2^lgn) should be the complexity.growth of exponential function is more than a linear funtion.Hence 2.2^lgn=O(2^lgn) instead of O(n)