Time complexity of this loop with multiplication - time-complexity

I wonder the complexity for this loop in terms of n
for (int i = 1; i <= n; i++) {
for (int j = 1; j * i <= n; j++) {
minHeap.offer(arr1[i - 1] + arr2[j - 1]);
}
}
What I did was to follow the concept of Big-O and gave it an upper bound -- O(n^2).

This will involve some math, so get ready :)
Let's first count how many times the line minHeap.offer(arr1[i - 1] + arr2[j - 1]); gets invoked. For each i from the outer loop, the number of iterations of the inner loop is n/i because the condition j * i <= n is equivalent to j <= n/i. Therefore, the total number of iterations of inner loop is n/1 + n/2 + n/3 + .. + 1, or, formally written,
There is a good approximation for this sum explained in detail e.g. here, so take a look. Since we are interested only in asymptotic complexity, we can take only the highest order term which is n * logn. If there was some O(1) operation instead of minHeap.offer(arr1[i - 1] + arr2[j - 1]); that would be a solution of your problem. However, the complexity of offer method in Java is O(logk), where k denotes the current size of priority queue. In our case, priority queue gets larger and larger, so the total running time is log1 + log2 + ... + log(n * logn) = log(1 * 2 * ... * nlogn) = log((nlogn)!).
We can additionally simplify this by using Stirling's approximation, so the final complexity is O(n * logn * log(n * logn)).

Related

I can find time complexity for 'for' loop but when the condition is changed I get stuck. Is there any mathematical way to calculate time complexity? [duplicate]

This question already has answers here:
Big O, how do you calculate/approximate it?
(24 answers)
Closed 2 years ago.
Q1)
int a = 0, i = N;
while (i > 0) {
a += i;
i /= 2;
}
Q2)
int i, j, k = 0;
for (i = n / 2; i <= n; i++) {
for (j = 2; j <= n; j = j * 2) {
k = k + n / 2;
}
}
If you merely approach it mathematically, things can get very complex. most of the time it's better to walk through the logic of the algorithm.
Q1) i changes from N to 0. every time the loop runs, i (which was initially N) halves which gives you logarithmic time complexity. O(log N)
Q2) The outer loop runs n/2 times so O(N/2) and according to time complexity rules you drop the constant so O(N).
Every time the inner loop is run, you are doubling the value of j which means you are closing the distance between j and n in multiplications of 2 which is logarithmic so O(log N).
And since the inner loop runs for each element in the outer loop, the total time complexity is O(N log N).
And notice that I am not taking k = k + n / 2 and a += i into consideration because these operations run in constant time O(1) which is the least dominant time complexity and so has no effect in the calculations.

What is the time complexity of below func?

What is the running time complexity of fun()?
int fun(int n)
{
int count = 0;
for (int i = n; i > 0; i =i-2)
for (int j = 2; j < i; j=j*j)
for (int k=j; k>0; k=k/2)
count += 1;
return count;
}
is it O(n * lglgn * lglglgn)?
----
Edit:
j = loglog(i) times, big iterative value of j can be almost n(for example n=17,max(j)=16)
k= log(j), since the max value of j is at most n. then the max iterative times can be log(n)
so we can say that the big O of this question is O(n* lglgn * lgn)
Since the value of j and k depends on the previous iterative value (i, j), maybe there is better tight answer to this question.
We need to count this carefully, because the number of iterations of each inner loop depends in a non-trivial way on the outer loop's variable. Simply averaging for each loop and multiplying the results together will not give the right answer.
The outer loop runs O(n) times, because i counts from n down to 0 in constant steps.
The middle loop's values of j are 2, then 2*2 = 4, then 4*4 = 16, and on the m'th iteration, j = 2^2^m. The last iteration will be when 2^2^m >= i, in which case m >= log log i. So this runs O(log log i) times.
The innermost loop runs O(log j) times, because on the m'th iteration, k = j / 2^m. The last iteration will be when k <= 1, in which case m >= log j. So this runs O(log j) times.
However, it is not correct to multiply these together to get O(n * log log n * log log log n) - because i is not n on every iteration, and j is not log log n on every iteration. This gives an upper bound, but not a tight one. To calculate the true time complexity, you will need to write it as a double-summation, and simplify it algebraically.
As a simpler example to think about, consider the following code:
for(i = 1; i < n; i *= 2) {
for(j = 0; j < i; j += 1) {
// do something
}
}
The outer loop runs O(log n) times, and the inner loop runs O(i) times, but the overall complexity is actually O(n). To see this, count how many times // do something is reached; the first time the outer loop iterates it'll be 1, then it'll be 2, then 4, then 8, and so on up to n. This is a geometric progression with a sum <= 2n, giving a total number of steps which is O(n).
Note that if we naively multiply the two loops' complexities we get O(n log n) instead, which is an upper bound, but not a tight one.
Using Big O notation:
With the first outer loop we get O(N/2). You have a loop of N items and you are reducing it by 2 everytime, though you get a total of N/2 loops.
With the outter loop we get O(Log(I)).
With the most inner loop we have O(Log(J)), because you are dividing by 2 your iterator on every loop.
If we multiply the three complexities because they are nested:
O(N/2)*O(Log(I)) + O(Log(j)) ~ O(N/2*Log(I)*Log(J)) ~ O(N/2*Log^2(N)) ~ O(N*Log^2(N)).
We get a linearithmic complexity: O(N*Log^2(N))

what is the time complexity of a nested for loop that iterates n - 1 - i?

So if I have a loop like this?
int x, y, z;
for(int i = 0; i < n - 1; i++) {
for(int j = 0; j < n - 1 - i; j++){
x = 1;
y = 2;
z = 3;
}
}
so we start with the x, y, z definition so we have 4 operations there,
int i = 0 occurs once, i < n - 1 and i++ iterate n - 1 times, int j = 0, iterates n - 1 times and j < n - 1 - i and j++ iterates (n - 1) * (n - 1 - i) and xyz = 1 would iterate (n - 1) * (n - 1 - i) as well. So if I were to simplify this, would the above code run at O(n^2)?
so we start with the x, y, z definition so we have 4 operations there
This is not necessary, we need only count critical operations (i.e. in this case how often the loop body executes).
So if I were to simplify this, would the above code run at O(n²)?
A function T(n) is in O(g(n)) if T(n) <= c*g(n) (under the assumption n >= n0) for some constants c > 0, n0 > 0.
So for your code, the loop body is executed n - i times for every i, of which there are n. So we have:
Which is indeed true for c = 1/2, n0 = 1. Therefore T(n) ∈ O(n²).
You are correct that the complexity is O(n^2). There is more than one way to approach the question of why.
The formal way is to count the number of iterations of the inner loop, which will be n-1 the first time, then n-2, then n-3, ... all the way down to 1, giving a total of n*(n-1)/2 iterations, which is O(n^2).
An informal way is to say the outer loop runs O(n) times, and "on average", i is roughly n/2, so the inner loop runs on average about (n - n/2) = n/2 times, which is also O(n). So the total number of iterations is O(n) * O(n) = O(n^2).
With both of these techniques, it's not enough to just say that the loop body iterates O(n^2) times - we also need to check the complexity of the inner loop body. In this code, the body of the inner loop just does a few assignments, so it has a complexity of O(1). This means the overall complexity of the code is O(n^2) * O(1) = O(n^2). If instead the inner loop body did e.g. a binary search over an array of length n, then that would be O(log n) and the overall complexity of the code would be O(n^2 log n), for example.
Yes, you are right. Time complexity of this program will be O(n^2) at its worst case.

Time complexity of code within nested loop

Given
for (int i = 1; i <= n - 1; i++)
for (int j = i + 1; j <= n; j++)
Console.WriteLine(i, j);
I understand that the outer for loop runs 4n - 1 times and the inner runs 3n^2 - 3 times, however I don't understand why the print statement runs n(n - 1)/2 times. I am only getting n(n - 1) as my time complexity yet the slides say n(n - 1)/2. What am I missing?
for i = 1, j varies from 2 to n => n-1 times
for i = 2, j varies from 3 to n => n-2 times
...
...
for i=n-1 j varies from n to n => 1 time
so number of operations => (n-1) + (n-2) + (n-3) + .... +1
that solves to n(n-1)/2 (remember the formula for summation of n natural numbers - https://cseweb.ucsd.edu/groups/tatami/handdemos/sum/
You are not missing much because the big O bound of both n(n - 1) and n(n - 1)/2 is O(n^2). The double loop you showed will be upper bounded by O(n^2), and this is the main point here, I think.

How to calculate the complexity of the Algorithm?

I have to calculate the complexity of this algorithm,I tried to solve it and found the answer to be O(nlogn). Is it correct ? If not please explain.
for (i=5; i<n/2; i+=5)
{
for (j=1; j<n; j*=4)
op;
x = 3*n;
while(x > 6)
{op; x--;}
}
Katrina, In this example we've got an O (n*log(n))
`
for (int i= 0; i < N; i++) {
c= i;
while (c > 0) {
c= c/2
}
}
How ever you have another for that includes this both bucles.
Im not quite sure to understand the way it works the algorithm but an standard way considering an another for bucle, should be O(nnlog n
for (int j= 0; i < N); j++) { --> O(n)
for (int i= 0; i < N; i++) { O(n)
c= i;
while (c > 0) { O(logn)
c= c/2 O(1)
}
}
} ?
Aftermath in this standard algorithm would be O(n) * O(n) * O(logn) * O(1)
So, I think you forgot to include another O(n)
)
Hope it helps
Let's count the number of iterations in each loop.
The outermost loop for (i=5; i<n/2; i+=5) steps through all values between 5 and n / 2 in steps of 5. It will, thus, require approximately n / 10 - 1 iterations.
There are two inner loops. Let's consider the first one: for (j=1; j<n; j*=4). This steps through all values of the form 4^x between 1 and n for integers of x. The lowest value of x for this to be true is 0, and the highest value is the x that fulfills 4^x < n -- i.e., the integer closest to log_4(n). Thus, labelling the iterations by x, we have iterations 0, 1, ..., log_4(n). In other words, we've approximately log_4(n) + 1 iterations for this loop.
Now consider the second inner loop. It steps through all values from 3 * n down to 7. Thus the number of iterations are approximately 3 n - 6.
All other operations have constant run time and can therefore be ignored.
How do we put this together? The two inner loops are run sequentially (i.e., they are not nested) so the run time for both of them together is simply the sum:
(log_4(n) + 1) + (3 n - 6) = 3 n + log_4(n) - 5.
The outer loop and the two inner loops are, however, nested. For every iteration of the outer loop, both the inner ones are run. Therefore we multiply the number of iterations of the outer with the total of the inner:
(n / 10 - 1) * (3 n + log_4(n) - 5) =
= 3 n^2 / 10 + n log_4(n) / 10 - 7 n / 2 - log_4(n) + 5.
Finally, complexity is often expressed in Big-O notation -- that is, we're only interested in the order of the run time. This means two things. First, we can ignore all constant factors in all terms. For example, O(3 n^2 / 10) becomes just O(n^2). Thereby we have:
O(3 n^2 / 10 + n log_4(n) / 10 - 7 n / 2 - log_4(n) + 5) =
= O(n^2 + n log_4(n) - n - log_4(n) + 1).
Second, we can ignore all terms that have a lower order than the term with the highest order. For example, n is of an higher order than 1 so we have O(n + 1) = O(n). Thereby we have:
O(n^2 + n log_4(n) - n - log_4(n) + 1) = O(n^2).
Finally, we have the answer. The complexity of the algorithm your code describes is O(n^2).
(In practice, one would never calculate the (approximate) number of iterations as we did here. The simplification we did in the last step can be done earlier which makes the calculations much easier.)
As Fredrik mentioned, time complexity of first and third loops are O(n). an time for second loop is O(log(n)).
So complexity of following algorithm is O(n^2).
for (i=5; i<n/2; i+=5)
{
for (j=1; j<n; j*=4)
op;
x = 3*n;
while(x > 6)
{op; x--;}
}
Note that complexity of following algorithm is O(n^2*log(n)) which is not same with above algorithm.
for (i=5; i<n/2; i+=5)
{
for (j=1; j<n; j*=4)
{
op;
x = 3*n;
while(x > 6)
{op; x--;}
}
}