Run time of nested loops - time-complexity

Sorry if this question has already been asked, I wasn't sure how to search for it.
Say you have the following loop
for (i=0; i < n; i++)
for(j = i; j < n; j++)
Would this be O(n^2) or O(nlog(n)) and why?

The outer loop's runtime (by itself) is O(n), and the inner loop's runtime is O(n-i). So the loop's time would be (n)(n-i), and when you throw away the constant i, the runtime would be O(n^2).

Related

What would the big O notation be for this particular code, It seems that a for loop with a worse time notation is nested in one thats O(1)

As stated previously, in the attached photo, the for loop with a bigger time complexity is nested in one that is with O(1), so what is the overall time complexity? and why?
void function(int n) {
int i;
int x = 0;
for (i = 0; i<10; i++)
for (j=0; j<n/2; j++)
x--;
}
It’s O(n).
The inner loop is all that matters there since the outer loop runs a constant number of times. The outer loop’s contribution is just a constant factor which you ignore when figuring out Big-O.

Time complexity on nested for loop

function(n):
{
for (i = 1; i <= n; i++)
{
for (j = 1; j <= n / 2; j++)
output("")
}
}
Now I have calculated the time complexity for the first for loop which is O(n). Now the second for loop shows j <= n / 2 so any given n I put, for example, a range of [1,2,...,10] will give me O(log(n)) since it will continuously give me a series of n,n/2,n/4,n/8 .... K.
So if we wanted to compare the relationship it must look something like this 2^n = k.
My question is will it give me O(log(n))?
The correct summation according to the code is:
So, it's not O(log n). It's O(n^2).
No, it does not give you o(logn).
The first for loop is O(n). The second loop is O(n) as well, as the number of iterations grows as a function of n (the growth rate is linear).
It would be the same even by changing the second loop to something like
for (j=1; j<=n/2000; j++)
or in general if you replace the denominator with any constant k.
To conclude, the time compexity is quadratic, i.e., O(n^2)

Complexity of the following code fragment

I have to find the big O for the following piece of code:
for(int i=0; i<n; i++) // O(n)
if(n*f(i)+1>0) // O(n*log(n))
for(int j=f(i);j<n;j++) //O(n^2*log(n))
g(j); // O(n^3*log(n))
The O for function f is O(log(n)) and for the function g is O(n)
In the comments I've written my calculations but not sure if they are correct.
Let
Then the inner loop is (ignoring off-by-one):
Which is O(n^2) given that i < n.
Outer loop starting condition is
, where a is some constant = the base of log. The exponent should be very small (though depends on the value of lambda), so the condition should be << n.
Therefore the complexity is O(n^3).

Order of growth of triple for loop

I'm practising algorithm complexity and I came across this code online but I cannot figure out the order of growth for it. Any ideas?
int counter= 0;
for (int i = 0; i*i < N; i++)
for (int j = 0; j*j < 4*N; j++)
for (int k = 0; k < N*N; k++)
counter++;
Take it one step (or loop in this case) at a time:
The first loop increments i as long its square is lower than N, so this must be O(sqrt N), because int(sqrt(N)) or int(sqrt(N)) - 1 is the largest integer value whose square is lower than N;
The same holds for the second loop. We can ignore the 4 because it is a constant, and we do not care about those when dealing with big-oh notation. So the first two loops together are O(sqrt N)*O(sqrt N) = O(sqrt(N)^2) = O(N). You can multiply the complexities because the loops are nested, so the second loop will fully execute for each iteration of the first;
The third loop is obviously O(N^2), because k goes up to the square of N.
So the whole thing has to be O(N) * O(N^2) = O(N^3). You can usually solve problems like this by figuring out the complexity of the first loop, then the second, then the first two and so on.
Sqrt n x 2 Sqrt n x n ^ 2
Which gives:
O n^3
Explanation:
For the first loop, square root both sides of the equation
i^2 = n
For the second loop, square root both sides of the equation
j^2 = 4n^2
The third loop is straight forward.

Big-Oh notation for code fragment

I am lost on these code fragments and finding a hard time to find any other similar examples.
//Code fragment 1
sum = 0;
for(i = 0;i < n; i++)
for(J=1;j<i*i;J++)
for(K=0;k<j;k++)
sum++;
I'm guessing it is O(n^4) for fragment 1.
//Code fragment 2
sum = 0;
for( 1; i < n; i++ )
for (j =1;,j < i * i; j++)
if( j % i == 0 )
for( k = 0; k < j; k++)
sum++;
I am very lost on this one. Not sure how does the if statement affects the loop.
Thank you for the help head of time!
The first one is in fact O(n^5). The sum++ line is executed 1^4 times, then 2^4 times, then 3^4, and so on. The sum of powers-of-k has a term in n^(k+1) (see e.g. Faulhaber's formula), so in this case n^5.
For the second one, the way to think about it is that the inner loop only executes when j is a multiple of i. So the second loop may as well be written for (j = 1; j < i * i; j+=i). But this is the same as for (j = 1; j < i; j++). So we now have a sequence of cubes, rather than powers-of-4. The highest term is therefore n^4.
I'm fairly sure the 1st fragment is actually O(n^5).
Because:
n times,
i^2 times, where i is actually half of n (average i for the case, since for each x there is a corresponding n-x that sum to 2n) Which is therefore n^2 / 4 times. (a times)
Then, a times again,
and when you do: n*a*a, or n*n*n/4*n*n/4 = n^5 / 16, or O(n^5)
I believe the second is O(4), because:
It's iterated n times.
Then it's iterated n*n times, (literally n*n/4, but not in O notation)
Then only 1/n are let through by the if (I can't remember how I got this)
Then n*n are repeated.
So, n*n*n*n*n/n = n^4.
With a sum so handy to compute, you could run these for n=10, n=50, and so on, and just look which of O(N^2), O(N^3), O(N^4), O(N^6) is a better match. (Note that the index for the inner-most loop also runs to n*n...)
First off I agree with your assumption for the first scenario. Here is my breakdown for the second.
The If statement will cause the last loop to run only half of the time since an odd value for i*i will only lead to the third inner loop for prime values that i*i can be decomposed into. Bottom line in big-O we ignore constants so I think your looking at O(n^3).