What is the big-theta of the following code? [I*I <= n] - time-complexity

for (k = 1; k <= n; k++)
for (i = 1; i*i <= n; i++)
// some O(1) operations`
I am asked to find the big-theta of this code. I figured by i*i < n. I can rewrite it as I < n/I. And so tracing it I got the following:
I # of interations
1 n
2 n/2
3 n/3
. .
. .
. .
n/L 1
Though I am not sure how to go from here. Should I calculate the summation of n/i from i=0 to n? In that case how do I calculate the sum when a variable (n) is present?
I know that if I find L I find the number of iterations needed. And as it terminates after L = N/L I can't calculate L in terms of N.
I am very confused with this. Any insight would be appreciated.

The outer loop has N iterations. The inner loop has squareroot(N) iterations. Multiply those two to find the answer.

Related

Time complexity of nested while loop inside nested for loop

I am having a little bit of difficulty analyzing the time complexity of this algorithm:
for i = 1 to n do
for j = 1 to n do
k = j
while k <= n do
k = k*3
end while
end for
end for
I know that the outer for loop will run n times, and the inner for loop will run n^2 times. However, for the while loop, I've narrowed the cost down to n^2 + (some factor)*n, but I don't know where to go from here. I'd appreciate the help.
We can immediately factor out the outer for-loop, since none of the inner logic depends on its iteration variable i.
Now let's look at the while loop. You have correctly deduced that it is logarithmic - but what variables does it depend on? After the m-th execution of this loop, the value of k is:
The loop breaks when this is greater than n, so the number of times max(m) it executes is:
The total complexity, equal to n times the combined complexity of the inner for-loop and this while-loop is therefore the following summation:
What do we do about the rounding brackets? If you round a number up to the nearest integer, the result always differs from the original number by less than 1. Therefore the difference can just be written as a constant O(1):
Where in:
(1), (2), (4) we used three of the logarithm rules,
(3) we used the definition of the factorial n! = n(n-1)(n-2)...1,
(4) we used Stirling's approximation.
Thus the total time complexity is:
And as I mentioned in the comments, there is no logarithmic term or factor.
EDIT numerical tests to confirm this result. JavaScript code:
function T(n) {
var m = 0;
for (var i = 1; i <= n; i++) {
for (var j = 1; j <= n; j++) {
var k = j;
while (k <= n) {
k *= 3;
m++;
}
}
}
return m;
}
Test results:
n T(n)
-----------------
1000 1498000
1500 3370500
2000 5992000
2500 9365000
3000 13494000
3500 18357500
4000 23984000
4500 30361500
5000 37475000
5500 45347500
6000 53976000
6500 63336000
7000 73472000
7500 84345000
8000 95952000
8500 108324000
9000 121482000
9500 135337000
10000 149960000
Plot of T(n) against n^2:
A very tidy straight line, which proves that T(n) is directly proportional to n^2 for large inputs.

time compexity of my algorithm

I'm having trouble with finding the O-time of some algoritms. I've searched quite some O notations but whenever my excercise gets harder, I can't find the solution. Now I came across an algoritm and I can't really find a solution.
I've searched through Stack Overflow but found nothing to really help me. The best post I found was this.
It only said what I knew and not really how to calculate it or see it. Also the second post did say some algoritms with solutions, but not how to find it.
The Code
`for i = 1; i <= n; i++
for j = 1; j <= i; j++
for k = 1; k <= i; k++
x = x + 1
`
Question
What is the time complexity of this algorithm?
Also, are there some good tutorials to help me understand this matter better?
Also sorry if there's a stack overflow post of this allready but I couldn't find a good one to help me.
The loop defining i runs n times.
The loop defining j runs n * n/2 times
The loop defining k runs n * n/2 * n/2 times
= n * 1/2 * n * 1/2 * n
= n * n * n * 1/2 * 1/2
= O(n^3)
You can also try to infer that from the final value of the variable x, which should be roughly proportional to n^3

Calculating the time complexity

Can somebody help with the time complexity of the following code:
for(i = 0; i <= n; i++)
{
for(j = 0; j <= i; j++)
{
for(k = 2; k <= n; k = k^2)
print("")
}
a/c to me the first loop will run n times,2nd will run for(1+2+3...n) times and third for loglogn times..
but i m not sure about the answer.
We start from the inside and work out. Consider the innermost loop:
for(k = 2; k <= n; k = k^2)
print("")
How many iterations of print("") are executed? First note that n is constant. What sequence of values does k assume?
iter | k
--------
1 | 2
2 | 4
3 | 16
4 | 256
We might find a formula for this in several ways. I used guess and prove to get iter = log(log(k)) + 1. Since the loop won't execute the next iteration if the value is already bigger than n, the total number of iterations executed for n is floor(log(log(n)) + 1). We can check this with a couple of values to make sure we got this right. For n = 2, we get one iteration which is correct. For n = 5, we get two. And so on.
The next level does i + 1 iterations, where i varies from 0 to n. We must therefore compute the sum 1, 2, ..., n + 1 and that will give us the total number of iterations of the outermost and middle loop: this sum is (n + 1)(n + 2) / 2 We must multiply this by the cost of the inner loop to get the answer, (n + 1)(n + 2)(log(log(n)) + 1) / 2 to get the total cost of the snippet. The fastest-growing term in the expansion is n^2 log(log(n)) and so that is what would typically be given as asymptotic complexity.

Order of growth of triple for loop

I'm practising algorithm complexity and I came across this code online but I cannot figure out the order of growth for it. Any ideas?
int counter= 0;
for (int i = 0; i*i < N; i++)
for (int j = 0; j*j < 4*N; j++)
for (int k = 0; k < N*N; k++)
counter++;
Take it one step (or loop in this case) at a time:
The first loop increments i as long its square is lower than N, so this must be O(sqrt N), because int(sqrt(N)) or int(sqrt(N)) - 1 is the largest integer value whose square is lower than N;
The same holds for the second loop. We can ignore the 4 because it is a constant, and we do not care about those when dealing with big-oh notation. So the first two loops together are O(sqrt N)*O(sqrt N) = O(sqrt(N)^2) = O(N). You can multiply the complexities because the loops are nested, so the second loop will fully execute for each iteration of the first;
The third loop is obviously O(N^2), because k goes up to the square of N.
So the whole thing has to be O(N) * O(N^2) = O(N^3). You can usually solve problems like this by figuring out the complexity of the first loop, then the second, then the first two and so on.
Sqrt n x 2 Sqrt n x n ^ 2
Which gives:
O n^3
Explanation:
For the first loop, square root both sides of the equation
i^2 = n
For the second loop, square root both sides of the equation
j^2 = 4n^2
The third loop is straight forward.

Big-Oh notation for code fragment

I am lost on these code fragments and finding a hard time to find any other similar examples.
//Code fragment 1
sum = 0;
for(i = 0;i < n; i++)
for(J=1;j<i*i;J++)
for(K=0;k<j;k++)
sum++;
I'm guessing it is O(n^4) for fragment 1.
//Code fragment 2
sum = 0;
for( 1; i < n; i++ )
for (j =1;,j < i * i; j++)
if( j % i == 0 )
for( k = 0; k < j; k++)
sum++;
I am very lost on this one. Not sure how does the if statement affects the loop.
Thank you for the help head of time!
The first one is in fact O(n^5). The sum++ line is executed 1^4 times, then 2^4 times, then 3^4, and so on. The sum of powers-of-k has a term in n^(k+1) (see e.g. Faulhaber's formula), so in this case n^5.
For the second one, the way to think about it is that the inner loop only executes when j is a multiple of i. So the second loop may as well be written for (j = 1; j < i * i; j+=i). But this is the same as for (j = 1; j < i; j++). So we now have a sequence of cubes, rather than powers-of-4. The highest term is therefore n^4.
I'm fairly sure the 1st fragment is actually O(n^5).
Because:
n times,
i^2 times, where i is actually half of n (average i for the case, since for each x there is a corresponding n-x that sum to 2n) Which is therefore n^2 / 4 times. (a times)
Then, a times again,
and when you do: n*a*a, or n*n*n/4*n*n/4 = n^5 / 16, or O(n^5)
I believe the second is O(4), because:
It's iterated n times.
Then it's iterated n*n times, (literally n*n/4, but not in O notation)
Then only 1/n are let through by the if (I can't remember how I got this)
Then n*n are repeated.
So, n*n*n*n*n/n = n^4.
With a sum so handy to compute, you could run these for n=10, n=50, and so on, and just look which of O(N^2), O(N^3), O(N^4), O(N^6) is a better match. (Note that the index for the inner-most loop also runs to n*n...)
First off I agree with your assumption for the first scenario. Here is my breakdown for the second.
The If statement will cause the last loop to run only half of the time since an odd value for i*i will only lead to the third inner loop for prime values that i*i can be decomposed into. Bottom line in big-O we ignore constants so I think your looking at O(n^3).