What is the time complexity of this nested loop? - time-complexity

I stumbled upon a loop for which I am not sure what the time complexity is. It is the following loop:
for(i = 1; i <= n^2; i++){
for(j = 1; j <= i; j++) {
//some elementary operation
}
}
I would have argued that the outer for-loop runs in n^2 and the inner for loop would also run in n^2 as for every iteration of the outer-loop we do n^2 - (n^2 - 1), n^2 - (n^2 - 2),..., n^2. Am I totally going in the wrong direction here?
So the time complexity would be in n^4

The number of operation will be :
1 + 2 + 3 + 4 + 5 + ... + n²
Which is equal to (n² * (n² - 1)) / 2.
The Big O notation is O(n^4). You are correct.

It's a simple arithmetic progression happening here.
Every new iteration is bigger by 1.
Outer loop will do n^2 operations which will result for following sequence:
1 + 2 + 3 + ... + n + ... + n^2 = n^2 (n^2+1) / 2 = O(n^4)

Related

TIme complexity for a specific loop

What is the time complexity and tilde for the loop below?
for (int i = N/2; i < N; i++) {
for (int j = i; j < N; j++) {
doSomething(i, j);
}
}
I think that it runs N/2 + (N/2 + 1) + (N/2 + 2) + ... + (N-1) times, but how do I get it's time complexity and tilde?
For example - if N = 100, the loop will run 50 + 51 + 52 + 53 + ... + 99 times.
I am assuming doSomething(i, j); is not iterating all the elements between i and j; if this is the case, the complexity of this algorithm is O(N^2).
The outer loop for (int i = N/2; i < N; i++) { will execute O(N) times, cause N/2 is actually constant value.
The inner loop in worst case will execute N times (or N - i times) too, this will also merge with previous O(N).
Therefore, overall time complexity will be O(N^2) in worst case scenario.
The inner loop is executed:
N/2-1 times for i = N/2,
N/2-2 times for i = N/2+1
....
1 time for i = N-2
therefore the total time for the inner loop is :
(N/2-1) + (N/2-2) + .... (N/2-k) where k = N/2 - 1
= N/2*k - (1 + 2 + ... + k)
= N/2*(N/2-1) - (N/2-1)(N/2)/2
= N/2(N/2 - 1 - N/4 + 1/2)
= N/2(N/4 - 1/2)
= N^2/8 - N/4
Hence the order of growth of the code is of N^2
If you consider tilde notation which is defined as :
"∼g(n) to represent any quantity that, when divided by f(n), approaches 1 as n grows" from here, you can see that ~g(n) = ~N^2/8 because as N grows (N^2/8)/(N^2/8-N/4) approaches 1.

Understanding the theoretical run time of a function with nested loops

I've been trying to understand Big-O notation. Earlier today, I was given a function to practice with and told that it has a O(n^5). I've tried calculating it on my own but don't know if I've calculated T(n) correctly.
Here are my two questions:
1) Did I calculate T(n) correctly and if not then what did I do wrong?
2) Why do we only concern ourselves with the variable to the highest power?
1 sum = 0; //1 = 1
2 for( i=0; i < n; i++) //1 + n + 2(n-1) = 1+n+2n-2 = 3n-1
3 for (j=0; j < i*i; j++) //n + n*n + 2n(n-1))= n+ n^2 + 2n^2-2n = 3n^2 -n
4 for (k=0; k < j; k++) //n + n*n + 4n(n-1))= n + n*n +4n*n-4n = 5n^2 -3n
5 sum++;
6 k++;
7 j++;
8 i++;
// so now that I have simplified everything I multiplied the equations on lines 2-4 and added line 1
// T(n) = 1 + (3n-1)(3n^2-n)(5n^2 -3n) = 45n^5 -57n^4 +23n^3 -3n^2 + 1
Innermost loop runs j times.
Second loop runs for j = 0 to i^2 -> sum of integers.
Outer loop runs to n -> sum of squares and 4th powers of integers.
We only take the highest power because as n approaches infinity, the highest power of n (or order) will always dominate, irrespective of its coefficient.

Solve: T(n) = T(n/2) + n/2 + 1

I struggle to define the running time for the following algorithm in O notation. My first guess was O(n), but the gap between the iterations and the number I apply isn't steady. How have I incorrectly defined this?
public int function (int n )
{
if ( n == 0) {
return 0;
}
int i = 1;
int j = n ;
while ( i < j )
{
i = i + 1;
j = j - 1;
}
return function ( i - 1) + 1;
}
The while is executed in about n/2 time.
The recursion is executed passing as n a value that is about half of the original n, so:
n/2 (first iteration)
n/4 (second iteration, equal to (n/2)/2)
n/8
n/16
n/32
...
This is similar to a geometric serie.
Infact it can be represented as
n * (1/2 + 1/4 + 1/8 + 1/16 + ...)
So it converges to n * 1 = n
So the O notation is O(n)
Another approach is to write it down as T(n) = T(n/2) + n/2 + 1.
The while loop does n/2 work. Argument passed to next call is n/2.
Solving this using the master theorem where:
a = 1
b = 2
f = n/2 + 1
Let c=0.9
1*(f(n/2) + 1) <? c*f(n)
1*(n/4)+1 <? 0.9*(n/2 + 1)
0.25n + 1 <? 0.45n + 0.9
0 < 0.2n - 0.1
Which is:
T(n) = Θ(n)

Time Complexity: O(logN) or O(N)?

I thought the time complexity of the following code is O(log N), but the answer says it's O(N). I wonder why:
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}
For the inners for-loop, it runs for this many times:
N + N/2 + N/4 ...
it seems to be logN to me. Please help me understand why here. Thanks
1, 1/2, 1/4, 1/8... 1/2 ** n is a geometric sequence with a = 1, r = 1/2 (a is the first term, and r is the common ratio).
Its sum can be calculated using the following formula:
In this case, the limit of the sum is 2, so:
n + n/2 + n/4 ... = n(1 + 1/2 + 1/4...) -> n * 2
Thus the complicity is O(N)
Proceeding step by step, based on the code fragment, we obtain:

How to find the time complexity of the following code snippet

What is the time complexity of the following code snippet and how to calculate it.
function(int n){
for(int i = 1 ; i<=n ; i++){
for(int j=1 ; j<=n; j+=i){
System.out.println("*");
}
}
Let's think about the total work that's done. As you noted in your comment, the inner loop runs n times when i = 1, then n / 2 times when i = 2, then n / 3 times when i = 3, etc. (ignoring rounding errors). This means that the total work done is
n + n/2 + n/3 + n/4 + ... + n/n
= n(1 + 1/2 + 1/3 + 1/4 + ... + 1/n)
The term in parentheses is the nth harmonic number, denoted Hn, so the work done overall is roughly nHn. It's known that Hn = Θ(log n), so the total work is Θ(n log n).