Running Time Calculation/Complexity of an Algorithm - time-complexity

I have to calculate the time complexity or theoretical running time of an algorithm (given the psuedocode), line by line as T(n). I've given it a try, but there are a couple things confusing me. For example, what is the time complexity for an "if" statement? And how do I deal with nested loops? The code is below along with my attempt which is commented.
length[A] = n
for i = 0 to length[A] - 1 // n - 1
k = i + 1 // n - 2
for j = 1 + 2 to length[A] // (n - 1)(n - 3)
if A[k] > A[j] // 1(n - 1)(n - 3)
k = j // 1(n - 1)(n - 3)
if k != i + 1 // 1(n - 1)
temp = A[i + 1] // 1(n - 1)
A[i + 1] = A[k] // 1(n - 1)
A[k] = temp // 1(n - 1)

Blender is right, the result is O(n^2): two nested loops that each have an iteration count dependent on n.
A longer explanation:
The if, in this case, does not really matter: Since O-notation only looks at the worst-case execution time of an algorithm, you'd simply choose the execution path that's worse for the overall execution time. Since, in your example, both execution paths (k != i+ 1 is true or false) have no further implication for the runtime, you can disregard it. If there were a third nested loop, also running to n, inside the if, you'd end up with O(n^3).
A line-by-line overview:
for i = 0 to length[A] - 1 // n + 1 [1]
k = i + 1 // n
for j = 1 + 2 to length[A] // (n)(n - 3 + 1) [1]
if A[k] > A[j] // (n)(n - 3)
k = j // (n)(n - 3)*x [2]
if k != i + 1 // n
temp = A[i + 1] // n*y [2]
A[i + 1] = A[k] // n*y
A[k] = temp // n*y
[1] The for loop statement will be executed n+1 times with the following values for i: 0 (true, continue loop), 1 (true, continue loop), ..., length[A] - 1 (true, continue loop), length[A] (false, break loop)
[2] Without knowing the data, you have to guess how often the if's condition is true. This guess can be done mathematically by introducing a variable 0 <= x <= 1. This is in line with what I said before: x is independent of n and therefore influences the overall runtime complexity only as a constant factor: you need to take a look at the execution paths .

Related

TIme complexity for a specific loop

What is the time complexity and tilde for the loop below?
for (int i = N/2; i < N; i++) {
for (int j = i; j < N; j++) {
doSomething(i, j);
}
}
I think that it runs N/2 + (N/2 + 1) + (N/2 + 2) + ... + (N-1) times, but how do I get it's time complexity and tilde?
For example - if N = 100, the loop will run 50 + 51 + 52 + 53 + ... + 99 times.
I am assuming doSomething(i, j); is not iterating all the elements between i and j; if this is the case, the complexity of this algorithm is O(N^2).
The outer loop for (int i = N/2; i < N; i++) { will execute O(N) times, cause N/2 is actually constant value.
The inner loop in worst case will execute N times (or N - i times) too, this will also merge with previous O(N).
Therefore, overall time complexity will be O(N^2) in worst case scenario.
The inner loop is executed:
N/2-1 times for i = N/2,
N/2-2 times for i = N/2+1
....
1 time for i = N-2
therefore the total time for the inner loop is :
(N/2-1) + (N/2-2) + .... (N/2-k) where k = N/2 - 1
= N/2*k - (1 + 2 + ... + k)
= N/2*(N/2-1) - (N/2-1)(N/2)/2
= N/2(N/2 - 1 - N/4 + 1/2)
= N/2(N/4 - 1/2)
= N^2/8 - N/4
Hence the order of growth of the code is of N^2
If you consider tilde notation which is defined as :
"∼g(n) to represent any quantity that, when divided by f(n), approaches 1 as n grows" from here, you can see that ~g(n) = ~N^2/8 because as N grows (N^2/8)/(N^2/8-N/4) approaches 1.

What is the time complexity of this nested loop?

I stumbled upon a loop for which I am not sure what the time complexity is. It is the following loop:
for(i = 1; i <= n^2; i++){
for(j = 1; j <= i; j++) {
//some elementary operation
}
}
I would have argued that the outer for-loop runs in n^2 and the inner for loop would also run in n^2 as for every iteration of the outer-loop we do n^2 - (n^2 - 1), n^2 - (n^2 - 2),..., n^2. Am I totally going in the wrong direction here?
So the time complexity would be in n^4
The number of operation will be :
1 + 2 + 3 + 4 + 5 + ... + n²
Which is equal to (n² * (n² - 1)) / 2.
The Big O notation is O(n^4). You are correct.
It's a simple arithmetic progression happening here.
Every new iteration is bigger by 1.
Outer loop will do n^2 operations which will result for following sequence:
1 + 2 + 3 + ... + n + ... + n^2 = n^2 (n^2+1) / 2 = O(n^4)

Time complexity of inner for loop

I have been trying to figure out how the get to the solution of this for loop but I cant seem to understand why the answer is what it is. I get stuck on the inner for loop, can someone help me break it down step by step and to get to the answer of the second for loop, line 2. THIS IS FOR REVIEW NOT HOMEWORK. I am just trying to understand the concept better.
1: for (int i = 0; i < n; i++) { // Runs n+1 times
2: for (int j = n; j >= i; j--) { // Runs (n+2)+(n+1)+ n(n+1)/2 - 3 times
3: cout << i << “,” << j <<endl; // Runs (n+1) + n(n+1)/2 - 1 times
4: }
5: }
I know the second line is simplified to n(n+1)/2 +2n, but I dont understand how to even get the (n+2)+(n+1)+ n(n+1)/2 - 3 first.
1: for (int i = 0; i < n; i++) { // Runs n+1 times
This line does not run for n+1 times. Since it is i < n It runs for n times.
2: for (int j = n; j >= i; j--) {
This line runs for n - i + 1 times. Because it uses >= for comparison.
So if we write down the executions for cout , we get something like this:
1: n+1
2: n
3: n-1
...
...
n: 1
So what we need to do is simply add numbers up to n+1
which is (n+1)(n+2)/2
hope this helps
for (int i = 0; i < n; i++) {
for (int j = n; j >= i; j--) {
cout << "something" << endl;
}
}
Lets see for the values of i, how many times the inner loop is run.
i=0 --> n
i=1 --> n-1
i=2 --> n-2
i=3 --> n-3
...
...
i=n-1 --> 1
Let us increase each element to n to have an upper bound
1 + 2 + 3 +....+ n-1 + n < [n + n + n + .... + n] = O(n^2)
Lets throw half of the first elements, and the other half decrease it to n/2. For a lower bound.
1 + 2 + 3 +....+ n-1 + n > [n/2 + n/2 + ... + n/2] = (n/2)*(n/2) = (n^2)/4 = Ω(n^2)
Lower and Upper bounds are Ω(n), O(n), we can deduce it is actually ϴ(n).
We could have done this immediately by noticing it is actually an arithmetic sequence.
1 + 2 + 3 + ... + n = (1+n)*(n-1)/2 = ϴ(n^2)

Understanding the theoretical run time of a function with nested loops

I've been trying to understand Big-O notation. Earlier today, I was given a function to practice with and told that it has a O(n^5). I've tried calculating it on my own but don't know if I've calculated T(n) correctly.
Here are my two questions:
1) Did I calculate T(n) correctly and if not then what did I do wrong?
2) Why do we only concern ourselves with the variable to the highest power?
1 sum = 0; //1 = 1
2 for( i=0; i < n; i++) //1 + n + 2(n-1) = 1+n+2n-2 = 3n-1
3 for (j=0; j < i*i; j++) //n + n*n + 2n(n-1))= n+ n^2 + 2n^2-2n = 3n^2 -n
4 for (k=0; k < j; k++) //n + n*n + 4n(n-1))= n + n*n +4n*n-4n = 5n^2 -3n
5 sum++;
6 k++;
7 j++;
8 i++;
// so now that I have simplified everything I multiplied the equations on lines 2-4 and added line 1
// T(n) = 1 + (3n-1)(3n^2-n)(5n^2 -3n) = 45n^5 -57n^4 +23n^3 -3n^2 + 1
Innermost loop runs j times.
Second loop runs for j = 0 to i^2 -> sum of integers.
Outer loop runs to n -> sum of squares and 4th powers of integers.
We only take the highest power because as n approaches infinity, the highest power of n (or order) will always dominate, irrespective of its coefficient.

Solve: T(n) = T(n/2) + n/2 + 1

I struggle to define the running time for the following algorithm in O notation. My first guess was O(n), but the gap between the iterations and the number I apply isn't steady. How have I incorrectly defined this?
public int function (int n )
{
if ( n == 0) {
return 0;
}
int i = 1;
int j = n ;
while ( i < j )
{
i = i + 1;
j = j - 1;
}
return function ( i - 1) + 1;
}
The while is executed in about n/2 time.
The recursion is executed passing as n a value that is about half of the original n, so:
n/2 (first iteration)
n/4 (second iteration, equal to (n/2)/2)
n/8
n/16
n/32
...
This is similar to a geometric serie.
Infact it can be represented as
n * (1/2 + 1/4 + 1/8 + 1/16 + ...)
So it converges to n * 1 = n
So the O notation is O(n)
Another approach is to write it down as T(n) = T(n/2) + n/2 + 1.
The while loop does n/2 work. Argument passed to next call is n/2.
Solving this using the master theorem where:
a = 1
b = 2
f = n/2 + 1
Let c=0.9
1*(f(n/2) + 1) <? c*f(n)
1*(n/4)+1 <? 0.9*(n/2 + 1)
0.25n + 1 <? 0.45n + 0.9
0 < 0.2n - 0.1
Which is:
T(n) = Θ(n)