Time complexity of inner for loop - time-complexity

I have been trying to figure out how the get to the solution of this for loop but I cant seem to understand why the answer is what it is. I get stuck on the inner for loop, can someone help me break it down step by step and to get to the answer of the second for loop, line 2. THIS IS FOR REVIEW NOT HOMEWORK. I am just trying to understand the concept better.
1: for (int i = 0; i < n; i++) { // Runs n+1 times
2: for (int j = n; j >= i; j--) { // Runs (n+2)+(n+1)+ n(n+1)/2 - 3 times
3: cout << i << “,” << j <<endl; // Runs (n+1) + n(n+1)/2 - 1 times
4: }
5: }
I know the second line is simplified to n(n+1)/2 +2n, but I dont understand how to even get the (n+2)+(n+1)+ n(n+1)/2 - 3 first.

1: for (int i = 0; i < n; i++) { // Runs n+1 times
This line does not run for n+1 times. Since it is i < n It runs for n times.
2: for (int j = n; j >= i; j--) {
This line runs for n - i + 1 times. Because it uses >= for comparison.
So if we write down the executions for cout , we get something like this:
1: n+1
2: n
3: n-1
...
...
n: 1
So what we need to do is simply add numbers up to n+1
which is (n+1)(n+2)/2
hope this helps

for (int i = 0; i < n; i++) {
for (int j = n; j >= i; j--) {
cout << "something" << endl;
}
}
Lets see for the values of i, how many times the inner loop is run.
i=0 --> n
i=1 --> n-1
i=2 --> n-2
i=3 --> n-3
...
...
i=n-1 --> 1
Let us increase each element to n to have an upper bound
1 + 2 + 3 +....+ n-1 + n < [n + n + n + .... + n] = O(n^2)
Lets throw half of the first elements, and the other half decrease it to n/2. For a lower bound.
1 + 2 + 3 +....+ n-1 + n > [n/2 + n/2 + ... + n/2] = (n/2)*(n/2) = (n^2)/4 = Ω(n^2)
Lower and Upper bounds are Ω(n), O(n), we can deduce it is actually ϴ(n).
We could have done this immediately by noticing it is actually an arithmetic sequence.
1 + 2 + 3 + ... + n = (1+n)*(n-1)/2 = ϴ(n^2)

Related

Time complexity for an intersection (worst case)

Having trouble finding the time complexity for the worst-case time complexity. This case is for an intersection of two sort arrays of the same size (n).
Not sure how to count the while loop with two conditions or how to count the if and else if statements
I know the big 0 would be N+N but no idea how to show the worst case.
int printIntersection(int arr1[], int arr2[]) {
int i = 0, j = 0;
while (i < n && j < n) {
if (arr1[i] < arr2[j])
i++;
else if (arr2[j] < arr1[i])
j++;
else /* if arr1[i] == arr2[j] */ {
cout << arr2[j] << " ";
i++;
j++;
}
}
}
To prove that in the worst case the loop will make 2N iterations you can use the following argument.
Given two indices i and j at each step:
if arr1[i] < arr2[j] then i is incremented by 1
if arr2[i] > arr1[j] then j is incremented by 1
if arr2[i] = arr1[j] then both i and j are incremented by 1
At each iteration at least one between i and j is incremented by one and the maximum number of iterations is bounded by 2N (both i and j go from 0 to n-1),
you get your resulting worst case time complexity.

TIme complexity for a specific loop

What is the time complexity and tilde for the loop below?
for (int i = N/2; i < N; i++) {
for (int j = i; j < N; j++) {
doSomething(i, j);
}
}
I think that it runs N/2 + (N/2 + 1) + (N/2 + 2) + ... + (N-1) times, but how do I get it's time complexity and tilde?
For example - if N = 100, the loop will run 50 + 51 + 52 + 53 + ... + 99 times.
I am assuming doSomething(i, j); is not iterating all the elements between i and j; if this is the case, the complexity of this algorithm is O(N^2).
The outer loop for (int i = N/2; i < N; i++) { will execute O(N) times, cause N/2 is actually constant value.
The inner loop in worst case will execute N times (or N - i times) too, this will also merge with previous O(N).
Therefore, overall time complexity will be O(N^2) in worst case scenario.
The inner loop is executed:
N/2-1 times for i = N/2,
N/2-2 times for i = N/2+1
....
1 time for i = N-2
therefore the total time for the inner loop is :
(N/2-1) + (N/2-2) + .... (N/2-k) where k = N/2 - 1
= N/2*k - (1 + 2 + ... + k)
= N/2*(N/2-1) - (N/2-1)(N/2)/2
= N/2(N/2 - 1 - N/4 + 1/2)
= N/2(N/4 - 1/2)
= N^2/8 - N/4
Hence the order of growth of the code is of N^2
If you consider tilde notation which is defined as :
"∼g(n) to represent any quantity that, when divided by f(n), approaches 1 as n grows" from here, you can see that ~g(n) = ~N^2/8 because as N grows (N^2/8)/(N^2/8-N/4) approaches 1.

What is the complexity of this for loop, for (int j = i; j < n; j++)?

what is the complexity of the second for loop? would it be n-i? from my understanding a the first for loop will go n times, but the index in the second for loop is set to i instead.
//where n is the number elements in an array
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
// Some Constant time task
}
}
In all, the inner loop iterates sum(1..n) times, which is n * (n + 1) / 2, which is O(n2)
If you try to visualise this as a matrix where lines represents i and each columns represents j you'll see that this forms a triangle with the sides n
Example with n being 4
0 1 2 3
1 2 3
2 3
3
The inner loop has (on average) complexity n/2 which is O(n).
The total complexity is n*(n+1)/2 or O(n^2)
The number of steps this takes is a Triangle Number. Here's a bit of code I put together in LINQpad (yeah, sorry about answering in C#, but hopefully this is still readable):
void Main()
{
long k = 0;
// Whatever you want
const int n = 13;
for (int i = 0; i < n; i++)
{
for (int j = i; j < n; j++)
{
k++;
}
}
k.Dump();
triangleNumber(n).Dump();
(((n * n) + n) / 2).Dump();
}
int triangleNumber(int number)
{
if (number == 0) return 0;
else return number + triangleNumber(number - 1);
}
All 3 print statements (.Dump() in LINQpad) produce the same answer (91 for the value of n I selected, but again you can choose whatever you want).
As others indicated, this is O(n^2). (You can also see this Q&A for more details on that).
We can see that the total iteration of the loop is n*(n+1)/2. I am assuming that you are clear with that from the above explanations.
Now let's find the asymptotic time complexity in an easy logical way.
Big Oh, comes to play when the value of n is a large number, in such cases we need not consider the dividing by 2 ( 2 is a constant) because (large number / 2) is also a large number.
This leaves us with n*(n+1).
As explained above, since n is a large number, (n+1) can be approximated to (n).
thus leaving us with (n*n).
hence the time complexity O(n^2).

Understanding the theoretical run time of a function with nested loops

I've been trying to understand Big-O notation. Earlier today, I was given a function to practice with and told that it has a O(n^5). I've tried calculating it on my own but don't know if I've calculated T(n) correctly.
Here are my two questions:
1) Did I calculate T(n) correctly and if not then what did I do wrong?
2) Why do we only concern ourselves with the variable to the highest power?
1 sum = 0; //1 = 1
2 for( i=0; i < n; i++) //1 + n + 2(n-1) = 1+n+2n-2 = 3n-1
3 for (j=0; j < i*i; j++) //n + n*n + 2n(n-1))= n+ n^2 + 2n^2-2n = 3n^2 -n
4 for (k=0; k < j; k++) //n + n*n + 4n(n-1))= n + n*n +4n*n-4n = 5n^2 -3n
5 sum++;
6 k++;
7 j++;
8 i++;
// so now that I have simplified everything I multiplied the equations on lines 2-4 and added line 1
// T(n) = 1 + (3n-1)(3n^2-n)(5n^2 -3n) = 45n^5 -57n^4 +23n^3 -3n^2 + 1
Innermost loop runs j times.
Second loop runs for j = 0 to i^2 -> sum of integers.
Outer loop runs to n -> sum of squares and 4th powers of integers.
We only take the highest power because as n approaches infinity, the highest power of n (or order) will always dominate, irrespective of its coefficient.

Finding the big theta bound

Give big theta bound for:
for (int i = 0; i < n; i++) {
if (i * i < n) {
for (int j = 0; j < n; j++) {
count++;
}
}
else {
int k = i;
while (k > 0) {
count++;
k = k / 2;
}
}
}
So here's what I think..Not sure if it's right though:
The first for loop will run for n iterations. Then the for for loop within the first for loop will run for n iterations as well, giving O(n^2).
For the else statement, the while loop will run for n iterations and the k = k/ 2 will run for logn time giving O(nlogn). So then the entire thing will look like n^2 + nlogn and by taking the bigger run time, the answer would be theta n^2 ?
I would say the result is O(nlogn) because i*i is typically not smaller than n for a linear n. The else branch will dominate.
Example:
n= 10000
after i=100 the else part will be calculated instead of the inner for loop