I assume that this code ideally representation of O(n^2) complexity. The reason is for function in another for function
for (int i = 0; i < array. length; i++)
for (int j = 0; j < array.length; j++)
System.out.println(array[i] + "," + arrayfj]);
Also, I read that code below is represent O(ab) time complexity. But why is that way? I don't undersant, because if (arrayA[i] < arrayS[j]) , this is constant and we can ignore that.
for (int i = 0; i < arrayA.length; i++)
for (int j = 0; j < arrayB.length; j++)
if (arrayA[i] < arrayS[j])
System.out.println(arrayA[i] + + arrayBfj]);
This also mentioned as O(ab), although for (int k = 0; k < 160800; k++) is also as constant
for (int i = 0j i < arrayA.length; i++)
for (int j = 0; j < arrayB.length; j++)
for (int k = 0; k < 160800; k++)
System.out.println(arrayA[i] + "," + arrayB[j]);
Different sites write different information about it.
If the first case, each array is the same length (n), and n*n prints are done.
In the second, the sizes of the arrays are a & b, and a*b ifs are done, and (potentially) that many prints are done (maybe everything in A is less than everything in B).
In the third, the sizes of the arrays are a & b, and (a*b)*160800 prints are done, but the constant can be ignored.
I also think that the OP is missing an important point. It's not that the first algorithm is O(n). It's that the first algorithm is O(n) where n is the length of the array. Though it is usually implicit, the variables we use in big-O notation must have some relationship to the inputs to the algorithm.
Likewise, the second algorithm is O(ab), where a=length(arrayA) and b=length(arrayB).
Regarding the if statement. If the if statement is false, then those two lines run in some small constant time. If the if statement is true, then those two lines run in some slightly larger, but still small constant time. The goal of big-Oh notation is to ignore constants, and just to see how the running time of the algorithm is related to the inputs. So a constant is a constant is a constant.
Likewise for the third program. The loop is run a constant number of times. Hence it takes a constant amount of time. A constant is a constant, even if it's large.
Related
I am having problem in identifying the time complexity of this nested loop code. Some of my friends are saying O(n^3) and O(n^5).
sum = 0;
for(int i=0; i<n; i++)
for(int j=0; j<i*i; j++)
for(int k=0; k<j; k++)
sum++;
WolframAlpha gives the total count of increments to sum as
sum_(i=0)^(n-1)( sum_(j=0)^(i^2 - 1)( sum_(k=0)^(j-1) 1))
= 1/20 (n - 2) (n - 1) n (n + 1) (2 n - 1)
= n^5/10 - n^4/4 + n^2/4 - n/10
which is in θ(n^5).
I would say time complexity is about N * (N*N)/2 * N/2. Combined it would be O(N^4).
Edit: it's O(N^5)because the inner loop is squared by the middle loop!
But don't take my word for it. For these kind of questions, why don't you run a few examples of your code with different N and compare the sums, you will figure out what the time-complexity is soon enough.
what is the complexity of the second for loop? would it be n-i? from my understanding a the first for loop will go n times, but the index in the second for loop is set to i instead.
//where n is the number elements in an array
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
// Some Constant time task
}
}
In all, the inner loop iterates sum(1..n) times, which is n * (n + 1) / 2, which is O(n2)
If you try to visualise this as a matrix where lines represents i and each columns represents j you'll see that this forms a triangle with the sides n
Example with n being 4
0 1 2 3
1 2 3
2 3
3
The inner loop has (on average) complexity n/2 which is O(n).
The total complexity is n*(n+1)/2 or O(n^2)
The number of steps this takes is a Triangle Number. Here's a bit of code I put together in LINQpad (yeah, sorry about answering in C#, but hopefully this is still readable):
void Main()
{
long k = 0;
// Whatever you want
const int n = 13;
for (int i = 0; i < n; i++)
{
for (int j = i; j < n; j++)
{
k++;
}
}
k.Dump();
triangleNumber(n).Dump();
(((n * n) + n) / 2).Dump();
}
int triangleNumber(int number)
{
if (number == 0) return 0;
else return number + triangleNumber(number - 1);
}
All 3 print statements (.Dump() in LINQpad) produce the same answer (91 for the value of n I selected, but again you can choose whatever you want).
As others indicated, this is O(n^2). (You can also see this Q&A for more details on that).
We can see that the total iteration of the loop is n*(n+1)/2. I am assuming that you are clear with that from the above explanations.
Now let's find the asymptotic time complexity in an easy logical way.
Big Oh, comes to play when the value of n is a large number, in such cases we need not consider the dividing by 2 ( 2 is a constant) because (large number / 2) is also a large number.
This leaves us with n*(n+1).
As explained above, since n is a large number, (n+1) can be approximated to (n).
thus leaving us with (n*n).
hence the time complexity O(n^2).
I'm just trying to calculate complexity on some program fragments, but I'm, worried I'm making things too simple. If I put my fragments and answers down, can you tell me if I'm doing anything wrong with it?
(a)
sum = 0;
for (i = 0;i < n;i++)
sum++;
ANSWER: n, only one for loop
(b)
sum = 0;
for (i = 0;i < n;i++)
for (k = 0;k < n*n;k++)
sum++;
ANSWER: n^2 because of the nested loop, although I wonder if the n*n in the nested loop makes it n^3
(c)
sum = 0;
for (i = 0;i < n;i++)
for (k = 0;k < i;k++)
sum++;
ANSWER: n^2
(d)
sum = 0;
for (i = 0;i < n;i++)
for (k = 0;k < i*i;k++)
sum++;
ANSWER: n^2, but I have the same concern as b
(e)
sum= 0;
for (i = 0;i < n;i++)
for (k = i;k < n;k++)
sum++;
ANSWER: n^2
Since in all your examples the main operation is sum++, we are bound to count the number of times this basic operation is performed.
Also, in all cases, there is the i++ operation, that also counts, as well as the k++. Finally, these counters have to be compared with their limits at every step, and we should also take these comparisons into account. Now, these additional operations don't change the number of iterations; they simply make each iteration more expensive. For instance,
(a)
sum = 0;
for (i = 0;i < n;i++)
sum++;
repeats n times: i++, sum++ and i<n, all of which gives 3n operations of similar complexity. This is why the total complexity is O(n).
Once this has been understood, it is no longer necessary to analyze the complexity in as much detail, because the big-O notation will take care of these additional calculations.
The second example
sum = 0;
for (i = 0;i < n;i++)
for (k = 0;k < n*n;k++)
sum++;
repeats n time the operation
for (k = 0;k < n*n;k++)
sum++;
Because of the previous case, this operation has complexity O(n*n) as here the limit is n*n rather than n. Thus the total complexity is O(n*n*n).
The third example is similar, except that this time the operation being executed n times is
for (k = 0;k < i;k++)
sum++;
which has a complexity that changes with i. Therefore, instead of multiplying by n we have to sum n different things:
O(1) + O(2) + ... + O(n)
and since the constant factor implicit in the O is always the same (= number of variables being increased or compared at every elementary step), we can rewrite it as
O(1 + 2 + ... + n) = O(n(n+1)/2) = O(n*n)
The other examples are similar in that they can be analyzed following these very same ideas.
I'm having trouble understanding the time complexity of pseudocodes.
p=10;
num=0;
plimit=100000;
for (i = p; i<=plimit; i++)
for (j = 1; j<=i; j++)
num = num + 1;
I think it will be a linear search, but just wanted to confirm.
It's not linear time. The inner loop has the incremental operation cost as i increments on each iteration, so 1+2+3...+n gives you O(n2) because of (n+1)*(n/2).
Here is a nested for loop. I work out the time complexity for it is nlg(n).
int sum = 0;
for(int k = 1; k < n; k*=2){
for(int i = 1; i < n; i++){
sum++;
}
}
My thoughts are as follows.
The outer for loop: k will take values of 1, 2, 4, 8, ... Thus it will take lg(n) iteration.
The inner for loop: i will take n iteration.
Hence, the overall operations taken will be nlg(n).
Am I right?
Yes, the order of growth you suggested is correct. You can show it like the following: