How to calculate complexity of such loop - time-complexity

What is time complexity of such loop?
for (int i = 0; i < array.length; i++){
for(int j = i + 1; j < arrray.length; j++){
// Some O(1) operation
}
}

Two nested loops O(n^2)
The time complexity of a nested loop is equal to the number of times the innermost statement is executed.
More info on time complexity and a fun read refer to:
https://www.enjoyalgorithms.com/blog/time-complexity-analysis-of-loop-in-programming

Related

3 Nested for loops where third loop is dependent on first time complexity

I'm trying to find the time complexity for 3 nested for loops. I'm a little lost on how to do this because the the first and third are dependent. From what I did I found that the pattern is n(1 + 2 + 3) so O(n^2) but I'm unsure if that's right. I'm also unsure if this includes the j loop or would I have to multiply a n to my current answer. Any help is much appreciated.
for (int i = 0; i < n*n; i++) {
for (int j = 0; j < n; j++) {
for (int k = 0; k < i; k++) {
// print some statement here
}
}
}
Short Answer:
Assuming the innermost loop operation is O(1), the time compexity of your code is O(n^5).
Longer Answer:
Let's start with a simpler example of 2 dependent loops:
for (int i=0; i<n; ++i) {
for (int j=0; j<i; ++j) {
// Some O(1) operation
}
}
The outer loop will run n times and the inner loop will run 1...n times, and on average:
(1 + 2 + ... + n)/n = n(n+1)/2/n = O(n)
So the overall complexity for this simpler example is O(n^2).
Now to your case:
Note that I assumed the operation in the innermost loop is done in O(1).
for (int i=0; i< n*n; i++){
for (int j=0; j<n; j++){
for (int k=0; k<i; k++){
// Some O(1) operation
}
}
}
The 1st outer loop will run n^2 times.
The 2nd outer loop (i.e. the middle loop) will run n times.
So the 2 outer loop together will run in O(n^3).
The number of times the inner loop will run on average is now O(n^2) because the number of iterations will now be 1..n^2 (instead of 1..n):
(1 + 2 + ... n^2)/n^2 = (n^2)(n^2+1)/2/(n^2) = O(n^2).
Therefore the overall time complexity is O(n^5).
Addendum:
The code below is not in any case a proof regarding the complexity, since measuring for specific values of n does not prove anything about the asymptotic behavior of the time function, but it can give you a "feel" about the number of operations that are done.
#include <iostream>
#include <ctype.h>
void test(int n)
{
int64_t counter = 0;
for (int i = 0; i < n * n; i++) {
for (int j = 0; j < n; j++) {
for (int k = 0; k < i; k++) {
counter++;
}
}
}
std::cout << "n:" << n << ", counter:" << counter << std::endl;
}
int main()
{
test(10);
test(100);
test(1000);
}
Output:
n:10, counter:49500
n:100, counter:4999500000
n:1000, counter:499999500000000
I believe it is quite clear that the number of operations is close to n^5/2, and since constants like 1/2 do not apply: O(n^5).

Best time complexity of a single loop?

I have a really simple question, I have this loop:
for (int i=0; i<n; i++) {
"some O(n) stuff here)"
}
what will be the BEST time complexity of this algorithm?
O(n)? (for loop O(1) * O(n) stuff)
or
O(n^2)? (for loop O(n) * O(n) stuff inside the loop)
Will the for loop itself be considered as O(n) as normally, or will it be considered as O(1)
since it will only make only 1 loop for the BEST case scenario?
You are right, the best time complexity is O(N) (and even Θ(N)), if the best running time of "stuff" is constant (even zero).
Anyway, if "stuff" is known to be best case Ω(f(N)), then the best total time is Ω(N f(N)).
If your loop is doing O(n) stuff for n times then the time complexity will be O(n^2). May you call it worst case. The best case and average case will be based on your some O(n) stuff that is executed with every iteration of your loop.
Lets take a simple example of bubble sort algorithm:
for (int i = 0; i < n - 1; ++i) {
for (int j = 0; j < n - i - 1; ++j) {
if (a[j] > a[j + 1]) {
swap(&a[j], &a[j + 1]);
}
}
}
Time complexity this will always be O(n^2), whether array is sorted (irrespective of order - ascending or descending) or not.
But this can be optimised by observing that the nth pass finds the nth largest element and puts it into its final place. So, the inner loop can avoid looking at the last n − 1 items when running for the nth time:
for (int i = 0; i < n - 1; ++i) {
swapped = false;
for (int j = 0; j < n - i - 1; ++j) {
if (a[j] > a[j + 1]) {
swap(&a[j], &a[j + 1]);
swapped = true;
}
}
if (swapped == false) {
break;
}
}
Now the best case time complexity is O(n) i.e. when the array is sorted in ascending order (in context of above implementation). Average and worst case are still O(n^2).
So, to identify the best case time complexity of your algorithm you have to show us the implementation of some O(n) stuff and if not implementation then at least show the algorithm that you are trying to implement.
As you stated it, it's O(n^2).
'Cause You are doing n times a O(n) operation.

Nested loops with time complexity log(log n)

Can there be an algorithm with two loops (nested) such that the overall time complexity is O(log(log n))
This arised after solving the following :
for(i=N; i>0; i=i/2){
for(j=0; j<i; j++){
print "hello world"
}
}
The above code has time complexity of N. (Using concept of Geometric Progression). Can there exists similar loop with time complexity O(log(log n)) ?
For a loop to iterate O(log log n) times, where the loop index variable counts up to n, then the index variable has to grow like the inverse function of log log k where k is the number of iterations; i.e. it has to grow like 2^2^k, or some other base than 2.
One way to achieve this is by starting at 2 and repeatedly squaring until you reach n - if the index variable is (((2^2)^2)...^2) with k squarings, then this equals 2^2^k as required:
for(int i = 2; i < n; i = i*i) {
//...
}
This loop iterates O(log log n) times, as required. If you absolutely must do it with nested loops, we can trivially add an extra loop which iterates O(1) times, leaving the total number of iterations asymptotically the same:
for(int i = 2; i < n; i = i*i) {
for(int j = 0; j < 10; j++) {
// ...
}
}

Time complexity of dependent nested logarithmic loop

What is the Big-O complexity of the following loop?
for (int i=2; i <= n; i=i*2) {
for (int k=i; k <= n; k=k*2) {
times++;
}
}
The outer loop is of course logarithmic, but I am not sure on how to deal with the increasing starting condition of the second loop. It looks similar to logarithmic complexity, but how do I get the k=i in there?

time complexity ( studying for exam)

I am currently studying to an examen in algorithms and I am trying to solve a question about time complexity in java, but can't really figure out how to do it. I am suppose to calculate the expected time complexity. N is a positive integer.
for (int i=0; i < N; i++)
for (int j=i+1; j < N; i++) {
int x=j+1; int h=N-1; int k;
while(x<h) {
k=(x+h)/2;
if (a[i]+a[j]+a[k] == 0) { cnt++; break;}
if (a[i]+a[j]+a[k] < 0) x=k+1;
else h=k-1;
}}
The first for loop should run N times and the second should run N-1. Since x is j+1 I guessed that x= N-2. I dont know how to think after that with the while loop or if I have done anything right. Would really appreciate help!
Create your time complexity function in parts.
for (int i=0; i < N; i++) //Takes linear O(n)
for (int j=i+1; j < N; i++) { //Takes linear O(n) and in computer science we can safely assume -1 is irrelevant at N-1 in big O notation
int x=j+1; int h=N-1; int k; // 3 x O(1)
while( x < h ) { // Worst case is when j equals i + 1 where i = 0 so x is at lowest 2 and h equals to N-1 so h depends on N. So again loop takes linear O(n) time.
k=(x+h)/2; // Takes O(1) time
if (a[i]+a[j]+a[k] == 0) { // Takes O(1) time and if this gives true we do break from the while loop
cnt++; // Takes O(1) time
break; // Takes O(1) time
}
if ( a[i]+a[j]+a[k] < 0 ) { // Takes O(1) time
x=k+1; // Takes O(1) time
} else {
h=k-1; // Takes O(1) time
}
}
}
}
So in summary T(N) equals to O(N^3) and Ω(N^2)
More specific T(N) = N * N-1 * N-2 + 10 and this last while loop in avarage takes O(N/2) time but still in computer science it is same as O(N).
We are only interested in worst and best cases.
To confuse even more big O notation actually
T(N)=O(g(N)) means this:
I hope this answer helps even little bit...