Time complexity of nested loop with multiplication? - time-complexity

I just confused about to find out time complexity of the this nested loop:
for (int i=0; i<n; i++){
i*=2;
for (int j=0; j<i; j++){
}
}
And also what is the time complexity of the outer loop (just ignore the inner loop:
for (int i=0; i<n; i++){
i*=2;
}

I ran the following code which is a modified version of the second code from the question.
let it = 0
let pow = 1000
let n = Math.pow(2, pow)
for (let i = 0; i < n; i++) {
it++
i *= 2
}
console.log(it)
It prints the value of either pow+1 or pow, which implies that it runs log2(n) loops.
Although it contains a single for loop, it modifies iterating variables inside the loop. In every iteration, the value of i is doubled, which halves the remaining number of iterations. So instead of O(n), it is O(log n). It is similar to binary search if it helps you to understand better. In every iteration, it halves the remaining search space.
However, for the first code, it is quite confusing. Since the outer loop runs for log2(n) times, and the inner loop runs i times which is dependent on n. My guess is for the first code, big O notation is O(n log(n)). But for this case, my guess is as good as yours.

Related

Best time complexity of a single loop?

I have a really simple question, I have this loop:
for (int i=0; i<n; i++) {
"some O(n) stuff here)"
}
what will be the BEST time complexity of this algorithm?
O(n)? (for loop O(1) * O(n) stuff)
or
O(n^2)? (for loop O(n) * O(n) stuff inside the loop)
Will the for loop itself be considered as O(n) as normally, or will it be considered as O(1)
since it will only make only 1 loop for the BEST case scenario?
You are right, the best time complexity is O(N) (and even Θ(N)), if the best running time of "stuff" is constant (even zero).
Anyway, if "stuff" is known to be best case Ω(f(N)), then the best total time is Ω(N f(N)).
If your loop is doing O(n) stuff for n times then the time complexity will be O(n^2). May you call it worst case. The best case and average case will be based on your some O(n) stuff that is executed with every iteration of your loop.
Lets take a simple example of bubble sort algorithm:
for (int i = 0; i < n - 1; ++i) {
for (int j = 0; j < n - i - 1; ++j) {
if (a[j] > a[j + 1]) {
swap(&a[j], &a[j + 1]);
}
}
}
Time complexity this will always be O(n^2), whether array is sorted (irrespective of order - ascending or descending) or not.
But this can be optimised by observing that the nth pass finds the nth largest element and puts it into its final place. So, the inner loop can avoid looking at the last n − 1 items when running for the nth time:
for (int i = 0; i < n - 1; ++i) {
swapped = false;
for (int j = 0; j < n - i - 1; ++j) {
if (a[j] > a[j + 1]) {
swap(&a[j], &a[j + 1]);
swapped = true;
}
}
if (swapped == false) {
break;
}
}
Now the best case time complexity is O(n) i.e. when the array is sorted in ascending order (in context of above implementation). Average and worst case are still O(n^2).
So, to identify the best case time complexity of your algorithm you have to show us the implementation of some O(n) stuff and if not implementation then at least show the algorithm that you are trying to implement.
As you stated it, it's O(n^2).
'Cause You are doing n times a O(n) operation.

Big-O of fixed loops

I was discussing some code during an interview and don't think I did a good job articulating one of my blocks of code.
I know (high-level) we are taught two for loops == O(n^2), but what happens when you make some assertions as part of the work that limit the work done to a constant amount.
The code I came up with was something like
String[] someVal = new String[]{'a','b','c','d'} ;// this was really - some other computation
if(someVal != 4) {
return false;
}
for(int i=0; i < someVal; i++){
String subString = someVal[i];
if(subString.length() != 8){
return false;
}
for(int j = 0; j < subString.length(); j++){
// do some other stuff
}
}
So there are two for loops, but the number of iterations become fixed because of the length check before proceeding.
for(int i=0; i < **4**; i++){
String subString = someVal[i];
if(subString.length() != 8){ return false }
for(int j = 0; j < **8**; j++){
// do some other stuff
}
}
I tried to argue this made it constant, but didn't do a great job.
Was I completely wrong or off-base?
Your early exit condition inside of the for loop is if(subString.length() != 8), so your second for loop is executed any time if the length exactly 8. This in fact makes the complexity of the second for loop constant, as it is not depending on the input size. But before your first for loop you have another early exit condition if(someVal != 4) making the first for loop also constant.
So yes, I would follow your argumentation, that the complete function has a constant big-O time complexity. Maybe repeating in the explanation that big-O always describes an upper bound complexity, which will never be crossed and constant time factors can be reduced to 1.
But keep in mind, that a constant complexity based on real world input could still be longer in execution time than a O(n) complexity, based on the size of n. If it would be a known pre-condition that n does not grow beyond a (low) given number, I would not argue about the Big-O complexity, but about overall expected runtime, where the second constant for loop could have a larger impact than expected by Big-O complexity analysis.

does a for loop have to have a code of execution inside the loop to be considered o(n)?

function doNothing(n) {
for (let i = 0; i < n; i++);
}
Does a for loop that does nothing count as O(n)? Or would this be constant time?
The for loop doesn't "do nothing", it increments the variable i by one on each iteration five times. This is of course ignoring the fact a compiler may choose to remove this code completely.
So the for loop for (let i = 0; i < n; i++); is O(N). In your case however with for (let i = 0; i < 5; i++); since you do only five things, which is a constant amount of work, its O(1) (ie 5 ∈ O(1)) but remember this is because five is a constant not because the loop does nothing.
This is constant time. As i will always run from 0 to 5.
The time complexity of the above piece of code will be O(1).

Nested loops with time complexity log(log n)

Can there be an algorithm with two loops (nested) such that the overall time complexity is O(log(log n))
This arised after solving the following :
for(i=N; i>0; i=i/2){
for(j=0; j<i; j++){
print "hello world"
}
}
The above code has time complexity of N. (Using concept of Geometric Progression). Can there exists similar loop with time complexity O(log(log n)) ?
For a loop to iterate O(log log n) times, where the loop index variable counts up to n, then the index variable has to grow like the inverse function of log log k where k is the number of iterations; i.e. it has to grow like 2^2^k, or some other base than 2.
One way to achieve this is by starting at 2 and repeatedly squaring until you reach n - if the index variable is (((2^2)^2)...^2) with k squarings, then this equals 2^2^k as required:
for(int i = 2; i < n; i = i*i) {
//...
}
This loop iterates O(log log n) times, as required. If you absolutely must do it with nested loops, we can trivially add an extra loop which iterates O(1) times, leaving the total number of iterations asymptotically the same:
for(int i = 2; i < n; i = i*i) {
for(int j = 0; j < 10; j++) {
// ...
}
}

Time complexity of dependent nested logarithmic loop

What is the Big-O complexity of the following loop?
for (int i=2; i <= n; i=i*2) {
for (int k=i; k <= n; k=k*2) {
times++;
}
}
The outer loop is of course logarithmic, but I am not sure on how to deal with the increasing starting condition of the second loop. It looks similar to logarithmic complexity, but how do I get the k=i in there?