Time complexity of dependent nested logarithmic loop - time-complexity

What is the Big-O complexity of the following loop?
for (int i=2; i <= n; i=i*2) {
for (int k=i; k <= n; k=k*2) {
times++;
}
}
The outer loop is of course logarithmic, but I am not sure on how to deal with the increasing starting condition of the second loop. It looks similar to logarithmic complexity, but how do I get the k=i in there?

Related

Time complexity of nested loop with multiplication?

I just confused about to find out time complexity of the this nested loop:
for (int i=0; i<n; i++){
i*=2;
for (int j=0; j<i; j++){
}
}
And also what is the time complexity of the outer loop (just ignore the inner loop:
for (int i=0; i<n; i++){
i*=2;
}
I ran the following code which is a modified version of the second code from the question.
let it = 0
let pow = 1000
let n = Math.pow(2, pow)
for (let i = 0; i < n; i++) {
it++
i *= 2
}
console.log(it)
It prints the value of either pow+1 or pow, which implies that it runs log2(n) loops.
Although it contains a single for loop, it modifies iterating variables inside the loop. In every iteration, the value of i is doubled, which halves the remaining number of iterations. So instead of O(n), it is O(log n). It is similar to binary search if it helps you to understand better. In every iteration, it halves the remaining search space.
However, for the first code, it is quite confusing. Since the outer loop runs for log2(n) times, and the inner loop runs i times which is dependent on n. My guess is for the first code, big O notation is O(n log(n)). But for this case, my guess is as good as yours.

Best time complexity of a single loop?

I have a really simple question, I have this loop:
for (int i=0; i<n; i++) {
"some O(n) stuff here)"
}
what will be the BEST time complexity of this algorithm?
O(n)? (for loop O(1) * O(n) stuff)
or
O(n^2)? (for loop O(n) * O(n) stuff inside the loop)
Will the for loop itself be considered as O(n) as normally, or will it be considered as O(1)
since it will only make only 1 loop for the BEST case scenario?
You are right, the best time complexity is O(N) (and even Θ(N)), if the best running time of "stuff" is constant (even zero).
Anyway, if "stuff" is known to be best case Ω(f(N)), then the best total time is Ω(N f(N)).
If your loop is doing O(n) stuff for n times then the time complexity will be O(n^2). May you call it worst case. The best case and average case will be based on your some O(n) stuff that is executed with every iteration of your loop.
Lets take a simple example of bubble sort algorithm:
for (int i = 0; i < n - 1; ++i) {
for (int j = 0; j < n - i - 1; ++j) {
if (a[j] > a[j + 1]) {
swap(&a[j], &a[j + 1]);
}
}
}
Time complexity this will always be O(n^2), whether array is sorted (irrespective of order - ascending or descending) or not.
But this can be optimised by observing that the nth pass finds the nth largest element and puts it into its final place. So, the inner loop can avoid looking at the last n − 1 items when running for the nth time:
for (int i = 0; i < n - 1; ++i) {
swapped = false;
for (int j = 0; j < n - i - 1; ++j) {
if (a[j] > a[j + 1]) {
swap(&a[j], &a[j + 1]);
swapped = true;
}
}
if (swapped == false) {
break;
}
}
Now the best case time complexity is O(n) i.e. when the array is sorted in ascending order (in context of above implementation). Average and worst case are still O(n^2).
So, to identify the best case time complexity of your algorithm you have to show us the implementation of some O(n) stuff and if not implementation then at least show the algorithm that you are trying to implement.
As you stated it, it's O(n^2).
'Cause You are doing n times a O(n) operation.

Time complexity of for loop with if/else

Would the following code be considered O(1) or O(n)? The for loop has constant complexity which runs 10 times but I'm not sure whether the if condition would be considered O(n) or O(1).
for (i = 0; i < 10; i++)
{
if (arr [i] == 1001)
{
return i;
}
}
The complexity would be O(1), because regardless of how much you increase the input the complexity of the algorithm remains constant. Namely, the operations performed inside that loop are considered to have constant time, hence O(1).
for (i = 0; i < 10; i++)
{
if (arr [i] == 1001)
{
return i;
}
}
On the other hand if your loop was:
for (i = 0; i < 10; i++)
{
f(n)
}
and the function f(n) had a complexity of O(n) then the complexity of the entire code snippet would be O(n) since 10 * N can be labeled as O(n).
For a more in depth explanation have a look at What is a plain English explanation of “Big O” notation?

Calculating the time and space complexity of function f2

I have this function and I am trying to calculate time and space complexity, I got an answer but I am not sure if it's correct, I'd like to know whether this is correct or not
void f1(int n)
{
int i,j,k=0;
for(int i=n/2; i<=n; i++)
{
for(int j=2; j<=i; j*=2)
{
k+=n;
}
}
free(malloc(k));
}
My results:
For the outer for loop it's pretty straightforward O(n).
For the inner loop, we have log((n/2)+i) each iteration, so basically log(n) each iteration.
And so the total time complexity is O(n*log(n))
For space complexity, it's O(k) whenever k receives it's final value, since final value for k is k+n nlog(n) times, we have that after all iterations, k=n^2log(n), and so space complexity is O((n^2)*log(n)).
Is this correct?

Time complexity of 2 nested loops

I want to detect the time complexity for this block of code that consists of 2 nested loops:
function(x) {
for (var i = 4; i <= x; i++) {
for (var k = 0; k <= ((x * x) / 2); k++) {
// data processing
}
}
}
I guess the time complexity here is O(n^3), could you please correct me if I am wrong with a brief explanation?
the time complexity here is O(n^2). Loops like this where you increment the value of i and k by one have the complexity of O(n) and since there are two nested it is O(n^2). This complexity doesn't depend on X. If X is a smaller number the program will execute faster but still the complexity stays the same.