Determine time complexity when inner loop has logarithmic frequency of outer loop - time-complexity

I am novice in analysising time complexity.some one can help me with the time complexity of below algorithm?
public void test(int n)
{
int i=1;
while(i<n)
{
int j=1;
while (j<i)
{
j=j*2;
}
i=i*2;
}
}
outer loop will run log(n) times.How many times the inner loop will run. How can we calulate the frequency of inner
loop in terms of "n" because here it depends on variable "i" and will run log(i) times.
Can someone help to find time complexity of above code.

The time complexity of the given function is O(log n log n) = O(log^2 n).
The outer loop has time complexity O(log n).
Similarly, the inner loop also has time complexity O(log n) because the value of i is bounded above by n. Hence log i = O(log n).

Outer loop will run for (log n ) times.
Since inner loop is bound by i. So it will run for
log1+log2+log3...log (n-1) times for different values of i.
Solving it above can be deduced to
= log(1*2*3...(n-l). (Because log a* log b=log(a*b)
= log((n-1)!). (This is (n-1) factorial)
=(n-1)log(n-1). (because logn!= nlogn)
SO inner loop will have complexity O(nlogn).
So the time complexity of above algo is
logn + nlogn
which is O(nlogn).

Related

How the time complexity of this code is coming O(logn)?

int a = 0, i = N;
while (i > 0)
{
a += i;
i /= 2;
}
How will I calculate the time complexity of the code? Can anyone Explain?
Time complexity is basically the number of times a loop will run. Big O is the worst case complexity that a particular loop can have. For example, if linear search were being used to find K, which is, say the (n-1)th element of an array(0 indexed, starts with 0), the program would have to loop through the entire array to find the element. This would mean that the loop has to run n times in the worst case, giving linear search a time complexity of O(n).
In the case of your problem, i is initally equal to N and decrements by half per iteration. This would mean that when (N/pow(2, m) > 0 the loop terminates. So the loop runs at most m times which log(n).
log(N) = log(pow(2,m)) ==> log(N) = m

Time complexity of these two while loops

x = 1;
while(x<n)
{
x = x + n / 10;
m = n*n
y = n;
while(m>y)
{
m=m-100;
y=y+20;
}
}
The way I solved it: we are adding to x 1/10 of n every time, so no matter how big n will be, the number of repitation we are doing is always 10.
The inner loop is from n to n^2, and each variable in it is increasing linearly, so the inner loop should be O(n)
and because the outer loop is O(1), we get O(n) for all of the function.
but the optional answers for the question are: O(n^2), O(n^3), O(n^4), O(nlogn)
What am I missing? thanks.
You are correct that the outer loop runs a constant number of times (10), but your reasoning about the inner loop isn't all the way there. You say that there is a "linear increase" with each iteration of the inner loop , and if that were the case you would be correct that the whole function runs in O(n), but it's not. Try and figure it out from here, but if you're still stuck:
The inner loop has to make up the difference between n^2 and n in terms of the difference between m and y. That difference between m and y is reduced by a constant amount with each iteration: 120, not a linear amount. Therefore the inner loop runs (n^2 - n)/120 times. Multiplying the number of times the outer loop runs by the times the inner loop runs, we get:
O(10) * O((n^2 - n)/120)
= O(1) * O(n^2)
= O(n^2)

How to calculate the time complexity of nested loops?

How to calculate the time complexity of the following algorithm?
for(i=1;i<=n;i++)
for(k=i;k*k<=n;k++)
{
Statements;
}
From what I know, time complexity for nested for loops is equal to the number of times the innermost loop is executed. So here innermost loop is executed n*n times, hence it's O(n^2).
Could it be O(n) depending upon the condition k*k<=n given in the second loop?
Thank you!
Time complexity of an algorithm is always measured in terms of a certain type of operation. For example, if your Statements; have an un unknown time complexity which depends on n, then it would be misleading to describe the time complexity in the first place.
But what you are probably after is to know the time complexity in terms of Statements; operations. If Statements; is a constant-time operation, this becomes especially meaningful. And in this case, what we are looking for is simply to count how many times Statements; are executed. If this number is, say, 3*n, then the time complexity would be O(n).
To answer this question, let us break your nested loop apart. The outer loop iterates from (and including) 1 to n, so it will run exactly n times, regardless of anything.
For each iteration of the outer loop, the inner loop will execute once. It starts from k=i and iterates until k*k > n, or k > sqrt(n). Notice that whenever i > sqrt(n), it will not run at all. We can see that on average, it will run for
O(sqrt(n) + sqrt(n)-1 + sqrt(n)-2 + ... + 0) / n
iterations. By the summation formula you can find here, this equals
O( sqrt(n) * (sqrt(n) + 1) / 2 ) = O( (n + sqrt(n))/2 ) = O( n + sqrt(n) ) = O(n).
So yes, the time complexity in this case is O(n) as you suggested.
You can see this in action by writing a simple script which simulates your algorithm and counts the number of Statements;. Below in JavaScript, so it can be run as a snippet:
// Simulation
function f(n) {
let res = 0;
for(let i=1;i<=n;i++)
for(let k=i;k*k<=n;k++)
++res;
return res;
}
// Estimation
function g(n) {
return ~~((n + Math.sqrt(n))/2);
}
console.log(
f(10),
f(100),
f(1000),
f(10000),
);
console.log(
g(10),
g(100),
g(1000),
g(10000),
);
I hope you found this useful.

What is worst case time complexity for given function?

Void func(int n){
Int k=n;
Int i=0;
for(;i<n;i++){
while(k>1){
k>>=1;
}
}
What is worst case time complexity of the function? explain the solution also.
In the first iteration of the outer loop, the inner loop runs log2(n) times. This is because k is repeatedly divided by 2 (right shift by 1), so its value decreases exponentially.
However for all other iterations of the outer loop, the value of k is still less than 1, so the inner loop does not run.
Therefore the time complexity is given by T(n) = Ө(n - 1) + Ө(log2(n)) = Ө(n).

Analyzing the time complexity of this function

Assuming do_something is O(1), how do I calculate the time complexity of this function?
function run(n) {
for(var i = 1; i <= n; i++) {
var counter = 1;
while(counter <= n) {
for (var j = counter; j >= 1; j--) {
do_something();
}
counter = counter * 2;
}
}
}
I'm assuming the initial for loop means the that the complexity will be n and the inner while loop means log(n). Is this true?
How do I calculate the complexity of everything?
Thanks.
It's not trivial to calculate the complexity of everything because there are functions (e. g. collatz conjecture) where no exact analysis is known. In such a case you can do an empirical analysis where you guess which run time class would match best.
Related to your sample:
The outer for loop is called n times
The while loop is called log(n) times
The inner for loop is called counter times where counter grows 2^m. But counter is recalculated log(n) times
That means the inner exponential for loop (2^m) is called m=log(n) times.
So you have 2^(log n) = n log(2) = n
You can also do it vice versa. The inner exponential function is called log times. Means log(2^m) -> (2^n = 2^m) -> m = n.
You can also simply create a sum of 2^m from m=0 to m=log(n) which is 2n-1 -> O(n).
Then you have to multiply linear time of outer for loop with linear time of while loop (incl. inner for loop). So you have O(n*n) = O(n²) time.
The exact amount of steps can be calculated.
The outer loop runs n times
The while and inner-most loop is run counter times for each of the values of counter, these are 1, 2, 4, 8, ..., 2^(bitLength(n)-1)
which sums to 2^bitLength(n) - 1
where bitLength(n) = trunc(ln(n)/ln(2))+1
Multiplying these two, yields the exact amount of steps:
n * (2^bitLength(n) - 1)
The approximate complexity is O(n2), because O(2^bitLength(n) - 1) = O(n)