I am thinking about how to correctly calculate time complexity of this function:
def foo(lst):
jump = 1
total = 0
while jump < len(lst):
for i in range(0, len(lst), jump):
total += i
jump = jump * 2
return total
I assume that it's O(n) where n is length of list.
Our while loop is O(n) and for loop is also O(n), that means that we got 2*O(n) which equals O(n).
Am I right?
Your calculation have mistakes, for nested loop we multiply the complexity of outer loop with the complexity of inner loop to get the full complexity.
If n is the length of list then the while loop runs log(n) time as every time we are using jump = jump * 2
Inner loop is like, n/1 times when jump = 1, and n/2 times when jump is 2, so the complexity of inner loop is like : (n/1) + (n/2) + (n/4) + (n/8) .... till logn times
Hence total time complexity = n(1 + 1/2 + 1/4 + 1/8 +… till logn). The coefficient of n here is negligible, so the complexity is O(n)
Related
The question is how many times does this algorithm produce a meow:
KITTYCAT(n):
for i from 0 to n − 1:
for j from 2^i to n − 1:
meow
So The inner loop has a worse case of n, but it only runs log(n) times because even though the outer loop runs n times, whenever i > log(n) the inner loop never runs, so you have O(n * log(n)).
However, since I can't assume a best case of n for the inner loop, how do I prove that the algorithm still has a best case of n * log(n)?
When i > log2n the start value of the inner loop is higher than its end value. Depending on how you interpret it, this either means that the inner loop counts down, or that it does not run at all. If you interpret it as counting down, then it gets very big indeed and ends up dominating, and you have Ω(2n), which is not what you seem to be looking for.
If instead you assume the inner loop goes away, then this code is really
for i from 0 to log2n:
for j from 2i to n - 1:
meow
giving you Ω(nlogn)
If you're asking how to prove that last step, you can calculate the exact number of iterations -- the inner loop iterates n times, then n-1 times, then n-2, then n-4, etc all the way down to 0. So the exact complexity (at least when n is a power of 2) is
n + n-1 + n-2 + n-4 + ... + n-n/4 + n-n/2 + n-n
or
nlog2n - 1 - 2 - 4 - ... - n/4 - n/2
which converges to
nlog2n - n
which is asymptotically equivalent to nlogn as n -> ∞
x = 1;
while(x<n)
{
x = x + n / 10;
m = n*n
y = n;
while(m>y)
{
m=m-100;
y=y+20;
}
}
The way I solved it: we are adding to x 1/10 of n every time, so no matter how big n will be, the number of repitation we are doing is always 10.
The inner loop is from n to n^2, and each variable in it is increasing linearly, so the inner loop should be O(n)
and because the outer loop is O(1), we get O(n) for all of the function.
but the optional answers for the question are: O(n^2), O(n^3), O(n^4), O(nlogn)
What am I missing? thanks.
You are correct that the outer loop runs a constant number of times (10), but your reasoning about the inner loop isn't all the way there. You say that there is a "linear increase" with each iteration of the inner loop , and if that were the case you would be correct that the whole function runs in O(n), but it's not. Try and figure it out from here, but if you're still stuck:
The inner loop has to make up the difference between n^2 and n in terms of the difference between m and y. That difference between m and y is reduced by a constant amount with each iteration: 120, not a linear amount. Therefore the inner loop runs (n^2 - n)/120 times. Multiplying the number of times the outer loop runs by the times the inner loop runs, we get:
O(10) * O((n^2 - n)/120)
= O(1) * O(n^2)
= O(n^2)
What would be the time complexity be of these lines of code?
Begin
sum = 0
For i = 1 to n do
For j = i to n do
sum = sum + 1
End
I want to say O(n) = 2n^2 + 1 but I'm unsure because of the i=j part.
Your answer is correct!
A nested loop instinctively leads you to the right answer which is O(n^2). In doing Big-O analysis, it usually doesn't matter being specific to the point i.e. saying the time complexity is O(2n^2) or O(3n^2 + 1) -- saying O(n^2) is enough since that is the dominating task of the function.
The i = j condition simply makes it so that there are...
i=1: n operations
i=2: (n-1) operations
...
i=n: 1 operation
So, the sum of all operations you do is 1 + 2 + ... n = n(n+1)/2 which is O(n^2).
start = 0
while (start!= len(array)-1):
for i in range(start +1,len(array)):
if (array[i]<array[start]):
array[i],array[start] = array[start],array[i]
print(array)
start += 1
in this case should'nt the complexity be like
O(n) = n * [(n-1) + (n-2) + .... (n-(n-1))]
as for each of the n times of the outer loop the inner loop runs for diff steps gradually reducing by one. In this way O(n) comes to be (n^3 - n^2)/2. What is wrong with my approach.?enter code here
Look at that in this way. The first time (start=0) the inner loop performs n-1 steps,
the second time (start=1) the inner loop performs n-2 steps, and so on. Thus you have:
(n-1) + (n-2) + ... + 1 steps, which is equals to (n^2-n)/2 steps.
I am getting confused about how to analysis the time complexity within a nested while loop which divide into odd and even situation. could anyone help to explain how to deal with the situation?
i = 1
while (i < n) {
k = i
while ( k < n ) {
if ( k % 2 == 1 )
k ++
else
k = k + 0.01*n
}
i = i + 0.1*n
}
So in a problem like this, the factors 0.01 and 0.1 play a huge role.
First let's consider the inner while loop, if k is odd, we increment k by 1. If k is even, we increment k by one-hundredths of n. How man iterations can this inner while loop run?
Clearly if all iterations were of type-1(odd case), the inner while loop would run n-k times, and similarly if all the iterations were of type-2(even case), the inner while loop would run atmost a 100 times(as we increment the value of k by one-hundredths of n each time).
Given value of k, the number of iterations of the inner while loop is:
max(n-k,100). From now on, we will assume the value of n-k to be greater than 100 always, without loss of generality.
Okay, how does the outer loop iterate? In each iteration of the outer loop, the value of i increases by one-tenths of n each time, so the outer while will run at most 10 times.
Making the running times explicit and calculating the overall running time:
Running time for first iteration of outer loop : n-k
Running time for second iteration of outer loop : + n-(k+0.1*n)
+ n-(k+0.2*n)
...
+ n-(k+0.9*n)
-----------
= 10n-10k-(4.5)n
Plugging in k=1(as this is the start value of k),
10n-10-4.5n = 5.5 n -10 = O(n)
Hence complexity is O(n) time.