So these are the for loops that I have to find the time complexity, but I am not really clearly understood how to calculate.
for (int i = n; i > 1; i /= 3) {
for (int j = 0; j < n; j += 2) {
... ...
}
for (int k = 2; k < n; k = (k * k) {
...
}
}
For the first line, (int i = n; i > 1; i /= 3), keeps diving i by 3 and if i is less than 1 then the loop stops there, right?
But what is the time complexity of that? I think it is n, but I am not really sure. The reason why I am thinking it is n is, if I assume that n is 30 then i will be like 30, 10, 3, 1 then the loop stops. It runs n times, doesn't it?
And for the last for loop, I think its time complexity is also n because what it does is
k starts as 2 and keeps multiplying itself to itself until k is greater than n.
So if n is 20, k will be like 2, 4, 16 then stop. It runs n times too.
I don't really think I am understanding this kind of questions because time complexity can be log(n) or n^2 or etc but all I see is n.
I don't really know when it comes to log or square. Or anything else.
Every for loop runs n times, I think. How can log or square be involved?
Can anyone help me understanding this? Please.
If you want to calculate the time complexity of an algorithm, go through this post here: How to find time complexity of an algorithm
That said, the way you're thinking about algorithm complexity is small and linear. It helps to think about it in orders of magnitude, then plot it that way. If you take:
x, z = 0
for (int i = n; i > 1; i /= 3) {
for (int j = 0; j < n; j += 2) {
x = x + 1
}
for (int k = 2; k < n; k = (k * k) {
z = z + 1
}
}
and plot x and z on a graph where n goes from 1 -> 10 -> 100 -> 1000 -> 10^15 or so, you'll get an answer which looks like an n^2 graph. When analyzing algorithmic complexity you're primarily interested in maximum the number of times, in either the worst or most common case, your inputs are looped through omitting constants. So in this case I would expect your algorithm to be O(n^2)
For further reading, I suggest https://en.wikipedia.org/wiki/Introduction_to_Algorithms ; it's not exactly easy but covers this in depth.
Related
what is the complexity of the second for loop? would it be n-i? from my understanding a the first for loop will go n times, but the index in the second for loop is set to i instead.
//where n is the number elements in an array
for (int i = 0; i < n; i++) {
for (int j = i; j < n; j++) {
// Some Constant time task
}
}
In all, the inner loop iterates sum(1..n) times, which is n * (n + 1) / 2, which is O(n2)
If you try to visualise this as a matrix where lines represents i and each columns represents j you'll see that this forms a triangle with the sides n
Example with n being 4
0 1 2 3
1 2 3
2 3
3
The inner loop has (on average) complexity n/2 which is O(n).
The total complexity is n*(n+1)/2 or O(n^2)
The number of steps this takes is a Triangle Number. Here's a bit of code I put together in LINQpad (yeah, sorry about answering in C#, but hopefully this is still readable):
void Main()
{
long k = 0;
// Whatever you want
const int n = 13;
for (int i = 0; i < n; i++)
{
for (int j = i; j < n; j++)
{
k++;
}
}
k.Dump();
triangleNumber(n).Dump();
(((n * n) + n) / 2).Dump();
}
int triangleNumber(int number)
{
if (number == 0) return 0;
else return number + triangleNumber(number - 1);
}
All 3 print statements (.Dump() in LINQpad) produce the same answer (91 for the value of n I selected, but again you can choose whatever you want).
As others indicated, this is O(n^2). (You can also see this Q&A for more details on that).
We can see that the total iteration of the loop is n*(n+1)/2. I am assuming that you are clear with that from the above explanations.
Now let's find the asymptotic time complexity in an easy logical way.
Big Oh, comes to play when the value of n is a large number, in such cases we need not consider the dividing by 2 ( 2 is a constant) because (large number / 2) is also a large number.
This leaves us with n*(n+1).
As explained above, since n is a large number, (n+1) can be approximated to (n).
thus leaving us with (n*n).
hence the time complexity O(n^2).
Question 1
for (i = 0; i < n; i++) {
for (j = 0; j < i * i ; j++){
}
}
Answer: O(n^3)
At first glance, O(n^3) made sense to me, but I remember a previous problem I did:
Question 2
for (int i = n; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
//statement
}
}
Answer: O(n)
For Question 2, the outer loop is O(log n) and the inner loop is O(2n / log n) resulting in O(n). The inner loop is O(2n / log n) because - see explanation here: Big O of Nested Loop (int j = 0; j < i; j++)
Why we don't do Question 1 like Question 2 since in Question 1, j also depends on i which means we should really be taking the average of how many iterations will occur in the inner loop (as we do in Question 2).
My answer would be: O(n) for the outer loop and O(n^2 / n) for the inner loop which results in O(n^2) for Question 1.
Your answer is wrong. The code is Θ(n³).
To see that note that the inner loop takes i² steps which is at most n² but for half of the outer loop iterations is at least (n/2)² = n²/4.
Therefore the number of total inner iterations is at most n * n² = n³ but at least n/2 * n²/4 = n³/8.
Your consideration is wrong in that the inner loop takes on average proportional to n² many iterations, not n² / n.
What your inner for loop is doing, in combination with the outer for loop, is calculating the sum of i^2. If you write it out you are adding the following terms:
1 + 4 + 9 + 16 + ...
The result of that is (2n^3+3n^2+n)/6. If you want to calculate the average of the number of iterations of the inner for loop, you divide it by n as this is the number of iterations of the outer for loop. So you get (2n^2+3n+1)/6, in terms of Big O notation this will be O(n^2). And having that gives you... nothing. You have not gain any new information as you already knew the complexity of the inner for loop is O(n^2). Having O(n^2) running n times gives you O(n^3) of total complexity, that you already knew...
So, you can calculate the average number of iterations of the inner for loop, but you will not gain any new information. There were no cuts in the number of iteration steps as there were in your previous question (the i /= 2 stuff).
void fun(int n, int k)
{
for (int i=1; i<=n; i++)
{
int p = pow(i, k);
for (int j=1; j<=p; j++)
{
// Some O(1) work
}
}
}
Time complexity of above function can be written as 1k + 2k + 3k + … n1k.
In your case k = 2
Sum = 12 + 22 + 32 + ... n12.
= n(n+1)(2n+1)/6
= n3/3 + n2/2 + n/6
In the following piece of code, f() is any function taking time of Θ(1). The time complexity should be Θ(n4/3), can someone explain why?
for(int i = 1; i ≤ n; i = 2∗i) {
for(int j = 1; j∗j∗j ≤ n; j = j+1) {
for(int k = 1; k ≤ i∗i; k = k + i) {
f();
}
}
}
By my analysis, the first for loop takes Θ(log2 n) time, the second for loop is Θ(n1/3), and the third for loop is Θ(i). So in total we have Θ((log2 n) × n1/3 × i).
Since i can be n, we have Θ((log2 n) × n1/3 × n) = Θ(n4/3 log2 n). Where is my mistake?
Your bound is not tight, because you counted i as Θ(n), but i is not Θ(n) on average. Consider the sequence of values that i takes, and add these up to count the total number of iterations for the inner loop. We can ignore the middle loop over j for now, since it is independent of i and k.
The sequence of values for i is 1, 2, 4, 8, ... up to n. If we say n = 2r for some r, this is a geometric progression with sum 2r+1 - 1, which is about twice as big as n, so it's Θ(n). This counts both the outer and inner loop; the middle loop gives another factor of Θ(n1/3), and hence the overall complexity is Θ(n4/3) as required.
I thought the time complexity of the following code is O(log N), but the answer says it's O(N). I wonder why:
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}
For the inners for-loop, it runs for this many times:
N + N/2 + N/4 ...
it seems to be logN to me. Please help me understand why here. Thanks
1, 1/2, 1/4, 1/8... 1/2 ** n is a geometric sequence with a = 1, r = 1/2 (a is the first term, and r is the common ratio).
Its sum can be calculated using the following formula:
In this case, the limit of the sum is 2, so:
n + n/2 + n/4 ... = n(1 + 1/2 + 1/4...) -> n * 2
Thus the complicity is O(N)
Proceeding step by step, based on the code fragment, we obtain:
Give big theta bound for:
for (int i = 0; i < n; i++) {
if (i * i < n) {
for (int j = 0; j < n; j++) {
count++;
}
}
else {
int k = i;
while (k > 0) {
count++;
k = k / 2;
}
}
}
So here's what I think..Not sure if it's right though:
The first for loop will run for n iterations. Then the for for loop within the first for loop will run for n iterations as well, giving O(n^2).
For the else statement, the while loop will run for n iterations and the k = k/ 2 will run for logn time giving O(nlogn). So then the entire thing will look like n^2 + nlogn and by taking the bigger run time, the answer would be theta n^2 ?
I would say the result is O(nlogn) because i*i is typically not smaller than n for a linear n. The else branch will dominate.
Example:
n= 10000
after i=100 the else part will be calculated instead of the inner for loop