Complexity of the following code fragment - time-complexity

I have to find the big O for the following piece of code:
for(int i=0; i<n; i++) // O(n)
if(n*f(i)+1>0) // O(n*log(n))
for(int j=f(i);j<n;j++) //O(n^2*log(n))
g(j); // O(n^3*log(n))
The O for function f is O(log(n)) and for the function g is O(n)
In the comments I've written my calculations but not sure if they are correct.

Let
Then the inner loop is (ignoring off-by-one):
Which is O(n^2) given that i < n.
Outer loop starting condition is
, where a is some constant = the base of log. The exponent should be very small (though depends on the value of lambda), so the condition should be << n.
Therefore the complexity is O(n^3).

Related

What would the big O notation be for this particular code, It seems that a for loop with a worse time notation is nested in one thats O(1)

As stated previously, in the attached photo, the for loop with a bigger time complexity is nested in one that is with O(1), so what is the overall time complexity? and why?
void function(int n) {
int i;
int x = 0;
for (i = 0; i<10; i++)
for (j=0; j<n/2; j++)
x--;
}
It’s O(n).
The inner loop is all that matters there since the outer loop runs a constant number of times. The outer loop’s contribution is just a constant factor which you ignore when figuring out Big-O.

Time complexity on nested for loop

function(n):
{
for (i = 1; i <= n; i++)
{
for (j = 1; j <= n / 2; j++)
output("")
}
}
Now I have calculated the time complexity for the first for loop which is O(n). Now the second for loop shows j <= n / 2 so any given n I put, for example, a range of [1,2,...,10] will give me O(log(n)) since it will continuously give me a series of n,n/2,n/4,n/8 .... K.
So if we wanted to compare the relationship it must look something like this 2^n = k.
My question is will it give me O(log(n))?
The correct summation according to the code is:
So, it's not O(log n). It's O(n^2).
No, it does not give you o(logn).
The first for loop is O(n). The second loop is O(n) as well, as the number of iterations grows as a function of n (the growth rate is linear).
It would be the same even by changing the second loop to something like
for (j=1; j<=n/2000; j++)
or in general if you replace the denominator with any constant k.
To conclude, the time compexity is quadratic, i.e., O(n^2)

Why is the time complexity of the below snippet O(n) while the space complexity is O(1)

The below-given code has a space complexity of O(1). I know it has something to do with the call stack but I am unable to visualize it correctly. If somebody could make me understand this a little bit clearer that would be great.
int pairSumSequence(int n) {
int sum = 0;
for (int i = 0;i < n; i++) {
sum += pairSum(i, i + l);
}
return sum;
}
int pairSum(int a, int b) {
return a + b;
}
How much space does it needs in relation to the value of n?
The only variable used is sum.
sum doesn't change with regards to n, therefore it's constant.
If it's constant then it's O(1)
How many instructions will it execute in relation to the value of n?
Let's first simplify the code, then analyze it row by row.
int pairSumSequence(int n) {
int sum = 0;
for (int i = 0; i < n; i++) {
sum += 2 * i + l;
}
return sum;
}
The declaration and initialization of a variable take constant time and doesn't change with the value of n. Therefore this line is O(1).
int sum = 0;
Similarly returning a value takes constant time so it's also O(1)
return sum;
Finally, let's analyze the inside of the for:
sum += 2 * i + l;
This is also constant time since it's basically one multiplication and two sums. Again O(1).
But this O(1) it's called inside a for:
for (int i = 0; i < n; i++) {
sum += 2 * i + l;
}
This for loop will execute exactly n times.
Therefore the total complexity of this function is:
C = O(1) + n * O(1) + O(1) = O(n)
Meaning that this function will take time proportional to the value of n.
Time/space complexity O(1) means a constant complexity, and the constant is not necessarily 1, it can be arbitrary number, but it has to be constant and not dependent from n. For example if you always had 1000 variables (independent from n), it would still give you O(1). Sometimes it may even happen that the constant will be so big compared to your n that O(n) would be much better than O(1) with that constant.
Now in your case, your time complexity is O(n) because you enter the loop n times and each loop has constant time complexity, so it is linearly dependent from your n. Your space complexity, however, is independent from n (you always have the same number of variables kept) and is constant, hence it will be O(1)

Time complexity of the for-loop

I need to calculate the time complexity of the following loop:
for (i = 1; i < n; i++)
{
statements;
}
Assuming n = 10,
Is i < n; control statement going to run for n time and i++; statement for n-1 times? And knowing that i = 1; statement is going to run for a unit of time.
Calculating the total time complexity of the three statements in for-loop yields 1+n+n-1 = 2n and the loop with its statements yields 2n+n-1 = 3n-1 = O(n).
Are my calculations correct to this point?
Yes, your calculations are correct, a for loop like such would have O(n) notation.
Similarly, you could make a calculation like such:
for(int i = 0; i <n*2; i++){
//calculations
}
in this case, the for loop would have a big O notation of O(n^2) (you get the idea)
This loop takes O(n^2) time; math function = n^n This way you can calculate how long your loop need for n 10 or 100 or 1000
This way you can build graphs for loops and such.
as DAle mentioned in the comments the big O notation is not affected by calculations within the loop, only the loop itself.

What is the time complexity of the code below?

sum =0;
for (int i=1; i<n; i++) {
for (int j=1; j< n/i; j++) {
sum = sum +j;
}
}
In the above outer loop , The variable i runs from 1 to n, thus making complexity of outer loop as O(n).
This explains the n part of O(n logn) complexity.
But for the outer part when we see then j runs from 1 to n/i, meaning whenever i is 1 , the complexity is n so i guess the inner time complexity should also be O(n).
Making the total time Complexity as O(n*n)=O(n^2).
This is what you can do using Sigma notation:
With H_{n-1} is the harmonic function:
Finding Big O of the Harmonic Series