Prove that the time complexity of a function is O(n^3) - time-complexity

public void function2(long input) {
long s = 0;
for (long i = 1; i < input * input; i++){
for(long j = 1; j < i * i; j++){
s++;
}
}
}
l'm pretty certain that the time complexity of this function is n^3, however if someone could provide a line by line explanation of this, that would be great.

First of all, you need to define what n is if you write something like O(n^3), otherwise it doesn't make any sense. Let's say n is the value (as opposed to e.g. the bit-length) of input, so n = input.
The outer loop has k iterations, where k = n^2. The inner loop has 1^2, 2^2, 3^2, ... up to k^2 iterations, so summing up everything you get O(k^3) iterations (since the sum of the p-th powers of the first m integers is always O(m^(p+1))).
Hence the overall time complexity is O(n^6).

Related

Determining the time complexity of this program

void f2(int n)
{
if (n<=1)
return;
g2(n, n/3);
}
void g2(int n, int m)
{
int i=1;
while (m < n) {
m += i;
i++;
}
f2(n/2);
}
I tried alot to calculate the time complexity and got it wrong, I would really appreciate it if someone could help me on how to approach these programs. (The answer is O(sqrt(n)).
The following explanation can be simplified, but I tried to be as much scrupulous as possible.
Sum of arithmetic progression
First of all lets talk about complixity of the following loop (note the m=0):
int m=0;
int i=1;
while (m < n) {
m += i;
i++;
}
Invariant of the loop is: after ith iteration m == 1+2+...+i == (1+i)*i/2. So the loop stops when the following condition is met:
which is equavalent to
Big O of the left and right parts are equal and both equal to O(i), so O(i)=O(sqrt(n)) is the complexity of the loop.
Complexity of the loop inside g2
The loop inside the g2 is equavalent to the following loop:
int n_modified = n - m;
m = 0;
int i=1;
while (m < n_modified) {
m += i;
i++;
}
which complexity is O(sqrt(n-m)) as we've shown in the previous section.
Complexity of the f2
Now lets get overall formula for complexity of the f2 function. Its complexity is essentially the same as complexity of the g2(n, n/3) call. It consists of two parts: complexity of the g2's loop and complexitiy of the recursion. That is, the formula is
This can be simplified and estimated (factoring and sum of geometric progression):
which gives us the final answer: the complexity of f2 is O(sqrt(n)).

Time complexity of 2 nested loops

I want to detect the time complexity for this block of code that consists of 2 nested loops:
function(x) {
for (var i = 4; i <= x; i++) {
for (var k = 0; k <= ((x * x) / 2); k++) {
// data processing
}
}
}
I guess the time complexity here is O(n^3), could you please correct me if I am wrong with a brief explanation?
the time complexity here is O(n^2). Loops like this where you increment the value of i and k by one have the complexity of O(n) and since there are two nested it is O(n^2). This complexity doesn't depend on X. If X is a smaller number the program will execute faster but still the complexity stays the same.

Nested loops with time complexity log(log n)

Can there be an algorithm with two loops (nested) such that the overall time complexity is O(log(log n))
This arised after solving the following :
for(i=N; i>0; i=i/2){
for(j=0; j<i; j++){
print "hello world"
}
}
The above code has time complexity of N. (Using concept of Geometric Progression). Can there exists similar loop with time complexity O(log(log n)) ?
For a loop to iterate O(log log n) times, where the loop index variable counts up to n, then the index variable has to grow like the inverse function of log log k where k is the number of iterations; i.e. it has to grow like 2^2^k, or some other base than 2.
One way to achieve this is by starting at 2 and repeatedly squaring until you reach n - if the index variable is (((2^2)^2)...^2) with k squarings, then this equals 2^2^k as required:
for(int i = 2; i < n; i = i*i) {
//...
}
This loop iterates O(log log n) times, as required. If you absolutely must do it with nested loops, we can trivially add an extra loop which iterates O(1) times, leaving the total number of iterations asymptotically the same:
for(int i = 2; i < n; i = i*i) {
for(int j = 0; j < 10; j++) {
// ...
}
}

determine the time complexity of the algorithems

I have just started to learn time complexity, but I don't really get the idea, could you help with those questions and explain the way of thinking:
int Fun1(int n)
{
for (i = 0; i < n; i += 1) {
for (j = 0; j < i; j += 1) {
for (k = j; k < i; i += 1) {
// do something
}
}
}
}
void Fun2(int n){
i=o
while(i<n){
for (j = 0; j < i; j += 1) {
k=n
while(k>j){
k=k/2
}
k=j
while(k>1){
k=k/2
}
}
}
int Fun3(int n){
for (i = 0; i < n; i += 1) {
print("*")
}
if(n<=1){
print("*")
return
}
if (n%2 != 0){
Fun3(n-1)
}
else{
Fun3(n/2)
}
}
for function 1, I think its Theta(n^3) because it runs at most
n*n*n times but I am not sure how to prove this.
for the second I think its Theta (n^2log(n))
I am not sure
Could you help, please?
First a quick note, in Fun2(n) there should be a i++ before closing the while loop, anyway, time complexity is important in order to understand the efficiency of your algorithms. In this case you have these 3 functions:
Fun1(n)
In this function you have three nested for loops, each for loops iterates n times over a given input, we know that the complexity of this iteration is O(n). Since there are three nested for loops, the second for loop will iterate n times over each iteration of the outer for loop. The same will do the most inner loop. The resulting complexity, as you correctly said, is O(n) * O(n) * O(n) = O(n^3)
Fun2(n)
This function has a while loop that iterates n times over a given input. So the outer loop complexity is O(n). Same as before we have an inner for loop that iterates n times on each cycle of the outer loop. So far we have O(n) * O(n) which is O(n^2) as complexity. Inside the for loop we have a while loop, that differs from the other loops, since does not iterate on each element in a specific range, but it divides the range by 2 at each iteration. For example from 0 to 31 we have 0-31 -> 0-15 -> 0-7 -> 0-3 -> 0-1
As you know the number of iteration is the result of the logarithmic function, log(n), so we end up with O(n^2) * O(log(n)) = O(n^2(log(n)) as time complexity
Fun3(n)
In this function we have a for loop with no more inner loops, but then we have a recursive call. The complexity of the for loop as we know is O(n), but how many times will this function be called?
If we take a small number (like 6) as example we have a first loop with 6 iteration, then we call again the function with n = 6-1 since 6 mod 2 = 0
Now we have a call to Fun3(5), we do 5 iteration and the recursively we call Fun3(2) since 5 mod 2 != 0
What are we having here? We having a recursive call that in the worst case will call itself n times
The complexity result is O(n!)
Note that when we calculate time complexity we ignore the coefficients since are not relevant, usually the function we consider, especially in CS, are:
O(1), O(n), O(log(n)), O(n^a) with a > 1, O(n!)
and we combine and simplify them in order to know who has the best (lowest) time complexity to have an idea of which algorithm could be used

time complexity ( studying for exam)

I am currently studying to an examen in algorithms and I am trying to solve a question about time complexity in java, but can't really figure out how to do it. I am suppose to calculate the expected time complexity. N is a positive integer.
for (int i=0; i < N; i++)
for (int j=i+1; j < N; i++) {
int x=j+1; int h=N-1; int k;
while(x<h) {
k=(x+h)/2;
if (a[i]+a[j]+a[k] == 0) { cnt++; break;}
if (a[i]+a[j]+a[k] < 0) x=k+1;
else h=k-1;
}}
The first for loop should run N times and the second should run N-1. Since x is j+1 I guessed that x= N-2. I dont know how to think after that with the while loop or if I have done anything right. Would really appreciate help!
Create your time complexity function in parts.
for (int i=0; i < N; i++) //Takes linear O(n)
for (int j=i+1; j < N; i++) { //Takes linear O(n) and in computer science we can safely assume -1 is irrelevant at N-1 in big O notation
int x=j+1; int h=N-1; int k; // 3 x O(1)
while( x < h ) { // Worst case is when j equals i + 1 where i = 0 so x is at lowest 2 and h equals to N-1 so h depends on N. So again loop takes linear O(n) time.
k=(x+h)/2; // Takes O(1) time
if (a[i]+a[j]+a[k] == 0) { // Takes O(1) time and if this gives true we do break from the while loop
cnt++; // Takes O(1) time
break; // Takes O(1) time
}
if ( a[i]+a[j]+a[k] < 0 ) { // Takes O(1) time
x=k+1; // Takes O(1) time
} else {
h=k-1; // Takes O(1) time
}
}
}
}
So in summary T(N) equals to O(N^3) and Ω(N^2)
More specific T(N) = N * N-1 * N-2 + 10 and this last while loop in avarage takes O(N/2) time but still in computer science it is same as O(N).
We are only interested in worst and best cases.
To confuse even more big O notation actually
T(N)=O(g(N)) means this:
I hope this answer helps even little bit...