complexity for recursive functions (Big O notation) - time-complexity

I have two functions which I would like to determine the complexity for.
The first one I just need to know whether my solving is correct, the second one because of the two recursive calls I am struggling to find the solution to, if possible would be good to have the working out so that I can learn how its done.
First:
def sum(list):
assert len(list)>0
if len(list) == 1:
return list[0]
else:
return sum(list[0:-1]) + list[-1]
Attempted solution :
T(0) = 4
T(n) = T(n-1) + 1 + c -- True for all n >0
T(n) = T(n-1) + 1 + c
= T(n-2) + 2 + 2C
= T(n-k) + k = kC --(n-k = 0 implies that k=n)
T(n) = T(0) + n + nC
= T(0) + 2nC --(T0 is nothing but 4)
= 6nC
Complexity = O(n)
Second:
def binSum(list):
if len(list) == 1:
return list[0]
else:
return binSum(list[:len(list)//2]) + binSum(list[len(list)//2:])
Any help would be greatly appreciated.
Regards

For the first case, you can model the time complexity with the recursive function T(n) = T(n-1) + O(1) and T(0) = O(1). which obviously solves to T(n) = O(n).
Here's a more direct and more formal proof by induction. The base case is easy: T(0)<=C by the definition of O(1). For the inductive step, suppose that T(k) <= C*k for all k<=n for some absolute constant C>0. Now T(n+1) <= D + T(n) <= D + C*n <= max(C,D)*(n+1) by the inductive hypothesis, where D>0 is an absolute constant.
For the second case, you can model the time complexity with T(n) = T(n/2) + T(n/2) = 2T(n/2) for n>1 and T(1)=O(1). This solves to T(n)=O(n) by the master theorem.

Related

How to find time and space complexity for this algorithm?

I need to find time and space complexity for this java code for recursive calculation of the determinant of a matrix:
public int determinant(int[][] m) {
int n = m.length;
if(n == 1) {
return m[0][0];
} else {
int det = 0;
for(int j = 0; j < n; j++) {
det += Math.pow(-1, j) * m[0][j] * determinant(minor(m, 0, j));
}
return det;
}
}
public int[][] minor(final int[][] m, final int i, final int j) {
int n = m.length;
int[][] minor = new int[n - 1][n - 1];
int r = 0, s = 0;
for(int k = 0; k < n; k++) {
int[] row = m[k];
if(k != i) {
for(int l = 0; l < row.length; l++) {
if(l != j) {
minor[r][s++] = row[l];
}
}
r++;
s = 0;
}
}
return minor;
}
Help: Determine the number of operations and memory consumption of the algorithm with respect to n, and after dividing by n^2 you will get the desired result.
I'm confused. I calculate the time complexity as the sum of the input size (n^2) and the number of steps, and the space complexity as the input size. (?) So its O(n^2) but I don't think I'm doing it right. And why should I divide it (help is from the teacher).
Can someone explain to me how to calculate this in this case?
Let us see. For an input matrix of size n, denote the time complexity as T(n).
So, for a matrix of size n:
if n = 1, we have the answer right away: T(1) = O(1);
otherwise, we loop over j for a total of n times, and for each j, we:
construct a minor in O(n^2) (the minor function), and then
run the function recursively for that minor: this one takes T(n-1) time.
Putting it all together, we have T(1) = O(1) and T(n) = n * (n^2 + T(n-1)).
To understand what is going on, write down what is T(n-1) there:
T(n) = n * (n^2 + T(n-1)) =
= n * (n^2 + (n-1) * ((n-1)^2 + T(n-2)))
And then, do the same for T(n-2):
T(n) = n * (n^2 + T(n-1)) =
= n * (n^2 + (n-1) * ((n-1)^2 + T(n-2))) =
= n * (n^2 + (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) = ...
Then we write what is T(n-3), and so on.
Now, open the brackets:
T(n) = n * (n^2 + (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1)^3 + n * (n-1) * (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1)^3 + n * (n-1) * (n-2)^3 + n * (n-1) * (n-2) * T(n-3)))) =...
As we can see going further, the highest term among these, in terms of n, will be
T(n) = n * (n-1) * (n-2) * ... * 3 * 2 * 1^3, which is just n! (n-factorial).
The above is about time complexity.
To address space complexity, consider how recursion uses memory.
Note that it is much different from what happens to the time.
The reason is that, at each point of time, we are in some single particular branch of the recursion tree, and at this time other branches don't use the memory they need.
Therefore, unlike with time, we can't just add up the total memory used by all recursive calls.
Instead, we have to consider all branches of the recursion tree, and find the one that uses maximum memory.
In this particular case, it is just any deepest branch of the recursion.
Denote the memory consumption by M(n) where n is the matrix size.
if n = 1, we have the answer right away: M(1) = O(1);
otherwise, we loop over j for a total of n times, and for each j, we:
construct a minor that takes O(n^2) memory (the minor function), and then
run the function recursively for that minor: this one takes M(n-1) memory.
Note that the loop does not accumulate memory used by minors.
Instead, when we construct the minor for next j, the minor for previous j is not needed anymore.
Thus we have M(n) = n^2 + M(n-1).
Again, writing down what is M(n-1), then M(n-2), and so on gets us to
M(n) = n^2 + (n-1)^2 + (n-2)^2 + ... + 3^2 + 2^2 + 1^2, which is O(n^3) with some constant factor we need not care about.
So, by the above reasoning, the answer is: T(n) = O(n!) and M(n) = O(n^3).
What's the hint about, with dividing by n^2, I don't have a clue, sorry!

Asymptotic time complexity of c*n*(1 - n)

Suppose on solving a recurrence, I find that:
T(n) = c*n*(1-n) = c*n - c*n^2
where c is a positive constant and n the size of the input
Should I consider the asymptotic time complexity of this recurrence, O(n) as the n^2 term is negative?
UPDATE:
For example, suppose we have the following recurrence:
T(n) = T(a*n) + O(n), where the factor a less than 1:
=> T(n) = c*n*(1 + a + a^2 + a^3 + ... for logan terms)
=> T(n) = c*n*(1 - a^logan)/(1 - a)
=> T(n) = c*n*(1 - n)/(1 - a) ~ c*n*(1-n)
The error as suggested by #meowgoesthedog in the comments is due to an incorrect leap in reasoning (there are log(1/a)n terms and not logan); the correct derivation is as follows:
T(n) = T(a*n) + O(n), where the factor a less than 1:
=> T(n) = c*n*(1 + a + a^2 + a^3 + ... for log(1/a)n terms)
=> T(n) = c*n*(1 - a^log(1/a)n)/(1 - a)
=> T(n) = c*n*(1 - n)/(1 - a) ~ c*n*(1-1/n) ~ O(n)

What is the time complexity of this nested loop?

I stumbled upon a loop for which I am not sure what the time complexity is. It is the following loop:
for(i = 1; i <= n^2; i++){
for(j = 1; j <= i; j++) {
//some elementary operation
}
}
I would have argued that the outer for-loop runs in n^2 and the inner for loop would also run in n^2 as for every iteration of the outer-loop we do n^2 - (n^2 - 1), n^2 - (n^2 - 2),..., n^2. Am I totally going in the wrong direction here?
So the time complexity would be in n^4
The number of operation will be :
1 + 2 + 3 + 4 + 5 + ... + n²
Which is equal to (n² * (n² - 1)) / 2.
The Big O notation is O(n^4). You are correct.
It's a simple arithmetic progression happening here.
Every new iteration is bigger by 1.
Outer loop will do n^2 operations which will result for following sequence:
1 + 2 + 3 + ... + n + ... + n^2 = n^2 (n^2+1) / 2 = O(n^4)

Solve: T(n) = T(n/2) + n/2 + 1

I struggle to define the running time for the following algorithm in O notation. My first guess was O(n), but the gap between the iterations and the number I apply isn't steady. How have I incorrectly defined this?
public int function (int n )
{
if ( n == 0) {
return 0;
}
int i = 1;
int j = n ;
while ( i < j )
{
i = i + 1;
j = j - 1;
}
return function ( i - 1) + 1;
}
The while is executed in about n/2 time.
The recursion is executed passing as n a value that is about half of the original n, so:
n/2 (first iteration)
n/4 (second iteration, equal to (n/2)/2)
n/8
n/16
n/32
...
This is similar to a geometric serie.
Infact it can be represented as
n * (1/2 + 1/4 + 1/8 + 1/16 + ...)
So it converges to n * 1 = n
So the O notation is O(n)
Another approach is to write it down as T(n) = T(n/2) + n/2 + 1.
The while loop does n/2 work. Argument passed to next call is n/2.
Solving this using the master theorem where:
a = 1
b = 2
f = n/2 + 1
Let c=0.9
1*(f(n/2) + 1) <? c*f(n)
1*(n/4)+1 <? 0.9*(n/2 + 1)
0.25n + 1 <? 0.45n + 0.9
0 < 0.2n - 0.1
Which is:
T(n) = Θ(n)

Time complexity analysis - how to simplify the expression

My algorithm's complexity has the below expression. But I am not sure how to simplify this further to express in Big-O notation.
T(n) = 3 * T(n-1) + 3 * T(n-2) + 3 * T(n-3) + ... + 3 *T(1)
T(1) takes constant time.
Appreciate any help.
Calculating T(n-1), we get:
T(n-1) = 3*T(n-2) + 3*T(n-3) + ... + 3*T(1)
So effectively,
T(n) = 3*T(n-1) + T(n-1) = 4*T(n-1) = 4*(4*T(n-2))
Thus T(n) = 4(n - 1).