Time complexity Recursive program - time-complexity

I can't figure out the time complexety of the following snippet of code:
void f( int N ){
sum ++;
if ( N > 1){
f( N /2);
f( N /2);
}
It's the double iteration that gives me problems.
I know (or think) that the time complexity of
void f( int N ){
sum ++;
if ( N > 1){
f( N /2);
}
is ~log2(N) but don't know what to do with the other code.

You are calling the recursion twice on (N/2). Lets write the formula:
T(n) = 2*(N/2) + 1
Using master theorem, we fall on the first case where:
a=2
f = O(1)
b=2
T(n) = Θ(n)
We also find T(n) = 2*T(n/2) + 1 here, which shows it is bounded by O(n)

A good way to solve this problem would be:
1. Finding the recurrence relation
For each call to f we have time complexity T(N). Each call contains:
A constant amount of work C: sum++, comparison N > 1, recursive calling overhead, etc.
2 recursive calls to f, each with time complexity T(N / 2).
Thus the recurrence relation is given by T(N) = 2T(N/2) + C.
2. Finding a solution by inspection
If we repeatedly substitute T(N) into itself, we can see an emerging pattern:
What is the upper limit of the summation, m? Since the stopping condition is N > 1, after repeated substitutions the requirement would be
Thus the summation equals (dropping the round-down, and the constant C because it is simply multiplicative):
3. Proof by induction
Base step: test if the summation result is correct for the lowest possible value of N, i.e. 2:
The result is consistent.
Recurrence step: confirm that if the summation is correct for N / 2, it is also correct for N:
Which is exactly our original recurrence relation.
Hence by induction, the summation result is correct, and T(N) is indeed Ө(N).
4. Numerical testing:
We can write code to confirm our result if needed:
function T(n) {
return n > 1 ? 2 * T(floor(n/2)) + 1 : 1;
}
Results:
N T(N)
-------------------------
2 3
4 7
8 15
16 31
32 63
64 127
128 255
256 511
512 1023
1024 2047
2048 4095
4096 8191
8192 16383
16384 32767
32768 65535
65536 131071
131072 262143
262144 524287
524288 1048575
1048576 2097151
2097152 4194303
4194304 8388607
8388608 16777215
16777216 33554431
33554432 67108863
67108864 134217727
134217728 268435455
268435456 536870911
536870912 1073741823
1073741824 2147483647
Graph:

Let's try tracing through it:
Let N be 8.
1 + f(4) + f(4)
=> 1 + 2 + f(2) + f(2) + f(2) + f(2)
=> 1 + 6 + f(1) + f(1) + f(1) + f(1) + f(1) + f(1) + f(1) + f(1)
When n = 8; work = 15
Let N be 4.
1 + f(2) + f(2)
=> 1 + 2 + f(1) + f(1) + f(1) + f(1)
When n = 4; work = 7
Let N be 2
1 + f(1) + f(1)
When n = 2; work = 3;
Let N be 1
1
When n = 1; work = 1
So at a glance the work pattern seems to be 2n - 1
We still need to prove it!
From the algorithm the recursive relation is:
W(1) = 1
W(N) = 1 + 2 * W(N / 2)
Proof by induction
Base Case:
W(1) = 2(1) - 1 = 1 as required.
Recursive case:
Assume W(N / 2) = 2(n/2) - 1 = n - 1
W(N) = 1 + 2 * (N / 2)
Applying induction...
W(N) = 1 + 2(n - 1) = 1 + 2n - 2 = 2n - 1 as required
Therefore the complexity is O(2n - 1) because O is reflexive
=> O( max { 2n, -1 } ) because of the rule of sums
=> O(2n)
=> O(n) because of the rule of scaling

This code is much similar to binary tree traversal,
void f( int N ){
sum ++;
if ( N > 1){
f( N /2); //like Traverse left subtree
f( N /2); //like Traverse right subtree
}
which basically traverses through each node once, with O(N) time complexity.
n/8
n/4 ---------
n/8
n/2 ------------------
n/8
n/4 ---------
n/8
n -------------------------------
n/8
n/4 ---------
n/8
n/2 ----------------
n/8
n/4 ---------
n/8
this goes on till the passed value becomes 1 or 0.

Related

How to find time and space complexity for this algorithm?

I need to find time and space complexity for this java code for recursive calculation of the determinant of a matrix:
public int determinant(int[][] m) {
int n = m.length;
if(n == 1) {
return m[0][0];
} else {
int det = 0;
for(int j = 0; j < n; j++) {
det += Math.pow(-1, j) * m[0][j] * determinant(minor(m, 0, j));
}
return det;
}
}
public int[][] minor(final int[][] m, final int i, final int j) {
int n = m.length;
int[][] minor = new int[n - 1][n - 1];
int r = 0, s = 0;
for(int k = 0; k < n; k++) {
int[] row = m[k];
if(k != i) {
for(int l = 0; l < row.length; l++) {
if(l != j) {
minor[r][s++] = row[l];
}
}
r++;
s = 0;
}
}
return minor;
}
Help: Determine the number of operations and memory consumption of the algorithm with respect to n, and after dividing by n^2 you will get the desired result.
I'm confused. I calculate the time complexity as the sum of the input size (n^2) and the number of steps, and the space complexity as the input size. (?) So its O(n^2) but I don't think I'm doing it right. And why should I divide it (help is from the teacher).
Can someone explain to me how to calculate this in this case?
Let us see. For an input matrix of size n, denote the time complexity as T(n).
So, for a matrix of size n:
if n = 1, we have the answer right away: T(1) = O(1);
otherwise, we loop over j for a total of n times, and for each j, we:
construct a minor in O(n^2) (the minor function), and then
run the function recursively for that minor: this one takes T(n-1) time.
Putting it all together, we have T(1) = O(1) and T(n) = n * (n^2 + T(n-1)).
To understand what is going on, write down what is T(n-1) there:
T(n) = n * (n^2 + T(n-1)) =
= n * (n^2 + (n-1) * ((n-1)^2 + T(n-2)))
And then, do the same for T(n-2):
T(n) = n * (n^2 + T(n-1)) =
= n * (n^2 + (n-1) * ((n-1)^2 + T(n-2))) =
= n * (n^2 + (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) = ...
Then we write what is T(n-3), and so on.
Now, open the brackets:
T(n) = n * (n^2 + (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1)^3 + n * (n-1) * (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1)^3 + n * (n-1) * (n-2)^3 + n * (n-1) * (n-2) * T(n-3)))) =...
As we can see going further, the highest term among these, in terms of n, will be
T(n) = n * (n-1) * (n-2) * ... * 3 * 2 * 1^3, which is just n! (n-factorial).
The above is about time complexity.
To address space complexity, consider how recursion uses memory.
Note that it is much different from what happens to the time.
The reason is that, at each point of time, we are in some single particular branch of the recursion tree, and at this time other branches don't use the memory they need.
Therefore, unlike with time, we can't just add up the total memory used by all recursive calls.
Instead, we have to consider all branches of the recursion tree, and find the one that uses maximum memory.
In this particular case, it is just any deepest branch of the recursion.
Denote the memory consumption by M(n) where n is the matrix size.
if n = 1, we have the answer right away: M(1) = O(1);
otherwise, we loop over j for a total of n times, and for each j, we:
construct a minor that takes O(n^2) memory (the minor function), and then
run the function recursively for that minor: this one takes M(n-1) memory.
Note that the loop does not accumulate memory used by minors.
Instead, when we construct the minor for next j, the minor for previous j is not needed anymore.
Thus we have M(n) = n^2 + M(n-1).
Again, writing down what is M(n-1), then M(n-2), and so on gets us to
M(n) = n^2 + (n-1)^2 + (n-2)^2 + ... + 3^2 + 2^2 + 1^2, which is O(n^3) with some constant factor we need not care about.
So, by the above reasoning, the answer is: T(n) = O(n!) and M(n) = O(n^3).
What's the hint about, with dividing by n^2, I don't have a clue, sorry!

TIme complexity for a specific loop

What is the time complexity and tilde for the loop below?
for (int i = N/2; i < N; i++) {
for (int j = i; j < N; j++) {
doSomething(i, j);
}
}
I think that it runs N/2 + (N/2 + 1) + (N/2 + 2) + ... + (N-1) times, but how do I get it's time complexity and tilde?
For example - if N = 100, the loop will run 50 + 51 + 52 + 53 + ... + 99 times.
I am assuming doSomething(i, j); is not iterating all the elements between i and j; if this is the case, the complexity of this algorithm is O(N^2).
The outer loop for (int i = N/2; i < N; i++) { will execute O(N) times, cause N/2 is actually constant value.
The inner loop in worst case will execute N times (or N - i times) too, this will also merge with previous O(N).
Therefore, overall time complexity will be O(N^2) in worst case scenario.
The inner loop is executed:
N/2-1 times for i = N/2,
N/2-2 times for i = N/2+1
....
1 time for i = N-2
therefore the total time for the inner loop is :
(N/2-1) + (N/2-2) + .... (N/2-k) where k = N/2 - 1
= N/2*k - (1 + 2 + ... + k)
= N/2*(N/2-1) - (N/2-1)(N/2)/2
= N/2(N/2 - 1 - N/4 + 1/2)
= N/2(N/4 - 1/2)
= N^2/8 - N/4
Hence the order of growth of the code is of N^2
If you consider tilde notation which is defined as :
"∼g(n) to represent any quantity that, when divided by f(n), approaches 1 as n grows" from here, you can see that ~g(n) = ~N^2/8 because as N grows (N^2/8)/(N^2/8-N/4) approaches 1.

What is the time complexity of this nested loop?

I stumbled upon a loop for which I am not sure what the time complexity is. It is the following loop:
for(i = 1; i <= n^2; i++){
for(j = 1; j <= i; j++) {
//some elementary operation
}
}
I would have argued that the outer for-loop runs in n^2 and the inner for loop would also run in n^2 as for every iteration of the outer-loop we do n^2 - (n^2 - 1), n^2 - (n^2 - 2),..., n^2. Am I totally going in the wrong direction here?
So the time complexity would be in n^4
The number of operation will be :
1 + 2 + 3 + 4 + 5 + ... + n²
Which is equal to (n² * (n² - 1)) / 2.
The Big O notation is O(n^4). You are correct.
It's a simple arithmetic progression happening here.
Every new iteration is bigger by 1.
Outer loop will do n^2 operations which will result for following sequence:
1 + 2 + 3 + ... + n + ... + n^2 = n^2 (n^2+1) / 2 = O(n^4)

Solve: T(n) = T(n/2) + n/2 + 1

I struggle to define the running time for the following algorithm in O notation. My first guess was O(n), but the gap between the iterations and the number I apply isn't steady. How have I incorrectly defined this?
public int function (int n )
{
if ( n == 0) {
return 0;
}
int i = 1;
int j = n ;
while ( i < j )
{
i = i + 1;
j = j - 1;
}
return function ( i - 1) + 1;
}
The while is executed in about n/2 time.
The recursion is executed passing as n a value that is about half of the original n, so:
n/2 (first iteration)
n/4 (second iteration, equal to (n/2)/2)
n/8
n/16
n/32
...
This is similar to a geometric serie.
Infact it can be represented as
n * (1/2 + 1/4 + 1/8 + 1/16 + ...)
So it converges to n * 1 = n
So the O notation is O(n)
Another approach is to write it down as T(n) = T(n/2) + n/2 + 1.
The while loop does n/2 work. Argument passed to next call is n/2.
Solving this using the master theorem where:
a = 1
b = 2
f = n/2 + 1
Let c=0.9
1*(f(n/2) + 1) <? c*f(n)
1*(n/4)+1 <? 0.9*(n/2 + 1)
0.25n + 1 <? 0.45n + 0.9
0 < 0.2n - 0.1
Which is:
T(n) = Θ(n)

How to find the time complexity of the following code snippet

What is the time complexity of the following code snippet and how to calculate it.
function(int n){
for(int i = 1 ; i<=n ; i++){
for(int j=1 ; j<=n; j+=i){
System.out.println("*");
}
}
Let's think about the total work that's done. As you noted in your comment, the inner loop runs n times when i = 1, then n / 2 times when i = 2, then n / 3 times when i = 3, etc. (ignoring rounding errors). This means that the total work done is
n + n/2 + n/3 + n/4 + ... + n/n
= n(1 + 1/2 + 1/3 + 1/4 + ... + 1/n)
The term in parentheses is the nth harmonic number, denoted Hn, so the work done overall is roughly nHn. It's known that Hn = Θ(log n), so the total work is Θ(n log n).