Can anyone help understanding a recurrence relation? - time-complexity

I'm learning about time complexity and have this function:
public static double pow( double x, int n ) {
if( n==0 ) return 1.0;
return x*pow(x,n-1);
}
I'm tasked with finding the recurrence relation for it's time complexity and I know the answer is T(n) = T(n-1) + O(1), which I don't understand because I think it should be T(n) = T(n-1) + O(n). My reasoning is that I multiply each recursive call with x which happens n times so that's O(n) right?
*edit: I think I understand my mistake. T(n) = T(n-1) + O(1) is only for that specific call so it's clearly O(1) and solving the recurrence leads to n*T(n-1) + (n-1)O(1) so the (n-1)O(1) is ignored since it's less than n*T(n-1). So we get order of growth = O(n).

You don't need to consider every multiplication. Recurrence relations are about how the call for one number relates to the same call for a different number. In this case, calculating pow(x,n) requires that you already know pow(x,n-1) - that's the T(n) = T(n-1) part. Now, what do you additionally need to do to calculate pow(x,n), given that you already have pow(x,n-1)? That's only a single multiplication - x*pow(x,n-1) - which is O(1), so the total recurrence relation is T(n) = T(n-1) + O(1). All the other multiplications are already accounted for in T(n-1)...

Related

Time Complexity - T(n) =T(9n/10) + O(n)

What will be the time complexity of this Equation ?
Using Master's Algorithm , I am getting answer as O(n) using a < b^k case .
But the correct answer is O(nlogn) .
How ?
Using Master's Theorem for an equation like this:
you should first calculate this value:
In this case we have:
a=1
so the value of c will be something like:
which means:
Now it depends on f(n) to choose the right case of Master's method. It can be case 2 or 3 depending on f(n). If f(n) is a constant, then according to case 2, T(n) = O(nlogn) and if f(n) is a polynomial of n, according to case 3, T(n) = O(n).
Using recursive method, I also got O(n). How do you know it's O(nlogn)?
It is O(n).
Look at the recursion tree: (ignoring the constant factor that O(n) term should have)
The non-recursive version of the above is the sum of the right branches or
T(n) = n + (9/10) n + (9/10)^2 n + (9/10)^3 n + ...
which reduces to
T(n) = n * (1 + (9/10) + (9/10)^2 + (9/10)^3) + ... )
which means T(n) is some constant -- I think 10? -- times n, but in any case it is O(n) asymptotically.

Time Complexity Analysis in the following relation

Can the recurrence :
T(N)= SUM T(N-i) //i=1 to N
be solved as:
T(N)<= N*T(N-1)
which finally comes O(N^(N-1)) ?
By solving iteratively it comes:
T(N)=N*(N-1)T(N-2).... , T(N)=N....(N-k+1)T(1), k=N-1.
so finally O(N!)
Note that O gives you an upper bound on the execution time, which means that if a certain algorithm, for example, is linear, then it is O(n), but it is also O(n^2) and O(n!) and it is also O of any superlinear function.
Your inference is correct, however on both steps you overestimated your function complexity. The recurrent relation T(N) = sum(T(N-i)) is O(2^N) (and I suspect it is also o(2^N)). It is easy to show, since 2^n = sum(2^i) + 1 for 1 <= i <= n - 1.
On your first step you used a higher bound, which is perfectly fine for the O. However, even with your bound of T(N) <= N*T(N-1) the complexity you ended up with is too high. O(N!), which is less than what you estimated, also satisfies T(N) <= N*T(N-1).

Number of Arithmetic Operations Performed By a Fibonacci Function

For this problem, we are given a function f which calculates the nth fibonacci number in the standard way:
f(n)
{
if(n == 0)
return 0;
if(n == 1)
return 1;
else return f(n-1) + f(n-2);
}
I am then tasked with calculating the number of arithmetic operations performed by the function on an input n. I know that each call to f performs 3 operations (2 minus, 1 addition) and I've then attempted to find it by expanding the function to try and find some sort of recurrence relation and then working out how many arithmetic operations get done. But then I get stuck because this doesn't come out as the right answer.
The correct answer is f does 3f(n+1)-3 arithmetic operations on an input n. Please could someone explain this to me? It is important as we are later asked to find the time complexity using this as a starting point.
Thanks very much,
Niamh
Time complexity of your algorithm is close to 2^(n) . Hence it is O( 2^n )
Here is an example of how your algorithm will solve if n = 6
You algorithm does T(n) = T(n-1) + T(n-2) + O(1) .
T(n-1) = T(n-2) + T(n-3) + O(1)
T(n-2) = T(n-3) + T(n-4) + O(1)
.
.
.
T(2) = T(1) + T(0) + O(1)
Have a look at Fibonacci time complexity
T(n) = T(n-1) + T(n-2) + O(1) which is equal to
T(n) = O(2^(n-1) ) + O(2^(n-2) )+ O(1) = O(2^n)
perhaps n/2 times additon operations performed ;
for(i=3;i<=n;i++)
let n=4;
variable a,b=0,c=1;
[a=b+c;b=c;c=a;].........(1) //now a=1;b=1;c=1;
[a=b+c,b=c,c=a)...............(2) //now a=2;b=1;c=2;
output:0112.
Note:if n is Odd No then addition operation performed (n/2)+1 times;
Note:if n is Even No then addition operation performed (n/2) times;
!Please correct me if i am wrong!

How is this algorithm O(n)?

Working through the recurrences, you can derive that during each call to this function, the time complexity will be: T(n) = 2T(n/2) + O(1)
And the height of the recurrence tree would be log2(n), where is the total number of calls (i.e. nodes in the tree).
It was said by the instructor that this function has a time complexity of O(n), but I simply cannot see why.
Further, when you substitute O(n) into the time complexity equation there are strange results. For example,
T(n) <= cn
T(n/2) <= (cn)/2
Back into the original equation:
T(n) <= cn + 1
Where this is obviously not true because cn + 1 !< cn
Your instructor is correct. This is an application of the Master theorem.
You can't substitute O(n) like you did in the time complexity equation, a correct substitution would be a polynomial form like an + b, since O(n) only shows the highest significant degree (there can be constants of lower degree).
To expand on the answer, you correctly recognize an time complexity equation of the form
T(n) = aT(n/b) + f(n), with a = 2, b = 2 and f(n) asympt. equals O(1).
With this type of equations, you have three cases that depends on the compared value of log_b(a) (cost of recursion) and of f(n) (cost of solving the basic problem of length n):
1° f(n) is much longer than the recursion itself (log_b(a) < f(n)), for instance a = 2, b = 2 and f(n) asympt. equals O(n^16). Then the recursion is of negligible complexity and the total time complexity can be assimilated to the complexity of f(n):
T(n) = f(n)
2° The recursion is longer than f(n) (log_b(a) > f(n)), which is the case here Then the complexity is O(log_b(a)), in your example O(log_2(2)), ie O(n).
3° The critical case where f(n) == log_b(a), ie there exists k >= 0 such that f(n) = O(n^{log_b(a)} log^k (n)), then the complexity is:
T(n) = O(n^{log_b(a)} log^k+1 (a)}
This is the ugly case in my opinion.

Difficulty figuring out the time complexity of this recursive function

I think it's interesting but I'm not sure about my solution. This algorithm calculates xn
If I use the master theorem my reasoning goes like this
T(n) = 2 T(n/2) + f(n)
But f(n) in this case is 1? Because n <= 4 is constant. Gives me:
T(n) = Θ(n)
If I use substitution I get this answer
T(n) = Θ(n + log(n))
I think I'm doing lots of things wrong. Can someone point me in the right direction?
T(n) = Θ(n + log(n)) is just T(n) = Θ(n). The lower order term (log(n)) can be omitted when using theta.
Also, f(n) is O(1) because you are only doing one multiplication (rek(x, n/2) * rek(x, (n + 1)/2)) for each recursion.
The complexity is 0(n). Explanation: You make there ALL multiplication as in using simple cycle. And you have no operation thats numbers divided by numbers of * are bigger than a const. So, complexity is about const*0(n) that makes 0(n).