Quicksort time complextiy analysis (Analysis of recurrence equation) - time-complexity

Quicksort's recurrence equation is
T(n) = T(n/2) + T(n/2) + theta(n)
if pivot always divides the original array into two same-sized sub arrays.
so the overall time complexity would be O(nlogn)
But what if the ratio of the two sub-lists is always 1:99?
The equation definitely would be T(n) = T(n/100) + T(99n/100) + theta(n)
But how can I derive time complexity from the above equation?
I've read other answer which describes that we can ignore T(n/100) since T(99n/100) will dominate the overall time complexity.
But I quite cannot fully understand.
Any advice would be appreciated!

Plug T(n) = n log(n) + Θ(n) and you get
n log(n) + Θ(n) = n/100 log(n/100) + 99n/100 log(99n/100) + Θ(n/100) + Θ(99n/100) + Θ(n)
= n log(n)/100 + 99n log(n)/100 - n/100 log(100) - 99n/100 log(99/100) + Θ(n)
= n log(n) + Θ(n)
In fact any fraction will work:
T(n) = T(pn) + T((1-p)n) + Θ(n)
is solved by O(n log(n)).

Related

complexity of the sum of the squares of geometric progression

I have a question in my data structure course homework and I thought of 2 algorithms to solve this question, one of them is O(n^2) time and the other one is:
T(n) = 3 * n + 1*1 + 2*2 + 4*4 + 8*8 + 16*16 + ... + logn*logn
And I'm not sure which one is better.
I know that the sum of geometric progression from 1 to logn is O(logn) because I can use the geometric series formula for that. But here I have the squares of the geometric progression and I have no idea how to calculate this.
You can rewrite it as:
log n * log n + ((log n) / 2) * ((log n) / 2) + ((log n) / 4) * ((log n) / 4) ... + 1
if you substitute (for easier understanding) log^2 n with x, you get:
x + x/4 + x/16 + x/64 + ... + 1
You can use formula to sum the series, but if you dont have to be formal, then basic logic is enough. Just imagine you have 1/4 of pie and then add 1/16 pie and 1/64 etc., you can clearly see, it will never reach whole piece therefore:
x + x/4 + x/16 + x/64 + ... + 1 < 2x
Which means its O(x)
Changing back the x for log^2 n:
T(n) = O(3*n + log^2 n) = O(n)

When is it okay to replace f(n) terms with g(n)?

So during my lecture my professor demonstrated how to solve this problem...
Prove n^2 + 2n + lgn = O(n^2)
So for this problem if I'm correct, I can replace lgn and 2n with n^2
because it grows faster than those terms. After doing that, we'd end up with n^2 + 2n^2 + n^2 as our g(n)
0 <= n^2 2n + lgn <= n^2 + 2n^2 + n^2 for all n >= 1
However, is this only allowed when proving Big O questions, or can the same methodology be used when trying to prove Big Omega?
So with that being said that leads us to this problem...
Prove 2n^3 - 3n^2 + 2n is Big Omega (n^3)
For this problem he got rid of -3n^2 completely, and only swapped 2n with n^3.
So after doing all of this g(n) would be this.
2n^3 - 3n^2 + 2n >= 2n^3 - n^3 for all n >= 3
Why wasn't -3n^2 replaced with n^3?

Number of Arithmetic Operations Performed By a Fibonacci Function

For this problem, we are given a function f which calculates the nth fibonacci number in the standard way:
f(n)
{
if(n == 0)
return 0;
if(n == 1)
return 1;
else return f(n-1) + f(n-2);
}
I am then tasked with calculating the number of arithmetic operations performed by the function on an input n. I know that each call to f performs 3 operations (2 minus, 1 addition) and I've then attempted to find it by expanding the function to try and find some sort of recurrence relation and then working out how many arithmetic operations get done. But then I get stuck because this doesn't come out as the right answer.
The correct answer is f does 3f(n+1)-3 arithmetic operations on an input n. Please could someone explain this to me? It is important as we are later asked to find the time complexity using this as a starting point.
Thanks very much,
Niamh
Time complexity of your algorithm is close to 2^(n) . Hence it is O( 2^n )
Here is an example of how your algorithm will solve if n = 6
You algorithm does T(n) = T(n-1) + T(n-2) + O(1) .
T(n-1) = T(n-2) + T(n-3) + O(1)
T(n-2) = T(n-3) + T(n-4) + O(1)
.
.
.
T(2) = T(1) + T(0) + O(1)
Have a look at Fibonacci time complexity
T(n) = T(n-1) + T(n-2) + O(1) which is equal to
T(n) = O(2^(n-1) ) + O(2^(n-2) )+ O(1) = O(2^n)
perhaps n/2 times additon operations performed ;
for(i=3;i<=n;i++)
let n=4;
variable a,b=0,c=1;
[a=b+c;b=c;c=a;].........(1) //now a=1;b=1;c=1;
[a=b+c,b=c,c=a)...............(2) //now a=2;b=1;c=2;
output:0112.
Note:if n is Odd No then addition operation performed (n/2)+1 times;
Note:if n is Even No then addition operation performed (n/2) times;
!Please correct me if i am wrong!

Finding Big O of the Harmonic Series

Prove that
1 + 1/2 + 1/3 + ... + 1/n is O(log n).
Assume n = 2^k
I put the series into the summation, but I have no idea how to tackle this problem. Any help is appreciated
This follows easily from a simple fact in Calculus:
and we have the following inequality:
Here we can conclude that S = 1 + 1/2 + ... + 1/n is both Ω(log(n)) and O(log(n)), thus it is Ɵ(log(n)), the bound is actually tight.
Here's a formulation using Discrete Mathematics:
So, H(n) = O(log n)
If the problem was changed to :
1 + 1/2 + 1/4 + ... + 1/n
series can now be written as:
1/2^0 + 1/2^1 + 1/2^2 + ... + 1/2^(k)
How many times loop will run? 0 to k = k + 1 times.From both series we can see 2^k = n. Hence k = log (n). So, number of times it ran = log(n) + 1 = O(log n).

Understanding recurrence relation

I have this recurrence relation
T(n) = T(n-1) + n, for n ≥ 2
T(1) = 1
Practice exercise: Solve recurrence relation using the iteration method and give an asymptotic running time.
So I solved it like this:
T(n) = T(n-1) + n
= T(n-2) + (n - 1) + n
= T(n-3) + (n - 2) + (n - 1) + n
= …
= T(1) + 2 + … (n - 2) + (n - 1) + n **
= 1 + 2 + … + (n - 2) + (n - 1) + n
= O(n^2)
I have some questions:
1)How I can find asymptotic running time?
**2)At this state of problem T(1) means that there was n that when it was subtracted with a number it gave the result 1, right?
3)What if T(0) = 1 and what if T(2) = 1?
Edit: 4) Why n ≥ 2 is useful?
I need really to understand it for my Mid-Term test
T(n) = T(n-1) + n, for n ≥ 2
T(1) = 1
If T(x) represents the running time:
You have already found the asymptotic running time, O(n^2) (quadratic).
If the relation is changed to T(0) = 1 or T(2) = 1, then the running time is still quadratic. The asymptotic behavior does not change if you add a constant or multiply by a constant, and changing the initial condition only adds a constant to the following terms.
n ≥ 2 is present in the relation so that T(n) is defined at exactly once for every positive n. Otherwise, both lines would apply to T(1). You cannot compute T(1) from T(0) using T(n) = T(n-1) + n. Even if you could, T(1) would be defined in two different (and potentially inconsistent) ways.