1) ....+n/16+n/8 + n/4+ n/n..=?
2) ...+n/5+n/4 + n/3+n/2...n/n..=?
I am working on finding time complexity of few algorithms where i came across few geometric series.
I believe 1st geometric series has log(n) .What is time complexity of 2nd geometric series?
Assuming that (1) is n * (… + 1/2^k + … + 1/16 + 1/8 + 1/4 + 1/2 + 1/1), the answer is 2n because the sum 1 + 1/2 + 1/4 + … + 1/2^k + … converges to the value 2. To see this:
1/1 + 1/2 + … + 1/2^n + … = k
(1/1 + 1/2 + … + 1/2^n + …)/2 = k/2
1/2 + 1/4 + … + 1/2^(n+1) + … = k/2
k - 1 = k/2
k/2 = 1
k = 2
The key step above was recognizing the LHS of the third line is one less than the LHS of the first line.
For (2), n * (… + 1/k + … + 1/5 + 1/4 + 1/3 + 1/2 + 1/1) is n times the harmonic series. The harmonic series diverges so this is undefined, tending toward infinity. To see this, compare the two series:
1/1 + 1/2 + 1/3 + 1/4 + 1/5 + 1/6 + 1/7 + 1/8 + …
1/1 + 1/2 + 1/4 + 1/4 + 1/8 + 1/8 + 1/8 + 1/8 + …
The second is the same as the first but all terms have had denominators increased to the next higher power of two. Thus the second series cannot sum to a larger value than the first. But the second series clearly diverges since we can group two 1/4s, four 1/8s, etc., to get the sum 1 + 1/2 + 1/2 + … + 1/2 + …
1) n{(1/1)+(1/2)+(1/4)+(1/8)+...} ==>O(n) because this series {(1/1)+(1/2)+(1/4)+...} equal to number 2
2) n{(1/1)+(1/2)+(1/3)+(1/4)+...} ==>O(n Lnn) because this series {(1/1)+(1/2)+(1/3)+...} equal to Lnn(this is harmonic series)
1)
...+n/16+n/8 + n/4+ n/n..=? is a geometric series and its sum will always be less than equal 2n. So it is O(n).
2)...+n/5+n/4 + n/3+n/2...n/n..=? is a harmonic series whose sum will be logn.There are mathematical calculations to derive this.So it is O(log n)
Related
I have algorithm one that has a complexity of O(mnr +mr^2 + nr^2 ) + K x (mr^2 + nr^2)
and the second algorithm which is Estep = Q.X + (1-Q).(W*H) which i computed as O(mnr)
where (.) is element-wise multiplication
Now i want to add O(mnr) to O(mnr) +mr^2 + nr^2 + K x (mr^2 + nr^2)
Question:
do you agree with my complexity of the second statement?
what will be the final complexity
Thanks for your time
You can simplify O(mnr) + O(mnr +mr^2 + nr^2) + K x (mr^2 + nr^2) as
O(mnr + mr^2 + nr^2). You cannot simplify further without any assumption on the relationship between m, n and r.
Suppose I have two factors, N and M, and a constraint that M <= N, and I have an operation with O(log(N)) time complexity, that needs to be run M times, however, N reduces by 1 on each iteration, so it looks roughly like this:
O(log(N) + log(N - 1) + ... + log(N - (M - 2)) + log(N - (M - 1)))
How do I reduce this to a simple expression?
as a bonus, I kind of simplified things above, N doesn't definitely reduce by 1 on each iteration, this only occurs in the worst case (where M = N), it actually reduces by the result of the prior log(N) operation, which is some series of M numbers, lets call it series R, and series R sums to N, so it's really like:
O(log(N) + log(N - R(0)) + log(N - R(0) - R(1)) + ... + log(N - R(0) - R(1) - ... - R(M - 2)) + log(N - R(0) - R(1) - ... - R(M - 2) - R(M - 1)))
where it's a summation with sub summations... is this able to be simplified?
Since log(a) + log(b) = log(a*b) it follows that your equation equals:
O( log( N*(N-1)*(N-2)* ... * (N-(M-1)) * (N-M) ) )
So for the worst case scenario M=N-1 gives the upper bound O(log(N!))
In the general case the complexity is O(log(N!/(N-M)!)). Which increases with M as expected.
I have a question in my data structure course homework and I thought of 2 algorithms to solve this question, one of them is O(n^2) time and the other one is:
T(n) = 3 * n + 1*1 + 2*2 + 4*4 + 8*8 + 16*16 + ... + logn*logn
And I'm not sure which one is better.
I know that the sum of geometric progression from 1 to logn is O(logn) because I can use the geometric series formula for that. But here I have the squares of the geometric progression and I have no idea how to calculate this.
You can rewrite it as:
log n * log n + ((log n) / 2) * ((log n) / 2) + ((log n) / 4) * ((log n) / 4) ... + 1
if you substitute (for easier understanding) log^2 n with x, you get:
x + x/4 + x/16 + x/64 + ... + 1
You can use formula to sum the series, but if you dont have to be formal, then basic logic is enough. Just imagine you have 1/4 of pie and then add 1/16 pie and 1/64 etc., you can clearly see, it will never reach whole piece therefore:
x + x/4 + x/16 + x/64 + ... + 1 < 2x
Which means its O(x)
Changing back the x for log^2 n:
T(n) = O(3*n + log^2 n) = O(n)
Prove that
1 + 1/2 + 1/3 + ... + 1/n is O(log n).
Assume n = 2^k
I put the series into the summation, but I have no idea how to tackle this problem. Any help is appreciated
This follows easily from a simple fact in Calculus:
and we have the following inequality:
Here we can conclude that S = 1 + 1/2 + ... + 1/n is both Ω(log(n)) and O(log(n)), thus it is Ɵ(log(n)), the bound is actually tight.
Here's a formulation using Discrete Mathematics:
So, H(n) = O(log n)
If the problem was changed to :
1 + 1/2 + 1/4 + ... + 1/n
series can now be written as:
1/2^0 + 1/2^1 + 1/2^2 + ... + 1/2^(k)
How many times loop will run? 0 to k = k + 1 times.From both series we can see 2^k = n. Hence k = log (n). So, number of times it ran = log(n) + 1 = O(log n).
I have this recurrence relation
T(n) = T(n-1) + n, for n ≥ 2
T(1) = 1
Practice exercise: Solve recurrence relation using the iteration method and give an asymptotic running time.
So I solved it like this:
T(n) = T(n-1) + n
= T(n-2) + (n - 1) + n
= T(n-3) + (n - 2) + (n - 1) + n
= …
= T(1) + 2 + … (n - 2) + (n - 1) + n **
= 1 + 2 + … + (n - 2) + (n - 1) + n
= O(n^2)
I have some questions:
1)How I can find asymptotic running time?
**2)At this state of problem T(1) means that there was n that when it was subtracted with a number it gave the result 1, right?
3)What if T(0) = 1 and what if T(2) = 1?
Edit: 4) Why n ≥ 2 is useful?
I need really to understand it for my Mid-Term test
T(n) = T(n-1) + n, for n ≥ 2
T(1) = 1
If T(x) represents the running time:
You have already found the asymptotic running time, O(n^2) (quadratic).
If the relation is changed to T(0) = 1 or T(2) = 1, then the running time is still quadratic. The asymptotic behavior does not change if you add a constant or multiply by a constant, and changing the initial condition only adds a constant to the following terms.
n ≥ 2 is present in the relation so that T(n) is defined at exactly once for every positive n. Otherwise, both lines would apply to T(1). You cannot compute T(1) from T(0) using T(n) = T(n-1) + n. Even if you could, T(1) would be defined in two different (and potentially inconsistent) ways.