in the growth of functions we can find out the complexity of an algorithm in terms of time

4
GROWTH OF FUNCTIONS In the growth of Functions we can find out the complexity of an algorithm in terms of Time & Space. ASYMPTOTIC NOTATIONS: Suppose we are considering two algorithm say A & B , for solving problem. Let us say that we’ve done a analysis of the running times of each of the algorithm and determined them to be T a (n) and T b (n). T a (n) Time taken by algorithm A T b (n) Time taken by algorithm B n measure of problem size. Then it should be a fairly simple matter to compare the function T a (n) and T b (n) to determine which algorithm is best. But it is really simple? What does it mean for one function, say T a (n) to be better than another function, T b (n)? One possibility arises if we know the problem size a priori(without examination). E.g., suppose the problem size is n 0 and T a (n 0 )< T b (n 0 ). Then clearly algorithm A is better than algorithm B for problem size n 0 . In the general case, we have no a priori knowledge of the problem size. However, if it can be shown, say, that for all n>=0 : T a (n)<= T b (n) , then algorithm A is better than algorithm B regardless of the problem size. Unfortunately, we usually don't know the problem size beforehand, nor is it true that one of the functions is less than or equal the other over the entire range of problem sizes. In this case, we consider the asymptotic behavior of the two functions for very large problem sizes. There are some most usable notations fro finding the time complexity of an algorithm are as follows: 1] Big oh(O) notation :

Upload: narayan-changder

Post on 28-Jul-2015

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: In the Growth of Functions We Can Find Out the Complexity of an Algorithm in Terms of Time

GROWTH OF FUNCTIONS

In the growth of Functions we can find out the complexity of an algorithm in terms of Time & Space.

ASYMPTOTIC NOTATIONS:

Suppose we are considering two algorithm say A & B , for solving problem. Let us say that we’ve done a analysis of the running times of each of the algorithm and determined them to be Ta(n) and Tb(n).

Ta(n) Time taken by algorithm A

Tb(n) Time taken by algorithm B

n measure of problem size.

Then it should be a fairly simple matter to compare the function Ta(n) and Tb(n) to determine which algorithm is best.

But it is really simple? What does it mean for one function, say Ta(n) to be better than another function, Tb(n)?

One possibility arises if we know the problem size a priori(without examination). E.g., suppose the problem size is n0 and Ta(n0)< Tb(n0). Then clearly algorithm A is better than algorithm B for problem size n0.

In the general case, we have no a priori knowledge of the problem size. However, if it can be shown, say, that for all n>=0 : Ta(n)<= Tb(n) , then algorithm A is better than algorithm B regardless of the problem size.

Unfortunately, we usually don't know the problem size beforehand, nor is it true that one of the functions is less than or equal the other over the entire range of problem sizes.

In this case, we consider the asymptotic behavior of the two functions for very large problem sizes.

There are some most usable notations fro finding the time complexity of an algorithm are as follows:

1] Big oh(O) notation :

Consider a function f(n) which is non-negative for all integers n>=0. We say that ``f(n) is big oh g(n),'' which we write f(n)=O(g(n)), if there exists an integer n0 and a constant c>0 such that for all integers n>=n0 ,

f(n)<=C.g(n)

Page 2: In the Growth of Functions We Can Find Out the Complexity of an Algorithm in Terms of Time

GROWTH OF FUNCTIONS

example:

f(n)=3n+5

solution 3n+5 <= C.n

N=1 : 3.1+5<=C.1

C=8 for all n>=1

So, f(n)=Og(n)

2] Big Omega(Ω) notation:

For non-negative functions, f(n) and g(n), if there exists an integer n0 and a constant c > 0 such that for all integers n > n0, f(n) ≥ cg(n), then f(n) is omega of g(n). This is denoted as

"f(n) = Ω(g(n))"

example:

f(n)=3n+5

solution 3n+5 <= 3.n

N=1 : 3.1+5<=3.1

C=3 for all n>=1

So, f(n)=Ωg(n)

3] Big Theta(Θ) notation:

For non-negative functions, f(n) and g(n), f(n) is theta of g(n) if and only if f(n) = O(g(n)) and f(n) = Ω(g(n)). This is denoted as

"f(n) = Θ(g(n))"

This is basically saying that the function, f(n) is bounded both from the top and bottom by the same function, g(n).

Page 3: In the Growth of Functions We Can Find Out the Complexity of an Algorithm in Terms of Time

GROWTH OF FUNCTIONS

C1.g(n)<=f(n)<=C2.g(n)

example:

f(n)=3n+5

solution

C1.g(n)<=f(n)<=C2.g(n)

Taking first part of the equation :

C1.g(n)<=f(n

3n+5 <= 3.n

N=1 : 3.1+5<=3.1

C=3 for all n>=1

Taking second part of the equation :

f(n)<=C2.g(n)

3n+5 <= C.n

N=1 : 3.1+5<=C.1

C=8 for all n>=1

So, for all n>=1 , C1=3 and C2=8

So, f(n)=Θ(n)