1 combinatorial analysis 1.1 1.2 1.3 1.4

Upload: engineerzindi84146

Post on 30-May-2018

229 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    1/53

    Contents

    1 Combinatorial analysis 1

    1.1 Fundamental Principles Of Counting: Tree Diagram . . . . . . . . . . . . . 1

    1.2 Factorial Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    1.3 Permutations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

    1.4 Combinations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

    1.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

    2 Elements of probability theory 1

    2.1 Set Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12.1.1 Venn Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    2.1.2 Operators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    2.1.3 Set Theorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

    2.1.4 Principle Of Duality: . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

    2.2 Sample Space And Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

    2.3 The Concept Of Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

    2.4 Addition Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

    2.5 Conditional Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

    2.6 The Multiplication Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

    2.7 Bayes Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

    2.8 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

    3 Elements of mathematical statistics 1

    3.1 Stochastic Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    0.1

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    2/53

    0 CONTENTS

    3.2 Discrete Probability Distributions . . . . . . . . . . . . . . . . . . . . . . . . 2

    3.3 Continuous Probability Distributions . . . . . . . . . . . . . . . . . . . . . . . 3

    3.4 Joint Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

    3.5 Expected Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

    3.6 Some Discrete Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

    3.6.1 Binomial Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

    3.6.2 Poisson Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

    3.6.3 Other Discrete Distributions . . . . . . . . . . . . . . . . . . . . . . . 123.7 Some Continuous Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . 12

    3.7.1 Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

    3.7.2 Exponential Distribution . . . . . . . . . . . . . . . . . . . . . . . . . 13

    3.7.3 Erlang-k Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

    3.8 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

    4 Theory of sampling 1

    4.1 Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    4.2 Sampling Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    4.3 The Central Limit Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

    4.4 Sampling Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    4.4.1 Population mean and variance 2 are known . . . . . . . . . . . . . 3

    4.4.2 Population mean and variance 2 are both unknown . . . . . . . . . 4

    4.5 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

    Author index 7

    Index 7

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    3/53

    Chapter 1

    Combinatorial analysis

    In some cases the number of possible outcomes for a particular event is not very large, andso direct counting of possible outcomes is not difficult. However, problems often arise wheredirect counting becomes a practical impossibility. In these cases use is made ofcombinatorial analysis, which could be called a sophisticated way of counting.

    1.1 Fundamental Principles Of Counting: Tree

    Diagram

    We first introduce two rules which are employed in many proofs through the combinatorialanalysis:

    Rule of Sum: If object A may be chosen in m ways, and objectB in n other ways, either A or B may bechosen in m + n ways.

    Rule of Product: If object A may be chosen in m ways, andthereafter object B in n ways, both Aand B may be chosen in this order inm n ways (multiplication principle).

    It should be noticed that in the Rule of Sum, the choices of A and B are mutuallyexclusive, that is, one can not choose both A and B but either A or B.

    1

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    4/53

    2 CHAPTER 1. COMBINATORIAL ANALYSIS

    T1

    T2

    L2T1

    T2T1

    T2

    R1

    R2

    R3

    R1

    R2

    R3

    T2

    T1

    T1

    T2

    T2

    T1

    L1

    Figure 1.1: Fig.1.1 Tree diagram for example (1.1), where, where=2, l=3, and m = 2

    The Rule of Product is often used in cases where the order of choosing is immaterial, thatis, where the choices are independent. But in many practical situations the possiblity ofdependence should not be ignored.

    If one thing can be accomplished in n1 different ways, and if after this a second thing can beaccomplished in n2 different ways, , and finally a kth thing can be accomplished in nkdifferent ways, then all k things (which are assumed to be independent of each other) canbe accomplished in the specified order in n = n1 n2 nk different ways.

    A diagram, called a tree diagram because of its appearance (fig.1.1), is often used inconnection with these rules.

    Example 1.1 (see Fig. 1.1)

    Let the setting-up procedure of a call involve the following devices:

    k local circuits : L1 , L2 , , Lk

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    5/53

    1.2. FACTORIAL FUNCTION 3

    l registers : R1 , R2 , , Rlm trunk circuits : T1 , T2 , , Tm

    Under the assumption of independence a call can then be set up in

    n = k l m

    different ways.

    For a group of 1000 subscribers typical figures are

    k = 80, l = 15, m = 50, i.e. n = 60.000.

    If a malfunctioning only occurs for a specific combination of devices, it can be very difficultto trace the fault, as it only appears in one out of 60.000 calls (assuming random hunting).

    2

    1.2 Factorial Function

    Factorial n (denoted by n!, n integer) is defined as

    n! = n (n 1) (n 2) 2 1 (1.1)

    It is convenient to define0! = 1 (1.2)

    Example 2.1

    For many calculators the upper range of number is

    9.999 1099

    Thus the factorial function only exists for n < 70:

    69! = 1.7112 1098

    70! = 1.1979 10100

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    6/53

    4 CHAPTER 1. COMBINATORIAL ANALYSIS

    If n is large a direct evaluation of n! is impractical. In such cases Sterlings approximation

    is often applied: n! 2n nn en (1.3)where e is the base of natural logarithms(e 2.718281828 ).

    The symbol means that the ratio of the left side to the right side approaches 1 as n .For this reason we often call the right side an asymptotic expansion of the left side. Thesymbol means approximately equal to.

    2

    The gamma function, denoted by (n) , is defined for any real value of n > 0:

    (n) =0

    tn1 etdt , n > 0 (1.4)

    A recurrence formula is(n + 1) = n (n) (1.5)

    We can easily find (1) = 1. If n is a positive integer, then we get (1.1):

    (n + 1) = n!

    1.3 Permutations

    The basic definition of a permutation is given below:

    A r-permutation ofn different objects is an ordered selection or arrangement of r (r n) ofthe objects.

    Actually it is sampling without replacement. Suppose that we are given n distinguishableobjects and wish to arrange r ( n) of these objects on a line. Since there are n ways ofchoosing 1st object and, after this is done, n 1 ways of choosing the 2nd object, , andfinally (n r + 1) ways of choosing the rth object, by repeating the application of the ruleof product it follows by the fundamental principle of counting that the number of differentarrangements, or permutations as they are called, is given by

    P

    n

    r

    = n (n 1) (n r + 1) (1.6)

    We call P n

    r

    the number of permutations of n objects taken r at a time.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    7/53

    1.4. COMBINATIONS 5

    For the particular case where r = n (1.6) becomes

    P n

    n

    = n! (1.7)

    We can write (1.6) in terms of factorials as

    P

    n

    r

    =

    n!

    (n r)! (1.8)

    If r = n we see that (1.6) and (1.8) agree only if 0! = 1 (1.2).

    Example 3.1

    We consider the case where some objects are identical. The number of permutation of nobjects consisting of groups of which n1 are identical, n2 are identical, , and nk areidentical, where

    n = n1 + n2 + + nkis

    n!

    n1! n2! nk!=

    n

    n1

    n2

    nk

    . The term of the righthand side is called the Polynomial coefficient.

    Example 3.2

    Let us consider a group of n circuits. We can look (hunt) for idle circuits in

    P

    n

    n

    = n!

    different ways.

    2

    1.4 Combinations

    In a permutation we are interested in the order of arrangements of the objects. Thus (a b c)

    is different permutation from (b c a). In many problems, however, we are only interested inselecting objects without regards to the order. Such selections are called combinations. Forexample (a b c) and (b c a) are the same combination.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    8/53

    6 CHAPTER 1. COMBINATORIAL ANALYSIS

    The total number of combinations of r objects selected from n (also called the combinations

    of n items taken r at a time) is denoted by C nr or nr .We have

    C

    n

    r

    =

    n

    r

    =

    n!

    r!(n r)! (1.9)

    It can also be written

    nr =P

    n

    r

    r! (1.10)

    We can easily derive the expression by noticing that each combination of r different objectsmay be ordered in r! ways, and so ordered it is an r-permutation.

    Thus we have

    r!Cn

    r = Pn

    r = n(n 1) (n r + 1), n rBy moving r! to the other side we get the expression (1.10).

    It is easy to see that n

    r

    =

    n

    n r

    (1.11)

    The numbers

    n

    r

    are often called binomial coefficients because they arise in the

    binomial expansion:

    (x + y)n =n

    r=0

    n

    r

    xr ynr (1.12)

    They can be generalized in several ways. Thus we define: nr

    = (1)r

    n + r 1

    r

    , n > 0 (1.13)

    This appears in the Negative Binomial Distribution.

    Example 4.1

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    9/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    10/53

    8 CHAPTER 1. COMBINATORIAL ANALYSIS

    Let us consider a set of n objects consisting of n1 different objects of type 1, n2 different

    objects of type 2, etc. where

    n = n1 + n2 + + nk

    We consider combinations of r objects containing r1 objects of type 1, r2 objects of type 2, , rk objects of type k. where

    r = r1 + r2 + + rk, ri ni

    The number of these combinations is by the fundamental principle of counting:

    n1r1

    n2r2

    nkrk

    The total number of combinations with r elements is n

    r

    .

    2

    Many combinatorial problems can be reduced to the following form. For a group of ncircuits, p of them are busy and (n p) of them are idle. A group of k circuits is chosen atrandom. We seek the number of combinations which contain exactly x busy circuits. Here xcan be any integer between zero and p or k, whichever is the smaller.

    The chosen group contains x busy and k x idle circuits. Since any choice of busy circuitsmay be combined with any choice of idle ones, the busy ones can be chosen in

    p

    x

    different ways and the idle ones in

    n pk x

    different ways. Thus the total number of

    combinations containing x busy circuits isp

    x

    n pk x

    (1.15)

    The total number of combinations containing k circuits (idle or busy) is n

    k

    . So the

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    11/53

    1.4. COMBINATIONS 9

    relative number of favourable combinations is

    qk =

    px n p

    k x

    n

    p

    (1.16)This can be rewritten in the form

    qk =

    k

    x

    n kp x

    n

    p

    (1.17)In terms of probabilities this is called the hypergeometric distribution.

    Example 4.4

    For the special case x = k we get (1.16)

    qk =

    p

    k

    n p

    0

    nk =

    p

    k

    nk (1.18)

    or from (1.17)

    qk =

    k

    k

    n kp k

    n

    p

    =

    n kp k

    n

    p

    (1.19)These expressions are useful when deriving Palm-Jacobus formula and Erlangsinterconnection formula (Erlangs ideal grading) in teletraffic theory.

    2

    Useful Relations And Results

    Equalities: n

    r

    =

    n

    n r

    (1.20)n

    r

    = 0, for r > n and for r < 0 (1.21)

    n0

    = 1 (1.22)

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    12/53

    10 CHAPTER 1. COMBINATORIAL ANALYSIS

    Recurrence formula (Pascals triangle): n

    r 1

    + n

    r

    = n + 1

    r

    (1.23)

    nr=0

    n

    r

    = 2n (1.24)

    ni=r

    i

    r

    =

    n + 1r + 1

    (1.25)

    11 1

    1 2 11 3 3 1

    1 4 6 4 1

    Fig.1.2 Pascals triangle

    Summary:P ermutation =

    n!

    (nr)! , repetitions are not allowed

    nr, repetitions are allowed

    Combination =

    n!

    r!(nr)!, repetitions are not allowed

    (n+r1)!r!(n1)! , repetitions are allowed

    1.5 Exercises

    1. Evaluate 69! by using Sterlings approximation to n! (use logarithm to the base 10).Compare the result with the value given in Example(2.1).

    2. In how many ways can 4 calls occupy 10 different circuits?

    3. How many 6-digit telephone numbers can be formed with the digits 0,1,2, , 9 (0 isnot allowed in the first digit) if

    (a) repetitions are allowed?(b) repetitions are not allowed?(c) the last digit must be 0 and repetitions are not allowed?

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    13/53

    1.5. EXERCISES 11

    4. We consider a 4-digit binary number. Every digit may be 0 or 1. How many

    different numbers have two 0 (and two 1)?

    5. Prove that (1.16) and (1.17) are equal.

    6. Prove formula (1.23).

    7. A trunk group contains 10 circuits. 7 circuits are busy. What is the number ofcombinations which contain X(X = 0, 1, 2, 3, 4) busy circuits?

    8. The number of combinations of n digits taken x(x = 0, 1, 2, , n) at a time, whereevery digit has one of k different values 0,1,2, ,k 1 can be shown to be

    n + kn

    Verify this for n = 2 and k = 10.

    Updated: 2001.01.10

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    14/53

    12 CHAPTER 1. COMBINATORIAL ANALYSIS

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    15/53

    Chapter 2

    Elements of probability theory

    Probability theory deals with the studies of events whose occurrence cannot be predicted inadvance. These kinds of events are termed random events. For example, to throw a singledie, the result may be one of the six numbers: 1, 2, 3, 4, 5, 6. We cannot predict theresult. So the outcome of throwing a die is a random event. When observing the number oftelephone calls arriving at a telephone exchange during a certain time interval, we are ofcause unable to predict the actual number of arriving calls. This is also a random event.

    Probability theory is usually discussed in terms of experiments and possible outcomes of theexperiments. The set theory plays an important role in the study of probability theory.

    2.1 Set Theory

    A set is a collection of objects called elements of the set. In general we shall denote a set bya capital letter such as A,B,C, etc. and an element by a lower case letter such as a,b,c, etc..

    If an element c belongs to a set A, we write c A. If c does not belong to A, we writec A. If both a and c belong to A we write a, c A. A set is well-defined if we are able todetermine whether a particular element does or does not belong to the set.

    A set can be defined by listing its elements. If the set A consists of the elements a,b,c, thenwe write

    A = {a,b,c}

    A set can also be defined by describing some properties held by all elements and bynon-elements. We call a set a finite (or infinite) one if it contains a finite (or infinite)number of elements.

    1

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    16/53

    2 CHAPTER 2. ELEMENTS OF PROBABILITY THEORY

    Example 1.1

    A = {abc, acd, abd, bcd}= {x | x is a combination of the letters a,b,c

    and d taken three at a time }

    2

    Example 1.2

    B = {x | x is the number of telephone call attemptsbetween 9 a.m. and 10 a.m. }

    2

    If any element of a set A does belong to a set B, then we call A a subset of B, written:

    A B ( A is contained in B ) or

    B A ( B contains A )

    For all sets we have A A. If both A B, and B A, then A and B are said to be equal,and we write A = B. In this case A and B have exactly the same elements.

    If A and B do not have the same elements, we write A = B.

    If A B, but A = B, then we call A a proper subset of B denoted by A B.

    Example 1.3 (cf. Example 1.1)

    The set consisting of the combinations of the letters a, b, c, and d taken three at a time isa proper subset of the set consisting of the permutations of the same four letters taken threeat a time.

    2

    Example 1.4 (cf. Example 1.2)

    The set consisting of successful call attempts between 9 a.m. and 10 a.m. is a subset of allcall attempts during the same period.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    17/53

    2.1. SET THEORY 3

    2

    The following theorem is true for any sets A , B , C :

    if A B and B C then A C

    All sets considered are in general assumed to be subsets of some fixed set called the universeor the universal set, and denoted by U. It is also useful to define a set having no elements

    at all. This is called the null set and is denoted by .

    2.1.1 Venn Diagram

    A universe U can be shown graphically by the set of points inside a rectangle (fig. 2.1).Subsets of U (such as A and B shown in fig. 2.1) can be represented by sets of points insidecircles. Such a diagram is called a Venn Diagram. It often serves to provide geometricintuition regarding possible relationships between sets.

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    ..

    .

    .

    ..

    .

    .

    .

    .

    .

    ..

    .

    .

    ..

    .

    ..........................................

    ........

    ......

    ................

    ....................................................

    ...............................................................................................................................................................................

    .............................................................................................

    .

    .

    .

    .

    . .

    .

    . .

    .

    . .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    . .

    .

    . .

    .

    . .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    ..

    .

    .

    .

    .

    .

    .

    .

    .

    ..

    .

    .

    .

    .

    .

    ..

    .

    .

    ..

    .

    ..

    ..

    ......................................

    ........

    ......

    ................

    ....................................................

    ...............................................................................................................................................................................

    .............................................................................................

    .

    .

    .

    .

    . .

    .

    . .

    .

    . .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    .

    . .

    .

    . .

    .

    . .

    .

    .

    .

    .

    A

    B

    U

    Fig. 2.1 A Venn Diagram. U is the universe. A and B are subsets.

    2.1.2 Operators

    In set theory we define a number of operators. We assign symbols to them, just asarithmetic operators for addition, subtraction, multiplication and division have symbols like+, , ,. The set operators are:

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    18/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    19/53

    2.1. SET THEORY 5

    2.1.3 Set Theorems

    Set operations are similar to those of Boolean algebra as seen from the following theorems.

    1. Idempotent laws:

    A A = A A A = A

    2. Commutative laws:

    A B = B A A B = B A

    3. Associative laws:A (B C) = (A B) C = A B CA (B C) = (A B) C = A B C

    4. Distributive laws:

    A (B C) = (A B) (A C)A (B C) = (A B) (A C)

    5. Identity laws:

    A = A A = A U = U A U = A

    6. De Morgans laws:

    C(A B) = CA CBC(A B) = CA CB

    7. Complement laws:

    A CA = U A CA = C(CA) = A CU = C = U

    8. For any sets A and B:

    A = (A B) (A CB)

    2.1.4 Principle Of Duality:

    Any true results involving sets is also true if we replace unions by intersections, intersectionsby unions, sets by their complements and we reverse the inclusion symbols and .

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    20/53

    6 CHAPTER 2. ELEMENTS OF PROBABILITY THEORY

    2.2 Sample Space And Events

    Experiments are of great importance in science and engineering. Experiments in which theoutcome will not be the same, even though the conditions are nearly identical, are calledrandom experiments and are subject to study by probability theory.

    A set of a list of all possible outcomes of an experiment is called a sample space denoted byS. The individual outcome is called a sample point which is an element of the set S.

    Example 2.1

    Experiment : Throw a dieSample space: S = {1, 2, 3, 4, 5, 6}

    Example 2.2

    Experiment : Observation of the number of busylines in a group of n lines.

    Sample space: S =

    {0, 1, 2,

    , n

    }2

    Example 2.3

    Experiment : Observation of the number of callsarriving at a telephone stationduring a certain time interval.

    Sample space: S =

    {0, 1, 2,

    }2

    Example 2.4

    Experiment : Make a telephone call between 9 a.m.and 10 a.m. and observe the timeof the call.

    Sample space: S =

    {t

    |9

    t

    10

    }2

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    21/53

    2.2. SAMPLE SPACE AND EVENTS 7

    If the sample space has a finite number of points it is called a finite sample space (Example

    2.1 and 2.2). If it has as many points as there are natural numbers it is called acountable finite sample space (Example 2.3). In both cases it is called a discrete samplespace. If it has as many points as there are points in some interval it is called anon-countable infinite sample space or a continuous sample space(Example 2.4).

    An event is a subset A of the sample space, i.e. it is a set of possible outcomes. If theoutcome of an experiment is A we say that the event A has occurred or that A is arealization of the experiment.

    An event which consists of one sample point of S is called a simple event. It can not bebroken down to other events. A compound event is the aggregate of a number of simpleevents. A sure event is an event that will definitely occur. Naturally an impossible eventnever occurs.

    Example 2.5

    For a group of 12 lines the compound event that exactly 2 circuits are busy consists of122

    = 66 simple events.

    2

    Since events are sets, statements concerning events can be translated into the language ofset theory and conversely. We can represent events graphically on a Venn Diagram, and wealso have an algebra of events corresponding to the algebra of sets given in section 2.1. Byusing the operators of section 2.1 on events in S we can obtain other events in S. If A andB are events, we have

    A

    B : the event either A or B or both

    A B : the event both A and BCA : the event not AA \ B : the event A but not B : the event never occurS : the event surely occur

    If A B = , that is the sets corresponding to events A and B are disjoint, then bothevents cannot occur simultaneously. They are mutually exclusive.

    A set of events is termed exhaustive if their union is the entire sample space S.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    22/53

    8 CHAPTER 2. ELEMENTS OF PROBABILITY THEORY

    Example 2.6

    Consider a trunk group having four trunk circuits. The experiment is to observe the state ofthe trunks: busy or idle. A sample space S for this experiment of observing trunk circuits 1through 4 may be the set of four-tuples (a1, a2, a3, a4). ai is either 1 or 0 indicating thatthe ith trunk circuit is busy or idle. Thus the sample point (1,0,1,0 ) corresponds to theoutcome that the first and the third circuits are busy while the second and the fourthcircuits are idle. The sample space consists of 24 = 16 sample points.Let A be an event in which at least two circuits are idle, and let B be an event in which nomore than two circuits are idle. Then A B is the whole space S. A B is the collection ofelements of S in which just two trunks are idle. If C is the event that exactly one circuit is

    idle, then A C = that is to say A and C have no sample points in common.2

    2.3 The Concept Of Probability

    Probability is a positive measure between 0 and 1 associated with each simple event, thetotal of all simple event probabilities being 1. From a strict mathematical point of view it is

    difficult to define the concept of probability. We shall use a relative frequency approach (theposteriori approach)

    In early or classical probability theory, all sample spaces were assumed to be finite, andeach sample point was considered to occur with equal frequency. The definition of theprobability P of an event A was described by the relative frequency by which A occurs:

    P(A) =h

    n

    where h is the number of sample points in A and n is the total number of sample points.This definition is applicable in some cases as the following example.

    Example 3.1 (ref. Example 2.6)

    According to Example 2.6, there are 24 = 16 sample points. A is the event that at least twotrunk circuits are idle, B is the event that at most two circuits are idle, C is the event thatexactly one circuit is idle. We use the combinational analysis to get the probabilities:

    hA = 4

    2

    + 4

    3

    + 4

    4

    = 11

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    23/53

    2.3. THE CONCEPT OF PROBABILITY 9

    and P(A) = hAn

    = 1116

    = 0.6875

    hB = 4

    0

    + 4

    1

    + 4

    2

    = 11

    and P(B) = hBn

    = 1116

    = 0.6875

    hAB =4

    i=0

    4i

    = 16

    and P(A B) = 1616

    = 1.

    We find that a sure event has the probability 1.

    hAB =

    42

    = 6

    and P(A B) = 616

    = 0.375

    hC =

    41

    = 4

    and P(C) =4

    16 = 0.25

    hAC = 0

    and P(A C) = 016

    = 0.

    We find that an impossible event has the probability 0.

    2

    Let us consider an experiment with sample space S. Let h be the number of times that theevent A occurs in n repetitions of the experiment. Then we define the probability of A by

    P(A) = limn

    h

    n(2.1)

    Thus the probability of an event is the proportion of all experiments in which this eventoccurs when we make a very large number of experiments. From the definition we obtain anumber of basic properties:

    1. For every event A we have:

    0 P(A) 1 (2.2)

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    24/53

    10 CHAPTER 2. ELEMENTS OF PROBABILITY THEORY

    2. An impossible event has zero probability:

    P() = 0 (2.3)

    3. A sure event has probability of unity:

    P(S) = 1 (2.4)

    4. For any number of disjoint events A1,A2, ,Ak we have:

    P(A1 A2 Ak) =k

    i=1P(Ai) (2.5)

    In particular for two disjoint events:

    P(A1 A2) = P(A1) + P(A2) (2.6)

    If S is a continuous sample space, then the probability of any particular sample point iszero. We therefore need the concept of a probability density:

    P(A) = Ap(s)ds (2.7)This is similar to the discrete case (2.5), and all laws of probability still apply if we replacesummation by integration.

    Example 3.2

    In Example 2.4 we have a continuous sample space {9 t 10}. The probabiliy of a call at9:30 sharp is zero. If we assume the call is equally likely to occur anywhere between 9 and10, then the density function becomes (1hour)1 = (60minutes)1. The probability of a call

    between 9:29 and 9:31 then becomes 260 = 130 .

    2

    Example 3.3

    A single die is thrown. The sample space isS = {1, 2, 3, 4, 5, 6}. If we assume the die is fair, then we assign equal probabilities to thesample points:

    P(1) = P(2) = P(3) = P(4) = P(5) = P(6) =1

    6

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    25/53

    2.4. ADDITION RULE 11

    2

    In some cases enough is known about the experiment to enumerate all possible outcomesand to state that these are equally likely. The probability of A is then equal to the ratio ofthe number of outcomes in which A is realized to the total number of outcomes.Combinatorial analysis is useful to find the relevant number of outcomes.

    Example 3.4 (cf.Chapter 1 Example 4.4)

    If all possible combinations are equally likely, then the probability of finding k busy circuits

    are given by (1.18) or (1.19).

    2

    We now consider an important theorem in probability theory.

    2.4 Addition Rule

    If A and B are two events, then

    P(A B) = P(A) + P(B) P(A B) (2.8)

    This formula might be used to simplify the evaluations of Example 3.1.

    If the events are mutually exclusive (disjoint), then

    P(A B) = P(A) + P(B) (2.9)

    Generalizations to 3 or more events can be made.

    2.5 Conditional Probability

    We now investigate a more complicated version of the case in Example 2.6.

    Given the first trunk circuit is known to be idle, then we are interested in the probability Pthat there are at least two other circuits idle. It is obvious that the first circuit is idle is

    an event, and that at least two other circuits are idle is also an event. So P is theprobability that one event occurs under the condition that another event has occurred. Thiskind of probability is naturally called conditional probability.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    26/53

    12 CHAPTER 2. ELEMENTS OF PROBABILITY THEORY

    We consider an experiment. If it is known that an event B has already occurred, then the

    probability that the event A has also occurred is known as the conditional probability. Thisis denoted by P(A | B), the conditional probability of A given B, and it is defined by

    P(A | B) = P(A B)P(B)

    (2.10)

    i.e. the number of experiments in which A and B (at least) are realized.

    Example 3.5

    Let A denote the event a group of 5 circuits (a,b,c,d,e) contains 2 calls only which occupyadjacent circuits. Let B denote the event a group of 5 circuits contains 2 calls only, one ofwhich occupies the circuit a.

    From combinatorial analysis we know that 2 lines can be busy in

    52

    = 10 different ways.

    Therefore we get P(B) = 410

    (ab, ac, ad, or ae occupied), and P(A B) = 110

    (ab occupied).

    P(A | B) =110410

    =1

    4

    2

    2.6 The Multiplication Theorem

    From (2.10) we get the following formula:

    P(A B) = P(A) P(B | A) = P(B) P(A | B) (2.11)

    If the events are mutually exclusive we get

    P(A B) = P(A) P(B) (2.12)In this case A and B are said to (statistically) independent events. The probability of eventA does not depend on whether B has occurred or not:

    P(B | A) = P(B)or

    P(A | B) = P(A)

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    27/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    28/53

    14 CHAPTER 2. ELEMENTS OF PROBABILITY THEORY

    2.8 Exercises

    1. Telephones returned to a workshop for repair are subject to three kinds of defects,namely A, B and C. A sample of 1000 pieces was inspected with the following results:

    400 had type A defect (and possiblyother defects).500 had type B defect (and possiblyother defects).300 had type C defect (and possiblyother defects).60 had both type A and B defects(and possibly C).100 had both type A and C defects(and possibly B).80 had both type B and C defects(and possibly C).20 had all type A, B and C defects.

    Let A , B , C be subsets of the universe consisting of all 1000 telephones.

    (a) Make a Venn Diagram representingthe above subsets.

    Find from this diagram.(b) The number of telephones which had

    none of these defects.(c) The number of telephones which had

    at least one of these defects.

    (d) The number of telephones whichwere free of type A and B defects.(e) The number of telephones which

    had no more than one of these defects.

    2. A sample space consists of the following sample points:U = (E1,E2,E3,E4,E5,E6)We define the subsets:

    A = (E2,E3,E5)

    B = (E3,E5,E6)

    (a) Find the intersection of A and B.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    29/53

    2.8. EXERCISES 15

    (b) Find the union of A and B.

    (c) Find the difference of A and B.(d) Find the complement of A.

    (e) Find the complement of A relative to B.

    3. A ball is drawn at random from a box containing 6 red balls, 4 white balls and 5 blueballs. Determine the probability that it is

    (a) red

    (b) white

    (c) blue

    (d) not red

    (e) red or white

    4. A die is thrown twice.

    (a) Define the sample space for each throwand for the total experiment.

    (b) Find the probability of the event.A = two times sixA = at least one six

    (c) Find the probability of not gettinga total of 7 or 11 in the two throws.

    5. Let the sample space in exercise 2 correspond to the throw of a die. We assign equalprobability to the sample points. Find the probability P(B | A).

    6. Determine the probability of three sixs in five throws of a fair die.

    7. A box contains 8 red, 3 white and 9 blue balls. If 3 balls are drawn at randomwithout replacement, determine the probability that:

    (a) all 3 are red(b) all 3 are white

    (c) 2 are red and 1 is blue

    (d) at least 1 is white

    (e) 1 of each color is drawn

    (f) balls are drawn in the order of red, white, blue.

    8. Suppose six dice are thrown simultaneously. What is the probability of getting

    (a) all faces alike

    (b) no two faces alike

    (c) only five different faces

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    30/53

    16 CHAPTER 2. ELEMENTS OF PROBABILITY THEORY

    9. Try to generalize formula (2.8) to 3 events and to prove it.

    Updated: 2001.01.10

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    31/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    32/53

    2 CHAPTER 3. ELEMENTS OF MATHEMATICAL STATISTICS

    A stochastic variable is characterized by its (cumulative)

    distribution function F(x):F(x) = P{X x} < x < (3.1)

    Here X x is a shorthand notation for the event corresponding to the set of all points sin sample space S for which X(s) x. F(x) is a never decreasing function of x andF() = 0, F() = 1.

    3.2 Discrete Probability Distributions

    Let X be a discrete stochastic variable which can take the values x1, x2, , xn (finitenumber or countably many values). If these values are assumed with probabilities given by

    P{X = xk} = f(xk) (3.2)then we introduce the (probability) density function (frequency function) noted by

    P{X = x} = f(x) (3.3)For x = xk this reduces to (3.2), while for other values of x we have f(x) = 0.

    In general a function f(x) is a density function if

    f(x) 0 (3.4)and

    x

    f(x) = 1 (3.5)

    where the sum is to be taken over all possible values of x.

    The distribution function is obtained from the density function by noting that

    F(x) = P{X x} =ux

    f(u) (3.6)

    Example 1.2 (ref. Example 2.6 of Chapter2)

    Let us define a discrete random variable X that counts the number of busy circuits in atrunk group of 4 trunks. X takes only the values 1, 2, 3 and 4. Let the probabilities be givenby

    p(1) = 0.40, p(2) = 0.35, p(3) = 0.15 p(4) = 0.10

    Then we can get the distribution function F(x) of the discrete random variable X as follows:

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    33/53

    3.3. CONTINUOUS PROBABILITY DISTRIBUTIONS 3

    F(0) = 0

    F(1) = p(1) = 0.40F(2) = p(1) +p(2) = 0.75F(3) = p(1) +p(2) +p(3) = 0.90F(4) = p(1) +p(2) +p(3) +p(4) = 1.00

    Thus for example F(3.4) = 0.90. (ref. Fig. 3.1)

    0.0

    0.2

    0.4

    0.6

    0.8

    1.0

    probability

    0 1 2 3 4 5

    Number of tests

    0.40

    0.75

    0.901.00

    Fig. 3.1 Distribution function of Example 1.2

    3.3 Continuous Probability Distributions

    When X is a continuous stochastic variable, the probability that X takes any one particularvalue is in general zero. We noticed, however, in Chapter 2 (2.7) that the probability that

    X is in between two different values is meaningful. In fact a < X b is the eventcorresponding to the set ]a, b].

    The concept of probability density leads us to the introduction of a (probability)density function f(x) where

    f(x) 0 (3.7)

    f(u)du = 1 (3.8)

    We denote the probability that X lies between a and b by

    P{a < X b} =ba

    f(u)du (3.9)

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    34/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    35/53

    3.4. JOINT DISTRIBUTIONS 5

    3.4 Joint Distributions

    We are often interested in two stochastic variables X and Y at the same time. These maybe the outcomes of 2 experiments, or they may be a pair of figures emerging from a singleexperiment.

    (X, Y) can be regarded as taking values in the product space (S T) consisting of all pairs(s, t) with s S and t T.

    We first consider the case of two discrete stochastic variables X, Y. Thejoint (probability) density function is defined by

    f(x, y) = P{X = x, Y = y} (3.13)where

    f(x, y) 0 (3.14)u,v

    f(u, v) = 1 (3.15)

    The probability that X = xj and Y = yk is given by

    f(xj, yk) = P{X = xj, Y = yk} (3.16)The total probability of P{X = xj} is obtained by adding all possible values of yk:

    P{X = xj} = f1(xj) =

    f(xj, ) (3.17)

    This is called the marginal density function of X.

    The joint distribution function of X and Y is defined by

    F(x, y) = P

    {X

    x, Y

    y

    }= uxvy f(u, v) (3.18)

    The continuous case is easily obtained by analogy by replacing sums by integrals. It is alsoobvious how the mixed case (discrete - continuous) should be dealt with.

    If the events X = x and Y = y are independent for all x and y, then we say that X and Yare independent stochastic variables. In this case

    P{X = x, Y = y} = P{X = x} P{Y = y} (3.19)or equivalently

    f(x, y) = f(x) f(y) (3.20)

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    36/53

    6 CHAPTER 3. ELEMENTS OF MATHEMATICAL STATISTICS

    Generalizations to more than two variables can also be made.

    Example 4.1

    Consider again two consecutive throws of a die. Let X and Y correspond to the result ofthe first and second throw. We can easily see that X and Y are independent, andx, y {1, 2, 3, 4, 5, 6}, each with probability 1

    6. Hence the two-dimentional variable (X, Y)

    takes on the pairs of values (i, j) where i, j {1, 2, 3, 4, 5, 6}.

    Let Z = X + Y, which is the sum of the random variables X and Y. It is a one-dimentional

    random variable which takes on the values of the sum i + j : 2, 3, 4, , 11, 12. Because ofthe independence of X and Y, we have

    P(Z = 2) =P(X = 1, Y = 1)=P(X = 1) P(Y = 1) = 1

    36

    P(Z = 3) =P(X = 1, Y = 2)+P(X = 2, Y = 1) = 1

    18 P(Z = 12) =P(X = 6, Y = 6)

    =P(X = 6)

    P(Y = 6) = 1

    36

    It is easy to verify that12

    i=2 P(Z = i) = 1.

    2

    3.5 Expected Values

    For a discrete stochastic variable X taking the possible values x1, x2, , xn we define theexpectation of X of the mean of X as follows

    E(X) =n

    j=1

    xj P{X = xj} =n

    j=1

    xj f(xj) (3.21)

    For the continuous case the expectation of X with density function f(x) is defined in asimilar way:

    E(X) =

    x f(x)dx (3.22)

    Let X be a stochastic variable. Consider a single-valued function g(t), then Y = g(X) isalso a stochastic variable, and in analogy with (3.21) and (3.22) we define the expectationof g(x) by:

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    37/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    38/53

    8 CHAPTER 3. ELEMENTS OF MATHEMATICAL STATISTICS

    We derive the mean value of X:

    E(X) = 12

    xex2

    2 dx = 12

    (x)ex2

    2 dx

    = 12

    [ex2

    2 ]+ = 0

    2

    Some properties of the expectation are shown below:

    1. E(c) = c

    2. E(c X) = c E(X)3. E(X1 + X2) = E(X1) + E(X2)

    4. E(X1 X2) = E(X1) E(X2) X1, X2 are independent

    Hereby c is a constant and X, X1, X2 are stochastic variables whose expectations exist.

    Of particular interest is the expectation of g(x), when g(x) = Xr, where r is positiveinteger. Then r = E(X

    r) is called the rth moment of X:

    Discrete case:

    r =j

    xrj f(xj) (3.25)

    Continuous case:

    r = +

    xr

    f(x)dx (3.26)

    We notice that:

    1 = E(X) 2 = E(X2)

    The rth moment of a stochastic variable X about a is defined by E((X a)r).

    Moments about the mean of X are denoted by r:

    Discrete case:r =

    j

    (xj E(X))r f(xj) (3.27)

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    39/53

    3.5. EXPECTED VALUES 9

    Continuous case:

    r = + (x E(X))r f(x)dx (3.28)

    Of particular interest is the 2nd moment about the mean. This is called the variance :

    V ar(X) = E((X E(X))2) (3.29)

    This is a non-negative number. The square root of the variance is called thestandard deviation.

    We can easily get:V ar(X) = 2 = 2 21 (3.30)

    Some properties of the variance are:

    1. V ar(c) = 0

    2. V ar(cX) = c2 V ar(X)3. V ar(X1 + X2) = V ar(X1) + V ar(X2) X1, X2 are independent

    Example 5.4

    We calculate the variances

    1. in Example 5.1

    1

    = E(X) = 0.62 = E(X2) = (1)2 0.2 + 12 0.8 = 12 = V ar(X) = 2 21 = 0.64

    2. in Example 5.2

    1 = 2 = E(X

    2) =

    x=0 x2 x

    x! e

    = e(2e + e) = 2 + 2 = V ar(X) = 2 21 =

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    40/53

    10 CHAPTER 3. ELEMENTS OF MATHEMATICAL STATISTICS

    3. in Example 5.3

    1 = E(X) = 0

    2 = E(X2) = 1

    2

    + x

    2ex2

    2 dx = 1

    2 = V ar(X) = 2 21 = 1

    2

    For a discrete stochastic variable assuming non-negative integers as values theBinomial moments are very useful in teletraffic theory. The rth Binomial moment isdefined by

    r =i=r

    i

    r

    p(i) (3.31)

    The results given above can be extended to two or more variables having joint densityfunctions, e.g. f(x, y).

    E(X) =u

    v

    u f(u, v) (3.32)

    An interesting quantity arising in the case of two variables is the covariance defined by

    Cov(X, Y) = E((X E(X))(Y E(Y))) (3.33)

    If X and Y are independent, then Cov(X, Y) = 0.

    On the other hand, if X and Y are identical(X = Y) then

    Cov(X, Y) = (V ar(X) V ar(Y)) 12 = V ar(X) (3.34)

    Thus we are led to a measure of the dependence of the variables X and Y given by

    =Cov(X, Y)

    (V ar(X) V ar(Y)) 12 (3.35)

    This is a dimensionless quantity called the correlation coefficient or coefficient of correlation.

    3.6 Some Discrete Distributions

    We shall now describe some important distributions of the discrete type.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    41/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    42/53

    12 CHAPTER 3. ELEMENTS OF MATHEMATICAL STATISTICS

    The Poisson distribution, however, is not only an approximation of the Binomial

    distribution, but it is a distribution of its own right, as we shall see in the teletraffic theory.

    Example 6.2

    If we examine a large number of subscribers each of which has a small probability of beingbusy, then the number found busy will follow a Poission distribution.

    2

    Example 6.3

    The number of calls incoming to an exchange during one hour will also follow a Poissiondistribution.

    2

    3.6.3 Other Discrete Distributions

    The random variable which in a Bernouli sequence counts the number of trials to get thefirst success is called a geometric random variable. It is described by theGeometric distribution. In some cases we dont include the trial for success so that thevalues assumed are k = 0, 1, . The geometric distribution is shown in Table 3.1. Notice,that this distribution includes the success (k = 1, 2, ). By adding k geometricdistributions we get the Negative Binomial distribution, which is also shown in Table 3.1(Pascal distribution). In Chapter 1 we indicated the Hypergeometric distribution (formula(1.17)). From Table 3.1 we notice the close relationship between the Binomial, theGeometric and the Negative Binomial distributions.

    3.7 Some Continuous Distributions

    3.7.1 Normal Distribution

    This is a continuous distribution with the density function

    f(t) =1

    2exp(1

    2(

    t

    )2) < t < + (3.42)

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    43/53

    3.7. SOME CONTINUOUS DISTRIBUTIONS 13

    with (Example 5.3 & 5.4):

    E(T) = (3.43)V ar(T) = 2 (3.44)

    One usually writes T = N(, ), which means T is normally distributed with mean value and the standard deviation .

    The standard Normal distribution has a mean of 0 and variance of 1, and forms the basisfor tables of the Normal distribution. The properties of other Normal distributions areobtained from these tables by working in terms of the quantity (t)

    .

    3.7.2 Exponential Distribution

    This distribution is called the negative exponential distribution in teletraffic theory. Thedensity and distribution functions are

    f(t) = et, t 0, > 0 (3.45)

    respectively

    F(t) = 1

    et, t

    0, > 0 (3.46)

    We have:

    E(T) =1

    (3.47)

    V ar(T) =1

    2(3.48)

    This is one of the most important distributions in teletraffic theory.

    The well-known Markov or memoryless property is inherent in this distribution as wehave:

    P{

    X > t + h|

    X > t}

    = P{

    X > h}

    The stochastic variable forget the age t.

    3.7.3 Erlang-k Distribution

    By adding k exponentially distributed stochastic variables we get a new stochastic variablewhich is Erlang-k distributed (Table 3.1). By allowing k to be non-integral this can begeneralized to the Gamma distribution:

    f(t) =(t)k1

    (k)et (3.49)

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    44/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    45/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    46/53

    16 CHAPTER 3. ELEMENTS OF MATHEMATICAL STATISTICS

    obtained during a period from 8 a.m. to 10 a.m..

    06 08 08 07 07 06 09 07 06 0309 07 11 07 12 09 13 08 09 0508 15 09 09 19 16 10 11 11 1517 12 16 14 15 14 09 10 14 14

    (a) Make a diagram of the frequency of differentnumber of arriving calls.

    (b) Make a table of the (empirical) density function,i.e. of the relative frequency of different number

    of arriving calls.(c) Make a table of the (empirical) distributionfunction.

    (d) Calculate the mean value x of the numberof arriving calls for this measurement(1) from (b). (2) from the formula x = 1

    40

    40i=1 xi

    (e) Calculate the variance of the number ofarriving calls for this measurement(1) from (b). (2) from the formula V ar = 1

    n

    ni=1(xi x)2

    (In practice we divide the sum by n 1 instead of n whenwe calculate the variance of observations because thisgives a better result. We use n only in theoretical analysis)

    2. Prove (3.30).

    3. Show that if X and Y are independent, then Cov(X, Y) = 0.

    Updated: 2001.01.10

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    47/53

    Chapter 4

    Theory of sampling

    4.1 Sampling

    We are often interested in drawing conclusions about a large set of objects, which we shallcall a population . The population size N can be finite or infinite. Instead of examining theentire population (doing this is often impossible in practice) we observe only a

    sample of size n, which is a subset of the population. The process of obtaining samples iscalled sampling. The purpose is to obtain some knowledge about the population fromresults found in the sample.

    Sampling where each element of a population may be chosen more than once is calledsampling with replacement. In sampling without replacement each element cannot bechosen more than once.

    A population is characterized by a stochastic variable X, which is defined by a distributionfunction F(X) having population parameters as the mean value , the variance 2, etc. If

    we know F(t), then we have full information about the population. However, in real worldproblems one often has little or no knowledge about the distribution underlying samples. Sofinding knowledge, when little or nothing is known of the underlying distribution, is themain topic of this chapter. In general we shall only try to get some knowledge (estimates)of some population parameters by sampling.

    4.2 Sampling Statistics

    By taking random samples from the population these may be used to obtain estimates ofthe population parameters. An important problem in sampling theory is to decide how to

    1

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    48/53

    2 CHAPTER 4. THEORY OF SAMPLING

    form the sample statistics which will best estimate a given population parameter.

    Let us pick n members from the population at random:

    observations : x1, x2, , xnsample size : n

    Then we calculate the following sample statistics:

    sample mean : x = 1n

    ni=1

    xi (4.1)

    sample variance : s2 =1

    n

    ni=1

    x2i x2 (4.2)

    These statistics are functions of stochastic variables and are therefore stochastic variablesthemselves.

    The unkown population mean and variance are estimated by the following unbiased

    estimators: = E{x} (4.3)2 = E{s2} = E{ n

    n 1 s2} (4.4)

    (these important results are proven in mathematical statistics)We now want to know how accurate these results are.

    4.3 The Central Limit Theorem

    This is a fundamental theorem from mathematical statistics:

    If a sample of size n is taken from a population with finite mean and finite variance 2

    (and otherways any statistical distribution), then as n increases the distribution of thesample mean x is asymptotically normal distributed (cf. section 3.7) with mean value andvariance

    2

    n. Or equivalently, the distribution of

    Z =(x )

    n

    (4.5)

    tends towards the standard Normal distribution N(0, 1) as n increases.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    49/53

    4.4. SAMPLING DISTRIBUTION 3

    Example 4.1

    Consider a sequence of Bernouli random variables X1, X2, , Xn that are independent andeach with success probability p. We have then E(Xi = p and V ar(Xi = p(1 p)). By thecentral limit theorem, we get:

    x pp(1p)

    n

    N(0, 1) (n )

    or

    ni=1 Xi npnp(1 p)

    N(0, 1) (n )

    Since we know from section 6 of Chapter 3 that Sn =n

    i=1 Xi is binomial distributed, theabove expression shows that for large n an approximation for binomial probaboloties can beobtained by using the Normal probabilities of N(np, np(1 p)).

    2

    4.4 Sampling Distribution

    A sample statistic, which is calculated from a sample, is a function of random variables andis therefore itself a random variable. The probability distribution of a sample statistic iscalled the sampling distribution of the statistic . We shall only consider two sampling

    distributions for the sample mean.

    4.4.1 Population mean and variance 2 are known

    Suppose that the population from which samples are taken has a probability distributionwith mean value and variance 2 (not necessary a Normal distribution). Then it can beshown that the sampling distribution of x is asymptotically normal distributed N(, 2),i.e. :

    Z =x

    n N(0, 1) for n

    (4.6)

    This is a consequence of the Central Limit Theorem in section 4.3.

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    50/53

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    51/53

    4.4. SAMPLING DISTRIBUTION 5

    The t-value yields a larger confidence interval than the z-value (we have less information

    because we dont know the population mean and variance), but for large values of n and formost pratical purposes we often use the z-value.

    Example 4.3

    For = 5% we had z97.5% = 1.96. From the t-distribution we get:

    n t97.5%1 12.71

    2 4.305 2.5710 2.2320 2.0950 2.01

    We notice that for increasing n the t-value tends to 1.96.

    2

    For a given confidence level we have a relation between the confidence limits (confidenceinterval) and the sample size. If we want to reduce the confidence interval by a factor c,then we must increase the sample size by a factor c2.

    Example 4.4

    The average holding time of calls during a certain period in a telephone system is to beestimated. Based on a random sample of 100 holding times of calls, the sample mean andsample variance are calculated as x = 5.74 time unit and s

    2 = 2.65 square of time unit.

    Find a 95% confidence interval for the true average holding time of calls in that period.

    Let denote the true average holding time of calls. The confidence interval for based onformula (4.9), is

    (x sn

    t12,n1, x +

    sn

    t12,n1)

    where n = 100, 1 = 0.95, x = 5.74,s = 2.65. We have t12,n1 = 1.984. Therefore the

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    52/53

    6 CHAPTER 4. THEORY OF SAMPLING

    confidence interval for is (5.4170, 6.0630).

    2

    4.5 Exercises

    1. A bottle is supposed to contain 250 ml of wine, with a standard deviation of 3ml. Ifwe sample 200 such bottles at random, probability that the average of wine containedin a bottle will be

    (a) At most 248 ml.(b) At least 252 ml.

    (c) Between 249 and 251 ml.

    2. IfX is a Poisson random variable with mean 81, find the approximate probabilityP(X 75).

    3. Let X1, X2, , Xn (n large enough to justify applying the Central Limit Theorem) beindependent random variables, each Poisson with mean . Find an (appriximate)1 confidence interval for . ( = 5%).

    4. Suppose that it is observed that the average span of using one kind of parts of amachine is 5 years, with a standard deviation of 1.2 years. By sampling of 100 of thiskind parts, we obtain x = 4.75. Construct a confidence interval for with confidencelevel

    (a) 99% (b) 95% (c) 90% (d) 80%

    Does the length of the intervals increase or decrease as the confidence level decrease?

    5. A certain kind of instrument labeled 1.5 kg weight. A random sample of 50instruments is measured to be in the standard weight. We calculate x = 1.47 kg,

    s2 = 0.09 kg2. Construct a confidence interval for with confidence level(a) 80% (b) 95% (c) 99%.Updated: 2001.01.10

  • 8/14/2019 1 Combinatorial Analysis 1.1 1.2 1.3 1.4

    53/53

    4.5. EXERCISES 7