signal detection part 1( prob&random)

Upload: m-s-prasad

Post on 04-Apr-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/30/2019 Signal Detection Part 1( Prob&Random)

    1/4

    Signal Detection & Estimation ( Part 1)Review of Probability & Random Process

    This part is a review of Probaility concpts and Random variable including

    Processes. This is to bring the students to the same level of discussion further on

    Signal Detection & estimation Theory.

    Probability

    Probability theory is an application of measure theory, though the relationship

    between the two is more symbiotic . One by-product of the separate legacies of the

    two theories is that the two have different names for the same concepts. In the

    language of probability, the scenario in which we plan to fit a probabilistic model is

    called the experiment. The set of all possible outcomes to this experiment is calledthe sample space ( ), and subsets of the sample space are called events()

    Axiom 1. Given an experiment, there exists a sample space, , representing the

    totality of possible outcomes of the experiment and a collection,A, of subsets, A,of called events.

    Axiom 2. To each event A inA, there can be assigned a nonnegative number P(A)such that.

    P(A) > 0 , P() = 1

    A () ()

    If A and B are two arbitrary events in the sample space , then

    ( ) () () ( )

    Joint and Marginal Probability

    Consider the collection of events A1,...,An. The probability that all of these occur is

    called the joint probability of the events Aj and can be calculated through P(A1,A2

    An) and above probability of arbitrary events .

    Marginal Probability. Suppose that the sample space can be partitioned into two

    different families of disjoint sets, {Ai } and {Bj }, i.e.,

    ( i=1 , j =1)

    ( ) is then the joint probability of events Ai and Bj .

  • 7/30/2019 Signal Detection Part 1( Prob&Random)

    2/4

    Hence we can write the marginal probability of Bj ( formed by taking the

    sum of all joint probability of Bj with Ai.

    () ( ) summation runs from i = 1 to m.

    Conditional Probability

    A concept that will be of very use in estimation theory is conditional probability. A

    conditional probability is the probability of the occurrence of an event subject to the

    hypothesis that another event has occurred.

    P( B|A ) =( )

    ()

    Also we can express () ( ) ( )

    The probability of A given B is related to the probability of B given A through BayesRule

    P ( A|B) = P ( B|A) P( A) / P(B)

    Another important concept when dealing with collections of events is the idea of

    statistical independence. Two events A and B are independent if the occurrence of

    one event gives us no information about the occurrence of the other. This manifests

    itself in the conditional probability formula as

    P(A|B) = P(A).

    Two events, A and B, are orthogonal if one event implies that the other event will

    not occur, i.e., P(B|A) = 0

  • 7/30/2019 Signal Detection Part 1( Prob&Random)

    3/4

    Random variable

    Given a probability space, (,A,P),a random variable X() : Rn is a real (vector)

    valued point function which carries a sample point, , into a point y Rn in sucha way that every set, A , of the form

    A = { : X() x, x Rn }

    A real random variable X() is said to be discrete if there exists a finitely countable

    set S ={xj } such that

    P ( : X() = xj ) = 1

    Two random variables X and Y are called independent if any event of the form

    X() Ais independent of any event of the formY() Bwhere A,B are sets in

    Rn. The joint probability distribution of two independent random variables is the

    product of their marginal distribution functions:

    F(x,y) = P (X x, Y y ) = P (X x ) P (Y y )

    = F(x)F(y),

    and the same is true for the joint density function of two absolutely continuous

    random variables:

    fXY(x, y) = 2 / xy F |X=x,Y=y

    = /x/y F|X=x F|Y=y

    = {/x F|X=x }{ /y F|Y=y }

    = fX (x) fY ( y)

    Important

    Suppose that we have a random variable, Y , defined as the sum of two

    independent random variables, X1 and X2: Y = X1 + X2. If X1 and X2 have density

    functions, fX1 and fX2 , respectively, then what is the density function of Y ?

    Define Z1 = Y = X1 + X2 and Z2 = X 2 ( a linear relation with given vectors)

    Z = AX now the derived Probability density can be written as

    Z = gX = AX => X = g-1 (Z) = A-1 ( Z)

    X = g -1 ( Z) ={

    + * + = { Z1 Z2 , Z2 }

    T this finally terms out to

    be the convolution integral.

  • 7/30/2019 Signal Detection Part 1( Prob&Random)

    4/4