Você está na página 1de 2

Binary decision

A binary decision is a choice between two alternatives. For example: true or false, conditional
statements,
1. Maximum Likelihood Detection

With binary decision, let

H0

and

H1

be two hypotheses.

Each of the two messages generates a point z in the observation space Z. It is desired to
divide Z into two decision regions

Then we make decision

d1

d0

Z0

and

Z1

if the hypotheses

H0

is true, if

z Z0

, and similarly for decision

MLD Criterion: We make the decision


probability (or the likelihood) of z given
if opposite is true.

(z)
Likelihood ratio:

d0

if

H0

p (z | H 0 ) p (z | H1 )

is larger than given

H1

(the conditional

) and make the decision

p (z | H1 )
p (z | H 0 )

2. Neyman-Pearson Criterion

In some case, maybe we cant be assign a priori probability to a hypothesis. So we need a


decision rule that does not depend on making assumption about a priori probability in each
hypothesis.
An alternative classical solution for simple hypothesis is developed by Neyman-Pearson.

We denote:

P d1 | H 0 PF

(Make decision d1 when H0 is true) is false alarm probability

d1

P d1 | H1 PD

(Make decision d1 when H1 is true) is detection probability.

The perfect case where our rule is always right and never wrong (
happen due to they either decrease or increase simultaneously.

PD 1

and

PF 0

) cannot

Neyman-Pearson Criterion: A better approach is to fix false alarm probability is less than or
equal to a specified level

and while maximizing the detection probability.

That means PF is fixed at a preselected value


3. Maximum A Posteriori (MAP) Decision

(threshold

), and then maximize PD.

P(H 0 ), P(H1 )

H0

If we know the priori probability


(the probability of hypotheses
and
we can make the better decision , based on Bayes rule, we will find out the posteriori
probability.

MAP Criterion: We will make the decision


maximum (

p (H 0 | z) p (H1 | z)

H0

H1

),

if the posteriori probability of it given z is

) and make the decision

H1

if opposite is true.

Another good thing about MAP decision is the MAP will minimize the probability of error
(making incorrect decision). (false alarm and miss detection)

(z)
Likelihood ratio:

p(z | H1 )
p(z | H 0 )

( z )
or

P{H 0 }
P{H1}

4. Single observation, multiple decision


We want to estimate/decide the N animals in the closed region.
Step 1: we catch r animals, mark them and release them.
Step 2: After dispersing, catch randomly n animals, and count the number, i of the marked
animal.

Cri .CNn ir
P(i | N )
C Nn
P (i | N )
The MLD chooses the value N that maximizes
when there are actually N animals.

the probability of the observed event i

5. Decision vs. Estimation


In the decision problem, the number of Hypotheses in finite or countably infinite.
In the estimation problem, the number of Hypotheses is uncountably infinite.

Você também pode gostar