Escolar Documentos
Profissional Documentos
Cultura Documentos
A binary decision is a choice between two alternatives. For example: true or false, conditional
statements,
1. Maximum Likelihood Detection
H0
and
H1
be two hypotheses.
Each of the two messages generates a point z in the observation space Z. It is desired to
divide Z into two decision regions
d1
d0
Z0
and
Z1
if the hypotheses
H0
is true, if
z Z0
(z)
Likelihood ratio:
d0
if
H0
p (z | H 0 ) p (z | H1 )
H1
(the conditional
p (z | H1 )
p (z | H 0 )
2. Neyman-Pearson Criterion
We denote:
P d1 | H 0 PF
d1
P d1 | H1 PD
The perfect case where our rule is always right and never wrong (
happen due to they either decrease or increase simultaneously.
PD 1
and
PF 0
) cannot
Neyman-Pearson Criterion: A better approach is to fix false alarm probability is less than or
equal to a specified level
(threshold
P(H 0 ), P(H1 )
H0
p (H 0 | z) p (H1 | z)
H0
H1
),
H1
if opposite is true.
Another good thing about MAP decision is the MAP will minimize the probability of error
(making incorrect decision). (false alarm and miss detection)
(z)
Likelihood ratio:
p(z | H1 )
p(z | H 0 )
( z )
or
P{H 0 }
P{H1}
Cri .CNn ir
P(i | N )
C Nn
P (i | N )
The MLD chooses the value N that maximizes
when there are actually N animals.