Você está na página 1de 11

REVIEW FOR FINAL

Ex1
Boolean Bayesian network above: a) Which of the following are asserted by the BN structure? P(B,I,M)=P(B)P(I)P(M) P(J|G)=P(J|G,I) P(M|G,B,I)= P(M|G,B,I,J) b) Compute the probability of P(b,not(m),i,g,j)

Ex1
Boolean Bayesian network above: a)! If: B ! BrokeLaw I ! Indicted M ! PoliticalProsecutor G! FoundGuilty J! Jailed What is the probability of going to jail if you broke the law have been indicted and the prosecutor is politically motivated?

Ex2

Let Hx be a random variable denoting the handedness of an individual x , with possible values l or r . A common hypothesis is that left- or right-handedness is inherited by a simple mechanism; i.e., perhaps there is a gene Gx , also with values l or r , and perhaps actual handedness turns out the same (with some probability s ) as the gene an individual possesses. Furthermore, perhaps the gene itself is equally likely to be inherited from either of an individual's parents, with a small non-zero probability m of a random mutation flipping the handedness.
a)! b)! c)! d)! e)! Which of the above BN claim that P(Gfather,Gmother,GChild)=P(Gfather)P(Gmother)P(GChild)? Which make independence claims which are consistent with the hypothesis? Which BN best described the hypothesis? Describe the CPT of Gchild in (i) and (ii) Assume P(Gmother=l)=P(Gfather=l)=x. In BN (i) derive an expression for P(Gchild=l) in terms of m and x

Ex3
Consider decision trees with continuous input attributes A1,!.An and a Boolean output attribute Y . In such trees, the test at each internal node is an inequality of the form Ai > c, where c, the split point, may be any real number (to be chosen by the learning algorithm). The value at each leaf is true or false. In a test-once tree, each attribute may be tested at most once on any path in the tree; in a test-many tree, each attribute may be tested more than once on a path. Suppose we are given the following four examples:

a)! Draw a test once decision tree that classifies the examples correctly b)! Give the information gain at the root (no need to compute logs) c)! Is it true that any non noisy training set can be classified by a test-once decision tree? d) Same question above but with test-many decision tree

Ex4
"!

"!

"!

A 1-decision list (1-DL) is a decision tree with Boolean attributes in which at least one branch from every attribute (except the last one) test leads immediately to a leaf Draw a 1-DL that is equivalent to the disjunction of 3 attributes (a1,a2,a3) Now consider the representation of Boolean functions as perceptrons, with output 0/1. Assume a fixed weight a0=-1 and 3 additional weights and a step activation function with threshold at 0 and outputs +1/-1. Draw the perceptron representing the disjunction of 3 attributes

Ex4 ctd a1
T F F T a3 T F 8 2 -4 8 T a0 a1 a2 a3 a2 F T

a1 F T T T F 8 -8 +4 -2 a2 F a3 F F T F a0 a1 a2 a3 a1 F F 12 -8 -4 -2 T a2 T

a3 F F T F

A
a0 a1 a2 a3

3 Which 1-DL is equivalent to which perceptron? Give a simple argument to show that a perceptron cannot represent some data sets generated by decision trees

Ex5
s1 20 s2

0 10 0 100

100 s3

0 s4

Compute Q along the path (s3,up),(s1,right), (s2,down) and back gamma=0.9, all Q values initialized to 0

Ex6

"!

Given the following grammar

a) Which of the following sentences are generated by the grammar? (i)(ii) (i) First First Spike smelled fragrant then he smelled then he watered the violet violet (ii) the red red rose rose rose (iii) she was a violet violet violet b) Give parse tree for: First she watered the rose then its smelled

Ex7

Consider this PCFG (3) Which of the following have a nonzero probability of being generated as complete? (i)! shoots the duck well well well (ii) seems the well well (iii) shoots the unwell well badly (b) What is the probability of generating is well well"?

Ex7 ctd
True/False: Given any PCFG, it is possible to calculate the probability that the PCFG generates a string of exactly 10 words (assuming a finite lexicon)
"!

Você também pode gostar