首页 > > 详细

辅导C/C++编程、C/C++编程辅导、辅导C/C++编程、C/C++语言调试、辅导C/C++编程

1 Introduction
In this assignment, you will implement an approximate inference engine for the
Hidden Markov Model shown in Figure 1 (which is similar to the one introduced
in Lecture 21.
The HMM depicts the scenario shown in Lecture 21. In this model, there
are a sequence of latent(unobserved) random variablesfR0;R1; ;RTgwhich
represent if it is raining outside and a sequence of observed random variables
(evidence) fU0;U1; ;UTg indicating whether the director arriving with an

R0 R1 Rt RT
U0 U1 Ut UT
Figure 1: The hidden Markov model.
1
umbrella. Here, we use Rt = True denotes raining and Ut = True represents
that an umbrella is observed. We de ne the following probabilities:
P(R0 = True) = 0:2
P(Rt = TruejRt 1 = True) = 0:7
P(Rt = TruejRt 1 = False) = 0:3
P(Ut = TruejRt = True) = 0:9
P(Ut = TruejRt = False) = 0:2:
We are interested in estimating the distribution of P(RTjU0;U1; ;UT). To
calculate this distribution, you are required to use both likelihood weighted
sampling method and Gibbs sampling method to perform. the approximate in-
ference.
2 Deliverables
You can write your program in C/C++ or Python. Your program takes the a
txt le (observation.txt) as the input. The text le contains a sequence of binary
numbers (separating by white spaces), e.g. 0 1 1 0. This sequence corresponds
to the observed variable sequence fU0;U1; ;UTg with 0 indicating False and
1 indicating True. You can infer the sequence length T by the sequence of binary
numbers.
If you write your code in C/C++, you must supply a Make le with a rule
called inference to compile your program into a Linux executable binary named
inference.bin. Your program must be able to be compiled and run as follows:
$ make inference
$ ./inference.bin observation.txt
In the case of python, your program must be run as follows:
$ python inference.py observation.txt
The result of an inference method returned by your program must be written
as two decimal values (corresponding to the values of P(Rt = Truej:::) and
P(Rt = Falsej:::)) onto the standard output stream, separated by white space
and followed by the name of inference method, For example:
0.873 0.127 Likelihood
0.872 0.128 Gibbs
3 Submission and Assessment
Two example input les, observation1.txt and observation2.txt will be supplied
for the test of your code.
You must submit, by the due date, two les:
les containing your code with necessary dependencies.
le with a short written report detailing your implementation in no more
than 1 page. A gure showing the relationship between the estimated
posterior probability and the number of iterations is needed for the two
methods. For example, you can vary the iteration number from 100 to
20000 and plot the estimated posterior probabilities (for the HMM in
observation2.txt) for each iteration number.
The implementation of likelihood weighted sampling methods takes up 40
marks and the implementation of Gibbs sampling takes up 60 marks. For each
part, your work will be assessed on the basis of quality of code(40%), quality of
write-up (30%) and accuracy of results (30%).

联系我们
  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp
热点标签

联系我们 - QQ: 99515681 微信:codinghelp
程序辅导网!