首页 > > 详细

讲解Bayesian linear、辅导Python,Java,c/c++程序语言、regression model留学生辅导 解析Haskell程

2019 Midterm due at Noon on Feb 23
Please submit the solutions be email to me or place in my mailbox on the second floor of Old
Chemistry.
Pr. 1.1 Consider the data set in Lab 1. This is the Advertisement.csv data set.
1. Compute the leave-on-out cross validation error for a Gaussian process regression
model as well as a Bayesian linear regression model.
2. Plot the predictive posterior distribution when observations 1, 50, 100, 150 are respectively
left out of the training set and you are asked to predict their response.
3. Use a bootstrap procedure to output confidence intervals when observations1, 50, 100,
150 are respectively left out of the training set and you are asked to use ordinary least
squares as your regression method.
Pr. 1.2 Write out the EM update steps for a mixture of multinomials model. Specfically,
consider the following likelihood
f(x1, ..., xn; π, {θ1, ..., θ7}) = Yn
i=1 "X
7
k=1
πkf(xi
; θk)
#
where x takes values 1, ..., 4 (there are four categories) and
f(x = c; θ) = θc, θc ≥ 0 and X
c
θc = 1.
Pr. 1.3 Given the classification data set in Lab 2 run regularized logistic regression versus SVM
and compare classification accuracy on a test-train split.
Pr. 1.4 Show that the EM algorithm does not decrease with respect to the likelihood value at
each step.
Pr. 1.5 Sketch how the Least Angle Regression problem implements a form of sparse regression.
Pr. 1.6 Sketch how the Least Angle Regression problem implements a form of sparse regression.
Pr. 1.7 Run the sklearn.mixture program with 8 components. Output the probability assignment
for each observation. Explain the difference between a hard assignment versus a soft
assignment for each observation.


联系我们
  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp
热点标签

联系我们 - QQ: 99515681 微信:codinghelp
程序辅导网!