首页 > > 详细

讲解Gaussian kernel程序、R format程序讲解、解析Gaussian 编程

You agree that you will not discuss any question in this take-home nal, directly or
indirectly, with anyone, in the class or outside. Any infringement of this agreement
will be reported to the Academic Integrity O ce.
Send your code to with the exact subject line \MATH 282B (FINAL)".
Your le has to be in .R format. Please name it as follows: 282b-final-lastname-firstname.R.
Problem 1. Write a function, localLogistic(x, y, h, xnew = x), that takes in binary classi cation
data and ts a model by local logistic regression with the speci ed bandwidth (use the Gaussian
kernel), returning the probability that the class label is 1 at the speci ed new values of the predictor
variable. (In the notation of the slides, the tted model, denoted ^ , is meant to estimate (x) =
P(y = 1jx). Then if the new predictor variables are xnew1 ;:::;xnewm , the function should return
^ (xnew1 );:::; ^ (xnewm ).)
Apply your function to data generated as follows. Let x1;:::;xn be iid standard normal in
dimension 2, and then generate y1;:::;yn2f0;1g according to
P(y = 1jx = (x1;x2)) = 11 + (x1 x2)2
Choose the sample size to be n = 1000. Produce a plot of ^ .
Problem 2. Write a function, stepVisual(x, y, type = c(‘forward’, ‘backward’)), that takes in
regression data and returns the forward or backward model path (as speci ed), and also produces
a related plot showing how the coe cients change with each step,1 as well as a plot tracing the
AIC as a function of the number of steps. There is no stopping rule here, meaning that forward
selection starts at the intercept-only model and ends at the full model, while backward selection
starts at the full model and ends at the intercept-only model.
Apply your function to data generated as follows. Let x1;:::;xn be iid standard normal in
dimension p, and then generate y1;:::;yn according to yi N( 0 + >xi;1), where 0 = 0 and
= ( 1;:::; p) with j = 5=j2. Choose p = 20 and n = 1000. Apply your function, with option
‘forward’, to some data generated as described. Also apply ridge regression and the lasso, producing
the corresponding plots.
1 We saw this kind of plot for ridge regression and the lasso, where the coe cients were shown to change with the
tuning parameter. Here the tuning parameter is the number of steps

联系我们
  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-21:00
  • 微信:codinghelp
热点标签

联系我们 - QQ: 99515681 微信:codinghelp
程序辅导网!