STAT 3701 Homework 6
Show all work. Submit your solutions in a pdf document on Canvas. Include your R code (which must be
commented and properly indented) in the pdf file. Also submit one text file with all your R code (comments
and all) clearly labeled with the problem it goes with. This must be properly indented. Before every solution
with random sampling use set.seed(3701).
Problem 1 (16 points)
For this problem let the objective function g be
g(x) = sin(x) x
, x ∈ (0, 6π]
We are interested in using the bisection search method to find the global minimizer xˆ, i.e.,
xˆ = argmin
x∈(0,6π] g(x)
(a) (4 points) Derive ∇g(x).
(b) (4 points) Using the bisection search algorithm bsearch from the notes with initial interval [a0, b0] =
[0.1, 6π]. Let x¯ be the critical point return by the algorithm. What is the value of g(¯x)?
(c) (4 points) Make a plot of g(x) over the domain (0, 6π]. Was the value x¯ you found in part (b) the global
minimizer? Use seq(from=0.1,to =6*pi,by=0.001) to generate x.list containing the
values of x over the domain.
(d) (4 points) Find the global minimizer of g(x) by using the bisection search algorithm and carefully
choosing the initial interval. Note your initial must have a width larger than 2π.
Problem 2 (14 points)
Suppose that x1,...,xn are independent realizations from N(µ, σ2). We know that X¯ ∼ N(µ, σ2n ). Suppose
we are interested in minimizing the objective function h with respect to a where h is defined as
h(a)=(E[aX¯ − µ])2 + Var(aX¯)
Specifically we are interested in finding the global minimizer aˆ, i.e.,
aˆ = argmin
a∈R h(a),
where R+ denote the set of positive real numbers.
(a) (7 points) Find ∇h(a) and ∇2h(a).
(b) (7 points) Based on ∇h(a) find the value a¯ such that ∇h(¯a)=0. Prove that a¯ is the global minimizer
by checking the value of ∇2h(a). 1
Problem 3 (20 points)
Let X1,...,Xn be iid observation from N(µ, σ2), where µ and σ2 are usually unknown in real data analysis.
Previously, we’ve been using the estimators µˆ = 1n !ni=1 Xi and σˆ2 = 1 n−1 !ni=1(Xi − X¯)2. Here we are
interested in deriving the maximum likelihood estimators for µ and σ2.
(a) (5 points) Write down the loglikelihood function l(µ, σ2; x1,...,xn).
(b) (7 points) Now treat σ2 as fixed. Minimize the negative loglikelihood function with respect to µ get
µˆmle. And you will see µˆmle does not depend on σ2. (Hint: you may want to refer to Example 1.3 on
notes)
(c) (8 points) To get the MLE for σ2, we plug in µˆmle for µ in the loglikelihood function, take differentiate
the loglikelihood function with respect to σ2 and set it to 0. That is, the MLE σ2,mle solve the equation
dl(ˆµmle, σ2; x1 ...,xn)/dσ2 = 0. ·2·