首页 >
> 详细

Optimization in Machine Learning (2020 Winter)

Assignment 2

Instructions: For the code parts, submit the completed files on Canvas (both .py file

and .ipynb file are accepted). For the free response parts, type your solution into a seperate

electronic document (.pdf file). Physical submissions will NOT be accepted.

To submit, compress all your files into a single compressed file (.zip file). E-mail a softcopy

of your code and answers to rkwon@mie.utoronto.ca.

If you have any questions about this assignment, please e-mail yhe@mie.utoronto.ca.

1. Linear Support Vector Machine

Download the dataset ‘prob1data.csv’. The dataset consists of two features as first two

columns and its classification as the third column (0 and 1 refer to two distinct types.) A

plot of the dataset is given below:

In this problem, you will solve both primal and dual optimization problem of the linear

soft-margin support vector machine using CVXPY and analyze the results.

(1a) (Code + Free Response) Complete the function ‘LinearSVM Primal0

that solve the

primal optimization problem of the linear SVM. Using the entire dataset, C = 1, solve the

optimization problem and report:

(1) The optimal decision boundary.

(2) The optimal support vectors.

(3) The solution time.

1

(1b) (Code + Free Response) Complete the function ‘LinearSVM Dual’ that solve the dual

optimization problem of the linear SVM. Using the entire dataset, C = 1, solve the optimization problem and report:

(1) The optimal dual solution.

(2) The optimal decision boundary.

(3) The optimal support vectors.

(4) The solution time.

(1c) (Free Response) Discuss if the decision boundary of the linear SVM will change with

increased and decreased C value. If the decision boundary changes, briefly discuss how it

changes with C and why. If the decision boundary does not change with C, discuss the reason.

(1d) (Code + Free Response) Complete the function ‘Linearly separable’ that output 1 if

the dataset is linearly separable and 0 otherwise. Determine if the given dataset is linearly

separable. For any given dataset with multiple features, how can one conclude if the dataset

is linearly separable based on the optimal solution (optimal decision boundary) and optimal

objective function value solved? (Hint: consider varying C values.)

In this following problems, we will consider an alternative soft-margin method, known as the

l2 norm soft margin SVM. This new algorithm is given by the following primal optimization

problem (notice that the slack penalties are now squared, n is the total number of datapoints,

(2a) (Code) Complete the function ‘gaussian kernel sigma’ that returns a function ‘gaussian kernel’

with the specified σ value.

(2b) (Code + Free Response) Use ‘SVC’ from ‘sklearn.svm’ and ‘gaussian kernel sigma’

coded in (2a) to build a kernel SVM to classify the train data X train. Use C = 1 and

σ = 0.1. Report:

(1) Number of support vectors.

(2) Prediction error (ratio) in test set X test.

(3) Plot decision boundary approximately.

Download the files ‘votes.csv’. This dataset votes consists of over 3000 counties in the

United States along with their socioeconomic and demographic information and voting

records in the 2016 US election; each row corresponds to a single county.

(2c) (Code) The response variable will be prefer trump, which is 0 or 1 indicating whether

the percentage of people who voted for Trump in that county is greater than that who voted

for Clinton. Compute the response variable y.

(2d) (Code + Free Response) Use ‘SVC’ from ‘sklearn.svm’ to implement polynomial kernel

SVM with C = 10.0, max iter=1e6. Implement SVM with kernel degree set to 1, then 2, 3,

4, and 5. For each model, report:

(1) Number of support vectors.

(2) Prediction error (ratio) in test set X train.

(3) Prediction error (ratio) in test set X test.

(2e) (Free Response) Based on the 5 models trained in (2d), how does the predictive error

change with the degree of the polynomial kernel? Explain why. How does the number of

support vectors change with the degree of the polynomial kernel? Explain why.

联系我们

- QQ：99515681
- 邮箱：99515681@qq.com
- 工作时间：8:00-23:00
- 微信：codinghelp2

- Data留学生作业代做、Python程序设计作业调试、Python语言作业代 2020-08-14
- 代写ict501课程作业、代做ict Technical作业、代做java， 2020-08-14
- Ict615留学生作业代做、代写operating Systems作业、Py 2020-08-14
- Busa90501 Machine Learning 2020-08-13
- Fit 2093 Introduction To Cyber Securit... 2020-08-13
- Cmpt307 Assignment 2 2020-08-13
- 代写comp3170-Assignment 3帮做java编程作... 2020-08-13
- 代写math2010代做留学生asp语言、Asp程序代做 2020-08-13
- 代写cs-210代写java课程设计、Java程序代做 2020-08-13
- System课程作业代做、Programming作业代写、C++编程设计作业 2020-08-12
- 代写programming作业、代做python程序语言作业、代写data课 2020-08-12
- 代写software课程作业、代做r程序设计作业、代写r语言作业、代写dat 2020-08-12
- Eee2007作业代写、Programming作业代写、代做c/C++编程设 2020-08-12
- 代做spss|代写python编程|代写python程序|代写留学生 Sta 2020-08-12
- 代写sec202 代写留学生asp编程、Prolog帮写 2020-08-12
- Cs412 代写mean代写留学生jsp课程设计 2020-08-12
- 代写itnpbd7调试java作业、Java编程代写 2020-08-12
- 代写pmath 340-Assignment 2代写asp编程作业、C/C+ 2020-08-12
- Tsp课程作业代写、代做algorithms留学生作业、代做java，C/C 2020-06-23
- Kit107留学生作业代做、C++编程语言作业调试、Data课程作业代写、代 2020-06-23