首页 » 数据挖掘代写 » 数据挖掘代写 | 4120-COMP Assignment 4: Ensemble methods and other topics of Machine Learning

数据挖掘代写 | 4120-COMP Assignment 4: Ensemble methods and other topics of Machine Learning

本次代写是一个数据挖掘机器学习算法相关的assignment

Question 1

The following code shows an incorrect implementation of Adaboost algorithm. Please follow its
overall structure but modify it to make a correct implementation. You do not need to write code for
weak_classifier_train and weak_classifier_prediction, but you need to add more input and output
arguments of weak_classifier_train to create a correct implementa!on. Please clearly define the
variables you added.

def Adaboost_train(train_data, train_label, T):
# train_data: N x d matrix
# train_label: N x 1 vector
# T: the number of weak classifiers in the ensemble
ensemble_models = []
for t in range(0,T):
model_param_t = weak_classifier_train(train_data, train_label) # model_param_t returns the
model parameters of the learned weak classifier
# defini!on of model
ensemble_models.append(model_param_t)
return ensemble_models

def Adaboost_test(test_data, ensemble_models):
# test_data: 1 x d
decision_ensemble = 0
for k in range(1,len(ensemble_models)):
predic!on = weak_classifier_predic!on(test_data) # predic!on returns 1 or -1 predic!on
from the weak classifier
decision_ensemble = decision_ensemble + predic!on
if decision_ensemble > 0:
predic!on = 1
else:
predic!on = -1
return prediction

Question 3

Assume that the weak learners are a finite set of decision stumps, Adaboost cannot achieve zero
training error if the training data is not linearly separable.

True
False


程序辅导定制C/C++/JAVA/安卓/PYTHON/留学生/PHP/APP开发/MATLAB


本网站支持 Alipay WeChatPay PayPal等支付方式

E-mail: vipdue@outlook.com  微信号:vipnxx


如果您使用手机请先保存二维码,微信识别。如果用电脑,直接掏出手机果断扫描。

blank