这是一个加拿大的Python数值方法作业代写
1 Polynomial regression with MLE and MAP [40%]
We begin by exploring a maximum likelihood estimator (MLE) and maximum a posteriori (MAP) estimator on the
familiar polynomial regression problem.
1.1 Polynomial regression
The overdetermined degree-𝑚 polynomial regression problem — with an explicit corruption/noise model on the
data — seeks an interpolant across 𝑛 data samples (𝑥𝑖, 𝑦𝑖) that satisfies:
where:
𝑥𝑖 is the independent coordinate of sample 𝑖 , with 𝐱 = {𝑥0 … 𝑥𝑛−1}
𝑦𝑖 is the dependent coordinate of sample 𝑖, with 𝐲 = {𝑦0 … 𝑦𝑛−1}
𝜖𝑖 ∼ (𝜇, 𝜎2) is standard Gaussian noise corrupting the outputs 𝑦𝑖, and
𝜽 = {𝜃0, 𝜃1, … , 𝜃𝑚} are the 𝑝 = 𝑚 + 1 polynomial coefficients we are solving for.
Note that one common way to rewrite this model is by “folding in” the deterministic component into the mean of
the Gaussian, as:
1.2 Maximum likelihood estimation (MLE) [20%]
You will implement a MLE for the polynomial parameters 𝜽 that maximize the data’s likelihood function:
where — assuming the samples (𝑥𝑖, 𝑦𝑖) are drawn i.i.d. — the likelihood function can be expressed using the
normal distribution’s density, as
Taking the log of the likelihood before taking the argmax — which is a valid transformation under argmax, given
log’s monotonicity — yields:
程序辅导定制C/C++/JAVA/安卓/PYTHON/留学生/PHP/APP开发/MATLAB

本网站支持 Alipay WeChatPay PayPal等支付方式
E-mail: vipdue@outlook.com 微信号:vipnxx
如果您使用手机请先保存二维码,微信识别。如果用电脑,直接掏出手机果断扫描。
