首页 » Python辅导 » Python辅导 | COMP222 – 2019 – Second CA Assignment

Python辅导 | COMP222 – 2019 – Second CA Assignment

这个作业是用python为两个数据集进行深度神经网络处理

COMP222 – 2019 – Second CA Assignment

Individual coursework Train Deep Learning Agents

1 Objective

This assignment requires you to implement deep neural networks for the two datasets, i.e.,

  • Optical recognition of handwritten digits dataset
  • RCV1 dataset

from https://scikit-learn.org/stable/datasets/index.html, and apply the model eval-

uation methods to compare them with the two models in Assignment 1. Please make sure

that you select the same dataset as you did for the Assignment 1, if you completed the

Assignment 1.

2 DNN-based Classification

2.1 Requirement and Description

Language and Platform Python (version 3.5 or above) and Tensorflow or Keras (latest

version). You can use some libraries available on Python platform, including numpy, scipy,

scikit-learn, and matplotlib. If you intend to use libraries other than these, please consult

the demonstrator or the lecturer.

Learning Task You can choose either classification (preferred) or regression, but needs to be the same choice as your Assignment 1 submission. Classification.

Assignment Tasks You need to implement the following functionalities:

f1 design and build two different deep neural networks, one with convolutional layer and

the other without convolutional layer;

f2 apply model evaluation on the learned models. For the materials on model evaluation,

you may take a look at the metrics explained in the lecture “model evaluation”. You

are required to implement by yourself (i.e., do not call built-in libraries)

(a) the cross-validation of 5 subsamples,

(b) the confusion matrix, and

(c) the ROC curve for one class vs. all other classes

for

(a) the two neural networks you trained in f1, and

(b) the two traditional machine learning algorithms in the first assignment.

Please also summarise your observation on the results.

2

 

Additional Requirements We have additional requirements that,

  1. the marker can run your code directly, i.e., see the results of functionality f1 by loading

the saved models, without training.

  1. You need to provide clear instructions on how to train the two models. The instructions

may be e.g., a different command or an easy way of adapting the source code.

Documentation You need to write a proper document

  1. detailing how to run your program, including the software dependencies,
  2. explaining how the functionalities and additional requirements are implemented, and
  3. providing the details of your implementation, including e.g., the meaning of parameters

and variables, the description of your model evaluation, etc.

Submission files Your submission should include the following files:

  • a file for source code,
  • two files for saved models, and
  • a document.

Please see Section 3 for instructions on how to package your submission files, and read the

Q&A on whether to upload the two trained models from the first assignment.

2.2 Marking Criteria

The assignment is split in a number of steps. Every step gives you some marks.

Note 1 At the beginning of the document, please include a check list indicating whether

the below marking points have been implemented successfully. Unless exceptional cases, the

length of the submitted document needs to be within 4 pages (A4 paper, 11pt font size).

Note 2 The marking of a functionality will also consider the quality of coding and the quality of documentation. A run-able implementation alone will have up to 50% of the marks.

functionality f1: 50%

For each model (with and without convolutional layer), 20% will be for the model construction and 5% will be on the model saving and the model file in the submission.

3

 

functionality f2: 50%

The model evaluation between will include

  • cross validation (10%)
  • confusion matrix (10%)
  • ROC curve (20%)
  • discussion on the discovery (10%)

For each of the four parts, 80% of the marks are for deep learning models, while 20% are for

the traditional models in the first assignment. For example, for cross validation part, if you

only do deep learning models, your marks are capped at 8% instead of 10%.

The marker will mark according to the quality of both your evaluation and the docu-

mentation.

4 Q&A

Q: The ROC curve we taught in the lecture is for binary classification, but

the models we trained are for multiple classes. What can we do?

A: As indicated, you can have one class vs. all other classes, where all other classes are deemed as a single class.

Q: My models in the first assignment can output a classification but not a

confidence probability. What can we do for ROC curve?

A: If you think some functionality is hard to implement, please explain in the docu- ment. The marker will then evaluate your explanation to give you a reasonable mark.

4

 

Q: Since we are requested to evaluate the two models from our first assign-

ment, shall we upload again?

A: You can upload them again if needed. Note that, the marker won’t be able to access the first assignment when they are marking the second assignment.

Q: My runtime for the functionality f2 are longer than 5 minutes. Will this

affect my marks?

A: Marking is based on the quality of your implementation and your documenta-

tion, and will not take the runtime into consideration. On the other hand, you are

recommended to explain the details of your program (including the runtime) in your

document.

5


程序辅导定制C/C++/JAVA/安卓/PYTHON/留学生/PHP/APP开发/MATLAB


本网站支持 Alipay WeChatPay PayPal等支付方式

E-mail: vipdue@outlook.com  微信号:vipnxx


如果您使用手机请先保存二维码,微信识别。如果用电脑,直接掏出手机果断扫描。

blank

发表评论

您的电子邮箱地址不会被公开。