COMP2712_8715 Assignment01Page 1 of 2Assignment02: Using Evolutionary Algorithms to Optimise MLPs (25%)The assignment objective is to implement the optimisation of the structure of the MLP using anEvolutionary Algorithm (EA) to classify the images in the CIFAR-10 dataset.Specifically, you are to use the SciKit-Learn sklearn.neural_network.MLPClassifier The EA should notbe used to optimise the weights of the network, rather the EA should be used to discover theoptimal network structure and training parameters. For example, this set of options may consist of• number of nodes in each hidden layer hidden_layer_sizes• number of hidden layers hidden_layer_sizeshidden_layer_sizes = (10,5) is 2 layers with 10 in layer 1 and 5 neurons in layer 2• activation functions on hidden layers activation• training algorithm solverEssentially any option you can set in the creation of an MLP, see Parameters section of the MLPhttps://scikit-learn.org/stable/modules/generated/sklearn.neural_network.MLPClassifier.htmlAt this stage, your familiarity with EAs should be sufficient to implement a simple EA, but you arewelcome to look at the source code of Python EA libraries and modules if this helps. However, donot rely extensively on other external libraries that already implement EA functionality, unless youare especially creative with it. The choice of EA is up to you, whether it be a Genetic Algorithm (GA)or Evolutionary Strategy (ES) or otherwise; you can even roll your own, but for full marks it should beeither at a similar functional level of complexity to a simple GA, or at least perform similarly well.Finally, compare your EA against the standard exploration of MLP parameters (i.e. GridSearchCV) andprovide a brief report on this comparison.1. Load the CIFAR-10 dataset2. Classification Explorationa. MLP Standard Training and Evaluation (incl. GridSearchCV) 15%b. MLP Using EA (sklearn-deap) for Hyper Parameter Optimisation 35%3. Implement your own Evolutionary Algorithm for Hyper Parameter Optimisation 10%4. Evaluationa. basic: train-test-split, intermediate: 10 CV, expert: 10 x 10 CV 15%5. Discussion 25%NOTE: Part 3 to implement your own EA is only worth 10% and this does not equate to the level ofdifficulty. It is potentially very difficult and only intended for those looking for a HD. It will mostlikely take more time/effort than the rest of the assignment combined!! You’ve been warned 🙂Here is a list of useful websites the implement or use EA in Python• The DEAP is an evolutionary package for Python that is very extensive and has also beenincorporated into a package that performs a grid search using EA (sklean-deap)o https://github.com/deap/deapo https://github.com/rsteca/sklearn-deapo https://towardsdatascience.com/genetic-algorithms-in-python-using-the-deap-library-e67f7ce4024c• The links below are examples of creating your own simple EAo https://lethain.com/genetic-algorithms-cool-name-damn-simple/o https://blog.coast.ai/lets-evolve-a-neural-network-with-a-genetic-algorithm-code-included-8809bece164COMP2712_8715 Assignment01Page 2 of 2SubmissionYour final submission should be a .zip file containing• a PDF of your evaluation and discussion submitted via FLO• a Jupyter notebook (.ipynb) from your Google Colab (File->Download .ipynb)Your PDF should also include a sharable link from your Google Colab. If you have multiple notebooksthen include multiple .ipydb files and links. Include a citation to any code you use for other sources.
- Assignment status: Already Solved By Our Experts
- (USA, AUS, UK & CA PhD. Writers)
- CLICK HERE TO GET A PROFESSIONAL WRITER TO WORK ON THIS PAPER AND OTHER SIMILAR PAPERS, GET A NON PLAGIARIZED PAPER FROM OUR EXPERTS
