UNIVERSITY OF EDINBURGH

COLLEGE OF SCIENCE AND ENGINEERING

SCHOOL OF INFORMATICS

INFR11161 NATURAL COMPUTING

Monday 17 th December 2018

14:30 to 16:30

INSTRUCTIONS TO CANDIDATES

Answer QUESTION 1 and ONE other question.

Question 1 is COMPULSORY. If both QUESTION 2 and

QUESTION 3 are answered, only QUESTION 2 will be marked.

All questions carry equal weight.

CALCULATORS MAY NOT BE USED IN THIS EXAMINATION

MSc Courses

Convener: M.Mistry

External Examiners: W. Knottenbelt, M. Dunlop, M. Niranjan, E. Vasilaki

THIS EXAMINATION WILL BE MARKED ANONYMOUSLY

1. THIS QUESTION IS COMPULSORY

(a) What is meant by intensification in relation to metaheuristic algorithms? [2 marks]

(b) In the search for the global optimum, Simulated Annealing uses cooling

schemes for the statistical temperature. Explain briefly the rationale behind

this procedure. [2 marks]

(c) Explain the difference between roulette wheel selection and a tournament

selection in genetic algorithms. For what conditions would you prefer either

of these selection mechanisms? [3 marks]

(d) If the global optimum has been discovered already by an ant colony algorithm and only the best ant (in the sense of best-ever) contributes to the

pheromone trail in every time step, what level of pheromone will accumulate

on the optimal path in the long run? [3 marks]

(e) Give three points that help to set-up a differential evolution algorithm for

good performance on a typical benchmark problem. [3 marks]

(f) What is the difference between the concept of mutation in evolutionary

algorithms and the velocity update in particle swarm optimisation? [2 marks]

(g) How can crowding be prevented in multi-objective optimisation? [2 marks]

(h) How might one evolve a control algorithm, e.g. for a mobile robot, using a

metaheuristic algorithm? [3 marks]

(i) How does the Baldwin effect enable the transmission of acquired skills from

one generation to the next and how would such a mechanism support the

function of a genetic algorithm? [2 marks]

(j) Assume you are participating in a competition where your algorithm will

be tested on random instances of known benchmark functions. How can

you improve your algorithm such that it has a good chance to win the

competition? [3 marks]

Page 1 of 3

2. ANSWER EITHER THIS QUESTION OR QUESTION 3

(a) The schema theorem gives a lower bound for the expected frequency of a

certain schema in the next generation. What implications can be drawn for

the performance a genetic algorithm over many generations? [3 marks]

(b) In the design of a metaheuristic algorithm, we choose to maintain an elite

set of solutions that are found during the search. This set is meant to

contain a fixed number K solutions. At each iteration, the metaheuristic

generates a new (or an updated) population of N solutions (N > K). Which

replacement strategies can be applied to update the elite set by solutions

from the union of the elite set and the new population? Discuss advantages

and disadvantages of two different strategies. [4 marks]

(c) Particle swarms can be used to solve continuous optimisation problems such

as data fusion over a sensor network, e.g. in a smart home.

i. How can you assess the performance of the algorithm? [2 marks]

ii. Describe how the algorithm differs from a particle swarm algorithm that

solves a static problem. [2 marks]

iii. Write down the main steps of the algorithm in pseudocode and explain

the main parts of the code. [4 marks]

iv. How is diversity and specificity of the particles achieved in the algorithm? [3 marks]

v. In some versions of particle swarm optimisation the particles interact

only with particles in the neighbourhood rather than with all particles

in the swarm. Would such a neighbourhood structure be beneficial in

the present problem? Explain your answer. [4 marks]

vi. Describe three ways to hybridise the pure swarm solution considered so

far. [3 marks]

Page 2 of 3

3. ANSWER EITHER THIS QUESTION OR QUESTION 2

(a) The No-Free-Lunch (NFL) theorem.

i. The NFL theorem could be perceived as stating that all optimisation

algorithms have the same performance. In what respects would you

disagree with this statement? [4 marks]

ii. For some algorithms strategies for parameter selection are known which

do not refer to any specific fitness functions. Choose an algorithm and

explain whether such as the parameter choice is meaningful in the light

of the NFL theorem. [5 marks]

(b) Decision trees represent a standard approach to classification tasks in data

mining. A decision tree is a tree with terminal nodes that correspond to

class labels ci, i = 1; : : : ; n, and nonterminal nodes that represent the available features aj, j = 1; : : : ; m. Each node of the tree contains a test on

the feature values aj which we assume to be numbers in the unit interval.

Assume that you are given a data set containing information about a set of

items. For each item a list of features and the class label for the respective

item are given.

i. Explain in detail how Genetic Programming (GP) evolves solutions to

a problem of this type. [6 marks]

ii. Do you expect that problems such as local minima, inflated complexity

(bloat”), generation of erroneous code, missing or contradictory data

labels would interfere with the performance of GP in this example? In

addition, discuss also how to adapt your solution to the case of multiobjective optimisation. [4 marks]

iii. For problem of size n = 8, m = 8, and 100 items to be classified,

how many evaluations would a random search algorithm need to find

the optimal solution assuming there is only a single optimum. Explain

whether you would expect the GP to be able to find the optimum faster? [3 marks]

iv. What advantage would a Cartesian GP provide compared to the treelike representation that was assumed above? [3 marks]

Page 3 of 3

AssignmentTutorOnline

- Assignment status: Already Solved By Our Experts
*(USA, AUS, UK & CA PhD. Writers)***CLICK HERE TO GET A PROFESSIONAL WRITER TO WORK ON THIS PAPER AND OTHER SIMILAR PAPERS, GET A NON PLAGIARIZED PAPER FROM OUR EXPERTS**

**NO PLAGIARISM**– CUSTOM PAPER