🡴 Up to the list of all exercises
Question 1
In Exercise 1, we performed experiments that showed how mutation intensity ("neighborhood size") affects optimization results. In this exercise, we will examine in more detail how these results are affected by different types of mutation ("neighborhood shape").
In earlier exercises, we were only concerned with the combinatorial aspect of optimization – without numerical (continuous) elements. In addition, we optimized only static, passive structures that do not react to the environment or to their own state. Now we will attach active elements to the structure itself. We need sensors (receptors) to acquire information from the environment (or from the structure itself) and actuators (effectors) to influence the environment (or the structure itself). Some pieces of logic to process the signal (rules, neurons, etc.) will also be useful.
In the Framsticks environment, we are provided with a number of simulated sensors and actuators; one can also develop custom ones. For our simple experiments, we will only need two already existing types of receptors (denoted by the symbols G
and T
) and two types of effectors (symbols @
and
|
). You can find a description and interactive demonstrations of how they work here. In addition, two types of generators will be useful (Sin
– sinusoidal waveform generator, and *
– generator of a signal of value 1). Neurons, receptors and effectors – apart from their type – have also their individual parameters, so not every neuron of a given type has to be identical.
To make sure that the structures and neural networks optimized in further experiments are not mysterious and incomprehensible ("evolution has optimized something"), we will learn how the neurons are encoded using the f1 representation as an example, and we will test interactively how they work in simulation. To this end, we will first practice sections I.4, I.5 and I.6 (up to and including exercise 1) from the tutorial.
Before we perform unattended evolutionary experiments to optimize a neural network, let's first watch more interactively what evolution does to a neural network by completing section II.1 of the tutorial and successively enlarging the neighborhood.
In your response, specify the velocities obtained:
- in I.6, Exercise 1 (manual adjustment of two parameters)
- in II.1, mutating only neural weights and neural properties
- in II.1, + mutating stick length modifier
- in II.1, + mutations: adding and removing neurons, sensors and effectors
Achieved maximum stable velocity values:
I.6, Exercise 1 |
0.000000000 (TODO) |
II.1, mutating only neural weights and neural properties | 0.000000000 (TODO) |
II.1, + mutating stick length modifier | 0.000000000 (TODO) |
II.1, + mutations: adding and removing neurons, sensors and effectors | 0.000000000 (TODO) |
Question 2
In the next question, we will optimize the velocity of the structure moving, but the velocity has to be somehow determined. We know that velocity=distance/time, and we measure time in simulation steps. However, "time" has two interpretations. The first is the simulation time of a given structure (this is "lifespan" in Framsticks terminology and let's assume that it is fixed and we do not change it). The second is the time interval between the two moments, between which we determine the distance (in Framsticks terminology this is "perfperiod" – short for "performance sampling period").
Assuming that each structure is simulated for 10 000 steps (this is "lifespan"), we can set "perfperiod" right at 10 000 steps, but we can also set it to 2 000 and during the entire "lifespan" calculate five velocities, which we will then average. Or we can set "perfperiod" to 20 and calculate 500 velocities, which we will then average. In all cases, we will eventually get one value for velocity, but defined differently – see this diagram. ❶ And what in physics is the velocity called when "perfperiod" tends to zero?
Imagine now the space of all possible genotypes, and above this space – a fitness landscape composed of velocity values. Now imagine that there are many such landscapes, each for a different "perfperiod" value. How are these landscapes different?
❷ How does the landscape change when "perfperiod" increases smoothly starting from the minimum value? ❸ Do all the landscapes have any common points, and if so, precisely what kind of movement establishes such points? ❹ Do the landscapes differ in how easily can optimization discover very good solutions (in a given landscape), and why? ("very good" abstracts completely from how we subjectively like a certain type of movement – it is only about the landscape itself and very good values of the objective function).
Note – you can optionally verify if your idea of landscapes and their difficulty was correct: prepare a few dozen structures that move in different ways and evaluate them by setting different "perfperiod"s (do not change "lifespan"), then draw a plot: horizontal axis = individual genotypes, vertical axis = "velocity" for different frams.Populations[0].perfperiod
(a few values from 1 to "lifespan").
Question 3
It's time for an evolutionary experiment. Choose one of the representations (f0 or f1) and perform the optimization of movement velocity following this procedure:
for %%P in (0)
do (
for /L %%N in (1,1,10) do (
python FramsticksEvolution.py -path %DIR_WITH_FRAMS_LIBRARY% -sim
"eval-allcriteria.sim;deterministic.sim;sample-period-longest.sim;my-own-probab-%%P.sim"
-opt velocity -max_numparts 15 -max_numjoints 30 -max_numneurons 20 -max_numconnections 30 -genformat CHOSEN-0-OR-1 -pxov 0 -popsize
sizeatleast50 -generations adequatelength -hof_size 1
-hof_savefile HoF-vel-probab-%%P-%%N.gen
))
The results of the above experiment will serve as the baseline. The outer loop with %%P
is redundant for now. To avoid an exception when trying to load a missing file my-own-probab-0.sim
, create an empty file with that name in the subdirectory data/
.
As we know, due to the variety of elements of the structure that must be evolved, we need different mutation operators. This is quite different from, for example, the TSP-type permutation problems, where we had a well-defined, homogeneous, finite neighborhood. Here we have many ways to generate a neighbor (a mutant) and each of them has its own probability of occurring.
Run Framsticks GUI, choose from the menu: File→Open (Ctrl-O) and load the eval-allcriteria.sim
file, choose from the menu: Simulation→Parameters (Ctrl-P), on the left in the tree expand "Genetics" and then expand branches "f0" and "f1". In the sub-branches you see values (they don't have to add up to 1), which are the relative probabilities of each operation performed on a genotype. They affect how the search topology is traversed. There is always one selected operation performed per single mutation (this operation is selected using the roulette technique based on exactly these relative probabilities). Now have a look at the eval-allcriteria.sim
file – there you will find a group of parameters starting with f0_p_new
,
and slightly below that, another group starting with f1_smX
. Depending on whether you chose f0 or f1, copy the parameters that are numerical to a new file, for example my-own-probab-1.sim
– not forgetting that the first line preceding the parameter section must be
sim_params:
(without any empty lines in-between).
Now we will change the way the topology of the solution set is searched. Perform another experiment setting all probabilities in your file my-own-probab-1.sim
to the same value, for example 1. Imagine the consequences of this change for evolution. Generate the same plots as usually. Did you achieve better final results, or did these results remain unchanged, or did they get worse? What does this indicate? Now adjust the probability values based on your intuition and knowledge about what each operator is doing, save the values in a new file my-own-probab-2.sim
, repeat the experiment and generate plots. Did the values yield better results? You can test more of different sets of probabilities if you want to verify your other ideas and hypotheses.
Finally, based on your assumptions and existing experience, try varying the probabilities of operators during evolution. You probably predict that different operators have lower or higher utility at the beginning or at the end of the optimization process. As in the first exercise, after including import frams
in FramsticksEvolution.py
, you can access parameters-objects frams.GenMan.f0_p_new
, frams.GenMan.f1_smX
etc. You can safely change their values while evolution is running, for example making their adjustments dependent on the counter of calls to the frams_evaluate()
function, on the number of generations, on the highest or the average fitness value in the population, and the like.
Generate plots and include them all in your response. Did your predictions and ideas to change the way evolution searches the topology of solutions improve the final results?