项目作者: godatadriven

项目描述 :
a python grammar for evolutionary algorithms and heuristics
高级语言: Python
项目地址: git://github.com/godatadriven/evol.git
创建时间: 2017-07-15T07:59:20Z
项目社区:https://github.com/godatadriven/evol

开源协议:MIT License

下载


Documentation StatusDownloads
Build Status Documentation StatusDownloads

Imgur

Evol is clear dsl for composable evolutionary algorithms that optimised for joy.

Installation

We currently support python3.6 and python3.7 and you can install it via pip.

  1. pip install evol

Documentation

For more details you can read the docs but we advice everyone to get start by first checking out the examples in the /examples directory. These stand alone examples should show the spirit of usage better than the docs.

The Gist

The main idea is that you should be able to define a complex algorithm
in a composable way. To explain what we mean by this: let’s consider
two evolutionary algorithms for travelling salesman problems.

The first approach takes a collections of solutions and applies:

  1. a survival where only the top 50% solutions survive
  2. the population reproduces using a crossover of genes
  3. certain members mutate
  4. repeat this, maybe 1000 times or more!

Drawing

We can also think of another approach:

  1. pick the best solution of the population
  2. make random changes to this parent and generate new solutions
  3. repeat this, maybe 1000 times or more!

Drawing

One could even combine the two algorithms into a new one:

  1. run algorithm 1 50 times
  2. run algorithm 2 10 times
  3. repeat this, maybe 1000 times or more!

Drawing

You might notice that many parts of these algorithms are similar and it
is the goal of this library is to automate these parts. We hope to
provide an API that is fun to use and easy to tweak your heuristics in.

A working example of something silimar to what is depicted above is shown below. You can also find this code as an example in the /examples/simple_nonlinear.py.

  1. import random
  2. from evol import Population, Evolution
  3. random.seed(42)
  4. def random_start():
  5. """
  6. This function generates a random (x,y) coordinate
  7. """
  8. return (random.random() - 0.5) * 20, (random.random() - 0.5) * 20
  9. def func_to_optimise(xy):
  10. """
  11. This is the function we want to optimise (maximize)
  12. """
  13. x, y = xy
  14. return -(1-x)**2 - 2*(2-x**2)**2
  15. def pick_random_parents(pop):
  16. """
  17. This is how we are going to select parents from the population
  18. """
  19. mom = random.choice(pop)
  20. dad = random.choice(pop)
  21. return mom, dad
  22. def make_child(mom, dad):
  23. """
  24. This function describes how two candidates combine into a new candidate
  25. Note that the output is a tuple, just like the output of `random_start`
  26. We leave it to the developer to ensure that chromosomes are of the same type
  27. """
  28. child_x = (mom[0] + dad[0])/2
  29. child_y = (mom[1] + dad[1])/2
  30. return child_x, child_y
  31. def add_noise(chromosome, sigma):
  32. """
  33. This is a function that will add some noise to the chromosome.
  34. """
  35. new_x = chromosome[0] + (random.random()-0.5) * sigma
  36. new_y = chromosome[1] + (random.random()-0.5) * sigma
  37. return new_x, new_y
  38. # We start by defining a population with candidates.
  39. pop = Population(chromosomes=[random_start() for _ in range(200)],
  40. eval_function=func_to_optimise, maximize=True)
  41. # We define a sequence of steps to change these candidates
  42. evo1 = (Evolution()
  43. .survive(fraction=0.5)
  44. .breed(parent_picker=pick_random_parents, combiner=make_child)
  45. .mutate(func=add_noise, sigma=1))
  46. # We define another sequence of steps to change these candidates
  47. evo2 = (Evolution()
  48. .survive(n=1)
  49. .breed(parent_picker=pick_random_parents, combiner=make_child)
  50. .mutate(func=add_noise, sigma=0.2))
  51. # We are combining two evolutions into a third one. You don't have to
  52. # but this approach demonstrates the flexibility of the library.
  53. evo3 = (Evolution()
  54. .repeat(evo1, n=50)
  55. .repeat(evo2, n=10)
  56. .evaluate())
  57. # In this step we are telling evol to apply the evolutions
  58. # to the population of candidates.
  59. pop = pop.evolve(evo3, n=5)
  60. print(f"the best score found: {max([i.fitness for i in pop])}")

Getting Started

The best place to get started is the /examples folder on github.
This folder contains self contained examples that work out of the
box.

How does it compare to …

  • … deap? We think our library is more composable and pythonic while not removing any functionality. Our library may be a bit slower though.
  • … hyperopt? Since we force the user to make the actual algorithm we are less black boxy. Hyperopt is meant for hyperparameter tuning for machine learning and has better support for search in scikit learn.
  • … inspyred? The library offers a simple way to get started but it seems the project is less actively maintained than ours.