Getting Started!

Important, this is work in progress. Any feedback is very welcome!

Flatland uses genetic programming (GP) to evolve the behavior of the agents (the Flatlanders). If you are new to GP or agents - google is your friend. A good site about GP is http://www.genetic-programming.com/


Define a problem

The GP implementation in Flatland is used by defining a given problem as a C# class. Several examples are included in the solution. Any arbitrary problem can be defined, but in particular, we can easily define a problem - or behavior - that is to be solved and evolved be a Flatlander in Flatland.

Note that the words behavior and problem are used interchangeably.

A trivial example could look like this:
    [Serializable]
    public class CommandExample : GenoType {
        
        [Primitive]
        public void MoveLeft() {
              Agent.TurnLeft();
        }

        [Primitive]
        public void MoveRight() {
               Agent.TurnRight();
        }

        public override double Fitness() {
            Execute();
            return 42;//return meaningful fitness;
        }

        public override object Clone() {
            return new CommandExample();
        }

        protected override void Act(ref string msg) {
            Execute();
        }

        public CommandExample(SerializationInfo info, StreamingContext context) {}
        public CommandExample(){}
    }

As seen, problem classes derive from GenoType.cs that defines the basics for using a problem with the GP. Methods attributed with [Primitive], are the methods that will used as primitives in the program evolved by the system. In the above example, we are evolving some command-like behavior for an agent. In this case we are allowed to use void and boolean return types for our primitives; that is, closure is relaxed a bit to allow for more meaningful primitives expressing the desires of agents. The method Act is called when an Agent is asked to perceive and act, so usually we will call Execute which will execute the syntax tree generated from our problem class. A few class members that need to be included per default are included (see also Flatland > GeneticProgramming > Examples > Template.cs).

Agent behavior

Behaviors are added to agents using as layers in the Layer.cs class:
    public enum Layer {
        [Subsume(typeof(CommandExample))]
        MY_MOVE_BEHAVIOR
    }


Now, when running the Flatland simulation, our behavior will be used as the (currently only) behavior of all Flatlanders. Upon a run of Flatland, every individual is created with the defined Layer enums by initializing a syntax tree subject to what GenoType we provide to the Subsume attribute, and what initialization technique we have set for the GP. When Flatlanders breed, these syntax tress will cross and mutate.

Evolve behavior

To evolve a behavior with some fitness value as a target we use the FlatlandLab.cs class. This class will spawn an isolated and controlled Flatland environment, where one can train a population of agents in turn, and eventually pick the best breed.

            FindFood findFood = new FindFood();
            FlatlandLab lab = new FoodLab(new GP(findFood));
            FlatlandWorld = new FlatlandWorld(lab);

The above will train a food-finder agent. FindFood.cs can be found in Flatland > GeneBank.

GP
Have the GP automatically produce any kind of program.

The StringExample.cs class demonstrates how we can define a problem that will generate a program that outputs a certain string.

    [Serializable]
    public class StringExample : GenoType, IExpression<String>
    {

        public static string Answer = "flatland";
        
        [Primitive] 
        public String[] letters = new []{"a", "b", "c", "d", "e", "f", "g",
                                                     "h", "i", "j", "k", "l", "m", "n", 
                                                     "o", "p", "q", "r", "s", "t", "u", 
                                                      "v", "w", "x", "y", "z", ""};

     //   [Primitive]
     //   public String[] nice_letters = new[] { "a", "d", "f", "l", "n", "t" };

        [Primitive]
        public String Concat(String a, String b){
            return String.Concat(a, b);
        }

        public override double Fitness()
        {
            int error = 0;
            string guess = (String)Execute();
            error += Math.Abs(Answer.Length - guess.Length) * 100;
            for (int i = 0; i < Answer.Length; i++){
                if(i < guess.Length)
                {
                    error += Math.Abs(guess[i] - Answer[i]);
                }
            }
            return GP.DEFAULT_FITNESS_GOAL - error;
        }
        ...

All we declare is the alphabet plus an empty string, and a concatenation function. Our fitness function will call Execute() to get the string produced by a given solution instance during a GP run. Its fitness is based on how close the program is to the correct word length and how close a given letter is to its corresponding letter in the answer string.

To produce a solution program with the GP, we could run something like:
             GP gp = new GP(new StringExample(), 
                                      populationSize: 50, 
                                      maxTreeDepth: 7, 
                                      maxGenerations: 10000, 
                                      mutationRate: 0.20);
             StringExample solution = (StringExample)gp.Run();
             Console.WriteLine("Result " + solution.Execute());
             Console.ReadKey();
             return;


The above GP run will often output "flatland", but obviously less accurate results can occur. The average and max fitness values of the population are printed to the console during a run.



Last edited Nov 2, 2012 at 11:00 PM by Andershqst, version 7

Comments

No comments yet.