Você está na página 1de 6

GENETIC ALGORITHM- an Approach to solve optimization problem

The genetic algorithm is a search heuristic that is routinely used to generate useIul solution
to optimization and search problems. It generates solution to optimization problems using
technique inspired by natural evolution, such as inheritance, mutation, selection and
crossover. Genetic Algorithm are one oI the best way to solve a problem Ior which little is
known and where traditional approach. They are a very general algorithms and so work
well in any search space. All you need to know is what you need the solution to be able to
do well, and genetic algorithm will be able to create a high quality solution. Particle
Swarm Optimization (PSO) is a biologically inspired computational search
optimization method developed in 1995 by Eberhart and Kennedy based on the
social behaviors of birds flocking or fish schooling. Using this optimization
technique we can solve optimization such as both constained and unconstrained
optimization problem where traditional approachs are fail. In this our project work
we have tried to solve some optimization problem using Simple PSO. After that we
have discussed some application of PSO for solving problems.


Darwin`s theory oI Evolution states that all liIe is related and has descended Irom a
common ancestor. The theory presumes that complex creatures have evolved Irom more
simplistic ancestors naturally over time. In a nutshell, as random genetic mutations occur
within an organism's genetic code, the beneIicial mutations are preserved because they aid
survival -- a process known as "natural selection." These beneIicial mutations are passed
on to the next generation. Over time, beneIicial mutations accumulate and the result is an
entirely diIIerent organism. Genetic algorithms are adaptive heuristic search algorithm
premised on the Darwin`s evolutionary ideas oI natural selection and genetic. The basic
concept oI genetic algorithms is designed to simulate processes in natural system
necessary Ior evolution. As such they represent an intelligent exploitation oI a random
search within a deIined search space to solve a problem. First pioneered by John Holland
in the 60`s, GAs has been widely studied, experimented and applied in many Iields in
engineering world. Not only does genetic algorithm provide an alternative method to
solving problem, it consistently outperIorms other traditional methods in most oI the
problems link. Many oI the real world problems which involve Iinding optimal parameters
might prove diIIicult Ior traditional methods but are ideal Ior genetic algorithms.


A genetic algorithm (GA) is a search heuristic that mimics the process oI natural
evolution. This heuristic is routinely used to generate useIul solutions
to optimization and search problems. Genetic algorithms belong to the larger class
oI evolutionary algorithms (EA), which generate solutions to optimization problems using
techniques inspired by natural evolution, such as inheritance, mutation, selection,
and crossover.
A population oI individuals is maintained within search space Ior a GA, each representing
a possible solution to a given problem. Each individual is coded as a Iinite length vector oI
components, or variables, in terms oI some alphabet, usually the binary alphabet 0, 1}.
To continue the genetic analogy these individuals are likened to chromosomes and the
variables are analogous to genes. Thus a chromosome (solution) is composed oI several
genes (variables). In a genetic algorithm, a population oI strings (called chromosomes or
the genotype oI the genome), which encode candidate solutions (called individuals,
creatures, or phenotypes) to an optimization problem, evolves toward better solutions.
Traditionally, solutions are represented in binary as strings oI 0s and 1s, but other
encodings are also possible. The evolution usually starts Irom a population oI randomly
generated individuals and happens in generations. In each generation, the Iitness oI every
individual in the population is evaluated, multiple individuals are stochastically selected.
From the current population (based on their Iitness), and modiIied (recombined and
possibly randomly mutated) to Iorm a new population. The new population is then used in
the next iteration oI the algorithm. Commonly, the algorithm terminates when either a
maximum number oI generations has been produced, or a satisIactory Iitness level has
been reached Ior the population. II the algorithm has terminated due to a maximum
number oI generations, a satisIactory solution may or may not have been reached.

The chromosome then looks like this:
Chromosome 1 1101100100110110

Chromosome 2 1101111000011110

Each chromosome has one binary string. Each bit in this string can represent some
characteristic oI the solution. OI course, there are many other ways oI encoding. This
depends mainly on the solved problem. For example, one can encode directly integer or
real numbers; sometimes it is useIul to encode some permutations and so on.


Initially many individual solutions are randomly generated to Iorm an initial population.
The population size depends on the nature oI the problem, but typically contains several
hundreds or thousands oI possible solutions. Traditionally, the population is generated
randomly, covering the entire range oI possible solutions (the search space). Occasionally,
the solutions may be "seeded" in areas where optimal solutions are likely to be Iound.


During each successive generation, a proportion oI the existing population is selected to
breed a new generation. Individual solutions are selected through a Iitness-based process,
where Iitter solutions (as measured by a Iitness Iunction) are typically more likely to be
selected. Certain selection methods rate the Iitness oI each solution and preIerentially
select the best solutions. Other methods rate only a random sample oI the population, as
this process may be very time-consuming. Most Iunctions are stochastic and designed so
that a small proportion oI less Iit solutions are selected. This helps keep the diversity oI the
population large, preventing premature convergence on poor solutions. Popular and well
studied selection methods include roulette wheel selection and tournament selection.

Some selection procedure are given below which are used in Genetic Algorithm.

Fitness level is used to associate a probability oI selection with each individual
e Iirst calculate the Iitness Ior each input and then represent it on the wheel in terms
oI percentages.
In a search space oI N` chromosomes, we spin the roulette wheel.
Chromosome with bigger Iitness will be selected more times.


Two solutions are picked out oI the pool oI possible solutions, their Iitness is
compared, and the better is permitted to reproduce.
Deterministic tournament selection selects the best individual in each tournament.
Can take advantage oI parallel architecture.


Next step is to perIorm crossover. This operator selects genes Irom parent chromosomes
and creates a new oIIspring. The simplest way how to do this is to choose randomly some
crossover point and everything beIore this point copy Irom a Iirst parent and then
everything aIter a crossover point copy Irom the second parent.
Crossover can then look like this (, is the crossover point):

Chromosome 1 $
Chromosome 2 $
OIIspring 1 $
OIIspring 2 $

Table 1: Single Point Crossover

There are other ways how to make crossover, Ior example we can choose more crossover
points. Crossover can be rather complicated and very depends on encoding oI the
chromosome. SpeciIic crossover made Ior a speciIic problem can improve perIormance oI
the genetic algorithm.


AIter a crossover is perIormed, mutation takes place. This is to prevent Ialling all solutions
in population into a local optimum oI solved problem. Mutation changes randomly the
new oIIspring. For binary encoding we can switch a Iew randomly chosen bits Irom 1 to 0
or Irom 0 to 1. Mutation can then be Iollowing:

Original oIIspring 1 1101111000011110
Original oIIspring 2 1101100100110110
Mutated oIIspring 1 1100111000011110
Mutated oIIspring 2 1101101100110110

Table 2: Mutation
The mutation depends on the encoding as well as the crossover. For example when we are
encoding permutations, mutation could be exchanging two genes.


*Phenotype is the coding scheme used to represent the chromosomes.


1. |Start| Generate random population oI n chromosomes (suitable solutions Ior the
2. |Fitness| Evaluate the Iitness I(x) oI each chromosome x in the population
3. |New population| Create a new population by repeating Iollowing steps until the new
population is complete.

SelectionSelect two parent chromosomes Irom a population according to their Iitness
(the better Iitness, the bigger chance to be selected)
|Crossover| ith a crossover probability cross over the parents to Iorm a new oIIspring
(children). II no crossover was perIormed, oIIspring is an exact copy oI parents.
|Mutation] ith a mutation probability mutate new oIIspring at each locus (position in
|Accepting| Place new oIIspring in a new population.
4. |Replace| Use new generated population Ior a Iurther run oI algorithm
5. |Test| iI the end condition (Ior example number oI populations or improvement oI the
best solution) is satisIied, stops, and returns the best solution in current population
6. |Loop| Go to step 2

What makes GA a Good Global Optimization Tool?
1. The Iirst and most important point is that genetic algorithms are intrinsically
parallel. Most other algorithms are serial and can only explore the solution space to
a problem in one direction at a time, and iI the solution they discover turns out to
be suboptimal, there is nothing to do but abandon all work previously completed
and start over. However, since GA has multiple oIIspring, they can explore the
solution space in multiple directions at once. II one path turns out to be a dead end,
they can easily eliminate it and continue work on more promising avenues, giving
them a greater chance each run oI Iinding the optimal solution.
2. Due to the parallelism that allows them to implicitly evaluate many schemas at
once, genetic algorithms are particularly well-suited to solving problems where the
space oI all potential solutions is truly huge too vast to search exhaustively in any
reasonable amount oI time. It directly samples only small regions oI the vast
Iitness landscape and successIully Iinds optimal or very good results in a short
period oI time aIter directly.
3. Another notable strength oI genetic algorithms is that they perIorm well in
problems Ior which the Iitness landscape is complex - ones where the Iitness
Iunction is discontinuous, noisy, change over time, or has many local optima. GA
has proven to be eIIective at escaping local optima and discovering the global
optimum in even a very rugged and complex Iitness landscape. However, even iI a
GA does not always deliver a provably perIect solution to a problem, it can almost
always deliver at least a very good solution.
4. Another area in which genetic algorithms excel is their ability to manipulate many
parameters simultaneously. GAs are very good at solving such problems, in
particular, their use oI parallelism enables them to produce multiple equally good
solutions to the same problem, possibly with one candidate solution optimizing one
parameter and another candidate optimizing a diIIerent one and a human overseer
can then select one oI these candidates to use.
5. Finally, one oI the qualities oI genetic algorithms which might at Iirst appear to be
a liability turns out to be one oI their strengths, namely, GAs know nothing about
the problems they are deployed to solve. Instead oI using previously known
domain-speciIic inIormation to guide each step, they make random changes to their
candidate solutions and then use the Iitness Iunction to determine whether those
changes produce an improvement.