[Communications in Computer and Information Science] Computer Science for Environmental Engineering...

6

Click here to load reader

Transcript of [Communications in Computer and Information Science] Computer Science for Environmental Engineering...

Page 1: [Communications in Computer and Information Science] Computer Science for Environmental Engineering and EcoInformatics Volume 158 || A Particle Swarm Optimization with Differential

Y. Yu, Z. Yu, and J. Zhao (Eds.): CSEEE 2011, Part I, CCIS 158, pp. 384–389, 2011. © Springer-Verlag Berlin Heidelberg 2011

A Particle Swarm Optimization with Differential Evolution

Ying Chen1, Yong Feng1,2, Zhi Ying Tan2, and Xiao Yu Shi1

1 School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China

2 Chengdu Inst. of Computer Application, Chinese Academy of Sciences, Chengdu 610041, China

[email protected], [email protected], [email protected], [email protected]

Abstract. Particle swarm optimization(PSO) is a simple population-based algo-rithm which has many advantages such as simple operation and converge quick-ly. However, PSO is easily trapped into local optimum. Differential evolution(DE) is a simple evolutionary algorithm the same as PSO. This paper proposes an improved PSO algorithm based on DE operator(termed IPSODE). Finally, several benchmark functions are used to evaluate the performance of the proposed IPSODE algorithm. The simulation results show the stability and the effectiveness of IPSODE algorithm on the optimum search, also demon-strate that the performance of the IPSODE is better than the standard algorithm in solving the benchmark functions.

Keywords: particle swarm optimization, differential evolution, optimization, benchmark function.

1 Introduction

Particle swarm algorithm (PSO), which stems from the simulation of birds flocking, has been paid attention and researched wildly. PSO is a stochastic population-based evolutionary optimization method first introduced by Kennedy and Eberhart in 1995[1,2]. However, the standard PSO algorithm is easily trapped in to local optimum and might be easily premature.

Since then, there has been a mount of research work to improve the standard PSO algorithm. In order to enhance the capability of space exploration Shi and Eberhart introduced the parameter called inertia weight to the original PSO[3]. On this basis, Shi and Eberhart proposed a method of a linearly varying inertia weight over the generations[4], which improved the performance of the PSO significantly. Clerc and Kennedy [5] introduced a constriction factor in PSO. Eberhart and Shi have proposed a random inertia weight factor for tracking dynamic systems[6].

DE is an evolutionary algorithm first proposed by Storn and Price[7]. It is very useful for global optimization problems. It is the same as the GA including mutation, cross and selection. DE has many advantages such as simple operation and powerful

Page 2: [Communications in Computer and Information Science] Computer Science for Environmental Engineering and EcoInformatics Volume 158 || A Particle Swarm Optimization with Differential

A Particle Swarm Optimization with Differential Evolution 385

search. However, DE may be trapped into local optima because of it no use of global information about the search space.

In order to construct a more appropriate strategy to balance the local search ability and global search ability, the researchers have made some improvement strategies[8]. Although they are improved in performance, the premature convergence is still not solved fundamentally, especially in the late stage particle diversity decreases, leading to slow convergence rate and the tendency to fall into local optimum. This paper pro-poses an improved PSO algorithm based on DE, called IPSODE. In the IPSODE, when the best fitness is not improved, DE operator is introduced into the iteration process. In such way, the IPSODE can maintain diversity of swarm and enhance the ability of local and global search. Finally, IPSODE is compared with the standard PSO on the several most commonly used benchmark functions.

2 Particle Swarm Optimization and Differential Evolution

2.1 Particle Swarm Optimization

In the particle swarm optimization algorithm, the routings of each individual is ad-justed by updating the velocity of each particle, according to its own experience and the experience of the colleagues in the search space. The position of the particle i in

the j dimensional space can be represented as ( )1 2 3, , ,i i i i ijX x x x x= L , and the velocity

of the particle is represented as ( )1 2 3, , , ,i i i i ijV v v v v= L . The best position of each par-

ticle is represented as kijpbest , which is calculated at step k , and the best position of

population found so far at step k is represented as kjgbest . Then, the new velocities and

the positions of the particles for the next step are calculated according to the following equations:

( ) ( )11 1 2 2

k k k k k kij ij ij ij j ijv v c r pbest x c r gbest xω+ = + − + − (1)

1 1k k kij ij ijx x v+ += + (2)

where ω is the value inertia weight, 1c and 2c are constants known as acceleration

coefficients in the range [0,4], 1r and 2r and are two separately generated uniformly distributed random numbers in the range [0,1].

2.2 Differential Evolution

Differential Evolution is an evolutionary algorithm first introduced by Storm and Price. It is important that DE may make use of the differential information to generate a new individual. DE adds the weighted difference between two individuals to a third individual. The combination of the individuals is evaluated by use of the objective function for the new individual.

Page 3: [Communications in Computer and Information Science] Computer Science for Environmental Engineering and EcoInformatics Volume 158 || A Particle Swarm Optimization with Differential

386 Y. Chen et al.

The individual is represented as ( )1 2, , ,i i iDX x x x= L ( 1,2, ,i N= L ) in a D-

dimensional space. Then the variation individual ( )1 2, ,i i iDQ q q q= L is generated by

the following equations.

11 2 3( )t t t t

i s s sQ X Fc X X+ = + − (3)

where 1s , 2s and 3s are integers between 1 and N, and they are different from i , Fc

is a constriction factor which is used to control the size of difference of two individu-als, t is the current iterate point .

The new individual ( )1 2, , ,i i iDU u u u= L is generated by use of the cross between

the parent and offspring with the given probability.

( )11 , ( )

,

tijt

ij tij

q if rand j CRu

x Otherwise

++ ⎧ <⎪= ⎨

⎪⎩ (4)

where ( )rand j are random values in the range [0,1], CR is probability of cross in the range [0,1].

The individual 1tiU + and parent t

iX are selected for generating the offspring 1tiX +

according to the following equations

( ) ( )1 11 , ( )

,

t t ti i it

i ti

U if F U F XX

X Otherwise

+ ++

⎧ <⎪= ⎨⎪⎩

(5)

where F is the objective function value.

3 PSO Based on Differential Evolution Operator

From the above discussion, it is clear that DE algorithm has some advantages such as its ability to maintain the diversity of population, and to explore local search. The cor-responding is the lost of the diversity of the PSO algorithm in the later stage of the evolutionary process. This paper introduced an improved method, which was to achieve a balance between diversity of population and convergence rate through com-bining the DE operator.

The process of the improved PSO algorithm based on the DE operator can be de-scribed as follows:

Step1. Initialize the population including each particle’s position and velocity vector.

Step2. Evaluate the fitness value of each particle. Step3. Update the Pbest of each particle and the Gbest of population, respectively. Step4. Update each particle’s velocity and position using equation (1) and equation

(2), respectively. Step5. Compared with the best fitness value of the previous step if the best fitness

value of the population is not improved, the mutation of the DE algorithm is introduced

Page 4: [Communications in Computer and Information Science] Computer Science for Environmental Engineering and EcoInformatics Volume 158 || A Particle Swarm Optimization with Differential

A Particle Swarm Optimization with Differential Evolution 387

in to the population evolutionary process. Randomly select a particle in the population, the new individual can be obtained by equation (3). If the fitness value of the parent is better than that of the new individual, then the particle remains the same, else the new individual is used as offspring.

Step6. Judge the stop criterion, whether the iteration reached the given steps or the best fitness value reaches the given value. If the criterion is satisfied then stop the pro-gram, else go to Step2.

4 Simulation Results

To evaluate the capability of the proposed IPSODE algorithm on the optimum search, we use a set of classical benchmark functions. These benchmark functions are widely used in evaluating performance of PSO algorithm. The IPSODE method is compared with the SPSO method on the solution search. Simulations were carried out to find the global minimum of each function. The benchmark functions as follows:

Spherical:

( ) 21

1

n

ii

f x x=

=∑ (6)

Rosenbrock:

( ) ( )1 2 22

2 11

( ) 100 1n

i i ii

f x x x x−

+=

⎡ ⎤= − + −⎢ ⎥⎣ ⎦∑ (7)

Step:

( )2

31

( ) 0.5n

ii

f x x=

= +∑ (8)

Rastrigin:

( )24

1

( ) 10cos 2 10n

i ii

f x x xπ=

⎡ ⎤= − +⎣ ⎦∑ (9)

Ackley:

25

1 1

1 1( ) 20exp 0.2 exp cos 2 20

n n

i ii i

f x x x en n

π= =

⎛ ⎞ ⎛ ⎞= − − − + +⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎝ ⎠⎝ ⎠∑ ∑ (10)

The dimension of each particle in the population, the search range and the global optimum of each function are listed in Table 1.

For comparing the capability of the proposed IPSODE algorithm and the SPSO al-gorithm on the optimum search, the average and the standard deviations of the fitness for 50 trials are calculated. The average and the standard deviation of the fitness values of such functions for 50 trials are listed in Table 2.

Page 5: [Communications in Computer and Information Science] Computer Science for Environmental Engineering and EcoInformatics Volume 158 || A Particle Swarm Optimization with Differential

388 Y. Chen et al.

Table 1. Configuration for simulation parameters

Function Dimension [-Xmax, Xmax] Optimum

1f 30 [-100,100] 0

2f 30 [-30,30] 0

3f 30 [-100,100] 0

4f 30 [-5.12,5.12] 0

5f 30 [-32,32] 0

As shown in Table 2, simulation results show the IPSODE algorithm is superior to

the SPSO algorithm on all the benchmark functions. The average and standard devia-tion of IPSODE are both smaller than the SPSO, which show the effectiveness and stability of IPSODE. Overall, the PSO combining DE operator overcomes the disad-vantages of PSO efficiently. Simulation results indicate this method is effective.

Table 2. The comparison results of two algorithms after 2000 runs

Function Algorithm Average Standard

Deviation

1f IPSODE 2.2419e-038 1.3992e-037

SPSO 2.1297e-005 3.5633e--005

2f IPSODE 3.1063e+001 2.4206e+001

SPSO 6.8518e+001 5.8013e+001

3f IPSODE 0 0

SPSO 0 0

4f IPSODE 2.230e+001 5.7973e+000

SPSO 2.3832e+001 7.5055e+000

5f IPSODE 7.6710e-010 5.4241e-009

SPSO 1.9864e-003 6.0721e-003

5 Conclusion

In this paper, the IPSODE algorithm based on DE operator is proposed. The mutation operator into the evolutionary process for maintaining the diversity of population and avoiding the local optimum by use of the DE algorithm well performance. Simulation results show that our proposed algorithm outperform the standard PSO on all the benchmark functions. Proposed a new IPSODE for the optimum search to avoid the particle prematurely and improved the ability of global convergence.

Page 6: [Communications in Computer and Information Science] Computer Science for Environmental Engineering and EcoInformatics Volume 158 || A Particle Swarm Optimization with Differential

A Particle Swarm Optimization with Differential Evolution 389

References

1. Kennedy, J., Eberhart, R.C.: Particle Swarm Optimization. In: IEEE Int’l. Conf. on Neural Networks, pp. 1942–1948. IEEE Press, Piscataway (1995)

2. Eberhart, R.C., Kennedy, J.: A New Optimizer Using Particle Swarm Theory. In: Sixth Int’l Symposium on Micro Machine and Human Science, pp. 39–43. IEEE Press, Piscataway (1995)

3. Shi, Y., Eberhart, R.C.: A Modified Particle Swarm Optimizer. In: IEEE Int’l Conf. on Evo-lutionary Computation, pp. 69–73. IEEE Press, Piscataway (1998)

4. Shi, Y., Eberhart, R.C.: Empirical Study of Particle Swarm Optimization. In: IEEE Int. Congr. Evolutionary Computation, pp. 101–106. IEEE Press, Piscataway (1999)

5. Clerc, M., Kennedy, J.: The Particle Swarm-Explosion, Stability, and Convergence in A Multidimensional Complex Space. IEEE Transactions on Evolutionary Computation 61, 58–73 (2002)

6. Eberhart, R.C., Shi, Y.: Tracking and Optimizing Dynamic Systems with Particle Swarms. In: IEEE Congr. Evolutionary Computation 2001, pp. 94–97. IEEE Press, Piscataway (2001)

7. Storn, R., Price, K.: Differential Evolution: A Simple and Efficient Adaptive Scheme for Global Optimization Over Continuous Spaces. Technical Report, International Computer Sciences Institute (1995)

8. Hao, Z.F., Guo, G.H., Huang, H.: A Particle Swarm Optimization Algorithm with Differen-tial Evolution. In: Proceedings of the Sixth International Conference on Machine Learning and Cybernetics, pp. 19–22. IEEE Press, Piscataway (2007)