Research Article An Effective Hybrid Firefly Algorithm with...

10
Hindawi Publishing Corporation e Scientific World Journal Volume 2013, Article ID 125625, 9 pages http://dx.doi.org/10.1155/2013/125625 Research Article An Effective Hybrid Firefly Algorithm with Harmony Search for Global Numerical Optimization Lihong Guo, 1 Gai-Ge Wang, 2 Heqi Wang, 1 and Dinan Wang 1 1 Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China 2 School of Computer Science and Technology, Jiangsu Normal University, Xuzhou, Jiangsu 221116, China Correspondence should be addressed to Gai-Ge Wang; [email protected] Received 10 August 2013; Accepted 29 September 2013 Academic Editors: Z. Cui and X. Yang Copyright © 2013 Lihong Guo et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. A hybrid metaheuristic approach by hybridizing harmony search (HS) and firefly algorithm (FA), namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. e HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods. 1. Introduction In engineering problems, optimization is to look for a vector that can maximize and minimize a function. Nowadays, stochastic method is generally utilized to cope with optimiza- tion problems [1]. ough there are many ways to classify them, a simple one is used to divide them into two groups according to their nature: deterministic and stochastic. Deter- ministic algorithms can get the same solutions if the initial conditions are unchanged, because they always follow the rigorous move. However, regardless of the initial values, stochastic ones are based on certain stochastic distribution; therefore they generally generate various solutions. In fact, both of them can find satisfactory solutions aſter some generations. Recently, nature-inspired algorithms are well capable of solving numerical optimization problems more efficiently. ese metaheuristic approaches are developed to solve complicated problems, like permutation flow shop schedul- ing [2], reliability [3, 4], high-dimensional function opti- mization [5], and other engineering problems [6, 7]. In the 1950s, nature evolution was idealized as an optimization technology and this made a new type of approach, namely, genetic algorithms (GAs) [8]. Aſter that, many other meta- heuristic methods have appeared, like evolutionary strategy (ES) [9, 10], ant colony optimization (ACO) [11], probability- based incremental learning (PBIL) [12], big bang-big crunch algorithm [1316], harmony search (HS) [1719], charged system search (CSS) [20], artificial physics optimization [21], bat algorithm (BA) [22, 23], animal migration optimization (AMO) [24], krill herd (KH) [2527], differential evolution (DE) [2831], particle swarm optimization (PSO) [3235], stud GA (SGA) [36], cuckoo search (CS) [37, 38], artificial plant optimization algorithm (APOA) [39], biogeography- based optimization (BBO) [40], and FA method [41, 42]. As a global optimization method, FA [42] is firstly proposed by Yang in 2008, and it is originated from the fireflies swarm. Recent researches demonstrate that the FA is quite powerful and relatively efficient [43]. Furthermore, the performance of FA can be improved with feasible promising results [44]. In addition, nonconvex problems can be solved by FA [45]. A summarization of swarm intelligence contain- ing FA is given by Parpinelli and Lopes [46]. On the other hand, HS [17, 47] is a novel heuristic technique for optimization problems. In engineering opti- mization, the engineers make an effort to find an optimum that can be decided by an objective function. While, in the music improvisation process, musicians search for most satisfactory harmony as decided by aesthetician. HS method originates in the similarity between them [1].

Transcript of Research Article An Effective Hybrid Firefly Algorithm with...

Page 1: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

Hindawi Publishing CorporationThe Scientific World JournalVolume 2013 Article ID 125625 9 pageshttpdxdoiorg1011552013125625

Research ArticleAn Effective Hybrid Firefly Algorithm with Harmony Search forGlobal Numerical Optimization

Lihong Guo1 Gai-Ge Wang2 Heqi Wang1 and Dinan Wang1

1 Changchun Institute of Optics Fine Mechanics and Physics Chinese Academy of Sciences Changchun 130033 China2 School of Computer Science and Technology Jiangsu Normal University Xuzhou Jiangsu 221116 China

Correspondence should be addressed to Gai-Ge Wang gaigewang163com

Received 10 August 2013 Accepted 29 September 2013

Academic Editors Z Cui and X Yang

Copyright copy 2013 Lihong Guo et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

A hybrid metaheuristic approach by hybridizing harmony search (HS) and firefly algorithm (FA) namely HSFA is proposedto solve function optimization In HSFA the exploration of HS and the exploitation of FA are fully exerted so HSFA has afaster convergence speed than HS and FA Also top fireflies scheme is introduced to reduce running time and HS is utilizedto mutate between fireflies when updating fireflies The HSFA method is verified by various benchmarks From the experimentsthe implementation of HSFA is better than the standard FA and other eight optimization methods

1 Introduction

In engineering problems optimization is to look for a vectorthat can maximize and minimize a function Nowadaysstochasticmethod is generally utilized to cope with optimiza-tion problems [1] Though there are many ways to classifythem a simple one is used to divide them into two groupsaccording to their nature deterministic and stochasticDeter-ministic algorithms can get the same solutions if the initialconditions are unchanged because they always follow therigorous move However regardless of the initial valuesstochastic ones are based on certain stochastic distributiontherefore they generally generate various solutions In factboth of them can find satisfactory solutions after somegenerations Recently nature-inspired algorithms are wellcapable of solving numerical optimization problems moreefficiently

These metaheuristic approaches are developed to solvecomplicated problems like permutation flow shop schedul-ing [2] reliability [3 4] high-dimensional function opti-mization [5] and other engineering problems [6 7] In the1950s nature evolution was idealized as an optimizationtechnology and this made a new type of approach namelygenetic algorithms (GAs) [8] After that many other meta-heuristic methods have appeared like evolutionary strategy

(ES) [9 10] ant colony optimization (ACO) [11] probability-based incremental learning (PBIL) [12] big bang-big crunchalgorithm [13ndash16] harmony search (HS) [17ndash19] chargedsystem search (CSS) [20] artificial physics optimization [21]bat algorithm (BA) [22 23] animal migration optimization(AMO) [24] krill herd (KH) [25ndash27] differential evolution(DE) [28ndash31] particle swarm optimization (PSO) [32ndash35]stud GA (SGA) [36] cuckoo search (CS) [37 38] artificialplant optimization algorithm (APOA) [39] biogeography-based optimization (BBO) [40] and FA method [41 42]

As a global optimization method FA [42] is firstlyproposed by Yang in 2008 and it is originated from thefireflies swarm Recent researches demonstrate that the FA isquite powerful and relatively efficient [43] Furthermore theperformance of FA can be improved with feasible promisingresults [44] In addition nonconvex problems can be solvedby FA [45] A summarization of swarm intelligence contain-ing FA is given by Parpinelli and Lopes [46]

On the other hand HS [17 47] is a novel heuristictechnique for optimization problems In engineering opti-mization the engineers make an effort to find an optimumthat can be decided by an objective function While inthe music improvisation process musicians search for mostsatisfactory harmony as decided by aesthetician HS methodoriginates in the similarity between them [1]

2 The Scientific World Journal

In most cases FA can find the optimal solution with itsexploitation However the search used in FA is based onrandomness so it cannot always get the global best valuesOn the one hand in order to improve diversity of fireflies animprovement of adding HS is made to the FA which can betreated as a mutation operator By combining the principle ofHS and FA an enhanced FA is proposed to look for the bestobjective function value On the other hand FA needs muchmore time to search for the best solution and its performancesignificantly deteriorates with the increases in populationsize In HSFA top fireflies scheme is introduced to reducerunning timeThis scheme is carried out by reduction of outerloop in FAThrough top fireflies scheme the time complexityof HSFA decreases from O(NP2) to O(KEEPlowastNP) whereKEEP is the number of top fireflies The proposed approachis evaluated on various benchmarks The results demonstratethat theHSFA performsmore effectively and accurately thanFA and other intelligent algorithms

The rest of this paper is structured below To begin with abrief background on the HS and FA is provided in Sections2 and 3 respectively Our proposed HSFA is presented inSection 4 HSFA is verified through various functions inSection 5 and Section 6 presents the general conclusions

2 HS Method

As a relative optimization technique there are four optimiza-tion operators in HS [17 48 49]

HM the harmony memory as shown in (1)

HMS the harmony memory size

HMCR the harmony memory consideration rate

PAR the pitch adjustment rate and

bw the pitch adjustment bandwidth [1]

Consider

HM =

[

[

[

[

[

[

1199091

11199091

2sdot sdot sdot 119909

1

119863

1199092

11199092

2sdot sdot sdot 119909

2

119863

sdot sdot sdot

119909HMS1

119909HMS2

sdot sdot sdot 119909HMS119863

100381610038161003816100381610038161003816100381610038161003816100381610038161003816100381610038161003816100381610038161003816100381610038161003816

fitness (1199091)fitness (1199092)

fitness (119909HMS

)

]

]

]

]

]

]

(1)

TheHSmethod can be explained according to the discus-sion of the player improvisation process There are 3 feasibleoptions for a player in the music improvisation process (1)play several pitches that are the same with the HMCR (2)play some pitches like a known piece or (3) improvise newpitches [1] These three options can be idealized into threecomponents use of HM pitch adjusting and randomization[1]

Similar to selecting the optimal ones in GA the first partis important as it is [1] This can guarantees that the optimalharmonieswill not be destroyed in theHMTomakeHSmorepowerful the parameter HMCR should be properly set [1]Through several experiments in most cases HMCR = 07sim095

Table 1 Benchmark functions

No Name No NameF01 Beale F19 Holzman 2 functionF02 Bohachevsky 1 F20 LevyF03 Bohachevsky 2 F21 Pathological functionF04 Bohachevsky 3 F22 Penalty 1F05 Booth F23 Penalty 2F06 Branin F24 PowelF07 Easom F25 Quartic with noiseF08 Foxholes F26 RastriginF09 Freudenstein-Roth F27 RosenbrockF10 Goldstein-Price F28 Schwefel 226F11 Hump F29 Schwefel 12F12 Matyas F30 Schwefel 222F13 Ackley F31 Schwefel 221F14 Alpine F32 SphereF15 Brown F33 StepF16 Dixon and Price F34 Sum functionF17 Fletcher-Powell F35 ZakharovF18 Griewank F36 Wavy1

The pitch in the second part needs to be adjusted slightlyand hence a proper method is used to adjust the frequency[1] If the new pitch 119909new is updated by

119909new = 119909old + 119887119908 (2120576 minus 1) (2)

where 120576 is a random number in [0 1] and 119909old is the currentpitch Here bw is the bandwidth

Parameter PAR should also be appropriately set If PAR isvery close to 1 then the solution is always updating and HS ishard to converge If it is next to 0 then little change is madeand HS may be premature So here we set PAR = 01sim05 [1]

To improve the diversity the randomization is necessaryas shown in the third componentTheusage of randomizationallows the method to go a step further into promising area soas to find the optimal solution [1]

The HS can be presented in Algorithm 1 Where 119863 is thenumber of decision variables rand is a random real numberin interval (0 1) drawn from uniform distribution

3 FA Method

FA [42] is a metaheuristic approach for optimization prob-lemsThe search strategy in FAcomes from the fireflies swarmbehavior [50] There are two significant issues in FA thatare the formulation of attractiveness and variation of lightintensity [42]

For simplicity several characteristics of fireflies are ideal-ized into three rules described in [51] Based on these threerules the FA can be described in Algorithm 2

For two fireflies 119909119894and 119909

119895 they can be updated as follows

119909119905+1

119894= 119909119905

119894+ 1205730119890minus1205741199032

119894119895(119909119905

119894minus 119909119905

119895) + 120572120576

119905

119894 (3)

where 120572 is the step size 1205730is the attractiveness at 119903 = 0 the

second part is the attraction while the third is randomization

The Scientific World Journal 3

BeginStep 1 Initialize the HMStep 2 Evaluate the fitnessStep 3 while the halting criteria is not satisfied do

for 119889 = 1 D doif rand lt HMCR then memory consideration

119909new (119889) = 119909119886(119889) where 119886 isin (1 2 HMS)

if rand lt PARthen pitch adjustment119909new (119889) = 119909old (119889) + 119887119908 times (2 times rand minus 1)

endifelse random selection

119909new(119889) = 119909min119889 + rand times (119909max119889 minus 119909minminus119889)

endifendfor dUpdate the HM as 119909

119908= 119909new if 119891(119909new) lt 119891(119909

119908) (minimization objective)

Update the best harmony vectorStep 4 end whileStep 5 Output results

End

Algorithm 1 HS method

BeginStep 1 Initialization Set 119866 = 1 define 120574 set step size 120572 and 120573

0at 119903 = 0

Step 2 Evaluate the light intensity I determined by 119891(119909)

Step 3 While G lt MaxGeneration dofor 119894 = 1 NP (all NP fireflies) do

for 119895 = 1 NP (NP fireflies) doif (119868119895lt 119868119894)

move firefly i towards jend if

Update attractivenessUpdate light intensityend for j

end for i119866 = 119866 + 1

Step 4 end whileStep 5 Output the results

End

Algorithm 2 Firefly algorithm FA method

[50] In our present work we take 1205730= 1 120572 isin [0 1] and 120574 = 1

[50]

4 HSFA

Based on the introduction of HS and FA in the previoussection the combination of the two approaches is describedand HSFA is proposed which updates the poor solutions toaccelerate its convergence speed

HS and FA are adept at exploring the search space andexploiting solution respectively Therefore in the presentwork a hybrid by inducing HS into FA method namedHSFA is utilized to deal with optimization problem whichcan be considered as mutation operator By this strategythe mutation of the HS and FA can explore the new search

space and exploit the population respectively Therefore itcan overcome the lack of the exploration of the FA

To combat the random walks used in FA in the presentwork the addition of mutation operator is introduced intothe FA including two detailed improvements

The first one is the introduction of top fireflies schemeinto FA to reduce running time that is analogous to the elitismscheme frequently used in other population-based optimiza-tion algorithms In FA due to dual loop time complexity isO(NP2) whose performance significantly deteriorates withthe increases in population size This improvement is carriedout by reduction of outer loop in FA In HSFA we selectthe special firefly with optimal or near-optimal fitness (iethe brightest fireflies) to form top fireflies and all the firefliesonly move towards top firefliesThrough top fireflies scheme

4 The Scientific World Journal

BeginStep 1 Initialization Set 119905 = 1 define 120574 set 120572 120573

0at 119903 = 0 set HMCR and PAR set the

number of top fireflies KEEPStep 2 Evaluate the light intensity IStep 3While t lt MaxGeneration do

Sort the fireflies by light intensity Ifor 119894 = 1 KEEP (all Top fireflies) do

for 119895 = 1 NP (all fireflies) doif (119868119895lt 119868119894) then

Move firefly i towards jelse

for 119896 = 1 D (all elements) do Mutateif (rand lt HMCR) then

1199031= lceilNP lowast randrceil

119909] (119896) = 1199091199031

(119896)

if (rand lt PAR) then119909] (119896) = 119909] (119896) + 119887119908 times (2 times rand minus 1)

end ifelse

119909](119896) = 119909min119896 + rand times (119909max119896 minus 119909min119896)

end ifend for k

end ifUpdate attractivenessUpdate light intensity

end for jend for iEvaluate the light intensity ISort the population by light intensity I119905 = 119905 + 1

Step 4 end whileEnd

Algorithm 3 HSFA method

the time complexity of HSFA decreases from O(NP2) toO(KEEPlowastNP) where KEEP is the number of top firefliesIn general KEEP is far smaller than NP so the time usedby HSFA is much less than FA Apparently if KEEP =NP the algorithm HSFA is declined to the standard FAIf KEEP is too small only few best fireflies are selected toform top fireflies and it converges too fast moreover may bepremature for lack of diversity If KEEP is extremely big (nearNP) almost all the fireflies are used to form top fireflies soall fireflies are explored well leading to potentially optimalsolutions while the algorithm performs badly and convergestoo slowly Therefore we use KEEP = 2 in our study

The second is the addition of HS serving as mutationoperator striving to improve the population diversity to avoidthe premature convergence In standard FA if firefly 119894 isbrighter than firefly j firefly 119895 will move towards firefly iand then evaluate newly-generated fireflies and update lightintensity If not firefly 119895 does nothing However in HSFAif firefly 119894 is not brighter than firefly j firefly 119895 is updated bymutation operation to improve the light intensity for firefly119895 More concretely for the global search part with respectto HSFA we tune every element 119909

119896119895(119896 = 1 2 119863) in

119909119895(the position of firefly j) using HS When 120585

1is not less

than HMCR that is 1205851ge HMCR the element 119909

119896119895is updated

randomly whereas when 1205851ltHMCR we update the element

119909119896119895

in accordance with xr1 Under this circumstance pitchadjustment operation in HS is applied to update the element119909119896119895if 1205852

lt PAR to increase population diversity as shown in(2) where 120585

1and 120585

2are two uniformly distributed random

numbers in [0 1] r1is the integer number in [1NP] and NP

is population sizeIn sum the detailed presentation of HSFA can be given

in Algorithm 3

5 The Results

The HSFA method is tested on optimization problemsthrough several simulations conducted in test problemsTo make a fair comparison between different methods allthe experiments were conducted on the same conditionsdescribed in [1]

In this section HSFA is compared on optimizationproblems with other ninemethods which are ACO [11] BBO[40] DE [28ndash30] ES [9 10] FA [41 42] GA [8] HS [17ndash19]PSO [32 52] and SGA [36] Here for HS FA and HSFAthe parameters are set as follows absorption coefficient 120574 =

The Scientific World Journal 5

0 5 10 15 20 25 30 35 40 45 500

50

100

150

200

250

300

Number of generations

Benc

hmar

k fu

nctio

n va

lue

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 1 Performance comparison for the F26 Rastrigin function

Benc

hmar

k fu

nctio

n va

lue

0 5 10 15 20 25 30 35 40 45 500

1000

2000

3000

4000

5000

6000

7000

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 2 Performance comparison for the F28 Schwefel 226function

10 the HMCR = 09 and the PAR = 01 For parametersused in other methods they can be referred to as in [48 53]Thirty-six functions are utilized to verify our HSFAmethodwhich can be shown in Table 1 More knowledge of all thebenchmarks can be found in [54]

Because all the intelligent algorithms always have somerandomness in order to get representative statistical features

0 5 10 15 20 25 30 35 40 45 500

10

20

30

40

50

60

70

80

90

100

110

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Benc

hmar

k fu

nctio

n va

lue

Figure 3 Performance comparison for the F30 Schwefel 222function

0 5 10 15 20 25 30 35 40 45 50Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

103

104

Benc

hmar

k fu

nctio

n va

lue

Figure 4 Performance comparison for the F33 step function

we did 500 implementations of each method on each prob-lem Tables 2 and 3 illustrate the average and best resultsfound by each algorithm respectively Note that we have usedtwo different scales to normalize the values in the tables andits detailed process can be found in [54] The dimension ofeach function is set to 30

6 The Scientific World Journal

Table 2 Mean normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 101 101 100 102 108 104 107 100 101 125F02 143 271 100 255 100 139 1666 100 313 122F03 125 184 100 228 100 117 1177 101 350 126F04 391198644 171198645 4966 281198645 100 401198644 201198646 151198643 371198645 251198645

F05 101 102 100 111 100 101 115 100 105 119F06 103 102 100 109 100 102 103 100 103 301F07 240 248 227 235 223 183 188 100 171 299F08 172 172 172 172 172 172 172 172 172 100F09 103 101 100 205 1729 100 655 100 104 124F10 240 240 240 306 240 240 309 240 270 100F11 100 100 100 103 100 100 103 100 102 125F12 100 100 100 101 100 100 102 100 100 114F13 432 256 368 556 139 498 570 100 483 263F14 3617 798 4316 7374 1168 3313 7032 100 5362 848F15 57098 1402 2790 111198643 14186 9902 65265 100 48592 1210F16 161198643 7520 31708 121198644 735 94208 111198644 100 161198643 2684F17 2198 235 767 2163 530 733 1901 100 1546 226F18 849 540 1418 6670 231 2849 13902 100 5277 569F19 281198643 16715 54418 191198644 2571 131198643 191198644 100 271198643 3988F20 9379 1359 6816 27660 2005 9242 28265 100 17317 933F21 320 249 174 100 369 265 388 178 255 242F22 121198648 971198643 281198645 511198647 664 581198645 781198647 100 791198646 981F23 221198647 291198644 311198645 141198647 736 671198645 221198647 100 351198646 501198643

F24 11259 800 4808 18898 104 2576 13399 100 5291 292F25 121198643 10334 63738 181198644 1791 141198643 181198644 100 411198643 6277F26 2437 458 2106 3251 775 2084 2989 100 2303 729F27 3723 238 534 4970 100 1038 3412 104 1206 200F28 3776 1845 7351 9292 9382 3193 10920 100 11228 2121F29 479 252 672 741 100 540 713 193 481 431F30 4208 633 1676 6365 936 3016 5357 100 3527 832F31 298 313 387 456 100 393 478 115 397 279F32 20580 1356 3741 38280 187 13127 36149 100 15162 1465F33 4044 2026 5305 31201 514 11120 47125 100 19410 1572F34 27421 2702 4657 54626 617 13885 55025 100 18896 2575F35 121198645 146 332 357 118 312 334 100 312 264F36 982 526 1412 2995 1067 1690 3557 100 2395 537The bold data are the best function value among different methods for the specified function

From Table 2 on average HSFA is well capable offinding function minimum on twenty-eight of the thirty-sixfunctions FA performs the second best on ten of the thirty-six functions Table 3 shows that HSFA and FA perform thesame and best on twenty-two of the thirty-six and seventeenfunctions respectively ACO DE and GA perform the beston eight benchmarks From the above tables we can see thatfor low-dimensional functions both FA and HSFA performwell and their performance has little difference between eachother

Further convergence graphs of ten methods for mostrepresentative functions are illustrated in Figures 1 2 3 and 4which indicate the optimization process The values here arethe real mean function values from above experiments

F26 is a complicated multimodal function and it has asingle global value 0 and several local optima Figure 1 showsthat HSFA converges to global value 0 with the fastest speedHere FA converges a little faster initially but it is likely tobe trapped into subminima as the function value decreasesslightly

F28 is also a multimodal problem and it has only a globalvalue 0 For this problem HSFA is superior to the other ninemethods and finds the optimal value earliest

For this function the figure illustrates that HSFA signif-icantly outperforms all others in the optimization processAt last HSFA converges to the best solution superiorly toothers BBO is only inferior to HSFA and performs thesecond best for this case

The Scientific World Journal 7

Table 3 Best normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 100 100 100 100 100 100 100 100 100 100F02 100 171 100 153 100 100 149 100 172 100F03 100 128 100 112 100 100 128 100 134 113F04 2011986414 2011986414 3211986410 2311986415 251198648 100 1111986415 3811986411 1011986415 4511986415

F05 100 100 100 102 100 100 100 100 100 100F06 101 101 100 101 100 101 100 100 100 251F07 251198646 331198646 721198645 161198646 100 171198644 211198645 288 371198645 331198646

F08 199 199 199 199 199 199 199 199 199 100F09 100 100 100 101 100 100 100 100 100 100F10 265 265 265 274 265 265 265 265 265 100F11 100 100 100 100 100 100 100 100 100 103F12 100 100 100 100 100 100 100 100 100 100F13 834 393 705 1102 100 871 1156 154 951 398F14 6346 1142 8882 15953 1222 4065 14404 100 10248 1066F15 21830 954 4747 59191 3350 12864 79273 100 35864 944F16 471198643 10924 131198643 401198644 100 31553 611198644 231 591198643 4361F17 4258 324 1183 4106 120 950 4349 100 2907 302F18 799 362 1385 6326 100 1442 16001 108 3787 232F19 311198643 13516 89382 391198644 156 56665 541198644 100 611198643 367F20 25129 3294 14230 72064 796 13148 86343 100 48370 2366F21 396 289 165 100 460 291 487 190 272 237F22 3183 5555 151198645 131198648 826 8930 301198648 100 581198646 1515F23 100 441198643 151198646 881198647 2710 331198644 161198648 489 281198647 2922F24 221198643 8870 111198643 471198643 160 21501 341198643 100 94082 3450F25 301198643 38011 291198643 101198645 100 311198643 131198645 201 241198644 5467F26 3966 565 3004 5839 603 2736 4232 100 3857 891F27 5477 213 1253 8736 127 1085 5814 100 2111 277F28 16445 6782 33575 44701 43029 8582 59600 100 55144 6885F29 878 400 1650 1531 100 1085 1842 327 675 729F30 6353 1038 2771 10579 715 4704 8891 100 5802 1283F31 380 563 723 939 100 731 1020 193 756 453F32 74024 3059 18472 181198643 100 32280 181198643 287 72562 3100F33 14929 6643 22457 141198643 386 25571 241198643 100 101198643 4271F34 49151 3562 10026 111198643 164 11366 111198643 100 40061 2755F35 344 250 589 667 100 428 548 146 383 374F36 1105 601 1856 4010 907 1847 4318 100 3118 464The bold data are the best function value among different methods for the specified function

HSFA significantly outperforms all others in the opti-mization process Furthermore Figure 4 indicates that atthe early stage of the optimization process FA convergesfaster than HSFA while HSFA is well capable of improvingits solution steadily in the long run Here FA shows fasterconverges initially (within 20 iterations) however it seems tobe trapped into subminima as the function value decreasesslightly (after 20 iterations) and it is outperformed by HSFAafter 30 iterations

From Figures 1ndash4 our HSFArsquos performance is far betterthan the others In general BBO and FA especially FAare only inferior to the HSFA Note that in [40] BBO iscompared with seven EAs and an engineering problem The

experiments proved the excellent performance of BBO It isalso indirectly proven that our HSFA is a more effectiveoptimization method than others

6 Conclusions

In the present work a hybrid HSFA was proposed foroptimization problems FA is enhanced by the combinationof the basic HS method In HSFA top fireflies scheme isintroduced to reduce running time the other is used tomutate between fireflies when updating fireflies The newharmony vector takes the place of the new firefly only ifit is better than before which generally outperforms HS

8 The Scientific World Journal

and FA The HSFA strive to exploit merits of the FA andHS so as to escape all fireflies being trapped into localoptima Benchmark evaluation on the test problems is usedto investigate the HSFA and the other nine approaches Theresults demonstrated that HSFA is able to make use of theuseful knowledge more efficiently to find much better valuescompared with the other optimization algorithms

References

[1] G Wang and L Guo ldquoA novel hybrid bat algorithm withharmony search for global numerical optimizationrdquo Journal ofApplied Mathematics vol 2013 Article ID 696491 21 pages2013

[2] X Li and M Yin ldquoAn opposition-based differential evolutionalgorithm for permutation flow shop scheduling based ondiversity measurerdquo Advances in Engineering Software vol 55pp 10ndash31 2013

[3] D Zou L Gao S Li and J Wu ldquoAn effective global harmonysearch algorithm for reliability problemsrdquo Expert Systems withApplications vol 38 no 4 pp 4642ndash4648 2011

[4] D Zou L Gao J Wu S Li and Y Li ldquoA novel globalharmony search algorithm for reliability problemsrdquo Computersand Industrial Engineering vol 58 no 2 pp 307ndash316 2010

[5] X-S Yang Z Cui R Xiao A H Gandomi and M Kara-manoglu Swarm Intelligence and Bio-Inspired ComputationElsevier Waltham Mass USA 2013

[6] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[7] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[8] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Boston Mass USA 1989

[9] T Back Evolutionary Algorithms inTheory and Practice OxfordUniversity Press Oxford UK 1996

[10] H BeyerTheTheory of Evolution Strategies Springer NewYorkNY USA 2001

[11] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCambridge UK 2004

[12] B Shumeet ldquoPopulation-based incremental learning a methodfor integrating genetic search based function optimization andcompetitive learningrdquo Carnegie Mellon University CMU-CS-94-163 Carnegie Mellon University Pittsburgh Pa USA 1994

[13] O K Erol and I Eksin ldquoA new optimization method big bang-big crunchrdquo Advances in Engineering Software vol 37 no 2 pp106ndash111 2006

[14] A Kaveh and S Talatahari ldquoSize optimization of space trussesusing big bang-big crunch algorithmrdquo Computers and Struc-tures vol 87 no 17-18 pp 1129ndash1140 2009

[15] A Kaveh and S Talatahari ldquoOptimal design of schwedlerand ribbed domes via hybrid big bang-big crunch algorithmrdquoJournal of Constructional Steel Research vol 66 no 3 pp 412ndash419 2010

[16] A Kaveh and S Talatahari ldquoA discrete big bang-big crunchalgorithm for optimal design of skeletal structuresrdquo AsianJournal of Civil Engineering vol 11 no 1 pp 103ndash122 2010

[17] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[18] P Yadav R Kumar S K Panda and C S Chang ldquoAn intelligenttuned harmony search algorithm for optimisationrdquo InformationSciences vol 196 pp 47ndash72 2012

[19] S Gholizadeh and A Barzegar ldquoShape optimization of struc-tures for frequency constraints by sequential harmony searchalgorithmrdquo Engineering Optimization vol 45 no 6 pp 627ndash646 2013

[20] A Kaveh and S Talatahari ldquoA novel heuristic optimizationmethod charged system searchrdquo Acta Mechanica vol 213 no3-4 pp 267ndash289 2010

[21] L Xie J Zeng and R A Formato ldquoSelection strategies forgravitational constant G in artificial physics optimisation basedon analysis of convergence propertiesrdquo International Journal ofBio-Inspired Computation vol 4 no 6 pp 380ndash391 2012

[22] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing amp Applications vol 22 no 6 pp 1239ndash1255 2013

[23] X S Yang and A H Gandomi ldquoBat algorithm a novelapproach for global engineering optimizationrdquo EngineeringComputations vol 29 no 5 pp 464ndash483 2012

[24] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications 2013

[25] A H Gandomi and A H Alavi ldquoKrill herd a new bio-inspiredoptimization algorithmrdquo Communications in Nonlinear Scienceand Numerical Simulation vol 17 no 12 pp 4831ndash4845 2012

[26] G-G Wang A H Gandomi and A H Alavi ldquoStud krill herdalgorithmrdquo Neurocomputing 2013

[27] G-GWang A H Gandomi and A H Alavi ldquoAn effective krillherd algorithmwith migration operator in biogeography-basedoptimizationrdquo Applied Mathematical Modelling 2013

[28] R Storn and K Price ldquoDifferential evolution-a simple and effi-cient adaptive scheme for global optimization over continuousspacesrdquo Tech Rep 1075-4946 International Computer ScienceInstitute Berkley Calif USA 1995

[29] R Storn and K Price ldquoDifferential evolution-a simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[30] X Li and M Yin ldquoApplication of differential evolution algo-rithm on self-potential datardquo PLoS One vol 7 no 12 ArticleID e51199 2012

[31] GGWang AHGandomi AHAlavi andG SHao ldquoHybridkrill herd algorithm with differential evolution for globalnumerical optimizationrdquo Neural Computing amp Applications2013

[32] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[33] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[34] S Talatahari M Kheirollahi C Farahmandpour and A HGandomi ldquoA multi-stage particle swarm for optimum designof truss structuresrdquo Neural Computing amp Applications vol 23no 5 pp 1297ndash1309 2013

[35] K Y Huang ldquoA hybrid particle swarm optimization approachfor clustering and classification of datasetsrdquo Knowledge-BasedSystems vol 24 no 3 pp 420ndash426 2011

[36] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquoParallel Problem Solving from Nature pp 683ndash691 1998

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 2: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

2 The Scientific World Journal

In most cases FA can find the optimal solution with itsexploitation However the search used in FA is based onrandomness so it cannot always get the global best valuesOn the one hand in order to improve diversity of fireflies animprovement of adding HS is made to the FA which can betreated as a mutation operator By combining the principle ofHS and FA an enhanced FA is proposed to look for the bestobjective function value On the other hand FA needs muchmore time to search for the best solution and its performancesignificantly deteriorates with the increases in populationsize In HSFA top fireflies scheme is introduced to reducerunning timeThis scheme is carried out by reduction of outerloop in FAThrough top fireflies scheme the time complexityof HSFA decreases from O(NP2) to O(KEEPlowastNP) whereKEEP is the number of top fireflies The proposed approachis evaluated on various benchmarks The results demonstratethat theHSFA performsmore effectively and accurately thanFA and other intelligent algorithms

The rest of this paper is structured below To begin with abrief background on the HS and FA is provided in Sections2 and 3 respectively Our proposed HSFA is presented inSection 4 HSFA is verified through various functions inSection 5 and Section 6 presents the general conclusions

2 HS Method

As a relative optimization technique there are four optimiza-tion operators in HS [17 48 49]

HM the harmony memory as shown in (1)

HMS the harmony memory size

HMCR the harmony memory consideration rate

PAR the pitch adjustment rate and

bw the pitch adjustment bandwidth [1]

Consider

HM =

[

[

[

[

[

[

1199091

11199091

2sdot sdot sdot 119909

1

119863

1199092

11199092

2sdot sdot sdot 119909

2

119863

sdot sdot sdot

119909HMS1

119909HMS2

sdot sdot sdot 119909HMS119863

100381610038161003816100381610038161003816100381610038161003816100381610038161003816100381610038161003816100381610038161003816100381610038161003816

fitness (1199091)fitness (1199092)

fitness (119909HMS

)

]

]

]

]

]

]

(1)

TheHSmethod can be explained according to the discus-sion of the player improvisation process There are 3 feasibleoptions for a player in the music improvisation process (1)play several pitches that are the same with the HMCR (2)play some pitches like a known piece or (3) improvise newpitches [1] These three options can be idealized into threecomponents use of HM pitch adjusting and randomization[1]

Similar to selecting the optimal ones in GA the first partis important as it is [1] This can guarantees that the optimalharmonieswill not be destroyed in theHMTomakeHSmorepowerful the parameter HMCR should be properly set [1]Through several experiments in most cases HMCR = 07sim095

Table 1 Benchmark functions

No Name No NameF01 Beale F19 Holzman 2 functionF02 Bohachevsky 1 F20 LevyF03 Bohachevsky 2 F21 Pathological functionF04 Bohachevsky 3 F22 Penalty 1F05 Booth F23 Penalty 2F06 Branin F24 PowelF07 Easom F25 Quartic with noiseF08 Foxholes F26 RastriginF09 Freudenstein-Roth F27 RosenbrockF10 Goldstein-Price F28 Schwefel 226F11 Hump F29 Schwefel 12F12 Matyas F30 Schwefel 222F13 Ackley F31 Schwefel 221F14 Alpine F32 SphereF15 Brown F33 StepF16 Dixon and Price F34 Sum functionF17 Fletcher-Powell F35 ZakharovF18 Griewank F36 Wavy1

The pitch in the second part needs to be adjusted slightlyand hence a proper method is used to adjust the frequency[1] If the new pitch 119909new is updated by

119909new = 119909old + 119887119908 (2120576 minus 1) (2)

where 120576 is a random number in [0 1] and 119909old is the currentpitch Here bw is the bandwidth

Parameter PAR should also be appropriately set If PAR isvery close to 1 then the solution is always updating and HS ishard to converge If it is next to 0 then little change is madeand HS may be premature So here we set PAR = 01sim05 [1]

To improve the diversity the randomization is necessaryas shown in the third componentTheusage of randomizationallows the method to go a step further into promising area soas to find the optimal solution [1]

The HS can be presented in Algorithm 1 Where 119863 is thenumber of decision variables rand is a random real numberin interval (0 1) drawn from uniform distribution

3 FA Method

FA [42] is a metaheuristic approach for optimization prob-lemsThe search strategy in FAcomes from the fireflies swarmbehavior [50] There are two significant issues in FA thatare the formulation of attractiveness and variation of lightintensity [42]

For simplicity several characteristics of fireflies are ideal-ized into three rules described in [51] Based on these threerules the FA can be described in Algorithm 2

For two fireflies 119909119894and 119909

119895 they can be updated as follows

119909119905+1

119894= 119909119905

119894+ 1205730119890minus1205741199032

119894119895(119909119905

119894minus 119909119905

119895) + 120572120576

119905

119894 (3)

where 120572 is the step size 1205730is the attractiveness at 119903 = 0 the

second part is the attraction while the third is randomization

The Scientific World Journal 3

BeginStep 1 Initialize the HMStep 2 Evaluate the fitnessStep 3 while the halting criteria is not satisfied do

for 119889 = 1 D doif rand lt HMCR then memory consideration

119909new (119889) = 119909119886(119889) where 119886 isin (1 2 HMS)

if rand lt PARthen pitch adjustment119909new (119889) = 119909old (119889) + 119887119908 times (2 times rand minus 1)

endifelse random selection

119909new(119889) = 119909min119889 + rand times (119909max119889 minus 119909minminus119889)

endifendfor dUpdate the HM as 119909

119908= 119909new if 119891(119909new) lt 119891(119909

119908) (minimization objective)

Update the best harmony vectorStep 4 end whileStep 5 Output results

End

Algorithm 1 HS method

BeginStep 1 Initialization Set 119866 = 1 define 120574 set step size 120572 and 120573

0at 119903 = 0

Step 2 Evaluate the light intensity I determined by 119891(119909)

Step 3 While G lt MaxGeneration dofor 119894 = 1 NP (all NP fireflies) do

for 119895 = 1 NP (NP fireflies) doif (119868119895lt 119868119894)

move firefly i towards jend if

Update attractivenessUpdate light intensityend for j

end for i119866 = 119866 + 1

Step 4 end whileStep 5 Output the results

End

Algorithm 2 Firefly algorithm FA method

[50] In our present work we take 1205730= 1 120572 isin [0 1] and 120574 = 1

[50]

4 HSFA

Based on the introduction of HS and FA in the previoussection the combination of the two approaches is describedand HSFA is proposed which updates the poor solutions toaccelerate its convergence speed

HS and FA are adept at exploring the search space andexploiting solution respectively Therefore in the presentwork a hybrid by inducing HS into FA method namedHSFA is utilized to deal with optimization problem whichcan be considered as mutation operator By this strategythe mutation of the HS and FA can explore the new search

space and exploit the population respectively Therefore itcan overcome the lack of the exploration of the FA

To combat the random walks used in FA in the presentwork the addition of mutation operator is introduced intothe FA including two detailed improvements

The first one is the introduction of top fireflies schemeinto FA to reduce running time that is analogous to the elitismscheme frequently used in other population-based optimiza-tion algorithms In FA due to dual loop time complexity isO(NP2) whose performance significantly deteriorates withthe increases in population size This improvement is carriedout by reduction of outer loop in FA In HSFA we selectthe special firefly with optimal or near-optimal fitness (iethe brightest fireflies) to form top fireflies and all the firefliesonly move towards top firefliesThrough top fireflies scheme

4 The Scientific World Journal

BeginStep 1 Initialization Set 119905 = 1 define 120574 set 120572 120573

0at 119903 = 0 set HMCR and PAR set the

number of top fireflies KEEPStep 2 Evaluate the light intensity IStep 3While t lt MaxGeneration do

Sort the fireflies by light intensity Ifor 119894 = 1 KEEP (all Top fireflies) do

for 119895 = 1 NP (all fireflies) doif (119868119895lt 119868119894) then

Move firefly i towards jelse

for 119896 = 1 D (all elements) do Mutateif (rand lt HMCR) then

1199031= lceilNP lowast randrceil

119909] (119896) = 1199091199031

(119896)

if (rand lt PAR) then119909] (119896) = 119909] (119896) + 119887119908 times (2 times rand minus 1)

end ifelse

119909](119896) = 119909min119896 + rand times (119909max119896 minus 119909min119896)

end ifend for k

end ifUpdate attractivenessUpdate light intensity

end for jend for iEvaluate the light intensity ISort the population by light intensity I119905 = 119905 + 1

Step 4 end whileEnd

Algorithm 3 HSFA method

the time complexity of HSFA decreases from O(NP2) toO(KEEPlowastNP) where KEEP is the number of top firefliesIn general KEEP is far smaller than NP so the time usedby HSFA is much less than FA Apparently if KEEP =NP the algorithm HSFA is declined to the standard FAIf KEEP is too small only few best fireflies are selected toform top fireflies and it converges too fast moreover may bepremature for lack of diversity If KEEP is extremely big (nearNP) almost all the fireflies are used to form top fireflies soall fireflies are explored well leading to potentially optimalsolutions while the algorithm performs badly and convergestoo slowly Therefore we use KEEP = 2 in our study

The second is the addition of HS serving as mutationoperator striving to improve the population diversity to avoidthe premature convergence In standard FA if firefly 119894 isbrighter than firefly j firefly 119895 will move towards firefly iand then evaluate newly-generated fireflies and update lightintensity If not firefly 119895 does nothing However in HSFAif firefly 119894 is not brighter than firefly j firefly 119895 is updated bymutation operation to improve the light intensity for firefly119895 More concretely for the global search part with respectto HSFA we tune every element 119909

119896119895(119896 = 1 2 119863) in

119909119895(the position of firefly j) using HS When 120585

1is not less

than HMCR that is 1205851ge HMCR the element 119909

119896119895is updated

randomly whereas when 1205851ltHMCR we update the element

119909119896119895

in accordance with xr1 Under this circumstance pitchadjustment operation in HS is applied to update the element119909119896119895if 1205852

lt PAR to increase population diversity as shown in(2) where 120585

1and 120585

2are two uniformly distributed random

numbers in [0 1] r1is the integer number in [1NP] and NP

is population sizeIn sum the detailed presentation of HSFA can be given

in Algorithm 3

5 The Results

The HSFA method is tested on optimization problemsthrough several simulations conducted in test problemsTo make a fair comparison between different methods allthe experiments were conducted on the same conditionsdescribed in [1]

In this section HSFA is compared on optimizationproblems with other ninemethods which are ACO [11] BBO[40] DE [28ndash30] ES [9 10] FA [41 42] GA [8] HS [17ndash19]PSO [32 52] and SGA [36] Here for HS FA and HSFAthe parameters are set as follows absorption coefficient 120574 =

The Scientific World Journal 5

0 5 10 15 20 25 30 35 40 45 500

50

100

150

200

250

300

Number of generations

Benc

hmar

k fu

nctio

n va

lue

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 1 Performance comparison for the F26 Rastrigin function

Benc

hmar

k fu

nctio

n va

lue

0 5 10 15 20 25 30 35 40 45 500

1000

2000

3000

4000

5000

6000

7000

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 2 Performance comparison for the F28 Schwefel 226function

10 the HMCR = 09 and the PAR = 01 For parametersused in other methods they can be referred to as in [48 53]Thirty-six functions are utilized to verify our HSFAmethodwhich can be shown in Table 1 More knowledge of all thebenchmarks can be found in [54]

Because all the intelligent algorithms always have somerandomness in order to get representative statistical features

0 5 10 15 20 25 30 35 40 45 500

10

20

30

40

50

60

70

80

90

100

110

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Benc

hmar

k fu

nctio

n va

lue

Figure 3 Performance comparison for the F30 Schwefel 222function

0 5 10 15 20 25 30 35 40 45 50Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

103

104

Benc

hmar

k fu

nctio

n va

lue

Figure 4 Performance comparison for the F33 step function

we did 500 implementations of each method on each prob-lem Tables 2 and 3 illustrate the average and best resultsfound by each algorithm respectively Note that we have usedtwo different scales to normalize the values in the tables andits detailed process can be found in [54] The dimension ofeach function is set to 30

6 The Scientific World Journal

Table 2 Mean normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 101 101 100 102 108 104 107 100 101 125F02 143 271 100 255 100 139 1666 100 313 122F03 125 184 100 228 100 117 1177 101 350 126F04 391198644 171198645 4966 281198645 100 401198644 201198646 151198643 371198645 251198645

F05 101 102 100 111 100 101 115 100 105 119F06 103 102 100 109 100 102 103 100 103 301F07 240 248 227 235 223 183 188 100 171 299F08 172 172 172 172 172 172 172 172 172 100F09 103 101 100 205 1729 100 655 100 104 124F10 240 240 240 306 240 240 309 240 270 100F11 100 100 100 103 100 100 103 100 102 125F12 100 100 100 101 100 100 102 100 100 114F13 432 256 368 556 139 498 570 100 483 263F14 3617 798 4316 7374 1168 3313 7032 100 5362 848F15 57098 1402 2790 111198643 14186 9902 65265 100 48592 1210F16 161198643 7520 31708 121198644 735 94208 111198644 100 161198643 2684F17 2198 235 767 2163 530 733 1901 100 1546 226F18 849 540 1418 6670 231 2849 13902 100 5277 569F19 281198643 16715 54418 191198644 2571 131198643 191198644 100 271198643 3988F20 9379 1359 6816 27660 2005 9242 28265 100 17317 933F21 320 249 174 100 369 265 388 178 255 242F22 121198648 971198643 281198645 511198647 664 581198645 781198647 100 791198646 981F23 221198647 291198644 311198645 141198647 736 671198645 221198647 100 351198646 501198643

F24 11259 800 4808 18898 104 2576 13399 100 5291 292F25 121198643 10334 63738 181198644 1791 141198643 181198644 100 411198643 6277F26 2437 458 2106 3251 775 2084 2989 100 2303 729F27 3723 238 534 4970 100 1038 3412 104 1206 200F28 3776 1845 7351 9292 9382 3193 10920 100 11228 2121F29 479 252 672 741 100 540 713 193 481 431F30 4208 633 1676 6365 936 3016 5357 100 3527 832F31 298 313 387 456 100 393 478 115 397 279F32 20580 1356 3741 38280 187 13127 36149 100 15162 1465F33 4044 2026 5305 31201 514 11120 47125 100 19410 1572F34 27421 2702 4657 54626 617 13885 55025 100 18896 2575F35 121198645 146 332 357 118 312 334 100 312 264F36 982 526 1412 2995 1067 1690 3557 100 2395 537The bold data are the best function value among different methods for the specified function

From Table 2 on average HSFA is well capable offinding function minimum on twenty-eight of the thirty-sixfunctions FA performs the second best on ten of the thirty-six functions Table 3 shows that HSFA and FA perform thesame and best on twenty-two of the thirty-six and seventeenfunctions respectively ACO DE and GA perform the beston eight benchmarks From the above tables we can see thatfor low-dimensional functions both FA and HSFA performwell and their performance has little difference between eachother

Further convergence graphs of ten methods for mostrepresentative functions are illustrated in Figures 1 2 3 and 4which indicate the optimization process The values here arethe real mean function values from above experiments

F26 is a complicated multimodal function and it has asingle global value 0 and several local optima Figure 1 showsthat HSFA converges to global value 0 with the fastest speedHere FA converges a little faster initially but it is likely tobe trapped into subminima as the function value decreasesslightly

F28 is also a multimodal problem and it has only a globalvalue 0 For this problem HSFA is superior to the other ninemethods and finds the optimal value earliest

For this function the figure illustrates that HSFA signif-icantly outperforms all others in the optimization processAt last HSFA converges to the best solution superiorly toothers BBO is only inferior to HSFA and performs thesecond best for this case

The Scientific World Journal 7

Table 3 Best normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 100 100 100 100 100 100 100 100 100 100F02 100 171 100 153 100 100 149 100 172 100F03 100 128 100 112 100 100 128 100 134 113F04 2011986414 2011986414 3211986410 2311986415 251198648 100 1111986415 3811986411 1011986415 4511986415

F05 100 100 100 102 100 100 100 100 100 100F06 101 101 100 101 100 101 100 100 100 251F07 251198646 331198646 721198645 161198646 100 171198644 211198645 288 371198645 331198646

F08 199 199 199 199 199 199 199 199 199 100F09 100 100 100 101 100 100 100 100 100 100F10 265 265 265 274 265 265 265 265 265 100F11 100 100 100 100 100 100 100 100 100 103F12 100 100 100 100 100 100 100 100 100 100F13 834 393 705 1102 100 871 1156 154 951 398F14 6346 1142 8882 15953 1222 4065 14404 100 10248 1066F15 21830 954 4747 59191 3350 12864 79273 100 35864 944F16 471198643 10924 131198643 401198644 100 31553 611198644 231 591198643 4361F17 4258 324 1183 4106 120 950 4349 100 2907 302F18 799 362 1385 6326 100 1442 16001 108 3787 232F19 311198643 13516 89382 391198644 156 56665 541198644 100 611198643 367F20 25129 3294 14230 72064 796 13148 86343 100 48370 2366F21 396 289 165 100 460 291 487 190 272 237F22 3183 5555 151198645 131198648 826 8930 301198648 100 581198646 1515F23 100 441198643 151198646 881198647 2710 331198644 161198648 489 281198647 2922F24 221198643 8870 111198643 471198643 160 21501 341198643 100 94082 3450F25 301198643 38011 291198643 101198645 100 311198643 131198645 201 241198644 5467F26 3966 565 3004 5839 603 2736 4232 100 3857 891F27 5477 213 1253 8736 127 1085 5814 100 2111 277F28 16445 6782 33575 44701 43029 8582 59600 100 55144 6885F29 878 400 1650 1531 100 1085 1842 327 675 729F30 6353 1038 2771 10579 715 4704 8891 100 5802 1283F31 380 563 723 939 100 731 1020 193 756 453F32 74024 3059 18472 181198643 100 32280 181198643 287 72562 3100F33 14929 6643 22457 141198643 386 25571 241198643 100 101198643 4271F34 49151 3562 10026 111198643 164 11366 111198643 100 40061 2755F35 344 250 589 667 100 428 548 146 383 374F36 1105 601 1856 4010 907 1847 4318 100 3118 464The bold data are the best function value among different methods for the specified function

HSFA significantly outperforms all others in the opti-mization process Furthermore Figure 4 indicates that atthe early stage of the optimization process FA convergesfaster than HSFA while HSFA is well capable of improvingits solution steadily in the long run Here FA shows fasterconverges initially (within 20 iterations) however it seems tobe trapped into subminima as the function value decreasesslightly (after 20 iterations) and it is outperformed by HSFAafter 30 iterations

From Figures 1ndash4 our HSFArsquos performance is far betterthan the others In general BBO and FA especially FAare only inferior to the HSFA Note that in [40] BBO iscompared with seven EAs and an engineering problem The

experiments proved the excellent performance of BBO It isalso indirectly proven that our HSFA is a more effectiveoptimization method than others

6 Conclusions

In the present work a hybrid HSFA was proposed foroptimization problems FA is enhanced by the combinationof the basic HS method In HSFA top fireflies scheme isintroduced to reduce running time the other is used tomutate between fireflies when updating fireflies The newharmony vector takes the place of the new firefly only ifit is better than before which generally outperforms HS

8 The Scientific World Journal

and FA The HSFA strive to exploit merits of the FA andHS so as to escape all fireflies being trapped into localoptima Benchmark evaluation on the test problems is usedto investigate the HSFA and the other nine approaches Theresults demonstrated that HSFA is able to make use of theuseful knowledge more efficiently to find much better valuescompared with the other optimization algorithms

References

[1] G Wang and L Guo ldquoA novel hybrid bat algorithm withharmony search for global numerical optimizationrdquo Journal ofApplied Mathematics vol 2013 Article ID 696491 21 pages2013

[2] X Li and M Yin ldquoAn opposition-based differential evolutionalgorithm for permutation flow shop scheduling based ondiversity measurerdquo Advances in Engineering Software vol 55pp 10ndash31 2013

[3] D Zou L Gao S Li and J Wu ldquoAn effective global harmonysearch algorithm for reliability problemsrdquo Expert Systems withApplications vol 38 no 4 pp 4642ndash4648 2011

[4] D Zou L Gao J Wu S Li and Y Li ldquoA novel globalharmony search algorithm for reliability problemsrdquo Computersand Industrial Engineering vol 58 no 2 pp 307ndash316 2010

[5] X-S Yang Z Cui R Xiao A H Gandomi and M Kara-manoglu Swarm Intelligence and Bio-Inspired ComputationElsevier Waltham Mass USA 2013

[6] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[7] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[8] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Boston Mass USA 1989

[9] T Back Evolutionary Algorithms inTheory and Practice OxfordUniversity Press Oxford UK 1996

[10] H BeyerTheTheory of Evolution Strategies Springer NewYorkNY USA 2001

[11] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCambridge UK 2004

[12] B Shumeet ldquoPopulation-based incremental learning a methodfor integrating genetic search based function optimization andcompetitive learningrdquo Carnegie Mellon University CMU-CS-94-163 Carnegie Mellon University Pittsburgh Pa USA 1994

[13] O K Erol and I Eksin ldquoA new optimization method big bang-big crunchrdquo Advances in Engineering Software vol 37 no 2 pp106ndash111 2006

[14] A Kaveh and S Talatahari ldquoSize optimization of space trussesusing big bang-big crunch algorithmrdquo Computers and Struc-tures vol 87 no 17-18 pp 1129ndash1140 2009

[15] A Kaveh and S Talatahari ldquoOptimal design of schwedlerand ribbed domes via hybrid big bang-big crunch algorithmrdquoJournal of Constructional Steel Research vol 66 no 3 pp 412ndash419 2010

[16] A Kaveh and S Talatahari ldquoA discrete big bang-big crunchalgorithm for optimal design of skeletal structuresrdquo AsianJournal of Civil Engineering vol 11 no 1 pp 103ndash122 2010

[17] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[18] P Yadav R Kumar S K Panda and C S Chang ldquoAn intelligenttuned harmony search algorithm for optimisationrdquo InformationSciences vol 196 pp 47ndash72 2012

[19] S Gholizadeh and A Barzegar ldquoShape optimization of struc-tures for frequency constraints by sequential harmony searchalgorithmrdquo Engineering Optimization vol 45 no 6 pp 627ndash646 2013

[20] A Kaveh and S Talatahari ldquoA novel heuristic optimizationmethod charged system searchrdquo Acta Mechanica vol 213 no3-4 pp 267ndash289 2010

[21] L Xie J Zeng and R A Formato ldquoSelection strategies forgravitational constant G in artificial physics optimisation basedon analysis of convergence propertiesrdquo International Journal ofBio-Inspired Computation vol 4 no 6 pp 380ndash391 2012

[22] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing amp Applications vol 22 no 6 pp 1239ndash1255 2013

[23] X S Yang and A H Gandomi ldquoBat algorithm a novelapproach for global engineering optimizationrdquo EngineeringComputations vol 29 no 5 pp 464ndash483 2012

[24] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications 2013

[25] A H Gandomi and A H Alavi ldquoKrill herd a new bio-inspiredoptimization algorithmrdquo Communications in Nonlinear Scienceand Numerical Simulation vol 17 no 12 pp 4831ndash4845 2012

[26] G-G Wang A H Gandomi and A H Alavi ldquoStud krill herdalgorithmrdquo Neurocomputing 2013

[27] G-GWang A H Gandomi and A H Alavi ldquoAn effective krillherd algorithmwith migration operator in biogeography-basedoptimizationrdquo Applied Mathematical Modelling 2013

[28] R Storn and K Price ldquoDifferential evolution-a simple and effi-cient adaptive scheme for global optimization over continuousspacesrdquo Tech Rep 1075-4946 International Computer ScienceInstitute Berkley Calif USA 1995

[29] R Storn and K Price ldquoDifferential evolution-a simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[30] X Li and M Yin ldquoApplication of differential evolution algo-rithm on self-potential datardquo PLoS One vol 7 no 12 ArticleID e51199 2012

[31] GGWang AHGandomi AHAlavi andG SHao ldquoHybridkrill herd algorithm with differential evolution for globalnumerical optimizationrdquo Neural Computing amp Applications2013

[32] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[33] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[34] S Talatahari M Kheirollahi C Farahmandpour and A HGandomi ldquoA multi-stage particle swarm for optimum designof truss structuresrdquo Neural Computing amp Applications vol 23no 5 pp 1297ndash1309 2013

[35] K Y Huang ldquoA hybrid particle swarm optimization approachfor clustering and classification of datasetsrdquo Knowledge-BasedSystems vol 24 no 3 pp 420ndash426 2011

[36] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquoParallel Problem Solving from Nature pp 683ndash691 1998

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 3: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

The Scientific World Journal 3

BeginStep 1 Initialize the HMStep 2 Evaluate the fitnessStep 3 while the halting criteria is not satisfied do

for 119889 = 1 D doif rand lt HMCR then memory consideration

119909new (119889) = 119909119886(119889) where 119886 isin (1 2 HMS)

if rand lt PARthen pitch adjustment119909new (119889) = 119909old (119889) + 119887119908 times (2 times rand minus 1)

endifelse random selection

119909new(119889) = 119909min119889 + rand times (119909max119889 minus 119909minminus119889)

endifendfor dUpdate the HM as 119909

119908= 119909new if 119891(119909new) lt 119891(119909

119908) (minimization objective)

Update the best harmony vectorStep 4 end whileStep 5 Output results

End

Algorithm 1 HS method

BeginStep 1 Initialization Set 119866 = 1 define 120574 set step size 120572 and 120573

0at 119903 = 0

Step 2 Evaluate the light intensity I determined by 119891(119909)

Step 3 While G lt MaxGeneration dofor 119894 = 1 NP (all NP fireflies) do

for 119895 = 1 NP (NP fireflies) doif (119868119895lt 119868119894)

move firefly i towards jend if

Update attractivenessUpdate light intensityend for j

end for i119866 = 119866 + 1

Step 4 end whileStep 5 Output the results

End

Algorithm 2 Firefly algorithm FA method

[50] In our present work we take 1205730= 1 120572 isin [0 1] and 120574 = 1

[50]

4 HSFA

Based on the introduction of HS and FA in the previoussection the combination of the two approaches is describedand HSFA is proposed which updates the poor solutions toaccelerate its convergence speed

HS and FA are adept at exploring the search space andexploiting solution respectively Therefore in the presentwork a hybrid by inducing HS into FA method namedHSFA is utilized to deal with optimization problem whichcan be considered as mutation operator By this strategythe mutation of the HS and FA can explore the new search

space and exploit the population respectively Therefore itcan overcome the lack of the exploration of the FA

To combat the random walks used in FA in the presentwork the addition of mutation operator is introduced intothe FA including two detailed improvements

The first one is the introduction of top fireflies schemeinto FA to reduce running time that is analogous to the elitismscheme frequently used in other population-based optimiza-tion algorithms In FA due to dual loop time complexity isO(NP2) whose performance significantly deteriorates withthe increases in population size This improvement is carriedout by reduction of outer loop in FA In HSFA we selectthe special firefly with optimal or near-optimal fitness (iethe brightest fireflies) to form top fireflies and all the firefliesonly move towards top firefliesThrough top fireflies scheme

4 The Scientific World Journal

BeginStep 1 Initialization Set 119905 = 1 define 120574 set 120572 120573

0at 119903 = 0 set HMCR and PAR set the

number of top fireflies KEEPStep 2 Evaluate the light intensity IStep 3While t lt MaxGeneration do

Sort the fireflies by light intensity Ifor 119894 = 1 KEEP (all Top fireflies) do

for 119895 = 1 NP (all fireflies) doif (119868119895lt 119868119894) then

Move firefly i towards jelse

for 119896 = 1 D (all elements) do Mutateif (rand lt HMCR) then

1199031= lceilNP lowast randrceil

119909] (119896) = 1199091199031

(119896)

if (rand lt PAR) then119909] (119896) = 119909] (119896) + 119887119908 times (2 times rand minus 1)

end ifelse

119909](119896) = 119909min119896 + rand times (119909max119896 minus 119909min119896)

end ifend for k

end ifUpdate attractivenessUpdate light intensity

end for jend for iEvaluate the light intensity ISort the population by light intensity I119905 = 119905 + 1

Step 4 end whileEnd

Algorithm 3 HSFA method

the time complexity of HSFA decreases from O(NP2) toO(KEEPlowastNP) where KEEP is the number of top firefliesIn general KEEP is far smaller than NP so the time usedby HSFA is much less than FA Apparently if KEEP =NP the algorithm HSFA is declined to the standard FAIf KEEP is too small only few best fireflies are selected toform top fireflies and it converges too fast moreover may bepremature for lack of diversity If KEEP is extremely big (nearNP) almost all the fireflies are used to form top fireflies soall fireflies are explored well leading to potentially optimalsolutions while the algorithm performs badly and convergestoo slowly Therefore we use KEEP = 2 in our study

The second is the addition of HS serving as mutationoperator striving to improve the population diversity to avoidthe premature convergence In standard FA if firefly 119894 isbrighter than firefly j firefly 119895 will move towards firefly iand then evaluate newly-generated fireflies and update lightintensity If not firefly 119895 does nothing However in HSFAif firefly 119894 is not brighter than firefly j firefly 119895 is updated bymutation operation to improve the light intensity for firefly119895 More concretely for the global search part with respectto HSFA we tune every element 119909

119896119895(119896 = 1 2 119863) in

119909119895(the position of firefly j) using HS When 120585

1is not less

than HMCR that is 1205851ge HMCR the element 119909

119896119895is updated

randomly whereas when 1205851ltHMCR we update the element

119909119896119895

in accordance with xr1 Under this circumstance pitchadjustment operation in HS is applied to update the element119909119896119895if 1205852

lt PAR to increase population diversity as shown in(2) where 120585

1and 120585

2are two uniformly distributed random

numbers in [0 1] r1is the integer number in [1NP] and NP

is population sizeIn sum the detailed presentation of HSFA can be given

in Algorithm 3

5 The Results

The HSFA method is tested on optimization problemsthrough several simulations conducted in test problemsTo make a fair comparison between different methods allthe experiments were conducted on the same conditionsdescribed in [1]

In this section HSFA is compared on optimizationproblems with other ninemethods which are ACO [11] BBO[40] DE [28ndash30] ES [9 10] FA [41 42] GA [8] HS [17ndash19]PSO [32 52] and SGA [36] Here for HS FA and HSFAthe parameters are set as follows absorption coefficient 120574 =

The Scientific World Journal 5

0 5 10 15 20 25 30 35 40 45 500

50

100

150

200

250

300

Number of generations

Benc

hmar

k fu

nctio

n va

lue

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 1 Performance comparison for the F26 Rastrigin function

Benc

hmar

k fu

nctio

n va

lue

0 5 10 15 20 25 30 35 40 45 500

1000

2000

3000

4000

5000

6000

7000

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 2 Performance comparison for the F28 Schwefel 226function

10 the HMCR = 09 and the PAR = 01 For parametersused in other methods they can be referred to as in [48 53]Thirty-six functions are utilized to verify our HSFAmethodwhich can be shown in Table 1 More knowledge of all thebenchmarks can be found in [54]

Because all the intelligent algorithms always have somerandomness in order to get representative statistical features

0 5 10 15 20 25 30 35 40 45 500

10

20

30

40

50

60

70

80

90

100

110

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Benc

hmar

k fu

nctio

n va

lue

Figure 3 Performance comparison for the F30 Schwefel 222function

0 5 10 15 20 25 30 35 40 45 50Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

103

104

Benc

hmar

k fu

nctio

n va

lue

Figure 4 Performance comparison for the F33 step function

we did 500 implementations of each method on each prob-lem Tables 2 and 3 illustrate the average and best resultsfound by each algorithm respectively Note that we have usedtwo different scales to normalize the values in the tables andits detailed process can be found in [54] The dimension ofeach function is set to 30

6 The Scientific World Journal

Table 2 Mean normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 101 101 100 102 108 104 107 100 101 125F02 143 271 100 255 100 139 1666 100 313 122F03 125 184 100 228 100 117 1177 101 350 126F04 391198644 171198645 4966 281198645 100 401198644 201198646 151198643 371198645 251198645

F05 101 102 100 111 100 101 115 100 105 119F06 103 102 100 109 100 102 103 100 103 301F07 240 248 227 235 223 183 188 100 171 299F08 172 172 172 172 172 172 172 172 172 100F09 103 101 100 205 1729 100 655 100 104 124F10 240 240 240 306 240 240 309 240 270 100F11 100 100 100 103 100 100 103 100 102 125F12 100 100 100 101 100 100 102 100 100 114F13 432 256 368 556 139 498 570 100 483 263F14 3617 798 4316 7374 1168 3313 7032 100 5362 848F15 57098 1402 2790 111198643 14186 9902 65265 100 48592 1210F16 161198643 7520 31708 121198644 735 94208 111198644 100 161198643 2684F17 2198 235 767 2163 530 733 1901 100 1546 226F18 849 540 1418 6670 231 2849 13902 100 5277 569F19 281198643 16715 54418 191198644 2571 131198643 191198644 100 271198643 3988F20 9379 1359 6816 27660 2005 9242 28265 100 17317 933F21 320 249 174 100 369 265 388 178 255 242F22 121198648 971198643 281198645 511198647 664 581198645 781198647 100 791198646 981F23 221198647 291198644 311198645 141198647 736 671198645 221198647 100 351198646 501198643

F24 11259 800 4808 18898 104 2576 13399 100 5291 292F25 121198643 10334 63738 181198644 1791 141198643 181198644 100 411198643 6277F26 2437 458 2106 3251 775 2084 2989 100 2303 729F27 3723 238 534 4970 100 1038 3412 104 1206 200F28 3776 1845 7351 9292 9382 3193 10920 100 11228 2121F29 479 252 672 741 100 540 713 193 481 431F30 4208 633 1676 6365 936 3016 5357 100 3527 832F31 298 313 387 456 100 393 478 115 397 279F32 20580 1356 3741 38280 187 13127 36149 100 15162 1465F33 4044 2026 5305 31201 514 11120 47125 100 19410 1572F34 27421 2702 4657 54626 617 13885 55025 100 18896 2575F35 121198645 146 332 357 118 312 334 100 312 264F36 982 526 1412 2995 1067 1690 3557 100 2395 537The bold data are the best function value among different methods for the specified function

From Table 2 on average HSFA is well capable offinding function minimum on twenty-eight of the thirty-sixfunctions FA performs the second best on ten of the thirty-six functions Table 3 shows that HSFA and FA perform thesame and best on twenty-two of the thirty-six and seventeenfunctions respectively ACO DE and GA perform the beston eight benchmarks From the above tables we can see thatfor low-dimensional functions both FA and HSFA performwell and their performance has little difference between eachother

Further convergence graphs of ten methods for mostrepresentative functions are illustrated in Figures 1 2 3 and 4which indicate the optimization process The values here arethe real mean function values from above experiments

F26 is a complicated multimodal function and it has asingle global value 0 and several local optima Figure 1 showsthat HSFA converges to global value 0 with the fastest speedHere FA converges a little faster initially but it is likely tobe trapped into subminima as the function value decreasesslightly

F28 is also a multimodal problem and it has only a globalvalue 0 For this problem HSFA is superior to the other ninemethods and finds the optimal value earliest

For this function the figure illustrates that HSFA signif-icantly outperforms all others in the optimization processAt last HSFA converges to the best solution superiorly toothers BBO is only inferior to HSFA and performs thesecond best for this case

The Scientific World Journal 7

Table 3 Best normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 100 100 100 100 100 100 100 100 100 100F02 100 171 100 153 100 100 149 100 172 100F03 100 128 100 112 100 100 128 100 134 113F04 2011986414 2011986414 3211986410 2311986415 251198648 100 1111986415 3811986411 1011986415 4511986415

F05 100 100 100 102 100 100 100 100 100 100F06 101 101 100 101 100 101 100 100 100 251F07 251198646 331198646 721198645 161198646 100 171198644 211198645 288 371198645 331198646

F08 199 199 199 199 199 199 199 199 199 100F09 100 100 100 101 100 100 100 100 100 100F10 265 265 265 274 265 265 265 265 265 100F11 100 100 100 100 100 100 100 100 100 103F12 100 100 100 100 100 100 100 100 100 100F13 834 393 705 1102 100 871 1156 154 951 398F14 6346 1142 8882 15953 1222 4065 14404 100 10248 1066F15 21830 954 4747 59191 3350 12864 79273 100 35864 944F16 471198643 10924 131198643 401198644 100 31553 611198644 231 591198643 4361F17 4258 324 1183 4106 120 950 4349 100 2907 302F18 799 362 1385 6326 100 1442 16001 108 3787 232F19 311198643 13516 89382 391198644 156 56665 541198644 100 611198643 367F20 25129 3294 14230 72064 796 13148 86343 100 48370 2366F21 396 289 165 100 460 291 487 190 272 237F22 3183 5555 151198645 131198648 826 8930 301198648 100 581198646 1515F23 100 441198643 151198646 881198647 2710 331198644 161198648 489 281198647 2922F24 221198643 8870 111198643 471198643 160 21501 341198643 100 94082 3450F25 301198643 38011 291198643 101198645 100 311198643 131198645 201 241198644 5467F26 3966 565 3004 5839 603 2736 4232 100 3857 891F27 5477 213 1253 8736 127 1085 5814 100 2111 277F28 16445 6782 33575 44701 43029 8582 59600 100 55144 6885F29 878 400 1650 1531 100 1085 1842 327 675 729F30 6353 1038 2771 10579 715 4704 8891 100 5802 1283F31 380 563 723 939 100 731 1020 193 756 453F32 74024 3059 18472 181198643 100 32280 181198643 287 72562 3100F33 14929 6643 22457 141198643 386 25571 241198643 100 101198643 4271F34 49151 3562 10026 111198643 164 11366 111198643 100 40061 2755F35 344 250 589 667 100 428 548 146 383 374F36 1105 601 1856 4010 907 1847 4318 100 3118 464The bold data are the best function value among different methods for the specified function

HSFA significantly outperforms all others in the opti-mization process Furthermore Figure 4 indicates that atthe early stage of the optimization process FA convergesfaster than HSFA while HSFA is well capable of improvingits solution steadily in the long run Here FA shows fasterconverges initially (within 20 iterations) however it seems tobe trapped into subminima as the function value decreasesslightly (after 20 iterations) and it is outperformed by HSFAafter 30 iterations

From Figures 1ndash4 our HSFArsquos performance is far betterthan the others In general BBO and FA especially FAare only inferior to the HSFA Note that in [40] BBO iscompared with seven EAs and an engineering problem The

experiments proved the excellent performance of BBO It isalso indirectly proven that our HSFA is a more effectiveoptimization method than others

6 Conclusions

In the present work a hybrid HSFA was proposed foroptimization problems FA is enhanced by the combinationof the basic HS method In HSFA top fireflies scheme isintroduced to reduce running time the other is used tomutate between fireflies when updating fireflies The newharmony vector takes the place of the new firefly only ifit is better than before which generally outperforms HS

8 The Scientific World Journal

and FA The HSFA strive to exploit merits of the FA andHS so as to escape all fireflies being trapped into localoptima Benchmark evaluation on the test problems is usedto investigate the HSFA and the other nine approaches Theresults demonstrated that HSFA is able to make use of theuseful knowledge more efficiently to find much better valuescompared with the other optimization algorithms

References

[1] G Wang and L Guo ldquoA novel hybrid bat algorithm withharmony search for global numerical optimizationrdquo Journal ofApplied Mathematics vol 2013 Article ID 696491 21 pages2013

[2] X Li and M Yin ldquoAn opposition-based differential evolutionalgorithm for permutation flow shop scheduling based ondiversity measurerdquo Advances in Engineering Software vol 55pp 10ndash31 2013

[3] D Zou L Gao S Li and J Wu ldquoAn effective global harmonysearch algorithm for reliability problemsrdquo Expert Systems withApplications vol 38 no 4 pp 4642ndash4648 2011

[4] D Zou L Gao J Wu S Li and Y Li ldquoA novel globalharmony search algorithm for reliability problemsrdquo Computersand Industrial Engineering vol 58 no 2 pp 307ndash316 2010

[5] X-S Yang Z Cui R Xiao A H Gandomi and M Kara-manoglu Swarm Intelligence and Bio-Inspired ComputationElsevier Waltham Mass USA 2013

[6] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[7] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[8] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Boston Mass USA 1989

[9] T Back Evolutionary Algorithms inTheory and Practice OxfordUniversity Press Oxford UK 1996

[10] H BeyerTheTheory of Evolution Strategies Springer NewYorkNY USA 2001

[11] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCambridge UK 2004

[12] B Shumeet ldquoPopulation-based incremental learning a methodfor integrating genetic search based function optimization andcompetitive learningrdquo Carnegie Mellon University CMU-CS-94-163 Carnegie Mellon University Pittsburgh Pa USA 1994

[13] O K Erol and I Eksin ldquoA new optimization method big bang-big crunchrdquo Advances in Engineering Software vol 37 no 2 pp106ndash111 2006

[14] A Kaveh and S Talatahari ldquoSize optimization of space trussesusing big bang-big crunch algorithmrdquo Computers and Struc-tures vol 87 no 17-18 pp 1129ndash1140 2009

[15] A Kaveh and S Talatahari ldquoOptimal design of schwedlerand ribbed domes via hybrid big bang-big crunch algorithmrdquoJournal of Constructional Steel Research vol 66 no 3 pp 412ndash419 2010

[16] A Kaveh and S Talatahari ldquoA discrete big bang-big crunchalgorithm for optimal design of skeletal structuresrdquo AsianJournal of Civil Engineering vol 11 no 1 pp 103ndash122 2010

[17] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[18] P Yadav R Kumar S K Panda and C S Chang ldquoAn intelligenttuned harmony search algorithm for optimisationrdquo InformationSciences vol 196 pp 47ndash72 2012

[19] S Gholizadeh and A Barzegar ldquoShape optimization of struc-tures for frequency constraints by sequential harmony searchalgorithmrdquo Engineering Optimization vol 45 no 6 pp 627ndash646 2013

[20] A Kaveh and S Talatahari ldquoA novel heuristic optimizationmethod charged system searchrdquo Acta Mechanica vol 213 no3-4 pp 267ndash289 2010

[21] L Xie J Zeng and R A Formato ldquoSelection strategies forgravitational constant G in artificial physics optimisation basedon analysis of convergence propertiesrdquo International Journal ofBio-Inspired Computation vol 4 no 6 pp 380ndash391 2012

[22] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing amp Applications vol 22 no 6 pp 1239ndash1255 2013

[23] X S Yang and A H Gandomi ldquoBat algorithm a novelapproach for global engineering optimizationrdquo EngineeringComputations vol 29 no 5 pp 464ndash483 2012

[24] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications 2013

[25] A H Gandomi and A H Alavi ldquoKrill herd a new bio-inspiredoptimization algorithmrdquo Communications in Nonlinear Scienceand Numerical Simulation vol 17 no 12 pp 4831ndash4845 2012

[26] G-G Wang A H Gandomi and A H Alavi ldquoStud krill herdalgorithmrdquo Neurocomputing 2013

[27] G-GWang A H Gandomi and A H Alavi ldquoAn effective krillherd algorithmwith migration operator in biogeography-basedoptimizationrdquo Applied Mathematical Modelling 2013

[28] R Storn and K Price ldquoDifferential evolution-a simple and effi-cient adaptive scheme for global optimization over continuousspacesrdquo Tech Rep 1075-4946 International Computer ScienceInstitute Berkley Calif USA 1995

[29] R Storn and K Price ldquoDifferential evolution-a simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[30] X Li and M Yin ldquoApplication of differential evolution algo-rithm on self-potential datardquo PLoS One vol 7 no 12 ArticleID e51199 2012

[31] GGWang AHGandomi AHAlavi andG SHao ldquoHybridkrill herd algorithm with differential evolution for globalnumerical optimizationrdquo Neural Computing amp Applications2013

[32] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[33] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[34] S Talatahari M Kheirollahi C Farahmandpour and A HGandomi ldquoA multi-stage particle swarm for optimum designof truss structuresrdquo Neural Computing amp Applications vol 23no 5 pp 1297ndash1309 2013

[35] K Y Huang ldquoA hybrid particle swarm optimization approachfor clustering and classification of datasetsrdquo Knowledge-BasedSystems vol 24 no 3 pp 420ndash426 2011

[36] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquoParallel Problem Solving from Nature pp 683ndash691 1998

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 4: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

4 The Scientific World Journal

BeginStep 1 Initialization Set 119905 = 1 define 120574 set 120572 120573

0at 119903 = 0 set HMCR and PAR set the

number of top fireflies KEEPStep 2 Evaluate the light intensity IStep 3While t lt MaxGeneration do

Sort the fireflies by light intensity Ifor 119894 = 1 KEEP (all Top fireflies) do

for 119895 = 1 NP (all fireflies) doif (119868119895lt 119868119894) then

Move firefly i towards jelse

for 119896 = 1 D (all elements) do Mutateif (rand lt HMCR) then

1199031= lceilNP lowast randrceil

119909] (119896) = 1199091199031

(119896)

if (rand lt PAR) then119909] (119896) = 119909] (119896) + 119887119908 times (2 times rand minus 1)

end ifelse

119909](119896) = 119909min119896 + rand times (119909max119896 minus 119909min119896)

end ifend for k

end ifUpdate attractivenessUpdate light intensity

end for jend for iEvaluate the light intensity ISort the population by light intensity I119905 = 119905 + 1

Step 4 end whileEnd

Algorithm 3 HSFA method

the time complexity of HSFA decreases from O(NP2) toO(KEEPlowastNP) where KEEP is the number of top firefliesIn general KEEP is far smaller than NP so the time usedby HSFA is much less than FA Apparently if KEEP =NP the algorithm HSFA is declined to the standard FAIf KEEP is too small only few best fireflies are selected toform top fireflies and it converges too fast moreover may bepremature for lack of diversity If KEEP is extremely big (nearNP) almost all the fireflies are used to form top fireflies soall fireflies are explored well leading to potentially optimalsolutions while the algorithm performs badly and convergestoo slowly Therefore we use KEEP = 2 in our study

The second is the addition of HS serving as mutationoperator striving to improve the population diversity to avoidthe premature convergence In standard FA if firefly 119894 isbrighter than firefly j firefly 119895 will move towards firefly iand then evaluate newly-generated fireflies and update lightintensity If not firefly 119895 does nothing However in HSFAif firefly 119894 is not brighter than firefly j firefly 119895 is updated bymutation operation to improve the light intensity for firefly119895 More concretely for the global search part with respectto HSFA we tune every element 119909

119896119895(119896 = 1 2 119863) in

119909119895(the position of firefly j) using HS When 120585

1is not less

than HMCR that is 1205851ge HMCR the element 119909

119896119895is updated

randomly whereas when 1205851ltHMCR we update the element

119909119896119895

in accordance with xr1 Under this circumstance pitchadjustment operation in HS is applied to update the element119909119896119895if 1205852

lt PAR to increase population diversity as shown in(2) where 120585

1and 120585

2are two uniformly distributed random

numbers in [0 1] r1is the integer number in [1NP] and NP

is population sizeIn sum the detailed presentation of HSFA can be given

in Algorithm 3

5 The Results

The HSFA method is tested on optimization problemsthrough several simulations conducted in test problemsTo make a fair comparison between different methods allthe experiments were conducted on the same conditionsdescribed in [1]

In this section HSFA is compared on optimizationproblems with other ninemethods which are ACO [11] BBO[40] DE [28ndash30] ES [9 10] FA [41 42] GA [8] HS [17ndash19]PSO [32 52] and SGA [36] Here for HS FA and HSFAthe parameters are set as follows absorption coefficient 120574 =

The Scientific World Journal 5

0 5 10 15 20 25 30 35 40 45 500

50

100

150

200

250

300

Number of generations

Benc

hmar

k fu

nctio

n va

lue

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 1 Performance comparison for the F26 Rastrigin function

Benc

hmar

k fu

nctio

n va

lue

0 5 10 15 20 25 30 35 40 45 500

1000

2000

3000

4000

5000

6000

7000

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 2 Performance comparison for the F28 Schwefel 226function

10 the HMCR = 09 and the PAR = 01 For parametersused in other methods they can be referred to as in [48 53]Thirty-six functions are utilized to verify our HSFAmethodwhich can be shown in Table 1 More knowledge of all thebenchmarks can be found in [54]

Because all the intelligent algorithms always have somerandomness in order to get representative statistical features

0 5 10 15 20 25 30 35 40 45 500

10

20

30

40

50

60

70

80

90

100

110

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Benc

hmar

k fu

nctio

n va

lue

Figure 3 Performance comparison for the F30 Schwefel 222function

0 5 10 15 20 25 30 35 40 45 50Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

103

104

Benc

hmar

k fu

nctio

n va

lue

Figure 4 Performance comparison for the F33 step function

we did 500 implementations of each method on each prob-lem Tables 2 and 3 illustrate the average and best resultsfound by each algorithm respectively Note that we have usedtwo different scales to normalize the values in the tables andits detailed process can be found in [54] The dimension ofeach function is set to 30

6 The Scientific World Journal

Table 2 Mean normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 101 101 100 102 108 104 107 100 101 125F02 143 271 100 255 100 139 1666 100 313 122F03 125 184 100 228 100 117 1177 101 350 126F04 391198644 171198645 4966 281198645 100 401198644 201198646 151198643 371198645 251198645

F05 101 102 100 111 100 101 115 100 105 119F06 103 102 100 109 100 102 103 100 103 301F07 240 248 227 235 223 183 188 100 171 299F08 172 172 172 172 172 172 172 172 172 100F09 103 101 100 205 1729 100 655 100 104 124F10 240 240 240 306 240 240 309 240 270 100F11 100 100 100 103 100 100 103 100 102 125F12 100 100 100 101 100 100 102 100 100 114F13 432 256 368 556 139 498 570 100 483 263F14 3617 798 4316 7374 1168 3313 7032 100 5362 848F15 57098 1402 2790 111198643 14186 9902 65265 100 48592 1210F16 161198643 7520 31708 121198644 735 94208 111198644 100 161198643 2684F17 2198 235 767 2163 530 733 1901 100 1546 226F18 849 540 1418 6670 231 2849 13902 100 5277 569F19 281198643 16715 54418 191198644 2571 131198643 191198644 100 271198643 3988F20 9379 1359 6816 27660 2005 9242 28265 100 17317 933F21 320 249 174 100 369 265 388 178 255 242F22 121198648 971198643 281198645 511198647 664 581198645 781198647 100 791198646 981F23 221198647 291198644 311198645 141198647 736 671198645 221198647 100 351198646 501198643

F24 11259 800 4808 18898 104 2576 13399 100 5291 292F25 121198643 10334 63738 181198644 1791 141198643 181198644 100 411198643 6277F26 2437 458 2106 3251 775 2084 2989 100 2303 729F27 3723 238 534 4970 100 1038 3412 104 1206 200F28 3776 1845 7351 9292 9382 3193 10920 100 11228 2121F29 479 252 672 741 100 540 713 193 481 431F30 4208 633 1676 6365 936 3016 5357 100 3527 832F31 298 313 387 456 100 393 478 115 397 279F32 20580 1356 3741 38280 187 13127 36149 100 15162 1465F33 4044 2026 5305 31201 514 11120 47125 100 19410 1572F34 27421 2702 4657 54626 617 13885 55025 100 18896 2575F35 121198645 146 332 357 118 312 334 100 312 264F36 982 526 1412 2995 1067 1690 3557 100 2395 537The bold data are the best function value among different methods for the specified function

From Table 2 on average HSFA is well capable offinding function minimum on twenty-eight of the thirty-sixfunctions FA performs the second best on ten of the thirty-six functions Table 3 shows that HSFA and FA perform thesame and best on twenty-two of the thirty-six and seventeenfunctions respectively ACO DE and GA perform the beston eight benchmarks From the above tables we can see thatfor low-dimensional functions both FA and HSFA performwell and their performance has little difference between eachother

Further convergence graphs of ten methods for mostrepresentative functions are illustrated in Figures 1 2 3 and 4which indicate the optimization process The values here arethe real mean function values from above experiments

F26 is a complicated multimodal function and it has asingle global value 0 and several local optima Figure 1 showsthat HSFA converges to global value 0 with the fastest speedHere FA converges a little faster initially but it is likely tobe trapped into subminima as the function value decreasesslightly

F28 is also a multimodal problem and it has only a globalvalue 0 For this problem HSFA is superior to the other ninemethods and finds the optimal value earliest

For this function the figure illustrates that HSFA signif-icantly outperforms all others in the optimization processAt last HSFA converges to the best solution superiorly toothers BBO is only inferior to HSFA and performs thesecond best for this case

The Scientific World Journal 7

Table 3 Best normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 100 100 100 100 100 100 100 100 100 100F02 100 171 100 153 100 100 149 100 172 100F03 100 128 100 112 100 100 128 100 134 113F04 2011986414 2011986414 3211986410 2311986415 251198648 100 1111986415 3811986411 1011986415 4511986415

F05 100 100 100 102 100 100 100 100 100 100F06 101 101 100 101 100 101 100 100 100 251F07 251198646 331198646 721198645 161198646 100 171198644 211198645 288 371198645 331198646

F08 199 199 199 199 199 199 199 199 199 100F09 100 100 100 101 100 100 100 100 100 100F10 265 265 265 274 265 265 265 265 265 100F11 100 100 100 100 100 100 100 100 100 103F12 100 100 100 100 100 100 100 100 100 100F13 834 393 705 1102 100 871 1156 154 951 398F14 6346 1142 8882 15953 1222 4065 14404 100 10248 1066F15 21830 954 4747 59191 3350 12864 79273 100 35864 944F16 471198643 10924 131198643 401198644 100 31553 611198644 231 591198643 4361F17 4258 324 1183 4106 120 950 4349 100 2907 302F18 799 362 1385 6326 100 1442 16001 108 3787 232F19 311198643 13516 89382 391198644 156 56665 541198644 100 611198643 367F20 25129 3294 14230 72064 796 13148 86343 100 48370 2366F21 396 289 165 100 460 291 487 190 272 237F22 3183 5555 151198645 131198648 826 8930 301198648 100 581198646 1515F23 100 441198643 151198646 881198647 2710 331198644 161198648 489 281198647 2922F24 221198643 8870 111198643 471198643 160 21501 341198643 100 94082 3450F25 301198643 38011 291198643 101198645 100 311198643 131198645 201 241198644 5467F26 3966 565 3004 5839 603 2736 4232 100 3857 891F27 5477 213 1253 8736 127 1085 5814 100 2111 277F28 16445 6782 33575 44701 43029 8582 59600 100 55144 6885F29 878 400 1650 1531 100 1085 1842 327 675 729F30 6353 1038 2771 10579 715 4704 8891 100 5802 1283F31 380 563 723 939 100 731 1020 193 756 453F32 74024 3059 18472 181198643 100 32280 181198643 287 72562 3100F33 14929 6643 22457 141198643 386 25571 241198643 100 101198643 4271F34 49151 3562 10026 111198643 164 11366 111198643 100 40061 2755F35 344 250 589 667 100 428 548 146 383 374F36 1105 601 1856 4010 907 1847 4318 100 3118 464The bold data are the best function value among different methods for the specified function

HSFA significantly outperforms all others in the opti-mization process Furthermore Figure 4 indicates that atthe early stage of the optimization process FA convergesfaster than HSFA while HSFA is well capable of improvingits solution steadily in the long run Here FA shows fasterconverges initially (within 20 iterations) however it seems tobe trapped into subminima as the function value decreasesslightly (after 20 iterations) and it is outperformed by HSFAafter 30 iterations

From Figures 1ndash4 our HSFArsquos performance is far betterthan the others In general BBO and FA especially FAare only inferior to the HSFA Note that in [40] BBO iscompared with seven EAs and an engineering problem The

experiments proved the excellent performance of BBO It isalso indirectly proven that our HSFA is a more effectiveoptimization method than others

6 Conclusions

In the present work a hybrid HSFA was proposed foroptimization problems FA is enhanced by the combinationof the basic HS method In HSFA top fireflies scheme isintroduced to reduce running time the other is used tomutate between fireflies when updating fireflies The newharmony vector takes the place of the new firefly only ifit is better than before which generally outperforms HS

8 The Scientific World Journal

and FA The HSFA strive to exploit merits of the FA andHS so as to escape all fireflies being trapped into localoptima Benchmark evaluation on the test problems is usedto investigate the HSFA and the other nine approaches Theresults demonstrated that HSFA is able to make use of theuseful knowledge more efficiently to find much better valuescompared with the other optimization algorithms

References

[1] G Wang and L Guo ldquoA novel hybrid bat algorithm withharmony search for global numerical optimizationrdquo Journal ofApplied Mathematics vol 2013 Article ID 696491 21 pages2013

[2] X Li and M Yin ldquoAn opposition-based differential evolutionalgorithm for permutation flow shop scheduling based ondiversity measurerdquo Advances in Engineering Software vol 55pp 10ndash31 2013

[3] D Zou L Gao S Li and J Wu ldquoAn effective global harmonysearch algorithm for reliability problemsrdquo Expert Systems withApplications vol 38 no 4 pp 4642ndash4648 2011

[4] D Zou L Gao J Wu S Li and Y Li ldquoA novel globalharmony search algorithm for reliability problemsrdquo Computersand Industrial Engineering vol 58 no 2 pp 307ndash316 2010

[5] X-S Yang Z Cui R Xiao A H Gandomi and M Kara-manoglu Swarm Intelligence and Bio-Inspired ComputationElsevier Waltham Mass USA 2013

[6] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[7] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[8] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Boston Mass USA 1989

[9] T Back Evolutionary Algorithms inTheory and Practice OxfordUniversity Press Oxford UK 1996

[10] H BeyerTheTheory of Evolution Strategies Springer NewYorkNY USA 2001

[11] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCambridge UK 2004

[12] B Shumeet ldquoPopulation-based incremental learning a methodfor integrating genetic search based function optimization andcompetitive learningrdquo Carnegie Mellon University CMU-CS-94-163 Carnegie Mellon University Pittsburgh Pa USA 1994

[13] O K Erol and I Eksin ldquoA new optimization method big bang-big crunchrdquo Advances in Engineering Software vol 37 no 2 pp106ndash111 2006

[14] A Kaveh and S Talatahari ldquoSize optimization of space trussesusing big bang-big crunch algorithmrdquo Computers and Struc-tures vol 87 no 17-18 pp 1129ndash1140 2009

[15] A Kaveh and S Talatahari ldquoOptimal design of schwedlerand ribbed domes via hybrid big bang-big crunch algorithmrdquoJournal of Constructional Steel Research vol 66 no 3 pp 412ndash419 2010

[16] A Kaveh and S Talatahari ldquoA discrete big bang-big crunchalgorithm for optimal design of skeletal structuresrdquo AsianJournal of Civil Engineering vol 11 no 1 pp 103ndash122 2010

[17] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[18] P Yadav R Kumar S K Panda and C S Chang ldquoAn intelligenttuned harmony search algorithm for optimisationrdquo InformationSciences vol 196 pp 47ndash72 2012

[19] S Gholizadeh and A Barzegar ldquoShape optimization of struc-tures for frequency constraints by sequential harmony searchalgorithmrdquo Engineering Optimization vol 45 no 6 pp 627ndash646 2013

[20] A Kaveh and S Talatahari ldquoA novel heuristic optimizationmethod charged system searchrdquo Acta Mechanica vol 213 no3-4 pp 267ndash289 2010

[21] L Xie J Zeng and R A Formato ldquoSelection strategies forgravitational constant G in artificial physics optimisation basedon analysis of convergence propertiesrdquo International Journal ofBio-Inspired Computation vol 4 no 6 pp 380ndash391 2012

[22] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing amp Applications vol 22 no 6 pp 1239ndash1255 2013

[23] X S Yang and A H Gandomi ldquoBat algorithm a novelapproach for global engineering optimizationrdquo EngineeringComputations vol 29 no 5 pp 464ndash483 2012

[24] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications 2013

[25] A H Gandomi and A H Alavi ldquoKrill herd a new bio-inspiredoptimization algorithmrdquo Communications in Nonlinear Scienceand Numerical Simulation vol 17 no 12 pp 4831ndash4845 2012

[26] G-G Wang A H Gandomi and A H Alavi ldquoStud krill herdalgorithmrdquo Neurocomputing 2013

[27] G-GWang A H Gandomi and A H Alavi ldquoAn effective krillherd algorithmwith migration operator in biogeography-basedoptimizationrdquo Applied Mathematical Modelling 2013

[28] R Storn and K Price ldquoDifferential evolution-a simple and effi-cient adaptive scheme for global optimization over continuousspacesrdquo Tech Rep 1075-4946 International Computer ScienceInstitute Berkley Calif USA 1995

[29] R Storn and K Price ldquoDifferential evolution-a simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[30] X Li and M Yin ldquoApplication of differential evolution algo-rithm on self-potential datardquo PLoS One vol 7 no 12 ArticleID e51199 2012

[31] GGWang AHGandomi AHAlavi andG SHao ldquoHybridkrill herd algorithm with differential evolution for globalnumerical optimizationrdquo Neural Computing amp Applications2013

[32] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[33] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[34] S Talatahari M Kheirollahi C Farahmandpour and A HGandomi ldquoA multi-stage particle swarm for optimum designof truss structuresrdquo Neural Computing amp Applications vol 23no 5 pp 1297ndash1309 2013

[35] K Y Huang ldquoA hybrid particle swarm optimization approachfor clustering and classification of datasetsrdquo Knowledge-BasedSystems vol 24 no 3 pp 420ndash426 2011

[36] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquoParallel Problem Solving from Nature pp 683ndash691 1998

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 5: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

The Scientific World Journal 5

0 5 10 15 20 25 30 35 40 45 500

50

100

150

200

250

300

Number of generations

Benc

hmar

k fu

nctio

n va

lue

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 1 Performance comparison for the F26 Rastrigin function

Benc

hmar

k fu

nctio

n va

lue

0 5 10 15 20 25 30 35 40 45 500

1000

2000

3000

4000

5000

6000

7000

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Figure 2 Performance comparison for the F28 Schwefel 226function

10 the HMCR = 09 and the PAR = 01 For parametersused in other methods they can be referred to as in [48 53]Thirty-six functions are utilized to verify our HSFAmethodwhich can be shown in Table 1 More knowledge of all thebenchmarks can be found in [54]

Because all the intelligent algorithms always have somerandomness in order to get representative statistical features

0 5 10 15 20 25 30 35 40 45 500

10

20

30

40

50

60

70

80

90

100

110

Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

Benc

hmar

k fu

nctio

n va

lue

Figure 3 Performance comparison for the F30 Schwefel 222function

0 5 10 15 20 25 30 35 40 45 50Number of generations

ACOBBODEESFA

GAHSHSFAPSOSGA

103

104

Benc

hmar

k fu

nctio

n va

lue

Figure 4 Performance comparison for the F33 step function

we did 500 implementations of each method on each prob-lem Tables 2 and 3 illustrate the average and best resultsfound by each algorithm respectively Note that we have usedtwo different scales to normalize the values in the tables andits detailed process can be found in [54] The dimension ofeach function is set to 30

6 The Scientific World Journal

Table 2 Mean normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 101 101 100 102 108 104 107 100 101 125F02 143 271 100 255 100 139 1666 100 313 122F03 125 184 100 228 100 117 1177 101 350 126F04 391198644 171198645 4966 281198645 100 401198644 201198646 151198643 371198645 251198645

F05 101 102 100 111 100 101 115 100 105 119F06 103 102 100 109 100 102 103 100 103 301F07 240 248 227 235 223 183 188 100 171 299F08 172 172 172 172 172 172 172 172 172 100F09 103 101 100 205 1729 100 655 100 104 124F10 240 240 240 306 240 240 309 240 270 100F11 100 100 100 103 100 100 103 100 102 125F12 100 100 100 101 100 100 102 100 100 114F13 432 256 368 556 139 498 570 100 483 263F14 3617 798 4316 7374 1168 3313 7032 100 5362 848F15 57098 1402 2790 111198643 14186 9902 65265 100 48592 1210F16 161198643 7520 31708 121198644 735 94208 111198644 100 161198643 2684F17 2198 235 767 2163 530 733 1901 100 1546 226F18 849 540 1418 6670 231 2849 13902 100 5277 569F19 281198643 16715 54418 191198644 2571 131198643 191198644 100 271198643 3988F20 9379 1359 6816 27660 2005 9242 28265 100 17317 933F21 320 249 174 100 369 265 388 178 255 242F22 121198648 971198643 281198645 511198647 664 581198645 781198647 100 791198646 981F23 221198647 291198644 311198645 141198647 736 671198645 221198647 100 351198646 501198643

F24 11259 800 4808 18898 104 2576 13399 100 5291 292F25 121198643 10334 63738 181198644 1791 141198643 181198644 100 411198643 6277F26 2437 458 2106 3251 775 2084 2989 100 2303 729F27 3723 238 534 4970 100 1038 3412 104 1206 200F28 3776 1845 7351 9292 9382 3193 10920 100 11228 2121F29 479 252 672 741 100 540 713 193 481 431F30 4208 633 1676 6365 936 3016 5357 100 3527 832F31 298 313 387 456 100 393 478 115 397 279F32 20580 1356 3741 38280 187 13127 36149 100 15162 1465F33 4044 2026 5305 31201 514 11120 47125 100 19410 1572F34 27421 2702 4657 54626 617 13885 55025 100 18896 2575F35 121198645 146 332 357 118 312 334 100 312 264F36 982 526 1412 2995 1067 1690 3557 100 2395 537The bold data are the best function value among different methods for the specified function

From Table 2 on average HSFA is well capable offinding function minimum on twenty-eight of the thirty-sixfunctions FA performs the second best on ten of the thirty-six functions Table 3 shows that HSFA and FA perform thesame and best on twenty-two of the thirty-six and seventeenfunctions respectively ACO DE and GA perform the beston eight benchmarks From the above tables we can see thatfor low-dimensional functions both FA and HSFA performwell and their performance has little difference between eachother

Further convergence graphs of ten methods for mostrepresentative functions are illustrated in Figures 1 2 3 and 4which indicate the optimization process The values here arethe real mean function values from above experiments

F26 is a complicated multimodal function and it has asingle global value 0 and several local optima Figure 1 showsthat HSFA converges to global value 0 with the fastest speedHere FA converges a little faster initially but it is likely tobe trapped into subminima as the function value decreasesslightly

F28 is also a multimodal problem and it has only a globalvalue 0 For this problem HSFA is superior to the other ninemethods and finds the optimal value earliest

For this function the figure illustrates that HSFA signif-icantly outperforms all others in the optimization processAt last HSFA converges to the best solution superiorly toothers BBO is only inferior to HSFA and performs thesecond best for this case

The Scientific World Journal 7

Table 3 Best normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 100 100 100 100 100 100 100 100 100 100F02 100 171 100 153 100 100 149 100 172 100F03 100 128 100 112 100 100 128 100 134 113F04 2011986414 2011986414 3211986410 2311986415 251198648 100 1111986415 3811986411 1011986415 4511986415

F05 100 100 100 102 100 100 100 100 100 100F06 101 101 100 101 100 101 100 100 100 251F07 251198646 331198646 721198645 161198646 100 171198644 211198645 288 371198645 331198646

F08 199 199 199 199 199 199 199 199 199 100F09 100 100 100 101 100 100 100 100 100 100F10 265 265 265 274 265 265 265 265 265 100F11 100 100 100 100 100 100 100 100 100 103F12 100 100 100 100 100 100 100 100 100 100F13 834 393 705 1102 100 871 1156 154 951 398F14 6346 1142 8882 15953 1222 4065 14404 100 10248 1066F15 21830 954 4747 59191 3350 12864 79273 100 35864 944F16 471198643 10924 131198643 401198644 100 31553 611198644 231 591198643 4361F17 4258 324 1183 4106 120 950 4349 100 2907 302F18 799 362 1385 6326 100 1442 16001 108 3787 232F19 311198643 13516 89382 391198644 156 56665 541198644 100 611198643 367F20 25129 3294 14230 72064 796 13148 86343 100 48370 2366F21 396 289 165 100 460 291 487 190 272 237F22 3183 5555 151198645 131198648 826 8930 301198648 100 581198646 1515F23 100 441198643 151198646 881198647 2710 331198644 161198648 489 281198647 2922F24 221198643 8870 111198643 471198643 160 21501 341198643 100 94082 3450F25 301198643 38011 291198643 101198645 100 311198643 131198645 201 241198644 5467F26 3966 565 3004 5839 603 2736 4232 100 3857 891F27 5477 213 1253 8736 127 1085 5814 100 2111 277F28 16445 6782 33575 44701 43029 8582 59600 100 55144 6885F29 878 400 1650 1531 100 1085 1842 327 675 729F30 6353 1038 2771 10579 715 4704 8891 100 5802 1283F31 380 563 723 939 100 731 1020 193 756 453F32 74024 3059 18472 181198643 100 32280 181198643 287 72562 3100F33 14929 6643 22457 141198643 386 25571 241198643 100 101198643 4271F34 49151 3562 10026 111198643 164 11366 111198643 100 40061 2755F35 344 250 589 667 100 428 548 146 383 374F36 1105 601 1856 4010 907 1847 4318 100 3118 464The bold data are the best function value among different methods for the specified function

HSFA significantly outperforms all others in the opti-mization process Furthermore Figure 4 indicates that atthe early stage of the optimization process FA convergesfaster than HSFA while HSFA is well capable of improvingits solution steadily in the long run Here FA shows fasterconverges initially (within 20 iterations) however it seems tobe trapped into subminima as the function value decreasesslightly (after 20 iterations) and it is outperformed by HSFAafter 30 iterations

From Figures 1ndash4 our HSFArsquos performance is far betterthan the others In general BBO and FA especially FAare only inferior to the HSFA Note that in [40] BBO iscompared with seven EAs and an engineering problem The

experiments proved the excellent performance of BBO It isalso indirectly proven that our HSFA is a more effectiveoptimization method than others

6 Conclusions

In the present work a hybrid HSFA was proposed foroptimization problems FA is enhanced by the combinationof the basic HS method In HSFA top fireflies scheme isintroduced to reduce running time the other is used tomutate between fireflies when updating fireflies The newharmony vector takes the place of the new firefly only ifit is better than before which generally outperforms HS

8 The Scientific World Journal

and FA The HSFA strive to exploit merits of the FA andHS so as to escape all fireflies being trapped into localoptima Benchmark evaluation on the test problems is usedto investigate the HSFA and the other nine approaches Theresults demonstrated that HSFA is able to make use of theuseful knowledge more efficiently to find much better valuescompared with the other optimization algorithms

References

[1] G Wang and L Guo ldquoA novel hybrid bat algorithm withharmony search for global numerical optimizationrdquo Journal ofApplied Mathematics vol 2013 Article ID 696491 21 pages2013

[2] X Li and M Yin ldquoAn opposition-based differential evolutionalgorithm for permutation flow shop scheduling based ondiversity measurerdquo Advances in Engineering Software vol 55pp 10ndash31 2013

[3] D Zou L Gao S Li and J Wu ldquoAn effective global harmonysearch algorithm for reliability problemsrdquo Expert Systems withApplications vol 38 no 4 pp 4642ndash4648 2011

[4] D Zou L Gao J Wu S Li and Y Li ldquoA novel globalharmony search algorithm for reliability problemsrdquo Computersand Industrial Engineering vol 58 no 2 pp 307ndash316 2010

[5] X-S Yang Z Cui R Xiao A H Gandomi and M Kara-manoglu Swarm Intelligence and Bio-Inspired ComputationElsevier Waltham Mass USA 2013

[6] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[7] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[8] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Boston Mass USA 1989

[9] T Back Evolutionary Algorithms inTheory and Practice OxfordUniversity Press Oxford UK 1996

[10] H BeyerTheTheory of Evolution Strategies Springer NewYorkNY USA 2001

[11] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCambridge UK 2004

[12] B Shumeet ldquoPopulation-based incremental learning a methodfor integrating genetic search based function optimization andcompetitive learningrdquo Carnegie Mellon University CMU-CS-94-163 Carnegie Mellon University Pittsburgh Pa USA 1994

[13] O K Erol and I Eksin ldquoA new optimization method big bang-big crunchrdquo Advances in Engineering Software vol 37 no 2 pp106ndash111 2006

[14] A Kaveh and S Talatahari ldquoSize optimization of space trussesusing big bang-big crunch algorithmrdquo Computers and Struc-tures vol 87 no 17-18 pp 1129ndash1140 2009

[15] A Kaveh and S Talatahari ldquoOptimal design of schwedlerand ribbed domes via hybrid big bang-big crunch algorithmrdquoJournal of Constructional Steel Research vol 66 no 3 pp 412ndash419 2010

[16] A Kaveh and S Talatahari ldquoA discrete big bang-big crunchalgorithm for optimal design of skeletal structuresrdquo AsianJournal of Civil Engineering vol 11 no 1 pp 103ndash122 2010

[17] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[18] P Yadav R Kumar S K Panda and C S Chang ldquoAn intelligenttuned harmony search algorithm for optimisationrdquo InformationSciences vol 196 pp 47ndash72 2012

[19] S Gholizadeh and A Barzegar ldquoShape optimization of struc-tures for frequency constraints by sequential harmony searchalgorithmrdquo Engineering Optimization vol 45 no 6 pp 627ndash646 2013

[20] A Kaveh and S Talatahari ldquoA novel heuristic optimizationmethod charged system searchrdquo Acta Mechanica vol 213 no3-4 pp 267ndash289 2010

[21] L Xie J Zeng and R A Formato ldquoSelection strategies forgravitational constant G in artificial physics optimisation basedon analysis of convergence propertiesrdquo International Journal ofBio-Inspired Computation vol 4 no 6 pp 380ndash391 2012

[22] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing amp Applications vol 22 no 6 pp 1239ndash1255 2013

[23] X S Yang and A H Gandomi ldquoBat algorithm a novelapproach for global engineering optimizationrdquo EngineeringComputations vol 29 no 5 pp 464ndash483 2012

[24] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications 2013

[25] A H Gandomi and A H Alavi ldquoKrill herd a new bio-inspiredoptimization algorithmrdquo Communications in Nonlinear Scienceand Numerical Simulation vol 17 no 12 pp 4831ndash4845 2012

[26] G-G Wang A H Gandomi and A H Alavi ldquoStud krill herdalgorithmrdquo Neurocomputing 2013

[27] G-GWang A H Gandomi and A H Alavi ldquoAn effective krillherd algorithmwith migration operator in biogeography-basedoptimizationrdquo Applied Mathematical Modelling 2013

[28] R Storn and K Price ldquoDifferential evolution-a simple and effi-cient adaptive scheme for global optimization over continuousspacesrdquo Tech Rep 1075-4946 International Computer ScienceInstitute Berkley Calif USA 1995

[29] R Storn and K Price ldquoDifferential evolution-a simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[30] X Li and M Yin ldquoApplication of differential evolution algo-rithm on self-potential datardquo PLoS One vol 7 no 12 ArticleID e51199 2012

[31] GGWang AHGandomi AHAlavi andG SHao ldquoHybridkrill herd algorithm with differential evolution for globalnumerical optimizationrdquo Neural Computing amp Applications2013

[32] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[33] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[34] S Talatahari M Kheirollahi C Farahmandpour and A HGandomi ldquoA multi-stage particle swarm for optimum designof truss structuresrdquo Neural Computing amp Applications vol 23no 5 pp 1297ndash1309 2013

[35] K Y Huang ldquoA hybrid particle swarm optimization approachfor clustering and classification of datasetsrdquo Knowledge-BasedSystems vol 24 no 3 pp 420ndash426 2011

[36] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquoParallel Problem Solving from Nature pp 683ndash691 1998

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 6: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

6 The Scientific World Journal

Table 2 Mean normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 101 101 100 102 108 104 107 100 101 125F02 143 271 100 255 100 139 1666 100 313 122F03 125 184 100 228 100 117 1177 101 350 126F04 391198644 171198645 4966 281198645 100 401198644 201198646 151198643 371198645 251198645

F05 101 102 100 111 100 101 115 100 105 119F06 103 102 100 109 100 102 103 100 103 301F07 240 248 227 235 223 183 188 100 171 299F08 172 172 172 172 172 172 172 172 172 100F09 103 101 100 205 1729 100 655 100 104 124F10 240 240 240 306 240 240 309 240 270 100F11 100 100 100 103 100 100 103 100 102 125F12 100 100 100 101 100 100 102 100 100 114F13 432 256 368 556 139 498 570 100 483 263F14 3617 798 4316 7374 1168 3313 7032 100 5362 848F15 57098 1402 2790 111198643 14186 9902 65265 100 48592 1210F16 161198643 7520 31708 121198644 735 94208 111198644 100 161198643 2684F17 2198 235 767 2163 530 733 1901 100 1546 226F18 849 540 1418 6670 231 2849 13902 100 5277 569F19 281198643 16715 54418 191198644 2571 131198643 191198644 100 271198643 3988F20 9379 1359 6816 27660 2005 9242 28265 100 17317 933F21 320 249 174 100 369 265 388 178 255 242F22 121198648 971198643 281198645 511198647 664 581198645 781198647 100 791198646 981F23 221198647 291198644 311198645 141198647 736 671198645 221198647 100 351198646 501198643

F24 11259 800 4808 18898 104 2576 13399 100 5291 292F25 121198643 10334 63738 181198644 1791 141198643 181198644 100 411198643 6277F26 2437 458 2106 3251 775 2084 2989 100 2303 729F27 3723 238 534 4970 100 1038 3412 104 1206 200F28 3776 1845 7351 9292 9382 3193 10920 100 11228 2121F29 479 252 672 741 100 540 713 193 481 431F30 4208 633 1676 6365 936 3016 5357 100 3527 832F31 298 313 387 456 100 393 478 115 397 279F32 20580 1356 3741 38280 187 13127 36149 100 15162 1465F33 4044 2026 5305 31201 514 11120 47125 100 19410 1572F34 27421 2702 4657 54626 617 13885 55025 100 18896 2575F35 121198645 146 332 357 118 312 334 100 312 264F36 982 526 1412 2995 1067 1690 3557 100 2395 537The bold data are the best function value among different methods for the specified function

From Table 2 on average HSFA is well capable offinding function minimum on twenty-eight of the thirty-sixfunctions FA performs the second best on ten of the thirty-six functions Table 3 shows that HSFA and FA perform thesame and best on twenty-two of the thirty-six and seventeenfunctions respectively ACO DE and GA perform the beston eight benchmarks From the above tables we can see thatfor low-dimensional functions both FA and HSFA performwell and their performance has little difference between eachother

Further convergence graphs of ten methods for mostrepresentative functions are illustrated in Figures 1 2 3 and 4which indicate the optimization process The values here arethe real mean function values from above experiments

F26 is a complicated multimodal function and it has asingle global value 0 and several local optima Figure 1 showsthat HSFA converges to global value 0 with the fastest speedHere FA converges a little faster initially but it is likely tobe trapped into subminima as the function value decreasesslightly

F28 is also a multimodal problem and it has only a globalvalue 0 For this problem HSFA is superior to the other ninemethods and finds the optimal value earliest

For this function the figure illustrates that HSFA signif-icantly outperforms all others in the optimization processAt last HSFA converges to the best solution superiorly toothers BBO is only inferior to HSFA and performs thesecond best for this case

The Scientific World Journal 7

Table 3 Best normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 100 100 100 100 100 100 100 100 100 100F02 100 171 100 153 100 100 149 100 172 100F03 100 128 100 112 100 100 128 100 134 113F04 2011986414 2011986414 3211986410 2311986415 251198648 100 1111986415 3811986411 1011986415 4511986415

F05 100 100 100 102 100 100 100 100 100 100F06 101 101 100 101 100 101 100 100 100 251F07 251198646 331198646 721198645 161198646 100 171198644 211198645 288 371198645 331198646

F08 199 199 199 199 199 199 199 199 199 100F09 100 100 100 101 100 100 100 100 100 100F10 265 265 265 274 265 265 265 265 265 100F11 100 100 100 100 100 100 100 100 100 103F12 100 100 100 100 100 100 100 100 100 100F13 834 393 705 1102 100 871 1156 154 951 398F14 6346 1142 8882 15953 1222 4065 14404 100 10248 1066F15 21830 954 4747 59191 3350 12864 79273 100 35864 944F16 471198643 10924 131198643 401198644 100 31553 611198644 231 591198643 4361F17 4258 324 1183 4106 120 950 4349 100 2907 302F18 799 362 1385 6326 100 1442 16001 108 3787 232F19 311198643 13516 89382 391198644 156 56665 541198644 100 611198643 367F20 25129 3294 14230 72064 796 13148 86343 100 48370 2366F21 396 289 165 100 460 291 487 190 272 237F22 3183 5555 151198645 131198648 826 8930 301198648 100 581198646 1515F23 100 441198643 151198646 881198647 2710 331198644 161198648 489 281198647 2922F24 221198643 8870 111198643 471198643 160 21501 341198643 100 94082 3450F25 301198643 38011 291198643 101198645 100 311198643 131198645 201 241198644 5467F26 3966 565 3004 5839 603 2736 4232 100 3857 891F27 5477 213 1253 8736 127 1085 5814 100 2111 277F28 16445 6782 33575 44701 43029 8582 59600 100 55144 6885F29 878 400 1650 1531 100 1085 1842 327 675 729F30 6353 1038 2771 10579 715 4704 8891 100 5802 1283F31 380 563 723 939 100 731 1020 193 756 453F32 74024 3059 18472 181198643 100 32280 181198643 287 72562 3100F33 14929 6643 22457 141198643 386 25571 241198643 100 101198643 4271F34 49151 3562 10026 111198643 164 11366 111198643 100 40061 2755F35 344 250 589 667 100 428 548 146 383 374F36 1105 601 1856 4010 907 1847 4318 100 3118 464The bold data are the best function value among different methods for the specified function

HSFA significantly outperforms all others in the opti-mization process Furthermore Figure 4 indicates that atthe early stage of the optimization process FA convergesfaster than HSFA while HSFA is well capable of improvingits solution steadily in the long run Here FA shows fasterconverges initially (within 20 iterations) however it seems tobe trapped into subminima as the function value decreasesslightly (after 20 iterations) and it is outperformed by HSFAafter 30 iterations

From Figures 1ndash4 our HSFArsquos performance is far betterthan the others In general BBO and FA especially FAare only inferior to the HSFA Note that in [40] BBO iscompared with seven EAs and an engineering problem The

experiments proved the excellent performance of BBO It isalso indirectly proven that our HSFA is a more effectiveoptimization method than others

6 Conclusions

In the present work a hybrid HSFA was proposed foroptimization problems FA is enhanced by the combinationof the basic HS method In HSFA top fireflies scheme isintroduced to reduce running time the other is used tomutate between fireflies when updating fireflies The newharmony vector takes the place of the new firefly only ifit is better than before which generally outperforms HS

8 The Scientific World Journal

and FA The HSFA strive to exploit merits of the FA andHS so as to escape all fireflies being trapped into localoptima Benchmark evaluation on the test problems is usedto investigate the HSFA and the other nine approaches Theresults demonstrated that HSFA is able to make use of theuseful knowledge more efficiently to find much better valuescompared with the other optimization algorithms

References

[1] G Wang and L Guo ldquoA novel hybrid bat algorithm withharmony search for global numerical optimizationrdquo Journal ofApplied Mathematics vol 2013 Article ID 696491 21 pages2013

[2] X Li and M Yin ldquoAn opposition-based differential evolutionalgorithm for permutation flow shop scheduling based ondiversity measurerdquo Advances in Engineering Software vol 55pp 10ndash31 2013

[3] D Zou L Gao S Li and J Wu ldquoAn effective global harmonysearch algorithm for reliability problemsrdquo Expert Systems withApplications vol 38 no 4 pp 4642ndash4648 2011

[4] D Zou L Gao J Wu S Li and Y Li ldquoA novel globalharmony search algorithm for reliability problemsrdquo Computersand Industrial Engineering vol 58 no 2 pp 307ndash316 2010

[5] X-S Yang Z Cui R Xiao A H Gandomi and M Kara-manoglu Swarm Intelligence and Bio-Inspired ComputationElsevier Waltham Mass USA 2013

[6] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[7] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[8] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Boston Mass USA 1989

[9] T Back Evolutionary Algorithms inTheory and Practice OxfordUniversity Press Oxford UK 1996

[10] H BeyerTheTheory of Evolution Strategies Springer NewYorkNY USA 2001

[11] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCambridge UK 2004

[12] B Shumeet ldquoPopulation-based incremental learning a methodfor integrating genetic search based function optimization andcompetitive learningrdquo Carnegie Mellon University CMU-CS-94-163 Carnegie Mellon University Pittsburgh Pa USA 1994

[13] O K Erol and I Eksin ldquoA new optimization method big bang-big crunchrdquo Advances in Engineering Software vol 37 no 2 pp106ndash111 2006

[14] A Kaveh and S Talatahari ldquoSize optimization of space trussesusing big bang-big crunch algorithmrdquo Computers and Struc-tures vol 87 no 17-18 pp 1129ndash1140 2009

[15] A Kaveh and S Talatahari ldquoOptimal design of schwedlerand ribbed domes via hybrid big bang-big crunch algorithmrdquoJournal of Constructional Steel Research vol 66 no 3 pp 412ndash419 2010

[16] A Kaveh and S Talatahari ldquoA discrete big bang-big crunchalgorithm for optimal design of skeletal structuresrdquo AsianJournal of Civil Engineering vol 11 no 1 pp 103ndash122 2010

[17] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[18] P Yadav R Kumar S K Panda and C S Chang ldquoAn intelligenttuned harmony search algorithm for optimisationrdquo InformationSciences vol 196 pp 47ndash72 2012

[19] S Gholizadeh and A Barzegar ldquoShape optimization of struc-tures for frequency constraints by sequential harmony searchalgorithmrdquo Engineering Optimization vol 45 no 6 pp 627ndash646 2013

[20] A Kaveh and S Talatahari ldquoA novel heuristic optimizationmethod charged system searchrdquo Acta Mechanica vol 213 no3-4 pp 267ndash289 2010

[21] L Xie J Zeng and R A Formato ldquoSelection strategies forgravitational constant G in artificial physics optimisation basedon analysis of convergence propertiesrdquo International Journal ofBio-Inspired Computation vol 4 no 6 pp 380ndash391 2012

[22] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing amp Applications vol 22 no 6 pp 1239ndash1255 2013

[23] X S Yang and A H Gandomi ldquoBat algorithm a novelapproach for global engineering optimizationrdquo EngineeringComputations vol 29 no 5 pp 464ndash483 2012

[24] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications 2013

[25] A H Gandomi and A H Alavi ldquoKrill herd a new bio-inspiredoptimization algorithmrdquo Communications in Nonlinear Scienceand Numerical Simulation vol 17 no 12 pp 4831ndash4845 2012

[26] G-G Wang A H Gandomi and A H Alavi ldquoStud krill herdalgorithmrdquo Neurocomputing 2013

[27] G-GWang A H Gandomi and A H Alavi ldquoAn effective krillherd algorithmwith migration operator in biogeography-basedoptimizationrdquo Applied Mathematical Modelling 2013

[28] R Storn and K Price ldquoDifferential evolution-a simple and effi-cient adaptive scheme for global optimization over continuousspacesrdquo Tech Rep 1075-4946 International Computer ScienceInstitute Berkley Calif USA 1995

[29] R Storn and K Price ldquoDifferential evolution-a simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[30] X Li and M Yin ldquoApplication of differential evolution algo-rithm on self-potential datardquo PLoS One vol 7 no 12 ArticleID e51199 2012

[31] GGWang AHGandomi AHAlavi andG SHao ldquoHybridkrill herd algorithm with differential evolution for globalnumerical optimizationrdquo Neural Computing amp Applications2013

[32] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[33] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[34] S Talatahari M Kheirollahi C Farahmandpour and A HGandomi ldquoA multi-stage particle swarm for optimum designof truss structuresrdquo Neural Computing amp Applications vol 23no 5 pp 1297ndash1309 2013

[35] K Y Huang ldquoA hybrid particle swarm optimization approachfor clustering and classification of datasetsrdquo Knowledge-BasedSystems vol 24 no 3 pp 420ndash426 2011

[36] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquoParallel Problem Solving from Nature pp 683ndash691 1998

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 7: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

The Scientific World Journal 7

Table 3 Best normalized optimization results

ACO BBO DE ES FA GA HS HSFA PSO SGAF01 100 100 100 100 100 100 100 100 100 100F02 100 171 100 153 100 100 149 100 172 100F03 100 128 100 112 100 100 128 100 134 113F04 2011986414 2011986414 3211986410 2311986415 251198648 100 1111986415 3811986411 1011986415 4511986415

F05 100 100 100 102 100 100 100 100 100 100F06 101 101 100 101 100 101 100 100 100 251F07 251198646 331198646 721198645 161198646 100 171198644 211198645 288 371198645 331198646

F08 199 199 199 199 199 199 199 199 199 100F09 100 100 100 101 100 100 100 100 100 100F10 265 265 265 274 265 265 265 265 265 100F11 100 100 100 100 100 100 100 100 100 103F12 100 100 100 100 100 100 100 100 100 100F13 834 393 705 1102 100 871 1156 154 951 398F14 6346 1142 8882 15953 1222 4065 14404 100 10248 1066F15 21830 954 4747 59191 3350 12864 79273 100 35864 944F16 471198643 10924 131198643 401198644 100 31553 611198644 231 591198643 4361F17 4258 324 1183 4106 120 950 4349 100 2907 302F18 799 362 1385 6326 100 1442 16001 108 3787 232F19 311198643 13516 89382 391198644 156 56665 541198644 100 611198643 367F20 25129 3294 14230 72064 796 13148 86343 100 48370 2366F21 396 289 165 100 460 291 487 190 272 237F22 3183 5555 151198645 131198648 826 8930 301198648 100 581198646 1515F23 100 441198643 151198646 881198647 2710 331198644 161198648 489 281198647 2922F24 221198643 8870 111198643 471198643 160 21501 341198643 100 94082 3450F25 301198643 38011 291198643 101198645 100 311198643 131198645 201 241198644 5467F26 3966 565 3004 5839 603 2736 4232 100 3857 891F27 5477 213 1253 8736 127 1085 5814 100 2111 277F28 16445 6782 33575 44701 43029 8582 59600 100 55144 6885F29 878 400 1650 1531 100 1085 1842 327 675 729F30 6353 1038 2771 10579 715 4704 8891 100 5802 1283F31 380 563 723 939 100 731 1020 193 756 453F32 74024 3059 18472 181198643 100 32280 181198643 287 72562 3100F33 14929 6643 22457 141198643 386 25571 241198643 100 101198643 4271F34 49151 3562 10026 111198643 164 11366 111198643 100 40061 2755F35 344 250 589 667 100 428 548 146 383 374F36 1105 601 1856 4010 907 1847 4318 100 3118 464The bold data are the best function value among different methods for the specified function

HSFA significantly outperforms all others in the opti-mization process Furthermore Figure 4 indicates that atthe early stage of the optimization process FA convergesfaster than HSFA while HSFA is well capable of improvingits solution steadily in the long run Here FA shows fasterconverges initially (within 20 iterations) however it seems tobe trapped into subminima as the function value decreasesslightly (after 20 iterations) and it is outperformed by HSFAafter 30 iterations

From Figures 1ndash4 our HSFArsquos performance is far betterthan the others In general BBO and FA especially FAare only inferior to the HSFA Note that in [40] BBO iscompared with seven EAs and an engineering problem The

experiments proved the excellent performance of BBO It isalso indirectly proven that our HSFA is a more effectiveoptimization method than others

6 Conclusions

In the present work a hybrid HSFA was proposed foroptimization problems FA is enhanced by the combinationof the basic HS method In HSFA top fireflies scheme isintroduced to reduce running time the other is used tomutate between fireflies when updating fireflies The newharmony vector takes the place of the new firefly only ifit is better than before which generally outperforms HS

8 The Scientific World Journal

and FA The HSFA strive to exploit merits of the FA andHS so as to escape all fireflies being trapped into localoptima Benchmark evaluation on the test problems is usedto investigate the HSFA and the other nine approaches Theresults demonstrated that HSFA is able to make use of theuseful knowledge more efficiently to find much better valuescompared with the other optimization algorithms

References

[1] G Wang and L Guo ldquoA novel hybrid bat algorithm withharmony search for global numerical optimizationrdquo Journal ofApplied Mathematics vol 2013 Article ID 696491 21 pages2013

[2] X Li and M Yin ldquoAn opposition-based differential evolutionalgorithm for permutation flow shop scheduling based ondiversity measurerdquo Advances in Engineering Software vol 55pp 10ndash31 2013

[3] D Zou L Gao S Li and J Wu ldquoAn effective global harmonysearch algorithm for reliability problemsrdquo Expert Systems withApplications vol 38 no 4 pp 4642ndash4648 2011

[4] D Zou L Gao J Wu S Li and Y Li ldquoA novel globalharmony search algorithm for reliability problemsrdquo Computersand Industrial Engineering vol 58 no 2 pp 307ndash316 2010

[5] X-S Yang Z Cui R Xiao A H Gandomi and M Kara-manoglu Swarm Intelligence and Bio-Inspired ComputationElsevier Waltham Mass USA 2013

[6] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[7] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[8] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Boston Mass USA 1989

[9] T Back Evolutionary Algorithms inTheory and Practice OxfordUniversity Press Oxford UK 1996

[10] H BeyerTheTheory of Evolution Strategies Springer NewYorkNY USA 2001

[11] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCambridge UK 2004

[12] B Shumeet ldquoPopulation-based incremental learning a methodfor integrating genetic search based function optimization andcompetitive learningrdquo Carnegie Mellon University CMU-CS-94-163 Carnegie Mellon University Pittsburgh Pa USA 1994

[13] O K Erol and I Eksin ldquoA new optimization method big bang-big crunchrdquo Advances in Engineering Software vol 37 no 2 pp106ndash111 2006

[14] A Kaveh and S Talatahari ldquoSize optimization of space trussesusing big bang-big crunch algorithmrdquo Computers and Struc-tures vol 87 no 17-18 pp 1129ndash1140 2009

[15] A Kaveh and S Talatahari ldquoOptimal design of schwedlerand ribbed domes via hybrid big bang-big crunch algorithmrdquoJournal of Constructional Steel Research vol 66 no 3 pp 412ndash419 2010

[16] A Kaveh and S Talatahari ldquoA discrete big bang-big crunchalgorithm for optimal design of skeletal structuresrdquo AsianJournal of Civil Engineering vol 11 no 1 pp 103ndash122 2010

[17] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[18] P Yadav R Kumar S K Panda and C S Chang ldquoAn intelligenttuned harmony search algorithm for optimisationrdquo InformationSciences vol 196 pp 47ndash72 2012

[19] S Gholizadeh and A Barzegar ldquoShape optimization of struc-tures for frequency constraints by sequential harmony searchalgorithmrdquo Engineering Optimization vol 45 no 6 pp 627ndash646 2013

[20] A Kaveh and S Talatahari ldquoA novel heuristic optimizationmethod charged system searchrdquo Acta Mechanica vol 213 no3-4 pp 267ndash289 2010

[21] L Xie J Zeng and R A Formato ldquoSelection strategies forgravitational constant G in artificial physics optimisation basedon analysis of convergence propertiesrdquo International Journal ofBio-Inspired Computation vol 4 no 6 pp 380ndash391 2012

[22] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing amp Applications vol 22 no 6 pp 1239ndash1255 2013

[23] X S Yang and A H Gandomi ldquoBat algorithm a novelapproach for global engineering optimizationrdquo EngineeringComputations vol 29 no 5 pp 464ndash483 2012

[24] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications 2013

[25] A H Gandomi and A H Alavi ldquoKrill herd a new bio-inspiredoptimization algorithmrdquo Communications in Nonlinear Scienceand Numerical Simulation vol 17 no 12 pp 4831ndash4845 2012

[26] G-G Wang A H Gandomi and A H Alavi ldquoStud krill herdalgorithmrdquo Neurocomputing 2013

[27] G-GWang A H Gandomi and A H Alavi ldquoAn effective krillherd algorithmwith migration operator in biogeography-basedoptimizationrdquo Applied Mathematical Modelling 2013

[28] R Storn and K Price ldquoDifferential evolution-a simple and effi-cient adaptive scheme for global optimization over continuousspacesrdquo Tech Rep 1075-4946 International Computer ScienceInstitute Berkley Calif USA 1995

[29] R Storn and K Price ldquoDifferential evolution-a simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[30] X Li and M Yin ldquoApplication of differential evolution algo-rithm on self-potential datardquo PLoS One vol 7 no 12 ArticleID e51199 2012

[31] GGWang AHGandomi AHAlavi andG SHao ldquoHybridkrill herd algorithm with differential evolution for globalnumerical optimizationrdquo Neural Computing amp Applications2013

[32] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[33] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[34] S Talatahari M Kheirollahi C Farahmandpour and A HGandomi ldquoA multi-stage particle swarm for optimum designof truss structuresrdquo Neural Computing amp Applications vol 23no 5 pp 1297ndash1309 2013

[35] K Y Huang ldquoA hybrid particle swarm optimization approachfor clustering and classification of datasetsrdquo Knowledge-BasedSystems vol 24 no 3 pp 420ndash426 2011

[36] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquoParallel Problem Solving from Nature pp 683ndash691 1998

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 8: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

8 The Scientific World Journal

and FA The HSFA strive to exploit merits of the FA andHS so as to escape all fireflies being trapped into localoptima Benchmark evaluation on the test problems is usedto investigate the HSFA and the other nine approaches Theresults demonstrated that HSFA is able to make use of theuseful knowledge more efficiently to find much better valuescompared with the other optimization algorithms

References

[1] G Wang and L Guo ldquoA novel hybrid bat algorithm withharmony search for global numerical optimizationrdquo Journal ofApplied Mathematics vol 2013 Article ID 696491 21 pages2013

[2] X Li and M Yin ldquoAn opposition-based differential evolutionalgorithm for permutation flow shop scheduling based ondiversity measurerdquo Advances in Engineering Software vol 55pp 10ndash31 2013

[3] D Zou L Gao S Li and J Wu ldquoAn effective global harmonysearch algorithm for reliability problemsrdquo Expert Systems withApplications vol 38 no 4 pp 4642ndash4648 2011

[4] D Zou L Gao J Wu S Li and Y Li ldquoA novel globalharmony search algorithm for reliability problemsrdquo Computersand Industrial Engineering vol 58 no 2 pp 307ndash316 2010

[5] X-S Yang Z Cui R Xiao A H Gandomi and M Kara-manoglu Swarm Intelligence and Bio-Inspired ComputationElsevier Waltham Mass USA 2013

[6] A H Gandomi X S Yang S Talatahari and A H AlaviMetaheuristic Applications in Structures and InfrastructuresElsevier Waltham Mass USA 2013

[7] X S Yang A H Gandomi S Talatahari and A H AlaviMetaheuristics in Water Geotechnical and Transport Engineer-ing Elsevier Waltham Mass USA 2013

[8] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Boston Mass USA 1989

[9] T Back Evolutionary Algorithms inTheory and Practice OxfordUniversity Press Oxford UK 1996

[10] H BeyerTheTheory of Evolution Strategies Springer NewYorkNY USA 2001

[11] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCambridge UK 2004

[12] B Shumeet ldquoPopulation-based incremental learning a methodfor integrating genetic search based function optimization andcompetitive learningrdquo Carnegie Mellon University CMU-CS-94-163 Carnegie Mellon University Pittsburgh Pa USA 1994

[13] O K Erol and I Eksin ldquoA new optimization method big bang-big crunchrdquo Advances in Engineering Software vol 37 no 2 pp106ndash111 2006

[14] A Kaveh and S Talatahari ldquoSize optimization of space trussesusing big bang-big crunch algorithmrdquo Computers and Struc-tures vol 87 no 17-18 pp 1129ndash1140 2009

[15] A Kaveh and S Talatahari ldquoOptimal design of schwedlerand ribbed domes via hybrid big bang-big crunch algorithmrdquoJournal of Constructional Steel Research vol 66 no 3 pp 412ndash419 2010

[16] A Kaveh and S Talatahari ldquoA discrete big bang-big crunchalgorithm for optimal design of skeletal structuresrdquo AsianJournal of Civil Engineering vol 11 no 1 pp 103ndash122 2010

[17] Z W Geem J H Kim and G V Loganathan ldquoA new heuristicoptimization algorithm harmony searchrdquo Simulation vol 76no 2 pp 60ndash68 2001

[18] P Yadav R Kumar S K Panda and C S Chang ldquoAn intelligenttuned harmony search algorithm for optimisationrdquo InformationSciences vol 196 pp 47ndash72 2012

[19] S Gholizadeh and A Barzegar ldquoShape optimization of struc-tures for frequency constraints by sequential harmony searchalgorithmrdquo Engineering Optimization vol 45 no 6 pp 627ndash646 2013

[20] A Kaveh and S Talatahari ldquoA novel heuristic optimizationmethod charged system searchrdquo Acta Mechanica vol 213 no3-4 pp 267ndash289 2010

[21] L Xie J Zeng and R A Formato ldquoSelection strategies forgravitational constant G in artificial physics optimisation basedon analysis of convergence propertiesrdquo International Journal ofBio-Inspired Computation vol 4 no 6 pp 380ndash391 2012

[22] A H Gandomi X-S Yang A H Alavi and S TalataharildquoBat algorithm for constrained optimization tasksrdquo NeuralComputing amp Applications vol 22 no 6 pp 1239ndash1255 2013

[23] X S Yang and A H Gandomi ldquoBat algorithm a novelapproach for global engineering optimizationrdquo EngineeringComputations vol 29 no 5 pp 464ndash483 2012

[24] X Li J Zhang andM Yin ldquoAnimal migration optimization anoptimization algorithm inspired by animalmigration behaviorrdquoNeural Computing and Applications 2013

[25] A H Gandomi and A H Alavi ldquoKrill herd a new bio-inspiredoptimization algorithmrdquo Communications in Nonlinear Scienceand Numerical Simulation vol 17 no 12 pp 4831ndash4845 2012

[26] G-G Wang A H Gandomi and A H Alavi ldquoStud krill herdalgorithmrdquo Neurocomputing 2013

[27] G-GWang A H Gandomi and A H Alavi ldquoAn effective krillherd algorithmwith migration operator in biogeography-basedoptimizationrdquo Applied Mathematical Modelling 2013

[28] R Storn and K Price ldquoDifferential evolution-a simple and effi-cient adaptive scheme for global optimization over continuousspacesrdquo Tech Rep 1075-4946 International Computer ScienceInstitute Berkley Calif USA 1995

[29] R Storn and K Price ldquoDifferential evolution-a simple andefficient heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997

[30] X Li and M Yin ldquoApplication of differential evolution algo-rithm on self-potential datardquo PLoS One vol 7 no 12 ArticleID e51199 2012

[31] GGWang AHGandomi AHAlavi andG SHao ldquoHybridkrill herd algorithm with differential evolution for globalnumerical optimizationrdquo Neural Computing amp Applications2013

[32] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[33] R J Kuo Y J Syu Z-Y Chen and F C Tien ldquoIntegration ofparticle swarm optimization and genetic algorithm for dynamicclusteringrdquo Information Sciences vol 195 pp 124ndash140 2012

[34] S Talatahari M Kheirollahi C Farahmandpour and A HGandomi ldquoA multi-stage particle swarm for optimum designof truss structuresrdquo Neural Computing amp Applications vol 23no 5 pp 1297ndash1309 2013

[35] K Y Huang ldquoA hybrid particle swarm optimization approachfor clustering and classification of datasetsrdquo Knowledge-BasedSystems vol 24 no 3 pp 420ndash426 2011

[36] W Khatib and P Fleming ldquoThe stud GA a mini revolutionrdquoParallel Problem Solving from Nature pp 683ndash691 1998

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 9: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

The Scientific World Journal 9

[37] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[38] A H Gandomi S Talatahari X S Yang and S Deb ldquoDesignoptimization of truss structures using cuckoo search algorithmrdquoThe Structural Design of Tall and Special Buildings vol 22 no17 pp 1330ndash1349 2013

[39] X Cai S Fan and Y Tan ldquoLight responsive curve selection forphotosynthesis operator of APOArdquo International Journal of Bio-Inspired Computation vol 4 no 6 pp 373ndash379 2012

[40] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008

[41] A H Gandomi X-S Yang and A H Alavi ldquoMixed variablestructural optimization using firefly algorithmrdquo Computers ampStructures vol 89 no 23-24 pp 2325ndash2336 2011

[42] X S Yang Nature-Inspired Metaheuristic Algorithms LuniverFrome UK 2008

[43] X S Yang ldquoFirefly algorithms for multimodal optimizationrdquoin Proceedings of the 5th International Conference on Stochas-tic Algorithms Foundations and Applications pp 169ndash178Springer Sapporo Japan 2009

[44] X S Yang ldquoFirefly algorithm stochastic test functions anddesign optimisationrdquo International Journal of Bio-Inspired Com-putation vol 2 no 2 pp 78ndash84 2010

[45] X-S Yang S S S Hosseini and A H Gandomi ldquoFireflyalgorithm for solving non-convex economic dispatch problemswith valve loading effectrdquo Applied Soft Computing Journal vol12 no 3 pp 1180ndash1186 2012

[46] R Parpinelli and H Lopes ldquoNew inspirations in swarmintelligence a surveyrdquo International Journal of Bio-InspiredComputation vol 3 no 1 pp 1ndash16 2011

[47] D Zou L Gao J Wu and S Li ldquoNovel global harmony searchalgorithm for unconstrained problemsrdquo Neurocomputing vol73 no 16ndash18 pp 3308ndash3318 2010

[48] G Wang L Guo H Wang H Duan L Liu and J Li ldquoIncor-porating mutation scheme into krill herd algorithm for globalnumerical optimizationrdquo Neural Computing and Applications2012

[49] S Z Zhao P N Suganthan Q-K Pan and M Fatih Tasge-tiren ldquoDynamic multi-swarm particle swarm optimizer withharmony searchrdquo Expert Systems with Applications vol 38 no4 pp 3735ndash3742 2011

[50] G Wang L Guo H Duan L Liu and H Wang ldquoA modifiedfirefly algorithm for UCAV path planningrdquo International Jour-nal of Hybrid Information Technology vol 5 no 3 pp 123ndash1442012

[51] A H Gandomi X S Yang S Talatahari and A H AlavildquoFirefly algorithm with chaosrdquo Communications in NonlinearScience and Numerical Simulation vol 18 no 1 pp 89ndash98 2013

[52] Y Zhang D Huang M Ji and F Xie ldquoImage segmentationusing PSO and PCM with Mahalanobis distancerdquo Expert Sys-tems with Applications vol 38 no 7 pp 9036ndash9040 2011

[53] G G Wang L Guo A H Gandomi A H Alavi and H DuanldquoSimulated annealing-based krill herd algorithm for globaloptimizationrdquo Abstract and Applied Analysis vol 2013 ArticleID 213853 11 pages 2013

[54] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 10: Research Article An Effective Hybrid Firefly Algorithm with …downloads.hindawi.com/journals/tswj/2013/125625.pdf · 2019. 7. 31. · An Effective Hybrid Firefly Algorithm with Harmony

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014