Research ArticleBare-Bones Teaching-Learning-Based Optimization
Feng Zou12 Lei Wang1 Xinhong Hei1 Debao Chen2 Qiaoyong Jiang1 and Hongye Li1
1 School of Computer Science and Engineering Xirsquoan University of Technology Xirsquoan 710048 China2 School of Physics and Electronic Information Huaibei Normal University Huaibei 235000 China
Correspondence should be addressed to Lei Wang wangleeei163com
Received 20 February 2014 Accepted 7 April 2014 Published 10 June 2014
Academic Editors S Balochian and Y Zhang
Copyright copy 2014 Feng Zou et alThis is an open access article distributed under theCreative CommonsAttribution License whichpermits unrestricted use distribution and reproduction in any medium provided the original work is properly cited
Teaching-learning-based optimization (TLBO) algorithm which simulates the teaching-learning process of the class room is oneof the recently proposed swarm intelligent (SI) algorithms In this paper a new TLBO variant called bare-bones teaching-learning-based optimization (BBTLBO) is presented to solve the global optimization problems In this method each learner of teacher phaseemploys an interactive learning strategy which is the hybridization of the learning strategy of teacher phase in the standard TLBOand Gaussian sampling learning based on neighborhood search and each learner of learner phase employs the learning strategyof learner phase in the standard TLBO or the new neighborhood search strategy To verify the performance of our approaches20 benchmark functions and two real-world problems are utilized Conducted experiments can been observed that the BBTLBOperforms significantly better than or at least comparable to TLBO and some existing bare-bones algorithms The results indicatethat the proposed algorithm is competitive to some other optimization algorithms
1 Introduction
Many real-life optimization problems are becomingmore andmore complex and difficult with the development of scientifictechnology So how to resolve these complex problems in anexactmanner within a reasonable time cost is very importantThe traditional optimization algorithms are difficult to solvethese complex nonlinear problems In recent years nature-inspired optimization algorithms which simulate naturalphenomena and have different design philosophies andcharacteristics such as evolutionary algorithms [1ndash3] andswarm intelligence algorithms [4ndash7] are a research fieldwhich simulates different natural phenomena to solve a widerange of problems In these algorithms the convergence rateof the algorithm is given prime importance for solving real-world optimization problems The ability of the algorithmsto obtain the global optima value is one aspect and the fasterconvergence is the other aspect
As a stochastic search scheme TLBO [8 9] is a newlypopulation-based algorithm based on swarm intelligence andhas characters of simple computation and rapid convergenceit has been extended to the function optimization engineer-ing optimization multiobjective optimization clustering
and so forth [9ndash17] TLBO is a parameter-free evolutionarytechnique and is also gaining popularity due to its abilityto achieve better results in comparatively faster convergencetime to genetic algorithms (GA) [1] particle swarm optimizer(PSO) [5] and artificial bee colony algorithm (ABC) [6]However in evolutionary computation research there havebeen always attempts to improve any given findings furtherand further This work is an attempt to improve the con-vergence characteristics of TLBO further without sacrificingthe accuracies obtained in TLBO and in some occasionstrying to even better the accuracies The aims of this paperare of threefold First authors propose an improved versionof TLBO namely BBTLBO Next the proposed techniqueis validated on unimodal and multimodal functions basedon different performance indicators The result of BBTLBOis compared with other algorithms Results of both thealgorithms are also compared using statistical paired 119905-testThirdly it is applied to solve the real-world optimizationproblem
The remainder of this paper is organized as followsThe TLBO algorithm is introduced in Section 2 Section 3presents a brief overview of some recently proposed
Hindawi Publishing Corporatione Scientific World JournalVolume 2014 Article ID 136920 17 pageshttpdxdoiorg1011552014136920
2 The Scientific World Journal
(1) Begin(2) Initialize119873 (number of learners) and119863 (number of dimensions)(3) Initialize learners119883 and evaluate all learners119883(4) Donate the best learner as Teacher and the mean of all learners119883 asMean(5) while (stopping condition not met)(6) for each learner119883
119894of the class Teaching phase
(7) TF = round(1 + rand(0 1))(8) for 119895 = 1 119863(9) 119899119890119908119883
119894119895= 119883119894119895+ rand(0 1) lowast (119879119890119886119888ℎ119890119903(119895) minus TF lowast119872119890119886119899(119895))
(10) endfor(11) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(12) endfor(13) for each learner119883
119894of the class Learning phase
(14) Randomly select one learner119883119896 such that 119894 = 119896
(15) if 119891(119883119894) better 119891(119883
119896)
(16) for 119895 = 1 119863(17) 119899119890119908119883
119894119895= 119883119894119895+ rand(0 1) lowast (119883
119894119895minus 119883119896119895)
(18) endfor(19) else(20) for 119895 = 1 119863(21) 119899119890119908119883
119894119895= 119883119894119895+ rand(0 1) lowast (119883
119896119895minus 119883119894119895)
(22) endfor(23) endif(24) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(25) endfor(26) Update the Teacher and theMean(27) endwhile(28) end
Algorithm 1 TLBO( )
bare-bones algorithms Section 4 describes the improvedteaching-learning-based optimization algorithm usingneighborhood search (BBTLBO) Section 5 presents the testson several benchmark functions and the experiments areconducted along with statistical tests The applications fortraining artificial neural network are shown in Section 6Conclusions are given in Section 7
2 Teaching-Learning-Based Optimization
Rao et al [8 9] first proposed a novel teaching-learning-based optimization (TLBO) inspired from the philosophy ofteaching and learning The TLBO algorithm is based on theeffect of the influence of a teacher on the output of learners ina class which is considered in terms of results or grades Theprocess ofworking of TLBO is divided into twopartsThefirstpart consists of ldquoteacher phaserdquo and the second part consistsof ldquolearner phaserdquo The ldquoteacher phaserdquo means learning fromthe teacher and the ldquolearner phaserdquo means learning throughthe interaction between learners
A good teacher is one who brings his or her learners up tohis or her level in terms of knowledge But in practice this isnot possible and a teacher can only move the mean of a classup to some extent depending on the capability of the classThis follows a random process depending on many factorsLet119872 be the mean and let 119879 be the teacher at any iteration 119879will try tomovemean119872 toward its own level so now the new
mean will be 119879 designated as119872new The solution is updatedaccording to the difference between the existing and the newmean according to the following expression
119899119890119908119883 = 119883 + 119903 times (119872new minus TF times119872) (1)
where TF is a teaching factor that decides the value of meanto be changed and 119903 is a random vector in which each elementis a random number in the range [0 1] The value of TF canbe either 1 or 2 which is again a heuristic step and decidedrandomly with equal probability as
TF = round [1 + rand (0 1)] (2)
Learners increase their knowledge by two differentmeans one through input from the teacher and the otherthrough interaction between themselves A learner interactsrandomly with other learners with the help of group discus-sions presentations formal communications and so forthA learner learns something new if the other learner hasmore knowledge than him or her Learner modification isexpressed as
119899119890119908119883119894=
119883119894+ 119903 lowast (119883
119894minus 119883119895) if119891 (119883
119894) lt 119891 (119883
119895)
119883119894+ 119903 lowast (119883
119895minus 119883119894) otherwise
(3)
As explained above the pseudocode for the implementa-tion of TLBO is summarized in Algorithm 1
The Scientific World Journal 3
3 Bare-Bones Algorithm
In this section we only presented a brief overview of somerecently proposed bare-bones algorithms
31 BBPSO and BBExp PSO is a swarm intelligence-basedalgorithm which is inspired by the behavior of birds flocking[5] In PSO each particle is attracted by its personal bestposition (119901best) and the global best position (119892best) foundso far Theoretical studies [18 19] proved that each particleconverges to the weighted average of 119901best and 119892best
lim119905rarrinfin
119883119894 (119905) =
1198881sdot 119892best + 1198882 sdot 119901best
1198881+ 1198882
(4)
where 1198881and 1198882are two leaning factors in PSO
Based on the convergence characteristic of PSO Kennedy[20] proposed a new PSO variant called bare-bones PSO(BBPSO) Bare-bones PSO retains the standard PSO socialcommunication but replaces dynamical particle update withsampling from a probability distribution based on 119892best and119901best119894 as follows
119909119894119895 (119905 + 1) = 119873(
119892best + 119901best119894119895 (119905)
2
100381610038161003816100381610038161003816119892best minus 119901best119894119895 (119905)
100381610038161003816100381610038161003816) (5)
where 119909119894119895(119905 + 1) is the 119895th dimension of the 119894th particle in
the population and119873 represents a Gaussian distributionwithmean (119892best + 119901best119894119895(119905))2 and standard deviation |119892best minus119901best119894119895(119905)|
Kennedy [20] proposed also an alternative version of theBBPSO denoted by BBExp where (5) is replaced by119909119894119895 (119905 + 1)
=
119873(
119892best + 119901best119894119895 (119905)
2
100381610038161003816100381610038161003816119892best minus 119901best119894119895 (119905)
100381610038161003816100381610038161003816) rand (0 1) gt 05
119901best119894119895 (119905) otherwise
(6)where rand (01) is a random value within [0 1] for the 119895thdimension For the alternative mechanism there is a 50chance that the search process is focusing on the previous bestpositions
32 BBDE GBDE and MGBDE Inspired by the BBPSOand DE Omran et al [21] proposed a new and efficientDE variant called bare-bones differential evolution (BBDE)The BBDE is a new almost parameter-free optimizationalgorithm that is a hybrid of the bare-bones particle swarmoptimizer and differential evolution Differential evolution isused tomutate for each particle the attractor associated withthat particle defined as a weighted average of its personal andneighborhood best positions For the BBDE the individual isupdated as follows119909119894119895 (119905 + 1)
= 1199011198943 119895(119905) + 1199032 sdot (1199091198941 119895
(119905) minus 1199091198942 119895(119905)) rand (0 1) gt CR
119901best1198943119895 (119905) otherwise(7)
where 1198941 1198942 and 119894
3are three indices chosen from the set
1 2 NP with 1198941= 1198942= 119894 rand (0 1) is a random value
within [0 1] for the 119895th dimension and 119901119894119895(119905) is defined by
119901119894119895 (119905 + 1) = 1199031119895 sdot 119901best119894119895 (119905) + (1 minus 1199032119895) 119892best119894 (119905) (8)
where 119901best and 119892best are personal best position and the globalbest position 119903
1119895 is a random value within [0 1] for the 119895th
dimensionBased on the idea that the Gaussian sampling is a fine
tuning procedure which starts during exploration and iscontinued to exploitation Wang et al [22] proposed a newparameter-freeDE algorithm calledGBDE In theGBDE themutation strategy uses a Gaussian sampling method which isdefined by
V119894119895 (119905 + 1)
=
119873(
119883best119895 (119905) + 119909119894119895 (119905)
2 rand (0 1) le CR or 119895 = 119895rand
10038161003816100381610038161003816119883best119895 (119905) minus 119909119894119895 (119905)
10038161003816100381610038161003816)
119909119894119895 (119905) otherwise(9)
where 119873 represents a Gaussian distribution with mean(119883best119895(119905)+119909119894119895(119905))2 and standard deviation |119883best119895(119905)minus119909119894119895(119905)|and CR is the probability of crossover
To balance the global search ability and convergence rateWang et al [22] proposed amodifiedGBDE (calledMGBDE)The mutation strategy uses a hybridization of GBDE andDEbest1 as follows
V119894119895 (119905 + 1)
=
119883best119895 (119905)+119865 sdot (1199091198941119895 (119905)minus1199091198942119895 (119905)) rand (0 1) le 05
119873(
119883best119895 (119905)+119909119894119895 (119905)
210038161003816100381610038161003816119883best119895 (119905)minus119909119894119895 (119905)
10038161003816100381610038161003816) otherwise
(10)
4 Proposed Algorithm BBTLBO
The bare-bones PSO utilizes this information by samplingcandidate solutions normally distributed around the for-mally derived attractor point That is the new position isgenerated by a Gaussian distribution for sampling the searchspace based on the 119892best and the 119901best at the current iterationAs a result the new position will be centered around theweighted average of 119901best and 119892best Generally speaking atthe initial evolutionary stages the search process focuses onexploration due to the large deviation With an increasingnumber of generations the deviation becomes smaller andthe search process will focus on exploitation From the searchbehavior of BBPSO the Gaussian sampling is a fine tuningprocedure which starts during exploration and is continuedto exploitation This can be beneficial for the search of manyevolutionary optimization algorithms Additionally the bare-bones PSO has no parameters to be tuned
Based on a previous explanation a new bare-bones TLBO(BBTLBO) with neighborhood search is proposed in this
4 The Scientific World Journal
Begin
Initialize learners size (NP) dimension (D) and hybridization factor (u)
Calculate the NTeacher and NMean of each learner
Modify each learner Xi in the class= + r lowast (NTeacher minus TF lowast NMean)= N((NTeacher + Xi)2 (NTeacher minus Xi))
newXi = u lowast newX1 + (1 minus u) lowast newX2Teacher phase
newXi better Xi
Xi = newXi Xi = Xi
Yes
Yes
Yes
Yes
No
No
No
No
Denote the NTeacheri and randomly select a Xk for each Xi
Learner phase
rand(0 1) lt 05
The original TLBO learning Neighborhood search strategy
newXi better Xi
Xi = newXi Xi = Xi
Termination criteria satisfied
End
gen = gen + 1
XiV1V2
Figure 1 Flow chart showing the working of BBTLBO algorithm
paper In fact for TLBO if the new learner has a betterfunction value than that of the old learner it is replacedwith the old one in the memory Otherwise the old one isretained in the memory In other words a greedy selectionmechanism is employed as the selection operation betweenthe old and the candidate one Hence the new teacher and thenew learner are the global best (119892best) and learnerrsquos personalbest (119901best) found so far respectively The complete flowchartof the BBTLBO algorithm is shown in Figure 1
41 Neighborhood Search It is known that birds of a featherflock together and people of a mind fall into the samegroup Just like evolutionary algorithms themselves thenotion of neighborhood is inspired by nature Neighborhoodtechnique is an efficient method to maintain diversity of
the solutions It plays an important role in evolutionaryalgorithms and is often introduced by researchers in orderto allow maintenance of a population of diverse individualsand improve the exploration capability of population-basedheuristic algorithms [23ndash26] In fact learners with similarinterests form different learning groups Because of his orher favor characteristic the learner maybe learns from theexcellent individual in the learning group
For the implementation of grouping various types ofconnected distances may be used Here we have used aring topology [27] based on the indexes of learners for thesake of simplicity In a ring topology the first individualis the neighbor of the last individual and vice versa Basedon the ring topology a 119896-neighborhood radius is definedwhere 119896 is a predefined integer number For each individual
The Scientific World Journal 5
NeighborhoodiXi
Ximinus1
Xi+1
Figure 2 Ring neighborhood topology with three members
its 119896-neighborhood radius consists of 2119896 + 1 individuals(including oneself) which are 119883
119894minus119896 119883
119894 119883
119894+119896 That is
the neighborhood size is 2119896 + 1 for a 119896-neighborhood Forsimplicity 119896 is set to 1 (Figure 2) in our algorithmThismeansthat there are 3 individuals in each learning group Oncegroups are constructed we can utilize them for updating thelearners of the corresponding group
42 Teacher Phase To balance the global and local searchability a modified interactive learning strategy is proposed inteacher phase In this learning phase each learner employs aninteractive learning strategy (the hybridization of the learningstrategy of teacher phase in the standard TLBO and Gaussiansampling learning) based on neighborhood search
In BBTLBO the updating formula of the learning for alearner 119883
119894in teacher phase is proposed by the hybridization
of the learning strategy of teacher phase and the Gaussiansampling learning as follows
1198811119895 (119905 + 1) = 119883119894119895 (119905) + rand (0 1)
sdot (119873119879119890119886119888ℎ119890119903119894119895 (119905) minus TF sdot 119873119872119890119886119899
119894119895 (119905))
1198812119895 (119905 + 1) = 119873(
119873119879119890119886119888ℎ119890119903119894119895 (119905) + 119873119872119890119886119899119894119895 (119905)
2
100381610038161003816100381610038161003816100381610038161003816
119873119879119890119886119888ℎ119890119903119894119895 (119905) minus 119873119872119890119886119899119894119895 (119905)
100381610038161003816100381610038161003816100381610038161003816
)
119899119890119908119883119894119895 (119905 + 1) = 119906 sdot 1198811119895 (119905 + 1) + (1 minus 119906) sdot 1198812119895 (119905 + 1)
(11)
where 119906 called the hybridization factor is a random numberin the range [0 1] for the 119895th dimension 119873119879119890119886119888ℎ119890119903 and119873119872119890119886119899 are the existing neighborhood best solution and theneighborhood mean solution of each learner and TF is ateaching factor which can be either 1 or 2 randomly
In the BBTLBO there is a (119906 lowast 100) chance that the119895th dimension of the 119894th learner in the population follows thebehavior of the learning strategy of teacher phase while theremaining (100 minus 119906lowast 100) follow the search behavior of theGaussian sampling in teacher phase This will be helpful tobalance the advantages of fast convergence rate (the attraction
of the learning strategy of teacher phase) and exploration (theGaussian sampling) in BBTLBO
43 Learner Phase At the same time in the learner phase alearner interacts randomly with other learners for enhancinghis or her knowledge in the class This learning method canbe treated as the global search strategy (shown in (3))
In this paper we introduce a new learning strategy inwhich each learner learns from the neighborhood teacher andthe other learner selected randomly of his or her correspond-ing neighborhood in learner phaseThis learningmethod canbe treated as the neighborhood search strategy Let 119899119890119908119883
119894
represent the interactive learning result of the learner119883119894This
neighborhood search strategy can be expressed as follows
119899119890119908119883119894119895= 119883119894119895+ 1199031lowast (119873119879119890119886119888ℎ119890119903
119894119895minus 119883119894119895)
+ 1199032lowast (119883119894119895minus 119883119896119895)
(12)
where 1199031and 1199032are random vectors in which each element
is a random number in the range [0 1] 119873119879119890119886119888ℎ119890119903 is theteacher of the learner 119883
119894rsquos corresponding neighborhood
and the learner 119883119896is selected randomly from the learnerrsquos
corresponding neighborhoodIn BBTLBO each learner is probabilistically learning by
means of the global search strategy or the neighborhoodsearch strategy in learner phaseThat is about 50of learnersin the population execute the learning strategy of learnerphase in the standard TLBO (shown in (3)) while theremaining 50execute neighborhood search strategy (shownin (12)) This will be helpful to balance the global search andlocal search in learner phase
Moreover compared to the original TLBO BBTLBOonlymodifies the learning strategies Therefore both the originalTLBO and BBTLBO have the same time complexity 119874 (NP sdot119863 sdot Genmax) where NP is the number of the population 119863is the number of dimensions and Genmax is the maximumnumber of generations
As explained above the pseudocode for the implementa-tion of BBTLBO is summarized in Algorithm 2
5 Functions Optimization
In this section to illustrate the effectiveness of the proposedmethod 20 benchmark functions are used to test the effi-ciency of BBTLBO To compare the search performance ofBBTLBO with some other methods other different algo-rithms are also simulated in the paper
51 Benchmark Functions Thedetails of 20 benchmark func-tions are shown in Table 1 Among 20 benchmark functions1198651to 1198659are unimodal functions and 119865
10to 11986520
are multi-modal functions The searching range and theory optima forall functions are also shown in Table 1
52 Parameter Settings All the experiments are carried outon the same machine with a Celoron 226GHz CPU 2GBmemory andWindows XP operating system withMatlab 79
6 The Scientific World Journal
(1) Begin(2) Initialize119873 (number of learners)119863 (number of dimensions) and hybridization factor 119906(3) Initialize learners 119883 and evaluate all learners119883(4) while (stopping condition not met)(5) for each learner119883
119894of the class Teaching phase
(6) TF = round(1 + rand(0 1))(7) Donate the119873 119879119890119886119888ℎ119890119903 and the119873 119872119890119886119899 in its neighborhood for each learner(8) Updating each learner according (11)(9) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(10) endfor(11) for each learner119883
119894of the class Learning phase
(12) Randomly select one learner119883119896 such that 119894 = 119896
(13) if rand(0 1) lt 05(14) Updating each learner according (3)(15) else(16) Donate the119873119879119890119886119888ℎ119890119903 in its neighborhood for each learner(17) Updating each learner according (12)(18) endif(19) Accept 119899119890119908119883
119894if 119891(119899119890119908119883119894) is better than 119891(119883
119894)
(20) endfor(21) endwhile(22) end
Algorithm 2 BBTLBO( )
For the purpose of reducing statistical errors each algorithmis independently simulated 50 times For all algorithms thepopulation size was set to 20 Population-based stochasticalgorithms use the same stopping criterion that is reachinga certain number of function evaluations (FEs)
53 Effect of Variation in Parameter 119906 The hybridizationfactor u is set to 00 01 03 05 07 09 10 Comparativetests have been performed using different 119906 In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions Table 2 showsthe mean optimum solutions and the standard deviation ofthe solutions obtained using different hybridization factor119906 in the 50 independent runs The best results amongthe algorithms are shown in bold Figure 3 presents therepresentative convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved byusing different hybridization factor 119906 on all test functionsDue to the tight space limitation some sample graphs areillustrated
The comparisons in Table 2 and Figure 3 show that whenthe hybridization factor 119906 is set to 09 BBTLBOoffers the bestperformance on 20 test functions Hence the hybridizationfactor 119906 is set to 09 in the following experiments
54 Comparison of BBTLBO with Some Similar Bare-BonesAlgorithms In this section we compare BBTLBO with fiveother recently proposed three bare-bones DE variants andtwo bare-bones PSO algorithms Our experiment includestwo series of comparisons in terms of the solution accuracyand the solution convergence (convergence speed and successrate) We compared the performance of BBTLBO with other
similar bare-bones algorithms including BBPSO [20] BBExp[20] BBDE [21] GBDE [22] and MGBDE [22]
541 Comparisons on the Solution Accuracy In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions The resultsare shown in Table 3 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the 50independent runs by each algorithm on 20 test functionsThebest results among the algorithms are shown in bold Figure 4presents the convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved by 7algorithms for 50 independent runs Due to the tight spacelimitation some sample graphs are illustrated
From Table 3 it can be observed that the mean optimumsolution and the standard deviation of all algorithms performwell for the functions 119865
15and 11986517 Although BBExp performs
better than BBTLBO on function 1198659and MGBDE performs
better than BBTLBO on function 11986520 our approach BBTLBO
achieves better results than other algorithms on the rest of testfunctions Table 3 and Figure 4 conclude that the BBTLBOhas a good performance of the solution accuracy for testfunctions in this paper
542 Comparison of the Convergence Speed and SR In orderto compare the convergence speed and successful rate (SR)of different algorithms we select a threshold value of theobjective function for each test function For other functionsthe threshold values are listed in Table 4 In our experimentthe stopping criterion is that each algorithm is terminatedwhen the best fitness value so far is below the predefinedthreshold value (119879 Value) or the number of FEs reaches to
The Scientific World Journal 7
Table 1 Details of numerical benchmarks used
Function Formula 119863 Range Optima
Sphere 1198651(119909) =
119863
sum
119894=1
1199092
11989430 [minus100 100] 0
Sum square 1198652(119909) =
119863
sum
119894=1
1198941199092
11989430 [minus100 100] 0
Quadric 1198653(119909) =
119863
sum
119894=1
1198941199094
119894+ random(0 1) 30 [minus128 128] 0
Step 1198654(119909) =
119863
sum
119894=1
(lfloor119909119894+ 05rfloor)
2 30 [minus100 100] 0
Schwefel 12 1198655(119909) =
119863
sum
119894=1
(
119894
sum
119895=1
119909119895)
2
30 [minus100 100] 0
Schwefel 221 1198656(119909) = max 1003816100381610038161003816119909119894
1003816100381610038161003816 1 le 119894 le 119863 30 [minus100 100] 0
Schwefel 222 1198657(119909) =
119863
sum
119894=1
1003816100381610038161003816119909119894
1003816100381610038161003816+
119863
prod
119894=1
1003816100381610038161003816119909119894
100381610038161003816100381630 [minus10 10] 0
Zakharov 1198658(119909) =
119863
sum
119894=1
1199092
119894+ (
119863
sum
119894=1
05119894119909119894)
2
+ (
119863
sum
119894=1
05119894119909119894)
4
30 [minus100 100] 0
Rosenbrock 1198659(119909) =
119863minus1
sum
119894=1
lfloor100(1199092
119894minus 119909119894+1)2
+ (119909119894minus 1)2
rfloor 30 [minus2048 2048] 0
Ackley 11986510(119909) = 20 minus 20 exp((minus1
5)radic(
1
119863)
119863
sum
119894=1
1199092
119894) minus exp(( 1
119863)
119863
sum
119894=1
cos (2120587119909119894)) + 119890 30 [minus32 32] 0
Rastrigin 11986511(119909) =
119863
sum
119894=1
(1199092
119894minus 10 cos (2120587119909
119894) + 10) 30 [minus512 512] 0
Weierstrass11986512(119909) =
119863
sum
119894=1
(
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))]) minus 119863
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 times 05)]
119886 = 05 119887 = 3 119896max = 20
30 [minus05 05] 0
Griewank 11986513(119909) =
119863
sum
119894=1
(1199092
119894
4000) minus
119899
prod
119894=1
cos(119909119894
radic119894
) + 1 30 [minus600 600] 0
Schwefel 11986514(119909) = 4189829119863 +
119863
sum
119894=1
(minus119909119894sinradicabs(119909
119894)) 30 [minus500 500] 0
Bohachevsky1 11986515(119909) = 119909
2
1+ 21199092
2minus 03 cos (3120587119909
1) minus 04 cos (4120587119909
2) + 07 2 [minus100 100] 0
Bohachevsky2 11986516(119909) = 119909
2
1+ 21199092
2minus 03cos (3120587119909
1) lowast cos (4120587119909
2) + 03 2 [minus100 100] 0
Bohachevsky3 11986517(119909) = 119909
2
1+ 21199092
2minus 03cos((3120587119909
1) + (4120587119909
2)) + 03 2 [minus100 100] 0
Shekel5 11986518(119909) = minus
5
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus101532
Shekel7 11986519(119909) = minus
7
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus104029
Shekel10 11986520(119909) = minus
10
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus105364
the maximal FEs 40000 The results are shown in Table 4in terms of the mean number of FEs (MFEs) required toconverge to the threshold and successful rate (SR) in the50 independent runs ldquoNaNrdquo represents that no runs of thecorresponding algorithm converged below the predefinedthreshold before meeting the maximum number of FEs Thebest results among the six algorithms are shown in boldface
FromTable 5 it can be observed that all algorithms hardlyconverge to the threshold for unimodal functions 119865
3 1198655 1198656
and 1198658and multimodal functions 119865
11 11986512 and 119865
14 BBTLBO
converges to the threshold except for functions 1198653 1198659 and
11986514 From the results of total average FEs BBTLBO converges
faster than other algorithms on all unimodal functions andthe majority of multimodal functions except for functions119865151198651611986519 and119865
20The acceleration rates between BBTLBO
and other algorithms are mostly 10 for functions 1198651 1198652 1198654
1198657 1198659 11986510 and 119865
13 From the results of total average SR
BBTLBO achieves the highest SR for those test functions ofwhich BBTLBO successfully converges to the threshold valueIt can be concluded that the BBTLBOhas a good performanceof convergence speed and successful rate (SR) of the solutionsfor test functions in this paper
8 The Scientific World Journal
Table2Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent119906
Fun
BBTL
BO(119906=00)
BBTL
BO(119906=01)
BBTL
BO(119906=03)
BBTL
BO(119906=05)
BBTL
BO(119906=07)
BBTL
BO(119906=09)
BBTL
BO(119906=10)
1198651
175119890minus001plusmn121119890+000
689119890minus071plusmn101119890minus070
123119890minus163plusmn00
121119890minus256plusmn00
00plusmn00
00plusmn00
00plusmn00
1198652
898119890minus005plusmn573119890minus004
562119890minus069plusmn272119890minus068
220119890minus161plusmn112119890minus160
243119890minus254plusmn00
00plusmn00
00plusmn00
00plusmn00
1198653
120119890minus001plusmn634119890minus002
591119890minus003plusmn144119890minus003
101119890minus003plusmn348119890minus004
435119890minus004plusmn197119890minus004
235119890minus004plusmn130119890minus004
227119890minus004plusmn126119890minus004
199
eminus00
4plusmn113
eminus00
41198654
765119890+002plusmn583119890+002
480119890minus001plusmn886119890minus001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
558119890+002plusmn653119890+002
187119890minus028plusmn573119890minus028
353119890minus054plusmn186119890minus053
369119890minus073plusmn227119890minus072
953119890minus096plusmn674119890minus095
216
eminus11
5plusmn110
eminus11
4256119890minus100plusmn130119890minus099
1198656
251119890+001plusmn534119890+000
667119890minus021plusmn881119890minus021
281119890minus061plusmn636119890minus061
822119890minus100plusmn180119890minus099
818119890minus137plusmn141119890minus136
363
eminus15
4plusmn134
eminus15
3886119890minus147plusmn322119890minus146
1198657
137119890minus003plusmn954119890minus003
872119890minus043plusmn152119890minus042
568119890minus088plusmn876119890minus088
101119890minus133plusmn238119890minus133
260119890minus175plusmn00
116
eminus18
8plusmn00
833119890minus180plusmn00
1198658
241119890+000plusmn307119890+000
132119890minus019plusmn298119890minus019
213119890minus028plusmn769119890minus028
344119890minus037plusmn124119890minus036
220119890minus050plusmn912119890minus050
107
eminus05
6plusmn439
eminus05
6203119890minus049plusmn894119890minus049
1198659
266
e+00
1plusmn179
e+00
0272119890+001plusmn317119890minus001
277119890+001plusmn318119890minus001
283119890+001plusmn278119890minus001
284119890+001plusmn267119890minus001
283119890+001plusmn341119890minus001
280119890+001plusmn387119890minus001
11986510830119890+000plusmn176119890+000
177119890minus001plusmn610119890minus001
590119890minus015plusmn170119890minus015
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
11986511374119890+001plusmn905119890+000
333119890+001plusmn118119890+001
271119890+001plusmn800119890+000
189119890+001plusmn114119890+001
573119890+000plusmn106119890+001
00plusmn00
00plusmn00
11986512815119890+000plusmn193119890+000
338119890minus001plusmn116119890+000
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986513506119890minus001plusmn808119890minus001
652119890minus003plusmn886119890minus003
178119890minus003plusmn368119890minus003
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986514
433
e+00
3plusmn679
e+00
2467119890+003plusmn610119890+002
517119890+003plusmn668119890+002
559119890+003plusmn685119890+002
553119890+003plusmn710119890+002
558119890+003plusmn780119890+002
540119890+003plusmn653119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518minus771119890+000plusmn347119890+000minus806119890+000plusmn339119890+000minus964119890+000plusmn181119890+000minus965119890+000plusmn176119890+000minus102
e+00
1plusmn677
eminus00
3minus985119890+000plusmn122119890+000minus993119890+000plusmn112119890+000
11986519minus769119890+000plusmn352119890+000minus813119890+000plusmn336119890+000minus987119890+000plusmn183119890+000minus103
e+00
1plusmn945
eminus00
1minus976119890+000plusmn195119890+000minus982119890+000plusmn178119890+000minus961119890+000plusmn199119890+000
11986520minus812119890+000plusmn353119890+000minus938119890+000plusmn269119890+000minus101119890+001plusmn165119890+000minus101
e+00
1plusmn161
e+00
0minus970119890+000plusmn228119890+000minus941119890+000plusmn243119890+000minus100119890+001plusmn169119890+000
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
2 The Scientific World Journal
(1) Begin(2) Initialize119873 (number of learners) and119863 (number of dimensions)(3) Initialize learners119883 and evaluate all learners119883(4) Donate the best learner as Teacher and the mean of all learners119883 asMean(5) while (stopping condition not met)(6) for each learner119883
119894of the class Teaching phase
(7) TF = round(1 + rand(0 1))(8) for 119895 = 1 119863(9) 119899119890119908119883
119894119895= 119883119894119895+ rand(0 1) lowast (119879119890119886119888ℎ119890119903(119895) minus TF lowast119872119890119886119899(119895))
(10) endfor(11) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(12) endfor(13) for each learner119883
119894of the class Learning phase
(14) Randomly select one learner119883119896 such that 119894 = 119896
(15) if 119891(119883119894) better 119891(119883
119896)
(16) for 119895 = 1 119863(17) 119899119890119908119883
119894119895= 119883119894119895+ rand(0 1) lowast (119883
119894119895minus 119883119896119895)
(18) endfor(19) else(20) for 119895 = 1 119863(21) 119899119890119908119883
119894119895= 119883119894119895+ rand(0 1) lowast (119883
119896119895minus 119883119894119895)
(22) endfor(23) endif(24) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(25) endfor(26) Update the Teacher and theMean(27) endwhile(28) end
Algorithm 1 TLBO( )
bare-bones algorithms Section 4 describes the improvedteaching-learning-based optimization algorithm usingneighborhood search (BBTLBO) Section 5 presents the testson several benchmark functions and the experiments areconducted along with statistical tests The applications fortraining artificial neural network are shown in Section 6Conclusions are given in Section 7
2 Teaching-Learning-Based Optimization
Rao et al [8 9] first proposed a novel teaching-learning-based optimization (TLBO) inspired from the philosophy ofteaching and learning The TLBO algorithm is based on theeffect of the influence of a teacher on the output of learners ina class which is considered in terms of results or grades Theprocess ofworking of TLBO is divided into twopartsThefirstpart consists of ldquoteacher phaserdquo and the second part consistsof ldquolearner phaserdquo The ldquoteacher phaserdquo means learning fromthe teacher and the ldquolearner phaserdquo means learning throughthe interaction between learners
A good teacher is one who brings his or her learners up tohis or her level in terms of knowledge But in practice this isnot possible and a teacher can only move the mean of a classup to some extent depending on the capability of the classThis follows a random process depending on many factorsLet119872 be the mean and let 119879 be the teacher at any iteration 119879will try tomovemean119872 toward its own level so now the new
mean will be 119879 designated as119872new The solution is updatedaccording to the difference between the existing and the newmean according to the following expression
119899119890119908119883 = 119883 + 119903 times (119872new minus TF times119872) (1)
where TF is a teaching factor that decides the value of meanto be changed and 119903 is a random vector in which each elementis a random number in the range [0 1] The value of TF canbe either 1 or 2 which is again a heuristic step and decidedrandomly with equal probability as
TF = round [1 + rand (0 1)] (2)
Learners increase their knowledge by two differentmeans one through input from the teacher and the otherthrough interaction between themselves A learner interactsrandomly with other learners with the help of group discus-sions presentations formal communications and so forthA learner learns something new if the other learner hasmore knowledge than him or her Learner modification isexpressed as
119899119890119908119883119894=
119883119894+ 119903 lowast (119883
119894minus 119883119895) if119891 (119883
119894) lt 119891 (119883
119895)
119883119894+ 119903 lowast (119883
119895minus 119883119894) otherwise
(3)
As explained above the pseudocode for the implementa-tion of TLBO is summarized in Algorithm 1
The Scientific World Journal 3
3 Bare-Bones Algorithm
In this section we only presented a brief overview of somerecently proposed bare-bones algorithms
31 BBPSO and BBExp PSO is a swarm intelligence-basedalgorithm which is inspired by the behavior of birds flocking[5] In PSO each particle is attracted by its personal bestposition (119901best) and the global best position (119892best) foundso far Theoretical studies [18 19] proved that each particleconverges to the weighted average of 119901best and 119892best
lim119905rarrinfin
119883119894 (119905) =
1198881sdot 119892best + 1198882 sdot 119901best
1198881+ 1198882
(4)
where 1198881and 1198882are two leaning factors in PSO
Based on the convergence characteristic of PSO Kennedy[20] proposed a new PSO variant called bare-bones PSO(BBPSO) Bare-bones PSO retains the standard PSO socialcommunication but replaces dynamical particle update withsampling from a probability distribution based on 119892best and119901best119894 as follows
119909119894119895 (119905 + 1) = 119873(
119892best + 119901best119894119895 (119905)
2
100381610038161003816100381610038161003816119892best minus 119901best119894119895 (119905)
100381610038161003816100381610038161003816) (5)
where 119909119894119895(119905 + 1) is the 119895th dimension of the 119894th particle in
the population and119873 represents a Gaussian distributionwithmean (119892best + 119901best119894119895(119905))2 and standard deviation |119892best minus119901best119894119895(119905)|
Kennedy [20] proposed also an alternative version of theBBPSO denoted by BBExp where (5) is replaced by119909119894119895 (119905 + 1)
=
119873(
119892best + 119901best119894119895 (119905)
2
100381610038161003816100381610038161003816119892best minus 119901best119894119895 (119905)
100381610038161003816100381610038161003816) rand (0 1) gt 05
119901best119894119895 (119905) otherwise
(6)where rand (01) is a random value within [0 1] for the 119895thdimension For the alternative mechanism there is a 50chance that the search process is focusing on the previous bestpositions
32 BBDE GBDE and MGBDE Inspired by the BBPSOand DE Omran et al [21] proposed a new and efficientDE variant called bare-bones differential evolution (BBDE)The BBDE is a new almost parameter-free optimizationalgorithm that is a hybrid of the bare-bones particle swarmoptimizer and differential evolution Differential evolution isused tomutate for each particle the attractor associated withthat particle defined as a weighted average of its personal andneighborhood best positions For the BBDE the individual isupdated as follows119909119894119895 (119905 + 1)
= 1199011198943 119895(119905) + 1199032 sdot (1199091198941 119895
(119905) minus 1199091198942 119895(119905)) rand (0 1) gt CR
119901best1198943119895 (119905) otherwise(7)
where 1198941 1198942 and 119894
3are three indices chosen from the set
1 2 NP with 1198941= 1198942= 119894 rand (0 1) is a random value
within [0 1] for the 119895th dimension and 119901119894119895(119905) is defined by
119901119894119895 (119905 + 1) = 1199031119895 sdot 119901best119894119895 (119905) + (1 minus 1199032119895) 119892best119894 (119905) (8)
where 119901best and 119892best are personal best position and the globalbest position 119903
1119895 is a random value within [0 1] for the 119895th
dimensionBased on the idea that the Gaussian sampling is a fine
tuning procedure which starts during exploration and iscontinued to exploitation Wang et al [22] proposed a newparameter-freeDE algorithm calledGBDE In theGBDE themutation strategy uses a Gaussian sampling method which isdefined by
V119894119895 (119905 + 1)
=
119873(
119883best119895 (119905) + 119909119894119895 (119905)
2 rand (0 1) le CR or 119895 = 119895rand
10038161003816100381610038161003816119883best119895 (119905) minus 119909119894119895 (119905)
10038161003816100381610038161003816)
119909119894119895 (119905) otherwise(9)
where 119873 represents a Gaussian distribution with mean(119883best119895(119905)+119909119894119895(119905))2 and standard deviation |119883best119895(119905)minus119909119894119895(119905)|and CR is the probability of crossover
To balance the global search ability and convergence rateWang et al [22] proposed amodifiedGBDE (calledMGBDE)The mutation strategy uses a hybridization of GBDE andDEbest1 as follows
V119894119895 (119905 + 1)
=
119883best119895 (119905)+119865 sdot (1199091198941119895 (119905)minus1199091198942119895 (119905)) rand (0 1) le 05
119873(
119883best119895 (119905)+119909119894119895 (119905)
210038161003816100381610038161003816119883best119895 (119905)minus119909119894119895 (119905)
10038161003816100381610038161003816) otherwise
(10)
4 Proposed Algorithm BBTLBO
The bare-bones PSO utilizes this information by samplingcandidate solutions normally distributed around the for-mally derived attractor point That is the new position isgenerated by a Gaussian distribution for sampling the searchspace based on the 119892best and the 119901best at the current iterationAs a result the new position will be centered around theweighted average of 119901best and 119892best Generally speaking atthe initial evolutionary stages the search process focuses onexploration due to the large deviation With an increasingnumber of generations the deviation becomes smaller andthe search process will focus on exploitation From the searchbehavior of BBPSO the Gaussian sampling is a fine tuningprocedure which starts during exploration and is continuedto exploitation This can be beneficial for the search of manyevolutionary optimization algorithms Additionally the bare-bones PSO has no parameters to be tuned
Based on a previous explanation a new bare-bones TLBO(BBTLBO) with neighborhood search is proposed in this
4 The Scientific World Journal
Begin
Initialize learners size (NP) dimension (D) and hybridization factor (u)
Calculate the NTeacher and NMean of each learner
Modify each learner Xi in the class= + r lowast (NTeacher minus TF lowast NMean)= N((NTeacher + Xi)2 (NTeacher minus Xi))
newXi = u lowast newX1 + (1 minus u) lowast newX2Teacher phase
newXi better Xi
Xi = newXi Xi = Xi
Yes
Yes
Yes
Yes
No
No
No
No
Denote the NTeacheri and randomly select a Xk for each Xi
Learner phase
rand(0 1) lt 05
The original TLBO learning Neighborhood search strategy
newXi better Xi
Xi = newXi Xi = Xi
Termination criteria satisfied
End
gen = gen + 1
XiV1V2
Figure 1 Flow chart showing the working of BBTLBO algorithm
paper In fact for TLBO if the new learner has a betterfunction value than that of the old learner it is replacedwith the old one in the memory Otherwise the old one isretained in the memory In other words a greedy selectionmechanism is employed as the selection operation betweenthe old and the candidate one Hence the new teacher and thenew learner are the global best (119892best) and learnerrsquos personalbest (119901best) found so far respectively The complete flowchartof the BBTLBO algorithm is shown in Figure 1
41 Neighborhood Search It is known that birds of a featherflock together and people of a mind fall into the samegroup Just like evolutionary algorithms themselves thenotion of neighborhood is inspired by nature Neighborhoodtechnique is an efficient method to maintain diversity of
the solutions It plays an important role in evolutionaryalgorithms and is often introduced by researchers in orderto allow maintenance of a population of diverse individualsand improve the exploration capability of population-basedheuristic algorithms [23ndash26] In fact learners with similarinterests form different learning groups Because of his orher favor characteristic the learner maybe learns from theexcellent individual in the learning group
For the implementation of grouping various types ofconnected distances may be used Here we have used aring topology [27] based on the indexes of learners for thesake of simplicity In a ring topology the first individualis the neighbor of the last individual and vice versa Basedon the ring topology a 119896-neighborhood radius is definedwhere 119896 is a predefined integer number For each individual
The Scientific World Journal 5
NeighborhoodiXi
Ximinus1
Xi+1
Figure 2 Ring neighborhood topology with three members
its 119896-neighborhood radius consists of 2119896 + 1 individuals(including oneself) which are 119883
119894minus119896 119883
119894 119883
119894+119896 That is
the neighborhood size is 2119896 + 1 for a 119896-neighborhood Forsimplicity 119896 is set to 1 (Figure 2) in our algorithmThismeansthat there are 3 individuals in each learning group Oncegroups are constructed we can utilize them for updating thelearners of the corresponding group
42 Teacher Phase To balance the global and local searchability a modified interactive learning strategy is proposed inteacher phase In this learning phase each learner employs aninteractive learning strategy (the hybridization of the learningstrategy of teacher phase in the standard TLBO and Gaussiansampling learning) based on neighborhood search
In BBTLBO the updating formula of the learning for alearner 119883
119894in teacher phase is proposed by the hybridization
of the learning strategy of teacher phase and the Gaussiansampling learning as follows
1198811119895 (119905 + 1) = 119883119894119895 (119905) + rand (0 1)
sdot (119873119879119890119886119888ℎ119890119903119894119895 (119905) minus TF sdot 119873119872119890119886119899
119894119895 (119905))
1198812119895 (119905 + 1) = 119873(
119873119879119890119886119888ℎ119890119903119894119895 (119905) + 119873119872119890119886119899119894119895 (119905)
2
100381610038161003816100381610038161003816100381610038161003816
119873119879119890119886119888ℎ119890119903119894119895 (119905) minus 119873119872119890119886119899119894119895 (119905)
100381610038161003816100381610038161003816100381610038161003816
)
119899119890119908119883119894119895 (119905 + 1) = 119906 sdot 1198811119895 (119905 + 1) + (1 minus 119906) sdot 1198812119895 (119905 + 1)
(11)
where 119906 called the hybridization factor is a random numberin the range [0 1] for the 119895th dimension 119873119879119890119886119888ℎ119890119903 and119873119872119890119886119899 are the existing neighborhood best solution and theneighborhood mean solution of each learner and TF is ateaching factor which can be either 1 or 2 randomly
In the BBTLBO there is a (119906 lowast 100) chance that the119895th dimension of the 119894th learner in the population follows thebehavior of the learning strategy of teacher phase while theremaining (100 minus 119906lowast 100) follow the search behavior of theGaussian sampling in teacher phase This will be helpful tobalance the advantages of fast convergence rate (the attraction
of the learning strategy of teacher phase) and exploration (theGaussian sampling) in BBTLBO
43 Learner Phase At the same time in the learner phase alearner interacts randomly with other learners for enhancinghis or her knowledge in the class This learning method canbe treated as the global search strategy (shown in (3))
In this paper we introduce a new learning strategy inwhich each learner learns from the neighborhood teacher andthe other learner selected randomly of his or her correspond-ing neighborhood in learner phaseThis learningmethod canbe treated as the neighborhood search strategy Let 119899119890119908119883
119894
represent the interactive learning result of the learner119883119894This
neighborhood search strategy can be expressed as follows
119899119890119908119883119894119895= 119883119894119895+ 1199031lowast (119873119879119890119886119888ℎ119890119903
119894119895minus 119883119894119895)
+ 1199032lowast (119883119894119895minus 119883119896119895)
(12)
where 1199031and 1199032are random vectors in which each element
is a random number in the range [0 1] 119873119879119890119886119888ℎ119890119903 is theteacher of the learner 119883
119894rsquos corresponding neighborhood
and the learner 119883119896is selected randomly from the learnerrsquos
corresponding neighborhoodIn BBTLBO each learner is probabilistically learning by
means of the global search strategy or the neighborhoodsearch strategy in learner phaseThat is about 50of learnersin the population execute the learning strategy of learnerphase in the standard TLBO (shown in (3)) while theremaining 50execute neighborhood search strategy (shownin (12)) This will be helpful to balance the global search andlocal search in learner phase
Moreover compared to the original TLBO BBTLBOonlymodifies the learning strategies Therefore both the originalTLBO and BBTLBO have the same time complexity 119874 (NP sdot119863 sdot Genmax) where NP is the number of the population 119863is the number of dimensions and Genmax is the maximumnumber of generations
As explained above the pseudocode for the implementa-tion of BBTLBO is summarized in Algorithm 2
5 Functions Optimization
In this section to illustrate the effectiveness of the proposedmethod 20 benchmark functions are used to test the effi-ciency of BBTLBO To compare the search performance ofBBTLBO with some other methods other different algo-rithms are also simulated in the paper
51 Benchmark Functions Thedetails of 20 benchmark func-tions are shown in Table 1 Among 20 benchmark functions1198651to 1198659are unimodal functions and 119865
10to 11986520
are multi-modal functions The searching range and theory optima forall functions are also shown in Table 1
52 Parameter Settings All the experiments are carried outon the same machine with a Celoron 226GHz CPU 2GBmemory andWindows XP operating system withMatlab 79
6 The Scientific World Journal
(1) Begin(2) Initialize119873 (number of learners)119863 (number of dimensions) and hybridization factor 119906(3) Initialize learners 119883 and evaluate all learners119883(4) while (stopping condition not met)(5) for each learner119883
119894of the class Teaching phase
(6) TF = round(1 + rand(0 1))(7) Donate the119873 119879119890119886119888ℎ119890119903 and the119873 119872119890119886119899 in its neighborhood for each learner(8) Updating each learner according (11)(9) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(10) endfor(11) for each learner119883
119894of the class Learning phase
(12) Randomly select one learner119883119896 such that 119894 = 119896
(13) if rand(0 1) lt 05(14) Updating each learner according (3)(15) else(16) Donate the119873119879119890119886119888ℎ119890119903 in its neighborhood for each learner(17) Updating each learner according (12)(18) endif(19) Accept 119899119890119908119883
119894if 119891(119899119890119908119883119894) is better than 119891(119883
119894)
(20) endfor(21) endwhile(22) end
Algorithm 2 BBTLBO( )
For the purpose of reducing statistical errors each algorithmis independently simulated 50 times For all algorithms thepopulation size was set to 20 Population-based stochasticalgorithms use the same stopping criterion that is reachinga certain number of function evaluations (FEs)
53 Effect of Variation in Parameter 119906 The hybridizationfactor u is set to 00 01 03 05 07 09 10 Comparativetests have been performed using different 119906 In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions Table 2 showsthe mean optimum solutions and the standard deviation ofthe solutions obtained using different hybridization factor119906 in the 50 independent runs The best results amongthe algorithms are shown in bold Figure 3 presents therepresentative convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved byusing different hybridization factor 119906 on all test functionsDue to the tight space limitation some sample graphs areillustrated
The comparisons in Table 2 and Figure 3 show that whenthe hybridization factor 119906 is set to 09 BBTLBOoffers the bestperformance on 20 test functions Hence the hybridizationfactor 119906 is set to 09 in the following experiments
54 Comparison of BBTLBO with Some Similar Bare-BonesAlgorithms In this section we compare BBTLBO with fiveother recently proposed three bare-bones DE variants andtwo bare-bones PSO algorithms Our experiment includestwo series of comparisons in terms of the solution accuracyand the solution convergence (convergence speed and successrate) We compared the performance of BBTLBO with other
similar bare-bones algorithms including BBPSO [20] BBExp[20] BBDE [21] GBDE [22] and MGBDE [22]
541 Comparisons on the Solution Accuracy In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions The resultsare shown in Table 3 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the 50independent runs by each algorithm on 20 test functionsThebest results among the algorithms are shown in bold Figure 4presents the convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved by 7algorithms for 50 independent runs Due to the tight spacelimitation some sample graphs are illustrated
From Table 3 it can be observed that the mean optimumsolution and the standard deviation of all algorithms performwell for the functions 119865
15and 11986517 Although BBExp performs
better than BBTLBO on function 1198659and MGBDE performs
better than BBTLBO on function 11986520 our approach BBTLBO
achieves better results than other algorithms on the rest of testfunctions Table 3 and Figure 4 conclude that the BBTLBOhas a good performance of the solution accuracy for testfunctions in this paper
542 Comparison of the Convergence Speed and SR In orderto compare the convergence speed and successful rate (SR)of different algorithms we select a threshold value of theobjective function for each test function For other functionsthe threshold values are listed in Table 4 In our experimentthe stopping criterion is that each algorithm is terminatedwhen the best fitness value so far is below the predefinedthreshold value (119879 Value) or the number of FEs reaches to
The Scientific World Journal 7
Table 1 Details of numerical benchmarks used
Function Formula 119863 Range Optima
Sphere 1198651(119909) =
119863
sum
119894=1
1199092
11989430 [minus100 100] 0
Sum square 1198652(119909) =
119863
sum
119894=1
1198941199092
11989430 [minus100 100] 0
Quadric 1198653(119909) =
119863
sum
119894=1
1198941199094
119894+ random(0 1) 30 [minus128 128] 0
Step 1198654(119909) =
119863
sum
119894=1
(lfloor119909119894+ 05rfloor)
2 30 [minus100 100] 0
Schwefel 12 1198655(119909) =
119863
sum
119894=1
(
119894
sum
119895=1
119909119895)
2
30 [minus100 100] 0
Schwefel 221 1198656(119909) = max 1003816100381610038161003816119909119894
1003816100381610038161003816 1 le 119894 le 119863 30 [minus100 100] 0
Schwefel 222 1198657(119909) =
119863
sum
119894=1
1003816100381610038161003816119909119894
1003816100381610038161003816+
119863
prod
119894=1
1003816100381610038161003816119909119894
100381610038161003816100381630 [minus10 10] 0
Zakharov 1198658(119909) =
119863
sum
119894=1
1199092
119894+ (
119863
sum
119894=1
05119894119909119894)
2
+ (
119863
sum
119894=1
05119894119909119894)
4
30 [minus100 100] 0
Rosenbrock 1198659(119909) =
119863minus1
sum
119894=1
lfloor100(1199092
119894minus 119909119894+1)2
+ (119909119894minus 1)2
rfloor 30 [minus2048 2048] 0
Ackley 11986510(119909) = 20 minus 20 exp((minus1
5)radic(
1
119863)
119863
sum
119894=1
1199092
119894) minus exp(( 1
119863)
119863
sum
119894=1
cos (2120587119909119894)) + 119890 30 [minus32 32] 0
Rastrigin 11986511(119909) =
119863
sum
119894=1
(1199092
119894minus 10 cos (2120587119909
119894) + 10) 30 [minus512 512] 0
Weierstrass11986512(119909) =
119863
sum
119894=1
(
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))]) minus 119863
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 times 05)]
119886 = 05 119887 = 3 119896max = 20
30 [minus05 05] 0
Griewank 11986513(119909) =
119863
sum
119894=1
(1199092
119894
4000) minus
119899
prod
119894=1
cos(119909119894
radic119894
) + 1 30 [minus600 600] 0
Schwefel 11986514(119909) = 4189829119863 +
119863
sum
119894=1
(minus119909119894sinradicabs(119909
119894)) 30 [minus500 500] 0
Bohachevsky1 11986515(119909) = 119909
2
1+ 21199092
2minus 03 cos (3120587119909
1) minus 04 cos (4120587119909
2) + 07 2 [minus100 100] 0
Bohachevsky2 11986516(119909) = 119909
2
1+ 21199092
2minus 03cos (3120587119909
1) lowast cos (4120587119909
2) + 03 2 [minus100 100] 0
Bohachevsky3 11986517(119909) = 119909
2
1+ 21199092
2minus 03cos((3120587119909
1) + (4120587119909
2)) + 03 2 [minus100 100] 0
Shekel5 11986518(119909) = minus
5
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus101532
Shekel7 11986519(119909) = minus
7
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus104029
Shekel10 11986520(119909) = minus
10
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus105364
the maximal FEs 40000 The results are shown in Table 4in terms of the mean number of FEs (MFEs) required toconverge to the threshold and successful rate (SR) in the50 independent runs ldquoNaNrdquo represents that no runs of thecorresponding algorithm converged below the predefinedthreshold before meeting the maximum number of FEs Thebest results among the six algorithms are shown in boldface
FromTable 5 it can be observed that all algorithms hardlyconverge to the threshold for unimodal functions 119865
3 1198655 1198656
and 1198658and multimodal functions 119865
11 11986512 and 119865
14 BBTLBO
converges to the threshold except for functions 1198653 1198659 and
11986514 From the results of total average FEs BBTLBO converges
faster than other algorithms on all unimodal functions andthe majority of multimodal functions except for functions119865151198651611986519 and119865
20The acceleration rates between BBTLBO
and other algorithms are mostly 10 for functions 1198651 1198652 1198654
1198657 1198659 11986510 and 119865
13 From the results of total average SR
BBTLBO achieves the highest SR for those test functions ofwhich BBTLBO successfully converges to the threshold valueIt can be concluded that the BBTLBOhas a good performanceof convergence speed and successful rate (SR) of the solutionsfor test functions in this paper
8 The Scientific World Journal
Table2Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent119906
Fun
BBTL
BO(119906=00)
BBTL
BO(119906=01)
BBTL
BO(119906=03)
BBTL
BO(119906=05)
BBTL
BO(119906=07)
BBTL
BO(119906=09)
BBTL
BO(119906=10)
1198651
175119890minus001plusmn121119890+000
689119890minus071plusmn101119890minus070
123119890minus163plusmn00
121119890minus256plusmn00
00plusmn00
00plusmn00
00plusmn00
1198652
898119890minus005plusmn573119890minus004
562119890minus069plusmn272119890minus068
220119890minus161plusmn112119890minus160
243119890minus254plusmn00
00plusmn00
00plusmn00
00plusmn00
1198653
120119890minus001plusmn634119890minus002
591119890minus003plusmn144119890minus003
101119890minus003plusmn348119890minus004
435119890minus004plusmn197119890minus004
235119890minus004plusmn130119890minus004
227119890minus004plusmn126119890minus004
199
eminus00
4plusmn113
eminus00
41198654
765119890+002plusmn583119890+002
480119890minus001plusmn886119890minus001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
558119890+002plusmn653119890+002
187119890minus028plusmn573119890minus028
353119890minus054plusmn186119890minus053
369119890minus073plusmn227119890minus072
953119890minus096plusmn674119890minus095
216
eminus11
5plusmn110
eminus11
4256119890minus100plusmn130119890minus099
1198656
251119890+001plusmn534119890+000
667119890minus021plusmn881119890minus021
281119890minus061plusmn636119890minus061
822119890minus100plusmn180119890minus099
818119890minus137plusmn141119890minus136
363
eminus15
4plusmn134
eminus15
3886119890minus147plusmn322119890minus146
1198657
137119890minus003plusmn954119890minus003
872119890minus043plusmn152119890minus042
568119890minus088plusmn876119890minus088
101119890minus133plusmn238119890minus133
260119890minus175plusmn00
116
eminus18
8plusmn00
833119890minus180plusmn00
1198658
241119890+000plusmn307119890+000
132119890minus019plusmn298119890minus019
213119890minus028plusmn769119890minus028
344119890minus037plusmn124119890minus036
220119890minus050plusmn912119890minus050
107
eminus05
6plusmn439
eminus05
6203119890minus049plusmn894119890minus049
1198659
266
e+00
1plusmn179
e+00
0272119890+001plusmn317119890minus001
277119890+001plusmn318119890minus001
283119890+001plusmn278119890minus001
284119890+001plusmn267119890minus001
283119890+001plusmn341119890minus001
280119890+001plusmn387119890minus001
11986510830119890+000plusmn176119890+000
177119890minus001plusmn610119890minus001
590119890minus015plusmn170119890minus015
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
11986511374119890+001plusmn905119890+000
333119890+001plusmn118119890+001
271119890+001plusmn800119890+000
189119890+001plusmn114119890+001
573119890+000plusmn106119890+001
00plusmn00
00plusmn00
11986512815119890+000plusmn193119890+000
338119890minus001plusmn116119890+000
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986513506119890minus001plusmn808119890minus001
652119890minus003plusmn886119890minus003
178119890minus003plusmn368119890minus003
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986514
433
e+00
3plusmn679
e+00
2467119890+003plusmn610119890+002
517119890+003plusmn668119890+002
559119890+003plusmn685119890+002
553119890+003plusmn710119890+002
558119890+003plusmn780119890+002
540119890+003plusmn653119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518minus771119890+000plusmn347119890+000minus806119890+000plusmn339119890+000minus964119890+000plusmn181119890+000minus965119890+000plusmn176119890+000minus102
e+00
1plusmn677
eminus00
3minus985119890+000plusmn122119890+000minus993119890+000plusmn112119890+000
11986519minus769119890+000plusmn352119890+000minus813119890+000plusmn336119890+000minus987119890+000plusmn183119890+000minus103
e+00
1plusmn945
eminus00
1minus976119890+000plusmn195119890+000minus982119890+000plusmn178119890+000minus961119890+000plusmn199119890+000
11986520minus812119890+000plusmn353119890+000minus938119890+000plusmn269119890+000minus101119890+001plusmn165119890+000minus101
e+00
1plusmn161
e+00
0minus970119890+000plusmn228119890+000minus941119890+000plusmn243119890+000minus100119890+001plusmn169119890+000
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 3
3 Bare-Bones Algorithm
In this section we only presented a brief overview of somerecently proposed bare-bones algorithms
31 BBPSO and BBExp PSO is a swarm intelligence-basedalgorithm which is inspired by the behavior of birds flocking[5] In PSO each particle is attracted by its personal bestposition (119901best) and the global best position (119892best) foundso far Theoretical studies [18 19] proved that each particleconverges to the weighted average of 119901best and 119892best
lim119905rarrinfin
119883119894 (119905) =
1198881sdot 119892best + 1198882 sdot 119901best
1198881+ 1198882
(4)
where 1198881and 1198882are two leaning factors in PSO
Based on the convergence characteristic of PSO Kennedy[20] proposed a new PSO variant called bare-bones PSO(BBPSO) Bare-bones PSO retains the standard PSO socialcommunication but replaces dynamical particle update withsampling from a probability distribution based on 119892best and119901best119894 as follows
119909119894119895 (119905 + 1) = 119873(
119892best + 119901best119894119895 (119905)
2
100381610038161003816100381610038161003816119892best minus 119901best119894119895 (119905)
100381610038161003816100381610038161003816) (5)
where 119909119894119895(119905 + 1) is the 119895th dimension of the 119894th particle in
the population and119873 represents a Gaussian distributionwithmean (119892best + 119901best119894119895(119905))2 and standard deviation |119892best minus119901best119894119895(119905)|
Kennedy [20] proposed also an alternative version of theBBPSO denoted by BBExp where (5) is replaced by119909119894119895 (119905 + 1)
=
119873(
119892best + 119901best119894119895 (119905)
2
100381610038161003816100381610038161003816119892best minus 119901best119894119895 (119905)
100381610038161003816100381610038161003816) rand (0 1) gt 05
119901best119894119895 (119905) otherwise
(6)where rand (01) is a random value within [0 1] for the 119895thdimension For the alternative mechanism there is a 50chance that the search process is focusing on the previous bestpositions
32 BBDE GBDE and MGBDE Inspired by the BBPSOand DE Omran et al [21] proposed a new and efficientDE variant called bare-bones differential evolution (BBDE)The BBDE is a new almost parameter-free optimizationalgorithm that is a hybrid of the bare-bones particle swarmoptimizer and differential evolution Differential evolution isused tomutate for each particle the attractor associated withthat particle defined as a weighted average of its personal andneighborhood best positions For the BBDE the individual isupdated as follows119909119894119895 (119905 + 1)
= 1199011198943 119895(119905) + 1199032 sdot (1199091198941 119895
(119905) minus 1199091198942 119895(119905)) rand (0 1) gt CR
119901best1198943119895 (119905) otherwise(7)
where 1198941 1198942 and 119894
3are three indices chosen from the set
1 2 NP with 1198941= 1198942= 119894 rand (0 1) is a random value
within [0 1] for the 119895th dimension and 119901119894119895(119905) is defined by
119901119894119895 (119905 + 1) = 1199031119895 sdot 119901best119894119895 (119905) + (1 minus 1199032119895) 119892best119894 (119905) (8)
where 119901best and 119892best are personal best position and the globalbest position 119903
1119895 is a random value within [0 1] for the 119895th
dimensionBased on the idea that the Gaussian sampling is a fine
tuning procedure which starts during exploration and iscontinued to exploitation Wang et al [22] proposed a newparameter-freeDE algorithm calledGBDE In theGBDE themutation strategy uses a Gaussian sampling method which isdefined by
V119894119895 (119905 + 1)
=
119873(
119883best119895 (119905) + 119909119894119895 (119905)
2 rand (0 1) le CR or 119895 = 119895rand
10038161003816100381610038161003816119883best119895 (119905) minus 119909119894119895 (119905)
10038161003816100381610038161003816)
119909119894119895 (119905) otherwise(9)
where 119873 represents a Gaussian distribution with mean(119883best119895(119905)+119909119894119895(119905))2 and standard deviation |119883best119895(119905)minus119909119894119895(119905)|and CR is the probability of crossover
To balance the global search ability and convergence rateWang et al [22] proposed amodifiedGBDE (calledMGBDE)The mutation strategy uses a hybridization of GBDE andDEbest1 as follows
V119894119895 (119905 + 1)
=
119883best119895 (119905)+119865 sdot (1199091198941119895 (119905)minus1199091198942119895 (119905)) rand (0 1) le 05
119873(
119883best119895 (119905)+119909119894119895 (119905)
210038161003816100381610038161003816119883best119895 (119905)minus119909119894119895 (119905)
10038161003816100381610038161003816) otherwise
(10)
4 Proposed Algorithm BBTLBO
The bare-bones PSO utilizes this information by samplingcandidate solutions normally distributed around the for-mally derived attractor point That is the new position isgenerated by a Gaussian distribution for sampling the searchspace based on the 119892best and the 119901best at the current iterationAs a result the new position will be centered around theweighted average of 119901best and 119892best Generally speaking atthe initial evolutionary stages the search process focuses onexploration due to the large deviation With an increasingnumber of generations the deviation becomes smaller andthe search process will focus on exploitation From the searchbehavior of BBPSO the Gaussian sampling is a fine tuningprocedure which starts during exploration and is continuedto exploitation This can be beneficial for the search of manyevolutionary optimization algorithms Additionally the bare-bones PSO has no parameters to be tuned
Based on a previous explanation a new bare-bones TLBO(BBTLBO) with neighborhood search is proposed in this
4 The Scientific World Journal
Begin
Initialize learners size (NP) dimension (D) and hybridization factor (u)
Calculate the NTeacher and NMean of each learner
Modify each learner Xi in the class= + r lowast (NTeacher minus TF lowast NMean)= N((NTeacher + Xi)2 (NTeacher minus Xi))
newXi = u lowast newX1 + (1 minus u) lowast newX2Teacher phase
newXi better Xi
Xi = newXi Xi = Xi
Yes
Yes
Yes
Yes
No
No
No
No
Denote the NTeacheri and randomly select a Xk for each Xi
Learner phase
rand(0 1) lt 05
The original TLBO learning Neighborhood search strategy
newXi better Xi
Xi = newXi Xi = Xi
Termination criteria satisfied
End
gen = gen + 1
XiV1V2
Figure 1 Flow chart showing the working of BBTLBO algorithm
paper In fact for TLBO if the new learner has a betterfunction value than that of the old learner it is replacedwith the old one in the memory Otherwise the old one isretained in the memory In other words a greedy selectionmechanism is employed as the selection operation betweenthe old and the candidate one Hence the new teacher and thenew learner are the global best (119892best) and learnerrsquos personalbest (119901best) found so far respectively The complete flowchartof the BBTLBO algorithm is shown in Figure 1
41 Neighborhood Search It is known that birds of a featherflock together and people of a mind fall into the samegroup Just like evolutionary algorithms themselves thenotion of neighborhood is inspired by nature Neighborhoodtechnique is an efficient method to maintain diversity of
the solutions It plays an important role in evolutionaryalgorithms and is often introduced by researchers in orderto allow maintenance of a population of diverse individualsand improve the exploration capability of population-basedheuristic algorithms [23ndash26] In fact learners with similarinterests form different learning groups Because of his orher favor characteristic the learner maybe learns from theexcellent individual in the learning group
For the implementation of grouping various types ofconnected distances may be used Here we have used aring topology [27] based on the indexes of learners for thesake of simplicity In a ring topology the first individualis the neighbor of the last individual and vice versa Basedon the ring topology a 119896-neighborhood radius is definedwhere 119896 is a predefined integer number For each individual
The Scientific World Journal 5
NeighborhoodiXi
Ximinus1
Xi+1
Figure 2 Ring neighborhood topology with three members
its 119896-neighborhood radius consists of 2119896 + 1 individuals(including oneself) which are 119883
119894minus119896 119883
119894 119883
119894+119896 That is
the neighborhood size is 2119896 + 1 for a 119896-neighborhood Forsimplicity 119896 is set to 1 (Figure 2) in our algorithmThismeansthat there are 3 individuals in each learning group Oncegroups are constructed we can utilize them for updating thelearners of the corresponding group
42 Teacher Phase To balance the global and local searchability a modified interactive learning strategy is proposed inteacher phase In this learning phase each learner employs aninteractive learning strategy (the hybridization of the learningstrategy of teacher phase in the standard TLBO and Gaussiansampling learning) based on neighborhood search
In BBTLBO the updating formula of the learning for alearner 119883
119894in teacher phase is proposed by the hybridization
of the learning strategy of teacher phase and the Gaussiansampling learning as follows
1198811119895 (119905 + 1) = 119883119894119895 (119905) + rand (0 1)
sdot (119873119879119890119886119888ℎ119890119903119894119895 (119905) minus TF sdot 119873119872119890119886119899
119894119895 (119905))
1198812119895 (119905 + 1) = 119873(
119873119879119890119886119888ℎ119890119903119894119895 (119905) + 119873119872119890119886119899119894119895 (119905)
2
100381610038161003816100381610038161003816100381610038161003816
119873119879119890119886119888ℎ119890119903119894119895 (119905) minus 119873119872119890119886119899119894119895 (119905)
100381610038161003816100381610038161003816100381610038161003816
)
119899119890119908119883119894119895 (119905 + 1) = 119906 sdot 1198811119895 (119905 + 1) + (1 minus 119906) sdot 1198812119895 (119905 + 1)
(11)
where 119906 called the hybridization factor is a random numberin the range [0 1] for the 119895th dimension 119873119879119890119886119888ℎ119890119903 and119873119872119890119886119899 are the existing neighborhood best solution and theneighborhood mean solution of each learner and TF is ateaching factor which can be either 1 or 2 randomly
In the BBTLBO there is a (119906 lowast 100) chance that the119895th dimension of the 119894th learner in the population follows thebehavior of the learning strategy of teacher phase while theremaining (100 minus 119906lowast 100) follow the search behavior of theGaussian sampling in teacher phase This will be helpful tobalance the advantages of fast convergence rate (the attraction
of the learning strategy of teacher phase) and exploration (theGaussian sampling) in BBTLBO
43 Learner Phase At the same time in the learner phase alearner interacts randomly with other learners for enhancinghis or her knowledge in the class This learning method canbe treated as the global search strategy (shown in (3))
In this paper we introduce a new learning strategy inwhich each learner learns from the neighborhood teacher andthe other learner selected randomly of his or her correspond-ing neighborhood in learner phaseThis learningmethod canbe treated as the neighborhood search strategy Let 119899119890119908119883
119894
represent the interactive learning result of the learner119883119894This
neighborhood search strategy can be expressed as follows
119899119890119908119883119894119895= 119883119894119895+ 1199031lowast (119873119879119890119886119888ℎ119890119903
119894119895minus 119883119894119895)
+ 1199032lowast (119883119894119895minus 119883119896119895)
(12)
where 1199031and 1199032are random vectors in which each element
is a random number in the range [0 1] 119873119879119890119886119888ℎ119890119903 is theteacher of the learner 119883
119894rsquos corresponding neighborhood
and the learner 119883119896is selected randomly from the learnerrsquos
corresponding neighborhoodIn BBTLBO each learner is probabilistically learning by
means of the global search strategy or the neighborhoodsearch strategy in learner phaseThat is about 50of learnersin the population execute the learning strategy of learnerphase in the standard TLBO (shown in (3)) while theremaining 50execute neighborhood search strategy (shownin (12)) This will be helpful to balance the global search andlocal search in learner phase
Moreover compared to the original TLBO BBTLBOonlymodifies the learning strategies Therefore both the originalTLBO and BBTLBO have the same time complexity 119874 (NP sdot119863 sdot Genmax) where NP is the number of the population 119863is the number of dimensions and Genmax is the maximumnumber of generations
As explained above the pseudocode for the implementa-tion of BBTLBO is summarized in Algorithm 2
5 Functions Optimization
In this section to illustrate the effectiveness of the proposedmethod 20 benchmark functions are used to test the effi-ciency of BBTLBO To compare the search performance ofBBTLBO with some other methods other different algo-rithms are also simulated in the paper
51 Benchmark Functions Thedetails of 20 benchmark func-tions are shown in Table 1 Among 20 benchmark functions1198651to 1198659are unimodal functions and 119865
10to 11986520
are multi-modal functions The searching range and theory optima forall functions are also shown in Table 1
52 Parameter Settings All the experiments are carried outon the same machine with a Celoron 226GHz CPU 2GBmemory andWindows XP operating system withMatlab 79
6 The Scientific World Journal
(1) Begin(2) Initialize119873 (number of learners)119863 (number of dimensions) and hybridization factor 119906(3) Initialize learners 119883 and evaluate all learners119883(4) while (stopping condition not met)(5) for each learner119883
119894of the class Teaching phase
(6) TF = round(1 + rand(0 1))(7) Donate the119873 119879119890119886119888ℎ119890119903 and the119873 119872119890119886119899 in its neighborhood for each learner(8) Updating each learner according (11)(9) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(10) endfor(11) for each learner119883
119894of the class Learning phase
(12) Randomly select one learner119883119896 such that 119894 = 119896
(13) if rand(0 1) lt 05(14) Updating each learner according (3)(15) else(16) Donate the119873119879119890119886119888ℎ119890119903 in its neighborhood for each learner(17) Updating each learner according (12)(18) endif(19) Accept 119899119890119908119883
119894if 119891(119899119890119908119883119894) is better than 119891(119883
119894)
(20) endfor(21) endwhile(22) end
Algorithm 2 BBTLBO( )
For the purpose of reducing statistical errors each algorithmis independently simulated 50 times For all algorithms thepopulation size was set to 20 Population-based stochasticalgorithms use the same stopping criterion that is reachinga certain number of function evaluations (FEs)
53 Effect of Variation in Parameter 119906 The hybridizationfactor u is set to 00 01 03 05 07 09 10 Comparativetests have been performed using different 119906 In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions Table 2 showsthe mean optimum solutions and the standard deviation ofthe solutions obtained using different hybridization factor119906 in the 50 independent runs The best results amongthe algorithms are shown in bold Figure 3 presents therepresentative convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved byusing different hybridization factor 119906 on all test functionsDue to the tight space limitation some sample graphs areillustrated
The comparisons in Table 2 and Figure 3 show that whenthe hybridization factor 119906 is set to 09 BBTLBOoffers the bestperformance on 20 test functions Hence the hybridizationfactor 119906 is set to 09 in the following experiments
54 Comparison of BBTLBO with Some Similar Bare-BonesAlgorithms In this section we compare BBTLBO with fiveother recently proposed three bare-bones DE variants andtwo bare-bones PSO algorithms Our experiment includestwo series of comparisons in terms of the solution accuracyand the solution convergence (convergence speed and successrate) We compared the performance of BBTLBO with other
similar bare-bones algorithms including BBPSO [20] BBExp[20] BBDE [21] GBDE [22] and MGBDE [22]
541 Comparisons on the Solution Accuracy In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions The resultsare shown in Table 3 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the 50independent runs by each algorithm on 20 test functionsThebest results among the algorithms are shown in bold Figure 4presents the convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved by 7algorithms for 50 independent runs Due to the tight spacelimitation some sample graphs are illustrated
From Table 3 it can be observed that the mean optimumsolution and the standard deviation of all algorithms performwell for the functions 119865
15and 11986517 Although BBExp performs
better than BBTLBO on function 1198659and MGBDE performs
better than BBTLBO on function 11986520 our approach BBTLBO
achieves better results than other algorithms on the rest of testfunctions Table 3 and Figure 4 conclude that the BBTLBOhas a good performance of the solution accuracy for testfunctions in this paper
542 Comparison of the Convergence Speed and SR In orderto compare the convergence speed and successful rate (SR)of different algorithms we select a threshold value of theobjective function for each test function For other functionsthe threshold values are listed in Table 4 In our experimentthe stopping criterion is that each algorithm is terminatedwhen the best fitness value so far is below the predefinedthreshold value (119879 Value) or the number of FEs reaches to
The Scientific World Journal 7
Table 1 Details of numerical benchmarks used
Function Formula 119863 Range Optima
Sphere 1198651(119909) =
119863
sum
119894=1
1199092
11989430 [minus100 100] 0
Sum square 1198652(119909) =
119863
sum
119894=1
1198941199092
11989430 [minus100 100] 0
Quadric 1198653(119909) =
119863
sum
119894=1
1198941199094
119894+ random(0 1) 30 [minus128 128] 0
Step 1198654(119909) =
119863
sum
119894=1
(lfloor119909119894+ 05rfloor)
2 30 [minus100 100] 0
Schwefel 12 1198655(119909) =
119863
sum
119894=1
(
119894
sum
119895=1
119909119895)
2
30 [minus100 100] 0
Schwefel 221 1198656(119909) = max 1003816100381610038161003816119909119894
1003816100381610038161003816 1 le 119894 le 119863 30 [minus100 100] 0
Schwefel 222 1198657(119909) =
119863
sum
119894=1
1003816100381610038161003816119909119894
1003816100381610038161003816+
119863
prod
119894=1
1003816100381610038161003816119909119894
100381610038161003816100381630 [minus10 10] 0
Zakharov 1198658(119909) =
119863
sum
119894=1
1199092
119894+ (
119863
sum
119894=1
05119894119909119894)
2
+ (
119863
sum
119894=1
05119894119909119894)
4
30 [minus100 100] 0
Rosenbrock 1198659(119909) =
119863minus1
sum
119894=1
lfloor100(1199092
119894minus 119909119894+1)2
+ (119909119894minus 1)2
rfloor 30 [minus2048 2048] 0
Ackley 11986510(119909) = 20 minus 20 exp((minus1
5)radic(
1
119863)
119863
sum
119894=1
1199092
119894) minus exp(( 1
119863)
119863
sum
119894=1
cos (2120587119909119894)) + 119890 30 [minus32 32] 0
Rastrigin 11986511(119909) =
119863
sum
119894=1
(1199092
119894minus 10 cos (2120587119909
119894) + 10) 30 [minus512 512] 0
Weierstrass11986512(119909) =
119863
sum
119894=1
(
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))]) minus 119863
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 times 05)]
119886 = 05 119887 = 3 119896max = 20
30 [minus05 05] 0
Griewank 11986513(119909) =
119863
sum
119894=1
(1199092
119894
4000) minus
119899
prod
119894=1
cos(119909119894
radic119894
) + 1 30 [minus600 600] 0
Schwefel 11986514(119909) = 4189829119863 +
119863
sum
119894=1
(minus119909119894sinradicabs(119909
119894)) 30 [minus500 500] 0
Bohachevsky1 11986515(119909) = 119909
2
1+ 21199092
2minus 03 cos (3120587119909
1) minus 04 cos (4120587119909
2) + 07 2 [minus100 100] 0
Bohachevsky2 11986516(119909) = 119909
2
1+ 21199092
2minus 03cos (3120587119909
1) lowast cos (4120587119909
2) + 03 2 [minus100 100] 0
Bohachevsky3 11986517(119909) = 119909
2
1+ 21199092
2minus 03cos((3120587119909
1) + (4120587119909
2)) + 03 2 [minus100 100] 0
Shekel5 11986518(119909) = minus
5
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus101532
Shekel7 11986519(119909) = minus
7
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus104029
Shekel10 11986520(119909) = minus
10
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus105364
the maximal FEs 40000 The results are shown in Table 4in terms of the mean number of FEs (MFEs) required toconverge to the threshold and successful rate (SR) in the50 independent runs ldquoNaNrdquo represents that no runs of thecorresponding algorithm converged below the predefinedthreshold before meeting the maximum number of FEs Thebest results among the six algorithms are shown in boldface
FromTable 5 it can be observed that all algorithms hardlyconverge to the threshold for unimodal functions 119865
3 1198655 1198656
and 1198658and multimodal functions 119865
11 11986512 and 119865
14 BBTLBO
converges to the threshold except for functions 1198653 1198659 and
11986514 From the results of total average FEs BBTLBO converges
faster than other algorithms on all unimodal functions andthe majority of multimodal functions except for functions119865151198651611986519 and119865
20The acceleration rates between BBTLBO
and other algorithms are mostly 10 for functions 1198651 1198652 1198654
1198657 1198659 11986510 and 119865
13 From the results of total average SR
BBTLBO achieves the highest SR for those test functions ofwhich BBTLBO successfully converges to the threshold valueIt can be concluded that the BBTLBOhas a good performanceof convergence speed and successful rate (SR) of the solutionsfor test functions in this paper
8 The Scientific World Journal
Table2Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent119906
Fun
BBTL
BO(119906=00)
BBTL
BO(119906=01)
BBTL
BO(119906=03)
BBTL
BO(119906=05)
BBTL
BO(119906=07)
BBTL
BO(119906=09)
BBTL
BO(119906=10)
1198651
175119890minus001plusmn121119890+000
689119890minus071plusmn101119890minus070
123119890minus163plusmn00
121119890minus256plusmn00
00plusmn00
00plusmn00
00plusmn00
1198652
898119890minus005plusmn573119890minus004
562119890minus069plusmn272119890minus068
220119890minus161plusmn112119890minus160
243119890minus254plusmn00
00plusmn00
00plusmn00
00plusmn00
1198653
120119890minus001plusmn634119890minus002
591119890minus003plusmn144119890minus003
101119890minus003plusmn348119890minus004
435119890minus004plusmn197119890minus004
235119890minus004plusmn130119890minus004
227119890minus004plusmn126119890minus004
199
eminus00
4plusmn113
eminus00
41198654
765119890+002plusmn583119890+002
480119890minus001plusmn886119890minus001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
558119890+002plusmn653119890+002
187119890minus028plusmn573119890minus028
353119890minus054plusmn186119890minus053
369119890minus073plusmn227119890minus072
953119890minus096plusmn674119890minus095
216
eminus11
5plusmn110
eminus11
4256119890minus100plusmn130119890minus099
1198656
251119890+001plusmn534119890+000
667119890minus021plusmn881119890minus021
281119890minus061plusmn636119890minus061
822119890minus100plusmn180119890minus099
818119890minus137plusmn141119890minus136
363
eminus15
4plusmn134
eminus15
3886119890minus147plusmn322119890minus146
1198657
137119890minus003plusmn954119890minus003
872119890minus043plusmn152119890minus042
568119890minus088plusmn876119890minus088
101119890minus133plusmn238119890minus133
260119890minus175plusmn00
116
eminus18
8plusmn00
833119890minus180plusmn00
1198658
241119890+000plusmn307119890+000
132119890minus019plusmn298119890minus019
213119890minus028plusmn769119890minus028
344119890minus037plusmn124119890minus036
220119890minus050plusmn912119890minus050
107
eminus05
6plusmn439
eminus05
6203119890minus049plusmn894119890minus049
1198659
266
e+00
1plusmn179
e+00
0272119890+001plusmn317119890minus001
277119890+001plusmn318119890minus001
283119890+001plusmn278119890minus001
284119890+001plusmn267119890minus001
283119890+001plusmn341119890minus001
280119890+001plusmn387119890minus001
11986510830119890+000plusmn176119890+000
177119890minus001plusmn610119890minus001
590119890minus015plusmn170119890minus015
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
11986511374119890+001plusmn905119890+000
333119890+001plusmn118119890+001
271119890+001plusmn800119890+000
189119890+001plusmn114119890+001
573119890+000plusmn106119890+001
00plusmn00
00plusmn00
11986512815119890+000plusmn193119890+000
338119890minus001plusmn116119890+000
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986513506119890minus001plusmn808119890minus001
652119890minus003plusmn886119890minus003
178119890minus003plusmn368119890minus003
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986514
433
e+00
3plusmn679
e+00
2467119890+003plusmn610119890+002
517119890+003plusmn668119890+002
559119890+003plusmn685119890+002
553119890+003plusmn710119890+002
558119890+003plusmn780119890+002
540119890+003plusmn653119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518minus771119890+000plusmn347119890+000minus806119890+000plusmn339119890+000minus964119890+000plusmn181119890+000minus965119890+000plusmn176119890+000minus102
e+00
1plusmn677
eminus00
3minus985119890+000plusmn122119890+000minus993119890+000plusmn112119890+000
11986519minus769119890+000plusmn352119890+000minus813119890+000plusmn336119890+000minus987119890+000plusmn183119890+000minus103
e+00
1plusmn945
eminus00
1minus976119890+000plusmn195119890+000minus982119890+000plusmn178119890+000minus961119890+000plusmn199119890+000
11986520minus812119890+000plusmn353119890+000minus938119890+000plusmn269119890+000minus101119890+001plusmn165119890+000minus101
e+00
1plusmn161
e+00
0minus970119890+000plusmn228119890+000minus941119890+000plusmn243119890+000minus100119890+001plusmn169119890+000
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
4 The Scientific World Journal
Begin
Initialize learners size (NP) dimension (D) and hybridization factor (u)
Calculate the NTeacher and NMean of each learner
Modify each learner Xi in the class= + r lowast (NTeacher minus TF lowast NMean)= N((NTeacher + Xi)2 (NTeacher minus Xi))
newXi = u lowast newX1 + (1 minus u) lowast newX2Teacher phase
newXi better Xi
Xi = newXi Xi = Xi
Yes
Yes
Yes
Yes
No
No
No
No
Denote the NTeacheri and randomly select a Xk for each Xi
Learner phase
rand(0 1) lt 05
The original TLBO learning Neighborhood search strategy
newXi better Xi
Xi = newXi Xi = Xi
Termination criteria satisfied
End
gen = gen + 1
XiV1V2
Figure 1 Flow chart showing the working of BBTLBO algorithm
paper In fact for TLBO if the new learner has a betterfunction value than that of the old learner it is replacedwith the old one in the memory Otherwise the old one isretained in the memory In other words a greedy selectionmechanism is employed as the selection operation betweenthe old and the candidate one Hence the new teacher and thenew learner are the global best (119892best) and learnerrsquos personalbest (119901best) found so far respectively The complete flowchartof the BBTLBO algorithm is shown in Figure 1
41 Neighborhood Search It is known that birds of a featherflock together and people of a mind fall into the samegroup Just like evolutionary algorithms themselves thenotion of neighborhood is inspired by nature Neighborhoodtechnique is an efficient method to maintain diversity of
the solutions It plays an important role in evolutionaryalgorithms and is often introduced by researchers in orderto allow maintenance of a population of diverse individualsand improve the exploration capability of population-basedheuristic algorithms [23ndash26] In fact learners with similarinterests form different learning groups Because of his orher favor characteristic the learner maybe learns from theexcellent individual in the learning group
For the implementation of grouping various types ofconnected distances may be used Here we have used aring topology [27] based on the indexes of learners for thesake of simplicity In a ring topology the first individualis the neighbor of the last individual and vice versa Basedon the ring topology a 119896-neighborhood radius is definedwhere 119896 is a predefined integer number For each individual
The Scientific World Journal 5
NeighborhoodiXi
Ximinus1
Xi+1
Figure 2 Ring neighborhood topology with three members
its 119896-neighborhood radius consists of 2119896 + 1 individuals(including oneself) which are 119883
119894minus119896 119883
119894 119883
119894+119896 That is
the neighborhood size is 2119896 + 1 for a 119896-neighborhood Forsimplicity 119896 is set to 1 (Figure 2) in our algorithmThismeansthat there are 3 individuals in each learning group Oncegroups are constructed we can utilize them for updating thelearners of the corresponding group
42 Teacher Phase To balance the global and local searchability a modified interactive learning strategy is proposed inteacher phase In this learning phase each learner employs aninteractive learning strategy (the hybridization of the learningstrategy of teacher phase in the standard TLBO and Gaussiansampling learning) based on neighborhood search
In BBTLBO the updating formula of the learning for alearner 119883
119894in teacher phase is proposed by the hybridization
of the learning strategy of teacher phase and the Gaussiansampling learning as follows
1198811119895 (119905 + 1) = 119883119894119895 (119905) + rand (0 1)
sdot (119873119879119890119886119888ℎ119890119903119894119895 (119905) minus TF sdot 119873119872119890119886119899
119894119895 (119905))
1198812119895 (119905 + 1) = 119873(
119873119879119890119886119888ℎ119890119903119894119895 (119905) + 119873119872119890119886119899119894119895 (119905)
2
100381610038161003816100381610038161003816100381610038161003816
119873119879119890119886119888ℎ119890119903119894119895 (119905) minus 119873119872119890119886119899119894119895 (119905)
100381610038161003816100381610038161003816100381610038161003816
)
119899119890119908119883119894119895 (119905 + 1) = 119906 sdot 1198811119895 (119905 + 1) + (1 minus 119906) sdot 1198812119895 (119905 + 1)
(11)
where 119906 called the hybridization factor is a random numberin the range [0 1] for the 119895th dimension 119873119879119890119886119888ℎ119890119903 and119873119872119890119886119899 are the existing neighborhood best solution and theneighborhood mean solution of each learner and TF is ateaching factor which can be either 1 or 2 randomly
In the BBTLBO there is a (119906 lowast 100) chance that the119895th dimension of the 119894th learner in the population follows thebehavior of the learning strategy of teacher phase while theremaining (100 minus 119906lowast 100) follow the search behavior of theGaussian sampling in teacher phase This will be helpful tobalance the advantages of fast convergence rate (the attraction
of the learning strategy of teacher phase) and exploration (theGaussian sampling) in BBTLBO
43 Learner Phase At the same time in the learner phase alearner interacts randomly with other learners for enhancinghis or her knowledge in the class This learning method canbe treated as the global search strategy (shown in (3))
In this paper we introduce a new learning strategy inwhich each learner learns from the neighborhood teacher andthe other learner selected randomly of his or her correspond-ing neighborhood in learner phaseThis learningmethod canbe treated as the neighborhood search strategy Let 119899119890119908119883
119894
represent the interactive learning result of the learner119883119894This
neighborhood search strategy can be expressed as follows
119899119890119908119883119894119895= 119883119894119895+ 1199031lowast (119873119879119890119886119888ℎ119890119903
119894119895minus 119883119894119895)
+ 1199032lowast (119883119894119895minus 119883119896119895)
(12)
where 1199031and 1199032are random vectors in which each element
is a random number in the range [0 1] 119873119879119890119886119888ℎ119890119903 is theteacher of the learner 119883
119894rsquos corresponding neighborhood
and the learner 119883119896is selected randomly from the learnerrsquos
corresponding neighborhoodIn BBTLBO each learner is probabilistically learning by
means of the global search strategy or the neighborhoodsearch strategy in learner phaseThat is about 50of learnersin the population execute the learning strategy of learnerphase in the standard TLBO (shown in (3)) while theremaining 50execute neighborhood search strategy (shownin (12)) This will be helpful to balance the global search andlocal search in learner phase
Moreover compared to the original TLBO BBTLBOonlymodifies the learning strategies Therefore both the originalTLBO and BBTLBO have the same time complexity 119874 (NP sdot119863 sdot Genmax) where NP is the number of the population 119863is the number of dimensions and Genmax is the maximumnumber of generations
As explained above the pseudocode for the implementa-tion of BBTLBO is summarized in Algorithm 2
5 Functions Optimization
In this section to illustrate the effectiveness of the proposedmethod 20 benchmark functions are used to test the effi-ciency of BBTLBO To compare the search performance ofBBTLBO with some other methods other different algo-rithms are also simulated in the paper
51 Benchmark Functions Thedetails of 20 benchmark func-tions are shown in Table 1 Among 20 benchmark functions1198651to 1198659are unimodal functions and 119865
10to 11986520
are multi-modal functions The searching range and theory optima forall functions are also shown in Table 1
52 Parameter Settings All the experiments are carried outon the same machine with a Celoron 226GHz CPU 2GBmemory andWindows XP operating system withMatlab 79
6 The Scientific World Journal
(1) Begin(2) Initialize119873 (number of learners)119863 (number of dimensions) and hybridization factor 119906(3) Initialize learners 119883 and evaluate all learners119883(4) while (stopping condition not met)(5) for each learner119883
119894of the class Teaching phase
(6) TF = round(1 + rand(0 1))(7) Donate the119873 119879119890119886119888ℎ119890119903 and the119873 119872119890119886119899 in its neighborhood for each learner(8) Updating each learner according (11)(9) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(10) endfor(11) for each learner119883
119894of the class Learning phase
(12) Randomly select one learner119883119896 such that 119894 = 119896
(13) if rand(0 1) lt 05(14) Updating each learner according (3)(15) else(16) Donate the119873119879119890119886119888ℎ119890119903 in its neighborhood for each learner(17) Updating each learner according (12)(18) endif(19) Accept 119899119890119908119883
119894if 119891(119899119890119908119883119894) is better than 119891(119883
119894)
(20) endfor(21) endwhile(22) end
Algorithm 2 BBTLBO( )
For the purpose of reducing statistical errors each algorithmis independently simulated 50 times For all algorithms thepopulation size was set to 20 Population-based stochasticalgorithms use the same stopping criterion that is reachinga certain number of function evaluations (FEs)
53 Effect of Variation in Parameter 119906 The hybridizationfactor u is set to 00 01 03 05 07 09 10 Comparativetests have been performed using different 119906 In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions Table 2 showsthe mean optimum solutions and the standard deviation ofthe solutions obtained using different hybridization factor119906 in the 50 independent runs The best results amongthe algorithms are shown in bold Figure 3 presents therepresentative convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved byusing different hybridization factor 119906 on all test functionsDue to the tight space limitation some sample graphs areillustrated
The comparisons in Table 2 and Figure 3 show that whenthe hybridization factor 119906 is set to 09 BBTLBOoffers the bestperformance on 20 test functions Hence the hybridizationfactor 119906 is set to 09 in the following experiments
54 Comparison of BBTLBO with Some Similar Bare-BonesAlgorithms In this section we compare BBTLBO with fiveother recently proposed three bare-bones DE variants andtwo bare-bones PSO algorithms Our experiment includestwo series of comparisons in terms of the solution accuracyand the solution convergence (convergence speed and successrate) We compared the performance of BBTLBO with other
similar bare-bones algorithms including BBPSO [20] BBExp[20] BBDE [21] GBDE [22] and MGBDE [22]
541 Comparisons on the Solution Accuracy In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions The resultsare shown in Table 3 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the 50independent runs by each algorithm on 20 test functionsThebest results among the algorithms are shown in bold Figure 4presents the convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved by 7algorithms for 50 independent runs Due to the tight spacelimitation some sample graphs are illustrated
From Table 3 it can be observed that the mean optimumsolution and the standard deviation of all algorithms performwell for the functions 119865
15and 11986517 Although BBExp performs
better than BBTLBO on function 1198659and MGBDE performs
better than BBTLBO on function 11986520 our approach BBTLBO
achieves better results than other algorithms on the rest of testfunctions Table 3 and Figure 4 conclude that the BBTLBOhas a good performance of the solution accuracy for testfunctions in this paper
542 Comparison of the Convergence Speed and SR In orderto compare the convergence speed and successful rate (SR)of different algorithms we select a threshold value of theobjective function for each test function For other functionsthe threshold values are listed in Table 4 In our experimentthe stopping criterion is that each algorithm is terminatedwhen the best fitness value so far is below the predefinedthreshold value (119879 Value) or the number of FEs reaches to
The Scientific World Journal 7
Table 1 Details of numerical benchmarks used
Function Formula 119863 Range Optima
Sphere 1198651(119909) =
119863
sum
119894=1
1199092
11989430 [minus100 100] 0
Sum square 1198652(119909) =
119863
sum
119894=1
1198941199092
11989430 [minus100 100] 0
Quadric 1198653(119909) =
119863
sum
119894=1
1198941199094
119894+ random(0 1) 30 [minus128 128] 0
Step 1198654(119909) =
119863
sum
119894=1
(lfloor119909119894+ 05rfloor)
2 30 [minus100 100] 0
Schwefel 12 1198655(119909) =
119863
sum
119894=1
(
119894
sum
119895=1
119909119895)
2
30 [minus100 100] 0
Schwefel 221 1198656(119909) = max 1003816100381610038161003816119909119894
1003816100381610038161003816 1 le 119894 le 119863 30 [minus100 100] 0
Schwefel 222 1198657(119909) =
119863
sum
119894=1
1003816100381610038161003816119909119894
1003816100381610038161003816+
119863
prod
119894=1
1003816100381610038161003816119909119894
100381610038161003816100381630 [minus10 10] 0
Zakharov 1198658(119909) =
119863
sum
119894=1
1199092
119894+ (
119863
sum
119894=1
05119894119909119894)
2
+ (
119863
sum
119894=1
05119894119909119894)
4
30 [minus100 100] 0
Rosenbrock 1198659(119909) =
119863minus1
sum
119894=1
lfloor100(1199092
119894minus 119909119894+1)2
+ (119909119894minus 1)2
rfloor 30 [minus2048 2048] 0
Ackley 11986510(119909) = 20 minus 20 exp((minus1
5)radic(
1
119863)
119863
sum
119894=1
1199092
119894) minus exp(( 1
119863)
119863
sum
119894=1
cos (2120587119909119894)) + 119890 30 [minus32 32] 0
Rastrigin 11986511(119909) =
119863
sum
119894=1
(1199092
119894minus 10 cos (2120587119909
119894) + 10) 30 [minus512 512] 0
Weierstrass11986512(119909) =
119863
sum
119894=1
(
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))]) minus 119863
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 times 05)]
119886 = 05 119887 = 3 119896max = 20
30 [minus05 05] 0
Griewank 11986513(119909) =
119863
sum
119894=1
(1199092
119894
4000) minus
119899
prod
119894=1
cos(119909119894
radic119894
) + 1 30 [minus600 600] 0
Schwefel 11986514(119909) = 4189829119863 +
119863
sum
119894=1
(minus119909119894sinradicabs(119909
119894)) 30 [minus500 500] 0
Bohachevsky1 11986515(119909) = 119909
2
1+ 21199092
2minus 03 cos (3120587119909
1) minus 04 cos (4120587119909
2) + 07 2 [minus100 100] 0
Bohachevsky2 11986516(119909) = 119909
2
1+ 21199092
2minus 03cos (3120587119909
1) lowast cos (4120587119909
2) + 03 2 [minus100 100] 0
Bohachevsky3 11986517(119909) = 119909
2
1+ 21199092
2minus 03cos((3120587119909
1) + (4120587119909
2)) + 03 2 [minus100 100] 0
Shekel5 11986518(119909) = minus
5
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus101532
Shekel7 11986519(119909) = minus
7
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus104029
Shekel10 11986520(119909) = minus
10
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus105364
the maximal FEs 40000 The results are shown in Table 4in terms of the mean number of FEs (MFEs) required toconverge to the threshold and successful rate (SR) in the50 independent runs ldquoNaNrdquo represents that no runs of thecorresponding algorithm converged below the predefinedthreshold before meeting the maximum number of FEs Thebest results among the six algorithms are shown in boldface
FromTable 5 it can be observed that all algorithms hardlyconverge to the threshold for unimodal functions 119865
3 1198655 1198656
and 1198658and multimodal functions 119865
11 11986512 and 119865
14 BBTLBO
converges to the threshold except for functions 1198653 1198659 and
11986514 From the results of total average FEs BBTLBO converges
faster than other algorithms on all unimodal functions andthe majority of multimodal functions except for functions119865151198651611986519 and119865
20The acceleration rates between BBTLBO
and other algorithms are mostly 10 for functions 1198651 1198652 1198654
1198657 1198659 11986510 and 119865
13 From the results of total average SR
BBTLBO achieves the highest SR for those test functions ofwhich BBTLBO successfully converges to the threshold valueIt can be concluded that the BBTLBOhas a good performanceof convergence speed and successful rate (SR) of the solutionsfor test functions in this paper
8 The Scientific World Journal
Table2Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent119906
Fun
BBTL
BO(119906=00)
BBTL
BO(119906=01)
BBTL
BO(119906=03)
BBTL
BO(119906=05)
BBTL
BO(119906=07)
BBTL
BO(119906=09)
BBTL
BO(119906=10)
1198651
175119890minus001plusmn121119890+000
689119890minus071plusmn101119890minus070
123119890minus163plusmn00
121119890minus256plusmn00
00plusmn00
00plusmn00
00plusmn00
1198652
898119890minus005plusmn573119890minus004
562119890minus069plusmn272119890minus068
220119890minus161plusmn112119890minus160
243119890minus254plusmn00
00plusmn00
00plusmn00
00plusmn00
1198653
120119890minus001plusmn634119890minus002
591119890minus003plusmn144119890minus003
101119890minus003plusmn348119890minus004
435119890minus004plusmn197119890minus004
235119890minus004plusmn130119890minus004
227119890minus004plusmn126119890minus004
199
eminus00
4plusmn113
eminus00
41198654
765119890+002plusmn583119890+002
480119890minus001plusmn886119890minus001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
558119890+002plusmn653119890+002
187119890minus028plusmn573119890minus028
353119890minus054plusmn186119890minus053
369119890minus073plusmn227119890minus072
953119890minus096plusmn674119890minus095
216
eminus11
5plusmn110
eminus11
4256119890minus100plusmn130119890minus099
1198656
251119890+001plusmn534119890+000
667119890minus021plusmn881119890minus021
281119890minus061plusmn636119890minus061
822119890minus100plusmn180119890minus099
818119890minus137plusmn141119890minus136
363
eminus15
4plusmn134
eminus15
3886119890minus147plusmn322119890minus146
1198657
137119890minus003plusmn954119890minus003
872119890minus043plusmn152119890minus042
568119890minus088plusmn876119890minus088
101119890minus133plusmn238119890minus133
260119890minus175plusmn00
116
eminus18
8plusmn00
833119890minus180plusmn00
1198658
241119890+000plusmn307119890+000
132119890minus019plusmn298119890minus019
213119890minus028plusmn769119890minus028
344119890minus037plusmn124119890minus036
220119890minus050plusmn912119890minus050
107
eminus05
6plusmn439
eminus05
6203119890minus049plusmn894119890minus049
1198659
266
e+00
1plusmn179
e+00
0272119890+001plusmn317119890minus001
277119890+001plusmn318119890minus001
283119890+001plusmn278119890minus001
284119890+001plusmn267119890minus001
283119890+001plusmn341119890minus001
280119890+001plusmn387119890minus001
11986510830119890+000plusmn176119890+000
177119890minus001plusmn610119890minus001
590119890minus015plusmn170119890minus015
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
11986511374119890+001plusmn905119890+000
333119890+001plusmn118119890+001
271119890+001plusmn800119890+000
189119890+001plusmn114119890+001
573119890+000plusmn106119890+001
00plusmn00
00plusmn00
11986512815119890+000plusmn193119890+000
338119890minus001plusmn116119890+000
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986513506119890minus001plusmn808119890minus001
652119890minus003plusmn886119890minus003
178119890minus003plusmn368119890minus003
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986514
433
e+00
3plusmn679
e+00
2467119890+003plusmn610119890+002
517119890+003plusmn668119890+002
559119890+003plusmn685119890+002
553119890+003plusmn710119890+002
558119890+003plusmn780119890+002
540119890+003plusmn653119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518minus771119890+000plusmn347119890+000minus806119890+000plusmn339119890+000minus964119890+000plusmn181119890+000minus965119890+000plusmn176119890+000minus102
e+00
1plusmn677
eminus00
3minus985119890+000plusmn122119890+000minus993119890+000plusmn112119890+000
11986519minus769119890+000plusmn352119890+000minus813119890+000plusmn336119890+000minus987119890+000plusmn183119890+000minus103
e+00
1plusmn945
eminus00
1minus976119890+000plusmn195119890+000minus982119890+000plusmn178119890+000minus961119890+000plusmn199119890+000
11986520minus812119890+000plusmn353119890+000minus938119890+000plusmn269119890+000minus101119890+001plusmn165119890+000minus101
e+00
1plusmn161
e+00
0minus970119890+000plusmn228119890+000minus941119890+000plusmn243119890+000minus100119890+001plusmn169119890+000
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 5
NeighborhoodiXi
Ximinus1
Xi+1
Figure 2 Ring neighborhood topology with three members
its 119896-neighborhood radius consists of 2119896 + 1 individuals(including oneself) which are 119883
119894minus119896 119883
119894 119883
119894+119896 That is
the neighborhood size is 2119896 + 1 for a 119896-neighborhood Forsimplicity 119896 is set to 1 (Figure 2) in our algorithmThismeansthat there are 3 individuals in each learning group Oncegroups are constructed we can utilize them for updating thelearners of the corresponding group
42 Teacher Phase To balance the global and local searchability a modified interactive learning strategy is proposed inteacher phase In this learning phase each learner employs aninteractive learning strategy (the hybridization of the learningstrategy of teacher phase in the standard TLBO and Gaussiansampling learning) based on neighborhood search
In BBTLBO the updating formula of the learning for alearner 119883
119894in teacher phase is proposed by the hybridization
of the learning strategy of teacher phase and the Gaussiansampling learning as follows
1198811119895 (119905 + 1) = 119883119894119895 (119905) + rand (0 1)
sdot (119873119879119890119886119888ℎ119890119903119894119895 (119905) minus TF sdot 119873119872119890119886119899
119894119895 (119905))
1198812119895 (119905 + 1) = 119873(
119873119879119890119886119888ℎ119890119903119894119895 (119905) + 119873119872119890119886119899119894119895 (119905)
2
100381610038161003816100381610038161003816100381610038161003816
119873119879119890119886119888ℎ119890119903119894119895 (119905) minus 119873119872119890119886119899119894119895 (119905)
100381610038161003816100381610038161003816100381610038161003816
)
119899119890119908119883119894119895 (119905 + 1) = 119906 sdot 1198811119895 (119905 + 1) + (1 minus 119906) sdot 1198812119895 (119905 + 1)
(11)
where 119906 called the hybridization factor is a random numberin the range [0 1] for the 119895th dimension 119873119879119890119886119888ℎ119890119903 and119873119872119890119886119899 are the existing neighborhood best solution and theneighborhood mean solution of each learner and TF is ateaching factor which can be either 1 or 2 randomly
In the BBTLBO there is a (119906 lowast 100) chance that the119895th dimension of the 119894th learner in the population follows thebehavior of the learning strategy of teacher phase while theremaining (100 minus 119906lowast 100) follow the search behavior of theGaussian sampling in teacher phase This will be helpful tobalance the advantages of fast convergence rate (the attraction
of the learning strategy of teacher phase) and exploration (theGaussian sampling) in BBTLBO
43 Learner Phase At the same time in the learner phase alearner interacts randomly with other learners for enhancinghis or her knowledge in the class This learning method canbe treated as the global search strategy (shown in (3))
In this paper we introduce a new learning strategy inwhich each learner learns from the neighborhood teacher andthe other learner selected randomly of his or her correspond-ing neighborhood in learner phaseThis learningmethod canbe treated as the neighborhood search strategy Let 119899119890119908119883
119894
represent the interactive learning result of the learner119883119894This
neighborhood search strategy can be expressed as follows
119899119890119908119883119894119895= 119883119894119895+ 1199031lowast (119873119879119890119886119888ℎ119890119903
119894119895minus 119883119894119895)
+ 1199032lowast (119883119894119895minus 119883119896119895)
(12)
where 1199031and 1199032are random vectors in which each element
is a random number in the range [0 1] 119873119879119890119886119888ℎ119890119903 is theteacher of the learner 119883
119894rsquos corresponding neighborhood
and the learner 119883119896is selected randomly from the learnerrsquos
corresponding neighborhoodIn BBTLBO each learner is probabilistically learning by
means of the global search strategy or the neighborhoodsearch strategy in learner phaseThat is about 50of learnersin the population execute the learning strategy of learnerphase in the standard TLBO (shown in (3)) while theremaining 50execute neighborhood search strategy (shownin (12)) This will be helpful to balance the global search andlocal search in learner phase
Moreover compared to the original TLBO BBTLBOonlymodifies the learning strategies Therefore both the originalTLBO and BBTLBO have the same time complexity 119874 (NP sdot119863 sdot Genmax) where NP is the number of the population 119863is the number of dimensions and Genmax is the maximumnumber of generations
As explained above the pseudocode for the implementa-tion of BBTLBO is summarized in Algorithm 2
5 Functions Optimization
In this section to illustrate the effectiveness of the proposedmethod 20 benchmark functions are used to test the effi-ciency of BBTLBO To compare the search performance ofBBTLBO with some other methods other different algo-rithms are also simulated in the paper
51 Benchmark Functions Thedetails of 20 benchmark func-tions are shown in Table 1 Among 20 benchmark functions1198651to 1198659are unimodal functions and 119865
10to 11986520
are multi-modal functions The searching range and theory optima forall functions are also shown in Table 1
52 Parameter Settings All the experiments are carried outon the same machine with a Celoron 226GHz CPU 2GBmemory andWindows XP operating system withMatlab 79
6 The Scientific World Journal
(1) Begin(2) Initialize119873 (number of learners)119863 (number of dimensions) and hybridization factor 119906(3) Initialize learners 119883 and evaluate all learners119883(4) while (stopping condition not met)(5) for each learner119883
119894of the class Teaching phase
(6) TF = round(1 + rand(0 1))(7) Donate the119873 119879119890119886119888ℎ119890119903 and the119873 119872119890119886119899 in its neighborhood for each learner(8) Updating each learner according (11)(9) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(10) endfor(11) for each learner119883
119894of the class Learning phase
(12) Randomly select one learner119883119896 such that 119894 = 119896
(13) if rand(0 1) lt 05(14) Updating each learner according (3)(15) else(16) Donate the119873119879119890119886119888ℎ119890119903 in its neighborhood for each learner(17) Updating each learner according (12)(18) endif(19) Accept 119899119890119908119883
119894if 119891(119899119890119908119883119894) is better than 119891(119883
119894)
(20) endfor(21) endwhile(22) end
Algorithm 2 BBTLBO( )
For the purpose of reducing statistical errors each algorithmis independently simulated 50 times For all algorithms thepopulation size was set to 20 Population-based stochasticalgorithms use the same stopping criterion that is reachinga certain number of function evaluations (FEs)
53 Effect of Variation in Parameter 119906 The hybridizationfactor u is set to 00 01 03 05 07 09 10 Comparativetests have been performed using different 119906 In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions Table 2 showsthe mean optimum solutions and the standard deviation ofthe solutions obtained using different hybridization factor119906 in the 50 independent runs The best results amongthe algorithms are shown in bold Figure 3 presents therepresentative convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved byusing different hybridization factor 119906 on all test functionsDue to the tight space limitation some sample graphs areillustrated
The comparisons in Table 2 and Figure 3 show that whenthe hybridization factor 119906 is set to 09 BBTLBOoffers the bestperformance on 20 test functions Hence the hybridizationfactor 119906 is set to 09 in the following experiments
54 Comparison of BBTLBO with Some Similar Bare-BonesAlgorithms In this section we compare BBTLBO with fiveother recently proposed three bare-bones DE variants andtwo bare-bones PSO algorithms Our experiment includestwo series of comparisons in terms of the solution accuracyand the solution convergence (convergence speed and successrate) We compared the performance of BBTLBO with other
similar bare-bones algorithms including BBPSO [20] BBExp[20] BBDE [21] GBDE [22] and MGBDE [22]
541 Comparisons on the Solution Accuracy In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions The resultsare shown in Table 3 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the 50independent runs by each algorithm on 20 test functionsThebest results among the algorithms are shown in bold Figure 4presents the convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved by 7algorithms for 50 independent runs Due to the tight spacelimitation some sample graphs are illustrated
From Table 3 it can be observed that the mean optimumsolution and the standard deviation of all algorithms performwell for the functions 119865
15and 11986517 Although BBExp performs
better than BBTLBO on function 1198659and MGBDE performs
better than BBTLBO on function 11986520 our approach BBTLBO
achieves better results than other algorithms on the rest of testfunctions Table 3 and Figure 4 conclude that the BBTLBOhas a good performance of the solution accuracy for testfunctions in this paper
542 Comparison of the Convergence Speed and SR In orderto compare the convergence speed and successful rate (SR)of different algorithms we select a threshold value of theobjective function for each test function For other functionsthe threshold values are listed in Table 4 In our experimentthe stopping criterion is that each algorithm is terminatedwhen the best fitness value so far is below the predefinedthreshold value (119879 Value) or the number of FEs reaches to
The Scientific World Journal 7
Table 1 Details of numerical benchmarks used
Function Formula 119863 Range Optima
Sphere 1198651(119909) =
119863
sum
119894=1
1199092
11989430 [minus100 100] 0
Sum square 1198652(119909) =
119863
sum
119894=1
1198941199092
11989430 [minus100 100] 0
Quadric 1198653(119909) =
119863
sum
119894=1
1198941199094
119894+ random(0 1) 30 [minus128 128] 0
Step 1198654(119909) =
119863
sum
119894=1
(lfloor119909119894+ 05rfloor)
2 30 [minus100 100] 0
Schwefel 12 1198655(119909) =
119863
sum
119894=1
(
119894
sum
119895=1
119909119895)
2
30 [minus100 100] 0
Schwefel 221 1198656(119909) = max 1003816100381610038161003816119909119894
1003816100381610038161003816 1 le 119894 le 119863 30 [minus100 100] 0
Schwefel 222 1198657(119909) =
119863
sum
119894=1
1003816100381610038161003816119909119894
1003816100381610038161003816+
119863
prod
119894=1
1003816100381610038161003816119909119894
100381610038161003816100381630 [minus10 10] 0
Zakharov 1198658(119909) =
119863
sum
119894=1
1199092
119894+ (
119863
sum
119894=1
05119894119909119894)
2
+ (
119863
sum
119894=1
05119894119909119894)
4
30 [minus100 100] 0
Rosenbrock 1198659(119909) =
119863minus1
sum
119894=1
lfloor100(1199092
119894minus 119909119894+1)2
+ (119909119894minus 1)2
rfloor 30 [minus2048 2048] 0
Ackley 11986510(119909) = 20 minus 20 exp((minus1
5)radic(
1
119863)
119863
sum
119894=1
1199092
119894) minus exp(( 1
119863)
119863
sum
119894=1
cos (2120587119909119894)) + 119890 30 [minus32 32] 0
Rastrigin 11986511(119909) =
119863
sum
119894=1
(1199092
119894minus 10 cos (2120587119909
119894) + 10) 30 [minus512 512] 0
Weierstrass11986512(119909) =
119863
sum
119894=1
(
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))]) minus 119863
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 times 05)]
119886 = 05 119887 = 3 119896max = 20
30 [minus05 05] 0
Griewank 11986513(119909) =
119863
sum
119894=1
(1199092
119894
4000) minus
119899
prod
119894=1
cos(119909119894
radic119894
) + 1 30 [minus600 600] 0
Schwefel 11986514(119909) = 4189829119863 +
119863
sum
119894=1
(minus119909119894sinradicabs(119909
119894)) 30 [minus500 500] 0
Bohachevsky1 11986515(119909) = 119909
2
1+ 21199092
2minus 03 cos (3120587119909
1) minus 04 cos (4120587119909
2) + 07 2 [minus100 100] 0
Bohachevsky2 11986516(119909) = 119909
2
1+ 21199092
2minus 03cos (3120587119909
1) lowast cos (4120587119909
2) + 03 2 [minus100 100] 0
Bohachevsky3 11986517(119909) = 119909
2
1+ 21199092
2minus 03cos((3120587119909
1) + (4120587119909
2)) + 03 2 [minus100 100] 0
Shekel5 11986518(119909) = minus
5
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus101532
Shekel7 11986519(119909) = minus
7
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus104029
Shekel10 11986520(119909) = minus
10
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus105364
the maximal FEs 40000 The results are shown in Table 4in terms of the mean number of FEs (MFEs) required toconverge to the threshold and successful rate (SR) in the50 independent runs ldquoNaNrdquo represents that no runs of thecorresponding algorithm converged below the predefinedthreshold before meeting the maximum number of FEs Thebest results among the six algorithms are shown in boldface
FromTable 5 it can be observed that all algorithms hardlyconverge to the threshold for unimodal functions 119865
3 1198655 1198656
and 1198658and multimodal functions 119865
11 11986512 and 119865
14 BBTLBO
converges to the threshold except for functions 1198653 1198659 and
11986514 From the results of total average FEs BBTLBO converges
faster than other algorithms on all unimodal functions andthe majority of multimodal functions except for functions119865151198651611986519 and119865
20The acceleration rates between BBTLBO
and other algorithms are mostly 10 for functions 1198651 1198652 1198654
1198657 1198659 11986510 and 119865
13 From the results of total average SR
BBTLBO achieves the highest SR for those test functions ofwhich BBTLBO successfully converges to the threshold valueIt can be concluded that the BBTLBOhas a good performanceof convergence speed and successful rate (SR) of the solutionsfor test functions in this paper
8 The Scientific World Journal
Table2Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent119906
Fun
BBTL
BO(119906=00)
BBTL
BO(119906=01)
BBTL
BO(119906=03)
BBTL
BO(119906=05)
BBTL
BO(119906=07)
BBTL
BO(119906=09)
BBTL
BO(119906=10)
1198651
175119890minus001plusmn121119890+000
689119890minus071plusmn101119890minus070
123119890minus163plusmn00
121119890minus256plusmn00
00plusmn00
00plusmn00
00plusmn00
1198652
898119890minus005plusmn573119890minus004
562119890minus069plusmn272119890minus068
220119890minus161plusmn112119890minus160
243119890minus254plusmn00
00plusmn00
00plusmn00
00plusmn00
1198653
120119890minus001plusmn634119890minus002
591119890minus003plusmn144119890minus003
101119890minus003plusmn348119890minus004
435119890minus004plusmn197119890minus004
235119890minus004plusmn130119890minus004
227119890minus004plusmn126119890minus004
199
eminus00
4plusmn113
eminus00
41198654
765119890+002plusmn583119890+002
480119890minus001plusmn886119890minus001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
558119890+002plusmn653119890+002
187119890minus028plusmn573119890minus028
353119890minus054plusmn186119890minus053
369119890minus073plusmn227119890minus072
953119890minus096plusmn674119890minus095
216
eminus11
5plusmn110
eminus11
4256119890minus100plusmn130119890minus099
1198656
251119890+001plusmn534119890+000
667119890minus021plusmn881119890minus021
281119890minus061plusmn636119890minus061
822119890minus100plusmn180119890minus099
818119890minus137plusmn141119890minus136
363
eminus15
4plusmn134
eminus15
3886119890minus147plusmn322119890minus146
1198657
137119890minus003plusmn954119890minus003
872119890minus043plusmn152119890minus042
568119890minus088plusmn876119890minus088
101119890minus133plusmn238119890minus133
260119890minus175plusmn00
116
eminus18
8plusmn00
833119890minus180plusmn00
1198658
241119890+000plusmn307119890+000
132119890minus019plusmn298119890minus019
213119890minus028plusmn769119890minus028
344119890minus037plusmn124119890minus036
220119890minus050plusmn912119890minus050
107
eminus05
6plusmn439
eminus05
6203119890minus049plusmn894119890minus049
1198659
266
e+00
1plusmn179
e+00
0272119890+001plusmn317119890minus001
277119890+001plusmn318119890minus001
283119890+001plusmn278119890minus001
284119890+001plusmn267119890minus001
283119890+001plusmn341119890minus001
280119890+001plusmn387119890minus001
11986510830119890+000plusmn176119890+000
177119890minus001plusmn610119890minus001
590119890minus015plusmn170119890minus015
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
11986511374119890+001plusmn905119890+000
333119890+001plusmn118119890+001
271119890+001plusmn800119890+000
189119890+001plusmn114119890+001
573119890+000plusmn106119890+001
00plusmn00
00plusmn00
11986512815119890+000plusmn193119890+000
338119890minus001plusmn116119890+000
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986513506119890minus001plusmn808119890minus001
652119890minus003plusmn886119890minus003
178119890minus003plusmn368119890minus003
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986514
433
e+00
3plusmn679
e+00
2467119890+003plusmn610119890+002
517119890+003plusmn668119890+002
559119890+003plusmn685119890+002
553119890+003plusmn710119890+002
558119890+003plusmn780119890+002
540119890+003plusmn653119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518minus771119890+000plusmn347119890+000minus806119890+000plusmn339119890+000minus964119890+000plusmn181119890+000minus965119890+000plusmn176119890+000minus102
e+00
1plusmn677
eminus00
3minus985119890+000plusmn122119890+000minus993119890+000plusmn112119890+000
11986519minus769119890+000plusmn352119890+000minus813119890+000plusmn336119890+000minus987119890+000plusmn183119890+000minus103
e+00
1plusmn945
eminus00
1minus976119890+000plusmn195119890+000minus982119890+000plusmn178119890+000minus961119890+000plusmn199119890+000
11986520minus812119890+000plusmn353119890+000minus938119890+000plusmn269119890+000minus101119890+001plusmn165119890+000minus101
e+00
1plusmn161
e+00
0minus970119890+000plusmn228119890+000minus941119890+000plusmn243119890+000minus100119890+001plusmn169119890+000
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
6 The Scientific World Journal
(1) Begin(2) Initialize119873 (number of learners)119863 (number of dimensions) and hybridization factor 119906(3) Initialize learners 119883 and evaluate all learners119883(4) while (stopping condition not met)(5) for each learner119883
119894of the class Teaching phase
(6) TF = round(1 + rand(0 1))(7) Donate the119873 119879119890119886119888ℎ119890119903 and the119873 119872119890119886119899 in its neighborhood for each learner(8) Updating each learner according (11)(9) Accept 119899119890119908119883
119894if 119891(119899119890119908119883
119894) is better than 119891(119883
119894)
(10) endfor(11) for each learner119883
119894of the class Learning phase
(12) Randomly select one learner119883119896 such that 119894 = 119896
(13) if rand(0 1) lt 05(14) Updating each learner according (3)(15) else(16) Donate the119873119879119890119886119888ℎ119890119903 in its neighborhood for each learner(17) Updating each learner according (12)(18) endif(19) Accept 119899119890119908119883
119894if 119891(119899119890119908119883119894) is better than 119891(119883
119894)
(20) endfor(21) endwhile(22) end
Algorithm 2 BBTLBO( )
For the purpose of reducing statistical errors each algorithmis independently simulated 50 times For all algorithms thepopulation size was set to 20 Population-based stochasticalgorithms use the same stopping criterion that is reachinga certain number of function evaluations (FEs)
53 Effect of Variation in Parameter 119906 The hybridizationfactor u is set to 00 01 03 05 07 09 10 Comparativetests have been performed using different 119906 In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions Table 2 showsthe mean optimum solutions and the standard deviation ofthe solutions obtained using different hybridization factor119906 in the 50 independent runs The best results amongthe algorithms are shown in bold Figure 3 presents therepresentative convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved byusing different hybridization factor 119906 on all test functionsDue to the tight space limitation some sample graphs areillustrated
The comparisons in Table 2 and Figure 3 show that whenthe hybridization factor 119906 is set to 09 BBTLBOoffers the bestperformance on 20 test functions Hence the hybridizationfactor 119906 is set to 09 in the following experiments
54 Comparison of BBTLBO with Some Similar Bare-BonesAlgorithms In this section we compare BBTLBO with fiveother recently proposed three bare-bones DE variants andtwo bare-bones PSO algorithms Our experiment includestwo series of comparisons in terms of the solution accuracyand the solution convergence (convergence speed and successrate) We compared the performance of BBTLBO with other
similar bare-bones algorithms including BBPSO [20] BBExp[20] BBDE [21] GBDE [22] and MGBDE [22]
541 Comparisons on the Solution Accuracy In our exper-iment the maximal FEs are used as ended condition ofalgorithm namely 40000 for all test functions The resultsare shown in Table 3 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the 50independent runs by each algorithm on 20 test functionsThebest results among the algorithms are shown in bold Figure 4presents the convergence graphs of different benchmarkfunctions in terms of the mean fitness values achieved by 7algorithms for 50 independent runs Due to the tight spacelimitation some sample graphs are illustrated
From Table 3 it can be observed that the mean optimumsolution and the standard deviation of all algorithms performwell for the functions 119865
15and 11986517 Although BBExp performs
better than BBTLBO on function 1198659and MGBDE performs
better than BBTLBO on function 11986520 our approach BBTLBO
achieves better results than other algorithms on the rest of testfunctions Table 3 and Figure 4 conclude that the BBTLBOhas a good performance of the solution accuracy for testfunctions in this paper
542 Comparison of the Convergence Speed and SR In orderto compare the convergence speed and successful rate (SR)of different algorithms we select a threshold value of theobjective function for each test function For other functionsthe threshold values are listed in Table 4 In our experimentthe stopping criterion is that each algorithm is terminatedwhen the best fitness value so far is below the predefinedthreshold value (119879 Value) or the number of FEs reaches to
The Scientific World Journal 7
Table 1 Details of numerical benchmarks used
Function Formula 119863 Range Optima
Sphere 1198651(119909) =
119863
sum
119894=1
1199092
11989430 [minus100 100] 0
Sum square 1198652(119909) =
119863
sum
119894=1
1198941199092
11989430 [minus100 100] 0
Quadric 1198653(119909) =
119863
sum
119894=1
1198941199094
119894+ random(0 1) 30 [minus128 128] 0
Step 1198654(119909) =
119863
sum
119894=1
(lfloor119909119894+ 05rfloor)
2 30 [minus100 100] 0
Schwefel 12 1198655(119909) =
119863
sum
119894=1
(
119894
sum
119895=1
119909119895)
2
30 [minus100 100] 0
Schwefel 221 1198656(119909) = max 1003816100381610038161003816119909119894
1003816100381610038161003816 1 le 119894 le 119863 30 [minus100 100] 0
Schwefel 222 1198657(119909) =
119863
sum
119894=1
1003816100381610038161003816119909119894
1003816100381610038161003816+
119863
prod
119894=1
1003816100381610038161003816119909119894
100381610038161003816100381630 [minus10 10] 0
Zakharov 1198658(119909) =
119863
sum
119894=1
1199092
119894+ (
119863
sum
119894=1
05119894119909119894)
2
+ (
119863
sum
119894=1
05119894119909119894)
4
30 [minus100 100] 0
Rosenbrock 1198659(119909) =
119863minus1
sum
119894=1
lfloor100(1199092
119894minus 119909119894+1)2
+ (119909119894minus 1)2
rfloor 30 [minus2048 2048] 0
Ackley 11986510(119909) = 20 minus 20 exp((minus1
5)radic(
1
119863)
119863
sum
119894=1
1199092
119894) minus exp(( 1
119863)
119863
sum
119894=1
cos (2120587119909119894)) + 119890 30 [minus32 32] 0
Rastrigin 11986511(119909) =
119863
sum
119894=1
(1199092
119894minus 10 cos (2120587119909
119894) + 10) 30 [minus512 512] 0
Weierstrass11986512(119909) =
119863
sum
119894=1
(
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))]) minus 119863
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 times 05)]
119886 = 05 119887 = 3 119896max = 20
30 [minus05 05] 0
Griewank 11986513(119909) =
119863
sum
119894=1
(1199092
119894
4000) minus
119899
prod
119894=1
cos(119909119894
radic119894
) + 1 30 [minus600 600] 0
Schwefel 11986514(119909) = 4189829119863 +
119863
sum
119894=1
(minus119909119894sinradicabs(119909
119894)) 30 [minus500 500] 0
Bohachevsky1 11986515(119909) = 119909
2
1+ 21199092
2minus 03 cos (3120587119909
1) minus 04 cos (4120587119909
2) + 07 2 [minus100 100] 0
Bohachevsky2 11986516(119909) = 119909
2
1+ 21199092
2minus 03cos (3120587119909
1) lowast cos (4120587119909
2) + 03 2 [minus100 100] 0
Bohachevsky3 11986517(119909) = 119909
2
1+ 21199092
2minus 03cos((3120587119909
1) + (4120587119909
2)) + 03 2 [minus100 100] 0
Shekel5 11986518(119909) = minus
5
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus101532
Shekel7 11986519(119909) = minus
7
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus104029
Shekel10 11986520(119909) = minus
10
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus105364
the maximal FEs 40000 The results are shown in Table 4in terms of the mean number of FEs (MFEs) required toconverge to the threshold and successful rate (SR) in the50 independent runs ldquoNaNrdquo represents that no runs of thecorresponding algorithm converged below the predefinedthreshold before meeting the maximum number of FEs Thebest results among the six algorithms are shown in boldface
FromTable 5 it can be observed that all algorithms hardlyconverge to the threshold for unimodal functions 119865
3 1198655 1198656
and 1198658and multimodal functions 119865
11 11986512 and 119865
14 BBTLBO
converges to the threshold except for functions 1198653 1198659 and
11986514 From the results of total average FEs BBTLBO converges
faster than other algorithms on all unimodal functions andthe majority of multimodal functions except for functions119865151198651611986519 and119865
20The acceleration rates between BBTLBO
and other algorithms are mostly 10 for functions 1198651 1198652 1198654
1198657 1198659 11986510 and 119865
13 From the results of total average SR
BBTLBO achieves the highest SR for those test functions ofwhich BBTLBO successfully converges to the threshold valueIt can be concluded that the BBTLBOhas a good performanceof convergence speed and successful rate (SR) of the solutionsfor test functions in this paper
8 The Scientific World Journal
Table2Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent119906
Fun
BBTL
BO(119906=00)
BBTL
BO(119906=01)
BBTL
BO(119906=03)
BBTL
BO(119906=05)
BBTL
BO(119906=07)
BBTL
BO(119906=09)
BBTL
BO(119906=10)
1198651
175119890minus001plusmn121119890+000
689119890minus071plusmn101119890minus070
123119890minus163plusmn00
121119890minus256plusmn00
00plusmn00
00plusmn00
00plusmn00
1198652
898119890minus005plusmn573119890minus004
562119890minus069plusmn272119890minus068
220119890minus161plusmn112119890minus160
243119890minus254plusmn00
00plusmn00
00plusmn00
00plusmn00
1198653
120119890minus001plusmn634119890minus002
591119890minus003plusmn144119890minus003
101119890minus003plusmn348119890minus004
435119890minus004plusmn197119890minus004
235119890minus004plusmn130119890minus004
227119890minus004plusmn126119890minus004
199
eminus00
4plusmn113
eminus00
41198654
765119890+002plusmn583119890+002
480119890minus001plusmn886119890minus001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
558119890+002plusmn653119890+002
187119890minus028plusmn573119890minus028
353119890minus054plusmn186119890minus053
369119890minus073plusmn227119890minus072
953119890minus096plusmn674119890minus095
216
eminus11
5plusmn110
eminus11
4256119890minus100plusmn130119890minus099
1198656
251119890+001plusmn534119890+000
667119890minus021plusmn881119890minus021
281119890minus061plusmn636119890minus061
822119890minus100plusmn180119890minus099
818119890minus137plusmn141119890minus136
363
eminus15
4plusmn134
eminus15
3886119890minus147plusmn322119890minus146
1198657
137119890minus003plusmn954119890minus003
872119890minus043plusmn152119890minus042
568119890minus088plusmn876119890minus088
101119890minus133plusmn238119890minus133
260119890minus175plusmn00
116
eminus18
8plusmn00
833119890minus180plusmn00
1198658
241119890+000plusmn307119890+000
132119890minus019plusmn298119890minus019
213119890minus028plusmn769119890minus028
344119890minus037plusmn124119890minus036
220119890minus050plusmn912119890minus050
107
eminus05
6plusmn439
eminus05
6203119890minus049plusmn894119890minus049
1198659
266
e+00
1plusmn179
e+00
0272119890+001plusmn317119890minus001
277119890+001plusmn318119890minus001
283119890+001plusmn278119890minus001
284119890+001plusmn267119890minus001
283119890+001plusmn341119890minus001
280119890+001plusmn387119890minus001
11986510830119890+000plusmn176119890+000
177119890minus001plusmn610119890minus001
590119890minus015plusmn170119890minus015
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
11986511374119890+001plusmn905119890+000
333119890+001plusmn118119890+001
271119890+001plusmn800119890+000
189119890+001plusmn114119890+001
573119890+000plusmn106119890+001
00plusmn00
00plusmn00
11986512815119890+000plusmn193119890+000
338119890minus001plusmn116119890+000
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986513506119890minus001plusmn808119890minus001
652119890minus003plusmn886119890minus003
178119890minus003plusmn368119890minus003
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986514
433
e+00
3plusmn679
e+00
2467119890+003plusmn610119890+002
517119890+003plusmn668119890+002
559119890+003plusmn685119890+002
553119890+003plusmn710119890+002
558119890+003plusmn780119890+002
540119890+003plusmn653119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518minus771119890+000plusmn347119890+000minus806119890+000plusmn339119890+000minus964119890+000plusmn181119890+000minus965119890+000plusmn176119890+000minus102
e+00
1plusmn677
eminus00
3minus985119890+000plusmn122119890+000minus993119890+000plusmn112119890+000
11986519minus769119890+000plusmn352119890+000minus813119890+000plusmn336119890+000minus987119890+000plusmn183119890+000minus103
e+00
1plusmn945
eminus00
1minus976119890+000plusmn195119890+000minus982119890+000plusmn178119890+000minus961119890+000plusmn199119890+000
11986520minus812119890+000plusmn353119890+000minus938119890+000plusmn269119890+000minus101119890+001plusmn165119890+000minus101
e+00
1plusmn161
e+00
0minus970119890+000plusmn228119890+000minus941119890+000plusmn243119890+000minus100119890+001plusmn169119890+000
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 7
Table 1 Details of numerical benchmarks used
Function Formula 119863 Range Optima
Sphere 1198651(119909) =
119863
sum
119894=1
1199092
11989430 [minus100 100] 0
Sum square 1198652(119909) =
119863
sum
119894=1
1198941199092
11989430 [minus100 100] 0
Quadric 1198653(119909) =
119863
sum
119894=1
1198941199094
119894+ random(0 1) 30 [minus128 128] 0
Step 1198654(119909) =
119863
sum
119894=1
(lfloor119909119894+ 05rfloor)
2 30 [minus100 100] 0
Schwefel 12 1198655(119909) =
119863
sum
119894=1
(
119894
sum
119895=1
119909119895)
2
30 [minus100 100] 0
Schwefel 221 1198656(119909) = max 1003816100381610038161003816119909119894
1003816100381610038161003816 1 le 119894 le 119863 30 [minus100 100] 0
Schwefel 222 1198657(119909) =
119863
sum
119894=1
1003816100381610038161003816119909119894
1003816100381610038161003816+
119863
prod
119894=1
1003816100381610038161003816119909119894
100381610038161003816100381630 [minus10 10] 0
Zakharov 1198658(119909) =
119863
sum
119894=1
1199092
119894+ (
119863
sum
119894=1
05119894119909119894)
2
+ (
119863
sum
119894=1
05119894119909119894)
4
30 [minus100 100] 0
Rosenbrock 1198659(119909) =
119863minus1
sum
119894=1
lfloor100(1199092
119894minus 119909119894+1)2
+ (119909119894minus 1)2
rfloor 30 [minus2048 2048] 0
Ackley 11986510(119909) = 20 minus 20 exp((minus1
5)radic(
1
119863)
119863
sum
119894=1
1199092
119894) minus exp(( 1
119863)
119863
sum
119894=1
cos (2120587119909119894)) + 119890 30 [minus32 32] 0
Rastrigin 11986511(119909) =
119863
sum
119894=1
(1199092
119894minus 10 cos (2120587119909
119894) + 10) 30 [minus512 512] 0
Weierstrass11986512(119909) =
119863
sum
119894=1
(
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 (119909
119894+ 05))]) minus 119863
119896maxsum
119896=0
[119886119896 cos (2120587119887119896 times 05)]
119886 = 05 119887 = 3 119896max = 20
30 [minus05 05] 0
Griewank 11986513(119909) =
119863
sum
119894=1
(1199092
119894
4000) minus
119899
prod
119894=1
cos(119909119894
radic119894
) + 1 30 [minus600 600] 0
Schwefel 11986514(119909) = 4189829119863 +
119863
sum
119894=1
(minus119909119894sinradicabs(119909
119894)) 30 [minus500 500] 0
Bohachevsky1 11986515(119909) = 119909
2
1+ 21199092
2minus 03 cos (3120587119909
1) minus 04 cos (4120587119909
2) + 07 2 [minus100 100] 0
Bohachevsky2 11986516(119909) = 119909
2
1+ 21199092
2minus 03cos (3120587119909
1) lowast cos (4120587119909
2) + 03 2 [minus100 100] 0
Bohachevsky3 11986517(119909) = 119909
2
1+ 21199092
2minus 03cos((3120587119909
1) + (4120587119909
2)) + 03 2 [minus100 100] 0
Shekel5 11986518(119909) = minus
5
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus101532
Shekel7 11986519(119909) = minus
7
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus104029
Shekel10 11986520(119909) = minus
10
sum
119894=1
[(119909 minus 119886119894)(119909 minus 119886
119894)119879+ 119888119894]
minus1
4 [0 10] minus105364
the maximal FEs 40000 The results are shown in Table 4in terms of the mean number of FEs (MFEs) required toconverge to the threshold and successful rate (SR) in the50 independent runs ldquoNaNrdquo represents that no runs of thecorresponding algorithm converged below the predefinedthreshold before meeting the maximum number of FEs Thebest results among the six algorithms are shown in boldface
FromTable 5 it can be observed that all algorithms hardlyconverge to the threshold for unimodal functions 119865
3 1198655 1198656
and 1198658and multimodal functions 119865
11 11986512 and 119865
14 BBTLBO
converges to the threshold except for functions 1198653 1198659 and
11986514 From the results of total average FEs BBTLBO converges
faster than other algorithms on all unimodal functions andthe majority of multimodal functions except for functions119865151198651611986519 and119865
20The acceleration rates between BBTLBO
and other algorithms are mostly 10 for functions 1198651 1198652 1198654
1198657 1198659 11986510 and 119865
13 From the results of total average SR
BBTLBO achieves the highest SR for those test functions ofwhich BBTLBO successfully converges to the threshold valueIt can be concluded that the BBTLBOhas a good performanceof convergence speed and successful rate (SR) of the solutionsfor test functions in this paper
8 The Scientific World Journal
Table2Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent119906
Fun
BBTL
BO(119906=00)
BBTL
BO(119906=01)
BBTL
BO(119906=03)
BBTL
BO(119906=05)
BBTL
BO(119906=07)
BBTL
BO(119906=09)
BBTL
BO(119906=10)
1198651
175119890minus001plusmn121119890+000
689119890minus071plusmn101119890minus070
123119890minus163plusmn00
121119890minus256plusmn00
00plusmn00
00plusmn00
00plusmn00
1198652
898119890minus005plusmn573119890minus004
562119890minus069plusmn272119890minus068
220119890minus161plusmn112119890minus160
243119890minus254plusmn00
00plusmn00
00plusmn00
00plusmn00
1198653
120119890minus001plusmn634119890minus002
591119890minus003plusmn144119890minus003
101119890minus003plusmn348119890minus004
435119890minus004plusmn197119890minus004
235119890minus004plusmn130119890minus004
227119890minus004plusmn126119890minus004
199
eminus00
4plusmn113
eminus00
41198654
765119890+002plusmn583119890+002
480119890minus001plusmn886119890minus001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
558119890+002plusmn653119890+002
187119890minus028plusmn573119890minus028
353119890minus054plusmn186119890minus053
369119890minus073plusmn227119890minus072
953119890minus096plusmn674119890minus095
216
eminus11
5plusmn110
eminus11
4256119890minus100plusmn130119890minus099
1198656
251119890+001plusmn534119890+000
667119890minus021plusmn881119890minus021
281119890minus061plusmn636119890minus061
822119890minus100plusmn180119890minus099
818119890minus137plusmn141119890minus136
363
eminus15
4plusmn134
eminus15
3886119890minus147plusmn322119890minus146
1198657
137119890minus003plusmn954119890minus003
872119890minus043plusmn152119890minus042
568119890minus088plusmn876119890minus088
101119890minus133plusmn238119890minus133
260119890minus175plusmn00
116
eminus18
8plusmn00
833119890minus180plusmn00
1198658
241119890+000plusmn307119890+000
132119890minus019plusmn298119890minus019
213119890minus028plusmn769119890minus028
344119890minus037plusmn124119890minus036
220119890minus050plusmn912119890minus050
107
eminus05
6plusmn439
eminus05
6203119890minus049plusmn894119890minus049
1198659
266
e+00
1plusmn179
e+00
0272119890+001plusmn317119890minus001
277119890+001plusmn318119890minus001
283119890+001plusmn278119890minus001
284119890+001plusmn267119890minus001
283119890+001plusmn341119890minus001
280119890+001plusmn387119890minus001
11986510830119890+000plusmn176119890+000
177119890minus001plusmn610119890minus001
590119890minus015plusmn170119890minus015
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
11986511374119890+001plusmn905119890+000
333119890+001plusmn118119890+001
271119890+001plusmn800119890+000
189119890+001plusmn114119890+001
573119890+000plusmn106119890+001
00plusmn00
00plusmn00
11986512815119890+000plusmn193119890+000
338119890minus001plusmn116119890+000
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986513506119890minus001plusmn808119890minus001
652119890minus003plusmn886119890minus003
178119890minus003plusmn368119890minus003
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986514
433
e+00
3plusmn679
e+00
2467119890+003plusmn610119890+002
517119890+003plusmn668119890+002
559119890+003plusmn685119890+002
553119890+003plusmn710119890+002
558119890+003plusmn780119890+002
540119890+003plusmn653119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518minus771119890+000plusmn347119890+000minus806119890+000plusmn339119890+000minus964119890+000plusmn181119890+000minus965119890+000plusmn176119890+000minus102
e+00
1plusmn677
eminus00
3minus985119890+000plusmn122119890+000minus993119890+000plusmn112119890+000
11986519minus769119890+000plusmn352119890+000minus813119890+000plusmn336119890+000minus987119890+000plusmn183119890+000minus103
e+00
1plusmn945
eminus00
1minus976119890+000plusmn195119890+000minus982119890+000plusmn178119890+000minus961119890+000plusmn199119890+000
11986520minus812119890+000plusmn353119890+000minus938119890+000plusmn269119890+000minus101119890+001plusmn165119890+000minus101
e+00
1plusmn161
e+00
0minus970119890+000plusmn228119890+000minus941119890+000plusmn243119890+000minus100119890+001plusmn169119890+000
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
8 The Scientific World Journal
Table2Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent119906
Fun
BBTL
BO(119906=00)
BBTL
BO(119906=01)
BBTL
BO(119906=03)
BBTL
BO(119906=05)
BBTL
BO(119906=07)
BBTL
BO(119906=09)
BBTL
BO(119906=10)
1198651
175119890minus001plusmn121119890+000
689119890minus071plusmn101119890minus070
123119890minus163plusmn00
121119890minus256plusmn00
00plusmn00
00plusmn00
00plusmn00
1198652
898119890minus005plusmn573119890minus004
562119890minus069plusmn272119890minus068
220119890minus161plusmn112119890minus160
243119890minus254plusmn00
00plusmn00
00plusmn00
00plusmn00
1198653
120119890minus001plusmn634119890minus002
591119890minus003plusmn144119890minus003
101119890minus003plusmn348119890minus004
435119890minus004plusmn197119890minus004
235119890minus004plusmn130119890minus004
227119890minus004plusmn126119890minus004
199
eminus00
4plusmn113
eminus00
41198654
765119890+002plusmn583119890+002
480119890minus001plusmn886119890minus001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
558119890+002plusmn653119890+002
187119890minus028plusmn573119890minus028
353119890minus054plusmn186119890minus053
369119890minus073plusmn227119890minus072
953119890minus096plusmn674119890minus095
216
eminus11
5plusmn110
eminus11
4256119890minus100plusmn130119890minus099
1198656
251119890+001plusmn534119890+000
667119890minus021plusmn881119890minus021
281119890minus061plusmn636119890minus061
822119890minus100plusmn180119890minus099
818119890minus137plusmn141119890minus136
363
eminus15
4plusmn134
eminus15
3886119890minus147plusmn322119890minus146
1198657
137119890minus003plusmn954119890minus003
872119890minus043plusmn152119890minus042
568119890minus088plusmn876119890minus088
101119890minus133plusmn238119890minus133
260119890minus175plusmn00
116
eminus18
8plusmn00
833119890minus180plusmn00
1198658
241119890+000plusmn307119890+000
132119890minus019plusmn298119890minus019
213119890minus028plusmn769119890minus028
344119890minus037plusmn124119890minus036
220119890minus050plusmn912119890minus050
107
eminus05
6plusmn439
eminus05
6203119890minus049plusmn894119890minus049
1198659
266
e+00
1plusmn179
e+00
0272119890+001plusmn317119890minus001
277119890+001plusmn318119890minus001
283119890+001plusmn278119890minus001
284119890+001plusmn267119890minus001
283119890+001plusmn341119890minus001
280119890+001plusmn387119890minus001
11986510830119890+000plusmn176119890+000
177119890minus001plusmn610119890minus001
590119890minus015plusmn170119890minus015
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
355
eminus01
5plusmn00
11986511374119890+001plusmn905119890+000
333119890+001plusmn118119890+001
271119890+001plusmn800119890+000
189119890+001plusmn114119890+001
573119890+000plusmn106119890+001
00plusmn00
00plusmn00
11986512815119890+000plusmn193119890+000
338119890minus001plusmn116119890+000
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986513506119890minus001plusmn808119890minus001
652119890minus003plusmn886119890minus003
178119890minus003plusmn368119890minus003
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986514
433
e+00
3plusmn679
e+00
2467119890+003plusmn610119890+002
517119890+003plusmn668119890+002
559119890+003plusmn685119890+002
553119890+003plusmn710119890+002
558119890+003plusmn780119890+002
540119890+003plusmn653119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518minus771119890+000plusmn347119890+000minus806119890+000plusmn339119890+000minus964119890+000plusmn181119890+000minus965119890+000plusmn176119890+000minus102
e+00
1plusmn677
eminus00
3minus985119890+000plusmn122119890+000minus993119890+000plusmn112119890+000
11986519minus769119890+000plusmn352119890+000minus813119890+000plusmn336119890+000minus987119890+000plusmn183119890+000minus103
e+00
1plusmn945
eminus00
1minus976119890+000plusmn195119890+000minus982119890+000plusmn178119890+000minus961119890+000plusmn199119890+000
11986520minus812119890+000plusmn353119890+000minus938119890+000plusmn269119890+000minus101119890+001plusmn165119890+000minus101
e+00
1plusmn161
e+00
0minus970119890+000plusmn228119890+000minus941119890+000plusmn243119890+000minus100119890+001plusmn169119890+000
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 9
Table3Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
BBPS
OBB
Exp
BBDE
GBD
EMGBD
EBB
TLBO
1198651
544119890minus027plusmn187119890minus026
262119890minus024plusmn500119890minus024
390119890minus035plusmn200119890minus034
435119890minus022plusmn113119890minus021
335119890minus035plusmn211119890minus034
00plusmn00
1198652
13800plusmn211119890+004
1000plusmn463119890+003
620119890minus021plusmn438119890minus020
1400plusmn452119890+003
128119890minus032plusmn837119890minus032
00plusmn00
1198653
132119890+000plusmn318119890+000
222119890minus002plusmn755119890minus003
164119890minus002plusmn957119890minus003
249119890minus002plusmn988119890minus003
116119890minus002plusmn526119890minus003
227eminus
004plusmn126
eminus004
1198654
560119890+000plusmn928119890+000
960119890minus001plusmn427119890+000
789119890+001plusmn305119890+002
840119890minus001plusmn912119890minus001
108119890+000plusmn128119890+000
00plusmn00
1198655
124119890+004plusmn666119890+003
441119890+003plusmn337119890+003
209119890+000plusmn400119890+000
536119890+003plusmn326119890+003
757119890+002plusmn116119890+003
216eminus
115plusmn110
eminus114
1198656
167119890+001plusmn919119890+000
120119890+000plusmn522119890minus001
139119890+001plusmn447119890+000
360119890minus001plusmn195119890minus001
110119890+000plusmn294119890+000
363eminus
154plusmn134
eminus153
1198657
234119890+001plusmn132119890+001
100119890+000plusmn303119890+000
406119890minus019plusmn215119890minus018
600119890minus001plusmn240119890+000
200119890minus001plusmn141119890+000
116eminus
188plusmn00
1198658
187119890+002plusmn134119890+002
158119890+002plusmn700119890+001
116119890minus001plusmn235119890minus001
172119890+002plusmn667119890+001
249119890+001plusmn199119890+001
107eminus
056plusmn439
eminus056
1198659
707119890+001plusmn148119890+002
357e+
001plusmn250e+
001
276119890+001plusmn106119890+001
317119890+001plusmn207119890+001
276119890+001plusmn146119890+001
283119890+001plusmn341119890minus001
11986510
106119890+001plusmn929119890+000
152119890+000plusmn511119890+000
134119890+000plusmn115119890+000
259119890+000plusmn645119890+000
554119890minus001plusmn279119890+000
355eminus
015plusmn00
11986511
116119890+002plusmn353119890+001
181119890+001plusmn728119890+000
676119890+001plusmn389119890+001
155119890+001plusmn596119890+000
203119890+001plusmn923119890+000
00plusmn00
11986512
273119890+000plusmn211119890+000
120119890minus001plusmn442119890minus001
173119890+000plusmn132119890+000
121119890minus001plusmn337119890minus001
517119890minus001plusmn867119890minus001
00plusmn00
11986513
214119890minus002plusmn411119890minus002
230119890minus003plusmn429119890minus003
407119890minus002plusmn489119890minus002
308119890minus003plusmn742119890minus003
463119890minus003plusmn716119890minus003
00plusmn00
11986514
364119890+003plusmn628119890+002
258119890+003plusmn551119890+002
230119890+003plusmn409119890+002
249119890+003plusmn541119890+002
260119890+003plusmn505119890+002
558e+
003plusmn780
e+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
437119890minus003plusmn309119890minus002
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus560119890+000plusmn341119890+000
minus790119890+000plusmn274119890+000
minus709119890+000plusmn333119890+000
minus763119890+000plusmn286119890+000
minus801119890+000plusmn300119890+000
minus985e+
000plusmn122e+
000
11986519
minus597119890+000plusmn331119890+000
minus787119890+000plusmn303119890+000
minus621119890+000plusmn366119890+000
minus860119890+000plusmn268119890+000
minus837119890+000plusmn290119890+000
minus982e+
000plusmn178e+
000
11986520
minus581119890+000plusmn365119890+000
minus940119890+000plusmn242119890+000
minus602119890+000plusmn377119890+000
minus946e+
000plusmn224e+
000
minus938119890+000plusmn251119890+000
minus941119890+000plusmn243119890+000
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
10 The Scientific World Journal
0 05 1 15 2 25 3 35 4
0
50
FEs
minus200
minus150
minus100
minus50
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(a) 1198657 Schwefel 222
0 05 1 15 2 25 3 35 4
0
10
FEs times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
minus60
minus50
minus40
minus30
minus20
minus10
log10
(mea
n fit
ness
)(b) 1198658 Zakharov
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus6
minus5
minus4
minus3
minus2
minus1
log10
(mea
n fit
ness
)
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(c) 11986518 Shekel5
0 05 1 15 2 25 3 35 4
0
FEs
Mea
n fit
ness
minus12
minus10
minus8
minus6
minus4
minus2
times104
u = 00
u = 01
u = 03
u = 05
u = 07
u = 09
u = 10
(d) 11986511 Rastrigin
Figure 3 Comparison of the performance curves using different 119906
55 Comparison of BBTLBO with DE Variants PSO Variantsand Some TLBO Variants In this section we comparedthe performance of BBTLBO with other optimization algo-rithms including jDE [28] SaDE [29] PSOcfLocal [27]PSOwFIPS [30] and TLBO [8 9] In our experiment themaximal FEs are used as the stopping criterion of all algo-rithms namely 40000 for all test functions The results areshown in Table 5 in terms of the mean optimum solutionand the standard deviation of the solutions obtained in the50 independent runs by each algorithm on 20 test functions
where ldquo119908119905119897rdquo summarizes the competition results amongBBTLBO and other algorithms The best results among thealgorithms are shown in boldface
The comparisons in Table 5 show that that all algorithmsperform well for 119865
15 11986516 and 119865
17 Although SaDE outper-
forms BBTLBOon11986514 PSOcfLocal outperforms BBTLBOon
1198659and PSOwFIPS outperforms BBTLBO on 119865
19and 11986520 and
BBTLBO offers the highest accuracy on functions 1198653 1198654 1198655
1198657 1198658 11986510 11986511 and 119865
18 ldquo119908119905119897rdquo shows that BBTLBO offers
well accuracy for the majority of test functions in this paper
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 11
0 05 1 15 2 25 3 35 4
0
1
2
3
FEs
minus4
minus3
minus2
minus1
times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(a) 1198653 Quadric
0 05 1 15 2 25 3 35 41
15
2
25
3
35
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)(b) 1198659 Rosenbrock
0 05 1 15 2 25 3 35 433
34
35
36
37
38
39
4
41
42
FEs times104
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
log10
(mea
n fit
ness
)
(c) 11986518 Shekel5
0
0
05 1 15 2 25 3 35 4FEs
Mea
n fit
ness
times104
minus10
minus9
minus8
minus7
minus6
minus5
minus4
minus3
minus2
minus1
BBPSOBBExpBBDE
GBDEMGBDEBBTLBO
(d) 11986514 Schwefel
Figure 4 Comparison of the performance curves using different algorithms
Table 5 concludes that BBTLBO has a good performance ofthe solution accuracy for all unimodal optimization problemsand most complex multimodal optimization problems
6 Two Real-World Optimization Problems
In this section to show the effectiveness of the proposedmethod the proposed BBTLBO algorithm is applied toestimate parameters of two real-world problems
61 Nonlinear Function Approximation The artificial neuralnetwork trained by our BBTLBO algorithm is a three-layer
Input x Output y
Desired output dBBTLBO algorithm
ANN
minus
Figure 5 BBTLBO-based ANN
feed-forward network and the basic structure of the proposedscheme is depicted in Figure 5 The inputs are connectedto all the hidden units which in turn all connected to all
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
12 The Scientific World Journal
Table 4 The mean number of FEs and SR with acceptable solutions using different algorithms
Fun 119905 value BBPSO BBExp BBDE GBDE MGBDE BBTLBOMFEs SR MFEs SR MFEs SR MFEs SR MFEs SR MFEs SR
1198651
1119864 minus 8 15922 100 17727 100 11042 100 19214 100 11440 100 1390 1001198652
1119864 minus 8 17515 54 19179 94 12243 100 20592 90 12634 100 1500 1001198653
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 01198654
1119864 minus 8 11710 24 8120 84 3634 6 7343 40 4704 34 525 1001198655
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 4100 1001198656
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2603 1001198657
1119864 minus 8 17540 6 21191 90 17314 100 22684 94 15322 98 2144 1001198658
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 9286 1001198659
1119864 minus 2 17073 62 18404 42 14029 24 18182 52 17200 80 NaN 011986510
1119864 minus 8 24647 26 27598 90 18273 26 29172 82 18320 84 2110 10011986511
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 2073 10011986512
1119864 minus 8 NaN 0 25465 50 NaN 0 27317 64 19704 24 2471 10011986513
1119864 minus 8 16318 32 21523 58 11048 16 22951 64 14786 58 1470 10011986514
1119864 minus 8 NaN 0 NaN 0 NaN 0 NaN 0 NaN 0 NaN 011986515
1119864 minus 8 658 100 1176 100 1274 100 1251 100 1206 100 799 10011986516
1119864 minus 8 657 98 1251 100 1294 100 1343 100 1308 100 813 10011986517
1119864 minus 8 995 100 2626 100 1487 100 2759 100 1921 100 973 10011986518
minus1015 1752 34 6720 44 2007 52 4377 32 8113 64 1684 9411986519
minus1040 2839 34 8585 48 1333 42 6724 50 3056 66 2215 9011986520
minus1053 1190 36 8928 74 1115 40 6548 76 5441 80 2822 82
the outputs The variables consist of neural network weightsand biases Suppose a three-layer forward neural networkarchitecture with 119872 input units 119873 hidden units and 119870
output units and the number of the variables is shown asfollows
119871 = (119872 + 1) lowast 119873 + (119873 + 1) lowast 119870 (13)
For neural network training the aim is to find a set ofweights with the smallest error measure Here the objectivefunction is the mean sum of squared errors (MSE) over alltraining patterns which is shown as follows
MSE = 1
119876 lowast 119870
119876
sum
119894=1
119870
sum
119895
1
2(119889119894119895minus 119910119894119895)2
(14)
where 119876 is the number of training data set 119870 is the numberof output units 119889
119894119895is desired output and 119910
119894119895is output inferred
from neural networkIn this example a three-layer feed-forward ANN with
one input unit five hidden units and one output unit isconstructed tomodel the curve of a nonlinear functionwhichis described by the following equation [31]
119910 = sin (2119909) exp (minus2119909) (15)
In this case activation function used in the output layer isthe sigma function and activation function used in the outputlayer is linear The number (dimension) of the variables is16 for BBTLBO-based ANN In order to train the ANN
200 pairs of data are chosen from the real model For eachalgorithm 50 runs are performed The other parametersare the same as those of the previous investigations Theresults are shown in Table 6 in terms of the mean MSEand the standard deviation obtained in the 50 independentruns for three methods Figure 6 shows the predicted timeseries for training and test using different algorithms It canconclude that the approximation achieved by BBTLBO hasgood performance
62 Tuning of PID Controller The continuous form of adiscrete-type PID controller with a small sampling period Δ119905is described as follows [32]
119906 [119896] = 119870119875 sdot 119890 (119896) + 119870119868 sdot
119896
sum
119894=1
119890 [119894] sdot Δ119905 + 119870119863 sdot119890 [119896] minus 119890 [119896 minus 1]
Δ119905
(16)
where 119906[119896] is the controlled output respectively 119890[119896] = 119903[119896]minus119910[119896] is the error signal 119903[119896] and 119910[119896] are the reference signaland the system output and 119870
119875 119870119868 and 119870
119863represent the
proportional integral and derivate gains respectivelyFor an unknown plant the goal of this problem is to
minimize the integral absolute error (IAE) which is given asfollow [32 33]
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903 (17)
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 13
Table5Com
paris
onsm
eanplusmnstd
ofthes
olutions
usingdifferent
algorithm
s
Fun
jDE
SaDE
PSOcfLo
cal
PSOwFIPS
TLBO
BBTL
BO1198651
363119890minus025plusmn185119890minus024
765119890minus025plusmn334119890minus024
923119890minus018plusmn303119890minus017
101119890minus002plusmn548119890minus003
305119890minus189plusmn00
00plusmn00
1198652
149119890minus023plusmn669119890minus023
275119890minus025plusmn108119890minus024
368119890minus017plusmn537119890minus017
108119890minus001plusmn505119890minus002
129119890minus185plusmn00
00plusmn00
1198653
322119890minus002plusmn283119890minus002
208119890minus002plusmn118119890minus002
128119890minus002plusmn550119890minus003
186119890minus002plusmn439119890minus003
570119890minus004plusmn237119890minus004
227eminus
004plusmn126
eminus004
1198654
211119890+001plusmn674119890+001
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
1198655
122119890+002plusmn137119890+002
428119890+001plusmn259119890+001
117119890+001plusmn930119890+000
260119890+003plusmn679119890+002
945119890minus043plusmn647119890minus042
216eminus
115plusmn110
eminus114
1198656
306119890+001plusmn850119890+000
245119890+000plusmn260119890+000
467119890minus001plusmn282119890minus001
266119890+000plusmn558119890minus001
208119890minus078plusmn430119890minus078
363eminus
154plusmn134
eminus153
1198657
828119890minus019plusmn349119890minus018
540119890minus016plusmn381119890minus015
134119890minus011plusmn127119890minus011
170119890minus002plusmn285119890minus003
384119890minus096plusmn553119890minus096
116eminus
188plusmn00
1198658
216119890+000plusmn416119890+000
488119890minus001plusmn582119890minus001
960119890minus002plusmn699119890minus002
586119890+001plusmn170119890+001
709119890minus022plusmn499119890minus021
107eminus
056plusmn439
eminus056
1198659
249119890+001plusmn105119890+001
261119890+001plusmn107119890+000
240e+
001plusmn152
e+000
265119890+001plusmn354119890minus001
255119890+001plusmn501119890minus001
283119890+001plusmn341119890minus001
11986510
505119890minus001plusmn706119890minus001
207119890minus001plusmn458119890minus001
194119890minus001plusmn456119890minus001
216119890minus002plusmn437119890minus003
362119890minus015plusmn502119890minus016
355eminus
015plusmn00
11986511
203119890+000plusmn194119890+000
386119890+000plusmn197119890+000
426119890+001plusmn106119890+001
115119890+002plusmn154119890+001
155119890+001plusmn809119890+000
00plusmn00
11986512
288119890minus002plusmn145119890minus001
650119890minus002plusmn187119890minus001
789119890minus001plusmn103119890+000
136119890+000plusmn741119890minus001
00plusmn00
00plusmn00
11986513
187119890minus002plusmn358119890minus002
118119890minus002plusmn175119890minus002
116119890minus002plusmn158119890minus002
106119890minus001plusmn993119890minus002
00plusmn00
00plusmn00
11986514
193119890+002plusmn142119890+002
135
e+002plusmn126e+
002
449119890+003plusmn825119890+002
396119890+003plusmn840119890+002
482119890+003plusmn686119890+002
558119890+003plusmn780119890+002
11986515
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986516
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986517
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
00plusmn00
11986518
minus940119890+000plusmn210119890+000
minus925119890+000plusmn230119890+000
minus776119890+000plusmn342119890+000
minus979119890+000plusmn144119890+000
minus972119890+000plusmn142119890+000
minus985e+
000plusmn122e+
000
11986519
minus985119890+000plusmn190119890+000
minus987119890+000plusmn183119890+000
minus924119890+000plusmn270119890+000
minus104e+
001plusmn423eminus
009
minus922119890+000plusmn241119890+000
minus982119890+000plusmn178119890+000
11986520
minus965119890+000plusmn223119890+000
minus101119890+001plusmn159119890+000
minus963119890+000plusmn250119890+000
minus105e+
001plusmn101eminus
004
minus965119890+000plusmn223119890+000
minus941119890+000plusmn243119890+000
119908119905119897
1334
1244
1343
1244
1163
mdash
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
14 The Scientific World Journal
0 50 100 150 200 250 300 350 400 450 500
0
1
2
3 Convergence curves using different methods
Generation
Best
valu
e
minus5
minus4
minus3
minus2
minus1
TLBOBBTLBO
(a) Convergence curves
0 10 20 30 40 50 60 70
0
005
01
015
02
025
03
035 Approximation curves using different methods
Sample number
y(t)
minus005
ActualTLBO
BBTLBO
(b) Approximation curves
0 10 20 30 40 50 60 700
0002
0004
0006
0008
001
0012
0014
0016Error curves using different methods
Sample number
y(t)
TLBOBBTLBO
(c) Error curves
Figure 6 Comparison of the performance curves using different algorithms
Table 6 Comparisons between BBTLBO and other algorithms onMSE
Algorithm Training error Testing errorMean Std Mean Std
TLBO 985119890 minus 004 926119890 minus 004 943119890 minus 004 918119890 minus 004
BBTLBO 345119890 minus 004 202119890 minus 004 276119890 minus 004 182119890 minus 004
where 119890(119905) and 119906(119905) are used to represent the system error andthe control output at time 119905 119905
119903is the rising time and 120596
119894(119894 = 1
2 3) are weight coefficientsTo avoid overshooting a penalty value is adopted in the
cost function That is once overshooting occurs the value
of overshooting is added to the cost function and the costfunction is given as follows [32 33]
if 119889119910 (119905) lt 0
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)
+1205964
1003816100381610038161003816119889119910 (119905)
1003816100381610038161003816) 119889119905 + 120596
3119905119903
else
119891 (119905) = int
infin
0
(1205961 |119890 (119905)| + 1205962119906
2(119905)) 119889119905 + 1205963119905119903
end
(18)
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 15
Table 7 Comparisons of parameters of PID controllers using different algorithms
Algorithm 119870119875
119870119868
119870119863
Overshoot () Peak time (s) Rise time (s) Cost function CPU time (s)GA 011257 002710 028792 290585 165000 105000 1634555 705900PSO 011772 001756 027737 104808 165000 065000 1160773 691000BBTLBO 011605 001661 025803 034261 180000 070000 1134300 704500
0 5 10 15 20 25 30 35 40 45 501
15
2
25
3
35
Generation
Best
valu
e
Performance curves using different methods
GAPSOBBTLBO
Figure 7 Performance curves using different methods
where 1205964is a coefficient and 120596
4≫ 1205961 119889119910(119905) = 119910(119905) minus119910(119905minus 1)
and 119910(119905) is the output of the controlled objectiveIn our simulation the formulas for the plant examined
are given as follows [34]
119866 (119904) =1958
1199043+ 1789119904
2+ 1033119904 + 1908
(19)
The system sampling time is Δ119905 = 005 second and thecontrol value 119906 is limited in the range of [minus10 10] Other rel-evant system variables are 119870
119875isin [0 1] 119870
119868isin [0 1] and 119870
119863isin
[0 1] The weight coefficients of the cost function are set as1205961= 0999 120596
2= 0001 120596
3= 2 and 120596 = 100 in this example
In the simulations the step response of PID controlsystem tuned by the proposed BBTLBO is compared withthat tuned by the standard genetic algorithm (GA) and thestandard PSO (PSO) The population sizes of GA PSO andBBTLBO are 50 and the corresponding maximum numbersof iterations are 50 50 and 50 respectively In addition thecrossover rate is set as 090 and the mutation rate is 010 forGA
The optimal parameters and the corresponding perfor-mance values of the PID controllers are listed in Table 7 andthe corresponding performance curves and step responsescurves are given in Figures 7 and 8 It can be seen fromFigure 7 and Table 7 that the PID controller tuned byBBTLBO has the minimum cost function and CPU time
0 05 1 15 2 25 3 35 4 45 50
02
04
06
08
1
Time (s)
Step response curves using different methods
GAPSOBBTLBO
r in
you
t
Figure 8 Step response curves using different methods
Although PID controllers tuned by PSO have a smaller peaktime and rise time their maximum overshoots are muchlarger than the overshoot tuned by BBTLBO It concludes thatthe PID controller tuned by the BBTLBO could perform thebest control performance in the simulations
7 Conclusion
In this paper TLBO has been extended to BBTLBO whichuses the hybridization of the learning strategy in the stan-dard TLBO and Gaussian sampling learning to balance theexploration and the exploitation in teacher phase and uses amodified mutation operation so as to eliminate the duplicatelearners in learner phase The proposed BBTLBO algorithmis utilized to optimize 20 benchmark functions and tworeal-world optimization problems From the analysis andexperiments the BBTLBO algorithm significantly improvesthe performance of the original TLBO although it needs tospend more CPU time than the standard TLBO algorithmin each generation From the results compared with otheralgorithms on the 20 chosen test problems it can be observedthat the BBTLBO algorithm has good performance by usingneighborhood search more effectively to generate betterquality solutions although the BBTLBO algorithm does notalways have the best performance in all experiments cases ofthis paper It can be also observed that the BBTLBOalgorithm
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
16 The Scientific World Journal
gives the best performance on two real-world optimizationproblems compared with other algorithms in the paper
Further work includes research into neighborhood searchbased on different topological structures Moreover thealgorithm may be further applied to constrained dynamicand noisy single-objective and multiobjective optimizationdomain It is expected that BBTLBOwill be used tomore real-world optimization problems
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This research was partially supported by the National NaturalScience Foundation of China (61100173 61100009 61272283and 61304082) This work is partially supported by theNatural Science Foundation of Anhui Province China (Grantno 1308085MF82) and the Doctoral Innovation Foundationof Xirsquoan University of Technology (207-002J1305)
References
[1] D E Goldberg Genetic Algorithms in Search Optimization andMachine Learning Addison-Wesley Reading Mass USA 1989
[2] L C Jiao and L Wang ldquoA novel genetic algorithm based onimmunityrdquo IEEE Transactions on SystemsMan and CyberneticsA Systems and Humans vol 30 no 5 pp 552ndash561 2000
[3] R Storn and K Price ldquoDifferential evolution a simple andefficient Heuristic for global optimization over continuousspacesrdquo Journal of Global Optimization vol 11 no 4 pp 341ndash359 1997
[4] M Dorigo and T Stutzle Ant Colony Optimization MIT Press2004
[5] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995
[6] D Karaboga and B Basturk ldquoOn the performance of artificialbee colony (ABC) algorithmrdquo Applied Soft Computing Journalvol 8 no 1 pp 687ndash697 2008
[7] D Simon ldquoBiogeography-based optimizationrdquo IEEE Transac-tions on Evolutionary Computation vol 12 no 6 pp 702ndash7132008
[8] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization a novelmethod for constrainedmechanicaldesign optimization problemsrdquo CAD Computer Aided Designvol 43 no 3 pp 303ndash315 2011
[9] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization an optimization method for continuousnon-linear large scale problemsrdquo Information Sciences vol 183no 1 pp 1ndash15 2012
[10] R V Rao V J Savsani and D P Vakharia ldquoTeaching-learning-based optimization algorithm for unconstrained and con-strained real-parameter optimization problemsrdquo EngineeringOptimization vol 44 no 12 pp 1447ndash1462 2011
[11] V Togan ldquoDesign of planar steel frames using teaching-learningbased optimizationrdquo Engineering Structures vol 34 pp 225ndash232 2012
[12] R V Rao and V Patel ldquoAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problemsrdquo International Journal of Industrial EngineeringComputations vol 3 pp 535ndash560 2012
[13] S O Degertekin and M S Hayalioglu ldquoSizing truss structuresusing teaching-learning-based optimizationrdquo Computers andStructures vol 119 pp 177ndash188 2013
[14] R V Rao and V Patel ldquoAn improved teaching-learning-basedoptimization algorithm for solving unconstrained optimizationproblemsrdquo Scientia Iranica vol 20 no 3 pp 710ndash720 2013
[15] R V Rao and V Patel ldquoMulti-objective optimization of com-bined Brayton and inverse Brayton cycles using advancedoptimization algorithmsrdquo Engineering Optimization vol 44 no8 pp 965ndash983 2011
[16] T Niknam F Golestaneh and M S Sadeghi ldquoTheta-multi-objective teaching-learning-based optimization for dynamiceconomic emission dispatchrdquo IEEE Systems Journal vol 6 no2 pp 341ndash352 2012
[17] R V Rao and V Patel ldquoMulti-objective optimization of heatexchangers using a modified teaching-learning-based opti-mization algorithmrdquo Applied Mathematical Modelling vol 37no 3 pp 1147ndash1162 2013
[18] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002
[19] F van den Bergh and A P Engelbrecht ldquoA study of particleswarm optimization particle trajectoriesrdquo Information Sciencesvol 176 no 8 pp 937ndash971 2006
[20] J Kennedy ldquoBare bones particle swarmsrdquo in Proceedings of theSwarm Intelligence Symposium (SIS rsquo03) pp 80ndash87 2003
[21] M G H Omran A P Engelbrecht and A Salman ldquoBarebones differential evolutionrdquo European Journal of OperationalResearch vol 196 no 1 pp 128ndash139 2009
[22] H Wang S Rahnamayan H Sun and M G H OmranldquoGaussian bare-bones differential evolutionrdquo IEEE Transactionson Cybernetics vol 43 no 2 pp 634ndash647 2013
[23] X H Hu and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation pp 677ndash1681 2002
[24] M G Omran A P Engelbrecht and A Salman ldquoUsingthe ring neighborhood topology with self-adaptive differentialevolutionrdquo in Advances in Natural Computation pp 976ndash979Springer Berlin Germany 2006
[25] X Li ldquoNiching without niching parameters particle swarmoptimization using a ring topologyrdquo IEEE Transactions onEvolutionary Computation vol 14 no 1 pp 150ndash169 2010
[26] I Maruta T H Kim D Song and T Sugie ldquoSynthesis of fixed-structure robust controllers using a constrained particle swarmoptimizer with cyclic neighborhood topologyrdquo Expert Systemswith Applications vol 40 no 9 pp 3595ndash3605 2013
[27] J Kennedy and R Mendes ldquoPopulation structure and particleswarm performancerdquo in Proceedings of the International Con-ference on Evolutionary Computation pp 1671ndash1676 HonoluluHawaii USA 2002
[28] J Brest S Greiner B Boskovic M Mernik and V ZumerldquoSelf-adapting control parameters in differential evolution acomparative study on numerical benchmark problemsrdquo IEEETransactions on Evolutionary Computation vol 10 no 6 pp646ndash657 2006
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
The Scientific World Journal 17
[29] A K Qin V L Huang and P N Suganthan ldquoDifferential evo-lution algorithm with strategy adaptation for global numericaloptimizationrdquo IEEE Transactions on Evolutionary Computationvol 13 no 2 pp 398ndash417 2009
[30] R Mendes J Kennedy and J Neves ldquoThe fully informedparticle swarm simpler maybe betterrdquo IEEE Transactions onEvolutionary Computation vol 8 no 3 pp 204ndash210 2004
[31] F Herrera and M Lozano ldquoGradual distributed real-codedgenetic algorithmsrdquo IEEE Transactions on Evolutionary Compu-tation vol 4 no 1 pp 43ndash62 2000
[32] J Liu Advanced PID Control and MATLAB Simulation Elec-tronic Industry Press 2003
[33] J Zhang J Zhuang H Du and S Wang ldquoSelf-organizinggenetic algorithm based tuning of PID controllersrdquo InformationSciences vol 179 no 7 pp 1007ndash1017 2009
[34] R Haber-Haber R Haber M Schmittdiel and R M delToro ldquoA classic solution for the control of a high-performancedrilling processrdquo International Journal of Machine Tools andManufacture vol 47 no 15 pp 2290ndash2297 2007
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Submit your manuscripts athttpwwwhindawicom
Computer Games Technology
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Distributed Sensor Networks
International Journal of
Advances in
FuzzySystems
Hindawi Publishing Corporationhttpwwwhindawicom
Volume 2014
International Journal of
ReconfigurableComputing
Hindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Applied Computational Intelligence and Soft Computing
thinspAdvancesthinspinthinsp
Artificial Intelligence
HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014
Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Journal of
Computer Networks and Communications
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation
httpwwwhindawicom Volume 2014
Advances in
Multimedia
International Journal of
Biomedical Imaging
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
ArtificialNeural Systems
Advances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Computational Intelligence and Neuroscience
Industrial EngineeringJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Human-ComputerInteraction
Advances in
Computer EngineeringAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Top Related