ImprovedClonalSelectionAlgorithmBasedonBiological...

10
ResearchArticle Improved Clonal Selection Algorithm Based on Biological Forgetting Mechanism Chao Yang , 1,2,3 Bing-qiu Chen , 1 Lin Jia, 1 and Hai-yang Wen 1 1 SchoolofComputerScienceandInformationEngineering,HubeiUniversity,HubeiProvince,Wuhan430062,China 2 HubeiEducationInformatizationEngineeringTechnologyResearchCenter,HubeiProvince,Wuhan430062,China 3 HubeiKeyLaboratoryofAppliedMathematics,FacultyofMathematicsandStatistics,HubeiUniversity,Wuhan430062,China Correspondence should be addressed to Chao Yang; [email protected] Received 3 December 2019; Revised 24 February 2020; Accepted 10 March 2020; Published 7 April 2020 Academic Editor: Lingzhong Guo Copyright © 2020 Chao Yang et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. e antibody candidate set generated by the clonal selection algorithm has only a small number of antibodies with high antigen affinity to obtain high-frequency mutations. Among other antibodies, some low-affinity antibodies are replaced by new antibodies to participate in the next clonal selection. A large number of antibodies with high affinity make it difficult to participate in clonal selection and exist in antibody concentration for a long time. is part of inactive antibody forms a “black hole” of the antibody set, which is difficult to remove and update in a timely manner, thus affecting the speed at which the algorithm approximates the optimal solution. Inspired by the mechanism of biological forgetting, an improved clonal selection algorithm is proposed to solve this problem. It aims to use the abstract mechanism of biological forgetting to eliminate antibodies that cannot actively participate in high-frequency mutations in the antibody candidate set and to improve the problem of insufficient diversity of antibodies in the clonal selection algorithm, which is prone to fall into the local optimal. Compared with the existing clonal selection and genetic algorithms, the experiment and time complexity analysis show that the algorithm has good optimization efficiency and stability. 1. Introduction As a heuristic algorithm to solve complex problems with a high mutation rate, the clonal selection algorithm has two important characteristics: it has efficient optimization per- formance [1] and it is difficult to get into local optimum [2]. As a result, it has attracted the attention of scholars in related fields. Kim and Bentley [3] used a dynamic cloning selection algorithm to solve the problem of anomaly detection in the changing environment. In recent years, the cloning selection algorithm inspired by biological immunity has also been widely used in power industries such as power plant addressing [4], electricity price prediction [5], hybrid shop scheduling [6], car flow organization [7] and other power industries, playing an active role in the improvement of clustering [8] as well as machine learning algorithms [9–11]. e classical clonal selection algorithm has problems of algorithm efficiency, convergence rate, and lack of sufficient theoretical support [12]. erefore, people use the cloning selection algorithm to solve practical problems but also put forward many useful improvements to the algorithm itself. For example, G´ alveza et al. [13] proposed an elitist clonal selection algorithm for complex multimodal and multi- variable continuous nonlinear optimization problems. e algorithm replaces the selection and cloning mechanism of the classical clonal selection algorithm with the antibody with the best affinity for antigen and selects the optimal set of the first n antibodies. is algorithm is better than the related alternative method of genetic algorithm in the automatic node adjustment for the B-spline curve. Gong et al. [14] proposed an improved cloning selection algorithm based on the Baldwin effect, which makes the algorithm more effective and robust by promoting the evolutionary exploration of good genotypes. Rao and Vaisakh proposed [15] a multi- objective adaptive cloning selection algorithm to solve the optimal power flow problem. Pareto optimization was found by using the crowded distance, and the best strategy was selected based on the fuzzy mechanism. In terms of Hindawi Complexity Volume 2020, Article ID 2807056, 10 pages https://doi.org/10.1155/2020/2807056

Transcript of ImprovedClonalSelectionAlgorithmBasedonBiological...

Page 1: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

Research ArticleImproved Clonal Selection Algorithm Based on BiologicalForgetting Mechanism

Chao Yang 123 Bing-qiu Chen 1 Lin Jia1 and Hai-yang Wen1

1School of Computer Science and Information Engineering Hubei University Hubei Province Wuhan 430062 China2Hubei Education Informatization Engineering Technology Research Center Hubei Province Wuhan 430062 China3Hubei Key Laboratory of Applied Mathematics Faculty of Mathematics and Statistics Hubei University Wuhan 430062 China

Correspondence should be addressed to Chao Yang stevenychubueducn

Received 3 December 2019 Revised 24 February 2020 Accepted 10 March 2020 Published 7 April 2020

Academic Editor Lingzhong Guo

Copyright copy 2020 Chao Yang et al is is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

e antibody candidate set generated by the clonal selection algorithm has only a small number of antibodies with high antigenaffinity to obtain high-frequency mutations Among other antibodies some low-affinity antibodies are replaced by new antibodiesto participate in the next clonal selection A large number of antibodies with high affinity make it difficult to participate in clonalselection and exist in antibody concentration for a long time is part of inactive antibody forms a ldquoblack holerdquo of the antibodyset which is difficult to remove and update in a timely manner thus affecting the speed at which the algorithm approximates theoptimal solution Inspired by the mechanism of biological forgetting an improved clonal selection algorithm is proposed to solvethis problem It aims to use the abstract mechanism of biological forgetting to eliminate antibodies that cannot actively participatein high-frequency mutations in the antibody candidate set and to improve the problem of insufficient diversity of antibodies in theclonal selection algorithm which is prone to fall into the local optimal Compared with the existing clonal selection and geneticalgorithms the experiment and time complexity analysis show that the algorithm has good optimization efficiency and stability

1 Introduction

As a heuristic algorithm to solve complex problems with ahigh mutation rate the clonal selection algorithm has twoimportant characteristics it has efficient optimization per-formance [1] and it is difficult to get into local optimum [2]As a result it has attracted the attention of scholars in relatedfields Kim and Bentley [3] used a dynamic cloning selectionalgorithm to solve the problem of anomaly detection in thechanging environment In recent years the cloning selectionalgorithm inspired by biological immunity has also beenwidely used in power industries such as power plantaddressing [4] electricity price prediction [5] hybrid shopscheduling [6] car flow organization [7] and other powerindustries playing an active role in the improvement ofclustering [8] as well as machine learning algorithms [9ndash11]

e classical clonal selection algorithm has problems ofalgorithm efficiency convergence rate and lack of sufficienttheoretical support [12] erefore people use the cloning

selection algorithm to solve practical problems but also putforward many useful improvements to the algorithm itselfFor example Galveza et al [13] proposed an elitist clonalselection algorithm for complex multimodal and multi-variable continuous nonlinear optimization problems ealgorithm replaces the selection and cloning mechanism ofthe classical clonal selection algorithm with the antibodywith the best affinity for antigen and selects the optimal set ofthe first n antibodiesis algorithm is better than the relatedalternative method of genetic algorithm in the automaticnode adjustment for the B-spline curve Gong et al [14]proposed an improved cloning selection algorithm based onthe Baldwin effect whichmakes the algorithmmore effectiveand robust by promoting the evolutionary exploration ofgood genotypes Rao and Vaisakh proposed [15] a multi-objective adaptive cloning selection algorithm to solve theoptimal power flow problem Pareto optimization was foundby using the crowded distance and the best strategy wasselected based on the fuzzy mechanism In terms of

HindawiComplexityVolume 2020 Article ID 2807056 10 pageshttpsdoiorg10115520202807056

theoretical analysis Hong et al [16] analyzed the conver-gence of elitist clonal selection algorithm from the per-spective of the state transition probability matrix Markovchain and other random process-related theories andproposed a method to evaluate the convergence property ofthe algorithm which has a certain reference value isalgorithm is better than the genetic algorithm in terms ofautomatic node adjustment of B-spline curves

It can be seen from the above literature that the currentclonal selection algorithm flow mainly includes steps ofselection cloning mutation reelection and replacement Atpresent a large number of improved algorithms areintended to improve the steps of selection cloning andmutation without considering the improvement and opti-mization of the replacement and update steps of antibody-concentrated antibodies by clonal selection algorithm Inorder to improve the update efficiency of the antibody setthis paper tries to find a new scheme to replace the re-placement steps used in the current clonal selection algo-rithm so as to improve the overall search accuracy andconvergence stability of the algorithm

e antibody candidate set is a set of antibodies pro-duced by the clonal selection algorithm during initializatione clonal selection algorithm sorts the affinity of the an-tibodies in the set and selects the first n antibodies with highaffinity to clone and mutate Since n depends on manualexperience it is generally set to 10 to 50 of the populationsize [12] this selection mechanism inevitably leads to a largenumber of antibodies with general affinity ldquoretentionrdquo in theantibody candidate set Some were replaced by new anti-bodies due to low affinity and the other part did not satisfythe conditions of being replaced and they remained in theantibody candidate set for a long time is was not con-ducive to the rapid update of the antibody set and affectedthe efficiency of the algorithm in finding the optimalsolution

is kind of antibody that stays in the candidate set ofantibodies for a long time can neither converge nor updatequickly jumping out of the local optimal interval ereforethere is a phenomenon in which antibodies cannot be se-lected and update iterations in a short period of time whichis vividly called ldquoblack holerdquo in this paper

In this paper an improved clonal selection algorithmbased on the mechanism of biological forgetting (FCSA) isproposed Aiming at the black hole formed by antibodieswhose affinity does not meet the mutation and updateconditions the forgetting mechanism is applied to the an-tibody candidate set update process of the algorithm eaim is to update these inactive and long-lasting antibodies inthe candidate antibody set and enhance the diversity ofantibodies in the antibody set thereby increasing the con-vergence speed of the algorithm

2 Overview of the BiologicalForgetting Mechanism

21 Biological Forgetting Research Background In 1885Ebbinghaus [17] put forward the Ebbinghaus memory curvewhich described the amount of forgetting over time Over

the next 100 years numerous scholars have proposed the-ories to explain how forgetting occurs At present there arevarious opinions as to the causes of forgetting and thewidely supported explanations include decay theory [18] andinterference theory [19] Decay theory holds that the decayof forgetting which is similar to the decay of radioactiveelements is a process in which memories disappear spon-taneously over time According to the interference theoryforgetting is caused by the conflict between memory blockse similarity quantity and intensity of memory blocksdetermine the degree of forgetting e more similar thecontents between memory blocks are the more times theyare remembered and the higher the memory intensity is theless likely they are to be forgotten

22 Forgetting Mechanism Affected by Rac1 ActivityAlthough both attenuation theory and interference theoryare supported by experimental results they lack an objectivemechanism to explain why attenuation theory is only ap-plicable to long-term memory while interference theory ismore applicable to short-term memory

Shuai et al [20] studied the small G protein Rac1 in thebrain of drosophila and found that Rac1 activity affected thedegree of forgetting in a drosophila olfactory avoidanceexperiment Specifically after Rac1 activity was inhibited inthe brain of drosophila the forgetting rate slowed downafter Rac1 activity was enhanced by stimulation the for-getting rate increased Changes in Rac1 activity in drosophilafly brain during memory decay and memory interferencementioned in Section 21 of this paper also support the aboveconclusions

erefore Shuai et al [20] proposed a set of forgettingmechanisms affected by Rac1 activity e activity of Rac1would also change according to the memory process withdiffering time lengths When memory was disturbed for ashort time Rac1 activity also increased rapidly making theprevious outdated memory forgotten Rac1 takes longer toactivate when the time span is large suggesting that memorydeclines over time

Liu et alrsquos [21] studies on Rac1 protein activity in mousehippocampal neurons and object recognition memory inmice further support the role of Rac1 activation inforgetting

23 Observations about Biological Forgetting In the complexbiological system the process of forgetting often occursispaper argues that forgetting is a process of information lossand information loss is meaningful under certain circum-stances In reference [22] the odor type that would cause thefruit fly to receive an electric shock was repeatedly changedand the fruit fly would remember the odor that would re-ceive an electric shock the most recently so as to avoid thistype of odor is suggests that in complex external situa-tions memories that cannot be adapted to the current en-vironment need to be forgotten Above we suggest that theforgetting mechanism is meaningful for behaviors requiringrapid adaptation to the new environment such as cloningselection for a high-mutation environment

2 Complexity

At the same time the idea of attenuation theory of bi-ological forgetting is introduced into the replacementprocess of the clonal selection algorithm e number ofiterations of antibodies in the antibody candidate set is takenas the time length and whether the antibody participates inhigh-frequency mutation in a certain iteration is taken as thebasis of whether the antibody is remembered at the point intime so as to realize the purpose of replacing the antibodywith weak memory degree when the time span is largeMoreover since the current replacement mechanism ofclonal selection is still to replace the d antibodies with theworst affinity of antibodies the antibody can be dynamicallyreplaced according to its own characteristics after the in-troduction of attenuation theory

3 Clonal Selection Algorithm Based onForgetting Mechanism

31 Clonal Selection Algorithm and Forgetting Mechanism

311 Introduction to Clonal Selection Algorithm In 2002De Castro and Von Zuben [23] inspired by biological as wellas artificial immunology proposed a clonal selection algo-rithm to solve the problems of multimodal and combina-torial optimization by using the principle of clonal selectione main flow of this algorithm is

Step 1 Initialize the size of the antibody set iterationnumber cloning number and other relevant param-eters randomly select an antigen from the antigen setand generate the candidate antibody set which iscomposed of a memory set and residual setStep 2 Calculate the affinity between each antibody andantigen in the candidate antibody concentration andselect the first n antibodies with the highest affinityStep 3 Clone the n antibodies and the number ofantibody clones is positively correlated with their af-finity to the antigenStep 4 Carry out mutation treatment on the antibodyset generated after cloning and the higher the affinitythe lower the probability of antibody mutationStep 5 Calculate the antibody affinity after mutationselect the antibody with the highest affinity andcompare it with the antibody in the current memoryset en select the antibody with the highest affinityand put it into the memory setStep 6 Randomly select d new antibodies to replace thed antibodies with the worst affinity to the antigen in theremaining setStep 7 Skip to step 2 for the next iteration When thenumber of iterations meets the termination conditionthe algorithm terminates

It can be seen that the cloning selection algorithm selectsthe first n antibodies with the highest concentration affinityof candidate antibodies in step 2 and the subsequent stepsprimarily involve cloning and mutation of these n anti-bodies e rest of the antibody candidate set does not

participate in the epicycle mutation leading to the followingsituations Antibody A does not belong to the first n anti-bodies with the highest affinity in one iteration of the al-gorithm and it is not replaced in step 6 In the subsequentrounds of iterations antibody A does not always become thed antibody with the lowest affinity with the antigen In thiscase antibody A can no longer adapt to new changes cannotparticipate in the high mutation process of the algorithmand cannot be replaced as the worst d antibody in time thusaffecting the update rate of antibodies in the whole candidateantibody concentration

312 Clonal Selection Algorithm Inspired by ForgettingMechanism In order to more intuitively explain the state ofthe candidate antibodies in a certain round of iteration theaffinity between the antigen and the antibody is expressed inthe form of distance According to the previous section afterall antibodies are sorted according to their affinity the clonalselection algorithm selects the n best antibodies in the an-tibody candidate set for cloning and mutation As shown inFigure 1 in the reselect step the original antibody and theantibody are mixed and screened for the next iteration Atthis time there are antibodies that were not selected asclones in the previous roundWe divide the best n antibodiesinto the first layer and the worst d antibodies among theremaining antibodies into the third layer and then thesecond layer contains the intermediate antibodies

Since the total number P of antibodies remains un-changed after the completion of one iteration of the CSA allof the antibodies in the third layer will be replaced with newantibodies and the antibodies in the antibody candidate setwill be reordered according to the affinity with the antigen sosome of the antibodies in the entire set of antibody candidateswill migrate In the first layer a part of the antibody willmigrate to the first tier II In addition some of the antibodiesin layer II migrate to layer III Analyze the layer II antibodiesat this time including the antibodies migrated into layer I andlayer III and the original antibodies not migrated out of layerII After multiple rounds of iteration there are still someunmigrated native antibodies in layer II which have notmigrated to the high-frequency mutations involved in thealgorithm in layer I and have not been replaced with newantibodies in layer III It is suggested that the above unmi-grated antibodies that always exist in layer II are a memorythat cannot adapt to the current environment and should beupdated along with the antibody in layer III

Inspired by the forgetting mechanism for each antibodyin the candidate antibody set calculate the number of timesit is selected as the top n affinity and the number of iterationsin the candidate antibody set and use these two as antibodycharacteristics ese antibody characteristics were used asthe basis for the change of Rac1 activity and the activationdegree of Rac1 determined whether to eliminate theldquostrandedrdquo antibody from the concentration of candidateantibodies

As shown in Figure 2 antibodies in the candidate set ofantibodies are divided into two layers according to the size ofaffinity Layer II contains all remaining antibodies

Complexity 3

For each antibody in the candidate set the Rac1 activity ofthe antibody in layer I is significantly lower than that in layerII When the Rac1 activity of the layer II antibody exceeds theactivity threshold the antibody is replaced and the entireantibody candidate set is updated In this way the clonalselection algorithm avoids the antibody ldquoblack holerdquo formedby the partially unmigrated original antibodies in layer II

32 Abstract Definition of Forgetting Mechanism e fol-lowing definitions relate to the forgetting mechanism

(1) Antibody survival time is the number of iterationsthat antibodies have participated in the antibodycandidate set

(2) Time T is the execution time of the clonal selectionalgorithm In this paper T refers to the number ofalgorithm iterations

(3) Appropriate memory is the attribute of each can-didate antibody In an iteration if the antibodybelongs to the best n antibodies it is regarded as asuitable memory

(4) Appropriate memory strength is the appropriatememory accumulated by candidate antibodies dur-ing T time

(5) Rac1 protein activity is the index affecting antibodyforgetting determined by the survival time of an-tibody in candidate antibody concentration and thestrength of appropriate memory Rac1 protein ac-tivity is proportional to the survival time of anti-bodies and inversely proportional to the degree ofappropriate memory

33 Improved Clonal Selection Algorithm In this paper animproved clonal selection algorithm (FCSA) inspired by theforgetting mechanism is proposed Its core implementationidea is to replace the receptor editing mechanism [24] in theCSA with a unique forgetting mechanism

e specific implementation method is as follows Ineach iteration of the algorithm the appropriate memorystrength and survival time of each antibody candidate set arerecorded After several iterations of the algorithm antibody

Layer I

Layer II

Layer III

AgAg

AgAg

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ab

Figure 1 Antigen and antibody distribution structure

Ag

AgAg

AgAg

AgAg

Ag

Ab

AgAg Ag

Ag Ag Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Layer II

Layer I

Figure 2 Antigen-antibody distribution structure affected by Rac1

4 Complexity

forgetting was determined based on whether Rac1 proteinactivity reached the threshold

331 Affinity Calculation To simplify the calculation thetarget test function of antibody affinity to antigen is thefunction value which can be expressed as

Affabi f x1 x2 xD( 1113857 (1)

where xi | i 1 2 D1113864 1113865 is the antibody and D is the di-mension of the antibody

332 Cloning Method According to the affinity corre-sponding to the antibody and antigen the cloning methodperforms the cloning operation on the antibody e higherthe affinity the greater the number of antibodies that will becloned e specific cloning formula is

Abc abij

i isin [0 population size minus 1]

j max 1 intAffabi

+ a

alowast max clone1113888 11138891113888 1113889

111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

⎫⎪⎪⎪⎪⎪⎬

⎪⎪⎪⎪⎪⎭

agt 0

(2)

where population size represents the number of antibodiesin the candidate set of antibodies Affabi

represents the af-finity between the antibody abi and the antigen max clonerepresents the initial clone number and j represents theclone number of the antibody abi

333 Variation Method e mutation method aims at theabi of the cloned antibody and determines the mutationdegree of the antibody according to the affinity between theantibody and the antigen e higher the affinity the less thepossibility and degree of the mutation

e specific variation formula is

ab xi | i 1 2 D1113864 1113865

xi xi + random(minus a a) random(0 1)lt eminus rlowastaffabmax aff( )

xi random(0 1)gt eminus rlowastaffabMAX aff( )

⎧⎨

(3)

where r is the mutation rate a is the variation range agt 0and max aff is the maximum affinity of the concentratedantibody

334 Forgetting Method e method of forgetting deter-mines the necessity of antibody forgetting based on thesurvival time of the antibody the appropriate memory in-tensity and the activity of the Rac1 protein

e specific forgetting formula is

ab xi | i 1 2 D1113864 1113865

xi

random(minDomain maxDomain)abtime

abstrengthgt c

xi otherwise

⎧⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎩

(4)

where abtime is the antibody survival time abstrength is theappropriate memory strength and c is the Rac1 proteinactivity threshold

335 Algorithm Flow e flow of the improved algorithmproposed in this paper is shown in Algorithm1

e suspension conditions of the algorithms inAlgorithm 1can be determined according to specific needsCommon termination conditions include reaching themaximum value of the function evaluation and reaching themaximum number of generations

In the algorithm Rac1 protein activity is an inherentproperty of each candidate antibody which is calculatedbased on antibody survival time and appropriate memorystrength when the antibody is first selected into the can-didate set And it changes dynamically with the execution ofthe algorithm When the property value of the antibodyreaches the threshold it means that the antibody has notmutated in a better direction within the time we expect andit is not sufficiently competitive with other candidate an-tibodies So in the algorithm the antibody that meets thethreshold value will perform the forgetting operation

4 Experiment and Algorithm Evaluation

irteen kinds of testing were done to select CEC testfunction optimization algorithm functions as experimentaltest functions respectively using the test function of thepresented algorithm (FCSA) proposed in [13] the elitistclonal selection algorithm (ECSA) proposed in literature[14] Baldwinian learning in clonal selection algorithm(BCSA) and search accuracy of the genetic algorithm (GA)for testing e experimental steps are as follows

First initialize various parameters of the algorithm etermination criterion in this experiment is to run the GABCSA ECSA and FCSA until the number of functionevaluations reaches the maximum value of 350000

Second find the optimal solution of the test functionree algorithms were executed to obtain the optimal so-lution generated by each execution of the algorithm eaverage optimal solution maximum optimal solution andminimum optimal solution after 100 executions wereanalyzed

e purpose of the experiment in this paper is to analyzethe effectiveness of the forgetting mechanism applied to theclonal selection algorithm e performance of the algo-rithm is mainly evaluated by the quality of the results ob-tained when the suspension conditions are consistent is

Complexity 5

article counts the mean and standard deviation of the resultsof multiple runs of the algorithm to evaluate the quality ofthe results ese two indicators reflect the concentrationtrend and the degree of dispersion of the experimental datarespectively erefore this paper uses these two indicatorsto verify the effectiveness of the improved algorithm

Finally we obtained the results of GA CSA and FCSArunning at 1000 generations and plotted them as line chartse purpose is to analyze the accuracy and speed of thealgorithm by characterizing the relationship between gen-erations and algorithm results

Among them the algorithm parameters are set as shownin Table 1 and the execution environment of the algorithm isshown in Table 2

41 Test Function e test functions selected in this paperare shown in Table 3eir common feature is that they havea global minimum value [25] and the function image iscomplex with multiple local minimum values Converselythe opposite value of the test function is the globalmaximum

Consider the test functions in Table 3 Since antibodieswith high affinity are generally selected when comparing theaffinity of antigens and antibodies this paper takes theopposite value corresponding to these test functions as theglobal maximum value which is equivalent to the globalminimum value of the test function in Table 3 e oppositeof the trial function is the global maximum

42 Experimental Results e results of our experiment areshown in Table 4 e closer the average value of the optimalsolution obtained by each algorithm is to the reference valuethe higher the accuracy of the algorithm under the termi-nation condition For the case that the test functions f_1 f_6f_8 f_9 and f_12 have high-dimensional solutions in orderto verify the convergence of the FCSA in high-dimensionaltest function set d 50 and d 100 and compare the resultswith ECSA and BCSA as shown in Tables 5 and 6

After replacing the updating operator in BCSA with theforgetting operator set d 50 and d 100 to test theupdating ability of the forgetting operator compared withthe previous updating operator in different dimensions

FCSAInput N (the size of the population) n (the number of antibodies selected for cloning) nc (the number of clones)m (the degreeof variation) c (Rac1 protein activity threshold)Output the best antibody

(1) Begin(2) Randomly generate N antibodies to form the initial candidate set(3) while not meet algorithm termination conditions do(4) Calculate the affinity Affabi

of each antibody for antigen in the candidate set and record antibody survival time Tabi

(5) Sort the antibodies in the candidate set according to their affinity and put the best n antibodies into the antibody set Abs

(6) forabiinAbs

(7) Update the value of the appropriate memory of antibody abi Sabi+ 1 See CLONING METHOD clone antibody abi

according to nc and Affabi and put all antibodies obtained by cloning into antibody set Abc

(8) end for(9) forabiinAbc

(10) See VARIATIONMETHOD according to the degree of variationm and the affinity of the antibody for the antigen Affabito

mutate abi

(11) if antibody abi is a variant antibody(12) e abi survival time Tabi

0 e appropriate memory intensity Sabi 1

(13) end if(14) end for(15) Select the N antibodies with the highest antigen affinity in Abc and Ab to replace the N antibodies in Ab(16) See FORGETTING METHOD calculate the Rac1 protein activity of each antibody in Ab according to the ratio of Tabi

to Sabi

(17) if antibody abi Rac1 protein activitygt threshold(18) forget the antibody abi

(19) end if(20) end while(21) Choose the best antibody as the final output

ALGORITHM 1 Forgotten-based clonal selection algorithm

Table 1 Initialization parameters

Algorithm parameter GA CSA FCSA ECSA BCSACross rate 05 mdash mdash mdash mdashMutation rate 013 2 2 2 2Initial clone number mdash 5 5 5 5Rac1 threshold mdash mdash 3 mdash mdash

Table 2 Execution environment

OS Windows 10 professional editionCPU Intel(R) Core(TM) i3-3217U CPU 180GHZRAM 120GBCompiler version Python 36

6 Complexity

Tabl

e3

Test

functio

n

Test

functio

nEx

pressio

nOptim

um

AckleyFu

nctio

nf1(

x)

minus20

exp(

minus02

(1

d)

1113936d i1

x2 i

1113969

)minusexp(

(1

d)

1113936d i1cos(2π

xi)

)+20

+exp(

1)0

BukinFu

nctio

nn

6f2(

x)

100

|x2

minus001

x2 1|

1113969+001

|x1

+10

|0

Cross-in

-TrayFu

nctio

nf3(

x)

minus00001

(|sin

x1sin

x2exp(

|100

minus(

x2 1

+x2 2

1113969π

)|)|

+1)

01

minus206261

Drop-WaveFu

nctio

nf4(

x)

minus

(1

+cos(12

x2 1

+x2 2

1113969

)05(

x2 1

+x2 2)

+2)

minus1

Eggh

olderFu

nctio

nf5(

x)

minus

(x2

+47

)sin

(

|x

2+

(x12)

+47

|1113968

)minus

x1sin

(

|x

1minus

x2

minus47

|1113968

)minus9596407

Griew

ankFu

nctio

nf6(

x)

1113936

d i1

x2 i4000

minus1113937

d i1cos(

xi

iradic

)+1

0

HolderTableFu

nctio

nf7(

x)

minus

|sin(

x1)cos(

x2)exp(

|1minus

(

x2 1

+x2 2

1113969π

)|)|

minus192085

Levy

Functio

nf8(

x)

sin

2 (πw

1)+

1113936d

minus1

i1

(w

iminus1)

2 [1+10

sin2 (πw

i+1)

]+

(w

dminus1)

2 [1+sin

2 (2π

wd)]

where

wi

1

+(

ximinus14)

for

alli

1

d0

RastriginFu

nctio

nf9(

x)

10

d+

1113936d i1[

x2 i

minus10cos(2π

xi)

]0

SchafferFu

nctio

nn

2f10

(x

)05

+(sin

2 (x2 1

minusx2 2)

minus05

[1+0001(

x2 1

+x2 2)

]2)

0

SchafferFu

nctio

nn

4f11

(x

)05

+(cos(sin

(|x

2 1minus

x2 2|

))minus05

[1+0001(

x2 1

+x2 2)

]2)

05

Schw

efel

Functio

nf12

(x

)4189829

dminus

1113936d i1

xisin

(

|x

i|1113968

)0

Shub

ertF

unction

f13

(x

)

(1113936

5 i1

icos

((i+1)

x1

+i)

)(1113936

5 i1

icos

((i+1)

x2

+i)

)minus1867309

Complexity 7

Finally the comparison results of the three algorithms ofGA CSA and FCSA in the 1000th generation are shown inFigure 3 e abscissa represents the current generationnumber of the algorithm the ordinate represents the currentresult of the algorithm the dashed line represents GA andthe thin solid line represents CSAe line represents FCSA

In the test results of the Ackley Function the optimalinterval of FCSA in the process of 100 executions of thealgorithm is (minus 1 0) the optimal interval of CSA is (minus 15 0)and the optimal interval of GA is (minus 25 0) e convergencedegree of the GA is worse than that of CSA and FCSA andFCSA has the best degree of convergence

Among the test results of the Bukin Function n 6drop-wave Function Rastrigin Function and SchafferFunction n 4 the FCSA has the best optimization ac-curacy and convergence stability while the GA has a pooroptimization accuracy and convergence degree comparedwith CSA

In the test results of functions Cross-in-Tray FunctionHolder Table Function Levy Function and Shubert Func-tion the CSA and FCSA algorithms converge stably to the

global optimal among which FCSA has the optimal averagesearch accuracy and stability while the GA still has deviationpoints and the convergence is not stable

Table 4 Results of GA CSA and FCSA when D 2

ALGsGA CSA FCSA

Mean Std Mean Std Mean Stdf1 minus 50923e minus 001 51699e minus 001 minus 41302e minus 001 34025e minus 001 minus 19901e minus 001 20546e minus 001f2 minus 73939e+ 000 47669e+ 000 minus 12512e+ 000 63021e minus 001 minus 97695e minus 001 53091e minus 001f3 20601e+ 000 50685e minus 003 206255e+ 00 61577e minus 005 206258e+ 00 29248e minus 005f4 92563e minus 001 43546e minus 002 96047e minus 001 23679e minus 002 97154e minus 001 20048e minus 002f5 93533e+ 002 14155e minus 002 95836e+ 002 17805e+ 000 95926e+ 002 63550e minus 001f6 minus 16845e minus 002 27008e minus 002 minus 25327e minus 002 15556e minus 002 minus 23604e minus 002 13131e minus 002f7 18907e+ 001 38513ndash001 19203e+ 001 61897e minus 003 19206e+ 001 31154e minus 003f8 minus 97077e minus 003 22512e minus 002 minus 72359e minus 004 73241e minus 004 minus 33312e minus 004 35434e minus 004f9 minus 14031e+ 000 96354e minus 001 minus 15528e minus 001 15445e minus 001 minus 79310e minus 002 76989e minus 002f10 minus 20083e minus 004 64135e minus 004 minus 27670e minus 003 32928e minus 003 minus 11855e minus 003 16805e minus 003f11 minus 500096e minus 01 34944e minus 006 minus 500094e minus 01 17190e minus 006 minus 500094e minus 01 15678e minus 006f12 minus 25903e minus 003 48844e minus 003 minus 44234e minus 002 45491e minus 002 minus 30112e minus 002 31088e minus 002f13 16615e+ 002 23853e+ 001 18632e+ 002 39038e minus 001 18649e+ 002 29016e minus 001

Table 5 Results of BCSA ECSA and FCSA when D 50

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 12838e+ 001 23931e minus 001 minus 20213e+ 001 11482e minus 001 minus 12749e+ 001 23649e minus 001f6 minus 75633e+ 002 33740e+ 001 minus 74943e+ 002 34107e+ 001 minus 74891e+ 002 36480e+ 001f8 minus 25742e+ 002 17774e+ 001 minus 25256e+ 002 19350e+ 001 minus 24812e+ 002 16304e+ 001f9 minus 63444e+ 002 19519e+ 001 minus 63492e minus 001 19929e+ 001 minus 62661e+ 002 16450e+ 001f12 minus 14640e+ 004 31504e+ 002 minus 14646e+ 004 30211e+ 002 minus 14721e+ 004 27377e+ 002

Table 6 Results of BCSA ECSA and FCSA when D 100

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 13768e+ 001 11996e minus 001 minus 13756e+ 001 12061e minus 001 minus 13697e+ 001 11556e minus 001f6 minus 19237e+ 003 57814e+ 001 minus 19198e+ 003 64516e+ 001 minus 19070e+ 003 65405e+ 001f8 minus 71533e+ 002 25226e+ 001 minus minus 70166e+ 002 40096e+ 001 minus 69408e+ 002 27792e+ 001f9 minus 14416e+ 003 26102e+ 001 minus 14370e+ 003 32709e+ 001 minus 14283e+ 003 22737e+ 001f12 minus 32839e+ 004 45766e+ 002 minus 32866e+ 004 37953e+ 002 minus 32949e+ 004 36880e+ 002

Generation

1 39 77 115

153

191

229

267

305

343

381

419

457

495

533

571

609

647

685

723

761

799

837

875

913

951

989

0

5

10

15

20

25

Resu

lt

CSAGAFCSA

1000 gen result when ALGS = f_1 D = 2

Figure 3 Results of GA CSA and FCSA when D 2 andALGs f_1

8 Complexity

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 2: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

theoretical analysis Hong et al [16] analyzed the conver-gence of elitist clonal selection algorithm from the per-spective of the state transition probability matrix Markovchain and other random process-related theories andproposed a method to evaluate the convergence property ofthe algorithm which has a certain reference value isalgorithm is better than the genetic algorithm in terms ofautomatic node adjustment of B-spline curves

It can be seen from the above literature that the currentclonal selection algorithm flow mainly includes steps ofselection cloning mutation reelection and replacement Atpresent a large number of improved algorithms areintended to improve the steps of selection cloning andmutation without considering the improvement and opti-mization of the replacement and update steps of antibody-concentrated antibodies by clonal selection algorithm Inorder to improve the update efficiency of the antibody setthis paper tries to find a new scheme to replace the re-placement steps used in the current clonal selection algo-rithm so as to improve the overall search accuracy andconvergence stability of the algorithm

e antibody candidate set is a set of antibodies pro-duced by the clonal selection algorithm during initializatione clonal selection algorithm sorts the affinity of the an-tibodies in the set and selects the first n antibodies with highaffinity to clone and mutate Since n depends on manualexperience it is generally set to 10 to 50 of the populationsize [12] this selection mechanism inevitably leads to a largenumber of antibodies with general affinity ldquoretentionrdquo in theantibody candidate set Some were replaced by new anti-bodies due to low affinity and the other part did not satisfythe conditions of being replaced and they remained in theantibody candidate set for a long time is was not con-ducive to the rapid update of the antibody set and affectedthe efficiency of the algorithm in finding the optimalsolution

is kind of antibody that stays in the candidate set ofantibodies for a long time can neither converge nor updatequickly jumping out of the local optimal interval ereforethere is a phenomenon in which antibodies cannot be se-lected and update iterations in a short period of time whichis vividly called ldquoblack holerdquo in this paper

In this paper an improved clonal selection algorithmbased on the mechanism of biological forgetting (FCSA) isproposed Aiming at the black hole formed by antibodieswhose affinity does not meet the mutation and updateconditions the forgetting mechanism is applied to the an-tibody candidate set update process of the algorithm eaim is to update these inactive and long-lasting antibodies inthe candidate antibody set and enhance the diversity ofantibodies in the antibody set thereby increasing the con-vergence speed of the algorithm

2 Overview of the BiologicalForgetting Mechanism

21 Biological Forgetting Research Background In 1885Ebbinghaus [17] put forward the Ebbinghaus memory curvewhich described the amount of forgetting over time Over

the next 100 years numerous scholars have proposed the-ories to explain how forgetting occurs At present there arevarious opinions as to the causes of forgetting and thewidely supported explanations include decay theory [18] andinterference theory [19] Decay theory holds that the decayof forgetting which is similar to the decay of radioactiveelements is a process in which memories disappear spon-taneously over time According to the interference theoryforgetting is caused by the conflict between memory blockse similarity quantity and intensity of memory blocksdetermine the degree of forgetting e more similar thecontents between memory blocks are the more times theyare remembered and the higher the memory intensity is theless likely they are to be forgotten

22 Forgetting Mechanism Affected by Rac1 ActivityAlthough both attenuation theory and interference theoryare supported by experimental results they lack an objectivemechanism to explain why attenuation theory is only ap-plicable to long-term memory while interference theory ismore applicable to short-term memory

Shuai et al [20] studied the small G protein Rac1 in thebrain of drosophila and found that Rac1 activity affected thedegree of forgetting in a drosophila olfactory avoidanceexperiment Specifically after Rac1 activity was inhibited inthe brain of drosophila the forgetting rate slowed downafter Rac1 activity was enhanced by stimulation the for-getting rate increased Changes in Rac1 activity in drosophilafly brain during memory decay and memory interferencementioned in Section 21 of this paper also support the aboveconclusions

erefore Shuai et al [20] proposed a set of forgettingmechanisms affected by Rac1 activity e activity of Rac1would also change according to the memory process withdiffering time lengths When memory was disturbed for ashort time Rac1 activity also increased rapidly making theprevious outdated memory forgotten Rac1 takes longer toactivate when the time span is large suggesting that memorydeclines over time

Liu et alrsquos [21] studies on Rac1 protein activity in mousehippocampal neurons and object recognition memory inmice further support the role of Rac1 activation inforgetting

23 Observations about Biological Forgetting In the complexbiological system the process of forgetting often occursispaper argues that forgetting is a process of information lossand information loss is meaningful under certain circum-stances In reference [22] the odor type that would cause thefruit fly to receive an electric shock was repeatedly changedand the fruit fly would remember the odor that would re-ceive an electric shock the most recently so as to avoid thistype of odor is suggests that in complex external situa-tions memories that cannot be adapted to the current en-vironment need to be forgotten Above we suggest that theforgetting mechanism is meaningful for behaviors requiringrapid adaptation to the new environment such as cloningselection for a high-mutation environment

2 Complexity

At the same time the idea of attenuation theory of bi-ological forgetting is introduced into the replacementprocess of the clonal selection algorithm e number ofiterations of antibodies in the antibody candidate set is takenas the time length and whether the antibody participates inhigh-frequency mutation in a certain iteration is taken as thebasis of whether the antibody is remembered at the point intime so as to realize the purpose of replacing the antibodywith weak memory degree when the time span is largeMoreover since the current replacement mechanism ofclonal selection is still to replace the d antibodies with theworst affinity of antibodies the antibody can be dynamicallyreplaced according to its own characteristics after the in-troduction of attenuation theory

3 Clonal Selection Algorithm Based onForgetting Mechanism

31 Clonal Selection Algorithm and Forgetting Mechanism

311 Introduction to Clonal Selection Algorithm In 2002De Castro and Von Zuben [23] inspired by biological as wellas artificial immunology proposed a clonal selection algo-rithm to solve the problems of multimodal and combina-torial optimization by using the principle of clonal selectione main flow of this algorithm is

Step 1 Initialize the size of the antibody set iterationnumber cloning number and other relevant param-eters randomly select an antigen from the antigen setand generate the candidate antibody set which iscomposed of a memory set and residual setStep 2 Calculate the affinity between each antibody andantigen in the candidate antibody concentration andselect the first n antibodies with the highest affinityStep 3 Clone the n antibodies and the number ofantibody clones is positively correlated with their af-finity to the antigenStep 4 Carry out mutation treatment on the antibodyset generated after cloning and the higher the affinitythe lower the probability of antibody mutationStep 5 Calculate the antibody affinity after mutationselect the antibody with the highest affinity andcompare it with the antibody in the current memoryset en select the antibody with the highest affinityand put it into the memory setStep 6 Randomly select d new antibodies to replace thed antibodies with the worst affinity to the antigen in theremaining setStep 7 Skip to step 2 for the next iteration When thenumber of iterations meets the termination conditionthe algorithm terminates

It can be seen that the cloning selection algorithm selectsthe first n antibodies with the highest concentration affinityof candidate antibodies in step 2 and the subsequent stepsprimarily involve cloning and mutation of these n anti-bodies e rest of the antibody candidate set does not

participate in the epicycle mutation leading to the followingsituations Antibody A does not belong to the first n anti-bodies with the highest affinity in one iteration of the al-gorithm and it is not replaced in step 6 In the subsequentrounds of iterations antibody A does not always become thed antibody with the lowest affinity with the antigen In thiscase antibody A can no longer adapt to new changes cannotparticipate in the high mutation process of the algorithmand cannot be replaced as the worst d antibody in time thusaffecting the update rate of antibodies in the whole candidateantibody concentration

312 Clonal Selection Algorithm Inspired by ForgettingMechanism In order to more intuitively explain the state ofthe candidate antibodies in a certain round of iteration theaffinity between the antigen and the antibody is expressed inthe form of distance According to the previous section afterall antibodies are sorted according to their affinity the clonalselection algorithm selects the n best antibodies in the an-tibody candidate set for cloning and mutation As shown inFigure 1 in the reselect step the original antibody and theantibody are mixed and screened for the next iteration Atthis time there are antibodies that were not selected asclones in the previous roundWe divide the best n antibodiesinto the first layer and the worst d antibodies among theremaining antibodies into the third layer and then thesecond layer contains the intermediate antibodies

Since the total number P of antibodies remains un-changed after the completion of one iteration of the CSA allof the antibodies in the third layer will be replaced with newantibodies and the antibodies in the antibody candidate setwill be reordered according to the affinity with the antigen sosome of the antibodies in the entire set of antibody candidateswill migrate In the first layer a part of the antibody willmigrate to the first tier II In addition some of the antibodiesin layer II migrate to layer III Analyze the layer II antibodiesat this time including the antibodies migrated into layer I andlayer III and the original antibodies not migrated out of layerII After multiple rounds of iteration there are still someunmigrated native antibodies in layer II which have notmigrated to the high-frequency mutations involved in thealgorithm in layer I and have not been replaced with newantibodies in layer III It is suggested that the above unmi-grated antibodies that always exist in layer II are a memorythat cannot adapt to the current environment and should beupdated along with the antibody in layer III

Inspired by the forgetting mechanism for each antibodyin the candidate antibody set calculate the number of timesit is selected as the top n affinity and the number of iterationsin the candidate antibody set and use these two as antibodycharacteristics ese antibody characteristics were used asthe basis for the change of Rac1 activity and the activationdegree of Rac1 determined whether to eliminate theldquostrandedrdquo antibody from the concentration of candidateantibodies

As shown in Figure 2 antibodies in the candidate set ofantibodies are divided into two layers according to the size ofaffinity Layer II contains all remaining antibodies

Complexity 3

For each antibody in the candidate set the Rac1 activity ofthe antibody in layer I is significantly lower than that in layerII When the Rac1 activity of the layer II antibody exceeds theactivity threshold the antibody is replaced and the entireantibody candidate set is updated In this way the clonalselection algorithm avoids the antibody ldquoblack holerdquo formedby the partially unmigrated original antibodies in layer II

32 Abstract Definition of Forgetting Mechanism e fol-lowing definitions relate to the forgetting mechanism

(1) Antibody survival time is the number of iterationsthat antibodies have participated in the antibodycandidate set

(2) Time T is the execution time of the clonal selectionalgorithm In this paper T refers to the number ofalgorithm iterations

(3) Appropriate memory is the attribute of each can-didate antibody In an iteration if the antibodybelongs to the best n antibodies it is regarded as asuitable memory

(4) Appropriate memory strength is the appropriatememory accumulated by candidate antibodies dur-ing T time

(5) Rac1 protein activity is the index affecting antibodyforgetting determined by the survival time of an-tibody in candidate antibody concentration and thestrength of appropriate memory Rac1 protein ac-tivity is proportional to the survival time of anti-bodies and inversely proportional to the degree ofappropriate memory

33 Improved Clonal Selection Algorithm In this paper animproved clonal selection algorithm (FCSA) inspired by theforgetting mechanism is proposed Its core implementationidea is to replace the receptor editing mechanism [24] in theCSA with a unique forgetting mechanism

e specific implementation method is as follows Ineach iteration of the algorithm the appropriate memorystrength and survival time of each antibody candidate set arerecorded After several iterations of the algorithm antibody

Layer I

Layer II

Layer III

AgAg

AgAg

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ab

Figure 1 Antigen and antibody distribution structure

Ag

AgAg

AgAg

AgAg

Ag

Ab

AgAg Ag

Ag Ag Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Layer II

Layer I

Figure 2 Antigen-antibody distribution structure affected by Rac1

4 Complexity

forgetting was determined based on whether Rac1 proteinactivity reached the threshold

331 Affinity Calculation To simplify the calculation thetarget test function of antibody affinity to antigen is thefunction value which can be expressed as

Affabi f x1 x2 xD( 1113857 (1)

where xi | i 1 2 D1113864 1113865 is the antibody and D is the di-mension of the antibody

332 Cloning Method According to the affinity corre-sponding to the antibody and antigen the cloning methodperforms the cloning operation on the antibody e higherthe affinity the greater the number of antibodies that will becloned e specific cloning formula is

Abc abij

i isin [0 population size minus 1]

j max 1 intAffabi

+ a

alowast max clone1113888 11138891113888 1113889

111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

⎫⎪⎪⎪⎪⎪⎬

⎪⎪⎪⎪⎪⎭

agt 0

(2)

where population size represents the number of antibodiesin the candidate set of antibodies Affabi

represents the af-finity between the antibody abi and the antigen max clonerepresents the initial clone number and j represents theclone number of the antibody abi

333 Variation Method e mutation method aims at theabi of the cloned antibody and determines the mutationdegree of the antibody according to the affinity between theantibody and the antigen e higher the affinity the less thepossibility and degree of the mutation

e specific variation formula is

ab xi | i 1 2 D1113864 1113865

xi xi + random(minus a a) random(0 1)lt eminus rlowastaffabmax aff( )

xi random(0 1)gt eminus rlowastaffabMAX aff( )

⎧⎨

(3)

where r is the mutation rate a is the variation range agt 0and max aff is the maximum affinity of the concentratedantibody

334 Forgetting Method e method of forgetting deter-mines the necessity of antibody forgetting based on thesurvival time of the antibody the appropriate memory in-tensity and the activity of the Rac1 protein

e specific forgetting formula is

ab xi | i 1 2 D1113864 1113865

xi

random(minDomain maxDomain)abtime

abstrengthgt c

xi otherwise

⎧⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎩

(4)

where abtime is the antibody survival time abstrength is theappropriate memory strength and c is the Rac1 proteinactivity threshold

335 Algorithm Flow e flow of the improved algorithmproposed in this paper is shown in Algorithm1

e suspension conditions of the algorithms inAlgorithm 1can be determined according to specific needsCommon termination conditions include reaching themaximum value of the function evaluation and reaching themaximum number of generations

In the algorithm Rac1 protein activity is an inherentproperty of each candidate antibody which is calculatedbased on antibody survival time and appropriate memorystrength when the antibody is first selected into the can-didate set And it changes dynamically with the execution ofthe algorithm When the property value of the antibodyreaches the threshold it means that the antibody has notmutated in a better direction within the time we expect andit is not sufficiently competitive with other candidate an-tibodies So in the algorithm the antibody that meets thethreshold value will perform the forgetting operation

4 Experiment and Algorithm Evaluation

irteen kinds of testing were done to select CEC testfunction optimization algorithm functions as experimentaltest functions respectively using the test function of thepresented algorithm (FCSA) proposed in [13] the elitistclonal selection algorithm (ECSA) proposed in literature[14] Baldwinian learning in clonal selection algorithm(BCSA) and search accuracy of the genetic algorithm (GA)for testing e experimental steps are as follows

First initialize various parameters of the algorithm etermination criterion in this experiment is to run the GABCSA ECSA and FCSA until the number of functionevaluations reaches the maximum value of 350000

Second find the optimal solution of the test functionree algorithms were executed to obtain the optimal so-lution generated by each execution of the algorithm eaverage optimal solution maximum optimal solution andminimum optimal solution after 100 executions wereanalyzed

e purpose of the experiment in this paper is to analyzethe effectiveness of the forgetting mechanism applied to theclonal selection algorithm e performance of the algo-rithm is mainly evaluated by the quality of the results ob-tained when the suspension conditions are consistent is

Complexity 5

article counts the mean and standard deviation of the resultsof multiple runs of the algorithm to evaluate the quality ofthe results ese two indicators reflect the concentrationtrend and the degree of dispersion of the experimental datarespectively erefore this paper uses these two indicatorsto verify the effectiveness of the improved algorithm

Finally we obtained the results of GA CSA and FCSArunning at 1000 generations and plotted them as line chartse purpose is to analyze the accuracy and speed of thealgorithm by characterizing the relationship between gen-erations and algorithm results

Among them the algorithm parameters are set as shownin Table 1 and the execution environment of the algorithm isshown in Table 2

41 Test Function e test functions selected in this paperare shown in Table 3eir common feature is that they havea global minimum value [25] and the function image iscomplex with multiple local minimum values Converselythe opposite value of the test function is the globalmaximum

Consider the test functions in Table 3 Since antibodieswith high affinity are generally selected when comparing theaffinity of antigens and antibodies this paper takes theopposite value corresponding to these test functions as theglobal maximum value which is equivalent to the globalminimum value of the test function in Table 3 e oppositeof the trial function is the global maximum

42 Experimental Results e results of our experiment areshown in Table 4 e closer the average value of the optimalsolution obtained by each algorithm is to the reference valuethe higher the accuracy of the algorithm under the termi-nation condition For the case that the test functions f_1 f_6f_8 f_9 and f_12 have high-dimensional solutions in orderto verify the convergence of the FCSA in high-dimensionaltest function set d 50 and d 100 and compare the resultswith ECSA and BCSA as shown in Tables 5 and 6

After replacing the updating operator in BCSA with theforgetting operator set d 50 and d 100 to test theupdating ability of the forgetting operator compared withthe previous updating operator in different dimensions

FCSAInput N (the size of the population) n (the number of antibodies selected for cloning) nc (the number of clones)m (the degreeof variation) c (Rac1 protein activity threshold)Output the best antibody

(1) Begin(2) Randomly generate N antibodies to form the initial candidate set(3) while not meet algorithm termination conditions do(4) Calculate the affinity Affabi

of each antibody for antigen in the candidate set and record antibody survival time Tabi

(5) Sort the antibodies in the candidate set according to their affinity and put the best n antibodies into the antibody set Abs

(6) forabiinAbs

(7) Update the value of the appropriate memory of antibody abi Sabi+ 1 See CLONING METHOD clone antibody abi

according to nc and Affabi and put all antibodies obtained by cloning into antibody set Abc

(8) end for(9) forabiinAbc

(10) See VARIATIONMETHOD according to the degree of variationm and the affinity of the antibody for the antigen Affabito

mutate abi

(11) if antibody abi is a variant antibody(12) e abi survival time Tabi

0 e appropriate memory intensity Sabi 1

(13) end if(14) end for(15) Select the N antibodies with the highest antigen affinity in Abc and Ab to replace the N antibodies in Ab(16) See FORGETTING METHOD calculate the Rac1 protein activity of each antibody in Ab according to the ratio of Tabi

to Sabi

(17) if antibody abi Rac1 protein activitygt threshold(18) forget the antibody abi

(19) end if(20) end while(21) Choose the best antibody as the final output

ALGORITHM 1 Forgotten-based clonal selection algorithm

Table 1 Initialization parameters

Algorithm parameter GA CSA FCSA ECSA BCSACross rate 05 mdash mdash mdash mdashMutation rate 013 2 2 2 2Initial clone number mdash 5 5 5 5Rac1 threshold mdash mdash 3 mdash mdash

Table 2 Execution environment

OS Windows 10 professional editionCPU Intel(R) Core(TM) i3-3217U CPU 180GHZRAM 120GBCompiler version Python 36

6 Complexity

Tabl

e3

Test

functio

n

Test

functio

nEx

pressio

nOptim

um

AckleyFu

nctio

nf1(

x)

minus20

exp(

minus02

(1

d)

1113936d i1

x2 i

1113969

)minusexp(

(1

d)

1113936d i1cos(2π

xi)

)+20

+exp(

1)0

BukinFu

nctio

nn

6f2(

x)

100

|x2

minus001

x2 1|

1113969+001

|x1

+10

|0

Cross-in

-TrayFu

nctio

nf3(

x)

minus00001

(|sin

x1sin

x2exp(

|100

minus(

x2 1

+x2 2

1113969π

)|)|

+1)

01

minus206261

Drop-WaveFu

nctio

nf4(

x)

minus

(1

+cos(12

x2 1

+x2 2

1113969

)05(

x2 1

+x2 2)

+2)

minus1

Eggh

olderFu

nctio

nf5(

x)

minus

(x2

+47

)sin

(

|x

2+

(x12)

+47

|1113968

)minus

x1sin

(

|x

1minus

x2

minus47

|1113968

)minus9596407

Griew

ankFu

nctio

nf6(

x)

1113936

d i1

x2 i4000

minus1113937

d i1cos(

xi

iradic

)+1

0

HolderTableFu

nctio

nf7(

x)

minus

|sin(

x1)cos(

x2)exp(

|1minus

(

x2 1

+x2 2

1113969π

)|)|

minus192085

Levy

Functio

nf8(

x)

sin

2 (πw

1)+

1113936d

minus1

i1

(w

iminus1)

2 [1+10

sin2 (πw

i+1)

]+

(w

dminus1)

2 [1+sin

2 (2π

wd)]

where

wi

1

+(

ximinus14)

for

alli

1

d0

RastriginFu

nctio

nf9(

x)

10

d+

1113936d i1[

x2 i

minus10cos(2π

xi)

]0

SchafferFu

nctio

nn

2f10

(x

)05

+(sin

2 (x2 1

minusx2 2)

minus05

[1+0001(

x2 1

+x2 2)

]2)

0

SchafferFu

nctio

nn

4f11

(x

)05

+(cos(sin

(|x

2 1minus

x2 2|

))minus05

[1+0001(

x2 1

+x2 2)

]2)

05

Schw

efel

Functio

nf12

(x

)4189829

dminus

1113936d i1

xisin

(

|x

i|1113968

)0

Shub

ertF

unction

f13

(x

)

(1113936

5 i1

icos

((i+1)

x1

+i)

)(1113936

5 i1

icos

((i+1)

x2

+i)

)minus1867309

Complexity 7

Finally the comparison results of the three algorithms ofGA CSA and FCSA in the 1000th generation are shown inFigure 3 e abscissa represents the current generationnumber of the algorithm the ordinate represents the currentresult of the algorithm the dashed line represents GA andthe thin solid line represents CSAe line represents FCSA

In the test results of the Ackley Function the optimalinterval of FCSA in the process of 100 executions of thealgorithm is (minus 1 0) the optimal interval of CSA is (minus 15 0)and the optimal interval of GA is (minus 25 0) e convergencedegree of the GA is worse than that of CSA and FCSA andFCSA has the best degree of convergence

Among the test results of the Bukin Function n 6drop-wave Function Rastrigin Function and SchafferFunction n 4 the FCSA has the best optimization ac-curacy and convergence stability while the GA has a pooroptimization accuracy and convergence degree comparedwith CSA

In the test results of functions Cross-in-Tray FunctionHolder Table Function Levy Function and Shubert Func-tion the CSA and FCSA algorithms converge stably to the

global optimal among which FCSA has the optimal averagesearch accuracy and stability while the GA still has deviationpoints and the convergence is not stable

Table 4 Results of GA CSA and FCSA when D 2

ALGsGA CSA FCSA

Mean Std Mean Std Mean Stdf1 minus 50923e minus 001 51699e minus 001 minus 41302e minus 001 34025e minus 001 minus 19901e minus 001 20546e minus 001f2 minus 73939e+ 000 47669e+ 000 minus 12512e+ 000 63021e minus 001 minus 97695e minus 001 53091e minus 001f3 20601e+ 000 50685e minus 003 206255e+ 00 61577e minus 005 206258e+ 00 29248e minus 005f4 92563e minus 001 43546e minus 002 96047e minus 001 23679e minus 002 97154e minus 001 20048e minus 002f5 93533e+ 002 14155e minus 002 95836e+ 002 17805e+ 000 95926e+ 002 63550e minus 001f6 minus 16845e minus 002 27008e minus 002 minus 25327e minus 002 15556e minus 002 minus 23604e minus 002 13131e minus 002f7 18907e+ 001 38513ndash001 19203e+ 001 61897e minus 003 19206e+ 001 31154e minus 003f8 minus 97077e minus 003 22512e minus 002 minus 72359e minus 004 73241e minus 004 minus 33312e minus 004 35434e minus 004f9 minus 14031e+ 000 96354e minus 001 minus 15528e minus 001 15445e minus 001 minus 79310e minus 002 76989e minus 002f10 minus 20083e minus 004 64135e minus 004 minus 27670e minus 003 32928e minus 003 minus 11855e minus 003 16805e minus 003f11 minus 500096e minus 01 34944e minus 006 minus 500094e minus 01 17190e minus 006 minus 500094e minus 01 15678e minus 006f12 minus 25903e minus 003 48844e minus 003 minus 44234e minus 002 45491e minus 002 minus 30112e minus 002 31088e minus 002f13 16615e+ 002 23853e+ 001 18632e+ 002 39038e minus 001 18649e+ 002 29016e minus 001

Table 5 Results of BCSA ECSA and FCSA when D 50

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 12838e+ 001 23931e minus 001 minus 20213e+ 001 11482e minus 001 minus 12749e+ 001 23649e minus 001f6 minus 75633e+ 002 33740e+ 001 minus 74943e+ 002 34107e+ 001 minus 74891e+ 002 36480e+ 001f8 minus 25742e+ 002 17774e+ 001 minus 25256e+ 002 19350e+ 001 minus 24812e+ 002 16304e+ 001f9 minus 63444e+ 002 19519e+ 001 minus 63492e minus 001 19929e+ 001 minus 62661e+ 002 16450e+ 001f12 minus 14640e+ 004 31504e+ 002 minus 14646e+ 004 30211e+ 002 minus 14721e+ 004 27377e+ 002

Table 6 Results of BCSA ECSA and FCSA when D 100

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 13768e+ 001 11996e minus 001 minus 13756e+ 001 12061e minus 001 minus 13697e+ 001 11556e minus 001f6 minus 19237e+ 003 57814e+ 001 minus 19198e+ 003 64516e+ 001 minus 19070e+ 003 65405e+ 001f8 minus 71533e+ 002 25226e+ 001 minus minus 70166e+ 002 40096e+ 001 minus 69408e+ 002 27792e+ 001f9 minus 14416e+ 003 26102e+ 001 minus 14370e+ 003 32709e+ 001 minus 14283e+ 003 22737e+ 001f12 minus 32839e+ 004 45766e+ 002 minus 32866e+ 004 37953e+ 002 minus 32949e+ 004 36880e+ 002

Generation

1 39 77 115

153

191

229

267

305

343

381

419

457

495

533

571

609

647

685

723

761

799

837

875

913

951

989

0

5

10

15

20

25

Resu

lt

CSAGAFCSA

1000 gen result when ALGS = f_1 D = 2

Figure 3 Results of GA CSA and FCSA when D 2 andALGs f_1

8 Complexity

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 3: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

At the same time the idea of attenuation theory of bi-ological forgetting is introduced into the replacementprocess of the clonal selection algorithm e number ofiterations of antibodies in the antibody candidate set is takenas the time length and whether the antibody participates inhigh-frequency mutation in a certain iteration is taken as thebasis of whether the antibody is remembered at the point intime so as to realize the purpose of replacing the antibodywith weak memory degree when the time span is largeMoreover since the current replacement mechanism ofclonal selection is still to replace the d antibodies with theworst affinity of antibodies the antibody can be dynamicallyreplaced according to its own characteristics after the in-troduction of attenuation theory

3 Clonal Selection Algorithm Based onForgetting Mechanism

31 Clonal Selection Algorithm and Forgetting Mechanism

311 Introduction to Clonal Selection Algorithm In 2002De Castro and Von Zuben [23] inspired by biological as wellas artificial immunology proposed a clonal selection algo-rithm to solve the problems of multimodal and combina-torial optimization by using the principle of clonal selectione main flow of this algorithm is

Step 1 Initialize the size of the antibody set iterationnumber cloning number and other relevant param-eters randomly select an antigen from the antigen setand generate the candidate antibody set which iscomposed of a memory set and residual setStep 2 Calculate the affinity between each antibody andantigen in the candidate antibody concentration andselect the first n antibodies with the highest affinityStep 3 Clone the n antibodies and the number ofantibody clones is positively correlated with their af-finity to the antigenStep 4 Carry out mutation treatment on the antibodyset generated after cloning and the higher the affinitythe lower the probability of antibody mutationStep 5 Calculate the antibody affinity after mutationselect the antibody with the highest affinity andcompare it with the antibody in the current memoryset en select the antibody with the highest affinityand put it into the memory setStep 6 Randomly select d new antibodies to replace thed antibodies with the worst affinity to the antigen in theremaining setStep 7 Skip to step 2 for the next iteration When thenumber of iterations meets the termination conditionthe algorithm terminates

It can be seen that the cloning selection algorithm selectsthe first n antibodies with the highest concentration affinityof candidate antibodies in step 2 and the subsequent stepsprimarily involve cloning and mutation of these n anti-bodies e rest of the antibody candidate set does not

participate in the epicycle mutation leading to the followingsituations Antibody A does not belong to the first n anti-bodies with the highest affinity in one iteration of the al-gorithm and it is not replaced in step 6 In the subsequentrounds of iterations antibody A does not always become thed antibody with the lowest affinity with the antigen In thiscase antibody A can no longer adapt to new changes cannotparticipate in the high mutation process of the algorithmand cannot be replaced as the worst d antibody in time thusaffecting the update rate of antibodies in the whole candidateantibody concentration

312 Clonal Selection Algorithm Inspired by ForgettingMechanism In order to more intuitively explain the state ofthe candidate antibodies in a certain round of iteration theaffinity between the antigen and the antibody is expressed inthe form of distance According to the previous section afterall antibodies are sorted according to their affinity the clonalselection algorithm selects the n best antibodies in the an-tibody candidate set for cloning and mutation As shown inFigure 1 in the reselect step the original antibody and theantibody are mixed and screened for the next iteration Atthis time there are antibodies that were not selected asclones in the previous roundWe divide the best n antibodiesinto the first layer and the worst d antibodies among theremaining antibodies into the third layer and then thesecond layer contains the intermediate antibodies

Since the total number P of antibodies remains un-changed after the completion of one iteration of the CSA allof the antibodies in the third layer will be replaced with newantibodies and the antibodies in the antibody candidate setwill be reordered according to the affinity with the antigen sosome of the antibodies in the entire set of antibody candidateswill migrate In the first layer a part of the antibody willmigrate to the first tier II In addition some of the antibodiesin layer II migrate to layer III Analyze the layer II antibodiesat this time including the antibodies migrated into layer I andlayer III and the original antibodies not migrated out of layerII After multiple rounds of iteration there are still someunmigrated native antibodies in layer II which have notmigrated to the high-frequency mutations involved in thealgorithm in layer I and have not been replaced with newantibodies in layer III It is suggested that the above unmi-grated antibodies that always exist in layer II are a memorythat cannot adapt to the current environment and should beupdated along with the antibody in layer III

Inspired by the forgetting mechanism for each antibodyin the candidate antibody set calculate the number of timesit is selected as the top n affinity and the number of iterationsin the candidate antibody set and use these two as antibodycharacteristics ese antibody characteristics were used asthe basis for the change of Rac1 activity and the activationdegree of Rac1 determined whether to eliminate theldquostrandedrdquo antibody from the concentration of candidateantibodies

As shown in Figure 2 antibodies in the candidate set ofantibodies are divided into two layers according to the size ofaffinity Layer II contains all remaining antibodies

Complexity 3

For each antibody in the candidate set the Rac1 activity ofthe antibody in layer I is significantly lower than that in layerII When the Rac1 activity of the layer II antibody exceeds theactivity threshold the antibody is replaced and the entireantibody candidate set is updated In this way the clonalselection algorithm avoids the antibody ldquoblack holerdquo formedby the partially unmigrated original antibodies in layer II

32 Abstract Definition of Forgetting Mechanism e fol-lowing definitions relate to the forgetting mechanism

(1) Antibody survival time is the number of iterationsthat antibodies have participated in the antibodycandidate set

(2) Time T is the execution time of the clonal selectionalgorithm In this paper T refers to the number ofalgorithm iterations

(3) Appropriate memory is the attribute of each can-didate antibody In an iteration if the antibodybelongs to the best n antibodies it is regarded as asuitable memory

(4) Appropriate memory strength is the appropriatememory accumulated by candidate antibodies dur-ing T time

(5) Rac1 protein activity is the index affecting antibodyforgetting determined by the survival time of an-tibody in candidate antibody concentration and thestrength of appropriate memory Rac1 protein ac-tivity is proportional to the survival time of anti-bodies and inversely proportional to the degree ofappropriate memory

33 Improved Clonal Selection Algorithm In this paper animproved clonal selection algorithm (FCSA) inspired by theforgetting mechanism is proposed Its core implementationidea is to replace the receptor editing mechanism [24] in theCSA with a unique forgetting mechanism

e specific implementation method is as follows Ineach iteration of the algorithm the appropriate memorystrength and survival time of each antibody candidate set arerecorded After several iterations of the algorithm antibody

Layer I

Layer II

Layer III

AgAg

AgAg

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ab

Figure 1 Antigen and antibody distribution structure

Ag

AgAg

AgAg

AgAg

Ag

Ab

AgAg Ag

Ag Ag Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Layer II

Layer I

Figure 2 Antigen-antibody distribution structure affected by Rac1

4 Complexity

forgetting was determined based on whether Rac1 proteinactivity reached the threshold

331 Affinity Calculation To simplify the calculation thetarget test function of antibody affinity to antigen is thefunction value which can be expressed as

Affabi f x1 x2 xD( 1113857 (1)

where xi | i 1 2 D1113864 1113865 is the antibody and D is the di-mension of the antibody

332 Cloning Method According to the affinity corre-sponding to the antibody and antigen the cloning methodperforms the cloning operation on the antibody e higherthe affinity the greater the number of antibodies that will becloned e specific cloning formula is

Abc abij

i isin [0 population size minus 1]

j max 1 intAffabi

+ a

alowast max clone1113888 11138891113888 1113889

111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

⎫⎪⎪⎪⎪⎪⎬

⎪⎪⎪⎪⎪⎭

agt 0

(2)

where population size represents the number of antibodiesin the candidate set of antibodies Affabi

represents the af-finity between the antibody abi and the antigen max clonerepresents the initial clone number and j represents theclone number of the antibody abi

333 Variation Method e mutation method aims at theabi of the cloned antibody and determines the mutationdegree of the antibody according to the affinity between theantibody and the antigen e higher the affinity the less thepossibility and degree of the mutation

e specific variation formula is

ab xi | i 1 2 D1113864 1113865

xi xi + random(minus a a) random(0 1)lt eminus rlowastaffabmax aff( )

xi random(0 1)gt eminus rlowastaffabMAX aff( )

⎧⎨

(3)

where r is the mutation rate a is the variation range agt 0and max aff is the maximum affinity of the concentratedantibody

334 Forgetting Method e method of forgetting deter-mines the necessity of antibody forgetting based on thesurvival time of the antibody the appropriate memory in-tensity and the activity of the Rac1 protein

e specific forgetting formula is

ab xi | i 1 2 D1113864 1113865

xi

random(minDomain maxDomain)abtime

abstrengthgt c

xi otherwise

⎧⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎩

(4)

where abtime is the antibody survival time abstrength is theappropriate memory strength and c is the Rac1 proteinactivity threshold

335 Algorithm Flow e flow of the improved algorithmproposed in this paper is shown in Algorithm1

e suspension conditions of the algorithms inAlgorithm 1can be determined according to specific needsCommon termination conditions include reaching themaximum value of the function evaluation and reaching themaximum number of generations

In the algorithm Rac1 protein activity is an inherentproperty of each candidate antibody which is calculatedbased on antibody survival time and appropriate memorystrength when the antibody is first selected into the can-didate set And it changes dynamically with the execution ofthe algorithm When the property value of the antibodyreaches the threshold it means that the antibody has notmutated in a better direction within the time we expect andit is not sufficiently competitive with other candidate an-tibodies So in the algorithm the antibody that meets thethreshold value will perform the forgetting operation

4 Experiment and Algorithm Evaluation

irteen kinds of testing were done to select CEC testfunction optimization algorithm functions as experimentaltest functions respectively using the test function of thepresented algorithm (FCSA) proposed in [13] the elitistclonal selection algorithm (ECSA) proposed in literature[14] Baldwinian learning in clonal selection algorithm(BCSA) and search accuracy of the genetic algorithm (GA)for testing e experimental steps are as follows

First initialize various parameters of the algorithm etermination criterion in this experiment is to run the GABCSA ECSA and FCSA until the number of functionevaluations reaches the maximum value of 350000

Second find the optimal solution of the test functionree algorithms were executed to obtain the optimal so-lution generated by each execution of the algorithm eaverage optimal solution maximum optimal solution andminimum optimal solution after 100 executions wereanalyzed

e purpose of the experiment in this paper is to analyzethe effectiveness of the forgetting mechanism applied to theclonal selection algorithm e performance of the algo-rithm is mainly evaluated by the quality of the results ob-tained when the suspension conditions are consistent is

Complexity 5

article counts the mean and standard deviation of the resultsof multiple runs of the algorithm to evaluate the quality ofthe results ese two indicators reflect the concentrationtrend and the degree of dispersion of the experimental datarespectively erefore this paper uses these two indicatorsto verify the effectiveness of the improved algorithm

Finally we obtained the results of GA CSA and FCSArunning at 1000 generations and plotted them as line chartse purpose is to analyze the accuracy and speed of thealgorithm by characterizing the relationship between gen-erations and algorithm results

Among them the algorithm parameters are set as shownin Table 1 and the execution environment of the algorithm isshown in Table 2

41 Test Function e test functions selected in this paperare shown in Table 3eir common feature is that they havea global minimum value [25] and the function image iscomplex with multiple local minimum values Converselythe opposite value of the test function is the globalmaximum

Consider the test functions in Table 3 Since antibodieswith high affinity are generally selected when comparing theaffinity of antigens and antibodies this paper takes theopposite value corresponding to these test functions as theglobal maximum value which is equivalent to the globalminimum value of the test function in Table 3 e oppositeof the trial function is the global maximum

42 Experimental Results e results of our experiment areshown in Table 4 e closer the average value of the optimalsolution obtained by each algorithm is to the reference valuethe higher the accuracy of the algorithm under the termi-nation condition For the case that the test functions f_1 f_6f_8 f_9 and f_12 have high-dimensional solutions in orderto verify the convergence of the FCSA in high-dimensionaltest function set d 50 and d 100 and compare the resultswith ECSA and BCSA as shown in Tables 5 and 6

After replacing the updating operator in BCSA with theforgetting operator set d 50 and d 100 to test theupdating ability of the forgetting operator compared withthe previous updating operator in different dimensions

FCSAInput N (the size of the population) n (the number of antibodies selected for cloning) nc (the number of clones)m (the degreeof variation) c (Rac1 protein activity threshold)Output the best antibody

(1) Begin(2) Randomly generate N antibodies to form the initial candidate set(3) while not meet algorithm termination conditions do(4) Calculate the affinity Affabi

of each antibody for antigen in the candidate set and record antibody survival time Tabi

(5) Sort the antibodies in the candidate set according to their affinity and put the best n antibodies into the antibody set Abs

(6) forabiinAbs

(7) Update the value of the appropriate memory of antibody abi Sabi+ 1 See CLONING METHOD clone antibody abi

according to nc and Affabi and put all antibodies obtained by cloning into antibody set Abc

(8) end for(9) forabiinAbc

(10) See VARIATIONMETHOD according to the degree of variationm and the affinity of the antibody for the antigen Affabito

mutate abi

(11) if antibody abi is a variant antibody(12) e abi survival time Tabi

0 e appropriate memory intensity Sabi 1

(13) end if(14) end for(15) Select the N antibodies with the highest antigen affinity in Abc and Ab to replace the N antibodies in Ab(16) See FORGETTING METHOD calculate the Rac1 protein activity of each antibody in Ab according to the ratio of Tabi

to Sabi

(17) if antibody abi Rac1 protein activitygt threshold(18) forget the antibody abi

(19) end if(20) end while(21) Choose the best antibody as the final output

ALGORITHM 1 Forgotten-based clonal selection algorithm

Table 1 Initialization parameters

Algorithm parameter GA CSA FCSA ECSA BCSACross rate 05 mdash mdash mdash mdashMutation rate 013 2 2 2 2Initial clone number mdash 5 5 5 5Rac1 threshold mdash mdash 3 mdash mdash

Table 2 Execution environment

OS Windows 10 professional editionCPU Intel(R) Core(TM) i3-3217U CPU 180GHZRAM 120GBCompiler version Python 36

6 Complexity

Tabl

e3

Test

functio

n

Test

functio

nEx

pressio

nOptim

um

AckleyFu

nctio

nf1(

x)

minus20

exp(

minus02

(1

d)

1113936d i1

x2 i

1113969

)minusexp(

(1

d)

1113936d i1cos(2π

xi)

)+20

+exp(

1)0

BukinFu

nctio

nn

6f2(

x)

100

|x2

minus001

x2 1|

1113969+001

|x1

+10

|0

Cross-in

-TrayFu

nctio

nf3(

x)

minus00001

(|sin

x1sin

x2exp(

|100

minus(

x2 1

+x2 2

1113969π

)|)|

+1)

01

minus206261

Drop-WaveFu

nctio

nf4(

x)

minus

(1

+cos(12

x2 1

+x2 2

1113969

)05(

x2 1

+x2 2)

+2)

minus1

Eggh

olderFu

nctio

nf5(

x)

minus

(x2

+47

)sin

(

|x

2+

(x12)

+47

|1113968

)minus

x1sin

(

|x

1minus

x2

minus47

|1113968

)minus9596407

Griew

ankFu

nctio

nf6(

x)

1113936

d i1

x2 i4000

minus1113937

d i1cos(

xi

iradic

)+1

0

HolderTableFu

nctio

nf7(

x)

minus

|sin(

x1)cos(

x2)exp(

|1minus

(

x2 1

+x2 2

1113969π

)|)|

minus192085

Levy

Functio

nf8(

x)

sin

2 (πw

1)+

1113936d

minus1

i1

(w

iminus1)

2 [1+10

sin2 (πw

i+1)

]+

(w

dminus1)

2 [1+sin

2 (2π

wd)]

where

wi

1

+(

ximinus14)

for

alli

1

d0

RastriginFu

nctio

nf9(

x)

10

d+

1113936d i1[

x2 i

minus10cos(2π

xi)

]0

SchafferFu

nctio

nn

2f10

(x

)05

+(sin

2 (x2 1

minusx2 2)

minus05

[1+0001(

x2 1

+x2 2)

]2)

0

SchafferFu

nctio

nn

4f11

(x

)05

+(cos(sin

(|x

2 1minus

x2 2|

))minus05

[1+0001(

x2 1

+x2 2)

]2)

05

Schw

efel

Functio

nf12

(x

)4189829

dminus

1113936d i1

xisin

(

|x

i|1113968

)0

Shub

ertF

unction

f13

(x

)

(1113936

5 i1

icos

((i+1)

x1

+i)

)(1113936

5 i1

icos

((i+1)

x2

+i)

)minus1867309

Complexity 7

Finally the comparison results of the three algorithms ofGA CSA and FCSA in the 1000th generation are shown inFigure 3 e abscissa represents the current generationnumber of the algorithm the ordinate represents the currentresult of the algorithm the dashed line represents GA andthe thin solid line represents CSAe line represents FCSA

In the test results of the Ackley Function the optimalinterval of FCSA in the process of 100 executions of thealgorithm is (minus 1 0) the optimal interval of CSA is (minus 15 0)and the optimal interval of GA is (minus 25 0) e convergencedegree of the GA is worse than that of CSA and FCSA andFCSA has the best degree of convergence

Among the test results of the Bukin Function n 6drop-wave Function Rastrigin Function and SchafferFunction n 4 the FCSA has the best optimization ac-curacy and convergence stability while the GA has a pooroptimization accuracy and convergence degree comparedwith CSA

In the test results of functions Cross-in-Tray FunctionHolder Table Function Levy Function and Shubert Func-tion the CSA and FCSA algorithms converge stably to the

global optimal among which FCSA has the optimal averagesearch accuracy and stability while the GA still has deviationpoints and the convergence is not stable

Table 4 Results of GA CSA and FCSA when D 2

ALGsGA CSA FCSA

Mean Std Mean Std Mean Stdf1 minus 50923e minus 001 51699e minus 001 minus 41302e minus 001 34025e minus 001 minus 19901e minus 001 20546e minus 001f2 minus 73939e+ 000 47669e+ 000 minus 12512e+ 000 63021e minus 001 minus 97695e minus 001 53091e minus 001f3 20601e+ 000 50685e minus 003 206255e+ 00 61577e minus 005 206258e+ 00 29248e minus 005f4 92563e minus 001 43546e minus 002 96047e minus 001 23679e minus 002 97154e minus 001 20048e minus 002f5 93533e+ 002 14155e minus 002 95836e+ 002 17805e+ 000 95926e+ 002 63550e minus 001f6 minus 16845e minus 002 27008e minus 002 minus 25327e minus 002 15556e minus 002 minus 23604e minus 002 13131e minus 002f7 18907e+ 001 38513ndash001 19203e+ 001 61897e minus 003 19206e+ 001 31154e minus 003f8 minus 97077e minus 003 22512e minus 002 minus 72359e minus 004 73241e minus 004 minus 33312e minus 004 35434e minus 004f9 minus 14031e+ 000 96354e minus 001 minus 15528e minus 001 15445e minus 001 minus 79310e minus 002 76989e minus 002f10 minus 20083e minus 004 64135e minus 004 minus 27670e minus 003 32928e minus 003 minus 11855e minus 003 16805e minus 003f11 minus 500096e minus 01 34944e minus 006 minus 500094e minus 01 17190e minus 006 minus 500094e minus 01 15678e minus 006f12 minus 25903e minus 003 48844e minus 003 minus 44234e minus 002 45491e minus 002 minus 30112e minus 002 31088e minus 002f13 16615e+ 002 23853e+ 001 18632e+ 002 39038e minus 001 18649e+ 002 29016e minus 001

Table 5 Results of BCSA ECSA and FCSA when D 50

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 12838e+ 001 23931e minus 001 minus 20213e+ 001 11482e minus 001 minus 12749e+ 001 23649e minus 001f6 minus 75633e+ 002 33740e+ 001 minus 74943e+ 002 34107e+ 001 minus 74891e+ 002 36480e+ 001f8 minus 25742e+ 002 17774e+ 001 minus 25256e+ 002 19350e+ 001 minus 24812e+ 002 16304e+ 001f9 minus 63444e+ 002 19519e+ 001 minus 63492e minus 001 19929e+ 001 minus 62661e+ 002 16450e+ 001f12 minus 14640e+ 004 31504e+ 002 minus 14646e+ 004 30211e+ 002 minus 14721e+ 004 27377e+ 002

Table 6 Results of BCSA ECSA and FCSA when D 100

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 13768e+ 001 11996e minus 001 minus 13756e+ 001 12061e minus 001 minus 13697e+ 001 11556e minus 001f6 minus 19237e+ 003 57814e+ 001 minus 19198e+ 003 64516e+ 001 minus 19070e+ 003 65405e+ 001f8 minus 71533e+ 002 25226e+ 001 minus minus 70166e+ 002 40096e+ 001 minus 69408e+ 002 27792e+ 001f9 minus 14416e+ 003 26102e+ 001 minus 14370e+ 003 32709e+ 001 minus 14283e+ 003 22737e+ 001f12 minus 32839e+ 004 45766e+ 002 minus 32866e+ 004 37953e+ 002 minus 32949e+ 004 36880e+ 002

Generation

1 39 77 115

153

191

229

267

305

343

381

419

457

495

533

571

609

647

685

723

761

799

837

875

913

951

989

0

5

10

15

20

25

Resu

lt

CSAGAFCSA

1000 gen result when ALGS = f_1 D = 2

Figure 3 Results of GA CSA and FCSA when D 2 andALGs f_1

8 Complexity

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 4: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

For each antibody in the candidate set the Rac1 activity ofthe antibody in layer I is significantly lower than that in layerII When the Rac1 activity of the layer II antibody exceeds theactivity threshold the antibody is replaced and the entireantibody candidate set is updated In this way the clonalselection algorithm avoids the antibody ldquoblack holerdquo formedby the partially unmigrated original antibodies in layer II

32 Abstract Definition of Forgetting Mechanism e fol-lowing definitions relate to the forgetting mechanism

(1) Antibody survival time is the number of iterationsthat antibodies have participated in the antibodycandidate set

(2) Time T is the execution time of the clonal selectionalgorithm In this paper T refers to the number ofalgorithm iterations

(3) Appropriate memory is the attribute of each can-didate antibody In an iteration if the antibodybelongs to the best n antibodies it is regarded as asuitable memory

(4) Appropriate memory strength is the appropriatememory accumulated by candidate antibodies dur-ing T time

(5) Rac1 protein activity is the index affecting antibodyforgetting determined by the survival time of an-tibody in candidate antibody concentration and thestrength of appropriate memory Rac1 protein ac-tivity is proportional to the survival time of anti-bodies and inversely proportional to the degree ofappropriate memory

33 Improved Clonal Selection Algorithm In this paper animproved clonal selection algorithm (FCSA) inspired by theforgetting mechanism is proposed Its core implementationidea is to replace the receptor editing mechanism [24] in theCSA with a unique forgetting mechanism

e specific implementation method is as follows Ineach iteration of the algorithm the appropriate memorystrength and survival time of each antibody candidate set arerecorded After several iterations of the algorithm antibody

Layer I

Layer II

Layer III

AgAg

AgAg

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ab

Figure 1 Antigen and antibody distribution structure

Ag

AgAg

AgAg

AgAg

Ag

Ab

AgAg Ag

Ag Ag Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Ag

Layer II

Layer I

Figure 2 Antigen-antibody distribution structure affected by Rac1

4 Complexity

forgetting was determined based on whether Rac1 proteinactivity reached the threshold

331 Affinity Calculation To simplify the calculation thetarget test function of antibody affinity to antigen is thefunction value which can be expressed as

Affabi f x1 x2 xD( 1113857 (1)

where xi | i 1 2 D1113864 1113865 is the antibody and D is the di-mension of the antibody

332 Cloning Method According to the affinity corre-sponding to the antibody and antigen the cloning methodperforms the cloning operation on the antibody e higherthe affinity the greater the number of antibodies that will becloned e specific cloning formula is

Abc abij

i isin [0 population size minus 1]

j max 1 intAffabi

+ a

alowast max clone1113888 11138891113888 1113889

111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

⎫⎪⎪⎪⎪⎪⎬

⎪⎪⎪⎪⎪⎭

agt 0

(2)

where population size represents the number of antibodiesin the candidate set of antibodies Affabi

represents the af-finity between the antibody abi and the antigen max clonerepresents the initial clone number and j represents theclone number of the antibody abi

333 Variation Method e mutation method aims at theabi of the cloned antibody and determines the mutationdegree of the antibody according to the affinity between theantibody and the antigen e higher the affinity the less thepossibility and degree of the mutation

e specific variation formula is

ab xi | i 1 2 D1113864 1113865

xi xi + random(minus a a) random(0 1)lt eminus rlowastaffabmax aff( )

xi random(0 1)gt eminus rlowastaffabMAX aff( )

⎧⎨

(3)

where r is the mutation rate a is the variation range agt 0and max aff is the maximum affinity of the concentratedantibody

334 Forgetting Method e method of forgetting deter-mines the necessity of antibody forgetting based on thesurvival time of the antibody the appropriate memory in-tensity and the activity of the Rac1 protein

e specific forgetting formula is

ab xi | i 1 2 D1113864 1113865

xi

random(minDomain maxDomain)abtime

abstrengthgt c

xi otherwise

⎧⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎩

(4)

where abtime is the antibody survival time abstrength is theappropriate memory strength and c is the Rac1 proteinactivity threshold

335 Algorithm Flow e flow of the improved algorithmproposed in this paper is shown in Algorithm1

e suspension conditions of the algorithms inAlgorithm 1can be determined according to specific needsCommon termination conditions include reaching themaximum value of the function evaluation and reaching themaximum number of generations

In the algorithm Rac1 protein activity is an inherentproperty of each candidate antibody which is calculatedbased on antibody survival time and appropriate memorystrength when the antibody is first selected into the can-didate set And it changes dynamically with the execution ofthe algorithm When the property value of the antibodyreaches the threshold it means that the antibody has notmutated in a better direction within the time we expect andit is not sufficiently competitive with other candidate an-tibodies So in the algorithm the antibody that meets thethreshold value will perform the forgetting operation

4 Experiment and Algorithm Evaluation

irteen kinds of testing were done to select CEC testfunction optimization algorithm functions as experimentaltest functions respectively using the test function of thepresented algorithm (FCSA) proposed in [13] the elitistclonal selection algorithm (ECSA) proposed in literature[14] Baldwinian learning in clonal selection algorithm(BCSA) and search accuracy of the genetic algorithm (GA)for testing e experimental steps are as follows

First initialize various parameters of the algorithm etermination criterion in this experiment is to run the GABCSA ECSA and FCSA until the number of functionevaluations reaches the maximum value of 350000

Second find the optimal solution of the test functionree algorithms were executed to obtain the optimal so-lution generated by each execution of the algorithm eaverage optimal solution maximum optimal solution andminimum optimal solution after 100 executions wereanalyzed

e purpose of the experiment in this paper is to analyzethe effectiveness of the forgetting mechanism applied to theclonal selection algorithm e performance of the algo-rithm is mainly evaluated by the quality of the results ob-tained when the suspension conditions are consistent is

Complexity 5

article counts the mean and standard deviation of the resultsof multiple runs of the algorithm to evaluate the quality ofthe results ese two indicators reflect the concentrationtrend and the degree of dispersion of the experimental datarespectively erefore this paper uses these two indicatorsto verify the effectiveness of the improved algorithm

Finally we obtained the results of GA CSA and FCSArunning at 1000 generations and plotted them as line chartse purpose is to analyze the accuracy and speed of thealgorithm by characterizing the relationship between gen-erations and algorithm results

Among them the algorithm parameters are set as shownin Table 1 and the execution environment of the algorithm isshown in Table 2

41 Test Function e test functions selected in this paperare shown in Table 3eir common feature is that they havea global minimum value [25] and the function image iscomplex with multiple local minimum values Converselythe opposite value of the test function is the globalmaximum

Consider the test functions in Table 3 Since antibodieswith high affinity are generally selected when comparing theaffinity of antigens and antibodies this paper takes theopposite value corresponding to these test functions as theglobal maximum value which is equivalent to the globalminimum value of the test function in Table 3 e oppositeof the trial function is the global maximum

42 Experimental Results e results of our experiment areshown in Table 4 e closer the average value of the optimalsolution obtained by each algorithm is to the reference valuethe higher the accuracy of the algorithm under the termi-nation condition For the case that the test functions f_1 f_6f_8 f_9 and f_12 have high-dimensional solutions in orderto verify the convergence of the FCSA in high-dimensionaltest function set d 50 and d 100 and compare the resultswith ECSA and BCSA as shown in Tables 5 and 6

After replacing the updating operator in BCSA with theforgetting operator set d 50 and d 100 to test theupdating ability of the forgetting operator compared withthe previous updating operator in different dimensions

FCSAInput N (the size of the population) n (the number of antibodies selected for cloning) nc (the number of clones)m (the degreeof variation) c (Rac1 protein activity threshold)Output the best antibody

(1) Begin(2) Randomly generate N antibodies to form the initial candidate set(3) while not meet algorithm termination conditions do(4) Calculate the affinity Affabi

of each antibody for antigen in the candidate set and record antibody survival time Tabi

(5) Sort the antibodies in the candidate set according to their affinity and put the best n antibodies into the antibody set Abs

(6) forabiinAbs

(7) Update the value of the appropriate memory of antibody abi Sabi+ 1 See CLONING METHOD clone antibody abi

according to nc and Affabi and put all antibodies obtained by cloning into antibody set Abc

(8) end for(9) forabiinAbc

(10) See VARIATIONMETHOD according to the degree of variationm and the affinity of the antibody for the antigen Affabito

mutate abi

(11) if antibody abi is a variant antibody(12) e abi survival time Tabi

0 e appropriate memory intensity Sabi 1

(13) end if(14) end for(15) Select the N antibodies with the highest antigen affinity in Abc and Ab to replace the N antibodies in Ab(16) See FORGETTING METHOD calculate the Rac1 protein activity of each antibody in Ab according to the ratio of Tabi

to Sabi

(17) if antibody abi Rac1 protein activitygt threshold(18) forget the antibody abi

(19) end if(20) end while(21) Choose the best antibody as the final output

ALGORITHM 1 Forgotten-based clonal selection algorithm

Table 1 Initialization parameters

Algorithm parameter GA CSA FCSA ECSA BCSACross rate 05 mdash mdash mdash mdashMutation rate 013 2 2 2 2Initial clone number mdash 5 5 5 5Rac1 threshold mdash mdash 3 mdash mdash

Table 2 Execution environment

OS Windows 10 professional editionCPU Intel(R) Core(TM) i3-3217U CPU 180GHZRAM 120GBCompiler version Python 36

6 Complexity

Tabl

e3

Test

functio

n

Test

functio

nEx

pressio

nOptim

um

AckleyFu

nctio

nf1(

x)

minus20

exp(

minus02

(1

d)

1113936d i1

x2 i

1113969

)minusexp(

(1

d)

1113936d i1cos(2π

xi)

)+20

+exp(

1)0

BukinFu

nctio

nn

6f2(

x)

100

|x2

minus001

x2 1|

1113969+001

|x1

+10

|0

Cross-in

-TrayFu

nctio

nf3(

x)

minus00001

(|sin

x1sin

x2exp(

|100

minus(

x2 1

+x2 2

1113969π

)|)|

+1)

01

minus206261

Drop-WaveFu

nctio

nf4(

x)

minus

(1

+cos(12

x2 1

+x2 2

1113969

)05(

x2 1

+x2 2)

+2)

minus1

Eggh

olderFu

nctio

nf5(

x)

minus

(x2

+47

)sin

(

|x

2+

(x12)

+47

|1113968

)minus

x1sin

(

|x

1minus

x2

minus47

|1113968

)minus9596407

Griew

ankFu

nctio

nf6(

x)

1113936

d i1

x2 i4000

minus1113937

d i1cos(

xi

iradic

)+1

0

HolderTableFu

nctio

nf7(

x)

minus

|sin(

x1)cos(

x2)exp(

|1minus

(

x2 1

+x2 2

1113969π

)|)|

minus192085

Levy

Functio

nf8(

x)

sin

2 (πw

1)+

1113936d

minus1

i1

(w

iminus1)

2 [1+10

sin2 (πw

i+1)

]+

(w

dminus1)

2 [1+sin

2 (2π

wd)]

where

wi

1

+(

ximinus14)

for

alli

1

d0

RastriginFu

nctio

nf9(

x)

10

d+

1113936d i1[

x2 i

minus10cos(2π

xi)

]0

SchafferFu

nctio

nn

2f10

(x

)05

+(sin

2 (x2 1

minusx2 2)

minus05

[1+0001(

x2 1

+x2 2)

]2)

0

SchafferFu

nctio

nn

4f11

(x

)05

+(cos(sin

(|x

2 1minus

x2 2|

))minus05

[1+0001(

x2 1

+x2 2)

]2)

05

Schw

efel

Functio

nf12

(x

)4189829

dminus

1113936d i1

xisin

(

|x

i|1113968

)0

Shub

ertF

unction

f13

(x

)

(1113936

5 i1

icos

((i+1)

x1

+i)

)(1113936

5 i1

icos

((i+1)

x2

+i)

)minus1867309

Complexity 7

Finally the comparison results of the three algorithms ofGA CSA and FCSA in the 1000th generation are shown inFigure 3 e abscissa represents the current generationnumber of the algorithm the ordinate represents the currentresult of the algorithm the dashed line represents GA andthe thin solid line represents CSAe line represents FCSA

In the test results of the Ackley Function the optimalinterval of FCSA in the process of 100 executions of thealgorithm is (minus 1 0) the optimal interval of CSA is (minus 15 0)and the optimal interval of GA is (minus 25 0) e convergencedegree of the GA is worse than that of CSA and FCSA andFCSA has the best degree of convergence

Among the test results of the Bukin Function n 6drop-wave Function Rastrigin Function and SchafferFunction n 4 the FCSA has the best optimization ac-curacy and convergence stability while the GA has a pooroptimization accuracy and convergence degree comparedwith CSA

In the test results of functions Cross-in-Tray FunctionHolder Table Function Levy Function and Shubert Func-tion the CSA and FCSA algorithms converge stably to the

global optimal among which FCSA has the optimal averagesearch accuracy and stability while the GA still has deviationpoints and the convergence is not stable

Table 4 Results of GA CSA and FCSA when D 2

ALGsGA CSA FCSA

Mean Std Mean Std Mean Stdf1 minus 50923e minus 001 51699e minus 001 minus 41302e minus 001 34025e minus 001 minus 19901e minus 001 20546e minus 001f2 minus 73939e+ 000 47669e+ 000 minus 12512e+ 000 63021e minus 001 minus 97695e minus 001 53091e minus 001f3 20601e+ 000 50685e minus 003 206255e+ 00 61577e minus 005 206258e+ 00 29248e minus 005f4 92563e minus 001 43546e minus 002 96047e minus 001 23679e minus 002 97154e minus 001 20048e minus 002f5 93533e+ 002 14155e minus 002 95836e+ 002 17805e+ 000 95926e+ 002 63550e minus 001f6 minus 16845e minus 002 27008e minus 002 minus 25327e minus 002 15556e minus 002 minus 23604e minus 002 13131e minus 002f7 18907e+ 001 38513ndash001 19203e+ 001 61897e minus 003 19206e+ 001 31154e minus 003f8 minus 97077e minus 003 22512e minus 002 minus 72359e minus 004 73241e minus 004 minus 33312e minus 004 35434e minus 004f9 minus 14031e+ 000 96354e minus 001 minus 15528e minus 001 15445e minus 001 minus 79310e minus 002 76989e minus 002f10 minus 20083e minus 004 64135e minus 004 minus 27670e minus 003 32928e minus 003 minus 11855e minus 003 16805e minus 003f11 minus 500096e minus 01 34944e minus 006 minus 500094e minus 01 17190e minus 006 minus 500094e minus 01 15678e minus 006f12 minus 25903e minus 003 48844e minus 003 minus 44234e minus 002 45491e minus 002 minus 30112e minus 002 31088e minus 002f13 16615e+ 002 23853e+ 001 18632e+ 002 39038e minus 001 18649e+ 002 29016e minus 001

Table 5 Results of BCSA ECSA and FCSA when D 50

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 12838e+ 001 23931e minus 001 minus 20213e+ 001 11482e minus 001 minus 12749e+ 001 23649e minus 001f6 minus 75633e+ 002 33740e+ 001 minus 74943e+ 002 34107e+ 001 minus 74891e+ 002 36480e+ 001f8 minus 25742e+ 002 17774e+ 001 minus 25256e+ 002 19350e+ 001 minus 24812e+ 002 16304e+ 001f9 minus 63444e+ 002 19519e+ 001 minus 63492e minus 001 19929e+ 001 minus 62661e+ 002 16450e+ 001f12 minus 14640e+ 004 31504e+ 002 minus 14646e+ 004 30211e+ 002 minus 14721e+ 004 27377e+ 002

Table 6 Results of BCSA ECSA and FCSA when D 100

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 13768e+ 001 11996e minus 001 minus 13756e+ 001 12061e minus 001 minus 13697e+ 001 11556e minus 001f6 minus 19237e+ 003 57814e+ 001 minus 19198e+ 003 64516e+ 001 minus 19070e+ 003 65405e+ 001f8 minus 71533e+ 002 25226e+ 001 minus minus 70166e+ 002 40096e+ 001 minus 69408e+ 002 27792e+ 001f9 minus 14416e+ 003 26102e+ 001 minus 14370e+ 003 32709e+ 001 minus 14283e+ 003 22737e+ 001f12 minus 32839e+ 004 45766e+ 002 minus 32866e+ 004 37953e+ 002 minus 32949e+ 004 36880e+ 002

Generation

1 39 77 115

153

191

229

267

305

343

381

419

457

495

533

571

609

647

685

723

761

799

837

875

913

951

989

0

5

10

15

20

25

Resu

lt

CSAGAFCSA

1000 gen result when ALGS = f_1 D = 2

Figure 3 Results of GA CSA and FCSA when D 2 andALGs f_1

8 Complexity

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 5: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

forgetting was determined based on whether Rac1 proteinactivity reached the threshold

331 Affinity Calculation To simplify the calculation thetarget test function of antibody affinity to antigen is thefunction value which can be expressed as

Affabi f x1 x2 xD( 1113857 (1)

where xi | i 1 2 D1113864 1113865 is the antibody and D is the di-mension of the antibody

332 Cloning Method According to the affinity corre-sponding to the antibody and antigen the cloning methodperforms the cloning operation on the antibody e higherthe affinity the greater the number of antibodies that will becloned e specific cloning formula is

Abc abij

i isin [0 population size minus 1]

j max 1 intAffabi

+ a

alowast max clone1113888 11138891113888 1113889

111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868111386811138681113868

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

⎫⎪⎪⎪⎪⎪⎬

⎪⎪⎪⎪⎪⎭

agt 0

(2)

where population size represents the number of antibodiesin the candidate set of antibodies Affabi

represents the af-finity between the antibody abi and the antigen max clonerepresents the initial clone number and j represents theclone number of the antibody abi

333 Variation Method e mutation method aims at theabi of the cloned antibody and determines the mutationdegree of the antibody according to the affinity between theantibody and the antigen e higher the affinity the less thepossibility and degree of the mutation

e specific variation formula is

ab xi | i 1 2 D1113864 1113865

xi xi + random(minus a a) random(0 1)lt eminus rlowastaffabmax aff( )

xi random(0 1)gt eminus rlowastaffabMAX aff( )

⎧⎨

(3)

where r is the mutation rate a is the variation range agt 0and max aff is the maximum affinity of the concentratedantibody

334 Forgetting Method e method of forgetting deter-mines the necessity of antibody forgetting based on thesurvival time of the antibody the appropriate memory in-tensity and the activity of the Rac1 protein

e specific forgetting formula is

ab xi | i 1 2 D1113864 1113865

xi

random(minDomain maxDomain)abtime

abstrengthgt c

xi otherwise

⎧⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎩

(4)

where abtime is the antibody survival time abstrength is theappropriate memory strength and c is the Rac1 proteinactivity threshold

335 Algorithm Flow e flow of the improved algorithmproposed in this paper is shown in Algorithm1

e suspension conditions of the algorithms inAlgorithm 1can be determined according to specific needsCommon termination conditions include reaching themaximum value of the function evaluation and reaching themaximum number of generations

In the algorithm Rac1 protein activity is an inherentproperty of each candidate antibody which is calculatedbased on antibody survival time and appropriate memorystrength when the antibody is first selected into the can-didate set And it changes dynamically with the execution ofthe algorithm When the property value of the antibodyreaches the threshold it means that the antibody has notmutated in a better direction within the time we expect andit is not sufficiently competitive with other candidate an-tibodies So in the algorithm the antibody that meets thethreshold value will perform the forgetting operation

4 Experiment and Algorithm Evaluation

irteen kinds of testing were done to select CEC testfunction optimization algorithm functions as experimentaltest functions respectively using the test function of thepresented algorithm (FCSA) proposed in [13] the elitistclonal selection algorithm (ECSA) proposed in literature[14] Baldwinian learning in clonal selection algorithm(BCSA) and search accuracy of the genetic algorithm (GA)for testing e experimental steps are as follows

First initialize various parameters of the algorithm etermination criterion in this experiment is to run the GABCSA ECSA and FCSA until the number of functionevaluations reaches the maximum value of 350000

Second find the optimal solution of the test functionree algorithms were executed to obtain the optimal so-lution generated by each execution of the algorithm eaverage optimal solution maximum optimal solution andminimum optimal solution after 100 executions wereanalyzed

e purpose of the experiment in this paper is to analyzethe effectiveness of the forgetting mechanism applied to theclonal selection algorithm e performance of the algo-rithm is mainly evaluated by the quality of the results ob-tained when the suspension conditions are consistent is

Complexity 5

article counts the mean and standard deviation of the resultsof multiple runs of the algorithm to evaluate the quality ofthe results ese two indicators reflect the concentrationtrend and the degree of dispersion of the experimental datarespectively erefore this paper uses these two indicatorsto verify the effectiveness of the improved algorithm

Finally we obtained the results of GA CSA and FCSArunning at 1000 generations and plotted them as line chartse purpose is to analyze the accuracy and speed of thealgorithm by characterizing the relationship between gen-erations and algorithm results

Among them the algorithm parameters are set as shownin Table 1 and the execution environment of the algorithm isshown in Table 2

41 Test Function e test functions selected in this paperare shown in Table 3eir common feature is that they havea global minimum value [25] and the function image iscomplex with multiple local minimum values Converselythe opposite value of the test function is the globalmaximum

Consider the test functions in Table 3 Since antibodieswith high affinity are generally selected when comparing theaffinity of antigens and antibodies this paper takes theopposite value corresponding to these test functions as theglobal maximum value which is equivalent to the globalminimum value of the test function in Table 3 e oppositeof the trial function is the global maximum

42 Experimental Results e results of our experiment areshown in Table 4 e closer the average value of the optimalsolution obtained by each algorithm is to the reference valuethe higher the accuracy of the algorithm under the termi-nation condition For the case that the test functions f_1 f_6f_8 f_9 and f_12 have high-dimensional solutions in orderto verify the convergence of the FCSA in high-dimensionaltest function set d 50 and d 100 and compare the resultswith ECSA and BCSA as shown in Tables 5 and 6

After replacing the updating operator in BCSA with theforgetting operator set d 50 and d 100 to test theupdating ability of the forgetting operator compared withthe previous updating operator in different dimensions

FCSAInput N (the size of the population) n (the number of antibodies selected for cloning) nc (the number of clones)m (the degreeof variation) c (Rac1 protein activity threshold)Output the best antibody

(1) Begin(2) Randomly generate N antibodies to form the initial candidate set(3) while not meet algorithm termination conditions do(4) Calculate the affinity Affabi

of each antibody for antigen in the candidate set and record antibody survival time Tabi

(5) Sort the antibodies in the candidate set according to their affinity and put the best n antibodies into the antibody set Abs

(6) forabiinAbs

(7) Update the value of the appropriate memory of antibody abi Sabi+ 1 See CLONING METHOD clone antibody abi

according to nc and Affabi and put all antibodies obtained by cloning into antibody set Abc

(8) end for(9) forabiinAbc

(10) See VARIATIONMETHOD according to the degree of variationm and the affinity of the antibody for the antigen Affabito

mutate abi

(11) if antibody abi is a variant antibody(12) e abi survival time Tabi

0 e appropriate memory intensity Sabi 1

(13) end if(14) end for(15) Select the N antibodies with the highest antigen affinity in Abc and Ab to replace the N antibodies in Ab(16) See FORGETTING METHOD calculate the Rac1 protein activity of each antibody in Ab according to the ratio of Tabi

to Sabi

(17) if antibody abi Rac1 protein activitygt threshold(18) forget the antibody abi

(19) end if(20) end while(21) Choose the best antibody as the final output

ALGORITHM 1 Forgotten-based clonal selection algorithm

Table 1 Initialization parameters

Algorithm parameter GA CSA FCSA ECSA BCSACross rate 05 mdash mdash mdash mdashMutation rate 013 2 2 2 2Initial clone number mdash 5 5 5 5Rac1 threshold mdash mdash 3 mdash mdash

Table 2 Execution environment

OS Windows 10 professional editionCPU Intel(R) Core(TM) i3-3217U CPU 180GHZRAM 120GBCompiler version Python 36

6 Complexity

Tabl

e3

Test

functio

n

Test

functio

nEx

pressio

nOptim

um

AckleyFu

nctio

nf1(

x)

minus20

exp(

minus02

(1

d)

1113936d i1

x2 i

1113969

)minusexp(

(1

d)

1113936d i1cos(2π

xi)

)+20

+exp(

1)0

BukinFu

nctio

nn

6f2(

x)

100

|x2

minus001

x2 1|

1113969+001

|x1

+10

|0

Cross-in

-TrayFu

nctio

nf3(

x)

minus00001

(|sin

x1sin

x2exp(

|100

minus(

x2 1

+x2 2

1113969π

)|)|

+1)

01

minus206261

Drop-WaveFu

nctio

nf4(

x)

minus

(1

+cos(12

x2 1

+x2 2

1113969

)05(

x2 1

+x2 2)

+2)

minus1

Eggh

olderFu

nctio

nf5(

x)

minus

(x2

+47

)sin

(

|x

2+

(x12)

+47

|1113968

)minus

x1sin

(

|x

1minus

x2

minus47

|1113968

)minus9596407

Griew

ankFu

nctio

nf6(

x)

1113936

d i1

x2 i4000

minus1113937

d i1cos(

xi

iradic

)+1

0

HolderTableFu

nctio

nf7(

x)

minus

|sin(

x1)cos(

x2)exp(

|1minus

(

x2 1

+x2 2

1113969π

)|)|

minus192085

Levy

Functio

nf8(

x)

sin

2 (πw

1)+

1113936d

minus1

i1

(w

iminus1)

2 [1+10

sin2 (πw

i+1)

]+

(w

dminus1)

2 [1+sin

2 (2π

wd)]

where

wi

1

+(

ximinus14)

for

alli

1

d0

RastriginFu

nctio

nf9(

x)

10

d+

1113936d i1[

x2 i

minus10cos(2π

xi)

]0

SchafferFu

nctio

nn

2f10

(x

)05

+(sin

2 (x2 1

minusx2 2)

minus05

[1+0001(

x2 1

+x2 2)

]2)

0

SchafferFu

nctio

nn

4f11

(x

)05

+(cos(sin

(|x

2 1minus

x2 2|

))minus05

[1+0001(

x2 1

+x2 2)

]2)

05

Schw

efel

Functio

nf12

(x

)4189829

dminus

1113936d i1

xisin

(

|x

i|1113968

)0

Shub

ertF

unction

f13

(x

)

(1113936

5 i1

icos

((i+1)

x1

+i)

)(1113936

5 i1

icos

((i+1)

x2

+i)

)minus1867309

Complexity 7

Finally the comparison results of the three algorithms ofGA CSA and FCSA in the 1000th generation are shown inFigure 3 e abscissa represents the current generationnumber of the algorithm the ordinate represents the currentresult of the algorithm the dashed line represents GA andthe thin solid line represents CSAe line represents FCSA

In the test results of the Ackley Function the optimalinterval of FCSA in the process of 100 executions of thealgorithm is (minus 1 0) the optimal interval of CSA is (minus 15 0)and the optimal interval of GA is (minus 25 0) e convergencedegree of the GA is worse than that of CSA and FCSA andFCSA has the best degree of convergence

Among the test results of the Bukin Function n 6drop-wave Function Rastrigin Function and SchafferFunction n 4 the FCSA has the best optimization ac-curacy and convergence stability while the GA has a pooroptimization accuracy and convergence degree comparedwith CSA

In the test results of functions Cross-in-Tray FunctionHolder Table Function Levy Function and Shubert Func-tion the CSA and FCSA algorithms converge stably to the

global optimal among which FCSA has the optimal averagesearch accuracy and stability while the GA still has deviationpoints and the convergence is not stable

Table 4 Results of GA CSA and FCSA when D 2

ALGsGA CSA FCSA

Mean Std Mean Std Mean Stdf1 minus 50923e minus 001 51699e minus 001 minus 41302e minus 001 34025e minus 001 minus 19901e minus 001 20546e minus 001f2 minus 73939e+ 000 47669e+ 000 minus 12512e+ 000 63021e minus 001 minus 97695e minus 001 53091e minus 001f3 20601e+ 000 50685e minus 003 206255e+ 00 61577e minus 005 206258e+ 00 29248e minus 005f4 92563e minus 001 43546e minus 002 96047e minus 001 23679e minus 002 97154e minus 001 20048e minus 002f5 93533e+ 002 14155e minus 002 95836e+ 002 17805e+ 000 95926e+ 002 63550e minus 001f6 minus 16845e minus 002 27008e minus 002 minus 25327e minus 002 15556e minus 002 minus 23604e minus 002 13131e minus 002f7 18907e+ 001 38513ndash001 19203e+ 001 61897e minus 003 19206e+ 001 31154e minus 003f8 minus 97077e minus 003 22512e minus 002 minus 72359e minus 004 73241e minus 004 minus 33312e minus 004 35434e minus 004f9 minus 14031e+ 000 96354e minus 001 minus 15528e minus 001 15445e minus 001 minus 79310e minus 002 76989e minus 002f10 minus 20083e minus 004 64135e minus 004 minus 27670e minus 003 32928e minus 003 minus 11855e minus 003 16805e minus 003f11 minus 500096e minus 01 34944e minus 006 minus 500094e minus 01 17190e minus 006 minus 500094e minus 01 15678e minus 006f12 minus 25903e minus 003 48844e minus 003 minus 44234e minus 002 45491e minus 002 minus 30112e minus 002 31088e minus 002f13 16615e+ 002 23853e+ 001 18632e+ 002 39038e minus 001 18649e+ 002 29016e minus 001

Table 5 Results of BCSA ECSA and FCSA when D 50

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 12838e+ 001 23931e minus 001 minus 20213e+ 001 11482e minus 001 minus 12749e+ 001 23649e minus 001f6 minus 75633e+ 002 33740e+ 001 minus 74943e+ 002 34107e+ 001 minus 74891e+ 002 36480e+ 001f8 minus 25742e+ 002 17774e+ 001 minus 25256e+ 002 19350e+ 001 minus 24812e+ 002 16304e+ 001f9 minus 63444e+ 002 19519e+ 001 minus 63492e minus 001 19929e+ 001 minus 62661e+ 002 16450e+ 001f12 minus 14640e+ 004 31504e+ 002 minus 14646e+ 004 30211e+ 002 minus 14721e+ 004 27377e+ 002

Table 6 Results of BCSA ECSA and FCSA when D 100

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 13768e+ 001 11996e minus 001 minus 13756e+ 001 12061e minus 001 minus 13697e+ 001 11556e minus 001f6 minus 19237e+ 003 57814e+ 001 minus 19198e+ 003 64516e+ 001 minus 19070e+ 003 65405e+ 001f8 minus 71533e+ 002 25226e+ 001 minus minus 70166e+ 002 40096e+ 001 minus 69408e+ 002 27792e+ 001f9 minus 14416e+ 003 26102e+ 001 minus 14370e+ 003 32709e+ 001 minus 14283e+ 003 22737e+ 001f12 minus 32839e+ 004 45766e+ 002 minus 32866e+ 004 37953e+ 002 minus 32949e+ 004 36880e+ 002

Generation

1 39 77 115

153

191

229

267

305

343

381

419

457

495

533

571

609

647

685

723

761

799

837

875

913

951

989

0

5

10

15

20

25

Resu

lt

CSAGAFCSA

1000 gen result when ALGS = f_1 D = 2

Figure 3 Results of GA CSA and FCSA when D 2 andALGs f_1

8 Complexity

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 6: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

article counts the mean and standard deviation of the resultsof multiple runs of the algorithm to evaluate the quality ofthe results ese two indicators reflect the concentrationtrend and the degree of dispersion of the experimental datarespectively erefore this paper uses these two indicatorsto verify the effectiveness of the improved algorithm

Finally we obtained the results of GA CSA and FCSArunning at 1000 generations and plotted them as line chartse purpose is to analyze the accuracy and speed of thealgorithm by characterizing the relationship between gen-erations and algorithm results

Among them the algorithm parameters are set as shownin Table 1 and the execution environment of the algorithm isshown in Table 2

41 Test Function e test functions selected in this paperare shown in Table 3eir common feature is that they havea global minimum value [25] and the function image iscomplex with multiple local minimum values Converselythe opposite value of the test function is the globalmaximum

Consider the test functions in Table 3 Since antibodieswith high affinity are generally selected when comparing theaffinity of antigens and antibodies this paper takes theopposite value corresponding to these test functions as theglobal maximum value which is equivalent to the globalminimum value of the test function in Table 3 e oppositeof the trial function is the global maximum

42 Experimental Results e results of our experiment areshown in Table 4 e closer the average value of the optimalsolution obtained by each algorithm is to the reference valuethe higher the accuracy of the algorithm under the termi-nation condition For the case that the test functions f_1 f_6f_8 f_9 and f_12 have high-dimensional solutions in orderto verify the convergence of the FCSA in high-dimensionaltest function set d 50 and d 100 and compare the resultswith ECSA and BCSA as shown in Tables 5 and 6

After replacing the updating operator in BCSA with theforgetting operator set d 50 and d 100 to test theupdating ability of the forgetting operator compared withthe previous updating operator in different dimensions

FCSAInput N (the size of the population) n (the number of antibodies selected for cloning) nc (the number of clones)m (the degreeof variation) c (Rac1 protein activity threshold)Output the best antibody

(1) Begin(2) Randomly generate N antibodies to form the initial candidate set(3) while not meet algorithm termination conditions do(4) Calculate the affinity Affabi

of each antibody for antigen in the candidate set and record antibody survival time Tabi

(5) Sort the antibodies in the candidate set according to their affinity and put the best n antibodies into the antibody set Abs

(6) forabiinAbs

(7) Update the value of the appropriate memory of antibody abi Sabi+ 1 See CLONING METHOD clone antibody abi

according to nc and Affabi and put all antibodies obtained by cloning into antibody set Abc

(8) end for(9) forabiinAbc

(10) See VARIATIONMETHOD according to the degree of variationm and the affinity of the antibody for the antigen Affabito

mutate abi

(11) if antibody abi is a variant antibody(12) e abi survival time Tabi

0 e appropriate memory intensity Sabi 1

(13) end if(14) end for(15) Select the N antibodies with the highest antigen affinity in Abc and Ab to replace the N antibodies in Ab(16) See FORGETTING METHOD calculate the Rac1 protein activity of each antibody in Ab according to the ratio of Tabi

to Sabi

(17) if antibody abi Rac1 protein activitygt threshold(18) forget the antibody abi

(19) end if(20) end while(21) Choose the best antibody as the final output

ALGORITHM 1 Forgotten-based clonal selection algorithm

Table 1 Initialization parameters

Algorithm parameter GA CSA FCSA ECSA BCSACross rate 05 mdash mdash mdash mdashMutation rate 013 2 2 2 2Initial clone number mdash 5 5 5 5Rac1 threshold mdash mdash 3 mdash mdash

Table 2 Execution environment

OS Windows 10 professional editionCPU Intel(R) Core(TM) i3-3217U CPU 180GHZRAM 120GBCompiler version Python 36

6 Complexity

Tabl

e3

Test

functio

n

Test

functio

nEx

pressio

nOptim

um

AckleyFu

nctio

nf1(

x)

minus20

exp(

minus02

(1

d)

1113936d i1

x2 i

1113969

)minusexp(

(1

d)

1113936d i1cos(2π

xi)

)+20

+exp(

1)0

BukinFu

nctio

nn

6f2(

x)

100

|x2

minus001

x2 1|

1113969+001

|x1

+10

|0

Cross-in

-TrayFu

nctio

nf3(

x)

minus00001

(|sin

x1sin

x2exp(

|100

minus(

x2 1

+x2 2

1113969π

)|)|

+1)

01

minus206261

Drop-WaveFu

nctio

nf4(

x)

minus

(1

+cos(12

x2 1

+x2 2

1113969

)05(

x2 1

+x2 2)

+2)

minus1

Eggh

olderFu

nctio

nf5(

x)

minus

(x2

+47

)sin

(

|x

2+

(x12)

+47

|1113968

)minus

x1sin

(

|x

1minus

x2

minus47

|1113968

)minus9596407

Griew

ankFu

nctio

nf6(

x)

1113936

d i1

x2 i4000

minus1113937

d i1cos(

xi

iradic

)+1

0

HolderTableFu

nctio

nf7(

x)

minus

|sin(

x1)cos(

x2)exp(

|1minus

(

x2 1

+x2 2

1113969π

)|)|

minus192085

Levy

Functio

nf8(

x)

sin

2 (πw

1)+

1113936d

minus1

i1

(w

iminus1)

2 [1+10

sin2 (πw

i+1)

]+

(w

dminus1)

2 [1+sin

2 (2π

wd)]

where

wi

1

+(

ximinus14)

for

alli

1

d0

RastriginFu

nctio

nf9(

x)

10

d+

1113936d i1[

x2 i

minus10cos(2π

xi)

]0

SchafferFu

nctio

nn

2f10

(x

)05

+(sin

2 (x2 1

minusx2 2)

minus05

[1+0001(

x2 1

+x2 2)

]2)

0

SchafferFu

nctio

nn

4f11

(x

)05

+(cos(sin

(|x

2 1minus

x2 2|

))minus05

[1+0001(

x2 1

+x2 2)

]2)

05

Schw

efel

Functio

nf12

(x

)4189829

dminus

1113936d i1

xisin

(

|x

i|1113968

)0

Shub

ertF

unction

f13

(x

)

(1113936

5 i1

icos

((i+1)

x1

+i)

)(1113936

5 i1

icos

((i+1)

x2

+i)

)minus1867309

Complexity 7

Finally the comparison results of the three algorithms ofGA CSA and FCSA in the 1000th generation are shown inFigure 3 e abscissa represents the current generationnumber of the algorithm the ordinate represents the currentresult of the algorithm the dashed line represents GA andthe thin solid line represents CSAe line represents FCSA

In the test results of the Ackley Function the optimalinterval of FCSA in the process of 100 executions of thealgorithm is (minus 1 0) the optimal interval of CSA is (minus 15 0)and the optimal interval of GA is (minus 25 0) e convergencedegree of the GA is worse than that of CSA and FCSA andFCSA has the best degree of convergence

Among the test results of the Bukin Function n 6drop-wave Function Rastrigin Function and SchafferFunction n 4 the FCSA has the best optimization ac-curacy and convergence stability while the GA has a pooroptimization accuracy and convergence degree comparedwith CSA

In the test results of functions Cross-in-Tray FunctionHolder Table Function Levy Function and Shubert Func-tion the CSA and FCSA algorithms converge stably to the

global optimal among which FCSA has the optimal averagesearch accuracy and stability while the GA still has deviationpoints and the convergence is not stable

Table 4 Results of GA CSA and FCSA when D 2

ALGsGA CSA FCSA

Mean Std Mean Std Mean Stdf1 minus 50923e minus 001 51699e minus 001 minus 41302e minus 001 34025e minus 001 minus 19901e minus 001 20546e minus 001f2 minus 73939e+ 000 47669e+ 000 minus 12512e+ 000 63021e minus 001 minus 97695e minus 001 53091e minus 001f3 20601e+ 000 50685e minus 003 206255e+ 00 61577e minus 005 206258e+ 00 29248e minus 005f4 92563e minus 001 43546e minus 002 96047e minus 001 23679e minus 002 97154e minus 001 20048e minus 002f5 93533e+ 002 14155e minus 002 95836e+ 002 17805e+ 000 95926e+ 002 63550e minus 001f6 minus 16845e minus 002 27008e minus 002 minus 25327e minus 002 15556e minus 002 minus 23604e minus 002 13131e minus 002f7 18907e+ 001 38513ndash001 19203e+ 001 61897e minus 003 19206e+ 001 31154e minus 003f8 minus 97077e minus 003 22512e minus 002 minus 72359e minus 004 73241e minus 004 minus 33312e minus 004 35434e minus 004f9 minus 14031e+ 000 96354e minus 001 minus 15528e minus 001 15445e minus 001 minus 79310e minus 002 76989e minus 002f10 minus 20083e minus 004 64135e minus 004 minus 27670e minus 003 32928e minus 003 minus 11855e minus 003 16805e minus 003f11 minus 500096e minus 01 34944e minus 006 minus 500094e minus 01 17190e minus 006 minus 500094e minus 01 15678e minus 006f12 minus 25903e minus 003 48844e minus 003 minus 44234e minus 002 45491e minus 002 minus 30112e minus 002 31088e minus 002f13 16615e+ 002 23853e+ 001 18632e+ 002 39038e minus 001 18649e+ 002 29016e minus 001

Table 5 Results of BCSA ECSA and FCSA when D 50

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 12838e+ 001 23931e minus 001 minus 20213e+ 001 11482e minus 001 minus 12749e+ 001 23649e minus 001f6 minus 75633e+ 002 33740e+ 001 minus 74943e+ 002 34107e+ 001 minus 74891e+ 002 36480e+ 001f8 minus 25742e+ 002 17774e+ 001 minus 25256e+ 002 19350e+ 001 minus 24812e+ 002 16304e+ 001f9 minus 63444e+ 002 19519e+ 001 minus 63492e minus 001 19929e+ 001 minus 62661e+ 002 16450e+ 001f12 minus 14640e+ 004 31504e+ 002 minus 14646e+ 004 30211e+ 002 minus 14721e+ 004 27377e+ 002

Table 6 Results of BCSA ECSA and FCSA when D 100

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 13768e+ 001 11996e minus 001 minus 13756e+ 001 12061e minus 001 minus 13697e+ 001 11556e minus 001f6 minus 19237e+ 003 57814e+ 001 minus 19198e+ 003 64516e+ 001 minus 19070e+ 003 65405e+ 001f8 minus 71533e+ 002 25226e+ 001 minus minus 70166e+ 002 40096e+ 001 minus 69408e+ 002 27792e+ 001f9 minus 14416e+ 003 26102e+ 001 minus 14370e+ 003 32709e+ 001 minus 14283e+ 003 22737e+ 001f12 minus 32839e+ 004 45766e+ 002 minus 32866e+ 004 37953e+ 002 minus 32949e+ 004 36880e+ 002

Generation

1 39 77 115

153

191

229

267

305

343

381

419

457

495

533

571

609

647

685

723

761

799

837

875

913

951

989

0

5

10

15

20

25

Resu

lt

CSAGAFCSA

1000 gen result when ALGS = f_1 D = 2

Figure 3 Results of GA CSA and FCSA when D 2 andALGs f_1

8 Complexity

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 7: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

Tabl

e3

Test

functio

n

Test

functio

nEx

pressio

nOptim

um

AckleyFu

nctio

nf1(

x)

minus20

exp(

minus02

(1

d)

1113936d i1

x2 i

1113969

)minusexp(

(1

d)

1113936d i1cos(2π

xi)

)+20

+exp(

1)0

BukinFu

nctio

nn

6f2(

x)

100

|x2

minus001

x2 1|

1113969+001

|x1

+10

|0

Cross-in

-TrayFu

nctio

nf3(

x)

minus00001

(|sin

x1sin

x2exp(

|100

minus(

x2 1

+x2 2

1113969π

)|)|

+1)

01

minus206261

Drop-WaveFu

nctio

nf4(

x)

minus

(1

+cos(12

x2 1

+x2 2

1113969

)05(

x2 1

+x2 2)

+2)

minus1

Eggh

olderFu

nctio

nf5(

x)

minus

(x2

+47

)sin

(

|x

2+

(x12)

+47

|1113968

)minus

x1sin

(

|x

1minus

x2

minus47

|1113968

)minus9596407

Griew

ankFu

nctio

nf6(

x)

1113936

d i1

x2 i4000

minus1113937

d i1cos(

xi

iradic

)+1

0

HolderTableFu

nctio

nf7(

x)

minus

|sin(

x1)cos(

x2)exp(

|1minus

(

x2 1

+x2 2

1113969π

)|)|

minus192085

Levy

Functio

nf8(

x)

sin

2 (πw

1)+

1113936d

minus1

i1

(w

iminus1)

2 [1+10

sin2 (πw

i+1)

]+

(w

dminus1)

2 [1+sin

2 (2π

wd)]

where

wi

1

+(

ximinus14)

for

alli

1

d0

RastriginFu

nctio

nf9(

x)

10

d+

1113936d i1[

x2 i

minus10cos(2π

xi)

]0

SchafferFu

nctio

nn

2f10

(x

)05

+(sin

2 (x2 1

minusx2 2)

minus05

[1+0001(

x2 1

+x2 2)

]2)

0

SchafferFu

nctio

nn

4f11

(x

)05

+(cos(sin

(|x

2 1minus

x2 2|

))minus05

[1+0001(

x2 1

+x2 2)

]2)

05

Schw

efel

Functio

nf12

(x

)4189829

dminus

1113936d i1

xisin

(

|x

i|1113968

)0

Shub

ertF

unction

f13

(x

)

(1113936

5 i1

icos

((i+1)

x1

+i)

)(1113936

5 i1

icos

((i+1)

x2

+i)

)minus1867309

Complexity 7

Finally the comparison results of the three algorithms ofGA CSA and FCSA in the 1000th generation are shown inFigure 3 e abscissa represents the current generationnumber of the algorithm the ordinate represents the currentresult of the algorithm the dashed line represents GA andthe thin solid line represents CSAe line represents FCSA

In the test results of the Ackley Function the optimalinterval of FCSA in the process of 100 executions of thealgorithm is (minus 1 0) the optimal interval of CSA is (minus 15 0)and the optimal interval of GA is (minus 25 0) e convergencedegree of the GA is worse than that of CSA and FCSA andFCSA has the best degree of convergence

Among the test results of the Bukin Function n 6drop-wave Function Rastrigin Function and SchafferFunction n 4 the FCSA has the best optimization ac-curacy and convergence stability while the GA has a pooroptimization accuracy and convergence degree comparedwith CSA

In the test results of functions Cross-in-Tray FunctionHolder Table Function Levy Function and Shubert Func-tion the CSA and FCSA algorithms converge stably to the

global optimal among which FCSA has the optimal averagesearch accuracy and stability while the GA still has deviationpoints and the convergence is not stable

Table 4 Results of GA CSA and FCSA when D 2

ALGsGA CSA FCSA

Mean Std Mean Std Mean Stdf1 minus 50923e minus 001 51699e minus 001 minus 41302e minus 001 34025e minus 001 minus 19901e minus 001 20546e minus 001f2 minus 73939e+ 000 47669e+ 000 minus 12512e+ 000 63021e minus 001 minus 97695e minus 001 53091e minus 001f3 20601e+ 000 50685e minus 003 206255e+ 00 61577e minus 005 206258e+ 00 29248e minus 005f4 92563e minus 001 43546e minus 002 96047e minus 001 23679e minus 002 97154e minus 001 20048e minus 002f5 93533e+ 002 14155e minus 002 95836e+ 002 17805e+ 000 95926e+ 002 63550e minus 001f6 minus 16845e minus 002 27008e minus 002 minus 25327e minus 002 15556e minus 002 minus 23604e minus 002 13131e minus 002f7 18907e+ 001 38513ndash001 19203e+ 001 61897e minus 003 19206e+ 001 31154e minus 003f8 minus 97077e minus 003 22512e minus 002 minus 72359e minus 004 73241e minus 004 minus 33312e minus 004 35434e minus 004f9 minus 14031e+ 000 96354e minus 001 minus 15528e minus 001 15445e minus 001 minus 79310e minus 002 76989e minus 002f10 minus 20083e minus 004 64135e minus 004 minus 27670e minus 003 32928e minus 003 minus 11855e minus 003 16805e minus 003f11 minus 500096e minus 01 34944e minus 006 minus 500094e minus 01 17190e minus 006 minus 500094e minus 01 15678e minus 006f12 minus 25903e minus 003 48844e minus 003 minus 44234e minus 002 45491e minus 002 minus 30112e minus 002 31088e minus 002f13 16615e+ 002 23853e+ 001 18632e+ 002 39038e minus 001 18649e+ 002 29016e minus 001

Table 5 Results of BCSA ECSA and FCSA when D 50

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 12838e+ 001 23931e minus 001 minus 20213e+ 001 11482e minus 001 minus 12749e+ 001 23649e minus 001f6 minus 75633e+ 002 33740e+ 001 minus 74943e+ 002 34107e+ 001 minus 74891e+ 002 36480e+ 001f8 minus 25742e+ 002 17774e+ 001 minus 25256e+ 002 19350e+ 001 minus 24812e+ 002 16304e+ 001f9 minus 63444e+ 002 19519e+ 001 minus 63492e minus 001 19929e+ 001 minus 62661e+ 002 16450e+ 001f12 minus 14640e+ 004 31504e+ 002 minus 14646e+ 004 30211e+ 002 minus 14721e+ 004 27377e+ 002

Table 6 Results of BCSA ECSA and FCSA when D 100

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 13768e+ 001 11996e minus 001 minus 13756e+ 001 12061e minus 001 minus 13697e+ 001 11556e minus 001f6 minus 19237e+ 003 57814e+ 001 minus 19198e+ 003 64516e+ 001 minus 19070e+ 003 65405e+ 001f8 minus 71533e+ 002 25226e+ 001 minus minus 70166e+ 002 40096e+ 001 minus 69408e+ 002 27792e+ 001f9 minus 14416e+ 003 26102e+ 001 minus 14370e+ 003 32709e+ 001 minus 14283e+ 003 22737e+ 001f12 minus 32839e+ 004 45766e+ 002 minus 32866e+ 004 37953e+ 002 minus 32949e+ 004 36880e+ 002

Generation

1 39 77 115

153

191

229

267

305

343

381

419

457

495

533

571

609

647

685

723

761

799

837

875

913

951

989

0

5

10

15

20

25

Resu

lt

CSAGAFCSA

1000 gen result when ALGS = f_1 D = 2

Figure 3 Results of GA CSA and FCSA when D 2 andALGs f_1

8 Complexity

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 8: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

Finally the comparison results of the three algorithms ofGA CSA and FCSA in the 1000th generation are shown inFigure 3 e abscissa represents the current generationnumber of the algorithm the ordinate represents the currentresult of the algorithm the dashed line represents GA andthe thin solid line represents CSAe line represents FCSA

In the test results of the Ackley Function the optimalinterval of FCSA in the process of 100 executions of thealgorithm is (minus 1 0) the optimal interval of CSA is (minus 15 0)and the optimal interval of GA is (minus 25 0) e convergencedegree of the GA is worse than that of CSA and FCSA andFCSA has the best degree of convergence

Among the test results of the Bukin Function n 6drop-wave Function Rastrigin Function and SchafferFunction n 4 the FCSA has the best optimization ac-curacy and convergence stability while the GA has a pooroptimization accuracy and convergence degree comparedwith CSA

In the test results of functions Cross-in-Tray FunctionHolder Table Function Levy Function and Shubert Func-tion the CSA and FCSA algorithms converge stably to the

global optimal among which FCSA has the optimal averagesearch accuracy and stability while the GA still has deviationpoints and the convergence is not stable

Table 4 Results of GA CSA and FCSA when D 2

ALGsGA CSA FCSA

Mean Std Mean Std Mean Stdf1 minus 50923e minus 001 51699e minus 001 minus 41302e minus 001 34025e minus 001 minus 19901e minus 001 20546e minus 001f2 minus 73939e+ 000 47669e+ 000 minus 12512e+ 000 63021e minus 001 minus 97695e minus 001 53091e minus 001f3 20601e+ 000 50685e minus 003 206255e+ 00 61577e minus 005 206258e+ 00 29248e minus 005f4 92563e minus 001 43546e minus 002 96047e minus 001 23679e minus 002 97154e minus 001 20048e minus 002f5 93533e+ 002 14155e minus 002 95836e+ 002 17805e+ 000 95926e+ 002 63550e minus 001f6 minus 16845e minus 002 27008e minus 002 minus 25327e minus 002 15556e minus 002 minus 23604e minus 002 13131e minus 002f7 18907e+ 001 38513ndash001 19203e+ 001 61897e minus 003 19206e+ 001 31154e minus 003f8 minus 97077e minus 003 22512e minus 002 minus 72359e minus 004 73241e minus 004 minus 33312e minus 004 35434e minus 004f9 minus 14031e+ 000 96354e minus 001 minus 15528e minus 001 15445e minus 001 minus 79310e minus 002 76989e minus 002f10 minus 20083e minus 004 64135e minus 004 minus 27670e minus 003 32928e minus 003 minus 11855e minus 003 16805e minus 003f11 minus 500096e minus 01 34944e minus 006 minus 500094e minus 01 17190e minus 006 minus 500094e minus 01 15678e minus 006f12 minus 25903e minus 003 48844e minus 003 minus 44234e minus 002 45491e minus 002 minus 30112e minus 002 31088e minus 002f13 16615e+ 002 23853e+ 001 18632e+ 002 39038e minus 001 18649e+ 002 29016e minus 001

Table 5 Results of BCSA ECSA and FCSA when D 50

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 12838e+ 001 23931e minus 001 minus 20213e+ 001 11482e minus 001 minus 12749e+ 001 23649e minus 001f6 minus 75633e+ 002 33740e+ 001 minus 74943e+ 002 34107e+ 001 minus 74891e+ 002 36480e+ 001f8 minus 25742e+ 002 17774e+ 001 minus 25256e+ 002 19350e+ 001 minus 24812e+ 002 16304e+ 001f9 minus 63444e+ 002 19519e+ 001 minus 63492e minus 001 19929e+ 001 minus 62661e+ 002 16450e+ 001f12 minus 14640e+ 004 31504e+ 002 minus 14646e+ 004 30211e+ 002 minus 14721e+ 004 27377e+ 002

Table 6 Results of BCSA ECSA and FCSA when D 100

ALGsBCSA ECSA FCSA

Mean Std Mean Std Mean Stdf1 minus 13768e+ 001 11996e minus 001 minus 13756e+ 001 12061e minus 001 minus 13697e+ 001 11556e minus 001f6 minus 19237e+ 003 57814e+ 001 minus 19198e+ 003 64516e+ 001 minus 19070e+ 003 65405e+ 001f8 minus 71533e+ 002 25226e+ 001 minus minus 70166e+ 002 40096e+ 001 minus 69408e+ 002 27792e+ 001f9 minus 14416e+ 003 26102e+ 001 minus 14370e+ 003 32709e+ 001 minus 14283e+ 003 22737e+ 001f12 minus 32839e+ 004 45766e+ 002 minus 32866e+ 004 37953e+ 002 minus 32949e+ 004 36880e+ 002

Generation

1 39 77 115

153

191

229

267

305

343

381

419

457

495

533

571

609

647

685

723

761

799

837

875

913

951

989

0

5

10

15

20

25

Resu

lt

CSAGAFCSA

1000 gen result when ALGS = f_1 D = 2

Figure 3 Results of GA CSA and FCSA when D 2 andALGs f_1

8 Complexity

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 9: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

According to the test results of Eggholder Function theconvergence stability of the CSA and FCSA is worse thanGA but the optimization accuracy is better than GA

In the test results of Griewank Function the optimi-zation accuracy of the CSA and FCSA is better than that ofGA and GA has a few deviation points as well as poorconvergence stability With the improved CSA when d 50FCSA is the best in algorithm stability and when d 100FCSA is the best in algorithm optimization accuracy

In the test results of Schaffer Function n 2 and SchwefelFunction the GA has better convergence stability and op-timization accuracy than the CSA and FCSA For SchwefelFunction FCSA shows a more stable degree of convergencein higher dimensions while BCSA shows more accurateconvergence in higher dimensions

As can be seen from the experimental results in Table 7 themean value and standard deviation of the optimal solution inthe test functions f_1 f_6 f_8 and f_9 after the forgettingmechanism is introduced into the BCSA algorithm are bothbetter than the results of the original BCSA in Tables 5 and 6

On the other hand it can be seen from Figure 3 that theclone selection algorithm converges around 300 generationsand is very close to the global minimum of the function (theglobal minimum is 0) In particular the improved algorithmproposed in this paper is more accurate when comparedwith the comparison algorithm and it can always find betterresults when the generation is the same

43 Experiment Analysis According to Section 42 of thispaper respectively using GA CSA and FCSA on 13 kinds oftest functions for the optimal solution (global maximum)experimental data the following results can be seen Usingthe Ackley Function Bukin Function Eggholder FunctionLevy Function Rastrigin Function Shubert Function etcwhen using this algorithm in the initial experimental en-vironment which is the same as the CSA and GA optimi-zation under the condition of higher precision convergenceis stable and reliableeGA can only converge stably on thetest functions Schaffer Function n 2 and Schwefel Functionbut it cannot converge stably on the other 11 test functionsand it is easy to fall into local optimum

Overall the experimental results show that FCSA hashigher optimization accuracy and stability than CSA andFCSA has higher optimization accuracy and convergencestability than GA in most test functions

It can be seen from the high-dimensional experiments ofBCSA ECSA and FCSA that FCSA has more advantagesover ECSA and BCSA in terms of convergence stability and

accuracy Due to the characteristics of the test function itselfthe higher the dimension the more complex the functionchange is which leads to decreased optimization accuracyand stability of the algorithm

By applying the forgetting mechanism to BCSA thenumber of antibodies to be replaced by the original manualdefinition is changed to the number of antibodies to bereplaced by the affinity attribute of antibodiese forgettingmechanism has a positive effect on improving the conver-gence speed and convergence stability of such algorithms

5 Conclusion

To solve the problem that the CSA cannot in a timely wayeliminate antibodies that are not adapted to the newenvironment and thus form an antibody black hole wesee that by changing the receptor editing mechanism ofthe clonal selection algorithm to a new forgettingmechanism the antibody candidate set can be replacedand updated under the regulation of Rac1 protein Ex-periments show that FCSA is an effective improved al-gorithm compared with ECSA and BCSA in terms ofoptimization efficiency optimization accuracy andconvergence stability

Because FCSA changes the substitution step in thecurrent clonal selection algorithm it is better than theexisting improved clonal selection algorithm Howeverfrom the experimental performance in high-dimensionaltest function FCSA still has the problem of low optimizationprecision In the future the FCSA will be combined with theexisting improved clonal selection algorithm to furtheroptimize the precision and stability of high-dimensionaloptimization

We also note that Luo et al [26] proposed a clonalselection method for dynamic multimodal optimizationproblems and proved the effectiveness of the method Whenthe global peaks of the problem change with time how to usethe forgetting mechanism to quickly adapt to the new heightof global peaks and forget the outdated experience in timewill be our future research direction

At the same time as an effective updating mechanismthe forgetting mechanism can also be applied to otherheuristic algorithms that need to update the population ofalgorithms

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Table 7 Results of the BCSA algorithm combined with the forgetting mechanism when D 50 100

ALGsD 50 D 100

Mean Std Mean Stdf1 minus 85121e+ 000 14299e minus 001 minus 91801e+ 000 94149e minus 002f6 minus 75371e+ 002 37047e+ 001 minus 19109e+ 003 72792e+ 001f8 minus 24475e+ 002 19376e+ 001 minus 68859e+ 002 31488e+ 001f9 minus 62326e+ 002 19837e+ 001 minus 14202e+ 003 30524e+ 001f12 minus 14873e+ 004 27891e+ 002 minus 33390e+ 004 36963e+ 002

Complexity 9

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity

Page 10: ImprovedClonalSelectionAlgorithmBasedonBiological …downloads.hindawi.com/journals/complexity/2020/2807056.pdf · 2020. 4. 7. · clonal selection algorithm flow mainly includes

Conflicts of Interest

e authors declare that they have no conflicts of interest

Acknowledgments

is work was supported by the National Natural ScienceFoundation of China (61977021) National Social ScienceFund (15CGL074) Intelligent Information Processing andReal Time Industrial System Hubei Provincial Key Labo-ratory Open Fund Project (znxx2018MS05) and Openproject of Hubei Key Laboratory of Applied Mathematics(HBAM201902)

References

[1] W Zhang K Gao W Zhang X Wang Q Zhang andH Wang ldquoA hybrid clonal selection algorithm with modifiedcombinatorial recombination and success-history basedadaptive mutation for numerical optimizationrdquo Applied In-telligence vol 49 no 2 pp 819ndash836 2019

[2] G Yang H Jin and X zhu ldquoOptimization algorithm basedon differential evolution and clonal selection mechanismrdquoComputer Engineering and Applications vol 49 no 10pp 50ndash52 2013

[3] J Kim and P Bentley ldquoImmune memory and gene libraryevolution in the dynamic clonal selection algorithmrdquo GeneticProgramming and Evolvable Machines vol 5 no 4pp 361ndash391 2004

[4] D Zhang K Wang Y Long and X Zhao ldquoMulti-objectiveclonal selection algorithm applying to biomass power plantlocation modelrdquo Journal of Geomatics vol 43 no 2pp 19ndash23 2018

[5] M Rafiei T Niknam and M H Khooban ldquoProbabilisticelectricity price forecasting by improved clonal selection al-gorithm and wavelet preprocessingrdquo Neural Computing andApplications vol 28 no 12 pp 1ndash13 2016

[6] G Lou and Z Cai ldquoImproved hybrid immune clonal selectiongenetic algorithm and its application in hybrid shop sched-ulingrdquo Cluster Computing vol 22 no S2 pp 3419ndash34292019

[7] Y Jing and Z Zhang ldquoA study on car flow organization in theloading end of heavy haul railway based on immune clonalselection algorithmrdquo Neural Computing and Applicationsvol 31 no 5 pp 1455ndash1465 2019

[8] Z Zareizadeh M S Helfroush A Rahideh and K KazemildquoA robust gene clustering algorithm based on clonal selectionin multiobjective optimization frameworkrdquo Expert Systemswith Applications vol 113 no 15 pp 301ndash314 2018

[9] S Kamada and T Ichimura ldquoA generation method of im-munological memory in clonal selection algorithm by usingrestricted boltzmann machinesrdquo in Proceedings of the IEEEInternational Conference on Systems October 2016

[10] S Mohapatra P M Khilar and R Ranjan Swain ldquoFaultdiagnosis in wireless sensor network using clonal selectionprinciple and probabilistic neural network approachrdquo In-ternational Journal of Communication Systems vol 32 no 16p e4138 2019

[11] C Yavuz Burcu Y Nilufer and O Ozkan ldquoPrediction ofprotein secondary structure with clonal selection algorithmand multilayer perceptronrdquo IEEE ACCESS vol 6pp 45256ndash45261 2018

[12] W Luo and X Lin ldquoRecent advances in clonal selectionalgorithms and applicationsrdquo in Proceedings of the IEEESymposium Series on Computational Intelligence November2018

[13] A Galvez A Iglesias A Avila C Otero R Arias andC Manchado ldquoElitist clonal selection algorithm for optimalchoice of free knots in B-spline data fittingrdquo Applied SoftComputing vol 26 pp 90ndash106 2015

[14] M Gong L Jiao and L Zhang ldquoBaldwinian learning in clonalselection algorithm for optimizationrdquo Information Sciencesvol 180 no 8 pp 1218ndash1236 2010

[15] B S Rao and K Vaisakh ldquoMulti-objective adaptive clonalselection algorithm for solving optimal power flow consid-ering multi-type FACTS devices and load uncertaintyrdquo Ap-plied Soft Computing vol 23 pp 286ndash297 2014

[16] L Hong C L Gong J Z Wang and Z C Ji ldquoConvergencerate estimation of elitist clonal selection algorithmrdquo ActaElectronica Sinica vol 43 no 5 pp 916ndash921 2015 inChinese

[17] H Ebbinghaus Memory Columbia University New YorkNY USA 1913

[18] T J Ricker E Vergauwe and N Cowan ldquoDecay theory ofimmediate memory from Brown (1958) to today (2014)rdquoQuarterly Journal of Experimental Psychology vol 69 no 10pp 1969ndash1995 2016

[19] M Anderson ldquoRethinking interference theory executivecontrol and the mechanisms of forgettingrdquo Journal of Memoryand Language vol 49 no 4 pp 415ndash445 2003

[20] Y Shuai B Lu Y Hu L Wang K Sun and Y ZhongldquoForgetting is regulated through rac activity in DrosophilardquoCell vol 140 no 4 pp 579ndash589 2010

[21] Y Liu S Du L Lv et al ldquoHippocampal activation of Rac1regulates the forgetting of object recognition memoryrdquoCurrent Biology vol 26 no 17 pp 2351ndash2357 2016

[22] T Tully S Boynton C Brandes et al ldquoGenetic dissection ofmemory formation in Drosophila melanogasterrdquo Cold SpringHarbor Symposia on Quantitative Biology vol 55 no 1pp 203ndash211 1990

[23] L N De Castro and F J Von Zuben ldquoLearning and opti-mization using the clonal selection principlerdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 3pp 239ndash251 2002

[24] B Ulutas and A A Islier ldquoDynamic facility layout problem infootwear industryrdquo Journal of Manufacturing Systems vol 36no 36 pp 55ndash61 2015

[25] S Surjanovic and D Bingham ldquoVirtual library of simulationexperiments test functions and datasetsrdquo 2018 httpwwwsfucasimssurjanooptimizationhtml

[26] W Luo X Lin T Zhu and P Xu ldquoA clonal selection al-gorithm for dynamic multimodal function optimizationrdquoSwarm and Evolutionary Computation vol 50 2019

10 Complexity