Hybrid Soft Computing Schemes for the Prediction

download Hybrid Soft Computing Schemes for the Prediction

of 12

Transcript of Hybrid Soft Computing Schemes for the Prediction

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    1/12

    Research ArticleHybrid Soft Computing Schemes for the Prediction of Import Demand of Crude Oil in Taiwan

     Yuehjen E. Shao,1 Chi-Jie Lu,2 and Chia-Ding Hou1

    Department o Statistics and Inormation Science, Fu Jen Catholic University, Xinzhuang, New aipei , aiwan Department o Industrial Management, Chien Hsin University o Science and echnology, Zhongli,

    aoyuan , aiwan

    Correspondence should be addressed to Chia-Ding Hou; [email protected] 

    Received February ; Accepted April ; Published April

    Academic Editor: Ker-Wei Yu

    Copyright © Yuehjen E. Shao et al. Tis is an openaccess article distributed under the CreativeCommons AttributionLicense,which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

    Crude oil is the most important nonrenewable energy resource and the most key element or the world. In contrast to typicalorecasts o oil price, this study aims at orecasting the demand o imported crude oil (ICO). Tis study proposes different singlestage and two-stage hybrid stages o orecasting models or prediction o ICO in aiwan. Te single stage orecasting modelingincludes multiplelinear regression (MLR), support vector regression (SVR),articial neural networks (ANN), and extreme learningmachine (ELM) approaches. While the rst step o the two-stage modeling is to select the ewer but more signicant explanatory  variables, the second step is to generate predictions by using these signicant explanatory variables. Te proposed two-stage hybridmodels consist o integration o different modeling components. Mean absolute percentage error, root mean square error, andmean absolute difference are utilized as the perormance measures. Real data set o crude oil in aiwan or the period o –and twenty-three associated explanatory variables are sampled and investigated. Te orecasting results reveal that the proposedtwo-stage hybrid modeling is able to accurately predict the demand o crude oil in aiwan.

    1. Introduction

    Natural resources are oen classied into two groups: renew-able and non-renewable resources. Renewable energy is theone which comes rom natural resources such as sunlight,wind, and rain, and it is naturally replenished. It is reportedthat about % o global nal energy consumption comes

    rom renewable energy. Te share o renewable energy inglobal electricity generation is around % []. In the UnitedStates, renewables provided .% o total domestic electricity in , up rom .% in , and .% in . In China,wind power generation increased by more than .% in .In European Union, renewables accounted or more than %o total electric capacity additions in []. Non-renewableresources orm very slowly or do not naturally orm in theenvironment. A good example o the nonrenewable is ossiluels. Although many studies have reported that the renew-able sources o energy are currently receiving considerableattention, the ossil uels are still the most needed elemento the world energies []. In particular, crude oil is the most

    important nonrenewable energy resource and the most key element or the world [, ].

    Te relationship between economic growth and oil con-sumption was addressed [], and the study determined thatthe minimum statistical (lower-bound) annual oil consump-tion or developed countries was barrels per capita. By using autoregressive distributed lag (ARDL) bounds testingapproach o cointegration, the study discussed a long-runrelationship among quantity o crude oil import, income,and price o the imported crude in India []. Te resultsshowed that the long-term income elasticity o importedcrude in India is . and there existed a unidirectional long-run causality running rom economic growth to crude oilimport. Te demand or ICO in South Arica as a unction o real income and the price o crude oil were studied []. Teestimated long-run price and income elasticities revealed thatimport demand or crude oil is price and income inelastic.

    Te study estimated the short-run and long-run elastici-ties o demand or crude oil in urkey or the period o to []. Te comparative study was perormed in the

    Hindawi Publishing CorporationMathematical Problems in EngineeringVolume 2014, Article ID 257947, 11 pageshttp://dx.doi.org/10.1155/2014/257947

    http://dx.doi.org/10.1155/2014/257947http://dx.doi.org/10.1155/2014/257947

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    2/12

    Mathematical Problems in Engineering

    study []. A decision support system or orecasting ossiluel production in urkey, using a regression ARIMA andSARIMA method, was developed or the period o –. Also, the study applied the ARIMA and SARIMAmethods to predict primary energy demand in urkey or theperiod rom to [].

    In addition to orecasting the oil demand or consump-tion, several orecasting methods were applied to predictdifferent types o energies. For example, regression modelingwas employed to orecast the coal, oil, and gas electricity requirement []. Te MR models were also applied to pre-dictions or electricity consumption in aiwan, Brazil, Delhi,and Hong-Kong, respectively [–]. Te study [] usedthe multiple linear regression techniques to develop simpleempirical relations or the estimation o daily and monthly evaporation in Kuwait. While the ARIMA approaches werecommonly used or predictions o energy demand [,  ],the ANN models were also widely used or predictions o energy demand [,  –]. Another promising orecastingtechnique, SVR, was used to orecast the energy demand [,]. Additionally, the various hybrid modeling approacheswere reported in orecasting energy demandin several studies[–], and the hybrid modeling schemes appear to be apromising technique. Te study provided literature review indetail about the orecasting modeling in energy issue [].

    Extreme learning machine (ELM) proposed by Huanget al. [,   ] is a novel learning algorithm or single-hidden-layer eedorward neural networks (SLFN). It pro-

     vides much better generalization perormance with muchaster learning speed and avoids many issues aced in thetraditional algorithms such as stopping criterion, learningrate, number o epochs and local minima, and the over-tuned problems []. Moreover, the universal approximationcapability o ELM has been analyzed and proven by [–]to show the effectiveness o ELM. Tus, ELM has attracteda lot o attentions in recent years and been used or variousorecasting issues, such as sales orecasting [–], stock price orecasting [,   ], and electricity price orecasting[].

    In June , Chinese Petroleum Corp. (CPC) wasunded, and the headquarters was set up in aipei underthe direction o the Ministry o Economic Affairs. Withservice acilities covering the whole nation, its operationstoday include the import, exploration, development, rening,transport, marketing, and sale o petroleum and natural gas.CPC’s total capital stands at N. billion, and its total

    revenues in amounted to N. trillion.Since crude oil is extremely important or development

    o aiwan’s economy, the predictions o the demand o imported crude oil are a must. Accordingly, this study isaimed at proposing single and two-stage orecasting tech-niques to predict the demand o imported crude oil inaiwan. Te single stage orecasting modeling includes thesupport vector regression (SVR), articial neural networks(ANN), extreme learning machine (ELM), and multiplelinear regression (MLR) approaches. Te two-stage modelscombine the two modeling components. Te rst componento the model uses its own eature to capture the signicantexplanatory variables. Ten, the second componentgenerates

    the predictions based on these explanatory variables. In thisstudy, the combinations o MLR and SVR (i.e., reer to MLR-SVR), MLR and ANN (i.e., reer to MLR-ANN), and MLEand ELM (i.e., reer to MLR-ELM) are used as the two-stagemodels.

    Real data are sampled or the period o – or the

    ICO in aiwan. According to the suggestion [], twenty-three associated variables are collected or serving as theexplanatory variables. Te predictions or ICO in aiwan canbe made based on the single stage and two-stage orecastingmodels. Tis study uses the rst years (–) o dataor model building, and the last our years’ data are used orthe purpose o conrmation. Te mean absolute percentageerror (MAPE), the root mean square error (RMSE), and themean absolute difference (MAD) are used as the orecastingaccuracy measures.

    Te contents o this study are organized as ollows.Te ollowing section introduces the proposed orecastingtechniques. Section  presents the real data o ICO and theorecasting results. Te perormances orall o the orecastingmodels are demonstrated and discussed. Te nal section,Section , concludes this study.

    2. Research Methodologies

    Tis studyconsiders MLR, SVR, ANN, ELM, andtheir hybridmodeling schemes as possible orecasting models or importdemand o crude oil in aiwan. Tese orecasting techniquesare introduced in the subsequent sections.

    .. Multiple Linear Regression.   Te multiple linear regressionanalysis is the procedure by which an algebraic equation is

    ormulated to estimate the value o a dependent variable, given the values o some other independent variables1, 2, . . . , , = 1,2 , . . . ,. Te relationship between thedependent and independent variables is expressed by a linearalgebraic model as ollows:

     = 0 + 11 + 22 + ⋅ ⋅ ⋅ +  + , = 1, 2, . . . , .   ()

    Based on the least squares or maximum likelihood criterion,the so-called normal equations and the point estimators o 0, 1, . . . ,  can be obtained. Under the normal and somemild assumptions, the sampling distributions and hence the

    statistical inerence or the estimators o 0, 1, . . . ,  can bederived.o identiy signicant independent variables, the back-

    ward elimination, orward selection, or stepwise regressionprocedures can be applied. Te backward elimination proce-dure begins with the model which includes all o the availableexplanatory variables, and successively deletes one variable atatimetothemodelinsuchawaythatateachstep,thevariabledeleted is the variable contributing the least to the predictiono dependent variable at that step. On the contrary, the or-ward selection procedure begins with the constant model thatincludes no explanatory variable, and successively adds one

     variable at a time to the model in such a way that, at each step,

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    3/12

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    4/12

    Mathematical Problems in Engineering

    Accordingly, the ANN model in () accomplishes a non-linear unctional mapping rom the inputs (1, 2, . . . , ) tothe output ; that is,

    = 1, 2, . . . , , + ,   ()where

    is a vectoro all model parameters and

    is a unction

    determined by the ANN structure and connection weights.

    .. Extreme Learning Machine.   ELM randomly selectedthe input weights and analytically determined the outputweights o SLFNs. One may randomly choose and x thehidden node parameters which are the key principle o theELM. Aer randomly choosing the hiddennodes parameters,SLFN becomes a linear system where the output weightso the network can be analytically determined using simplegeneralized inverse operation o the hidden layer outputmatrices [, ].

    In general, the concept o ELM is similar to that o therandom vector unctional-link (RVFL) network where the

    hidden neurons are randomly selected. However, the maindifference between ELM and RVFL is the characteristics o hidden neuron parameters. In ELM, all the hidden nodeparameters are randomly generated independently o thetarget unctions and the training patterns []. In RVFL, theselection o hidden neurons is based on partial randomnessand the randomly generated hidden node parameters are notcompletely independent o the training data []. Tat is, theuniversal approximation capability o ELM can be linearly extended to RVFL [, ].

    Consider  arbitrary distinct samples (, ) where  =[1, 2, . . . , ] ∈ and  = [1, 2, . . . , ] ∈ .SLFNs with

     ̃ hidden neurons and activation unction

     ()can approximate  samples with zero error. Tis means that = ,   ()where

    1, . . . , ̃, 1, . . . , ̃, 1, . . . , = 

    (1 ⋅ 1 + 1) ⋅ ⋅ ⋅ (̃ ⋅ 1 + ̃)...   d

    ...(1 ⋅  + 1) ⋅⋅ ⋅ (̃ ⋅  + ̃)×̃

    ,

    ̃×

     = 1

    , . . . ,

    ̃

    ,× = 1 , . . . , ,

    ()

    and  = [1, 2, . . . , ],  = 1, 2, . . . , ̃, is the weight vector connecting the th hidden node and the input nodes, = [1, 2, . . . , ] is the weight vector connecting theth hidden node and the output nodes, and  is the thresholdo the th hidden node.  ⋅  denotes the inner producto    and .   is called the hidden layer output matrix o the neural network; the th column o     is the th hiddennode output with respect to inputs 1, 2, . . . . Tereore,the determination o the output weights is as simple asnding the least-square solution to the given linear system.

    Te minimum norm least-square (LS) solution to the linearsystem is

    ̂ = Ψ,   ()where

    Ψ is the Moore-Penrose generalized inverse o matrix

    . Te minimum norm LS solution is unique and has thesmallest norm among all the LS solutions.Te rst step o ELM algorithm is to randomly assigninput weight   and bias . Ten, the hidden layer outputmatrix  is calculated. Finally, one can calculate the outputweight , ̂ = Ψ, where  = (1, . . . , )... Hybrid Models.   Recent research indicates that hybridsystems which are integrated with several standard ones canhelp to achieve a better perormance or some applications.For example, the hybrid modeling applications have beenreported in orecasting [–], creditrisk[], andmanuac-turing process [–]. As a consequence, this studyproposes

    a two-step hybrid data mining mechanisms to predict thedemand o ICO in aiwan. Tis study proposes three hybriddata mining models, namely, the integration o MLR andANN (i.e., MLR-ANN), MLR and SVR (i.e., MLR-SVR), andMLR and ELM (i.e., MLR-ELM). Te concept o those hybridmodeling is as ollows.

    In the rst stage o hybrid modeling, more signi-cant variables are selected using MLR with orward selec-tion, backward elimination or stepwise regression, say,∗1, ∗2, . . . , ∗. In the second stage, the selected signicant

     variables∗1, ∗2, . . . , ∗ obtainedin therst stage are servedas the inputs o ANN, SVR, and ELM in order to establishthe two-stage hybrid orecasting models. By providing the

    ANN, SVR, and ELM with good starting points, it is hopedthat more effective models can be developed on the strengtho their learning ability. Such two-stage hybrid models thenare compared with the single-stage o MLR, ANN, SVR, andELM models.

    3. Results and Analysis

    o show the effectiveness o the proposed hybrid modeling,the real data, rom the years to , were sampled orthe ICO rom Bureau o Energy in aiwan []. Followingthe suggestion [] and having discussed with the ICOpractitioners, we sample associated inuential variables.

    able  lists these variables. Te yearly data were collectedor the period o – rom the web sites o Bureau o Energy in aiwan []. Te rst years’ data are used ormodel building, and the last our years’ data are used ormodel conrmation.

    .. Single-Stage Modeling.  Figure  displays the yearly datao ICO in aiwan or the periods o –. For MLR modeling, we rst compute the variance ination actors(VIFs) to examine the existence o collinearity.

    Our numerical results reveal that all the values o VIFso independent variables are greater than except the

     variables o 

     4   and

     5, indicating that there may exists

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    5/12

    Mathematical Problems in Engineering

    : Meaning o the inuential variables or ICO modelbuilding.

    Variable Meaning   Imported crude oil1   Gross domestic product2

      Consumer price index

    3   Personal disposable income4   Average temperature5   Average sunshine per day (hours)6   Average electricity households7   Average electricity price8   National income9   Population10   Foreign trade total11   Wholesale prices12   Consumer index13   GNP deators14

      Foreign exchange reserves

    15   otal primary energy supply 16   otal nal consumption17   otal domestic consumption18   Energy productivity 19   Energy intensity 20   Energy consumption o energy intensive industries21   Value-added o energy intensive industries22   Dependence on imported energy 23   Electricity average load

    130000

    260000

    1 2 3 4 5 6 7 8 9 10 11 12 13 14

    Time (year)

       I   C   O     (

            1        0        0        0

        b    b    l    )

    F : Historical yearly data o ICO in aiwan.

    high collinearity among the independent variables. o elim-

    inate variables with high collinearity, we apply the Pearsoncorrelation coefficient. I the correlation coefficient betweentwo independent variables is greater than ., we eliminatethe variable which has a lower relationship with dependent

     variable .  able   lists the analysis results. As shown inable , aer excluding the variables with high collinearity,there remained six independent variables, including 4, 5,7, 13, 19, and 22. In able , it can be ound that all the

     values o VIFs o the remaining variables are smaller than .As a consequence, there is no high collinearity among theseindependent variables.In addition, we apply the backwardelimination, orward selection, and stepwise regression pro-ceduresto identiy signicant independent variables, and use

    a . signicance level to perorm the MLR analysis.All thethree procedures have the same selection results. As listed inable , signicantindependentvariables related to importedcrude oil in aiwan include average electricity price (7)and dependence on imported energy (22). Accordingly, theMLR model is derived as ollows:

    ̂ = −11862349.62 − 75695.537 + 124776.5322.   ()Te regression coefficients in the MLR model indicate

    that the higher the average electricity price is, the lowerthe imported crude oil is. On the contrary, the higher thedependence on imported energy is, the lower the importedcrude oil is.

    For ANN modeling, since the backpropagation neuralnetwork (BPNN) structure has been widely used [, ], thisstudy employs BPNN as ANN modeling structure. In BPNNstructure, we have input nodes and one output node. Tehidden nodes range rom +2 to − 2, where is the number o input variables. Tus, the hidden nodes were set up as , ,

    , , and , respectively. Te training and testing processesinclude and data vectors or possible parameter setting.Te learning rates are ., ., and ., respectively,according to the suggestions [].

    Aer applying ANN to ICO data, we have obtained the{--} topology with a learning rate o . which providesthe best result. Te {-ℎ-}   represents the number o nodes in the input layer, hidden layer, and output layer,respectively. For SVR modeling, same as ANN modeling, wehave input variables. Te two parameters,  and gamma,were estimated as 2−15 and 2−15, respectively.

    As mentioned earlier, the most importantELM parameteris the number o hidden nodes and that ELM tends to be

    unstable in single run orecasting [, ]. Tereore, the ELMmodels with different numbers o hidden nodes varying rom to are constructed. For each number o nodes, an ELMmodel is repeated times and the average RMSE o eachnode is calculated. Te number o hidden nodes that gives thesmallest average RMSE value is selected as the best parametero ELM model. In modeling ELM, the orecasting model witheight hidden nodes has the smallest average RMSE values andis thereore the best ELM model.

    .. Proposed Hybrid Modeling.   A rational strategy or ahybrid modeling is to use the ewer but more inormative

     variables, which were selected by the rst stage o modeling

    approaches, as the inputs or the second stage o classi-er approaches. Accordingly, in this study, the signicant

     variables selected, that is, average electricity price (7) anddependence on imported energy (22), are used as the input

     variables o the ANN and SVR or hybrid modeling.Aer completing the rst stage o hybrid modeling,

    the ANN topology settings can be established. Tis study has ound that the {--}   topology with a learning rateo . provides the best result or the hybrid model. Tenetwork topology with the minimum testing RMSE is alsoconsidered as theoptimal network topology. For the MR/SVR hybrid modeling, the parameters o   and gamma were stillestimated as

    2−15 and

    2−15, respectively. In the construction o 

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    6/12

    Mathematical Problems in Engineering

          T                           :      P    e    a    r    s    o    n

        c    o    r    r    e       l    a     t      i    o    n    s       f    o    r    p    a      i    r    s    o       f    v    a    r      i    a       b       l    e    s .

                 1

                 2

                 3

                 4

         

            5

                 6

                 7

                 8

                 9

                 1        0

                 1        1

                 1        2

                 1        3

                 1        4

                 1        5

                 1        6

                 1        7

                 1        8

                 1        9

                 2        0

                 2

            1

                 2        2

                 2        3

         

                 1

           .      

          

                 2

             .              

           .      

          

                 3

             .              

             .              

           .      

          

                 4

           .      

          

           .      

          

           .      

          

           .      

          

                 5

           .      

          

           .      

          

           .      

          

           .      

          

           .            

                 6

             .              

             .              

             .              

           .      

          

           .

                

           .      

          

                 7

           .      

          

           .      

          

           .      

          

      −       .      

          

           .      

          

           .      

          

           .      

          

                 8

             .              

             .              

             .              

           .      

          

           .            

             .              

           .      

          

           .      

          

                 9

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

           .      

          

                 1        0

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

             .              

           .      

          

                 1        1

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

             .              

             .              

           .      

          

                 1        2

             .              

             .              

             .              

           .      

          

           .            

             .              

           .      

          

             .              

             .              

             .              

             .              

           .      

          

                 1        3

           .      

          

           .      

          

           .      

          

           .      

          

      −       .      

          

           .      

          

      −       .      

          

           .      

          

           .      

          

      −       .            

      −       .      

          

           .      

          

           .      

          

                 1        4

             .              

             .              

             .              

           .      

          

           .            

             .              

           .      

          

             .              

             .              

             .              

             .              

             .              

      −       .      

          

           .      

          

                 1        5

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

             .              

             .              

             .              

             .              

           .      

          

             .              

           .      

          

                 1        6

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

             .              

             .              

             .              

             .              

           .      

          

             .              

             .              

           .      

          

                 1        7

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

             .              

             .              

             .              

             .              

           .      

          

             .              

             .              

             .              

           .      

          

                 1        8

           .      

          

           .      

          

           .      

          

      −       .      

          

      −       .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

      −       .      

          

           .      

          

      −       .      

          

      −       .      

          

      −       .      

          

           .      

          

                 1        9

      −       .      

          

      −       .      

          

      −       .      

          

           .      

          

           .            

      −       .      

          

      −       .      

          

      −       .      

          

           .      

          

      −       .      

          

      −       .      

          

      −       .      

          

           .      

          

      −       .      

          

           .      

          

           .      

          

           .      

          

      −         .              

           .      

          

                 2        0

             .              

             .              

             .              

           .      

          

           .            

             .              

           .      

          

             .              

             .              

             .              

             .              

             .              

      −       .      

          

             .              

             .              

             .              

             .              

           .      

          

      −       .      

          

           .      

          

                 2        1

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

             .              

             .              

             .              

             .              

           .      

          

             .              

             .              

             .              

             .              

           .      

          

      −       .      

          

             .              

           .      

          

                 2        2

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

             .              

             .              

             .              

             .              

           .      

          

             .              

             .              

             .              

             .              

      −       .      

          

           .      

          

             .              

             .       

           

           .      

          

                 2        3

             .              

             .              

             .              

           .      

          

           .

                

             .              

           .      

          

             .              

             .              

             .              

             .              

             .              

           .      

          

             .              

             .              

             .              

             .              

           .      

          

      −       .      

          

             .              

             .       

           

             .              

           .      

          

         

           .      

          

           .      

          

           .      

          

           .      

          

           .

                

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

      −       .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

           .      

          

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    7/12

    Mathematical Problems in Engineering

    : Collinearity diagnosis or MLR modeling.

    Variable    4   5   7   13   19   22VIF . . . . . .

    : MLR model or ICO in aiwan.

    Variables  Estimated

    coefficientStandard

    error    valueConstant   −. .  

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    8/12

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    9/12

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    10/12

    Mathematical Problems in Engineering

    [] L. Suganthi and A. A. Samuel, “Energy models or demandorecasting—a review,”   Renewable and Sustainable Energy Reviews, vol. , no. , pp. –, .

    [] G.-B. Huang, Q.-Y. Zhu, and C.-K. Siew, “Extreme learningmachine: a new learning scheme o eedorward neural net-works,” in Proceedings o the IEEE International Joint Conerenceon Neural Networks, pp. –, Budapest, Hungary, July 

    .

    [] G.-B. Huang, Q.-Y. Zhu, and C.-K. Siew, “Extreme learningmachine: theory and applications,” Neurocomputing , vol. , no.–, pp. –, .

    [] G.-B. Huang, L. Chen, and C.-K. Siew, “Universal approxima-tion using incremental constructive eedorward networks withrandom hidden nodes,” IEEE ransactions on Neural Networks, vol. , no. , pp. –, .

    [] G.-B. Huang and L. Chen, “Convex incremental extreme learn-ing machine,”   Neurocomputing , vol. , no. –, pp. –, .

    [] G.-B. Huang and L. Chen, “Enhanced random search basedincremental extreme learning machine,” Neurocomputing , vol., no. –, pp. –, .

    [] Z.-L. Sun, .-M. Choi, K.-F. Au, and Y. Yu, “Sales orecastingusing extreme learning machine with applications in ashionretailing,” Decision Support Systems, vol. , no. , pp. –,.

    [] C. J. Lu, . S. Lee, and C. M. Lian, “Sales orecasting orcomputer wholesalers: a comparison o multivariate adaptiveregression splines and articial neural networks,”   DecisionSupport Systems, vol. , no. , pp. –, .

    [] C. J. Lu and Y. E. Shao, “Forecasting computer products salesby integrating ensemble empirical mode decomposition andextreme learning machine,” Mathematical Problemsin Engineer-ing , vol. , Article ID , pages, .

    [] Z. Fang, J. Zhao, F. Fei, Q.Wang, and X. He, “An approach based

    on multi-eature wavelet and ELM algorithm or orecastingoutlier occurrence in Chinese stock market,” Journal o Teoret-ical and Applied Inormation echnology , vol. , no. , pp. –, .

    [] J. Zhao, Z. Wang, and D. S. Park, “Online sequential extremelearning machine with orgetting mechanism,”  Neurocomput-ing , vol. , pp. –, .

    [] X. Chen, Z. Y. Dong, K. Meng, Y. Xu, K. P. Wong, and H.W. Ngan, “Electricity price orecasting with extreme learningmachine and bootstrapping,” IEEE ransactions on Power Sys-tems, vol. , no. , pp. –, .

    [] Y. E. Shao, C. D. Hou, and C. S. Lin, “Applying articial neuralnetworks, multiple regression and their hybrid models orpredictions o electricity sales in aiwan,” ICIC Express Letters.

    In press.[] J. Park, I.-H. Kwon, S.-S. Kim, and J.-G. Baek, “Spline regres-

    sion based eature extraction or semiconductor process aultdetection using support vector machine,” Expert Systems with Applications, vol. , no. , pp. –, .

    [] Y. E. Shao and B.-S. Hsu, “Determining the contributors or amultivariate SPC chart signal using articial neural networksand supportvector machine,” InternationalJournalo InnovativeComputing, Inormation and Control , vol. , no. , pp. –, .

    [] C.-J. Lu, Y. E. Shao, and P.-H. Li, “Mixture control chartpatterns recognition using independent component analysisand support vector machine,”  Neurocomputing , vol. , no. ,pp. –, .

    [] Y. E. Shao, C.-J. Lu, and C.-C. Chiu, “A ault detection systemor an autocorrelated process using SPC/EPC/ANN ANDSPC/EPC/SVM schemes,”   International Journal o InnovativeComputing, Inormation and Control , vol. , no. ,pp.–,.

    [] Y. E. Shao, C. J. Lu, and Y. C. Wang, “A hybrid ICA-SVMapproach or determining the ault quality variables in amultivariate process,”   Mathematical Problems in Engineering , vol. , Article ID , pages, .

    [] M. Castro-Neto, Y.-S. Jeong, M.-K. Jeong, and L. D. Han,“Online-SVR or short-term traffic ow prediction undertypical and atypical traffic conditions,”   Expert Systems with Applications, vol. , no. , pp. –, .

    [] C.-J. Lu and Y.-W. Wang, “Combining independent componentanalysis and growing hierarchical sel-organizing maps withsupport vector regression in product demand orecasting,”International Journal o Production Economics, vol. , no. ,pp. –, .

    [] S. Pang, L. Song, and N. Kasabov, “Correlation-aided support vector regression or orex time series prediction,”   Neural 

    Computing and Applications, vol. , no. , pp. –, .[] D. Basak, S. Pal, and D. C. Patranabis, “Support vector regres-sion,” Neural Inormation Processing-Letters and Reviews, vol. ,no. , pp. –, .

    [] W. Dai, Y. E. Shao, and C. J. Lu, “Incorporating eatureselection method into support vector regression or stock indexorecasting,” Neural Computing and Applications, vol. , no. ,pp. –, .

    [] V. N. Vapnik, “An overview o statistical learning theory,” IEEEransactions on Neural Networks, vol. , no. , pp. –,.

    [] V. N. Vapnik, Te Nature o Statistical Learning Teory , Springer,New York, NY, USA, .

    [] M. itterington, “Neural networks,”   Wiley Interdisciplinary 

    Reviews: Computational Statistics, vol. , no. , pp. –, .[] L. Mi and F. akeda, “Analysis on the robustness o the

    pressure-basedindividual identicationsystem based on neuralnetworks,” International Journal o Innovative Computing, Inor-mation and Control , vol. , no. , pp. –, .

    [] C.-C. Chiu, Y. E. Shao, .-S. Lee, and K.-M. Lee, “Identicationo process disturbance using SPC/EPC and neural networks,” Journal o Intelligent Manuacturing , vol. , no. -, pp. –, .

    [] G.-B. Huang, M.-B. Li, L. Chen, and C.-K. Siew, “Incrementalextreme learning machine with ully complex hidden nodes,”Neurocomputing , vol. , no. –, pp. –, .

    [] G. B. Huang, “An insight into extreme learning machines:

    random neurons, random eatures and kernels,”  Cognitive

    Computation, .

    [] P. G. Zhang, “ime series orecasting using a hybrid ARIMAand neural network model,” Neurocomputing , vol. , pp. –, .

    [] H. F. Zou, G. P. Xia, F. . Yang, and H. Y. Wang, “Aninvestigation and comparison o articial neural network andtime series models or Chinese ood grain price orecasting,”Neurocomputing , vol. , no. –, pp. –, .

    [] Y. E. Shao, “Body at percentage prediction using intelligenthybrid approaches,”   Te Scientic World Journal , vol. ,Article ID , pages, .

    [] H. C. Co and R. Boosarawongse, “Forecasting Tailand’s riceexport: Statistical techniques vs. Articial neural networks,”

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    11/12

    Mathematical Problems in Engineering

    Computers and Industrial Engineering , vol. , no. , pp. –, .

    [] S. L. Lin, “A new two-stage hybrid approach o credit risk inbanking industry,” Expert Systems with Applications, vol. , no., pp. –, .

    [] Bureau o Energy,   http://web.moeaboe.gov.tw/ECW/popul-

    ace/content/SubMenu.aspx?menu id=.[] C. J. Lu, Y. E. Shao, and C. C. Li, “Recognition o concurrentcontrol chart patterns by integrating ICA and SVM,”  Applied  Mathematics & Inormation Sciences, vol. , no. , pp. –,.

    [] Y. E. Shao, C. D. Hou, and C. C. Chiu, “Hybrid intelligentmodeling schemes or heart disease classication,” Applied Sof Computing , vol. , pp. –, .

    [] Y. E. Shao and C. D. Hou, “Hybrid articial neural networksmodeling or aults identication o a stochastic multivariateprocess,” Abstract and Applied Analysis, vol. , Article ID, pages, .

    [] Y. E. Shao and C.-D. Hou, “Change point determination or amultivariate process using a two-stage hybrid scheme,” Applied 

    Sof Computing Journal , vol. , no. , pp. –, .[] M. Arun Kumar and M. Gopal, “A hybrid SVM based decision

    tree,” Pattern Recognition, vol. , no. , pp. –, .

    [] P. Rajendran and M. Madheswaran, “Hybrid medical imageclassication using association rule mining with decision treealgorithm,” Journal o Computing , vol. , pp. –, .

    [] B. B. Nair, V. P. Mohandas, and N. R. Sakthivel, “A geneticalgorithm optimized decision tree-SVM based stock markettrend prediction system,”   International Journal on Computer Science and Engineering , vol. , pp. –, .

    [] B. W. Heumann, “An object-based classication o mangrovesusing a hybrid decision tree-support vector machine approach,”Remote Sensing , vol. , no. , pp. –, .

  • 8/9/2019 Hybrid Soft Computing Schemes for the Prediction

    12/12