Global stability of a class of continuous-time recurrent ... · stability, global exponential...

14
1334 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 49, NO. 9, SEPTEMBER 2002 Global Stability of a Class of Continuous-Time Recurrent Neural Networks Sanqing Hu and Jun Wang, Senior Member, IEEE Abstract—This paper investigates global asymptotic stability (GAS) and global exponential stability (GES) of a class of con- tinuous-time recurrent neural networks. First, we introduce a necessary and sufficient condition for existence and uniqueness of equilibrium of the neural networks with Lipschitz continuous activation functions. Next, we present two sufficient conditions to ascertain the GAS of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. We then give two GES conditions for the neural networks whose activation functions may not be monotone nondecreasing. We also provide a Lyapunov diagonal stability condition, without the nonsingularity requirement for the connection weight matrices, to ascertain the GES of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. This Lyapunov diagonal stability condition generalizes and unifies many the existing GAS and GES results. Moreover, two higher exponential convergence rates are estimated. Index Terms—Continuous-time , continuous, global asymptotic stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION I N this paper, we consider a continuous-time recurrent neural network (RNN) model as follows: (1) where is a state vector, is a constant diagonal matrix with , is a constant connec- tion weight matrix, is an con- stant input vector, and is a vector-valued nonlinear activation function from to . This model can be found in [1] and [2]. In addition, (1) includes [3, model (4)] as a special case. The RNN model (1) is widely applied to solve various op- timization problems such as the linear variational inequality problem (VIP) that contains linear and convex quadratic pro- gramming problems and linear complementary problems as spe- cial cases [3]–[5]. In solving an optimization problem, the equi- librium of the RNN model is also the optimum of the objective function subject to constraints. Due to this desirable property, exploring the global asymptotic stability (GAS) and global ex- ponential stability (GES) of the RNN model (1) is a very impor- tant topic. Manuscript received July 9, 2001; revised November 29, 2001. This work was supported by the Hong Kong Research Grants Council under Grant, CUHK4174/00E. This paper was recommended by Associate Editor X. Yu. The authors are with the Department of Automation and Computer-Aided Engineering, The Chinese University of Hong Kong, Shatin, Hong Kong. Publisher Item Identifier 10.1109/TCSI.2002.802360. Two basic approaches are available for exploring the GAS and GES of the RNN model (1). In the first approach, the RNN model (1) is analyzed directly. For example, in [25], the GES of the RNN model is obtained for bound-constrained linear VIP with positive definite matrices, under the condition that the unique equilibrium of the RNN model also satisfies a linear equation. In the second approach, under the conditions that and is nonsingular, by using , the model (1) is transformed to (2) Recently, the GAS and GES of the RNN model (2) have received much attention; e.g., see [4]–[25]. It was proved that the neces- sary and sufficient condition for absolute stability (ABST) of the neural network (2) with a symmetric connection weight ma- trix and a sigmoid activation function is that is negative semidefinite [6]. The ABST result was extended to the abso- lute exponential stability (AEST) in [13]. With globally Lips- chitz continuous and monotone nondecreasing activation func- tions, Lyapunov diagonal stability (LDS) results was reported in [7]. The LDS result for the GES was extended in [19]. It is shown in [15] and [16] that the LDS result extends many ex- isting conditions in the literature, such as -matrix character- istic [18], lower triangular structure [4], negative semidefinite- ness [5], diagonal stability [9], diagonal semistability [10], and the sufficient conditions in [1], [8], [12] and [17]. All of these existing GAS and GES results for the RNN model (2) can be applied to the RNN model (1) when and are commuta- tive and is nonsingular. For example, a special case of the RNN model (1) was presented in [23] for solving the box-con- strained linear VIP and was shown to be globally exponentially convergent under appropriate conditions. The box-constrained set may be bounded or unbounded. If bounded, the linear VIP with box constraints becomes a bound-constrained linear VIP. The GES of the RNN model for solving the box-constrained quadratic programming was investigated in [3]. The GES results in both [3] and [23] are subject to the nonsingularity condition of the connection weight matrix in the RNN model. However, when or is singular, whether or not the existing GAS and GES results for the RNN model (2) are effective for the RNN model (1) needs to be explored since two models are not equivalent in this case. This paper is concerned with the GAS and GES of the con- tinuous-time RNN model (1). A necessary and sufficient condi- tion for existence and uniqueness of equilibrium of the neural network is first introduced. Two sufficient conditions for the GAS of the neural networks with globally Lipschitz contin- uous and monotone nondecreasing activation functions are then 1057-7122/02$17.00 © 2002 IEEE

Transcript of Global stability of a class of continuous-time recurrent ... · stability, global exponential...

Page 1: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

1334 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 49, NO. 9, SEPTEMBER 2002

Global Stability of a Class of Continuous-TimeRecurrent Neural Networks

Sanqing Hu and Jun Wang, Senior Member, IEEE

Abstract—This paper investigates global asymptotic stability(GAS) and global exponential stability (GES) of a class of con-tinuous-time recurrent neural networks. First, we introduce anecessary and sufficient condition for existence and uniquenessof equilibrium of the neural networks with Lipschitz continuousactivation functions. Next, we present two sufficient conditions toascertain the GAS of the neural networks with globally Lipschitzcontinuous and monotone nondecreasing activation functions.We then give two GES conditions for the neural networks whoseactivation functions may not be monotone nondecreasing. Wealso provide a Lyapunov diagonal stability condition, without thenonsingularity requirement for the connection weight matrices, toascertain the GES of the neural networks with globally Lipschitzcontinuous and monotone nondecreasing activation functions.This Lyapunov diagonal stability condition generalizes and unifiesmany the existing GAS and GES results. Moreover, two higherexponential convergence rates are estimated.

Index Terms—Continuous-time , continuous, global asymptoticstability, global exponential stability, global Lipschitz, recurrentneural networks.

I. INTRODUCTION

I N this paper, we consider a continuous-time recurrent neuralnetwork (RNN) model as follows:

(1)

where is a state vector,is a constant diagonal matrix

with , is a constant connec-tion weight matrix, is an con-stant input vector, and is avector-valued nonlinear activation function from to . Thismodel can be found in [1] and [2]. In addition, (1) includes [3,model (4)] as a special case.

The RNN model (1) is widely applied to solve various op-timization problems such as the linear variational inequalityproblem (VIP) that contains linear and convex quadratic pro-gramming problems and linear complementary problems as spe-cial cases [3]–[5]. In solving an optimization problem, the equi-librium of the RNN model is also the optimum of the objectivefunction subject to constraints. Due to this desirable property,exploring the global asymptotic stability (GAS) and global ex-ponential stability (GES) of the RNN model (1) is a very impor-tant topic.

Manuscript received July 9, 2001; revised November 29, 2001. This workwas supported by the Hong Kong Research Grants Council under Grant,CUHK4174/00E. This paper was recommended by Associate Editor X. Yu.

The authors are with the Department of Automation and Computer-AidedEngineering, The Chinese University of Hong Kong, Shatin, Hong Kong.

Publisher Item Identifier 10.1109/TCSI.2002.802360.

Two basic approaches are available for exploring the GASand GES of the RNN model (1). In the first approach, the RNNmodel (1) is analyzed directly. For example, in [25], the GESof the RNN model is obtained for bound-constrained linearVIP with positive definite matrices, under the condition thatthe unique equilibrium of the RNN model also satisfies a linearequation. In the second approach, under the conditions that

and is nonsingular, by using , themodel (1) is transformed to

(2)

Recently, the GAS and GES of the RNN model (2) have receivedmuch attention; e.g., see [4]–[25]. It was proved that the neces-sary and sufficient condition for absolute stability (ABST) ofthe neural network (2) with a symmetric connection weight ma-trix and a sigmoid activation function is that is negativesemidefinite [6]. The ABST result was extended to the abso-lute exponential stability (AEST) in [13]. With globally Lips-chitz continuous and monotone nondecreasing activation func-tions, Lyapunov diagonal stability (LDS) results was reportedin [7]. The LDS result for the GES was extended in [19]. It isshown in [15] and [16] that the LDS result extends many ex-isting conditions in the literature, such as-matrix character-istic [18], lower triangular structure [4], negative semidefinite-ness [5], diagonal stability [9], diagonal semistability [10], andthe sufficient conditions in [1], [8], [12] and [17]. All of theseexisting GAS and GES results for the RNN model (2) can beapplied to the RNN model (1) when and are commuta-tive and is nonsingular. For example, a special case of theRNN model (1) was presented in [23] for solving the box-con-strained linear VIP and was shown to be globally exponentiallyconvergent under appropriate conditions. The box-constrainedset may be bounded or unbounded. If bounded, the linear VIPwith box constraints becomes a bound-constrained linear VIP.The GES of the RNN model for solving the box-constrainedquadratic programming was investigated in [3]. The GES resultsin both [3] and [23] are subject to the nonsingularity conditionof the connection weight matrix in the RNN model. However,when or is singular, whether or not the existingGAS and GES results for the RNN model (2) are effective forthe RNN model (1) needs to be explored since two models arenot equivalent in this case.

This paper is concerned with the GAS and GES of the con-tinuous-time RNN model (1). A necessary and sufficient condi-tion for existence and uniqueness of equilibrium of the neuralnetwork is first introduced. Two sufficient conditions for theGAS of the neural networks with globally Lipschitz contin-uous and monotone nondecreasing activation functions are then

1057-7122/02$17.00 © 2002 IEEE

Page 2: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

HU AND WANG: GLOBAL STABILITY OF CONTINUOUS-TIME RNNs 1335

given. Two GES results for the neural networks, whose activa-tion functions may or may not be monotone nondecreasing, arealso discussed. A Lyapunov diagonal stability condition for theGES of the neural networks with globally Lipschitz continuousand monotone nondecreasing activation functions is provided,without nonsingularity requirement for the connection weightmatrices. This Lyapunov diagonal stability condition general-izes and unifies the previously obtained conditions for the RNNmodels in [3], [23], [25]. In addition, two higher exponentialconvergence rates are derived.

The remainder of this paper is organized as follows. InSection II, some preliminaries are presented for RNNs. InSection III, the existence and uniqueness of the equilibriumare studied. GAS and GES of neural networks are discussedin Sections IV and V, respectively. Illustrative examples aregiven in Section VI. Finally, concluding remarks are made inSection VII.

II. PRELIMINARIES

In the following, let denote the class of globally Lips-chitz continuous (g.l.c.) and monotone nondecreasing activationfunctions; that is, those satisfying that there exist such that

and

(3)

As shown in [20] and [21], the capacity of an associativememory model can be remarkably improved by replacing theusual sigmoid activation functions with nonmonotonic activa-tion functions. Hence, it seems that for some purpose, nonmono-tonic (and not necessarily smooth) functions might be bettercandidates for neuron activation functions in designing and im-plementing artificial neural networks. In fact, in many electroniccircuits, amplifiers with neither monotone increasing nor con-tinuously differentiable input–output functions are frequentlyadopted. Motivated by these, some recent results [22] and [24]discussed the GAS or GES of the neural networks whose activa-tion functions may or may not be differentiable and monotone.

When monotonicity of the activation functions is not re-quired, let denote the class of g.l.c. activation functions; thatis, those satisfying that there exist such that

(4)Obviously, . Moreover, given any or , theneach has a Lipschitz constant or

.Many popular activation functions sat-

isfy (3) or (4). Examples of activation functions satisfying(3) include the sigmoidal functions ,

, and , thelinear saturation function , and thepiecewise-linear function . An example of functionssatisfying (4) is the radial basis functionwhere . As can be seen from these functions, the functionunder the above assumption may be bounded or unbounded.

From the global Lipschitz continuity of it follows that, forany initial point , there exists a unique global solution

of the autonomous system (1)with the initial condition (see, e.g., [29, p. 38,Th. 25]).

Definition 1: An equilibrium of theRNN model (1), which satisfies , issaid to be GAS if it is locally stable in the sense of Lyapunovand globally attractive. An equilibrium is said to be GES ifthere exist and such that , the positivehalf trajectory of the neural network (1) withsatisfies

(5)

Definition 2 ([7]): A real square matrix is said to be Lya-punov diagonally stable (LDS) if there exists a diagonal matrix

such that . We denotethe class of LDS matrices.

Definition 3 ([16] and [28]): Let be anmatrix with nonpositive off-diagonal elements. Then, each ofthe following conditions is equivalent to the statement “is anonsingular -matrix”.

: All principal minors of are positive.: The real part of each eigenvalue ofis positive.: The diagonal elements of are all positive and there

exists a positive diagonal matrixsuch that is strictly diagonal row-dominant; that is

: The diagonal elements of are all positive and thereexists a positive diagonal matrixsuch that is strictly diagonal column-dominant; that is

: There exists a positive diagonal matrixsuch that is positive

definite; that is, .

III. EXISTENCE AND UNIQUENESS OFEQUILIBRIUM

To discuss the GAS and GES of the RNN model (1), the exis-tence and uniqueness of equilibrium of the RNN model (1) areprerequisites. The ensuing result will provide a necessary andsufficient condition for the existence and uniqueness of equilib-rium of the RNN model (1).

Theorem 1: (See Appendix for proof) The RNN model (1)has a unique equilibrium for any givenif and only ifis nonsingular if or if

where , and and are as in(3) and (4), respectively.

In Theorem 1, or , nonsingularity ofis actually equivalent to stability of ,

(i.e., all eigenvalues of are located in the left partof the complex plane). Thus, once is nonsingularor stable for any admissible, then the RNN model (1) has a

Page 3: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

1336 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 49, NO. 9, SEPTEMBER 2002

unique equilibrium . By means of the coordinate translation, (1) can be put into the equivalent form

(6)

where ,, and is

defined by

(7)

We have and when (or, when). Let where

satisfies We can see thatwhen and when .

From (6) it follows that

(8)

By the above analysis, we make a remark as follows.Remark 1: If is nonsingular or stable for any

admissible , then the RNN model (1) has a unique equilibriumfor any given . In this case, the GAS and GES of the

RNN model (1) at is equivalent to that of the model (6) or(8) at .

IV. GLOBAL ASYMPTOTIC STABILITY

Lemma 1: Let . If there exists a positive definitematrix such that is negativedefinite for any with , thenthe RNN model (1) is GAS.

Proof: As for any ,it follows that is a stable matrix. From Remark 1, theRNN model (1) has a unique equilibrium for any andwe only need to consider (8) for the GAS. Define a Lyapunovfunction . Computing the time derivative ofalong the positive half trajectory of (8) yields

in view that for anyadmissible . This shows that (8) is GAS at . As a result,the equilibrium of the RNN model (1) is GAS.

Since it is not straightforward to verify the conditionfor any admissible especially

for a large-scale neural network, we introduce two more testablemethods.

Let

......

...

.... . .

...

(9)

Then, can be rewritten as .

Theorem 2: Let . The RNN model (1) is GAS if thereexists a positive definite matrix such that

(10)

and

(11)

Proof: Since

in view of (10), (11) and , Theorem 2 easily followsfrom Lemma 1.

Next, suppose and as in theRNN models for solving the linear VIP [3], [23] and [25]. Wesupply a result as follows.

Theorem 3: Let and . The RNN model (1) isGAS if there exists a positive definite matrixsuch that

(12)

and there exists such that

where is defined in (9).Proof: Since the condition

is equivalent to the condition, if there exists a

such thatfor any admissible , then the conclusion in Lemma 1 alsoholds. Now compute

(13)

In view of (12), (13), and , whenit easily follows that

Page 4: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

HU AND WANG: GLOBAL STABILITY OF CONTINUOUS-TIME RNNs 1337

when it easily follows that

Remark 2: In Theorem 2, (10) has nothing to do with any of, and (11) depends only on variables,

. Especially, when , (11)becomes , in this case, we only need to checkif there exists a to ensure

and which have nothing to dowith any of , . However, in Lemma 1 we need toconsiderall variables tocheck if thereexistsa

such that . Hence,verifying the conditions in Theorem 2 is much easier than veri-fying those in Lemma 1. Comparing Theorem 3 with Theorem2, we can see that checking the conditions in Theorem 3 is mucheasier than checking those in Theorem 2, since we do not need toconsider any variable in Theorem 3.

V. GLOBAL EXPONENTIAL STABILITY

In the sequel, let where is anyreal number. Let where

, , ,, , and. Let (or ) be

the minimum (or maximum) eigenvalue of matrix.denotes the Euclidean norm of a vector or matrix definedby or . Let

be the solution of (6) or (8) withinitial condition at . Define the upper-right Diniderivative of a function [27]

From the result in Appendix B of [26], the upper-rightDini derivative of is given by

for , wherethe two-variable function is defined by

when and when , and the functionis the signum function defined by

if , if , and . It is easyto see that .

Theorem 4: (See Appendix for proof) Let . The RNNmodel (1) is GES if there exist suchthat any of the following three conditions hold:

(14)

(15)

(16)

Theorem 5: (See Appendix for proof) Let . The RNNmodel (1) is GES if there exist suchthat any of the following three conditions hold:

(17)

(18)

(19)

Lemma 3 (Lemma 2 in [23]):Let the functionfor where is a constant. Then,

attains the maximal value at the unique maximum, and the maximal value of is

Let , for the RNN model (2) an elegant GES conditionis , see, e.g., Theorems 1–4 in [19] or The-orem 1 in [23] where . This conditiongeneralizes the existing results such as-matrix characteristic[18], lower triangular structure [4], negative semidefiniteness[5], diagonal stability [9], diagonal semistability [10], and thesufficient conditions in [1], [8], [12], [17]. Now one problem israised: Can the condition guarantee thatthe RNN model (1) is GES for any ? In the following,we will answer this question for one special case.

Theorem 6: (See Appendix for proof) Let ,, and . If

, then the RNN model (1) is GES with the convergencerate of at least where is in (43).

Remark 3: In Theorem 6, is required. However,such a requirement is not needed in Theorems 4 and 5. Hence,Theorem 6 is different from Theorems 4 and 5. Although con-ditions (17)–(19) are weaker than conditions (14)–(16) respec-tively, Theorem 5 only deals with the special class of .Thus, Theorem 5 is different from Theorem 4 that deals withthe class .

Now consider a special case of the RNN model (1) as follows.For

where each is a constant parameter and each isdefined by

if andifif and

and and are finite or infinite satisfying ,For the closed intervals in

the infinite cases are understood as whenand , when and , and

Page 5: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

1338 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 49, NO. 9, SEPTEMBER 2002

when and , respec-tively. This model can be represented as the following compactvector-matrix form

(20)

where , the diagonal matrix, and

Define . Then,is a closed box set. The model (20) can be applied

to solve the linear VIP (see, [3], [23], and [25]); that is, deter-mining a vector in a nonempty closed convex setsuch that

(21)

When is a closed box set defined above, which may bebounded or unbounded, the linear VIP is called box-con-strained. When the box set is bounded, the linear VIP iscalled bound-constrained.

Comparing (20) with (1), we can see that ,, , , and Then, from Theorem

6 we have an immediate result as follows.Corollary 1: If (i.e., there exists a positive def-

inite matrix such that ),then the RNN model (20) is GES for any and any diag-onal matrix with the convergence rate of at least

(22)

where

(23)

and .

Proof: Given any and any. Let

(24)

Since , we have

that is, . In this case, note andthus, . In view of Theorem 6, we know that theRNN model (20) has a unique GES equilibrium at the conver-gence rate of at least

(25)where is as in (43) and rewritten by

(26)

To solve bound-constrained linear VIP, the exponential con-vergence of the RNN model (20) is explored in [3] and [25].When the matrix is real, symmetric, and positive definite, aGES result of the RNN model (20) is given in [3] provided thatthe matrix is nonsingular (see Theorem 5 in [3]). Byusing the symmetric part of the matrix , i.e., ,instead of in Theorem 5 in [3], we can see that when thesymmetric or nonsymmetric matrix is positive definite, the

above GES result is still effective for the RNN model (20) pro-vided that is nonsingular. As shown in [3], foral-most anychoices of the constant parameters where

is nonsingular. Especially, whenwith , is nonsingular if and only

if is not an eigenvalue of .In [25], the exponential convergence of the RNN model (20)

in the bound-constrained case is also studied. Whenwith and is positive definite, it is proved that all thenetwork trajectories starting from the bound-constrained setwill exponentially converge to the unique equilibrium

, provided that the equilibrium satisfies the linear equa-tion (see Theorem 2 in [25]). If a positive halftrajectory starts from the outside of the bounded set, it willconverge to the set exponentially. However, the convergenceof these trajectories starting from the outside ofto the equilib-rium is not guaranteed in [25] unless each of these trajecto-ries is shown to enter the setwithin some finite time. Further-more, when is a nonzero vector, for any given positive definitematrix , there exists a set of vectorwith Lebesgue measureof infinity in such that for any . In fact,for any given positive definite matrix , the vectordoes not belong to the bounded setif only

That is, for any given positive definite matrix , if, then . Hence, in this

case the unique equilibrium can not satisfy the linearequation for any . Let . We can seethat if , then the unique equilibriumcan not satisfy the linear equation for anyand any .

In the case of general box-constrained linear VIP, the GES ofthe RNN model (20) is presented in [23] under the same condi-tion of nonsingularity of as in Theorem 5 of [3]. Theabove GES result is achieved in the general situation of Lya-punov diagonally stable which includes the positive stablematrix as a special case (see Theorem 3 in [23]). When ,in eqn. (9) of [23], a lower bound on the rate of exponentialconvergence of the RNN model (20) is presented by

(27)

where is in (43) and rewritten by

and and are as in (24). As shown in (45), in (26) is lessthan or equals to above. Hence, in (25) is greater than orequals to above.

Based on the above discussions on exponential convergenceof the RNN model (20), we make a remark as follows.

Remark 4: Corollary 1 generalizes and unifies all the pre-viously obtained results on the exponential convergence of theRNN model (20) given in [3], [23], [25]. Furthermore, the ob-tained lower bound on the rate of convergence in Corollary 1is higher than the lower bound which is described in (27) in[23]. It is worth noting that the matrix is required to benonsingular in order to obtain the lower boundon the rate ofexponential convergence in [23]. However, the lower bound

Page 6: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

HU AND WANG: GLOBAL STABILITY OF CONTINUOUS-TIME RNNs 1339

Fig. 1. Exponential convergence of the positive half trajectories of the neural network in Example 2.

on the rate of exponential convergence in Corollary 1 is alwayseffective no matter whether is nonsingular.

Theorem 7: (See Appendix for proof) Letwhere number and .If (i.e., there exists a positive definite matrix

such that ), then for any, any , and any diagonal matrix , the RNN model

(20) is GES at the convergence rate of at least

(28)

which is strictly monotone increasing with respect to, is lessthan 1/2, and approaches to 1/2 arbitrarily asapproaches to

.Remark 5: As given in Theorem 7, the exponential conver-

gence rate and as .can approach to 1/2 arbitrarily by adjusting parameter. How-ever, putting into (23), we can see that as

and consequently, the exponential convergence ratein (22) approaches to 0 as . Therefore, when is

large enough, the convergence rate derived in Theorem 7is higher than the convergence ratein Corollary 1.

VI. I LLUSTRATIVE EXAMPLES

Example 1 : Consider a RNN model (1) with , ,, 2

We can see that

Let . We can check that

and

According to Theorem 3, this neural network is GAS for any. We can check that Theorems 4 and 5 can not be applied

to analyze the GAS of the neural network. SinceTheorem 6 can not be used to analyze the GAS of the neuralnetwork.

Example 2: Let and , , 2

Let . We can check that , , ,and satisfy condition (14) for , 2. Then, according toTheorem 4 we see that this neural network is GES. To simulate,let and and .Obviously, , and .It can be seen from Fig. 1 that all the trajectories from the 40random initial points in the set exponentiallyconverge to a unique equilibrium .

Example 3: Let and , 2

Clearly, . Let and. Then

that is, . From Theorem 6 it followsthat this neural network is GES for any . It can be seenthat Theorems 4 and 5 are not applicable to analyze the GES ofthe neural network.

Example 4: Consider a RNN model (20) with

is not positive definite and its eigenvalues are 0.7 and 1.3.Hence, Theorems 1 and 2 in [25] can not be used to ascertain theGAS or GES of this neural network when . When

or is singular. Consequently,Theorem 3 in [23] is not applicable either to analyze the GESof this neural network. Let . Then, it can beseen that ; that is, . Based on Corollary1, given any and any , this neural

Page 7: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

1340 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 49, NO. 9, SEPTEMBER 2002

Fig. 2. A comparison amongr (�), r (�), andr(�) in Example 4.

Fig. 3. Exponential convergence comparison of three trajectories of the neural network with� = 0:5, � = 3, and� = 10 in Example 4.

network is GES. Let . Compute. In terms of the exponential convergence ratein (22),

we have

where

and

In terms of the exponential convergence ratein (27), we have

where

In terms of the exponential convergence rate in (28), wehave

Fig. 2 shows a comparison among , and . Wecan see that for any given ,

for any given , is strictly mono-tone increasing with respect to , and approaches to1/2 by adjusting parameter. Let ,

and initial condition . Fig. 3 showsthe positive half trajectories of the neural network with three dif-ferent parameters . We can see that all of three trajectoriesexponentially converge to the unique equilibrium(i.e., the unique solution of Problem (21) where

) and the convergence rate increases asincreases. The monotonicity of the convergence rates of trajec-tories is in accordance with the monotonicity of above.

VII. CONCLUDING REMARKS

In this paper, we analyze the GAS and GES of a class ofcontinuous-timeRNNs with globally Lipschitz continuous acti-vation functions. After introducing a necessary and sufficientcondition for existence and uniqueness of equilibrium of theneural networks, we first present two sufficient conditions forthe GAS of the neural networks with globally Lipschitz contin-uous and monotone nondecreasing activation functions. Next,give two GES conditions for the neural networks with or withoutnondecreasing monotonicity in activation functions. With glob-

Page 8: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

HU AND WANG: GLOBAL STABILITY OF CONTINUOUS-TIME RNNs 1341

ally Lipschitz continuous and monotone nondecreasing activa-tion functions, we also provide a LDS condition for the GESof the neural networks without nonsingularity requirement forthe connection weight matrices. This LDS condition general-izes and unifies the previously obtained results. Moreover, twohigher exponential convergence rates are derived.

APPENDIX

THE PROOFS OFTHEOREMS1, 4, 5, 6,AND 7

A. Proof of Theorem 1

We only consider . When , the reasoning issimilar.

B. Necessity

Suppose that is singular for some diagonal ma-trix with

. Now construct each as

Then, the equation characterizing the equilibrium becomes

which either has infinitely many solutions or no solution de-pending on whether is in the range space ofor not. A contradiction occurs.

C. Sufficiency

We first show the existence of the equilibrium given thatis nonsingular for any admissible. The existence

is proven by induction on . When , the equilibriumsatisfies

(29)

Because is a nondecreasing function and satisfies (3), thereexists such that

Since , the solution of (29) is

Now assume the existence of the equilibrium for a neural net-work of order . We aim to prove the existence of the equi-librium for an th-order neural network. We partition, , ,

, and as

in which and are matrices, andare vectors, and contains the first

entries of . So, under this partitioning, the equilibrium mustsatisfy the following two equations:

(30)

and

(31)

Letting be an zero matrix, we easilyhave , , since isnonsingular for any admissible. Thus, given any , (31) hasa solution for

for some . Substituting this into (30) yields anequation corresponding to an th-order neural network

(32)

which has a solution provided that the matrix

(33)

is nonsingular for any admissible . Since

is nonsingular for any admissible, it is easy to see that (33) isnonsingular for any admissible . This would then ensure that(32) has a solution. As a result, there exists an equilibrium tothe neural network provided that is nonsingular forany admissible .

We then show that the equilibrium is unique by contradiction.Suppose that there are two distinct solutionsand forsome nonzero; that is

and

we then have

where . There existsa diagonal matrix with such that

or

has a nonzero solution. The latter implies that issingular. A contradiction occurs. The proof is complete.

D. Proof of Theorem 4

Comparing conditions (14)–(16) with conditions ,and in Definition 3, respectively, we can see that

is nonsingular for anywith . Thus, in view of Remark 1 any oneof conditions (14)–(16) can guarantee that the RNN model(1) has a unique equilibrium for any . Inthe following, we will focus on the equivalent system(8) where with

.

Page 9: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

1342 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 49, NO. 9, SEPTEMBER 2002

E. Case a

Condition (14) is satisfied. Let be the index such that. Let the function

. Then, from (8) it follows that

Based on condition (14) , So, we have, . Hence,

for fromwe derive ,

. This means that (8) is GES at ; that is, the RNNmodel (1) is GES at the equilibrium .

F. Case b

Condition (15) is satisfied. Let the functionand

based on condition (15). Then, computing the time derivative ofalong the positive half trajectory of (8) yields

So, we have. Then,

from it follows that,

, . This actually shows that (8) is GES at; that is, the RNN model (1) is GES at the equilibrium.

G. Case c

Condition (16) is satisfied. Let the function. Computing the time derivative of along the

positive half trajectory of (8) yields

(34)

where , , ,, , and By

condition (16), we see that is negative definite. Fromand (34) it follows that

leading to . Then, noting, we can deduce

This shows that (8) is GES at ; that is, the RNN model (1)is GES at the equilibrium .

H. Proof of Theorem 5

Comparing conditions (17)–(19) with conditions , andin Definition 3, respectively, we can see that is

Page 10: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

HU AND WANG: GLOBAL STABILITY OF CONTINUOUS-TIME RNNs 1343

nonsingular for any with .Thus, in view of Remark 1 any one of conditions (17)–(19) canguarantee that the RNN model (1) has a unique equilibriumfor any . In the following, we will focus on the equivalentsystem (8) where with

.

I. Case a

Condition (17) is satisfied. Let be the index such that. Let the function

. Then, from (8) it follows that the equation shownat the bottom of the page, holds. Based on (17), it can be seenthat . So, we have

. Hence, forfrom we derive

. This meansthat (8) is GES at ; that is, the RNN model (1) is GES atthe equilibrium .

J. Case b

Condition (18) is satisfied. Let the functionand

based on (18). Then, computing the time derivative ofalong the positive half trajectory of (8) yields

So, we have, . Then,

from it follows that

Page 11: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

1344 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 49, NO. 9, SEPTEMBER 2002

,, . This actually shows that (8) is GES at

; that is, the RNN model (1) is GES at the equilibrium.

K. Case c

Condition (19) is satisfied. Let the function. Computing the time derivative of along the

positive half trajectory of (8) yields

(35)

where , , ,, and By (19), we see

that is negative definite. Fromand (35) it follows that

leading to . Then, noting, we can deduce

This shows that the (8) is GES at ; that is, the RNN model(1) is GES at the equilibrium .

L. Proof of Theorem 6

Since , we know that there exists amatrix such that

. Then, i) given any , we havewhich shows

that is stable or nonsingular and consequentlyis nonsingular; ii) given any

where there exists at least some, without loss of generality, assume

, and . Partition , , as

respectively. Similar to case i), we can deduce thatis nonsingular. So,

is nonsingular. In terms of i) and ii), the conditioncan ensure that is

nonsingular for any satisfying . Given any, according to Theorem 1, the RNN model (1) has a

unique equilibrium. Hence, in the following, we will focus onthe equivalent system (6).

Since , we have

(36)

We consider the following Lyapunov function

(37)

with any fixed number , is as in (7), and

(38)

Computing the time derivative of along the positive halftrajectory of (6) yields

from

from (36)

from (38) (39)

Page 12: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

HU AND WANG: GLOBAL STABILITY OF CONTINUOUS-TIME RNNs 1345

Since the second term on the right-hand side of (37) is nonneg-ative based on the first inequality of (36), in view of (36) and(37) we have

(from the second inequality of (36))

(40)

where ,, or and

.Let and

(41)

Then, . Thus, in viewof the inequality (39) we have

which can infer that

From this, we get

which shows that the equilibrium of (6) is GES; that is

which means that the equilibrium of the RNN model (1) isGES and the rate of exponential convergence has a lower bound

with defined in (41) whereis any fixed number.

We define , , and . Then, the ex-ponential convergence rate of the RNN model (1) has a lowerbound

where

for .

If , then andfor and hence . In the case of

, according to Lemma 3, the lower boundon the rateof convergence of the RNN model (1) can be computed by

(42)

which actually holds for all where .According to the above three choices for, let

(43)

Since the function

is monotone decreasing for , the lower bound on the rateof convergence of the RNN model (1) becomes

(44)

where . It is easy to see that

(45)On the other hand,

Then, . As a result, thelower bound on the rate of convergence of the RNN model(1) is given by

(46)

where is as in (43).

M. Proof of Theorem 7

Given any , any , and any diagonal matrix. Since , according to Corollary 1, the RNN

model (20) has a unique GES equilibrium and hence (6) isGES at . In this case, (6) can be rewritten as

(47)

where .Now, we give a new lower bound on the rate of convergence.

Let

and

(48)

It is easy to see that , and

Page 13: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

1346 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—I: FUNDAMENTAL THEORY AND APPLICATIONS, VOL. 49, NO. 9, SEPTEMBER 2002

where is as in (28). Then, from it follows that

(49)

We introduce a Lyapunov function as follows:

(50)

with

(51)

Computing the time derivative of along the positive halftrajectory of (47) yields

(52)

From the Lyapunov function defined in (50) we have

Thus, (52) can lead to

So, which yields

that is

REFERENCES

[1] M. W. Hirsch, “Convergent activation dynamics in continuous time net-works,” Neural Networks, vol. 2, no. 5, pp. 331–349, 1989.

[2] S. Grossberg, “Nonlinear neural networks: Principles, mechanisms andarchitectures,”Neural Networks, vol. 1, no. 1, pp. 17–61, 1988.

[3] X. B. Liang and J. Wang, “A recurrent neural network for nonlinear opti-mization with a continuously differentiable objective function and boundconstraints,”IEEE Trans. Neural Networks, vol. 11, pp. 1251–1262,Nov. 2000.

[4] G. Avitabile, M. Forti, S. Manetti, and M. Marini, “On a class of non-symmetrical neural networks with application to ADC,”IEEE Trans.Circuits Syst., vol. 38, pp. 202–209, Feb. 1991.

[5] M. Forti, S. Manetti, and M. Marini, “A condition for global convergenceof a class of symmetric neural networks,”IEEE Trans. Circuits Syst. I,vol. 39, pp. 480–483, June 1992.

[6] , “Necessary and sufficient condition for absolute stability of neuralnetworks,”IEEE Trans. Circuits Syst. I, vol. 41, pp. 491–494, July 1994.

[7] M. Forti and A. Tesi, “New condition for global stability of neural net-works with application to linear and quadratic programming problems,”IEEE Trans. Circuits Syst. I, vol. 42, pp. 354–366, July 1995.

[8] Z. H. Guan, G. R. Chen, and Y. Qin, “On equilibria, stability, and insta-bility of Hopfield neural networks,”IEEE Trans. Neural Networks, vol.11, pp. 534–540, Mar. 2000.

[9] E. Kaszkurewicz and A. Bhaya, “On a class of globally stable neuralcircuits,” IEEE Trans. Circuits Syst. I, vol. 41, pp. 171–174, July 1994.

[10] , “Comments on Necessary and sufficient condition for absolutestability of neural networks,”IEEE Trans. Circuits Syst. I, vol. 42, pp.497–499, Aug. 1995.

[11] J. C. Juang, “Stability analysis of Hopfield-type neural network,”IEEETrans. Neural Networks, vol. 10, pp. 1366–1374, Nov. 1999.

[12] D. G. Kelly, “Stability in contractive nonlinear neural networks,”IEEETrans. Biomed. Eng., vol. 3, pp. 231–242, Mar. 1990.

[13] X. B. Liang and T. Yamaguchi, “Necessary and sufficient conditions forabsolute exponential stability of Hopfield-type neural networks,”IEICETrans. Inf. & Syst, vol. E79-D, pp. 990–993, 1996.

[14] X. B. Liang and L. D. Wu, “Comments on new conditions for globalstability of neural networks with application to linear and quadraticprogramming problems,”IEEE Trans. Circuits Syst. I, vol. 44, pp.1099–1101, Nov. 1997.

[15] , “Global exponential stability of a class of neural circuits,”IEEETrans. Circuits Syst. I, vol. 46, pp. 748–751, June 1999.

[16] X. B. Liang, “A comment on equilibria, stability, and instability ofHopfield neural networks,”IEEE Trans. Neural Networks, vol. 11, pp.1506–1507, Nov. 2000.

[17] K. Matsuoka, “Stability conditions for nonlinear continuous neural net-works with asymmetric connection weights,”Neural Networks, vol. 5,no. 3, pp. 495–500, 1992.

[18] T. Roska, “Some qualitative aspects of neural computing systems,” inProc. IEEE ISCAS, Helsinki, Finland, 1989, pp. 751–754.

[19] Y. Zhang, P. A. Heng, and A. W. C. Fu, “Estimate of exponential conver-gence rate and exponential stability for neural networks,”IEEE Trans.Neural Networks, vol. 10, pp. 1487–1493, Nov. 1999.

[20] M. Morita, “Associative memory with nonmonotone dynamics,”NeuralNetworks, vol. 6, no. 1, pp. 115–126, 1993.

[21] S. Yoshizawa, M. Morita, and S. I. Amari, “Capacity of associativememory using a nonmonotonic neural model,”Neural Networks, vol.6, no. 2, pp. 167–176, 1993.

[22] P. van den Driessche and X. F. Zou, “Global attractivity in delay Hopfieldneural networks,”SIAM J. Appl. Math., vol. 58, no. 6, pp. 1878–1890,1998.

[23] X. B. Liang and J. Si, “Global exponential stability of neural networkswith globally lipschitz continuous activations and its application tolinear variational inequality problem,”IEEE Trans. Neural Networks,vol. 12, pp. 349–359, Mar. 2001.

Page 14: Global stability of a class of continuous-time recurrent ... · stability, global exponential stability, global Lipschitz, recurrent neural networks. I. INTRODUCTION IN this paper,

HU AND WANG: GLOBAL STABILITY OF CONTINUOUS-TIME RNNs 1347

[24] H. Qiao, J. G. Peng, and Z. B. Xu, “Nonlinear measures: A new ap-proach to exponential stability analysis for Hopfield-type neural net-works,” IEEE Trans. Neural Networks, vol. 12, pp. 360–370, Mar. 2001.

[25] Y. Xia and J. Wang, “Global asymptotic and exponential stability of adynamic neural system with asymmetric connection weight weights,”IEEE Trans. Automat. Contr., vol. 46, pp. 635–638, Apr. 2001.

[26] I. W. Sandberg, “Some theorems on the dynamic response of nonlineartransistor networks,”Bell Syst. Tech. J., vol. 48, no. 1, pp. 35–54, 1969.

[27] A. N. Michel and R. K. Miller,Qualitative Analysis of Large Scale Dy-namical Systems. New York: Academic, 1977.

[28] A. Berman and R. J. Plemmons,Nonnegative Matrices in the Mathemat-ical Sciences. New York: Academic, 1979.

[29] M. Vidyasagar, Nonlinear Systems Analysis, 2nd ed. EnglewoodCliffs, NJ: Prentice-Hall, 1993.

Sanqing Hu received the B.S. degree in mathe-matics from Hunan Normal University, Changsha,China, in 1992, the M.S.degree in automatic controlfrom Northeastern University, Shengyang, China,in 1996, and the Ph.D. degree in automation andcomputer-aided engineering from the ChineseUniversity of Hong Kong, Hong Kong, in 2001.

He is currently a Visiting Scholar with the Depart-ment of Electrical and Computer Engineering, Uni-versity of Illinois, Chicago. His interests include ro-bust control, nonlinear systems, neural networks, and

signal processing.

Jun Wang (S’89–M’90–SM’93) received the B.S.degree in electrical engineering and the M.S. degreein systems engineering from Dalian Universityof Technology, Dalian, China, in 1982 and 1985,respectively, and the Ph.D. degree in systemsengineering from Case Western Reserve University,Cleveland, OH, in 1991.

He is currently a Professor of automation andcomputer-aided engineering at the Chinese Univer-sity of Hong Kong, Hong Kong. In 1995, he wasan Associate Professor at the University of North

Dakota, Grand Forks. He is an Associate Editor of the IEEE TRANSACTIONS

ON NEURAL NETWORKS and IEEE TRANSACTIONS ON SYSTEMS, MAN, AND

CYBERNETICS. His current research interests include neural networks and theirengineering applications.