Global exponential stability of a class of neural networks with unbounded delays
In this paper, the global exponential stability of a class of neural networks is investigated. The neural networks contain variable and unbounded delays. By constructing a suitable Lyapunov function and using the technique of matrix analysis, we obtain some new sufficient conditions for global expon...
Gespeichert in:
| Datum: | 2008 |
|---|---|
| Hauptverfasser: | , , , |
| Format: | Artikel |
| Sprache: | Englisch |
| Veröffentlicht: |
Institute of Mathematics, NAS of Ukraine
2008
|
| Online Zugang: | https://umj.imath.kiev.ua/index.php/umj/article/view/3254 |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Назва журналу: | Ukrains’kyi Matematychnyi Zhurnal |
| Завантажити файл: | |
Institution
Ukrains’kyi Matematychnyi Zhurnal| _version_ | 1860509306910146560 |
|---|---|
| author | Duong, Anh Tuan Tran, Thi Loan Даон, Анх Туан Тран, Тхі Лоан |
| author_facet | Duong, Anh Tuan Tran, Thi Loan Даон, Анх Туан Тран, Тхі Лоан |
| author_sort | Duong, Anh Tuan |
| baseUrl_str | https://umj.imath.kiev.ua/index.php/umj/oai |
| collection | OJS |
| datestamp_date | 2020-03-18T19:49:15Z |
| description | In this paper, the global exponential stability of a class of neural networks is investigated. The neural networks contain variable and unbounded delays. By constructing a suitable Lyapunov function and using the technique of matrix analysis, we obtain some new sufficient conditions for global exponential stability. |
| first_indexed | 2026-03-24T02:39:01Z |
| format | Article |
| fulltext |
UDC 517. 9
Tran Thi Loan, Duong Anh Tuan (Hanoi Nat. Univ. Education, Vietnam)
GLOBAL EXPONENTIAL STABILITY OF A CLASS
OF NEURAL NETWORKS WITH UNBOUNDED DELAYS
HLOBAL|NA EKSPONENCIAL|NA STIJKIST|
ODNOHO KLASU NEJRONNYX SITOK
Z NEOBMEÛENYMY ZAHAGVANNQMY
In this paper, the global exponential stability of a class of neural networks is investigated. The neural
networks contain variable and unbounded delays. By constructing a suitable Lyapunov function and
using the technique of matrix analysis, some new sufficient conditions on the global exponential stability
are obtained.
DoslidΩeno hlobal\nu eksponencial\nu stijkist\ odnoho klasu nejronnyx sitok. Nejronni sit-
ky mistqt\ zminni ta neobmeΩeni zahagvannq. Na osnovi pobudovy vidpovidno] funkci] Lqpunova
ta texniky matryçnoho analizu otrymano novi dostatni umovy hlobal\no] eksponencial\no] stij-
kosti.
1. Introduction. It is well-known that cellular neural networks (CNNs) proposed by
L. O. Chua and L. Yang in 1988 have been extensively studied both in theory and
applications. We refer the reader to [1 – 8] for more details on this matter. The CNNs
have been successfully applied in signal processing, pattern recognition, associative
memories and especially in static image treatments. Such applications rely on the
qualitative properties of the neural networks. In hardware implementation, time delays
occur due to finite switching speeds of the amplifiers and communication time. Time
delays may lead to an oscillation and furthermore, to an instability of networks [1].
On the other hand, the process of moving images requires the occurrence of delays in
the signal transmission among the networks [5]. Therefore, the study of stability of
neural networks with delay is practically required.
It is known that fixed time delays in model of delayed feedback systems serve as a
good approximation of a simple circuit having a small number of cells. The neural
network usually has spatial nature due to the presence of various parallel pathways.
Thus, it is desirable to model them by introducing unbounded delays.
Usually, the Lyapunov functional method is used to study qualitative properties of
CNNs. Such a method is performed in three steps. In step 1, we construct a Lyapunov
function V ( t ). In step 2, we get rid of time delays in V ( t ) to estimate V ( t ). In step 3,
we require some conditions on CNNs so that the function V ( t ) satisfies necessary
properties. Therefore, we obtain the qualitative properties of CNNs.
For autonomous CNNs, the study of the existence, uniqueness and stability of the
equilibrium point of neural networks have been carried out. Some useful results have
already been obtained in [3, 4, 6 – 11].
The stability of nonautonomous CNNs has been studied in [2, 3, 12, 13]. However,
results about the stability of the neural networks with variable, unbounded delays and
time varying coefficients have not been widely studied. In particular other authors
have not used scale Lyapunov functions in their studies (see [3, 7, 8, 14]). In this
paper, we study the exponential stability of CNNs with variable, unbounded delay and
time varying coefficients by constructing proper scale Lyapunov functions. Some new
sufficient conditions for global exponential stability are obtained.
This paper is organized as follows. In Section 2, we introduce some definitions and
assumptions. In Section 3, the global exponential stability is obtained. In Section 4,
we obtain a series of corollaries.
2. Definitions and assumptions. In this paper we consider the general neural
© TRAN THI LOAN, DUONG ANH TUAN, 2008
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10 1401
1402 TRAN THI LOAN, DUONG ANH TUAN
networks with variable and unbounded time delays of the form
dx t
dt
d t x t a t f x t b t g x t ti
i i ij j j
j
n
ij j j ij
j
n( ) = − ( ) ( ) + ( ) ( ) + ( ) ( − ( ))( ) ( )
= =
∑ ∑
1 1
τ +
+ c t k t s h x s ds I tij
j
n
ij j j
t
i( ) ( − ) ( ) + ( )
= − ∞
∑ ∫ ( )
1
, i = 1, n , (1)
where xi is the state of neuron i, i = 1, … , n; n is the number of neuron; A ( t ) =
= ( )( ) ×a tij n n, B ( t ) = ( )( ) ×b tij n n , C ( t ) = ( )( ) ×c tij n n are connection matrices; I ( t ) =
= ( )( ) … ( )I t I tn
T
1 , , is the input vector;
f u f u f un
T( ) = ( ) … ( )( )1 , , , g u g u g un
T( ) = ( ) … ( )( )1 , , ,
h u h u h un
T( ) = ( ) … ( )( )1 , ,
are the activation functions of the neurons. D ( t ) = diag , ,( )( ) … ( )d t d tn1 , d ti( )
represents the rate in which the i th unit will reset its potential to the resting state in
isolation when disconnected from the network; k tij ( ), i, j = 1, … , n, are the kernel
functions; τij t( ) , i, j = 1, … , n, are the delays.
For convenience, we introduce some notations: x = ( … )x xn
T
1, , ∈ R
n
denotes a
column vector (the symbol (
T
) denotes the transpose) with norm x = xii
n 2
1
1 2
=∑( ) /
.
For matrix A = ( ) ×aij n n , AT denotes the transpose of A, A–
1 denotes the inverse of
A . If A, B are symmetric matrices, A > B ( A ≥ B ) means that A – B is positive
definite (positive semidefinite). The minimum and maximum real eigenvalue of a
symmetric matrix A are denoted by λmin ( A) and λmax ( A), respectively.
We consider system (1) under some following assumptions.
( H1 ) Functions di ( t ), aij ( t ), bij ( t ), cij ( t ) and Ii ( t ), i, j = 1, … , n, are defined,
bounded and continuous on R
+
. Functions τij t( ) , i , j = 1, … , n, are defined
nonnegative, bounded and continuously differentiable on R
+
, inf ˙
t
ij t
∈ +
( )− ( )
R
1 τ > 0,
where τ̇ij t( ) is the derivative of τij t( ) with respect to t. Functions kij : [ 0, ∞ ) →
→ [ 0, ∞ ), i , j = 1, … , n , are piecewise continuous on [ 0, ∞ ) and satisfy
e k s dss
ij
ε ( )
∞
∫0
= pij ( ε ), where pij ( ε ) are continuous functions in [ 0, δ ), δ > 0,
pij ( 0 ) = 1.
( H2 ) There are positive constants Hi , Ki , Li , i = 1, … , n, such that
0 ≤
f u f u
u u
Hi i
i
( ) − ( )
−
≤
*
*
, g u g u K u ui i i( ) − ( ) ≤ −* * ,
h u h u L u ui i i( ) − ( ) ≤ −* *
for all u, u* ∈ R and i = 1, … , n.
( H3 ) There are a positive definite symmetric matrix S, diagonal matrices
α = diag , ,( … )α α1 n > 0, β = diag , ,( … )β β1 n > 0,
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
GLOBAL EXPONENTIAL STABILITY OF A CLASS OF NEURAL NETWORKS … 1403
ω = diag , ,( … )ω ω1 n > 0, σ = diag , ,( … )σ σ1 n > 0,
γ = diag , ,( … )γ γ1 n ≥ 0
and a constant a > 0 such that λ ηmin ,( )( )D t1 ≥ a for all t ∈ R
+
, 0 ≤ η ≤ H, where
D t SD t D t S nS S1
1 1( ) = ( ) + ( ) − ( + )− −, η α ω –
– n SA t D tηγ β σ γη γ η( + ) − ( ) − ( )− − ( )1 1 –
– η γ η γ γ η( ) ( )( ) − − ( ) + ( )A t S D A t A tT T –
– [ ]( + ) ( ) + ( + ) ( )
=
∑ α β ω σi i i i i i
i
n
B t C t
1
,
η = diag , ,( … )η η1 n , H = diag , ,( … )H Hn1 ,
C t L k s c t s ds L k s c t s dsi i i n in in( ) = (− ) ( − ) … (− ) ( − )
−∞−∞
∫∫diag , ,1
2
1 1
2 2 2
00
,
B t K
b t
t
K
b t
t
i
i i
i i
n
in in
in in
( ) =
( )
− ( )
… ( )
− ( )
( )
( )
( )
( )
−
−
−
−diag
˙
, ,
˙1
2 1
2
1
1
1 1
1
2
2 1
11 1
ψ
τ ψ
ψ
τ ψ
, i = 1, … , n,
here ψ ij t− ( )1 is inverse function of ψ ij t( ) = t – τij t( ) .
( H4 ) There are a positive definite symmetric matrix S, diagonal matrices
β = diag , ,( … )β β1 n > 0, σ = diag , ,( … )σ σ1 n > 0,
γ = diag , ,( … )γ γ1 n > 0
and a constant a > 0 such that γ γA t A tT( ) + ( ) < 0, λ ηmin ,( )( )D t2 ≥ a for all t ∈
∈ R
+
, 0 ≤ η ≤ H, where
D t SD t D t S nS S2
1 1( ) = ( ) + ( ) − ( + )− −, η β σ –
– ( ) ( )( ) − ( ) − ( ) − ( )SA t D t A t S D tTγ η η γ –
– [ ]( − ) ( ) + ( − ) ( )
=
∑ β α σ ωi i i i i i
i
n
B t C t* *
1
,
α ω λ γ γ γ γ* *
mininf= = ( ) + ( )
∈
−
+
n
A t A t
E
t
T
R 2
1
,
E is unit matrix.
We denote by B C the Banach space of bounded continuous functions ϕ : ( – ∞,
0 ] → R
n
with norm ϕ = sup
− ∞< ≤
( )
s
s
0
ϕ .
The initial condition associated with (1) is of the form
x( )θ = ϕ θ( ), θ ∈ ( – ∞, 0 ], where ϕ ∈ B C. (2)
It is well-known that if hypotheses ( H1 ), ( H2 ) are satisfied, then the system (1) has a
unique solution x ( t ) = ( )( ) … ( )x t x tn
T
1 , , satisfying the initial condition (2) (see [15]).
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
1404 TRAN THI LOAN, DUONG ANH TUAN
Definition 1. The equilibrium point x* of the system (1) is said to be globally
exponentially stable (GES) if there exist constants λ > 0 and M > 0 such that
for any solution x ( t ) of the system (1) with initial function ϕ, we have
x t x M x e t( ) − ≤ − −* *ϕ λ for all t ∈ R
+
.
Definition 2. The system (1) is said to be globally exponentially stable, if there
are constants ε > 0 and M ≥ 1 such that for any two solutions x ( t ), y ( t ) of the
system (1) with the initial functions ϕ, ψ, respectively, one has
x t y t M e t( ) − ( ) ≤ − −ψ ϕ ε for all t ∈ R
+
.
The activation function f is said to belong to the class P L I (denoted f ∈ P L I ) i f
for each j ∈ {1, 2, … , n }, g j : R → R is a partially Lipschitz continuous and
monotone increasing function. A function f is said to be partially Lipschitz
continuous in R if for any ρ ∈ R there exists a positive number lρ such that
f f l( ) − ( ) ≤ −θ ρ θ ρρ for all θ ∈ R.
Definition 3. The system (1) is said to be absolutely exponentially stable with
respect to the class P L I if it possesses a unique GES equilibrium point for every
functions g, f, h ∈ P L I and every input vector I.
Definition 4. A matrix A is said to belong to the class P0 i f A satisfies that
all principal minors of A are nonnegative (denoted A ∈ P0).
3. Global exponential stability of CNNs. In this section, by constructing a
suitable Lyapunov function and using the technique of matrix analysis we give some
sufficient conditions for the global exponential stability of solutions of the system (1).
The main results in this article are content of Theorem 1.
Theorem 1. If the hypotheses ( H1 ), ( H2 ) and ( H3 ) or ( H4 ) are satisfied then
the system (1) is globally exponentially stable.
Proof. Let x ( t ), y ( t ) be two arbitrary solutions of the system (1) with initial
value ψ, ϕ, respectively. Setting z ( t ) = x ( t ) – y ( t ), we have
ż t D t z t A t z t G z t t F z t( ) = − ( ) ( ) + ( ) ( ) + − ( ) + ( )( ) ( ( )) ( )Φ τ , (3)
where
Φ Φ Φ( ) ( ( ) ( ))( ) = ( ) … ( )z t z t z tn n1 1 , , ,
G z t t G z t t G z t tn n n
T( ( )) ( ( ( )) ( ( )))− ( ) = − ( ) … − ( )τ τ τ1 1 1 , , ,
F z t F z t F z tn n
T( ) ( ( ) ( ))( ) = ( ) … ( )1 1 , , ,
Φi i i i i iz t f x t f y t( ) ( ) − ( )( ) = ( ) ( ) ,
G z t t b t g x t t g y t ti i i ij
j
n
j j ij j j ij( ( )) ( ( )) ( ( ))− ( ) = ( ) − ( ) − − ( )[ ]
=
∑τ τ τ
1
,
F z t c t k t s h x t h t dsi i ij
j
n
ij j j j j
t
y( ) ( ) ( )( ) = ( ) ( − ) ( ) − ( )[ ]
= −∞
∑ ∫
1
, i = 1, … , n.
Define a Lyapunov function as follows
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
GLOBAL EXPONENTIAL STABILITY OF A CLASS OF NEURAL NETWORKS … 1405
V t z z t Sz t e s dset
T t
i
n
i i
t
z ti
( ) = ( ) ( ) + ( )
=
( )
∑ ∫, ε εγ2
1 0
Φ +
+
i
n
i i
j
n
ij ij
ij ij
j j j j
t t
t
s sb s
s
g x s g y s e ds
ij
ij ij
= =
−
−
− ( )
( + ( ( )))∑ ∑ ∫( + )
( )
− ( )
( ) − ( )[ ]( )
( )
( ) ( )
−
1 1
2 1
1
2
1
1
α β
ψ
τ ψτ
ε τ ψ
˙
+
+
i
n
i i
j
n
ij ij j j j j
t s
t
u sk s c u s h x u h y u e duds
= = +
( − )
−∞
∑ ∑ ∫∫( + ) (− ) ( − ) ( ) − ( )[ ]( ) ( )
1 1
2 2
0
ω σ ε , (4)
where S, α, β , γ, ω, σ are given by ( H3 ), ε > 0 be a constant which will be
determined later on. Caculating the derivative of V t zt( ), along the solutions of
equation (3), we get
dV t z
dt
e z t SD t z t z t SA t z t z t SG z t tt t T T T( ) = − ( ) ( ) ( )[ + ( ) ( ) ( ) + ( ) − ( )( ) ( ( )), ε τ2 2 2Φ +
+ 2 2 2z t SF z t z t D t z t z t A t z tT T T( ) ( ) − ( ) ( ) ( ) + ( ) ( ) ( )( ) ( ) ( ) ( )Φ Φ Φγ γ +
+ 2 2Φ ΦT z t G z t t z t F z t( ) ( ( )) ( ) ( )( ) − ( ) + ( ) ( ) ]γ τ γ +
+
i
n
i i
j
n
ij ij
ij ij
j j j j
t tb t
t
g x t g y t e ij ij
= =
−
−
( + ( ( )))∑ ∑( + )
( )
− ( )
( ) − ( )( )( )
( )
( ) ( )
−
1 1
2 1
1
2
1
1
α β
ψ
τ ψ
ε τ ψ
˙
–
– b t g x t t g y t t eij j j ij j j ij
t2 2
( ) − ( ) − − ( )[ ]
( ( )) ( ( ))τ τ ε +
+
i
n
i i
j
n
ij ij j j j j
t sk s c t s h x t h y t e ds
= =
( − )
−∞
∑ ∑ ∫( + )
(− ) ( − ) ( ) − ( )[ ]( ) ( )
1 1
2 2
0
ω σ ε –
– k s c t h x t s h y t s dseij ij j j j j
t(− ) ( ) ( + ) − ( + )[ ]
( ) ( )
−∞
∫ 2 2
0
ε +
+ ε ε γε εz t Sz t e s dseT t
i
n
i i
t
z ti
( ) ( ) + ( )
=
( )
∑ ∫2
1 0
Φ . (5)
It follows by using Cauchy – Schwarz inequality that
n b t g x t t g y t tij j j ij j j ij
i
n
2 2
1
( ) − ( ) − − ( )[ ]( ( )) ( ( ))
=
∑ τ τ ≥
≥ b t g x t t g y t tij j j ij j j ij
j
n
( ) − ( ) − − ( )[ ]
( ( )) ( ( ))
=
∑ τ τ
1
2
.
Since k s dsij( )
∞
∫0
= 1, we obtain
k s h x t s h y t s dsij j j j j(− ) ( + ) − ( + )[ ]( ) ( )
−∞
∫
2
0
=
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
1406 TRAN THI LOAN, DUONG ANH TUAN
= k s h x t s h y t s ds k s dsij j j j j ij(− ) ( + ) − ( + )[ ] (− )( ) ( )
−∞ −∞
∫ ∫
2
0 0
≥
≥ k s h x t s h y t s dsij j j j j(− ) ( + ) − ( + )[ ]
( ) ( )
−∞
∫
0 2
for all i, j = 1, … , n. Hence, we have
c t k s h x t s h y t s dsij
j
n
ij j j j j
2
1
2
0
( ) (− ) ( + ) − ( + )[ ]
= −∞
∑ ∫ ( ) ( ) ≥
≥ c t k t s h x s h y s dsij
j
n
ij j j j j
t
2
1
2
( ) ( − ) ( ) − ( )[ ]
= −∞
∑ ∫ ( ) ( ) ≥
≥
1
1
2
n
c t k t s h x s h y s dsij
j
n
ij j j j j
t
( ) ( − ) ( ) − ( )[ ]
= −∞
∑ ∫ ( ) ( ) .
Firstly, we assume that ( H3 ) holds. From ( H1 ), ( H2 ), we have
dV t z
dt
e z t SD t z t z t SA t z t z t SG z t tt t T T T( ) ≤
− ( ) ( ) ( ) + ( ) ( ) ( ) + ( ) − ( )( ) ( ( )), ε τ2 2 2Φ +
+ 2 2 2z t SF z t z t D t z t z t A t z tT T T( ) ( ) − ( ) ( ) ( ) + ( ) ( ) ( )( ) ( ) ( ) ( )Φ Φ Φγ γ +
+ 2 2Φ ΦT z t G z t t z t F z t( ) ( ( )) ( ) ( )( ) − ( ) + ( ) ( )γ τ γ +
+
i
n
i i
j
n
j
ij ij
ij ij
t
jK
b t
t
e z tij ij
= =
−
−
( ( ))∑ ∑( + )
( )
− ( )
( )
( )
( )
−
1 1
2
2 1
1
2
1
1
α β
ψ
τ ψ
ετ ψ
˙
–
–
i
n
i i
j
n
ij j j ij j j ijn
b t g x t t g y t t
= =
∑ ∑( + ) ( ) − ( ) − − ( )[ ]
( ( )) ( ( ))
1 1
2
1α β τ τ +
+
i
n
i i
j
n
ij ij
s
j jk s c t s e ds L z t
= =
−
−∞
∑ ∑ ∫( + ) (− ) ( − ) ( )
1 1
2 2 2
0
ω σ ε –
–
i
n
i i ij
j
n
ij j j j j
t
n
c t k t s h x s h y s ds
= = −∞
∑ ∑ ∫( + ) ( ) ( − ) ( ) − ( )[ ]
( ) ( )
1 1
2
1ω σ +
+ ε ε γz t Sz t s dsT
i
n
i i
z ti
( ) ( ) + ( )
=
( )
∑ ∫2
1 0
Φ =
= e z t SD t z t z t SA t z t z t SG z t tt T T Tε τ
− ( ) ( ) ( ) + ( ) ( ) ( ) + ( ) − ( )( ) ( ( ))2 2 2Φ +
+ 2 2 2z t SF z t z t D t z t z t A t z tT T T( ) ( ) − ( ) ( ) ( ) + ( ) ( ) ( )( ) ( ) ( ) ( ))Φ Φ Φγ γ +
+ 2 2
1
Φ ΦT
i i
T
i
i
n
z t G z t t z t F z t z t B t z t( ) ( ( )) ( ) ( )( ) − ( ) + ( ) ( ) + ( + ) ( ) ( ) ( )
=
∑γ τ γ α β ε, –
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
GLOBAL EXPONENTIAL STABILITY OF A CLASS OF NEURAL NETWORKS … 1407
–
1
1n
G z t t G z t t z t C t z tT
i i
T
i
i
n
( ( )) ( ( ))− ( ) ( + ) − ( ) + ( + ) ( ) ( ) ( )
=
∑τ α β τ ω σ ε, –
–
1
2
1 0
n
F z t F z t z t Sz t s dsT T
i
n
i i
z ti
( ) ( )( ) ( + ) ( ) + ( ) ( ) + ( )
=
( )
∑ ∫ω σ ε ε γ Φ , (6)
where
B t K
b t
t
e K
b t
t
ei
i i
i i
t
n
in in
in in
ti i in in( ) = ( )
− ( )
… ( )
− ( )
( )
( )
( )
( )
−
−
( ( ))
−
−
( ( ))− −
, diag
˙
, ,
˙
ε ψ
τ ψ
ψ
τ ψ
ετ ψ ετ ψ
1
2 1
2
1
1
1 1
1
2
2 1
11 1
1 1
1 1
,
C t k s c t s e ds L k s c t s e ds Li i i
s
in in
s
n( ) = (− ) ( − ) … (− ) ( − )
−
−∞
−
−∞
∫ ∫, diag , ,ε ε ε
1 1
2
1
2
0
2 2
0
for all i = 1, … , n.
Let η ( t ) = diag , ,( )( ) … ( )η η1 t tn , where ηi t( ) = ηi iz t( )( ) such that
Φ( )( ) = ( ) ( )z t t z tη ( ∀t ∈ R
+
). (7)
Using the inequality
2 1x y y Dy x D xT T T− ≤ − , where x, y ∈ R
n
, D > 0, (8)
we have
2
1
z t SG z t t G z t t
n
G z t t z t S
n
Sz tT T T( ) − ( ) − ( ) − ( ) ≤ ( )
( )( ( )) − ( ( )) ( ( ))
−
τ τ α τ α
,
2
1
z t SF z t F z t
n
F z t z t S
n
Sz tT T T( ) ( ) ( )
( ) ≤ ( )
( )( ) − ( ) ( )
−ω ω
,
2ΦT Tz t G z t t G z t t
n
G z t t( ) ( ( )) − ( ( )) ( ( ))( ) − ( ) − ( ) − ( )γ τ τ β τ ≤
≤ Φ ΦT z t
n
z t( ) ( )( )
( )
−
γ β γ
1
,
2
1
Φ Φ ΦT T Tz t F z t z t
n
F z t z t
n
z tF( ) ( ) − ( ) ( ) ( ) ( )( ) ( ) ( )
( ) ≤ ( )
( )
−
γ σ γ σ γ .
This implies
dV t z
dt
e z t SD t D t S SA t D tt t T( ) ≤ − ( )
( ) + ( ) − ( ) − ( )( ), ε γ η –
– η γ η γ γ η ω α( ) ( )( ) − ( ) − ( ) + ( ) − ( + )− −A t S D t A t A t nS ST T 1 1 –
– n B t C ti i i
i
n
i i i
i
n
ηγ σ β γη α β ε ω σ ε( + ) − ( + ) ( ) − ( + ) ( )− −
= =
∑ ∑1 1
1 1
, , –
– ε ε γ η εS E
z t
s ds z t z t D t z t
i
n
i i
z t
T
i
−
( )
( )
( ) = ( ) ( ) ( )
=
( )
∑ ∫2
1
2
1 0
1Φ , , , (9)
where
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
1408 TRAN THI LOAN, DUONG ANH TUAN
D t SD t D t S SA t D t1( ) = ( ) + ( ) − ( ) − ( )( ), ,η ε γ η –
– η γ η γ γ η( ) ( )( ) − ( ) − ( ) + ( )A t S D t A t A tT T –
– nS S n( + ) − ( + )− − − −ω α ηγ σ β γη1 1 1 1 –
– ( + ) ( ) − ( + ) ( )
= =
∑ ∑α β ε ω σ εi i i
i
n
i i i
i
n
B t C t, ,
1 1
–
– ε ε γS E
z t
s ds
i
n
i i
z ti
−
( )
( )
=
( )
∑ ∫2
1
2
1 0
Φ .
Obviously, lim ,
ε
ε
→
( )
0
B ti = B ti( ) uniformly for all t ∈ R
+
, lim ,
ε
ε
→
( )
0
C ti = C ti( )
uniformly for all t ∈ R
+
and i = 1, … , n. From a assumption ( H2 ) we obtain
1 1
22
1 0
1z t
s ds H
i
n
i i
z t
i n
i i
i
( )
( ) ≤ ( )
=
( )
≤ ≤
∑ ∫ γ γΦ max ( ∀t ∈ R
+
).
Hence, we have lim , ,
ε
η ε
→
( )
0
1D t = D t1( ), η uniformly for all t ∈ R
+
and 0 ≤ η ≤ H.
Thus, by assumption ( H3 ), there exists a constant ε > 0 such that λ η εmin , ,( )( )D t1 ≥
≥
a
2
for all t ∈ R
+
and 0 ≤ η ≤ H. Therefore, by (9), we have
dV t z
dt
a
e z t z tt t T( ) ≤ − ( ) ( ),
2
ε ( ∀t ∈ R
+
). (10)
Secondly, we assume that ( H4 ) holds. By (8) we obtain
2
2
Φ Φ ΦT T
T
z t G z t t z t
A t A t
z t( ) ( ( )) ( ) ( )( ) − ( ) + ( ) ( ) + ( )
( )γ τ γ γ
=
= 2
2
Φ Φ ΦT T
T
z t G z t t z t
A t A t
z t( ) ( ( )) ( ) ( )( ) − ( ) − ( ) − ( ) + ( )
( )γ τ γ γ
≤
≤ G z t t
A t A t
G z t t
T
( ( )) ( ( ))− ( ) − ( ) + ( )
− ( )
−
τ γ γ γ γ τ
2
1
,
2
2
Φ Φ ΦT T
T
z t F z t z t
A t A t
z t( ) ( ) ( ) ( )( ) ( ) + ( ) ( ) + ( )
( )γ γ γ
=
= 2
2
Φ Φ ΦT T
T
z t F z t z t
A t A t
z t( ) ( ) ( ) ( )( ) ( ) − ( ) − ( ) + ( )
( )γ γ γ
≤
≤ F z t
A t A t
F z t
T
( ) ( )( ) − ( ) + ( )
( )
−
γ γ γ γ
2
1
,
2
1
z t SG z t t G z t t
n
G z t t z t S
n
Sz tT T T( ) − ( ) − ( ) − ( ) ≤ ( )
( )( ( )) − ( ( )) ( ( ))
−
τ τ β τ β
,
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
GLOBAL EXPONENTIAL STABILITY OF A CLASS OF NEURAL NETWORKS … 1409
2
1
z t SF z t F z t
n
F z t z t S
n
Sz tT T T( ) ( ) ( )
( ) ≤ ( )
( )( ) − ( ) ( )
−σ σ
.
So from (4), (6) with α = – α*, ω = – ω*, we have
dV t z
dt
e z t SD t z t z t SA t z tt t T T( ) ≤
− ( ) ( ) ( ) + ( ) ( ) ( )( ), ε 2 2 Φ +
+ G z t t
A t A t
G z t t
T
( ( )) ( ( ))− ( ) − ( ) + ( )
− ( )
−
τ γ γ γ γ τ
2
1
+
+ F z t
A t A t
F z t
T
( ) ( )( ) − ( ) + ( )
( )
−
γ γ γ γ
2
1
+
+ nz t S Sz t z t D t z tT T( ) ( + ) ( ) − ( ) ( ) ( )− − ( )β σ γ1 1 2Φ +
+ ( + ) ( ) ( ) ( ) + ( + ) ( ) ( ) ( )
= =
∑ ∑α β ε ω σ εi i
T
i
i
n
i i
T
i
i
n
z t B t z t z t C t z t, ,
1 1
–
– G z t t
n
G z t t F z t
n
F z tT T( ( )) ( ( )) ( ) ( )− ( ) − ( ) − ( )
( )τ α τ ω
+
+ ε ε γz t Sz t s dsT
i
n
i i
z ti
( ) ( ) + ( )
=
( )
∑ ∫2
1 0
Φ ≤
≤ − ( )
( ) + ( ) − ( ) − ( )( )z t e SD t D t S SA t D tT tε γ η –
– η γ β σ( )( ) − ( ) − ( + )− −A t S D t nS ST 1 1 –
– ( + ) ( ) − ( + ) ( )
= =
∑ ∑α β ε ω σ εi i i
i
n
i i i
i
n
B t C t, ,
1 1
–
– ε ε γS E
z t
s ds z t
i
n
i i
z ti
−
( )
( )
( )
=
( )
∑ ∫2
1
2
1 0
Φ . (11)
Let
D t SD t D t S SA t D t2( ) = ( ) + ( ) − ( ) − ( )( ), ,η ε γ η –
– η γ β σ( )( ) − ( ) − ( + )− −A t S D t nS ST 1 1 –
– ( + ) ( ) − ( + ) ( )
= =
∑ ∑α β ε ω σ εi i i
i
n
i i i
i
n
B t C t, ,
1 1
–
– ε ε γS E
z t
s ds
i
n
i i
z ti
−
( )
( )
=
( )
∑ ∫2
1
2
1 0
Φ .
By a similar argument as used for D t1( ), ,η ε , we also have lim , ,
ε
η ε
→
( )
0
2D t =
= D t2( ), η uniformly for all t ∈ R
+
and 0 ≤ η ≤ H. Thus, by assumption ( H4 ) there
exists a constant ε > 0 such that
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
1410 TRAN THI LOAN, DUONG ANH TUAN
λ η εmin , ,( )( ) ≥D t
a
2 2
.
Therefore, by (11) we finally have
dV t z
dt
e
a
z t z tt t T( ) ≤ − ( ) ( ), ε
2
( ∀t ∈ R
+
). (12)
From (10), (12) we further obtain
V t V( ) ≤ ( )0 ( ∀t ≥ 0 ). (13)
Directly, from (4) and a assumption ( H2 ) we have
V t z t Sz t e S e z tT t t( ) ≥ ( ) ( ) ≥ ( ) ( )ε ελmin
2 ( ∀t ≥ 0 ), (14)
V z Sz s dsT
i
n
i i
zi
( ) = ( ) ( ) + ( )
=
( )
∑ ∫0 0 0 2
1 0
0
γ Φ +
+
i
n
i i
j
n
ij ij
ij ij
j j j j
s sb s
s
g x s g y s e ds
ij
ij ij
= =
−
−
− ( )
( + ( ( )))∑ ∑ ∫( + )
( )
− ( )
( ) − ( )[ ]( )
( )
( ) ( )
−
1 1
2 1
1
2
0
0
1
1
α β
ψ
τ ψτ
ε τ ψ
˙
+
+
i
n
i i
j
n
ij ij j j j j
s
u sk s c u s h x u h y u e duds
= = −∞
( − )∑ ∑ ∫ ∫( + ) (− ) ( − ) ( ) − ( )[ ]( ) ( )
1 1
0
2 2
0
ω σ ε ≤
≤ λ ϕ ψ γ ϕ ψ α β ϕ ψmax max( ) − + ( ) − + ( + ) −
≤ ≤ =
∑S H P
i n
i i i i
i
n
i
2
1
2
1
2 +
+ ( + ) − =
( ) + ( ) + ( + )
= ≤ ≤ =
∑ ∑ω σ ϕ ψ λ γ α βi i
i
n
i
i n
i i i i
i
n
iO S H P
1
2
1 1
max max +
+ ( + )
−
=
∑ ω σ ϕ ψi i
i
n
iO
1
2 ,
where
P
b s
s
e Ki
j n
ij ij
ij ij
s s
i
ij ij=
( )
− ( )
≤ ≤ [− ]
−
−
( + ( ( )))( )
( )
−
max sup
˙,1 0
2 1
1
2
1
1
τ
ε τ ψψ
τ ψ
τ,
O c L pi i
j n
ij= ( ) −
≤ ≤
( )2 2
1
1
1
ε
εmax .
Hence, we have P0 > 1 such that z t( ) 2 ≤ P e t
0
2 2ϕ ψ ε− − for all t ∈ R
+
, that is
x t y t P e t( ) − ( ) ≤ − −
0
2ϕ ψ ε / for all t ∈ R
+
.
This completes the proof of Theorem 1.
4. Corollaries. In this section, as some special cases of Theorem 1, we derive
some corollaries which seems to be advantageous for the stability test.
Corollary 1. Assume that the hypotheses ( H1 ), ( H2 ) are satisfied. If there is a
constant a > 0 such that
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
GLOBAL EXPONENTIAL STABILITY OF A CLASS OF NEURAL NETWORKS … 1411
λ η ηmin 2 2
1
D t A t A t B t C t nET
i i
i
n
( ) − ( ) − ( ) − ( ) + ( ) −
( )
=
∑ ≥ a
for all t ∈ R
+
and 0 ≤ η ≤ H, then the system (1) is globally exponentially stable.
Proof. Choosing S = α = ω = δ E ( δ > 0 ) and γ = 0, we obtain
D t D t A t A t B t C t nET
i i
i
n
1
1
2 2( ) = ( ) − ( ) − ( ) − ( ) + ( ) −
( )
=
∑, η δ η η –
– ( )( ) + ( )
=
∑ β σi i i i
i
n
B t C t
1
.
According to Theorem 1, this corollary holds.
Corollary 2. Assume that the hypotheses ( H1 ), ( H2 ) are satisfied. If A t( ) +
+ A tT ( ) < 0 and there is a constant a > 0 such that
λ η ηmin
( ) − ( ) − ( ) − ( ) − ( ) −( ) ( )2 2D t A t D t A t D t nET –
– ( )( − ) ( ) + ( − ) ( )
=
∑ 1 1
1
α ωi i i i
i
n
B t C t* * ≥ a
for all t ∈ R
+
and 0 ≤ η ≤ H, then the system (1) is globally exponentially stable,
here
α ω λ* *
mininf= = ( ) + ( )
∈
−
+
n
A t A t
E
t
T
R 2
1
.
Proof. By choosing S = β = γ = σ = E, we obtain
D t D t A t D t A t D t nET
2 2 2( ) = ( ) − ( ) − ( ) − ( ) − ( ) −( ) ( ), η η η –
– ( )( − ) ( ) + ( − ) ( )
=
∑ 1 1
1
α ωi i i i
i
n
B t C t* * .
Using Theorem 1, the Corollary 2 holds.
When D t( ) = D, A t( ) = A, B t( ) = B, C t( ) = C , τij t( ) = τ, f = g = h, I t( ) = I
for all t ∈ R
+
, system (1) degenerates into
dx t
dt
d x t a g x t b g x ti
i i ij j j
j
n
ij j j
j
n( ) = − ( ) + ( ) + ( − )( ) ( )
= =
∑ ∑
1 1
τ +
+ c k t s g x s ds Iij
j
n
ij j j
t
i
= −∞
∑ ∫ ( − ) ( ) +( )
1
, i = 1, n . (15)
We consider the following assumptions:
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
1412 TRAN THI LOAN, DUONG ANH TUAN
( ′ )H1 – ( A + B + C ) ∈ P0 ; functions kij : [ 0, ∞ ) → [ 0, ∞ ), i, j = 1, … , n, are
piecewise continuous on [ 0, ∞ ) and satisfy e k s dss
ij
ε ( )
∞
∫0
= pij ( ε ), where pij ( ε ) are
continuous functions in [ 0, δ ), δ > 0, pij ( 0 ) = 1.
( ′ )H2 There are positive constants H i, i = 1, … , n , such that 0 ≤
≤
g u g u
u u
Hi i
i
( ) − ( )
−
≤
*
*
for all u, u* ∈ R and i = 1, … , n.
( ′ )H3 There are a positive definite symmetric matrix S, diagonal matrices
α = diag , ,( … )α α1 n > 0, β = diag , ,( … )β β1 n > 0,
ω = diag , ,( … )ω ω1 n > 0, σ = diag , ,( … )σ σ1 n > 0,
γ = diag , ,( … )γ γ1 n ≥ 0
such that λ ηmin( )( )D1 > 0 for all 0 ≤ η ≤ H,
D SD DS nS S n1
1 1 1 1( ) = + − ( + ) − ( + )− − − −η α ω ηγ β σ γη –
– ( ) ( ) ( )− − − − +SA D A S D A AT Tγ η η γ η γ γ η –
– [ ]( + ) + ( + )
=
∑ α β ω σi i i i i i
i
n
B C
1
where η = diag , ,( … )η η1 n , H = diag , ,( … )H Hn1 , Bi = diag , ,( … )b H b Hi in n1
2
1
2 2 2 ,
Ci = diag , ,( … )c L c Li in n1
2
1
2 2 2 , i = 1, … , n.
( ′ )H4 There are a positive definite symmetric matrix S, diagonal matrices
β = diag , ,( … )β β1 n > 0, σ = diag , ,( … )σ σ1 n > 0,
γ = diag , ,( … )γ γ1 n > 0
such that γ γA AT+ < 0, λ ηmin( )( )D2 > 0 for all 0 ≤ η ≤ H, where
D t SD DS nS S SA D2
1 1( ) = + − ( + ) − −− − ( ), η β σ γ η –
– η γ β α σ ω( ) [ ]− − ( − ) + ( − )
=
∑A S D B CT
i i i i i i
i
n
* *
1
,
α ω λ γ γ γ γ* *
min= = +
−
n
A A
E
T
2
1
.
We have some following results.
Corollary 3. Assume that the hypotheses ( ′ )H1 , ( ′ )H2 and ( ′ )H3 or ( ′ )H4 are
satisfied, then the system (15) is absolutely exponentially stable with respect to the
class P L I.
Proof. By [13], the system (15) have a unique equilibrium point for any function
g ∈ P L I and any input vector I if and only if – ( A + B + C ) ∈ P0 . Hence, the
Corollary 3 follows from Theorem 1.
Further, as consequence of Corollaries 1 and 2 we have the following corollaries.
Corollary 4. Assume that the hypotheses ( ′ )H1 , ( ′ )H2 are satisfied. If
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
GLOBAL EXPONENTIAL STABILITY OF A CLASS OF NEURAL NETWORKS … 1413
λ η ηmin 2 2
1
D A A B C nET
i i
i
n
− − − + −
( )
=
∑ > 0
for all 0 ≤ η ≤ H, then the system (15) is absolutely exponentially stable with
respect to the class P L I.
Corollary 5. Assume that the hypotheses ( ′ )H1 , ( ′ )H2 are satisfied. If A + AT <
< 0 and
λ η η α ωmin
* *2 2 1 1
1
D A D A D nE B CT
i i i i
i
n
− ( − ) − ( − ) − − ( − ) + ( − )
( )
=
∑ > 0
for all 0 ≤ η ≤ H, then the system (15) is absolutely exponentially stable with
respect to the class P L I.
5. Conclusions. In this paper, the general neural networks with variable and
unbounded time delays have been studied. We introduce two new important
assumptions ( H3 ), ( H4 ) to ensure the global exponential stability of the systems. The
results obtained in this paper are new and completely different from that given in [3, 6
– 8]. Comparing with [2], the results in this article improve and extend those results of
[2] in many aspects. Here, the Lyapunov functional is a scale function. It shows that
we can use scale Lyapunov functionals to study CNNs with variable and unbounded
delays (see [7] for a criticism this method).
Acknowdegments. The authors would like to thank the Associate Editor and the
anonymous reviewers for their constructive comments and suggestions to improve the
quality of the paper.
1. Cillvallrri P. P., Gill L. M., Pandolfi L. On stability of cellular neural networks with delay // IEEE
Trans. Circuits and Syst.-I: – 1993. – 40. – P. 157 – 164.
2. Jiang H., Teng Z. Global exponential stability of cellular neural networks with time-varying
confficients and delay // Neural Networks. – 2004. – 17. – P. 1415 – 1425.
3. Liu B., Huang L. Existence and exponential stability of almost periodic solutions for cellular neural
networks with continuously distributed delays // J. Korean Math. Soc. – 2006. – 43. – P. 445 – 459.
4. Lu W., Rong L., Chen T. Global convergence of delayed neural networks systems // Int. J. Neural
Syst. – 2003. – 13. – P. 193 – 204.
5. Roska, Chua L.O. Cellular neural networks with delay type template elements and non-uniform
grids // Int. J. Circuit Theory and Appl. – 1992. – 20, # 4. – P. 469 – 481.
6. Zhao H. Global asymptotic stability of Hopfield neural network invoving distributed delays //
Neural Networks. – 2004. – 17. – P. 45 – 53.
7. Zhang J. Absolute stability of a class of neural networks with unbounded delay // Int. J. Circuit
Theory and Appl. – 2004. – 32. – P. 11 – 21.
8. Zhang J., Yoshihiro Suda, Takashi Iwasa. Absolutely exponential stability of a class of neural
networks with unbounded delay // Neural Networks. – 2003. – 17. – P. 391 – 397.
9. Cao J., Wang J. Global exponential stability and periodicity of recurrent neural networks with time
delays // IEEE Trans. Circuits and Syst.-I: Regular Paper. – 2005. – 52.
10. Cao J. Global stability conditions for delayed CNNs // IEEE Trans. Circuits and Syst.-I: Fundam
Theory and Appl. – 2001. – 48.
11. Lu H., Chung F., He Z. Some sufficient conditions for global exponential stability of delayed
Hopfield neural networks // Neural Networks. – 2004. – 17. – P. 537 – 544.
12. Jiang H., Li Z., Teng Z. Boundedness and stability for autonomous cellular neural networks with
delay // Phys. Lett. A. – 2003. – 306. – P. 313 – 325 .
13. Rehim M., Jiang H., Teng Z. Boundedness and stability for nonautonomous cellular neural
networks with delay // Neural Networks. – 2004. – 17. – P. 1017 – 1025.
14. Zhang J. Absolutely exponential stability in delayed cellular neural networks // Int. J. Circuit
Theory and Appl. – 2002. – 30. – P. 395 – 409.
15. Driver R. D. Existence and stability of solution of a delay-differential system // Arch. Ration.
Mech. and Anal. – 1962. – 10. – P. 401 – 426.
Received 29.12.06,
after revision — 31.01.08
ISSN 1027-3190. Ukr. mat. Ωurn., 2008, t. 60, # 10
|
| id | umjimathkievua-article-3254 |
| institution | Ukrains’kyi Matematychnyi Zhurnal |
| keywords_txt_mv | keywords |
| language | English |
| last_indexed | 2026-03-24T02:39:01Z |
| publishDate | 2008 |
| publisher | Institute of Mathematics, NAS of Ukraine |
| record_format | ojs |
| resource_txt_mv | umjimathkievua/25/7ba607276bd0c8e8e5e7540928fbfd25.pdf |
| spelling | umjimathkievua-article-32542020-03-18T19:49:15Z Global exponential stability of a class of neural networks with unbounded delays Глобальна експоненціальна стійкість одного класу нейронних сіток з необмеженими загаюваннями Duong, Anh Tuan Tran, Thi Loan Даон, Анх Туан Тран, Тхі Лоан In this paper, the global exponential stability of a class of neural networks is investigated. The neural networks contain variable and unbounded delays. By constructing a suitable Lyapunov function and using the technique of matrix analysis, we obtain some new sufficient conditions for global exponential stability. Досліджено глобальну експоненціальну стійкість одного класу нейронних сіток. Нейронні сітки містять змінні та необмежені загаювання. На основі побудови відповідної функції Ляпунова та техніки матричного аналізу отримано нові достатні умови глобальної експоненціальної стійкості. Institute of Mathematics, NAS of Ukraine 2008-10-25 Article Article application/pdf https://umj.imath.kiev.ua/index.php/umj/article/view/3254 Ukrains’kyi Matematychnyi Zhurnal; Vol. 60 No. 10 (2008); 1401–1413 Український математичний журнал; Том 60 № 10 (2008); 1401–1413 1027-3190 en https://umj.imath.kiev.ua/index.php/umj/article/view/3254/3254 https://umj.imath.kiev.ua/index.php/umj/article/view/3254/3255 Copyright (c) 2008 Duong Anh Tuan; Tran Thi Loan |
| spellingShingle | Duong, Anh Tuan Tran, Thi Loan Даон, Анх Туан Тран, Тхі Лоан Global exponential stability of a class of neural networks with unbounded delays |
| title | Global exponential stability of a class of neural networks with unbounded delays |
| title_alt | Глобальна експоненціальна стійкість одного класу нейронних сіток з необмеженими загаюваннями |
| title_full | Global exponential stability of a class of neural networks with unbounded delays |
| title_fullStr | Global exponential stability of a class of neural networks with unbounded delays |
| title_full_unstemmed | Global exponential stability of a class of neural networks with unbounded delays |
| title_short | Global exponential stability of a class of neural networks with unbounded delays |
| title_sort | global exponential stability of a class of neural networks with unbounded delays |
| url | https://umj.imath.kiev.ua/index.php/umj/article/view/3254 |
| work_keys_str_mv | AT duonganhtuan globalexponentialstabilityofaclassofneuralnetworkswithunboundeddelays AT tranthiloan globalexponentialstabilityofaclassofneuralnetworkswithunboundeddelays AT daonanhtuan globalexponentialstabilityofaclassofneuralnetworkswithunboundeddelays AT tranthíloan globalexponentialstabilityofaclassofneuralnetworkswithunboundeddelays AT duonganhtuan globalʹnaeksponencíalʹnastíjkístʹodnogoklasunejronnihsítokzneobmeženimizagaûvannâmi AT tranthiloan globalʹnaeksponencíalʹnastíjkístʹodnogoklasunejronnihsítokzneobmeženimizagaûvannâmi AT daonanhtuan globalʹnaeksponencíalʹnastíjkístʹodnogoklasunejronnihsítokzneobmeženimizagaûvannâmi AT tranthíloan globalʹnaeksponencíalʹnastíjkístʹodnogoklasunejronnihsítokzneobmeženimizagaûvannâmi |