Direct Spectral Problem for the Generalized Jacobi Hermitian Matrices
In this article we will introduce and investigate some generalized Jacobi matrices. These matrices have three-diagonal block structure and they are Hermitian. We will give necessary and sufficient conditions for selfadjointness of the operator which is generated by the matrix of such a type, and con...
Saved in:
| Date: | 2009 |
|---|---|
| Main Author: | |
| Format: | Article |
| Language: | English |
| Published: |
Інститут математики НАН України
2009
|
| Online Access: | https://nasplib.isofts.kiev.ua/handle/123456789/5704 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Journal Title: | Digital Library of Periodicals of National Academy of Sciences of Ukraine |
| Cite this: | Direct spectral problem for the generalized Jacobi Hermitian matrices / I.Ya. Ivasiuk // Methods of Functional Analysis and Topology. — 2009. — Т. 15, № 1. — С. 3-14. — Бібліогр.: 7 назв. — англ. |
Institution
Digital Library of Periodicals of National Academy of Sciences of Ukraine| id |
nasplib_isofts_kiev_ua-123456789-5704 |
|---|---|
| record_format |
dspace |
| spelling |
Ivasiuk, I.Ya 2010-02-02T13:01:20Z 2010-02-02T13:01:20Z 2009 Direct spectral problem for the generalized Jacobi Hermitian matrices / I.Ya. Ivasiuk // Methods of Functional Analysis and Topology. — 2009. — Т. 15, № 1. — С. 3-14. — Бібліогр.: 7 назв. — англ. 1029-3531 https://nasplib.isofts.kiev.ua/handle/123456789/5704 In this article we will introduce and investigate some generalized Jacobi matrices. These matrices have three-diagonal block structure and they are Hermitian. We will give necessary and sufficient conditions for selfadjointness of the operator which is generated by the matrix of such a type, and consider its generalized eigenvector expansion. en Інститут математики НАН України Direct Spectral Problem for the Generalized Jacobi Hermitian Matrices Article published earlier |
| institution |
Digital Library of Periodicals of National Academy of Sciences of Ukraine |
| collection |
DSpace DC |
| title |
Direct Spectral Problem for the Generalized Jacobi Hermitian Matrices |
| spellingShingle |
Direct Spectral Problem for the Generalized Jacobi Hermitian Matrices Ivasiuk, I.Ya |
| title_short |
Direct Spectral Problem for the Generalized Jacobi Hermitian Matrices |
| title_full |
Direct Spectral Problem for the Generalized Jacobi Hermitian Matrices |
| title_fullStr |
Direct Spectral Problem for the Generalized Jacobi Hermitian Matrices |
| title_full_unstemmed |
Direct Spectral Problem for the Generalized Jacobi Hermitian Matrices |
| title_sort |
direct spectral problem for the generalized jacobi hermitian matrices |
| author |
Ivasiuk, I.Ya |
| author_facet |
Ivasiuk, I.Ya |
| publishDate |
2009 |
| language |
English |
| publisher |
Інститут математики НАН України |
| format |
Article |
| description |
In this article we will introduce and investigate some generalized Jacobi matrices. These matrices have three-diagonal block structure and they are Hermitian. We will give necessary and sufficient conditions for selfadjointness of the operator which is generated by the matrix of such a type, and consider its generalized eigenvector expansion.
|
| issn |
1029-3531 |
| url |
https://nasplib.isofts.kiev.ua/handle/123456789/5704 |
| citation_txt |
Direct spectral problem for the generalized Jacobi Hermitian matrices / I.Ya. Ivasiuk // Methods of Functional Analysis and Topology. — 2009. — Т. 15, № 1. — С. 3-14. — Бібліогр.: 7 назв. — англ. |
| work_keys_str_mv |
AT ivasiukiya directspectralproblemforthegeneralizedjacobihermitianmatrices |
| first_indexed |
2025-11-25T20:39:16Z |
| last_indexed |
2025-11-25T20:39:16Z |
| _version_ |
1850527872321585152 |
| fulltext |
Methods of Functional Analysis and Topology
Vol. 15 (2009), no. 1, pp. 3–14
DIRECT SPECTRAL PROBLEM FOR THE GENERALIZED JACOBI
HERMITIAN MATRICES
I. YA. IVASIUK
To the memory of A. Ya. Povzner
Abstract. In this article we will introduce and investigate some generalized Jacobi
matrices. These matrices have three-diagonal block structure and they are Hermit-
ian. We will give necessary and sufficient conditions for selfadjointness of the operator
which is generated by the matrix of such a type, and consider its generalized eigen-
vector expansion.
1. Introduction
At first we will recall the direct spectral problem for the classical Jacobi matrix (see,
e.g. [1, 3, 6]). In the classical theory, one considers the Hermitian Jacobi matrix
(1) J =
b0 c0 0 0 0 . . .
a0 b1 c1 0 0 . . .
0 a1 b2 c2 0 . . .
...
...
...
...
...
, bn ∈ R, an = cn > 0, n ∈ N0 = {0, 1, 2, . . .},
on the space ℓ2 of sequences f = (fn)∞n=0, fn ∈ C.
This matrix, defined on finite sequences f ∈ ℓfin, gives rise to an operator on ℓ2, which
is Hermitian with equal deficiency numbers and, therefore, has a selfadjoint extension
on ℓ2. Under some conditions imposed on J , e.g.
∑
∞
n=0
1
an
= ∞, the closure J̃ of J is
selfadjoint.
The direct spectral problem, i.e., the eigenfunction expansion for J̃ (or for some self-
adjoint extension of J), is constructed in the following way (for simplicity we will assume
that J̃ is selfadjoint).
We introduce ∀λ ∈ R a sequence of polynomials,
P (λ) =
(
Pn(λ)
)
∞
n=0
,
as a solution of the equation
JP (λ) = λP (λ), P0(λ) = 1, i.e., ∀n ∈ N0
an−1Pn−1(λ) + bnPn(λ) + anPn+1(λ) = λPn(λ),
P−1(λ) = 0, P0(λ) = 1.
(2)
This recurrence has a solution. It is constructed inductively, starting with P0(λ), which
can be done, since an > 0 for all n.
2000 Mathematics Subject Classification. Primary 47B36, 47A70, 47A75.
Key words and phrases. Block three-diagonal matrix, Jacobi matrix, generalized eigenvector, eigen-
vector expansion.
3
4 I. YA. IVASIUK
The sequence of polynomials P (λ) is a generalized eigenvector for J̃ with eigenvalue
λ; we use some quasinuclear rigging of the space H = ℓ2,
(3) H− ⊃ H0 ⊃ H+, P (λ) ∈ H−.
The corresponding Fourier transformation F =̂ is given by
(4) ℓ2 ⊃ ℓfin ∋ f = (fn)∞n=0 �→ f̂(λ) =
∞∑
n=0
fnPn(λ) ∈ L2(R, dρ(λ)) =: L2.
This mapping is a unitary operator (after taking the closure) between ℓ2 and L2. The
image of J̃ is the operator of multiplication by λ on the space L2. The polynomials Pn(λ)
are orthonormal w.r.t. the spectral measure dρ(λ),
(5)
∫
R
Pj(λ)Pk(λ)dρ(λ) = δj,k, j, k ∈ N0.
Note that (5) is a consequence of the Parseval equality that holds true for mapping (4),
(6) ∀f, g ∈ ℓfin (f, g)ℓ2 =
∫
R
f̂(λ)ĝ(λ)dρ(λ).
In this paper we will deal with the following situation. The matrix under consideration
has the same structure as in (1) but ai, bi, ci, i ∈ N0, are matrices (with complex elements)
of dimensions (i + 2)× (i + 1), (i + 1)× (i +1), (i + 1)× (i + 2), respectively. We assume
that this matrix is Hermitian. The detailed form of such matrix is given in Section 2.
Also in this section, we formulate a criterion and a sufficient condition for selfadjointness
of the operator generated by such a matrix on the space C1 ⊕ C2 ⊕ C3 ⊕ · · · . These
results have analogs in the classical theory (see [3], Ch. 7). It is necessary to say that
matrices of such type appear in [4], but they are normal matrices and are connected with
a complex moment problem. Also truncated Jacobi matrices of similar structure appear
in papers of Yuan Xu (see, e.g., [7]).
In Section 3 we will introduce an analog of the first order polynomials for the matrix
under consideration. An essential difference in our case, as compared with the the clas-
sical situation, is that the matrices ai, ci are not invertible, so we have to assume that
∀i ∈ N0 rank ci = i+1. It is natural that these matrices are not invertible, since they are
not square. Using these polynomials we will construct generalized eigenvector expansion.
It is necessary to note that because the ai, ci are not invertible, there is no a scalar spec-
tral measure dρ(λ). The obtained measure is infinite dimensional matrix-valued measure
that is similar to the one in the case of partial difference equations (compare with [2]).
2. Hermitian block Jacobi-type matrices and selfadjointness of the
corresponding operators
Let us consider the complex Hilbert space
(7) l2 = H0 ⊕ H1 ⊕ H2 ⊕ · · · , Hi = C
i+1, i ∈ N0,
of vectors l2 ∋ f = (fn)∞n=0, where fn = (fn;j)
n
j=0 ∈ Hn; f =
∑
∞
n=0
∑n
j=0 fn;jen;j, (here
en;j, n = 0, 1, . . . , j = 0, 1, . . . , n, are elements of the standard basis in l2) with the scalar
product (f, g)l2 =
∑
∞
n=0(fn, gn)Hn
; f, g ∈ l2. Consider the Hilbert space rigging
(8) l = (lfin)′ ⊃ l2(p
−1) ⊃ l2 ⊃ l2(p) ⊃ lfin,
where lfin is the space of finite vectors, l is the space of arbitrary vectors, l2(p) is the
space of infinite vectors with the scalar product (f, g)l2(p) =
∑
∞
n=0(fn, gn)Hn
pn; f, g ∈
l2(p) (here p = (pn)∞n=0, pn > 0, is a given weight). In what follows, pn ≥ 1 and∑
∞
n=0 p−1
n < ∞; therefore the embedding of the positive space l2(p) ⊂ l2 is quasinuclear.
So, the rigging (8) is quasinuclear.
DIRECT SPECTRAL PROBLEM FOR THE GENERALIZED JACOBI HERMITIAN MATRICES 5
Let us consider, in the space (7), the Hermitian matrix J = (Jj,k)∞j,k=0 with operator-
valued complex elements Jj,k : Hk → Hj , Jj,k = (Jj,k;α,β) j k
α=0β=0 of the following block
Jacobi structure:
(9) J =
b0 c0 0 0 . . .
a0 b1 c1 0 . . .
0 a1 b2 c2 . . .
...
...
...
...
. . .
, where
ai = Ji+1,i : Hi → Hi+1,
bi = Ji,i : Hi → Hi,
ci = Ji,i+1 : Hi+1 → Hi.
For the matrix J to be Hermitian, it is necessary and sufficient that bi = b∗i , ai = c∗i ,
where ∗ denotes the adjoint to the matrix. Also we suppose that ∀i ∈ N0 rank ci = i+1.
Remark 1. Actually bi = b∗i ⇔ (biui, vi)Hi
= (ui, bivi)Hi
∀ui, vi ∈ Hi and ai = c∗i ⇔
(aiξi, ζi+1)Hi+1
= (ξi, ciζi+1)Hi
∀ξi ∈ Hi, ζi+1 ∈ Hi+1, i ∈ N0.
Let u ∈ l2. Then matrix J acts on u in following way:
(10) (Ju)j = aj−1uj−1 + bjuj + cjuj+1, where u−1 = 0.
It is easy to show that ∀k, l ∈ N0, k ≤ l, the following analogue of Green’s formula takes
place:
l∑
j=k
[(
(Ju)j , vj
)
Hj
−
(
uj, (Jv)j
)
Hj
]
=
[
(clul+1, vl)Hl
− (alul, vl+1)Hl+1
]
−
[
(ck−1uk, vk−1)Hk−1
− (ak−1uk−1, vk)Hk
]
, ∀u, v ∈ l2.
(11)
Using rigging (8) and formula (11) we can construct, from the matrix J , an operator
J that acts on l2, (see, e.g., a similar scheme of construction in [3], Ch. 7, § 1). The
construction is following. Consider some operator J
′
on finite vectors in l2, that acts by
formula (10), i.e., (J
′
u)j = (Ju)j , u ∈ lfin, and u−1 = 0. Using (11) we can conclude that
J
′
is Hermitian. By J we denote the closure of the operator J
′
. It is easy to see that
domain of the operator J∗ consists of v ∈ l2 for which Jv ∈ l2.
Let us consider equation which gives possibility to find eigenvectors for operator J.
(12) (Jϕ(z))j = aj−1ϕj−1(z) + bjϕj(z) + cjϕj+1(z) = zϕj(z), z ∈ C, j ∈ N0,
where ϕ ∈ l; ϕ−1(z) = 0.
Remark 2. This equation can be considered as a recurrence relation for finding ϕj+1 by
using ϕj and ϕj−1. In Section 3 we will make a few assumptions. They guarantee that
(12) is solvable.
Proposition 1. The operator J is selfadjoint if and only if any non-zero solution of
system (12) satisfies
∑
∞
j=0 ‖ϕj(z)‖
2
Hj
= ∞, where z ∈ C \ R.
Proof. Let z ∈ C, Im z �= 0, be some fixed number. Consider the deficiency subspace
Nz̄ of the operator J orthogonal to R(J − z̄E) (here R(A) is the range of the operator
A). It coincides with the subspace of solutions of the equation J∗ϕ = zϕ. Due to the
construction of J∗, we have (J∗ϕ(z))j = (Jϕ(z))j = zϕ(z), ϕ−1(z) = 0, ϕ ∈ D(J∗). So,
the dimension of Nz̄ is not equal to zero if and only if
∑
∞
j=0 ‖ϕj(z)‖
2
Hj
< ∞, where ϕ(z)
is some non-zero solution of (12). �
Theorem 1. Let the matrix J be such that
∑
∞
j=0(‖aj‖j;j+1 + ‖cj‖j+1;j)
−1 = ∞, where
‖·‖k; l denotes the norm of a (l + 1)× (k + 1)–matrix or the respective operator that acts
from Hk to Hl. Then the operator J is selfadjoint.
6 I. YA. IVASIUK
Proof. Let z ∈ C, Im z �= 0, be some fixed number. According to Proposition 1, it is
sufficient to show that for any non-zero solution of (12),
∑
∞
j=0 ‖ϕj(z)‖
2
Hj
= ∞. Consider
some non-zero solution ϕ(z) = (ϕ0(z), ϕ1(z), . . .) of equation (12). Let n0 ∈ N0 be
the index of the first non-zero element of ϕ(z), i.e., Hn0
∋ ϕn0
(z) �= 0. Since ∀ j =
0, . . . , n0 − 1 ϕj(z) ≡ 0, it follows from (12) that ϕn0
(z) does not depend on z, i.e.,
ϕn0
(z) = ϕn0
. Consider the identity
(z − z̄)
j∑
i=0
(
ϕi(z), ϕi(z)
)
Hi
=
j∑
i=0
[(
(Jϕ(z))i, ϕi(z)
)
Hi
−
(
ϕi(z), (Jϕ(z))i
)
Hi
]
.
Using formula (11) we have
(z − z̄)
j∑
i=0
(
ϕi(z), ϕi(z)
)
Hi
=
(
cjϕj+1(z), ϕj(z)
)
Hj
−
(
ajϕj(z), ϕj+1(z)
)
Hj+1
.
Let j ≥ n0. Then c ‖ϕn0
‖2
Hn0
≤
(
‖cj‖j+1;j +‖aj‖j;j+1
)
‖ϕj(z)‖
Hj
‖ϕj+1(z)‖
Hj+1
, where
c > 0 is some constant. So,
∞ =
∞∑
j=0
(
‖cj‖j+1;j + ‖aj‖j;j+1
)
−1
<
1
c
1
‖ϕn0
‖
2
Hn0
∞∑
j=0
‖ϕj(z)‖
Hj
‖ϕj+1(z)‖
Hj+1
≤
1
c
1
‖ϕn0
‖
2
Hn0
∞∑
j=0
‖ϕj(z)‖2
Hj
.
�
In what follows, the operator J is assumed to be selfadjoint.
3. The direct spectral problem
Let us consider equation (12). Since eigenvalues of a selfadjoint operator are real, we
have the following:
b0ϕ0 + c0ϕ1(λ) = λϕ0,
a0ϕ0 + b1ϕ1(λ) + c1ϕ2(λ) = λϕ1(λ),
. . .
aj−1ϕj−1(λ) + bjϕj(λ) + cjϕj+1(λ) = λϕj(λ),
. . . ,
j = 1, 2, 3, . . . , λ ∈ R.
(13)
As one can see, none of the equation in system (13) defines ϕj+1(λ) in unique way from
ϕj(λ) and ϕj−1(λ). Let us find a solution in following manner.
Assume that rank cj = j +1 and the matrix cj =
{
cj;α,β
}j j+1
α=0 β=0
is as follows: for the
matrix c̃j :=
{
cj;α,β
}j j+1
α=0 β=1
there exists an inverse, c̃−1
j , and cj;·,0 is linearly dependent
on the columns of c̃j , where cj;·,i =
{
cj;α,i
}j
α=0
, i = 0, 1, . . . , j + 1, is the ith column of
matrix cj. We will use the notation of such type for other matrices. Also, let ϕj;0(λ) =
ϕj;0 ∈ C, j = 0, 1, . . . , be some complex constants, where ϕ0,0 := ϕ0, and all of the above
indicated ϕj;0 generate a vector of “boundary conditions”, ϕ·;0 :=
ϕ0
ϕ1;0
ϕ2;0
. . .
.
Remark 3. We make this assumption for simplification of the subsequent construction.
In fact, it is easy to make the following calculations in the same way in the general case.
DIRECT SPECTRAL PROBLEM FOR THE GENERALIZED JACOBI HERMITIAN MATRICES 7
Since rank cj = j + 1, there are j + 1 columns of this matrix which make a linearly
independent system and then one more column that is a linear combination of the inde-
pendent ones.
In accordance with the assumption, system (13) can be rewritten in the following way:
ϕ1;1(λ) =
1
c0;0,1
(λ − b0)ϕ0 −
c0;0,0
c0;0,1
ϕ1;0,
ϕj+1;1(λ)
. . .
ϕj+1;j+1(λ)
= c̃−1
j (λIj − bj)ϕj(λ) − c̃−1
j aj−1ϕj−1(λ)
− c̃−1
j cj;·,0ϕj+1;0, j ∈ N,
(14)
where Ij is identity matrix on Hj .
Denote by Pα;(j,·)(λ) :=
(
Pα;(j;k)(λ)
)j
k=0
, α = 0, 1, . . . , j = 0, 1, . . . , a solution of
equation (14) satisfying the boundary conditions Pα;(j,0) = δj,α, j = 0, 1, . . . (the vector
that has its αth coordinate equal to 1 and all the others are zeros will be denoted by
δα). Let us consider a procedure of its construction. For a better understanding we will
describe this procedure in the simplest case and then give it in general.
00. Let ϕ·,0 = δ0. Then the corresponding solutions ϕj,k(λ), j = 0, 1, . . . , k = 0, 1, . . . , j,
of (14) are given by the polynomials P0;(j,k)(λ), j = 0, 1, . . . , k = 0, 1, . . . , j. We obtain a
set of complex-valued polynomials, which is convenient to represent in following order:
(15)
P0;(0,0)(λ) P0;(1,0)(λ) P0;(2,0)(λ) P0;(3,0)(λ) . . . P0;(j,0)(λ) . . .
P0;(1,1)(λ) P0;(2,1)(λ) P0;(3,1)(λ) . . . P0;(j,1)(λ) . . .
P0;(2,2)(λ) P0;(3,2)(λ) . . . P0;(j,2)(λ) . . .
P0;(3,3)(λ) . . . P0;(j,3)(λ) . . .
. . .
P0;(j,j)(λ) . . .
The construction of these polynomials is as following:
a) Since ϕ·,0 = δ0, we have P0;(0,0)(λ) = 1.
b) From the boundary conditions, P0;(1,0)(λ) = 0. From (14), we have
P0;(1,1)(λ) = 1
c0;0,1
(λ − b0)P0;(0,0)(λ) −
c0;0,0
c0;0,1
P0;(1,0)(λ) = 1
c0;0,1
(λ − b0).
c) Since P0;(2,0)(λ) = 0, from (14) we obtain(
P0;(2,1)(λ)
P0;(2,2)(λ)
)
= c̃−1
1 (λI1 − b1)
(
0
P0;(1,1)(λ)
)
− c̃−1
1 a0. Thus
P0;(2,1)(λ) =
1
c0;0,1
(
c̃−1
1 (λI1 − b1)
)
0,1
(λ − b0) − (c̃−1
1 a0)0,
P0;(2,2)(λ) =
1
c0;0,1
(
c̃−1
1 (λI1 − b1)
)
1,1
(λ − b0) − (c̃−1
1 a0)1,
where
(
c̃−1
1 (λI1 − b1)
)
m,n
is the matrix element in the mth row and nth column, and
(c̃−1
1 a0)m is the mth coordinate of this vector.
d) Since ϕ·,0 = δ0, we have P0;(j,0)(λ) = 0, j = 2, 3, . . . , and from (14) it follows that
P0;(j,1)(λ)
P0;(j,2)(λ)
. . .
P0;(j,j)(λ)
= c̃−1
j−1(λIj−1 − bj−1)
0
P0;(j−1,1)(λ)
. . .
P0;(j−1,j−1)(λ)
− c̃−1
j−1aj−2
0
P0;(j−2,1)(λ)
. . .
P0;(j−2,j−2)(λ)
.
8 I. YA. IVASIUK
Then ∀k = 1, 2, . . . , j,
P0;(j,k)(λ) = (c̃−1
j−1(λIj−1 − bj−1))k−1,·P0;(j−1,·)(λ) − (c̃−1
j−1aj−2)k−1,·P0;(j−2,·)(λ)
=
j−1∑
i=0
(c̃−1
j−1(λIj−1 − bj−1))k−1,iP0;(j−1,i)(λ) −
j−2∑
i=0
(c̃−1
j−1aj−2)k−1,iP0;(j−2,i)(λ).
α0. Let us consider the general situation. Let ϕ·;0 = δα, where α = 1, 2, . . . is some fixed
number. Then the solutions of (14) with this boundary conditions gives polynomials
Pα;(j,k)(λ), i.e., ϕj;k(λ) = Pα;(j,k)(λ), j = 0, 1, . . . , k = 0, 1, . . . , j. It is convenient to
represent these polynomials in the order similar to (15),
(16)
Pα;(0,0)(λ) . . . Pα;(α−1,0)(λ) Pα;(α,0)(λ) . . . Pα;(j,0)(λ) . . .
Pα;(α−1,1)(λ) Pα;(α,1)(λ) . . . Pα;(j,1)(λ) . . .
. . . . . . . . . . . .
Pα;(α−1,α−1)(λ) Pα;(α,α−1)(λ) . . . Pα;(j,α−1)(λ) . . .
Pα;(α,α)(λ) . . . Pα;(j,α)(λ) . . .
. . .
Pα;(j,j)(λ) . . .
Because ϕ·;0 = δα,
(17) Pα;(j,0)(λ) = ϕj;0 = δj,α, j = 0, 1, . . .
a) From (14) and (17) we obtain Pα;(j,k)(λ) = 0 if j = 0, 1, . . . , α − 1, k = 0, 1, . . . , j.
b) From (17) Pα;(α,0)(λ) = 1, and according to (14) we have
Pα;(α,1)(λ)
Pα;(α,2)(λ)
. . .
Pα;(α,α)(λ)
= −c̃−1
α−1cα−1;·,0Pα;(α,0)(λ).
Then ∀k = 1, 2, . . . , α we get
(18) Pα;(α,k)(λ) = −(c̃−1
α−1)k−1,·cα−1;·,0 = −
α∑
i=0
(c̃−1
α−1)k−1,icα−1;i,0.
c) From (17) it follows that Pα;(α+1,0)(λ) = 0. So (14) gives
Pα;(α+1,1)(λ)
. . .
Pα;(α+1,α+1)(λ)
= c̃−1
α (λIα − bα)Pα;(α,·)(λ),
and then
Pα;(α+1,k)(λ) =
(
c̃−1
α (λIα − bα)
)
k−1,·
Pα;(α,·)(λ)
=
α+1∑
i=0
(
c̃−1
α (λIα − bα)
)
k−1,i
Pα;(α,i)(λ), k = 1, 2, . . . , α + 1.
(19)
d) Let us consider the general situation. From (14) and (17) ∀j = α+1, α+2, . . . we get
Pα;(j,1)(λ)
. . .
Pα;(j,j)(λ)
= c̃−1
j−1(λIj−1 − bj−1)Pα;(j−1,·)(λ) − c̃−1
j−1aj−2Pα;(j−2,·)(λ)
DIRECT SPECTRAL PROBLEM FOR THE GENERALIZED JACOBI HERMITIAN MATRICES 9
or
Pα;(j,k)(λ) =
(
c̃−1
j−1(λIj−1 − bj−1)
)
k−1,·
Pα;(j−1,·)(λ) −
(
c̃−1
j−1aj−2
)
k−1,·
Pα;(j−2,·)(λ)
=
j−1∑
i=0
(
c̃−1
j−1(λIj−1 − bj−1)
)
k−1,i
Pα;(j−1,i)(λ)
−
j−2∑
i=0
(
c̃−1
j−1aj−2
)
k−1,i
Pα;(j−2,i)(λ), k = 1, 2, . . . , j.
(20)
So, from 00 and α0 (mainly, from (17)–(20)) we can summarize the following.
Let α = 0, 1, . . . Then for any fixed α and ∀k = 1, 2, . . . , j we get
(21) Pα;(j,k)(λ) =
0, j = 0, . . . , α − 1,
−(c̃−1
j−1)k−1,·cj−1;·,0, j = α,(
c̃−1
j−1(λIj−1 − bj−1)
)
k−1,·
Pα;(j−1,·)(λ), j = α + 1,(
c̃−1
j−1(λIj−1 − bj−1)
)
k−1,·
Pα;(j−1,·)(λ)−
−
(
c̃−1
j−1aj−2
)
k−1,·
Pα;(j−2,·)(λ),
j = α + 2, α + 3, . . .
Using formulae (21) we can define and calculate step by step all the polynomials
Pα;(j,k)(λ) for all permitted α, j, k.
Also, using formulae (21) it is easy to calculate the degree of the constructed poly-
nomials. Now we will give the distribution of the degrees of Pα;(j,k)(λ) for any fixed α
according to the order which was considered in (16).
(22)
0 1 . . . α − 1 α α + 1 α + 2 . . . j →
0 0 0 . . . 0 1 0 0 . . .
1 0 . . . 0 [1] [λ] [λ2] . . .
...
...
...
...
...
α − 1 0 [1] [λ] [λ2] . . .
α [1] [λ] [λ2] . . .
α + 1 [λ] [λ2] . . .
α + 2 [λ2] . . .
...
k ↓
Here [λm], m = 0, 1, . . . , means that there is a polynomial of degree m in the respective
place. It is easy to see that the degree of the polynomial Pα;(j,k)(λ) equals j−α, if j ≥ α.
Now we come back to solving equation (14) with some boundary conditions ϕ·,0. The
procedure will be given in a constructive way by using induction. Consider the first
equation in (14). Then the solution can be written in the form
(
ϕ1;0
ϕ1;1(λ)
)
=
[
0 1
1
c0;0,1
(λ − b0) −
c0;0,0
c0;0,1
](
ϕ0
ϕ1;0
)
=
(
P0;(1,0)(λ) P1;(1,0)(λ)
P0;(1,1)(λ) P1;(1,1)(λ)
)(
ϕ0
ϕ1;0
)
= ϕ0P0;(1,·)(λ) + ϕ1;0P1;(1,·)(λ).
(23)
So, ϕ1;1(λ) = ϕ0P0;(1,1)(λ) + ϕ1;0P1;(1,1)(λ).
10 I. YA. IVASIUK
Also we consider the construction of ϕ2(λ). It will be useful for understanding the
procedure to solve (14) in general. From (14) and (23) we have
(
ϕ2;1(λ)
ϕ2;2(λ)
)
= c̃−1
1 (λI1 − b1)
(
ϕ0P0;(1,·)(λ) + ϕ1;0P1;(1,·)(λ)
)
− c̃−1
1 a0ϕ0
− c̃−1
1 c1;·,0ϕ2;0
= ϕ0
(
c̃−1
1 (λI1 − b1)P0;(1,·)(λ) − c̃−1
1 a0
)
+ ϕ1;0
(
c̃−1
1 (λI1 − b1)P1;(1,·)(λ)
)
+ ϕ2;0
(
− c̃−1
1 c1;·,0
)
= ϕ0
(
(c̃−1
1 (λI1 − b1))0,·P0;(1,·)(λ) − (c̃−1
1 )0,·a0
(c̃−1
1 (λI1 − b1))1,·P0;(1,·)(λ) − (c̃−1
1 )1,·a0
)
+ ϕ1;0
(
(c̃−1
1 (λI1 − b1))0,·P1;(1,·)(λ)
(c̃−1
1 (λI1 − b1))1,·P1;(1,·)(λ)
)
+ ϕ2;0
(
−(c̃−1
1 )0,·c1;·,0
−(c̃−1
1 )1,·c1;·,0
)
.
(24)
Then from (21) and (24) we obtain
ϕ2(λ) = ϕ0P0;(2,·)(λ) + ϕ1;0P1;(2,·)(λ) + ϕ2;0P2;(2,·)(λ) =
2∑
α=0
ϕα;0Pα;(2,·)(λ) or
ϕ2;k(λ) =
2∑
α=0
ϕα;0Pα;(2,k)(λ), k = 0, 1, 2.
(25)
Consider the general situation by using induction. Let us suppose that, for some fixed
j ∈ N,
(26) ϕm(λ) =
m∑
α=0
ϕα;0Pα;(m,·)(λ), m = j − 1, j.
From (14) and (26) we obtain
ϕj+1;1(λ)
. . .
ϕj+1;j+1(λ)
= c̃−1
j (λIj − bj)
j∑
α=0
ϕα;0Pα;(j,·)(λ)
− c̃−1
j aj−1
j−1∑
α=0
ϕα;0Pα;(j−1,·)(λ) − c̃−1
j cj;·,0ϕj+1;0
=
j−1∑
α=0
ϕα;0
(
c̃−1
j (λIj − bj)Pα;(j,·)(λ) − c̃−1
j aj−1Pα;(j−1,·)(λ)
)
+ ϕj;0c̃
−1
j (λIj − bj)Pj;(j,·)(λ) + ϕj+1;0(−c̃−1
j cj;·,0).
(27)
So, from (21) we get that ϕj+1(λ) has the form (26) for m = j + 1.
Thus, for ϕj(λ), the following formulae take place in the vector and coordinate forms:
ϕj(λ) =
j∑
α=0
ϕα;0Pα;(j,·)(λ),
ϕj;k(λ) =
j∑
α=0
ϕα;0Pα;(j,k)(λ), j = 0, 1, . . . , k = 0, 1, . . . , j.
(28)
So, from our calculations we obtain the following Theorem.
Theorem 2. All solutions of equation (13) can be represented in the form (28), where
Pα;(j,k)(λ) are polynomials that can be calculated by recursion formulas (21) with the
initial conditions Pα;(j,0)(λ) = δj,α, j, α ∈ N0.
DIRECT SPECTRAL PROBLEM FOR THE GENERALIZED JACOBI HERMITIAN MATRICES 11
Now we will use the result from [3], Ch. 5, and [5], Ch. 15, about the generalized
eigenvector expansion for a selfadjoint operator connected with the chain (8) in a standard
way. For our operator J we have the representation
(29) Jf =
∫
R
λΦ(λ) dσ(λ)f, f ∈ l2(p),
where dσ(λ) is a spectral measure, Φ(λ) : l2(p) → l2(p
−1) is a generalized projection
operator and Φ(λ) is positive-definite kernel, i.e., ∀f ∈ l2(p) (Φ(λ)f, f)l2 ≥ 0. For all
f, g ∈ l2(p) we have the Parseval equality
(30) (f, g)l2 =
∫
R
(Φ(λ)f, g)l2dσ(λ).
Let us denote by πn the operator of orthogonal projection on Hn, n ∈ N0, in l2. Hence
∀f = (fn)∞n=0 ∈ l2 we have fn = πnf. This operator acts analogously in the space l2(p)
and l2(p
−1).
Let us consider the operator matrix (Φj,k(λ))∞j,k=0 , where
(31) Φj,k(λ) = πjΦ(λ)πk : l2 → Hj (or, in fact, Hk → Hj).
Remark 4. It is necessary to say that Φj,k(λ) is a (j + 1) × (k + 1)-matrix which stands
at the jth block of rows and the kth block of columns. Enumeration of the blocks starts
with 0 and is carried in such a way that the ith block of rows (columns) consists of i + 1
rows (columns). So Φ(λ) has the form
∗ ∗ ∗ . . .
∗ ∗ ∗ . . .
∗ ∗ ∗ . . .
...
...
...
. . .
,
where ∗ denotes elements of the matrix.
The Parseval identity (30) can be rewritten as follows: ∀f, g ∈ lfin
(32) (f, g)l2 =
∫
R
∞∑
j=0
(πjΦ(λ)f, πjg)Hj
dσ(λ) =
∫
R
∞∑
j,k=0
(Φj,k(λ)fk, gj)Hj
dσ(λ).
Lemma 1. For every fixed j, k ∈ N0, elements of the matrix (31), Φj,k(λ) : Hk → Hj,
have the following representation:
(33)
Φj,k;l,m(λ) =
j∑
α=0
k∑
β=0
Φα,β;0,0(λ)Pβ;(k,m)(λ)Pα;(j,l)(λ), l = 0, . . . , j, m = 0, . . . , k.
Proof. Since Φ(λ) is a projection onto generalized eigenvectors of the selfadjoint oper-
ator J with the corresponding generalized eigenvalue λ, we see that the vector ϕ(λ) =
(ϕj(λ)∞j=0) such that ϕj;l(λ) = Φj,k;l,m(λ), l = 0, . . . , j, is a solution of (13) for any fixed
k, m, k ∈ N0, m = 0, . . . , k. Indeed, since ϕ(λ) = Φ(λ)ek;m, k = 0, 1, . . . , m = 0, . . . , k,
we have 0 =
(
ϕ(λ), (J − λI)f
)
l2
, f ∈ lfin, f−1 = 0. Using Green’s formula (11) we obtain
that 0 =
(
(J −λI)ϕ(λ), f
)
l2
−a−1ϕ−1f0. Since f is an arbitrary vector from lfin, we have(
(J−λI)ϕ(λ)
)
j
= 0, j = 1, 2, . . . , and (b0−λ)ϕ0+c0ϕ1(λ) = 0. Therefore, ϕ(λ) ∈ l2(p
−1)
exists as a usual solution of the difference equation Jϕ = λϕ.
Consider a solution of (13) in form (28) with the same initial conditions. Then ϕj(λ) =∑j
α=0 ϕα;0Pα;(j,·)(λ), where ϕ0 = Φ0,k;0,m(λ), ϕ1;0 = Φ1,k;0,m(λ), . . . , ϕj;0 = Φj,k;0,m(λ).
12 I. YA. IVASIUK
Using Theorem 2 we see that
(34) Φj,k;l,m(λ) =
j∑
α=0
Φα,k;0,m(λ)Pα;(j,l)(λ); j, k ∈ N0, l = 0, . . . , j, m = 0, . . . , k.
Since the operator J is selfadjoint, the operator Φ(λ) : l2(p) → l2(p
−1) is formally self-
adjoint on l2. So, Φj,k(λ) = (Φk,j(λ))∗, j, k ∈ N0; therefore Φj,k;l,m(λ) = Φk,j;m,l(λ);
j, k ∈ N0, l = 0, . . . , j, m = 0, . . . , k. Thus ∀i, k ∈ N0
Φi,k;0,m(λ) = Φk,i;m,0(λ) =
k∑
β=0
Φβ,i;0,0(λ)Pβ;(k,m)(λ)
=
k∑
β=0
Φi,β;0,0(λ)Pβ;(k,m)(λ); m = 0, . . . , k.
(35)
Substituting (35) into (34) we obtain (33). �
It will be essential for us to rewrite the Parseval identities (30),(32) in the form which
involves the polynomials Pα;(j,k)(λ) introduced above.
Using Parseval equality (32) and representation (33) we get: ∀f, g ∈ lfin
(f, g)l2 =
∫
R
∞∑
j,k=0
j∑
l=0
k∑
m=0
Φj,k;l,m(λ)fk;mgj;l dσ(λ)
=
∫
R
∞∑
j,k=0
j∑
l=0
k∑
m=0
j∑
α=0
k∑
β=0
Φα,β;0,0(λ)Pβ;(k,m)(λ)Pα;(j,l)(λ)fk;mgj;l dσ(λ)
=
∫
R
∞∑
j,k=0
j∑
α=0
k∑
β=0
Φα,β;0,0(λ)
(
fk, Pβ;(k,·)(λ)
)
Hk
(
gj , Pα;(j,·)(λ)
)
Hj
dσ(λ).
In the last expression the range of the indexes (j, α) is j = 0, . . . ,∞; α = 0, . . . , j. We
can also get all points in a given range by making α = 0, . . . ,∞; j = α, . . . ,∞. The
same situation takes place for (k, β). So, after changing the summing order in the last
expression we obtain: ∀f, g ∈ lfin
(36) (f, g)l2 =
∫
R
∞∑
α,β=0
Φα,β;0,0(λ)
∞∑
k=β
(
fk, Pβ;(k,·)(λ)
)
Hk
∞∑
j=α
(
gj , Pα;(j,·)(λ)
)
Hj
dσ(λ).
Since Φ(λ) ≥ 0 and Φα,β;0,0(λ) = (Φ(λ)eβ;0, eα;0)l2 , α, β = 0, 1, . . . , it is easy to see that(
Φα,β;0,0(λ)
)
∞
α,β=0
is a positive-definite matrix.
Let us construct the matrix spectral measure Σ(·) by the formula
(37) dΣ(λ) =
Φ0,0;0,0(λ) Φ0,1;0,0(λ) . . .
Φ1,0;0,0(λ) Φ1,1;0,0(λ) . . .
...
...
. . .
dσ(λ) =
(
Φα,β;0,0(λ) dσ(λ)
)
∞
α,β=0
.
Consider the space of finite vectors u(λ) =
(
u0(λ), u1(λ), . . .
)
, λ ∈ R, (ui(·), i = 0, 1, . . . ,
are complex-valued functions of real variable) with the scalar product
(u(λ), v(λ))L2(R,dΣ(λ)) =
∫
R
(dΣ(λ)u(λ), v(λ))ℓ2 .
Let us introduce the Hilbert space L2(R, dΣ(λ)) as a completion of the given space of
finite vectors with respect to the scalar product (·, ·)L2(R,dΣ(λ)). Also, introduce for f ∈ lfin
DIRECT SPECTRAL PROBLEM FOR THE GENERALIZED JACOBI HERMITIAN MATRICES 13
a Fourier transform f̂(λ) =
f̂0(λ)
f̂1(λ)
. . .
∈ L2(R, dΣ(λ)) by the formula
(38) f̂β(λ) =
∞∑
k=β
(
fk, Pβ;(k,·)(λ)
)
Hk
, β ∈ N0.
According to (36) and (38), we have the following: ∀f, g ∈ lfin
(39) (f, g)l2 = (f̂(λ), ĝ(λ))L2(R,dΣ(λ)).
If we consider the vectors f, g ∈ lfin of the form f = eN ;ξ, g = eM ;ζ then the Parseval
identity (36) gives: ∀N, M ∈ N0, ξ = 0, . . . , N, ζ = 0, . . . , M
δN,Mδξ,ζ =
∫
R
N∑
β=0
M∑
α=0
Φα,β;0,0(λ)Pβ;(N,ξ)(λ)Pα;(M,ζ)(λ) dσ(λ)
=
∫
R
dΣ(λ)
P0;(N,ξ)(λ)
. . .
PN ;(N,ξ)(λ)
0
. . .
,
P0;(M,ζ)(λ)
. . .
PM ;(M,ζ)(λ)
0
. . .
ℓ2
.
(40)
It is easy to find elements of the matrix J in terms of the polynomials Pα;(j,k)(λ). The
representations (29), (30) and (31) give: ∀f, g ∈ lfin
(Jf, g)l2 = (Jf, g)l2 =
∫
R
λ(Φ(λ)f, g)l2dσ(λ) =
∫
R
λ
∞∑
j,k=0
(Φj,k(λ)fk, gj)Hj
dσ(λ)
=
(
λf̂(λ), ĝ(λ)
)
L2(R,dΣ(λ))
.
(41)
The last equality in (41) is obtained from representation (33), by changing the order
of summation (similar to (36)) and using the definition of the Fourier transform. If
we consider in (41) f = f(k;m) = ek;m, g = g(j;l) = ej;l, we will get: ∀j, k ∈ N0 and
l = 0, . . . , j, m = 0, . . . , k
Jj,k;l,m = (Jf(k;m), g(j;l))l2 =
∫
R
k∑
β=0
j∑
α=0
λΦα,β;0,0(λ)Pβ;(k,m)(λ)Pα;(j,l)(λ) dσ(λ)
=
∫
R
λ
dΣ(λ)
P0;(k,m)(λ)
. . .
Pk;(k,m)(λ)
0
. . .
,
P0;(j,l)(λ)
. . .
Pj;(j,l)(λ)
0
. . .
ℓ2
.
(42)
Using the above-mentioned results we can formulate the following Theorem.
Theorem 3. Let, in accordance with representation (29), Φ(·) and σ(·) be the generalized
projection operator and the spectral measure of the operator J. Construct the spectral
matrix dΣ(·) using (37). Then orthogonality relations (40) take place, and one has a
generalized eigenvector expansion due to (38), (39). The operator J can be reconstructed
from formulas (41), (42).
Acknowledgments. The author is very grateful to Yu. M. Berezansky for overall help
and stimulating the discussion, and to referee for useful remarks.
14 I. YA. IVASIUK
References
1. N. I. Akhiezer, The Classical Moment Problem and Some Related Questions in Analysis, Hafner
Publishing Co., New York, 1965. (Russian edition: Fizmatgiz, Moscow, 1961)
2. Yu. M. Berezansky, The expansion in eigenfunctions of partial difference equations of order
two, Trudy Mosk. Mat. Obschestva 5 (1956), 203–268. (Russian)
3. Yu. M. Berezanskii, Expansion in Eigenfunction of Self-Adjoint Operators, AMS, Providence,
R.I., 1968. (Russian edition: Naukova Dumka, Kiev, 1965)
4. Yu. M. Berezansky and M. E. Dudkin, The complex moment problem and direct and inverse
spectral problems for the block Jacobi type bounded normal matrices, Methods Funct. Anal.
Topology 12 (2006), no. 1, 1–31.
5. Yu. M. Berezansky, Z. G. Sheftel, G. F. Us, Functional Analysis, Vols. 1,2, Birkhäuser Verlag,
Basel–Boston–Berlin, 1996. (Russian edition: Vyshcha shkola, Kiev, 1990)
6. G. Teschl, Jacobi Operators and Completely Integrable Nonlinear Lattices, AMS Mathematical
Surveys and Monographs, Vol. 72, AMS, Providence, R.I., 2000.
7. Yuan Xu, Lecture notes on orthogonal polynomials of several variables, Inzell Lectures on
Orthogonal Polynomials, Adv. Theory Spec. Funct. Orthogonal Polynomials, Nova Sci. Publ.
2 (2004), 135–188.
Kyiv National Taras Shevchenko University, Mechanics and Mathematics Faculty, Depart-
ment of Mathematical Analysis, 64 Volodymyrs’ka, Kyiv, 01033, Ukraine
E-mail address: vanobsb@gmail.com
Received 26/05/2008; Revised 09/10/2008
|