# Vector-valued Reproducing Kernel Banach Spaces with Applications to Multi-task Learning

Motivated by multi-task machine learning with Banach spaces, we propose the notion of vector-valued reproducing kernel Banach spaces (RKBS). Basic properties of the spaces and the associated reproducing kernels are investigated. We also present feature map constructions and several concrete examples of vector-valued RKBS. The theory is then applied to multi-task machine learning. Especially, the representer theorem and characterization equations for the minimizer of regularized learning schemes in vector-valued RKBS are established.

## Authors

• 9 publications
• 147 publications
• ### Stability of Multi-Task Kernel Regression Algorithms

We study the stability properties of nonlinear multi-task regression in ...
06/17/2013 ∙ by Julien Audiffren, et al. ∙ 0

• ### Learning Rates for Multi-task Regularization Networks

Multi-task learning is an important trend of machine learning in facing ...
04/01/2021 ∙ by Jie Gui, et al. ∙ 0

• ### Learning Multiple Visual Tasks while Discovering their Structure

Multi-task learning is a natural approach for computer vision applicatio...
04/13/2015 ∙ by Carlo Ciliberto, et al. ∙ 0

• ### Bounds for Vector-Valued Function Estimation

We present a framework to derive risk bounds for vector-valued learning ...
06/05/2016 ∙ by Andreas Maurer, et al. ∙ 0

• ### No-regret Algorithms for Multi-task Bayesian Optimization

We consider multi-objective optimization (MOO) of an unknown vector-valu...
08/20/2020 ∙ by Sayak Ray Chowdhury, et al. ∙ 14

• ### On Reproducing Kernel Banach Spaces: Generic Definitions and Unified Framework of Constructions

Recently, there has been emerging interest in constructing reproducing k...
01/04/2019 ∙ by Rongrong Lin, et al. ∙ 0

• ### Emotion Transfer Using Vector-Valued Infinite Task Learning

Style transfer is a significant problem of machine learning with numerou...
02/09/2021 ∙ by Alex Lambert, et al. ∙ 16

##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1 Introduction

The purpose of this paper is to establish the notion of vector-valued reproducing kernel Banach spaces and demonstrate its applications to multi-task machine learning. Built on the theory of scalar-valued reproducing kernel Hilbert spaces (RKHS) [3], kernel methods have been proven successful in single task machine learning [10, 14, 29, 30, 33]. Multi-task learning where the unknown target function to be learned from finite sample data is vector-valued appears more often in practice. References [13, 25] proposed the development of kernel methods for learning multiple related tasks simultaneously. The mathematical foundation used there was the theory of vector-valued RKHS [5, 27]. Recent progresses in vector-valued RKHS can be found in [7, 8, 9]. In such a framework, both the space of the candidate functions used for approximation and the output space are chosen as a Hilbert space.

There are some occasions where it might be desirable to select the space of candidate functions, the output space, or both as Banach spaces. Hilbert spaces constitute a special and limited class of Banach spaces. Any two Hilbert spaces over a common number field with the same dimension are isometrically isomorphic. By reaching out to other Banach spaces, one obtains more variety in geometric structures and norms that are potentially useful for learning and approximation. Moreover, training data might come with intrinsic structures that make them impossible or inappropriate to be embedded into a Hilbert space. Learning schemes based on features in a Hilbert space may not work well for them. Finally, in some applications, a Banach space norm is engaged for some particular purpose. A typical example is the linear programming regularization in coefficient based regularization for machine learning

[29], where the norm is employed to obtain sparsity in the resulting minimizer.

There have been considerable work in learning a single task with Banach spaces (see, for example, [4, 6, 12, 15, 17, 20, 24, 26, 34, 39, 41]). The difficulty in mapping patterns into a Banach space and making use of these features for learning mainly lies in the lack of an inner product in Banach spaces. In particular, without an appropriate correspondence of the Riesz representation of continuous linear functionals, point evaluations do not have a kernel representation in these studies. Semi-inner products, a mathematical tool discovered by Lumer [23] for the purpose of extending Hilbert space type arguments to Banach spaces, seem to be a natural substitute for inner products in Banach spaces. An illustrative example is that we were able to extend the classical theory of frames and Riesz bases to Banach spaces via semi-inner products [38]. Semi-inner products were first used to machine learning by Der and Lee [12]

for the study of large margin classification by hyperplanes in a Banach space. With this tool, we established the notion of scalar-valued reproducing kernel Banach spaces (RKBS) and investigated regularized learning schemes in RKBS

[36, 37]. There has been increasing interest in the application of this new theory [40, 19, 31, 32].

We attempt to build a mathematical foundation for multi-task learning with Banach spaces. Specifically, we shall propose a definition of vector-valued RKBS and investigate its fundamental properties in the next section. Feature map representations and several concrete examples of vector-valued RKBS will be presented in Sections 3 and 4, respectively. In Section 5, we investigate regularized learning schemes in vector-valued RKBS.

## 2 Definition and Basic Properties

We are concerned with spaces of functions from a fixed set to a vector space. We shall allow the space of functions and the range space both to be a Banach space. Our key tool in dealing with a general Banach space is the semi-inner product [16, 23]. Recall that a semi-inner product on a Banach space is a function from to , denoted by , such that for all and

1. (linearity with respect to the first variable) ;

2. (positivity) for ;

3. (conjugate homogeneity with respect to the second variable) ;

4. (Cauchy-Schwartz inequality) .

A semi-inner product on is said to be compatible if

 [f,f]1/2V=∥f∥V for all f∈V,

where denotes the norm on . Every Banach space has a compatible semi-inner product [16, 23]. Let be a compatible semi-inner product on . Then one sees by the Cauchy-Schwartz inequality that for each , the linear functional on defined by

 f∗(g):=[g,f]V,  g∈V (2.1)

is bounded on . In other words, lies in the dual space of . Moreover, we have

 ∥f∗∥V∗=∥f∥V (2.2)

and

 f∗(f)=∥f∥V∥f∗∥V∗. (2.3)

Introduce the duality mapping from to by setting

 JV(f):=f∗,  f∈V.

We desire to represent the continuous linear functionals on the vector-valued RKBS to be introduced by the semi-inner product. However, the semi-inner product might not be able to fulfill this important role for an arbitrary Banach space. For instance, one verifies that the continuous linear functional

on endowed with the usual maximum norm can not be represented as

 μ(g)=[g,f],  g∈C([0,1])

for any compatible semi-inner product on and any .

The above example indicates that the duality mapping might not be surjective for a general Banach space. Other problems such as non-uniqueness of compatible semi-inner products and non-injectivity of the duality mapping may also occur. To overcome these difficulties, we shall focus on Banach spaces that are uniformly convex and uniformly Fréchet differentiable in this preliminary work on vector-valued RKBS. A Banach space is uniformly convex if for all there exists a such that

 ∥f+g∥V≤2−δ for all f,g∈V with ∥f∥V=∥g∥V=1 and ∥f−g∥V≥ε.

Uniform convexity ensures the injectivity of the duality mapping and the existence and uniqueness of the best approximation to a closed convex subset of [16]. We also say that is uniformly Fréchet differentiable if for all

 limt∈R,t→0∥f+tg∥V−∥f∥Vt (2.4)

exists and the limit is approached uniformly for all in the unit ball of . If is uniformly Fréchet differentiable then it has a unique compatible semi-inner product [16]. The differentiability (2.4) of the norm is useful to derive characterization equations for the minimizer of regularized learning schemes in Banach spaces. For simplicity, we call a Banach space uniform if it is both uniformly convex and uniformly Fréchet differentiable. An analogue of the Riesz representation theorem holds for uniform Banach spaces.

###### Lemma 2.1

(Giles [16]) Let be a uniform Banach space. Then it has a unique compatible semi-inner product and the duality mapping is bijective from to . In other words, for each there exists a unique such that

 μ(g)=[g,f]V for all g∈V.

In this case,

 [f∗,g∗]B∗:=[g,f]B,  f,g∈B (2.5)

defines a compatible semi-inner product on .

Let be a uniform Banach space. We shall always denote by the unique compatible semi-inner product on . By Lemma 2.1 and equation (2.2), the duality mapping is bijective and isometric from to . It is also conjugate homogeneous by property 3 of semi-inner products. However, it is non-additive unless reduces to a Hilbert space. As a consequence, a compatible semi-inner product is in general conjugate homogeneous but non-additive with respect to its second variable. Namely,

 [f,g+h]V≠[f,g]V+[f,h]V

in general.

We are ready to present the definition of vector-valued RKBS. Let be a Banach space which we shall sometimes call the output space and be a prescribed set which is usually called the input space. A space is called a Banach space of -valued functions on if it consists of certain functions from to and the norm on is compatible with point evaluations in the sense that

 ∥f∥B=0 if and only if f(x)=0 for all x∈X.

For instance, , is not a Banach space of functions while is. We restrict our consideration to Banach spaces of functions so that point evaluations (usually referred to as “sampling” in applications) are well-defined.

###### Definition 2.2

We call a -valued RKBS on if both and are uniform and is a Banach space of functions from to such that for every , the point evaluation defined by

 δx(f):=f(x),  f∈B

is continuous from to .

We shall derive a reproducing kernel for so defined a vector-valued RKBS. Throughout the rest of the paper, we let and be the unique semi-inner product and and the associated duality mapping on and , respectively. For two Banach spaces , we denote by the set of all the bounded operators from to and the subset of of those bounded operators that are also linear. When , is abbreviated as . For each , we denote by the greatest lower bound of all the nonnegative constants such that

 ∥Tu∥V2≤α∥u∥V1 for all u∈V1.

When is also linear, this quantity equals the operator norm of in . In those languages, we require that the point evaluation on a -valued RKBS on belong to for all .

###### Theorem 2.3

Let be a -valued RKBS on . Then there exists a unique function from to such that

(1)

for all and ,

(2)

for all , , and

 [f(x),ξ]Λ=[f,K(x,⋅)ξ]B, (2.6)
(3)

for all

 ∥K(x,y)∥M(Λ)≤∥δx∥L(B,Λ)∥δy∥L(B,Λ). (2.7)

Proof: Let and . As , we see that

 |[f(x),ξ]Λ|≤∥f(x)∥Λ∥ξ∥Λ≤∥δx∥L(B,Λ)∥f∥B∥ξ∥Λ. (2.8)

The above inequality together with the linearity of the semi-inner product with respect to its first variable implies that

 f→[f(x),ξ]Λ

is a bounded linear functional on . By Lemma 2.1, there exists a unique function such that

 [f(x),ξ]Λ=[f,gx,ξ]B. (2.9)

Define a function from to the set of operators from to by setting

 K(x,y)ξ:=gx,ξ(y),  x,y∈X, ξ∈Λ.

Clearly, satisfies the two requirements (1) and (2). It is also unique by the uniqueness of the function satisfying (2.9). It remains to show that it is bounded. To this end, we get by (2.8) that

 ∥K(x,⋅)ξ∥B=supf∈B,∥f∥B≤1|[f,K(x,⋅)]B|=supf∈B,∥f∥B≤1|[f(x),ξ]Λ|≤∥δx∥L(B,Λ)∥ξ∥Λ.

It follows that

 ∥K(x,y)ξ∥B≤∥δy∥L(B,Λ)∥K(x,⋅)ξ∥B≤∥δx∥L(B,Λ)∥δy∥L(B,Λ)∥ξ∥Λ,

which proves (2.7).

We call the above function the reproducing kernel of . It coincides with the usual reproducing kernel when is a Hilbert space and , and with the vector-valued reproducing kernel when both and are Hilbert spaces. We explore basic properties of vector-valued RKBS and its reproducing kernels for further investigation and applications.

Let be the adjoint operator of for all . Denote for a Banach space by the bilinear form on defined by

 (v,μ)V:=μ(v),  v∈V, μ∈V∗.

Thus, is define by

 (f,(δx)∗ξ∗)B=(δ(x)(f),ξ∗)Λ=(f(x),ξ∗)Λ=[f(x),ξ]Λ,  f∈B, ξ∈Λ. (2.10)
###### Proposition 2.4

Let be a -valued RKBS on and its reproducing kernel. Then there holds for all and that

 [K(x,x)ξ,ξ]Λ≥0, |[K(x,y)ξ,η]Λ|≤[K(x,x)ξ,ξ]1/2Λ[K(y,y)η,η]1/2Λ, (2.11)
 ∥K(x,y)∥M(Λ)≤∥K(x,x)∥1/2M(Λ)∥K(y,y)∥1/2M(Λ), (2.12)
 K(x,⋅)ξ=J−1B(δx)∗JΛ(ξ), (2.13)
 K(x,y)(αξ)=αK(x,y)ξ for all α∈C, (2.14)
 ∥K(x,⋅)ξ∥B≤∥δx∥L(B,Λ)∥ξ∥Λ,  ∥K(x,⋅)ξ∥B≤∥K(x,x)∥1/2M(Λ)∥ξ∥Λ, (2.15)
 (K(x,⋅)ξ)∗+(K(x,⋅)η)∗=(K(x,⋅)τ)∗ whenever τ∗=ξ∗+η∗, (2.16)
 span{(K(x,⋅)ξ)∗:x∈X, ξ∈Λ} is dense in% B∗. (2.17)

Proof: By (2.6),

 [K(x,x)ξ,ξ]Λ=[K(x,⋅)ξ,K(x,⋅)ξ]B=∥K(x,⋅)ξ∥2B≥0, (2.18)

which proves the first inequality in equation (2.11). For the second one, we use the Cauchy-Schwartz inequality of semi-inner products to get that

 |[K(x,y)ξ,η]Λ|=|[K(x,⋅)ξ,K(y,⋅)η]B|≤[K(x,⋅)ξ,K(x,⋅)ξ]1/2B[K(y,⋅)η,K(y,⋅)η]1/2B=[K(x,x)ξ,ξ]1/2Λ[K(y,y)η,η]1/2Λ.

It follows from (2.11) that

 |[K(x,y)ξ,η]Λ|≤∥K(x,x)ξ∥1/2Λ∥ξ∥1/2Λ∥K(y,y)η∥1/2Λ∥η∥1/2Λ≤∥K(x,x)∥1/2M(Λ)∥K(y,y)∥1/2M(Λ)∥ξ∥Λ∥η∥Λ.

Since , we have by the above equation that

 ∥K(x,y)ξ∥Λ≤∥K(x,x)∥1/2M(Λ)∥K(y,y)∥1/2M(Λ)∥ξ∥Λ,

which proves (2.12).

Turning to (2.13), we notice for each that

 [f,J−1B(δx)∗JΛ(ξ)]B=(f,(δx)∗JΛ(ξ))B=(δx(f),ξ∗)Λ=(f(x),ξ∗)Λ=[f(x),ξ]Λ,

which together with (2.6) confirms (2.13). Since the duality mappings are conjugate homogeneous, we have by (2.13) that

 K(x,⋅)(αξ)=J−1B(δx)∗JΛ(αξ)=αJ−1B(δx)∗JΛ(ξ)=αK(x,⋅)ξ,

which implies (2.14).

Recall that the duality mappings and are isometric. Note also that a bounded linear operator and its adjoint have equal operator norms. Using these two facts, we obtain from equation (2.13) that

 ∥K(x,⋅)ξ∥B≤∥(δx)∗∥L(Λ∗,B∗)∥ξ∥Λ=∥δx∥L(B,Λ)∥ξ∥Λ,

which is the first inequality in (2.15). The second one follows immediately from (2.18).

Let be such that . By (2.13),

 (K(x,⋅)ξ)∗+(K(x,⋅)η)∗=(δx)∗ξ∗+(δx)∗η∗=(δx)∗(ξ∗+η∗)=(δx)∗τ∗=(K(x,⋅)τ)∗.

Equation (2.16) hence holds true.

For the last property, let us assume that there exists some that vanishes on . Then

 [f(x),ξ]Λ=[f,K(x,⋅)ξ]B=(f,(K(x,⋅)ξ)∗)B=0 for all x∈X, ξ∈Λ,

which implies that for all . As is a Banach space of functions, as a vector in the Banach space . Therefore, (2.17) is true. The proof is complete.

We observe by the above proposition that the reproducing kernel of a vector-valued RKBS enjoys many properties similar to those of the reproducing kernel of a vector-valued RKHS. However, there are many significant differences due to the nature of a semi-inner product. Firstly, although for all , remains a homogeneous bounded operator on , it is generally non-additive. This can be seen from (2.13), where or is non-additive. Secondly, it is well-known that when is a Hilbert space, a function is the reproducing kernel of some -valued RKHS on if and only if for all finite and pairwise distinct , ,

 m∑j=1m∑k=1[K(xj,xk)ξj,ξk]Λ≥0. (2.19)

Although (2.19) still holds for the reproducing kernel of a vector-valued RKBS when and the number field is , it may cease to be true once the number of sampling points exceeds . An example will be constructed in the next section. Finally, the denseness property (2.17) in the dual space does not necessarily imply that

 ¯¯¯¯¯¯¯¯¯¯¯¯¯span{K(x,⋅)ξ:x∈X, ξ∈Λ}=B. (2.20)

A negative example will also be given in the next section after we present a construction of vector-valued RKBS through feature maps. Before that, we present another important property of a vector-valued RKBS.

###### Proposition 2.5

Let be a -valued RKBS on . Suppose that , converges to some then converges to in the topology of for each . The convergence is uniform on the set where is bounded.

Proof: Suppose that converges as tends to infinity. We get by (2.15) that

 ∥fn(x)−f(x)∥Λ=supξ∈Λ,∥ξ∥Λ=1|[fn(x)−f(x),ξ]Λ|=supξ∈Λ,∥ξ∥Λ=1|[fn−f,K(x,⋅)ξ]B|≤supξ∈Λ,∥ξ∥Λ=1∥fn−f∥B∥K(x,⋅)ξ∥B≤∥fn−f∥B∥K(x,x)∥1/2M(Λ).

Therefore, converges pointwise to on and the convergence is uniform on the set where is bounded.

## 3 Feature Map Representations

Feature map representations form the most important way of expressing reproducing kernels. To introduce feature maps for the reproducing kernel of a vector-valued RKBS, we need the notion of the generalized adjoint [22] of a bounded linear operator between Banach spaces. Let be two uniform Banach spaces with the compatible semi-inner products and , respectively. The generalized adjoint of a is an operator in defined by

 [Tu,v]V2=[u,T†v]V1,  u∈V1, v∈V2.

It can be identified that

 T†=J−1V1T∗JV2.

Thus, is indeed bounded as

 ∥T†∥M(V2,V1)=∥T∗∥L(V∗2,V∗1)=∥T∥L(V1,V2).

We are in a position to present a characterization of the reproducing kernel of a vector-valued RKBS.

###### Theorem 3.1

A function is the reproducing kernel of some -valued RKBS on if and only if there exists a uniform Banach space and a mapping such that

 K(x,y)=Φ(y)Φ†(x),  x,y∈X, (3.1)

and

 ¯¯¯¯¯¯¯¯¯¯¯¯¯span{(Φ†(x)ξ)∗:x∈X, ξ∈Λ}=W∗. (3.2)

Here is the function from to defined by , .

Proof: Suppose that is the reproducing kernel of some -valued RKBS on . Set and define by

 (Φ(x))(f):=f(x),  f∈B,  x∈X.

To identify , we observe by the reproducing property (2.6) for all and that

 [f,Φ†(x)ξ]B=[(Φ(x))f,ξ]Λ=[f(x),ξ]Λ=[f,K(x,⋅)ξ]B,  x∈X,  ξ∈Λ,

which implies that for all and . Requirement (3.2) is fulfilled by (2.17). By the forms of and , we obtain that

 Φ(y)Φ†(x)ξ=Φ(y)(K(x,⋅)ξ)=K(x,y)ξ,

which proves (3.1).

On the other hand, suppose that is of the form (3.1) in terms of some mapping satisfying the denseness condition (3.2). We shall construct the RKBS that takes as its reproducing kernel. For this purpose, we let be composed of functions from to of the following form

 fu(x):=Φ(x)u, x∈X for some u∈W.

Since each is a linear operator, is a linear vector space. We impose a norm on by setting

 ∥fu∥B:=∥u∥W,  u∈W.

To verify that this is a well-defined norm, it suffices to show that the representer of a function is unique. Assume that . Then for all and ,

 (u,(Φ†(x)ξ)∗)W=[u,Φ†(x)ξ]W=[Φ(x)u,ξ]Λ=[0,ξ]Λ=0,

which combined with (3.2) implies that . The arguments also show that is a Banach space of functions. Moreover, it is a uniform Banach space as it is isometrically isomorphic to . Clearly, we have for each and that

 ∥fu(x)∥Λ=∥Φ(x)u∥Λ≤∥Φ(x)∥L(W,Λ)∥u∥W=∥Φ(x)∥L(W,Λ)∥fu∥B,

which shows that point evaluations are bounded on . We conclude that is a -valued RKBS on . It remains to prove that is the reproducing kernel of . To this end, we identify the unique compatible semi-inner product on as

 [fu,fv]B:=[u,v]W,  u,v∈W,

and observe for all and that

 [fu,K(x,⋅)ξ]B=[fu,Φ(⋅)Φ†(x)ξ]B=[u,Φ†(x)ξ]W=[Φ(x)u,ξ]Λ=[fu(x),ξ]Λ,

which is what we want. The proof is complete.

We call the Banach space and the mapping in Theorem 3.1 a pair of feature space and feature map for , respectively. The proof of Theorem 3.1 contains a construction of vector-valued RKBS by feature maps, which we pull out separately as a corollary below.

###### Corollary 3.2

Let be a uniform Banach space and be a feature map of that satisfies (3.1) and (3.2). Then the linear vector space

 B:={Φ(⋅)u: u∈W}

endowed with the norm

 ∥Φ(⋅)u∥B:=∥u∥W,  u∈W

and compatible semi-inner product

 [Φ(⋅)u,Φ(⋅)v]B:=[u,v]W,  u,v∈W

is a -valued RKBS on with the reproducing kernel given by (3.1).

As an interesting application of Corollary 3.2, we shall show that a vector-valued RKBS is always isometrically isomorphic to a scalar-valued RKBS on a different input space.

###### Corollary 3.3

If is a -valued RKBS on then the following linear vector space of complex-valued functions on of the form

 ~f(x,ξ):=[f(x),ξ]Λ,  x∈X, ξ∈Λ, f∈B

is an RKBS on with the norm

 ∥~f∥~B:=∥f∥B,  f∈B

and the compatible semi-inner product

 [~f,~g]~B:=[f,g]B,  f,g∈B.

The reproducing kernel of is

 ~K((x,ξ),(y,η)):=[K(x,y)ξ,η]Λ,  x,y∈X, ξ,η∈Λ.

Proof: It suffices to point out that is constructed by Corollary 3.2 via the choices

 Λ:=C, W:=B, Φ(x,ξ):=(K(x,⋅)ξ)∗,  (x,ξ)∈~X.

The feature map satisfies the denseness condition by (2.17).

We shall next construct by Corollary 3.2 simple vector-valued RKBS to show that the reproducing kernel of a general vector-valued RKBS might not satisfy (2.19) or (2.20). Let satisfy that

 1p+1q=1r+1s=1. (3.3)

Here, for the sake of convenience in enumerating elements from a finite set, we set for . For each and , denotes the Banach space of all vectors with the norm

 ∥u∥ℓlγ:=(l∑j=1|uj|γ)1/γ<+∞.

The space is a uniform Banach space with the compatible semi-inner product

 [u,v]ℓlγ:=l∑j=1uj¯¯¯¯¯vj|vj|γ−2∥v∥γ−2ℓlγ,  u,v∈ℓlγ.

The dual element of is hence given by

 u∗:=⎛⎜ ⎜⎝¯¯¯¯¯vj|vj|γ−2∥v∥γ−2ℓlγ:j∈Nl⎞⎟ ⎟⎠,  u∈ℓlγ. (3.4)

Non-completeness of the linear span of the reproducing kernel in . We give a counterexample of (2.20) first. Let . We choose the output space and feature space as and , respectively. Thus, we have that and . The input space will be chosen as a set of discrete points . A feature map should satisfy the denseness condition (3.2). We note by the definition of the generalized adjoint that this condition is equivalent to

 ¯¯¯¯¯¯¯¯¯¯¯¯¯span{Φ∗(x)ξ∗:x∈X, ξ∈Λ}=W∗, (3.5)

where for all .

Let us take a close look at equation (2.20). By Corollary 3.2, a general function in is of the form for some . Equation (2.20) does not hold true if and only if there exists a nontrivial such that

 [K(x,⋅)ξ,fu]B=[Φ(⋅)Φ†(x)ξ,Φ(⋅)u]B=[Φ†(x)ξ,u]W=0,

which in turn is equivalent to that is not dense in . We conclude that to construct a -valued RKBS for which (2.20) is not true, it suffices to find a feature map that satisfies (3.5) but

 ¯¯¯¯¯¯¯¯¯¯¯¯¯span{Φ†(x)ξ:x∈X, ξ∈Λ}⫋W. (3.6)

To this end, we find a sequence of vectors and set

 Φ∗(xj)ξ∗:=(ξ∗)1wj,  j∈Nm, (3.7)

where is the first component of the vector . Since for each , is a linear operator from to and both the spaces are finite-dimensional, is bounded. We reformulate (3.5) and (3.6) to get that they are respectively equivalent to

 span{wj:j∈Nm}=Cm (3.8)

and

 span{J−1Wwj:j∈Nm}⫋Cm. (3.9)

Here for a vector , we get by (3.4) that

 J−1Wu=⎛⎝¯¯¯¯¯uj|uj|s−2∥u∥s−2ℓms:j∈Nm⎞⎠.

Therefore, the task reduces to the searching of an nonsingular matrix that becomes singular when we apply the function to each of its components. We find two such matrices as shown below

 m=4, s=4, A1:=⎡⎢ ⎢ ⎢⎣0824505154690948⎤⎥ ⎥ ⎥⎦, and m=4, s=5, A2:=⎡⎢ ⎢ ⎢⎣cccc9999860269217499⎤⎥ ⎥ ⎥⎦.

Non-positive-definiteness of the reproducing kernel of . We shall give an example to show that (2.19) might not hold true for the reproducing kernel of a vector-valued RKBS when the number of sampling points exceeds . In fact, we let and be constructed as in the above example with to be appropriately chosen in the definition (3.7) of . Our purpose is to find and , such that

 3∑j=13∑k=1[K(xj,xk)ξj,ξk]B<0. (3.10)

We first note for all that

 [K(xj,xk)ξj,ξk]Λ=[Φ(xk)Φ†(xj)ξj,ξk]Λ=[Φ†(xj)ξj,Φ†(xk)ξk]Λ=[(Φ†(xk)ξk)∗,(Φ†