This paper was converted on www.awesomepapers.org from LaTeX by an anonymous user.
Want to know more? Visit the Converter page.

A Basic Treatment of the Distance Covariance

Dominic Edelmann , Tobias Terzer, and Donald Richards Division of Biostatistics, German Cancer Research Center, Im Neuenheimer Feld 280, 69120 Heidelberg, Germany. E-mail address: dominic.edelmann@dkfz-heidelberg.de.  Division of Biostatistics, German Cancer Research Center, Im Neuenheimer Feld 280, 69120 Heidelberg, Germany. E-mail address: t.terzer@dkfz-heidelberg.de.  Department of Statistics, Pennsylvania State University, University Park, PA 16802, U.S.A. E-mail address: richards@stat.psu.edu.  MSC 2010 subject classifications: Primary 62G10, 62H20; Secondary 60E10, 62G20.  Keywords and phrases. Asymptotic distribution; distance correlation; multivariate tests of independence; orthogonal transformations; U-statistics.
Abstract

The distance covariance of Székely, et al. [24] and Székely and Rizzo [22], a powerful measure of dependence between sets of multivariate random variables, has the crucial feature that it equals zero if and only if the sets are mutually independent. Hence the distance covariance can be applied to multivariate data to detect arbitrary types of non-linear associations between sets of variables. We provide in this article a basic, albeit rigorous, introductory treatment of the distance covariance. Our investigations yield an approach that can be used as the foundation for presentation of this important and timely topic even in advanced undergraduate- or junior graduate-level courses on mathematical statistics.

1 Introduction

The distance covariance, a measure of dependence between multivariate random variables XX and YY, was introduced by Székely, Rizzo, and Bakirov [24] and has since received extensive attention in the statistical literature. A crucial feature of the distance covariance is that it equals zero if and only if XX and YY are mutually independent. Hence the distance covariance is sensitive to arbitrary dependencies; this is in contrast to the classical covariance, which is generally capable of detecting only linear dependencies. This property is illustrated in Figure 1, which illustrates that tests based on the distance covariance are able to detect numerous types of non-linear associations even when tests based on the classical covariance may fail to detect many such statistical relationships.

Refer to caption
Figure 1: The sub-figures A-C represent scatter-plots of bivariate samples (𝑿,𝒀)({\boldsymbol{X}},{\boldsymbol{Y}}) with n=600n=600 data points to which independence tests, based on the distance covariance and classical covariance, were applied. In each case a distance covariance permutation test using 100,000100,000 permutations yields pp-values of 10510^{-5}, demonstrating that the distance covariance is able to detect these dependencies. The pp-values of permutation tests based on the classical covariance with 100,000100,000 permutations are 0.6630.663, 0.1290.129, and 0.8890.889 for A, B, and C, respectively.

While the dependencies illustrated in Figure 1 clearly represent purely illustrative examples, the sensitivity of the distance covariance to arbitrary dependencies can be very useful for applications. This is demonstrated in Figure 2, where we show three dependencies between expression values genes in the breast cancer data set by Van De Vijver, et al. [26]; all these dependencies can be detected by the distance covariance but not by the classical covariance.

For comparisons of the distance covariance and classical covariance in applications to data, see the examples given by Székely and Rizzo [22, Section 5.2] and Dueck, et al. [3, Section 5]; for extensive numerical experiments and fast algorithms for computing the distance covariance, see Huo and Székely [11, Section 5]. We also refer to Sejdinovic, et al. [19], Dueck, et al. [3], Székely and Rizzo [22, 23], Huo and Székely [11], and Edelmann, et al. [6], representing only a few of the many authors who have given further theoretical results on the distance covariance and distance correlation coefficients; and to Zhou [28], Fiedler [7], and Edelmann, et al. [5] as among the applications to time series analyses. Many applications to data analysis of the distance correlation coefficient and the distance covariance are now available, including: Kong, et al. [12] on data in sociology, Martínez-Gómez, textitet al. [14] and Richards, et al. [18] on astrophysical databases and galaxy clusters, Dueck, et al. [3] on time series analyses of wind vectors at electricity-generating facilities, Richards [17] on the relationship between the strength of gun control laws and firearm-related homicides, Zhang, et al. [27] for remote sensing applications, and Ohana-Levi, et al. [15] on grapevine transpiration.

Refer to caption
Figure 2: Sub-figures A-C represent three scatter-plots of the expression values of genes in a breast cancer data set provided by Van De Vijver, et al. [26] (n=295n=295 samples) on which permutation tests, based on the distance covariance and classical covariance, were applied. The pp-values of the distance covariance permutation tests using 100,000100,000 permutations are A: 10510^{-5} ; B: 10510^{-5}; C: 3.00×1043.00\times 10^{-4} . The pp-values of permutation tests based on the classical covariance with 100,000100,000 permutations are A: 0.0790.079 ; B: 0.5030.503; C: 0.9300.930.

The original papers of Székely, et al. [24] and Székely and Rizzo [22] are now widely recognized as seminal and important contributions to measuring dependence between sets of random variables; however, the exposition therein includes some ingenious arguments that may make the material challenging to readers not having an advanced background in mathematical statistics. With the benefit of hindsight, we are able to provide in this article a simpler, albeit mathematically rigorous, introduction to the distance covariance that can be taught even in an undergraduate-level course covering the basic theory of U-statistics. Other than standard U-statistics theory and some well-known properties of characteristic functions, the requirements for our treatment are a knowledge of multidimensional integration and trigonometric inequalities, as covered in a course on undergraduate-level advanced calculus. Consequently, we hope that this treatment will prove to be beneficial to non-mathematical statisticians.

Our presentation introduces the distance covariance as an important alternative to the classical covariance. Moreover, the distance covariance constitutes a particularly interesting example of a U-statistic since it includes both the “non-degenerate” and “first-order degenerate” cases of the asymptotic distribution theory of U-statistics, these corresponding to the situations in which XX and YY are dependent, leading to the non-degenerate case, or XX and YY are independent, leading to the first-order degenerate case of the asymptotic theory.

Throughout the exposition, \|\cdot\| denotes the Euclidean norm and ,\langle\cdot,\cdot\rangle the corresponding inner product. Also, we denote by |||\cdot| the modulus in \mathbb{C} or the absolute value in \mathbb{R}, and the imaginary unit is i=1i=\sqrt{-1}.

2 The fundamental integral of distance covariance theory

Following Székely, et al. [24], we first establish a closed-form expression for an integral that plays a central work in this article, leading to the equivalence of two crucial expressions for the distance covariance. The first expression displays the distance covariance as an integrated distance between the joint characteristic function of (X,Y)(X,Y) and the product of the marginal characteristic functions of XX and YY; we will deduce from this expression that the distance covariance equals zero if and only if XX and YY are independent. The second expression allows us to derive consistent distance covariance estimators that are expressible as polynomials in the distances between random samples.

Since the ability to characterize independence and the existence of easily computable estimators are arguably the most important properties of the distance covariance, we will refer to this integral as the fundamental integral of distance covariance.

Lemma 2.1.

For xpx\in\mathbb{R}^{p},

p1cost,xtp+1dt=π(p+1)/2Γ((p+1)/2)x.\int_{\mathbb{R}^{p}}\frac{1-\cos\,\langle t,x\rangle}{\|t\|^{p+1}}{{\hskip 1.0pt}\rm{d}}t=\frac{\pi^{(p+1)/2}}{\Gamma\big{(}(p+1)/2\big{)}}\,\|x\|. (2.1)

Proof. Since (2.1) is valid for x=0x=0, we need only treat the case in which x0x\neq 0.

Denote by IpI_{p} the integral in (2.1). For p=1p=1, replacing tt by t/xt/x yields

I1=1costxt2dt=|x|1costt2dt.I_{1}=\int_{-\infty}^{\infty}\frac{1-\cos tx}{t^{2}}{{\hskip 1.0pt}\rm{d}}t=|x|\,\int_{-\infty}^{\infty}\frac{1-\cos t}{t^{2}}{{{\hskip 1.0pt}\rm{d}}t}. (2.2)

Denoting the latter integral in (2.2) by c1c_{1}, it follows by integration-by-parts that

c1=201costt2dt=20sinttdt=π,c_{1}=2\,\int_{0}^{\infty}\frac{1-\cos t}{t^{2}}{{{\hskip 1.0pt}\rm{d}}t}=2\,\int_{0}^{\infty}\frac{\sin t}{t}{{{\hskip 1.0pt}\rm{d}}t}=\pi, (2.3)

the last equality being classical in calculus (Spivak [20, Chapter 19, Problem 43]).

For general pp, note that IpI_{p} is invariant under orthogonal transformations HH of xx:

p1cost,Hxtp+1dt\displaystyle\int_{\mathbb{R}^{p}}\frac{1-\cos\,\langle t,Hx\rangle}{\|t\|^{p+1}}{{\hskip 1.0pt}\rm{d}}t =p1cosHt,HxHtp+1dt\displaystyle=\int_{\mathbb{R}^{p}}\frac{1-\cos\,\langle Ht,Hx\rangle}{\|H\,t\|^{p+1}}{{\hskip 1.0pt}\rm{d}}t
=p1cost,xtp+1dt,\displaystyle=\int_{\mathbb{R}^{p}}\frac{1-\cos\,\langle t,x\rangle}{\|t\|^{p+1}}{{\hskip 1.0pt}\rm{d}}t,

where the first equality follows from the transformation tHtt\mapsto Ht, which leaves the Lebesgue measure dt{{\hskip 1.0pt}\rm{d}}t unchanged; and the second equality holds because the norm and the inner product are orthogonally invariant. Therefore, in evaluating IpI_{p} we may replace xx by x(1,0,,0)\|x\|(1,0,\ldots,0); letting t=(t1,,tp)t=(t_{1},\ldots,t_{p}), we obtain

Ip=p1cos(t1x)tp+1dt=xp1cost1tp+1dt,I_{p}=\int_{\mathbb{R}^{p}}\frac{1-\cos\,(t_{1}\|x\|)}{\|t\|^{p+1}}{{\hskip 1.0pt}\rm{d}}t=\|x\|\int_{\mathbb{R}^{p}}\frac{1-\cos t_{1}}{\|t\|^{p+1}}{{\hskip 1.0pt}\rm{d}}t, (2.4)

the last equality obtained by replacing tjt_{j} by tj/xt_{j}/\|x\|, j=1,,pj=1,\ldots,p.

Denoting by cpc_{p} the latter integral in (2.4), we substitute in that integral tj=vjt_{j}=v_{j}, j=1,,p1j=1,\ldots,p-1, and tp=p1/2(v12++vp12)1/2vpt_{p}=p^{-1/2}(v_{1}^{2}+\cdots+v_{p-1}^{2})^{1/2}v_{p}. As the Jacobian of this transformation is p1/2(v12++vp12)1/2p^{-1/2}(v_{1}^{2}+\cdots+v_{p-1}^{2})^{1/2}, we obtain

cp\displaystyle c_{p} =p1/2p11cosv1(v12++vp12)p/2dv1dvp1dvp(1+p1vp2)(p+1)/2\displaystyle=p^{-1/2}\int_{\mathbb{R}^{p-1}}\frac{1-\cos v_{1}}{(v_{1}^{2}+\cdots+v_{p-1}^{2})^{p/2}}{{\hskip 1.0pt}\rm{d}}v_{1}\cdots{{\hskip 1.0pt}\rm{d}}v_{p-1}\cdot\int_{-\infty}^{\infty}\frac{{{\hskip 1.0pt}\rm{d}}v_{p}}{(1+p^{-1}v_{p}^{2})^{(p+1)/2}}
=p1/2cp1dvp(1+p1vp2)(p+1)/2.\displaystyle=p^{-1/2}c_{p-1}\,\int_{-\infty}^{\infty}\frac{{{\hskip 1.0pt}\rm{d}}v_{p}}{(1+p^{-1}v_{p}^{2})^{(p+1)/2}}. (2.5)

As the remaining integral in (2) is the familiar normalizing constant of the Student’s tt-distribution on pp degrees-of-freedom, we obtain

cp=π1/2Γ(p/2)Γ((p+1)/2)cp1.c_{p}=\frac{\pi^{1/2}\Gamma(p/2)}{\Gamma\big{(}(p+1)/2\big{)}}\,c_{p-1}.

Starting with c1=πc_{1}=\pi, we solve this recursive equation for cpc_{p}, obtaining (2.1).

3 Two representations for the distance covariance

We now introduce the representations of the distance covariance mentioned above. Following Székely, et al. [24], we define the distance covariance through its characteristic function representation. For jointly distributed random vectors XpX\in\mathbb{R}^{p} and YqY\in\mathbb{R}^{q}, let ϕX,Y(s,t)=𝔼eis,X+it,Y\phi_{X,Y}(s,t)=\mathbb{E}e^{i\langle s,X\rangle+i\langle t,Y\rangle} be the joint characteristic function of (X,Y)(X,Y) and ϕX(s)=ϕX,Y(s,0)\phi_{X}(s)=\phi_{X,Y}(s,0) and ϕY(t)=ϕX,Y(0,t)\phi_{Y}(t)=\phi_{X,Y}(0,t) be the corresponding marginal characteristic functions.

Definition 3.1.

The distance covariance 𝒱(X,Y){\mathcal{V}}(X,Y) between XX and YY is defined as the nonnegative square-root of

𝒱2(X,Y)=1cpcqpq|ϕX,Y(s,t)ϕX(s)ϕY(t)|2sp+1tp+1dsdt,{\mathcal{V}}^{2}(X,Y)=\frac{1}{c_{p}c_{q}}\int_{\mathbb{R}^{p}}\int_{\mathbb{R}^{q}}\frac{|\phi_{X,Y}(s,t)-\phi_{X}(s)\phi_{Y}(t)|^{2}}{\|s\|^{p+1}\,\|t\|^{p+1}}\,{{\hskip 1.0pt}\rm{d}}s\,{{\hskip 1.0pt}\rm{d}}t, (3.1)

where cpc_{p} is the normalizing constant in (2.1).

As the integrand in (3.1) is nonnegative, it follows that 𝒱2(X,Y)0{\mathcal{V}}^{2}(X,Y)\geq 0. Further, we will show in Corollary 3.4 that 𝒱2(X,Y)<{\mathcal{V}}^{2}(X,Y)<\infty whenever XX and YY have finite first moments.

An advantage of the representation (3.1) is that it directly implies one of the most important properties of the distance covariance, viz., the characterization of independence.

Theorem 3.2.

For all XX and YY, 𝒱2(X,Y)=0{\mathcal{V}}^{2}(X,Y)=0 if and only if XX and YY are independent.

Proof. If XX and YY are independent then ϕX,Y(s,t)=ϕX(s)ϕY(t)\phi_{X,Y}(s,t)=\phi_{X}(s)\phi_{Y}(t) for all ss and tt; hence 𝒱2(X,Y)=0{\mathcal{V}}^{2}(X,Y)=0.

Conversely, if XX and YY are not independent then the functions ϕX,Y(s,t)\phi_{X,Y}(s,t) and ϕX(s)ϕY(t)\phi_{X}(s)\phi_{Y}(t) are not identical (Van der Vaart [25, Lemma 2.15]). Since characteristic functions are continuous then there exists an open set 𝒜p×q\mathcal{A}\subseteq\mathbb{R}^{p}\times\mathbb{R}^{q} such that |ϕX,Y(s,t)ϕX(s)ϕY(t)|2>0|\phi_{X,Y}(s,t)-\phi_{X}(s)\phi_{Y}(t)|^{2}>0 for all (s,t)𝒜(s,t)\in\mathcal{A}. Hence, by (3.1), 𝒱2(X,Y)>0{\mathcal{V}}^{2}(X,Y)>0.

For the purpose of deriving estimators for 𝒱2(X,Y){\mathcal{V}}^{2}(X,Y), we now apply Lemma 2.1 to obtain a second representation of the distance covariance.

Theorem 3.3.

Suppose that (X1,Y1),,(X4,Y4)(X_{1},Y_{1}),\ldots,(X_{4},Y_{4}) are independent, identically distributed (i.i.d.) copies of (X,Y)(X,Y). Then

𝒱2(X,Y)=𝔼[X1X2Y1Y22X1X2Y1Y3+X1X2Y3Y4].{\mathcal{V}}^{2}(X,Y)=\mathbb{E}\Big{[}\|X_{1}-X_{2}\|\cdot\|Y_{1}-Y_{2}\|-2\|X_{1}-X_{2}\|\cdot\|Y_{1}-Y_{3}\|+\|X_{1}-X_{2}\|\cdot\|Y_{3}-Y_{4}\|\Big{]}. (3.2)

Proof. First, we observe that the numerator in the integrand in (3.1) equals

|ϕX,Y\displaystyle|\phi_{X,Y} (s,t)ϕX(s)ϕY(t)|2\displaystyle(s,t)-\phi_{X}(s)\phi_{Y}(t)|^{2}
=(ϕX,Y(s,t)ϕX(s)ϕY(t))(ϕX,Y(s,t)ϕX(s)ϕY(t))¯\displaystyle=(\phi_{X,Y}(s,t)-\phi_{X}(s)\phi_{Y}(t))\,\overline{(\phi_{X,Y}(s,t)-\phi_{X}(s)\phi_{Y}(t))}
=𝔼[eis,X1X2+it,Y1Y22eis,X1X2+it,Y1Y3+eis,X1X2+it,Y3Y4].\displaystyle=\mathbb{E}\big{[}e^{i\langle s,X_{1}-X_{2}\rangle+i\langle t,Y_{1}-Y_{2}\rangle}-2\,e^{i\langle s,X_{1}-X_{2}\rangle+i\langle t,Y_{1}-Y_{3}\rangle}+\,e^{i\langle s,X_{1}-X_{2}\rangle+i\langle t,Y_{3}-Y_{4}\rangle}\big{]}.

Since the latter expression is real, any term of the form eize^{iz}, zz\in\mathbb{R}, can be replaced by cosz\cos z. Hence, by (3.1),

cpcq𝒱2(X,Y)=pqA12(s,t)2A13(s,t)+A34(s,t)sp+1tq+1dsdt\displaystyle c_{p}c_{q}{\mathcal{V}}^{2}(X,Y)=\int_{\mathbb{R}^{p}}\int_{\mathbb{R}^{q}}\frac{A_{12}(s,t)-2\,A_{13}(s,t)+A_{34}(s,t)}{\|s\|^{p+1}\|t\|^{q+1}}{{\hskip 1.0pt}\rm{d}}s{{\hskip 1.0pt}\rm{d}}t (3.3)

where, for each (j,k)(j,k),

Ajk(s,t)=𝔼cos(s,X1X2+t,YjYk).A_{jk}(s,t)=\mathbb{E}\cos{\big{(}\langle s,X_{1}-X_{2}\rangle+\langle t,Y_{j}-Y_{k}\rangle\big{)}}. (3.4)

Replacing tt by t-t in (3.3), we also obtain

cpcq𝒱2(X,Y)=pqA12(s,t)2A13(s,t)+A34(s,t)sp+1tq+1dsdt,\displaystyle c_{p}c_{q}{\mathcal{V}}^{2}(X,Y)=\int_{\mathbb{R}^{p}}\int_{\mathbb{R}^{q}}\frac{A_{12}(s,-t)-2\,A_{13}(s,-t)+A_{34}(s,-t)}{\|s\|^{p+1}\|t\|^{q+1}}{{\hskip 1.0pt}\rm{d}}s{{\hskip 1.0pt}\rm{d}}t, (3.5)

and by adding (3.3) and (3.5), we find that

cpcq𝒱2(X,Y)=pqB12(s,t)2B13(s,t)+B34(s,t)sp+1tq+1dsdtc_{p}c_{q}{\mathcal{V}}^{2}(X,Y)=\int_{\mathbb{R}^{p}}\int_{\mathbb{R}^{q}}\frac{B_{12}(s,t)-2\,B_{13}(s,t)+B_{34}(s,t)}{\|s\|^{p+1}\|t\|^{q+1}}{{\hskip 1.0pt}\rm{d}}s{{\hskip 1.0pt}\rm{d}}t

where for each (j,k)(j,k),

Bjk(s,t)=12(Ajk(s,t)+Ajk(s,t)).B_{jk}(s,t)=\frac{1}{2}\big{(}A_{jk}(s,t)+A_{jk}(s,-t)\big{)}. (3.6)

On applying to (3.4) and (3.6) the trigonometric identity,

cos(x+y)+cos(xy)=2cosxcosy,\cos(x+y)+\cos(x-y)=2\,\cos x\cos y,

we deduce that

Bjk(s,t)=𝔼[coss,X1X2cost,YjYk].B_{jk}(s,t)=\mathbb{E}\big{[}\cos{\langle s,X_{1}-X_{2}\rangle}\,\cos{\langle t,Y_{j}-Y_{k}\rangle}\big{]}. (3.7)

For j,k,{1,2,3,4}j,k,\in\{1,2,3,4\}, we apply to (3.7) the elementary identity,

coss,X1X2cost,YjYk\displaystyle\cos{\langle s,X_{1}-X_{2}\rangle}\,\cos{\langle t,Y_{j}-Y_{k}\rangle} =(1coss,X1X2)(1cost,YjYk)\displaystyle=\big{(}1-\cos{\langle s,X_{1}-X_{2}\rangle}\big{)}\,\big{(}1-\cos{\langle t,Y_{j}-Y_{k}\rangle}\big{)}
1+coss,X1X2+cost,YjYk;\displaystyle\quad\quad-1+\cos{\langle s,X_{1}-X_{2}\rangle}+\cos{\langle t,Y_{j}-Y_{k}\rangle}; (3.8)

then we obtain

cp\displaystyle c_{p} cq𝒱2(X,Y)\displaystyle c_{q}{\mathcal{V}}^{2}(X,Y)
=pq(𝔼[(1coss,X1X2)(1cost,Y1Y2)]\displaystyle=\int_{\mathbb{R}^{p}}\int_{\mathbb{R}^{q}}\Big{(}\mathbb{E}\big{[}\big{(}1-\cos{\langle s,X_{1}-X_{2}\rangle}\big{)}\,\big{(}1-\cos{\langle t,Y_{1}-Y_{2}\rangle}\big{)}\big{]}
2𝔼[(1coss,X1X2)(1cost,Y1Y3)]\displaystyle\qquad\qquad\quad-2\,\mathbb{E}\big{[}\big{(}1-\cos{\langle s,X_{1}-X_{2}\rangle}\big{)}\,\big{(}1-\cos{\langle t,Y_{1}-Y_{3}\rangle}\big{)}\big{]}
+𝔼[(1coss,X1X2)(1cost,Y3Y4)])dsdtsp+1tq+1,\displaystyle\qquad\qquad\quad+\mathbb{E}\big{[}\big{(}1-\cos{\langle s,X_{1}-X_{2}\rangle}\big{)}\,\big{(}1-\cos{\langle t,Y_{3}-Y_{4}\rangle}\big{)}\big{]}\Big{)}\frac{{{\hskip 1.0pt}\rm{d}}s{{\hskip 1.0pt}\rm{d}}t}{\|s\|^{p+1}\|t\|^{q+1}},

which is obtained by decomposing all summands on the right-hand side using Eq. (3) and observing that all terms which are not of the form 𝔼[coss,XiXjcost,YlYk]\mathbb{E}[\cos{\langle s,X_{i}-X_{j}\rangle}\,\cos{\langle t,Y_{l}-Y_{k}\rangle}] cancel each other. By applying the Fubini-Tonelli Theorem and the linearity of expectation and integration, we obtain

cp\displaystyle c_{p} cq𝒱2(X,Y)\displaystyle c_{q}{\mathcal{V}}^{2}(X,Y)
=𝔼pq[(1coss,X1X2)(1cost,Y1Y2)\displaystyle=\mathbb{E}\int_{\mathbb{R}^{p}}\int_{\mathbb{R}^{q}}\Big{[}\big{(}1-\cos{\langle s,X_{1}-X_{2}\rangle}\big{)}\,\big{(}1-\cos{\langle t,Y_{1}-Y_{2}\rangle}\big{)}
2(1coss,X1X2)(1cost,Y1Y3)\displaystyle\qquad\qquad\qquad-2\,\big{(}1-\cos{\langle s,X_{1}-X_{2}\rangle}\big{)}\,\big{(}1-\cos{\langle t,Y_{1}-Y_{3}\rangle}\big{)}
+(1coss,X1X2)(1cost,Y3Y4)]dsdtsp+1tq+1.\displaystyle\qquad\qquad\qquad+\big{(}1-\cos{\langle s,X_{1}-X_{2}\rangle}\big{)}\,\big{(}1-\cos{\langle t,Y_{3}-Y_{4}\rangle}\big{)}\Big{]}{\hskip-1.54312pt}\frac{{{\hskip 1.0pt}\rm{d}}s{{\hskip 1.0pt}\rm{d}}t}{\|s\|^{p+1}\|t\|^{q+1}}.

The proof is completed by applying Lemma 2.1 to calculate these three integrals.

Before establishing estimators for 𝒱2(X,Y){\mathcal{V}}^{2}(X,Y), we remark briefly on the assumptions necessary for the existence of the distance covariance.

Corollary 3.4.

Suppose that 𝔼X<\mathbb{E}\|X\|<\infty and 𝔼Y<\mathbb{E}\|Y\|<\infty. Then 𝒱2(X,Y)<{\mathcal{V}}^{2}(X,Y)<\infty.

Proof. From the representation (3.2), we directly obtain the alternative representation

𝒱2(X,Y)=𝔼[X1X2Y1Y2X1X2Y1Y3X1X2Y2Y3+X1X2Y3Y4].{\mathcal{V}}^{2}(X,Y)=\mathbb{E}\Big{[}\|X_{1}-X_{2}\|\,\|Y_{1}-Y_{2}\|-\|X_{1}-X_{2}\|\,\|Y_{1}-Y_{3}\|\\ -\|X_{1}-X_{2}\|\,\|Y_{2}-Y_{3}\|+\|X_{1}-X_{2}\|\,\|Y_{3}-Y_{4}\|\Big{]}. (3.9)

Applying the triangle inequality yields

X1X2Y1Y2X1X2Y1Y3X1X2Y2Y30,\|X_{1}-X_{2}\|\,\|Y_{1}-Y_{2}\|-\|X_{1}-X_{2}\|\,\|Y_{1}-Y_{3}\|-\|X_{1}-X_{2}\|\,\|Y_{2}-Y_{3}\|\leq 0,

and hence

0𝒱2(X,Y)\displaystyle 0\leq{\mathcal{V}}^{2}(X,Y) 𝔼X1X2Y3Y4\displaystyle\leq\mathbb{E}\|X_{1}-X_{2}\|\,\|Y_{3}-Y_{4}\|
=𝔼X1X2𝔼Y3Y44EX𝔼Y,\displaystyle=\mathbb{E}\|X_{1}-X_{2}\|\mathbb{E}\|Y_{3}-Y_{4}\|\leq 4\,E\|X\|\mathbb{E}\|Y\|,

where the last inequality follows again by the triangle inequality.

4 Asymptotic theory for estimating the distance covariance

Using the representation of the distance covariance given in (3.2), it is straightforward to derive a U-statistic estimator for 𝒱2(X){\mathcal{V}}^{2}(X). Specifically, we define the symmetric kernel function

h((X1,Y1),,(X4,Y4))=124(XiXjYiYj2XiXjYiYk+XiXjYkYl),h\big{(}(X_{1},Y_{1}),\ldots,(X_{4},Y_{4})\big{)}\\ =\frac{1}{24}\sum\big{(}\|X_{i}-X_{j}\|\,\|Y_{i}-Y_{j}\|-2\,\|X_{i}-X_{j}\|\,\|Y_{i}-Y_{k}\|+\|X_{i}-X_{j}\|\,\|Y_{k}-Y_{l}\|\big{)}, (4.1)

where the sum is over all i,j,k,l{1,2,3,4}i,j,k,l\in\{1,2,3,4\} such that ii, jj, kk, and ll are distinct.

It follows from the representation (3.2) that each of the 2424 summands in (4.1) has expectation 𝒱2(X,Y){\mathcal{V}}^{2}(X,Y). Therefore,

𝔼h((X1,Y1),,(X4,Y4))=𝒱2(X,Y).\mathbb{E}h\big{(}(X_{1},Y_{1}),\ldots,(X_{4},Y_{4})\big{)}={\mathcal{V}}^{2}(X,Y).

Letting (X1,Y1),,(Xn,Yn)(X_{1},Y_{1}),\ldots,(X_{n},Y_{n}) be a random sample from (X,Y)(X,Y), we find that an unbiased estimator of 𝒱2(X,Y){\mathcal{V}}^{2}(X,Y) is

Ω^=(n4)11i<j<k<lnh((Xi,Yi),(Xj,Yj),(Xk,Yk),(Xl,Yl)).\widehat{\Omega}={\binom{n}{4}}^{-1}\sum_{1\leq i<j<k<l\leq n}h\big{(}(X_{i},Y_{i}),(X_{j},Y_{j}),(X_{k},Y_{k}),(X_{l},Y_{l})\big{)}. (4.2)

We can now derive the consistency and asymptotic distribution of this estimator using standard U-statistic theory (Lee [13]). For this purpose, let us define

h1(x,y)=𝔼[h((x,y),(X2,Y2),(X3,Y3),(X4,Y4))].h_{1}(x,y)=\mathbb{E}\big{[}h\big{(}(x,y),(X_{2},Y_{2}),(X_{3},Y_{3}),(X_{4},Y_{4})\big{)}\big{]}.

and

h2((x1,y1),(x2,y2))=𝔼[h((x1,y1),(x2,y2),(X3,Y3),(X4,Y4))].h_{2}((x_{1},y_{1}),(x_{2},y_{2}))=\mathbb{E}\big{[}h\big{(}(x_{1},y_{1}),(x_{2},y_{2}),(X_{3},Y_{3}),(X_{4},Y_{4})\big{)}\big{]}.

The preceding formulas and a classical result on U-statistics (Hoeffding [9, Theorem 7.1]) leads immediately to a proof of the following result.

Theorem 4.1.

Suppose that 0<Var(h1(X,Y))<0<\mbox{Var}\,(h_{1}(X,Y))<\infty. Then n(Ω^𝒱2(X,Y))PZ\sqrt{n}\big{(}\widehat{\Omega}-{\mathcal{V}}^{2}(X,Y)\big{)}\stackrel{{\scriptstyle P}}{{\longrightarrow}}Z as nn\to\infty, where Z𝒩(0,16Var(h1(X,Y))Z\sim\mathcal{N}\big{(}0,16\mbox{Var}\,(h_{1}(X,Y)\big{)}.

Except for pathological examples, Theorem 4.1 provides the asymptotic distribution of 𝒱2(X,Y){\mathcal{V}}^{2}(X,Y) if XX and YY are dependent. For the crucial case of independent XX and YY, however, the asymptotic distribution of n(Ω^𝒱(X,Y)2)\sqrt{n}(\widehat{\Omega}-{\mathcal{V}}(X,Y)^{2}) is degenerate; in this case, the asymptotic distribution can be derived using results on first-order degenerate U-statistics (Lee [13, Section 3.2.2]).

Lemma 4.2.

Let XX and YY be independent, and (X1,Y1)(X_{1},Y_{1}) and (X2,Y2)(X_{2},Y_{2}) be i.i.d. copies of (X,Y)(X,Y). Then h1(x,y)0h_{1}(x,y)\equiv 0 and Var(h2((X1,Y1),(X2,Y2)))=𝒱2(X,X)𝒱2(Y,Y)/36\mbox{Var}\,\big{(}h_{2}((X_{1},Y_{1}),(X_{2},Y_{2}))\big{)}={\mathcal{V}}^{2}(X,X)\,{\mathcal{V}}^{2}(Y,Y)/36.

The proof follows by elementary, but lengthy, transformations and may be left as an exercise to students. A complete proof is provided by Huang and Ho [10, Appendices B.6 and B.7].

Finally, the following result follows directly from Lemma 4.2 and classical results on the distributions of first-order degenerate U-statistics (Lee [13, Section 3.2.2]).

Theorem 4.3.

Let XX and YY be independent, with 𝔼(X)<\mathbb{E}(\|X\|)<\infty and 𝔼(Y)<\mathbb{E}(\|Y\|)<\infty. Then,

n(Ω^𝒱2(X,Y))𝒟6i=1λi(Zi21),\displaystyle n\,\big{(}\widehat{\Omega}-{\mathcal{V}}^{2}(X,Y)\big{)}\stackrel{{\scriptstyle\mathcal{D}}}{{\longrightarrow}}6\,\sum_{i=1}^{\infty}\lambda_{i}(Z_{i}^{2}-1), (4.3)

as nn\to\infty, where Z1,Z2,Z_{1},Z_{2},\ldots are i.i.d. standard normal random variables and λ1,λ2,\lambda_{1},\lambda_{2},\ldots are the eigenvalues of the integral equation

𝔼[h2((x1,y1),(X2,Y2))f(X2,Y2)]=λf(x1,y1).\mathbb{E}\big{[}h_{2}\big{(}(x_{1},y_{1}),(X_{2},Y_{2})\big{)}\,f(X_{2},Y_{2})\big{]}=\lambda f(x_{1},y_{1}).

5 Concluding Remarks

In this article, we have derived under minimal technical requirements the most important statistical properties of the distance covariance. From this starting point, there are several additional interesting topics that can be explored, e.g., as instructional assignments:

(i) The estimator (4.2) is O(n4)O(n^{4}) and is computationally inefficient. A straightforward combinatorial computation shows that an O(n2)O(n^{2}) estimator of 𝒱{\mathcal{V}} is given by

Ω~=1n(n3)[\displaystyle\widetilde{\Omega}=\frac{1}{n\,(n-3)}\Bigg{[} i,j=1nXiXjYiYj\displaystyle\sum_{i,j=1}^{n}\|X_{i}-X_{j}\|\|Y_{i}-Y_{j}\| (5.1)
+1(n1)(n2)i,j=1nXiXji,j=1nYiYj\displaystyle\ +\frac{1}{(n-1)\,(n-2)}\sum_{i,j=1}^{n}\|X_{i}-X_{j}\|\cdot\sum_{i,j=1}^{n}\|Y_{i}-Y_{j}\|
2(n2)i,j,k=1nXiXjYiYk];\displaystyle\ -\frac{2}{(n-2)}\sum_{i,j,k=1}^{n}\|X_{i}-X_{j}\|\|Y_{i}-Y_{k}\|\Bigg{]};

see Huo and Székely [11].

(ii) We remark that although no assumption was provided in Theorem 4.1 to ensure that Var(h1(X,Y))<\mbox{Var}(h_{1}(X,Y))<\infty, it can be shown that this condition holds whenever XX and YY have finite second moments; see Edelmann, et al. [6].

(iii) Important contributions of Székely, et al. [24] and Székely and Rizzo [22] are based on the distance correlation coefficient, which is defined as the nonnegative square-root of

2(X,Y)=𝒱2(X,Y)𝒱2(X,X)𝒱2(Y,Y).{\mathcal{R}}^{2}(X,Y)=\frac{{\mathcal{V}}^{2}(X,Y)}{\sqrt{{\mathcal{V}}^{2}(X,X){\mathcal{V}}^{2}(Y,Y)}}.

Numerous properties of 2(X,Y){\mathcal{R}}^{2}(X,Y) (see, e.g., Székely, et al. [24, Theorem 3]) may be derived using the methods that we have presented here.

We also remark on the fundamental integral, (2.1), that underpins the entire distance covariance and distance correlation theory. As noted by Dueck, et al. [4], the fundamental integral and variants of it have appeared in functional analysis (Gelfand and Shilov [8, pp. 192–195]), in Fourier analysis (Stein [21, pp. 140 and 263]), and in the theory of fractional Brownian motion on generalized random fields (Chil‘es and Delfiner [2, p. 266]; Reed, et al. [16]).

The fundamental integral also extends further. For mm\in\mathbb{N} and vv\in\mathbb{R}, define

cosm(v):=j=0m1(1)jv2j(2j)!,\cos_{m}(v):=\sum_{j=0}^{m-1}(-1)^{j}\frac{v^{2j}}{(2j)!}, (5.2)

the truncated Maclaurin expansion of the cosine function. Dueck, et al. [4] proved that for α\alpha\in\mathbb{C},

dcosm(t,x)cos(t,x)td+αdt=2πp/2Γ(1α/2)α 2αΓ((p+α)/2)xα,\int_{\mathbb{R}^{d}}\frac{\cos_{m}(\langle t,x\rangle)-\cos(\langle t,x\rangle)}{\|t\|^{d+\alpha}}\,{{\hskip 1.0pt}\rm{d}}t=\frac{2\pi^{p/2}\,\Gamma(1-\alpha/2)}{\alpha\,2^{\alpha}\,\Gamma\big{(}(p+\alpha)/2\big{)}}\,\|x\|^{\alpha}, (5.3)

with absolute convergence if and only if 2(m1)<(α)<2m2(m-1)<\Re(\alpha)<2m. For m=1m=1 and α=1\alpha=1, (5.3) reduces to (2.1). Further, for m=1m=1 and 0<α<20<\alpha<2, the integral (5.3) provides the Lévy-Khintchine representation of the negative definite function xα\|x\|^{\alpha}, thereby linking the fundamental integral to the probability theory of the stable distributions.

In conclusion, the statistical analysis of data through distance covariance and distance correlation theory, by means of the fundamental integral, is seen to be linked closely to many areas of the mathematical sciences.


Acknowledgements. D. Edelmann gratefully acknowledges financial support from the Deutsche Forschungsgemeinschaft (Grant number 417754611). The authors are grateful to the editor of the special issue and to a reviewer for comments on the manuscript.

References

  • [1]
  • [2] Chilès, J. P., and Delfiner P. (2012). Geostatistics: Modeling Spatial Uncertainty, second edition. Wiley, New York.
  • [3] Dueck, J., D. Edelmann, T. Gneiting, and D. Richards (2014). The affinely invariant distance correlation. Bernoulli, 20, 2305–2330.
  • [4] Dueck, J., Edelmann, D., and Richards, D. (2015). A generalization of an integral arising in the theory of distance correlation. Statist. Probab. Lett., 97, 116–119.
  • [5] Edelmann, D., K. Fokianos, and M. Pitsillou (2019). An updated literature review of distance correlation and its applications to time series. Internat. Statist. Rev., 87, 237–262.
  • [6] Edelmann, D., D. Richards, and D. Vogel (2020). The distance standard deviation. Ann. Statist., 48, 3395–3416.
  • [7] Fiedler, J. (2016). Distances, Gegenbauer Expansions, Curls, and Dimples: On Dependence Measures for Random Fields. Ph.D. dissertation, Heidelberg University.
  • [8] Gelfand, I. M., and Shilov, G. E. (1964). Generalized Functions, Volume 1. Academic Press, New York.
  • [9] Hoeffding, W. (1948). A class of statistics with asymptotically normal distribution. Ann. Math. Statist., 19, 293–325.
  • [10] Huang, C., and X. Huo (2017). A statistically and numerically efficient independence test based on random projections and distance covariance. Preprint, arXiv:1701.06054.
  • [11] Huo, X., and G. J. Székely (2016). Fast computing for distance covariance. Technometrics, 58, 435–447.
  • [12] Kong, J., B. E. Klein, R. Klein, K. E. Lee, and G. Wahba (2012). Using distance correlation and SS-ANOVA to assess associations of familial relationships, lifestyle factors, diseases, and mortality. Proc. Natl. Acad. Sci. U.S.A., 109, 20352–20357.
  • [13] Lee, A. J. (2019). U-Statistics: Theory and Practice. CRC Press, Boca Raton, FL.
  • [14] Martínez-Gómez, E., M. T. Richards, and D. St. P. Richards (2014). Distance correlation methods for discovering associations in large astrophysical databases. Astrophys. J. 781, 39 (11 pp.).
  • [15] Ohana-Levi, N., S. Munitz, A. Ben-Gal, A. Schwartz, A. Peeters, and Y. Netzer (2020). Multiseasonal grapevine water consumption - Drivers and forecasting. Agric. For. Meteorol., 280, 107796 (12 pp.).
  • [16] Reed, I. S., Lee, P. C., and Truong, T. K. (1995). Spectral representation of fractional Brownian motion in nn dimensions and its properties. IEEE Trans. Inform. Theory, 41, 1439–1451.
  • [17] Richards, D. St. P. (2017). Distance correlation: A new tool for detecting association and measuring correlation between data sets. Notices Amer. Math. Soc., 64, 16–18.
  • [18] Richards, M. T., D. St. P. Richards, and E. Martínez-Gómez (2014). Interpreting the distance correlation results for the COMBO-17 survey. Astrophys. J., 784, L34 (5 pp.).
  • [19] Sejdinovic, D., B. Sriperumbudur, A. Gretton, and K. Fukumizu (2013). Equivalence of distance-based and RKHS-based statistics in hypothesis testing. Ann. Statist., 41, 2263–2291.
  • [20] Spivak, M. (1994). Calculus, third edition. Publish or Perish, Houston, TX.
  • [21] Stein, E. M. (1970). Singular Integrals and Differentiability Properties of Functions. Princeton University Press, Princeton, N.J.
  • [22] Székely, G. J., and M. L. Rizzo (2009). Brownian distance covariance. Ann. Appl. Statist., 3, 1236–1265.
  • [23] Székely, G. J., and M. L. Rizzo (2014). Partial distance correlation with methods for dissimilarities. Ann. Statist., 42, 2382–2412.
  • [24] Székely, G. J., M. L. Rizzo, and N. K. Bakirov (2007). Measuring and testing independence by correlation of distances. Ann. Statist., 35, 2769–2794.
  • [25] Van der Vaart, A. W. (2000). Asymptotic Statistics. Cambridge University Press, New York.
  • [26] Van De Vijver, M. J., Y. D. He, L. J. Van’t Veer, et al. (2002). A gene-expression signature as a predictor of survival in breast cancer. N. Engl. J. Med., 347, 1999–2009.
  • [27] Zhang, X., M. Kano, and Y. Li (2018). Quality-relevant independent component regression model for virtual sensing application. Computers & Chem. Eng., 115, 141–149.
  • [28] Zhou, Z. (2012). Measuring nonlinear dependence in time-series, a distance correlation approach. J. Time Series Analysis, 33, 438–457.