A Basic Treatment of the Distance Covariance
Abstract
The distance covariance of Székely, et al. [24] and Székely and Rizzo [22], a powerful measure of dependence between sets of multivariate random variables, has the crucial feature that it equals zero if and only if the sets are mutually independent. Hence the distance covariance can be applied to multivariate data to detect arbitrary types of non-linear associations between sets of variables. We provide in this article a basic, albeit rigorous, introductory treatment of the distance covariance. Our investigations yield an approach that can be used as the foundation for presentation of this important and timely topic even in advanced undergraduate- or junior graduate-level courses on mathematical statistics.
1 Introduction
The distance covariance, a measure of dependence between multivariate random variables and , was introduced by Székely, Rizzo, and Bakirov [24] and has since received extensive attention in the statistical literature. A crucial feature of the distance covariance is that it equals zero if and only if and are mutually independent. Hence the distance covariance is sensitive to arbitrary dependencies; this is in contrast to the classical covariance, which is generally capable of detecting only linear dependencies. This property is illustrated in Figure 1, which illustrates that tests based on the distance covariance are able to detect numerous types of non-linear associations even when tests based on the classical covariance may fail to detect many such statistical relationships.

While the dependencies illustrated in Figure 1 clearly represent purely illustrative examples, the sensitivity of the distance covariance to arbitrary dependencies can be very useful for applications. This is demonstrated in Figure 2, where we show three dependencies between expression values genes in the breast cancer data set by Van De Vijver, et al. [26]; all these dependencies can be detected by the distance covariance but not by the classical covariance.
For comparisons of the distance covariance and classical covariance in applications to data, see the examples given by Székely and Rizzo [22, Section 5.2] and Dueck, et al. [3, Section 5]; for extensive numerical experiments and fast algorithms for computing the distance covariance, see Huo and Székely [11, Section 5]. We also refer to Sejdinovic, et al. [19], Dueck, et al. [3], Székely and Rizzo [22, 23], Huo and Székely [11], and Edelmann, et al. [6], representing only a few of the many authors who have given further theoretical results on the distance covariance and distance correlation coefficients; and to Zhou [28], Fiedler [7], and Edelmann, et al. [5] as among the applications to time series analyses. Many applications to data analysis of the distance correlation coefficient and the distance covariance are now available, including: Kong, et al. [12] on data in sociology, Martínez-Gómez, textitet al. [14] and Richards, et al. [18] on astrophysical databases and galaxy clusters, Dueck, et al. [3] on time series analyses of wind vectors at electricity-generating facilities, Richards [17] on the relationship between the strength of gun control laws and firearm-related homicides, Zhang, et al. [27] for remote sensing applications, and Ohana-Levi, et al. [15] on grapevine transpiration.

The original papers of Székely, et al. [24] and Székely and Rizzo [22] are now widely recognized as seminal and important contributions to measuring dependence between sets of random variables; however, the exposition therein includes some ingenious arguments that may make the material challenging to readers not having an advanced background in mathematical statistics. With the benefit of hindsight, we are able to provide in this article a simpler, albeit mathematically rigorous, introduction to the distance covariance that can be taught even in an undergraduate-level course covering the basic theory of U-statistics. Other than standard U-statistics theory and some well-known properties of characteristic functions, the requirements for our treatment are a knowledge of multidimensional integration and trigonometric inequalities, as covered in a course on undergraduate-level advanced calculus. Consequently, we hope that this treatment will prove to be beneficial to non-mathematical statisticians.
Our presentation introduces the distance covariance as an important alternative to the classical covariance. Moreover, the distance covariance constitutes a particularly interesting example of a U-statistic since it includes both the “non-degenerate” and “first-order degenerate” cases of the asymptotic distribution theory of U-statistics, these corresponding to the situations in which and are dependent, leading to the non-degenerate case, or and are independent, leading to the first-order degenerate case of the asymptotic theory.
Throughout the exposition, denotes the Euclidean norm and the corresponding inner product. Also, we denote by the modulus in or the absolute value in , and the imaginary unit is .
2 The fundamental integral of distance covariance theory
Following Székely, et al. [24], we first establish a closed-form expression for an integral that plays a central work in this article, leading to the equivalence of two crucial expressions for the distance covariance. The first expression displays the distance covariance as an integrated distance between the joint characteristic function of and the product of the marginal characteristic functions of and ; we will deduce from this expression that the distance covariance equals zero if and only if and are independent. The second expression allows us to derive consistent distance covariance estimators that are expressible as polynomials in the distances between random samples.
Since the ability to characterize independence and the existence of easily computable estimators are arguably the most important properties of the distance covariance, we will refer to this integral as the fundamental integral of distance covariance.
Lemma 2.1.
For ,
(2.1) |
Proof. Since (2.1) is valid for , we need only treat the case in which .
Denote by the integral in (2.1). For , replacing by yields
(2.2) |
Denoting the latter integral in (2.2) by , it follows by integration-by-parts that
(2.3) |
the last equality being classical in calculus (Spivak [20, Chapter 19, Problem 43]).
For general , note that is invariant under orthogonal transformations of :
where the first equality follows from the transformation , which leaves the Lebesgue measure unchanged; and the second equality holds because the norm and the inner product are orthogonally invariant. Therefore, in evaluating we may replace by ; letting , we obtain
(2.4) |
the last equality obtained by replacing by , .
Denoting by the latter integral in (2.4), we substitute in that integral , , and . As the Jacobian of this transformation is , we obtain
(2.5) |
As the remaining integral in (2) is the familiar normalizing constant of the Student’s -distribution on degrees-of-freedom, we obtain
Starting with , we solve this recursive equation for , obtaining (2.1). ∎
3 Two representations for the distance covariance
We now introduce the representations of the distance covariance mentioned above. Following Székely, et al. [24], we define the distance covariance through its characteristic function representation. For jointly distributed random vectors and , let be the joint characteristic function of and and be the corresponding marginal characteristic functions.
Definition 3.1.
The distance covariance between and is defined as the nonnegative square-root of
(3.1) |
where is the normalizing constant in (2.1).
As the integrand in (3.1) is nonnegative, it follows that . Further, we will show in Corollary 3.4 that whenever and have finite first moments.
An advantage of the representation (3.1) is that it directly implies one of the most important properties of the distance covariance, viz., the characterization of independence.
Theorem 3.2.
For all and , if and only if and are independent.
Proof. If and are independent then for all and ; hence .
Conversely, if and are not independent then the functions and are not identical (Van der Vaart [25, Lemma 2.15]). Since characteristic functions are continuous then there exists an open set such that for all . Hence, by (3.1), . ∎
For the purpose of deriving estimators for , we now apply Lemma 2.1 to obtain a second representation of the distance covariance.
Theorem 3.3.
Suppose that are independent, identically distributed (i.i.d.) copies of . Then
(3.2) |
Proof. First, we observe that the numerator in the integrand in (3.1) equals
Since the latter expression is real, any term of the form , , can be replaced by . Hence, by (3.1),
(3.3) |
where, for each ,
(3.4) |
Replacing by in (3.3), we also obtain
(3.5) |
and by adding (3.3) and (3.5), we find that
where for each ,
(3.6) |
On applying to (3.4) and (3.6) the trigonometric identity,
we deduce that
(3.7) |
For , we apply to (3.7) the elementary identity,
(3.8) |
then we obtain
which is obtained by decomposing all summands on the right-hand side using Eq. (3) and observing that all terms which are not of the form cancel each other. By applying the Fubini-Tonelli Theorem and the linearity of expectation and integration, we obtain
The proof is completed by applying Lemma 2.1 to calculate these three integrals. ∎
Before establishing estimators for , we remark briefly on the assumptions necessary for the existence of the distance covariance.
Corollary 3.4.
Suppose that and . Then .
Proof. From the representation (3.2), we directly obtain the alternative representation
(3.9) |
Applying the triangle inequality yields
and hence
where the last inequality follows again by the triangle inequality. ∎
4 Asymptotic theory for estimating the distance covariance
Using the representation of the distance covariance given in (3.2), it is straightforward to derive a U-statistic estimator for . Specifically, we define the symmetric kernel function
(4.1) |
where the sum is over all such that , , , and are distinct.
It follows from the representation (3.2) that each of the summands in (4.1) has expectation . Therefore,
Letting be a random sample from , we find that an unbiased estimator of is
(4.2) |
We can now derive the consistency and asymptotic distribution of this estimator using standard U-statistic theory (Lee [13]). For this purpose, let us define
and
The preceding formulas and a classical result on U-statistics (Hoeffding [9, Theorem 7.1]) leads immediately to a proof of the following result.
Theorem 4.1.
Suppose that . Then as , where .
Except for pathological examples, Theorem 4.1 provides the asymptotic distribution of if and are dependent. For the crucial case of independent and , however, the asymptotic distribution of is degenerate; in this case, the asymptotic distribution can be derived using results on first-order degenerate U-statistics (Lee [13, Section 3.2.2]).
Lemma 4.2.
Let and be independent, and and be i.i.d. copies of . Then and .
The proof follows by elementary, but lengthy, transformations and may be left as an exercise to students. A complete proof is provided by Huang and Ho [10, Appendices B.6 and B.7].
Finally, the following result follows directly from Lemma 4.2 and classical results on the distributions of first-order degenerate U-statistics (Lee [13, Section 3.2.2]).
Theorem 4.3.
Let and be independent, with and . Then,
(4.3) |
as , where are i.i.d. standard normal random variables and are the eigenvalues of the integral equation
5 Concluding Remarks
In this article, we have derived under minimal technical requirements the most important statistical properties of the distance covariance. From this starting point, there are several additional interesting topics that can be explored, e.g., as instructional assignments:
(i) The estimator (4.2) is and is computationally inefficient. A straightforward combinatorial computation shows that an estimator of is given by
(5.1) | ||||
see Huo and Székely [11].
(ii) We remark that although no assumption was provided in Theorem 4.1 to ensure that , it can be shown that this condition holds whenever and have finite second moments; see Edelmann, et al. [6].
(iii) Important contributions of Székely, et al. [24] and Székely and Rizzo [22] are based on the distance correlation coefficient, which is defined as the nonnegative square-root of
Numerous properties of (see, e.g., Székely, et al. [24, Theorem 3]) may be derived using the methods that we have presented here.
We also remark on the fundamental integral, (2.1), that underpins the entire distance covariance and distance correlation theory. As noted by Dueck, et al. [4], the fundamental integral and variants of it have appeared in functional analysis (Gelfand and Shilov [8, pp. 192–195]), in Fourier analysis (Stein [21, pp. 140 and 263]), and in the theory of fractional Brownian motion on generalized random fields (Chil‘es and Delfiner [2, p. 266]; Reed, et al. [16]).
The fundamental integral also extends further. For and , define
(5.2) |
the truncated Maclaurin expansion of the cosine function. Dueck, et al. [4] proved that for ,
(5.3) |
with absolute convergence if and only if . For and , (5.3) reduces to (2.1). Further, for and , the integral (5.3) provides the Lévy-Khintchine representation of the negative definite function , thereby linking the fundamental integral to the probability theory of the stable distributions.
In conclusion, the statistical analysis of data through distance covariance and distance correlation theory, by means of the fundamental integral, is seen to be linked closely to many areas of the mathematical sciences.
Acknowledgements. D. Edelmann gratefully acknowledges financial support from the Deutsche Forschungsgemeinschaft (Grant number 417754611). The authors are grateful to the editor of the special issue and to a reviewer for comments on the manuscript.
References
- [1]
- [2] Chilès, J. P., and Delfiner P. (2012). Geostatistics: Modeling Spatial Uncertainty, second edition. Wiley, New York.
- [3] Dueck, J., D. Edelmann, T. Gneiting, and D. Richards (2014). The affinely invariant distance correlation. Bernoulli, 20, 2305–2330.
- [4] Dueck, J., Edelmann, D., and Richards, D. (2015). A generalization of an integral arising in the theory of distance correlation. Statist. Probab. Lett., 97, 116–119.
- [5] Edelmann, D., K. Fokianos, and M. Pitsillou (2019). An updated literature review of distance correlation and its applications to time series. Internat. Statist. Rev., 87, 237–262.
- [6] Edelmann, D., D. Richards, and D. Vogel (2020). The distance standard deviation. Ann. Statist., 48, 3395–3416.
- [7] Fiedler, J. (2016). Distances, Gegenbauer Expansions, Curls, and Dimples: On Dependence Measures for Random Fields. Ph.D. dissertation, Heidelberg University.
- [8] Gelfand, I. M., and Shilov, G. E. (1964). Generalized Functions, Volume 1. Academic Press, New York.
- [9] Hoeffding, W. (1948). A class of statistics with asymptotically normal distribution. Ann. Math. Statist., 19, 293–325.
- [10] Huang, C., and X. Huo (2017). A statistically and numerically efficient independence test based on random projections and distance covariance. Preprint, arXiv:1701.06054.
- [11] Huo, X., and G. J. Székely (2016). Fast computing for distance covariance. Technometrics, 58, 435–447.
- [12] Kong, J., B. E. Klein, R. Klein, K. E. Lee, and G. Wahba (2012). Using distance correlation and SS-ANOVA to assess associations of familial relationships, lifestyle factors, diseases, and mortality. Proc. Natl. Acad. Sci. U.S.A., 109, 20352–20357.
- [13] Lee, A. J. (2019). U-Statistics: Theory and Practice. CRC Press, Boca Raton, FL.
- [14] Martínez-Gómez, E., M. T. Richards, and D. St. P. Richards (2014). Distance correlation methods for discovering associations in large astrophysical databases. Astrophys. J. 781, 39 (11 pp.).
- [15] Ohana-Levi, N., S. Munitz, A. Ben-Gal, A. Schwartz, A. Peeters, and Y. Netzer (2020). Multiseasonal grapevine water consumption - Drivers and forecasting. Agric. For. Meteorol., 280, 107796 (12 pp.).
- [16] Reed, I. S., Lee, P. C., and Truong, T. K. (1995). Spectral representation of fractional Brownian motion in dimensions and its properties. IEEE Trans. Inform. Theory, 41, 1439–1451.
- [17] Richards, D. St. P. (2017). Distance correlation: A new tool for detecting association and measuring correlation between data sets. Notices Amer. Math. Soc., 64, 16–18.
- [18] Richards, M. T., D. St. P. Richards, and E. Martínez-Gómez (2014). Interpreting the distance correlation results for the COMBO-17 survey. Astrophys. J., 784, L34 (5 pp.).
- [19] Sejdinovic, D., B. Sriperumbudur, A. Gretton, and K. Fukumizu (2013). Equivalence of distance-based and RKHS-based statistics in hypothesis testing. Ann. Statist., 41, 2263–2291.
- [20] Spivak, M. (1994). Calculus, third edition. Publish or Perish, Houston, TX.
- [21] Stein, E. M. (1970). Singular Integrals and Differentiability Properties of Functions. Princeton University Press, Princeton, N.J.
- [22] Székely, G. J., and M. L. Rizzo (2009). Brownian distance covariance. Ann. Appl. Statist., 3, 1236–1265.
- [23] Székely, G. J., and M. L. Rizzo (2014). Partial distance correlation with methods for dissimilarities. Ann. Statist., 42, 2382–2412.
- [24] Székely, G. J., M. L. Rizzo, and N. K. Bakirov (2007). Measuring and testing independence by correlation of distances. Ann. Statist., 35, 2769–2794.
- [25] Van der Vaart, A. W. (2000). Asymptotic Statistics. Cambridge University Press, New York.
- [26] Van De Vijver, M. J., Y. D. He, L. J. Van’t Veer, et al. (2002). A gene-expression signature as a predictor of survival in breast cancer. N. Engl. J. Med., 347, 1999–2009.
- [27] Zhang, X., M. Kano, and Y. Li (2018). Quality-relevant independent component regression model for virtual sensing application. Computers & Chem. Eng., 115, 141–149.
- [28] Zhou, Z. (2012). Measuring nonlinear dependence in time-series, a distance correlation approach. J. Time Series Analysis, 33, 438–457.