# HGR

**HGR** stands for Hirschfeld–Gebelein–Rényi maximum correlation. It is a correlation metric in statistics.
Compared with commonly-used Pearson's correlation coefficient, it has the advantage to handle non-linear statistical dependency. The benefit for such
generalization comes with computational cost. While correlation coefficient can be computed by definition, HGR maximum correlation should be approximated
by ACE ( Alternating Conditional Expectations) algorithm. The application of HGR and its extension has been found in the field
of information theory, machine learning and so on.

## History Edit

HGR maximum correlation is independently proposed by three mathematicians in 20th century. ^{[1]}^{[2]}^{[3]}
Hans proposed that the correlation can be computed by series expansion, which is not efficient and replaced by ACE latterly. ^{[4]}

## Mathematical Description Edit

Let X and Y be random variables, f and g be smooth transformation of X, Y respectively, HGR is defined as

## Applications Edit

### Information Theory Edit

The connection between HGR and other information-theoretic metric is discussed thoroughly with local assumption.^{[5]}

### Machine Learning Edit

## Software Implementations Edit

## References Edit

- ↑ Hirschfeld, Hermann O. "A connection between correlation and contingency." Mathematical Proceedings of the Cambridge Philosophical Society. Vol. 31. No. 4. Cambridge University Press, 1935.
- ↑ Gebelein, Hans. "Das statistische Problem der Korrelation als Variations‐und Eigenwertproblem und sein Zusammenhang mit der Ausgleichsrechnung." ZAMM‐Journal of Applied Mathematics and Mechanics/Zeitschrift für Angewandte Mathematik und Mechanik 21.6 (1941): 364-379.
- ↑ Rényi, Alfréd. "On measures of dependence." Acta Mathematica Academiae Scientiarum Hungarica 10.3-4 (1959): 441-451.
- ↑ Breiman, Leo, and Jerome H. Friedman. "Estimating optimal transformations for multiple regression and correlation." Journal of the American statistical Association 80.391 (1985): 580-598.
- ↑ S.-L. Huang, A. Makur, G. W. Wornell, L. Zheng (2020). On Universal Features for High-Dimensional Learning and Inference. Foundations and Trends in Communications and Information Theory: Now Publishers.