# Semiparametric Gaussian copula classification

@article{Zhao2014SemiparametricGC, title={Semiparametric Gaussian copula classification}, author={Yue Zhao and Marten H. Wegkamp}, journal={arXiv: Statistics Theory}, year={2014} }

This paper studies the binary classification of two distributions with the same Gaussian copula in high dimensions. Under this semiparametric Gaussian copula setting, we derive an accurate semiparametric estimator of the log density ratio, which leads to our empirical decision rule and a bound on its associated excess risk. Our estimation procedure takes advantage of the potential sparsity as well as the low noise condition in the problem, which allows us to achieve faster convergence rate of… Expand

#### 2 Citations

High-Dimensional Gaussian Copula Regression: Adaptive Estimation and Statistical Inference

- Mathematics
- 2015

We develop adaptive estimation and inference methods for high-dimensional Gaussian copula regression that achieve the same performance without the knowledge of the marginal transformations as that… Expand

A convex optimization approach to high-dimensional sparse quadratic discriminant analysis

- Mathematics
- 2019

In this paper, we study high-dimensional sparse Quadratic Discriminant Analysis (QDA) and aim to establish the optimal convergence rates for the classification error. Minimax lower bounds are… Expand

#### References

SHOWING 1-10 OF 30 REFERENCES

Semiparametric Sparse Discriminant Analysis in Ultra-High Dimensions

- Mathematics
- 2013

In recent years, a considerable amount of work has been devoted to generalizing linear discriminant analysis to overcome its incompetence for high-dimensional classification (Witten & Tibshirani… Expand

High Dimensional Semiparametric Gaussian Copula Graphical Models

- Computer Science, Mathematics
- ICML
- 2012

It is proved that the nonparanormal skeptic achieves the optimal parametric rates of convergence for both graph recovery and parameter estimation, and this result suggests that the NonParanormal graphical models can be used as a safe replacement of the popular Gaussian graphical models, even when the data are truly Gaussian. Expand

Discriminant analysis through a semiparametric model

- Mathematics
- 2003

We consider a semiparametric generalisation of normal-theory discriminant analysis. The semiparametric model assumes that, after unspecified univariate monotone transformations, the class… Expand

Asymptotic normality and optimalities in estimation of large Gaussian graphical models

- Mathematics
- 2015

The Gaussian graphical model, a popular paradigm for studying relationship among variables in a wide range of applications, has attracted great attention in recent years. This paper considers a… Expand

CODA: high dimensional copula discriminant analysis

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2013

In high dimensional settings, it is proved that the sparsity pattern of the discriminant features can be consistently recovered with the parametric rate, and the expected misclassification error is consistent to the Bayes risk. Expand

Multivariate Analysis of Nonparametric Estimates of Large Correlation Matrices

- Mathematics
- 2014

We study concentration in spectral norm of nonparametric estimates of correlation matrices. We work within the confine of a Gaussian copula model. Two nonparametric estimators of the correlation… Expand

HIGH-DIMENSIONAL COVARIANCE ESTIMATION BY MINIMIZING l1-PENALIZED LOG-DETERMINANT DIVERGENCE BY PRADEEP RAVIKUMAR

- 2009

Given i.i.d. observations of a random vector X ∈ R, we study the problem of estimating both its covariance matrix Σ∗, and its inverse covariance or concentration matrixΘ∗ = (Σ). We estimateΘ∗ by… Expand

Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas

- Mathematics
- 2016

We study the adaptive estimation of copula correlation matrix $\Sigma$ for the semi-parametric elliptical copula model. In this context, the correlations are connected to Kendall's tau through a sine… Expand

Optimal Feature Selection in High-Dimensional Discriminant Analysis

- Mathematics, Medicine
- IEEE Transactions on Information Theory
- 2015

This paper establishes rates of convergence that are significantly faster than the best known results and admit an optimal scaling of the sample size n, dimensionality p, and sparsity level s in the high-dimensional setting. Expand

Fast learning rates for plug-in classifiers

- Mathematics
- 2007

It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n -1/2 .… Expand