论文部分内容阅读
We investigate the structure of a large precision matrix in Gaussian graphical models by decomposing it into a low rank component and a remain-der part with sparse precision matrix. Based on the decomposition, we pro-pose to estimate the large precision matrix by inverting a principal orthogonal decomposition (IPOD). The IPOD approach has appealing practical interpre-tations in conditional graphical models given the low rank component, and it connects to Gaussian graphical models with latent variables. Specifically, we show that the low rank component in the decomposition of the large precision matrix can be viewed as the contribution from the latent variables in a Gaus-sian graphical model. Compared with existing approaches for latent variable graphical models, the IPOD is conveniently feasible in practice where only in-verting a low-dimensional matrix is required. To identify the number of latent variables, which is an objective of its own interest, we investigate and justify an approach by examining the ratios of adjacent eigenvalues of the sample co-variance matrix. Theoretical properties, numerical examples, and a real data application demonstrate the merits of the IPOD approach in its convenience, performance, and interpretability.