TY - JOUR
T1 - Learning additive exponential family graphical models via l2,1-norm regularized M-estimation
AU - Yuan, Xiao Tong
AU - Li, Ping
AU - Zhang, Tong
AU - Liu, Qingshan
AU - Liu, Guangcan
N1 - Funding Information:
Xiao-Tong Yuan and Ping Li were partially supported by NSF-Bigdata-1419210, NSF-III-1360971, ONR-N00014-13-1-0764, and AFOSR-FA9550-13-1-0137. Xiao-Tong Yuan is also partially supported by NSFC-61402232, NSFC-61522308, and NSFJP-BK20141003. Tong Zhang is supported by NSF-IIS-1407939 and NSF-IIS-1250985. Qingshan Liu is supported by NSFC-61532009. Guangcan Liu is supported by NSFC-61622305, NSFC-61502238 and NSFJP-BK20160040.
Publisher Copyright:
© 2016 NIPS Foundation - All Rights Reserved.
PY - 2016
Y1 - 2016
N2 - We investigate a subclass of exponential family graphical models of which the sufficient statistics are defined by arbitrary additive forms. We propose two l2 1-norm regularized maximum likelihood estimators to learn the model parameters from ii.d. samples. The first one is a joint MLE estimator which estimates all the parameters simultaneously. The second one is a node-wise conditional MLE estimator which estimates the parameters for each node individually. For both estimators, statistical analysis shows that under mild conditions the extra flexibility gained by the additive exponential family models comes at almost no cost of statistical efficiency. A Monte-Carlo approximation method is developed to efficiently optimize the proposed estimators. The advantages of our estimators over Gaussian graphical models and Nonparanormal estimators are demonstrated on synthetic and real data sets.
AB - We investigate a subclass of exponential family graphical models of which the sufficient statistics are defined by arbitrary additive forms. We propose two l2 1-norm regularized maximum likelihood estimators to learn the model parameters from ii.d. samples. The first one is a joint MLE estimator which estimates all the parameters simultaneously. The second one is a node-wise conditional MLE estimator which estimates the parameters for each node individually. For both estimators, statistical analysis shows that under mild conditions the extra flexibility gained by the additive exponential family models comes at almost no cost of statistical efficiency. A Monte-Carlo approximation method is developed to efficiently optimize the proposed estimators. The advantages of our estimators over Gaussian graphical models and Nonparanormal estimators are demonstrated on synthetic and real data sets.
UR - http://www.scopus.com/inward/record.url?scp=85019191994&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85019191994&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85019191994
SN - 1049-5258
SP - 4374
EP - 4382
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 30th Annual Conference on Neural Information Processing Systems, NIPS 2016
Y2 - 5 December 2016 through 10 December 2016
ER -