TY - GEN
T1 - CSTAR
T2 - 37th AAAI Conference on Artificial Intelligence, AAAI 2023
AU - Phan, Huy
AU - Yin, Miao
AU - Sui, Yang
AU - Yuan, Bo
AU - Zonouz, Saman
N1 - Funding Information:
This work was partially supported by the National Science Foundation under Grant CCF-1955909, RTML, CMMI and CPS programs.
Publisher Copyright:
Copyright © 2023, Association for the Advancement of Artificial Intelligence (www.aaai.org).
PY - 2023/6/27
Y1 - 2023/6/27
N2 - Model compression and model defense for deep neural networks (DNNs) have been extensively and individually studied. Considering the co-importance of model compactness and robustness in practical applications, several prior works have explored to improve the adversarial robustness of the sparse neural networks. However, the structured sparse models obtained by the existing works suffer severe performance degradation for both benign and robust accuracy, thereby causing a challenging dilemma between robustness and structuredness of compact DNNs. To address this problem, in this paper, we propose CSTAR, an efficient solution that simultaneously impose Compactness, high STructuredness and high Adversarial Robustness on the target DNN models. By formulating the structuredness and robustness requirement within the same framework, the compressed DNNs can simultaneously achieve high compression performance and strong adversarial robustness. Evaluations for various DNN models on different datasets demonstrate the effectiveness of CSTAR. Compared with the state-of-the-art robust structured pruning, CSTAR shows consistently better performance. For instance, when compressing ResNet-18 on CIFAR-10, CSTAR achieves up to 20.07% and 11.91% improvement for benign accuracy and robust accuracy, respectively. For compressing ResNet-18 with 16× compression ratio on Imagenet, CSTAR obtains 8.58% benign accuracy gain and 4.27% robust accuracy gain compared to the existing robust structured pruning.
AB - Model compression and model defense for deep neural networks (DNNs) have been extensively and individually studied. Considering the co-importance of model compactness and robustness in practical applications, several prior works have explored to improve the adversarial robustness of the sparse neural networks. However, the structured sparse models obtained by the existing works suffer severe performance degradation for both benign and robust accuracy, thereby causing a challenging dilemma between robustness and structuredness of compact DNNs. To address this problem, in this paper, we propose CSTAR, an efficient solution that simultaneously impose Compactness, high STructuredness and high Adversarial Robustness on the target DNN models. By formulating the structuredness and robustness requirement within the same framework, the compressed DNNs can simultaneously achieve high compression performance and strong adversarial robustness. Evaluations for various DNN models on different datasets demonstrate the effectiveness of CSTAR. Compared with the state-of-the-art robust structured pruning, CSTAR shows consistently better performance. For instance, when compressing ResNet-18 on CIFAR-10, CSTAR achieves up to 20.07% and 11.91% improvement for benign accuracy and robust accuracy, respectively. For compressing ResNet-18 with 16× compression ratio on Imagenet, CSTAR obtains 8.58% benign accuracy gain and 4.27% robust accuracy gain compared to the existing robust structured pruning.
UR - http://www.scopus.com/inward/record.url?scp=85167695295&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85167695295&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85167695295
T3 - Proceedings of the 37th AAAI Conference on Artificial Intelligence, AAAI 2023
SP - 2065
EP - 2073
BT - AAAI-23 Technical Tracks 2
A2 - Williams, Brian
A2 - Chen, Yiling
A2 - Neville, Jennifer
PB - AAAI press
Y2 - 7 February 2023 through 14 February 2023
ER -