首页 > TAG信息列表 > Perturbations

【论文笔记】(防御蒸馏)Distillation as a Defense to Adversarial Perturbations against Deep Neural Networks

有关蒸馏 (Distillation)的论文: (2006)Model Compression (2014)Do Deep Nets Really Need to be Deep?--- 论文笔记 (2015)Distilling the Knowledge in a Neural Network--- 论文笔记 摘要 本文提出了防御蒸馏(defensive distillation),主要思想为:使用从DNN中提取的知识来降低