Adversarial Defense via Neural Oscillation inspired Gradient Masking
Title: | Adversarial Defense via Neural Oscillation inspired Gradient Masking |
---|---|
Authors: | Jiang, Chunming, Zhang, Yilei |
Publication Year: | 2022 |
Collection: | Computer Science |
Subject Terms: | Computer Science - Machine Learning, Computer Science - Neural and Evolutionary Computing |
More Details: | Spiking neural networks (SNNs) attract great attention due to their low power consumption, low latency, and biological plausibility. As they are widely deployed in neuromorphic devices for low-power brain-inspired computing, security issues become increasingly important. However, compared to deep neural networks (DNNs), SNNs currently lack specifically designed defense methods against adversarial attacks. Inspired by neural membrane potential oscillation, we propose a novel neural model that incorporates the bio-inspired oscillation mechanism to enhance the security of SNNs. Our experiments show that SNNs with neural oscillation neurons have better resistance to adversarial attacks than ordinary SNNs with LIF neurons on kinds of architectures and datasets. Furthermore, we propose a defense method that changes model's gradients by replacing the form of oscillation, which hides the original training gradients and confuses the attacker into using gradients of 'fake' neurons to generate invalid adversarial samples. Our experiments suggest that the proposed defense method can effectively resist both single-step and iterative attacks with comparable defense effectiveness and much less computational costs than adversarial training methods on DNNs. To the best of our knowledge, this is the first work that establishes adversarial defense through masking surrogate gradients on SNNs. |
Document Type: | Working Paper |
Access URL: | http://arxiv.org/abs/2211.02223 |
Accession Number: | edsarx.2211.02223 |
Database: | arXiv |
FullText | Text: Availability: 0 CustomLinks: – Url: http://arxiv.org/abs/2211.02223 Name: EDS - Arxiv Category: fullText Text: View this record from Arxiv MouseOverText: View this record from Arxiv – Url: https://resolver.ebsco.com/c/xy5jbn/result?sid=EBSCO:edsarx&genre=article&issn=&ISBN=&volume=&issue=&date=20221103&spage=&pages=&title=Adversarial Defense via Neural Oscillation inspired Gradient Masking&atitle=Adversarial%20Defense%20via%20Neural%20Oscillation%20inspired%20Gradient%20Masking&aulast=Jiang%2C%20Chunming&id=DOI: Name: Full Text Finder (for New FTF UI) (s8985755) Category: fullText Text: Find It @ SCU Libraries MouseOverText: Find It @ SCU Libraries |
---|---|
Header | DbId: edsarx DbLabel: arXiv An: edsarx.2211.02223 RelevancyScore: 1043 AccessLevel: 3 PubType: Report PubTypeId: report PreciseRelevancyScore: 1043.48046875 |
IllustrationInfo | |
Items | – Name: Title Label: Title Group: Ti Data: Adversarial Defense via Neural Oscillation inspired Gradient Masking – Name: Author Label: Authors Group: Au Data: <searchLink fieldCode="AR" term="%22Jiang%2C+Chunming%22">Jiang, Chunming</searchLink><br /><searchLink fieldCode="AR" term="%22Zhang%2C+Yilei%22">Zhang, Yilei</searchLink> – Name: DatePubCY Label: Publication Year Group: Date Data: 2022 – Name: Subset Label: Collection Group: HoldingsInfo Data: Computer Science – Name: Subject Label: Subject Terms Group: Su Data: <searchLink fieldCode="DE" term="%22Computer+Science+-+Machine+Learning%22">Computer Science - Machine Learning</searchLink><br /><searchLink fieldCode="DE" term="%22Computer+Science+-+Neural+and+Evolutionary+Computing%22">Computer Science - Neural and Evolutionary Computing</searchLink> – Name: Abstract Label: Description Group: Ab Data: Spiking neural networks (SNNs) attract great attention due to their low power consumption, low latency, and biological plausibility. As they are widely deployed in neuromorphic devices for low-power brain-inspired computing, security issues become increasingly important. However, compared to deep neural networks (DNNs), SNNs currently lack specifically designed defense methods against adversarial attacks. Inspired by neural membrane potential oscillation, we propose a novel neural model that incorporates the bio-inspired oscillation mechanism to enhance the security of SNNs. Our experiments show that SNNs with neural oscillation neurons have better resistance to adversarial attacks than ordinary SNNs with LIF neurons on kinds of architectures and datasets. Furthermore, we propose a defense method that changes model's gradients by replacing the form of oscillation, which hides the original training gradients and confuses the attacker into using gradients of 'fake' neurons to generate invalid adversarial samples. Our experiments suggest that the proposed defense method can effectively resist both single-step and iterative attacks with comparable defense effectiveness and much less computational costs than adversarial training methods on DNNs. To the best of our knowledge, this is the first work that establishes adversarial defense through masking surrogate gradients on SNNs. – Name: TypeDocument Label: Document Type Group: TypDoc Data: Working Paper – Name: URL Label: Access URL Group: URL Data: <link linkTarget="URL" linkTerm="http://arxiv.org/abs/2211.02223" linkWindow="_blank">http://arxiv.org/abs/2211.02223</link> – Name: AN Label: Accession Number Group: ID Data: edsarx.2211.02223 |
PLink | https://login.libproxy.scu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&scope=site&db=edsarx&AN=edsarx.2211.02223 |
RecordInfo | BibRecord: BibEntity: Subjects: – SubjectFull: Computer Science - Machine Learning Type: general – SubjectFull: Computer Science - Neural and Evolutionary Computing Type: general Titles: – TitleFull: Adversarial Defense via Neural Oscillation inspired Gradient Masking Type: main BibRelationships: HasContributorRelationships: – PersonEntity: Name: NameFull: Jiang, Chunming – PersonEntity: Name: NameFull: Zhang, Yilei IsPartOfRelationships: – BibEntity: Dates: – D: 03 M: 11 Type: published Y: 2022 |
ResultId | 1 |