Skip to content

In this work the proposed defense strategy is evaluated against two black-box adversarial attacks, Hop Skip Jump and Square

Notifications You must be signed in to change notification settings

LorenzoAgnolucci/Adversarial_attacks_defense

Repository files navigation

Adversarial attacks defense

Table of Contents

About The Project

Based on the paper "Neural Compression Restoration Against Gradient-based Adversarial Attacks".

In this work the proposed defense strategy is evaluated against black-box attacks. In particular, we consider the Hop Skip Jump and the Square attack.

More details about the project in the paper or in the presentation.

Built With

Installation

To get a local copy up and running follow these simple steps:

  1. Clone the repo
git clone https://github.com/LorenzoAgnolucci/Adversarial_attacks_defense.git
  1. Run pip install -r requirements.txt in the root folder of the repo to install the requirements

  2. Run pip install -e adversarial-robustness-toolbox/ in the root folder to install the ART module with the custom files

Usage

  1. Download the dataset

  2. Change the path of the images and the parameters in jpeg_gan_hop_skip_jump_pytorch.py and jpeg_gan_square_pytorch.py

  3. Run jpeg_gan_hop_skip_jump_pytorch.py or jpeg_gan_square_pytorch.py to evaluate the defense strategy against the corresponding attack

Authors

Acknowledgments

Visual and Multimedia Recognition © Course held by Professor Alberto Del Bimbo - Computer Engineering Master Degree @University of Florence

About

In this work the proposed defense strategy is evaluated against two black-box adversarial attacks, Hop Skip Jump and Square

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages