File tree Expand file tree Collapse file tree 1 file changed +11
-0
lines changed Expand file tree Collapse file tree 1 file changed +11
-0
lines changed Original file line number Diff line number Diff line change @@ -303,3 +303,14 @@ Demonstrates training a neural network using smoothed vision transformers for ce
303
303
304
304
[ fabric_for_deep_learning_adversarial_samples_fashion_mnist.ipynb] ( fabric_for_deep_learning_adversarial_samples_fashion_mnist.ipynb ) [[ on nbviewer] ( https://nbviewer.jupyter.org/github/Trusted-AI/adversarial-robustness-toolbox/blob/main/notebooks/fabric_for_deep_learning_adversarial_samples_fashion_mnist.ipynb )]
305
305
shows how to use ART with deep learning models trained with the Fabric for Deep Learning (FfDL).
306
+
307
+ ## Hugging Face
308
+
309
+ [ huggingface_notebook.ipynb] ( huggingface_notebook.ipynb ) [[ on nbviewer] ( https://nbviewer.jupyter.org/github/Trusted-AI/adversarial-robustness-toolbox/blob/main/notebooks/huggingface_notebook.ipynb )]
310
+ shows how to use ART with the Hugging Face API for image classification tasks.
311
+
312
+ [ hugging_face_evasion.ipynb] ( hugging_face_evasion.ipynb ) [[ on nbviewer] ( https://nbviewer.jupyter.org/github/Trusted-AI/adversarial-robustness-toolbox/blob/main/notebooks/hugging_face_evasion.ipynb )]
313
+ shows how to use ART to perform evasion attacks on Hugging Face image classification models and defend them using adversarial training.
314
+
315
+ [ hugging_face_poisoning.ipynb] ( hugging_face_poisoning.ipynb ) [[ on nbviewer] ( https://nbviewer.jupyter.org/github/Trusted-AI/adversarial-robustness-toolbox/blob/main/notebooks/hugging_face_poisoning.ipynb )]
316
+ shows how to use ART to perform poison Hugging Face image classification models and defend them using poisoning defenses.
You can’t perform that action at this time.
0 commit comments