ART 1.6.1
This release of ART 1.6.1 provides updates to ART 1.6.
Added
- Added a notebook showing an example of Expectation over Transformation (EoT) sampling with ART to generate adversarial examples that are robust against rotation in image classification tasks. (#1051)
- Added a check for valid combinations of
stride,freq_dimand image size inSimBAattack. (#1037) - Added accurate gradient estimation to
LFilteraudio preprocessing. (#1002) - Added support for multiple layers to be targeted by
BullseyePolytopeAttackPyTorchattack to increase effectiveness in end-to-end scenarios. (#1003) - Added check and ValueError to provide explanation for too large
nb_parallelvalues inZooAttack. (#988)
Changed
- Changed
TensorFlowV2Classifier.get_activationsto accept negative layer indexes. (#1054) - Tested
BoundaryAttackandHopSkipJumpattacks withbatch_sizelarger than 1 and changed default value tobatch_size=64. (#971)
Removed
[None]
Fixed
- Fixed bug in
Dpatchattack which did not update the patch, being optimised, onto the images used for loss gradient calculation leading to iterations with the constant, initially, applied patches. (#1049) - Fixed bug in
BullseyePolytopeAttackPyTorchattack where attacking multiple layers of the underlying model only perturbed the first of all input images. (#1046) - Fixed return value of TensorFlowV2Classifier.get_activations to a list of strings. (#1011)
- Fixed bug in
TensorFlowV2Classifier.loss_gradientby adding labels to application of preprocessing step to enable EoT preprocessing steps that increase the number of samples and labels. This change does not affect the accuracy of previously calculated loss gradients. (#1010) - Fixed bug in
ElasticNetattack to apply theconfidenceparameter when generating adversarial examples. (#995) - Fixed bug in
art.attacks.poisoning.perturbations.image_perturbations.insert_imageto correctly transpose input images whenchannels_first=True. (#1009) - Fixed bug of missing method
compute_lossinPyTorchDeepSpeech,TensorFlowFasterRCNNandBlackBoxClassifier. (#994, #1000)