Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/hub/security.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,5 @@ For any other security questions, please feel free to send us an email at securi
- [Malware Scanning](./security-malware)
- [Pickle Scanning](./security-pickle)
- [Secrets Scanning](./security-secrets)
- [3rd party scanners](./third-party-scanners)
- [Resource Groups](./security-resource-groups)
25 changes: 25 additions & 0 deletions docs/hub/third-party-scanners.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# 3rd Party scanners

*Interested in joining our security partnership / providing scanning information on the Hub? Please get in touch with us over at [email protected].*

We partner with 3rd party scanning providers in order to make the Hub safer. The same way files are scanned by our internal scanning system, public repositories' files are scanned by the 3rd party scanners we integrate.

Our frontend has been redesigned specifically for this purpose, in order to accomodate for new scanners:

<img class="block" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/token-leak-email-example.png"/>

Here is an example repository you can check out to see the feature in action: [mcpotato/42-eicar-street](https://huggingface.co/mcpotato/42-eicar-street).

## Model security refresher

To share models, we serialize the data structures we use to interact with the models, in order to facilitate storage and transport. Some serialization formats are vulnerable to nasty exploits, such as arbitrary code execution (looking at you pickle), making sharing models potentially dangerous.

As Hugging Face has become the de facto platform for model sharing, we’d like to protect the community from this, hence why we have developed tools like [picklescan](https://github.com/mmaitre314/picklescan) and why we integrate 3rd party scanners.

Pickle is not the only exploitable format out there, [see for reference](https://github.com/Azure/counterfit/wiki/Abusing-ML-model-file-formats-to-create-malware-on-AI-systems:-A-proof-of-concept) how one can exploit Keras Lambda layers to achieve arbitrary code execution.

## Protect AI's Guardian

[Protect AI](https://protectai.com/)'s [Guardian](https://protectai.com/guardian) catches both pickle and Keras exploits. Guardian also benefits from reports sent in by their community of bounty [Huntr](https://huntr.com/)s.

<!-- insert image of report -->
Loading