diff --git a/docs/cody/capabilities/supported-models.mdx b/docs/cody/capabilities/supported-models.mdx
index 631c6dac8..fb5439608 100644
--- a/docs/cody/capabilities/supported-models.mdx
+++ b/docs/cody/capabilities/supported-models.mdx
@@ -43,4 +43,6 @@ Cody uses a set of models for autocomplete which are suited for the low latency
The default autocomplete model for Cody Free, Pro and Enterprise users is DeepSeek-Coder-V2.
+The DeepSeek model used by Sourcegraph is hosted by Fireworks.ai, and is hosted as a single-tenant service in a US-based data center. For more information see our [Cody FAQ](https://sourcegraph.com/docs/cody/faq#is-any-of-my-data-sent-to-deepseek).
+
Read here for [Ollama setup instructions](https://sourcegraph.com/docs/cody/clients/install-vscode#supported-local-ollama-models-with-cody). For information on context token limits, see our [documentation here](/cody/core-concepts/token-limits).
diff --git a/docs/cody/faq.mdx b/docs/cody/faq.mdx
index f6637e032..ebf2215e2 100644
--- a/docs/cody/faq.mdx
+++ b/docs/cody/faq.mdx
@@ -100,6 +100,14 @@ Cody does not support embeddings on Cody PLG and Cody Enterprise because we have
Leveraging Sourcegraph Search allowed us to deliver these enhancements.
+## LLM Data Sharing and Retention
+
+### Is any of my data sent to DeepSeek?
+
+Our autocomplete features uses the open source DeepSeek-Coder-V2 model, which is hosted by Fireworks.ai in a secure single-tenant environment located in the USA. No customer chat or autocomplete data - such as chat messages, or context such as code snippets or configuration - is stored by Fireworks.ai.
+
+Sourcegraph does not use models hosted by DeepSeek (the company), and does not send any data to the same.
+
## Third party dependencies
### What is the default `sourcegraph` provider for completions?