diff --git a/README.md b/README.md
index f8e7fb2b..671c2bfb 100644
--- a/README.md
+++ b/README.md
@@ -107,4 +107,6 @@ Disclaimer: Examples contributed by the community and partners do not represent
| [Build a bank support agent with Pydantic AI and Mistral AI](third_party/PydanticAI/pydantic_bank_support_agent.ipynb)| Agent | Pydantic |
| [Mistral and MLflow Tracing](third_party/MLflow/mistral-mlflow-tracing.ipynb) | Tracing, Observability | MLflow |
| [Mistral OCR with Gradio](third_party/gradio/MistralOCR.md) | OCR | Gradio |
+| [Mistral OCR with Gradio](third_party/gradio/MistralOCR.md) | OCR | Gradio |
+| [Semantic search in Rust using SurrealDB](third_party/surrealdb/rust-semantic-search-with-surrealdb.md) | RAG | SurrealDB |
| [prompt_optimization.ipynb](third_party/metagpt/prompt_optimization.ipynb)) |Prompting | Optimizing prompts without any supervision
diff --git a/third_party/SurrealDB/readme.md b/third_party/SurrealDB/readme.md
new file mode 100644
index 00000000..c2e98e4f
--- /dev/null
+++ b/third_party/SurrealDB/readme.md
@@ -0,0 +1,3 @@
+This page holds a collection of posts using MistralAI with SurrealDB.
+
+# [Semantic search in Rust with MistralAI and SurrealDB](rust-semantic-search-with-mistralai-and-surrealdb.md)
\ No newline at end of file
diff --git a/third_party/SurrealDB/rust-semantic-search-with-surrealdb.md b/third_party/SurrealDB/rust-semantic-search-with-surrealdb.md
new file mode 100644
index 00000000..297da12d
--- /dev/null
+++ b/third_party/SurrealDB/rust-semantic-search-with-surrealdb.md
@@ -0,0 +1,498 @@
+# Semantic search in Rust using SurrealDB
+
+This post demonstrates how to use the Rust SDK to store Mistral AI embeddings as [SurrealDB vectors](https:www.surrealdb.com/docs/surrealdb/reference-guide/vector-search), which can then be queried natively in SurrealQL to perform semantic search.
+
+This guide uses Rust's [mistralai-client](https://crates.io/crates/mistralai-client) crate to generate embeddings, but the code below can be modified to suit [other languages](https://docs.mistral.ai/getting-started/clients/#rust) that have clients for Mistral's AI platform. If you are a Python user, check out [this page](https:www.surrealdb.com/docs/integrations/embeddings/mistral) in the documentation for another ready-made example.
+
+## Setup
+
+Setting up an embedded SurrealDB database requires no installation and can be done in just a few lines of code. After creating a new Cargo project with `cargo new project_name` and going into the project folder, add the following dependencies inside `Cargo.toml`:
+
+```toml
+anyhow = "1.0.98"
+mistralai-client = "0.14.0"
+serde = "1.0.219"
+surrealdb = { version = "2.3", features = ["kv-mem"] }
+tokio = "1.45.0"
+```
+
+
+You can add the same dependencies on the command line through a single command:
+
+```
+cargo add anyhow mistralai-client serde tokio surrealdb --features surrealdb/kv-mem
+```
+
+
+Setting up a SurrealDB database in Rust is as easy as calling the `connect` function with `"memory"` to instantiate an embedded database in memory. This code uses `anyhow` to allow the question mark operator to be used, but you can also just begin with `.unwrap()` everywhere and eventually move on to your own preferred error handling.
+
+```rust
+use anyhow::Error;
+use surrealdb::engine::any::connect;
+
+#[tokio::main]
+async fn main() -> Result<(), Error> {
+ let db = connect("memory").await?;
+ Ok(())
+}
+```
+
+
+If you have a running Cloud or local instance, you can pass that path into the `connect()` function instead.
+
+```rust
+// Cloud address
+let db = connect("wss://cloud-docs-068rp16e0hsnl62vgooa7omjks.aws-euw1.staging.surrealdb.cloud").await?;
+
+// Local address
+let db = connect("ws://localhost:8000").await?;
+```
+
+
+After connecting, we will select a namespace and database name, such as `ns` and `db`.
+
+```rust
+db.use_ns("ns").use_db("db").await?;
+```
+
+
+## Create a vector table and index
+
+Next we'll create a table called `document` to store documents and embeddings, along with an index for the embeddings. The statements look like this:
+
+```surql
+DEFINE TABLE document;
+DEFINE FIELD text ON document TYPE string;
+DEFINE FIELD embedding ON document TYPE array;
+DEFINE INDEX hnsw_embed ON document FIELDS embedding HNSW DIMENSION 1024 DIST COSINE;
+```
+
+
+The important piece to understand is the relationship between the `embedding` field, a simple array of floats, and the index that we have given the name `hnsw_embed`. The size of the vector (1024 here) represents the number of dimensions in the embedding. This is to match Mistral AI's `mistral-embed` model, which uses [1024 as its length](https://docs.mistral.ai/getting-started/models/models_overview/#premier-models).
+
+The [HNSW index](https:www.surrealdb.com/docs/surrealdb/reference-guide/vector-search#vector-indexes) is not strictly necessary to use the KNN operator (`<||>`) to find an embedding's closest neighbours, and for our small sample code we will use the simple [brute force method](https:www.surrealdb.com/docs/surrealql/operators#brute-force-method) which chooses [an algorithm](https:www.surrealdb.com/docs/surrealdb/reference-guide/vector-search#computation-on-vectors-vector-package-of-functions) such as Euclidean, Hamming, and so on. The following is the code that we will use, which uses the cosine of an embedding to find the four closest neighbours.
+
+```surql
+SELECT
+ text,
+ vector::distance::knn() AS distance FROM document
+ WHERE embedding <|4,COSINE|> $embeds
+ ORDER BY distance;
+```
+
+As the dataset grows, however, the syntax can be changed to use [the HNSW index](https:www.surrealdb.com/docs/surrealql/operators#hnsw-method), by replacing an algorithm with a number that represents the size of the dynamic candidate list. This index is recommended when a small loss of accuracy is acceptable in order to preserve performance.
+
+```surql
+SELECT
+ text,
+ vector::distance::knn() AS distance FROM document
+ WHERE embedding <|4,40|> $embeds
+ ORDER BY distance;
+```
+
+Another option is to use the [MTREE](https:www.surrealdb.com/docs/surrealql/operators#mtree-index-method) index method.
+
+Inside the Rust SDK we can put all four of these inside a single `.query()` call and then add a line to see if there are errors inside any of them.
+
+```rust
+let mut res = db
+ .query(
+ "DEFINE TABLE document;
+DEFINE FIELD text ON document TYPE string;
+DEFINE FIELD embedding ON document TYPE array;
+DEFINE INDEX hnsw_embed ON document FIELDS embedding HNSW DIMENSION 1024 DIST COSINE;",
+ )
+ .await?;
+for (index, error) in res.take_errors() {
+ println!("Error in query {index}: {error}");
+}
+```
+
+
+## Generate Mistral AI embeddings
+
+At this point, you will need a [key](https://console.mistral.ai/api-keys) to interact with Mistral AI's platform. They offer a free tier for experimentation, after which you will be able to create a key to interact with it via the code below.
+
+The code in this page will still work without a proper code, but the request to the Mistral AI API will end up returning the following error message.
+
+```
+Error: ApiError: 401 Unauthorized: {"detail":"Unauthorized"}
+```
+
+
+The best way to set the key is as an environment variable, which we will set to be a static called `KEY`. The client will look for one called `MISTRAL_API_KEY`, though you can change this when setting up the Mistral AI Rust client if you like.
+
+```rust
+// Looks for MISTRAL_API_KEY
+let client = Client::new(Some(KEY.to_string()), None, None, None)?;
+// Looks for OTHER_ENV_VAR
+let client = Client::new(Some(KEY.to_string()), Some("OTHER_ENV_VAR".to_string()), None, None)?;
+```
+
+Using a `LazyLock` will let us call it via `std::env::var()` function the first time it is accessed. You can of course simply put it into a `const` for simplicity when first testing, but always remember to never hard-code API keys in your code in production.
+
+```rust
+static KEY: LazyLock = LazyLock::new(|| {
+ std::env::var("MISTRAL_API_KEY").unwrap()
+});
+```
+
+
+And then run the code like this:
+
+```bash
+MISTRAL_API_KEY=whateverthekeyis cargo run
+```
+
+
+Or like this if you are using PowerShell on Windows.
+
+```powershell
+$env:MISTRAL_API_KEY = "whateverthekeyis"
+cargo run
+```
+
+
+We can also create a `const MODEL` to hold the Mistral AI model used, which in this case is an `EmbedModel::MistralEmbed`.
+
+```rust
+const MODEL: EmbedModel = EmbedModel::MistralEmbed;
+```
+
+Inside `main()`, we will then [create a client](https://docs.rs/mistralai-client/0.14.0/mistralai_client/v1/client/struct.Client.html#method.new) from the `mistralai-client` crate.
+
+```rust
+let client = Client::new(Some(KEY.to_string()), None, None, None)?;
+```
+
+
+
+We'll use that to generate a Mistral AI embedding using the [`mistral-embed`](https://docs.mistral.ai/getting-started/models/models_overview/#premier-models) model. The `mistralai-client` has both sync and async functions that take a `Vec`, and since SurrealDB uses the tokio runtime, we'll call the async `.embeddings_async()` method.
+
+```rust
+let input = vec!["Joram is the main character in the Darksword Trilogy.".to_string()];
+
+let result = client.embeddings_async(MODEL, input, None).await?;
+println!("{:?}", result);
+```
+
+
+The output in your console should show a massive amount of floats, 1024 of them to be precise. That's the embedding for this input!
+
+## Store embeddings in database
+
+Now that we have the embedding returned from the Mistral AI client, we can store it in the database. The [response](https://docs.rs/mistralai-client/0.14.0/mistralai_client/v1/embedding/struct.EmbeddingResponse.html) returned from the mistralai-client crate looks like this, with a `Vec` of `EmbeddingResponseDataItem` structs that hold a `Vec`.
+
+```rust
+pub struct EmbeddingResponse {
+ pub id: String,
+ pub object: String,
+ pub model: EmbedModel,
+ pub data: Vec,
+ pub usage: ResponseUsage,
+}
+
+pub struct EmbeddingResponseDataItem {
+ pub index: u32,
+ pub embedding: Vec,
+ pub object: String,
+}
+```
+
+
+We know that our simple request only returned a single embedding, so `.remove(0)` will do the job. In a more complex codebase you would probably opt for a match on `.get(0)` to handle any possible errors.
+
+```rust
+let embeds = result.data.remove(0).embedding;
+```
+
+
+There are a [number of ways](https:www.surrealdb.com/docs/sdk/rust/concepts/flexible-typing) to work with or avoid structs when using the Rust SDK, but we'll just go with two basic structs: one to represent the input into a `.create()` statement, which will implement `Serialize`, and another that implements `Deserialize` to show the result.
+
+```rust
+#[derive(Serialize)]
+struct DocumentInput {
+ text: String,
+ embedding: Vec,
+}
+
+#[derive(Debug, Deserialize)]
+struct Document {
+ id: RecordId,
+ embedding: Vec,
+ text: String,
+}
+```
+
+
+Once that is done, we can print out the created documents as a `Document` struct. We'll fiddle with the code a bit to have the `input` start as a `&str` which will be turned into a `String` in order to get the embedding, as well as to create a `Document` struct.
+
+```rust
+let input = "Octopuses solve puzzles and escape enclosures, showing advanced intelligence.";
+
+let mut result = client
+ .embeddings_async(MODEL, vec![input.to_string()], None)
+ .await?;
+let embeds = result.data.remove(0).embedding;
+let in_db = db
+ .create::