From fac34010abf5c62312b8ab64809f6c403847b60e Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Thu, 21 Nov 2024 14:09:32 +0000 Subject: [PATCH 01/23] Moving Error section away from "Query D1". --- .../docs/d1/observability/debug-d1.mdx | 48 ++++++++++++++++++- 1 file changed, 46 insertions(+), 2 deletions(-) diff --git a/src/content/docs/d1/observability/debug-d1.mdx b/src/content/docs/d1/observability/debug-d1.mdx index 2fddfad7f26d3e4..98f2209c3410aa5 100644 --- a/src/content/docs/d1/observability/debug-d1.mdx +++ b/src/content/docs/d1/observability/debug-d1.mdx @@ -10,7 +10,7 @@ D1 allows you to capture exceptions and log errors returned when querying a data ## Handle errors -The D1 [client API](/d1/build-with-d1/d1-client-api/) returns detailed [error messages](/d1/build-with-d1/d1-client-api/#errors) within an `Error` object. +The D1 [Workers Binding API](/d1/worker-api/) returns detailed error messages within an `Error` object. To ensure you are capturing the full error message, log or return `e.message` as follows: @@ -29,6 +29,50 @@ try { */ ``` +### Errors + +The [`stmt.`](/d1/worker-api/prepared-statements/) and [`db.`](/d1/worker-api/d1-database/) methods throw an [Error object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error) whenever an error occurs. + +:::note +Prior to [`wrangler` 3.1.1](https://github.com/cloudflare/workers-sdk/releases/tag/wrangler%403.1.1), D1 JavaScript errors used the [cause property](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error/cause) for detailed error messages. + +To inspect these errors when using older versions of `wrangler`, you should log `error?.cause?.message`. +::: + +To capture exceptions, log the `Error.message` value. For example, the code below has a query with an invalid keyword - `INSERTZ` instead of `INSERT`: + +```js +try { + // This is an intentional mispelling + await db.exec("INSERTZ INTO my_table (name, employees) VALUES ()"); +} catch (e: any) { + console.error({ + message: e.message + }); +} +``` + +The code above throws the following error message: + +```json +{ + "message": "D1_EXEC_ERROR: Error in line 1: INSERTZ INTO my_table (name, employees) VALUES (): sql error: near \"INSERTZ\": syntax error in INSERTZ INTO my_table (name, employees) VALUES () at offset 0" +} +``` + +### Error list + +D1 returns the following error constants, in addition to the extended (detailed) error message: + +| Message | Cause | +| -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `D1_ERROR` | Generic error. | +| `D1_TYPE_ERROR` | Returned when there is a mismatch in the type between a column and a value. A common cause is supplying an `undefined` variable (unsupported) instead of `null`. | +| `D1_COLUMN_NOTFOUND` | Column not found. | +| `D1_DUMP_ERROR` | Database dump error. | +| `D1_EXEC_ERROR` | Exec error in line x: y error. | + + ## View logs View a stream of live logs from your Worker by using [`wrangler tail`](/workers/observability/logs/real-time-logs#view-logs-using-wrangler-tail) or via the [Cloudflare dashboard](/workers/observability/logs/real-time-logs#view-logs-from-the-dashboard). @@ -43,7 +87,7 @@ You should include as much of the following in any bug report: * The ID of your database. Use `wrangler d1 list` to match a database name to its ID. * The query (or queries) you ran when you encountered an issue. Ensure you redact any personally identifying information (PII). -* The Worker code that makes the query, including any calls to `bind()` using the [client API](/d1/build-with-d1/d1-client-api/). +* The Worker code that makes the query, including any calls to `bind()` using the [Workers Binding API](/d1/worker-api/). * The full error text, including the content of [`error.cause.message`](#handle-errors). ## Related resources From 1b1583a6a0583fec2ca599c2b3df350869be2e2d Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Thu, 21 Nov 2024 14:10:54 +0000 Subject: [PATCH 02/23] Redirecting Query D1 to Workers Binding API section. --- public/_redirects | 1 + 1 file changed, 1 insertion(+) diff --git a/public/_redirects b/public/_redirects index a7aceb5ef52e64f..030a94ee947f2cf 100644 --- a/public/_redirects +++ b/public/_redirects @@ -259,6 +259,7 @@ # D1 /d1/client-api/ /d1/build-with-d1/d1-client-api/ 301 +/d1/build-with-d1/d1-client-api/ /d1/worker-api/ 301 /d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301 /d1/learning/debug-d1/ /d1/observability/debug-d1/ 301 /d1/learning/using-indexes/ /d1/build-with-d1/use-indexes/ 301 From 6db092ae2c17aafa2f3376a934197642b0532418 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Thu, 21 Nov 2024 14:41:55 +0000 Subject: [PATCH 03/23] Replacing all interlinks to Query D1 chapter, to the new structure. --- src/content/changelogs/d1.yaml | 22 +++++++++---------- .../docs/d1/build-with-d1/foreign-keys.mdx | 6 ++--- .../d1/build-with-d1/import-export-data.mdx | 4 ++-- .../docs/d1/build-with-d1/query-json.mdx | 6 ++--- src/content/docs/d1/examples/d1-and-hono.mdx | 2 +- src/content/docs/d1/examples/d1-and-remix.mdx | 2 +- .../examples/query-d1-from-python-workers.mdx | 2 +- src/content/docs/d1/get-started.mdx | 8 +++---- src/content/docs/d1/index.mdx | 4 ++-- .../d1/observability/metrics-analytics.mdx | 4 ++-- .../tutorials/build-a-comments-api/index.mdx | 2 +- .../build-an-api-to-access-d1/index.mdx | 10 ++++----- .../docs/d1/worker-api/d1-database.mdx | 2 +- .../d1/worker-api/prepared-statements.mdx | 6 ++--- .../docs/d1/worker-api/return-object.mdx | 4 ++-- src/content/docs/pages/functions/bindings.mdx | 4 ++-- .../functions/wrangler-configuration.mdx | 2 +- .../docs/workers/platform/storage-options.mdx | 2 +- .../docs/workers/runtime-apis/bindings/D1.mdx | 2 +- .../static-assets/compatibility-matrix.mdx | 2 +- src/content/docs/workers/wrangler/api.mdx | 2 +- .../docs/workers/wrangler/configuration.mdx | 2 +- src/content/partials/workers/d1-pricing.mdx | 4 ++-- 23 files changed, 51 insertions(+), 53 deletions(-) diff --git a/src/content/changelogs/d1.yaml b/src/content/changelogs/d1.yaml index 3af43dac56c0f8f..7667f247df6a86a 100644 --- a/src/content/changelogs/d1.yaml +++ b/src/content/changelogs/d1.yaml @@ -19,7 +19,7 @@ entries: - publish_date: "2024-07-26" title: Fixed bug in TypeScript typings for run() API description: |- - The `run()` method as part of the [D1 Client API](/d1/build-with-d1/d1-client-api/) had an incorrect (outdated) type definition, which has now been addressed as of [`@cloudflare/workers-types`](https://www.npmjs.com/package/@cloudflare/workers-types) version `4.20240725.0`. + The `run()` method as part of the [D1 Client API](/d1/worker-api/) had an incorrect (outdated) type definition, which has now been addressed as of [`@cloudflare/workers-types`](https://www.npmjs.com/package/@cloudflare/workers-types) version `4.20240725.0`. The correct type definition is `stmt.run(): D1Result`, as `run()` returns the result rows of the query. The previously _incorrect_ type definition was `stmt.run(): D1Response`, which only returns query metadata and no results. @@ -28,7 +28,7 @@ entries: description: |- Previously, D1's [HTTP API](/api/operations/cloudflare-d1-query-database) returned a HTTP `500 Internal Server` error for queries that came in while a D1 database was overloaded. These requests now correctly return a `HTTP 429 Too Many Requests` error. - D1's [Workers API](/d1/build-with-d1/d1-client-api/) is unaffected by this change. + D1's [Workers API](/d1/worker-api/) is unaffected by this change. - publish_date: "2024-04-30" title: D1 alpha databases will stop accepting live SQL queries on August 15, 2024 @@ -42,7 +42,7 @@ entries: description: |- Previously, D1's [HTTP API](/api/operations/cloudflare-d1-query-database) returned a HTTP `500 Internal Server` error for an invalid query. An invalid SQL query now correctly returns a `HTTP 400 Bad Request` error. - D1's [Workers API](/d1/build-with-d1/d1-client-api/) is unaffected by this change. + D1's [Workers API](/d1/worker-api/) is unaffected by this change. - publish_date: "2024-04-05" title: D1 alpha databases are deprecated @@ -81,24 +81,24 @@ entries: - publish_date: "2024-02-16" title: API changes to `run()` description: |- - A previous change (made on 2024-02-13) to the `run()` [query statement method](/d1/build-with-d1/d1-client-api/#await-stmtrun) has been reverted. + A previous change (made on 2024-02-13) to the `run()` [query statement method](/d1/worker-api/prepared-statements/#run) has been reverted. - `run()` now returns a `D1Result`, including the result rows, matching its original behaviour prior to the change on 2024-02-13. + `run()` now returns a `D1Result`, including the result rows, matching its original behavior prior to the change on 2024-02-13. - Future change to `run()` to return a [`D1ExecResult`](/d1/build-with-d1/d1-client-api/#return-object), as originally intended and documented, will be gated behind a [compatibility date](/workers/configuration/compatibility-dates/) as to avoid breaking existing Workers relying on the way `run()` currently works. + Future change to `run()` to return a [`D1ExecResult`](/d1/worker-api/return-object/#d1execresult), as originally intended and documented, will be gated behind a [compatibility date](/workers/configuration/compatibility-dates/) as to avoid breaking existing Workers relying on the way `run()` currently works. - publish_date: "2024-02-13" title: API changes to `raw()`, `all()` and `run()` description: |- - D1's `raw()`, `all()` and `run()` [query statement methods](/d1/build-with-d1/d1-client-api/#query-statement-methods) have been updated to reflect their intended behaviour and improve compatibility with ORM libraries. + D1's `raw()`, `all()` and `run()` [query statement methods](/d1/worker-api/prepared-statements/) have been updated to reflect their intended behavior and improve compatibility with ORM libraries. `raw()` now correctly returns results as an array of arrays, allowing the correct handling of duplicate column names (such as when joining tables), as compared to `all()`, which is unchanged and returns an array of objects. To include an array of column names in the results when using `raw()`, use `raw({columnNames: true})`. - `run()` no longer incorrectly returns a `D1Result` and instead returns a [`D1ExecResult`](/d1/build-with-d1/d1-client-api/#return-object) as originally intended and documented. + `run()` no longer incorrectly returns a `D1Result` and instead returns a [`D1ExecResult`](/d1/worker-api/return-object/#d1execresult) as originally intended and documented. This may be a breaking change for some applications that expected `raw()` to return an array of objects. - Refer to [D1 client API](/d1/build-with-d1/d1-client-api/) to review D1's query methods, return types and TypeScript support in detail. + Refer to [D1 client API](/d1/worker-api/) to review D1's query methods, return types and TypeScript support in detail. - publish_date: "2024-01-18" title: Support for LIMIT on UPDATE and DELETE statements @@ -140,7 +140,7 @@ entries: description: |- D1 now returns a count of `rows_written` and `rows_read` for every query executed, allowing you to assess the cost of query for both [pricing](/d1/platform/pricing/) and [index optimization](/d1/build-with-d1/use-indexes/) purposes. - The `meta` object returned in [D1's Client API](/d1/build-with-d1/d1-client-api/) contains a total count of the rows read (`rows_read`) and rows written (`rows_written`) by that query. For example, a query that performs a full table scan (for example, `SELECT * FROM users`) from a table with 5000 rows would return a `rows_read` value of `5000`: + The `meta` object returned in [D1's Client API](/d1/worker-api/return-object/#d1result) contains a total count of the rows read (`rows_read`) and rows written (`rows_written`) by that query. For example, a query that performs a full table scan (for example, `SELECT * FROM users`) from a table with 5000 rows would return a `rows_read` value of `5000`: ```json "meta": { "duration": 0.20472300052642825, @@ -200,7 +200,7 @@ entries: - publish_date: "2023-06-12" title: Deprecating Error.cause description: |- - As of [`wrangler v3.1.1`](https://github.com/cloudflare/workers-sdk/releases/tag/wrangler%403.1.1) the [D1 client API](/d1/build-with-d1/d1-client-api/) now returns [detailed error messages](/d1/build-with-d1/d1-client-api/#errors) within the top-level `Error.message` property, and no longer requires developers to inspect the `Error.cause.message` property. + As of [`wrangler v3.1.1`](https://github.com/cloudflare/workers-sdk/releases/tag/wrangler%403.1.1) the [D1 client API](/d1/worker-api/) now returns [detailed error messages](/d1/observability/debug-d1/) within the top-level `Error.message` property, and no longer requires developers to inspect the `Error.cause.message` property. To facilitate a transition from the previous `Error.cause` behaviour, detailed error messages will continue to be populated within `Error.cause` as well as the top-level `Error` object until approximately July 14th, 2023. Future versions of both `wrangler` and the D1 client API will no longer populate `Error.cause` after this date. - publish_date: "2023-05-19" diff --git a/src/content/docs/d1/build-with-d1/foreign-keys.mdx b/src/content/docs/d1/build-with-d1/foreign-keys.mdx index e9ff25a87db6b35..27d274f2c77a4ab 100644 --- a/src/content/docs/d1/build-with-d1/foreign-keys.mdx +++ b/src/content/docs/d1/build-with-d1/foreign-keys.mdx @@ -16,7 +16,7 @@ By default, D1 enforces that foreign key constraints are valid within all querie ## Defer foreign key constraints -When running a [query](/d1/build-with-d1/d1-client-api/), [migration](/d1/reference/migrations/) or [importing data](/d1/build-with-d1/import-export-data/) against a D1 database, there may be situations in which you need to disable foreign key validation during table creation or changes to your schema. +When running a [query](/d1/worker-api/), [migration](/d1/reference/migrations/) or [importing data](/d1/build-with-d1/import-export-data/) against a D1 database, there may be situations in which you need to disable foreign key validation during table creation or changes to your schema. D1's foreign key enforcement is equivalent to SQLite's `PRAGMA foreign_keys = on` directive. Because D1 runs every query inside an implicit transaction, user queries cannot change this during a query or migration. @@ -87,7 +87,7 @@ There are five actions you can set when defining the `ON UPDATE` and/or `ON DELE :::caution[CASCADE usage] -Although `CASCADE` can be the desired behavior in some cases, deleting child rows across tables can have undesirable effects and/or result in unintended side effects for your users. +Although `CASCADE` can be the desired behavior in some cases, deleting child rows across tables can have undesirable effects and/or result in unintended side effects for your users. ::: In the following example, deleting a user from the `users` table will delete all related rows in the `scores` table as you have defined `ON DELETE CASCADE`. Delete all related rows in the `scores` table if you do not want to retain the scores for any users you have deleted entirely. This might mean that *other* users can no longer look up or refer to scores that were still valid. @@ -110,5 +110,5 @@ CREATE TABLE scores ( ## Next Steps * Read the SQLite [`FOREIGN KEY`](https://www.sqlite.org/foreignkeys.html) documentation. -* Learn how to [use the D1 client API](/d1/build-with-d1/d1-client-api/) from within a Worker. +* Learn how to [use the D1 Workers Binding API](/d1/worker-api/) from within a Worker. * Understand how [database migrations work](/d1/reference/migrations/) with D1. diff --git a/src/content/docs/d1/build-with-d1/import-export-data.mdx b/src/content/docs/d1/build-with-d1/import-export-data.mdx index af924164e6184d4..a4c14e898a80b9e 100644 --- a/src/content/docs/d1/build-with-d1/import-export-data.mdx +++ b/src/content/docs/d1/build-with-d1/import-export-data.mdx @@ -69,7 +69,7 @@ The `_cf_KV` table is a reserved table used by D1's underlying storage system. I ::: -From here, you can now query our new table from our Worker [using the D1 client API](/d1/build-with-d1/d1-client-api/). +From here, you can now query our new table from our Worker [using the D1 Workers Binding API](/d1/worker-api/). :::caution[Known limitations] @@ -201,5 +201,5 @@ VALUES ## Next Steps - Read the SQLite [`CREATE TABLE`](https://www.sqlite.org/lang_createtable.html) documentation. -- Learn how to [use the D1 client API](/d1/build-with-d1/d1-client-api/) from within a Worker. +- Learn how to [use the D1 Workers Binding API](/d1/worker-api/) from within a Worker. - Understand how [database migrations work](/d1/reference/migrations/) with D1. diff --git a/src/content/docs/d1/build-with-d1/query-json.mdx b/src/content/docs/d1/build-with-d1/query-json.mdx index c57e862aa6cffba..6b6832a152da3eb 100644 --- a/src/content/docs/d1/build-with-d1/query-json.mdx +++ b/src/content/docs/d1/build-with-d1/query-json.mdx @@ -19,7 +19,7 @@ This allows you to more precisely query over data and reduce the result set your ## Types -JSON data is stored as a `TEXT` column in D1. JSON types follow the same [type conversion rules](/d1/build-with-d1/d1-client-api/#type-conversion) as D1 in general, including: +JSON data is stored as a `TEXT` column in D1. JSON types follow the same [type conversion rules](/d1/worker-api/#type-conversion) as D1 in general, including: * A JSON null is treated as a D1 `NULL`. * A JSON number is treated as an `INTEGER` or `REAL`. @@ -184,7 +184,7 @@ To replace an existing value, use `json_replace()`, which will overwrite an exis Use `json_each` to expand an array into multiple rows. This can be useful when composing a `WHERE column IN (?)` query over several values. For example, if you wanted to update a list of users by their integer `id`, use `json_each` to return a table with each value as a column called `value`: ```sql -UPDATE users +UPDATE users SET last_audited = '2023-05-16T11:24:08+00:00' WHERE id IN (SELECT value FROM json_each('[183183, 13913, 94944]')) ``` @@ -208,7 +208,7 @@ key|value|type|id|fullkey|path 2|94944|integer|3|$[2]|$ ``` -You can use `json_each` with D1's [client API](/d1/build-with-d1/d1-client-api/) in a Worker by creating a statement and using `JSON.stringify` to pass an array as a [bound parameter](/d1/build-with-d1/d1-client-api/#parameter-binding): +You can use `json_each` with [D1 Workers Binding API](/d1/worker-api/) in a Worker by creating a statement and using `JSON.stringify` to pass an array as a [bound parameter](/d1/worker-api/d1-database/#guidance): ```ts const stmt = context.env.DB diff --git a/src/content/docs/d1/examples/d1-and-hono.mdx b/src/content/docs/d1/examples/d1-and-hono.mdx index 27760a04f32b9ad..9a2784c5f6e4059 100644 --- a/src/content/docs/d1/examples/d1-and-hono.mdx +++ b/src/content/docs/d1/examples/d1-and-hono.mdx @@ -19,7 +19,7 @@ Hono is a fast web framework for building API-first applications, and it include When using Workers: * Ensure you have configured [`wrangler.toml`](/d1/get-started/#3-bind-your-worker-to-your-d1-database) to bind your D1 database to your Worker. -* You can access your D1 databases via Hono's [`Context`](https://hono.dev/api/context) parameter: [bindings](https://hono.dev/getting-started/cloudflare-workers#bindings) are exposed on `context.env`. If you configured a [binding](/pages/functions/bindings/#d1-databases) named `DB`, then you would access D1's [client API](/d1/build-with-d1/d1-client-api/#query-statement-methods) methods via `c.env.DB`. +* You can access your D1 databases via Hono's [`Context`](https://hono.dev/api/context) parameter: [bindings](https://hono.dev/getting-started/cloudflare-workers#bindings) are exposed on `context.env`. If you configured a [binding](/pages/functions/bindings/#d1-databases) named `DB`, then you would access [D1 Workers Binding API](/d1/worker-api/prepared-statements/) methods via `c.env.DB`. * Refer to the Hono documentation for [Cloudflare Workers](https://hono.dev/getting-started/cloudflare-workers). If you are using [Pages Functions](/pages/functions/): diff --git a/src/content/docs/d1/examples/d1-and-remix.mdx b/src/content/docs/d1/examples/d1-and-remix.mdx index 0150095ac2c2031..3cf86ffd579eef4 100644 --- a/src/content/docs/d1/examples/d1-and-remix.mdx +++ b/src/content/docs/d1/examples/d1-and-remix.mdx @@ -25,7 +25,7 @@ To set up a new Remix site on Cloudflare Pages that can query D1: The following example shows you how to define a Remix [`loader`](https://remix.run/docs/en/main/route/loader) that has a binding to a D1 database. * Bindings are passed through on the `context.env` parameter passed to a `LoaderFunction`. -* If you configured a [binding](/pages/functions/bindings/#d1-databases) named `DB`, then you would access D1's [client API](/d1/build-with-d1/d1-client-api/#query-statement-methods) methods via `context.env.DB`. +* If you configured a [binding](/pages/functions/bindings/#d1-databases) named `DB`, then you would access [D1 Workers Binding API](/d1/worker-api/prepared-statements/) methods via `context.env.DB`. diff --git a/src/content/docs/d1/examples/query-d1-from-python-workers.mdx b/src/content/docs/d1/examples/query-d1-from-python-workers.mdx index 9824524cc9e3019..ab4e19bc7375e13 100644 --- a/src/content/docs/d1/examples/query-d1-from-python-workers.mdx +++ b/src/content/docs/d1/examples/query-d1-from-python-workers.mdx @@ -120,5 +120,5 @@ If you receive an error deploying: ## Next steps - Refer to [Workers Python documentation](/workers/languages/python/) to learn more about how to use Python in Workers. -- Review the [D1 client API](/d1/build-with-d1/d1-client-api/) and how to query D1 databases. +- Review the [D1 Workers Binding API](/d1/worker-api/) and how to query D1 databases. - Learn [how to import data](/d1/build-with-d1/import-export-data/) to your D1 database. diff --git a/src/content/docs/d1/get-started.mdx b/src/content/docs/d1/get-started.mdx index e8e1643f77d300f..ca298fffa2a00df 100644 --- a/src/content/docs/d1/get-started.mdx +++ b/src/content/docs/d1/get-started.mdx @@ -187,14 +187,12 @@ You create bindings by updating your `wrangler.toml` file. - The value (string) you set for `binding` is the **binding name**, and is used to reference this database in your Worker. In this tutorial, name your binding `DB`. - The binding name must be [a valid JavaScript variable name](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Grammar_and_types#variables). For example, `binding = "MY_DB"` or `binding = "productionDB"` would both be valid names for the binding. - - Your binding is available in your Worker at `env.` and the D1 [client API](/d1/build-with-d1/d1-client-api/) is exposed on this binding. + - Your binding is available in your Worker at `env.` and the D1 [Workers Binding API](/d1/worker-api/) is exposed on this binding. :::note - -When you execute the `wrangler d1 create` command, the client API package (which implements the D1 API and database class) is automatically installed. For more information on the D1 Client API, refer to [D1 Client API](/d1/build-with-d1/d1-client-api/). - +When you execute the `wrangler d1 create` command, the client API package (which implements the D1 API and database class) is automatically installed. For more information on the D1 Workers Binding API, refer to [Workers Binding API](/d1/worker-api/). ::: You can also bind your D1 database to a [Pages Function](/pages/functions/). For more information, refer to [Functions Bindings for D1](/pages/functions/bindings/#d1-databases). @@ -336,7 +334,7 @@ After you have set up your database, run an SQL query from within your Worker. In the code above, you: 1. Define a binding to your D1 database in your TypeScript code. This binding matches the `binding` value you set in `wrangler.toml` under `[[d1_databases]]`. - 2. Query your database using `env.DB.prepare` to issue a [prepared query](/d1/build-with-d1/d1-client-api/) with a placeholder (the `?` in the query). + 2. Query your database using `env.DB.prepare` to issue a [prepared query](/d1/worker-api/d1-database/#prepare) with a placeholder (the `?` in the query). 3. Call `bind()` to safely and securely bind a value to that placeholder. In a real application, you would allow a user to define the `CompanyName` they want to list results for. Using `bind()` prevents users from executing arbitrary SQL (known as "SQL injection") against your application and deleting or otherwise modifying your database. 4. Execute the query by calling `all()` to return all rows (or none, if the query returns none). 5. Return your query results, if any, in JSON format with `Response.json(results)`. diff --git a/src/content/docs/d1/index.mdx b/src/content/docs/d1/index.mdx index cdf4ebab1499587..575899c16ff271e 100644 --- a/src/content/docs/d1/index.mdx +++ b/src/content/docs/d1/index.mdx @@ -25,7 +25,7 @@ Create new serverless SQL databases to query from your Workers and Pages project D1 is Cloudflare’s native serverless database. D1 allows you to build applications that handle large amounts of users at no extra cost. With D1, you can restore your database to any minute within the last 30 days. -Create your first D1 database by [following the Get started guide](/d1/get-started/), learn how to [import data into a database](/d1/build-with-d1/import-export-data/), and how to [interact with your database](/d1/build-with-d1/d1-client-api/) directly from [Workers](/workers/) or [Pages](/pages/functions/bindings/#d1-databases). +Create your first D1 database by [following the Get started guide](/d1/get-started/), learn how to [import data into a database](/d1/build-with-d1/import-export-data/), and how to [interact with your database](/d1/worker-api/) directly from [Workers](/workers/) or [Pages](/pages/functions/bindings/#d1-databases). *** @@ -38,7 +38,7 @@ Create your first D1 database, establish a schema, import data and query D1 dire - + Execute SQL with SQLite's SQL compatibility and D1 Client API. diff --git a/src/content/docs/d1/observability/metrics-analytics.mdx b/src/content/docs/d1/observability/metrics-analytics.mdx index cfc3c73c8e9ab35..f1b84d49bf5536f 100644 --- a/src/content/docs/d1/observability/metrics-analytics.mdx +++ b/src/content/docs/d1/observability/metrics-analytics.mdx @@ -27,7 +27,7 @@ Metrics can be queried (and are retained) for the past 31 days. ### Row counts -D1 returns the number of rows read, rows written (or both) in response to each individual query via [the client API](/d1/build-with-d1/d1-client-api/#return-object). +D1 returns the number of rows read, rows written (or both) in response to each individual query via [the Workers Binding API](/d1/worker-api/return-object/). Row counts are a precise count of how many rows were read (scanned) or written by that query. Inspect row counts to understand the performance and cost of a given query, including whether you can reduce the rows read [using indexes](/d1/build-with-d1/use-indexes/). Use query counts to understand the total volume of traffic against your databases and to discern which databases are actively in-use. @@ -142,7 +142,7 @@ query { D1 provides metrics that let you understand and debug query performance. You can access these via GraphQL's `d1QueriesAdaptiveGroups` or `wrangler d1 insights` command. -D1 captures your query strings to make it easier to analyze metrics across query executions. [Bound parameters](/d1/build-with-d1/d1-client-api/#parameter-binding) are not captured to remove any sensitive information. +D1 captures your query strings to make it easier to analyze metrics across query executions. [Bound parameters](/d1/worker-api/prepared-statements/#guidance) are not captured to remove any sensitive information. :::note diff --git a/src/content/docs/d1/tutorials/build-a-comments-api/index.mdx b/src/content/docs/d1/tutorials/build-a-comments-api/index.mdx index 113e93918017b13..8ab1409e0150162 100644 --- a/src/content/docs/d1/tutorials/build-a-comments-api/index.mdx +++ b/src/content/docs/d1/tutorials/build-a-comments-api/index.mdx @@ -159,7 +159,7 @@ app.get("/api/posts/:slug/comments", async (c) => { }); ``` -The above code makes use of the `prepare`, `bind`, and `all` functions on a D1 binding to prepare and execute a SQL statement. Refer to [D1 client API](/d1/build-with-d1/d1-client-api/) for a list of all methods available. +The above code makes use of the `prepare`, `bind`, and `all` functions on a D1 binding to prepare and execute a SQL statement. Refer to [D1 Workers Binding API](/d1/worker-api/) for a list of all methods available. In this function, you accept a `slug` URL query parameter and set up a new SQL statement where you select all comments with a matching `post_slug` value to your query parameter. You can then return it as a JSON response. diff --git a/src/content/docs/d1/tutorials/build-an-api-to-access-d1/index.mdx b/src/content/docs/d1/tutorials/build-an-api-to-access-d1/index.mdx index 59913f5ced32bf6..94d69661a5384e5 100644 --- a/src/content/docs/d1/tutorials/build-an-api-to-access-d1/index.mdx +++ b/src/content/docs/d1/tutorials/build-an-api-to-access-d1/index.mdx @@ -291,8 +291,8 @@ Your application can now access the D1 database. In this step, you will update t // Update the API routes /** - * Executes the `stmt.all()` method. - * https://developers.cloudflare.com/d1/build-with-d1/d1-client-api/#await-stmtall + * Executes the `stmt.run()` method. + * https://developers.cloudflare.com/d1/worker-api/prepared-statements/#run */ app.post('/api/all', async (c) => { @@ -304,7 +304,7 @@ Your application can now access the D1 database. In this step, you will update t stmt = stmt.bind(params); } - const result = await stmt.all(); + const result = await stmt.run(); return c.json(result); } catch (err) { return c.json({ error: `Failed to run query: ${err}` }, 500); @@ -313,7 +313,7 @@ Your application can now access the D1 database. In this step, you will update t /** * Executes the `db.exec()` method. - * https://developers.cloudflare.com/d1/build-with-d1/d1-client-api/#await-dbexec + * https://developers.cloudflare.com/d1/worker-api/d1-database/#exec */ app.post('/api/exec', async (c) => { @@ -329,7 +329,7 @@ Your application can now access the D1 database. In this step, you will update t /** * Executes the `db.batch()` method. - * https://developers.cloudflare.com/d1/build-with-d1/d1-client-api/#dbbatch + * https://developers.cloudflare.com/d1/worker-api/d1-database/#batch */ app.post('/api/batch', async (c) => { diff --git a/src/content/docs/d1/worker-api/d1-database.mdx b/src/content/docs/d1/worker-api/d1-database.mdx index 98a3992b56582a3..e3b0ebe834f1e4e 100644 --- a/src/content/docs/d1/worker-api/d1-database.mdx +++ b/src/content/docs/d1/worker-api/d1-database.mdx @@ -244,7 +244,7 @@ return Response.json(returnValue); #### Guidance -- If an error occurs, an exception is thrown with the query and error messages, execution stops and further statements are not executed. Refer to [Errors](/d1/build-with-d1/d1-client-api/#errors) to learn more. +- If an error occurs, an exception is thrown with the query and error messages, execution stops and further statements are not executed. Refer to [Errors](/d1/observability/#errors) to learn more. - This method can have poorer performance (prepared statements can be reused in some cases) and, more importantly, is less safe. - Only use this method for maintenance and one-shot tasks (for example, migration jobs). - The input can be one or multiple queries separated by `\n`. diff --git a/src/content/docs/d1/worker-api/prepared-statements.mdx b/src/content/docs/d1/worker-api/prepared-statements.mdx index 6eb13db7d62de8f..89e0f0065effaaa 100644 --- a/src/content/docs/d1/worker-api/prepared-statements.mdx +++ b/src/content/docs/d1/worker-api/prepared-statements.mdx @@ -70,7 +70,7 @@ return Response.json(returnValue); #### Guidance - `results` is empty for write operations such as `UPDATE`, `DELETE`, or `INSERT`. -- When using TypeScript, you can pass a [type parameter](/d1/build-with-d1/d1-client-api/#typescript-support) to [`D1PreparedStatement::run`](#run) to return a typed result object. +- When using TypeScript, you can pass a [type parameter](/d1/worker-api/#typescript-support) to [`D1PreparedStatement::run`](#run) to return a typed result object. - [`D1PreparedStatement::run`](#run) is functionally equivalent to `D1PreparedStatement::all`, and can be treated as an alias. - You can choose to extract only the results you expect from the statement by simply returning the `results` property of the return object. @@ -158,7 +158,7 @@ return Response.json(returnValue) #### Guidance -- When using TypeScript, you can pass a [type parameter](/d1/build-with-d1/d1-client-api/#typescript-support) to [`D1PreparedStatement::raw`](#raw) to return a typed result array. +- When using TypeScript, you can pass a [type parameter](/d1/worker-api/#typescript-support) to [`D1PreparedStatement::raw`](#raw) to return a typed result array. ### `first()` @@ -219,5 +219,5 @@ return Response.json(returnValue) - If the query returns rows but `column` does not exist, then [`D1PreparedStatement::first`](#first) throws the `D1_ERROR` exception. - [`D1PreparedStatement::first`](#first) does not alter the SQL query. To improve performance, consider appending `LIMIT 1` to your statement. -- When using TypeScript, you can pass a [type parameter](/d1/build-with-d1/d1-client-api/#typescript-support) to [`D1PreparedStatement::first`](#first) to return a typed result object. +- When using TypeScript, you can pass a [type parameter](/d1/worker-api/#typescript-support) to [`D1PreparedStatement::first`](#first) to return a typed result object. diff --git a/src/content/docs/d1/worker-api/return-object.mdx b/src/content/docs/d1/worker-api/return-object.mdx index 85dd2448b365499..b8fd8d7425a6f69 100644 --- a/src/content/docs/d1/worker-api/return-object.mdx +++ b/src/content/docs/d1/worker-api/return-object.mdx @@ -12,7 +12,7 @@ Some D1 Worker Binding APIs return a typed object. | [`D1PreparedStatement::run`](/d1/worker-api/prepared-statements/#run), [`D1Database::batch`](/d1/worker-api/d1-database/#batch)| `D1Result` | | [`D1Database::exec`](/d1/worker-api/d1-database/#exec) | `D1ExecResult`| -## D1Result +## `D1Result` The methods [`D1PreparedStatement::run`](/d1/worker-api/prepared-statements/#run) and [`D1Database::batch`](/d1/worker-api/d1-database/#batch) return a typed [`D1Result`](#d1result) object for each query statement. This object contains: @@ -73,7 +73,7 @@ return Response.json(result) } ``` -## D1ExecResult +## `D1ExecResult` The method [`D1Database::exec`](/d1/worker-api/d1-database/#exec) returns a typed [`D1ExecResult`](#d1execresult) object for each query statement. This object contains: diff --git a/src/content/docs/pages/functions/bindings.mdx b/src/content/docs/pages/functions/bindings.mdx index 833d69aae99c0e4..1c3e76b7fb92b20 100644 --- a/src/content/docs/pages/functions/bindings.mdx +++ b/src/content/docs/pages/functions/bindings.mdx @@ -281,7 +281,7 @@ By default, Wrangler automatically persists data to local storage. For more info ::: -Refer to the [D1 client API documentation](/d1/build-with-d1/d1-client-api/) for the API methods available on your D1 binding. +Refer to the [D1 Workers Binding API documentation](/d1/worker-api/) for the API methods available on your D1 binding. @@ -683,7 +683,7 @@ export const onRequest: PagesFunction = async (context) => { }; ``` - + ### Interact with your Hyperdrive binding locally diff --git a/src/content/docs/pages/functions/wrangler-configuration.mdx b/src/content/docs/pages/functions/wrangler-configuration.mdx index 63ea6b910290d8c..0f82eedf0f4e95e 100644 --- a/src/content/docs/pages/functions/wrangler-configuration.mdx +++ b/src/content/docs/pages/functions/wrangler-configuration.mdx @@ -389,7 +389,7 @@ A [binding](/pages/functions/bindings/) enables your Pages Functions to interact ### D1 databases -[D1](/d1/) is Cloudflare's serverless SQL database. A Function can query a D1 database (or databases) by creating a [binding](/workers/runtime-apis/bindings/) to each database for D1's [client API](/d1/build-with-d1/d1-client-api/). +[D1](/d1/) is Cloudflare's serverless SQL database. A Function can query a D1 database (or databases) by creating a [binding](/workers/runtime-apis/bindings/) to each database for [D1 Workers Binding API](/d1/worker-api/). :::note diff --git a/src/content/docs/workers/platform/storage-options.mdx b/src/content/docs/workers/platform/storage-options.mdx index 181a7209b77cc3c..45aa2b104934a75 100644 --- a/src/content/docs/workers/platform/storage-options.mdx +++ b/src/content/docs/workers/platform/storage-options.mdx @@ -134,7 +134,7 @@ To get started with D1: - Read [the documentation](/d1) - Follow the [Get started guide](/d1/get-started/) to provision your first D1 database. -- Review the [D1 client API](/d1/build-with-d1/d1-client-api/). +- Review the [D1 Workers Binding API](/d1/worker-api/). :::note If your working data size exceeds 10 GB (the maximum size for a D1 database), consider splitting the database into multiple, smaller D1 databases. diff --git a/src/content/docs/workers/runtime-apis/bindings/D1.mdx b/src/content/docs/workers/runtime-apis/bindings/D1.mdx index 27b45ad5d8b3d72..2ee88f952c8fd6f 100644 --- a/src/content/docs/workers/runtime-apis/bindings/D1.mdx +++ b/src/content/docs/workers/runtime-apis/bindings/D1.mdx @@ -1,7 +1,7 @@ --- pcx_content_type: navigation title: D1 -external_link: /d1/build-with-d1/d1-client-api/ +external_link: /d1/worker-api/ head: [] description: APIs available in Cloudflare Workers to interact with D1. D1 is Cloudflare's native serverless database. diff --git a/src/content/docs/workers/static-assets/compatibility-matrix.mdx b/src/content/docs/workers/static-assets/compatibility-matrix.mdx index f7ebc4368bb3980..131efd94df75442 100644 --- a/src/content/docs/workers/static-assets/compatibility-matrix.mdx +++ b/src/content/docs/workers/static-assets/compatibility-matrix.mdx @@ -57,7 +57,7 @@ We plan to bridge the gaps between Workers and Pages and provide ways to migrate | [Analytics Engine](/analytics/analytics-engine) | ✅ | ✅ | | [Assets](/workers/static-assets/binding/) | ✅ | ✅ | | [Browser Rendering](/browser-rendering) | ✅ | ✅ | -| [D1](/d1/build-with-d1/d1-client-api/) | ✅ | ✅ | +| [D1](/d1/worker-api/) | ✅ | ✅ | | [Email Workers](/email-routing/email-workers/send-email-workers/) | ✅ | ❌ | | [Environment Variables](/workers/configuration/environment-variables/) | ✅ | ✅ | | [Hyperdrive](/hyperdrive/) | ✅ | ❌ | diff --git a/src/content/docs/workers/wrangler/api.mdx b/src/content/docs/workers/wrangler/api.mdx index ed28c395bc614c4..e728fac453b2392 100644 --- a/src/content/docs/workers/wrangler/api.mdx +++ b/src/content/docs/workers/wrangler/api.mdx @@ -351,7 +351,7 @@ The bindings supported by `getPlatformProxy` are: - [Queue bindings](/queues/configuration/javascript-apis/) -- [D1 database bindings](/d1/build-with-d1/d1-client-api/) +- [D1 database bindings](/d1/worker-api/) - [Hyperdrive bindings](/hyperdrive) diff --git a/src/content/docs/workers/wrangler/configuration.mdx b/src/content/docs/workers/wrangler/configuration.mdx index 9e5a14721a763de..0673ada83f9ed5f 100644 --- a/src/content/docs/workers/wrangler/configuration.mdx +++ b/src/content/docs/workers/wrangler/configuration.mdx @@ -413,7 +413,7 @@ binding = "" ### D1 databases -[D1](/d1/) is Cloudflare's serverless SQL database. A Worker can query a D1 database (or databases) by creating a [binding](/workers/runtime-apis/bindings/) to each database for D1's [client API](/d1/build-with-d1/d1-client-api/). +[D1](/d1/) is Cloudflare's serverless SQL database. A Worker can query a D1 database (or databases) by creating a [binding](/workers/runtime-apis/bindings/) to each database for [D1 Workers Binding API](/d1/worker-api/). To bind D1 databases to your Worker, assign an array of the below object to the `[[d1_databases]]` key. diff --git a/src/content/partials/workers/d1-pricing.mdx b/src/content/partials/workers/d1-pricing.mdx index f902f4713c7fa80..e7c735a68a57ba7 100644 --- a/src/content/partials/workers/d1-pricing.mdx +++ b/src/content/partials/workers/d1-pricing.mdx @@ -9,14 +9,14 @@ | Storage (per GB stored) | 5 GB (total) | First 5 GB included + $0.75 / GB-mo | :::note[Track your D1 usage] -To accurately track your usage, use the [meta object](/d1/build-with-d1/d1-client-api/#return-object), [GraphQL Analytics API](/d1/observability/metrics-analytics/#query-via-the-graphql-api), or the [Cloudflare dashboard](https://dash.cloudflare.com/?to=/:account/workers/d1/). Select your D1 database, then view: Metrics > Row Metrics. +To accurately track your usage, use the [meta object](/d1/worker-api/return-object/), [GraphQL Analytics API](/d1/observability/metrics-analytics/#query-via-the-graphql-api), or the [Cloudflare dashboard](https://dash.cloudflare.com/?to=/:account/workers/d1/). Select your D1 database, then view: Metrics > Row Metrics. ::: ### Definitions 1. Rows read measure how many rows a query reads (scans), regardless of the size of each row. For example, if you have a table with 5000 rows and run a `SELECT * FROM table` as a full table scan, this would count as 5,000 rows read. A query that filters on an [unindexed column](/d1/build-with-d1/use-indexes/) may return fewer rows to your Worker, but is still required to read (scan) more rows to determine which subset to return. 2. Rows written measure how many rows were written to D1 database. Write operations include `INSERT`, `UPDATE`, and `DELETE`. Each of these operations contribute towards rows written. A query that `INSERT` 10 rows into a `users` table would count as 10 rows written. -3. DDL operations (for example, `CREATE`, `ALTER`, and `DROP`) are used to define or modify the structure of a database. They may contribute to a mix of read rows and write rows. Ensure you are accurately tracking your usage through the available tools ([meta object](/d1/build-with-d1/d1-client-api/#return-object), [GraphQL Analytics API](/d1/observability/metrics-analytics/#query-via-the-graphql-api), or the [Cloudflare dashboard](https://dash.cloudflare.com/?to=/:account/workers/d1/)). +3. DDL operations (for example, `CREATE`, `ALTER`, and `DROP`) are used to define or modify the structure of a database. They may contribute to a mix of read rows and write rows. Ensure you are accurately tracking your usage through the available tools ([meta object](/d1/worker-api/return-object/), [GraphQL Analytics API](/d1/observability/metrics-analytics/#query-via-the-graphql-api), or the [Cloudflare dashboard](https://dash.cloudflare.com/?to=/:account/workers/d1/)). 4. Row size or the number of columns in a row does not impact how rows are counted. A row that is 1 KB and a row that is 100 KB both count as one row. 5. Defining [indexes](/d1/build-with-d1/use-indexes/) on your table(s) reduces the number of rows read by a query when filtering on that indexed field. For example, if the `users` table has an index on a timestamp column `created_at`, the query `SELECT * FROM users WHERE created_at > ?1` would only need to read a subset of the table. 6. Indexes will add an additional written row when writes include the indexed column, as there are two rows written: one to the table itself, and one to the index. The performance benefit of an index and reduction in rows read will, in nearly all cases, offset this additional write. From 0996b32328ba8aa180570c8b5a8401dbfbe0a3fe Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Thu, 21 Nov 2024 14:46:55 +0000 Subject: [PATCH 04/23] Update src/content/docs/d1/worker-api/prepared-statements.mdx Co-authored-by: hyperlint-ai[bot] <154288675+hyperlint-ai[bot]@users.noreply.github.com> --- src/content/docs/d1/worker-api/prepared-statements.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/content/docs/d1/worker-api/prepared-statements.mdx b/src/content/docs/d1/worker-api/prepared-statements.mdx index 89e0f0065effaaa..235eacdf6e961d4 100644 --- a/src/content/docs/d1/worker-api/prepared-statements.mdx +++ b/src/content/docs/d1/worker-api/prepared-statements.mdx @@ -158,7 +158,7 @@ return Response.json(returnValue) #### Guidance -- When using TypeScript, you can pass a [type parameter](/d1/worker-api/#typescript-support) to [`D1PreparedStatement::raw`](#raw) to return a typed result array. +- When using TypeScript, you can pass a [type parameter](/d1/worker-api/#typescript-support) to [`D1PreparedStatement::raw`](#raw) return a typed result array. ### `first()` From 4dab6310466427893ba8c721d4c872d3f879fa91 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Thu, 21 Nov 2024 14:49:12 +0000 Subject: [PATCH 05/23] Fixing grammar. --- src/content/docs/d1/worker-api/prepared-statements.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/content/docs/d1/worker-api/prepared-statements.mdx b/src/content/docs/d1/worker-api/prepared-statements.mdx index 235eacdf6e961d4..89e0f0065effaaa 100644 --- a/src/content/docs/d1/worker-api/prepared-statements.mdx +++ b/src/content/docs/d1/worker-api/prepared-statements.mdx @@ -158,7 +158,7 @@ return Response.json(returnValue) #### Guidance -- When using TypeScript, you can pass a [type parameter](/d1/worker-api/#typescript-support) to [`D1PreparedStatement::raw`](#raw) return a typed result array. +- When using TypeScript, you can pass a [type parameter](/d1/worker-api/#typescript-support) to [`D1PreparedStatement::raw`](#raw) to return a typed result array. ### `first()` From 194a9531a43d4caa0814ff0cc2810a0d177e070d Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Fri, 22 Nov 2024 17:15:28 +0000 Subject: [PATCH 06/23] Moving Wrangler commands out of Reference. Reshuffling chapters. --- .../docs/d1/build-with-d1/d1-client-api.mdx | 497 ------------------ .../docs/d1/build-with-d1/query-d1.mdx | 18 + src/content/docs/d1/configuration/index.mdx | 2 +- src/content/docs/d1/demos.mdx | 2 +- src/content/docs/d1/examples/index.mdx | 2 +- src/content/docs/d1/observability/index.mdx | 2 +- src/content/docs/d1/platform/index.mdx | 2 +- src/content/docs/d1/reference/index.mdx | 2 +- .../docs/d1/reference/wrangler-commands.mdx | 8 - src/content/docs/d1/{ => rest-api}/d1-api.mdx | 0 src/content/docs/d1/rest-api/index.mdx | 12 + .../docs/d1/rest-api/wrangler-commands.mdx | 11 + src/content/docs/d1/sql-api/index.mdx | 2 +- src/content/docs/d1/tutorials/index.mdx | 2 +- src/content/docs/d1/worker-api/index.mdx | 22 + .../docs/workers/wrangler/commands.mdx | 290 +--------- .../partials/workers/wrangler-commands/d1.mdx | 293 +++++++++++ 17 files changed, 366 insertions(+), 801 deletions(-) delete mode 100644 src/content/docs/d1/build-with-d1/d1-client-api.mdx create mode 100644 src/content/docs/d1/build-with-d1/query-d1.mdx delete mode 100644 src/content/docs/d1/reference/wrangler-commands.mdx rename src/content/docs/d1/{ => rest-api}/d1-api.mdx (100%) create mode 100644 src/content/docs/d1/rest-api/index.mdx create mode 100644 src/content/docs/d1/rest-api/wrangler-commands.mdx create mode 100644 src/content/partials/workers/wrangler-commands/d1.mdx diff --git a/src/content/docs/d1/build-with-d1/d1-client-api.mdx b/src/content/docs/d1/build-with-d1/d1-client-api.mdx deleted file mode 100644 index 81b33f307ec5c07..000000000000000 --- a/src/content/docs/d1/build-with-d1/d1-client-api.mdx +++ /dev/null @@ -1,497 +0,0 @@ ---- -title: Query D1 -pcx_content_type: concept -sidebar: - order: 1 ---- - -D1 is compatible with most SQLite's SQL convention since it leverages SQLite's query engine. D1 client API allows you to interact with a D1 database from within a [Worker](/workers/). - -## Prepared and static statements - -D1 client API supports prepared and static statements. Best practice is to use prepared statements which are precompiled objects used by the database to run the SQL. This is because prepared statements lead to overall faster execution and prevent SQL injection attacks. - -Below is an example of a prepared statement: - -```js -const stmt = db.prepare("SELECT * FROM users WHERE name = ?1").bind("Joe"); -``` - -However, if you still choose to use a static statement you can use the following as an example: - -```js -const stmt = db.prepare('SELECT * FROM users WHERE name = "John Doe"'); -``` - -## Parameter binding - -D1 follows the [SQLite convention](https://www.sqlite.org/lang_expr.html#varparam) for prepared statements parameter binding. Currently, D1 only supports Ordered (`?NNNN`) and Anonymous (`?`) parameters. In the future, D1 will support named parameters as well. - -| Syntax | Type | Description | -| ------ | --------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| `?NNN` | Ordered | A question mark followed by a number `NNN` holds a spot for the `NNN`-th parameter. `NNN` must be between `1` and `SQLITE_MAX_VARIABLE_NUMBER` | -| `?` | Anonymous | A question mark that is not followed by a number creates a parameter with a number one greater than the largest parameter number already assigned. If this means the parameter number is greater than SQLITE_MAX_VARIABLE_NUMBER, it is an error. This parameter format is provided for compatibility with other database engines. But because it is easy to miscount the question marks, the use of this parameter format is discouraged. Programmers are encouraged to use one of the symbolic formats below or the `?NNN` format above instead | - -To bind a parameter, use the `stmt.bind()` method. - -### Order and anonymous examples: - -```js -const stmt = db.prepare("SELECT * FROM users WHERE name = ?").bind("John Doe"); -``` - -```js -const stmt = db - .prepare("SELECT * FROM users WHERE name = ? AND age = ?") - .bind("John Doe", 41); -``` - -```js -const stmt = db - .prepare("SELECT * FROM users WHERE name = ?2 AND age = ?1") - .bind(41, "John Doe"); -``` - -## Type conversion - -D1 automatically converts supported JavaScript (including TypeScript) types passed as parameters via the client API to their associated D1 types. The type conversion is as follows: - -| JavaScript | D1 | -| -------------------- | ---------------------------------------------------------------------------- | -| null | `NULL` | -| Number | `REAL` | -| Number 1 | `INTEGER` | -| String | `TEXT` | -| Boolean 2 | `INTEGER` | -| ArrayBuffer | `BLOB` | -| undefined | Not supported. Queries with `undefined` values will return a `D1_TYPE_ERROR` | - -1 D1 supports 64-bit signed `INTEGER` values internally, however -[BigInts](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt) -are not currently supported in the API yet. JavaScript integers are safe up to -[`Number.MAX_SAFE_INTEGER`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER). - -2 Booleans will be cast to an `INTEGER` type where `1` is `TRUE` and -`0` is `FALSE`. - -## Return object - -The methods `stmt.all()` and `db.batch()` return a typed `D1Result` object that contains the results (if applicable), the success status, and a meta object with the internal duration of the operation in milliseconds. - -```js -{ - results: array | null, // [] if empty, or null if it does not apply - success: boolean, // true if the operation was successful, false otherwise - meta: { - duration: number, // duration of the operation in milliseconds - rows_read: number, // the number of rows read (scanned) by this query - rows_written: number // the number of rows written by this query - } -} -``` - -Example: - -```js -const { duration } = ( - await db - .prepare("INSERT INTO users (name, age) VALUES (?1, ?2)") - .bind("John", 42) - .run() -).meta; - -console.log(duration); // 0.172 -``` - -The `db.exec()` method returns a `D1ExecResult` object: - -```js -{ - count: number, // the number of queries executed - duration: number // duration of the operation in milliseconds -} -``` - -## Query statement methods - -D1 client API supports the following query statement methods for querying against a D1 database: - -- [`await stmt.all()`](/d1/build-with-d1/d1-client-api/#await-stmtall) -- [`await stmt.raw()`](/d1/build-with-d1/d1-client-api/#await-stmtraw) -- [`await stmt.first( [column] )`](/d1/build-with-d1/d1-client-api/#await-stmtfirstcolumn) -- [`await stmt.run()`](/d1/build-with-d1/d1-client-api/#await-stmtrun) -- [`await db.dump()`](/d1/build-with-d1/d1-client-api/#await-dbdump) -- [`await db.exec()`](/d1/build-with-d1/d1-client-api/#await-dbexec) - -### await stmt.all() - -Returns all rows as an array of objects, with each result row represented as an object on the `results` property of the `D1Result` type. - -When joining tables with identical column names, only the leftmost column will be included in the row object. Use [`stmt.raw()`](#await-stmtraw) to return all rows as an array of arrays. - -```js -const stmt = db.prepare("SELECT name, age FROM users LIMIT 3"); -const { results } = await stmt.all(); -console.log(results); -/* -[ - { - name: "John", - age: 42, - }, - { - name: "Anthony", - age: 37, - }, - { - name: "Dave", - age: 29, - }, - ] -*/ -``` - -When using TypeScript, you can pass a [type parameter](/d1/build-with-d1/d1-client-api/#typescript-support) to `all()` to return a typed result object. - -### await stmt.raw() - -Returns results as an array of arrays, with each row represented by an array. The return type is an array of arrays, and does not include query metadata. - -Column names are not included in the result set by default. To include column names as the first row of the result array, set `.raw({columnNames: true})`. - -```js -const stmt = db.prepare("SELECT name, age FROM users LIMIT 3"); -const rows = await stmt.raw(); -console.log(rows); - -/* -[ - [ "John", 42 ], - [ "Anthony", 37 ], - [ "Dave", 29 ], -] -*/ - -// With columnNames: true -const stmt = db.prepare("SELECT name, age FROM users LIMIT 3"); -const [columns, ...rows] = await stmt.raw({ columnNames: true }); -console.log(columns); - -/* -[ "name", age ], // The first result array includes the column names -*/ -``` - -When using TypeScript, you can pass a [type parameter](/d1/build-with-d1/d1-client-api/#typescript-support) to `raw()` to return a typed result array. - -### await stmt.first(\[column]) - -Returns the first row of the results. This does not return metadata like the other methods. Instead, it returns the object directly. - -Get a specific column from the first row: - -```js -const stmt = db.prepare("SELECT COUNT(*) AS total FROM users"); -const total = await stmt.first("total"); -console.log(total); // 50 -``` - -Get all the columns from the first row: - -```js -const stmt = db.prepare("SELECT COUNT(*) AS total FROM users"); -const values = await stmt.first(); -console.log(values); // { total: 50 } -``` - -If the query returns no rows, then `first()` will return `null`. If the query returns rows, but `column` does not exist, then `first()` will throw the `D1_ERROR` exception. - -`stmt.first()` does not alter the SQL query. To improve performance, consider appending `LIMIT 1` to your statement. - -When using TypeScript, you can pass a [type parameter](/d1/build-with-d1/d1-client-api/#typescript-support) to `first()` to return a typed result object. - -### await stmt.run() - -Runs the query (or queries) and returns results. Returns all rows as an array of objects, with each result row represented as an object on the `results` property of the `D1Result` type. For write operations like UPDATE, DELETE or INSERT, `results` will be empty. - -Run is functionally equivalent to `stmt.all()` and can be treated as an alias. - -```js -const stmt = await db.prepare("SELECT name, age FROM users LIMIT 3"); -const { results } = await stmt.run(); -console.log(results); -/* -[ - { - name: "John", - age: 42, - }, - { - name: "Anthony", - age: 37, - }, - { - name: "Dave", - age: 29, - }, - ] -*/ -``` - -When using TypeScript, you can pass a [type parameter](/d1/build-with-d1/d1-client-api/#typescript-support) to `run()` to return a typed result object. - -### await db.dump() - -:::caution - -This API only works on databases created during D1's alpha period. Check which version your database uses with `wrangler d1 info `. - -::: - -Dumps the entire D1 database to an SQLite compatible file inside an ArrayBuffer. - -```js -const dump = await db.dump(); -return new Response(dump, { - status: 200, - headers: { - "Content-Type": "application/octet-stream", - }, -}); -``` - -### await db.exec() - -Executes one or more queries directly without prepared statements or parameters binding. This method can have poorer performance (prepared statements can be reused in some cases) and, more importantly, is less safe. Only use this method for maintenance and one-shot tasks (for example, migration jobs). The input can be one or multiple queries separated by `\n`. - -If an error occurs, an exception is thrown with the query and error messages, execution stops and further statements are not executed. Refer to [Errors](/d1/build-with-d1/d1-client-api/#errors) to learn more. - -```js -const migration = await fetch("/migration.sql"); -const out = await db.exec(migration.text()); -console.log(out); -/* -{ - count: 80, - duration: 76 -} -*/ -``` - -## TypeScript support - -D1 client API is fully-typed via the `@cloudflare/workers-types` package, and also supports [generic types](https://www.typescriptlang.org/docs/handbook/2/generics.html#generic-types) as part of its TypeScript API. A generic type allows you to provide an optional _type parameter_ so that a function understands the type of the data it is handling. - -When using the [query statement methods](#query-statement-methods) `stmt.all()`, `stmt.raw()` and `stmt.first()`, you can provide a type representing each database row. D1's API will [return the result object](#return-object) with the correct type. - -For example, providing an `OrderRow` type as a type parameter to `stmt.all()` will return a typed `Array` object instead of the default `Record` type: - -```ts -// Row definition -type OrderRow = { - Id: string; - CustomerName: string; - OrderDate: number; -}; - -// Elsewhere in your application -const result = await env.MY_DB.prepare( - "SELECT Id, CustomerName, OrderDate FROM [Order] ORDER BY ShippedDate DESC LIMIT 100", -).all(); -``` - -## Reuse prepared statements - -Prepared statements can be reused with new bindings: - -```js -const stmt = db.prepare("SELECT name, age FROM users WHERE age < ?1"); -const young = await stmt.bind(20).all(); -console.log(young); -/* -{ - results: [...], - success: true - meta: { - duration: 31, - } -} -*/ -const old = await stmt.bind(80).all(); -console.log(old); -/* -{ - results: [...], - success: true - meta: { - duration: 29, - } -} -*/ -``` - -## Search with LIKE - -Perform a search using SQL's `LIKE` operator: - -```js -const { results } = await env.DB.prepare( - "SELECT * FROM Customers WHERE CompanyName LIKE ?", -) - .bind("%eve%") - .all(); -console.log("results: ", results); -/* -results: [...] -*/ -``` - -## Batch statements - -Batching sends multiple SQL statements inside a single call to the database. This can have a huge performance impact as it reduces latency from network round trips to D1. D1 operates in auto-commit. Our implementation guarantees that each statement in the list will execute and commit, sequentially, non-concurrently. - -Batched statements are [SQL transactions](https://www.sqlite.org/lang_transaction.html). If a statement in the sequence fails, then an error is returned for that specific statement, and it aborts or rolls back the entire sequence. - -### db.batch() - -To send batch statements, provide `batch()` a list of prepared statements and get the results in the same order. - -```js -await db.batch([ - db.prepare("UPDATE users SET name = ?1 WHERE id = ?2").bind("John", 17), - db.prepare("UPDATE users SET age = ?1 WHERE id = ?2").bind(35, 19), -]); -``` - -You can construct batches reusing the same prepared statement: - -```js -const stmt = db.prepare("SELECT * FROM users WHERE name = ?1"); - -const rows = await db.batch([stmt.bind("John"), stmt.bind("Anthony")]); - -console.log(rows[0].results); -/* -[ - { - name: "John Clemente", - age: 42, - }, - { - name: "John Davis", - age: 37, - }, - ] -*/ -console.log(rows[1].results); -/* -[ - { - name: "Anthony Hopkins", - age: 66, - }, - ] -*/ -``` - -## PRAGMA statements - -D1 supports [SQLite PRAGMA](https://www.sqlite.org/pragma.html) statements. The PRAGMA statement is an SQL extension for SQLite. PRAGMA commands can be used to: - -- Modify the behavior of certain SQLite operations -- Query the SQLite library for internal data about schemas or tables (but note that PRAGMA statements cannot query the contents of a table) -- Control environmental variables - -For example, D1 supports `PRAGMA table_list` and `PRAGMA table_info`: - -```js -const r = await db.batch([ - db.prepare("PRAGMA table_list"), - db.prepare("PRAGMA table_info(my_table)"), -]); -console.log(r); -/* -[ - { - "results": [ - { - "schema": "main", - "name": "my_table", - "type": "table", - "ncol": 3, - "wr": 0, - "strict": 0 - }, - ... - ] - }, - { - "results": [ - { - "cid": 0, - "name": "cid", - "type": "INTEGER", - "notnull": 0, - "dflt_value": null, - "pk": 1 - }, - ... - ] - } -] - -*/ -``` - -:::caution - -D1 PRAGMA statements only apply to the current transaction. - -::: - -For the full list of PRAGMA statements supported by D1, see [SQL statements](/d1/sql-api/sql-statements). - -## Errors - -The `stmt.` and `db.` methods will throw an [Error object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error) whenever an error occurs. - -:::note - -Prior to [`wrangler` 3.1.1](https://github.com/cloudflare/workers-sdk/releases/tag/wrangler%403.1.1), D1 JavaScript errors used the [cause property](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error/cause) for detailed error messages. - -To inspect these errors when using older versions of `wrangler`, you should log `error?.cause?.message`. - -::: - -To capture exceptions, log the `Error.message` value. For example, the code below has a query with an invalid keyword - `INSERTZ` instead of `INSERT`: - -```js -try { - // This is an intentional mispelling - await db.exec("INSERTZ INTO my_table (name, employees) VALUES ()"); -} catch (e: any) { - console.error({ - message: e.message - }); -} -``` - -The code above would throw the following error message: - -```json -{ - "message": "D1_EXEC_ERROR: Error in line 1: INSERTZ INTO my_table (name, employees) VALUES (): sql error: near \"INSERTZ\": syntax error in INSERTZ INTO my_table (name, employees) VALUES () at offset 0" -} -``` - -## Error list - -D1 will return the following error constants, in addition to the extended (detailed) error message: - -| Message | Cause | -| -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| `D1_ERROR` | Generic error. | -| `D1_TYPE_ERROR` | Returned when there is a mismatch in the type between a column and a value. A common cause is supplying an `undefined` variable (unsupported) instead of `null`. | -| `D1_COLUMN_NOTFOUND` | Column not found. | -| `D1_DUMP_ERROR` | Database dump error. | -| `D1_EXEC_ERROR` | Exec error in line x: y error. | diff --git a/src/content/docs/d1/build-with-d1/query-d1.mdx b/src/content/docs/d1/build-with-d1/query-d1.mdx new file mode 100644 index 000000000000000..e66dd9dd9c9037a --- /dev/null +++ b/src/content/docs/d1/build-with-d1/query-d1.mdx @@ -0,0 +1,18 @@ +--- +title: Query D1 +pcx_content_type: concept +sidebar: + order: 1 +--- + +There are three primary ways you can query a D1 database: + +1. Using [D1 Workers Binding API](/d1/worker-api/) in your code. +2. Using [D1 REST API](/api/operations/cloudflare-d1-create-database). +3. Using [D1 Wrangler commands](/d1/wrangler-commands/). + +## Query D1 with Workers Binding API + +## Query D1 with REST API + +## Query D1 with Wrangler commands \ No newline at end of file diff --git a/src/content/docs/d1/configuration/index.mdx b/src/content/docs/d1/configuration/index.mdx index ba06f8814c033de..35b901b9e04f0a5 100644 --- a/src/content/docs/d1/configuration/index.mdx +++ b/src/content/docs/d1/configuration/index.mdx @@ -2,7 +2,7 @@ title: Configuration pcx_content_type: navigation sidebar: - order: 6 + order: 8 group: hideIndex: true --- diff --git a/src/content/docs/d1/demos.mdx b/src/content/docs/d1/demos.mdx index f06ad57badb19c1..eb856d104531d2c 100644 --- a/src/content/docs/d1/demos.mdx +++ b/src/content/docs/d1/demos.mdx @@ -2,7 +2,7 @@ pcx_content_type: navigation title: Demos and architectures sidebar: - order: 10 + order: 12 --- diff --git a/src/content/docs/d1/examples/index.mdx b/src/content/docs/d1/examples/index.mdx index 26bca1093d92752..8ad40e6d10db80d 100644 --- a/src/content/docs/d1/examples/index.mdx +++ b/src/content/docs/d1/examples/index.mdx @@ -4,7 +4,7 @@ hideChildren: false pcx_content_type: navigation title: Examples sidebar: - order: 8 + order: 10 group: hideIndex: true --- diff --git a/src/content/docs/d1/observability/index.mdx b/src/content/docs/d1/observability/index.mdx index c10d1fd64af068d..35c902c3124a360 100644 --- a/src/content/docs/d1/observability/index.mdx +++ b/src/content/docs/d1/observability/index.mdx @@ -2,7 +2,7 @@ title: Observability pcx_content_type: navigation sidebar: - order: 7 + order: 9 group: hideIndex: true --- diff --git a/src/content/docs/d1/platform/index.mdx b/src/content/docs/d1/platform/index.mdx index 8b14ea8352a012c..1720605aee02065 100644 --- a/src/content/docs/d1/platform/index.mdx +++ b/src/content/docs/d1/platform/index.mdx @@ -2,7 +2,7 @@ pcx_content_type: navigation title: Platform sidebar: - order: 10 + order: 12 group: hideIndex: true --- diff --git a/src/content/docs/d1/reference/index.mdx b/src/content/docs/d1/reference/index.mdx index b007e3d2bb291a4..adcefc0bc391465 100644 --- a/src/content/docs/d1/reference/index.mdx +++ b/src/content/docs/d1/reference/index.mdx @@ -2,7 +2,7 @@ pcx_content_type: navigation title: Reference sidebar: - order: 11 + order: 13 group: hideIndex: true --- diff --git a/src/content/docs/d1/reference/wrangler-commands.mdx b/src/content/docs/d1/reference/wrangler-commands.mdx deleted file mode 100644 index 9de635997e196bd..000000000000000 --- a/src/content/docs/d1/reference/wrangler-commands.mdx +++ /dev/null @@ -1,8 +0,0 @@ ---- -pcx_content_type: navigation -title: Wrangler commands -external_link: /workers/wrangler/commands/#d1 -sidebar: - order: 7 - ---- diff --git a/src/content/docs/d1/d1-api.mdx b/src/content/docs/d1/rest-api/d1-api.mdx similarity index 100% rename from src/content/docs/d1/d1-api.mdx rename to src/content/docs/d1/rest-api/d1-api.mdx diff --git a/src/content/docs/d1/rest-api/index.mdx b/src/content/docs/d1/rest-api/index.mdx new file mode 100644 index 000000000000000..f189f0f1e266e43 --- /dev/null +++ b/src/content/docs/d1/rest-api/index.mdx @@ -0,0 +1,12 @@ +--- +pcx_content_type: navigation +title: REST API +sidebar: + order: 5 + group: + hideIndex: true +--- + +import { DirectoryListing } from "~/components"; + + diff --git a/src/content/docs/d1/rest-api/wrangler-commands.mdx b/src/content/docs/d1/rest-api/wrangler-commands.mdx new file mode 100644 index 000000000000000..1a2679e4c34f195 --- /dev/null +++ b/src/content/docs/d1/rest-api/wrangler-commands.mdx @@ -0,0 +1,11 @@ +--- +pcx_content_type: concept +title: Wrangler commands +sidebar: + order: 6 + +--- + +import { Render, Type, MetaInfo } from "~/components" + + \ No newline at end of file diff --git a/src/content/docs/d1/sql-api/index.mdx b/src/content/docs/d1/sql-api/index.mdx index 8b74fb521d38d05..3228a42dd6a9bcd 100644 --- a/src/content/docs/d1/sql-api/index.mdx +++ b/src/content/docs/d1/sql-api/index.mdx @@ -2,7 +2,7 @@ title: SQL API pcx_content_type: navigation sidebar: - order: 5 + order: 6 group: hideIndex: true --- diff --git a/src/content/docs/d1/tutorials/index.mdx b/src/content/docs/d1/tutorials/index.mdx index 2edc0d395da3f03..11a1a82e0f5331e 100644 --- a/src/content/docs/d1/tutorials/index.mdx +++ b/src/content/docs/d1/tutorials/index.mdx @@ -4,7 +4,7 @@ pcx_content_type: navigation title: Tutorials hideChildren: true sidebar: - order: 9 + order: 11 --- diff --git a/src/content/docs/d1/worker-api/index.mdx b/src/content/docs/d1/worker-api/index.mdx index a6ed1f013880bb9..2560c7911b28da6 100644 --- a/src/content/docs/d1/worker-api/index.mdx +++ b/src/content/docs/d1/worker-api/index.mdx @@ -38,6 +38,28 @@ const result = await env.MY_DB.prepare( ).run(); ``` +## Type conversion + +D1 automatically converts supported JavaScript (including TypeScript) types passed as parameters via the Workers Binding API to their associated D1 types. The type conversion is as follows: + +| JavaScript | D1 | +| -------------------- | ---------------------------------------------------------------------------- | +| null | `NULL` | +| Number | `REAL` | +| Number 1 | `INTEGER` | +| String | `TEXT` | +| Boolean 2 | `INTEGER` | +| ArrayBuffer | `BLOB` | +| undefined | Not supported. Queries with `undefined` values will return a `D1_TYPE_ERROR` | + +1 D1 supports 64-bit signed `INTEGER` values internally, however +[BigInts](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt) +are not currently supported in the API yet. JavaScript integers are safe up to +[`Number.MAX_SAFE_INTEGER`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER). + +2 Booleans will be cast to an `INTEGER` type where `1` is `TRUE` and +`0` is `FALSE`. + ## API playground The D1 Worker Binding API playground is an `index.js` file where you can test each of the documented Worker Binding APIs for D1. The file builds from the end-state of the [Get started](/d1/get-started/#write-queries-within-your-worker) code. diff --git a/src/content/docs/workers/wrangler/commands.mdx b/src/content/docs/workers/wrangler/commands.mdx index 9f34d3a860e2af2..6cee1bef8860627 100644 --- a/src/content/docs/workers/wrangler/commands.mdx +++ b/src/content/docs/workers/wrangler/commands.mdx @@ -184,295 +184,9 @@ wrangler generate [] [TEMPLATE] ## `d1` -Interact with Cloudflare's D1 service. +Interact with Cloudflare's [D1](/d1/) service. -### `create` - -Creates a new D1 database, and provides the binding and UUID that you will put in your `wrangler.toml` file. - -```txt -wrangler d1 create [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the new D1 database. -- `--location` - - Provide an optional [location hint](/d1/configuration/data-location/) for your database leader. - - Available options include `weur` (Western Europe), `eeur` (Eastern Europe), `apac` (Asia Pacific), `oc` (Oceania), `wnam` (Western North America), and `enam` (Eastern North America). - -### `info` - -Get information about a D1 database, including the current database size and state. - -```txt -wrangler d1 info [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to get information about. -- `--json` - - Return output as JSON rather than a table. - -### `list` - -List all D1 databases in your account. - -```txt -wrangler d1 list [OPTIONS] -``` - -- `--json` - - Return output as JSON rather than a table. - -### `delete` - -Delete a D1 database. - -```txt -wrangler d1 delete [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to delete. -- `-y, --skip-confirmation` - - Skip deletion confirmation prompt. - -### `execute` - -Execute a query on a D1 database. - -```txt -wrangler d1 execute [OPTIONS] -``` - -:::note - -You must provide either `--command` or `--file` for this command to run successfully. - -::: - -- `DATABASE_NAME` - - The name of the D1 database to execute a query on. -- `--command` - - The SQL query you wish to execute. -- `--file` - - Path to the SQL file you wish to execute. -- `-y, --yes` - - Answer `yes` to any prompts. -- `--local` - - Execute commands/files against a local database for use with [wrangler dev](#dev). -- `--remote` - - Execute commands/files against a remote D1 database for use with [wrangler dev --remote](#dev). -- `--persist-to` - - Specify directory to use for local persistence (for use in combination with `--local`). -- `--json` - - Return output as JSON rather than a table. -- `--preview` - - Execute commands/files against a preview D1 database (as defined by `preview_database_id` in [Wrangler.toml](/workers/wrangler/configuration/#d1-databases)). -- `--batch-size` - - Number of queries to send in a single batch. - -### `export` - -Export a D1 database or table's schema and/or content to a `.sql` file. - -```txt -wrangler d1 export [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to export. -- `--remote` - - Execute commands/files against a remote D1 database for use with [wrangler dev --remote](#dev). -- `--output` - - Path to the SQL file for your export. -- `--table` - - The name of the table within a D1 database to export. -- `--no-data` - - Controls whether export SQL file contains database data. Note that `--no-data=true` is not recommended due to a known wrangler limitation that intreprets the value as false. -- `--no-schema` - - Controls whether export SQL file contains database schema. Note that `--no-schema=true` is not recommended due to a known wrangler limitation that intreprets the value as false. - -### `time-travel restore` - -Restore a database to a specific point-in-time using [Time Travel](/d1/reference/time-travel/). - -```txt -wrangler d1 time-travel restore [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to execute a query on. -- `--bookmark` - - A D1 bookmark representing the state of a database at a specific point in time. -- `--timestamp` - - A UNIX timestamp or JavaScript date-time `string` within the last 30 days. -- `--json` - - Return output as JSON rather than a table. - -### `time-travel info` - -Inspect the current state of a database for a specific point-in-time using [Time Travel](/d1/reference/time-travel/). - -```txt -wrangler d1 time-travel info [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to execute a query on. -- `--timestamp` - - A UNIX timestamp or JavaScript date-time `string` within the last 30 days. -- `--json` b - - Return output as JSON rather than a table. - -### `backup create` - -:::caution - -This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. - -This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. -::: - -Initiate a D1 backup. - -```txt -wrangler d1 backup create -``` - -- `DATABASE_NAME` - - The name of the D1 database to backup. - -### `backup list` - -:::caution - -This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. - -This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. -::: - -List all available backups. - -```txt -wrangler d1 backup list -``` - -- `DATABASE_NAME` - - The name of the D1 database to list the backups of. - -### `backup restore` - -:::caution - -This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. - -This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. -::: - -Restore a backup into a D1 database. - -```txt -wrangler d1 backup restore -``` - -- `DATABASE_NAME` - - The name of the D1 database to restore the backup into. -- `BACKUP_ID` - - The ID of the backup you wish to restore. - -### `backup download` - -:::caution - -This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. - -This command will not work on databases that are created during the beta period, or after general availability (GA). To download existing data of a beta/GA database to your local machine refer to the `wrangler d1 export` command. Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backups for databases created during the beta/GA period. -::: - -Download existing data to your local machine. - -```txt -wrangler d1 backup download -``` - -- `DATABASE_NAME` - - The name of the D1 database you wish to download the backup of. -- `BACKUP_ID` - - The ID of the backup you wish to download. -- `--output` - - The `.sqlite3` file to write to (defaults to `'..sqlite3'`). - -### `migrations create` - -Create a new migration. - -This will generate a new versioned file inside the `migrations` folder. Name your migration file as a description of your change. This will make it easier for you to find your migration in the `migrations` folder. An example filename looks like: - -`0000_create_user_table.sql` - -The filename will include a version number and the migration name you specify below. - -```txt -wrangler d1 migrations create -``` - -- `DATABASE_NAME` - - The name of the D1 database you wish to create a migration for. -- `MIGRATION_NAME` - - A descriptive name for the migration you wish to create. - -### `migrations list` - -View a list of unapplied migration files. - -```txt -wrangler d1 migrations list [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database you wish to list unapplied migrations for. -- `--local` - - Show the list of unapplied migration files on your locally persisted D1 database. -- `--remote` - - Show the list of unapplied migration files on your remote D1 database. -- `--persist-to` - - Specify directory to use for local persistence (for use in combination with `--local`). -- `--preview` - - Show the list of unapplied migration files on your preview D1 database (as defined by `preview_database_id` in [`wrangler.toml`](/workers/wrangler/configuration/#d1-databases)). - -### `migrations apply` - -Apply any unapplied migrations. - -This command will prompt you to confirm the migrations you are about to apply. Confirm that you would like to proceed. After, a backup will be captured. - -The progress of each migration will be printed in the console. - -When running the apply command in a CI/CD environment or another non-interactive command line, the confirmation step will be skipped, but the backup will still be captured. - -If applying a migration results in an error, this migration will be rolled back, and the previous successful migration will remain applied. - -```txt -wrangler d1 migrations apply [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database you wish to apply your migrations on. -- `--env` - - Specify which environment configuration to use for D1 binding -- `--local` - - Execute any unapplied migrations on your locally persisted D1 database. -- `--remote` - - Execute any unapplied migrations on your remote D1 database. -- `--persist-to` - - Specify directory to use for local persistence (for use in combination with `--local`). -- `--preview` - - Execute any unapplied migrations on your preview D1 database (as defined by `preview_database_id` in [`wrangler.toml`](/workers/wrangler/configuration/#d1-databases)). -- `--batch-size` - - Number of queries to send in a single batch. - ---- + ## `hyperdrive` diff --git a/src/content/partials/workers/wrangler-commands/d1.mdx b/src/content/partials/workers/wrangler-commands/d1.mdx new file mode 100644 index 000000000000000..9360c278ed2f153 --- /dev/null +++ b/src/content/partials/workers/wrangler-commands/d1.mdx @@ -0,0 +1,293 @@ +--- +{} +--- + +import { AnchorHeading, Type, MetaInfo } from "~/components"; + + + +Creates a new D1 database, and provides the binding and UUID that you will put in your `wrangler.toml` file. + +```txt +wrangler d1 create [OPTIONS] +``` + +- `DATABASE_NAME` + - The name of the new D1 database. +- `--location` + - Provide an optional [location hint](/d1/configuration/data-location/) for your database leader. + - Available options include `weur` (Western Europe), `eeur` (Eastern Europe), `apac` (Asia Pacific), `oc` (Oceania), `wnam` (Western North America), and `enam` (Eastern North America). + + + +Get information about a D1 database, including the current database size and state. + +```txt +wrangler d1 info [OPTIONS] +``` + +- `DATABASE_NAME` + - The name of the D1 database to get information about. +- `--json` + - Return output as JSON rather than a table. + + + +List all D1 databases in your account. + +```txt +wrangler d1 list [OPTIONS] +``` + +- `--json` + - Return output as JSON rather than a table. + + + +Delete a D1 database. + +```txt +wrangler d1 delete [OPTIONS] +``` + +- `DATABASE_NAME` + - The name of the D1 database to delete. +- `-y, --skip-confirmation` + - Skip deletion confirmation prompt. + + + +Execute a query on a D1 database. + +```txt +wrangler d1 execute [OPTIONS] +``` + +:::note + +You must provide either `--command` or `--file` for this command to run successfully. + +::: + +- `DATABASE_NAME` + - The name of the D1 database to execute a query on. +- `--command` + - The SQL query you wish to execute. +- `--file` + - Path to the SQL file you wish to execute. +- `-y, --yes` + - Answer `yes` to any prompts. +- `--local` + - Execute commands/files against a local database for use with [wrangler dev](#dev). +- `--remote` + - Execute commands/files against a remote D1 database for use with [wrangler dev --remote](#dev). +- `--persist-to` + - Specify directory to use for local persistence (for use in combination with `--local`). +- `--json` + - Return output as JSON rather than a table. +- `--preview` + - Execute commands/files against a preview D1 database (as defined by `preview_database_id` in [Wrangler.toml](/workers/wrangler/configuration/#d1-databases)). +- `--batch-size` + - Number of queries to send in a single batch. + + + +Export a D1 database or table's schema and/or content to a `.sql` file. + +```txt +wrangler d1 export [OPTIONS] +``` + +- `DATABASE_NAME` + - The name of the D1 database to export. +- `--remote` + - Execute commands/files against a remote D1 database for use with [wrangler dev --remote](#dev). +- `--output` + - Path to the SQL file for your export. +- `--table` + - The name of the table within a D1 database to export. +- `--no-data` + - Controls whether export SQL file contains database data. Note that `--no-data=true` is not recommended due to a known wrangler limitation that intreprets the value as false. +- `--no-schema` + - Controls whether export SQL file contains database schema. Note that `--no-schema=true` is not recommended due to a known wrangler limitation that intreprets the value as false. + + + +Restore a database to a specific point-in-time using [Time Travel](/d1/reference/time-travel/). + +```txt +wrangler d1 time-travel restore [OPTIONS] +``` + +- `DATABASE_NAME` + - The name of the D1 database to execute a query on. +- `--bookmark` + - A D1 bookmark representing the state of a database at a specific point in time. +- `--timestamp` + - A UNIX timestamp or JavaScript date-time `string` within the last 30 days. +- `--json` + - Return output as JSON rather than a table. + + + +Inspect the current state of a database for a specific point-in-time using [Time Travel](/d1/reference/time-travel/). + +```txt +wrangler d1 time-travel info [OPTIONS] +``` + +- `DATABASE_NAME` + - The name of the D1 database to execute a query on. +- `--timestamp` + - A UNIX timestamp or JavaScript date-time `string` within the last 30 days. +- `--json` b + - Return output as JSON rather than a table. + + + +:::caution + +This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. + +This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. +::: + +Initiate a D1 backup. + +```txt +wrangler d1 backup create +``` + +- `DATABASE_NAME` + - The name of the D1 database to backup. + + + +:::caution + +This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. + +This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. +::: + +List all available backups. + +```txt +wrangler d1 backup list +``` + +- `DATABASE_NAME` + - The name of the D1 database to list the backups of. + + + +:::caution + +This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. + +This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. +::: + +Restore a backup into a D1 database. + +```txt +wrangler d1 backup restore +``` + +- `DATABASE_NAME` + - The name of the D1 database to restore the backup into. +- `BACKUP_ID` + - The ID of the backup you wish to restore. + + + +:::caution + +This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. + +This command will not work on databases that are created during the beta period, or after general availability (GA). To download existing data of a beta/GA database to your local machine refer to the `wrangler d1 export` command. Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backups for databases created during the beta/GA period. +::: + +Download existing data to your local machine. + +```txt +wrangler d1 backup download +``` + +- `DATABASE_NAME` + - The name of the D1 database you wish to download the backup of. +- `BACKUP_ID` + - The ID of the backup you wish to download. +- `--output` + - The `.sqlite3` file to write to (defaults to `'..sqlite3'`). + + + +Create a new migration. + +This will generate a new versioned file inside the `migrations` folder. Name your migration file as a description of your change. This will make it easier for you to find your migration in the `migrations` folder. An example filename looks like: + +`0000_create_user_table.sql` + +The filename will include a version number and the migration name you specify below. + +```txt +wrangler d1 migrations create +``` + +- `DATABASE_NAME` + - The name of the D1 database you wish to create a migration for. +- `MIGRATION_NAME` + - A descriptive name for the migration you wish to create. + + + +View a list of unapplied migration files. + +```txt +wrangler d1 migrations list [OPTIONS] +``` + +- `DATABASE_NAME` + - The name of the D1 database you wish to list unapplied migrations for. +- `--local` + - Show the list of unapplied migration files on your locally persisted D1 database. +- `--remote` + - Show the list of unapplied migration files on your remote D1 database. +- `--persist-to` + - Specify directory to use for local persistence (for use in combination with `--local`). +- `--preview` + - Show the list of unapplied migration files on your preview D1 database (as defined by `preview_database_id` in [`wrangler.toml`](/workers/wrangler/configuration/#d1-databases)). + + + +Apply any unapplied migrations. + +This command will prompt you to confirm the migrations you are about to apply. Confirm that you would like to proceed. After, a backup will be captured. + +The progress of each migration will be printed in the console. + +When running the apply command in a CI/CD environment or another non-interactive command line, the confirmation step will be skipped, but the backup will still be captured. + +If applying a migration results in an error, this migration will be rolled back, and the previous successful migration will remain applied. + +```txt +wrangler d1 migrations apply [OPTIONS] +``` + +- `DATABASE_NAME` + - The name of the D1 database you wish to apply your migrations on. +- `--env` + - Specify which environment configuration to use for D1 binding +- `--local` + - Execute any unapplied migrations on your locally persisted D1 database. +- `--remote` + - Execute any unapplied migrations on your remote D1 database. +- `--persist-to` + - Specify directory to use for local persistence (for use in combination with `--local`). +- `--preview` + - Execute any unapplied migrations on your preview D1 database (as defined by `preview_database_id` in [`wrangler.toml`](/workers/wrangler/configuration/#d1-databases)). +- `--batch-size` + - Number of queries to send in a single batch. + +--- \ No newline at end of file From 0b2506c0c3f04ec9ddbe116e3df74241307488fe Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Fri, 22 Nov 2024 17:28:52 +0000 Subject: [PATCH 07/23] Resolving merge conflict, adding global commands once at the bottom. --- .../docs/workers/wrangler/commands.mdx | 316 +----------------- .../partials/workers/wrangler-commands/d1.mdx | 6 +- 2 files changed, 5 insertions(+), 317 deletions(-) diff --git a/src/content/docs/workers/wrangler/commands.mdx b/src/content/docs/workers/wrangler/commands.mdx index c0e55b0ead6534b..ea758d8d0a989c4 100644 --- a/src/content/docs/workers/wrangler/commands.mdx +++ b/src/content/docs/workers/wrangler/commands.mdx @@ -183,321 +183,7 @@ wrangler generate [] [TEMPLATE] Interact with Cloudflare's D1 service. -### `create` - -Creates a new D1 database, and provides the binding and UUID that you will put in your `wrangler.toml` file. - -```txt -wrangler d1 create [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the new D1 database. -- `--location` - - Provide an optional [location hint](/d1/configuration/data-location/) for your database leader. - - Available options include `weur` (Western Europe), `eeur` (Eastern Europe), `apac` (Asia Pacific), `oc` (Oceania), `wnam` (Western North America), and `enam` (Eastern North America). - - - -### `info` - -Get information about a D1 database, including the current database size and state. - -```txt -wrangler d1 info [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to get information about. -- `--json` - - Return output as JSON rather than a table. - - - -### `list` - -List all D1 databases in your account. - -```txt -wrangler d1 list [OPTIONS] -``` - -- `--json` - - Return output as JSON rather than a table. - - - -### `delete` - -Delete a D1 database. - -```txt -wrangler d1 delete [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to delete. -- `-y, --skip-confirmation` - - Skip deletion confirmation prompt. - - - -### `execute` - -Execute a query on a D1 database. - -```txt -wrangler d1 execute [OPTIONS] -``` - -:::note - -You must provide either `--command` or `--file` for this command to run successfully. - -::: - -- `DATABASE_NAME` - - The name of the D1 database to execute a query on. -- `--command` - - The SQL query you wish to execute. -- `--file` - - Path to the SQL file you wish to execute. -- `-y, --yes` - - Answer `yes` to any prompts. -- `--local` - - Execute commands/files against a local database for use with [wrangler dev](#dev). -- `--remote` - - Execute commands/files against a remote D1 database for use with [wrangler dev --remote](#dev). -- `--persist-to` - - Specify directory to use for local persistence (for use in combination with `--local`). -- `--json` - - Return output as JSON rather than a table. -- `--preview` - - Execute commands/files against a preview D1 database (as defined by `preview_database_id` in [Wrangler.toml](/workers/wrangler/configuration/#d1-databases)). -- `--batch-size` - - Number of queries to send in a single batch. - - - -### `export` - -Export a D1 database or table's schema and/or content to a `.sql` file. - -```txt -wrangler d1 export [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to export. -- `--remote` - - Execute commands/files against a remote D1 database for use with [wrangler dev --remote](#dev). -- `--output` - - Path to the SQL file for your export. -- `--table` - - The name of the table within a D1 database to export. -- `--no-data` - - Controls whether export SQL file contains database data. Note that `--no-data=true` is not recommended due to a known wrangler limitation that intreprets the value as false. -- `--no-schema` - - Controls whether export SQL file contains database schema. Note that `--no-schema=true` is not recommended due to a known wrangler limitation that intreprets the value as false. - - - -### `time-travel restore` - -Restore a database to a specific point-in-time using [Time Travel](/d1/reference/time-travel/). - -```txt -wrangler d1 time-travel restore [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to execute a query on. -- `--bookmark` - - A D1 bookmark representing the state of a database at a specific point in time. -- `--timestamp` - - A UNIX timestamp or JavaScript date-time `string` within the last 30 days. -- `--json` - - Return output as JSON rather than a table. - - - -### `time-travel info` - -Inspect the current state of a database for a specific point-in-time using [Time Travel](/d1/reference/time-travel/). - -```txt -wrangler d1 time-travel info [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database to execute a query on. -- `--timestamp` - - A UNIX timestamp or JavaScript date-time `string` within the last 30 days. -- `--json` b - - Return output as JSON rather than a table. - - - -### `backup create` - -:::caution - -This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. - -This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. -::: - -Initiate a D1 backup. - -```txt -wrangler d1 backup create -``` - -- `DATABASE_NAME` - - The name of the D1 database to backup. - - - -### `backup list` - -:::caution - -This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. - -This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. -::: - -List all available backups. - -```txt -wrangler d1 backup list -``` - -- `DATABASE_NAME` - - The name of the D1 database to list the backups of. - - - -### `backup restore` - -:::caution - -This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. - -This command will not work on databases that are created during the beta period, or after general availability (GA). Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backup and restores for databases created during the beta/GA period. -::: - -Restore a backup into a D1 database. - -```txt -wrangler d1 backup restore -``` - -- `DATABASE_NAME` - - The name of the D1 database to restore the backup into. -- `BACKUP_ID` - - The ID of the backup you wish to restore. - - - -### `backup download` - -:::caution - -This command only works on databases created during D1's alpha period. You can check which version your database uses with `wrangler d1 info `. - -This command will not work on databases that are created during the beta period, or after general availability (GA). To download existing data of a beta/GA database to your local machine refer to the `wrangler d1 export` command. Refer to [Time Travel](/d1/reference/time-travel/) in the D1 documentation for more information on D1's approach to backups for databases created during the beta/GA period. -::: - -Download existing data to your local machine. - -```txt -wrangler d1 backup download -``` - -- `DATABASE_NAME` - - The name of the D1 database you wish to download the backup of. -- `BACKUP_ID` - - The ID of the backup you wish to download. -- `--output` - - The `.sqlite3` file to write to (defaults to `'..sqlite3'`). - - - -### `migrations create` - -Create a new migration. - -This will generate a new versioned file inside the `migrations` folder. Name your migration file as a description of your change. This will make it easier for you to find your migration in the `migrations` folder. An example filename looks like: - -`0000_create_user_table.sql` - -The filename will include a version number and the migration name you specify below. - -```txt -wrangler d1 migrations create -``` - -- `DATABASE_NAME` - - The name of the D1 database you wish to create a migration for. -- `MIGRATION_NAME` - - A descriptive name for the migration you wish to create. - - - -### `migrations list` - -View a list of unapplied migration files. - -```txt -wrangler d1 migrations list [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database you wish to list unapplied migrations for. -- `--local` - - Show the list of unapplied migration files on your locally persisted D1 database. -- `--remote` - - Show the list of unapplied migration files on your remote D1 database. -- `--persist-to` - - Specify directory to use for local persistence (for use in combination with `--local`). -- `--preview` - - Show the list of unapplied migration files on your preview D1 database (as defined by `preview_database_id` in [`wrangler.toml`](/workers/wrangler/configuration/#d1-databases)). - - - -### `migrations apply` - -Apply any unapplied migrations. - -This command will prompt you to confirm the migrations you are about to apply. Confirm that you would like to proceed. After, a backup will be captured. - -The progress of each migration will be printed in the console. - -When running the apply command in a CI/CD environment or another non-interactive command line, the confirmation step will be skipped, but the backup will still be captured. - -If applying a migration results in an error, this migration will be rolled back, and the previous successful migration will remain applied. - -```txt -wrangler d1 migrations apply [OPTIONS] -``` - -- `DATABASE_NAME` - - The name of the D1 database you wish to apply your migrations on. -- `--env` - - Specify which environment configuration to use for D1 binding -- `--local` - - Execute any unapplied migrations on your locally persisted D1 database. -- `--remote` - - Execute any unapplied migrations on your remote D1 database. -- `--persist-to` - - Specify directory to use for local persistence (for use in combination with `--local`). -- `--preview` - - Execute any unapplied migrations on your preview D1 database (as defined by `preview_database_id` in [`wrangler.toml`](/workers/wrangler/configuration/#d1-databases)). -- `--batch-size` - - Number of queries to send in a single batch. - - + --- diff --git a/src/content/partials/workers/wrangler-commands/d1.mdx b/src/content/partials/workers/wrangler-commands/d1.mdx index 9360c278ed2f153..8927a40ed45dc23 100644 --- a/src/content/partials/workers/wrangler-commands/d1.mdx +++ b/src/content/partials/workers/wrangler-commands/d1.mdx @@ -2,7 +2,7 @@ {} --- -import { AnchorHeading, Type, MetaInfo } from "~/components"; +import {Render, AnchorHeading, Type, MetaInfo } from "~/components"; @@ -290,4 +290,6 @@ wrangler d1 migrations apply [OPTIONS] - `--batch-size` - Number of queries to send in a single batch. ---- \ No newline at end of file +## Global commands + + \ No newline at end of file From cd9e478930190558f2f5f241b5d2160d1fce02d3 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Fri, 22 Nov 2024 17:30:15 +0000 Subject: [PATCH 08/23] Moving "global commands" out of the partial. --- src/content/docs/d1/rest-api/wrangler-commands.mdx | 8 +++++++- src/content/partials/workers/wrangler-commands/d1.mdx | 4 ---- 2 files changed, 7 insertions(+), 5 deletions(-) diff --git a/src/content/docs/d1/rest-api/wrangler-commands.mdx b/src/content/docs/d1/rest-api/wrangler-commands.mdx index 1a2679e4c34f195..7e99c47ba990c02 100644 --- a/src/content/docs/d1/rest-api/wrangler-commands.mdx +++ b/src/content/docs/d1/rest-api/wrangler-commands.mdx @@ -8,4 +8,10 @@ sidebar: import { Render, Type, MetaInfo } from "~/components" - \ No newline at end of file +## `d1` + + + +## Global commands + + \ No newline at end of file diff --git a/src/content/partials/workers/wrangler-commands/d1.mdx b/src/content/partials/workers/wrangler-commands/d1.mdx index 8927a40ed45dc23..7406b5cd4edd538 100644 --- a/src/content/partials/workers/wrangler-commands/d1.mdx +++ b/src/content/partials/workers/wrangler-commands/d1.mdx @@ -289,7 +289,3 @@ wrangler d1 migrations apply [OPTIONS] - Execute any unapplied migrations on your preview D1 database (as defined by `preview_database_id` in [`wrangler.toml`](/workers/wrangler/configuration/#d1-databases)). - `--batch-size` - Number of queries to send in a single batch. - -## Global commands - - \ No newline at end of file From 52dfe8e2823e1dcd030404d11444c73473728121 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Fri, 22 Nov 2024 17:41:08 +0000 Subject: [PATCH 09/23] Introducing APIs in Query D1. Introducing Wrangler commands properly. --- .../docs/d1/build-with-d1/query-d1.mdx | 23 ++++++++++++++++++- .../docs/d1/rest-api/wrangler-commands.mdx | 4 ++-- 2 files changed, 24 insertions(+), 3 deletions(-) diff --git a/src/content/docs/d1/build-with-d1/query-d1.mdx b/src/content/docs/d1/build-with-d1/query-d1.mdx index e66dd9dd9c9037a..bf4118b5512eac2 100644 --- a/src/content/docs/d1/build-with-d1/query-d1.mdx +++ b/src/content/docs/d1/build-with-d1/query-d1.mdx @@ -13,6 +13,27 @@ There are three primary ways you can query a D1 database: ## Query D1 with Workers Binding API +Workers Binding API primarily interacts with the data plane, and allows you to query your D1 database from your Worker. + +This requires you to: + +1. Bind your D1 database to your Worker. +2. Prepare a statement. +3. Run the statement. + +Refer to [Workers Binding API](/d1/worker-api/) for more information. + ## Query D1 with REST API -## Query D1 with Wrangler commands \ No newline at end of file +REST API primarily interacts with the control plane, and allows you to create/manage your D1 database. + +You can either use the REST API directly, or use Wrangler commands, which calls the REST API to perform its functions. + +- Refer to [D1 REST API](/api/operations/cloudflare-d1-create-database) for D1 REST API documentation. +- Refer to [D1 Wrangler commands](/d1/rest-api/wrangler-commands/) for the full list of D1 Wrangler commands. + +## Query D1 with SQL API + +D1 is compatible with most SQLite's SQL convention since it leverages SQLite's query engine. + +- Refer to [D1 SQL API](/d1/sql-api/sql-statements/) to learn more about supported SQL queries. \ No newline at end of file diff --git a/src/content/docs/d1/rest-api/wrangler-commands.mdx b/src/content/docs/d1/rest-api/wrangler-commands.mdx index 7e99c47ba990c02..c44a67040b1b1f9 100644 --- a/src/content/docs/d1/rest-api/wrangler-commands.mdx +++ b/src/content/docs/d1/rest-api/wrangler-commands.mdx @@ -1,6 +1,6 @@ --- pcx_content_type: concept -title: Wrangler commands +title: D1 Wrangler commands sidebar: order: 6 @@ -8,7 +8,7 @@ sidebar: import { Render, Type, MetaInfo } from "~/components" -## `d1` +D1 Wrangler commands use REST APIs to interact with the control plane. This page lists the Wrangler commands for D1. From c6a77bbf4923fcd66f15e0fc382bb6b461fb6e96 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Fri, 22 Nov 2024 17:56:46 +0000 Subject: [PATCH 10/23] Setting up redirects, fixing broken link. --- public/_redirects | 3 +++ src/content/docs/d1/build-with-d1/query-d1.mdx | 2 +- 2 files changed, 4 insertions(+), 1 deletion(-) diff --git a/public/_redirects b/public/_redirects index 22710f6552491d3..7b732150ef64e67 100644 --- a/public/_redirects +++ b/public/_redirects @@ -258,6 +258,7 @@ /constellation/ /workers-ai/ 301 # D1 +/d1/d1-api/ /d1/rest-api/d1-api 301/ /d1/client-api/ /d1/build-with-d1/d1-client-api/ 301 /d1/build-with-d1/d1-client-api/ /d1/worker-api/ 301 /d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301 @@ -280,6 +281,7 @@ /d1/reference/client-api/ /d1/build-with-d1/d1-client-api/ 301 /d1/reference/environments/ /d1/configuration/environments/ 301 /d1/reference/metrics-analytics/ /d1/observability/metrics-analytics/ 301 +/d1/reference/wrangler-commands/ / /d1/rest-api/wrangler-commands/ 301 /d1/how-to/ /d1/build-with-d1/ 301 /d1/how-to/query-databases/ /d1/build-with-d1/d1-client-api/ 301 /d1/how-to/using-indexes/ /d1/build-with-d1/use-indexes/ 301 @@ -295,6 +297,7 @@ /d1/configuration/local-development/ /d1/build-with-d1/local-development/ 301 /d1/configuration/remote-development/ /d1/build-with-d1/remote-development/ 301 /d1/build-with-d1/import-data/ /d1/build-with-d1/import-export-data/ 301 +/d1/build-with-d1/d1-client-api/ /d1/build-with-d1/query-d1/ 301 /d1/reference/database-commands/ /d1/reference/sql-statements/ 301 /d1/reference/sql-statements/ /d1/sql-api/sql-statements/ 301 diff --git a/src/content/docs/d1/build-with-d1/query-d1.mdx b/src/content/docs/d1/build-with-d1/query-d1.mdx index bf4118b5512eac2..898455761d7f214 100644 --- a/src/content/docs/d1/build-with-d1/query-d1.mdx +++ b/src/content/docs/d1/build-with-d1/query-d1.mdx @@ -9,7 +9,7 @@ There are three primary ways you can query a D1 database: 1. Using [D1 Workers Binding API](/d1/worker-api/) in your code. 2. Using [D1 REST API](/api/operations/cloudflare-d1-create-database). -3. Using [D1 Wrangler commands](/d1/wrangler-commands/). +3. Using [D1 Wrangler commands](/d1rest-api/wrangler-commands/). ## Query D1 with Workers Binding API From cd6be4c7b02d27cf5f6d2157939607667171d7b5 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Fri, 22 Nov 2024 18:55:39 +0000 Subject: [PATCH 11/23] Adding missing `/` --- src/content/docs/d1/build-with-d1/query-d1.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/content/docs/d1/build-with-d1/query-d1.mdx b/src/content/docs/d1/build-with-d1/query-d1.mdx index 898455761d7f214..1ff0a4a5ca80dae 100644 --- a/src/content/docs/d1/build-with-d1/query-d1.mdx +++ b/src/content/docs/d1/build-with-d1/query-d1.mdx @@ -9,7 +9,7 @@ There are three primary ways you can query a D1 database: 1. Using [D1 Workers Binding API](/d1/worker-api/) in your code. 2. Using [D1 REST API](/api/operations/cloudflare-d1-create-database). -3. Using [D1 Wrangler commands](/d1rest-api/wrangler-commands/). +3. Using [D1 Wrangler commands](/d1/rest-api/wrangler-commands/). ## Query D1 with Workers Binding API From 425ec467fe2753a21c235babe7ad26c7ac8f531a Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Mon, 25 Nov 2024 09:29:44 +0000 Subject: [PATCH 12/23] Fixing redirects. --- public/_redirects | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/public/_redirects b/public/_redirects index 7b732150ef64e67..ab6dff1f4ab4278 100644 --- a/public/_redirects +++ b/public/_redirects @@ -259,7 +259,7 @@ # D1 /d1/d1-api/ /d1/rest-api/d1-api 301/ -/d1/client-api/ /d1/build-with-d1/d1-client-api/ 301 +/d1/client-api/ /d1/worker-api/ 301 /d1/build-with-d1/d1-client-api/ /d1/worker-api/ 301 /d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301 /d1/learning/debug-d1/ /d1/observability/debug-d1/ 301 @@ -273,26 +273,26 @@ /d1/migrations/ /d1/reference/migrations/ 301 /d1/platform/wrangler-commands/ /workers/wrangler/commands/#d1 301 /d1/platform/community-projects/ /d1/reference/community-projects/ 301 -/d1/platform/client-api/ /d1/build-with-d1/d1-client-api/ 301 +/d1/platform/client-api/ /d1/worker-api/ 301 /d1/platform/data-security/ /d1/reference/data-security/ 301 /d1/platform/environments/ /d1/configuration/environments/ 301 /d1/platform/metrics-analytics/ /d1/observability/metrics-analytics/ 301 /d1/platform/migrations/ /d1/reference/migrations/ 301 -/d1/reference/client-api/ /d1/build-with-d1/d1-client-api/ 301 +/d1/reference/client-api/ /d1/worker-api/ 301 /d1/reference/environments/ /d1/configuration/environments/ 301 /d1/reference/metrics-analytics/ /d1/observability/metrics-analytics/ 301 /d1/reference/wrangler-commands/ / /d1/rest-api/wrangler-commands/ 301 /d1/how-to/ /d1/build-with-d1/ 301 -/d1/how-to/query-databases/ /d1/build-with-d1/d1-client-api/ 301 +/d1/how-to/query-databases/ /d1/build-with-d1/query-d1/ 301 /d1/how-to/using-indexes/ /d1/build-with-d1/use-indexes/ 301 /d1/how-to/querying-json/ /d1/build-with-d1/query-json/ 301 /d1/how-to/importing-data/ /d1/build-with-d1/import-export-data/ 301 /d1/how-to/generated-columns/ /d1/reference/generated-columns/ 301 /d1/build-databases/ /d1/build-with-d1/ 301 -/d1/build-databases/query-databases/ /d1/build-with-d1/d1-client-api/ 301 +/d1/build-databases/query-databases/ /d1/build-with-d1/query-d1/ 301 /d1/build-databases/use-indexes/ /d1/build-with-d1/use-indexes/ 301 /d1/build-databases/import-data/ /d1/build-with-d1/import-export-data/ 301 -/d1/build-databases/client-api/ /d1/build-with-d1/d1-client-api/ 301 +/d1/build-databases/client-api/ /d1/worker-api/ 301 /d1/reference/query-json/ /d1/build-with-d1/query-json/ 301 /d1/configuration/local-development/ /d1/build-with-d1/local-development/ 301 /d1/configuration/remote-development/ /d1/build-with-d1/remote-development/ 301 From 8b38884a7a358df1bb600b737644aa330fbc4203 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Mon, 25 Nov 2024 09:53:05 +0000 Subject: [PATCH 13/23] Resolving two conflicting redirects. --- public/_redirects | 1 - 1 file changed, 1 deletion(-) diff --git a/public/_redirects b/public/_redirects index ab6dff1f4ab4278..4e3bc0ac0f8b3b9 100644 --- a/public/_redirects +++ b/public/_redirects @@ -297,7 +297,6 @@ /d1/configuration/local-development/ /d1/build-with-d1/local-development/ 301 /d1/configuration/remote-development/ /d1/build-with-d1/remote-development/ 301 /d1/build-with-d1/import-data/ /d1/build-with-d1/import-export-data/ 301 -/d1/build-with-d1/d1-client-api/ /d1/build-with-d1/query-d1/ 301 /d1/reference/database-commands/ /d1/reference/sql-statements/ 301 /d1/reference/sql-statements/ /d1/sql-api/sql-statements/ 301 From 890c682880ce75503148cd936e745027077adee5 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Tue, 26 Nov 2024 11:20:22 +0000 Subject: [PATCH 14/23] Fixing redirect --- public/_redirects | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/public/_redirects b/public/_redirects index 4e3bc0ac0f8b3b9..27409d943400fc2 100644 --- a/public/_redirects +++ b/public/_redirects @@ -258,7 +258,7 @@ /constellation/ /workers-ai/ 301 # D1 -/d1/d1-api/ /d1/rest-api/d1-api 301/ +/d1/d1-api/ /d1/rest-api/d1-api/ 301 /d1/client-api/ /d1/worker-api/ 301 /d1/build-with-d1/d1-client-api/ /d1/worker-api/ 301 /d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301 From 5cc55062777b60ea2b319f1bad8fc433451cb85e Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Thu, 5 Dec 2024 18:05:35 +0000 Subject: [PATCH 15/23] Reordering API pages, moving Remote dev below Local dev. --- .../docs/d1/build-with-d1/remote-development.mdx | 2 +- .../docs/d1/build-with-d1/use-d1-from-pages.mdx | 2 +- src/content/docs/d1/{rest-api => }/d1-api.mdx | 2 +- src/content/docs/d1/rest-api/index.mdx | 12 ------------ src/content/docs/d1/sql-api/index.mdx | 2 +- .../docs/d1/{rest-api => }/wrangler-commands.mdx | 0 6 files changed, 4 insertions(+), 16 deletions(-) rename src/content/docs/d1/{rest-api => }/d1-api.mdx (92%) delete mode 100644 src/content/docs/d1/rest-api/index.mdx rename src/content/docs/d1/{rest-api => }/wrangler-commands.mdx (100%) diff --git a/src/content/docs/d1/build-with-d1/remote-development.mdx b/src/content/docs/d1/build-with-d1/remote-development.mdx index e287f36105a425a..95abccf312bf727 100644 --- a/src/content/docs/d1/build-with-d1/remote-development.mdx +++ b/src/content/docs/d1/build-with-d1/remote-development.mdx @@ -2,7 +2,7 @@ title: Remote development pcx_content_type: concept sidebar: - order: 4 + order: 9 --- diff --git a/src/content/docs/d1/build-with-d1/use-d1-from-pages.mdx b/src/content/docs/d1/build-with-d1/use-d1-from-pages.mdx index 3d205e72e444f2a..ded49eff452bd5e 100644 --- a/src/content/docs/d1/build-with-d1/use-d1-from-pages.mdx +++ b/src/content/docs/d1/build-with-d1/use-d1-from-pages.mdx @@ -3,6 +3,6 @@ pcx_content_type: navigation title: Use D1 from Pages external_link: /pages/functions/bindings/#d1-databases sidebar: - order: 10 + order: 11 --- diff --git a/src/content/docs/d1/rest-api/d1-api.mdx b/src/content/docs/d1/d1-api.mdx similarity index 92% rename from src/content/docs/d1/rest-api/d1-api.mdx rename to src/content/docs/d1/d1-api.mdx index 4c584b8c309de13..8663e4653b448aa 100644 --- a/src/content/docs/d1/rest-api/d1-api.mdx +++ b/src/content/docs/d1/d1-api.mdx @@ -3,6 +3,6 @@ pcx_content_type: navigation title: D1 REST API external_link: /api/operations/cloudflare-d1-create-database sidebar: - order: 5 + order: 6 --- diff --git a/src/content/docs/d1/rest-api/index.mdx b/src/content/docs/d1/rest-api/index.mdx deleted file mode 100644 index f189f0f1e266e43..000000000000000 --- a/src/content/docs/d1/rest-api/index.mdx +++ /dev/null @@ -1,12 +0,0 @@ ---- -pcx_content_type: navigation -title: REST API -sidebar: - order: 5 - group: - hideIndex: true ---- - -import { DirectoryListing } from "~/components"; - - diff --git a/src/content/docs/d1/sql-api/index.mdx b/src/content/docs/d1/sql-api/index.mdx index 3228a42dd6a9bcd..8b74fb521d38d05 100644 --- a/src/content/docs/d1/sql-api/index.mdx +++ b/src/content/docs/d1/sql-api/index.mdx @@ -2,7 +2,7 @@ title: SQL API pcx_content_type: navigation sidebar: - order: 6 + order: 5 group: hideIndex: true --- diff --git a/src/content/docs/d1/rest-api/wrangler-commands.mdx b/src/content/docs/d1/wrangler-commands.mdx similarity index 100% rename from src/content/docs/d1/rest-api/wrangler-commands.mdx rename to src/content/docs/d1/wrangler-commands.mdx From ac438d2958b57f83613b85fdae2f606c8cb7d450 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Fri, 6 Dec 2024 10:30:54 +0000 Subject: [PATCH 16/23] Editing chapter contents to action feedback. --- public/_redirects | 3 +- .../d1/build-with-d1/import-export-data.mdx | 12 +-- .../docs/d1/build-with-d1/query-d1.mdx | 77 ++++++++++++++++--- 3 files changed, 74 insertions(+), 18 deletions(-) diff --git a/public/_redirects b/public/_redirects index 298baa9f9508c7a..78eab1f80bc3b40 100644 --- a/public/_redirects +++ b/public/_redirects @@ -258,7 +258,6 @@ /constellation/ /workers-ai/ 301 # D1 -/d1/d1-api/ /d1/rest-api/d1-api/ 301 /d1/client-api/ /d1/worker-api/ 301 /d1/build-with-d1/d1-client-api/ /d1/worker-api/ 301 /d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301 @@ -281,7 +280,7 @@ /d1/reference/client-api/ /d1/worker-api/ 301 /d1/reference/environments/ /d1/configuration/environments/ 301 /d1/reference/metrics-analytics/ /d1/observability/metrics-analytics/ 301 -/d1/reference/wrangler-commands/ / /d1/rest-api/wrangler-commands/ 301 +/d1/reference/wrangler-commands/ / /d1/wrangler-commands/ 301 /d1/how-to/ /d1/build-with-d1/ 301 /d1/how-to/query-databases/ /d1/build-with-d1/query-d1/ 301 /d1/how-to/using-indexes/ /d1/build-with-d1/use-indexes/ 301 diff --git a/src/content/docs/d1/build-with-d1/import-export-data.mdx b/src/content/docs/d1/build-with-d1/import-export-data.mdx index a4c14e898a80b9e..bda482d521767a3 100644 --- a/src/content/docs/d1/build-with-d1/import-export-data.mdx +++ b/src/content/docs/d1/build-with-d1/import-export-data.mdx @@ -105,12 +105,6 @@ Once you have run the above command, you will need to edit the output SQL file t You can then follow the steps to [import an existing database](#import-an-existing-database) into D1 by using the `.sql` file you generated from the database dump as the input to `wrangler d1 execute`. -## Foreign key constraints - -When importing data, you may need to temporarily disable [foreign key constraints](/d1/build-with-d1/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. - -Refer to the [foreign key documentation](/d1/build-with-d1/foreign-keys/) to learn more about how to work with foreign keys and D1. - ## Export an existing D1 database In addition to importing existing SQLite databases, you might want to export a D1 database for local development or testing. You can export a D1 database to a `.sql` file using [wrangler d1 export](/workers/wrangler/commands/#export) and then execute (import) with `d1 execute --file`. @@ -198,6 +192,12 @@ VALUES ('1000', 'Boris Pewter', '2022-12-15 22:16:15'); ``` +## Foreign key constraints + +When importing data, you may need to temporarily disable [foreign key constraints](/d1/build-with-d1/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. + +Refer to the [foreign key documentation](/d1/build-with-d1/foreign-keys/) to learn more about how to work with foreign keys and D1. + ## Next Steps - Read the SQLite [`CREATE TABLE`](https://www.sqlite.org/lang_createtable.html) documentation. diff --git a/src/content/docs/d1/build-with-d1/query-d1.mdx b/src/content/docs/d1/build-with-d1/query-d1.mdx index 1ff0a4a5ca80dae..fd021ebe6495084 100644 --- a/src/content/docs/d1/build-with-d1/query-d1.mdx +++ b/src/content/docs/d1/build-with-d1/query-d1.mdx @@ -1,15 +1,16 @@ --- -title: Query D1 +title: Query a database pcx_content_type: concept sidebar: order: 1 --- -There are three primary ways you can query a D1 database: +There are a number of ways you can interact with a D1 database: 1. Using [D1 Workers Binding API](/d1/worker-api/) in your code. -2. Using [D1 REST API](/api/operations/cloudflare-d1-create-database). -3. Using [D1 Wrangler commands](/d1/rest-api/wrangler-commands/). +2. Using [SQL API](/d1/sql-api/sql-statements/). +3. Using [D1 REST API](/api/operations/cloudflare-d1-create-database). +4. Using [D1 Wrangler commands](/d1/wrangler-commands/). ## Query D1 with Workers Binding API @@ -21,19 +22,75 @@ This requires you to: 2. Prepare a statement. 3. Run the statement. +```js title="index.js" +export default { + async fetch(request, env) { + const {pathname} = new URL(request.url); + const companyName1 = `Bs Beverages`; + const companyName2 = `Around the Horn`; + const stmt = env.DB.prepare(`SELECT * FROM Customers WHERE CompanyName = ?`); + + if (pathname === `/RUN`) { + const returnValue = await stmt.bind(companyName1).run(); + return Response.json(returnValue); + } + + return new Response( + `Welcome to the D1 API Playground! + \nChange the URL to test the various methods inside your index.js file.`, + ); + }, +}; +``` + Refer to [Workers Binding API](/d1/worker-api/) for more information. +## Query D1 with SQL API + +D1 is compatible with most SQLite's SQL convention since it leverages SQLite's query engine. + +```sh +npx wrangler d1 execute prod-d1-tutorial --local --command="SELECT * FROM Customers" +``` + +```sh output +🌀 Mapping SQL input into an array of statements +🌀 Executing on local database production-db-backend (database-id) from .wrangler/state/v3/d1: +┌────────────┬─────────────────────┬───────────────────┐ +│ CustomerId │ CompanyName │ ContactName │ +├────────────┼─────────────────────┼───────────────────┤ +│ 1 │ Alfreds Futterkiste │ Maria Anders │ +├────────────┼─────────────────────┼───────────────────┤ +│ 4 │ Around the Horn │ Thomas Hardy │ +├────────────┼─────────────────────┼───────────────────┤ +│ 11 │ Bs Beverages │ Victoria Ashworth │ +├────────────┼─────────────────────┼───────────────────┤ +│ 13 │ Bs Beverages │ Random Name │ +└────────────┴─────────────────────┴───────────────────┘ +``` + +Refer to [D1 SQL API](/d1/sql-api/sql-statements/) to learn more about supported SQL queries. + ## Query D1 with REST API REST API primarily interacts with the control plane, and allows you to create/manage your D1 database. -You can either use the REST API directly, or use Wrangler commands, which calls the REST API to perform its functions. +Refer to [D1 REST API](/api/operations/cloudflare-d1-create-database) for D1 REST API documentation. -- Refer to [D1 REST API](/api/operations/cloudflare-d1-create-database) for D1 REST API documentation. -- Refer to [D1 Wrangler commands](/d1/rest-api/wrangler-commands/) for the full list of D1 Wrangler commands. +## Query D1 with Wrangler commands -## Query D1 with SQL API +You can use Wrangler commands to interact with the control plane. Note that Wrangler commands use REST APIs to perform its operations. -D1 is compatible with most SQLite's SQL convention since it leverages SQLite's query engine. +```sh +npx wrangler d1 create prod-d1-tutorial +``` + +```sh output + +✅ Successfully created DB 'prod-d1-tutorial' -- Refer to [D1 SQL API](/d1/sql-api/sql-statements/) to learn more about supported SQL queries. \ No newline at end of file +[[d1_databases]] +binding = "DB" # available in your Worker on env.DB +database_name = "prod-d1-tutorial" +database_id = "" +``` \ No newline at end of file From 8ac8712cfcedd49d9aa7c8763e48ff3854b64fdb Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Fri, 6 Dec 2024 10:52:29 +0000 Subject: [PATCH 17/23] Renaming "Build with D1" to "Best practices". --- public/_redirects | 44 +++++++++++-------- src/content/changelogs/d1.yaml | 8 ++-- .../foreign-keys.mdx | 2 +- .../import-export-data.mdx | 6 +-- .../index.mdx | 2 +- .../local-development.mdx | 0 .../query-d1.mdx | 0 .../query-json.mdx | 2 +- .../remote-development.mdx | 0 .../use-d1-from-pages.mdx | 0 .../use-indexes.mdx | 0 .../examples/query-d1-from-python-workers.mdx | 2 +- src/content/docs/d1/index.mdx | 2 +- .../d1/observability/metrics-analytics.mdx | 2 +- src/content/docs/d1/platform/pricing.mdx | 2 +- src/content/docs/d1/reference/backups.mdx | 2 +- .../docs/d1/reference/generated-columns.mdx | 6 +-- src/content/docs/d1/reference/migrations.mdx | 4 +- .../docs/d1/sql-api/sql-statements.mdx | 4 +- .../d1/tutorials/d1-and-prisma-orm/index.mdx | 4 +- .../import-to-d1-with-rest-api/index.mdx | 2 +- src/content/docs/pages/functions/bindings.mdx | 2 +- .../workers/testing/local-development.mdx | 2 +- .../partials/d1/use-pragma-statements.mdx | 4 +- .../durable-objects/durable-objects-vs-d1.mdx | 8 ++-- src/content/partials/workers/d1-pricing.mdx | 4 +- 26 files changed, 61 insertions(+), 53 deletions(-) rename src/content/docs/d1/{build-with-d1 => best-practices}/foreign-keys.mdx (96%) rename src/content/docs/d1/{build-with-d1 => best-practices}/import-export-data.mdx (95%) rename src/content/docs/d1/{build-with-d1 => best-practices}/index.mdx (87%) rename src/content/docs/d1/{build-with-d1 => best-practices}/local-development.mdx (100%) rename src/content/docs/d1/{build-with-d1 => best-practices}/query-d1.mdx (100%) rename src/content/docs/d1/{build-with-d1 => best-practices}/query-json.mdx (98%) rename src/content/docs/d1/{build-with-d1 => best-practices}/remote-development.mdx (100%) rename src/content/docs/d1/{build-with-d1 => best-practices}/use-d1-from-pages.mdx (100%) rename src/content/docs/d1/{build-with-d1 => best-practices}/use-indexes.mdx (100%) diff --git a/public/_redirects b/public/_redirects index 78eab1f80bc3b40..8057fc0b3446ded 100644 --- a/public/_redirects +++ b/public/_redirects @@ -260,14 +260,23 @@ # D1 /d1/client-api/ /d1/worker-api/ 301 /d1/build-with-d1/d1-client-api/ /d1/worker-api/ 301 +/d1/build-with-d1/import-data/ /d1/best-practices/import-export-data/ 301 +/d1/build-with-d1/ /d1/best-practices/ 301 +/d1/build-with-d1/import-export-data/ /d1/best-practices/import-export-data/ 301 +/d1/build-with-d1/use-indexes/ /d1/best-practices/use-indexes/ 301 +/d1/build-with-d1/remote-development/ /d1/best-practices/remote-development/ 301 +/d1/build-with-d1/local-development/ /d1/best-practices/local-development/ 301 +/d1/build-with-d1/foreign-keys/ /d1/best-practices/foreign-keys/ 301 +/d1/build-with-d1/query-json/ /d1/best-practices/query-json/ 301 +/d1/build-with-d1/use-d1-from-pages/ /d1/best-practices/use-d1-from-pages/ 301 /d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301 /d1/learning/debug-d1/ /d1/observability/debug-d1/ 301 -/d1/learning/using-indexes/ /d1/build-with-d1/use-indexes/ 301 -/d1/learning/querying-json/ /d1/build-with-d1/query-json/ 301 -/d1/learning/importing-data/ /d1/build-with-d1/import-export-data/ 301 +/d1/learning/using-indexes/ /d1/best-practices/use-indexes/ 301 +/d1/learning/querying-json/ /d1/best-practices/query-json/ 301 +/d1/learning/importing-data/ /d1/best-practices/import-export-data/ 301 /d1/learning/generated-columns/ /d1/reference/generated-columns/ 301 -/d1/learning/local-development/ /d1/build-with-d1/local-development/ 301 -/d1/learning/remote-development/ /d1/build-with-d1/remote-development/ 301 +/d1/learning/local-development/ /d1/best-practices/local-development/ 301 +/d1/learning/remote-development/ /d1/best-practices/remote-development/ 301 /d1/learning/data-location/ /d1/configuration/data-location/ 301 /d1/migrations/ /d1/reference/migrations/ 301 /d1/platform/wrangler-commands/ /workers/wrangler/commands/#d1 301 @@ -281,21 +290,20 @@ /d1/reference/environments/ /d1/configuration/environments/ 301 /d1/reference/metrics-analytics/ /d1/observability/metrics-analytics/ 301 /d1/reference/wrangler-commands/ / /d1/wrangler-commands/ 301 -/d1/how-to/ /d1/build-with-d1/ 301 -/d1/how-to/query-databases/ /d1/build-with-d1/query-d1/ 301 -/d1/how-to/using-indexes/ /d1/build-with-d1/use-indexes/ 301 -/d1/how-to/querying-json/ /d1/build-with-d1/query-json/ 301 -/d1/how-to/importing-data/ /d1/build-with-d1/import-export-data/ 301 +/d1/how-to/ /d1/best-practices/ 301 +/d1/how-to/query-databases/ /d1/best-practices/query-d1/ 301 +/d1/how-to/using-indexes/ /d1/best-practices/use-indexes/ 301 +/d1/how-to/querying-json/ /d1/best-practices/query-json/ 301 +/d1/how-to/importing-data/ /d1/best-practices/import-export-data/ 301 /d1/how-to/generated-columns/ /d1/reference/generated-columns/ 301 -/d1/build-databases/ /d1/build-with-d1/ 301 -/d1/build-databases/query-databases/ /d1/build-with-d1/query-d1/ 301 -/d1/build-databases/use-indexes/ /d1/build-with-d1/use-indexes/ 301 -/d1/build-databases/import-data/ /d1/build-with-d1/import-export-data/ 301 +/d1/build-databases/ /d1/best-practices/ 301 +/d1/build-databases/query-databases/ /d1/best-practices/query-d1/ 301 +/d1/build-databases/use-indexes/ /d1/best-practices/use-indexes/ 301 +/d1/build-databases/import-data/ /d1/best-practices/import-export-data/ 301 /d1/build-databases/client-api/ /d1/worker-api/ 301 -/d1/reference/query-json/ /d1/build-with-d1/query-json/ 301 -/d1/configuration/local-development/ /d1/build-with-d1/local-development/ 301 -/d1/configuration/remote-development/ /d1/build-with-d1/remote-development/ 301 -/d1/build-with-d1/import-data/ /d1/build-with-d1/import-export-data/ 301 +/d1/reference/query-json/ /d1/best-practices/query-json/ 301 +/d1/configuration/local-development/ /d1/best-practices/local-development/ 301 +/d1/configuration/remote-development/ /d1/best-practices/remote-development/ 301 /d1/reference/database-commands/ /d1/reference/sql-statements/ 301 /d1/reference/sql-statements/ /d1/sql-api/sql-statements/ 301 diff --git a/src/content/changelogs/d1.yaml b/src/content/changelogs/d1.yaml index 7667f247df6a86a..6d1abe70d7d21e9 100644 --- a/src/content/changelogs/d1.yaml +++ b/src/content/changelogs/d1.yaml @@ -58,7 +58,7 @@ entries: * Developers with a Workers Paid plan now have a 10GB GB per-database limit (up from 2GB), which can be combined with existing limit of 50,000 databases per account. * Developers with a Workers Free plan retain the 500 MB per-database limit and can create up to 10 databases per account. - * D1 databases can be [exported](/d1/build-with-d1/import-export-data/#export-an-existing-d1-database) as a SQL file. + * D1 databases can be [exported](/d1/best-practices/import-export-data/#export-an-existing-d1-database) as a SQL file. - publish_date: "2024-03-12" title: Change in `wrangler d1 execute` default @@ -138,7 +138,7 @@ entries: - publish_date: "2023-08-19" title: Row count now returned per query description: |- - D1 now returns a count of `rows_written` and `rows_read` for every query executed, allowing you to assess the cost of query for both [pricing](/d1/platform/pricing/) and [index optimization](/d1/build-with-d1/use-indexes/) purposes. + D1 now returns a count of `rows_written` and `rows_read` for every query executed, allowing you to assess the cost of query for both [pricing](/d1/platform/pricing/) and [index optimization](/d1/best-practices/use-indexes/) purposes. The `meta` object returned in [D1's Client API](/d1/worker-api/return-object/#d1result) contains a total count of the rows read (`rows_read`) and rows written (`rows_written`) by that query. For example, a query that performs a full table scan (for example, `SELECT * FROM users`) from a table with 5000 rows would return a `rows_read` value of `5000`: ```json @@ -195,7 +195,7 @@ entries: New documentation has been published on how to use D1's support for [generated columns](/d1/reference/generated-columns/) to define columns that are dynamically generated on write (or read). Generated columns allow you to extract - data from [JSON objects](/d1/build-with-d1/query-json/) or use the output of other + data from [JSON objects](/d1/best-practices/query-json/) or use the output of other SQL functions. - publish_date: "2023-06-12" title: Deprecating Error.cause @@ -223,7 +223,7 @@ entries: - publish_date: "2023-05-17" title: Query JSON description: - "[New documentation](/d1/build-with-d1/query-json/) has been published + "[New documentation](/d1/best-practices/query-json/) has been published that covers D1's extensive JSON function support. JSON functions allow you to parse, query and modify JSON directly from your SQL queries, reducing the number of round trips to your database, or data queried." diff --git a/src/content/docs/d1/build-with-d1/foreign-keys.mdx b/src/content/docs/d1/best-practices/foreign-keys.mdx similarity index 96% rename from src/content/docs/d1/build-with-d1/foreign-keys.mdx rename to src/content/docs/d1/best-practices/foreign-keys.mdx index 27d274f2c77a4ab..0472d41719c26b3 100644 --- a/src/content/docs/d1/build-with-d1/foreign-keys.mdx +++ b/src/content/docs/d1/best-practices/foreign-keys.mdx @@ -16,7 +16,7 @@ By default, D1 enforces that foreign key constraints are valid within all querie ## Defer foreign key constraints -When running a [query](/d1/worker-api/), [migration](/d1/reference/migrations/) or [importing data](/d1/build-with-d1/import-export-data/) against a D1 database, there may be situations in which you need to disable foreign key validation during table creation or changes to your schema. +When running a [query](/d1/worker-api/), [migration](/d1/reference/migrations/) or [importing data](/d1/best-practices/import-export-data/) against a D1 database, there may be situations in which you need to disable foreign key validation during table creation or changes to your schema. D1's foreign key enforcement is equivalent to SQLite's `PRAGMA foreign_keys = on` directive. Because D1 runs every query inside an implicit transaction, user queries cannot change this during a query or migration. diff --git a/src/content/docs/d1/build-with-d1/import-export-data.mdx b/src/content/docs/d1/best-practices/import-export-data.mdx similarity index 95% rename from src/content/docs/d1/build-with-d1/import-export-data.mdx rename to src/content/docs/d1/best-practices/import-export-data.mdx index bda482d521767a3..4c360db83e90ba1 100644 --- a/src/content/docs/d1/build-with-d1/import-export-data.mdx +++ b/src/content/docs/d1/best-practices/import-export-data.mdx @@ -7,7 +7,7 @@ sidebar: D1 allows you to import existing SQLite tables and their data directly, enabling you to migrate existing data into D1 quickly and easily. This can be useful when migrating applications to use Workers and D1, or when you want to prototype a schema locally before importing it to your D1 database(s). -D1 also allows you to export a database. This can be useful for [local development](/d1/build-with-d1/local-development/) or testing. +D1 also allows you to export a database. This can be useful for [local development](/d1/best-practices/local-development/) or testing. ## Import an existing database @@ -194,9 +194,9 @@ VALUES ## Foreign key constraints -When importing data, you may need to temporarily disable [foreign key constraints](/d1/build-with-d1/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. +When importing data, you may need to temporarily disable [foreign key constraints](/d1/best-practices/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. -Refer to the [foreign key documentation](/d1/build-with-d1/foreign-keys/) to learn more about how to work with foreign keys and D1. +Refer to the [foreign key documentation](/d1/best-practices/foreign-keys/) to learn more about how to work with foreign keys and D1. ## Next Steps diff --git a/src/content/docs/d1/build-with-d1/index.mdx b/src/content/docs/d1/best-practices/index.mdx similarity index 87% rename from src/content/docs/d1/build-with-d1/index.mdx rename to src/content/docs/d1/best-practices/index.mdx index d7c4c1c7e366171..57f21ba7643ffcc 100644 --- a/src/content/docs/d1/build-with-d1/index.mdx +++ b/src/content/docs/d1/best-practices/index.mdx @@ -1,5 +1,5 @@ --- -title: Build with D1 +title: Best practices pcx_content_type: navigation sidebar: order: 3 diff --git a/src/content/docs/d1/build-with-d1/local-development.mdx b/src/content/docs/d1/best-practices/local-development.mdx similarity index 100% rename from src/content/docs/d1/build-with-d1/local-development.mdx rename to src/content/docs/d1/best-practices/local-development.mdx diff --git a/src/content/docs/d1/build-with-d1/query-d1.mdx b/src/content/docs/d1/best-practices/query-d1.mdx similarity index 100% rename from src/content/docs/d1/build-with-d1/query-d1.mdx rename to src/content/docs/d1/best-practices/query-d1.mdx diff --git a/src/content/docs/d1/build-with-d1/query-json.mdx b/src/content/docs/d1/best-practices/query-json.mdx similarity index 98% rename from src/content/docs/d1/build-with-d1/query-json.mdx rename to src/content/docs/d1/best-practices/query-json.mdx index 6b6832a152da3eb..55f53a6781f4fef 100644 --- a/src/content/docs/d1/build-with-d1/query-json.mdx +++ b/src/content/docs/d1/best-practices/query-json.mdx @@ -78,7 +78,7 @@ ERROR 9015: SQL engine error: query error: Error code 1: SQL error or missing da D1's support for [generated columns](/d1/reference/generated-columns/) allows you to create dynamic columns that are generated based on the values of other columns, including extracted or calculated values of JSON data. -These columns can be queried like any other column, and can have [indexes](/d1/build-with-d1/use-indexes/) defined on them. If you have JSON data that you frequently query and filter over, creating a generated column and an index can dramatically improve query performance. +These columns can be queried like any other column, and can have [indexes](/d1/best-practices/use-indexes/) defined on them. If you have JSON data that you frequently query and filter over, creating a generated column and an index can dramatically improve query performance. For example, to define a column based on a value within a larger JSON object, use the `AS` keyword combined with a [JSON function](#supported-functions) to generate a typed column: diff --git a/src/content/docs/d1/build-with-d1/remote-development.mdx b/src/content/docs/d1/best-practices/remote-development.mdx similarity index 100% rename from src/content/docs/d1/build-with-d1/remote-development.mdx rename to src/content/docs/d1/best-practices/remote-development.mdx diff --git a/src/content/docs/d1/build-with-d1/use-d1-from-pages.mdx b/src/content/docs/d1/best-practices/use-d1-from-pages.mdx similarity index 100% rename from src/content/docs/d1/build-with-d1/use-d1-from-pages.mdx rename to src/content/docs/d1/best-practices/use-d1-from-pages.mdx diff --git a/src/content/docs/d1/build-with-d1/use-indexes.mdx b/src/content/docs/d1/best-practices/use-indexes.mdx similarity index 100% rename from src/content/docs/d1/build-with-d1/use-indexes.mdx rename to src/content/docs/d1/best-practices/use-indexes.mdx diff --git a/src/content/docs/d1/examples/query-d1-from-python-workers.mdx b/src/content/docs/d1/examples/query-d1-from-python-workers.mdx index 59a7ab8f47f0aec..e5a2f198f6358d4 100644 --- a/src/content/docs/d1/examples/query-d1-from-python-workers.mdx +++ b/src/content/docs/d1/examples/query-d1-from-python-workers.mdx @@ -127,4 +127,4 @@ If you receive an error deploying: - Refer to [Workers Python documentation](/workers/languages/python/) to learn more about how to use Python in Workers. - Review the [D1 Workers Binding API](/d1/worker-api/) and how to query D1 databases. -- Learn [how to import data](/d1/build-with-d1/import-export-data/) to your D1 database. +- Learn [how to import data](/d1/best-practices/import-export-data/) to your D1 database. diff --git a/src/content/docs/d1/index.mdx b/src/content/docs/d1/index.mdx index f4852932cf3274b..f6992adcf967408 100644 --- a/src/content/docs/d1/index.mdx +++ b/src/content/docs/d1/index.mdx @@ -27,7 +27,7 @@ D1 is Cloudflare's managed, serverless database with SQLite's SQL semantics, bui D1 is designed for horizontal scale out across multiple, smaller (10 GB) databases, such as per-user, per-tenant or per-entity databases. D1 allows you to build applications with thousands of databases at no extra cost for isolating with multiple databases. D1 pricing is based only on query and storage costs. -Create your first D1 database by [following the Get started guide](/d1/get-started/), learn how to [import data into a database](/d1/build-with-d1/import-export-data/), and how to [interact with your database](/d1/worker-api/) directly from [Workers](/workers/) or [Pages](/pages/functions/bindings/#d1-databases). +Create your first D1 database by [following the Get started guide](/d1/get-started/), learn how to [import data into a database](/d1/best-practices/import-export-data/), and how to [interact with your database](/d1/worker-api/) directly from [Workers](/workers/) or [Pages](/pages/functions/bindings/#d1-databases). *** diff --git a/src/content/docs/d1/observability/metrics-analytics.mdx b/src/content/docs/d1/observability/metrics-analytics.mdx index f1b84d49bf5536f..63ac9f87d8aecbc 100644 --- a/src/content/docs/d1/observability/metrics-analytics.mdx +++ b/src/content/docs/d1/observability/metrics-analytics.mdx @@ -30,7 +30,7 @@ Metrics can be queried (and are retained) for the past 31 days. D1 returns the number of rows read, rows written (or both) in response to each individual query via [the Workers Binding API](/d1/worker-api/return-object/). Row counts are a precise count of how many rows were read (scanned) or written by that query. -Inspect row counts to understand the performance and cost of a given query, including whether you can reduce the rows read [using indexes](/d1/build-with-d1/use-indexes/). Use query counts to understand the total volume of traffic against your databases and to discern which databases are actively in-use. +Inspect row counts to understand the performance and cost of a given query, including whether you can reduce the rows read [using indexes](/d1/best-practices/use-indexes/). Use query counts to understand the total volume of traffic against your databases and to discern which databases are actively in-use. Refer to the [Pricing documentation](/d1/platform/pricing/) for more details on how rows are counted. diff --git a/src/content/docs/d1/platform/pricing.mdx b/src/content/docs/d1/platform/pricing.mdx index 4790ec8e1cae3a7..b8a77fc2f021b5a 100644 --- a/src/content/docs/d1/platform/pricing.mdx +++ b/src/content/docs/d1/platform/pricing.mdx @@ -64,7 +64,7 @@ Yes, any queries you run against your database, including inserting (`INSERT`) e ### Can I use an index to reduce the number of rows read by a query? -Yes, you can use an index to reduce the number of rows read by a query. [Creating indexes](/d1/build-with-d1/use-indexes/) for your most queried tables and filtered columns reduces how much data is scanned and improves query performance at the same time. If you have a read-heavy workload (most common), this can be particularly advantageous. Writing to columns referenced in an index will add at least one (1) additional row written to account for updating the index, but this is typically offset by the reduction in rows read due to the benefits of an index. +Yes, you can use an index to reduce the number of rows read by a query. [Creating indexes](/d1/best-practices/use-indexes/) for your most queried tables and filtered columns reduces how much data is scanned and improves query performance at the same time. If you have a read-heavy workload (most common), this can be particularly advantageous. Writing to columns referenced in an index will add at least one (1) additional row written to account for updating the index, but this is typically offset by the reduction in rows read due to the benefits of an index. ### Does a freshly created database, and/or an empty table with no rows, contribute to my storage? diff --git a/src/content/docs/d1/reference/backups.mdx b/src/content/docs/d1/reference/backups.mdx index 5b30f93e190eb6d..0b26855c41a4f5c 100644 --- a/src/content/docs/d1/reference/backups.mdx +++ b/src/content/docs/d1/reference/backups.mdx @@ -86,7 +86,7 @@ wrangler d1 backup download example-db 123a81a2-ab91-4c2e-8ebc-64d69633faf1 🌀 Done! ``` -The database backup will be download to the current working directory in native SQLite3 format. To import a local database, read [the documentation on importing data](/d1/build-with-d1/import-export-data/) to D1. +The database backup will be download to the current working directory in native SQLite3 format. To import a local database, read [the documentation on importing data](/d1/best-practices/import-export-data/) to D1. ## Restoring a backup diff --git a/src/content/docs/d1/reference/generated-columns.mdx b/src/content/docs/d1/reference/generated-columns.mdx index e36001ac6fb46d9..c1ccfbec64c9a07 100644 --- a/src/content/docs/d1/reference/generated-columns.mdx +++ b/src/content/docs/d1/reference/generated-columns.mdx @@ -6,11 +6,11 @@ sidebar: --- -D1 allows you to define generated columns based on the values of one or more other columns, SQL functions, or even [extracted JSON values](/d1/build-with-d1/query-json/). +D1 allows you to define generated columns based on the values of one or more other columns, SQL functions, or even [extracted JSON values](/d1/best-practices/query-json/). This allows you to normalize your data as you write to it or read it from a table, making it easier to query and reducing the need for complex application logic. -Generated columns can also have [indexes defined](/d1/build-with-d1/use-indexes/) against them, which can dramatically increase query performance over frequently queried fields. +Generated columns can also have [indexes defined](/d1/best-practices/use-indexes/) against them, which can dramatically increase query performance over frequently queried fields. ## Types of generated columns @@ -48,7 +48,7 @@ As a concrete example, to automatically extract the `location` value from the fo } ``` -To define a generated column with the value of `$.measurement.location`, you can use the [`json_extract`](/d1/build-with-d1/query-json/#extract-values) function to extract the value from the `raw_data` column each time you write to that row: +To define a generated column with the value of `$.measurement.location`, you can use the [`json_extract`](/d1/best-practices/query-json/#extract-values) function to extract the value from the `raw_data` column each time you write to that row: ```sql CREATE TABLE sensor_readings ( diff --git a/src/content/docs/d1/reference/migrations.mdx b/src/content/docs/d1/reference/migrations.mdx index 495fbf6b207f104..07ef45c63bba263 100644 --- a/src/content/docs/d1/reference/migrations.mdx +++ b/src/content/docs/d1/reference/migrations.mdx @@ -42,6 +42,6 @@ migrations_dir = "" # Specify your custom migration directory ## Foreign key constraints -When applying a migration, you may need to temporarily disable [foreign key constraints](/d1/build-with-d1/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. +When applying a migration, you may need to temporarily disable [foreign key constraints](/d1/best-practices/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. -Refer to the [foreign key documentation](/d1/build-with-d1/foreign-keys/) to learn more about how to work with foreign keys and D1. +Refer to the [foreign key documentation](/d1/best-practices/foreign-keys/) to learn more about how to work with foreign keys and D1. diff --git a/src/content/docs/d1/sql-api/sql-statements.mdx b/src/content/docs/d1/sql-api/sql-statements.mdx index 06d3efbb51c2e3c..880f89dbaa1c365 100644 --- a/src/content/docs/d1/sql-api/sql-statements.mdx +++ b/src/content/docs/d1/sql-api/sql-statements.mdx @@ -73,6 +73,6 @@ results: [...] ## Related resources -- Learn [how to create indexes](/d1/build-with-d1/use-indexes/#list-indexes) in D1. -- Use D1's [JSON functions](/d1/build-with-d1/query-json/) to query JSON data. +- Learn [how to create indexes](/d1/best-practices/use-indexes/#list-indexes) in D1. +- Use D1's [JSON functions](/d1/best-practices/query-json/) to query JSON data. - Use [`wrangler dev`](/workers/wrangler/commands/#dev) to run your Worker and D1 locally and debug issues before deploying. diff --git a/src/content/docs/d1/tutorials/d1-and-prisma-orm/index.mdx b/src/content/docs/d1/tutorials/d1-and-prisma-orm/index.mdx index ee4ae04e79b439d..7542b338e88da60 100644 --- a/src/content/docs/d1/tutorials/d1-and-prisma-orm/index.mdx +++ b/src/content/docs/d1/tutorials/d1-and-prisma-orm/index.mdx @@ -193,8 +193,8 @@ CREATE UNIQUE INDEX "User_email_key" ON "User"("email"); You now need to use the `wrangler d1 migrations apply` command to send this SQL statement to D1. This command accepts two options: -- `--local`: Executes the statement against a _local_ version of D1. This local version of D1 is a SQLite database file that will be located in the `.wrangler/state` directory of your project. Use this approach when you want to develop and test your Worker on your local machine. Refer to [Local development](/d1/build-with-d1/local-development/) to learn more. -- `--remote`: Executes the statement against your _remote_ version of D1. This version is used by your _deployed_ Cloudflare Workers. Refer to [Remote development](/d1/build-with-d1/remote-development/) to learn more. +- `--local`: Executes the statement against a _local_ version of D1. This local version of D1 is a SQLite database file that will be located in the `.wrangler/state` directory of your project. Use this approach when you want to develop and test your Worker on your local machine. Refer to [Local development](/d1/best-practices/local-development/) to learn more. +- `--remote`: Executes the statement against your _remote_ version of D1. This version is used by your _deployed_ Cloudflare Workers. Refer to [Remote development](/d1/best-practices/remote-development/) to learn more. In this tutorial, you will do local and remote development. You will test the Worker locally and deploy your Worker afterwards. Open your terminal, and run both commands: diff --git a/src/content/docs/d1/tutorials/import-to-d1-with-rest-api/index.mdx b/src/content/docs/d1/tutorials/import-to-d1-with-rest-api/index.mdx index 89ae6ae0380f810..94bf5c06c6c46fd 100644 --- a/src/content/docs/d1/tutorials/import-to-d1-with-rest-api/index.mdx +++ b/src/content/docs/d1/tutorials/import-to-d1-with-rest-api/index.mdx @@ -449,7 +449,7 @@ In the previous steps, you have created functions to execute various processes i You will now see your target D1 table populated with the example data. :::note -If you encounter the `statement too long` error, you would need to break your SQL command into smaller chunks and upload them in batches. You can learn more about this error in the [D1 documentation](/d1/build-with-d1/import-export-data/#resolve-statement-too-long-error). +If you encounter the `statement too long` error, you would need to break your SQL command into smaller chunks and upload them in batches. You can learn more about this error in the [D1 documentation](/d1/best-practices/import-export-data/#resolve-statement-too-long-error). ::: ## Summary diff --git a/src/content/docs/pages/functions/bindings.mdx b/src/content/docs/pages/functions/bindings.mdx index 467bcbdcc6d36a4..1163789ba7f1502 100644 --- a/src/content/docs/pages/functions/bindings.mdx +++ b/src/content/docs/pages/functions/bindings.mdx @@ -265,7 +265,7 @@ You can interact with your D1 database bindings locally in one of two ways: - Configure your Pages project's `wrangler.toml` file and run [`npx wrangler pages dev`](/workers/wrangler/commands/#dev-1). - Pass arguments to `wrangler pages dev` directly. -To interact with a D1 database via the Wrangler CLI while [developing locally](/d1/build-with-d1/local-development/#develop-locally-with-pages), add `--d1 =` to the `wrangler pages dev` command. +To interact with a D1 database via the Wrangler CLI while [developing locally](/d1/best-practices/local-development/#develop-locally-with-pages), add `--d1 =` to the `wrangler pages dev` command. If your D1 database is bound to your Pages Function via the `NORTHWIND_DB` binding and the `database_id` in your `wrangler.toml` file is `xxxx-xxxx-xxxx-xxxx-xxxx`, access this database in local development by running: diff --git a/src/content/docs/workers/testing/local-development.mdx b/src/content/docs/workers/testing/local-development.mdx index e19bf6719c02c7e..4341374b7326b62 100644 --- a/src/content/docs/workers/testing/local-development.mdx +++ b/src/content/docs/workers/testing/local-development.mdx @@ -135,5 +135,5 @@ There is a bug associated with how outgoing requests are handled when using `wra ## Related resources -- [D1 local development](/d1/build-with-d1/local-development/) - The official D1 guide to local development and testing. +- [D1 local development](/d1/best-practices/local-development/) - The official D1 guide to local development and testing. - [DevTools](/workers/observability/dev-tools) - Guides to using DevTools to debug your Worker locally. diff --git a/src/content/partials/d1/use-pragma-statements.mdx b/src/content/partials/d1/use-pragma-statements.mdx index 1a522c79ab01129..491b804cac779aa 100644 --- a/src/content/partials/d1/use-pragma-statements.mdx +++ b/src/content/partials/d1/use-pragma-statements.mdx @@ -391,7 +391,7 @@ Toggles the foreign key constraint enforcement. When `PRAGMA foreign_keys` is se ### `PRAGMA defer_foreign_keys = (on|off)` -Allows you to defer the enforcement of [foreign key constraints](/d1/build-with-d1/foreign-keys/) until the end of the current transaction. This can be useful during [database migrations](/d1/reference/migrations/), as schema changes may temporarily violate constraints depending on the order in which they are applied. +Allows you to defer the enforcement of [foreign key constraints](/d1/best-practices/foreign-keys/) until the end of the current transaction. This can be useful during [database migrations](/d1/reference/migrations/), as schema changes may temporarily violate constraints depending on the order in which they are applied. This does not disable foreign key enforcement outside of the current transaction. If you have not resolved outstanding foreign key violations at the end of your transaction, it will fail with a `FOREIGN KEY constraint failed` error. @@ -410,4 +410,4 @@ ALTER TABLE users ... PRAGMA defer_foreign_keys = off ``` -Refer to the [foreign key documentation](/d1/build-with-d1/foreign-keys/) to learn more about how to work with foreign keys. +Refer to the [foreign key documentation](/d1/best-practices/foreign-keys/) to learn more about how to work with foreign keys. diff --git a/src/content/partials/durable-objects/durable-objects-vs-d1.mdx b/src/content/partials/durable-objects/durable-objects-vs-d1.mdx index 5867f5245526eef..b0913b72f52f6c5 100644 --- a/src/content/partials/durable-objects/durable-objects-vs-d1.mdx +++ b/src/content/partials/durable-objects/durable-objects-vs-d1.mdx @@ -6,11 +6,11 @@ Cloudflare Workers offers a SQLite-backed serverless database product - [D1](/d1/). How should you compare [SQLite in Durable Objects](/durable-objects/best-practices/access-durable-objects-storage/#sql-storage) and D1? -**D1 is a managed database product.** +**D1 is a managed database product.** -D1 fits into a familiar architecture for developers, where application servers communicate with a database over the network. Application servers are typically Workers; however, D1 also supports external, non-Worker access via an [HTTP API](https://developers.cloudflare.com/api/operations/cloudflare-d1-query-database), which helps unlock [third-party tooling](/d1/reference/community-projects/#_top) support for D1. +D1 fits into a familiar architecture for developers, where application servers communicate with a database over the network. Application servers are typically Workers; however, D1 also supports external, non-Worker access via an [HTTP API](https://developers.cloudflare.com/api/operations/cloudflare-d1-query-database), which helps unlock [third-party tooling](/d1/reference/community-projects/#_top) support for D1. -D1 aims for a "batteries included" feature set, including the above HTTP API, [database schema management](/d1/reference/migrations/#_top), [data import/export](/d1/build-with-d1/import-export-data/), and [database query insights](/d1/observability/metrics-analytics/#query-insights). +D1 aims for a "batteries included" feature set, including the above HTTP API, [database schema management](/d1/reference/migrations/#_top), [data import/export](/d1/best-practices/import-export-data/), and [database query insights](/d1/observability/metrics-analytics/#query-insights). With D1, your application code and SQL database queries are not colocated which can impact application performance. If performance is a concern with D1, Workers has [Smart Placement](/workers/configuration/smart-placement/#_top) to dynamically run your Worker in the best location to reduce total Worker request latency, considering everything your Worker talks to, including D1. @@ -19,7 +19,7 @@ With D1, your application code and SQL database queries are not colocated which By design, Durable Objects are accessed with Workers-only. Durable Objects require a bit more effort, but in return, give you more flexibility and control. With Durable Objects, you must implement two pieces of code that run in different places: a front-end Worker which routes incoming requests from the Internet to a unique Durable Object, and the Durable Object itself, which runs on the same machine as the SQLite database. You get to choose what runs where, and it may be that your application benefits from running some application business logic right next to the database. - + With SQLite in Durable Objects, you may also need to build some of your own database tooling that comes out-of-the-box with D1. SQL query pricing and limits are intended to be identical between D1 ([pricing](/d1/platform/pricing/), [limits](/d1/platform/limits/)) and SQLite in Durable Objects ([pricing](/durable-objects/platform/pricing/#sql-storage-billing), [limits](/durable-objects/platform/limits/)). During SQLite in Durable Objects beta, Storage per Durable Object is 1GB, which will be raised to mirror storage per D1 database (10GB) by general availability. \ No newline at end of file diff --git a/src/content/partials/workers/d1-pricing.mdx b/src/content/partials/workers/d1-pricing.mdx index e7c735a68a57ba7..564b86f14490f29 100644 --- a/src/content/partials/workers/d1-pricing.mdx +++ b/src/content/partials/workers/d1-pricing.mdx @@ -14,11 +14,11 @@ To accurately track your usage, use the [meta object](/d1/worker-api/return-obje ### Definitions -1. Rows read measure how many rows a query reads (scans), regardless of the size of each row. For example, if you have a table with 5000 rows and run a `SELECT * FROM table` as a full table scan, this would count as 5,000 rows read. A query that filters on an [unindexed column](/d1/build-with-d1/use-indexes/) may return fewer rows to your Worker, but is still required to read (scan) more rows to determine which subset to return. +1. Rows read measure how many rows a query reads (scans), regardless of the size of each row. For example, if you have a table with 5000 rows and run a `SELECT * FROM table` as a full table scan, this would count as 5,000 rows read. A query that filters on an [unindexed column](/d1/best-practices/use-indexes/) may return fewer rows to your Worker, but is still required to read (scan) more rows to determine which subset to return. 2. Rows written measure how many rows were written to D1 database. Write operations include `INSERT`, `UPDATE`, and `DELETE`. Each of these operations contribute towards rows written. A query that `INSERT` 10 rows into a `users` table would count as 10 rows written. 3. DDL operations (for example, `CREATE`, `ALTER`, and `DROP`) are used to define or modify the structure of a database. They may contribute to a mix of read rows and write rows. Ensure you are accurately tracking your usage through the available tools ([meta object](/d1/worker-api/return-object/), [GraphQL Analytics API](/d1/observability/metrics-analytics/#query-via-the-graphql-api), or the [Cloudflare dashboard](https://dash.cloudflare.com/?to=/:account/workers/d1/)). 4. Row size or the number of columns in a row does not impact how rows are counted. A row that is 1 KB and a row that is 100 KB both count as one row. -5. Defining [indexes](/d1/build-with-d1/use-indexes/) on your table(s) reduces the number of rows read by a query when filtering on that indexed field. For example, if the `users` table has an index on a timestamp column `created_at`, the query `SELECT * FROM users WHERE created_at > ?1` would only need to read a subset of the table. +5. Defining [indexes](/d1/best-practices/use-indexes/) on your table(s) reduces the number of rows read by a query when filtering on that indexed field. For example, if the `users` table has an index on a timestamp column `created_at`, the query `SELECT * FROM users WHERE created_at > ?1` would only need to read a subset of the table. 6. Indexes will add an additional written row when writes include the indexed column, as there are two rows written: one to the table itself, and one to the index. The performance benefit of an index and reduction in rows read will, in nearly all cases, offset this additional write. 7. Storage is based on gigabytes stored per month, and is based on the sum of all databases in your account. Tables and indexes both count towards storage consumed. 8. Free limits reset daily at 00:00 UTC. Monthly included limits reset based on your monthly subscription renewal date, which is determined by the day you first subscribed. From 8bf038df12ff902c214c11edf7cfbe77247f0f37 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Mon, 9 Dec 2024 10:51:18 +0000 Subject: [PATCH 18/23] Moving foreign keys and query json into SQL API. --- public/_redirects | 8 ++++---- src/content/changelogs/d1.yaml | 4 ++-- src/content/docs/d1/reference/generated-columns.mdx | 4 ++-- .../docs/d1/{best-practices => sql-api}/foreign-keys.mdx | 0 .../docs/d1/{best-practices => sql-api}/query-json.mdx | 0 src/content/docs/d1/sql-api/sql-statements.mdx | 2 +- 6 files changed, 9 insertions(+), 9 deletions(-) rename src/content/docs/d1/{best-practices => sql-api}/foreign-keys.mdx (100%) rename src/content/docs/d1/{best-practices => sql-api}/query-json.mdx (100%) diff --git a/public/_redirects b/public/_redirects index 8057fc0b3446ded..cd821e93479a1aa 100644 --- a/public/_redirects +++ b/public/_redirects @@ -267,12 +267,12 @@ /d1/build-with-d1/remote-development/ /d1/best-practices/remote-development/ 301 /d1/build-with-d1/local-development/ /d1/best-practices/local-development/ 301 /d1/build-with-d1/foreign-keys/ /d1/best-practices/foreign-keys/ 301 -/d1/build-with-d1/query-json/ /d1/best-practices/query-json/ 301 +/d1/build-with-d1/query-json/ /d1/sql-api/query-json/ 301 /d1/build-with-d1/use-d1-from-pages/ /d1/best-practices/use-d1-from-pages/ 301 /d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301 /d1/learning/debug-d1/ /d1/observability/debug-d1/ 301 /d1/learning/using-indexes/ /d1/best-practices/use-indexes/ 301 -/d1/learning/querying-json/ /d1/best-practices/query-json/ 301 +/d1/learning/querying-json/ /d1/sql-api/query-json/ 301 /d1/learning/importing-data/ /d1/best-practices/import-export-data/ 301 /d1/learning/generated-columns/ /d1/reference/generated-columns/ 301 /d1/learning/local-development/ /d1/best-practices/local-development/ 301 @@ -293,7 +293,7 @@ /d1/how-to/ /d1/best-practices/ 301 /d1/how-to/query-databases/ /d1/best-practices/query-d1/ 301 /d1/how-to/using-indexes/ /d1/best-practices/use-indexes/ 301 -/d1/how-to/querying-json/ /d1/best-practices/query-json/ 301 +/d1/how-to/querying-json/ /d1/sql-api/query-json/ 301 /d1/how-to/importing-data/ /d1/best-practices/import-export-data/ 301 /d1/how-to/generated-columns/ /d1/reference/generated-columns/ 301 /d1/build-databases/ /d1/best-practices/ 301 @@ -301,7 +301,7 @@ /d1/build-databases/use-indexes/ /d1/best-practices/use-indexes/ 301 /d1/build-databases/import-data/ /d1/best-practices/import-export-data/ 301 /d1/build-databases/client-api/ /d1/worker-api/ 301 -/d1/reference/query-json/ /d1/best-practices/query-json/ 301 +/d1/reference/query-json/ /d1/sql-api/query-json/ 301 /d1/configuration/local-development/ /d1/best-practices/local-development/ 301 /d1/configuration/remote-development/ /d1/best-practices/remote-development/ 301 /d1/reference/database-commands/ /d1/reference/sql-statements/ 301 diff --git a/src/content/changelogs/d1.yaml b/src/content/changelogs/d1.yaml index 6d1abe70d7d21e9..f197aac7caaac02 100644 --- a/src/content/changelogs/d1.yaml +++ b/src/content/changelogs/d1.yaml @@ -195,7 +195,7 @@ entries: New documentation has been published on how to use D1's support for [generated columns](/d1/reference/generated-columns/) to define columns that are dynamically generated on write (or read). Generated columns allow you to extract - data from [JSON objects](/d1/best-practices/query-json/) or use the output of other + data from [JSON objects](/d1/sql-api/query-json/) or use the output of other SQL functions. - publish_date: "2023-06-12" title: Deprecating Error.cause @@ -223,7 +223,7 @@ entries: - publish_date: "2023-05-17" title: Query JSON description: - "[New documentation](/d1/best-practices/query-json/) has been published + "[New documentation](/d1/sql-api/query-json/) has been published that covers D1's extensive JSON function support. JSON functions allow you to parse, query and modify JSON directly from your SQL queries, reducing the number of round trips to your database, or data queried." diff --git a/src/content/docs/d1/reference/generated-columns.mdx b/src/content/docs/d1/reference/generated-columns.mdx index c1ccfbec64c9a07..6ad0527ab311332 100644 --- a/src/content/docs/d1/reference/generated-columns.mdx +++ b/src/content/docs/d1/reference/generated-columns.mdx @@ -6,7 +6,7 @@ sidebar: --- -D1 allows you to define generated columns based on the values of one or more other columns, SQL functions, or even [extracted JSON values](/d1/best-practices/query-json/). +D1 allows you to define generated columns based on the values of one or more other columns, SQL functions, or even [extracted JSON values](/d1/sql-api/query-json/). This allows you to normalize your data as you write to it or read it from a table, making it easier to query and reducing the need for complex application logic. @@ -48,7 +48,7 @@ As a concrete example, to automatically extract the `location` value from the fo } ``` -To define a generated column with the value of `$.measurement.location`, you can use the [`json_extract`](/d1/best-practices/query-json/#extract-values) function to extract the value from the `raw_data` column each time you write to that row: +To define a generated column with the value of `$.measurement.location`, you can use the [`json_extract`](/d1/sql-api/query-json/#extract-values) function to extract the value from the `raw_data` column each time you write to that row: ```sql CREATE TABLE sensor_readings ( diff --git a/src/content/docs/d1/best-practices/foreign-keys.mdx b/src/content/docs/d1/sql-api/foreign-keys.mdx similarity index 100% rename from src/content/docs/d1/best-practices/foreign-keys.mdx rename to src/content/docs/d1/sql-api/foreign-keys.mdx diff --git a/src/content/docs/d1/best-practices/query-json.mdx b/src/content/docs/d1/sql-api/query-json.mdx similarity index 100% rename from src/content/docs/d1/best-practices/query-json.mdx rename to src/content/docs/d1/sql-api/query-json.mdx diff --git a/src/content/docs/d1/sql-api/sql-statements.mdx b/src/content/docs/d1/sql-api/sql-statements.mdx index 880f89dbaa1c365..4419395b5e7cde8 100644 --- a/src/content/docs/d1/sql-api/sql-statements.mdx +++ b/src/content/docs/d1/sql-api/sql-statements.mdx @@ -74,5 +74,5 @@ results: [...] ## Related resources - Learn [how to create indexes](/d1/best-practices/use-indexes/#list-indexes) in D1. -- Use D1's [JSON functions](/d1/best-practices/query-json/) to query JSON data. +- Use D1's [JSON functions](/d1/sql-api/query-json/) to query JSON data. - Use [`wrangler dev`](/workers/wrangler/commands/#dev) to run your Worker and D1 locally and debug issues before deploying. From baa60a695a601511eba2ad6134b258267e0bb044 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Mon, 9 Dec 2024 10:52:16 +0000 Subject: [PATCH 19/23] Moving foreign keys and query json into sql api. --- public/_redirects | 2 +- src/content/docs/d1/best-practices/import-export-data.mdx | 4 ++-- src/content/docs/d1/reference/migrations.mdx | 4 ++-- src/content/partials/d1/use-pragma-statements.mdx | 4 ++-- 4 files changed, 7 insertions(+), 7 deletions(-) diff --git a/public/_redirects b/public/_redirects index cd821e93479a1aa..a889f52c3efa667 100644 --- a/public/_redirects +++ b/public/_redirects @@ -266,7 +266,7 @@ /d1/build-with-d1/use-indexes/ /d1/best-practices/use-indexes/ 301 /d1/build-with-d1/remote-development/ /d1/best-practices/remote-development/ 301 /d1/build-with-d1/local-development/ /d1/best-practices/local-development/ 301 -/d1/build-with-d1/foreign-keys/ /d1/best-practices/foreign-keys/ 301 +/d1/build-with-d1/foreign-keys/ /d1/sql-api/foreign-keys/ 301 /d1/build-with-d1/query-json/ /d1/sql-api/query-json/ 301 /d1/build-with-d1/use-d1-from-pages/ /d1/best-practices/use-d1-from-pages/ 301 /d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301 diff --git a/src/content/docs/d1/best-practices/import-export-data.mdx b/src/content/docs/d1/best-practices/import-export-data.mdx index 4c360db83e90ba1..dd09721bb806f6c 100644 --- a/src/content/docs/d1/best-practices/import-export-data.mdx +++ b/src/content/docs/d1/best-practices/import-export-data.mdx @@ -194,9 +194,9 @@ VALUES ## Foreign key constraints -When importing data, you may need to temporarily disable [foreign key constraints](/d1/best-practices/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. +When importing data, you may need to temporarily disable [foreign key constraints](/d1/sql-api/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. -Refer to the [foreign key documentation](/d1/best-practices/foreign-keys/) to learn more about how to work with foreign keys and D1. +Refer to the [foreign key documentation](/d1/sql-api/foreign-keys/) to learn more about how to work with foreign keys and D1. ## Next Steps diff --git a/src/content/docs/d1/reference/migrations.mdx b/src/content/docs/d1/reference/migrations.mdx index 07ef45c63bba263..bcbca87450c549c 100644 --- a/src/content/docs/d1/reference/migrations.mdx +++ b/src/content/docs/d1/reference/migrations.mdx @@ -42,6 +42,6 @@ migrations_dir = "" # Specify your custom migration directory ## Foreign key constraints -When applying a migration, you may need to temporarily disable [foreign key constraints](/d1/best-practices/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. +When applying a migration, you may need to temporarily disable [foreign key constraints](/d1/sql-api/foreign-keys/). To do so, call `PRAGMA defer_foreign_keys = true` before making changes that would violate foreign keys. -Refer to the [foreign key documentation](/d1/best-practices/foreign-keys/) to learn more about how to work with foreign keys and D1. +Refer to the [foreign key documentation](/d1/sql-api/foreign-keys/) to learn more about how to work with foreign keys and D1. diff --git a/src/content/partials/d1/use-pragma-statements.mdx b/src/content/partials/d1/use-pragma-statements.mdx index 491b804cac779aa..9870d815c42d63b 100644 --- a/src/content/partials/d1/use-pragma-statements.mdx +++ b/src/content/partials/d1/use-pragma-statements.mdx @@ -391,7 +391,7 @@ Toggles the foreign key constraint enforcement. When `PRAGMA foreign_keys` is se ### `PRAGMA defer_foreign_keys = (on|off)` -Allows you to defer the enforcement of [foreign key constraints](/d1/best-practices/foreign-keys/) until the end of the current transaction. This can be useful during [database migrations](/d1/reference/migrations/), as schema changes may temporarily violate constraints depending on the order in which they are applied. +Allows you to defer the enforcement of [foreign key constraints](/d1/sql-api/foreign-keys/) until the end of the current transaction. This can be useful during [database migrations](/d1/reference/migrations/), as schema changes may temporarily violate constraints depending on the order in which they are applied. This does not disable foreign key enforcement outside of the current transaction. If you have not resolved outstanding foreign key violations at the end of your transaction, it will fail with a `FOREIGN KEY constraint failed` error. @@ -410,4 +410,4 @@ ALTER TABLE users ... PRAGMA defer_foreign_keys = off ``` -Refer to the [foreign key documentation](/d1/best-practices/foreign-keys/) to learn more about how to work with foreign keys. +Refer to the [foreign key documentation](/d1/sql-api/foreign-keys/) to learn more about how to work with foreign keys. From ec24c8d95c66ac8b9933cd233fb671a6be9b63bb Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Mon, 9 Dec 2024 14:50:50 +0000 Subject: [PATCH 20/23] Removing "D1" from REST and Wrangler sidebar. --- src/content/docs/d1/d1-api.mdx | 2 +- src/content/docs/d1/wrangler-commands.mdx | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/src/content/docs/d1/d1-api.mdx b/src/content/docs/d1/d1-api.mdx index 8663e4653b448aa..065bc842cf8cd18 100644 --- a/src/content/docs/d1/d1-api.mdx +++ b/src/content/docs/d1/d1-api.mdx @@ -1,6 +1,6 @@ --- pcx_content_type: navigation -title: D1 REST API +title: REST API external_link: /api/operations/cloudflare-d1-create-database sidebar: order: 6 diff --git a/src/content/docs/d1/wrangler-commands.mdx b/src/content/docs/d1/wrangler-commands.mdx index c44a67040b1b1f9..08f3bf8b8978c1e 100644 --- a/src/content/docs/d1/wrangler-commands.mdx +++ b/src/content/docs/d1/wrangler-commands.mdx @@ -1,6 +1,6 @@ --- pcx_content_type: concept -title: D1 Wrangler commands +title: Wrangler commands sidebar: order: 6 From 05a130bd0c2a7e316663e7b0373bae00c73afcb1 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Mon, 9 Dec 2024 16:56:26 +0000 Subject: [PATCH 21/23] Introducing query JSON and foreign keys in Query databases chapter. --- .../docs/d1/best-practices/query-d1.mdx | 66 ++++++++++++++----- 1 file changed, 48 insertions(+), 18 deletions(-) diff --git a/src/content/docs/d1/best-practices/query-d1.mdx b/src/content/docs/d1/best-practices/query-d1.mdx index fd021ebe6495084..791730c9006ce16 100644 --- a/src/content/docs/d1/best-practices/query-d1.mdx +++ b/src/content/docs/d1/best-practices/query-d1.mdx @@ -12,6 +12,10 @@ There are a number of ways you can interact with a D1 database: 3. Using [D1 REST API](/api/operations/cloudflare-d1-create-database). 4. Using [D1 Wrangler commands](/d1/wrangler-commands/). +:::note +D1 is compatible with most SQLite's SQL convention since it leverages SQLite's query engine. +::: + ## Query D1 with Workers Binding API Workers Binding API primarily interacts with the data plane, and allows you to query your D1 database from your Worker. @@ -47,29 +51,55 @@ Refer to [Workers Binding API](/d1/worker-api/) for more information. ## Query D1 with SQL API -D1 is compatible with most SQLite's SQL convention since it leverages SQLite's query engine. +D1 understands SQLite semantics, which allows you to query a database using SQL statements via Workers BindingAPI or REST API (including Wrangler commands). Refer to [D1 SQL API](/d1/sql-api/sql-statements/) to learn more about supported SQL statements. -```sh -npx wrangler d1 execute prod-d1-tutorial --local --command="SELECT * FROM Customers" +### Use foreign key relationships + +When using SQL with D1, you may wish to define and and enforce foreign key constraints across tables in a database. Foreign key constraints allow you to enforce relationships across tables, or prevent you from deleting rows that reference rows in other tables. An example of a foreign key relationship is shown below. + +```sql +CREATE TABLE users ( + user_id INTEGER PRIMARY KEY, + email_address TEXT, + name TEXT, + metadata TEXT +) + +CREATE TABLE orders ( + order_id INTEGER PRIMARY KEY, + status INTEGER, + item_desc TEXT, + shipped_date INTEGER, + user_who_ordered INTEGER, + FOREIGN KEY(user_who_ordered) REFERENCES users(user_id) +) ``` -```sh output -🌀 Mapping SQL input into an array of statements -🌀 Executing on local database production-db-backend (database-id) from .wrangler/state/v3/d1: -┌────────────┬─────────────────────┬───────────────────┐ -│ CustomerId │ CompanyName │ ContactName │ -├────────────┼─────────────────────┼───────────────────┤ -│ 1 │ Alfreds Futterkiste │ Maria Anders │ -├────────────┼─────────────────────┼───────────────────┤ -│ 4 │ Around the Horn │ Thomas Hardy │ -├────────────┼─────────────────────┼───────────────────┤ -│ 11 │ Bs Beverages │ Victoria Ashworth │ -├────────────┼─────────────────────┼───────────────────┤ -│ 13 │ Bs Beverages │ Random Name │ -└────────────┴─────────────────────┴───────────────────┘ +Refer to [Define foreign keys](/d1/sql-api/foreign-keys/) for more information. + +### Query JSON + +D1 allows you to query and parse JSON data stored within a database. For example, you can extract a value inside a JSON object. + +Given the following JSON object (`type:blob`) in a column named `sensor_reading`, you can extract values from it directly. + +```json +{ + "measurement": { + "temp_f": "77.4", + "aqi": [21, 42, 58], + "o3": [18, 500], + "wind_mph": "13", + "location": "US-NY" + } +} +``` +```sql +-- Extract the temperature value +SELECT json_extract(sensor_reading, '$.measurement.temp_f')-- returns "77.4" as TEXT ``` -Refer to [D1 SQL API](/d1/sql-api/sql-statements/) to learn more about supported SQL queries. +Refer to [Query JSON](/d1/sql-api/query-json/) to learn more about querying JSON objects. ## Query D1 with REST API From 55a938571496f9d371dce9dc520e4f13e67c9616 Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Mon, 9 Dec 2024 17:18:29 +0000 Subject: [PATCH 22/23] Restructuring "Query database", changing the example of --- .../docs/d1/best-practices/query-d1.mdx | 102 +++++++++--------- 1 file changed, 52 insertions(+), 50 deletions(-) diff --git a/src/content/docs/d1/best-practices/query-d1.mdx b/src/content/docs/d1/best-practices/query-d1.mdx index 791730c9006ce16..80902941aee4f99 100644 --- a/src/content/docs/d1/best-practices/query-d1.mdx +++ b/src/content/docs/d1/best-practices/query-d1.mdx @@ -5,51 +5,15 @@ sidebar: order: 1 --- +D1 is compatible with most SQLite's SQL convention since it leverages SQLite's query engine. You can use SQL commands to query D1. + There are a number of ways you can interact with a D1 database: 1. Using [D1 Workers Binding API](/d1/worker-api/) in your code. -2. Using [SQL API](/d1/sql-api/sql-statements/). -3. Using [D1 REST API](/api/operations/cloudflare-d1-create-database). -4. Using [D1 Wrangler commands](/d1/wrangler-commands/). - -:::note -D1 is compatible with most SQLite's SQL convention since it leverages SQLite's query engine. -::: - -## Query D1 with Workers Binding API - -Workers Binding API primarily interacts with the data plane, and allows you to query your D1 database from your Worker. - -This requires you to: - -1. Bind your D1 database to your Worker. -2. Prepare a statement. -3. Run the statement. - -```js title="index.js" -export default { - async fetch(request, env) { - const {pathname} = new URL(request.url); - const companyName1 = `Bs Beverages`; - const companyName2 = `Around the Horn`; - const stmt = env.DB.prepare(`SELECT * FROM Customers WHERE CompanyName = ?`); - - if (pathname === `/RUN`) { - const returnValue = await stmt.bind(companyName1).run(); - return Response.json(returnValue); - } - - return new Response( - `Welcome to the D1 API Playground! - \nChange the URL to test the various methods inside your index.js file.`, - ); - }, -}; -``` - -Refer to [Workers Binding API](/d1/worker-api/) for more information. +2. Using [D1 REST API](/api/operations/cloudflare-d1-create-database). +3. Using [D1 Wrangler commands](/d1/wrangler-commands/). -## Query D1 with SQL API +## Use SQL to query D1 D1 understands SQLite semantics, which allows you to query a database using SQL statements via Workers BindingAPI or REST API (including Wrangler commands). Refer to [D1 SQL API](/d1/sql-api/sql-statements/) to learn more about supported SQL statements. @@ -101,6 +65,39 @@ SELECT json_extract(sensor_reading, '$.measurement.temp_f')-- returns "77.4" as Refer to [Query JSON](/d1/sql-api/query-json/) to learn more about querying JSON objects. +## Query D1 with Workers Binding API + +Workers Binding API primarily interacts with the data plane, and allows you to query your D1 database from your Worker. + +This requires you to: + +1. Bind your D1 database to your Worker. +2. Prepare a statement. +3. Run the statement. + +```js title="index.js" +export default { + async fetch(request, env) { + const {pathname} = new URL(request.url); + const companyName1 = `Bs Beverages`; + const companyName2 = `Around the Horn`; + const stmt = env.DB.prepare(`SELECT * FROM Customers WHERE CompanyName = ?`); + + if (pathname === `/RUN`) { + const returnValue = await stmt.bind(companyName1).run(); + return Response.json(returnValue); + } + + return new Response( + `Welcome to the D1 API Playground! + \nChange the URL to test the various methods inside your index.js file.`, + ); + }, +}; +``` + +Refer to [Workers Binding API](/d1/worker-api/) for more information. + ## Query D1 with REST API REST API primarily interacts with the control plane, and allows you to create/manage your D1 database. @@ -112,15 +109,20 @@ Refer to [D1 REST API](/api/operations/cloudflare-d1-create-database) for D1 RES You can use Wrangler commands to interact with the control plane. Note that Wrangler commands use REST APIs to perform its operations. ```sh -npx wrangler d1 create prod-d1-tutorial +npx wrangler d1 execute prod-d1-tutorial --command="SELECT * FROM Customers" ``` - ```sh output - -✅ Successfully created DB 'prod-d1-tutorial' - -[[d1_databases]] -binding = "DB" # available in your Worker on env.DB -database_name = "prod-d1-tutorial" -database_id = "" +🌀 Mapping SQL input into an array of statements +🌀 Executing on local database production-db-backend () from .wrangler/state/v3/d1: +┌────────────┬─────────────────────┬───────────────────┐ +│ CustomerId │ CompanyName │ ContactName │ +├────────────┼─────────────────────┼───────────────────┤ +│ 1 │ Alfreds Futterkiste │ Maria Anders │ +├────────────┼─────────────────────┼───────────────────┤ +│ 4 │ Around the Horn │ Thomas Hardy │ +├────────────┼─────────────────────┼───────────────────┤ +│ 11 │ Bs Beverages │ Victoria Ashworth │ +├────────────┼─────────────────────┼───────────────────┤ +│ 13 │ Bs Beverages │ Random Name │ +└────────────┴─────────────────────┴───────────────────┘ ``` \ No newline at end of file From 66776f55a611a7bb9d01acec2a9321e76b8de45e Mon Sep 17 00:00:00 2001 From: Jun Lee Date: Mon, 9 Dec 2024 17:19:18 +0000 Subject: [PATCH 23/23] Editing Wrangler example. Restructuring to better explain how SQL fits into D1. --- src/content/docs/d1/best-practices/query-d1.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/content/docs/d1/best-practices/query-d1.mdx b/src/content/docs/d1/best-practices/query-d1.mdx index 80902941aee4f99..bdbfe9e0c6ad4b6 100644 --- a/src/content/docs/d1/best-practices/query-d1.mdx +++ b/src/content/docs/d1/best-practices/query-d1.mdx @@ -106,7 +106,7 @@ Refer to [D1 REST API](/api/operations/cloudflare-d1-create-database) for D1 RES ## Query D1 with Wrangler commands -You can use Wrangler commands to interact with the control plane. Note that Wrangler commands use REST APIs to perform its operations. +You can use Wrangler commands to query a D1 database. Note that Wrangler commands use REST APIs to perform its operations. ```sh npx wrangler d1 execute prod-d1-tutorial --command="SELECT * FROM Customers"