You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: packages/docs/README.md
+27-44Lines changed: 27 additions & 44 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -60,49 +60,6 @@ If you don't already have an account, then [create a Cloudflare account here](ht
60
60
61
61
Within the projects "Settings" for "Build and deployments", the "Build command" should be `pnpm build`, and the "Build output directory" should be set to `dist`.
62
62
63
-
## Algolia search
64
-
65
-
STILL WIP
66
-
67
-
resource: https://docsearch.algolia.com/
68
-
69
-
### Crawler
70
-
71
-
Setup in https://crawler.algolia.com/
72
-
73
-
### Debug local site with crawler settings
74
-
75
-
To crawl localhost site for testing index settings for content hierarchy. use this docker command
76
-
77
-
```shell
78
-
# create apiKey via https://www.algolia.com/account/api-keys
see guide of [DocSearch-legacy docker command](https://docsearch.algolia.com/docs/legacy/run-your-own#run-the-crawl-from-the-docker-image)
86
-
87
-
> In mac machine, docker container can access host's network, workaround is to use `host.docker.internal`
88
-
## Cloudflare Pages
89
-
90
-
Cloudflare's [wrangler](https://github.com/cloudflare/wrangler) CLI can be used to preview a production build locally. To start a local server, run:
91
-
92
-
```
93
-
pnpm serve
94
-
```
95
-
96
-
Then visit [http://localhost:8787/](http://localhost:8787/)
97
-
98
-
### Deployments
99
-
100
-
[Cloudflare Pages](https://pages.cloudflare.com/) are deployable through their [Git provider integrations](https://developers.cloudflare.com/pages/platform/git-integration/).
101
-
102
-
If you don't already have an account, then [create a Cloudflare account here](https://dash.cloudflare.com/sign-up/pages). Next go to your dashboard and follow the [Cloudflare Pages deployment guide](https://developers.cloudflare.com/pages/framework-guides/deploy-anything/).
103
-
104
-
Within the projects "Settings" for "Build and deployments", the "Build command" should be `pnpm build`, and the "Build output directory" should be set to `dist`.
105
-
106
63
### Function Invocation Routes
107
64
108
65
Cloudflare Page's [function-invocation-routes config](https://developers.cloudflare.com/pages/platform/functions/routing/#functions-invocation-routes) can be used to include, or exclude, certain paths to be used by the worker functions. Having a `_routes.json` file gives developers more granular control over when your Function is invoked.
@@ -130,4 +87,30 @@ By default, the Cloudflare pages adaptor _does not_ include a `public/_routes.js
130
87
131
88
In the above example, it's saying _all_ pages should be SSR'd. However, the root static files such as `/favicon.ico` and any static assets in `/build/*` should be excluded from the Functions, and instead treated as a static file.
132
89
133
-
In most cases the generated `dist/_routes.json` file is ideal. However, if you need more granular control over each path, you can instead provide you're own `public/_routes.json` file. When the project provides its own `public/_routes.json` file, then the Cloudflare adaptor will not auto-generate the routes config and instead use the committed one within the `public` directory.
90
+
In most cases the generated `dist/_routes.json` file is ideal. However, if you need more granular control over each path, you can instead provide you're own `public/_routes.json` file. When the project provides its own `public/_routes.json` file, then the Cloudflare adaptor will not auto-generate the routes config and instead use the one committed within the `public` directory.
91
+
92
+
## Algolia search
93
+
94
+
STILL WIP
95
+
96
+
resource: https://docsearch.algolia.com/
97
+
98
+
### Crawler
99
+
100
+
Setup in https://crawler.algolia.com/
101
+
102
+
### Debug local site with crawler settings
103
+
104
+
To crawl localhost site for testing index settings for content hierarchy. use this docker command
105
+
106
+
```shell
107
+
# create apiKey via https://www.algolia.com/account/api-keys
0 commit comments