Skip to content

Commit fd19f4b

Browse files
authored
docs: text deduplication (#7501)
1 parent 2d2d9a4 commit fd19f4b

File tree

1 file changed

+27
-44
lines changed

1 file changed

+27
-44
lines changed

packages/docs/README.md

Lines changed: 27 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -60,49 +60,6 @@ If you don't already have an account, then [create a Cloudflare account here](ht
6060

6161
Within the projects "Settings" for "Build and deployments", the "Build command" should be `pnpm build`, and the "Build output directory" should be set to `dist`.
6262

63-
## Algolia search
64-
65-
STILL WIP
66-
67-
resource: https://docsearch.algolia.com/
68-
69-
### Crawler
70-
71-
Setup in https://crawler.algolia.com/
72-
73-
### Debug local site with crawler settings
74-
75-
To crawl localhost site for testing index settings for content hierarchy. use this docker command
76-
77-
```shell
78-
# create apiKey via https://www.algolia.com/account/api-keys
79-
touch .env
80-
# APPLICATION_ID=APPLICATION_ID
81-
# API_KEY=API_KEY
82-
docker run -it --rm --env-file=.env -e "CONFIG=$(cat ./packages/docs/algolia.json | jq -r tostring)" algolia/docsearch-scraper
83-
```
84-
85-
see guide of [DocSearch-legacy docker command](https://docsearch.algolia.com/docs/legacy/run-your-own#run-the-crawl-from-the-docker-image)
86-
87-
> In mac machine, docker container can access host's network, workaround is to use `host.docker.internal`
88-
## Cloudflare Pages
89-
90-
Cloudflare's [wrangler](https://github.com/cloudflare/wrangler) CLI can be used to preview a production build locally. To start a local server, run:
91-
92-
```
93-
pnpm serve
94-
```
95-
96-
Then visit [http://localhost:8787/](http://localhost:8787/)
97-
98-
### Deployments
99-
100-
[Cloudflare Pages](https://pages.cloudflare.com/) are deployable through their [Git provider integrations](https://developers.cloudflare.com/pages/platform/git-integration/).
101-
102-
If you don't already have an account, then [create a Cloudflare account here](https://dash.cloudflare.com/sign-up/pages). Next go to your dashboard and follow the [Cloudflare Pages deployment guide](https://developers.cloudflare.com/pages/framework-guides/deploy-anything/).
103-
104-
Within the projects "Settings" for "Build and deployments", the "Build command" should be `pnpm build`, and the "Build output directory" should be set to `dist`.
105-
10663
### Function Invocation Routes
10764

10865
Cloudflare Page's [function-invocation-routes config](https://developers.cloudflare.com/pages/platform/functions/routing/#functions-invocation-routes) can be used to include, or exclude, certain paths to be used by the worker functions. Having a `_routes.json` file gives developers more granular control over when your Function is invoked.
@@ -130,4 +87,30 @@ By default, the Cloudflare pages adaptor _does not_ include a `public/_routes.js
13087

13188
In the above example, it's saying _all_ pages should be SSR'd. However, the root static files such as `/favicon.ico` and any static assets in `/build/*` should be excluded from the Functions, and instead treated as a static file.
13289

133-
In most cases the generated `dist/_routes.json` file is ideal. However, if you need more granular control over each path, you can instead provide you're own `public/_routes.json` file. When the project provides its own `public/_routes.json` file, then the Cloudflare adaptor will not auto-generate the routes config and instead use the committed one within the `public` directory.
90+
In most cases the generated `dist/_routes.json` file is ideal. However, if you need more granular control over each path, you can instead provide you're own `public/_routes.json` file. When the project provides its own `public/_routes.json` file, then the Cloudflare adaptor will not auto-generate the routes config and instead use the one committed within the `public` directory.
91+
92+
## Algolia search
93+
94+
STILL WIP
95+
96+
resource: https://docsearch.algolia.com/
97+
98+
### Crawler
99+
100+
Setup in https://crawler.algolia.com/
101+
102+
### Debug local site with crawler settings
103+
104+
To crawl localhost site for testing index settings for content hierarchy. use this docker command
105+
106+
```shell
107+
# create apiKey via https://www.algolia.com/account/api-keys
108+
touch .env
109+
# APPLICATION_ID=APPLICATION_ID
110+
# API_KEY=API_KEY
111+
docker run -it --rm --env-file=.env -e "CONFIG=$(cat ./packages/docs/algolia.json | jq -r tostring)" algolia/docsearch-scraper
112+
```
113+
114+
see guide of [DocSearch-legacy docker command](https://docsearch.algolia.com/docs/legacy/run-your-own#run-the-crawl-from-the-docker-image)
115+
116+
> In mac machines, docker containers can't access host's network directly, a workaround is to use host.docker.internal`

0 commit comments

Comments
 (0)