Skip to content

Commit e5ba442

Browse files
committed
various minor edits
1 parent abf7d6f commit e5ba442

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

docs/AI-for-security/knowledge-base.asciidoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,7 @@ Refer to the following video for an example of adding an index to Knowledge Base
143143

144144
You can use an {es} connector or web crawler to create an index that contains data you want to add to Knowledge Base.
145145

146-
This section provides an example of adding a threat intelligence feed to Knowledge Base using a web crawler. For more information on adding data to {es} using a connector, refer to {ref}/es-connectors.html[Ingest data with Elastic connectors]. For more information on web crawlers, refer to {ref}/crawler.html[Elastic web crawler].
146+
This section provides an example of adding a threat intelligence feed to Knowledge Base using a web crawler. For more information on adding data to {es} using a connector, refer to {ref}/es-connectors.html[Ingest data with Elastic connectors]. For more information on web crawlers, refer to {enterprise-search-ref}/crawler.html[Elastic web crawler].
147147

148148
[discrete]
149149
==== Use a web crawler to add threat intelligence to Knowledge Base
@@ -154,12 +154,12 @@ First, you'll need to set up a web crawler to add the desired data to an index,
154154
. Click **New web crawler**.
155155
.. Under **Index name**, name the index where the data from your new web crawler will be stored, for example `threat_intelligence_feed_1`. Click **Create index**.
156156
.. Under **Domain URL**, enter the URL where the web crawler should collect data. Click **Validate Domain** to test it, then **Add domain**.
157-
. The previous step opens a page with the details of your new crawler. Go to its **Mappings** tab, then click **Add field**.
157+
. The previous step opens a page with the details of your new index. Go to its **Mappings** tab, then click **Add field**.
158158
+
159159
NOTE: Remember, each index added to Knowledge Base must have at least one semantic text field.
160160
.. Under **Field type**, select `Semantic text`. Under **Select an inference endpoint**, select `elastic-security-ai-assistant-elser2`. Click **Add field**, then **Save mapping**.
161161
. Go to the **Scheduling** tab. Enable the **Enable recurring crawls with the following schedule** setting, and define your desired schedule.
162-
. Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**.
162+
. Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. For more information, refer to {enterprise-search-ref}/crawler-extraction-rules.html[Web crawler content extraction rules].
163163
.. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**.
164164
.. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**.
165165
.. Click **Crawl**, then **Crawl all domains on this index**. A message appears that says "Successfully scheduled a sync, waiting for a connector to pick it up".

0 commit comments

Comments
 (0)