You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/docs/agentic-ai/knowledge-integration/overview.mdx
+37-14Lines changed: 37 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,17 +13,27 @@ redirects:
13
13
This feature is currently provided as part of a preview program pursuant to our [pre-release policies](/docs/licenses/license-information/referenced-policies/new-relic-pre-release-policy).
14
14
</Callout>
15
15
16
-
New Relic AI uses large language models (LLMs) and New Relic's data platform to help you understand your system and how to better glean insights about the performance of those systems. It allows you to ask questions, troubleshoot issues, and explore telemetry data using plain language.
16
+
To provide more context-specific answers, New Relic AI can use a technique called Retrieval Augmented Generation (RAG) through the New Relic AI knowledge connector. While New Relic AI LLMs have a vast general knowledge, RAG enhances their responses.
17
17
18
-
To provide more context-specific answers, New Relic AI can use a technique called Retrieval Augmented Generation (RAG) through the New Relic AI knowledge connector. While foundation LLMs have a vast general knowledge, RAG enhances their responses by retrieving relevant information from your external data sources.
18
+
By setting up the New Relic AI knowledge connector, you can expect tangible outcomes such as faster incident resolution, more accurate and context-aware AI responses, and reduced manual searching across multiple documents. This unified approach helps your team make better decisions and respond to issues more efficiently.
19
19
20
20
## How it works
21
-
The New Relic AI knowledge connector integrates your internal knowledge with the analytical power of New Relic AI through the following three-step process:
22
-
-**Index:** The first step is to connect your content and knowledge bases, such as Confluence, to the New Relic AI platform. Once connected, the knowledge connector will perform an initial indexing of your documents. You can configure this process to run on a recurring basis, ensuring that New Relic AI always has access to the most up-to-date information as your documents evolve.
23
-
-**Retrieval:** When a user asks a question in New Relic AI, the system searches the indexed content for information relevant to the user's query. This step ensures that the context is pulled directly from your trusted, internal documentation.
24
-
-**Generation:** Finally, the system combines the retrieved information with the powerful generative capabilities of the underlying LLM. This synthesis produces a comprehensive and context-aware answer, grounded in your specific data and best practices.
25
21
26
-
This RAG approach significantly improves the accuracy and relevance of the responses, reducing the likelihood of generic or hallucinated answers.
22
+
<img
23
+
title="High-level visual of Knowledge connector"
24
+
alt="High-level visual of Knowledge connector"
25
+
src="/images/High-Level Visual Overview of the New Relic AI Knowledge Connector.webp"
26
+
/>
27
+
28
+
Your content and knowledge bases, such as Confluence, are uploaded to the New Relic AI platform. The system follows a three-step process to provide a context-aware answer:
29
+
30
+
-**Index:** Once your knowledge bases are connected to the New Relic AI platform, the knowledge connector performs an initial indexing of your documents. You can configure this process to run on a recurring basis, ensuring that New Relic AI always has access to the most up-to-date information as your documents evolve.
31
+
32
+
-**Retrieval:** When you ask a question in New Relic AI, the system searches the indexed content for information relevant to your query. This step ensures that the context is pulled directly from your trusted, internal documentation.
33
+
34
+
-**Generation:** The system combines the retrieved information with the powerful generative capabilities of the underlying LLM. This synthesis produces a comprehensive and context-aware answer, grounded in your specific data and best practices.
35
+
36
+
If a query does not require your organizational knowledge, New Relic AI will generate an answer using the underlying LLM's vast general knowledge. In both cases, the goal is to provide you with the most relevant and accurate information possible.
27
37
28
38
## Key features
29
39
With New Relic AI knowledge connector, you can:
@@ -34,30 +44,41 @@ With New Relic AI knowledge connector, you can:
34
44
- "What are the standard triage steps for this type of alert?"
35
45
- "Show me the runbook for a `database connection limit exceeded` error."
36
46
47
+
## Data security and privacy
48
+
37
49
<Calloutvariant="important">
38
-
39
-
At this time, all indexed documents can be retrieved by all users within your organization's New Relic account. Before you begin indexing, ensure that the documents you intend to connect comply with your internal data security and privacy policies for use of the services.
40
-
50
+
51
+
At this time, all indexed documents can be retrieved by all users within your organization's New Relic account. Please verify that only appropriate content is indexed, as there's currently no option to restrict access or redact information after indexing.
52
+
41
53
</Callout>
42
54
55
+
Before you begin indexing, ensure that:
56
+
57
+
* Only documents suitable for organization-wide access are indexed.
58
+
* Sensitive information is redacted.
59
+
* All the documents to be indexed comply with your organization's internal data security and privacy policies.
60
+
43
61
## Use cases and value
44
62
45
63
Here are some New Relic AI tools and a brief overview of how the knowledge connector integration adds value to each:
When a critical incident occurs, you face a time-consuming investigation because historical context is fragmented across multiple sources like Confluence, Jira, and runbooks. The knowledge connector integrates all of these disparate sources, transforming the Retro DocSearch tool into a powerful historical intelligence engine that provides all the context in one place. This allows the tool to:
68
+
When a critical incident occurs, you face a time-consuming investigation because historical context is fragmented across multiple sources like Confluence and runbooks.
69
+
70
+
The knowledge connector integrates all of these disparate sources, transforming the Retro DocSearch tool into a powerful historical intelligence engine that provides all the context in one place. This allows the tool to:
51
71
52
72
* Provide an immediate summary of an original incident's root cause.
53
73
* Present specific actionable steps from the runbook that resolved the issue.
54
74
* Identify the names of the experts or teams who solved it before.
55
-
* Fetch the status of the long-term fix from a Jira ticket.
While the Dashboard Analysis tool provides key insights, its analysis is limited to the data received by the dashboard. It lacks crucial business context, making it difficult to understand the true impact of an issue. The knowledge connector integration grounds the Dashboard Analysis tool in your company's unique knowledge, acting as a bridge between raw telemetry and your internal documentation. This allows the tool to:
79
+
While the Dashboard Analysis tool provides key insights, its analysis is limited to the data received by the dashboard. It lacks crucial business context, making it difficult to understand the true impact of an issue.
80
+
81
+
The knowledge connector integration grounds the Dashboard Analysis tool in your company's unique knowledge, acting as a bridge between raw telemetry and your internal documentation. This allows the tool to:
61
82
62
83
* Explain the business impact using indexed documents that define key business metrics.
63
84
* Curate suggestions based on internal runbooks to provide context-specific actionable steps.
@@ -67,7 +88,9 @@ While the Dashboard Analysis tool provides key insights, its analysis is limited
This tool generates a basic synthetic monitor creation script, but it lacks an understanding of your organization's unique standards and proprietary information. This creates manual work to ensure compliance and inject authentication tokens. The knowledge connector integration elevates the tool by giving it access to your company's internal documents. This allows the tool to:
91
+
This tool generates a basic synthetic monitor creation script, but it lacks an understanding of your organization's unique standards and proprietary information. This creates manual work to ensure compliance and inject authentication tokens.
92
+
93
+
The knowledge connector integration elevates the tool by giving it access to your company's internal documents. This allows the tool to:
71
94
72
95
* Automate standardization by applying your company’s rules to the generated script.
73
96
* Inject specific non-public information from an indexed runbook or wiki into the monitor's configuration.
Copy file name to clipboardExpand all lines: src/content/docs/apis/nerdgraph/examples/nerdgraph-rag.mdx
+42-18Lines changed: 42 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,11 +6,21 @@ redirects:
6
6
freshnessValidatedDate: never
7
7
---
8
8
9
-
With New Relic, you can enhance New Relic AI agents with RAG. This means that you can associate documentation, runbooks, incident retros and even source code with your services, giving New Relic AI better insight into issues with your system.
9
+
<Callouttitle="preview">
10
+
We're still working on this feature, but we'd love for you to try it out!
10
11
11
-
## Get Started with RAG [#get-started]
12
+
This feature is currently provided as part of a preview program pursuant to our [pre-release policies](/docs/licenses/license-information/referenced-policies/new-relic-pre-release-policy).
13
+
</Callout>
14
+
15
+
With New Relic, you can enhance New Relic AI agents with RAG. This means that you can associate documentation, runbooks, incident retros, and even source code with your services, giving New Relic AI better insight into issues with your system.
16
+
17
+
To learn more about the knowledge connector, refer to [New Relic AI Knowledge connector](/docs/agentic-ai/knowledge-integration/overview).
18
+
19
+
## Get started with RAG [#get-started]
12
20
13
-
### Obtain your organization ID [#get-org-id]
21
+
<Collapserid="get-org-id"title="Obtain your organization ID">
22
+
23
+
Before you begin, you'll need your organization ID to use with the following mutations and queries.
14
24
15
25
```graphql
16
26
{
@@ -21,10 +31,12 @@ With New Relic, you can enhance New Relic AI agents with RAG. This means that yo
21
31
}
22
32
}
23
33
```
34
+
</Collapser>
35
+
## Configure your RAG tool
36
+
<CollapserGroup>
37
+
<Collapserid="create-rag-tool"title="Create a RAG tool">
24
38
25
-
### Create a RAG Tool [#create-rag-tool]
26
-
27
-
It should be noted - the name and description of your RAG Tool are extremely important! The LLM will choose to leverage a tool when responding to a prompt based on the accuracy of a tool's name and description.
39
+
The name and description of your RAG tool help the LLM understand when and how to use it. Provide a clear name and accurate description so that the LLM is more likely to select the right tool for a given prompt and provide relevant, context-aware responses.
28
40
29
41
```graphql
30
42
mutation {
@@ -41,14 +53,16 @@ mutation {
41
53
}
42
54
}
43
55
```
44
-
45
-
### Upload a document to the Blob API[#upload-document]
56
+
</Collapser>
57
+
<Collapserid="upload-document"title="Upload a document to the Blob API">
46
58
47
59
<Calloutvariant="important">
48
60
All indexed documents are visible to all users within your organization. Make sure the documents you index comply with your internal policies for use of the services.
49
61
</Callout>
50
62
51
-
* Note that this specific step is not through NerdGraph. This is because NerdGraph does support file uploads through it's APIs.
63
+
NerdGraph is optimized for structured data queries and mutations and not for the efficient transfer of files. In order to upload documents, you will use the Blob API.
64
+
65
+
Here's an example of how to upload a document using a `curl``bash` command:
52
66
53
67
```shell
54
68
curl -X POST https://blob-api.one-service.newrelic.com/v1/e/organizations/$ORGANIZATION_ID/RagDocuments \
@@ -66,8 +80,10 @@ The response will look like this:
66
80
"blobVersionEntity": null
67
81
}
68
82
```
83
+
</Collapser>
84
+
<Collapserid="view-the-rag-document-entity-represented-in-nerdgraph"title="View the RAG document entity represented in NerdGraph">
69
85
70
-
### View the RAG Document entity represented in NerdGraph
86
+
You can view the RAG document entity in NerdGraph.
71
87
72
88
```graphql
73
89
{
@@ -89,10 +105,9 @@ The response will look like this:
89
105
}
90
106
}
91
107
```
92
-
93
-
Now that we have a RAG Tool and a RAG Document, we need to associate them with each other. This is done via the Entity Management APIs in NerdGraph.
94
-
95
-
### Create a relationship between the RAG Document and the RAG Tool
108
+
</Collapser>
109
+
<Collapserid="create-a-relationship-between-the-rag-document-and-the-rag-tool"title="Create a relationship between the RAG document and the RAG tool">
110
+
Now that you have a RAG tool and a RAG document, you need to associate them with each other. This is done via the Entity Management APIs in NerdGraph.
96
111
97
112
```graphql
98
113
mutation {
@@ -123,8 +138,15 @@ mutation {
123
138
}
124
139
}
125
140
```
141
+
</Collapser>
142
+
</CollapserGroup>
126
143
127
-
### Query to see relationships between RAG Documents and RAG Tools
144
+
## Verify your configuration
145
+
146
+
<CollapserGroup>
147
+
<Collapserid="query-to-see-relationships-between-rag-documents-and-rag-tools"title="Query to see relationships between RAG documents and RAG tools">
148
+
149
+
You can query relationships between RAG documents and RAG tools.
128
150
129
151
```graphql
130
152
{
@@ -145,10 +167,10 @@ mutation {
145
167
}
146
168
}
147
169
```
170
+
</Collapser>
171
+
<Collapserid="query-the-rag-tool"title="Query the RAG tool">
148
172
149
-
### Query the RAG Tool
150
-
151
-
You can query your RAG Tool and receive chunked matches based on the documents indexed for a given tool. You may use New Relic AI to summarize the returned chunk matches, or you may use the NerdGraph APIs to retrieve the match and use your own AI on your own systems.
173
+
You can query your RAG tool and receive chunked matches based on the documents indexed for a given tool. You may use New Relic AI to summarize the returned chunk matches, or you may use the NerdGraph APIs to retrieve the match and use your own AI on your own systems.
152
174
153
175
```graphql
154
176
{
@@ -168,3 +190,5 @@ You can query your RAG Tool and receive chunked matches based on the documents i
0 commit comments