You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/docs/agentic-ai/knowledge-integration/overview.mdx
+58-15Lines changed: 58 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,17 +13,37 @@ redirects:
13
13
This feature is currently provided as part of a preview program pursuant to our [pre-release policies](/docs/licenses/license-information/referenced-policies/new-relic-pre-release-policy).
14
14
</Callout>
15
15
16
-
New Relic AI uses large language models (LLMs) and New Relic's data platform to help you understand your system and how to better glean insights about the performance of those systems. It allows you to ask questions, troubleshoot issues, and explore telemetry data using plain language.
16
+
To provide more context-specific answers, New Relic AI can use a technique called Retrieval Augmented Generation (RAG) through the New Relic AI knowledge connector. While New Relic AI LLMs have a vast general knowledge, RAG enhances their responses.
17
17
18
-
To provide more context-specific answers, New Relic AI can use a technique called Retrieval Augmented Generation (RAG) through the New Relic AI knowledge connector. While foundation LLMs have a vast general knowledge, RAG enhances their responses by retrieving relevant information from your external data sources.
18
+
By setting up the New Relic AI knowledge connector, you can expect tangible outcomes such as faster incident resolution, more accurate and context-aware AI responses, and reduced manual searching across multiple documents. This unified approach helps your team make better decisions and respond to issues more efficiently.
19
19
20
-
## How it works
21
-
The New Relic AI knowledge connector integrates your internal knowledge with the analytical power of New Relic AI through the following three-step process:
22
-
-**Index:** The first step is to connect your content and knowledge bases, such as Confluence, to the New Relic AI platform. Once connected, the knowledge connector will perform an initial indexing of your documents. You can configure this process to run on a recurring basis, ensuring that New Relic AI always has access to the most up-to-date information as your documents evolve.
23
-
-**Retrieval:** When a user asks a question in New Relic AI, the system searches the indexed content for information relevant to the user's query. This step ensures that the context is pulled directly from your trusted, internal documentation.
24
-
-**Generation:** Finally, the system combines the retrieved information with the powerful generative capabilities of the underlying LLM. This synthesis produces a comprehensive and context-aware answer, grounded in your specific data and best practices.
20
+
## Use cases and value
21
+
22
+
The following examples show how the knowledge connector integration helps tackle challenges like fragmented documentation and slow incident response by surfacing relevant information.
When a critical incident occurs, you face a time-consuming investigation because historical context is fragmented across multiple sources like Confluence and runbooks.
28
+
29
+
The knowledge connector integrates all of these disparate sources, transforming the Retro DocSearch tool into a powerful historical intelligence engine that provides all the context in one place. This allows the tool to:
25
30
26
-
This RAG approach significantly improves the accuracy and relevance of the responses, reducing the likelihood of generic or hallucinated answers.
31
+
* Provide an immediate summary of an original incident's root cause.
32
+
* Present specific actionable steps from the runbook that resolved the issue.
33
+
* Identify the names of the experts or teams who solved it before.
When an IT issue reoccurs, you'll often have to spend valuable time trying to figure out if it has happened before. Your historical context is often scattered across separate retrospective or postmortem documents, which can make it difficult to quickly find the information you need to resolve a recurring incident.
39
+
40
+
By leveraging RAG, the New Relic AI platform stores information from your existing retrospective or postmortem documents for future reference. This helps you save valuable time during an incident. The What happened previously widget, with RAG implementation, allows you to:
41
+
42
+
* Quickly find relevant information and learn from previous incidents.
43
+
* Get a summary of similar past issues, along with links to the retrospective documents for detailed analysis.
44
+
45
+
</Collapser>
46
+
</CollapserGroup>
27
47
28
48
## Key features
29
49
With New Relic AI knowledge connector, you can:
@@ -35,17 +55,40 @@ With New Relic AI knowledge connector, you can:
35
55
- "Show me the runbook for a `database connection limit exceeded` error."
36
56
37
57
<Calloutvariant="important">
38
-
39
-
At this time, all indexed documents can be retrieved by all users within your organization's New Relic account. Before you begin indexing, ensure that the documents you intend to connect comply with your internal data security and privacy policies for use of the services.
40
-
58
+
59
+
At this time, all indexed documents can be retrieved by all users within your organization's New Relic account. Please verify that only appropriate content is indexed, as there's currently no option to restrict access or redact information after indexing.
60
+
41
61
</Callout>
42
62
63
+
## How it works
64
+
65
+
<img
66
+
title="High-level visual of Knowledge connector"
67
+
alt="High-level visual of Knowledge connector"
68
+
src="/images/High-Level Visual Overview of the New Relic AI Knowledge Connector.webp"
69
+
/>
70
+
71
+
The knowledge connector securely integrates with your content and knowledge bases, such as Confluence, to enhance New Relic AI's responses with your specific organizational knowledge. The process follows these steps:
72
+
73
+
-**Index:** Once your knowledge bases are connected to the New Relic AI platform, the knowledge connector performs an initial indexing of your documents. You can configure this process to run on a recurring basis, ensuring that New Relic AI always has access to the most up-to-date information as your documents evolve.
74
+
75
+
-**Retrieval:** When you ask a question in New Relic AI, the system searches the indexed content for the most relevant information. This step ensures that the context is pulled directly from your trusted, internal documentation.
76
+
77
+
-**Generation:** Finally, the system combines the retrieved information with the powerful generative capabilities of the underlying LLM. This synthesis produces a comprehensive and context-aware answer, grounded in your specific data and best practices.
78
+
79
+
If a query doesn't require your organizational knowledge, New Relic AI will generate an answer using the underlying LLM's vast general knowledge. In both cases, the goal is to provide you with the most relevant and accurate information possible.
80
+
43
81
## Prerequisites
44
-
To begin using the New Relic AI knowledge connector:
45
-
-**Enable New Relic AI:** Before you can configure the knowledge connector, New Relic AI must be enabled for your account.
46
-
-**Configure user permissions for indexing:** To manage which users can index data sources (which may have future billing implications), you must grant the appropriate permissions. Users responsible for setting up and managing the knowledge connectors will need the “Org Product Admin” role.
47
82
48
-
You have two options to assign this role:
83
+
Before you begin using the New Relic AI knowledge connector, ensure that:
84
+
85
+
- Only documents suitable for organization-wide access are indexed.
86
+
- Sensitive information is redacted.
87
+
- All the documents to be indexed comply with your organization's internal data security and privacy policies.
88
+
- New Relic AI is enabled for your account.
89
+
- Appropriate user permissions are configured for indexing (you'll need the “Org Product Admin” role, which allows you to perform actions—such as set up and manage the knowledge connectors—that may have future billing implications).
90
+
91
+
You have two options to assign the Org Product Admin role:
49
92
-**Apply to an existing user group:** Add the Org Product Admin role to an existing group of users who will be responsible for managing the knowledge connectors.
50
93
-**Create a dedicated group:** For more granular control, create a new user group specifically for this purpose and assign the Org Product Admin role to that group.
0 commit comments