You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 2, 2025. It is now read-only.
Copy file name to clipboardExpand all lines: alerts-detectors-notifications/alerts-and-detectors/alerts-detectors-notifications.rst
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -85,7 +85,7 @@ To learn more, see :ref:`condition-reference`.
85
85
<h2>Alerts<aname="alerts"class="headerlink"href="#alerts"title="Permalink to this headline">¶</a></h2>
86
86
</embed>
87
87
88
-
When data in an input MTS matches a condition, the detector generates a trigger event and an alert that has a specific severity level. You can configure an alert to send a notification using Splunk On-Call. For more information, see the :new-page:`Splunk On-Call <https://help.victorops.com/>` documentation.
88
+
When data in an input MTS matches a condition, the detector generates a trigger event and an alert that has a specific severity level. You can configure an alert to send a notification using Splunk On-Call. For more information, see the :ref:`about-spoc` documentation.
89
89
90
90
Alert rules use settings you specify for built-in alert conditions to define thresholds that trigger alerts. When a detector determines that the conditions for a rule are met, it triggers an alert, creates an event, and sends notifications (if specified). Detectors can send notifications via email, as well as via other systems, such as Slack, or via a webhook.
Copy file name to clipboardExpand all lines: gdi/opentelemetry/components/receiver-creator-receiver.rst
+18-19Lines changed: 18 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,9 +7,9 @@ Receiver creator receiver
7
7
.. meta::
8
8
:description: Use the receiver creator to create receivers at runtime in the OpenTelemetry Collector based on rules. Read on to learn how to configure the component.
9
9
10
-
The receiver creator receiver allows the Splunk Distribution of the OpenTelemetry Collector to create new receivers at runtime based on configured rules and observer extensions. The supported pipeline types are ``metrics``, ``traces``, and ``logs``. See :ref:`otel-data-processing` for more information.
10
+
Use the Receiver creator receiver with the Splunk Distribution of the OpenTelemetry Collector to create new receivers at runtime based on configured rules and observer extensions. The supported pipeline types are ``metrics``, ``traces``, and ``logs``. See :ref:`otel-data-processing` for more information.
11
11
12
-
You can use any of the following observer extensions as listeners for the receiver creator:
12
+
You can use any of the following observer extensions as listeners for the Receiver creator:
13
13
14
14
- ``docker_observer``: Detects and reports running container endpoints through the Docker API.
15
15
- ``ecs_task_observer``: Detects and reports container endpoints for running ECS tasks.
@@ -29,17 +29,16 @@ Follow these steps to configure and activate the component:
29
29
- :ref:`otel-install-windows`
30
30
- :ref:`otel-install-k8s`
31
31
32
-
2. Configure the receiver creator receiver as described in the next section.
32
+
2. Configure the Receiver creator receiver as described in the next section.
33
33
3. Restart the Collector.
34
34
35
-
Sample configurations
35
+
Sample configuration
36
36
----------------------
37
37
38
-
To activate the receiver creator receiver, add the desired extensions to the ``extensions`` section of your configuration file, followed by ``receiver_creator`` instances in the ``receivers`` section. For example:
38
+
To activate the Receiver creator receiver add the desired extensions to the ``extensions`` section of your configuration file, followed by ``receiver_creator`` instances in the ``receivers`` section. For example:
39
39
40
40
.. code-block:: yaml
41
41
42
-
43
42
extensions:
44
43
# Configures the Kubernetes observer to watch for pod start and stop events.
45
44
k8s_observer:
@@ -76,16 +75,10 @@ To activate the receiver creator receiver, add the desired extensions to the ``e
76
75
77
76
You can nest and configure any supported receiver inside the ``receivers`` section of a ``receiver_creator`` configuration. Which receiver you can nest depends on the type of infrastructure the receiver creator is watching through the extensions defined in ``watch_observers``.
78
77
79
-
Rules expressions
80
-
------------------------------------
81
-
82
-
New receivers are created dynamically based on rules. Each rule must start with ``type == ("pod"|"port"|"hostport"|"container"|"k8s.node") &&`` such that the rule matches only one endpoint type. For a list of variable available to each endpoint type, see :new-page:`Rules expressions <https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/receiver/receivercreator/README.md#rule-expressions>` on GitHub.
83
-
84
-
85
-
Docker observer example
78
+
Example: Docker observer
86
79
------------------------------------
87
80
88
-
The following example shows how to configure the receiver creator using the Docker observer:
81
+
The following example shows how to configure the Receiver creator receiver using the Docker observer:
89
82
90
83
.. code-block:: yaml
91
84
@@ -113,10 +106,10 @@ The following example shows how to configure the receiver creator using the Dock
113
106
114
107
.. note:: See :new-page:`https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/extension/observer/dockerobserver/README.md` for a complete list of settings.
115
108
116
-
Kubernetes observer example
109
+
Example: Kubernetes observer
117
110
------------------------------------
118
111
119
-
The following example shows how to configure the receiver creator using the Kubernetes observer:
112
+
The following example shows how to configure the receiver creator receiver using the Kubernetes observer:
120
113
121
114
.. code-block:: yaml
122
115
@@ -149,10 +142,18 @@ The following example shows how to configure the receiver creator using the Kube
149
142
150
143
.. note:: See :new-page:`https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/extension/observer/k8sobserver/README.md` for a complete list of settings.
151
144
145
+
Rules to create new receivers
146
+
============================================
147
+
148
+
You can use this receiver to dynamically create new receivers based on rules. Each rule must start with ``type == ("pod"|"port"|"hostport"|"container"|"k8s.node") &&`` such that the rule matches only one endpoint type.
149
+
150
+
For a list of variables available to each endpoint type, see :new-page:`Rules expressions <https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/receiver/receivercreator/README.md#rule-expressions>` on GitHub.
151
+
152
+
152
153
Settings
153
154
======================
154
155
155
-
The following table shows the configuration options for the receiver creator receiver:
156
+
The following table shows the configuration options for the Receiver creator receiver:
156
157
157
158
.. raw:: html
158
159
@@ -161,8 +162,6 @@ The following table shows the configuration options for the receiver creator rec
Copy file name to clipboardExpand all lines: get-started/overview.rst
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -105,7 +105,7 @@ For more information, see :ref:`logs-intro-logconnect`.
105
105
106
106
Splunk On-Call incident response software aligns log management, monitoring, chat tools, and more, for a single-pane of glass into system health. Splunk On-Call automates delivery of alerts to get the right alert, to the right person, at the right time.
107
107
108
-
For more information, see the :new-page:`Splunk On-Call documentation <https://help.victorops.com/>`.
Copy file name to clipboardExpand all lines: sp-oncall/admin/get-started/admin-getting-started.rst
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -57,7 +57,7 @@ The Team page is your central location for configuring teams, schedules, rotatio
57
57
:ref:`Create Escalation Policies <team-escalation-policy>` - Escalation policies determine which incidents are routed, to whom they are routed, and how they are escalated. Essentially, an escalation policy is how Splunk On-Call escalates a triggered event.
58
58
59
59
- Best practice for setting up your escalation policy is to establish a minimum of three escalation paths: on-duty user, previous or next user in a rotation, and manager or team lead.
60
-
- :ref:`Read this post <mult-escalation-policies>` for more tips and tricks on how to manage multiple alert behaviors within a single team.
60
+
- :ref:`Read this post <multi-escalation-policies>` for more tips and tricks on how to manage multiple alert behaviors within a single team.
61
61
62
62
- :ref:`Configure Routing Keys <spoc-routing-keys>` - Routing keys tie the alerts from your monitoring tools to the specific team (or escalation policy) in Splunk On-Call. This helps get the right person on the problem and reduce alert noise for those unrelated to a specific incident. These can be found by navigating to :menuselection:`Settings` then :menuselection:`Routing Keys`.
:description: Splunk On-Call system requirements, including browsers, mobile support, and incident requirements.
9
9
10
10
11
11
12
-
The Splunk On-Call Team Dashboard provides a comprehensive overview of incidents. This view automatically defaults to the teams that you are a member of and allows teams to dive into the details and understand the status of alerts or incidents.
12
+
The Splunk On-Call provides a comprehensive overview of incidents. This view automatically defaults to the teams that you are a member of and allows teams to dive into the details and understand the status of alerts or incidents.
13
13
14
14
All incidents derived from integrated monitoring tools in the incident table include their respective logos to help you rapidly identify the source of an alert. Manually created incidents, along with incidents originating from the Email Endpoint or the REST API integrations, will remain logo free.
15
15
16
16
Information Alerts can still be found on the Timeline Page.
17
17
18
18
.. image:: /_images/spoc/team-dashboard.png
19
19
:width:100%
20
-
:alt:An image of the team dashboard. On-Call individuals listed on the left; Team incidents are shown in the main pane.
20
+
:alt:An image of the . On-Call individuals listed on the left; Team incidents are shown in the main pane.
21
21
22
22
23
23
Filters
@@ -44,12 +44,12 @@ Quickly identify responsible parties during a firefight by easily seeing which p
44
44
Manual Incident Creation
45
45
----------------------------
46
46
47
-
You can create a manual incident from the team dashboard by selecting :guilabel:`Create Incident` in the top right corner. For instructions, see :ref:`manual-incident`.
47
+
You can create a manual incident from the by selecting :guilabel:`Create Incident` in the top right corner. For instructions, see :ref:`manual-incident`.
48
48
49
49
Incident War Rooms
50
50
----------------------------
51
51
52
-
Access Incident Details directly from the Team Dashboard by selecting the incident number link. This will expand the incident and its event history in the :ref:`Incident War Room <war-room>`.
52
+
Access Incident Details directly from the by selecting the incident number link. This will expand the incident and its event history in the :ref:`Incident War Room <war-room>`.
Copy file name to clipboardExpand all lines: sp-oncall/admin/sso/single-sign-sso.rst
+20-59Lines changed: 20 additions & 59 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,13 +10,17 @@ Configure Single Sign-On for Splunk On-Call
10
10
.. toctree::
11
11
:hidden:
12
12
13
-
sp-sso-okta
14
-
sp-sso-google
15
-
sp-sso-adfs
13
+
Configure SSO for Okta<sp-sso-okta>
14
+
Configure SSO for Google<sp-sso-google>
15
+
Configure SSO for ADFS<sp-sso-adfs>
16
+
Configure SSO for other IDPs<sp-sso-other>
16
17
sp-sso-users
17
18
18
-
Requirements
19
-
==================
19
+
.. raw:: html
20
+
21
+
<embed>
22
+
<h2>Requirements<aname="requirements"class="headerlink"href="#requirements"title="Permalink to this headline">¶</a></h2>
23
+
</embed>
20
24
21
25
This integration is compatible with the following versions of Splunk On-Call:
22
26
@@ -25,18 +29,16 @@ This integration is compatible with the following versions of Splunk On-Call:
25
29
To enable single sign-on (SSO) for your organization, you will need to provide an updated metadata file and your IDP. If you are
26
30
interested in setting up SSO, please contact :ref:`Splunk On-Call Support <spoc-support>`.
27
31
28
-
29
-
30
32
Configure Single Sign On (SSO) between your Identity Provider (IDP) and Splunk On-Call. Our standard SSO setup uses SAML 2.0 protocol. As long as your IDP can use SAML 2.0 protocol, it can integrate with Splunk On-Call. The exact steps differ depending on which IDP you use, but the process typically involves exporting a .XML metadata file and sending it to our Support team. Once you have sent the .xml file, a Splunk On-Call support specialist will
31
33
complete the setup on the back-end and respond with confirmation.
32
34
33
35
If your IDP does not have SAML capability, please contact Splunk On-Call Support to explore what alternative options may be available. For details on how to contact Splunk On-Call Support, see :ref:`spoc-support`.
34
36
35
-
36
-
37
-
38
-
Administrator Setup
39
-
==========================
37
+
.. raw:: html
38
+
39
+
<embed>
40
+
<h2>Configure SSO: Admin guides<aname="admin-setup"class="headerlink"href="#admin-setup"title="Permalink to this headline">¶</a></h2>
41
+
</embed>
40
42
41
43
Instructions to complete the SSO configuration with Splunk On-Call and your IDP are provided for:
42
44
@@ -46,51 +48,10 @@ Instructions to complete the SSO configuration with Splunk On-Call and your IDP
46
48
- :ref:`sso-azure-spoc`
47
49
- :ref:`sso-aws-spoc`
48
50
51
+
.. raw:: html
52
+
53
+
<embed>
54
+
<h2>Sign in to Splunk On-Call through SSO: User guide<aname="user-guide"class="headerlink"href="#user-guide"title="Permalink to this headline">¶</a></h2>
55
+
</embed>
49
56
50
-
51
-
.. _sso-onelogin-spoc:
52
-
53
-
54
-
OneLogin
55
-
-------------
56
-
57
-
If you are configuring SSO for OneLogin, the Default relay state is:
:description: Enable Splunk On-Call SSO for your organization.
9
9
10
-
Requirements
11
-
==================
12
-
13
-
This integration is compatible with the following versions of Splunk On-Call:
14
-
15
-
- Full-Stack
16
-
17
-
To enable single sign-on (SSO) for your organization, you will need to provide an updated metadata file and your IDP. If you are interested in setting up SSO, please contact :ref:`Splunk On-Call Support <spoc-support>`.
18
-
19
-
20
-
21
-
Configure Single Sign On between your Identity Provider (IDP) and Splunk On-Call. Our standard SSO setup uses SAML 2.0 protocol. As long as your IDP can use SAML 2.0 protocol, it can integrate with Splunk On-Call. The exact steps differ depending on which IDP you use, but the process typically involves exporting a .XML metadata file and sending it to our Support team. Once you have sent the .xml file, a Splunk On-Call support specialist will
22
-
complete the setup on the back-end and respond with confirmation.
23
-
24
-
If your IDP does not have SAML capability, please contact Splunk On-Call Support to explore what alternative options may be available. For details on how to contact Splunk On-Call Support, see :ref:`spoc-support`.
25
-
26
-
27
-
Administrator Setup
28
-
==========================
29
-
30
-
Instructions to complete the SSO configuration with Splunk On-Call and your IDP are provided for:
31
-
32
-
- :ref:`sso-okta-spoc`
33
-
- :ref:`sso-google-spoc`
34
-
-
35
-
36
-
37
-
.. _sso-google-spoc:
38
-
39
-
Google Apps
40
-
================
41
-
42
10
To configure SSO for Splunk On-Call using Google Apps:
43
11
44
12
#. Access the Admin portal for Google Apps and navigate to :guilabel:`Apps` then :guilabel:`SAML Apps`.
@@ -53,7 +21,7 @@ To configure SSO for Splunk On-Call using Google Apps:
53
21
:width:100%
54
22
:alt:Splunk On-Call SSO Google Apps Setup 2
55
23
56
-
#. From Step 2 of the wizard, select :guilabel:`Option 2` to download IDP metadata in XML format. Attach and send the downloaded .xml file to :ref:`Splunk On-Call Support <spoc-support>`.
24
+
#. From step 2 of the guided setup, select :guilabel:`Option 2` to download IDP metadata in XML format. Attach and send the downloaded .xml file to :ref:`Splunk On-Call Support <spoc-support>`.
57
25
58
26
.. image:: /_images/spoc/sso-google3.png
59
27
:width:100%
@@ -67,11 +35,8 @@ To configure SSO for Splunk On-Call using Google Apps:
67
35
:alt:Splunk On-Call SSO Google Apps Setup 5
68
36
69
37
#. In the :guilabel:`Service Provider Details` step, enter the following values:
70
-
- in the :guilabel:`ACS URL` field: :samp:`https://sso.victorops.com:443/sp/ACS.saml2`
71
-
- in the :guilabel:`Entity ID` field: :samp:`victorops.com`
72
-
- in the :guilabel:`Start URL` field, enter the following with the correct Organization Slug at the end: :samp:`https://portal.victorops.com/auth/sso/<<org-slug-here>>.`
73
-
74
-
75
-
#. Skip the attribute mapping step and select :guilabel:`Finish`.
76
-
38
+
- In the :guilabel:`ACS URL` field: :samp:`https://sso.victorops.com:443/sp/ACS.saml2`
39
+
- In the :guilabel:`Entity ID` field: :samp:`victorops.com`
40
+
- In the :guilabel:`Start URL` field, enter the following with the correct Organization Slug at the end: :samp:`https://portal.victorops.com/auth/sso/<<org-slug-here>>.`
77
41
42
+
#. Skip the attribute mapping step and select :guilabel:`Finish`.
0 commit comments