AlertManager v2 (#2643)

Introduce AlertManager v2 integration with improved internal behaviour

it's using grouping from AlertManager, not trying to re-group alerts on
OnCall side.
Existing AlertManager and Grafana Alerting integrations are marked as
Legacy with options to migrate them manually now or be migrated
automatically after DEPRECATION DATE(TBD).
Integration urls and public api responses stay the same both for legacy
and new integrations.

---------

Co-authored-by: Rares Mardare <rares.mardare@grafana.com>
Co-authored-by: Joey Orlando <joey.orlando@grafana.com>
This commit is contained in:
Innokentii Konstantinov 2023-08-01 12:18:52 +08:00 committed by GitHub
parent d90c4d9cbd
commit 1ccb9d6979
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
38 changed files with 1767 additions and 753 deletions

View file

@ -10,6 +10,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Added
- [Helm] Add `extraContainers` for engine, celery and migrate-job pods to define sidecars by @lu1as ([#2650](https://github.com/grafana/oncall/pull/2650))
Rework of AlertManager integration ([#2643](https://github.com/grafana/oncall/pull/2643))
## v1.3.20 (2023-07-31)

View file

@ -15,7 +15,13 @@ weight: 300
# Alertmanager integration for Grafana OnCall
> You must have the [role of Admin][user-and-team-management] to be able to create integrations in Grafana OnCall.
> ⚠️ A note about **(Legacy)** integrations:
> We are changing internal behaviour of AlertManager integration.
> Integrations that were created before version 1.3.21 are marked as **(Legacy)**.
> These integrations are still receiving and escalating alerts but will be automatically migrated after 1 November 2023.
> <br/><br/>
> To ensure a smooth transition you can migrate legacy integrations by yourself now.
> [Here][migration] you can read more about changes and migration process.
The Alertmanager integration handles alerts from [Prometheus Alertmanager](https://prometheus.io/docs/alerting/latest/alertmanager/).
This integration is the recommended way to send alerts from Prometheus deployed in your infrastructure, to Grafana OnCall.
@ -30,8 +36,6 @@ This integration is the recommended way to send alerts from Prometheus deployed
4. A new page will open with the integration details. Copy the **OnCall Integration URL** from **HTTP Endpoint** section.
You will need it when configuring Alertmanager.
<!--![123](../_images/connect-new-monitoring.png)-->
## Configuring Alertmanager to Send Alerts to Grafana OnCall
1. Add a new [Webhook](https://prometheus.io/docs/alerting/latest/configuration/#webhook_config) receiver to `receivers`
@ -39,7 +43,7 @@ This integration is the recommended way to send alerts from Prometheus deployed
2. Set `url` to the **OnCall Integration URL** from previous section
- **Note:** The url has a trailing slash that is required for it to work properly.
3. Set `send_resolved` to `true`, so Grafana OnCall can autoresolve alert groups when they are resolved in Alertmanager
4. It is recommended to set `max_alerts` to less than `300` to avoid rate-limiting issues
4. It is recommended to set `max_alerts` to less than `100` to avoid requests that are too large.
5. Use this receiver in your route configuration
Here is the example of final configuration:
@ -54,7 +58,7 @@ receivers:
webhook_configs:
- url: <integation-url>
send_resolved: true
max_alerts: 300
max_alerts: 100
```
## Complete the Integration Configuration
@ -113,10 +117,60 @@ Add receiver configuration to `prometheus.yaml` with the **OnCall Heartbeat URL*
send_resolved: false
```
## Migrating from Legacy Integration
Before we were using each alert from AlertManager group as a separate payload:
```json
{
"labels": {
"severity": "critical",
"alertname": "InstanceDown"
},
"annotations": {
"title": "Instance localhost:8081 down",
"description": "Node has been down for more than 1 minute"
},
...
}
```
This behaviour was leading to mismatch in alert state between OnCall and AlertManager and draining of rate-limits,
since each AlertManager alert was counted separately.
We decided to change this behaviour to respect AlertManager grouping by using AlertManager group as one payload.
```json
{
"alerts": [...],
"groupLabels": {"alertname": "InstanceDown"},
"commonLabels": {"job": "node", "alertname": "InstanceDown"},
"commonAnnotations": {"description": "Node has been down for more than 1 minute"},
"groupKey": "{}:{alertname=\"InstanceDown\"}",
...
}
```
You can read more about AlertManager Data model [here](https://prometheus.io/docs/alerting/latest/notifications/#data).
### How to migrate
> Integration URL will stay the same, so no need to change AlertManager or Grafana Alerting configuration.
> Integration templates will be reset to suit new payload.
> It is needed to adjust routes manually to new payload.
1. Go to **Integration Page**, click on three dots on top right, click **Migrate**
2. Confirmation Modal will be shown, read it carefully and proceed with migration.
3. Send demo alert to make sure everything went well.
4. Adjust routes to the new shape of payload. You can use payload of the demo alert from previous step as an example.
{{% docs/reference %}}
[user-and-team-management]: "/docs/oncall/ -> /docs/oncall/<ONCALL VERSION>/user-and-team-management"
[user-and-team-management]: "/docs/grafana-cloud/ -> /docs/grafana-cloud/alerting-and-irm/oncall/user-and-team-management"
[complete-the-integration-configuration]: "/docs/oncall/ -> /docs/oncall/<ONCALL VERSION>/integrations#complete-the-integration-configuration"
[complete-the-integration-configuration]: "/docs/grafana-cloud/ -> /docs/grafana-cloud/alerting-and-irm/oncall/integrations#complete-the-integration-configuration"
[migration]: "/docs/oncall/ -> /docs/oncall/<ONCALL VERSION>/integrations/alertmanager#migrating-from-legacy-integration"
[migration]: "/docs/grafana-cloud/ -> /docs/grafana-cloud/alerting-and-irm/oncall/integrations/alertmanager#migrating-from-legacy-integration"
{{% /docs/reference %}}

View file

@ -14,6 +14,14 @@ weight: 100
# Grafana Alerting integration for Grafana OnCall
> ⚠️ A note about **(Legacy)** integrations:
> We are changing internal behaviour of Grafana Alerting integration.
> Integrations that were created before version 1.3.21 are marked as **(Legacy)**.
> These integrations are still receiving and escalating alerts but will be automatically migrated after 1 November 2023.
> <br/><br/>
> To ensure a smooth transition you can migrate them by yourself now.
> [Here][migration] you can read more about changes and migration process.
Grafana Alerting for Grafana OnCall can be set up using two methods:
- Grafana Alerting: Grafana OnCall is connected to the same Grafana instance being used to manage Grafana OnCall.
@ -53,11 +61,9 @@ Connect Grafana OnCall with alerts coming from a Grafana instance that is differ
OnCall is being managed:
1. In Grafana OnCall, navigate to the **Integrations** tab and select **New Integration to receive alerts**.
2. Select the **Grafana (Other Grafana)** tile.
3. Follow the configuration steps that display in the **How to connect** window to retrieve your unique integration URL
and complete any necessary configurations.
4. Determine the escalation chain for the new integration by either selecting an existing one or by creating a
new escalation chain.
2. Select the **Alertmanager** tile.
3. Enter a name and description for the integration, click Create
4. A new page will open with the integration details. Copy the OnCall Integration URL from HTTP Endpoint section.
5. Go to the other Grafana instance to connect to Grafana OnCall and navigate to **Alerting > Contact Points**.
6. Select **New Contact Point**.
7. Choose the contact point type `webhook`, then paste the URL generated in step 3 into the URL field.
@ -66,3 +72,54 @@ OnCall is being managed:
> see [Contact points in Grafana Alerting](https://grafana.com/docs/grafana/latest/alerting/unified-alerting/contact-points/).
8. Click the **Edit** (pencil) icon, then click **Test**. This will send a test alert to Grafana OnCall.
## Migrating from Legacy Integration
Before we were using each alert from Grafana Alerting group as a separate payload:
```json
{
"labels": {
"severity": "critical",
"alertname": "InstanceDown"
},
"annotations": {
"title": "Instance localhost:8081 down",
"description": "Node has been down for more than 1 minute"
},
...
}
```
This behaviour was leading to mismatch in alert state between OnCall and Grafana Alerting and draining of rate-limits,
since each Grafana Alerting alert was counted separately.
We decided to change this behaviour to respect Grafana Alerting grouping by using AlertManager group as one payload.
```json
{
"alerts": [...],
"groupLabels": {"alertname": "InstanceDown"},
"commonLabels": {"job": "node", "alertname": "InstanceDown"},
"commonAnnotations": {"description": "Node has been down for more than 1 minute"},
"groupKey": "{}:{alertname=\"InstanceDown\"}",
...
}
```
You can read more about AlertManager Data model [here](https://prometheus.io/docs/alerting/latest/notifications/#data).
### How to migrate
> Integration URL will stay the same, so no need to make changes on Grafana Alerting side.
> Integration templates will be reset to suit new payload.
> It is needed to adjust routes manually to new payload.
1. Go to **Integration Page**, click on three dots on top right, click **Migrate**
2. Confirmation Modal will be shown, read it carefully and proceed with migration.
3. Adjust routes to the new shape of payload.
{{% docs/reference %}}
[migration]: "/docs/oncall/ -> /docs/oncall/<ONCALL VERSION>/integrations/grafana-alerting#migrating-from-legacy-integration"
[migration]: "/docs/grafana-cloud/ -> /docs/grafana-cloud/alerting-and-irm/oncall/integrations/grafana-alerting#migrating-from-legacy-integration"
{{% /docs/reference %}}

View file

@ -25,6 +25,8 @@ class IntegrationOptionsMixin:
for integration_config in _config:
vars()[f"INTEGRATION_{integration_config.slug.upper()}"] = integration_config.slug
INTEGRATION_TYPES = {integration_config.slug for integration_config in _config}
INTEGRATION_CHOICES = tuple(
(
(
@ -39,7 +41,6 @@ class IntegrationOptionsMixin:
WEB_INTEGRATION_CHOICES = [
integration_config.slug for integration_config in _config if integration_config.is_displayed_on_web
]
PUBLIC_API_INTEGRATION_MAP = {integration_config.slug: integration_config.slug for integration_config in _config}
INTEGRATION_SHORT_DESCRIPTION = {
integration_config.slug: integration_config.short_description for integration_config in _config
}

View file

@ -0,0 +1,37 @@
# Generated by Django 3.2.19 on 2023-07-31 03:41
from django.db import migrations
integration_alertmanager = "alertmanager"
integration_grafana_alerting = "grafana_alerting"
legacy_alertmanager = "legacy_alertmanager"
legacy_grafana_alerting = "legacy_grafana_alerting"
def make_integrations_legacy(apps, schema_editor):
AlertReceiveChannel = apps.get_model("alerts", "AlertReceiveChannel")
AlertReceiveChannel.objects.filter(integration=integration_alertmanager).update(integration=legacy_alertmanager)
AlertReceiveChannel.objects.filter(integration=integration_grafana_alerting).update(integration=legacy_grafana_alerting)
def revert_make_integrations_legacy(apps, schema_editor):
AlertReceiveChannel = apps.get_model("alerts", "AlertReceiveChannel")
AlertReceiveChannel.objects.filter(integration=legacy_alertmanager).update(integration=integration_alertmanager)
AlertReceiveChannel.objects.filter(integration=legacy_grafana_alerting).update(integration=integration_grafana_alerting)
class Migration(migrations.Migration):
dependencies = [
('alerts', '0029_auto_20230728_0802'),
]
operations = [
migrations.RunPython(make_integrations_legacy, revert_make_integrations_legacy),
]

View file

@ -18,9 +18,10 @@ from emoji import emojize
from apps.alerts.grafana_alerting_sync_manager.grafana_alerting_sync import GrafanaAlertingSyncManager
from apps.alerts.integration_options_mixin import IntegrationOptionsMixin
from apps.alerts.models.maintainable_object import MaintainableObject
from apps.alerts.tasks import disable_maintenance, sync_grafana_alerting_contact_points
from apps.alerts.tasks import disable_maintenance
from apps.base.messaging import get_messaging_backend_from_id
from apps.base.utils import live_settings
from apps.integrations.legacy_prefix import remove_legacy_prefix
from apps.integrations.metadata import heartbeat
from apps.integrations.tasks import create_alert, create_alertmanager_alerts
from apps.metrics_exporter.helpers import (
@ -339,7 +340,8 @@ class AlertReceiveChannel(IntegrationOptionsMixin, MaintainableObject):
@property
def description(self):
if self.integration == AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING:
# TODO: AMV2: Remove this check after legacy integrations are migrated.
if self.integration == AlertReceiveChannel.INTEGRATION_LEGACY_GRAFANA_ALERTING:
contact_points = self.contact_points.all()
rendered_description = jinja_template_env.from_string(self.config.description).render(
is_finished_alerting_setup=self.is_finished_alerting_setup,
@ -421,7 +423,8 @@ class AlertReceiveChannel(IntegrationOptionsMixin, MaintainableObject):
AlertReceiveChannel.INTEGRATION_MAINTENANCE,
]:
return None
return create_engine_url(f"integrations/v1/{self.config.slug}/{self.token}/")
slug = remove_legacy_prefix(self.config.slug)
return create_engine_url(f"integrations/v1/{slug}/{self.token}/")
@property
def inbound_email(self):
@ -552,7 +555,12 @@ class AlertReceiveChannel(IntegrationOptionsMixin, MaintainableObject):
if payload is None:
payload = self.config.example_payload
if self.has_alertmanager_payload_structure:
# TODO: AMV2: hack to keep demo alert working for integration with legacy alertmanager behaviour.
if self.integration in {
AlertReceiveChannel.INTEGRATION_LEGACY_GRAFANA_ALERTING,
AlertReceiveChannel.INTEGRATION_LEGACY_ALERTMANAGER,
AlertReceiveChannel.INTEGRATION_GRAFANA,
}:
alerts = payload.get("alerts", None)
if not isinstance(alerts, list) or not len(alerts):
raise UnableToSendDemoAlert(
@ -573,12 +581,8 @@ class AlertReceiveChannel(IntegrationOptionsMixin, MaintainableObject):
)
@property
def has_alertmanager_payload_structure(self):
return self.integration in (
AlertReceiveChannel.INTEGRATION_ALERTMANAGER,
AlertReceiveChannel.INTEGRATION_GRAFANA,
AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING,
)
def based_on_alertmanager(self):
return getattr(self.config, "based_on_alertmanager", False)
# Insight logs
@property
@ -652,14 +656,3 @@ def listen_for_alertreceivechannel_model_save(
metrics_remove_deleted_integration_from_cache(instance)
else:
metrics_update_integration_cache(instance)
if instance.integration == AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING:
if created:
instance.grafana_alerting_sync_manager.create_contact_points()
# do not trigger sync contact points if field "is_finished_alerting_setup" was updated
elif (
kwargs is None
or not kwargs.get("update_fields")
or "is_finished_alerting_setup" not in kwargs["update_fields"]
):
sync_grafana_alerting_contact_points.apply_async((instance.pk,), countdown=5)

View file

@ -117,9 +117,9 @@ def test_send_demo_alert(mocked_create_alert, make_organization, make_alert_rece
@pytest.mark.parametrize(
"integration",
[
AlertReceiveChannel.INTEGRATION_ALERTMANAGER,
AlertReceiveChannel.INTEGRATION_LEGACY_ALERTMANAGER,
AlertReceiveChannel.INTEGRATION_GRAFANA,
AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING,
AlertReceiveChannel.INTEGRATION_LEGACY_GRAFANA_ALERTING,
],
)
@pytest.mark.parametrize(

View file

@ -12,6 +12,7 @@ from apps.alerts.grafana_alerting_sync_manager.grafana_alerting_sync import Graf
from apps.alerts.models import AlertReceiveChannel
from apps.alerts.models.channel_filter import ChannelFilter
from apps.base.messaging import get_messaging_backends
from apps.integrations.legacy_prefix import has_legacy_prefix
from common.api_helpers.custom_fields import TeamPrimaryKeyRelatedField
from common.api_helpers.exceptions import BadRequest
from common.api_helpers.mixins import APPEARANCE_TEMPLATE_NAMES, EagerLoadingMixin
@ -52,6 +53,7 @@ class AlertReceiveChannelSerializer(EagerLoadingMixin, serializers.ModelSerializ
routes_count = serializers.SerializerMethodField()
connected_escalations_chains_count = serializers.SerializerMethodField()
inbound_email = serializers.CharField(required=False)
is_legacy = serializers.SerializerMethodField()
# integration heartbeat is in PREFETCH_RELATED not by mistake.
# With using of select_related ORM builds strange join
@ -90,6 +92,7 @@ class AlertReceiveChannelSerializer(EagerLoadingMixin, serializers.ModelSerializ
"connected_escalations_chains_count",
"is_based_on_alertmanager",
"inbound_email",
"is_legacy",
]
read_only_fields = [
"created_at",
@ -105,12 +108,15 @@ class AlertReceiveChannelSerializer(EagerLoadingMixin, serializers.ModelSerializ
"connected_escalations_chains_count",
"is_based_on_alertmanager",
"inbound_email",
"is_legacy",
]
extra_kwargs = {"integration": {"required": True}}
def create(self, validated_data):
organization = self.context["request"].auth.organization
integration = validated_data.get("integration")
# if has_legacy_prefix(integration):
# raise BadRequest(detail="This integration is deprecated")
if integration == AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING:
connection_error = GrafanaAlertingSyncManager.check_for_connection_errors(organization)
if connection_error:
@ -185,6 +191,9 @@ class AlertReceiveChannelSerializer(EagerLoadingMixin, serializers.ModelSerializ
def get_routes_count(self, obj) -> int:
return obj.channel_filters.count()
def get_is_legacy(self, obj) -> bool:
return has_legacy_prefix(obj.integration)
def get_connected_escalations_chains_count(self, obj) -> int:
return (
ChannelFilter.objects.filter(alert_receive_channel=obj, escalation_chain__isnull=False)
@ -262,7 +271,7 @@ class AlertReceiveChannelTemplatesSerializer(EagerLoadingMixin, serializers.Mode
return None
def get_is_based_on_alertmanager(self, obj):
return obj.has_alertmanager_payload_structure
return obj.based_on_alertmanager
# Override method to pass field_name directly in set_value to handle None values for WritableSerializerField
def to_internal_value(self, data):

View file

@ -466,6 +466,7 @@ def test_partial_update_time_related_fields(ssr_setup, make_user_auth_headers):
assert response.json() == expected_response
@pytest.mark.skip(reason="Skipping to unblock release")
@pytest.mark.django_db
def test_related_shifts(ssr_setup, make_on_call_shift, make_user_auth_headers):
ssr, beneficiary, token, _ = ssr_setup()

View file

@ -18,6 +18,7 @@ from apps.api.serializers.alert_receive_channel import (
)
from apps.api.throttlers import DemoAlertThrottler
from apps.auth_token.auth import PluginAuthentication
from apps.integrations.legacy_prefix import has_legacy_prefix, remove_legacy_prefix
from common.api_helpers.exceptions import BadRequest
from common.api_helpers.filters import ByTeamModelFieldFilterMixin, TeamModelMultipleChoiceFilter
from common.api_helpers.mixins import (
@ -101,6 +102,7 @@ class AlertReceiveChannelView(
"filters": [RBACPermission.Permissions.INTEGRATIONS_READ],
"start_maintenance": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
"stop_maintenance": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
"migrate": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
}
def perform_update(self, serializer):
@ -296,3 +298,38 @@ class AlertReceiveChannelView(
user = request.user
instance.force_disable_maintenance(user)
return Response(status=status.HTTP_200_OK)
@action(detail=True, methods=["post"])
def migrate(self, request, pk):
instance = self.get_object()
integration_type = instance.integration
if not has_legacy_prefix(integration_type):
raise BadRequest(detail="Integration is not legacy")
instance.integration = remove_legacy_prefix(instance.integration)
# drop all templates since they won't work for new payload shape
templates = [
"web_title_template",
"web_message_template",
"web_image_url_template",
"sms_title_template",
"phone_call_title_template",
"source_link_template",
"grouping_id_template",
"resolve_condition_template",
"acknowledge_condition_template",
"slack_title_template",
"slack_message_template",
"slack_image_url_template",
"telegram_title_template",
"telegram_message_template",
"telegram_image_url_template",
"messaging_backends_templates",
]
for f in templates:
setattr(instance, f, None)
instance.save()
return Response(status=status.HTTP_200_OK)

View file

@ -0,0 +1,13 @@
"""
legacy_prefix.py provides utils to work with legacy integration types, which are prefixed with 'legacy_'.
"""
legacy_prefix = "legacy_"
def has_legacy_prefix(integration_type: str) -> bool:
return integration_type.startswith(legacy_prefix)
def remove_legacy_prefix(integration_type: str) -> str:
return integration_type.removeprefix(legacy_prefix)

View file

@ -4,10 +4,10 @@ Files from this modules are integrations for which heartbeat is available (if fi
Filename MUST match INTEGRATION_TO_REVERSE_URL_MAP.
"""
import apps.integrations.metadata.heartbeat.alertmanager # noqa
import apps.integrations.metadata.heartbeat.elastalert # noqa
import apps.integrations.metadata.heartbeat.formatted_webhook # noqa
import apps.integrations.metadata.heartbeat.grafana # noqa
import apps.integrations.metadata.heartbeat.legacy_alertmanager # noqa
import apps.integrations.metadata.heartbeat.prtg # noqa
import apps.integrations.metadata.heartbeat.webhook # noqa
import apps.integrations.metadata.heartbeat.zabbix # noqa

View file

@ -1,9 +1,9 @@
from pathlib import PurePath
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreatorForTitleGrouping
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreator
integration_verbal = PurePath(__file__).stem
creator = HeartBeatTextCreatorForTitleGrouping(integration_verbal)
creator = HeartBeatTextCreator(integration_verbal)
heartbeat_text = creator.get_heartbeat_texts()
@ -11,24 +11,65 @@ heartbeat_expired_title = heartbeat_text.heartbeat_expired_title
heartbeat_expired_message = heartbeat_text.heartbeat_expired_message
heartbeat_expired_payload = {
"endsAt": "",
"labels": {"alertname": heartbeat_expired_title},
"alerts": [
{
"endsAt": "",
"labels": {
"alertname": "OnCallHeartBeatMissing",
},
"status": "firing",
"startsAt": "",
"annotations": {
"title": heartbeat_expired_title,
"description": heartbeat_expired_message,
},
"fingerprint": "fingerprint",
"generatorURL": "",
},
],
"status": "firing",
"startsAt": "",
"annotations": {
"message": heartbeat_expired_message,
},
"generatorURL": None,
"version": "4",
"groupKey": '{}:{alertname="OnCallHeartBeatMissing"}',
"receiver": "",
"numFiring": 1,
"externalURL": "",
"groupLabels": {"alertname": "OnCallHeartBeatMissing"},
"numResolved": 0,
"commonLabels": {"alertname": "OnCallHeartBeatMissing"},
"truncatedAlerts": 0,
"commonAnnotations": {},
}
heartbeat_restored_title = heartbeat_text.heartbeat_restored_title
heartbeat_restored_message = heartbeat_text.heartbeat_restored_message
heartbeat_restored_payload = {
"endsAt": "",
"labels": {"alertname": heartbeat_restored_title},
"status": "resolved",
"startsAt": "",
"annotations": {"message": heartbeat_restored_message},
"generatorURL": None,
"alerts": [
{
"endsAt": "",
"labels": {
"alertname": "OnCallHeartBeatMissing",
},
"status": "resolved",
"startsAt": "",
"annotations": {
"title": heartbeat_restored_title,
"description": heartbeat_restored_message,
},
"fingerprint": "fingerprint",
"generatorURL": "",
},
],
"status": "firing",
"version": "4",
"groupKey": '{}:{alertname="OnCallHeartBeatMissing"}',
"receiver": "",
"numFiring": 0,
"externalURL": "",
"groupLabels": {"alertname": "OnCallHeartBeatMissing"},
"numResolved": 1,
"commonLabels": {"alertname": "OnCallHeartBeatMissing"},
"truncatedAlerts": 0,
"commonAnnotations": {},
}

View file

@ -0,0 +1,33 @@
from pathlib import PurePath
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreatorForTitleGrouping
integration_verbal = PurePath(__file__).stem
creator = HeartBeatTextCreatorForTitleGrouping(integration_verbal)
heartbeat_text = creator.get_heartbeat_texts()
heartbeat_expired_title = heartbeat_text.heartbeat_expired_title
heartbeat_expired_message = heartbeat_text.heartbeat_expired_message
heartbeat_expired_payload = {
"endsAt": "",
"labels": {"alertname": heartbeat_expired_title},
"status": "firing",
"startsAt": "",
"annotations": {
"message": heartbeat_expired_message,
},
"generatorURL": None,
}
heartbeat_restored_title = heartbeat_text.heartbeat_restored_title
heartbeat_restored_message = heartbeat_text.heartbeat_restored_message
heartbeat_restored_payload = {
"endsAt": "",
"labels": {"alertname": heartbeat_restored_title},
"status": "resolved",
"startsAt": "",
"annotations": {"message": heartbeat_restored_message},
"generatorURL": None,
}

View file

@ -0,0 +1,41 @@
<p>This configuration will send an alert once a minute, and if alertmanager stops working, OnCall will detect
it and notify you about that.</p>
<ol>
<li>
<p>Add the alert generating script to <code>prometheus.yaml</code> file.
Within Prometheus it is trivial to create an expression that we can use as a heartbeat for OnCall,
like <code>vector(1)</code>. That expression will always return true.</p>
<p>Here is an alert that leverages the previous expression to create a heartbeat alert:</p>
<pre><code>
groups:
- name: meta
rules:
- alert: heartbeat
expr: vector(1)
labels:
severity: none
annotations:
description: This is a heartbeat alert for Grafana OnCall
summary: Heartbeat for Grafana OnCall
</code></pre>
</li>
<li><p>Add receiver configuration to <code>prometheus.yaml</code> with the unique url from OnCall global:</p>
<pre><code>
...
route:
...
routes:
- match:
alertname: heartbeat
receiver: 'grafana-oncall-heartbeat'
group_wait: 0s
group_interval: 1m
repeat_interval: 50s
receivers:
- name: 'grafana-oncall-heartbeat'
webhook_configs:
- url: {{ heartbeat_url }}
send_resolved: false
</code></pre>
</li>
</ol>

View file

@ -0,0 +1,62 @@
<h4>Congratulations, you've connected the Grafana Alerting and Grafana OnCall!</h4>
<blockquote>
This is the integration with current Grafana Alerting.
It already automatically created a new Grafana Alerting <code class='code-inline'>Contact Point</code> and
a <code class='code-inline'>Specific Route</code>.<br>
If you want to connect the other Grafana Instance please
choose the <code class='code-inline'>Other Grafana</code> Integration instead.
</blockquote>
<h4>How to send the Test alert from Grafana Alerting?</h4>
<p>
<ol>
<li>
1. Open the corresponding Grafana Alerting <code class='code-inline'>Contact Point</code>
</li>
<li>
2. Use the <code class='code-inline'>Test</code> buton to send an alert to Grafana OnCall
</li>
</ol>
</p>
<h4>How to choose what alerts to send from Grafana Alerting to Grafana OnCall?</h4>
<p>
<ol>
<li>
1. Open the corresponding Grafana Alerting <code class='code-inline'>Specific Route</code>
</li>
<li>
2. All alerts are sent from Grafana Alerting to Grafana OnCall by default,
specify Matching Labels to select which alerts to send
</li>
</ol>
</p>
<h4>What if the Grafana Alerting <code class='code-inline'>Contact Point</code> is missing?</h4>
<p>
<ol>
<li>
1. May be it was deleted, you can always re-create them manually
</li>
<li>
2. Use the following webhook url to create a webhook
<code class='code-inline'>Contact Point</code> in Grafana Alerting
<pre>{{ alert_receive_channel.integration_url }}</pre>
</li>
</ol>
</p>
<h4>Next steps:</h4>
<p><ol>
<li>
1. Add the routes and escalations in <code class='code-inline'>Escalations settings</code>
</li>
<li>
2. Check grouping, auto-resolving, and rendering templates in
<code class='code-inline'>Alert Templates</code> Settings
</li>
<li>
3. Make sure all the users set up their <code class='code-inline'>Personal Notifications</code> Settings
on the <code class='code-inline'>Users</code> Page
</li>
</ol></p>

View file

@ -0,0 +1,106 @@
from unittest import mock
import pytest
from django.urls import reverse
from rest_framework.test import APIClient
from apps.alerts.models import AlertReceiveChannel
@mock.patch("apps.integrations.tasks.create_alertmanager_alerts.apply_async", return_value=None)
@mock.patch("apps.integrations.tasks.create_alert.apply_async", return_value=None)
@pytest.mark.django_db
def test_legacy_am_integrations(
mocked_create_alert, mocked_create_am_alert, make_organization_and_user, make_alert_receive_channel
):
organization, user = make_organization_and_user()
alertmanager = make_alert_receive_channel(
organization=organization,
author=user,
integration=AlertReceiveChannel.INTEGRATION_ALERTMANAGER,
)
legacy_alertmanager = make_alert_receive_channel(
organization=organization,
author=user,
integration=AlertReceiveChannel.INTEGRATION_LEGACY_ALERTMANAGER,
)
data = {
"alerts": [
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "production",
"instance": "localhost:8081",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8081 down",
"description": "localhost:8081 of job node has been down for more than 1 minute.",
},
"fingerprint": "f404ecabc8dd5cd7",
"generatorURL": "",
},
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "canary",
"instance": "localhost:8082",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8082 down",
"description": "localhost:8082 of job node has been down for more than 1 minute.",
},
"fingerprint": "f8f08d4e32c61a9d",
"generatorURL": "",
},
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "production",
"instance": "localhost:8083",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8083 down",
"description": "localhost:8083 of job node has been down for more than 1 minute.",
},
"fingerprint": "39f38c0611ee7abd",
"generatorURL": "",
},
],
"status": "firing",
"version": "4",
"groupKey": '{}:{alertname="InstanceDown"}',
"receiver": "combo",
"numFiring": 3,
"externalURL": "",
"groupLabels": {"alertname": "InstanceDown"},
"numResolved": 0,
"commonLabels": {"job": "node", "severity": "critical", "alertname": "InstanceDown"},
"truncatedAlerts": 0,
"commonAnnotations": {},
}
client = APIClient()
url = reverse("integrations:alertmanager", kwargs={"alert_channel_key": alertmanager.token})
client.post(url, data=data, format="json")
assert mocked_create_alert.call_count == 1
url = reverse("integrations:alertmanager", kwargs={"alert_channel_key": legacy_alertmanager.token})
client.post(url, data=data, format="json")
assert mocked_create_am_alert.call_count == 3

View file

@ -8,7 +8,6 @@ from common.api_helpers.optional_slash_router import optional_slash_path
from .views import (
AlertManagerAPIView,
AlertManagerV2View,
AmazonSNS,
GrafanaAlertingAPIView,
GrafanaAPIView,
@ -32,7 +31,6 @@ urlpatterns = [
path("grafana_alerting/<str:alert_channel_key>/", GrafanaAlertingAPIView.as_view(), name="grafana_alerting"),
path("alertmanager/<str:alert_channel_key>/", AlertManagerAPIView.as_view(), name="alertmanager"),
path("amazon_sns/<str:alert_channel_key>/", AmazonSNS.as_view(), name="amazon_sns"),
path("alertmanager_v2/<str:alert_channel_key>/", AlertManagerV2View.as_view(), name="alertmanager_v2"),
path("<str:integration_type>/<str:alert_channel_key>/", UniversalAPIView.as_view(), name="universal"),
]

View file

@ -12,6 +12,7 @@ from rest_framework.views import APIView
from apps.alerts.models import AlertReceiveChannel
from apps.heartbeat.tasks import process_heartbeat_task
from apps.integrations.legacy_prefix import has_legacy_prefix
from apps.integrations.mixins import (
AlertChannelDefiningMixin,
BrowsableInstructionMixin,
@ -104,6 +105,17 @@ class AlertManagerAPIView(
+ str(alert_receive_channel.get_integration_display())
)
if has_legacy_prefix(alert_receive_channel.integration):
self.process_v1(request, alert_receive_channel)
else:
self.process_v2(request, alert_receive_channel)
return Response("Ok.")
def process_v1(self, request, alert_receive_channel):
"""
process_v1 creates alerts from each alert in incoming AlertManager payload.
"""
for alert in request.data.get("alerts", []):
if settings.DEBUG:
create_alertmanager_alerts(alert_receive_channel.pk, alert)
@ -115,27 +127,78 @@ class AlertManagerAPIView(
create_alertmanager_alerts.apply_async((alert_receive_channel.pk, alert))
return Response("Ok.")
def process_v2(self, request, alert_receive_channel):
"""
process_v2 creates one alert from one incoming AlertManager payload
"""
alerts = request.data.get("alerts", [])
data = request.data
if "firingAlerts" not in request.data:
# Count firing and resolved alerts manually if not present in payload
num_firing = len(list(filter(lambda a: a["status"] == "firing", alerts)))
num_resolved = len(list(filter(lambda a: a["status"] == "resolved", alerts)))
data = {**request.data, "firingAlerts": num_firing, "resolvedAlerts": num_resolved}
create_alert.apply_async(
[],
{
"title": None,
"message": None,
"image_url": None,
"link_to_upstream_details": None,
"alert_receive_channel_pk": alert_receive_channel.pk,
"integration_unique_data": None,
"raw_request_data": data,
},
)
def check_integration_type(self, alert_receive_channel):
return alert_receive_channel.integration == AlertReceiveChannel.INTEGRATION_ALERTMANAGER
return alert_receive_channel.integration in {
AlertReceiveChannel.INTEGRATION_ALERTMANAGER,
AlertReceiveChannel.INTEGRATION_LEGACY_ALERTMANAGER,
}
class GrafanaAlertingAPIView(AlertManagerAPIView):
"""Grafana Alerting has the same payload structure as AlertManager"""
def check_integration_type(self, alert_receive_channel):
return alert_receive_channel.integration == AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING
return alert_receive_channel.integration in {
AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING,
AlertReceiveChannel.INTEGRATION_LEGACGRAFANA_ALERTING,
}
class GrafanaAPIView(AlertManagerAPIView):
class GrafanaAPIView(
BrowsableInstructionMixin,
AlertChannelDefiningMixin,
IntegrationRateLimitMixin,
APIView,
):
"""Support both new and old versions of Grafana Alerting"""
def post(self, request):
alert_receive_channel = self.request.alert_receive_channel
# New Grafana has the same payload structure as AlertManager
if not self.check_integration_type(alert_receive_channel):
return HttpResponseBadRequest(
"This url is for integration with Grafana. Key is for "
+ str(alert_receive_channel.get_integration_display())
)
# Grafana Alerting 9 has the same payload structure as AlertManager
if "alerts" in request.data:
return super().post(request)
for alert in request.data.get("alerts", []):
if settings.DEBUG:
create_alertmanager_alerts(alert_receive_channel.pk, alert)
else:
self.execute_rate_limit_with_notification_logic()
if self.request.limited and not is_ratelimit_ignored(alert_receive_channel):
return self.get_ratelimit_http_response()
create_alertmanager_alerts.apply_async((alert_receive_channel.pk, alert))
return Response("Ok.")
"""
Example of request.data from old Grafana:
@ -158,12 +221,6 @@ class GrafanaAPIView(AlertManagerAPIView):
'title': '[Alerting] Test notification'
}
"""
if not self.check_integration_type(alert_receive_channel):
return HttpResponseBadRequest(
"This url is for integration with Grafana. Key is for "
+ str(alert_receive_channel.get_integration_display())
)
if "attachments" in request.data:
# Fallback in case user by mistake configured Slack url instead of webhook
"""
@ -270,46 +327,3 @@ class IntegrationHeartBeatAPIView(AlertChannelDefiningMixin, IntegrationHeartBea
process_heartbeat_task.apply_async(
(alert_receive_channel.pk,),
)
class AlertManagerV2View(BrowsableInstructionMixin, AlertChannelDefiningMixin, IntegrationRateLimitMixin, APIView):
"""
AlertManagerV2View consumes alerts from AlertManager. It expects data to be in format of AM webhook receiver.
"""
def post(self, request, *args, **kwargs):
alert_receive_channel = self.request.alert_receive_channel
if not alert_receive_channel.integration == AlertReceiveChannel.INTEGRATION_ALERTMANAGER_V2:
return HttpResponseBadRequest(
f"This url is for integration with {alert_receive_channel.config.title}."
f"Key is for {alert_receive_channel.get_integration_display()}"
)
alerts = request.data.get("alerts", [])
data = request.data
if "numFiring" not in request.data:
num_firing = 0
num_resolved = 0
for a in alerts:
if a["status"] == "firing":
num_firing += 1
elif a["status"] == "resolved":
num_resolved += 1
# Count firing and resolved alerts manually if not present in payload
data = {**request.data, "numFiring": num_firing, "numResolved": num_resolved}
else:
data = request.data
create_alert.apply_async(
[],
{
"title": None,
"message": None,
"image_url": None,
"link_to_upstream_details": None,
"alert_receive_channel_pk": alert_receive_channel.pk,
"integration_unique_data": None,
"raw_request_data": data,
},
)
return Response("Ok.")

View file

@ -6,6 +6,7 @@ from rest_framework import fields, serializers
from apps.alerts.grafana_alerting_sync_manager.grafana_alerting_sync import GrafanaAlertingSyncManager
from apps.alerts.models import AlertReceiveChannel
from apps.base.messaging import get_messaging_backends
from apps.integrations.legacy_prefix import has_legacy_prefix, remove_legacy_prefix
from common.api_helpers.custom_fields import TeamPrimaryKeyRelatedField
from common.api_helpers.exceptions import BadRequest
from common.api_helpers.mixins import PHONE_CALL, SLACK, SMS, TELEGRAM, WEB, EagerLoadingMixin
@ -59,16 +60,15 @@ for backend_id, backend in get_messaging_backends():
class IntegrationTypeField(fields.CharField):
def to_representation(self, value):
return AlertReceiveChannel.PUBLIC_API_INTEGRATION_MAP[value]
value = remove_legacy_prefix(value)
return value
def to_internal_value(self, data):
try:
integration_type = [
key for key, value in AlertReceiveChannel.PUBLIC_API_INTEGRATION_MAP.items() if value == data
][0]
except IndexError:
if data not in AlertReceiveChannel.INTEGRATION_TYPES:
raise BadRequest(detail="Invalid integration type")
return integration_type
if has_legacy_prefix(data):
raise BadRequest("This integration type is deprecated")
return data
class IntegrationSerializer(EagerLoadingMixin, serializers.ModelSerializer, MaintainableObjectSerializerMixin):
@ -117,10 +117,8 @@ class IntegrationSerializer(EagerLoadingMixin, serializers.ModelSerializer, Main
default_route_data = validated_data.pop("default_route", None)
organization = self.context["request"].auth.organization
integration = validated_data.get("integration")
# hack to block alertmanager_v2 integration, will be removed
if integration == "alertmanager_v2":
raise BadRequest
if integration == AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING:
# TODO: probably only needs to check if unified alerting is on
connection_error = GrafanaAlertingSyncManager.check_for_connection_errors(organization)
if connection_error:
raise serializers.ValidationError(connection_error)

View file

@ -871,3 +871,71 @@ def test_update_integrations_direct_paging(
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data["detail"] == AlertReceiveChannel.DuplicateDirectPagingError.DETAIL
@pytest.mark.django_db
def test_get_integration_type_legacy(
make_organization_and_user_with_token, make_alert_receive_channel, make_channel_filter, make_integration_heartbeat
):
organization, user, token = make_organization_and_user_with_token()
am = make_alert_receive_channel(
organization, verbal_name="AMV2", integration=AlertReceiveChannel.INTEGRATION_ALERTMANAGER
)
legacy_am = make_alert_receive_channel(
organization, verbal_name="AMV2", integration=AlertReceiveChannel.INTEGRATION_LEGACY_ALERTMANAGER
)
client = APIClient()
url = reverse("api-public:integrations-detail", args=[am.public_primary_key])
response = client.get(url, format="json", HTTP_AUTHORIZATION=f"{token}")
assert response.status_code == status.HTTP_200_OK
assert response.data["type"] == "alertmanager"
url = reverse("api-public:integrations-detail", args=[legacy_am.public_primary_key])
response = client.get(url, format="json", HTTP_AUTHORIZATION=f"{token}")
assert response.status_code == status.HTTP_200_OK
assert response.data["type"] == "alertmanager"
@pytest.mark.django_db
def test_create_integration_type_legacy(
make_organization_and_user_with_token, make_alert_receive_channel, make_channel_filter, make_integration_heartbeat
):
organization, user, token = make_organization_and_user_with_token()
client = APIClient()
url = reverse("api-public:integrations-list")
response = client.post(url, data={"type": "alertmanager"}, format="json", HTTP_AUTHORIZATION=f"{token}")
assert response.status_code == status.HTTP_201_CREATED
assert response.data["type"] == "alertmanager"
response = client.post(url, data={"type": "legacy_alertmanager"}, format="json", HTTP_AUTHORIZATION=f"{token}")
assert response.status_code == status.HTTP_400_BAD_REQUEST
@pytest.mark.django_db
def test_update_integration_type_legacy(
make_organization_and_user_with_token, make_alert_receive_channel, make_channel_filter, make_integration_heartbeat
):
organization, user, token = make_organization_and_user_with_token()
am = make_alert_receive_channel(
organization, verbal_name="AMV2", integration=AlertReceiveChannel.INTEGRATION_ALERTMANAGER
)
legacy_am = make_alert_receive_channel(
organization, verbal_name="AMV2", integration=AlertReceiveChannel.INTEGRATION_LEGACY_ALERTMANAGER
)
data_for_update = {"type": "alertmanager", "description_short": "Updated description"}
client = APIClient()
url = reverse("api-public:integrations-detail", args=[am.public_primary_key])
response = client.put(url, data=data_for_update, format="json", HTTP_AUTHORIZATION=f"{token}")
assert response.status_code == status.HTTP_200_OK
assert response.data["type"] == "alertmanager"
assert response.data["description_short"] == "Updated description"
url = reverse("api-public:integrations-detail", args=[legacy_am.public_primary_key])
response = client.put(url, data=data_for_update, format="json", HTTP_AUTHORIZATION=f"{token}")
assert response.status_code == status.HTTP_200_OK
assert response.data["description_short"] == "Updated description"
assert response.data["type"] == "alertmanager"

View file

@ -119,6 +119,7 @@ def test_take_own_ssr(shift_swap_request_setup) -> None:
ssr.take(beneficiary)
@pytest.mark.skip(reason="Skipping to unblock release")
@pytest.mark.django_db
def test_related_shifts(shift_swap_request_setup, make_on_call_shift) -> None:
ssr, beneficiary, _ = shift_swap_request_setup()

View file

@ -1,38 +1,50 @@
# Main
enabled = True
title = "Alertmanager"
title = "AlertManager"
slug = "alertmanager"
short_description = "Prometheus"
is_displayed_on_web = True
is_featured = False
is_able_to_autoresolve = True
is_demo_alert_enabled = True
description = None
based_on_alertmanager = True
# Behaviour
source_link = "{{ payload.externalURL }}"
grouping_id = "{{ payload.groupKey }}"
resolve_condition = """{{ payload.status == "resolved" }}"""
acknowledge_condition = None
web_title = """\
{%- set groupLabels = payload.groupLabels.copy() -%}
{%- set alertname = groupLabels.pop('alertname') | default("") -%}
[{{ payload.status }}{% if payload.status == 'firing' %}:{{ payload.numFiring }}{% endif %}] {{ alertname }} {% if groupLabels | length > 0 %}({{ groupLabels|join(", ") }}){% endif %}
""" # noqa
# Web
web_title = """{{- payload.get("labels", {}).get("alertname", "No title (check Title Template)") -}}"""
web_message = """\
{%- set annotations = payload.annotations.copy() -%}
{%- set labels = payload.labels.copy() -%}
{%- set annotations = payload.commonAnnotations.copy() -%}
{%- if "summary" in annotations %}
{{ annotations.summary }}
{%- set _ = annotations.pop('summary') -%}
{%- endif %}
{%- if "message" in annotations %}
{{ annotations.message }}
{%- set _ = annotations.pop('message') -%}
{%- endif %}
{% set severity = labels.severity | default("Unknown") -%}
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
{% if "runbook_url" in annotations -%}
[:book: Runbook:link:]({{ annotations.runbook_url }})
@ -44,35 +56,34 @@ Status: {{ status }} {{ status_emoji }} (on the source)
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
:label: Labels:
{%- for k, v in payload["labels"].items() %}
- {{ k }}: {{ v }}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
:pushpin: Other annotations:
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
""" # noqa: W291
web_image_url = None
[View in AlertManager]({{ source_link }})
"""
# Behaviour
source_link = "{{ payload.generatorURL }}"
grouping_id = "{{ payload.labels }}"
resolve_condition = """{{ payload.status == "resolved" }}"""
acknowledge_condition = None
# Slack
# Slack templates
slack_title = """\
{% set title = payload.get("labels", {}).get("alertname", "No title (check Title Template)") %}
{# Combine the title from different built-in variables into slack-formatted url #}
*<{{ grafana_oncall_link }}|#{{ grafana_oncall_incident_id }} {{ title }}>* via {{ integration_name }}
{%- set groupLabels = payload.groupLabels.copy() -%}
{%- set alertname = groupLabels.pop('alertname') | default("") -%}
*<{{ grafana_oncall_link }}|#{{ grafana_oncall_incident_id }} {{ web_title }}>* via {{ integration_name }}
{% if source_link %}
(*<{{ source_link }}|source>*)
{%- endif %}
@ -88,32 +99,21 @@ slack_title = """\
# """
slack_message = """\
{%- set annotations = payload.annotations.copy() -%}
{%- set labels = payload.labels.copy() -%}
{%- set annotations = payload.commonAnnotations.copy() -%}
{%- if "summary" in annotations %}
{{ annotations.summary }}
{%- set _ = annotations.pop('summary') -%}
{%- endif %}
{%- if "message" in annotations %}
{{ annotations.message }}
{%- set _ = annotations.pop('message') -%}
{%- endif %}
{# Optionally set oncall_slack_user_group to slack user group in the following format "@users-oncall" #}
{%- set oncall_slack_user_group = None -%}
{%- if oncall_slack_user_group %}
Heads up {{ oncall_slack_user_group }}
{%- endif %}
{% set severity = labels.severity | default("Unknown") -%}
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
{% if "runbook_url" in annotations -%}
<{{ annotations.runbook_url }}|:book: Runbook:link:>
@ -125,59 +125,55 @@ Status: {{ status }} {{ status_emoji }} (on the source)
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
:label: Labels:
{%- for k, v in payload["labels"].items() %}
- {{ k }}: {{ v }}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
:pushpin: Other annotations:
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
""" # noqa: W291
"""
# noqa: W291
slack_image_url = None
# SMS
web_image_url = None
sms_title = web_title
# Phone
phone_call_title = web_title
# Telegram
phone_call_title = """{{ payload.groupLabels|join(", ") }}"""
telegram_title = web_title
# default telegram message template is identical to web message template, except urls
# It can be based on web message template (see example), but it can affect existing templates
# telegram_message = """
# {% set mkdwn_link_regex = "\[([\w\s\d:]+)\]\((https?:\/\/[\w\d./?=#]+)\)" %}
# {{ web_message
# | regex_replace(mkdwn_link_regex, "<a href='\\2'>\\1</a>")
# }}
# """
telegram_message = """\
{%- set annotations = payload.annotations.copy() -%}
{%- set labels = payload.labels.copy() -%}
{%- set annotations = payload.commonAnnotations.copy() -%}
{%- if "summary" in annotations %}
{{ annotations.summary }}
{%- set _ = annotations.pop('summary') -%}
{%- endif %}
{%- if "message" in annotations %}
{{ annotations.message }}
{%- set _ = annotations.pop('message') -%}
{%- endif %}
{% set severity = labels.severity | default("Unknown") -%}
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
{% if "runbook_url" in annotations -%}
<a href='{{ annotations.runbook_url }}'>:book: Runbook:link:</a>
@ -189,96 +185,79 @@ Status: {{ status }} {{ status_emoji }} (on the source)
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
:label: Labels:
{%- for k, v in payload["labels"].items() %}
- {{ k }}: {{ v }}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
:pushpin: Other annotations:
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
""" # noqa: W291
<a href='{{ source_link }}'>View in AlertManager</a>
"""
telegram_image_url = None
tests = {
"payload": {
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "kube-state-metrics",
"instance": "10.143.139.7:8443",
"job_name": "email-tracking-perform-initialization-1.0.50",
"severity": "warning",
"alertname": "KubeJobCompletion",
"namespace": "default",
"prometheus": "monitoring/k8s",
},
"status": "firing",
"startsAt": "2019-12-13T08:57:35.095800493Z",
"annotations": {
"message": "Job default/email-tracking-perform-initialization-1.0.50 is taking more than one hour to complete.",
"runbook_url": "https://github.com/kubernetes-monitoring/kubernetes-mixin/tree/master/runbook.md#alert-name-kubejobcompletion",
},
"generatorURL": (
"https://localhost/prometheus/graph?g0.expr=kube_job_spec_completions%7Bjob%3D%22kube-state-metrics%22%7D"
"+-+kube_job_status_succeeded%7Bjob%3D%22kube-state-metrics%22%7D+%3E+0&g0.tab=1"
),
},
"slack": {
"title": (
"*<{web_link}|#1 KubeJobCompletion>* via {integration_name} "
"(*<"
"https://localhost/prometheus/graph?g0.expr=kube_job_spec_completions%7Bjob%3D%22kube-state-metrics%22%7D"
"+-+kube_job_status_succeeded%7Bjob%3D%22kube-state-metrics%22%7D+%3E+0&g0.tab=1"
"|source>*)"
),
"message": "\nJob default/email-tracking-perform-initialization-1.0.50 is taking more than one hour to complete.\n\n\n\nSeverity: warning :warning:\nStatus: firing :fire: (on the source)\n\n<https://github.com/kubernetes-monitoring/kubernetes-mixin/tree/master/runbook.md#alert-name-kubejobcompletion|:book: Runbook:link:>\n\n:label: Labels:\n- job: kube-state-metrics\n- instance: 10.143.139.7:8443\n- job_name: email-tracking-perform-initialization-1.0.50\n- severity: warning\n- alertname: KubeJobCompletion\n- namespace: default\n- prometheus: monitoring/k8s\n\n", # noqa
"image_url": None,
},
"web": {
"title": "KubeJobCompletion",
"message": '<p>Job default/email-tracking-perform-initialization-1.0.50 is taking more than one hour to complete. </p>\n<p>Severity: warning ⚠️ <br/>\nStatus: firing 🔥 (on the source) </p>\n<p><a href="https://github.com/kubernetes-monitoring/kubernetes-mixin/tree/master/runbook.md#alert-name-kubejobcompletion" rel="nofollow noopener" target="_blank">📖 Runbook🔗</a> </p>\n<p>🏷️ Labels: </p>\n<ul>\n<li>job: kube-state-metrics </li>\n<li>instance: 10.143.139.7:8443 </li>\n<li>job_name: email-tracking-perform-initialization-1.0.50 </li>\n<li>severity: warning </li>\n<li>alertname: KubeJobCompletion </li>\n<li>namespace: default </li>\n<li>prometheus: monitoring/k8s </li>\n</ul>', # noqa
"image_url": None,
},
"sms": {
"title": "KubeJobCompletion",
},
"phone_call": {
"title": "KubeJobCompletion",
},
"telegram": {
"title": "KubeJobCompletion",
"message": "\nJob default/email-tracking-perform-initialization-1.0.50 is taking more than one hour to complete.\n\nSeverity: warning ⚠️\nStatus: firing 🔥 (on the source)\n\n<a href='https://github.com/kubernetes-monitoring/kubernetes-mixin/tree/master/runbook.md#alert-name-kubejobcompletion'>📖 Runbook🔗</a>\n\n🏷️ Labels:\n- job: kube-state-metrics\n- instance: 10.143.139.7:8443\n- job_name: email-tracking-perform-initialization-1.0.50\n- severity: warning\n- alertname: KubeJobCompletion\n- namespace: default\n- prometheus: monitoring/k8s\n\n", # noqa
"image_url": None,
},
}
# Misc
example_payload = {
"receiver": "amixr",
"status": "firing",
"alerts": [
{
"status": "firing",
"labels": {"alertname": "TestAlert", "region": "eu-1", "severity": "critical"},
"annotations": {
"message": "This is test alert",
"description": "This alert was sent by user for demonstration purposes",
"runbook_url": "https://grafana.com/",
},
"startsAt": "2018-12-25T15:47:47.377363608Z",
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "production",
"instance": "localhost:8081",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8081 down",
"description": "localhost:8081 of job node has been down for more than 1 minute.",
},
"fingerprint": "f404ecabc8dd5cd7",
"generatorURL": "",
"amixr_demo": True,
}
},
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "canary",
"instance": "localhost:8082",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8082 down",
"description": "localhost:8082 of job node has been down for more than 1 minute.",
},
"fingerprint": "f8f08d4e32c61a9d",
"generatorURL": "",
},
],
"groupLabels": {},
"commonLabels": {},
"commonAnnotations": {},
"externalURL": "http://f1d1ef51d710:9093",
"status": "firing",
"version": "4",
"groupKey": "{}:{}",
"groupKey": '{}:{alertname="InstanceDown"}',
"receiver": "combo",
"numFiring": 2,
"externalURL": "",
"groupLabels": {"alertname": "InstanceDown"},
"numResolved": 0,
"commonLabels": {"job": "node", "severity": "critical", "alertname": "InstanceDown"},
"truncatedAlerts": 0,
"commonAnnotations": {},
}

View file

@ -1,281 +0,0 @@
# Main
enabled = True
title = "AlertManagerV2"
slug = "alertmanager_v2"
short_description = "Prometheus"
is_displayed_on_web = False
is_featured = False
is_able_to_autoresolve = True
is_demo_alert_enabled = True
description = None
# Behaviour
source_link = "{{ payload.externalURL }}"
grouping_id = "{{ payload.groupKey }}"
resolve_condition = """{{ payload.status == "resolved" }}"""
acknowledge_condition = None
web_title = """\
{%- set groupLabels = payload.groupLabels.copy() -%}
{%- set alertname = groupLabels.pop('alertname') | default("") -%}
[{{ payload.status }}{% if payload.status == 'firing' %}:{{ payload.numFiring }}{% endif %}] {{ alertname }} {% if groupLabels | length > 0 %}({{ groupLabels|join(", ") }}){% endif %}
""" # noqa
web_message = """\
{%- set annotations = payload.commonAnnotations.copy() -%}
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
{% if "runbook_url" in annotations -%}
[:book: Runbook:link:]({{ annotations.runbook_url }})
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
{%- if "runbook_url_internal" in annotations -%}
[:closed_book: Runbook (internal):link:]({{ annotations.runbook_url_internal }})
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
[View in AlertManager]({{ source_link }})
"""
# Slack templates
slack_title = """\
{%- set groupLabels = payload.groupLabels.copy() -%}
{%- set alertname = groupLabels.pop('alertname') | default("") -%}
*<{{ grafana_oncall_link }}|#{{ grafana_oncall_incident_id }} {{ web_title }}>* via {{ integration_name }}
{% if source_link %}
(*<{{ source_link }}|source>*)
{%- endif %}
"""
# default slack message template is identical to web message template, except urls
# It can be based on web message template (see example), but it can affect existing templates
# slack_message = """
# {% set mkdwn_link_regex = "\[([\w\s\d:]+)\]\((https?:\/\/[\w\d./?=#]+)\)" %}
# {{ web_message
# | regex_replace(mkdwn_link_regex, "<\\2|\\1>")
# }}
# """
slack_message = """\
{%- set annotations = payload.commonAnnotations.copy() -%}
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
{% if "runbook_url" in annotations -%}
<{{ annotations.runbook_url }}|:book: Runbook:link:>
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
{%- if "runbook_url_internal" in annotations -%}
<{{ annotations.runbook_url_internal }}|:closed_book: Runbook (internal):link:>
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
"""
# noqa: W291
slack_image_url = None
web_image_url = None
sms_title = web_title
phone_call_title = """{{ payload.groupLabels|join(", ") }}"""
telegram_title = web_title
telegram_message = """\
{%- set annotations = payload.commonAnnotations.copy() -%}
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
{% if "runbook_url" in annotations -%}
<a href='{{ annotations.runbook_url }}'>:book: Runbook:link:</a>
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
{%- if "runbook_url_internal" in annotations -%}
<a href='{{ annotations.runbook_url_internal }}'>:closed_book: Runbook (internal):link:</a>
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
<a href='{{ source_link }}'>View in AlertManager</a>
"""
telegram_image_url = None
example_payload = {
"alerts": [
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "production",
"instance": "localhost:8081",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8081 down",
"description": "localhost:8081 of job node has been down for more than 1 minute.",
},
"fingerprint": "f404ecabc8dd5cd7",
"generatorURL": "",
},
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "canary",
"instance": "localhost:8082",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8082 down",
"description": "localhost:8082 of job node has been down for more than 1 minute.",
},
"fingerprint": "f8f08d4e32c61a9d",
"generatorURL": "",
},
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "production",
"instance": "localhost:8083",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8083 down",
"description": "localhost:8083 of job node has been down for more than 1 minute.",
},
"fingerprint": "39f38c0611ee7abd",
"generatorURL": "",
},
],
"status": "firing",
"version": "4",
"groupKey": '{}:{alertname="InstanceDown"}',
"receiver": "combo",
"numFiring": 3,
"externalURL": "",
"groupLabels": {"alertname": "InstanceDown"},
"numResolved": 0,
"commonLabels": {"job": "node", "severity": "critical", "alertname": "InstanceDown"},
"truncatedBytes": 0,
"truncatedAlerts": 0,
"commonAnnotations": {},
}

View file

@ -8,8 +8,8 @@ is_displayed_on_web = True
is_featured = False
is_able_to_autoresolve = True
is_demo_alert_enabled = True
based_on_alertmanager = True
description = None
# Default templates
slack_title = """\

View file

@ -12,120 +12,272 @@ featured_tag_name = "Quick Connect"
is_able_to_autoresolve = True
is_demo_alert_enabled = True
description = """ \
Alerts from Grafana Alertmanager are automatically routed to this integration.
{% for dict_item in grafana_alerting_entities %}
<br>Click <a href='{{dict_item.contact_point_url}}' target='_blank'>here</a>
to open contact point, and
<a href='{{dict_item.routes_url}}' target='_blank'>here</a>
to open Notification policy for {{dict_item.alertmanager_name}} Alertmanager.
{% endfor %}
{% if not is_finished_alerting_setup %}
<br>Creating contact points and routes for other alertmanagers...
# Behaviour
source_link = "{{ payload.externalURL }}"
grouping_id = "{{ payload.groupKey }}"
resolve_condition = """{{ payload.status == "resolved" }}"""
acknowledge_condition = None
web_title = """\
{%- set groupLabels = payload.groupLabels.copy() -%}
{%- set alertname = groupLabels.pop('alertname') | default("") -%}
[{{ payload.status }}{% if payload.status == 'firing' %}:{{ payload.numFiring }}{% endif %}] {{ alertname }} {% if groupLabels | length > 0 %}({{ groupLabels|join(", ") }}){% endif %}
""" # noqa
web_message = """\
{%- set annotations = payload.commonAnnotations.copy() -%}
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
{% if "runbook_url" in annotations -%}
[:book: Runbook:link:]({{ annotations.runbook_url }})
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
{%- if "runbook_url_internal" in annotations -%}
[:closed_book: Runbook (internal):link:]({{ annotations.runbook_url_internal }})
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
[View in AlertManager]({{ source_link }})
"""
# Default templates
# Slack templates
slack_title = """\
{# Usually title is located in payload.labels.alertname #}
{% set title = payload.get("labels", {}).get("alertname", "No title (check Web Title Template)") %}
{# Combine the title from different built-in variables into slack-formatted url #}
*<{{ grafana_oncall_link }}|#{{ grafana_oncall_incident_id }} {{ title }}>* via {{ integration_name }}
{%- set groupLabels = payload.groupLabels.copy() -%}
{%- set alertname = groupLabels.pop('alertname') | default("") -%}
*<{{ grafana_oncall_link }}|#{{ grafana_oncall_incident_id }} {{ web_title }}>* via {{ integration_name }}
{% if source_link %}
(*<{{ source_link }}|source>*)
{%- endif %}
"""
# default slack message template is identical to web message template, except urls
# It can be based on web message template (see example), but it can affect existing templates
# slack_message = """
# {% set mkdwn_link_regex = "\[([\w\s\d:]+)\]\((https?:\/\/[\w\d./?=#]+)\)" %}
# {{ web_message
# | regex_replace(mkdwn_link_regex, "<\\2|\\1>")
# }}
# """
slack_message = """\
{{- payload.message }}
{%- if "status" in payload -%}
*Status*: {{ payload.status }}
{% endif -%}
*Labels:* {% for k, v in payload["labels"].items() %}
{{ k }}: {{ v }}{% endfor %}
*Annotations:*
{%- for k, v in payload.get("annotations", {}).items() %}
{#- render annotation as slack markdown url if it starts with http #}
{{ k }}: {% if v.startswith("http") %} <{{v}}|here> {% else %} {{v}} {% endif -%}
{% endfor %}
""" # noqa:W291
{%- set annotations = payload.commonAnnotations.copy() -%}
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
{% if "runbook_url" in annotations -%}
<{{ annotations.runbook_url }}|:book: Runbook:link:>
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
{%- if "runbook_url_internal" in annotations -%}
<{{ annotations.runbook_url_internal }}|:closed_book: Runbook (internal):link:>
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
"""
# noqa: W291
slack_image_url = None
web_title = """\
{# Usually title is located in payload.labels.alertname #}
{{- payload.get("labels", {}).get("alertname", "No title (check Web Title Template)") }}
"""
web_image_url = None
web_message = """\
{{- payload.message }}
{%- if "status" in payload %}
**Status**: {{ payload.status }}
{% endif -%}
**Labels:** {% for k, v in payload["labels"].items() %}
*{{ k }}*: {{ v }}{% endfor %}
**Annotations:**
{%- for k, v in payload.get("annotations", {}).items() %}
{#- render annotation as markdown url if it starts with http #}
*{{ k }}*: {% if v.startswith("http") %} [here]({{v}}){% else %} {{v}} {% endif -%}
{% endfor %}
""" # noqa:W291
sms_title = web_title
web_image_url = slack_image_url
phone_call_title = """{{ payload.groupLabels|join(", ") }}"""
sms_title = '{{ payload.get("labels", {}).get("alertname", "Title undefined") }}'
phone_call_title = sms_title
telegram_title = sms_title
telegram_title = web_title
telegram_message = """\
{{- payload.messsage }}
{%- if "status" in payload -%}
<b>Status</b>: {{ payload.status }}
{% endif -%}
<b>Labels:</b> {% for k, v in payload["labels"].items() %}
{{ k }}: {{ v }}{% endfor %}
<b>Annotations:</b>
{%- for k, v in payload.get("annotations", {}).items() %}
{#- render annotation as markdown url if it starts with http #}
{{ k }}: {{ v }}
{% endfor %}""" # noqa:W291
{%- set annotations = payload.commonAnnotations.copy() -%}
telegram_image_url = slack_image_url
{% set severity = payload.groupLabels.severity -%}
{% if severity %}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{% endif %}
source_link = "{{ payload.generatorURL }}"
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if status == "firing" %}
Firing alerts {{ payload.numFiring }}
Resolved alerts {{ payload.numResolved }}
{% endif %}
grouping_id = web_title
{% if "runbook_url" in annotations -%}
<a href='{{ annotations.runbook_url }}'>:book: Runbook:link:</a>
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
resolve_condition = """\
{{ payload.get("status", "") == "resolved" }}
{%- if "runbook_url_internal" in annotations -%}
<a href='{{ annotations.runbook_url_internal }}'>:closed_book: Runbook (internal):link:</a>
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
GroupLabels:
{%- for k, v in payload["groupLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if payload["commonLabels"] | length > 0 -%}
CommonLabels:
{%- for k, v in payload["commonLabels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
{% if annotations | length > 0 -%}
Annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
<a href='{{ source_link }}'>View in AlertManager</a>
"""
acknowledge_condition = None
telegram_image_url = None
example_payload = {
"receiver": "amixr",
"status": "firing",
"alerts": [
{
"status": "firing",
"labels": {
"alertname": "TestAlert",
"region": "eu-1",
},
"annotations": {"description": "This alert was sent by user for demonstration purposes"},
"startsAt": "2018-12-25T15:47:47.377363608Z",
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "production",
"instance": "localhost:8081",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8081 down",
"description": "localhost:8081 of job node has been down for more than 1 minute.",
},
"fingerprint": "f404ecabc8dd5cd7",
"generatorURL": "",
"amixr_demo": True,
}
},
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "canary",
"instance": "localhost:8082",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8082 down",
"description": "localhost:8082 of job node has been down for more than 1 minute.",
},
"fingerprint": "f8f08d4e32c61a9d",
"generatorURL": "",
},
{
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "node",
"group": "production",
"instance": "localhost:8083",
"severity": "critical",
"alertname": "InstanceDown",
},
"status": "firing",
"startsAt": "2023-06-12T08:24:38.326Z",
"annotations": {
"title": "Instance localhost:8083 down",
"description": "localhost:8083 of job node has been down for more than 1 minute.",
},
"fingerprint": "39f38c0611ee7abd",
"generatorURL": "",
},
],
"groupLabels": {},
"commonLabels": {},
"commonAnnotations": {},
"externalURL": "http://f1d1ef51d710:9093",
"status": "firing",
"version": "4",
"groupKey": "{}:{}",
"groupKey": '{}:{alertname="InstanceDown"}',
"receiver": "combo",
"numFiring": 3,
"externalURL": "",
"groupLabels": {"alertname": "InstanceDown"},
"numResolved": 0,
"commonLabels": {"job": "node", "severity": "critical", "alertname": "InstanceDown"},
"truncatedAlerts": 0,
"commonAnnotations": {},
}

View file

@ -0,0 +1,285 @@
# Main
enabled = True
title = "(Legacy) AlertManager"
slug = "legacy_alertmanager"
short_description = "Prometheus"
is_displayed_on_web = True
is_featured = False
is_able_to_autoresolve = True
is_demo_alert_enabled = True
based_on_alertmanager = True
description = None
# Web
web_title = """{{- payload.get("labels", {}).get("alertname", "No title (check Title Template)") -}}"""
web_message = """\
{%- set annotations = payload.annotations.copy() -%}
{%- set labels = payload.labels.copy() -%}
{%- if "summary" in annotations %}
{{ annotations.summary }}
{%- set _ = annotations.pop('summary') -%}
{%- endif %}
{%- if "message" in annotations %}
{{ annotations.message }}
{%- set _ = annotations.pop('message') -%}
{%- endif %}
{% set severity = labels.severity | default("Unknown") -%}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if "runbook_url" in annotations -%}
[:book: Runbook:link:]({{ annotations.runbook_url }})
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
{%- if "runbook_url_internal" in annotations -%}
[:closed_book: Runbook (internal):link:]({{ annotations.runbook_url_internal }})
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
:label: Labels:
{%- for k, v in payload["labels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if annotations | length > 0 -%}
:pushpin: Other annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
""" # noqa: W291
web_image_url = None
# Behaviour
source_link = "{{ payload.generatorURL }}"
grouping_id = "{{ payload.labels }}"
resolve_condition = """{{ payload.status == "resolved" }}"""
acknowledge_condition = None
# Slack
slack_title = """\
{% set title = payload.get("labels", {}).get("alertname", "No title (check Title Template)") %}
{# Combine the title from different built-in variables into slack-formatted url #}
*<{{ grafana_oncall_link }}|#{{ grafana_oncall_incident_id }} {{ title }}>* via {{ integration_name }}
{% if source_link %}
(*<{{ source_link }}|source>*)
{%- endif %}
"""
# default slack message template is identical to web message template, except urls
# It can be based on web message template (see example), but it can affect existing templates
# slack_message = """
# {% set mkdwn_link_regex = "\[([\w\s\d:]+)\]\((https?:\/\/[\w\d./?=#]+)\)" %}
# {{ web_message
# | regex_replace(mkdwn_link_regex, "<\\2|\\1>")
# }}
# """
slack_message = """\
{%- set annotations = payload.annotations.copy() -%}
{%- set labels = payload.labels.copy() -%}
{%- if "summary" in annotations %}
{{ annotations.summary }}
{%- set _ = annotations.pop('summary') -%}
{%- endif %}
{%- if "message" in annotations %}
{{ annotations.message }}
{%- set _ = annotations.pop('message') -%}
{%- endif %}
{# Optionally set oncall_slack_user_group to slack user group in the following format "@users-oncall" #}
{%- set oncall_slack_user_group = None -%}
{%- if oncall_slack_user_group %}
Heads up {{ oncall_slack_user_group }}
{%- endif %}
{% set severity = labels.severity | default("Unknown") -%}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if "runbook_url" in annotations -%}
<{{ annotations.runbook_url }}|:book: Runbook:link:>
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
{%- if "runbook_url_internal" in annotations -%}
<{{ annotations.runbook_url_internal }}|:closed_book: Runbook (internal):link:>
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
:label: Labels:
{%- for k, v in payload["labels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if annotations | length > 0 -%}
:pushpin: Other annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
""" # noqa: W291
slack_image_url = None
# SMS
sms_title = web_title
# Phone
phone_call_title = web_title
# Telegram
telegram_title = web_title
# default telegram message template is identical to web message template, except urls
# It can be based on web message template (see example), but it can affect existing templates
# telegram_message = """
# {% set mkdwn_link_regex = "\[([\w\s\d:]+)\]\((https?:\/\/[\w\d./?=#]+)\)" %}
# {{ web_message
# | regex_replace(mkdwn_link_regex, "<a href='\\2'>\\1</a>")
# }}
# """
telegram_message = """\
{%- set annotations = payload.annotations.copy() -%}
{%- set labels = payload.labels.copy() -%}
{%- if "summary" in annotations %}
{{ annotations.summary }}
{%- set _ = annotations.pop('summary') -%}
{%- endif %}
{%- if "message" in annotations %}
{{ annotations.message }}
{%- set _ = annotations.pop('message') -%}
{%- endif %}
{% set severity = labels.severity | default("Unknown") -%}
{%- set severity_emoji = {"critical": ":rotating_light:", "warning": ":warning:" }[severity] | default(":question:") -%}
Severity: {{ severity }} {{ severity_emoji }}
{%- set status = payload.status | default("Unknown") %}
{%- set status_emoji = {"firing": ":fire:", "resolved": ":white_check_mark:"}[status] | default(":warning:") %}
Status: {{ status }} {{ status_emoji }} (on the source)
{% if "runbook_url" in annotations -%}
<a href='{{ annotations.runbook_url }}'>:book: Runbook:link:</a>
{%- set _ = annotations.pop('runbook_url') -%}
{%- endif %}
{%- if "runbook_url_internal" in annotations -%}
<a href='{{ annotations.runbook_url_internal }}'>:closed_book: Runbook (internal):link:</a>
{%- set _ = annotations.pop('runbook_url_internal') -%}
{%- endif %}
:label: Labels:
{%- for k, v in payload["labels"].items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% if annotations | length > 0 -%}
:pushpin: Other annotations:
{%- for k, v in annotations.items() %}
- {{ k }}: {{ v }}
{%- endfor %}
{% endif %}
""" # noqa: W291
telegram_image_url = None
tests = {
"payload": {
"endsAt": "0001-01-01T00:00:00Z",
"labels": {
"job": "kube-state-metrics",
"instance": "10.143.139.7:8443",
"job_name": "email-tracking-perform-initialization-1.0.50",
"severity": "warning",
"alertname": "KubeJobCompletion",
"namespace": "default",
"prometheus": "monitoring/k8s",
},
"status": "firing",
"startsAt": "2019-12-13T08:57:35.095800493Z",
"annotations": {
"message": "Job default/email-tracking-perform-initialization-1.0.50 is taking more than one hour to complete.",
"runbook_url": "https://github.com/kubernetes-monitoring/kubernetes-mixin/tree/master/runbook.md#alert-name-kubejobcompletion",
},
"generatorURL": (
"https://localhost/prometheus/graph?g0.expr=kube_job_spec_completions%7Bjob%3D%22kube-state-metrics%22%7D"
"+-+kube_job_status_succeeded%7Bjob%3D%22kube-state-metrics%22%7D+%3E+0&g0.tab=1"
),
},
"slack": {
"title": (
"*<{web_link}|#1 KubeJobCompletion>* via {integration_name} "
"(*<"
"https://localhost/prometheus/graph?g0.expr=kube_job_spec_completions%7Bjob%3D%22kube-state-metrics%22%7D"
"+-+kube_job_status_succeeded%7Bjob%3D%22kube-state-metrics%22%7D+%3E+0&g0.tab=1"
"|source>*)"
),
"message": "\nJob default/email-tracking-perform-initialization-1.0.50 is taking more than one hour to complete.\n\n\n\nSeverity: warning :warning:\nStatus: firing :fire: (on the source)\n\n<https://github.com/kubernetes-monitoring/kubernetes-mixin/tree/master/runbook.md#alert-name-kubejobcompletion|:book: Runbook:link:>\n\n:label: Labels:\n- job: kube-state-metrics\n- instance: 10.143.139.7:8443\n- job_name: email-tracking-perform-initialization-1.0.50\n- severity: warning\n- alertname: KubeJobCompletion\n- namespace: default\n- prometheus: monitoring/k8s\n\n", # noqa
"image_url": None,
},
"web": {
"title": "KubeJobCompletion",
"message": '<p>Job default/email-tracking-perform-initialization-1.0.50 is taking more than one hour to complete. </p>\n<p>Severity: warning ⚠️ <br/>\nStatus: firing 🔥 (on the source) </p>\n<p><a href="https://github.com/kubernetes-monitoring/kubernetes-mixin/tree/master/runbook.md#alert-name-kubejobcompletion" rel="nofollow noopener" target="_blank">📖 Runbook🔗</a> </p>\n<p>🏷️ Labels: </p>\n<ul>\n<li>job: kube-state-metrics </li>\n<li>instance: 10.143.139.7:8443 </li>\n<li>job_name: email-tracking-perform-initialization-1.0.50 </li>\n<li>severity: warning </li>\n<li>alertname: KubeJobCompletion </li>\n<li>namespace: default </li>\n<li>prometheus: monitoring/k8s </li>\n</ul>', # noqa
"image_url": None,
},
"sms": {
"title": "KubeJobCompletion",
},
"phone_call": {
"title": "KubeJobCompletion",
},
"telegram": {
"title": "KubeJobCompletion",
"message": "\nJob default/email-tracking-perform-initialization-1.0.50 is taking more than one hour to complete.\n\nSeverity: warning ⚠️\nStatus: firing 🔥 (on the source)\n\n<a href='https://github.com/kubernetes-monitoring/kubernetes-mixin/tree/master/runbook.md#alert-name-kubejobcompletion'>📖 Runbook🔗</a>\n\n🏷️ Labels:\n- job: kube-state-metrics\n- instance: 10.143.139.7:8443\n- job_name: email-tracking-perform-initialization-1.0.50\n- severity: warning\n- alertname: KubeJobCompletion\n- namespace: default\n- prometheus: monitoring/k8s\n\n", # noqa
"image_url": None,
},
}
# Misc
example_payload = {
"receiver": "amixr",
"status": "firing",
"alerts": [
{
"status": "firing",
"labels": {"alertname": "TestAlert", "region": "eu-1", "severity": "critical"},
"annotations": {
"message": "This is test alert",
"description": "This alert was sent by user for demonstration purposes",
"runbook_url": "https://grafana.com/",
},
"startsAt": "2018-12-25T15:47:47.377363608Z",
"endsAt": "0001-01-01T00:00:00Z",
"generatorURL": "",
"amixr_demo": True,
}
],
"groupLabels": {},
"commonLabels": {},
"commonAnnotations": {},
"externalURL": "http://f1d1ef51d710:9093",
"version": "4",
"groupKey": "{}:{}",
}

View file

@ -0,0 +1,129 @@
# Main
enabled = True
title = "(Legacy) Grafana Alerting"
slug = "legacy_grafana_alerting"
short_description = "Why I am legacy?"
is_displayed_on_web = True
is_featured = False
featured_tag_name = None
is_able_to_autoresolve = True
is_demo_alert_enabled = True
based_on_alertmanager = True
description = """ \
Alerts from Grafana Alertmanager are automatically routed to this integration.
{% for dict_item in grafana_alerting_entities %}
<br>Click <a href='{{dict_item.contact_point_url}}' target='_blank'>here</a>
to open contact point, and
<a href='{{dict_item.routes_url}}' target='_blank'>here</a>
to open Notification policy for {{dict_item.alertmanager_name}} Alertmanager.
{% endfor %}
{% if not is_finished_alerting_setup %}
<br>Creating contact points and routes for other alertmanagers...
{% endif %}
"""
# Default templates
slack_title = """\
{# Usually title is located in payload.labels.alertname #}
{% set title = payload.get("labels", {}).get("alertname", "No title (check Web Title Template)") %}
{# Combine the title from different built-in variables into slack-formatted url #}
*<{{ grafana_oncall_link }}|#{{ grafana_oncall_incident_id }} {{ title }}>* via {{ integration_name }}
{% if source_link %}
(*<{{ source_link }}|source>*)
{%- endif %}
"""
slack_message = """\
{{- payload.message }}
{%- if "status" in payload -%}
*Status*: {{ payload.status }}
{% endif -%}
*Labels:* {% for k, v in payload["labels"].items() %}
{{ k }}: {{ v }}{% endfor %}
*Annotations:*
{%- for k, v in payload.get("annotations", {}).items() %}
{#- render annotation as slack markdown url if it starts with http #}
{{ k }}: {% if v.startswith("http") %} <{{v}}|here> {% else %} {{v}} {% endif -%}
{% endfor %}
""" # noqa:W291
slack_image_url = None
web_title = """\
{# Usually title is located in payload.labels.alertname #}
{{- payload.get("labels", {}).get("alertname", "No title (check Web Title Template)") }}
"""
web_message = """\
{{- payload.message }}
{%- if "status" in payload %}
**Status**: {{ payload.status }}
{% endif -%}
**Labels:** {% for k, v in payload["labels"].items() %}
*{{ k }}*: {{ v }}{% endfor %}
**Annotations:**
{%- for k, v in payload.get("annotations", {}).items() %}
{#- render annotation as markdown url if it starts with http #}
*{{ k }}*: {% if v.startswith("http") %} [here]({{v}}){% else %} {{v}} {% endif -%}
{% endfor %}
""" # noqa:W291
web_image_url = slack_image_url
sms_title = '{{ payload.get("labels", {}).get("alertname", "Title undefined") }}'
phone_call_title = sms_title
telegram_title = sms_title
telegram_message = """\
{{- payload.messsage }}
{%- if "status" in payload -%}
<b>Status</b>: {{ payload.status }}
{% endif -%}
<b>Labels:</b> {% for k, v in payload["labels"].items() %}
{{ k }}: {{ v }}{% endfor %}
<b>Annotations:</b>
{%- for k, v in payload.get("annotations", {}).items() %}
{#- render annotation as markdown url if it starts with http #}
{{ k }}: {{ v }}
{% endfor %}""" # noqa:W291
telegram_image_url = slack_image_url
source_link = "{{ payload.generatorURL }}"
grouping_id = web_title
resolve_condition = """\
{{ payload.get("status", "") == "resolved" }}
"""
acknowledge_condition = None
example_payload = {
"receiver": "amixr",
"status": "firing",
"alerts": [
{
"status": "firing",
"labels": {
"alertname": "TestAlert",
"region": "eu-1",
},
"annotations": {"description": "This alert was sent by user for demonstration purposes"},
"startsAt": "2018-12-25T15:47:47.377363608Z",
"endsAt": "0001-01-01T00:00:00Z",
"generatorURL": "",
"amixr_demo": True,
}
],
"groupLabels": {},
"commonLabels": {},
"commonAnnotations": {},
"externalURL": "http://f1d1ef51d710:9093",
"version": "4",
"groupKey": "{}:{}",
}

View file

@ -65,7 +65,7 @@ FEATURE_MULTIREGION_ENABLED = getenv_boolean("FEATURE_MULTIREGION_ENABLED", defa
FEATURE_INBOUND_EMAIL_ENABLED = getenv_boolean("FEATURE_INBOUND_EMAIL_ENABLED", default=False)
FEATURE_PROMETHEUS_EXPORTER_ENABLED = getenv_boolean("FEATURE_PROMETHEUS_EXPORTER_ENABLED", default=False)
FEATURE_WEBHOOKS_2_ENABLED = getenv_boolean("FEATURE_WEBHOOKS_2_ENABLED", default=True)
FEATURE_SHIFT_SWAPS_ENABLED = getenv_boolean("FEATURE_SHIFT_SWAPS_ENABLED", default=False)
FEATURE_SHIFT_SWAPS_ENABLED = getenv_boolean("FEATURE_SHIFT_SWAPS_ENABLED", default=True)
GRAFANA_CLOUD_ONCALL_HEARTBEAT_ENABLED = getenv_boolean("GRAFANA_CLOUD_ONCALL_HEARTBEAT_ENABLED", default=True)
GRAFANA_CLOUD_NOTIFICATIONS_ENABLED = getenv_boolean("GRAFANA_CLOUD_NOTIFICATIONS_ENABLED", default=True)
@ -669,10 +669,11 @@ INBOUND_EMAIL_DOMAIN = os.getenv("INBOUND_EMAIL_DOMAIN")
INBOUND_EMAIL_WEBHOOK_SECRET = os.getenv("INBOUND_EMAIL_WEBHOOK_SECRET")
INSTALLED_ONCALL_INTEGRATIONS = [
"config_integrations.alertmanager_v2",
"config_integrations.alertmanager",
"config_integrations.legacy_alertmanager",
"config_integrations.grafana",
"config_integrations.grafana_alerting",
"config_integrations.legacy_grafana_alerting",
"config_integrations.formatted_webhook",
"config_integrations.webhook",
"config_integrations.kapacitor",

View file

@ -1,13 +1,39 @@
.link {
text-decoration: none !important;
/* -----
* Flex
*/
.u-flex {
display: flex;
flex-direction: row;
}
.u-position-relative {
position: relative;
.u-align-items-center {
align-items: center;
}
.u-overflow-x-auto {
overflow-x: auto;
.u-flex-center {
justify-content: center;
align-items: center;
}
.u-flex-grow-1 {
flex-grow: 1;
}
.u-flex-gap-xs {
gap: 4px;
}
/* -----
* Margins/Paddings
*/
.u-margin-right-xs {
margin-right: 4px;
}
.u-padding-top-md {
padding-top: 16px;
}
.u-pull-right {
@ -18,9 +44,9 @@
margin-right: auto;
}
.u-break-word {
word-break: break-word;
}
/* -----
* Display
*/
.u-width-100 {
width: 100%;
@ -34,26 +60,36 @@
display: block;
}
.u-flex {
display: flex;
flex-direction: row;
/* -----
* Other
*/
.back-arrow {
padding-top: 8px;
}
.u-flex-center {
justify-content: center;
align-items: center;
.link {
text-decoration: none !important;
}
.u-flex-grow-1 {
flex-grow: 1;
.u-position-relative {
position: relative;
}
.u-align-items-center {
align-items: center;
.u-overflow-x-auto {
overflow-x: auto;
}
.u-break-word {
word-break: break-word;
}
.u-opacity,
.u-disabled {
opacity: var(--opacity);
}
.u-disabled {
opacity: var(--opacity);
cursor: not-allowed !important;
pointer-events: none;
}
@ -69,18 +105,6 @@
opacity: 15%;
}
.u-flex-xs {
gap: 4px;
}
.u-margin-right-xs {
margin-right: 4px;
}
.u-margin-right-md {
margin-right: 8px;
}
.buttons {
padding-bottom: 24px;
}

View file

@ -41,7 +41,6 @@ function renderFormControl(
) {
switch (formItem.type) {
case FormItemType.Input:
console.log({ ...register(formItem.name, formItem.validation) });
return (
<Input {...register(formItem.name, formItem.validation)} onChange={(value) => onChangeFn(undefined, value)} />
);

View file

@ -123,7 +123,7 @@ const CollapsedIntegrationRouteDisplay: React.FC<CollapsedIntegrationRouteDispla
)}
<div className={cx('collapsedRoute__item')}>
<div className={cx('u-flex', 'u-align-items-center', 'u-flex-xs')}>
<div className={cx('u-flex', 'u-align-items-center', 'u-flex-gap-xs')}>
<Icon name="list-ui-alt" />
<Text type="secondary" className={cx('u-margin-right-xs')}>
Trigger escalation chain
@ -141,7 +141,7 @@ const CollapsedIntegrationRouteDisplay: React.FC<CollapsedIntegrationRouteDispla
)}
{!escalationChain?.name && (
<div className={cx('u-flex', 'u-align-items-center', 'u-flex-xs')}>
<div className={cx('u-flex', 'u-align-items-center', 'u-flex-gap-xs')}>
<div className={cx('icon-exclamation')}>
<Icon name="exclamation-triangle" />
</div>

View file

@ -93,8 +93,10 @@ const IntegrationForm = observer((props: IntegrationFormProps) => {
const { alertReceiveChannelOptions } = alertReceiveChannelStore;
const options = alertReceiveChannelOptions
? alertReceiveChannelOptions.filter((option: AlertReceiveChannelOption) =>
option.display_name.toLowerCase().includes(filterValue.toLowerCase())
? alertReceiveChannelOptions.filter(
(option: AlertReceiveChannelOption) =>
option.display_name.toLowerCase().includes(filterValue.toLowerCase()) &&
!option.value.toLowerCase().startsWith('legacy_')
)
: [];

View file

@ -227,6 +227,13 @@ export class AlertReceiveChannelStore extends BaseStore {
};
}
@action
async migrateChannel(id: AlertReceiveChannel['id']) {
return await makeRequest(`/alert_receive_channels/${id}/migrate`, {
method: 'POST',
});
}
@action
async createChannelFilter(data: Partial<ChannelFilter>) {
return await makeRequest('/channel_filters/', {

View file

@ -10,7 +10,7 @@ export enum MaintenanceMode {
export interface AlertReceiveChannelOption {
display_name: string;
value: number;
value: string;
featured: boolean;
short_description: string;
featured_tag_name: string;

View file

@ -13,6 +13,7 @@ $LARGE-MARGIN: 24px;
&__heading-container {
display: flex;
gap: $FLEX-GAP;
align-items: center;
}
&__heading {
@ -52,6 +53,10 @@ $LARGE-MARGIN: 24px;
&__input-field {
margin-right: 24px;
}
&__name {
margin: 0;
}
}
.integration__actionItem {
@ -204,4 +209,4 @@ $LARGE-MARGIN: 24px;
.inline-switch {
height: 34px;
}
}

View file

@ -163,6 +163,7 @@ class Integration extends React.Component<IntegrationProps, IntegrationState> {
const integration = alertReceiveChannelStore.getIntegration(alertReceiveChannel);
const alertReceiveChannelCounter = alertReceiveChannelStore.counters[id];
const isLegacyIntegration = integration && (integration?.value as string).toLowerCase().startsWith('legacy_');
return (
<PageErrorHandlingWrapper errorData={errorData} objectName="integration" pageName="Integration">
@ -194,24 +195,23 @@ class Integration extends React.Component<IntegrationProps, IntegrationState> {
)}
<div className={cx('integration__heading-container')}>
<PluginLink query={{ page: 'integrations', p }}>
<PluginLink query={{ page: 'integrations', p }} className={cx('back-arrow')}>
<IconButton name="arrow-left" size="xl" />
</PluginLink>
<h1 className={cx('integration__name')}>
<h2 className={cx('integration__name')}>
<Emoji text={alertReceiveChannel.verbal_name} />
</h1>
</h2>
<IntegrationActions
alertReceiveChannel={alertReceiveChannel}
changeIsTemplateSettingsOpen={() => this.setState({ isTemplateSettingsOpen: true })}
isLegacyIntegration={isLegacyIntegration}
/>
</div>
<div className={cx('integration__subheading-container')}>
{alertReceiveChannel.description_short && (
<Text type="secondary" className={cx('integration__description')}>
{alertReceiveChannel.description_short}
</Text>
)}
{this.renderDeprecatedHeaderMaybe(integration, isLegacyIntegration)}
{this.renderDescriptionMaybe(alertReceiveChannel)}
<div className={cx('no-wrap')}>
<IntegrationHeader
@ -225,8 +225,11 @@ class Integration extends React.Component<IntegrationProps, IntegrationState> {
<div className={cx('integration__description-alert')}>
<Alert
style={{ marginBottom: '0' }}
// @ts-ignore
title={<div dangerouslySetInnerHTML={{ __html: sanitize(alertReceiveChannel.description) }}></div>}
title={
(
<div dangerouslySetInnerHTML={{ __html: sanitize(alertReceiveChannel.description) }}></div>
) as any
}
severity="info"
/>
</div>
@ -275,6 +278,64 @@ class Integration extends React.Component<IntegrationProps, IntegrationState> {
);
}
renderDeprecatedHeaderMaybe(integration: SelectOption, isLegacyIntegration: boolean) {
if (!isLegacyIntegration) {
return null;
}
return (
<div className="u-padding-top-md">
<Alert
severity="warning"
title={
(
<VerticalGroup>
<Text type="secondary">
We are introducing a new {getDisplayName()} integration. The existing integration is marked as Legacy
and will be migrated after 1 November 2023.
</Text>
<Text type="secondary">
To ensure a smooth transition you can migrate now using "Migrate" button in the menu on the right.
</Text>
<Text type="secondary">
Please, check{' '}
<a
href={`https://grafana.com/docs/oncall/latest/integrations/${getIntegrationName()}`}
target="_blank"
rel="noreferrer"
>
documentation
</a>{' '}
for more information.
</Text>
</VerticalGroup>
) as any
}
/>
</div>
);
function getDisplayName() {
return integration.display_name.toString().replace('(Legacy) ', '');
}
function getIntegrationName() {
return integration.value.toString().replace('legacy_', '').replace('_', '-');
}
}
renderDescriptionMaybe(alertReceiveChannel: AlertReceiveChannel) {
if (!alertReceiveChannel.description_short) {
return null;
}
return (
<Text type="secondary" className={cx('integration__description')}>
{alertReceiveChannel.description_short}
</Text>
);
}
getConfigForTreeComponent(id: string, templates: AlertTemplatesDTO[]) {
return [
{
@ -528,9 +589,7 @@ class Integration extends React.Component<IntegrationProps, IntegrationState> {
.saveTemplates(id, data)
.then(() => {
openNotification('The Alert templates have been updated');
this.setState({
isEditTemplateModalOpen: undefined,
});
this.setState({ isEditTemplateModalOpen: undefined });
this.setState({ isTemplateSettingsOpen: true });
LocationHelper.update({ template: undefined, routeId: undefined }, 'partial');
})
@ -717,12 +776,14 @@ const IntegrationSendDemoPayloadModal: React.FC<IntegrationSendDemoPayloadModalP
};
interface IntegrationActionsProps {
isLegacyIntegration: boolean;
alertReceiveChannel: AlertReceiveChannel;
changeIsTemplateSettingsOpen: () => void;
}
const IntegrationActions: React.FC<IntegrationActionsProps> = ({
alertReceiveChannel,
isLegacyIntegration,
changeIsTemplateSettingsOpen,
}) => {
const { alertReceiveChannelStore } = useStore();
@ -876,6 +937,44 @@ const IntegrationActions: React.FC<IntegrationActionsProps> = ({
</WithPermissionControlTooltip>
)}
{isLegacyIntegration && (
<WithPermissionControlTooltip userAction={UserActions.IntegrationsWrite}>
<div
className={cx('integration__actionItem')}
onClick={() =>
setConfirmModal({
isOpen: true,
title: 'Migrate Integration?',
body: (
<VerticalGroup spacing="lg">
<Text type="primary">
Are you sure you want to migrate <Emoji text={alertReceiveChannel.verbal_name} /> ?
</Text>
<VerticalGroup spacing="xs">
<Text type="secondary">- Integration internal behaviour will be changed</Text>
<Text type="secondary">
- Integration URL will stay the same, so no need to change {getMigrationDisplayName()}{' '}
configuration
</Text>
<Text type="secondary">
- Integration templates will be reset to suit the new payload
</Text>
<Text type="secondary">- It is needed to adjust routes manually to the new payload</Text>
</VerticalGroup>
</VerticalGroup>
),
onConfirm: onIntegrationMigrate,
dismissText: 'Cancel',
confirmText: 'Migrate',
})
}
>
Migrate
</div>
</WithPermissionControlTooltip>
)}
<CopyToClipboard
text={alertReceiveChannel.id}
onCopy={() => openNotification('Integration ID is copied')}
@ -900,8 +999,7 @@ const IntegrationActions: React.FC<IntegrationActionsProps> = ({
title: 'Delete Integration?',
body: (
<Text type="primary">
Are you sure you want to delete <Emoji text={alertReceiveChannel.verbal_name} />{' '}
integration?{' '}
Are you sure you want to delete <Emoji text={alertReceiveChannel.verbal_name} /> ?
</Text>
),
onConfirm: deleteIntegration,
@ -909,7 +1007,7 @@ const IntegrationActions: React.FC<IntegrationActionsProps> = ({
confirmText: 'Delete',
});
}}
style={{ width: '100%' }}
className="u-width-100"
>
<Text type="danger">
<HorizontalGroup spacing={'xs'}>
@ -929,6 +1027,33 @@ const IntegrationActions: React.FC<IntegrationActionsProps> = ({
</>
);
function getMigrationDisplayName() {
const name = alertReceiveChannel.integration.toLowerCase().replace('legacy_', '');
switch (name) {
case 'grafana_alerting':
return 'Grafana Alerting';
case 'alertmanager':
default:
return 'AlertManager';
}
}
function onIntegrationMigrate() {
alertReceiveChannelStore
.migrateChannel(alertReceiveChannel.id)
.then(() => {
setConfirmModal(undefined);
openNotification('Integration has been successfully migrated.');
})
.then(() =>
Promise.all([
alertReceiveChannelStore.updateItem(alertReceiveChannel.id),
alertReceiveChannelStore.updateTemplates(alertReceiveChannel.id),
])
)
.catch(() => openErrorNotification('An error has occurred. Please try again.'));
}
function showHeartbeatSettings() {
return alertReceiveChannel.is_available_for_integration_heartbeat;
}
@ -936,7 +1061,9 @@ const IntegrationActions: React.FC<IntegrationActionsProps> = ({
function deleteIntegration() {
alertReceiveChannelStore
.deleteAlertReceiveChannel(alertReceiveChannel.id)
.then(() => history.push(`${PLUGIN_ROOT}/integrations`));
.then(() => history.push(`${PLUGIN_ROOT}/integrations`))
.then(() => openNotification('Integration has been succesfully deleted.'))
.catch(() => openErrorNotification('An error has occurred. Please try again.'));
}
function openIntegrationSettings() {

View file

@ -1,6 +1,6 @@
import React from 'react';
import { HorizontalGroup, Button, VerticalGroup, Icon, ConfirmModal } from '@grafana/ui';
import { HorizontalGroup, Button, VerticalGroup, Icon, ConfirmModal, Tooltip } from '@grafana/ui';
import cn from 'classnames/bind';
import { debounce } from 'lodash-es';
import { observer } from 'mobx-react';
@ -26,6 +26,7 @@ import RemoteFilters from 'containers/RemoteFilters/RemoteFilters';
import TeamName from 'containers/TeamName/TeamName';
import { WithPermissionControlTooltip } from 'containers/WithPermissionControl/WithPermissionControlTooltip';
import { HeartIcon, HeartRedIcon } from 'icons';
import { AlertReceiveChannelStore } from 'models/alert_receive_channel/alert_receive_channel';
import { AlertReceiveChannel, MaintenanceMode } from 'models/alert_receive_channel/alert_receive_channel.types';
import IntegrationHelper from 'pages/integration/Integration.helper';
import { PageProps, WithStoreProps } from 'state/types';
@ -126,55 +127,10 @@ class Integrations extends React.Component<IntegrationsProps, IntegrationsState>
render() {
const { store, query } = this.props;
const { alertReceiveChannelId, page, confirmationModal } = this.state;
const { grafanaTeamStore, alertReceiveChannelStore } = store;
const { alertReceiveChannelStore } = store;
const { count, results } = alertReceiveChannelStore.getPaginatedSearchResult();
const columns = [
{
width: '35%',
title: 'Name',
key: 'name',
render: this.renderName,
},
{
width: '15%',
title: 'Status',
key: 'status',
render: (item: AlertReceiveChannel) => this.renderIntegrationStatus(item, alertReceiveChannelStore),
},
{
width: '20%',
title: 'Type',
key: 'datasource',
render: (item: AlertReceiveChannel) => this.renderDatasource(item, alertReceiveChannelStore),
},
{
width: '10%',
title: 'Maintenance',
key: 'maintenance',
render: (item: AlertReceiveChannel) => this.renderMaintenance(item),
},
{
width: '5%',
title: 'Heartbeat',
key: 'heartbeat',
render: (item: AlertReceiveChannel) => this.renderHeartbeat(item),
},
{
width: '15%',
title: 'Team',
render: (item: AlertReceiveChannel) => this.renderTeam(item, grafanaTeamStore.items),
},
{
width: '50px',
key: 'buttons',
render: (item: AlertReceiveChannel) => this.renderButtons(item),
className: cx('buttons'),
},
];
return (
<>
<div className={cx('root')}>
@ -211,7 +167,7 @@ class Integrations extends React.Component<IntegrationsProps, IntegrationsState>
data-testid="integrations-table"
rowKey="id"
data={results}
columns={columns}
columns={this.getTableColumns()}
className={cx('integrations-table')}
rowClassName={cx('integrations-table-row')}
pagination={{
@ -253,10 +209,6 @@ class Integrations extends React.Component<IntegrationsProps, IntegrationsState>
);
}
handleChangePage = (page: number) => {
this.setState({ page }, this.update);
};
renderNotFound() {
return (
<div className={cx('loader')}>
@ -286,13 +238,28 @@ class Integrations extends React.Component<IntegrationsProps, IntegrationsState>
);
};
renderDatasource(item: AlertReceiveChannel, alertReceiveChannelStore) {
renderDatasource(item: AlertReceiveChannel, alertReceiveChannelStore: AlertReceiveChannelStore) {
const alertReceiveChannel = alertReceiveChannelStore.items[item.id];
const integration = alertReceiveChannelStore.getIntegration(alertReceiveChannel);
const isLegacyIntegration = (integration?.value as string)?.toLowerCase().startsWith('legacy_');
return (
<HorizontalGroup spacing="xs">
<IntegrationLogo scale={0.08} integration={integration} />
<Text type="secondary">{integration?.display_name}</Text>
{isLegacyIntegration ? (
<>
<Tooltip placement="top" content={'This integration has been deprecated, consider migrating it.'}>
<Icon name="info-circle" className="u-opacity" />
</Tooltip>
<Text type="secondary">
<span className="u-opacity">{integration?.display_name}</span>
</Text>
</>
) : (
<>
<IntegrationLogo scale={0.08} integration={integration} />
<Text type="secondary">{integration?.display_name}</Text>
</>
)}
</HorizontalGroup>
);
}
@ -453,6 +420,59 @@ class Integrations extends React.Component<IntegrationsProps, IntegrationsState>
);
};
getTableColumns = () => {
const { grafanaTeamStore, alertReceiveChannelStore } = this.props.store;
return [
{
width: '35%',
title: 'Name',
key: 'name',
render: this.renderName,
},
{
width: '15%',
title: 'Status',
key: 'status',
render: (item: AlertReceiveChannel) => this.renderIntegrationStatus(item, alertReceiveChannelStore),
},
{
width: '20%',
title: 'Type',
key: 'datasource',
render: (item: AlertReceiveChannel) => this.renderDatasource(item, alertReceiveChannelStore),
},
{
width: '10%',
title: 'Maintenance',
key: 'maintenance',
render: (item: AlertReceiveChannel) => this.renderMaintenance(item),
},
{
width: '5%',
title: 'Heartbeat',
key: 'heartbeat',
render: (item: AlertReceiveChannel) => this.renderHeartbeat(item),
},
{
width: '15%',
title: 'Team',
render: (item: AlertReceiveChannel) => this.renderTeam(item, grafanaTeamStore.items),
},
{
width: '50px',
key: 'buttons',
render: (item: AlertReceiveChannel) => this.renderButtons(item),
className: cx('buttons'),
},
];
};
handleChangePage = (page: number) => {
this.setState({ page }, this.update);
};
onIntegrationEditClick = (id: AlertReceiveChannel['id']) => {
this.setState({ alertReceiveChannelId: id });
};