Merge remote-tracking branch 'origin/matiasb-resolve-web-schedule' into new-schedules
This commit is contained in:
commit
725d62f33a
71 changed files with 921 additions and 228 deletions
21
CHANGELOG.md
21
CHANGELOG.md
|
|
@ -1,11 +1,30 @@
|
|||
# Change Log
|
||||
|
||||
## v1.0.5 (2022-07-12)
|
||||
## v1.0.10 (2022-07-22)
|
||||
- Speed-up of alert group web caching
|
||||
- Internal api for OnCall shifts
|
||||
|
||||
## v1.0.9 (2022-07-21)
|
||||
- Frontend bug fixes & improvements
|
||||
- Support regex_replace() in templates
|
||||
- Bring back alert group caching and list view
|
||||
|
||||
## v1.0.7 (2022-07-18)
|
||||
- Backend & frontend bug fixes
|
||||
- Deployment improvements
|
||||
- Reshape webhook payload for outgoing webhooks
|
||||
- Add escalation chain usage info on escalation chains page
|
||||
- Improve alert group list load speeds and simplify caching system
|
||||
|
||||
## v1.0.6 (2022-07-12)
|
||||
- Manual Incidents enabled for teams
|
||||
- Fix phone notifications for OSS
|
||||
- Public API improvements
|
||||
|
||||
## v1.0.5 (2022-07-06)
|
||||
- Bump Django to 3.2.14
|
||||
- Fix PagerDuty iCal parsing
|
||||
|
||||
## 1.0.4 (2022-06-28)
|
||||
- Allow Telegram DMs without channel connection.
|
||||
|
||||
|
|
|
|||
14
README.md
14
README.md
|
|
@ -56,6 +56,20 @@ Grafana Url: http://grafana:3000
|
|||
6. Enjoy! Check our [OSS docs](https://grafana.com/docs/grafana-cloud/oncall/open-source/) if you want to set up Slack, Telegram, Twilio or SMS/calls through Grafana Cloud.
|
||||
|
||||
|
||||
## Update version
|
||||
To update your Grafana OnCall hobby environment:
|
||||
|
||||
```shell
|
||||
# Update Docker images
|
||||
docker-compose --env-file .env_hobby -f docker-compose.yml pull engine celery oncall_db_migration
|
||||
|
||||
# Re-deploy
|
||||
docker-compose --env-file .env_hobby -f docker-compose.yml up -d --remove-orphans
|
||||
```
|
||||
|
||||
After updating the engine, you'll also need to click the "Update" button on the [plugin version page](http://localhost:3000/plugins/grafana-oncall-app?page=version-history).
|
||||
See [Grafana docs](https://grafana.com/docs/grafana/latest/administration/plugin-management/#update-a-plugin) for more info on updating Grafana plugins.
|
||||
|
||||
## Join community
|
||||
|
||||
<a href="https://github.com/grafana/oncall/discussions/categories/community-calls"><img width="200px" src="docs/img/community_call.png"></a>
|
||||
|
|
|
|||
|
|
@ -29,7 +29,7 @@ services:
|
|||
oncall_db_migration:
|
||||
condition: service_completed_successfully
|
||||
rabbitmq:
|
||||
condition: service_started
|
||||
condition: service_healthy
|
||||
redis:
|
||||
condition: service_started
|
||||
|
||||
|
|
@ -64,7 +64,7 @@ services:
|
|||
oncall_db_migration:
|
||||
condition: service_completed_successfully
|
||||
rabbitmq:
|
||||
condition: service_started
|
||||
condition: service_healthy
|
||||
redis:
|
||||
condition: service_started
|
||||
|
||||
|
|
@ -92,7 +92,7 @@ services:
|
|||
mysql:
|
||||
condition: service_healthy
|
||||
rabbitmq:
|
||||
condition: service_started
|
||||
condition: service_healthy
|
||||
|
||||
mysql:
|
||||
image: mysql:5.7
|
||||
|
|
@ -133,6 +133,11 @@ services:
|
|||
RABBITMQ_DEFAULT_USER: "rabbitmq"
|
||||
RABBITMQ_DEFAULT_PASS: $RABBITMQ_PASSWORD
|
||||
RABBITMQ_DEFAULT_VHOST: "/"
|
||||
healthcheck:
|
||||
test: rabbitmq-diagnostics -q ping
|
||||
interval: 30s
|
||||
timeout: 30s
|
||||
retries: 3
|
||||
|
||||
mysql_to_create_grafana_db:
|
||||
image: mysql:5.7
|
||||
|
|
|
|||
|
|
@ -32,10 +32,8 @@ To automatically send alert data to a destination URL via outgoing webhook:
|
|||
|
||||
The format you use to call the variables must match the structure of how the fields are nested in the alert payload. The **Data** field can use the following four variables to auto-populate the webhook payload with information about the first alert in the alert group:
|
||||
|
||||
- `{{ alert_title }}`
|
||||
- `{{ alert_message }}`
|
||||
- `{{ alert_url }}`
|
||||
- `{{ alert_payload }}`
|
||||
- `{{ alert_group_id }}`
|
||||
<br>
|
||||
|
||||
`alert_payload` is always the first level of any variable you want to call.
|
||||
|
|
|
|||
|
|
@ -159,3 +159,4 @@ Built-in functions:
|
|||
- `tojson_pretty` - JSON prettified
|
||||
- `iso8601_to_time` - converts time from iso8601 (`2015-02-17T18:30:20.000Z`) to datetime
|
||||
- `datetimeformat` - converts time from datetime to the given format (`%H:%M / %d-%m-%Y` by default)
|
||||
- `regex_replace` - performs a regex find and replace
|
||||
|
|
|
|||
|
|
@ -29,6 +29,11 @@ Check the [helm chart](https://github.com/grafana/oncall/tree/dev/helm/oncall) f
|
|||
|
||||
We'll always be happy to provide assistance with production deployment in [our communities](https://github.com/grafana/oncall#join-community)!
|
||||
|
||||
## Update Grafana OnCall OSS
|
||||
To update an OSS installation of Grafana OnCall, please see the update docs:
|
||||
- **Hobby** playground environment: [README.md](https://github.com/grafana/oncall#update-version)
|
||||
- **Production** Helm environment: [Helm update](https://github.com/grafana/oncall/tree/dev/helm/oncall#update)
|
||||
|
||||
## Slack Setup
|
||||
|
||||
The Slack integration for Grafana OnCall leverages Slack API features to provide a customizable and useful integration. Refer to the following steps to configure the Slack integration:
|
||||
|
|
|
|||
|
|
@ -0,0 +1,34 @@
|
|||
from apps.alerts.incident_appearance.renderers.base_renderer import AlertBaseRenderer, AlertGroupBaseRenderer
|
||||
from apps.alerts.incident_appearance.templaters import AlertClassicMarkdownTemplater
|
||||
from common.utils import str_or_backup
|
||||
|
||||
|
||||
class AlertClassicMarkdownRenderer(AlertBaseRenderer):
|
||||
@property
|
||||
def templater_class(self):
|
||||
return AlertClassicMarkdownTemplater
|
||||
|
||||
def render(self):
|
||||
templated_alert = self.templated_alert
|
||||
rendered_alert = {
|
||||
"title": str_or_backup(templated_alert.title, "Alert"),
|
||||
"message": str_or_backup(templated_alert.message, ""),
|
||||
"image_url": str_or_backup(templated_alert.image_url, None),
|
||||
"source_link": str_or_backup(templated_alert.source_link, None),
|
||||
}
|
||||
return rendered_alert
|
||||
|
||||
|
||||
class AlertGroupClassicMarkdownRenderer(AlertGroupBaseRenderer):
|
||||
def __init__(self, alert_group):
|
||||
super().__init__(alert_group)
|
||||
|
||||
# use the last alert to render content
|
||||
self.alert_renderer = self.alert_renderer_class(self.alert_group.alerts.last())
|
||||
|
||||
@property
|
||||
def alert_renderer_class(self):
|
||||
return AlertClassicMarkdownRenderer
|
||||
|
||||
def render(self):
|
||||
return self.alert_renderer.render()
|
||||
|
|
@ -1,4 +1,5 @@
|
|||
from .alert_templater import TemplateLoader # noqa: F401
|
||||
from .classic_markdown_templater import AlertClassicMarkdownTemplater # noqa: F401
|
||||
from .email_templater import AlertEmailTemplater # noqa: F401
|
||||
from .phone_call_templater import AlertPhoneCallTemplater # noqa: F401
|
||||
from .slack_templater import AlertSlackTemplater # noqa: F401
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
from apps.alerts.incident_appearance.templaters.alert_templater import AlertTemplater
|
||||
|
||||
|
||||
class AlertClassicMarkdownTemplater(AlertTemplater):
|
||||
RENDER_FOR = "web"
|
||||
|
||||
def _render_for(self):
|
||||
return self.RENDER_FOR
|
||||
|
||||
def _postformat(self, templated_alert):
|
||||
if templated_alert.title:
|
||||
templated_alert.title = self._slack_format(templated_alert.title)
|
||||
if templated_alert.message:
|
||||
templated_alert.message = self._slack_format(templated_alert.message)
|
||||
return templated_alert
|
||||
|
||||
def _slack_format(self, data):
|
||||
sf = self.slack_formatter
|
||||
sf.hyperlink_mention_format = "[{title}]({url})"
|
||||
return sf.format(data)
|
||||
28
engine/apps/alerts/migrations/0004_auto_20220711_1106.py
Normal file
28
engine/apps/alerts/migrations/0004_auto_20220711_1106.py
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
# Generated by Django 3.2.13 on 2022-07-11 11:06
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
"""
|
||||
The previous version of this migration removes two fields:
|
||||
cached_render_for_web and active_cache_for_web_calculation_id.
|
||||
Now it doesn't do anything because it can be very slow and even fail on write heavy alertgroup table.
|
||||
This migration was released in version 1.0.7, so in order to bring back these fields in the later version
|
||||
there's a 0005 migration. Please see the next migration in alerts: 0005_alertgroup_cached_render_for_web.py
|
||||
"""
|
||||
|
||||
dependencies = [
|
||||
('alerts', '0003_grafanaalertingcontactpoint_datasource_uid'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# migrations.RemoveField(
|
||||
# model_name='alertgroup',
|
||||
# name='active_cache_for_web_calculation_id',
|
||||
# ),
|
||||
# migrations.RemoveField(
|
||||
# model_name='alertgroup',
|
||||
# name='cached_render_for_web',
|
||||
# ),
|
||||
]
|
||||
|
|
@ -0,0 +1,45 @@
|
|||
# Generated by Django 3.2.13 on 2022-07-20 09:04
|
||||
|
||||
from django.db import migrations, models, OperationalError
|
||||
|
||||
|
||||
class AddFieldIfNotExists(migrations.AddField):
|
||||
"""
|
||||
Adds a field and ignores "duplicate column" error in case the field already exists.
|
||||
When migrating back it will not delete the field.
|
||||
"""
|
||||
|
||||
def database_forwards(self, app_label, schema_editor, from_state, to_state):
|
||||
try:
|
||||
super().database_forwards(app_label, schema_editor, from_state, to_state)
|
||||
except OperationalError:
|
||||
pass
|
||||
|
||||
def database_backwards(self, app_label, schema_editor, from_state, to_state):
|
||||
pass
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
"""
|
||||
This migration tries to create two fields cached_render_for_web and active_cache_for_web_calculation_id.
|
||||
In case these fields already exist, this migration will do nothing.
|
||||
In case the database was already affected by the previous version of the 0004 migration,
|
||||
it will recreate these fields.
|
||||
"""
|
||||
|
||||
dependencies = [
|
||||
('alerts', '0004_auto_20220711_1106'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
AddFieldIfNotExists(
|
||||
model_name='alertgroup',
|
||||
name='cached_render_for_web',
|
||||
field=models.JSONField(default=dict),
|
||||
),
|
||||
AddFieldIfNotExists(
|
||||
model_name='alertgroup',
|
||||
name='active_cache_for_web_calculation_id',
|
||||
field=models.CharField(default=None, max_length=100, null=True),
|
||||
),
|
||||
]
|
||||
|
|
@ -32,6 +32,7 @@ from apps.slack.constants import SLACK_RATE_LIMIT_DELAY, SLACK_RATE_LIMIT_TIMEOU
|
|||
from apps.slack.tasks import post_slack_rate_limit_message
|
||||
from apps.slack.utils import post_message_to_channel
|
||||
from apps.user_management.organization_log_creator import OrganizationLogType, create_organization_log
|
||||
from common.api_helpers.utils import create_engine_url
|
||||
from common.exceptions import TeamCanNotBeChangedError, UnableToSendDemoAlert
|
||||
from common.public_primary_keys import generate_public_primary_key, increase_public_primary_key_length
|
||||
|
||||
|
|
@ -497,10 +498,7 @@ class AlertReceiveChannel(IntegrationOptionsMixin, MaintainableObject):
|
|||
AlertReceiveChannel.INTEGRATION_MAINTENANCE,
|
||||
]:
|
||||
return None
|
||||
return urljoin(
|
||||
settings.BASE_URL,
|
||||
f"/integrations/v1/{self.config.slug}/{self.token}/",
|
||||
)
|
||||
return create_engine_url(f"integrations/v1/{self.config.slug}/{self.token}/")
|
||||
|
||||
@property
|
||||
def inbound_email(self):
|
||||
|
|
@ -696,14 +694,19 @@ def listen_for_alertreceivechannel_model_save(sender, instance, created, *args,
|
|||
instance.organization, None, OrganizationLogType.TYPE_HEARTBEAT_CREATED, description
|
||||
)
|
||||
else:
|
||||
logger.info(f"Drop AG cache. Reason: save alert_receive_channel {instance.pk}")
|
||||
if kwargs is not None:
|
||||
if "update_fields" in kwargs:
|
||||
if kwargs["update_fields"] is not None:
|
||||
fields_to_not_to_invalidate_cache = [
|
||||
"rate_limit_message_task_id",
|
||||
"rate_limited_in_slack_at",
|
||||
"reason_to_skip_escalation",
|
||||
]
|
||||
# Hack to not to invalidate web cache on AlertReceiveChannel.start_send_rate_limit_message_task
|
||||
if "rate_limit_message_task_id" in kwargs["update_fields"]:
|
||||
return
|
||||
|
||||
for f in fields_to_not_to_invalidate_cache:
|
||||
if f in kwargs["update_fields"]:
|
||||
return
|
||||
logger.info(f"Drop AG cache. Reason: save alert_receive_channel {instance.pk}")
|
||||
invalidate_web_cache_for_alert_group.apply_async(kwargs={"channel_pk": instance.pk})
|
||||
|
||||
if instance.integration == AlertReceiveChannel.INTEGRATION_GRAFANA_ALERTING:
|
||||
|
|
|
|||
|
|
@ -118,11 +118,8 @@ class CustomButton(models.Model):
|
|||
elif self.data:
|
||||
rendered_data = Template(self.data).render(
|
||||
{
|
||||
"alert_title": self._escape_string(alert.title),
|
||||
"alert_message": self._escape_string(alert.message),
|
||||
"alert_url": alert.link_to_upstream_details,
|
||||
"alert_payload": self._escape_alert_payload(alert.raw_request_data),
|
||||
"alert_payload_json": json.dumps(alert.raw_request_data),
|
||||
"alert_group_id": alert.group.public_primary_key,
|
||||
}
|
||||
)
|
||||
post_kwargs["json"] = json.loads(rendered_data)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,3 @@
|
|||
import random
|
||||
import time
|
||||
|
||||
from django.apps import apps
|
||||
|
|
@ -356,37 +355,42 @@ def perform_notification(log_record_pk):
|
|||
message = f"{AlertGroupWebRenderer(alert_group).render().get('title', 'Incident')}"
|
||||
thread_id = f"{alert_group.channel.organization.public_primary_key}:{alert_group.public_primary_key}"
|
||||
devices_to_notify = APNSDevice.objects.filter(user_id=user.pk)
|
||||
sounds = ["alarm.aiff", "operation.aiff"]
|
||||
devices_to_notify.send_message(
|
||||
message,
|
||||
thread_id=thread_id,
|
||||
category="USER_NEW_INCIDENT",
|
||||
sound={"critical": 1, "name": f"{random.choice(sounds)}"},
|
||||
extra={
|
||||
"orgId": f"{alert_group.channel.organization.public_primary_key}",
|
||||
"orgName": f"{alert_group.channel.organization.stack_slug}",
|
||||
"incidentId": f"{alert_group.public_primary_key}",
|
||||
"status": f"{alert_group.status}",
|
||||
"interruption-level": "critical",
|
||||
"aps": {
|
||||
"alert": f"{message}",
|
||||
"sound": "bingbong.aiff",
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
elif notification_channel == UserNotificationPolicy.NotificationChannel.MOBILE_PUSH_CRITICAL:
|
||||
message = f"!!! {AlertGroupWebRenderer(alert_group).render().get('title', 'Incident')}"
|
||||
message = f"{AlertGroupWebRenderer(alert_group).render().get('title', 'Incident')}"
|
||||
thread_id = f"{alert_group.channel.organization.public_primary_key}:{alert_group.public_primary_key}"
|
||||
devices_to_notify = APNSDevice.objects.filter(user_id=user.pk)
|
||||
sounds = ["ambulance.aiff"]
|
||||
devices_to_notify.send_message(
|
||||
message,
|
||||
thread_id=thread_id,
|
||||
category="USER_NEW_INCIDENT",
|
||||
sound={"critical": 1, "name": f"{random.choice(sounds)}"},
|
||||
extra={
|
||||
"orgId": f"{alert_group.channel.organization.public_primary_key}",
|
||||
"orgName": f"{alert_group.channel.organization.stack_slug}",
|
||||
"incidentId": f"{alert_group.public_primary_key}",
|
||||
"status": f"{alert_group.status}",
|
||||
"interruption-level": "critical",
|
||||
"aps": {
|
||||
"alert": f"Critical page: {message}",
|
||||
# This is disabled until we gain the Critical Alerts Api permission from apple
|
||||
# "interruption-level": "critical",
|
||||
"interruption-level": "time-sensitive",
|
||||
"sound": "ambulance.aiff",
|
||||
},
|
||||
},
|
||||
)
|
||||
else:
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ from datetime import datetime
|
|||
import humanize
|
||||
from rest_framework import serializers
|
||||
|
||||
from apps.alerts.incident_appearance.renderers.classic_markdown_renderer import AlertGroupClassicMarkdownRenderer
|
||||
from apps.alerts.incident_appearance.renderers.web_renderer import AlertGroupWebRenderer
|
||||
from apps.alerts.models import AlertGroup
|
||||
from common.api_helpers.mixins import EagerLoadingMixin
|
||||
|
|
@ -56,6 +57,7 @@ class AlertGroupSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
|
||||
status = serializers.ReadOnlyField()
|
||||
render_for_web = serializers.SerializerMethodField()
|
||||
render_for_classic_markdown = serializers.SerializerMethodField()
|
||||
|
||||
PREFETCH_RELATED = [
|
||||
"alerts",
|
||||
|
|
@ -109,6 +111,7 @@ class AlertGroupSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
"resolved_at_verbose",
|
||||
"render_for_web",
|
||||
"render_after_resolve_report_json",
|
||||
"render_for_classic_markdown",
|
||||
"dependent_alert_groups",
|
||||
"root_alert_group",
|
||||
"status",
|
||||
|
|
@ -135,6 +138,9 @@ class AlertGroupSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
|
||||
return AlertSerializer(alerts, many=True).data
|
||||
|
||||
def get_render_for_classic_markdown(self, obj):
|
||||
return AlertGroupClassicMarkdownRenderer(obj).render()
|
||||
|
||||
def get_related_users(self, obj):
|
||||
users_ids = set()
|
||||
users = []
|
||||
|
|
|
|||
|
|
@ -46,7 +46,6 @@ class OnCallShiftSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
"until",
|
||||
"frequency",
|
||||
"interval",
|
||||
"until",
|
||||
"by_day",
|
||||
"source",
|
||||
"rolling_users",
|
||||
|
|
@ -74,7 +73,7 @@ class OnCallShiftSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
result = super().to_representation(instance)
|
||||
return result
|
||||
|
||||
def validate_name(self, name): # todo
|
||||
def validate_name(self, name):
|
||||
organization = self.context["request"].auth.organization
|
||||
if name is None:
|
||||
return name
|
||||
|
|
@ -147,12 +146,15 @@ class OnCallShiftSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
"interval",
|
||||
"by_day",
|
||||
"until",
|
||||
"rotation_start",
|
||||
]
|
||||
if event_type == CustomOnCallShift.TYPE_OVERRIDE:
|
||||
for field in fields_to_update_for_overrides:
|
||||
value = None
|
||||
if field == "priority_level":
|
||||
value = 0
|
||||
elif field == "rotation_start":
|
||||
value = validated_data["start"]
|
||||
validated_data[field] = value
|
||||
|
||||
self._validate_frequency(
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ from datetime import timedelta
|
|||
import humanize
|
||||
import pytz
|
||||
from django.apps import apps
|
||||
from django.conf import settings
|
||||
from django.utils import timezone
|
||||
from rest_framework import fields, serializers
|
||||
|
||||
|
|
@ -109,7 +110,25 @@ class CurrentOrganizationSerializer(OrganizationSerializer):
|
|||
|
||||
def get_limits(self, obj):
|
||||
user = self.context["request"].user
|
||||
return obj.notifications_limit_web_report(user)
|
||||
if not settings.OSS_INSTALLATION:
|
||||
return obj.notifications_limit_web_report(user)
|
||||
|
||||
# show a version warning on OSS installations in case backend and frontend are different versions
|
||||
frontend_version = self.context["request"].headers.get("X-OnCall-Plugin-Version")
|
||||
backend_version = settings.VERSION
|
||||
version_warning = {}
|
||||
if backend_version and frontend_version and backend_version != frontend_version:
|
||||
text = (
|
||||
"Version mismatch! Please make sure you have the same versions of the Grafana OnCall plugin "
|
||||
"and Grafana OnCall engine, "
|
||||
"otherwise there could be issues with your Grafana OnCall installation! "
|
||||
f"Current plugin version: {frontend_version}, current engine version: {backend_version}. "
|
||||
"Please see the update instructions: "
|
||||
"https://grafana.com/docs/oncall/latest/open-source/#update-grafana-oncall-oss"
|
||||
)
|
||||
version_warning = {"period_title": "Version mismatch", "show_limits_warning": True, "warning_text": text}
|
||||
|
||||
return version_warning or obj.notifications_limit_web_report(user)
|
||||
|
||||
def get_env_status(self, obj):
|
||||
LiveSetting.populate_settings_if_needed()
|
||||
|
|
|
|||
|
|
@ -426,6 +426,7 @@ def test_events_calendar(
|
|||
"start": on_call_shift.start,
|
||||
"end": on_call_shift.start + on_call_shift.duration,
|
||||
"users": [{"display_name": user.username, "pk": user.public_primary_key}],
|
||||
"missing_users": [],
|
||||
"priority_level": on_call_shift.priority_level,
|
||||
"source": "api",
|
||||
"calendar_type": OnCallSchedule.PRIMARY,
|
||||
|
|
@ -468,13 +469,13 @@ def test_filter_events_calendar(
|
|||
"by_day": ["MO", "FR"],
|
||||
"schedule": schedule,
|
||||
}
|
||||
|
||||
on_call_shift = make_on_call_shift(
|
||||
organization=organization, shift_type=CustomOnCallShift.TYPE_RECURRENT_EVENT, **data
|
||||
)
|
||||
on_call_shift.users.add(user)
|
||||
|
||||
url = reverse("api-internal:schedule-filter-events", kwargs={"pk": schedule.public_primary_key})
|
||||
url += "?type=rotation"
|
||||
response = client.get(url, format="json", **make_user_auth_headers(user, token))
|
||||
|
||||
# current week events are expected
|
||||
|
|
@ -490,6 +491,7 @@ def test_filter_events_calendar(
|
|||
"start": mon_start,
|
||||
"end": mon_start + on_call_shift.duration,
|
||||
"users": [{"display_name": user.username, "pk": user.public_primary_key}],
|
||||
"missing_users": [],
|
||||
"priority_level": on_call_shift.priority_level,
|
||||
"source": "api",
|
||||
"calendar_type": OnCallSchedule.PRIMARY,
|
||||
|
|
@ -504,6 +506,7 @@ def test_filter_events_calendar(
|
|||
"start": fri_start,
|
||||
"end": fri_start + on_call_shift.duration,
|
||||
"users": [{"display_name": user.username, "pk": user.public_primary_key}],
|
||||
"missing_users": [],
|
||||
"priority_level": on_call_shift.priority_level,
|
||||
"source": "api",
|
||||
"calendar_type": OnCallSchedule.PRIMARY,
|
||||
|
|
@ -522,6 +525,7 @@ def test_filter_events_calendar(
|
|||
@pytest.mark.django_db
|
||||
def test_filter_events_range_calendar(
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_user_for_organization,
|
||||
make_user_auth_headers,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
|
|
@ -537,6 +541,9 @@ def test_filter_events_range_calendar(
|
|||
|
||||
now = timezone.now().replace(microsecond=0)
|
||||
start_date = now - timezone.timedelta(days=7)
|
||||
mon_start = now - timezone.timedelta(days=start_date.weekday())
|
||||
request_date = mon_start + timezone.timedelta(days=2)
|
||||
|
||||
data = {
|
||||
"start": start_date,
|
||||
"rotation_start": start_date,
|
||||
|
|
@ -546,17 +553,27 @@ def test_filter_events_range_calendar(
|
|||
"by_day": ["MO", "FR"],
|
||||
"schedule": schedule,
|
||||
}
|
||||
|
||||
on_call_shift = make_on_call_shift(
|
||||
organization=organization, shift_type=CustomOnCallShift.TYPE_RECURRENT_EVENT, **data
|
||||
)
|
||||
on_call_shift.users.add(user)
|
||||
|
||||
mon_start = now - timezone.timedelta(days=start_date.weekday())
|
||||
request_date = mon_start + timezone.timedelta(days=2)
|
||||
# add override shift
|
||||
override_start = request_date + timezone.timedelta(seconds=3600)
|
||||
override_data = {
|
||||
"start": override_start,
|
||||
"rotation_start": override_start,
|
||||
"duration": timezone.timedelta(seconds=3600),
|
||||
"schedule": schedule,
|
||||
}
|
||||
override = make_on_call_shift(
|
||||
organization=organization, shift_type=CustomOnCallShift.TYPE_OVERRIDE, **override_data
|
||||
)
|
||||
other_user = make_user_for_organization(organization)
|
||||
override.users.add(other_user)
|
||||
|
||||
url = reverse("api-internal:schedule-filter-events", kwargs={"pk": schedule.public_primary_key})
|
||||
url += "?date={}&days=3".format(request_date.strftime("%Y-%m-%d"))
|
||||
url += "?date={}&days=3&type=rotation".format(request_date.strftime("%Y-%m-%d"))
|
||||
response = client.get(url, format="json", **make_user_auth_headers(user, token))
|
||||
|
||||
# only friday occurrence is expected
|
||||
|
|
@ -571,6 +588,7 @@ def test_filter_events_range_calendar(
|
|||
"start": fri_start,
|
||||
"end": fri_start + on_call_shift.duration,
|
||||
"users": [{"display_name": user.username, "pk": user.public_primary_key}],
|
||||
"missing_users": [],
|
||||
"priority_level": on_call_shift.priority_level,
|
||||
"source": "api",
|
||||
"calendar_type": OnCallSchedule.PRIMARY,
|
||||
|
|
@ -586,6 +604,215 @@ def test_filter_events_range_calendar(
|
|||
assert response.data == expected_result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_filter_events_overrides(
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_user_for_organization,
|
||||
make_user_auth_headers,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_plugin_token()
|
||||
client = APIClient()
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_web_schedule",
|
||||
)
|
||||
|
||||
now = timezone.now().replace(microsecond=0)
|
||||
start_date = now - timezone.timedelta(days=7)
|
||||
mon_start = now - timezone.timedelta(days=start_date.weekday())
|
||||
request_date = mon_start + timezone.timedelta(days=2)
|
||||
|
||||
data = {
|
||||
"start": start_date,
|
||||
"rotation_start": start_date,
|
||||
"duration": timezone.timedelta(seconds=7200),
|
||||
"priority_level": 1,
|
||||
"frequency": CustomOnCallShift.FREQUENCY_WEEKLY,
|
||||
"by_day": ["MO", "FR"],
|
||||
"schedule": schedule,
|
||||
}
|
||||
on_call_shift = make_on_call_shift(
|
||||
organization=organization, shift_type=CustomOnCallShift.TYPE_RECURRENT_EVENT, **data
|
||||
)
|
||||
on_call_shift.users.add(user)
|
||||
|
||||
# add override shift
|
||||
override_start = request_date + timezone.timedelta(seconds=3600)
|
||||
override_data = {
|
||||
"start": override_start,
|
||||
"rotation_start": override_start,
|
||||
"duration": timezone.timedelta(seconds=3600),
|
||||
"schedule": schedule,
|
||||
}
|
||||
override = make_on_call_shift(
|
||||
organization=organization, shift_type=CustomOnCallShift.TYPE_OVERRIDE, **override_data
|
||||
)
|
||||
other_user = make_user_for_organization(organization)
|
||||
override.users.add(other_user)
|
||||
|
||||
url = reverse("api-internal:schedule-filter-events", kwargs={"pk": schedule.public_primary_key})
|
||||
url += "?date={}&days=3&type=override".format(request_date.strftime("%Y-%m-%d"))
|
||||
response = client.get(url, format="json", **make_user_auth_headers(user, token))
|
||||
|
||||
# only override occurrence is expected
|
||||
expected_result = {
|
||||
"id": schedule.public_primary_key,
|
||||
"name": "test_web_schedule",
|
||||
"type": 2,
|
||||
"events": [
|
||||
{
|
||||
"all_day": False,
|
||||
"start": override_start,
|
||||
"end": override_start + override.duration,
|
||||
"users": [{"display_name": other_user.username, "pk": other_user.public_primary_key}],
|
||||
"missing_users": [],
|
||||
"priority_level": None,
|
||||
"source": "api",
|
||||
"calendar_type": OnCallSchedule.OVERRIDES,
|
||||
"is_empty": False,
|
||||
"is_gap": False,
|
||||
"shift": {
|
||||
"pk": override.public_primary_key,
|
||||
},
|
||||
}
|
||||
],
|
||||
}
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.data == expected_result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_filter_events_final_schedule(
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_user_for_organization,
|
||||
make_user_auth_headers,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_plugin_token()
|
||||
client = APIClient()
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_web_schedule",
|
||||
)
|
||||
|
||||
now = timezone.now().replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
start_date = now - timezone.timedelta(days=7)
|
||||
request_date = start_date
|
||||
|
||||
user_a, user_b, user_c, user_d, user_e = (make_user_for_organization(organization, username=i) for i in "ABCDE")
|
||||
|
||||
shifts = (
|
||||
# user, priority, start time (h), duration (hs)
|
||||
(user_a, 1, 10, 5), # r1-1: 10-15 / A
|
||||
(user_b, 1, 11, 2), # r1-2: 11-13 / B
|
||||
(user_a, 1, 16, 3), # r1-3: 16-19 / A
|
||||
(user_a, 1, 21, 1), # r1-4: 21-22 / A
|
||||
(user_b, 1, 22, 2), # r1-5: 22-00 / B
|
||||
(user_c, 2, 12, 2), # r2-1: 12-14 / C
|
||||
(user_d, 2, 14, 1), # r2-2: 14-15 / D
|
||||
(user_d, 2, 17, 1), # r2-3: 17-18 / D
|
||||
(user_d, 2, 20, 3), # r2-4: 20-23 / D
|
||||
)
|
||||
for user, priority, start_h, duration in shifts:
|
||||
data = {
|
||||
"start": start_date + timezone.timedelta(hours=start_h),
|
||||
"rotation_start": start_date,
|
||||
"duration": timezone.timedelta(hours=duration),
|
||||
"priority_level": priority,
|
||||
"frequency": CustomOnCallShift.FREQUENCY_DAILY,
|
||||
"schedule": schedule,
|
||||
}
|
||||
on_call_shift = make_on_call_shift(
|
||||
organization=organization, shift_type=CustomOnCallShift.TYPE_RECURRENT_EVENT, **data
|
||||
)
|
||||
on_call_shift.users.add(user)
|
||||
|
||||
# override: 22-23 / E
|
||||
override_data = {
|
||||
"start": start_date + timezone.timedelta(hours=22),
|
||||
"rotation_start": start_date,
|
||||
"duration": timezone.timedelta(hours=1),
|
||||
"schedule": schedule,
|
||||
}
|
||||
override = make_on_call_shift(
|
||||
organization=organization, shift_type=CustomOnCallShift.TYPE_OVERRIDE, **override_data
|
||||
)
|
||||
override.users.add(user_e)
|
||||
|
||||
url = reverse("api-internal:schedule-filter-events", kwargs={"pk": schedule.public_primary_key})
|
||||
url += "?date={}&days=1".format(request_date.strftime("%Y-%m-%d"))
|
||||
response = client.get(url, format="json", **make_user_auth_headers(user, token))
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
|
||||
expected = (
|
||||
# start (h), duration (H), user, priority, is_gap, is_override
|
||||
(0, 10, None, None, True, False), # 0-10 gap
|
||||
(10, 2, "A", 1, False, False), # 10-12 A
|
||||
(11, 1, "B", 1, False, False), # 11-12 B
|
||||
(12, 2, "C", 2, False, False), # 12-14 C
|
||||
(14, 1, "D", 2, False, False), # 14-15 D
|
||||
(15, 1, None, None, True, False), # 15-16 gap
|
||||
(16, 1, "A", 1, False, False), # 16-17 A
|
||||
(17, 1, "D", 2, False, False), # 17-18 D
|
||||
(18, 1, "A", 1, False, False), # 18-19 A
|
||||
(19, 1, None, None, True, False), # 19-20 gap
|
||||
(20, 2, "D", 2, False, False), # 20-22 D
|
||||
(22, 1, "E", None, False, True), # 22-23 E (override)
|
||||
(23, 1, "B", 1, False, False), # 23-00 B
|
||||
)
|
||||
expected_events = [
|
||||
{
|
||||
"calendar_type": 1 if is_override else None if is_gap else 0,
|
||||
"end": start_date + timezone.timedelta(hours=start + duration),
|
||||
"is_gap": is_gap,
|
||||
"priority_level": priority,
|
||||
"start": start_date + timezone.timedelta(hours=start, milliseconds=1 if start == 0 else 0),
|
||||
"user": user,
|
||||
}
|
||||
for start, duration, user, priority, is_gap, is_override in expected
|
||||
]
|
||||
returned_events = [
|
||||
{
|
||||
"calendar_type": e["calendar_type"],
|
||||
"end": e["end"],
|
||||
"is_gap": e["is_gap"],
|
||||
"priority_level": e["priority_level"],
|
||||
"start": e["start"],
|
||||
"user": e["users"][0]["display_name"] if e["users"] else None,
|
||||
}
|
||||
for e in response.data["events"]
|
||||
]
|
||||
assert returned_events == expected_events
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_filter_events_invalid_type(
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_user_auth_headers,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_plugin_token()
|
||||
client = APIClient()
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_web_schedule",
|
||||
)
|
||||
|
||||
url = reverse("api-internal:schedule-filter-events", kwargs={"pk": schedule.public_primary_key})
|
||||
url += "?type=invalid"
|
||||
response = client.get(url, format="json", **make_user_auth_headers(user, token))
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize(
|
||||
"role,expected_status",
|
||||
|
|
|
|||
|
|
@ -1,8 +1,6 @@
|
|||
import datetime
|
||||
from urllib.parse import urljoin
|
||||
|
||||
import pytz
|
||||
from django.conf import settings
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.db.models import OuterRef, Subquery
|
||||
from django.db.utils import IntegrityError
|
||||
|
|
@ -38,6 +36,11 @@ from common.api_helpers.mixins import (
|
|||
ShortSerializerMixin,
|
||||
UpdateSerializerMixin,
|
||||
)
|
||||
from common.api_helpers.utils import create_engine_url
|
||||
|
||||
EVENTS_FILTER_BY_ROTATION = "rotation"
|
||||
EVENTS_FILTER_BY_OVERRIDE = "override"
|
||||
EVENTS_FILTER_BY_FINAL = "final"
|
||||
|
||||
|
||||
class ScheduleView(
|
||||
|
|
@ -191,9 +194,10 @@ class ScheduleView(
|
|||
|
||||
return user_tz, date
|
||||
|
||||
def _filter_events(self, schedule, timezone, starting_date, days, with_empty, with_gap):
|
||||
def _filter_events(self, schedule, user_timezone, starting_date, days, with_empty, with_gap):
|
||||
shifts = (
|
||||
list_of_oncall_shifts_from_ical(schedule, starting_date, timezone, with_empty, with_gap, days=days) or []
|
||||
list_of_oncall_shifts_from_ical(schedule, starting_date, user_timezone, with_empty, with_gap, days=days)
|
||||
or []
|
||||
)
|
||||
events = []
|
||||
# for start, end, users, priority_level, source in shifts:
|
||||
|
|
@ -212,6 +216,7 @@ class ScheduleView(
|
|||
}
|
||||
for user in shift["users"]
|
||||
],
|
||||
"missing_users": shift["missing_users"],
|
||||
"priority_level": shift["priority"] if shift["priority"] != 0 else None,
|
||||
"source": shift["source"],
|
||||
"calendar_type": shift["calendar_type"],
|
||||
|
|
@ -256,8 +261,12 @@ class ScheduleView(
|
|||
@action(detail=True, methods=["get"])
|
||||
def filter_events(self, request, pk):
|
||||
user_tz, date = self.get_request_timezone()
|
||||
with_empty = self.request.query_params.get("with_empty", False) == "true"
|
||||
with_gap = self.request.query_params.get("with_gap", False) == "true"
|
||||
filter_by = self.request.query_params.get("type")
|
||||
|
||||
valid_filters = (EVENTS_FILTER_BY_ROTATION, EVENTS_FILTER_BY_OVERRIDE, EVENTS_FILTER_BY_FINAL)
|
||||
if filter_by is not None and filter_by not in valid_filters:
|
||||
raise BadRequest(detail="Invalid type value")
|
||||
resolve_schedule = filter_by is None or filter_by == EVENTS_FILTER_BY_FINAL
|
||||
|
||||
starting_date = date if self.request.query_params.get("date") else None
|
||||
if starting_date is None:
|
||||
|
|
@ -271,9 +280,16 @@ class ScheduleView(
|
|||
|
||||
schedule = self.original_get_object()
|
||||
events = self._filter_events(
|
||||
schedule, user_tz, starting_date, days=days, with_empty=with_empty, with_gap=with_gap
|
||||
schedule, user_tz, starting_date, days=days, with_empty=True, with_gap=resolve_schedule
|
||||
)
|
||||
|
||||
if filter_by == EVENTS_FILTER_BY_OVERRIDE:
|
||||
events = [e for e in events if e["calendar_type"] == OnCallSchedule.OVERRIDES]
|
||||
elif filter_by == EVENTS_FILTER_BY_ROTATION:
|
||||
events = [e for e in events if e["calendar_type"] == OnCallSchedule.PRIMARY]
|
||||
else: # resolve_schedule
|
||||
events = self._resolve_schedule(events)
|
||||
|
||||
result = {
|
||||
"id": schedule.public_primary_key,
|
||||
"name": schedule.name,
|
||||
|
|
@ -282,6 +298,103 @@ class ScheduleView(
|
|||
}
|
||||
return Response(result, status=status.HTTP_200_OK)
|
||||
|
||||
def _resolve_schedule(self, events):
|
||||
"""Calculate final schedule shifts considering rotations and overrides."""
|
||||
if not events:
|
||||
return []
|
||||
|
||||
# sort schedule events by (type desc, priority desc, start timestamp asc)
|
||||
events.sort(
|
||||
key=lambda e: (
|
||||
-e["calendar_type"] if e["calendar_type"] else 0, # overrides: 1, shifts: 0, gaps: None
|
||||
-e["priority_level"] if e["priority_level"] else 0,
|
||||
e["start"],
|
||||
)
|
||||
)
|
||||
|
||||
def _merge_intervals(evs):
|
||||
"""Keep track of scheduled intervals."""
|
||||
if not evs:
|
||||
return []
|
||||
intervals = [[e["start"], e["end"]] for e in evs]
|
||||
result = [intervals[0]]
|
||||
for interval in intervals[1:]:
|
||||
previous_interval = result[-1]
|
||||
if previous_interval[0] <= interval[0] <= previous_interval[1]:
|
||||
previous_interval[1] = max(previous_interval[1], interval[1])
|
||||
else:
|
||||
result.append(interval)
|
||||
return result
|
||||
|
||||
# iterate over events, reserving schedule slots based on their priority
|
||||
# if the expected slot was already scheduled for a higher priority event,
|
||||
# split the event, or fix start/end timestamps accordingly
|
||||
|
||||
# include overrides from start
|
||||
resolved = [e for e in events if e["calendar_type"] == OnCallSchedule.TYPE_ICAL_OVERRIDES]
|
||||
intervals = _merge_intervals(resolved)
|
||||
|
||||
pending = events[len(resolved) :]
|
||||
if not pending:
|
||||
return resolved
|
||||
|
||||
current_event_idx = 0 # current event to resolve
|
||||
current_interval_idx = 0 # current scheduled interval being checked
|
||||
current_priority = pending[0]["priority_level"] # current priority level being resolved
|
||||
|
||||
while current_event_idx < len(pending):
|
||||
ev = pending[current_event_idx]
|
||||
|
||||
if ev["priority_level"] != current_priority:
|
||||
# update scheduled intervals on priority change
|
||||
# and start from the beginning for the new priority level
|
||||
resolved.sort(key=lambda e: e["start"])
|
||||
intervals = _merge_intervals(resolved)
|
||||
current_interval_idx = 0
|
||||
current_priority = ev["priority_level"]
|
||||
|
||||
if current_interval_idx >= len(intervals):
|
||||
# event outside scheduled intervals, add to resolved
|
||||
resolved.append(ev)
|
||||
current_event_idx += 1
|
||||
elif ev["start"] < intervals[current_interval_idx][0] and ev["end"] <= intervals[current_interval_idx][0]:
|
||||
# event starts and ends outside an already scheduled interval, add to resolved
|
||||
resolved.append(ev)
|
||||
current_event_idx += 1
|
||||
elif ev["start"] < intervals[current_interval_idx][0] and ev["end"] > intervals[current_interval_idx][0]:
|
||||
# event starts outside interval but overlaps with an already scheduled interval
|
||||
# 1. add a split event copy to schedule the time before the already scheduled interval
|
||||
to_add = ev.copy()
|
||||
to_add["end"] = intervals[current_interval_idx][0]
|
||||
resolved.append(to_add)
|
||||
# 2. check if there is still time to be scheduled after the current scheduled interval ends
|
||||
if ev["end"] > intervals[current_interval_idx][1]:
|
||||
# event ends after current interval, update event start timestamp to match the interval end
|
||||
# and process the updated event as any other event
|
||||
ev["start"] = intervals[current_interval_idx][1]
|
||||
else:
|
||||
# done, go to next event
|
||||
current_event_idx += 1
|
||||
elif ev["start"] >= intervals[current_interval_idx][0] and ev["end"] <= intervals[current_interval_idx][1]:
|
||||
# event inside an already scheduled interval, ignore (go to next)
|
||||
current_event_idx += 1
|
||||
elif (
|
||||
ev["start"] >= intervals[current_interval_idx][0]
|
||||
and ev["start"] < intervals[current_interval_idx][1]
|
||||
and ev["end"] > intervals[current_interval_idx][1]
|
||||
):
|
||||
# event starts inside a scheduled interval but ends out of it
|
||||
# update the event start timestamp to match the interval end
|
||||
ev["start"] = intervals[current_interval_idx][1]
|
||||
# move to next interval and process the updated event as any other event
|
||||
current_interval_idx += 1
|
||||
elif ev["start"] >= intervals[current_interval_idx][1]:
|
||||
# event starts after the current interval, move to next interval and go through it
|
||||
current_interval_idx += 1
|
||||
|
||||
resolved.sort(key=lambda e: e["start"])
|
||||
return resolved
|
||||
|
||||
@action(detail=False, methods=["get"])
|
||||
def type_options(self, request):
|
||||
# TODO: check if it needed
|
||||
|
|
@ -328,10 +441,9 @@ class ScheduleView(
|
|||
except IntegrityError:
|
||||
raise Conflict("Schedule export token for user already exists")
|
||||
|
||||
export_url = urljoin(
|
||||
settings.BASE_URL,
|
||||
export_url = create_engine_url(
|
||||
reverse("api-public:schedules-export", kwargs={"pk": schedule.public_primary_key})
|
||||
+ f"?{SCHEDULE_EXPORT_TOKEN_NAME}={token}",
|
||||
+ f"?{SCHEDULE_EXPORT_TOKEN_NAME}={token}"
|
||||
)
|
||||
|
||||
data = {"token": token, "created_at": instance.created_at, "export_url": export_url}
|
||||
|
|
|
|||
|
|
@ -1,5 +1,4 @@
|
|||
import logging
|
||||
from urllib.parse import urljoin
|
||||
|
||||
import pytz
|
||||
from django.apps import apps
|
||||
|
|
@ -45,6 +44,7 @@ from apps.user_management.organization_log_creator import OrganizationLogType, c
|
|||
from common.api_helpers.exceptions import Conflict
|
||||
from common.api_helpers.mixins import FilterSerializerMixin, PublicPrimaryKeyMixin
|
||||
from common.api_helpers.paginators import HundredPageSizePaginator
|
||||
from common.api_helpers.utils import create_engine_url
|
||||
from common.constants.role import Role
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
|
@ -101,7 +101,11 @@ class UserView(
|
|||
mixins.ListModelMixin,
|
||||
viewsets.GenericViewSet,
|
||||
):
|
||||
authentication_classes = (PluginAuthentication,)
|
||||
authentication_classes = (
|
||||
MobileAppAuthTokenAuthentication,
|
||||
PluginAuthentication,
|
||||
)
|
||||
|
||||
permission_classes = (IsAuthenticated, ActionPermission)
|
||||
|
||||
# Non-admin users are allowed to list and retrieve users
|
||||
|
|
@ -411,10 +415,9 @@ class UserView(
|
|||
except IntegrityError:
|
||||
raise Conflict("Schedule export token for user already exists")
|
||||
|
||||
export_url = urljoin(
|
||||
settings.BASE_URL,
|
||||
export_url = create_engine_url(
|
||||
reverse("api-public:users-schedule-export", kwargs={"pk": user.public_primary_key})
|
||||
+ f"?{SCHEDULE_EXPORT_TOKEN_NAME}={token}",
|
||||
+ f"?{SCHEDULE_EXPORT_TOKEN_NAME}={token}"
|
||||
)
|
||||
|
||||
data = {"token": token, "created_at": instance.created_at, "export_url": export_url}
|
||||
|
|
|
|||
|
|
@ -118,7 +118,7 @@ class LiveSetting(models.Model):
|
|||
"<a href='https://github.com/grafana/oncall/blob/dev/engine/apps/oss_installation/usage_stats.py#L29'> source code</a>."
|
||||
),
|
||||
"GRAFANA_CLOUD_ONCALL_TOKEN": "Secret token for Grafana Cloud OnCall instance.",
|
||||
"GRAFANA_CLOUD_ONCALL_HEARTBEAT_ENABLED": "Enable hearbeat integration with Grafana Cloud OnCall.",
|
||||
"GRAFANA_CLOUD_ONCALL_HEARTBEAT_ENABLED": "Enable heartbeat integration with Grafana Cloud OnCall.",
|
||||
"GRAFANA_CLOUD_NOTIFICATIONS_ENABLED": "Enable SMS/call notifications via Grafana Cloud OnCall",
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -8,12 +8,12 @@ from django.conf import settings
|
|||
class IntegrationHeartBeatText:
|
||||
heartbeat_expired_title: str = "heartbeat_expired"
|
||||
heartbeat_expired_message: str = "heartbeat_expired"
|
||||
heartbeat_restored_title: str = "hearbeat_restored"
|
||||
heartbeat_restored_title: str = "heartbeat_restored"
|
||||
heartbeat_restored_message: str = "heartbeat_restored"
|
||||
heartbeat_instruction_template: str = None
|
||||
|
||||
|
||||
class HearBeatTextCreator:
|
||||
class HeartBeatTextCreator:
|
||||
def __init__(self, integration_verbal):
|
||||
self.integration_verbal = integration_verbal.capitalize()
|
||||
|
||||
|
|
@ -52,7 +52,7 @@ class HearBeatTextCreator:
|
|||
return f"heartbeat_instructions/{self.integration_verbal.lower()}.html"
|
||||
|
||||
|
||||
class HearBeatTextCreatorForTitleGrouping(HearBeatTextCreator):
|
||||
class HeartBeatTextCreatorForTitleGrouping(HeartBeatTextCreator):
|
||||
"""
|
||||
Some integrations (Grafana, AlertManager) have default grouping template based on title
|
||||
"""
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from pathlib import PurePath
|
||||
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HearBeatTextCreatorForTitleGrouping
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreatorForTitleGrouping
|
||||
|
||||
integration_verbal = PurePath(__file__).stem
|
||||
creator = HearBeatTextCreatorForTitleGrouping(integration_verbal)
|
||||
creator = HeartBeatTextCreatorForTitleGrouping(integration_verbal)
|
||||
heartbeat_text = creator.get_heartbeat_texts()
|
||||
|
||||
heartbeat_instruction_template = heartbeat_text.heartbeat_instruction_template
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from pathlib import PurePath
|
||||
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HearBeatTextCreator
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreator
|
||||
|
||||
integration_verbal = PurePath(__file__).stem
|
||||
creator = HearBeatTextCreator(integration_verbal)
|
||||
creator = HeartBeatTextCreator(integration_verbal)
|
||||
heartbeat_text = creator.get_heartbeat_texts()
|
||||
|
||||
heartbeat_instruction_template = heartbeat_text.heartbeat_instruction_template
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from pathlib import PurePath
|
||||
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HearBeatTextCreator
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreator
|
||||
|
||||
integration_verbal = PurePath(__file__).stem
|
||||
creator = HearBeatTextCreator(integration_verbal)
|
||||
creator = HeartBeatTextCreator(integration_verbal)
|
||||
heartbeat_text = creator.get_heartbeat_texts()
|
||||
|
||||
heartbeat_instruction_template = heartbeat_text.heartbeat_instruction_template
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from pathlib import PurePath
|
||||
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HearBeatTextCreatorForTitleGrouping
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreatorForTitleGrouping
|
||||
|
||||
integration_verbal = PurePath(__file__).stem
|
||||
creator = HearBeatTextCreatorForTitleGrouping(integration_verbal)
|
||||
creator = HeartBeatTextCreatorForTitleGrouping(integration_verbal)
|
||||
heartbeat_text = creator.get_heartbeat_texts()
|
||||
|
||||
heartbeat_instruction_template = heartbeat_text.heartbeat_instruction_template
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from pathlib import PurePath
|
||||
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HearBeatTextCreator
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreator
|
||||
|
||||
integration_verbal = PurePath(__file__).stem
|
||||
creator = HearBeatTextCreator(integration_verbal)
|
||||
creator = HeartBeatTextCreator(integration_verbal)
|
||||
heartbeat_text = creator.get_heartbeat_texts()
|
||||
|
||||
heartbeat_instruction_template = heartbeat_text.heartbeat_instruction_template
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from pathlib import PurePath
|
||||
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HearBeatTextCreator
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreator
|
||||
|
||||
integration_verbal = PurePath(__file__).stem
|
||||
creator = HearBeatTextCreator(integration_verbal)
|
||||
creator = HeartBeatTextCreator(integration_verbal)
|
||||
heartbeat_text = creator.get_heartbeat_texts()
|
||||
|
||||
heartbeat_instruction_template = heartbeat_text.heartbeat_instruction_template
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from pathlib import PurePath
|
||||
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HearBeatTextCreator
|
||||
from apps.integrations.metadata.heartbeat._heartbeat_text_creator import HeartBeatTextCreator
|
||||
|
||||
integration_verbal = PurePath(__file__).stem
|
||||
creator = HearBeatTextCreator(integration_verbal)
|
||||
creator = HeartBeatTextCreator(integration_verbal)
|
||||
heartbeat_text = creator.get_heartbeat_texts()
|
||||
|
||||
heartbeat_instruction_template = heartbeat_text.heartbeat_instruction_template
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ def setup_heartbeat_integration(name=None):
|
|||
# don't specify a team in the data, so heartbeat integration will be created in the General.
|
||||
name = name or f"OnCall Cloud Heartbeat {settings.BASE_URL}"
|
||||
data = {"type": "formatted_webhook", "name": name}
|
||||
url = urljoin(settings.GRAFANA_CLOUD_ONCALL_API_URL, "/api/v1/integrations/")
|
||||
url = urljoin(settings.GRAFANA_CLOUD_ONCALL_API_URL, "api/v1/integrations/")
|
||||
try:
|
||||
headers = {"Authorization": api_token}
|
||||
r = requests.post(url=url, data=data, headers=headers, timeout=5)
|
||||
|
|
|
|||
|
|
@ -137,6 +137,7 @@ def list_of_oncall_shifts_from_ical(
|
|||
"start": g.start if g.start else datetime_start,
|
||||
"end": g.end if g.end else datetime_end,
|
||||
"users": [],
|
||||
"missing_users": [],
|
||||
"priority": None,
|
||||
"source": None,
|
||||
"calendar_type": None,
|
||||
|
|
@ -157,6 +158,7 @@ def get_shifts_dict(calendar, calendar_type, schedule, datetime_start, datetime_
|
|||
priority = parse_priority_from_string(event.get(ICAL_SUMMARY, "[L0]"))
|
||||
pk, source = parse_event_uid(event.get(ICAL_UID))
|
||||
users = get_users_from_ical_event(event, schedule.organization)
|
||||
missing_users = get_missing_users_from_ical_event(event, schedule.organization)
|
||||
# Define on-call shift out of ical event that has the actual user
|
||||
if len(users) > 0 or with_empty_shifts:
|
||||
if type(event[ICAL_DATETIME_START].dt) == datetime.date:
|
||||
|
|
@ -168,6 +170,7 @@ def get_shifts_dict(calendar, calendar_type, schedule, datetime_start, datetime_
|
|||
"start": start,
|
||||
"end": end,
|
||||
"users": users,
|
||||
"missing_users": missing_users,
|
||||
"priority": priority,
|
||||
"source": source,
|
||||
"calendar_type": calendar_type,
|
||||
|
|
@ -183,6 +186,7 @@ def get_shifts_dict(calendar, calendar_type, schedule, datetime_start, datetime_
|
|||
"start": start,
|
||||
"end": end,
|
||||
"users": users,
|
||||
"missing_users": missing_users,
|
||||
"priority": priority,
|
||||
"source": source,
|
||||
"calendar_type": calendar_type,
|
||||
|
|
@ -379,6 +383,14 @@ def get_usernames_from_ical_event(event):
|
|||
return usernames_found, priority
|
||||
|
||||
|
||||
def get_missing_users_from_ical_event(event, organization):
|
||||
all_usernames, _ = get_usernames_from_ical_event(event)
|
||||
users = list(get_users_from_ical_event(event, organization))
|
||||
found_usernames = [u.username for u in users]
|
||||
found_emails = [u.email for u in users]
|
||||
return [u for u in all_usernames if u != "" and u not in found_usernames and u not in found_emails]
|
||||
|
||||
|
||||
def get_users_from_ical_event(event, organization):
|
||||
usernames_from_ical, _ = get_usernames_from_ical_event(event)
|
||||
users = []
|
||||
|
|
|
|||
|
|
@ -1,17 +1,12 @@
|
|||
# Generated by Django 3.2.13 on 2022-07-12 08:03
|
||||
|
||||
from django.db import migrations, models
|
||||
from django.db.models import F
|
||||
|
||||
|
||||
def fill_rotation_start_field(apps, schema_editor):
|
||||
CustomOnCallShift = apps.get_model("schedules", "CustomOnCallShift")
|
||||
shifts = CustomOnCallShift.objects.all()
|
||||
shifts_to_update = []
|
||||
for shift in shifts:
|
||||
shift.rotation_start = shift.start
|
||||
shifts_to_update.append(shift)
|
||||
|
||||
CustomOnCallShift.objects.bulk_update(shifts_to_update, ["rotation_start"], batch_size=5000)
|
||||
CustomOnCallShift.objects.update(rotation_start=F("start"))
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
|
|
|||
|
|
@ -236,7 +236,8 @@ class CustomOnCallShift(models.Model):
|
|||
result += (
|
||||
f", frequency: {self.get_frequency_display()}, interval: {self.interval}, "
|
||||
f"week start: {self.week_start}, by day: {self.by_day}, by month: {self.by_month}, "
|
||||
f"by monthday: {self.by_monthday}, until: {self.until.isoformat() if self.until else None}"
|
||||
f"by monthday: {self.by_monthday}, rotation start: {self.rotation_start.isoformat()}, "
|
||||
f"until: {self.until.isoformat() if self.until else None}"
|
||||
)
|
||||
return result
|
||||
|
||||
|
|
|
|||
|
|
@ -1,14 +1,13 @@
|
|||
import json
|
||||
import logging
|
||||
from urllib.parse import urljoin
|
||||
|
||||
from django.apps import apps
|
||||
from django.conf import settings
|
||||
from django.db.models import Q
|
||||
from django.utils import timezone
|
||||
|
||||
from apps.slack.scenarios import scenario_step
|
||||
from apps.slack.slack_client.exceptions import SlackAPIException
|
||||
from common.api_helpers.utils import create_engine_url
|
||||
|
||||
from .step_mixins import CheckAlertIsUnarchivedMixin
|
||||
|
||||
|
|
@ -607,7 +606,7 @@ class ResolutionNoteModalStep(CheckAlertIsUnarchivedMixin, scenario_step.Scenari
|
|||
|
||||
if not blocks:
|
||||
# there aren't any resolution notes yet, display a hint instead
|
||||
link_to_instruction = urljoin(settings.BASE_URL, "static/images/postmortem.gif")
|
||||
link_to_instruction = create_engine_url("static/images/postmortem.gif")
|
||||
blocks = [
|
||||
{
|
||||
"type": "divider",
|
||||
|
|
@ -633,7 +632,7 @@ class ResolutionNoteModalStep(CheckAlertIsUnarchivedMixin, scenario_step.Scenari
|
|||
return blocks
|
||||
|
||||
def get_invite_bot_tip_blocks(self, channel):
|
||||
link_to_instruction = urljoin(settings.BASE_URL, "static/images/postmortem.gif")
|
||||
link_to_instruction = create_engine_url("static/images/postmortem.gif")
|
||||
blocks = [
|
||||
{
|
||||
"type": "divider",
|
||||
|
|
|
|||
|
|
@ -1,10 +1,9 @@
|
|||
import json
|
||||
from urllib.parse import urljoin
|
||||
|
||||
import pytest
|
||||
from django.conf import settings
|
||||
|
||||
from apps.slack.scenarios.scenario_step import ScenarioStep
|
||||
from common.api_helpers.utils import create_engine_url
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
|
|
@ -22,7 +21,7 @@ def test_get_resolution_notes_blocks_default_if_empty(
|
|||
|
||||
blocks = step.get_resolution_notes_blocks(alert_group, "", False)
|
||||
|
||||
link_to_instruction = urljoin(settings.BASE_URL, "static/images/postmortem.gif")
|
||||
link_to_instruction = create_engine_url("static/images/postmortem.gif")
|
||||
expected_blocks = [
|
||||
{
|
||||
"type": "divider",
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
from datetime import datetime
|
||||
from textwrap import wrap
|
||||
|
||||
from apps.slack.slack_client import SlackClientWithErrorHandling
|
||||
|
|
@ -58,7 +59,8 @@ def post_message_to_channel(organization, channel_id, text):
|
|||
|
||||
|
||||
def format_datetime_to_slack(timestamp, format="date_short"):
|
||||
return f"<!date^{timestamp}^{{{format}}} {{time}}|{timestamp}>"
|
||||
fallback = datetime.utcfromtimestamp(timestamp).strftime("%Y-%m-%d %H:%M (UTC)")
|
||||
return f"<!date^{timestamp}^{{{format}}} {{time}}|{fallback}>"
|
||||
|
||||
|
||||
def get_cache_key_update_incident_slack_message(alert_group_pk):
|
||||
|
|
|
|||
|
|
@ -2,7 +2,6 @@ import logging
|
|||
import urllib.parse
|
||||
|
||||
from django.apps import apps
|
||||
from django.conf import settings
|
||||
from django.urls import reverse
|
||||
from twilio.base.exceptions import TwilioRestException
|
||||
from twilio.rest import Client
|
||||
|
|
@ -10,6 +9,7 @@ from twilio.rest import Client
|
|||
from apps.base.utils import live_settings
|
||||
from apps.twilioapp.constants import TEST_CALL_TEXT, TwilioLogRecordStatus, TwilioLogRecordType
|
||||
from apps.twilioapp.utils import get_calling_code, get_gather_message, get_gather_url, parse_phone_number
|
||||
from common.api_helpers.utils import create_engine_url
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -24,7 +24,7 @@ class TwilioClient:
|
|||
return live_settings.TWILIO_NUMBER
|
||||
|
||||
def send_message(self, body, to):
|
||||
status_callback = settings.BASE_URL + reverse("twilioapp:sms_status_events")
|
||||
status_callback = create_engine_url(reverse("twilioapp:sms_status_events"))
|
||||
try:
|
||||
return self.twilio_api_client.messages.create(
|
||||
body=body, to=to, from_=self.twilio_number, status_callback=status_callback
|
||||
|
|
@ -143,7 +143,7 @@ class TwilioClient:
|
|||
)
|
||||
|
||||
url = "http://twimlets.com/echo?Twiml=" + twiml_query
|
||||
status_callback = settings.BASE_URL + reverse("twilioapp:call_status_events")
|
||||
status_callback = create_engine_url(reverse("twilioapp:call_status_events"))
|
||||
|
||||
status_callback_events = ["initiated", "ringing", "answered", "completed"]
|
||||
|
||||
|
|
|
|||
|
|
@ -3,11 +3,12 @@ import re
|
|||
from string import digits
|
||||
|
||||
from django.apps import apps
|
||||
from django.conf import settings
|
||||
from django.urls import reverse
|
||||
from phonenumbers import COUNTRY_CODE_TO_REGION_CODE
|
||||
from twilio.twiml.voice_response import Gather, VoiceResponse
|
||||
|
||||
from common.api_helpers.utils import create_engine_url
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
|
@ -19,7 +20,7 @@ def get_calling_code(iso):
|
|||
|
||||
|
||||
def get_gather_url():
|
||||
gather_url = settings.BASE_URL + reverse("twilioapp:gather")
|
||||
gather_url = create_engine_url(reverse("twilioapp:gather"))
|
||||
return gather_url
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,6 @@ from django.db.models.signals import post_save
|
|||
from django.dispatch import receiver
|
||||
from emoji import demojize
|
||||
|
||||
from apps.alerts.tasks import invalidate_web_cache_for_alert_group
|
||||
from apps.schedules.tasks import drop_cached_ical_for_custom_events_for_organization
|
||||
from common.constants.role import Role
|
||||
from common.public_primary_keys import generate_public_primary_key, increase_public_primary_key_length
|
||||
|
|
@ -255,14 +254,6 @@ class User(models.Model):
|
|||
# TODO: check whether this signal can be moved to save method of the model
|
||||
@receiver(post_save, sender=User)
|
||||
def listen_for_user_model_save(sender, instance, created, *args, **kwargs):
|
||||
# if kwargs is not None:
|
||||
# if "update_fields" in kwargs:
|
||||
# if kwargs["update_fields"] is not None:
|
||||
# if "username" not in kwargs["update_fields"]:
|
||||
# return
|
||||
|
||||
drop_cached_ical_for_custom_events_for_organization.apply_async(
|
||||
(instance.organization_id,),
|
||||
)
|
||||
logger.info(f"Drop AG cache. Reason: save user {instance.pk}")
|
||||
invalidate_web_cache_for_alert_group.apply_async(kwargs={"org_pk": instance.organization_id})
|
||||
|
|
|
|||
|
|
@ -1,3 +1,5 @@
|
|||
from urllib.parse import urljoin
|
||||
|
||||
import requests
|
||||
from django.conf import settings
|
||||
from icalendar import Calendar
|
||||
|
|
@ -50,3 +52,11 @@ def validate_ical_url(url):
|
|||
raise serializers.ValidationError("Ical parse failed")
|
||||
return url
|
||||
return None
|
||||
|
||||
|
||||
def create_engine_url(path):
|
||||
base = settings.BASE_URL
|
||||
if not base.endswith("/"):
|
||||
base += "/"
|
||||
trimmed_path = path.lstrip("/")
|
||||
return urljoin(base, trimmed_path)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import json
|
||||
import re
|
||||
|
||||
from django.utils.dateparse import parse_datetime
|
||||
|
||||
|
|
@ -22,3 +23,10 @@ def to_pretty_json(value):
|
|||
return json.dumps(value, sort_keys=True, indent=4, separators=(",", ": "), ensure_ascii=False)
|
||||
except (ValueError, AttributeError, TypeError):
|
||||
return None
|
||||
|
||||
|
||||
def regex_replace(value, find, replace):
|
||||
try:
|
||||
return re.sub(find, replace, value)
|
||||
except (ValueError, AttributeError, TypeError):
|
||||
return None
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ from django.utils import timezone
|
|||
from jinja2 import BaseLoader
|
||||
from jinja2.sandbox import SandboxedEnvironment
|
||||
|
||||
from .filters import datetimeformat, iso8601_to_time, to_pretty_json
|
||||
from .filters import datetimeformat, iso8601_to_time, regex_replace, to_pretty_json
|
||||
|
||||
jinja_template_env = SandboxedEnvironment(loader=BaseLoader())
|
||||
|
||||
|
|
@ -10,3 +10,4 @@ jinja_template_env.filters["datetimeformat"] = datetimeformat
|
|||
jinja_template_env.filters["iso8601_to_time"] = iso8601_to_time
|
||||
jinja_template_env.filters["tojson_pretty"] = to_pretty_json
|
||||
jinja_template_env.globals["time"] = timezone.now
|
||||
jinja_template_env.filters["regex_replace"] = regex_replace
|
||||
|
|
|
|||
35
engine/common/tests/test_create_engine_url.py
Normal file
35
engine/common/tests/test_create_engine_url.py
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
from django.test.utils import override_settings
|
||||
|
||||
from common.api_helpers.utils import create_engine_url
|
||||
|
||||
|
||||
@override_settings(BASE_URL="http://localhost:8000")
|
||||
def test_create_engine_url_no_slash():
|
||||
assert create_engine_url("destination") == "http://localhost:8000/destination"
|
||||
assert create_engine_url("/destination") == "http://localhost:8000/destination"
|
||||
assert create_engine_url("destination/") == "http://localhost:8000/destination/"
|
||||
assert create_engine_url("/destination/") == "http://localhost:8000/destination/"
|
||||
|
||||
|
||||
@override_settings(BASE_URL="http://localhost:8000/")
|
||||
def test_create_engine_url_slash():
|
||||
assert create_engine_url("destination") == "http://localhost:8000/destination"
|
||||
assert create_engine_url("/destination") == "http://localhost:8000/destination"
|
||||
assert create_engine_url("destination/") == "http://localhost:8000/destination/"
|
||||
assert create_engine_url("/destination/") == "http://localhost:8000/destination/"
|
||||
|
||||
|
||||
@override_settings(BASE_URL="http://localhost:8000/test123")
|
||||
def test_create_engine_url_prefix_no_slash():
|
||||
assert create_engine_url("destination") == "http://localhost:8000/test123/destination"
|
||||
assert create_engine_url("/destination") == "http://localhost:8000/test123/destination"
|
||||
assert create_engine_url("destination/") == "http://localhost:8000/test123/destination/"
|
||||
assert create_engine_url("/destination/") == "http://localhost:8000/test123/destination/"
|
||||
|
||||
|
||||
@override_settings(BASE_URL="http://localhost:8000/test123/")
|
||||
def test_create_engine_url_prefix_slash():
|
||||
assert create_engine_url("destination") == "http://localhost:8000/test123/destination"
|
||||
assert create_engine_url("/destination") == "http://localhost:8000/test123/destination"
|
||||
assert create_engine_url("destination/") == "http://localhost:8000/test123/destination/"
|
||||
assert create_engine_url("/destination/") == "http://localhost:8000/test123/destination/"
|
||||
7
engine/common/tests/test_regex_replace.py
Normal file
7
engine/common/tests/test_regex_replace.py
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
from common.jinja_templater.filters import regex_replace
|
||||
|
||||
|
||||
def test_regex_replace_drop_field():
|
||||
original = "[ var='D0' metric='my_metric' labels={} value=140 ]"
|
||||
expected = "[ metric='my_metric' labels={} value=140 ]"
|
||||
assert regex_replace(original, "var='[a-zA-Z0-9]+' ", "") == expected
|
||||
|
|
@ -36,7 +36,10 @@ django-log-request-id==1.6.0
|
|||
django-polymorphic==3.0.0
|
||||
django-rest-polymorphic==0.1.9
|
||||
pre-commit==2.15.0
|
||||
https://github.com/iskhakov/django-push-notifications/archive/refs/tags/2.0.0-hotfix-4.tar.gz
|
||||
django-push-notifications==3.0.0
|
||||
django-mirage-field==1.3.0
|
||||
django-mysql==4.6.0
|
||||
PyMySQL==1.0.2
|
||||
emoji==1.7.0
|
||||
apns2==0.7.2
|
||||
|
||||
|
|
|
|||
|
|
@ -405,8 +405,9 @@ PUSH_NOTIFICATIONS_SETTINGS = {
|
|||
"APNS_TOPIC": os.environ.get("APNS_TOPIC", None),
|
||||
"APNS_AUTH_KEY_ID": os.environ.get("APNS_AUTH_KEY_ID", None),
|
||||
"APNS_TEAM_ID": os.environ.get("APNS_TEAM_ID", None),
|
||||
"APNS_USE_SANDBOX": True,
|
||||
"APNS_USE_SANDBOX": getenv_boolean("APNS_USE_SANDBOX", True),
|
||||
"USER_MODEL": "user_management.User",
|
||||
"UPDATE_ON_DUPLICATE_REG_ID": True,
|
||||
}
|
||||
|
||||
SELF_HOSTED_SETTINGS = {
|
||||
|
|
|
|||
|
|
@ -27,8 +27,11 @@ RABBITMQ_PASSWORD = os.environ.get("RABBITMQ_PASSWORD")
|
|||
RABBITMQ_HOST = os.environ.get("RABBITMQ_HOST")
|
||||
RABBITMQ_PORT = os.environ.get("RABBITMQ_PORT")
|
||||
RABBITMQ_PROTOCOL = os.environ.get("RABBITMQ_PROTOCOL")
|
||||
RABBITMQ_VHOST = os.environ.get("RABBITMQ_VHOST", "")
|
||||
|
||||
CELERY_BROKER_URL = f"{RABBITMQ_PROTOCOL}://{RABBITMQ_USERNAME}:{RABBITMQ_PASSWORD}@{RABBITMQ_HOST}:{RABBITMQ_PORT}"
|
||||
CELERY_BROKER_URL = (
|
||||
f"{RABBITMQ_PROTOCOL}://{RABBITMQ_USERNAME}:{RABBITMQ_PASSWORD}@{RABBITMQ_HOST}:{RABBITMQ_PORT}/{RABBITMQ_VHOST}"
|
||||
)
|
||||
|
||||
REDIS_PASSWORD = os.environ.get("REDIS_PASSWORD")
|
||||
REDIS_HOST = os.environ.get("REDIS_HOST")
|
||||
|
|
|
|||
|
|
@ -115,14 +115,15 @@ export const Root = observer((props: AppRootProps) => {
|
|||
}, []);
|
||||
|
||||
useEffect(() => {
|
||||
const style = document.createElement('style');
|
||||
document.head.appendChild(style);
|
||||
const index = style.sheet.insertRule('.page-body {max-width: unset !important}');
|
||||
const index2 = style.sheet.insertRule('.page-container {max-width: unset !important}');
|
||||
let link = document.createElement('link');
|
||||
link.type = 'text/css';
|
||||
link.rel = 'stylesheet';
|
||||
link.href = '/public/plugins/grafana-oncall-app/img/grafanaGlobalStyles.css';
|
||||
|
||||
document.head.appendChild(link);
|
||||
|
||||
return () => {
|
||||
style.sheet.removeRule(index);
|
||||
style.sheet.removeRule(index2);
|
||||
document.head.removeChild(link);
|
||||
};
|
||||
}, []);
|
||||
|
||||
|
|
|
|||
Binary file not shown.
|
Before Width: | Height: | Size: 636 KiB After Width: | Height: | Size: 688 KiB |
|
|
@ -1,8 +1,13 @@
|
|||
.root {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.search {
|
||||
max-width: 400px;
|
||||
}
|
||||
|
||||
.icon-button {
|
||||
color: var(--secondary-text-color);
|
||||
margin-left: 8px;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import React, { ChangeEvent, FC, useCallback } from 'react';
|
||||
|
||||
import { Icon, Input, Button } from '@grafana/ui';
|
||||
import { Icon, Input, Button, IconButton } from '@grafana/ui';
|
||||
import cn from 'classnames/bind';
|
||||
|
||||
import styles from './EscalationsFilters.module.css';
|
||||
|
|
@ -45,9 +45,7 @@ const EscalationsFilters: FC<EscalationsFiltersProps> = (props) => {
|
|||
value={value.searchTerm}
|
||||
onChange={onSearchTermChangeCallback}
|
||||
/>
|
||||
<Button variant="secondary" icon="times" onClick={handleClear} className={cx('clear-button')}>
|
||||
Clear filters
|
||||
</Button>
|
||||
<IconButton name="times" onClick={handleClear} className={cx('icon-button')} tooltip="Clear search input" />
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
|
|
|||
|
|
@ -1,22 +1,22 @@
|
|||
export const logoCoors: { [key: string]: { x: number; y: number } } = {
|
||||
Grafana: { x: 9, y: 0 },
|
||||
'Grafana Alerting': { x: 9, y: 0 },
|
||||
Webhook: { x: 2, y: 14 },
|
||||
AlertManager: { x: 12, y: 4 },
|
||||
Kapacitor: { x: 10, y: 1 },
|
||||
Fabric: { x: 8, y: 7 },
|
||||
NewRelic: { x: 0, y: 11 },
|
||||
DataDog: { x: 3, y: 7 },
|
||||
PagerDuty: { x: 0, y: 0 },
|
||||
Pingdom: { x: 4, y: 0 },
|
||||
ElastAlert: { x: 0, y: 0 },
|
||||
'Amazon SNS': { x: 0, y: 2 },
|
||||
Curler: { x: 0, y: 0 },
|
||||
'Sentry Webhook (Onprem)': { x: 11, y: 12 },
|
||||
'Formatted Webhook': { x: 2, y: 14 },
|
||||
'HeartBeat Monitoring': { x: 2, y: 14 },
|
||||
grafana: { x: 9, y: 0 },
|
||||
grafana_alerting: { x: 9, y: 0 },
|
||||
webhook: { x: 2, y: 14 },
|
||||
alertmanager: { x: 12, y: 4 },
|
||||
kapacitor: { x: 10, y: 1 },
|
||||
fabric: { x: 8, y: 7 },
|
||||
newrelic: { x: 0, y: 11 },
|
||||
datadog: { x: 3, y: 7 },
|
||||
pagerduty: { x: 3, y: 1 },
|
||||
pingdom: { x: 4, y: 0 },
|
||||
elastalert: { x: 2, y: 1 },
|
||||
amazon_sns: { x: 0, y: 2 },
|
||||
curler: { x: 0, y: 0 },
|
||||
sentry: { x: 11, y: 12 },
|
||||
formatted_webhook: { x: 2, y: 14 },
|
||||
heartbeat_monitoring: { x: 2, y: 14 },
|
||||
Stackdriver: { x: 8, y: 8 },
|
||||
UptimeRobot: { x: 14, y: 8 },
|
||||
Zabbix: { x: 7, y: 14 },
|
||||
PRTG: { x: 12, y: 5 },
|
||||
uptimerobot: { x: 14, y: 8 },
|
||||
zabbix: { x: 7, y: 14 },
|
||||
prtg: { x: 12, y: 5 },
|
||||
};
|
||||
|
|
|
|||
|
|
@ -24,7 +24,7 @@ const IntegrationLogo: FC<IntegrationLogoProps> = (props) => {
|
|||
return null;
|
||||
}
|
||||
|
||||
const coors = logoCoors[integration.display_name] || { x: 2, y: 14 };
|
||||
const coors = logoCoors[integration.value] || { x: 2, y: 14 };
|
||||
|
||||
const bgStyle = {
|
||||
backgroundPosition: `-${coors?.x * LOGO_WIDTH * scale}px -${coors?.y * LOGO_WIDTH * scale}px`,
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@
|
|||
overflow: auto;
|
||||
scroll-snap-type: y mandatory;
|
||||
padding: 0 10px 10px 0;
|
||||
min-width: 840px;
|
||||
}
|
||||
|
||||
.cards_centered {
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import plugin from '../../../package.json'; // eslint-disable-line
|
||||
import React, { FC, useEffect, useState, useCallback } from 'react';
|
||||
|
||||
import { AppRootProps } from '@grafana/data';
|
||||
|
|
@ -88,14 +89,33 @@ const DefaultPageLayout: FC<DefaultPageLayoutProps> = observer((props) => {
|
|||
/>
|
||||
</Alert>
|
||||
)}
|
||||
{currentTeam?.limits.show_limits_warning && !getItem(currentTeam.limits.warning_text) && (
|
||||
<Alert
|
||||
className={styles.alert}
|
||||
severity="warning"
|
||||
title={currentTeam?.limits.warning_text}
|
||||
onRemove={getRemoveAlertHandler(currentTeam?.limits.warning_text)}
|
||||
/>
|
||||
)}
|
||||
{store.backendLicense === 'OpenSource' &&
|
||||
store.backendVersion &&
|
||||
plugin?.version &&
|
||||
store.backendVersion !== plugin?.version && (
|
||||
<Alert className={styles.alert} severity="warning" title={'Version mismatch!'}>
|
||||
Please make sure you have the same versions of the Grafana OnCall plugin and the Grafana OnCall engine,
|
||||
otherwise there could be issues with your Grafana OnCall installation!
|
||||
<br />
|
||||
{`Current plugin version: ${plugin.version}, current engine version: ${store.backendVersion}`}
|
||||
<br />
|
||||
Please see{' '}
|
||||
<a href={'https://grafana.com/docs/oncall/latest/open-source/#update-grafana-oncall-oss'}>
|
||||
the update instructions
|
||||
</a>
|
||||
.
|
||||
</Alert>
|
||||
)}
|
||||
{currentTeam?.limits.show_limits_warning &&
|
||||
currentTeam?.limits.period_title !== 'Version mismatch' && // don't show version mismatch warning twice
|
||||
!getItem(currentTeam.limits.warning_text) && (
|
||||
<Alert
|
||||
className={styles.alert}
|
||||
severity="warning"
|
||||
title={currentTeam?.limits.warning_text}
|
||||
onRemove={getRemoveAlertHandler(currentTeam?.limits.warning_text)}
|
||||
/>
|
||||
)}
|
||||
{Boolean(
|
||||
currentTeam &&
|
||||
currentUser &&
|
||||
|
|
|
|||
|
|
@ -1,3 +1,14 @@
|
|||
.root {
|
||||
display: block;
|
||||
}
|
||||
|
||||
.connected-integrations {
|
||||
padding: 2px 4px;
|
||||
background: rgba(27, 133, 94, 0.15);
|
||||
border: 1px solid var(--success-text-color);
|
||||
border-radius: 2px;
|
||||
}
|
||||
|
||||
.icon {
|
||||
color: var(--success-text-color);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import React from 'react';
|
||||
|
||||
import { HorizontalGroup, VerticalGroup } from '@grafana/ui';
|
||||
import { HorizontalGroup, Icon, VerticalGroup, Tooltip } from '@grafana/ui';
|
||||
import cn from 'classnames/bind';
|
||||
import { observer } from 'mobx-react';
|
||||
|
||||
|
|
@ -29,9 +29,26 @@ const EscalationChainCard = observer((props: AlertReceiveChannelCardProps) => {
|
|||
<div className={cx('root')}>
|
||||
<HorizontalGroup align="flex-start">
|
||||
<VerticalGroup spacing="xs">
|
||||
<Text type="primary" size="medium">
|
||||
{escalationChain.name}
|
||||
</Text>
|
||||
<HorizontalGroup spacing="sm">
|
||||
<Text type="primary" size="medium">
|
||||
{escalationChain.name}
|
||||
</Text>
|
||||
{(escalationChain.number_of_integrations > 0 || escalationChain.number_of_routes > 0) && (
|
||||
<Tooltip
|
||||
placement="top"
|
||||
content={`Modifying this escalation chain will affect ${escalationChain.number_of_integrations} integrations and ${escalationChain.number_of_routes} routes.`}
|
||||
>
|
||||
<div className={cx('connected-integrations')}>
|
||||
<HorizontalGroup spacing="xs">
|
||||
<Icon className={cx('icon')} name="link" size="sm" />
|
||||
<Text type="success" size="small">
|
||||
{escalationChain.number_of_integrations}
|
||||
</Text>
|
||||
</HorizontalGroup>
|
||||
</div>
|
||||
</Tooltip>
|
||||
)}
|
||||
</HorizontalGroup>
|
||||
{/*<HorizontalGroup>
|
||||
<PluginLink
|
||||
query={{ page: 'incidents', integration: alertReceiveChannel.id }}
|
||||
|
|
|
|||
|
|
@ -36,7 +36,7 @@ export const form: { name: string; fields: FormItem[] } = {
|
|||
name: 'data',
|
||||
getDisabled: (form_data) => Boolean(form_data.forward_whole_payload),
|
||||
type: FormItemType.TextArea,
|
||||
description: 'Available variables: {{ alert_title }}, {{ alert_message }}, {{ alert_url }}, {{ alert_payload }}',
|
||||
description: 'Available variables: {{ alert_payload }}, {{ alert_group_id }}',
|
||||
extra: {
|
||||
rows: 9,
|
||||
},
|
||||
|
|
|
|||
|
|
@ -68,7 +68,7 @@ export const PluginConfigPage = (props: Props) => {
|
|||
provisioningConfig = await makeRequest('/plugin/self-hosted/install', { method: 'POST' });
|
||||
} catch (e) {
|
||||
if (e.response.status === 502) {
|
||||
console.warn('Could not connect to OnCall: ' + plugin.meta.jsonData.onCallApiUrl);
|
||||
console.warn('Could not connect to OnCall: ' + onCallApiUrl);
|
||||
} else if (e.response.status === 403) {
|
||||
console.warn('Invitation token is invalid or expired.');
|
||||
} else {
|
||||
|
|
@ -130,13 +130,11 @@ export const PluginConfigPage = (props: Props) => {
|
|||
|
||||
const handleSyncException = useCallback((e) => {
|
||||
if (plugin.meta.jsonData?.onCallApiUrl) {
|
||||
setPluginStatusMessage(
|
||||
'Tried connecting to OnCall: ' +
|
||||
plugin.meta.jsonData.onCallApiUrl +
|
||||
'\n' +
|
||||
e +
|
||||
', retry or check settings & re-initialize.'
|
||||
);
|
||||
let statusMessage = plugin.meta.jsonData.onCallApiUrl + '\n' + e + ', retry or check settings & re-initialize.';
|
||||
if (e.response.status == 404) {
|
||||
statusMessage += '\nIf Grafana OnCall was just installed, restart Grafana for OnCall routes to be available.';
|
||||
}
|
||||
setPluginStatusMessage(statusMessage);
|
||||
setRetrySync(true);
|
||||
} else {
|
||||
setPluginStatusMessage('OnCall has not been setup, configure & initialize below.');
|
||||
|
|
@ -151,7 +149,9 @@ export const PluginConfigPage = (props: Props) => {
|
|||
get_sync_response.version && get_sync_response.license
|
||||
? ` (${get_sync_response.license}, ${get_sync_response.version})`
|
||||
: '';
|
||||
setPluginStatusMessage(`Connected to OnCall${versionInfo}: ${plugin.meta.jsonData.onCallApiUrl}`);
|
||||
setPluginStatusMessage(
|
||||
`Connected to OnCall${versionInfo}\n - OnCall URL: ${plugin.meta.jsonData.onCallApiUrl}\n - Grafana URL: ${plugin.meta.jsonData.grafanaUrl}`
|
||||
);
|
||||
setIsSelfHostedInstall(plugin.meta.jsonData?.license === 'OpenSource');
|
||||
setPluginStatusOk(true);
|
||||
} else {
|
||||
|
|
|
|||
|
|
@ -148,7 +148,7 @@ const CloudPhoneSettings = observer((props: CloudPhoneSettingsProps) => {
|
|||
Updating...
|
||||
</Button>
|
||||
) : (
|
||||
<Button variant="secondary" icon="sync" onClick={syncUser}>
|
||||
<Button variant="secondary" icon="sync" onClick={syncUser} disabled={userStatus === 0}>
|
||||
Update
|
||||
</Button>
|
||||
)}
|
||||
|
|
|
|||
35
grafana-plugin/src/img/grafanaGlobalStyles.css
Normal file
35
grafana-plugin/src/img/grafanaGlobalStyles.css
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
.page-body {
|
||||
max-width: unset !important;
|
||||
}
|
||||
|
||||
.page-container {
|
||||
max-width: unset !important;
|
||||
}
|
||||
|
||||
/* This is for Grafana 8, remove later */
|
||||
@media (max-width: 1540px) {
|
||||
.page-header__tabs > ul > li > a > div {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 1540px) {
|
||||
.page-header__tabs > div > div > a > div {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 1300px) {
|
||||
.sidemenu {
|
||||
position: fixed !important;
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
.main-view {
|
||||
padding-left: 50px;
|
||||
}
|
||||
|
||||
.page-header__tabs li a {
|
||||
white-space: nowrap;
|
||||
}
|
||||
}
|
||||
|
|
@ -29,35 +29,3 @@
|
|||
.highlighted-row {
|
||||
background: var(--highlighted-row-bg);
|
||||
}
|
||||
|
||||
/* This is for Grafana 8, remove later */
|
||||
@media (max-width: 1540px) {
|
||||
.page-header__tabs > ul > li > a > div {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 1540px) {
|
||||
.page-header__tabs > div > div > a > div {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
|
||||
@media (max-width: 1300px) {
|
||||
.sidemenu {
|
||||
position: fixed !important;
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
.grafana-app {
|
||||
position: relative !important;
|
||||
}
|
||||
|
||||
.main-view {
|
||||
padding-left: 50px;
|
||||
}
|
||||
|
||||
.page-header__tabs li a {
|
||||
white-space: nowrap;
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -36,6 +36,7 @@ export interface ScheduleEvent {
|
|||
users: User[];
|
||||
is_empty: boolean;
|
||||
is_gap: boolean;
|
||||
missing_users: string[];
|
||||
}
|
||||
|
||||
export interface CreateScheduleExportTokenResponse {
|
||||
|
|
|
|||
|
|
@ -242,28 +242,6 @@ class EscalationChainsPage extends React.Component<EscalationChainsPageProps, Es
|
|||
const escalationChain = escalationChainStore.items[selectedEscalationChain];
|
||||
const escalationChainDetails = escalationChainStore.details[selectedEscalationChain];
|
||||
|
||||
let warningAboutModifyingEscalationChain = null;
|
||||
if (escalationChain.number_of_integrations > 0 || escalationChain.number_of_routes > 0) {
|
||||
warningAboutModifyingEscalationChain = (
|
||||
<>
|
||||
Modifying this escalation chain will affect{' '}
|
||||
{escalationChain.number_of_integrations > 0 && (
|
||||
<Text strong>
|
||||
{escalationChain.number_of_integrations} integration
|
||||
{escalationChain.number_of_integrations === 1 ? '' : 's'}
|
||||
</Text>
|
||||
)}
|
||||
{escalationChain.number_of_routes > 0 && escalationChain.number_of_integrations > 0 && ' and '}
|
||||
{escalationChain.number_of_routes > 0 && (
|
||||
<Text strong>
|
||||
{escalationChain.number_of_routes} route{escalationChain.number_of_routes === 1 ? '' : 's'}
|
||||
</Text>
|
||||
)}
|
||||
. Escalation chains linked to multiple integrations cannot be removed.
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<Block withBackground className={cx('header')}>
|
||||
|
|
@ -288,7 +266,7 @@ class EscalationChainsPage extends React.Component<EscalationChainsPageProps, Es
|
|||
<WithPermissionControl userAction={UserAction.UpdateEscalationPolicies}>
|
||||
<WithConfirm title={`Are you sure to remove "${escalationChain.name}"?`} confirmText="Remove">
|
||||
<IconButton
|
||||
disabled={escalationChain.number_of_integrations > 1}
|
||||
disabled={escalationChain.number_of_integrations > 0}
|
||||
tooltip="Remove"
|
||||
tooltipPlacement="top"
|
||||
onClick={this.handleDeleteEscalationChain}
|
||||
|
|
@ -296,7 +274,7 @@ class EscalationChainsPage extends React.Component<EscalationChainsPageProps, Es
|
|||
/>
|
||||
</WithConfirm>
|
||||
</WithPermissionControl>
|
||||
{escalationChain.number_of_integrations > 1 && (
|
||||
{escalationChain.number_of_integrations > 0 && (
|
||||
<Tooltip content="Escalation chains linked to multiple integrations cannot be removed">
|
||||
<Icon name="info-circle" />
|
||||
</Tooltip>
|
||||
|
|
@ -304,17 +282,13 @@ class EscalationChainsPage extends React.Component<EscalationChainsPageProps, Es
|
|||
</HorizontalGroup>
|
||||
</div>
|
||||
</Block>
|
||||
{warningAboutModifyingEscalationChain && (
|
||||
// @ts-ignore
|
||||
<Alert title={warningAboutModifyingEscalationChain} severity="warning" />
|
||||
)}
|
||||
<EscalationChainSteps id={selectedEscalationChain} />
|
||||
{escalationChainDetails ? (
|
||||
<Collapse
|
||||
headerWithBackground
|
||||
label={`${escalationChainDetails.length ? escalationChainDetails.length : 'No'} Linked integration${
|
||||
label={`${escalationChainDetails.length ? escalationChainDetails.length : 'No'} linked integration${
|
||||
escalationChainDetails.length === 1 ? '' : 's'
|
||||
}`}
|
||||
} will be affected by changes`}
|
||||
isOpen
|
||||
>
|
||||
{escalationChainDetails.length ? (
|
||||
|
|
|
|||
|
|
@ -47,7 +47,6 @@
|
|||
|
||||
.priority-icon {
|
||||
width: 32px;
|
||||
height: 32px;
|
||||
border-radius: 50%;
|
||||
background: var(--secondary-background);
|
||||
line-height: 32px;
|
||||
|
|
@ -65,4 +64,5 @@
|
|||
border-radius: 50px;
|
||||
color: #ff5286;
|
||||
font-weight: 400;
|
||||
align-items: baseline;
|
||||
}
|
||||
|
|
|
|||
|
|
@ -477,11 +477,11 @@ const Event = ({ event }: EventProps) => {
|
|||
const dates = getDatesString(event.start, event.end, event.all_day);
|
||||
|
||||
return (
|
||||
<HorizontalGroup align="flex-start" spacing="md">
|
||||
<>
|
||||
{!event.is_gap ? (
|
||||
<HorizontalGroup align="flex-start">
|
||||
<HorizontalGroup align="flex-start" spacing="sm">
|
||||
<div className={cx('priority-icon')}>
|
||||
<Text type="secondary">{`L${event.priority_level || '0'}`}</Text>
|
||||
<Text wrap type="secondary">{`L${event.priority_level || '0'}`}</Text>
|
||||
</div>
|
||||
<VerticalGroup>
|
||||
<div>
|
||||
|
|
@ -493,9 +493,17 @@ const Event = ({ event }: EventProps) => {
|
|||
</span>
|
||||
))
|
||||
) : (
|
||||
<HorizontalGroup>
|
||||
<HorizontalGroup spacing="sm">
|
||||
<Icon style={{ color: PENDING_COLOR }} name="exclamation-triangle" />
|
||||
<Text type="secondary">Empty shift (event without associated user or user with Viewer access)</Text>
|
||||
<Text type="secondary">Empty shift</Text>
|
||||
{event.missing_users[0] && (
|
||||
<Text type="secondary">
|
||||
(check if {event.missing_users[0].includes(',') ? 'some of these users -' : 'user -'}{' '}
|
||||
<Text type="secondary">"{event.missing_users[0]}"</Text>{' '}
|
||||
{event.missing_users[0].includes(',') ? 'are' : 'is'} existing in OnCall or{' '}
|
||||
{event.missing_users[0].includes(',') ? 'have' : 'has'} Viewer role)
|
||||
</Text>
|
||||
)}
|
||||
</HorizontalGroup>
|
||||
)}
|
||||
{event.source && <span> — source: {event.source}</span>}
|
||||
|
|
@ -507,11 +515,11 @@ const Event = ({ event }: EventProps) => {
|
|||
</HorizontalGroup>
|
||||
) : (
|
||||
<div className={cx('gap-between-shifts')}>
|
||||
<Icon size="sm" name="exclamation-triangle" className={cx('gap-between-shifts-icon')} /> Gap! Nobody
|
||||
On-Call...
|
||||
<Icon name="exclamation-triangle" className={cx('gap-between-shifts-icon')} />
|
||||
<Text> Gap! Nobody On-Call...</Text>
|
||||
</div>
|
||||
)}
|
||||
</HorizontalGroup>
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
|
|
|
|||
|
|
@ -38,6 +38,12 @@ export class RootBaseStore {
|
|||
@observable
|
||||
appLoading = true;
|
||||
|
||||
@observable
|
||||
backendVersion = '';
|
||||
|
||||
@observable
|
||||
backendLicense = '';
|
||||
|
||||
@observable
|
||||
pluginIsInitialized = true;
|
||||
|
||||
|
|
@ -134,6 +140,8 @@ export class RootBaseStore {
|
|||
this.initializationError = 'OnCall was not able to connect back to this Grafana';
|
||||
return;
|
||||
}
|
||||
this.backendVersion = get_sync_response.version;
|
||||
this.backendLicense = get_sync_response.license;
|
||||
this.appLoading = false;
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ Architecture diagram can be found [here](https://raw.githubusercontent.com/grafa
|
|||
|
||||
### Cluster requirements
|
||||
* ensure you can run x86-64/amd64 workloads. arm64 architecture is currently not supported
|
||||
* kubernetes version 1.25+ is not supported, if cert-manager is enabled
|
||||
|
||||
## Install
|
||||
### Prepare the repo
|
||||
|
|
@ -143,6 +144,24 @@ externalRabbitmq:
|
|||
password:
|
||||
```
|
||||
|
||||
## Update
|
||||
```shell
|
||||
# Add & upgrade the repository
|
||||
helm repo add grafana https://grafana.github.io/helm-charts
|
||||
helm repo update
|
||||
|
||||
# Re-deploy
|
||||
helm upgrade \
|
||||
--install \
|
||||
--wait \
|
||||
--set base_url=example.com \
|
||||
--set grafana."grafana\.ini".server.domain=example.com \
|
||||
release-oncall \
|
||||
grafana/oncall
|
||||
```
|
||||
|
||||
After re-deploying, please also update the Grafana OnCall plugin on the plugin version page. See [Grafana docs](https://grafana.com/docs/grafana/latest/administration/plugin-management/#update-a-plugin) for more info on updating Grafana plugins.
|
||||
|
||||
## Uninstall
|
||||
### Uninstalling the helm chart
|
||||
```bash
|
||||
|
|
|
|||
|
|
@ -106,6 +106,8 @@
|
|||
value: {{ include "snippet.rabbitmq.port" . }}
|
||||
- name: RABBITMQ_PROTOCOL
|
||||
value: {{ include "snippet.rabbitmq.protocol" . }}
|
||||
- name: RABBITMQ_VHOST
|
||||
value: {{ include "snippet.rabbitmq.vhost" . }}
|
||||
{{- end }}
|
||||
|
||||
{{- define "snippet.rabbitmq.user" -}}
|
||||
|
|
@ -140,6 +142,14 @@
|
|||
{{- end -}}
|
||||
{{- end -}}
|
||||
|
||||
{{- define "snippet.rabbitmq.vhost" -}}
|
||||
{{- if and (not .Values.rabbitmq.enabled) .Values.externalRabbitmq.vhost -}}
|
||||
{{ .Values.externalRabbitmq.vhost | quote }}
|
||||
{{- else -}}
|
||||
""
|
||||
{{- end -}}
|
||||
{{- end -}}
|
||||
|
||||
{{- define "snippet.rabbitmq.password.secret.name" -}}
|
||||
{{- if and (not .Values.rabbitmq.enabled) .Values.externalRabbitmq.password -}}
|
||||
{{ include "oncall.fullname" . }}-rabbitmq-external
|
||||
|
|
|
|||
|
|
@ -5,6 +5,10 @@ metadata:
|
|||
name: {{ include "oncall.engine.fullname" . }}-external
|
||||
labels:
|
||||
{{- include "oncall.engine.labels" . | nindent 4 }}
|
||||
{{- with .Values.service.annotations }}
|
||||
annotations:
|
||||
{{- toYaml . | nindent 4 }}
|
||||
{{- end }}
|
||||
spec:
|
||||
type: {{ .Values.service.type }}
|
||||
ports:
|
||||
|
|
|
|||
|
|
@ -16,6 +16,7 @@ service:
|
|||
enabled: false
|
||||
type: LoadBalancer
|
||||
port: 8080
|
||||
annotations: {}
|
||||
|
||||
# Engine pods configuration
|
||||
engine:
|
||||
|
|
@ -117,6 +118,7 @@ externalRabbitmq:
|
|||
user:
|
||||
password:
|
||||
protocol:
|
||||
vhost:
|
||||
|
||||
# Redis is included into this release for the convenience.
|
||||
# It is recommended to host it separately from this release
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue