v1.12.0
This commit is contained in:
commit
8e4a2dde22
111 changed files with 3078 additions and 1002 deletions
|
|
@ -41,6 +41,9 @@ Grafana OnCall Cloud includes SMS and Phone notifications.
|
|||
OSS users can use the [Grafana OSS-Cloud Setup](ref:grafana-oss-cloud-setup) as a relay or configure this notification type using other providers like Twilio.
|
||||
{{< /admonition >}}
|
||||
|
||||
Please note, not all countries are supported. Grafana OnCall aligns with Twilio’s suggested list of supported countries.
|
||||
For details, see [SMS/Voice support by country](#smsvoice-support-by-country).
|
||||
|
||||
## SMS notification behavior
|
||||
|
||||
OnCall reduces alert noise and distraction by bundling SMS notifications.
|
||||
|
|
@ -78,8 +81,316 @@ There are no specific rate limits, but we reserve the right to stop sending SMS/
|
|||
|
||||
To learn the phone number used by OnCall, make a test call from the “Phone Verification” tab.
|
||||
|
||||
### Phone calls or SMS not working
|
||||
### SMS/Voice support by country
|
||||
|
||||
There are instances where OnCall may not be able to make phone calls or send SMS to certain regions or specific phone numbers. We are working to resolve these issues.
|
||||
Please test your personal notification chain to ensure OnCall can notify you.
|
||||
We also suggest backing up phone calls and SMS with other notification methods such as the [Mobile app](ref:mobile-app).
|
||||
The following is a list of countries currently supported by Grafana OnCall.
|
||||
|
||||
{{< admonition type="note" >}}
|
||||
Be aware that due to limitations
|
||||
in our telecom provider’s service, some numbers within supported countries may occasionally be flagged as “high-risk” when
|
||||
verifying your phone number, thereby preventing you from being able to use that number to receive notifications.
|
||||
|
||||
Ensure that you test your notification rules to confirm that OnCall can reach you. For added reliability, consider backing
|
||||
up phone calls and SMS notifications with additional methods, such as the [Mobile app](/docs/grafana-cloud/alerting-and-irm/irm/irm-mobile-app/push-notifications/).
|
||||
{{< /admonition >}}
|
||||
|
||||
{{< collapse title="Europe" >}}
|
||||
|
||||
| Country | SMS/Voice Support |
|
||||
| ---------------------------------------------------- | ----------------- |
|
||||
| Andorra (+376) | ✅ |
|
||||
| Albania (+355) | ✅ |
|
||||
| Austria (+43) | ✅ |
|
||||
| Aland Islands (+35818) | ✅ |
|
||||
| Bosnia and Herzegovina (+387) | ✅ |
|
||||
| Belgium (+32) | ✅ |
|
||||
| Bulgaria (+359) | ✅ |
|
||||
| Belarus (+375) | ✅ |
|
||||
| Switzerland (+41) | ✅ |
|
||||
| Czechia (+420) | ✅ |
|
||||
| Germany (+49) | ✅ |
|
||||
| Denmark (+45) | ✅ |
|
||||
| Estonia (+372) | ✅ |
|
||||
| Spain (+34) | ✅ |
|
||||
| Finland (+358) | ✅ |
|
||||
| Faroe Islands (+298) | ❌ |
|
||||
| France (+33) | ✅ |
|
||||
| United Kingdom (+44) | ✅ |
|
||||
| Guernsey (+441481) | ✅ |
|
||||
| Gibraltar (+350) | ✅ |
|
||||
| Greece (+30) | ✅ |
|
||||
| Croatia (+385) | ✅ |
|
||||
| Hungary (+36) | ✅ |
|
||||
| Ireland (+353) | ✅ |
|
||||
| Isle Of Man (+441624) | ✅ |
|
||||
| Iceland (+354) | ✅ |
|
||||
| Italy (+39) | ✅ |
|
||||
| Jersey (+441534) | ✅ |
|
||||
| Liechtenstein (+423) | ❌ |
|
||||
| Lithuania (+370) | ✅ |
|
||||
| Luxembourg (+352) | ✅ |
|
||||
| Latvia (+371) | ✅ |
|
||||
| Monaco (+377) | ❌ |
|
||||
| Moldova (Republic of) (+373) | ✅ |
|
||||
| Montenegro (+382) | ✅ |
|
||||
| North Macedonia (Republic of North Macedonia) (+389) | ✅ |
|
||||
| Malta (+356) | ✅ |
|
||||
| Netherlands (+31) | ✅ |
|
||||
| Norway (+47) | ✅ |
|
||||
| Poland (+48) | ✅ |
|
||||
| Portugal (+351) | ✅ |
|
||||
| Romania (+40) | ✅ |
|
||||
| Serbia (+381) | ✅ |
|
||||
| Russian Federation (+7) | ✅ |
|
||||
| Sweden (+46) | ✅ |
|
||||
| Slovenia (+386) | ✅ |
|
||||
| Svalbard and Jan Mayen Islands (+4779) | ✅ |
|
||||
| Slovakia (+421) | ✅ |
|
||||
| San Marino (+378) | ❌ |
|
||||
| Turkey (+90) | ✅ |
|
||||
| Ukraine (+380) | ✅ |
|
||||
| Holy See (Vatican City State) (+3906698) | ❌ |
|
||||
| Kosovo (+383) | ❌ |
|
||||
| Yugoslavia (+38) | ✅ |
|
||||
|
||||
{{< /collapse >}}
|
||||
|
||||
{{< collapse title="Asia" >}}
|
||||
|
||||
| Country | SMS Support |
|
||||
| --------------------------------------------------------- | ----------- |
|
||||
| United Arab Emirates (+971) | ✅ |
|
||||
| Afghanistan (+93) | ❌ |
|
||||
| Armenia (+374) | ❌ |
|
||||
| Azerbaijan (+994) | ❌ |
|
||||
| Bangladesh (+880) | ❌ |
|
||||
| Bahrain (+973) | ✅ |
|
||||
| Brunei Darussalam (+673) | ✅ |
|
||||
| Bhutan (+975) | ✅ |
|
||||
| Cocos (Keeling) Islands (+672) | ✅ |
|
||||
| China (+86) | ❌ |
|
||||
| Christmas Island (+6189164) | ✅ |
|
||||
| Cyprus (+357) | ✅ |
|
||||
| Georgia (+995) | ✅ |
|
||||
| Hong Kong (Special Administrative Region of China) (+852) | ✅ |
|
||||
| Indonesia (+62) | ✅ |
|
||||
| Israel (+972) | ✅ |
|
||||
| India (+91) | ✅ |
|
||||
| British Indian Ocean Territory (+246) | ❌ |
|
||||
| Iraq (+964) | ❌ |
|
||||
| Iran (Islamic Republic of) (+98) | ❌ |
|
||||
| Jordan (+962) | ❌ |
|
||||
| Japan (+81) | ✅ |
|
||||
| Kyrgyzstan (+996) | ❌ |
|
||||
| Cambodia (+855) | ❌ |
|
||||
| Democratic People`s Republic of Korea (+850) | ❌ |
|
||||
| Republic of Korea (+82) | ✅ |
|
||||
| Kuwait (+965) | ❌ |
|
||||
| Kazakhstan (+7) | ✅ |
|
||||
| Lao People`s Democratic Republic (+856) | ❌ |
|
||||
| Lebanon (+961) | ❌ |
|
||||
| Sri Lanka (+94) | ❌ |
|
||||
| Myanmar (+95) | ❌ |
|
||||
| Mongolia (+976) | ❌ |
|
||||
| Macau (Special Administrative Region of China) (+853) | ✅ |
|
||||
| Maldives (+960) | ✅ |
|
||||
| Malaysia (+60) | ✅ |
|
||||
| Nepal (+977) | ❌ |
|
||||
| Oman (+968) | ❌ |
|
||||
| Philippines (+63) | ✅ |
|
||||
| Pakistan (+92) | ❌ |
|
||||
| Palestinian Territory (Occupied) (+970) | ❌ |
|
||||
| Qatar (+974) | ✅ |
|
||||
| Saudi Arabia (+966) | ✅ |
|
||||
| Singapore (+65) | ✅ |
|
||||
| Syrian Arab Republic (+963) | ❌ |
|
||||
| Thailand (+66) | ✅ |
|
||||
| Tajikistan (+992) | ❌ |
|
||||
| Timor-Leste (East Timor) (+670) | ❌ |
|
||||
| Turkmenistan (+993) | ❌ |
|
||||
| Taiwan (Province of China) (+886) | ✅ |
|
||||
| Uzbekistan (+998) | ❌ |
|
||||
| Vietnam (+84) | ❌ |
|
||||
| Yemen (+967) | ❌ |
|
||||
|
||||
{{< /collapse >}}
|
||||
|
||||
{{< collapse title="North America" >}}
|
||||
|
||||
| Country | SMS Support |
|
||||
| ---------------------------------------- | ----------- |
|
||||
| Antigua and Barbuda (+1268) | ❌ |
|
||||
| Anguilla (+1264) | ❌ |
|
||||
| Netherlands Antilles (+599) | ✅ |
|
||||
| Aruba (+297) | ✅ |
|
||||
| Barbados (+1246) | ✅ |
|
||||
| Saint Barthelemy (+590) | ✅ |
|
||||
| Bermuda (+1441) | ✅ |
|
||||
| Bonaire, Sint Eustatius And Saba (+5993) | ✅ |
|
||||
| Bahamas (+1242) | ✅ |
|
||||
| Belize (+501) | ❌ |
|
||||
| Canada (+1) | ✅ |
|
||||
| Costa Rica (+506) | ✅ |
|
||||
| Cuba (+53) | ✅ |
|
||||
| Dominica (+1767) | ❌ |
|
||||
| Dominican Republic (+1809) | ✅ |
|
||||
| Grenada (+1473) | ❌ |
|
||||
| Greenland (+299) | ❌ |
|
||||
| Guadeloupe (+590) | ❌ |
|
||||
| Guatemala (+502) | ✅ |
|
||||
| Honduras (+504) | ❌ |
|
||||
| Haiti (+509) | ✅ |
|
||||
| Jamaica (+1876) | ✅ |
|
||||
| Saint Kitts and Nevis (+1869) | ❌ |
|
||||
| Cayman Islands (+1345) | ✅ |
|
||||
| Saint Lucia (+1758) | ❌ |
|
||||
| Saint Martin French (+590) | ❌ |
|
||||
| Martinique (+596) | ❌ |
|
||||
| Montserrat (+1664) | ❌ |
|
||||
| Mexico (+52) | ✅ |
|
||||
| Nicaragua (+505) | ✅ |
|
||||
| Panama (+507) | ✅ |
|
||||
| Saint Pierre and Miquelon (+508) | ❌ |
|
||||
| Puerto Rico (+1787) | ✅ |
|
||||
| El Salvador (+503) | ❌ |
|
||||
| Sint Maarten Dutch (+1721) | ✅ |
|
||||
| Turks and Caicos Islands (+1649) | ❌ |
|
||||
| Trinidad and Tobago (+1868) | ✅ |
|
||||
| United States (+1) | ✅ |
|
||||
| Saint Vincent and the Grenadines (+1784) | ❌ |
|
||||
| Virgin Islands British (+1284) | ❌ |
|
||||
| Virgin Islands US (+1340) | ❌ |
|
||||
|
||||
{{< /collapse >}}
|
||||
|
||||
{{< collapse title="Africa" >}}
|
||||
|
||||
| Country | SMS Support |
|
||||
| --------------------------------------- | ----------- |
|
||||
| Angola (+244) | ❌ |
|
||||
| Burkina Faso (+226) | ✅ |
|
||||
| Burundi (+257) | ❌ |
|
||||
| Benin (+229) | ❌ |
|
||||
| Botswana (+267) | ✅ |
|
||||
| Democratic Republic of the Congo (+243) | ❌ |
|
||||
| Central African Republic (+236) | ❌ |
|
||||
| Congo (+242) | ❌ |
|
||||
| Cote d`Ivoire (+225) | ✅ |
|
||||
| Cameroon (+237) | ❌ |
|
||||
| Cape Verde (+238) | ✅ |
|
||||
| Djibouti (+253) | ✅ |
|
||||
| Algeria (+213) | ❌ |
|
||||
| Egypt (+20) | ❌ |
|
||||
| Western Sahara (+212) | ✅ |
|
||||
| Eritrea (+291) | ❌ |
|
||||
| Ethiopia (+251) | ❌ |
|
||||
| Gabon (+241) | ❌ |
|
||||
| Ghana (+233) | ❌ |
|
||||
| Gambia (+220) | ❌ |
|
||||
| Guinea (+224) | ❌ |
|
||||
| Equatorial Guinea (+240) | ❌ |
|
||||
| Guinea-Bissau (+245) | ❌ |
|
||||
| Kenya (+254) | ✅ |
|
||||
| Comoros (+269) | ❌ |
|
||||
| Liberia (+231) | ❌ |
|
||||
| Lesotho (+266) | ❌ |
|
||||
| Libyan Arab Jamahiriya (+218) | ❌ |
|
||||
| Morocco (+212) | ❌ |
|
||||
| Madagascar (+261) | ❌ |
|
||||
| Mali (+223) | ❌ |
|
||||
| Mauritania (+222) | ✅ |
|
||||
| Mauritius (+230) | ❌ |
|
||||
| Malawi (+265) | ❌ |
|
||||
| Mozambique (+258) | ✅ |
|
||||
| Namibia (+264) | ✅ |
|
||||
| Niger (+227) | ❌ |
|
||||
| Nigeria (+234) | ✅ |
|
||||
| Reunion (+262) | ❌ |
|
||||
| Rwanda (+250) | ❌ |
|
||||
| Seychelles (+248) | ❌ |
|
||||
| Sudan (+249) | ❌ |
|
||||
| Saint Helena (+290) | ✅ |
|
||||
| Sierra Leone (+232) | ❌ |
|
||||
| Senegal (+221) | ❌ |
|
||||
| Somalia (+252) | ✅ |
|
||||
| South Sudan (+211) | ❌ |
|
||||
| Sao Tome and Principe (+239) | ❌ |
|
||||
| Swaziland (+268) | ✅ |
|
||||
| Chad (+235) | ❌ |
|
||||
| Togo (+228) | ✅ |
|
||||
| Tunisia (+216) | ✅ |
|
||||
| Tanzania (United Republic of) (+255) | ✅ |
|
||||
| Uganda (+256) | ✅ |
|
||||
| Mayotte (+262269) | ✅ |
|
||||
| South Africa (+27) | ✅ |
|
||||
| Zambia (+260) | ❌ |
|
||||
| Zimbabwe (+263) | ❌ |
|
||||
|
||||
{{< /collapse >}}
|
||||
|
||||
{{< collapse title="Antarctica" >}}
|
||||
|
||||
| Country | SMS Support |
|
||||
| --------------------------------------------------- | ----------- |
|
||||
| Antarctica (+672) | ✅ |
|
||||
| Bouvet Island (+47) | ✅ |
|
||||
| South Georgia and The South Sandwich Islands (+500) | ✅ |
|
||||
| Heard Island and McDonald Islands (+61) | ✅ |
|
||||
| French Southern Territories (+1) | ✅ |
|
||||
|
||||
{{< /collapse >}}
|
||||
|
||||
{{< collapse title="South America" >}}
|
||||
|
||||
| Country | SMS Support |
|
||||
| ---------------------------------- | ----------- |
|
||||
| Argentina (+54) | ✅ |
|
||||
| Bolivia (+591) | ❌ |
|
||||
| Brazil (+55) | ✅ |
|
||||
| Chile (+56) | ✅ |
|
||||
| Colombia (+57) | ✅ |
|
||||
| Ecuador (+593) | ✅ |
|
||||
| Falkland Islands (Malvinas) (+500) | ❌ |
|
||||
| French Guiana (+594) | ❌ |
|
||||
| Guyana (+592) | ❌ |
|
||||
| Peru (+51) | ✅ |
|
||||
| Paraguay (+595) | ✅ |
|
||||
| Suriname (+597) | ✅ |
|
||||
| Uruguay (+598) | ✅ |
|
||||
| Venezuela (+58) | ✅ |
|
||||
|
||||
{{< /collapse >}}
|
||||
|
||||
{{< collapse title="Oceania" >}}
|
||||
|
||||
| Country | SMS Support |
|
||||
| ----------------------------------------- | ----------- |
|
||||
| American Samoa (+1684) | ❌ |
|
||||
| Australia (+61) | ✅ |
|
||||
| Cook Islands (+682) | ❌ |
|
||||
| Curacao (+5999) | ✅ |
|
||||
| Fiji (+679) | ✅ |
|
||||
| Micronesia (Federated States of) (+691) | ❌ |
|
||||
| Guam (+1671) | ✅ |
|
||||
| Kiribati (+686) | ❌ |
|
||||
| Marshall Islands (+692) | ❌ |
|
||||
| Northern Mariana Islands (+1670) | ✅ |
|
||||
| New Caledonia (+687) | ✅ |
|
||||
| Norfolk Island (+672) | ✅ |
|
||||
| Nauru (+674) | ❌ |
|
||||
| Niue (+683) | ❌ |
|
||||
| New Zealand (+64) | ✅ |
|
||||
| French Polynesia (+689) | ✅ |
|
||||
| Papua New Guinea (+675) | ❌ |
|
||||
| Pitcairn (+64) | ✅ |
|
||||
| Palau (+680) | ❌ |
|
||||
| Solomon Islands (+677) | ❌ |
|
||||
| Tokelau (+690) | ✅ |
|
||||
| Tonga (+676) | ❌ |
|
||||
| Tuvalu (+688) | ❌ |
|
||||
| United States Minor Outlying Islands (+1) | ✅ |
|
||||
| Vanuatu (+678) | ❌ |
|
||||
| Wallis and Futuna Islands (+681) | ❌ |
|
||||
| Samoa (+685) | ❌ |
|
||||
|
||||
{{< /collapse >}}
|
||||
|
|
|
|||
|
|
@ -20,10 +20,13 @@ from apps.alerts.tasks import (
|
|||
)
|
||||
from apps.alerts.utils import is_declare_incident_step_enabled
|
||||
from apps.schedules.ical_utils import list_users_to_notify_from_ical
|
||||
from apps.user_management.models import User
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
from apps.alerts.models.alert_group import AlertGroup
|
||||
from apps.schedules.models import OnCallSchedule
|
||||
from apps.slack.models import SlackUserGroup
|
||||
from apps.user_management.models import Team, User
|
||||
from apps.webhooks.models import Webhook
|
||||
|
||||
|
||||
class EscalationPolicySnapshot:
|
||||
|
|
@ -57,24 +60,24 @@ class EscalationPolicySnapshot:
|
|||
|
||||
def __init__(
|
||||
self,
|
||||
id,
|
||||
order,
|
||||
step,
|
||||
wait_delay,
|
||||
notify_to_users_queue,
|
||||
last_notified_user,
|
||||
from_time,
|
||||
to_time,
|
||||
num_alerts_in_window,
|
||||
num_minutes_in_window,
|
||||
custom_webhook,
|
||||
notify_schedule,
|
||||
notify_to_group,
|
||||
escalation_counter,
|
||||
passed_last_time,
|
||||
pause_escalation,
|
||||
notify_to_team_members=None,
|
||||
severity=None,
|
||||
id: int,
|
||||
order: int,
|
||||
step: int,
|
||||
wait_delay: typing.Optional[datetime.timedelta],
|
||||
notify_to_users_queue: typing.Optional[typing.Sequence["User"]],
|
||||
last_notified_user: typing.Optional["User"],
|
||||
from_time: typing.Optional[datetime.time],
|
||||
to_time: typing.Optional[datetime.time],
|
||||
num_alerts_in_window: typing.Optional[int],
|
||||
num_minutes_in_window: typing.Optional[int],
|
||||
custom_webhook: typing.Optional["Webhook"],
|
||||
notify_schedule: typing.Optional["OnCallSchedule"],
|
||||
notify_to_group: typing.Optional["SlackUserGroup"],
|
||||
escalation_counter: int,
|
||||
passed_last_time: typing.Optional[datetime.datetime],
|
||||
pause_escalation: bool,
|
||||
notify_to_team_members: typing.Optional["Team"] = None,
|
||||
severity: typing.Optional[str] = None,
|
||||
):
|
||||
self.id = id
|
||||
self.order = order
|
||||
|
|
@ -107,11 +110,11 @@ class EscalationPolicySnapshot:
|
|||
return EscalationPolicy.objects.filter(pk=self.id).first()
|
||||
|
||||
@property
|
||||
def sorted_users_queue(self) -> typing.List[User]:
|
||||
def sorted_users_queue(self) -> typing.List["User"]:
|
||||
return sorted(self.notify_to_users_queue, key=lambda user: (user.username or "", user.pk))
|
||||
|
||||
@property
|
||||
def next_user_in_sorted_queue(self) -> User:
|
||||
def next_user_in_sorted_queue(self) -> "User":
|
||||
users_queue = self.sorted_users_queue
|
||||
try:
|
||||
last_user_index = users_queue.index(self.last_notified_user)
|
||||
|
|
@ -120,7 +123,7 @@ class EscalationPolicySnapshot:
|
|||
next_user = users_queue[(last_user_index + 1) % len(users_queue)]
|
||||
return next_user
|
||||
|
||||
def execute(self, alert_group: "AlertGroup", reason) -> StepExecutionResultData:
|
||||
def execute(self, alert_group: "AlertGroup", reason: str) -> StepExecutionResultData:
|
||||
action_map: typing.Dict[typing.Optional[int], EscalationPolicySnapshot.StepExecutionFunc] = {
|
||||
EscalationPolicy.STEP_WAIT: self._escalation_step_wait,
|
||||
EscalationPolicy.STEP_FINAL_NOTIFYALL: self._escalation_step_notify_all,
|
||||
|
|
|
|||
|
|
@ -10,26 +10,69 @@ from apps.schedules.ical_utils import list_users_to_notify_from_ical
|
|||
if typing.TYPE_CHECKING:
|
||||
from django.db.models.manager import RelatedManager
|
||||
|
||||
from apps.alerts.escalation_snapshot.snapshot_classes import EscalationPolicySnapshot, EscalationSnapshot
|
||||
from apps.alerts.models import AlertGroup, AlertGroupLogRecord, ResolutionNote
|
||||
from apps.base.models import UserNotificationPolicy, UserNotificationPolicyLogRecord
|
||||
from apps.slack.models import SlackTeamIdentity
|
||||
from apps.user_management.models import User
|
||||
|
||||
|
||||
class PlanLines(typing.TypedDict):
|
||||
plan_lines: typing.List[str]
|
||||
|
||||
|
||||
class NotificationPlanLines(PlanLines):
|
||||
user_id: int
|
||||
plan_lines: typing.List[str]
|
||||
"""
|
||||
rendered notification policy line
|
||||
"""
|
||||
is_the_first_notification_step: bool
|
||||
|
||||
|
||||
EscalationPlan = typing.Dict[timezone.timedelta, typing.List[PlanLines]]
|
||||
"""
|
||||
`dict` with `timedelta` as a key and values containing a `list` of escalation plan lines
|
||||
"""
|
||||
|
||||
NotificationPlan = typing.Dict[timezone.timedelta, typing.List[NotificationPlanLines]]
|
||||
"""
|
||||
`dict` with `timedelta` as a key and values containing a `list` of notification plan lines
|
||||
"""
|
||||
|
||||
MixedEscalationNotificationPlan = typing.Dict[
|
||||
timezone.timedelta, typing.List[typing.Union[PlanLines, NotificationPlanLines]]
|
||||
]
|
||||
"""
|
||||
NOTE: this seems janky, maybe there's a better way for us to do this than jam two different object types into the same
|
||||
list? Opportunity for future refactoring...
|
||||
"""
|
||||
|
||||
FlattendEscalationPlan = typing.Dict[timezone.timedelta, typing.List[str]]
|
||||
|
||||
LogRecords = typing.List[typing.Union["AlertGroupLogRecord", "ResolutionNote", "UserNotificationPolicyLogRecord"]]
|
||||
|
||||
UsersToNotify = typing.Iterable["User"]
|
||||
"""
|
||||
`typing.Iterable` ([docs](https://docs.python.org/3/library/collections.abc.html#collections.abc.Iterable)):
|
||||
|
||||
ABC for classes that provide the __iter__() method
|
||||
|
||||
We just need this because some methods for fetching users return `QuerySet`s, others simply assign a list. In the end
|
||||
we simply iterate over the users.
|
||||
"""
|
||||
|
||||
|
||||
class IncidentLogBuilder:
|
||||
def __init__(self, alert_group: "AlertGroup"):
|
||||
self.alert_group = alert_group
|
||||
|
||||
def get_log_records_list(
|
||||
self, with_resolution_notes: bool = False
|
||||
) -> typing.List[typing.Union["AlertGroupLogRecord", "ResolutionNote", "UserNotificationPolicyLogRecord"]]:
|
||||
def get_log_records_list(self, with_resolution_notes: bool = False) -> LogRecords:
|
||||
"""
|
||||
Generates list of `AlertGroupLogRecord` and `UserNotificationPolicyLogRecord` logs.
|
||||
Generates list of `LogRecords`. `ResolutionNote`s are optionally included if `with_resolution_notes` is `True`.
|
||||
"""
|
||||
all_log_records: LogRecords = list()
|
||||
|
||||
`ResolutionNote`s are optionally included if `with_resolution_notes` is `True`.
|
||||
"""
|
||||
all_log_records: typing.List[
|
||||
typing.Union["AlertGroupLogRecord", "ResolutionNote", "UserNotificationPolicyLogRecord"]
|
||||
] = list()
|
||||
# get logs from AlertGroupLogRecord
|
||||
alert_group_log_records = self._get_log_records_for_after_resolve_report()
|
||||
all_log_records.extend(alert_group_log_records)
|
||||
|
|
@ -41,9 +84,9 @@ class IncidentLogBuilder:
|
|||
if with_resolution_notes:
|
||||
resolution_notes = self._get_resolution_notes()
|
||||
all_log_records.extend(resolution_notes)
|
||||
|
||||
# sort logs by date
|
||||
all_log_records_sorted = sorted(all_log_records, key=lambda log: log.created_at)
|
||||
return all_log_records_sorted
|
||||
return sorted(all_log_records, key=lambda log: log.created_at)
|
||||
|
||||
def _get_log_records_for_after_resolve_report(self) -> "RelatedManager['AlertGroupLogRecord']":
|
||||
from apps.alerts.models import AlertGroupLogRecord, EscalationPolicy
|
||||
|
|
@ -120,25 +163,22 @@ class IncidentLogBuilder:
|
|||
"created_at"
|
||||
)
|
||||
|
||||
def get_incident_escalation_plan(self, for_slack=False):
|
||||
"""
|
||||
Generates dict with escalation plan with timedelta as keys and list with plan lines as values
|
||||
:param for_slack: (bool) add user slack id to plan line or not
|
||||
:return:
|
||||
"""
|
||||
incident_escalation_plan = dict()
|
||||
incident_escalation_plan = self._add_invitation_plan(incident_escalation_plan, for_slack=for_slack)
|
||||
if not self.alert_group.acknowledged and not self.alert_group.is_silenced_forever:
|
||||
incident_escalation_plan = self._add_escalation_plan(incident_escalation_plan, for_slack=for_slack)
|
||||
final_incident_escalation_plan = self._finalize_escalation_plan_dict(incident_escalation_plan)
|
||||
return final_incident_escalation_plan
|
||||
def get_escalation_plan(self, for_slack: bool = False) -> FlattendEscalationPlan:
|
||||
escalation_plan: EscalationPlan = dict()
|
||||
escalation_plan = self._add_invitation_plan(escalation_plan, for_slack=for_slack)
|
||||
|
||||
def _add_escalation_plan(self, escalation_plan_dict, for_slack=False):
|
||||
if not self.alert_group.acknowledged and not self.alert_group.is_silenced_forever:
|
||||
escalation_plan = self._add_escalation_plan(escalation_plan, for_slack=for_slack)
|
||||
|
||||
return self._finalize_escalation_plan(escalation_plan)
|
||||
|
||||
def _add_escalation_plan(
|
||||
self,
|
||||
escalation_plan: MixedEscalationNotificationPlan,
|
||||
for_slack: bool = False,
|
||||
) -> MixedEscalationNotificationPlan:
|
||||
"""
|
||||
Returns plan for future escalations
|
||||
:param escalation_plan_dict:
|
||||
:param for_slack:
|
||||
:return: {timedelta: [{"user_id": user.pk, "plan_lines": [#rendered escalation policy line, ]}, ..., ...], ...}
|
||||
"""
|
||||
esc_timedelta = timezone.timedelta(seconds=0) # timedelta for next escalation step
|
||||
now = timezone.now()
|
||||
|
|
@ -147,7 +187,7 @@ class IncidentLogBuilder:
|
|||
# We cannot generate escalation plan in this case
|
||||
escalation_snapshot = self.alert_group.escalation_snapshot
|
||||
if not self.alert_group.has_escalation_policies_snapshots:
|
||||
return escalation_plan_dict
|
||||
return escalation_plan
|
||||
|
||||
if self.alert_group.silenced_until:
|
||||
timedelta = self.alert_group.silenced_until - now
|
||||
|
|
@ -160,18 +200,22 @@ class IncidentLogBuilder:
|
|||
stop_escalation_log_pk = stop_escalation_log.pk if stop_escalation_log else 0
|
||||
|
||||
# render escalation plan from escalation_snapshot
|
||||
escalation_plan_dict = self._render_escalation_plan_from_escalation_snapshot(
|
||||
escalation_plan_dict,
|
||||
return self._render_escalation_plan_from_escalation_snapshot(
|
||||
escalation_plan,
|
||||
stop_escalation_log_pk,
|
||||
esc_timedelta,
|
||||
escalation_snapshot,
|
||||
for_slack,
|
||||
)
|
||||
return escalation_plan_dict
|
||||
|
||||
def _render_escalation_plan_from_escalation_snapshot(
|
||||
self, escalation_plan_dict, stop_escalation_log_pk, esc_timedelta, escalation_snapshot, for_slack=False
|
||||
):
|
||||
self,
|
||||
escalation_plan: EscalationPlan,
|
||||
stop_escalation_log_pk: int,
|
||||
esc_timedelta: timezone.timedelta,
|
||||
escalation_snapshot: "EscalationSnapshot",
|
||||
for_slack=False,
|
||||
) -> MixedEscalationNotificationPlan:
|
||||
from apps.alerts.models import EscalationPolicy
|
||||
|
||||
now = timezone.now()
|
||||
|
|
@ -203,9 +247,11 @@ class IncidentLogBuilder:
|
|||
|
||||
if len(escalation_policies_snapshots) > 0 and not escalation_eta:
|
||||
future_step_timedelta = None
|
||||
|
||||
for escalation_policy_snapshot in escalation_policies_snapshots:
|
||||
step_timedelta = esc_timedelta
|
||||
future_step = escalation_policy_snapshot.order >= escalation_policy_order # step not passed yet
|
||||
|
||||
if future_step and escalation_policy_snapshot.step == EscalationPolicy.STEP_WAIT:
|
||||
wait_delay = escalation_policy_snapshot.wait_delay or EscalationPolicy.DEFAULT_WAIT_DELAY
|
||||
esc_timedelta += wait_delay # increase timedelta for next steps
|
||||
|
|
@ -215,6 +261,7 @@ class IncidentLogBuilder:
|
|||
future_step_timedelta = esc_timedelta - last_log_timedelta
|
||||
elif not future_step:
|
||||
passed_last_time = escalation_policy_snapshot.passed_last_time
|
||||
|
||||
if passed_last_time is not None:
|
||||
step_timedelta = esc_timedelta - (now - passed_last_time)
|
||||
else:
|
||||
|
|
@ -224,8 +271,8 @@ class IncidentLogBuilder:
|
|||
|
||||
# stop plan generation if there is resolve step in escalation plan
|
||||
if future_step and escalation_policy_snapshot.step == EscalationPolicy.STEP_FINAL_RESOLVE:
|
||||
escalation_plan_dict = IncidentLogBuilder._remove_future_plan(esc_timedelta, escalation_plan_dict)
|
||||
escalation_step_plan_dict = self._render_escalation_step_plan_from_escalation_policy_snapshot(
|
||||
escalation_plan = IncidentLogBuilder._remove_future_plan(esc_timedelta, escalation_plan)
|
||||
escalation_step_plan = self._render_escalation_step_plan_from_escalation_policy_snapshot(
|
||||
escalation_policy_snapshot,
|
||||
escalation_snapshot,
|
||||
for_slack=for_slack,
|
||||
|
|
@ -234,45 +281,48 @@ class IncidentLogBuilder:
|
|||
)
|
||||
step_timedelta += timezone.timedelta(seconds=5) # make this step the last in plan
|
||||
|
||||
for timedelta, plan in escalation_step_plan_dict.items():
|
||||
for timedelta, plan in escalation_step_plan.items():
|
||||
timedelta += step_timedelta
|
||||
escalation_plan_dict.setdefault(timedelta, []).extend(plan)
|
||||
escalation_plan.setdefault(timedelta, []).extend(plan)
|
||||
break
|
||||
|
||||
# render escalation and notification plan lines for step
|
||||
escalation_step_plan_dict = self._render_escalation_step_plan_from_escalation_policy_snapshot(
|
||||
escalation_step_plan = self._render_escalation_step_plan_from_escalation_policy_snapshot(
|
||||
escalation_policy_snapshot,
|
||||
escalation_snapshot,
|
||||
for_slack=for_slack,
|
||||
future_step=future_step,
|
||||
esc_timedelta=step_timedelta,
|
||||
)
|
||||
escalation_plan_dict = self._correct_users_notification_plan(
|
||||
escalation_plan_dict, escalation_step_plan_dict, step_timedelta
|
||||
|
||||
# NOTE: should probably refactor this? `_correct_users_notification_plan` expects a `NotificationPlan`
|
||||
# as its second arg, `escalation_step_plan` is of type
|
||||
escalation_plan = self._correct_users_notification_plan(
|
||||
escalation_plan, escalation_step_plan, step_timedelta
|
||||
)
|
||||
return escalation_plan_dict
|
||||
|
||||
return escalation_plan
|
||||
|
||||
@staticmethod
|
||||
def _remove_future_plan(timedelta_to_remove, plan_dict):
|
||||
def _remove_future_plan(timedelta_to_remove: timezone.timedelta, escalation_plan: EscalationPlan) -> EscalationPlan:
|
||||
"""
|
||||
Removes plan with higher timedelta (for events, that will start later, than selected time
|
||||
(timedelta_to_remove)).
|
||||
:param timedelta_to_remove:
|
||||
:param plan_dict:
|
||||
:return: new plan dict
|
||||
Removes plan with higher timedelta (for events, that will start later, than selected time (timedelta_to_remove))
|
||||
"""
|
||||
new_plan_dict = dict()
|
||||
for timedelta in sorted(plan_dict):
|
||||
if timedelta <= timedelta_to_remove:
|
||||
new_plan_dict[timedelta] = plan_dict[timedelta]
|
||||
return new_plan_dict
|
||||
new_plan: EscalationPlan = dict()
|
||||
|
||||
def _add_invitation_plan(self, escalation_plan_dict, for_slack=False):
|
||||
for timedelta in sorted(escalation_plan):
|
||||
if timedelta <= timedelta_to_remove:
|
||||
new_plan[timedelta] = escalation_plan[timedelta]
|
||||
|
||||
return new_plan
|
||||
|
||||
def _add_invitation_plan(
|
||||
self,
|
||||
escalation_plan: EscalationPlan,
|
||||
for_slack: bool = False,
|
||||
) -> MixedEscalationNotificationPlan:
|
||||
"""
|
||||
Adds notification plan for invitation
|
||||
:param escalation_plan_dict:
|
||||
:param for_slack:
|
||||
:return: {timedelta: [{"user_id": user.pk, "plan_lines": [#rendered escalation policy line, ]}, ..., ...], ...}
|
||||
"""
|
||||
from apps.alerts.models import Invitation
|
||||
|
||||
|
|
@ -280,6 +330,7 @@ class IncidentLogBuilder:
|
|||
for invitation in self.alert_group.invitations.filter(is_active=True):
|
||||
invitation_timedelta = timezone.timedelta()
|
||||
current_attempt = invitation.attempt - 1
|
||||
|
||||
# generate notification plan for each attempt
|
||||
for attempt in range(current_attempt, Invitation.ATTEMPTS_LIMIT + 1):
|
||||
notification_plan = self._get_notification_plan_for_user(
|
||||
|
|
@ -287,99 +338,107 @@ class IncidentLogBuilder:
|
|||
for_slack=for_slack,
|
||||
future_step=attempt >= invitation.attempt,
|
||||
)
|
||||
escalation_plan_dict = self._correct_users_notification_plan(
|
||||
escalation_plan_dict, notification_plan, invitation_timedelta
|
||||
escalation_plan = self._correct_users_notification_plan(
|
||||
escalation_plan, notification_plan, invitation_timedelta
|
||||
)
|
||||
|
||||
started_timedelta = now - invitation.created_at
|
||||
invitation_timedelta += Invitation.get_delay_by_attempt(attempt) - started_timedelta
|
||||
return escalation_plan_dict
|
||||
|
||||
def _correct_users_notification_plan(self, escalation_plan_dict, notification_plan_dict, esc_time):
|
||||
return escalation_plan
|
||||
|
||||
def _correct_users_notification_plan(
|
||||
self,
|
||||
escalation_plan: EscalationPlan,
|
||||
notification_plan: NotificationPlan,
|
||||
esc_time: timezone.timedelta,
|
||||
) -> MixedEscalationNotificationPlan:
|
||||
"""
|
||||
Check if escalation_plan_dict has user notification events with higher timedelta
|
||||
than timedelta of current step. If it has, remove future notification events for users that
|
||||
repeatedly notified by current escalation step from current escalation_plan_dict
|
||||
because their notification chain will start from the beginning.
|
||||
Check if `escalation_plan` has user notification events with higher `timedelta` than `timedelta` of
|
||||
current step.
|
||||
|
||||
:param escalation_plan_dict:
|
||||
:param notification_plan_dict:
|
||||
:param esc_time:
|
||||
:return:
|
||||
If it has, remove future notification events for users that repeatedly notified by current escalation step from
|
||||
current `escalation_plan` because their notification chain will start from the beginning.
|
||||
"""
|
||||
|
||||
future_step_timedelta = None
|
||||
|
||||
later_events_exist = False
|
||||
for timedelta in escalation_plan_dict:
|
||||
for timedelta in escalation_plan:
|
||||
if timedelta > esc_time:
|
||||
later_events_exist = True
|
||||
break
|
||||
if later_events_exist:
|
||||
earliest_events = notification_plan_dict.get(timezone.timedelta(), [])
|
||||
notification_plans_to_remove = []
|
||||
for event_dict in earliest_events: # [{"user_id": user.pk, "plan_lines": []}, {"plan_lines": []}]
|
||||
user_id = event_dict.get("user_id")
|
||||
if user_id:
|
||||
notification_plans_to_remove.append(user_id)
|
||||
new_escalation_policies_dict = {}
|
||||
|
||||
for timedelta in sorted(escalation_plan_dict):
|
||||
if later_events_exist:
|
||||
earliest_events = notification_plan.get(timezone.timedelta(), [])
|
||||
notification_plans_to_remove: list[int] = []
|
||||
|
||||
for event_dict in earliest_events:
|
||||
if user_id := event_dict.get("user_id"):
|
||||
notification_plans_to_remove.append(user_id)
|
||||
|
||||
new_escalation_plan: EscalationPlan = {}
|
||||
|
||||
for timedelta in sorted(escalation_plan):
|
||||
# do not add step from escalation plan if its timedelta < 0
|
||||
if timedelta < timezone.timedelta():
|
||||
continue
|
||||
events_list = list()
|
||||
for event_dict in escalation_plan_dict[timedelta]:
|
||||
if event_dict.get("is_the_first_notification_step"):
|
||||
|
||||
events_list: list[PlanLines] = list()
|
||||
for plan in escalation_plan[timedelta]:
|
||||
if plan.get("is_the_first_notification_step"):
|
||||
if (
|
||||
future_step_timedelta is None
|
||||
and timedelta > esc_time
|
||||
and event_dict.get("user_id") in notification_plans_to_remove
|
||||
and plan.get("user_id") in notification_plans_to_remove
|
||||
):
|
||||
future_step_timedelta = timedelta
|
||||
if (
|
||||
timedelta < esc_time
|
||||
or event_dict.get("user_id") not in notification_plans_to_remove
|
||||
or plan.get("user_id") not in notification_plans_to_remove
|
||||
or future_step_timedelta is not None
|
||||
):
|
||||
events_list.append(event_dict)
|
||||
events_list.append(plan)
|
||||
|
||||
if len(events_list) > 0:
|
||||
new_escalation_policies_dict.setdefault(timedelta, []).extend(events_list)
|
||||
new_escalation_plan.setdefault(timedelta, []).extend(events_list)
|
||||
|
||||
escalation_plan_dict = new_escalation_policies_dict
|
||||
escalation_plan = new_escalation_plan
|
||||
|
||||
for timedelta, plan in notification_plan_dict.items():
|
||||
for timedelta, plan in notification_plan.items():
|
||||
timedelta = esc_time + timedelta
|
||||
|
||||
if future_step_timedelta is None or future_step_timedelta > timedelta:
|
||||
escalation_plan_dict.setdefault(timedelta, []).extend(plan)
|
||||
escalation_plan.setdefault(timedelta, []).extend(plan)
|
||||
|
||||
return escalation_plan_dict
|
||||
return escalation_plan
|
||||
|
||||
def _finalize_escalation_plan_dict(self, escalation_dict):
|
||||
def _finalize_escalation_plan(self, escalation_plan: MixedEscalationNotificationPlan) -> FlattendEscalationPlan:
|
||||
"""
|
||||
It changes escalation dict structure
|
||||
from {timedelta: [{"user_id": user.pk, "plan_lines": []}, {"plan_lines": []}]}
|
||||
to {timedelta: [all plan lines for this timedelta]}
|
||||
:param escalation_dict:
|
||||
:return:
|
||||
It transforms `escalation_plan` from `MixedEscalationNotificationPlan` to `FlattendEscalationPlan`
|
||||
"""
|
||||
final_escalation_dict = dict()
|
||||
for timedelta in escalation_dict:
|
||||
plan_lines_list = list()
|
||||
for event_dict in escalation_dict[timedelta]:
|
||||
plan_lines_list.extend(event_dict["plan_lines"])
|
||||
if len(plan_lines_list) > 0:
|
||||
final_escalation_plan: FlattendEscalationPlan = dict()
|
||||
|
||||
for timedelta in escalation_plan:
|
||||
flattened_plan_lines: list[str] = list()
|
||||
|
||||
for plan_lines in escalation_plan[timedelta]:
|
||||
flattened_plan_lines.extend(plan_lines["plan_lines"])
|
||||
|
||||
if len(flattened_plan_lines) > 0:
|
||||
timedelta = timedelta if timedelta > timezone.timedelta() else timezone.timedelta()
|
||||
final_escalation_dict.setdefault(timedelta, []).extend(plan_lines_list)
|
||||
return final_escalation_dict
|
||||
final_escalation_plan.setdefault(timedelta, []).extend(flattened_plan_lines)
|
||||
|
||||
return final_escalation_plan
|
||||
|
||||
def _render_escalation_step_plan_from_escalation_policy_snapshot(
|
||||
self,
|
||||
escalation_policy_snapshot,
|
||||
escalation_snapshot,
|
||||
for_slack=False,
|
||||
future_step=False,
|
||||
esc_timedelta=None,
|
||||
):
|
||||
escalation_policy_snapshot: "EscalationPolicySnapshot",
|
||||
escalation_snapshot: "EscalationSnapshot",
|
||||
for_slack: bool = False,
|
||||
future_step: bool = False,
|
||||
esc_timedelta: typing.Optional[timezone.timedelta] = None,
|
||||
) -> EscalationPlan:
|
||||
"""
|
||||
Renders escalation and notification policies plan dict.
|
||||
|
||||
|
|
@ -388,112 +447,128 @@ class IncidentLogBuilder:
|
|||
:param for_slack: (bool) add or not user slack id to user notification plan line
|
||||
:param future_step: (bool) step not passed yet
|
||||
:param esc_timedelta: timedelta of escalation step
|
||||
|
||||
:return: dict with timedelta as a key and list with escalation and notification plan lines as a value
|
||||
"""
|
||||
from apps.alerts.models import EscalationPolicy
|
||||
|
||||
escalation_plan_dict = {}
|
||||
escalation_plan: EscalationPlan = {}
|
||||
timedelta = timezone.timedelta()
|
||||
|
||||
if escalation_policy_snapshot.step in [
|
||||
EscalationPolicy.STEP_NOTIFY_MULTIPLE_USERS,
|
||||
EscalationPolicy.STEP_NOTIFY_MULTIPLE_USERS_IMPORTANT,
|
||||
EscalationPolicy.STEP_NOTIFY_USERS_QUEUE,
|
||||
]:
|
||||
users_to_notify = escalation_policy_snapshot.sorted_users_queue
|
||||
users_to_notify: UsersToNotify = escalation_policy_snapshot.sorted_users_queue
|
||||
|
||||
if future_step:
|
||||
if users_to_notify:
|
||||
plan_line = f'escalation step "{escalation_policy_snapshot.step_display}"'
|
||||
|
||||
if escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_USERS_QUEUE:
|
||||
try:
|
||||
last_user_index = users_to_notify.index(escalation_policy_snapshot.last_notified_user)
|
||||
except ValueError:
|
||||
last_user_index = -1
|
||||
|
||||
user_to_notify = users_to_notify[(last_user_index + 1) % len(users_to_notify)]
|
||||
users_to_notify = [user_to_notify]
|
||||
|
||||
else:
|
||||
plan_line = (
|
||||
f'escalation step "{escalation_policy_snapshot.step_display}" with no recipients. ' f"Skipping"
|
||||
)
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
|
||||
escalation_plan.setdefault(timedelta, []).append({"plan_lines": [plan_line]})
|
||||
|
||||
elif escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_USERS_QUEUE:
|
||||
last_notified_user = escalation_policy_snapshot.last_notified_user
|
||||
users_to_notify = [last_notified_user] if last_notified_user else []
|
||||
|
||||
for user_to_notify in users_to_notify:
|
||||
notification_plan_dict = self._get_notification_plan_for_user(
|
||||
notification_plan = self._get_notification_plan_for_user(
|
||||
user_to_notify,
|
||||
important=escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_MULTIPLE_USERS_IMPORTANT,
|
||||
for_slack=for_slack,
|
||||
future_step=future_step,
|
||||
)
|
||||
# notification_plan_dict structure - {timedelta: [{"user_id": user.pk, "plan_lines": []}]
|
||||
for timedelta, notification_plan in notification_plan_dict.items():
|
||||
escalation_plan_dict.setdefault(timedelta, []).extend(notification_plan)
|
||||
|
||||
for timedelta, plan in notification_plan.items():
|
||||
escalation_plan.setdefault(timedelta, []).extend(plan)
|
||||
|
||||
elif escalation_policy_snapshot.step == EscalationPolicy.STEP_FINAL_NOTIFYALL:
|
||||
channel_id = escalation_snapshot.slack_channel_id
|
||||
users_to_notify = []
|
||||
final_notify_all_users_to_notify: UsersToNotify = []
|
||||
|
||||
if future_step:
|
||||
if self.alert_group.is_presented_in_slack and channel_id:
|
||||
plan_line = f'escalation step "{escalation_policy_snapshot.step_display}"'
|
||||
slack_team_identity = self.alert_group.slack_message.slack_team_identity
|
||||
users_to_notify = slack_team_identity.get_users_from_slack_conversation_for_organization(
|
||||
channel_id=channel_id,
|
||||
organization=self.alert_group.channel.organization,
|
||||
|
||||
# "technically" `slack_message` could be `None` here, but because of the conditional check above
|
||||
# it's safe to cast this to `SlackTeamIdentity`
|
||||
slack_team_identity: SlackTeamIdentity = self.alert_group.slack_message.slack_team_identity
|
||||
|
||||
final_notify_all_users_to_notify = (
|
||||
slack_team_identity.get_users_from_slack_conversation_for_organization(
|
||||
channel_id=channel_id,
|
||||
organization=self.alert_group.channel.organization,
|
||||
)
|
||||
)
|
||||
else:
|
||||
plan_line = (
|
||||
f'escalation step "{escalation_policy_snapshot.step_display}" is slack specific. ' f"Skipping"
|
||||
)
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
else:
|
||||
users_to_notify = escalation_policy_snapshot.notify_to_users_queue
|
||||
|
||||
for user_to_notify in users_to_notify:
|
||||
notification_plan_dict = self._get_notification_plan_for_user(
|
||||
escalation_plan.setdefault(timedelta, []).append({"plan_lines": [plan_line]})
|
||||
else:
|
||||
final_notify_all_users_to_notify = escalation_policy_snapshot.notify_to_users_queue
|
||||
|
||||
for user_to_notify in final_notify_all_users_to_notify:
|
||||
notification_plan = self._get_notification_plan_for_user(
|
||||
user_to_notify,
|
||||
important=escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_IMPORTANT,
|
||||
for_slack=for_slack,
|
||||
future_step=future_step,
|
||||
)
|
||||
# notification_plan_dict structure - {timedelta: [{"user_id": user.pk, "plan_lines": []}]
|
||||
for timedelta, notification_plan in notification_plan_dict.items():
|
||||
escalation_plan_dict.setdefault(timedelta, []).extend(notification_plan)
|
||||
|
||||
for timedelta, plan in notification_plan.items():
|
||||
escalation_plan.setdefault(timedelta, []).extend(plan)
|
||||
|
||||
elif escalation_policy_snapshot.step == EscalationPolicy.STEP_FINAL_RESOLVE:
|
||||
if future_step:
|
||||
plan_line = "resolve automatically"
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
escalation_plan.setdefault(timedelta, []).append({"plan_lines": ["resolve automatically"]})
|
||||
|
||||
elif escalation_policy_snapshot.step == EscalationPolicy.STEP_REPEAT_ESCALATION_N_TIMES:
|
||||
if future_step:
|
||||
escalation_counter = escalation_policy_snapshot.escalation_counter
|
||||
repeat_times = EscalationPolicy.MAX_TIMES_REPEAT - escalation_counter
|
||||
|
||||
if repeat_times > 0:
|
||||
plan_line = f"repeat escalation from the beginning ({repeat_times} times)"
|
||||
else:
|
||||
plan_line = 'skip step "Repeat Escalation"'
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
|
||||
escalation_plan.setdefault(timedelta, []).append({"plan_lines": [plan_line]})
|
||||
|
||||
elif escalation_policy_snapshot.step in [
|
||||
EscalationPolicy.STEP_NOTIFY_GROUP,
|
||||
EscalationPolicy.STEP_NOTIFY_GROUP_IMPORTANT,
|
||||
]:
|
||||
users_to_notify = []
|
||||
notify_group_users_to_notify: UsersToNotify = []
|
||||
|
||||
if future_step:
|
||||
if self.alert_group.is_presented_in_slack:
|
||||
user_group = escalation_policy_snapshot.notify_to_group
|
||||
|
||||
if user_group is not None:
|
||||
users_to_notify = user_group.get_users_from_members_for_organization(
|
||||
notify_group_users_to_notify = user_group.get_users_from_members_for_organization(
|
||||
self.alert_group.channel.organization
|
||||
)
|
||||
user_group_handle = user_group.handle
|
||||
important_text = ""
|
||||
|
||||
if escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_GROUP_IMPORTANT:
|
||||
important_text = " (Important)"
|
||||
|
||||
plan_line = f'escalation step "Notify @{user_group_handle} User Group{important_text}"'
|
||||
else:
|
||||
plan_line = (
|
||||
|
|
@ -504,58 +579,65 @@ class IncidentLogBuilder:
|
|||
plan_line = (
|
||||
f'escalation step "{escalation_policy_snapshot.step_display}" is slack specific. Skipping'
|
||||
)
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
else:
|
||||
users_to_notify = escalation_policy_snapshot.notify_to_users_queue
|
||||
|
||||
for user_to_notify in users_to_notify:
|
||||
notification_plan_dict = self._get_notification_plan_for_user(
|
||||
escalation_plan.setdefault(timedelta, []).append({"plan_lines": [plan_line]})
|
||||
else:
|
||||
notify_group_users_to_notify = escalation_policy_snapshot.notify_to_users_queue
|
||||
|
||||
for user_to_notify in notify_group_users_to_notify:
|
||||
notification_plan = self._get_notification_plan_for_user(
|
||||
user_to_notify,
|
||||
important=escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_GROUP_IMPORTANT,
|
||||
for_slack=for_slack,
|
||||
future_step=future_step,
|
||||
)
|
||||
for timedelta, notification_plan in notification_plan_dict.items():
|
||||
escalation_plan_dict.setdefault(timedelta, []).extend(notification_plan)
|
||||
|
||||
for timedelta, plan in notification_plan.items():
|
||||
escalation_plan.setdefault(timedelta, []).extend(plan)
|
||||
|
||||
elif escalation_policy_snapshot.step in [
|
||||
EscalationPolicy.STEP_NOTIFY_SCHEDULE,
|
||||
EscalationPolicy.STEP_NOTIFY_SCHEDULE_IMPORTANT,
|
||||
]:
|
||||
schedule = escalation_policy_snapshot.notify_schedule
|
||||
users_oncall = []
|
||||
users_oncall: UsersToNotify = []
|
||||
|
||||
if future_step:
|
||||
if schedule is not None:
|
||||
step_datetime = timezone.now() + esc_timedelta
|
||||
users_oncall = list_users_to_notify_from_ical(schedule, step_datetime)
|
||||
important_text = ""
|
||||
|
||||
if escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_SCHEDULE_IMPORTANT:
|
||||
important_text = " (Important)"
|
||||
|
||||
plan_line = f"escalation step \"Notify on-call from Schedule '{schedule.name}'{important_text}\""
|
||||
if users_oncall is None:
|
||||
plan_line += ", but iCal import was failed. Skipping"
|
||||
elif len(users_oncall) == 0:
|
||||
plan_line += ", but there are no users to notify for this schedule slot. Skipping"
|
||||
|
||||
else:
|
||||
plan_line = (
|
||||
f'escalation step "{escalation_policy_snapshot.step_display}", but schedule is '
|
||||
f"unspecified. Skipping"
|
||||
)
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
|
||||
escalation_plan.setdefault(timedelta, []).append({"plan_lines": [plan_line]})
|
||||
else:
|
||||
users_oncall = escalation_policy_snapshot.notify_to_users_queue
|
||||
|
||||
for user_to_notify in users_oncall:
|
||||
notification_plan_dict = self._get_notification_plan_for_user(
|
||||
notification_plan = self._get_notification_plan_for_user(
|
||||
user_to_notify,
|
||||
for_slack=for_slack,
|
||||
important=escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_SCHEDULE_IMPORTANT,
|
||||
future_step=future_step,
|
||||
)
|
||||
# notification_plan_dict structure - {timedelta: [{"user_id": user.pk, "plan_lines": []}]
|
||||
for timedelta, notification_plan in notification_plan_dict.items():
|
||||
escalation_plan_dict.setdefault(timedelta, []).extend(notification_plan)
|
||||
|
||||
for timedelta, plan in notification_plan.items():
|
||||
escalation_plan.setdefault(timedelta, []).extend(plan)
|
||||
|
||||
elif escalation_policy_snapshot.step == EscalationPolicy.STEP_TRIGGER_CUSTOM_WEBHOOK:
|
||||
if future_step:
|
||||
custom_webhook = escalation_policy_snapshot.custom_webhook
|
||||
|
|
@ -563,32 +645,34 @@ class IncidentLogBuilder:
|
|||
plan_line = f'trigger outgoing webhook "{custom_webhook.name}"'
|
||||
else:
|
||||
plan_line = f'escalation step "{escalation_policy_snapshot.step_display}" but outgoing webhook is unspecified. Skipping'
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
|
||||
escalation_plan.setdefault(timedelta, []).append({"plan_lines": [plan_line]})
|
||||
|
||||
elif escalation_policy_snapshot.step == EscalationPolicy.STEP_NOTIFY_IF_TIME:
|
||||
if future_step:
|
||||
if escalation_policy_snapshot.from_time is not None and escalation_policy_snapshot.to_time is not None:
|
||||
plan_line = 'escalation step "Continue escalation if time"'
|
||||
else:
|
||||
plan_line = 'escalation step "Continue escalation if time", but time is not configured. Skipping'
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
|
||||
escalation_plan.setdefault(timedelta, []).append({"plan_lines": [plan_line]})
|
||||
|
||||
elif escalation_policy_snapshot.step is None:
|
||||
if future_step:
|
||||
plan_line = "escalation step is unspecified. Skipping"
|
||||
plan = {"plan_lines": [plan_line]}
|
||||
escalation_plan_dict.setdefault(timedelta, []).append(plan)
|
||||
return escalation_plan_dict
|
||||
escalation_plan.setdefault(timedelta, []).append(
|
||||
{"plan_lines": ["escalation step is unspecified. Skipping"]}
|
||||
)
|
||||
|
||||
return escalation_plan
|
||||
|
||||
def _render_user_notification_line(
|
||||
self, user_to_notify: "User", notification_policy: "UserNotificationPolicy", for_slack=False
|
||||
):
|
||||
self,
|
||||
user_to_notify: "User",
|
||||
notification_policy: "UserNotificationPolicy",
|
||||
for_slack: bool = False,
|
||||
) -> str:
|
||||
"""
|
||||
Renders user notification plan line
|
||||
:param user_to_notify:
|
||||
:param notification_policy:
|
||||
:param for_slack: (bool) add or not user slack id to user notification plan line
|
||||
:return: plan line
|
||||
"""
|
||||
from apps.base.models import UserNotificationPolicy
|
||||
|
||||
|
|
@ -616,15 +700,14 @@ class IncidentLogBuilder:
|
|||
return result
|
||||
|
||||
def _get_notification_plan_for_user(
|
||||
self, user_to_notify: "User", future_step=False, important=False, for_slack=False
|
||||
):
|
||||
self,
|
||||
user_to_notify: "User",
|
||||
future_step: bool = False,
|
||||
important: bool = False,
|
||||
for_slack: bool = False,
|
||||
) -> NotificationPlan:
|
||||
"""
|
||||
Renders user notification plan
|
||||
:param user_to_notify:
|
||||
:param future_step:
|
||||
:param important:
|
||||
:param for_slack: (bool) add or not user slack id to user notification plan line
|
||||
:return: {timedelta: [{"user_id": user.pk, "plan_lines": [#rendered notification policy line, ]}, ...], ...}
|
||||
"""
|
||||
from apps.base.models import UserNotificationPolicy, UserNotificationPolicyLogRecord
|
||||
|
||||
|
|
@ -632,12 +715,15 @@ class IncidentLogBuilder:
|
|||
is_the_first_notification_step = future_step # escalation starts with this step or not
|
||||
|
||||
# generate starter dict for notification plan
|
||||
plan_lines_dict = {
|
||||
"user_id": user_to_notify.pk,
|
||||
"plan_lines": [],
|
||||
"is_the_first_notification_step": is_the_first_notification_step,
|
||||
notification_plan: NotificationPlan = {
|
||||
timedelta: [
|
||||
{
|
||||
"user_id": user_to_notify.pk,
|
||||
"plan_lines": [],
|
||||
"is_the_first_notification_step": is_the_first_notification_step,
|
||||
},
|
||||
],
|
||||
}
|
||||
notification_plan_dict = {timedelta: [plan_lines_dict]}
|
||||
|
||||
last_user_log = None
|
||||
|
||||
|
|
@ -650,10 +736,12 @@ class IncidentLogBuilder:
|
|||
)
|
||||
.distinct()
|
||||
)
|
||||
|
||||
# get lists of notification policies with scheduled but not triggered bundled notifications
|
||||
# and of all notification policies with bundled notifications
|
||||
notification_policy_ids_in_scheduled_bundle: typing.Set[int] = set()
|
||||
notification_policy_ids_in_bundle: typing.Set[int] = set()
|
||||
|
||||
for notification_policy_in_bundle in notification_policies_in_bundle:
|
||||
if notification_policy_in_bundle["bundle_uuid"] is None:
|
||||
notification_policy_ids_in_scheduled_bundle.add(notification_policy_in_bundle["notification_policy"])
|
||||
|
|
@ -683,6 +771,7 @@ class IncidentLogBuilder:
|
|||
if last_user_log.notification_step is not None
|
||||
else last_user_log.notification_policy.step
|
||||
)
|
||||
|
||||
# get order of the next notification step
|
||||
if notification_step == UserNotificationPolicy.Step.WAIT:
|
||||
# do not exclude wait step, because we need it to count timedelta
|
||||
|
|
@ -701,10 +790,11 @@ class IncidentLogBuilder:
|
|||
notification_policy.order >= notification_policy_order
|
||||
and notification_policy.id not in notification_policy_ids_in_bundle
|
||||
)
|
||||
|
||||
if notification_policy.step == UserNotificationPolicy.Step.WAIT:
|
||||
wait_delay = notification_policy.wait_delay
|
||||
if wait_delay is not None:
|
||||
if (wait_delay := notification_policy.wait_delay) is not None:
|
||||
timedelta += wait_delay # increase timedelta for next steps
|
||||
|
||||
elif future_notification or is_scheduled_bundled_notification:
|
||||
notification_timedelta = (
|
||||
timedelta + timezone.timedelta(seconds=BUNDLED_NOTIFICATION_DELAY_SECONDS)
|
||||
|
|
@ -714,10 +804,12 @@ class IncidentLogBuilder:
|
|||
plan_line = self._render_user_notification_line(
|
||||
user_to_notify, notification_policy, for_slack=for_slack
|
||||
)
|
||||
|
||||
# add plan_line to user plan_lines list
|
||||
if not notification_plan_dict.get(notification_timedelta):
|
||||
if not notification_plan.get(notification_timedelta):
|
||||
plan = {"user_id": user_to_notify.pk, "plan_lines": [plan_line]}
|
||||
notification_plan_dict.setdefault(notification_timedelta, []).append(plan)
|
||||
notification_plan.setdefault(notification_timedelta, []).append(plan)
|
||||
else:
|
||||
notification_plan_dict[notification_timedelta][0]["plan_lines"].append(plan_line)
|
||||
return notification_plan_dict
|
||||
notification_plan[notification_timedelta][0]["plan_lines"].append(plan_line)
|
||||
|
||||
return notification_plan
|
||||
|
|
|
|||
|
|
@ -0,0 +1,36 @@
|
|||
# Generated by Django 4.2.16 on 2024-11-01 11:20
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django_migration_linter as linter
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('slack', '0005_slackteamidentity__unified_slack_app_installed'),
|
||||
('alerts', '0061_alter_alertgroup_resolved_by_alert'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
linter.IgnoreMigration(),
|
||||
migrations.RenameField(
|
||||
model_name='channelfilter',
|
||||
old_name='slack_channel_id',
|
||||
new_name='_slack_channel_id',
|
||||
),
|
||||
migrations.RenameField(
|
||||
model_name='resolutionnoteslackmessage',
|
||||
old_name='slack_channel_id',
|
||||
new_name='_slack_channel_id',
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='channelfilter',
|
||||
name='slack_channel',
|
||||
field=models.ForeignKey(default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='slack.slackchannel'),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='resolutionnoteslackmessage',
|
||||
name='slack_channel',
|
||||
field=models.ForeignKey(default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='slack.slackchannel'),
|
||||
),
|
||||
]
|
||||
|
|
@ -0,0 +1,68 @@
|
|||
# Generated by Django 4.2.16 on 2024-11-01 10:58
|
||||
import logging
|
||||
|
||||
from django.db import migrations
|
||||
import django_migration_linter as linter
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def populate_slack_channel(apps, schema_editor):
|
||||
ChannelFilter = apps.get_model("alerts", "ChannelFilter")
|
||||
SlackChannel = apps.get_model("slack", "SlackChannel")
|
||||
|
||||
logger.info("Starting migration to populate slack_channel field.")
|
||||
|
||||
queryset = ChannelFilter.objects.filter(
|
||||
_slack_channel_id__isnull=False,
|
||||
alert_receive_channel__organization__slack_team_identity__isnull=False,
|
||||
)
|
||||
total_channel_filters = queryset.count()
|
||||
updated_channel_filters = 0
|
||||
missing_channel_filters = 0
|
||||
channel_filters_to_update = []
|
||||
|
||||
logger.info(f"Total channel filters to process: {total_channel_filters}")
|
||||
|
||||
for channel_filter in queryset:
|
||||
slack_id = channel_filter._slack_channel_id
|
||||
slack_team_identity = channel_filter.alert_receive_channel.organization.slack_team_identity
|
||||
|
||||
try:
|
||||
slack_channel = SlackChannel.objects.get(slack_id=slack_id, slack_team_identity=slack_team_identity)
|
||||
channel_filter.slack_channel = slack_channel
|
||||
channel_filters_to_update.append(channel_filter)
|
||||
|
||||
updated_channel_filters += 1
|
||||
logger.info(
|
||||
f"ChannelFilter {channel_filter.id} updated with SlackChannel {slack_channel.id} "
|
||||
f"(slack_id: {slack_id})."
|
||||
)
|
||||
except SlackChannel.DoesNotExist:
|
||||
missing_channel_filters += 1
|
||||
logger.warning(
|
||||
f"SlackChannel with slack_id {slack_id} and slack_team_identity {slack_team_identity} "
|
||||
f"does not exist for ChannelFilter {channel_filter.id}."
|
||||
)
|
||||
|
||||
if channel_filters_to_update:
|
||||
ChannelFilter.objects.bulk_update(channel_filters_to_update, ["slack_channel"])
|
||||
logger.info(f"Bulk updated {len(channel_filters_to_update)} ChannelFilters with their Slack channel.")
|
||||
|
||||
logger.info(
|
||||
f"Finished migration. Total channel filters processed: {total_channel_filters}. "
|
||||
f"Channel filters updated: {updated_channel_filters}. Missing SlackChannels: {missing_channel_filters}."
|
||||
)
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('alerts', '0062_rename_slack_channel_id_channelfilter__slack_channel_id_and_more'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# simply setting this new field is okay, we are not deleting the value of channel
|
||||
# therefore, no need to revert it
|
||||
linter.IgnoreMigration(),
|
||||
migrations.RunPython(populate_slack_channel, migrations.RunPython.noop),
|
||||
]
|
||||
|
|
@ -0,0 +1,73 @@
|
|||
# Generated by Django 4.2.16 on 2024-11-01 10:58
|
||||
import logging
|
||||
|
||||
from django.db import migrations
|
||||
import django_migration_linter as linter
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def populate_slack_channel(apps, schema_editor):
|
||||
ResolutionNoteSlackMessage = apps.get_model("alerts", "ResolutionNoteSlackMessage")
|
||||
SlackChannel = apps.get_model("slack", "SlackChannel")
|
||||
|
||||
logger.info("Starting migration to populate slack_channel field.")
|
||||
|
||||
queryset = ResolutionNoteSlackMessage.objects.filter(
|
||||
_slack_channel_id__isnull=False,
|
||||
alert_group__channel__organization__slack_team_identity__isnull=False,
|
||||
)
|
||||
total_resolution_notes = queryset.count()
|
||||
updated_resolution_notes = 0
|
||||
missing_resolution_notes = 0
|
||||
resolution_notes_to_update = []
|
||||
|
||||
logger.info(f"Total resolution note slack messages to process: {total_resolution_notes}")
|
||||
|
||||
for resolution_note in queryset:
|
||||
slack_id = resolution_note._slack_channel_id
|
||||
slack_team_identity = resolution_note.alert_group.channel.organization.slack_team_identity
|
||||
|
||||
try:
|
||||
slack_channel = SlackChannel.objects.get(slack_id=slack_id, slack_team_identity=slack_team_identity)
|
||||
resolution_note.slack_channel = slack_channel
|
||||
resolution_notes_to_update.append(resolution_note)
|
||||
|
||||
updated_resolution_notes += 1
|
||||
logger.info(
|
||||
f"ResolutionNoteSlackMessage {resolution_note.id} updated with SlackChannel {slack_channel.id} "
|
||||
f"(slack_id: {slack_id})."
|
||||
)
|
||||
except SlackChannel.DoesNotExist:
|
||||
missing_resolution_notes += 1
|
||||
logger.warning(
|
||||
f"SlackChannel with slack_id {slack_id} and slack_team_identity {slack_team_identity} "
|
||||
f"does not exist for ResolutionNoteSlackMessage {resolution_note.id}."
|
||||
)
|
||||
|
||||
if resolution_notes_to_update:
|
||||
ResolutionNoteSlackMessage.objects.bulk_update(resolution_notes_to_update, ["slack_channel"])
|
||||
logger.info(
|
||||
f"Bulk updated {len(resolution_notes_to_update)} ResolutionNoteSlackMessage with their Slack channel."
|
||||
)
|
||||
|
||||
logger.info(
|
||||
f"Finished migration. Total resolution note slack messages processed: {total_resolution_notes}. "
|
||||
f"Resolution note slack messages updated: {updated_resolution_notes}. "
|
||||
f"Missing SlackChannels: {missing_resolution_notes}."
|
||||
)
|
||||
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('alerts', '0063_migrate_channelfilter_slack_channel_id'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# simply setting this new field is okay, we are not deleting the value of channel
|
||||
# therefore, no need to revert it
|
||||
linter.IgnoreMigration(),
|
||||
migrations.RunPython(populate_slack_channel, migrations.RunPython.noop),
|
||||
]
|
||||
|
|
@ -43,6 +43,7 @@ if typing.TYPE_CHECKING:
|
|||
AlertGroupLogRecord,
|
||||
AlertReceiveChannel,
|
||||
BundledNotification,
|
||||
Invitation,
|
||||
RelatedIncident,
|
||||
ResolutionNote,
|
||||
ResolutionNoteSlackMessage,
|
||||
|
|
@ -193,11 +194,13 @@ class AlertGroup(AlertGroupSlackRenderingMixin, EscalationSnapshotMixin, models.
|
|||
acknowledged_by_user: typing.Optional["User"]
|
||||
alerts: "RelatedManager['Alert']"
|
||||
bundled_notifications: "RelatedManager['BundledNotification']"
|
||||
related_incidents: "RelatedManager['RelatedIncident']"
|
||||
dependent_alert_groups: "RelatedManager['AlertGroup']"
|
||||
channel: "AlertReceiveChannel"
|
||||
dependent_alert_groups: "RelatedManager['AlertGroup']"
|
||||
invitations: "RelatedManager['Invitation']"
|
||||
labels: "RelatedManager['AlertGroupAssociatedLabel']"
|
||||
log_records: "RelatedManager['AlertGroupLogRecord']"
|
||||
personal_log_records: "RelatedManager['UserNotificationPolicyLogRecord']"
|
||||
related_incidents: "RelatedManager['RelatedIncident']"
|
||||
resolution_notes: "RelatedManager['ResolutionNote']"
|
||||
resolution_note_slack_messages: "RelatedManager['ResolutionNoteSlackMessage']"
|
||||
resolved_by_user: typing.Optional["User"]
|
||||
|
|
@ -205,7 +208,6 @@ class AlertGroup(AlertGroupSlackRenderingMixin, EscalationSnapshotMixin, models.
|
|||
silenced_by_user: typing.Optional["User"]
|
||||
slack_messages: "RelatedManager['SlackMessage']"
|
||||
users: "RelatedManager['User']"
|
||||
labels: "RelatedManager['AlertGroupAssociatedLabel']"
|
||||
|
||||
objects: models.Manager["AlertGroup"] = AlertGroupQuerySet.as_manager()
|
||||
|
||||
|
|
@ -1980,13 +1982,10 @@ class AlertGroup(AlertGroupSlackRenderingMixin, EscalationSnapshotMixin, models.
|
|||
def slack_channel_id(self) -> str | None:
|
||||
if not self.channel.organization.slack_team_identity:
|
||||
return None
|
||||
|
||||
if self.slack_message:
|
||||
elif self.slack_message:
|
||||
return self.slack_message.channel_id
|
||||
|
||||
if self.channel_filter:
|
||||
return self.channel_filter.slack_channel_id_or_general_log_id
|
||||
|
||||
elif self.channel_filter:
|
||||
return self.channel_filter.slack_channel_id_or_org_default_id
|
||||
return None
|
||||
|
||||
@property
|
||||
|
|
|
|||
|
|
@ -617,17 +617,17 @@ class AlertReceiveChannel(IntegrationOptionsMixin, MaintainableObject):
|
|||
def force_disable_maintenance(self, user):
|
||||
disable_maintenance(alert_receive_channel_id=self.pk, force=True, user_id=user.pk)
|
||||
|
||||
def notify_about_maintenance_action(self, text, send_to_general_log_channel=True):
|
||||
def notify_about_maintenance_action(self, text: str, send_to_general_log_channel=True) -> None:
|
||||
# TODO: this method should be refactored.
|
||||
# It's binded to slack and sending maintenance notification only there.
|
||||
channel_ids = list(
|
||||
self.channel_filters.filter(slack_channel_id__isnull=False, notify_in_slack=False).values_list(
|
||||
self.channel_filters.filter(slack_channel__isnull=False, notify_in_slack=False).values_list(
|
||||
"slack_channel_id", flat=True
|
||||
)
|
||||
)
|
||||
|
||||
if send_to_general_log_channel:
|
||||
general_log_channel_id = self.organization.general_log_channel_id
|
||||
general_log_channel_id = self.organization.default_slack_channel_slack_id
|
||||
if general_log_channel_id is not None:
|
||||
channel_ids.append(general_log_channel_id)
|
||||
unique_channels_id = set(channel_ids)
|
||||
|
|
|
|||
|
|
@ -21,6 +21,7 @@ if typing.TYPE_CHECKING:
|
|||
|
||||
from apps.alerts.models import Alert, AlertGroup, AlertReceiveChannel
|
||||
from apps.labels.types import AlertLabels, LabelPair
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -45,7 +46,9 @@ class ChannelFilter(OrderedModel):
|
|||
"""
|
||||
|
||||
alert_groups: "RelatedManager['AlertGroup']"
|
||||
alert_receive_channel: "AlertReceiveChannel"
|
||||
filtering_labels: typing.Optional[list["LabelPair"]]
|
||||
slack_channel: typing.Optional["SlackChannel"]
|
||||
|
||||
order_with_respect_to = ["alert_receive_channel_id", "is_default"]
|
||||
|
||||
|
|
@ -67,7 +70,15 @@ class ChannelFilter(OrderedModel):
|
|||
notify_in_slack = models.BooleanField(null=True, default=True)
|
||||
notify_in_telegram = models.BooleanField(null=True, default=False)
|
||||
|
||||
slack_channel_id = models.CharField(max_length=100, null=True, default=None)
|
||||
# TODO: remove _slack_channel_id in future release
|
||||
_slack_channel_id = models.CharField(max_length=100, null=True, default=None)
|
||||
slack_channel = models.ForeignKey(
|
||||
"slack.SlackChannel",
|
||||
null=True,
|
||||
default=None,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="+",
|
||||
)
|
||||
|
||||
telegram_channel = models.ForeignKey(
|
||||
"telegram.TelegramToOrganizationConnector",
|
||||
|
|
@ -161,15 +172,18 @@ class ChannelFilter(OrderedModel):
|
|||
return False
|
||||
|
||||
@property
|
||||
def slack_channel_id_or_general_log_id(self):
|
||||
def slack_channel_slack_id(self) -> typing.Optional[str]:
|
||||
return self.slack_channel.slack_id if self.slack_channel else None
|
||||
|
||||
@property
|
||||
def slack_channel_id_or_org_default_id(self):
|
||||
organization = self.alert_receive_channel.organization
|
||||
slack_team_identity = organization.slack_team_identity
|
||||
if slack_team_identity is None:
|
||||
|
||||
if organization.slack_team_identity is None:
|
||||
return None
|
||||
if self.slack_channel_id is None:
|
||||
return organization.general_log_channel_id
|
||||
else:
|
||||
return self.slack_channel_id
|
||||
elif self.slack_channel_slack_id is None:
|
||||
return organization.default_slack_channel_slack_id
|
||||
return self.slack_channel_slack_id
|
||||
|
||||
@property
|
||||
def str_for_clients(self):
|
||||
|
|
@ -203,17 +217,10 @@ class ChannelFilter(OrderedModel):
|
|||
"slack_notification_enabled": self.notify_in_slack,
|
||||
"telegram_notification_enabled": self.notify_in_telegram,
|
||||
}
|
||||
if self.slack_channel_id:
|
||||
if self.slack_channel_id:
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
sti = self.alert_receive_channel.organization.slack_team_identity
|
||||
slack_channel = SlackChannel.objects.filter(
|
||||
slack_team_identity=sti, slack_id=self.slack_channel_id
|
||||
).first()
|
||||
if slack_channel is not None:
|
||||
# Case when slack channel was deleted, but channel filter still has it's id
|
||||
result["slack_channel"] = slack_channel.name
|
||||
if self.slack_channel:
|
||||
result["slack_channel"] = self.slack_channel.name
|
||||
|
||||
# TODO: use names instead of pks for telegram and other notifications backends.
|
||||
# It's needed to rework messaging backends for that
|
||||
if self.telegram_channel:
|
||||
|
|
|
|||
|
|
@ -1,11 +1,17 @@
|
|||
import datetime
|
||||
import logging
|
||||
import typing
|
||||
from functools import partial
|
||||
|
||||
from django.db import models, transaction
|
||||
|
||||
from apps.alerts import tasks
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
from apps.alerts.models import AlertGroup
|
||||
from apps.user_management.models import User
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.setLevel(logging.DEBUG)
|
||||
|
||||
|
|
@ -15,6 +21,10 @@ class Invitation(models.Model):
|
|||
It's an invitation of a user to join working on Alert Group
|
||||
"""
|
||||
|
||||
alert_group: "AlertGroup"
|
||||
author: typing.Optional["User"]
|
||||
invitee: typing.Optional["User"]
|
||||
|
||||
ATTEMPTS_LIMIT = 10
|
||||
|
||||
time_deltas_by_attempts = [
|
||||
|
|
@ -45,18 +55,18 @@ class Invitation(models.Model):
|
|||
attempt = models.IntegerField(default=0)
|
||||
|
||||
@property
|
||||
def attempts_left(self):
|
||||
def attempts_left(self) -> int:
|
||||
return Invitation.ATTEMPTS_LIMIT - self.attempt
|
||||
|
||||
@staticmethod
|
||||
def get_delay_by_attempt(attempt):
|
||||
def get_delay_by_attempt(attempt: int) -> datetime.timedelta:
|
||||
countdown = Invitation.time_deltas_by_attempts[-1]
|
||||
if attempt < len(Invitation.time_deltas_by_attempts):
|
||||
countdown = Invitation.time_deltas_by_attempts[attempt]
|
||||
return countdown
|
||||
|
||||
@staticmethod
|
||||
def invite_user(invitee_user, alert_group, user):
|
||||
def invite_user(invitee_user: "User", alert_group: "AlertGroup", user: "User") -> None:
|
||||
from apps.alerts.models import AlertGroupLogRecord
|
||||
|
||||
# RFCT - why atomic? without select for update?
|
||||
|
|
@ -97,7 +107,7 @@ class Invitation(models.Model):
|
|||
transaction.on_commit(partial(tasks.invite_user_to_join_incident.delay, invitation.pk))
|
||||
|
||||
@staticmethod
|
||||
def stop_invitation(invitation_pk, user):
|
||||
def stop_invitation(invitation_pk: int, user: "User") -> None:
|
||||
from apps.alerts.models import AlertGroupLogRecord
|
||||
|
||||
with transaction.atomic():
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ from common.utils import clean_markup
|
|||
|
||||
if typing.TYPE_CHECKING:
|
||||
from apps.alerts.models import AlertGroup
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
|
||||
def generate_public_primary_key_for_alert_group_postmortem():
|
||||
|
|
@ -54,6 +55,7 @@ class ResolutionNoteSlackMessageQueryset(models.QuerySet):
|
|||
class ResolutionNoteSlackMessage(models.Model):
|
||||
alert_group: "AlertGroup"
|
||||
resolution_note: typing.Optional["ResolutionNote"]
|
||||
slack_channel: typing.Optional["SlackChannel"]
|
||||
|
||||
alert_group = models.ForeignKey(
|
||||
"alerts.AlertGroup",
|
||||
|
|
@ -73,7 +75,17 @@ class ResolutionNoteSlackMessage(models.Model):
|
|||
related_name="added_resolution_note_slack_messages",
|
||||
)
|
||||
text = models.TextField(max_length=3000, default=None, null=True)
|
||||
slack_channel_id = models.CharField(max_length=100, null=True, default=None)
|
||||
|
||||
# TODO: remove _slack_channel_id in future release
|
||||
_slack_channel_id = models.CharField(max_length=100, null=True, default=None)
|
||||
slack_channel = models.ForeignKey(
|
||||
"slack.SlackChannel",
|
||||
null=True,
|
||||
default=None,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="+",
|
||||
)
|
||||
|
||||
ts = models.CharField(max_length=100, null=True, default=None)
|
||||
thread_ts = models.CharField(max_length=100, null=True, default=None)
|
||||
permalink = models.CharField(max_length=250, null=True, default=None)
|
||||
|
|
@ -88,6 +100,10 @@ class ResolutionNoteSlackMessage(models.Model):
|
|||
models.Index(fields=["ts", "thread_ts", "slack_channel_id"]),
|
||||
]
|
||||
|
||||
@property
|
||||
def slack_channel_slack_id(self) -> typing.Optional[str]:
|
||||
return self.slack_channel.slack_id if self.slack_channel else None
|
||||
|
||||
def get_resolution_note(self) -> typing.Optional["ResolutionNote"]:
|
||||
try:
|
||||
return self.resolution_note
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import logging
|
||||
import typing
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
|
|
@ -11,6 +12,9 @@ from common.incident_api.client import (
|
|||
IncidentAPIException,
|
||||
)
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
from apps.alerts.models import AlertGroup, EscalationPolicy
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
ATTACHMENT_CAPTION = "OnCall Alert Group"
|
||||
|
|
@ -19,7 +23,13 @@ MAX_RETRIES = 1 if settings.DEBUG else 10
|
|||
MAX_ATTACHED_ALERT_GROUPS_PER_INCIDENT = 5
|
||||
|
||||
|
||||
def _attach_alert_group_to_incident(alert_group, incident_id, incident_title, escalation_policy, attached=False):
|
||||
def _attach_alert_group_to_incident(
|
||||
alert_group: "AlertGroup",
|
||||
incident_id: str,
|
||||
incident_title: str,
|
||||
escalation_policy: "EscalationPolicy",
|
||||
attached: bool = False,
|
||||
) -> None:
|
||||
from apps.alerts.models import AlertGroupLogRecord, EscalationPolicy, RelatedIncident
|
||||
|
||||
declared_incident, _ = RelatedIncident.objects.get_or_create(
|
||||
|
|
@ -41,7 +51,9 @@ def _attach_alert_group_to_incident(alert_group, incident_id, incident_title, es
|
|||
)
|
||||
|
||||
|
||||
def _create_error_log_record(alert_group, escalation_policy, reason=""):
|
||||
def _create_error_log_record(
|
||||
alert_group: "AlertGroup", escalation_policy: "EscalationPolicy", reason: str = ""
|
||||
) -> None:
|
||||
from apps.alerts.models import AlertGroupLogRecord, EscalationPolicy
|
||||
|
||||
AlertGroupLogRecord.objects.create(
|
||||
|
|
@ -55,7 +67,7 @@ def _create_error_log_record(alert_group, escalation_policy, reason=""):
|
|||
|
||||
|
||||
@shared_dedicated_queue_retry_task(autoretry_for=(Exception,), retry_backoff=True, max_retries=MAX_RETRIES)
|
||||
def declare_incident(alert_group_pk, escalation_policy_pk, severity=None):
|
||||
def declare_incident(alert_group_pk: int, escalation_policy_pk: int, severity: typing.Optional[str] = None) -> None:
|
||||
from apps.alerts.models import AlertGroup, EscalationPolicy, RelatedIncident
|
||||
|
||||
alert_group = AlertGroup.objects.get(pk=alert_group_pk)
|
||||
|
|
|
|||
|
|
@ -1,3 +1,5 @@
|
|||
import typing
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
from apps.slack.scenarios import scenario_step
|
||||
|
|
@ -11,7 +13,7 @@ from .task_logger import task_logger
|
|||
@shared_dedicated_queue_retry_task(
|
||||
autoretry_for=(Exception,), retry_backoff=True, max_retries=1 if settings.DEBUG else None
|
||||
)
|
||||
def notify_group_task(alert_group_pk, escalation_policy_snapshot_order=None):
|
||||
def notify_group_task(alert_group_pk: int, escalation_policy_snapshot_order: typing.Optional[int] = None):
|
||||
from apps.alerts.models import AlertGroup, AlertGroupLogRecord, EscalationPolicy
|
||||
from apps.base.models import UserNotificationPolicy
|
||||
|
||||
|
|
|
|||
|
|
@ -62,7 +62,7 @@ def notify_ical_schedule_shift(schedule_pk):
|
|||
|
||||
try:
|
||||
schedule = OnCallSchedule.objects.get(
|
||||
pk=schedule_pk, cached_ical_file_primary__isnull=False, channel__isnull=False
|
||||
pk=schedule_pk, cached_ical_file_primary__isnull=False, slack_channel__isnull=False
|
||||
)
|
||||
except OnCallSchedule.DoesNotExist:
|
||||
task_logger.info(f"Trying to notify ical schedule shift for non-existing schedule {schedule_pk}")
|
||||
|
|
@ -170,7 +170,7 @@ def notify_ical_schedule_shift(schedule_pk):
|
|||
|
||||
try:
|
||||
slack_client.chat_postMessage(
|
||||
channel=schedule.channel,
|
||||
channel=schedule.slack_channel_slack_id,
|
||||
blocks=report_blocks,
|
||||
text=f"On-call shift for schedule {schedule.name} has changed",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -88,12 +88,15 @@ def test_delete(
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_alert,
|
||||
make_slack_channel,
|
||||
make_slack_message,
|
||||
make_resolution_note_slack_message,
|
||||
django_capture_on_commit_callbacks,
|
||||
):
|
||||
"""test alert group deleting"""
|
||||
organization, slack_team_identity = make_organization_with_slack_team_identity()
|
||||
slack_channel1 = make_slack_channel(slack_team_identity)
|
||||
slack_channel2 = make_slack_channel(slack_team_identity)
|
||||
user = make_user(organization=organization)
|
||||
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
|
|
@ -108,7 +111,7 @@ def test_delete(
|
|||
user=user,
|
||||
added_by_user=user,
|
||||
posted_by_bot=True,
|
||||
slack_channel_id="test1_channel_id",
|
||||
slack_channel=slack_channel1,
|
||||
ts="test1_ts",
|
||||
)
|
||||
resolution_note_2 = make_resolution_note_slack_message(
|
||||
|
|
@ -116,7 +119,7 @@ def test_delete(
|
|||
user=user,
|
||||
added_by_user=user,
|
||||
added_to_resolution_note=True,
|
||||
slack_channel_id="test2_channel_id",
|
||||
slack_channel=slack_channel2,
|
||||
ts="test2_ts",
|
||||
)
|
||||
|
||||
|
|
@ -168,11 +171,15 @@ def test_delete_slack_ratelimit(
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_alert,
|
||||
make_slack_channel,
|
||||
make_slack_message,
|
||||
make_resolution_note_slack_message,
|
||||
django_capture_on_commit_callbacks,
|
||||
):
|
||||
organization, slack_team_identity = make_organization_with_slack_team_identity()
|
||||
slack_channel1 = make_slack_channel(slack_team_identity)
|
||||
slack_channel2 = make_slack_channel(slack_team_identity)
|
||||
|
||||
user = make_user(organization=organization)
|
||||
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
|
|
@ -187,7 +194,7 @@ def test_delete_slack_ratelimit(
|
|||
user=user,
|
||||
added_by_user=user,
|
||||
posted_by_bot=True,
|
||||
slack_channel_id="test1_channel_id",
|
||||
slack_channel=slack_channel1,
|
||||
ts="test1_ts",
|
||||
)
|
||||
make_resolution_note_slack_message(
|
||||
|
|
@ -195,7 +202,7 @@ def test_delete_slack_ratelimit(
|
|||
user=user,
|
||||
added_by_user=user,
|
||||
added_to_resolution_note=True,
|
||||
slack_channel_id="test2_channel_id",
|
||||
slack_channel=slack_channel2,
|
||||
ts="test2_ts",
|
||||
)
|
||||
|
||||
|
|
@ -237,10 +244,13 @@ def test_delete_slack_api_error_other_than_ratelimit(
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_alert,
|
||||
make_slack_channel,
|
||||
make_slack_message,
|
||||
make_resolution_note_slack_message,
|
||||
):
|
||||
organization, slack_team_identity = make_organization_with_slack_team_identity()
|
||||
slack_channel1 = make_slack_channel(slack_team_identity)
|
||||
slack_channel2 = make_slack_channel(slack_team_identity)
|
||||
user = make_user(organization=organization)
|
||||
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
|
|
@ -255,7 +265,7 @@ def test_delete_slack_api_error_other_than_ratelimit(
|
|||
user=user,
|
||||
added_by_user=user,
|
||||
posted_by_bot=True,
|
||||
slack_channel_id="test1_channel_id",
|
||||
slack_channel=slack_channel1,
|
||||
ts="test1_ts",
|
||||
)
|
||||
make_resolution_note_slack_message(
|
||||
|
|
@ -263,7 +273,7 @@ def test_delete_slack_api_error_other_than_ratelimit(
|
|||
user=user,
|
||||
added_by_user=user,
|
||||
added_to_resolution_note=True,
|
||||
slack_channel_id="test2_channel_id",
|
||||
slack_channel=slack_channel2,
|
||||
ts="test2_ts",
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -167,7 +167,7 @@ def test_send_demo_alert_not_enabled(mocked_create_alert, make_organization, mak
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_notify_maintenance_no_general_channel(make_organization, make_alert_receive_channel):
|
||||
organization = make_organization(general_log_channel_id=None)
|
||||
organization = make_organization(default_slack_channel=None)
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
|
||||
with patch("apps.alerts.models.alert_receive_channel.post_message_to_channel") as mock_post_message:
|
||||
|
|
@ -177,21 +177,34 @@ def test_notify_maintenance_no_general_channel(make_organization, make_alert_rec
|
|||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_notify_maintenance_with_general_channel(make_organization, make_alert_receive_channel):
|
||||
organization = make_organization(general_log_channel_id="CHANNEL-ID")
|
||||
def test_notify_maintenance_with_general_channel(
|
||||
make_organization,
|
||||
make_alert_receive_channel,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
):
|
||||
slack_channel = make_slack_channel(make_slack_team_identity())
|
||||
organization = make_organization(default_slack_channel=slack_channel)
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
|
||||
with patch("apps.alerts.models.alert_receive_channel.post_message_to_channel") as mock_post_message:
|
||||
alert_receive_channel.notify_about_maintenance_action("maintenance mode enabled")
|
||||
|
||||
mock_post_message.assert_called_once_with(
|
||||
organization, organization.general_log_channel_id, "maintenance mode enabled"
|
||||
organization, organization.default_slack_channel.slack_id, "maintenance mode enabled"
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_get_or_create_manual_integration_deleted_team(make_organization, make_team, make_alert_receive_channel):
|
||||
organization = make_organization(general_log_channel_id="CHANNEL-ID")
|
||||
def test_get_or_create_manual_integration_deleted_team(
|
||||
make_organization,
|
||||
make_team,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
):
|
||||
slack_channel = make_slack_channel(make_slack_team_identity())
|
||||
organization = make_organization(default_slack_channel=slack_channel)
|
||||
|
||||
# setup general manual integration
|
||||
general_manual = AlertReceiveChannel.get_or_create_manual_integration(
|
||||
organization=organization, team=None, integration=AlertReceiveChannel.INTEGRATION_MANUAL, defaults={}
|
||||
|
|
|
|||
|
|
@ -36,7 +36,7 @@ def test_escalation_plan_messaging_backends(
|
|||
alert_group.save()
|
||||
|
||||
log_builder = IncidentLogBuilder(alert_group=alert_group)
|
||||
plan = log_builder.get_incident_escalation_plan()
|
||||
plan = log_builder.get_escalation_plan()
|
||||
assert list(plan.values()) == [["send test only backend message to {}".format(user.username)]]
|
||||
|
||||
|
||||
|
|
@ -156,5 +156,5 @@ def test_escalation_plan_custom_webhooks(
|
|||
alert_group.save()
|
||||
|
||||
log_builder = IncidentLogBuilder(alert_group=alert_group)
|
||||
plan = log_builder.get_incident_escalation_plan()
|
||||
plan = log_builder.get_escalation_plan()
|
||||
assert list(plan.values()) == [[f'trigger outgoing webhook "{custom_webhook.name}"']]
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@ from apps.base.models.user_notification_policy import UserNotificationPolicy
|
|||
def test_notify_all(
|
||||
make_organization,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_user,
|
||||
make_user_notification_policy,
|
||||
make_escalation_chain,
|
||||
|
|
@ -21,6 +22,7 @@ def test_notify_all(
|
|||
):
|
||||
organization = make_organization()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
organization.save()
|
||||
|
||||
|
|
@ -37,7 +39,7 @@ def test_notify_all(
|
|||
alert_receive_channel,
|
||||
escalation_chain=escalation_chain,
|
||||
notify_in_slack=True,
|
||||
slack_channel_id="slack-channel-id",
|
||||
slack_channel=slack_channel,
|
||||
)
|
||||
# note this is the only escalation step, with order=1
|
||||
notify_all = make_escalation_policy(
|
||||
|
|
|
|||
|
|
@ -16,12 +16,14 @@ def test_notify_group(
|
|||
make_escalation_chain,
|
||||
make_escalation_policy,
|
||||
make_channel_filter,
|
||||
make_slack_channel,
|
||||
make_slack_user_group,
|
||||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
):
|
||||
organization = make_organization()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
organization.save()
|
||||
|
||||
|
|
@ -40,7 +42,7 @@ def test_notify_group(
|
|||
alert_receive_channel,
|
||||
escalation_chain=escalation_chain,
|
||||
notify_in_slack=True,
|
||||
slack_channel_id="slack-channel-id",
|
||||
slack_channel=slack_channel,
|
||||
)
|
||||
usergroup = make_slack_user_group(slack_team_identity)
|
||||
# note this is the only escalation step, with order=1
|
||||
|
|
|
|||
|
|
@ -54,16 +54,20 @@ END:VCALENDAR
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_current_overrides_ical_schedule_is_none(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_schedule,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
|
||||
ical_schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleICal,
|
||||
name="test_ical_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
ical_url_primary="url",
|
||||
prev_ical_file_primary=ICAL_DATA,
|
||||
cached_ical_file_primary=ICAL_DATA,
|
||||
|
|
@ -77,11 +81,15 @@ def test_current_overrides_ical_schedule_is_none(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_next_shift_notification_long_shifts(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_schedule,
|
||||
make_user,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
make_user(organization=organization, username="user1")
|
||||
make_user(organization=organization, username="user2")
|
||||
|
||||
|
|
@ -89,7 +97,7 @@ def test_next_shift_notification_long_shifts(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleICal,
|
||||
name="test_ical_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
ical_url_primary="url",
|
||||
prev_ical_file_primary=ICAL_DATA,
|
||||
cached_ical_file_primary=ICAL_DATA,
|
||||
|
|
@ -111,12 +119,16 @@ def test_next_shift_notification_long_shifts(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_overrides_changes_no_current_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
ical_before = textwrap.dedent(
|
||||
|
|
@ -172,7 +184,7 @@ def test_overrides_changes_no_current_no_triggering_notification(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=ical_before,
|
||||
)
|
||||
|
|
@ -209,19 +221,23 @@ def test_overrides_changes_no_current_no_triggering_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_no_changes_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -255,19 +271,23 @@ def test_no_changes_no_triggering_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_current_shift_changes_trigger_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -302,14 +322,18 @@ def test_current_shift_changes_trigger_notification(
|
|||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize("swap_taken", [False, True])
|
||||
def test_current_shift_changes_swap_split(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
make_shift_swap_request,
|
||||
swap_taken,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
user2 = make_user(organization=organization, username="user2")
|
||||
|
||||
|
|
@ -317,7 +341,7 @@ def test_current_shift_changes_swap_split(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -364,13 +388,17 @@ def test_current_shift_changes_swap_split(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_current_shift_changes_end_affected_by_swap(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
make_shift_swap_request,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
user2 = make_user(organization=organization, username="user2")
|
||||
|
||||
|
|
@ -378,7 +406,7 @@ def test_current_shift_changes_end_affected_by_swap(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -431,12 +459,16 @@ def test_current_shift_changes_end_affected_by_swap(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_next_shift_changes_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
user2 = make_user(organization=organization, username="user2")
|
||||
|
||||
|
|
@ -444,7 +476,7 @@ def test_next_shift_changes_no_triggering_notification(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -500,17 +532,22 @@ def test_next_shift_changes_no_triggering_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_current_shifts_using_microseconds(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -539,12 +576,16 @@ def test_current_shifts_using_microseconds(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_lower_priority_changes_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
user2 = make_user(organization=organization, username="user2")
|
||||
|
||||
|
|
@ -552,7 +593,7 @@ def test_lower_priority_changes_no_triggering_notification(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -604,11 +645,15 @@ def test_lower_priority_changes_no_triggering_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_vtimezone_changes_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
make_user(organization=organization, username="user1")
|
||||
|
||||
ical_before = textwrap.dedent(
|
||||
|
|
@ -706,7 +751,7 @@ def test_vtimezone_changes_no_triggering_notification(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleICal,
|
||||
name="test_ical_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
ical_url_primary="url",
|
||||
prev_ical_file_primary=None,
|
||||
cached_ical_file_primary=ical_before,
|
||||
|
|
@ -732,19 +777,23 @@ def test_vtimezone_changes_no_triggering_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_no_changes_no_triggering_notification_from_old_to_new_task_version(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -787,12 +836,16 @@ def test_no_changes_no_triggering_notification_from_old_to_new_task_version(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_current_shift_changes_trigger_notification_from_old_to_new_task_version(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
user2 = make_user(organization=organization, username="user2")
|
||||
|
||||
|
|
@ -800,7 +853,7 @@ def test_current_shift_changes_trigger_notification_from_old_to_new_task_version
|
|||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -847,12 +900,16 @@ def test_current_shift_changes_trigger_notification_from_old_to_new_task_version
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_next_shift_notification_long_and_short_shifts(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
user2 = make_user(organization=organization, username="user2")
|
||||
user3 = make_user(organization=organization, username="user3")
|
||||
|
|
@ -861,7 +918,7 @@ def test_next_shift_notification_long_and_short_shifts(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
import typing
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from apps.alerts.models import AlertReceiveChannel, ChannelFilter, EscalationChain
|
||||
from apps.api.serializers.labels import LabelPairSerializer
|
||||
from apps.api.serializers.slack_channel import SlackChannelDetails, SlackChannelSerializer
|
||||
from apps.base.messaging import get_messaging_backend_from_id
|
||||
from apps.telegram.models import TelegramToOrganizationConnector
|
||||
from common.api_helpers.custom_fields import OrganizationFilteredPrimaryKeyRelatedField
|
||||
from common.api_helpers.custom_fields import (
|
||||
OrganizationFilteredPrimaryKeyRelatedField,
|
||||
SlackChannelsFilteredByOrganizationSlackWorkspaceField,
|
||||
)
|
||||
from common.api_helpers.exceptions import BadRequest
|
||||
from common.api_helpers.mixins import EagerLoadingMixin
|
||||
from common.api_helpers.utils import valid_jinja_template_for_serializer_method_field
|
||||
|
|
@ -14,12 +16,6 @@ from common.jinja_templater.apply_jinja_template import JinjaTemplateError
|
|||
from common.utils import is_regex_valid
|
||||
|
||||
|
||||
class SlackChannelDetails(typing.TypedDict):
|
||||
display_name: str
|
||||
slack_id: str
|
||||
id: str
|
||||
|
||||
|
||||
class ChannelFilterSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
||||
class TelegramChannelDetailsSerializer(serializers.Serializer):
|
||||
display_name = serializers.CharField(source="channel_name")
|
||||
|
|
@ -33,7 +29,10 @@ class ChannelFilterSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
allow_null=True,
|
||||
required=False,
|
||||
)
|
||||
slack_channel = serializers.SerializerMethodField()
|
||||
slack_channel = SlackChannelSerializer(read_only=True)
|
||||
|
||||
# TODO: we probably don't need both telegram_channel and telegram_channel_details, research which one isn't needed
|
||||
# and get rid of it
|
||||
# Duplicated telegram channel and telegram_channel_details field for backwards compatibility for old integration page
|
||||
telegram_channel = OrganizationFilteredPrimaryKeyRelatedField(
|
||||
queryset=TelegramToOrganizationConnector.objects, filter_field="organization", allow_null=True, required=False
|
||||
|
|
@ -41,11 +40,12 @@ class ChannelFilterSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
telegram_channel_details = TelegramChannelDetailsSerializer(
|
||||
source="telegram_channel", read_only=True, allow_null=True
|
||||
)
|
||||
|
||||
filtering_term_as_jinja2 = serializers.SerializerMethodField()
|
||||
filtering_term = serializers.CharField(required=False, allow_null=True, allow_blank=True)
|
||||
filtering_labels = LabelPairSerializer(many=True, required=False)
|
||||
|
||||
SELECT_RELATED = ["escalation_chain", "alert_receive_channel"]
|
||||
SELECT_RELATED = ["escalation_chain", "alert_receive_channel", "slack_channel"]
|
||||
|
||||
class Meta:
|
||||
model = ChannelFilter
|
||||
|
|
@ -96,30 +96,13 @@ class ChannelFilterSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
raise serializers.ValidationError(["Filtering labels field is required"])
|
||||
else:
|
||||
raise serializers.ValidationError(["Expression type is incorrect"])
|
||||
|
||||
if "slack_channel_id" in data:
|
||||
slack_channel = data.pop("slack_channel_id", None)
|
||||
data["slack_channel"] = slack_channel
|
||||
|
||||
return data
|
||||
|
||||
def get_slack_channel(self, obj) -> SlackChannelDetails | None:
|
||||
if obj.slack_channel_id is None:
|
||||
return None
|
||||
# display_name and id appears via annotate in ChannelFilterView.get_queryset()
|
||||
return {
|
||||
"display_name": obj.slack_channel_name,
|
||||
"slack_id": obj.slack_channel_id,
|
||||
"id": obj.slack_channel_pk,
|
||||
}
|
||||
|
||||
def validate_slack_channel(self, slack_channel_id):
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
if slack_channel_id is not None:
|
||||
slack_channel_id = slack_channel_id.upper()
|
||||
organization = self.context["request"].auth.organization
|
||||
try:
|
||||
organization.slack_team_identity.get_cached_channels().get(slack_id=slack_channel_id)
|
||||
except SlackChannel.DoesNotExist:
|
||||
raise serializers.ValidationError(["Slack channel does not exist"])
|
||||
return slack_channel_id
|
||||
|
||||
def validate_notification_backends(self, notification_backends):
|
||||
# NOTE: updates the whole field, handling dict updates per backend
|
||||
if notification_backends is not None:
|
||||
|
|
@ -160,8 +143,11 @@ class ChannelFilterSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
|
||||
|
||||
class ChannelFilterCreateSerializer(ChannelFilterSerializer):
|
||||
alert_receive_channel = OrganizationFilteredPrimaryKeyRelatedField(queryset=AlertReceiveChannel.objects)
|
||||
slack_channel = serializers.CharField(allow_null=True, required=False, source="slack_channel_id")
|
||||
slack_channel = SlackChannelsFilteredByOrganizationSlackWorkspaceField(
|
||||
allow_null=True,
|
||||
required=False,
|
||||
write_only=True,
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = ChannelFilter
|
||||
|
|
@ -182,25 +168,13 @@ class ChannelFilterCreateSerializer(ChannelFilterSerializer):
|
|||
]
|
||||
read_only_fields = ["created_at", "is_default"]
|
||||
|
||||
def _get_slack_channel(self, obj) -> SlackChannelDetails | None:
|
||||
if obj.slack_channel_id is None:
|
||||
return None
|
||||
slack_team_identity = self.context["request"].auth.organization.slack_team_identity
|
||||
if slack_team_identity is None:
|
||||
return None
|
||||
slack_channel = slack_team_identity.get_cached_channels(slack_id=obj.slack_channel_id).first()
|
||||
if slack_channel is None:
|
||||
return None
|
||||
return {
|
||||
"display_name": slack_channel.name,
|
||||
"slack_id": slack_channel.slack_id,
|
||||
"id": slack_channel.public_primary_key,
|
||||
}
|
||||
|
||||
def to_representation(self, obj):
|
||||
"""add correct slack channel data to result after instance creation/update"""
|
||||
"""
|
||||
This feels hacky.. it's because the UI currently POST/PUTs using "slack_channel", which is the SLACK ID of
|
||||
the slack channel that we'd like to set it to, whereas what we return is an object with more details
|
||||
"""
|
||||
result = super().to_representation(obj)
|
||||
result["slack_channel"] = self._get_slack_channel(obj)
|
||||
result["slack_channel"] = SlackChannelSerializer(obj.slack_channel).data if obj.slack_channel else None
|
||||
return result
|
||||
|
||||
def create(self, validated_data):
|
||||
|
|
@ -233,4 +207,4 @@ class ChannelFilterUpdateResponseSerializer(ChannelFilterUpdateSerializer):
|
|||
slack_channel = serializers.SerializerMethodField()
|
||||
|
||||
def get_slack_channel(self, obj) -> SlackChannelDetails | None:
|
||||
return self._get_slack_channel(obj)
|
||||
...
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ from dataclasses import asdict
|
|||
|
||||
from rest_framework import serializers
|
||||
|
||||
from apps.api.serializers.slack_channel import SlackChannelSerializer
|
||||
from apps.base.messaging import get_messaging_backend_from_id
|
||||
from apps.base.models import LiveSetting
|
||||
from apps.phone_notifications.phone_provider import get_phone_provider
|
||||
|
|
@ -21,12 +22,12 @@ class OrganizationSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
slack_team_identity = FastSlackTeamIdentitySerializer(read_only=True)
|
||||
|
||||
name = serializers.CharField(required=False, allow_null=True, allow_blank=True, source="org_title")
|
||||
slack_channel = serializers.SerializerMethodField()
|
||||
slack_channel = SlackChannelSerializer(read_only=True, source="default_slack_channel")
|
||||
|
||||
rbac_enabled = serializers.BooleanField(read_only=True, source="is_rbac_permissions_enabled")
|
||||
grafana_incident_enabled = serializers.BooleanField(read_only=True, source="is_grafana_incident_enabled")
|
||||
|
||||
SELECT_RELATED = ["slack_team_identity"]
|
||||
SELECT_RELATED = ["slack_team_identity", "slack_channel"]
|
||||
|
||||
class Meta:
|
||||
model = Organization
|
||||
|
|
@ -47,22 +48,6 @@ class OrganizationSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
"grafana_incident_enabled",
|
||||
]
|
||||
|
||||
def get_slack_channel(self, obj):
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
if obj.general_log_channel_id is None or obj.slack_team_identity is None:
|
||||
return None
|
||||
try:
|
||||
channel = obj.slack_team_identity.get_cached_channels().get(slack_id=obj.general_log_channel_id)
|
||||
except SlackChannel.DoesNotExist:
|
||||
return {"display_name": None, "slack_id": obj.general_log_channel_id, "id": None}
|
||||
|
||||
return {
|
||||
"display_name": channel.name,
|
||||
"slack_id": channel.slack_id,
|
||||
"id": channel.public_primary_key,
|
||||
}
|
||||
|
||||
|
||||
class CurrentOrganizationSerializer(OrganizationSerializer):
|
||||
env_status = serializers.SerializerMethodField()
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
from rest_framework import serializers
|
||||
|
||||
from apps.api.serializers.slack_channel import SlackChannelSerializer
|
||||
from apps.api.serializers.user_group import UserGroupSerializer
|
||||
from apps.schedules.models import OnCallSchedule
|
||||
from apps.schedules.tasks import schedule_notify_about_empty_shifts_in_schedule, schedule_notify_about_gaps_in_schedule
|
||||
|
|
@ -12,7 +13,7 @@ class ScheduleBaseSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
id = serializers.CharField(read_only=True, source="public_primary_key")
|
||||
organization = serializers.HiddenField(default=CurrentOrganizationDefault())
|
||||
team = TeamPrimaryKeyRelatedField(allow_null=True, required=False)
|
||||
slack_channel = serializers.SerializerMethodField()
|
||||
slack_channel = SlackChannelSerializer(read_only=True)
|
||||
user_group = UserGroupSerializer()
|
||||
warnings = serializers.SerializerMethodField()
|
||||
on_call_now = serializers.SerializerMethodField()
|
||||
|
|
@ -37,7 +38,7 @@ class ScheduleBaseSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
"enable_web_overrides",
|
||||
]
|
||||
|
||||
SELECT_RELATED = ["organization", "team", "user_group"]
|
||||
SELECT_RELATED = ["organization", "team", "user_group", "slack_channel"]
|
||||
|
||||
CANT_UPDATE_USER_GROUP_WARNING = (
|
||||
"Cannot update the user group, make sure to grant user group modification rights to "
|
||||
|
|
@ -46,15 +47,6 @@ class ScheduleBaseSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
SCHEDULE_HAS_GAPS_WARNING = "Schedule has unassigned time periods during next 7 days"
|
||||
SCHEDULE_HAS_EMPTY_SHIFTS_WARNING = "Schedule has empty shifts during next 7 days"
|
||||
|
||||
def get_slack_channel(self, obj):
|
||||
if obj.channel is None:
|
||||
return None
|
||||
return {
|
||||
"display_name": obj.slack_channel_name,
|
||||
"slack_id": obj.channel,
|
||||
"id": obj.slack_channel_pk,
|
||||
}
|
||||
|
||||
def get_warnings(self, obj):
|
||||
can_update_user_groups = self.context.get("can_update_user_groups", False)
|
||||
warnings = []
|
||||
|
|
@ -83,8 +75,8 @@ class ScheduleBaseSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
|
||||
def validate(self, attrs):
|
||||
if "slack_channel_id" in attrs:
|
||||
slack_channel_id = attrs.pop("slack_channel_id", None)
|
||||
attrs["channel"] = slack_channel_id.slack_id if slack_channel_id is not None else None
|
||||
# this is set in the serializer classes which subclass ScheduleBaseSerializer
|
||||
attrs["slack_channel"] = attrs.pop("slack_channel_id", None)
|
||||
return attrs
|
||||
|
||||
def create(self, validated_data):
|
||||
|
|
|
|||
|
|
@ -26,6 +26,7 @@ class ScheduleCalendarCreateSerializer(ScheduleCalendarSerializer):
|
|||
queryset=SlackChannel.objects,
|
||||
required=False,
|
||||
allow_null=True,
|
||||
write_only=True,
|
||||
)
|
||||
user_group = OrganizationFilteredPrimaryKeyRelatedField(
|
||||
filter_field="slack_team_identity__organizations",
|
||||
|
|
|
|||
|
|
@ -33,6 +33,7 @@ class ScheduleICalCreateSerializer(ScheduleICalSerializer):
|
|||
queryset=SlackChannel.objects,
|
||||
required=False,
|
||||
allow_null=True,
|
||||
write_only=True,
|
||||
)
|
||||
user_group = OrganizationFilteredPrimaryKeyRelatedField(
|
||||
filter_field="slack_team_identity__organizations",
|
||||
|
|
@ -41,12 +42,6 @@ class ScheduleICalCreateSerializer(ScheduleICalSerializer):
|
|||
allow_null=True,
|
||||
)
|
||||
|
||||
def create(self, validated_data):
|
||||
created_schedule = super().create(validated_data)
|
||||
# for iCal-based schedules we need to refresh final schedule information
|
||||
refresh_ical_final_schedule.apply_async((created_schedule.pk,))
|
||||
return created_schedule
|
||||
|
||||
class Meta:
|
||||
model = OnCallScheduleICal
|
||||
fields = [
|
||||
|
|
@ -60,6 +55,12 @@ class ScheduleICalCreateSerializer(ScheduleICalSerializer):
|
|||
"ical_url_overrides": {"required": False, "allow_null": True},
|
||||
}
|
||||
|
||||
def create(self, validated_data):
|
||||
created_schedule = super().create(validated_data)
|
||||
# for iCal-based schedules we need to refresh final schedule information
|
||||
refresh_ical_final_schedule.apply_async((created_schedule.pk,))
|
||||
return created_schedule
|
||||
|
||||
|
||||
class ScheduleICalUpdateSerializer(ScheduleICalCreateSerializer):
|
||||
class Meta:
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ from common.api_helpers.mixins import EagerLoadingMixin
|
|||
|
||||
|
||||
class PolymorphicScheduleSerializer(EagerLoadingMixin, PolymorphicSerializer):
|
||||
SELECT_RELATED = ["organization", "user_group", "team"]
|
||||
SELECT_RELATED = ["organization", "team", "user_group", "slack_channel"]
|
||||
|
||||
resource_type_field_name = "type"
|
||||
|
||||
|
|
|
|||
|
|
@ -22,6 +22,7 @@ class ScheduleWebCreateSerializer(ScheduleWebSerializer):
|
|||
queryset=SlackChannel.objects,
|
||||
required=False,
|
||||
allow_null=True,
|
||||
write_only=True,
|
||||
)
|
||||
user_group = OrganizationFilteredPrimaryKeyRelatedField(
|
||||
filter_field="slack_team_identity__organizations",
|
||||
|
|
|
|||
|
|
@ -1,8 +1,16 @@
|
|||
import typing
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
|
||||
class SlackChannelDetails(typing.TypedDict):
|
||||
display_name: str
|
||||
slack_id: str
|
||||
id: str
|
||||
|
||||
|
||||
class SlackChannelSerializer(serializers.ModelSerializer):
|
||||
id = serializers.CharField(read_only=True, source="public_primary_key")
|
||||
display_name = serializers.CharField(source="name")
|
||||
|
|
|
|||
|
|
@ -685,3 +685,64 @@ def test_channel_filter_long_filtering_term(
|
|||
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert "Expression is too long" in response.json()["non_field_errors"][0]
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_channel_filter_with_slack_channel_crud(
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_token_for_organization,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_alert_receive_channel,
|
||||
make_user_auth_headers,
|
||||
):
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel1 = make_slack_channel(slack_team_identity)
|
||||
slack_channel2 = make_slack_channel(slack_team_identity)
|
||||
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization, role=LegacyAccessControlRole.ADMIN)
|
||||
_, token = make_token_for_organization(organization)
|
||||
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
|
||||
client = APIClient()
|
||||
auth_headers = make_user_auth_headers(user, token)
|
||||
|
||||
# create the channel filter
|
||||
response = client.post(
|
||||
reverse("api-internal:channel_filter-list"),
|
||||
data={
|
||||
"alert_receive_channel": alert_receive_channel.public_primary_key,
|
||||
"slack_channel": slack_channel1.slack_id,
|
||||
},
|
||||
format="json",
|
||||
**auth_headers,
|
||||
)
|
||||
created_channel_filter = response.json()
|
||||
|
||||
assert response.status_code == status.HTTP_201_CREATED
|
||||
assert created_channel_filter["slack_channel"] == {
|
||||
"id": slack_channel1.public_primary_key,
|
||||
"display_name": slack_channel1.name,
|
||||
"slack_id": slack_channel1.slack_id,
|
||||
}
|
||||
|
||||
# update the slack channel
|
||||
url = reverse("api-internal:channel_filter-detail", kwargs={"pk": created_channel_filter["id"]})
|
||||
|
||||
response = client.patch(url, data={"slack_channel": slack_channel2.slack_id}, format="json", **auth_headers)
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json()["slack_channel"] == {
|
||||
"id": slack_channel2.public_primary_key,
|
||||
"display_name": slack_channel2.name,
|
||||
"slack_id": slack_channel2.slack_id,
|
||||
}
|
||||
|
||||
# remove the slack channel
|
||||
response = client.patch(url, data={"slack_channel": None}, format="json", **auth_headers)
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json()["slack_channel"] is None
|
||||
|
|
|
|||
|
|
@ -240,6 +240,7 @@ def test_organization_get_channel_verification_code_invalid(
|
|||
@pytest.mark.django_db
|
||||
def test_get_organization_slack_config_checks(
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_slack_channel,
|
||||
make_slack_team_identity,
|
||||
make_alert_receive_channel,
|
||||
make_channel_filter,
|
||||
|
|
@ -283,7 +284,7 @@ def test_get_organization_slack_config_checks(
|
|||
assert response.json() == expected_result
|
||||
|
||||
# connect integration to Slack (set a channel)
|
||||
channel_filter.slack_channel_id = "C123456"
|
||||
channel_filter.slack_channel = make_slack_channel(slack_team_identity)
|
||||
channel_filter.save()
|
||||
|
||||
response = client.get(url, format="json", **make_user_auth_headers(user, token))
|
||||
|
|
|
|||
|
|
@ -640,7 +640,6 @@ def test_create_calendar_schedule(schedule_internal_api_setup, make_user_auth_he
|
|||
"type": 0,
|
||||
"name": "created_calendar_schedule",
|
||||
"time_zone": "UTC",
|
||||
"slack_channel_id": None,
|
||||
"user_group": None,
|
||||
"team": None,
|
||||
"warnings": [],
|
||||
|
|
@ -671,7 +670,6 @@ def test_create_ical_schedule(schedule_internal_api_setup, make_user_auth_header
|
|||
"ical_url_overrides": None,
|
||||
"name": "created_ical_schedule",
|
||||
"type": 1,
|
||||
"slack_channel_id": None,
|
||||
"user_group": None,
|
||||
"team": None,
|
||||
"warnings": [],
|
||||
|
|
@ -706,7 +704,6 @@ def test_create_web_schedule(schedule_internal_api_setup, make_user_auth_headers
|
|||
"name": "created_web_schedule",
|
||||
"type": 2,
|
||||
"time_zone": "UTC",
|
||||
"slack_channel_id": None,
|
||||
"user_group": None,
|
||||
"team": None,
|
||||
"warnings": [],
|
||||
|
|
@ -2459,6 +2456,90 @@ def test_team_not_updated_if_not_in_data(
|
|||
assert schedule.team == team
|
||||
|
||||
|
||||
# we don't need to validate the ical URL when creating an ical schedule.. so just patch that functionality
|
||||
@patch("apps.api.serializers.schedule_ical.ScheduleICalSerializer.validate_ical_url_primary", return_value=ICAL_URL)
|
||||
@pytest.mark.parametrize(
|
||||
"schedule_type,other_create_data",
|
||||
[
|
||||
(0, {}),
|
||||
(1, {"ical_url_primary": ICAL_URL}),
|
||||
(2, {}),
|
||||
],
|
||||
)
|
||||
@pytest.mark.django_db
|
||||
def test_can_update_slack_channel(
|
||||
_mock_validate_ical_url_primary,
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_user_auth_headers,
|
||||
schedule_type,
|
||||
other_create_data,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_plugin_token()
|
||||
auth_headers = make_user_auth_headers(user, token)
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
organization.save()
|
||||
|
||||
slack_channel1 = make_slack_channel(slack_team_identity)
|
||||
slack_channel2 = make_slack_channel(slack_team_identity)
|
||||
|
||||
client = APIClient()
|
||||
|
||||
# we can set it when creating
|
||||
response = client.post(
|
||||
reverse("api-internal:schedule-list"),
|
||||
{
|
||||
"name": "created_schedule",
|
||||
"type": schedule_type,
|
||||
"slack_channel_id": slack_channel1.public_primary_key,
|
||||
**other_create_data,
|
||||
},
|
||||
format="json",
|
||||
**auth_headers,
|
||||
)
|
||||
|
||||
assert response.status_code == status.HTTP_201_CREATED
|
||||
|
||||
response_data = response.json()
|
||||
schedule_id = response_data["id"]
|
||||
url = reverse("api-internal:schedule-detail", kwargs={"pk": schedule_id})
|
||||
|
||||
# NOTE: the response returned by the POST/PUT endpoint currently doesn't include slack_channel_id
|
||||
# as it's not used by the UI.. additionally, there was already a bug in it that despite specifying it, it
|
||||
# would return null.. the proper way to refactor this is to change the name of slack_channel_id used in the
|
||||
# request (as this clashes with the slack_channel_id db column)
|
||||
def _assert_slack_channel_updated(new_slack_channel):
|
||||
response = client.get(url, **auth_headers)
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json()["slack_channel"] == new_slack_channel
|
||||
|
||||
# we can update it
|
||||
response = client.patch(
|
||||
url,
|
||||
data={
|
||||
"slack_channel_id": slack_channel2.public_primary_key,
|
||||
},
|
||||
format="json",
|
||||
**auth_headers,
|
||||
)
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
_assert_slack_channel_updated(
|
||||
{
|
||||
"id": slack_channel2.public_primary_key,
|
||||
"display_name": slack_channel2.name,
|
||||
"slack_id": slack_channel2.slack_id,
|
||||
}
|
||||
)
|
||||
|
||||
# we can unset it
|
||||
response = client.patch(url, data={"slack_channel_id": None}, format="json", **auth_headers)
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
_assert_slack_channel_updated(None)
|
||||
|
||||
|
||||
@patch.object(SlackUserGroup, "can_be_updated", new_callable=PropertyMock)
|
||||
@pytest.mark.django_db
|
||||
def test_can_update_user_groups(
|
||||
|
|
|
|||
|
|
@ -1,36 +0,0 @@
|
|||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
from django.urls import reverse
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.test import APIClient
|
||||
|
||||
from apps.api.permissions import LegacyAccessControlRole
|
||||
|
||||
|
||||
# Testing permissions, not view itself. So mock is ok here
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize(
|
||||
"role,expected_status",
|
||||
[
|
||||
(LegacyAccessControlRole.ADMIN, status.HTTP_200_OK),
|
||||
(LegacyAccessControlRole.EDITOR, status.HTTP_403_FORBIDDEN),
|
||||
(LegacyAccessControlRole.VIEWER, status.HTTP_403_FORBIDDEN),
|
||||
(LegacyAccessControlRole.NONE, status.HTTP_403_FORBIDDEN),
|
||||
],
|
||||
)
|
||||
def test_set_general_log_channel_permissions(
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_user_auth_headers,
|
||||
role,
|
||||
expected_status,
|
||||
):
|
||||
_, user, token = make_organization_and_user_with_plugin_token(role)
|
||||
client = APIClient()
|
||||
|
||||
url = reverse("api-internal:api-set-general-log-channel")
|
||||
with patch("apps.api.views.organization.SetGeneralChannel.post", return_value=Response(status=status.HTTP_200_OK)):
|
||||
response = client.post(url, format="json", **make_user_auth_headers(user, token))
|
||||
|
||||
assert response.status_code == expected_status
|
||||
89
engine/apps/api/tests/test_set_org_default_slack_channel.py
Normal file
89
engine/apps/api/tests/test_set_org_default_slack_channel.py
Normal file
|
|
@ -0,0 +1,89 @@
|
|||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
from django.urls import reverse
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.test import APIClient
|
||||
|
||||
from apps.api.permissions import LegacyAccessControlRole
|
||||
|
||||
|
||||
# Testing permissions, not view itself. So mock is ok here
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize(
|
||||
"role,expected_status",
|
||||
[
|
||||
(LegacyAccessControlRole.ADMIN, status.HTTP_200_OK),
|
||||
(LegacyAccessControlRole.EDITOR, status.HTTP_403_FORBIDDEN),
|
||||
(LegacyAccessControlRole.VIEWER, status.HTTP_403_FORBIDDEN),
|
||||
(LegacyAccessControlRole.NONE, status.HTTP_403_FORBIDDEN),
|
||||
],
|
||||
)
|
||||
def test_set_org_default_slack_channel_permissions(
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_user_auth_headers,
|
||||
role,
|
||||
expected_status,
|
||||
):
|
||||
_, user, token = make_organization_and_user_with_plugin_token(role)
|
||||
client = APIClient()
|
||||
|
||||
url = reverse("api-internal:set-default-slack-channel")
|
||||
with patch(
|
||||
"apps.api.views.organization.SetDefaultSlackChannel.post", return_value=Response(status=status.HTTP_200_OK)
|
||||
):
|
||||
response = client.post(url, format="json", **make_user_auth_headers(user, token))
|
||||
|
||||
assert response.status_code == expected_status
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_set_organization_slack_default_channel(
|
||||
make_organization_and_user_with_plugin_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_user_auth_headers,
|
||||
):
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
|
||||
organization, user, token = make_organization_and_user_with_plugin_token()
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
organization.save()
|
||||
|
||||
auth_headers = make_user_auth_headers(user, token)
|
||||
|
||||
assert organization.default_slack_channel is None
|
||||
|
||||
client = APIClient()
|
||||
|
||||
def _update_default_slack_channel(slack_channel_id):
|
||||
# this endpoint doesn't return any data..
|
||||
response = client.post(
|
||||
reverse("api-internal:set-default-slack-channel"),
|
||||
data={
|
||||
"id": slack_channel_id,
|
||||
},
|
||||
format="json",
|
||||
**auth_headers,
|
||||
)
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
|
||||
def _assert_default_slack_channel_is_updated(slack_channel_id):
|
||||
response = client.get(reverse("api-internal:api-organization"), format="json", **auth_headers)
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json()["slack_channel"] == slack_channel_id
|
||||
|
||||
_update_default_slack_channel(slack_channel.public_primary_key)
|
||||
_assert_default_slack_channel_is_updated(
|
||||
{
|
||||
"id": slack_channel.public_primary_key,
|
||||
"display_name": slack_channel.name,
|
||||
"slack_id": slack_channel.slack_id,
|
||||
}
|
||||
)
|
||||
|
||||
# NOTE: currently the endpoint doesn't allow to remove default slack channel, if and when it does, uncomment this
|
||||
# _update_default_slack_channel(None)
|
||||
# _assert_default_slack_channel_is_updated(None)
|
||||
|
|
@ -22,7 +22,7 @@ from .views.organization import (
|
|||
GetChannelVerificationCode,
|
||||
GetTelegramVerificationCode,
|
||||
OrganizationConfigChecksView,
|
||||
SetGeneralChannel,
|
||||
SetDefaultSlackChannel,
|
||||
)
|
||||
from .views.preview_template_options import PreviewTemplateOptionsView
|
||||
from .views.public_api_tokens import PublicApiTokenView
|
||||
|
|
@ -71,7 +71,7 @@ router.register(r"shift_swaps", ShiftSwapViewSet, basename="shift_swap")
|
|||
urlpatterns = [
|
||||
path("", include(router.urls)),
|
||||
optional_slash_path("user", CurrentUserView.as_view(), name="api-user"),
|
||||
optional_slash_path("set_general_channel", SetGeneralChannel.as_view(), name="api-set-general-log-channel"),
|
||||
optional_slash_path("set_general_channel", SetDefaultSlackChannel.as_view(), name="set-default-slack-channel"),
|
||||
optional_slash_path("organization", CurrentOrganizationView.as_view(), name="api-organization"),
|
||||
optional_slash_path(
|
||||
"organization/config-checks",
|
||||
|
|
|
|||
|
|
@ -1,4 +1,3 @@
|
|||
from django.db.models import OuterRef, Subquery
|
||||
from django_filters import rest_framework as filters
|
||||
from drf_spectacular.utils import extend_schema, extend_schema_view
|
||||
from rest_framework import status
|
||||
|
|
@ -15,7 +14,6 @@ from apps.api.serializers.channel_filter import (
|
|||
ChannelFilterUpdateSerializer,
|
||||
)
|
||||
from apps.auth_token.auth import PluginAuthentication
|
||||
from apps.slack.models import SlackChannel
|
||||
from common.api_helpers.exceptions import BadRequest
|
||||
from common.api_helpers.filters import ModelFieldFilterMixin, MultipleChoiceCharFilter, get_integration_queryset
|
||||
from common.api_helpers.mixins import (
|
||||
|
|
@ -82,22 +80,15 @@ class ChannelFilterView(
|
|||
TEAM_LOOKUP = "alert_receive_channel__team"
|
||||
|
||||
def get_queryset(self, ignore_filtering_by_available_teams=False):
|
||||
slack_channels_subq = SlackChannel.objects.filter(
|
||||
slack_id=OuterRef("slack_channel_id"),
|
||||
slack_team_identity=self.request.auth.organization.slack_team_identity,
|
||||
).order_by("pk")
|
||||
|
||||
queryset = ChannelFilter.objects.filter(
|
||||
alert_receive_channel__organization=self.request.auth.organization,
|
||||
alert_receive_channel__deleted_at=None,
|
||||
).annotate(
|
||||
slack_channel_name=Subquery(slack_channels_subq.values("name")[:1]),
|
||||
slack_channel_pk=Subquery(slack_channels_subq.values("public_primary_key")[:1]),
|
||||
)
|
||||
|
||||
if not ignore_filtering_by_available_teams:
|
||||
queryset = queryset.filter(*self.available_teams_lookup_args).distinct()
|
||||
queryset = self.serializer_class.setup_eager_loading(queryset)
|
||||
return queryset
|
||||
|
||||
return self.serializer_class.setup_eager_loading(queryset)
|
||||
|
||||
def destroy(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
|
|
|
|||
|
|
@ -108,7 +108,7 @@ class GetChannelVerificationCode(APIView):
|
|||
return Response(code)
|
||||
|
||||
|
||||
class SetGeneralChannel(APIView):
|
||||
class SetDefaultSlackChannel(APIView):
|
||||
authentication_classes = (PluginAuthentication,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
|
|
@ -127,6 +127,6 @@ class SetGeneralChannel(APIView):
|
|||
public_primary_key=slack_channel_id, slack_team_identity=slack_team_identity
|
||||
)
|
||||
|
||||
organization.set_general_log_channel(slack_channel.slack_id, slack_channel.name, request.user)
|
||||
organization.set_default_slack_channel(slack_channel, request.user)
|
||||
|
||||
return Response(status=200)
|
||||
|
|
|
|||
|
|
@ -37,7 +37,6 @@ from apps.mobile_app.auth import MobileAppAuthTokenAuthentication
|
|||
from apps.schedules.constants import PREFETCHED_SHIFT_SWAPS
|
||||
from apps.schedules.ical_utils import get_oncall_users_for_multiple_schedules
|
||||
from apps.schedules.models import OnCallSchedule, ShiftSwapRequest
|
||||
from apps.slack.models import SlackChannel
|
||||
from apps.slack.tasks import update_slack_user_group_for_schedules
|
||||
from common.api_helpers.exceptions import BadRequest, Conflict
|
||||
from common.api_helpers.filters import ByTeamModelFieldFilterMixin, ModelFieldFilterMixin, TeamModelMultipleChoiceFilter
|
||||
|
|
@ -171,11 +170,6 @@ class ScheduleView(
|
|||
|
||||
def _annotate_queryset(self, queryset):
|
||||
"""Annotate queryset with additional schedule metadata."""
|
||||
organization = self.request.auth.organization
|
||||
slack_channels = SlackChannel.objects.filter(
|
||||
slack_team_identity=organization.slack_team_identity,
|
||||
slack_id=OuterRef("channel"),
|
||||
)
|
||||
escalation_policies = (
|
||||
EscalationPolicy.objects.values("notify_schedule")
|
||||
.order_by("notify_schedule")
|
||||
|
|
@ -183,8 +177,6 @@ class ScheduleView(
|
|||
.filter(notify_schedule=OuterRef("id"))
|
||||
)
|
||||
queryset = queryset.annotate(
|
||||
slack_channel_name=Subquery(slack_channels.values("name")[:1]),
|
||||
slack_channel_pk=Subquery(slack_channels.values("public_primary_key")[:1]),
|
||||
num_escalation_chains=Subquery(escalation_policies.values("num_escalation_chains")[:1]),
|
||||
)
|
||||
return queryset
|
||||
|
|
@ -317,13 +309,14 @@ class ScheduleView(
|
|||
datetime_end = datetime_start + datetime.timedelta(days=1)
|
||||
events = schedule.filter_events(datetime_start, datetime_end, with_empty=with_empty, with_gap=with_gap)
|
||||
|
||||
schedule_slack_channel = schedule.slack_channel
|
||||
slack_channel = (
|
||||
{
|
||||
"id": schedule.slack_channel_pk,
|
||||
"slack_id": schedule.channel,
|
||||
"display_name": schedule.slack_channel_name,
|
||||
"id": schedule_slack_channel.public_primary_key,
|
||||
"slack_id": schedule_slack_channel.slack_id,
|
||||
"display_name": schedule_slack_channel.name,
|
||||
}
|
||||
if schedule.channel is not None
|
||||
if schedule_slack_channel is not None
|
||||
else None
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ from rest_framework import exceptions
|
|||
from rest_framework.authentication import BaseAuthentication, get_authorization_header
|
||||
from rest_framework.request import Request
|
||||
|
||||
from apps.api.permissions import GrafanaAPIPermissions, LegacyAccessControlRole, RBACPermission, user_is_authorized
|
||||
from apps.api.permissions import GrafanaAPIPermissions, LegacyAccessControlRole
|
||||
from apps.grafana_plugin.helpers.gcom import check_token
|
||||
from apps.grafana_plugin.sync_data import SyncPermission, SyncUser
|
||||
from apps.user_management.exceptions import OrganizationDeletedException, OrganizationMovedException
|
||||
|
|
@ -52,10 +52,8 @@ class ApiTokenAuthentication(BaseAuthentication):
|
|||
auth = get_authorization_header(request).decode("utf-8")
|
||||
user, auth_token = self.authenticate_credentials(auth)
|
||||
|
||||
if not user.is_active or not user_is_authorized(user, [RBACPermission.Permissions.API_KEYS_WRITE]):
|
||||
raise exceptions.AuthenticationFailed(
|
||||
"Only users with Admin permissions are allowed to perform this action."
|
||||
)
|
||||
if not user.is_active:
|
||||
raise exceptions.AuthenticationFailed("Only active users are allowed to perform this action.")
|
||||
|
||||
return user, auth_token
|
||||
|
||||
|
|
|
|||
|
|
@ -82,7 +82,6 @@ def make_schedule_with_on_call_shift(make_schedule, make_on_call_shift):
|
|||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
channel="channel",
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
|
|||
|
|
@ -169,9 +169,17 @@ def notify_about_integration_ratelimit_in_slack(organization_id, text, **kwargs)
|
|||
else:
|
||||
cache.set(cache_key, True, 60 * 15) # Set cache before sending message to make sure we don't ratelimit slack
|
||||
slack_team_identity = organization.slack_team_identity
|
||||
if slack_team_identity is not None:
|
||||
org_default_slack_channel_id = organization.default_slack_channel_slack_id
|
||||
|
||||
if slack_team_identity is not None and org_default_slack_channel_id is not None:
|
||||
try:
|
||||
sc = SlackClient(slack_team_identity, enable_ratelimit_retry=True)
|
||||
sc.chat_postMessage(channel=organization.general_log_channel_id, text=text)
|
||||
sc.chat_postMessage(channel=org_default_slack_channel_id, text=text)
|
||||
except SlackAPIError as e:
|
||||
logger.warning(f"Slack exception {e} while sending message for organization {organization_id}")
|
||||
else:
|
||||
logger.info(
|
||||
f"Slack team identity or general log channel is not set for organization {organization_id} "
|
||||
f"skipping rest of notify_about_integration_ratelimit_in_slack "
|
||||
f"slack_team_identity={slack_team_identity} org_default_slack_channel_id={org_default_slack_channel_id}"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -2,7 +2,10 @@ from rest_framework import fields, serializers
|
|||
|
||||
from apps.alerts.models import AlertReceiveChannel, ChannelFilter, EscalationChain
|
||||
from apps.base.messaging import get_messaging_backend_from_id, get_messaging_backends
|
||||
from common.api_helpers.custom_fields import OrganizationFilteredPrimaryKeyRelatedField
|
||||
from common.api_helpers.custom_fields import (
|
||||
OrganizationFilteredPrimaryKeyRelatedField,
|
||||
SlackChannelsFilteredByOrganizationSlackWorkspaceField,
|
||||
)
|
||||
from common.api_helpers.exceptions import BadRequest
|
||||
from common.api_helpers.mixins import EagerLoadingMixin
|
||||
from common.api_helpers.utils import valid_jinja_template_for_serializer_method_field
|
||||
|
|
@ -11,6 +14,11 @@ from common.ordered_model.serializer import OrderedModelSerializer
|
|||
from common.utils import is_regex_valid
|
||||
|
||||
|
||||
class SlackSerializer(serializers.Serializer):
|
||||
channel_id = SlackChannelsFilteredByOrganizationSlackWorkspaceField(required=False, allow_null=True)
|
||||
enabled = serializers.BooleanField(required=False, allow_null=True)
|
||||
|
||||
|
||||
class BaseChannelFilterSerializer(OrderedModelSerializer):
|
||||
"""Base Channel Filter serializer with validation methods"""
|
||||
|
||||
|
|
@ -25,13 +33,19 @@ class BaseChannelFilterSerializer(OrderedModelSerializer):
|
|||
self._declared_fields[field] = serializers.DictField(required=False)
|
||||
self.Meta.fields.append(field)
|
||||
|
||||
def to_representation(self, instance):
|
||||
result = super().to_representation(instance)
|
||||
result["slack"] = {"channel_id": instance.slack_channel_id, "enabled": bool(instance.notify_in_slack)}
|
||||
result["telegram"] = {
|
||||
"id": instance.telegram_channel.public_primary_key if instance.telegram_channel else None,
|
||||
"enabled": bool(instance.notify_in_telegram),
|
||||
def to_representation(self, instance: ChannelFilter):
|
||||
result = {
|
||||
**super().to_representation(instance),
|
||||
"slack": {
|
||||
"channel_id": instance.slack_channel_slack_id,
|
||||
"enabled": bool(instance.notify_in_slack),
|
||||
},
|
||||
"telegram": {
|
||||
"id": instance.telegram_channel.public_primary_key if instance.telegram_channel else None,
|
||||
"enabled": bool(instance.notify_in_telegram),
|
||||
},
|
||||
}
|
||||
|
||||
# add representation for other messaging backends
|
||||
for backend_id, backend in get_messaging_backends():
|
||||
if backend is None:
|
||||
|
|
@ -45,18 +59,16 @@ class BaseChannelFilterSerializer(OrderedModelSerializer):
|
|||
result[field] = {"id": channel_id, "enabled": notification_enabled}
|
||||
return result
|
||||
|
||||
def _correct_validated_data(self, validated_data):
|
||||
def _correct_validated_data(self, validated_data: dict) -> dict:
|
||||
organization = self.context["request"].auth.organization
|
||||
|
||||
slack_field = validated_data.pop("slack", {})
|
||||
if slack_field:
|
||||
if slack_field := validated_data.pop("slack", {}):
|
||||
if "channel_id" in slack_field:
|
||||
validated_data["slack_channel_id"] = self._validate_slack_channel_id(slack_field.get("channel_id"))
|
||||
validated_data["slack_channel"] = slack_field["channel_id"]
|
||||
if "enabled" in slack_field:
|
||||
validated_data["notify_in_slack"] = bool(slack_field.get("enabled"))
|
||||
validated_data["notify_in_slack"] = slack_field["enabled"]
|
||||
|
||||
telegram_field = validated_data.pop("telegram", {})
|
||||
if telegram_field:
|
||||
if telegram_field := validated_data.pop("telegram", {}):
|
||||
if "id" in telegram_field:
|
||||
validated_data["telegram_channel"] = self._validate_telegram_channel(telegram_field.get("id"))
|
||||
if "enabled" in telegram_field:
|
||||
|
|
@ -78,23 +90,9 @@ class BaseChannelFilterSerializer(OrderedModelSerializer):
|
|||
notification_backends[backend_id] = notification_backend
|
||||
if notification_backends:
|
||||
validated_data["notification_backends"] = notification_backends
|
||||
|
||||
return validated_data
|
||||
|
||||
def _validate_slack_channel_id(self, slack_channel_id):
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
if slack_channel_id is not None:
|
||||
slack_channel_id = slack_channel_id.upper()
|
||||
organization = self.context["request"].auth.organization
|
||||
slack_team_identity = organization.slack_team_identity
|
||||
if not slack_team_identity:
|
||||
raise BadRequest(detail="Slack isn't connected to this workspace")
|
||||
try:
|
||||
slack_team_identity.get_cached_channels().get(slack_id=slack_channel_id)
|
||||
except SlackChannel.DoesNotExist:
|
||||
raise BadRequest(detail="Slack channel does not exist")
|
||||
return slack_channel_id
|
||||
|
||||
def _validate_telegram_channel(self, telegram_channel_id):
|
||||
from apps.telegram.models import TelegramToOrganizationConnector
|
||||
|
||||
|
|
@ -132,7 +130,7 @@ class RoutingTypeField(fields.CharField):
|
|||
|
||||
class ChannelFilterSerializer(EagerLoadingMixin, BaseChannelFilterSerializer):
|
||||
id = serializers.CharField(read_only=True, source="public_primary_key")
|
||||
slack = serializers.DictField(required=False)
|
||||
slack = SlackSerializer(required=False)
|
||||
telegram = serializers.DictField(required=False)
|
||||
routing_type = RoutingTypeField(allow_null=False, required=False, source="filtering_term_type")
|
||||
routing_regex = serializers.CharField(allow_null=False, required=True, source="filtering_term")
|
||||
|
|
@ -147,7 +145,7 @@ class ChannelFilterSerializer(EagerLoadingMixin, BaseChannelFilterSerializer):
|
|||
|
||||
is_the_last_route = serializers.BooleanField(read_only=True, source="is_default")
|
||||
|
||||
SELECT_RELATED = ["alert_receive_channel", "escalation_chain"]
|
||||
SELECT_RELATED = ["alert_receive_channel", "escalation_chain", "slack_channel"]
|
||||
|
||||
class Meta:
|
||||
model = ChannelFilter
|
||||
|
|
@ -214,7 +212,7 @@ class ChannelFilterUpdateSerializer(ChannelFilterSerializer):
|
|||
|
||||
class DefaultChannelFilterSerializer(BaseChannelFilterSerializer):
|
||||
id = serializers.CharField(read_only=True, source="public_primary_key")
|
||||
slack = serializers.DictField(required=False)
|
||||
slack = SlackSerializer(required=False)
|
||||
telegram = serializers.DictField(required=False)
|
||||
escalation_chain_id = OrganizationFilteredPrimaryKeyRelatedField(
|
||||
queryset=EscalationChain.objects,
|
||||
|
|
|
|||
|
|
@ -1,28 +1,41 @@
|
|||
import datetime
|
||||
import typing
|
||||
|
||||
from rest_framework import serializers
|
||||
|
||||
from apps.schedules.ical_utils import list_users_to_notify_from_ical
|
||||
from apps.slack.models import SlackUserGroup
|
||||
from common.api_helpers.custom_fields import TeamPrimaryKeyRelatedField
|
||||
from common.api_helpers.exceptions import BadRequest
|
||||
from common.api_helpers.custom_fields import (
|
||||
SlackChannelsFilteredByOrganizationSlackWorkspaceField,
|
||||
SlackUserGroupsFilteredByOrganizationSlackWorkspaceField,
|
||||
TeamPrimaryKeyRelatedField,
|
||||
)
|
||||
from common.api_helpers.mixins import EagerLoadingMixin
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
from apps.schedules.models import OnCallSchedule
|
||||
|
||||
|
||||
class SlackSerializer(serializers.Serializer):
|
||||
channel_id = SlackChannelsFilteredByOrganizationSlackWorkspaceField(required=False, allow_null=True)
|
||||
user_group_id = SlackUserGroupsFilteredByOrganizationSlackWorkspaceField(required=False, allow_null=True)
|
||||
|
||||
|
||||
# TODO: update the following once we bump mypy to 1.11 (which supports generics)
|
||||
# class ScheduleBaseSerializer[M: "OnCallSchedule"](EagerLoadingMixin, serializers.ModelSerializer[M]):
|
||||
class ScheduleBaseSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
||||
id = serializers.CharField(read_only=True, source="public_primary_key")
|
||||
on_call_now = serializers.SerializerMethodField()
|
||||
slack = serializers.DictField(required=False)
|
||||
slack = SlackSerializer(required=False)
|
||||
team_id = TeamPrimaryKeyRelatedField(required=False, allow_null=True, source="team")
|
||||
|
||||
SELECT_RELATED = ["team", "user_group"]
|
||||
SELECT_RELATED = ["team", "user_group", "slack_channel"]
|
||||
|
||||
def create(self, validated_data):
|
||||
validated_data = self._correct_validated_data(validated_data)
|
||||
validated_data["organization"] = self.context["request"].auth.organization
|
||||
return super().create(validated_data)
|
||||
|
||||
def get_on_call_now(self, obj):
|
||||
def get_on_call_now(self, obj: "OnCallSchedule") -> typing.List[str]:
|
||||
users_on_call = list_users_to_notify_from_ical(obj, datetime.datetime.now(datetime.timezone.utc))
|
||||
if users_on_call is not None:
|
||||
return [user.public_primary_key for user in users_on_call]
|
||||
|
|
@ -30,54 +43,24 @@ class ScheduleBaseSerializer(EagerLoadingMixin, serializers.ModelSerializer):
|
|||
return []
|
||||
|
||||
def _correct_validated_data(self, validated_data):
|
||||
slack_field = validated_data.pop("slack", {})
|
||||
if "channel_id" in slack_field:
|
||||
validated_data["channel"] = slack_field["channel_id"]
|
||||
if slack_field := validated_data.pop("slack", {}):
|
||||
if "channel_id" in slack_field:
|
||||
validated_data["slack_channel"] = slack_field["channel_id"]
|
||||
|
||||
if "user_group_id" in slack_field:
|
||||
validated_data["user_group"] = SlackUserGroup.objects.filter(slack_id=slack_field["user_group_id"]).first()
|
||||
if "user_group_id" in slack_field:
|
||||
validated_data["user_group"] = slack_field["user_group_id"]
|
||||
|
||||
return validated_data
|
||||
|
||||
def validate_slack(self, slack_field):
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
slack_channel_id = slack_field.get("channel_id")
|
||||
user_group_id = slack_field.get("user_group_id")
|
||||
|
||||
organization = self.context["request"].auth.organization
|
||||
slack_team_identity = organization.slack_team_identity
|
||||
|
||||
if (slack_channel_id or user_group_id) and not slack_team_identity:
|
||||
raise BadRequest(detail="Slack isn't connected to this workspace")
|
||||
|
||||
if slack_channel_id is not None:
|
||||
slack_channel_id = slack_channel_id.upper()
|
||||
try:
|
||||
slack_team_identity.get_cached_channels().get(slack_id=slack_channel_id)
|
||||
except SlackChannel.DoesNotExist:
|
||||
raise BadRequest(detail="Slack channel does not exist")
|
||||
|
||||
if user_group_id is not None:
|
||||
user_group_id = user_group_id.upper()
|
||||
try:
|
||||
slack_team_identity.usergroups.get(slack_id=user_group_id)
|
||||
except SlackUserGroup.DoesNotExist:
|
||||
raise BadRequest(detail="Slack user group does not exist")
|
||||
|
||||
return slack_field
|
||||
|
||||
def to_representation(self, instance):
|
||||
result = super().to_representation(instance)
|
||||
|
||||
user_group_id = instance.user_group.slack_id if instance.user_group is not None else None
|
||||
result["slack"] = {
|
||||
"channel_id": instance.channel or None,
|
||||
"user_group_id": user_group_id,
|
||||
def to_representation(self, instance: "OnCallSchedule"):
|
||||
return {
|
||||
**super().to_representation(instance),
|
||||
"slack": {
|
||||
"channel_id": instance.slack_channel_slack_id,
|
||||
"user_group_id": instance.user_group.slack_id if instance.user_group is not None else None,
|
||||
},
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
|
||||
class FinalShiftQueryParamsSerializer(serializers.Serializer):
|
||||
start_date = serializers.DateTimeField(required=True, input_formats=["%Y-%m-%dT%H:%M", "%Y-%m-%d"])
|
||||
|
|
|
|||
|
|
@ -9,6 +9,8 @@ from common.api_helpers.custom_fields import TimeZoneField, UsersFilteredByOrgan
|
|||
from common.api_helpers.exceptions import BadRequest
|
||||
|
||||
|
||||
# TODO: update the following once we bump mypy to 1.11 (which supports generics)
|
||||
# class ScheduleCalendarSerializer(ScheduleBaseSerializer[OnCallScheduleCalendar]):
|
||||
class ScheduleCalendarSerializer(ScheduleBaseSerializer):
|
||||
time_zone = TimeZoneField(required=True)
|
||||
shifts = UsersFilteredByOrganizationField(
|
||||
|
|
|
|||
|
|
@ -10,6 +10,8 @@ from common.api_helpers.custom_fields import TeamPrimaryKeyRelatedField
|
|||
from common.api_helpers.utils import validate_ical_url
|
||||
|
||||
|
||||
# TODO: update the following once we bump mypy to 1.11 (which supports generics)
|
||||
# class ScheduleICalSerializer(ScheduleBaseSerializer[OnCallScheduleICal]):
|
||||
class ScheduleICalSerializer(ScheduleBaseSerializer):
|
||||
class Meta:
|
||||
model = OnCallScheduleICal
|
||||
|
|
|
|||
|
|
@ -8,6 +8,8 @@ from apps.schedules.tasks import (
|
|||
from common.api_helpers.custom_fields import TimeZoneField, UsersFilteredByOrganizationField
|
||||
|
||||
|
||||
# TODO: update the following once we bump mypy to 1.11 (which supports generics)
|
||||
# class ScheduleWebSerializer(ScheduleBaseSerializer[OnCallScheduleWeb]):
|
||||
class ScheduleWebSerializer(ScheduleBaseSerializer):
|
||||
time_zone = TimeZoneField(required=True)
|
||||
shifts = UsersFilteredByOrganizationField(
|
||||
|
|
|
|||
98
engine/apps/public_api/tests/test_rbac_permissions.py
Normal file
98
engine/apps/public_api/tests/test_rbac_permissions.py
Normal file
|
|
@ -0,0 +1,98 @@
|
|||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
from django.urls import reverse
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.test import APIClient
|
||||
|
||||
from apps.api.permissions import GrafanaAPIPermission, LegacyAccessControlRole, get_most_authorized_role
|
||||
from apps.public_api.urls import router
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"rbac_enabled,role,give_perm",
|
||||
[
|
||||
# rbac disabled: we will check the role is enough based on get_most_authorized_role for the perm
|
||||
(False, "admin", None),
|
||||
(False, "editor", None),
|
||||
(False, "viewer", None),
|
||||
(False, None, None),
|
||||
# rbac enabled: having role None, check the perm is required
|
||||
(True, None, False),
|
||||
(True, None, True),
|
||||
],
|
||||
)
|
||||
@pytest.mark.django_db
|
||||
def test_rbac_permissions(
|
||||
make_organization_and_user_with_token,
|
||||
rbac_enabled,
|
||||
role,
|
||||
give_perm,
|
||||
):
|
||||
# APIView default actions
|
||||
# (name, http method, detail-based)
|
||||
default_actions = {
|
||||
"create": ("post", False),
|
||||
"list": ("get", False),
|
||||
"retrieve": ("get", True),
|
||||
"update": ("put", True),
|
||||
"partial_update": ("patch", True),
|
||||
"destroy": ("delete", True),
|
||||
}
|
||||
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
if organization.is_rbac_permissions_enabled != rbac_enabled:
|
||||
# skip if the organization's rbac_enabled is not the expected by the test
|
||||
return
|
||||
|
||||
client = APIClient()
|
||||
# check all actions for all public API viewsets
|
||||
for _, viewset, _basename in router.registry:
|
||||
if viewset.__name__ == "ActionView":
|
||||
# old actions (webhooks) are deprecated, no RBAC support
|
||||
continue
|
||||
for viewset_method_name, required_perms in viewset.rbac_permissions.items():
|
||||
# setup user's role and permissions
|
||||
if rbac_enabled:
|
||||
# set the user's role to None and assign the permission or not based on the flag
|
||||
user.role = LegacyAccessControlRole.NONE
|
||||
user.permissions = []
|
||||
expected = status.HTTP_403_FORBIDDEN
|
||||
if give_perm:
|
||||
# if permissions are given, expect a 200 response
|
||||
user.permissions = [GrafanaAPIPermission(action=perm.value) for perm in required_perms]
|
||||
expected = status.HTTP_200_OK
|
||||
user.save()
|
||||
else:
|
||||
# set the user's role to the given role
|
||||
user.role = LegacyAccessControlRole[role.upper()] if role else LegacyAccessControlRole.NONE
|
||||
user.save()
|
||||
# check what the minimum required role for the perms is
|
||||
required_role = get_most_authorized_role(required_perms)
|
||||
# set expected depending on the user's role
|
||||
expected = status.HTTP_200_OK if user.role <= required_role else status.HTTP_403_FORBIDDEN
|
||||
|
||||
# iterate over all viewset actions, making an API request for each,
|
||||
# using the user's token and confirming the response status code
|
||||
if viewset_method_name in default_actions:
|
||||
http_method, detail = default_actions[viewset_method_name]
|
||||
else:
|
||||
action_method = getattr(viewset, viewset_method_name)
|
||||
http_method = list(action_method.mapping.keys())[0]
|
||||
detail = action_method.detail
|
||||
|
||||
method_path = f"{viewset.__module__}.{viewset.__name__}.{viewset_method_name}"
|
||||
success = Response(status=status.HTTP_200_OK)
|
||||
kwargs = {"pk": "NONEXISTENT"} if detail else None
|
||||
if viewset_method_name in default_actions and detail:
|
||||
url = reverse(f"api-public:{_basename}-detail", kwargs=kwargs)
|
||||
elif viewset_method_name in default_actions and not detail:
|
||||
url = reverse(f"api-public:{_basename}-list", kwargs=kwargs)
|
||||
else:
|
||||
name = viewset_method_name.replace("_", "-")
|
||||
url = reverse(f"api-public:{_basename}-{name}", kwargs=kwargs)
|
||||
|
||||
with patch(method_path, return_value=success):
|
||||
response = client.generic(path=url, method=http_method, HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == expected
|
||||
|
|
@ -12,28 +12,36 @@ TEST_MESSAGING_BACKEND_ID = "TESTBACKENDID"
|
|||
|
||||
@pytest.fixture()
|
||||
def route_public_api_setup(
|
||||
make_organization_and_user_with_token,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_alert_receive_channel,
|
||||
make_escalation_chain,
|
||||
make_channel_filter,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
escalation_chain = make_escalation_chain(organization)
|
||||
channel_filter = make_channel_filter(
|
||||
alert_receive_channel,
|
||||
is_default=True,
|
||||
slack_channel_id="TEST_SLACK_ID",
|
||||
slack_channel=slack_channel,
|
||||
escalation_chain=escalation_chain,
|
||||
)
|
||||
return organization, user, token, alert_receive_channel, escalation_chain, channel_filter
|
||||
return organization, token, alert_receive_channel, escalation_chain, channel_filter
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_get_route(
|
||||
route_public_api_setup,
|
||||
):
|
||||
_, _, token, alert_receive_channel, escalation_chain, channel_filter = route_public_api_setup
|
||||
def test_get_route(route_public_api_setup):
|
||||
_, token, alert_receive_channel, escalation_chain, channel_filter = route_public_api_setup
|
||||
|
||||
client = APIClient()
|
||||
|
||||
|
|
@ -48,7 +56,7 @@ def test_get_route(
|
|||
"routing_regex": channel_filter.filtering_term,
|
||||
"position": channel_filter.order,
|
||||
"is_the_last_route": channel_filter.is_default,
|
||||
"slack": {"channel_id": channel_filter.slack_channel_id, "enabled": True},
|
||||
"slack": {"channel_id": channel_filter.slack_channel_slack_id, "enabled": True},
|
||||
"telegram": {"id": None, "enabled": False},
|
||||
TEST_MESSAGING_BACKEND_FIELD: {"id": None, "enabled": False},
|
||||
}
|
||||
|
|
@ -58,10 +66,8 @@ def test_get_route(
|
|||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_get_routes_list(
|
||||
route_public_api_setup,
|
||||
):
|
||||
_, _, token, alert_receive_channel, escalation_chain, channel_filter = route_public_api_setup
|
||||
def test_get_routes_list(route_public_api_setup):
|
||||
_, token, alert_receive_channel, escalation_chain, channel_filter = route_public_api_setup
|
||||
|
||||
client = APIClient()
|
||||
|
||||
|
|
@ -81,7 +87,7 @@ def test_get_routes_list(
|
|||
"routing_regex": channel_filter.filtering_term,
|
||||
"position": channel_filter.order,
|
||||
"is_the_last_route": channel_filter.is_default,
|
||||
"slack": {"channel_id": channel_filter.slack_channel_id, "enabled": True},
|
||||
"slack": {"channel_id": channel_filter.slack_channel_slack_id, "enabled": True},
|
||||
"telegram": {"id": None, "enabled": False},
|
||||
TEST_MESSAGING_BACKEND_FIELD: {"id": None, "enabled": False},
|
||||
}
|
||||
|
|
@ -96,10 +102,8 @@ def test_get_routes_list(
|
|||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_get_routes_filter_by_integration_id(
|
||||
route_public_api_setup,
|
||||
):
|
||||
_, _, token, alert_receive_channel, escalation_chain, channel_filter = route_public_api_setup
|
||||
def test_get_routes_filter_by_integration_id(route_public_api_setup):
|
||||
_, token, alert_receive_channel, escalation_chain, channel_filter = route_public_api_setup
|
||||
|
||||
client = APIClient()
|
||||
|
||||
|
|
@ -121,7 +125,7 @@ def test_get_routes_filter_by_integration_id(
|
|||
"routing_regex": channel_filter.filtering_term,
|
||||
"position": channel_filter.order,
|
||||
"is_the_last_route": channel_filter.is_default,
|
||||
"slack": {"channel_id": channel_filter.slack_channel_id, "enabled": True},
|
||||
"slack": {"channel_id": channel_filter.slack_channel_slack_id, "enabled": True},
|
||||
"telegram": {"id": None, "enabled": False},
|
||||
TEST_MESSAGING_BACKEND_FIELD: {"id": None, "enabled": False},
|
||||
}
|
||||
|
|
@ -136,10 +140,8 @@ def test_get_routes_filter_by_integration_id(
|
|||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_create_route(
|
||||
route_public_api_setup,
|
||||
):
|
||||
_, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
def test_create_route(route_public_api_setup):
|
||||
_, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
|
||||
client = APIClient()
|
||||
|
||||
|
|
@ -170,7 +172,7 @@ def test_create_route(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_create_route_without_escalation_chain(route_public_api_setup):
|
||||
_, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
_, token, alert_receive_channel, _, _ = route_public_api_setup
|
||||
|
||||
client = APIClient()
|
||||
url = reverse("api-public:routes-list")
|
||||
|
|
@ -203,7 +205,7 @@ def test_create_route_without_escalation_chain(route_public_api_setup):
|
|||
def test_invalid_route_data(
|
||||
route_public_api_setup,
|
||||
):
|
||||
_, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
_, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
|
||||
client = APIClient()
|
||||
|
||||
|
|
@ -219,11 +221,8 @@ def test_invalid_route_data(
|
|||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_update_route(
|
||||
route_public_api_setup,
|
||||
make_channel_filter,
|
||||
):
|
||||
_, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
def test_update_route(route_public_api_setup, make_channel_filter):
|
||||
_, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
new_channel_filter = make_channel_filter(
|
||||
alert_receive_channel,
|
||||
is_default=False,
|
||||
|
|
@ -251,7 +250,7 @@ def test_update_route(
|
|||
"routing_regex": data_to_update["routing_regex"],
|
||||
"position": new_channel_filter.order,
|
||||
"is_the_last_route": new_channel_filter.is_default,
|
||||
"slack": {"channel_id": new_channel_filter.slack_channel_id, "enabled": True},
|
||||
"slack": {"channel_id": new_channel_filter.slack_channel_slack_id, "enabled": True},
|
||||
"telegram": {"id": None, "enabled": False},
|
||||
TEST_MESSAGING_BACKEND_FIELD: {"id": None, "enabled": False},
|
||||
}
|
||||
|
|
@ -261,11 +260,8 @@ def test_update_route(
|
|||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_delete_route(
|
||||
route_public_api_setup,
|
||||
make_channel_filter,
|
||||
):
|
||||
_, _, token, alert_receive_channel, _, _ = route_public_api_setup
|
||||
def test_delete_route(route_public_api_setup, make_channel_filter):
|
||||
_, token, alert_receive_channel, _, _ = route_public_api_setup
|
||||
new_channel_filter = make_channel_filter(
|
||||
alert_receive_channel,
|
||||
is_default=False,
|
||||
|
|
@ -285,47 +281,110 @@ def test_delete_route(
|
|||
@pytest.mark.django_db
|
||||
def test_create_route_slack_error(
|
||||
route_public_api_setup,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_alert_receive_channel,
|
||||
):
|
||||
_, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
|
||||
_, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
client = APIClient()
|
||||
|
||||
url = reverse("api-public:routes-list")
|
||||
data_for_create = {
|
||||
"integration_id": alert_receive_channel.public_primary_key,
|
||||
"routing_regex": "testreg",
|
||||
"escalation_chain_id": escalation_chain.public_primary_key,
|
||||
"slack": {"channel_id": "TEST_SLACK_ID"},
|
||||
}
|
||||
response = client.post(url, format="json", HTTP_AUTHORIZATION=token, data=data_for_create)
|
||||
|
||||
# Channel does not exist
|
||||
response = client.post(
|
||||
url,
|
||||
format="json",
|
||||
HTTP_AUTHORIZATION=token,
|
||||
data={
|
||||
"integration_id": alert_receive_channel.public_primary_key,
|
||||
"routing_regex": "testreg",
|
||||
"escalation_chain_id": escalation_chain.public_primary_key,
|
||||
"slack": {"channel_id": "TEST_SLACK_ID"},
|
||||
},
|
||||
)
|
||||
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.data["detail"] == "Slack isn't connected to this workspace"
|
||||
assert response.json()["slack"]["channel_id"][0] == "Slack channel does not exist"
|
||||
|
||||
# org doesn't have slack workspace connected
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization()
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
|
||||
assert organization.slack_team_identity is None
|
||||
|
||||
response = client.post(
|
||||
url,
|
||||
format="json",
|
||||
HTTP_AUTHORIZATION=token,
|
||||
data={
|
||||
"integration_id": alert_receive_channel.public_primary_key,
|
||||
"slack": {"channel_id": slack_channel.slack_id},
|
||||
},
|
||||
)
|
||||
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.json()["detail"] == "Slack isn't connected to this workspace"
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_update_route_slack_error(
|
||||
route_public_api_setup,
|
||||
make_channel_filter,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_alert_receive_channel,
|
||||
):
|
||||
_, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
new_channel_filter = make_channel_filter(
|
||||
alert_receive_channel,
|
||||
is_default=False,
|
||||
filtering_term="testreg",
|
||||
)
|
||||
|
||||
client = APIClient()
|
||||
|
||||
url = reverse("api-public:routes-detail", kwargs={"pk": new_channel_filter.public_primary_key})
|
||||
data_to_update = {
|
||||
"slack": {"channel_id": "TEST_SLACK_ID"},
|
||||
}
|
||||
# Channel does not exist
|
||||
_, token, alert_receive_channel, _, _ = route_public_api_setup
|
||||
new_channel_filter = make_channel_filter(alert_receive_channel, is_default=False, filtering_term="testreg")
|
||||
|
||||
response = client.put(url, format="json", HTTP_AUTHORIZATION=token, data=data_to_update)
|
||||
url = reverse("api-public:routes-detail", kwargs={"pk": new_channel_filter.public_primary_key})
|
||||
response = client.put(
|
||||
url,
|
||||
format="json",
|
||||
HTTP_AUTHORIZATION=token,
|
||||
data={
|
||||
"slack": {"channel_id": "TEST_SLACK_ID"},
|
||||
},
|
||||
)
|
||||
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.data["detail"] == "Slack isn't connected to this workspace"
|
||||
assert response.json()["slack"]["channel_id"][0] == "Slack channel does not exist"
|
||||
|
||||
# org doesn't have slack workspace connected
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization()
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
new_channel_filter = make_channel_filter(alert_receive_channel, is_default=False, filtering_term="testreg")
|
||||
|
||||
assert organization.slack_team_identity is None
|
||||
|
||||
url = reverse("api-public:routes-detail", kwargs={"pk": new_channel_filter.public_primary_key})
|
||||
response = client.put(
|
||||
url,
|
||||
format="json",
|
||||
HTTP_AUTHORIZATION=token,
|
||||
data={
|
||||
"slack": {"channel_id": slack_channel.slack_id},
|
||||
},
|
||||
)
|
||||
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.json()["detail"] == "Slack isn't connected to this workspace"
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
|
|
@ -334,7 +393,7 @@ def test_create_route_with_messaging_backend(
|
|||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
):
|
||||
organization, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
organization, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
organization.save(update_fields=["slack_team_identity"])
|
||||
|
|
@ -381,7 +440,7 @@ def test_update_route_with_messaging_backend(
|
|||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
):
|
||||
organization, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
organization, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
organization.save(update_fields=["slack_team_identity"])
|
||||
|
|
@ -407,7 +466,7 @@ def test_update_route_with_messaging_backend(
|
|||
}
|
||||
|
||||
# check if route data is different
|
||||
assert new_channel_filter.slack_channel_id != slack_channel.slack_id
|
||||
assert new_channel_filter.slack_channel_slack_id != slack_channel.slack_id
|
||||
assert new_channel_filter.notify_in_slack != data_to_update["slack"]["enabled"]
|
||||
assert new_channel_filter.notify_in_telegram != data_to_update["telegram"]["enabled"]
|
||||
assert new_channel_filter.notification_backends is None
|
||||
|
|
@ -433,7 +492,7 @@ def test_update_route_with_messaging_backend(
|
|||
new_channel_filter.refresh_from_db()
|
||||
|
||||
# check if route data is different was changed correctly
|
||||
assert new_channel_filter.slack_channel_id == slack_channel.slack_id
|
||||
assert new_channel_filter.slack_channel_slack_id == slack_channel.slack_id
|
||||
assert new_channel_filter.notify_in_slack == data_to_update["slack"]["enabled"]
|
||||
assert new_channel_filter.notify_in_telegram == data_to_update["telegram"]["enabled"]
|
||||
assert new_channel_filter.notification_backends == {
|
||||
|
|
@ -467,7 +526,7 @@ def test_update_route_with_messaging_backend(
|
|||
new_channel_filter.refresh_from_db()
|
||||
|
||||
# check if route data is different was changed correctly
|
||||
assert new_channel_filter.slack_channel_id == data_to_update["slack"]["channel_id"]
|
||||
assert new_channel_filter.slack_channel_slack_id == data_to_update["slack"]["channel_id"]
|
||||
assert new_channel_filter.notify_in_slack == data_to_update["slack"]["enabled"]
|
||||
assert new_channel_filter.notify_in_telegram == data_to_update["telegram"]["enabled"]
|
||||
assert new_channel_filter.notification_backends == {TestOnlyBackend.backend_id: {"channel": None, "enabled": True}}
|
||||
|
|
@ -504,11 +563,8 @@ def test_update_route_with_manual_ordering(
|
|||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_routes_long_filtering_term(
|
||||
route_public_api_setup,
|
||||
make_channel_filter,
|
||||
):
|
||||
organization, _, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
def test_routes_long_filtering_term(route_public_api_setup, make_channel_filter):
|
||||
_, token, alert_receive_channel, escalation_chain, _ = route_public_api_setup
|
||||
client = APIClient()
|
||||
long_filtering_term = "a" * (ChannelFilter.FILTERING_TERM_MAX_LENGTH + 1)
|
||||
|
||||
|
|
|
|||
|
|
@ -63,25 +63,29 @@ def assert_expected_shifts_export_response(response, users, expected_on_call_tim
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_get_calendar_schedule(
|
||||
make_organization_and_user_with_token,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
client = APIClient()
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=slack_channel_id)
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
channel=slack_channel_id,
|
||||
)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
schedule = make_schedule(organization, schedule_class=OnCallScheduleCalendar, slack_channel=slack_channel)
|
||||
|
||||
client = APIClient()
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
response = client.get(url, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
response = client.get(url, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
|
||||
result = {
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == {
|
||||
"id": schedule.public_primary_key,
|
||||
"team_id": None,
|
||||
"name": schedule.name,
|
||||
|
|
@ -90,16 +94,13 @@ def test_get_calendar_schedule(
|
|||
"on_call_now": [],
|
||||
"shifts": [],
|
||||
"slack": {
|
||||
"channel_id": "SLACKCHANNELID",
|
||||
"channel_id": slack_channel_id,
|
||||
"user_group_id": None,
|
||||
},
|
||||
"ical_url_overrides": None,
|
||||
"enable_web_overrides": False,
|
||||
}
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_create_calendar_schedule(make_organization_and_user_with_token):
|
||||
|
|
@ -115,7 +116,7 @@ def test_create_calendar_schedule(make_organization_and_user_with_token):
|
|||
"type": "calendar",
|
||||
}
|
||||
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
schedule = OnCallSchedule.objects.get(public_primary_key=response.data["id"])
|
||||
|
||||
result = {
|
||||
|
|
@ -166,7 +167,7 @@ def test_create_calendar_schedule_with_shifts(make_organization_and_user_with_to
|
|||
"shifts": [on_call_shift.public_primary_key],
|
||||
}
|
||||
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
schedule = OnCallSchedule.objects.get(public_primary_key=response.data["id"])
|
||||
|
||||
result = {
|
||||
|
|
@ -191,20 +192,24 @@ def test_create_calendar_schedule_with_shifts(make_organization_and_user_with_to
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_update_calendar_schedule(
|
||||
make_organization_and_user_with_token,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
client = APIClient()
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=slack_channel_id)
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
channel=slack_channel_id,
|
||||
)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
schedule = make_schedule(organization, schedule_class=OnCallScheduleCalendar, slack_channel=slack_channel)
|
||||
|
||||
client = APIClient()
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
|
||||
data = {
|
||||
|
|
@ -215,9 +220,14 @@ def test_update_calendar_schedule(
|
|||
assert schedule.name != data["name"]
|
||||
assert schedule.time_zone != data["time_zone"]
|
||||
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
result = {
|
||||
schedule.refresh_from_db()
|
||||
assert schedule.name == data["name"]
|
||||
assert schedule.time_zone == data["time_zone"]
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == {
|
||||
"id": schedule.public_primary_key,
|
||||
"team_id": None,
|
||||
"name": data["name"],
|
||||
|
|
@ -226,19 +236,13 @@ def test_update_calendar_schedule(
|
|||
"on_call_now": [],
|
||||
"shifts": [],
|
||||
"slack": {
|
||||
"channel_id": "SLACKCHANNELID",
|
||||
"channel_id": slack_channel_id,
|
||||
"user_group_id": None,
|
||||
},
|
||||
"ical_url_overrides": None,
|
||||
"enable_web_overrides": False,
|
||||
}
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
schedule.refresh_from_db()
|
||||
assert schedule.name == data["name"]
|
||||
assert schedule.time_zone == data["time_zone"]
|
||||
assert response.json() == result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_update_calendar_schedule_enable_web_overrides(
|
||||
|
|
@ -258,7 +262,7 @@ def test_update_calendar_schedule_enable_web_overrides(
|
|||
data = {
|
||||
"enable_web_overrides": True,
|
||||
}
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
result = {
|
||||
"id": schedule.public_primary_key,
|
||||
|
|
@ -281,25 +285,29 @@ def test_update_calendar_schedule_enable_web_overrides(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_get_web_schedule(
|
||||
make_organization_and_user_with_token,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
client = APIClient()
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=slack_channel_id)
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
channel=slack_channel_id,
|
||||
)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
schedule = make_schedule(organization, schedule_class=OnCallScheduleWeb, slack_channel=slack_channel)
|
||||
|
||||
client = APIClient()
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
response = client.get(url, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
response = client.get(url, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
|
||||
result = {
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == {
|
||||
"id": schedule.public_primary_key,
|
||||
"team_id": None,
|
||||
"name": schedule.name,
|
||||
|
|
@ -308,14 +316,11 @@ def test_get_web_schedule(
|
|||
"on_call_now": [],
|
||||
"shifts": [],
|
||||
"slack": {
|
||||
"channel_id": "SLACKCHANNELID",
|
||||
"channel_id": slack_channel_id,
|
||||
"user_group_id": None,
|
||||
},
|
||||
}
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_create_schedules_same_name(make_organization_and_user_with_token):
|
||||
|
|
@ -332,7 +337,7 @@ def test_create_schedules_same_name(make_organization_and_user_with_token):
|
|||
}
|
||||
|
||||
for _ in range(2):
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_201_CREATED
|
||||
|
||||
schedules = OnCallSchedule.objects.filter(name="same-name", organization=organization)
|
||||
|
|
@ -344,17 +349,10 @@ def test_update_web_schedule(
|
|||
make_organization_and_user_with_token,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
organization, _, token = make_organization_and_user_with_token()
|
||||
schedule = make_schedule(organization, schedule_class=OnCallScheduleWeb)
|
||||
|
||||
client = APIClient()
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
channel=slack_channel_id,
|
||||
)
|
||||
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
|
||||
data = {
|
||||
|
|
@ -365,35 +363,38 @@ def test_update_web_schedule(
|
|||
assert schedule.name != data["name"]
|
||||
assert schedule.time_zone != data["time_zone"]
|
||||
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.json() == {"detail": "Web schedule update is not enabled through API"}
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_update_ical_url_overrides_calendar_schedule(
|
||||
make_organization_and_user_with_token,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
client = APIClient()
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=slack_channel_id)
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
channel=slack_channel_id,
|
||||
)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
schedule = make_schedule(organization, schedule_class=OnCallScheduleCalendar, slack_channel=slack_channel)
|
||||
|
||||
client = APIClient()
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
|
||||
data = {"ical_url_overrides": ICAL_URL}
|
||||
|
||||
with patch("common.api_helpers.utils.validate_ical_url", return_value=ICAL_URL):
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data={"ical_url_overrides": ICAL_URL}, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
result = {
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == {
|
||||
"id": schedule.public_primary_key,
|
||||
"team_id": None,
|
||||
"name": schedule.name,
|
||||
|
|
@ -402,33 +403,36 @@ def test_update_ical_url_overrides_calendar_schedule(
|
|||
"on_call_now": [],
|
||||
"shifts": [],
|
||||
"slack": {
|
||||
"channel_id": "SLACKCHANNELID",
|
||||
"channel_id": slack_channel_id,
|
||||
"user_group_id": None,
|
||||
},
|
||||
"ical_url_overrides": ICAL_URL,
|
||||
"enable_web_overrides": False,
|
||||
}
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_update_calendar_schedule_with_custom_event(
|
||||
make_organization_and_user_with_token,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=slack_channel_id)
|
||||
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
schedule = make_schedule(organization, schedule_class=OnCallScheduleCalendar, slack_channel=slack_channel)
|
||||
|
||||
client = APIClient()
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
channel=slack_channel_id,
|
||||
)
|
||||
start_date = timezone.now().replace(microsecond=0)
|
||||
data = {
|
||||
"start": start_date,
|
||||
|
|
@ -447,9 +451,13 @@ def test_update_calendar_schedule_with_custom_event(
|
|||
|
||||
assert len(schedule.custom_on_call_shifts.all()) == 0
|
||||
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
result = {
|
||||
schedule.refresh_from_db()
|
||||
assert len(schedule.custom_on_call_shifts.all()) == 1
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == {
|
||||
"id": schedule.public_primary_key,
|
||||
"team_id": None,
|
||||
"name": schedule.name,
|
||||
|
|
@ -458,18 +466,13 @@ def test_update_calendar_schedule_with_custom_event(
|
|||
"on_call_now": [],
|
||||
"shifts": data["shifts"],
|
||||
"slack": {
|
||||
"channel_id": "SLACKCHANNELID",
|
||||
"channel_id": slack_channel_id,
|
||||
"user_group_id": None,
|
||||
},
|
||||
"ical_url_overrides": None,
|
||||
"enable_web_overrides": False,
|
||||
}
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
schedule.refresh_from_db()
|
||||
assert len(schedule.custom_on_call_shifts.all()) == 1
|
||||
assert response.json() == result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_update_calendar_schedule_invalid_override(
|
||||
|
|
@ -498,7 +501,7 @@ def test_update_calendar_schedule_invalid_override(
|
|||
"shifts": [on_call_shift.public_primary_key],
|
||||
}
|
||||
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.json() == {"detail": "Shifts of type override are not supported in this schedule"}
|
||||
|
||||
|
|
@ -521,7 +524,7 @@ def test_update_schedule_invalid_timezone(make_organization_and_user_with_token,
|
|||
|
||||
data = {"time_zone": "asdfasdf"}
|
||||
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.json() == {"time_zone": ["Invalid timezone"]}
|
||||
|
||||
|
|
@ -553,7 +556,7 @@ def test_update_web_schedule_with_override(
|
|||
"shifts": [on_call_shift.public_primary_key],
|
||||
}
|
||||
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.json() == {"detail": "Web schedule update is not enabled through API"}
|
||||
|
||||
|
|
@ -570,7 +573,7 @@ def test_delete_calendar_schedule(
|
|||
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
|
||||
response = client.delete(url, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.delete(url, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
assert response.status_code == status.HTTP_204_NO_CONTENT
|
||||
|
||||
|
|
@ -580,26 +583,35 @@ def test_delete_calendar_schedule(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_get_ical_schedule(
|
||||
make_organization_and_user_with_token,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
client = APIClient()
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=slack_channel_id)
|
||||
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleICal,
|
||||
channel=slack_channel_id,
|
||||
slack_channel=slack_channel,
|
||||
ical_url_primary=ICAL_URL,
|
||||
)
|
||||
|
||||
client = APIClient()
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
|
||||
response = client.get(url, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.get(url, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
result = {
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == {
|
||||
"id": schedule.public_primary_key,
|
||||
"team_id": None,
|
||||
"name": schedule.name,
|
||||
|
|
@ -608,18 +620,15 @@ def test_get_ical_schedule(
|
|||
"ical_url_overrides": None,
|
||||
"on_call_now": [],
|
||||
"slack": {
|
||||
"channel_id": "SLACKCHANNELID",
|
||||
"channel_id": slack_channel_id,
|
||||
"user_group_id": None,
|
||||
},
|
||||
}
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_create_ical_schedule(make_organization_and_user_with_token):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
_, _, token = make_organization_and_user_with_token()
|
||||
client = APIClient()
|
||||
|
||||
url = reverse("api-public:schedules-list")
|
||||
|
|
@ -634,7 +643,7 @@ def test_create_ical_schedule(make_organization_and_user_with_token):
|
|||
"apps.public_api.serializers.schedules_ical.ScheduleICalSerializer.validate_ical_url_primary",
|
||||
return_value=ICAL_URL,
|
||||
), patch("apps.schedules.tasks.refresh_ical_final_schedule.apply_async") as mock_refresh_final:
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
schedule = OnCallSchedule.objects.get(public_primary_key=response.data["id"])
|
||||
|
||||
result = {
|
||||
|
|
@ -658,21 +667,29 @@ def test_create_ical_schedule(make_organization_and_user_with_token):
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_update_ical_schedule(
|
||||
make_organization_and_user_with_token,
|
||||
make_organization,
|
||||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
client = APIClient()
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=slack_channel_id)
|
||||
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleICal,
|
||||
channel=slack_channel_id,
|
||||
slack_channel=slack_channel,
|
||||
ical_url_primary=ICAL_URL,
|
||||
)
|
||||
|
||||
client = APIClient()
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
|
||||
data = {
|
||||
|
|
@ -682,9 +699,10 @@ def test_update_ical_schedule(
|
|||
assert schedule.name != data["name"]
|
||||
|
||||
with patch("apps.schedules.tasks.refresh_ical_final_schedule.apply_async") as mock_refresh_final:
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
result = {
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == {
|
||||
"id": schedule.public_primary_key,
|
||||
"team_id": None,
|
||||
"name": data["name"],
|
||||
|
|
@ -693,15 +711,13 @@ def test_update_ical_schedule(
|
|||
"ical_url_overrides": None,
|
||||
"on_call_now": [],
|
||||
"slack": {
|
||||
"channel_id": "SLACKCHANNELID",
|
||||
"channel_id": slack_channel_id,
|
||||
"user_group_id": None,
|
||||
},
|
||||
}
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
schedule.refresh_from_db()
|
||||
assert schedule.name == data["name"]
|
||||
assert response.json() == result
|
||||
assert not mock_refresh_final.called
|
||||
|
||||
|
||||
|
|
@ -710,7 +726,7 @@ def test_delete_ical_schedule(
|
|||
make_organization_and_user_with_token,
|
||||
make_schedule,
|
||||
):
|
||||
organization, user, token = make_organization_and_user_with_token()
|
||||
organization, _, token = make_organization_and_user_with_token()
|
||||
client = APIClient()
|
||||
|
||||
schedule = make_schedule(
|
||||
|
|
@ -721,7 +737,7 @@ def test_delete_ical_schedule(
|
|||
|
||||
url = reverse("api-public:schedules-detail", kwargs={"pk": schedule.public_primary_key})
|
||||
|
||||
response = client.delete(url, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.delete(url, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
assert response.status_code == status.HTTP_204_NO_CONTENT
|
||||
|
||||
|
|
@ -736,26 +752,31 @@ def test_get_schedule_list(
|
|||
make_user_for_organization,
|
||||
make_public_api_token,
|
||||
make_slack_user_group,
|
||||
make_slack_channel,
|
||||
make_schedule,
|
||||
):
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
user_group_id = "SLACKGROUPID"
|
||||
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=slack_channel_id)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user = make_user_for_organization(organization=organization)
|
||||
_, token = make_public_api_token(user, organization)
|
||||
|
||||
slack_channel_id = "SLACKCHANNELID"
|
||||
user_group_id = "SLACKGROUPID"
|
||||
|
||||
user_group = make_slack_user_group(slack_team_identity, slack_id=user_group_id)
|
||||
|
||||
schedule_calendar = make_schedule(
|
||||
organization, schedule_class=OnCallScheduleCalendar, channel=slack_channel_id, user_group=user_group
|
||||
organization,
|
||||
schedule_class=OnCallScheduleCalendar,
|
||||
slack_channel=slack_channel,
|
||||
user_group=user_group,
|
||||
)
|
||||
|
||||
schedule_ical = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleICal,
|
||||
channel=slack_channel_id,
|
||||
slack_channel=slack_channel,
|
||||
ical_url_primary=ICAL_URL,
|
||||
user_group=user_group,
|
||||
)
|
||||
|
|
@ -763,9 +784,10 @@ def test_get_schedule_list(
|
|||
client = APIClient()
|
||||
url = reverse("api-public:schedules-list")
|
||||
|
||||
response = client.get(url, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.get(url, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
result = {
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == {
|
||||
"count": 2,
|
||||
"next": None,
|
||||
"previous": None,
|
||||
|
|
@ -798,9 +820,6 @@ def test_get_schedule_list(
|
|||
"total_pages": 1,
|
||||
}
|
||||
|
||||
assert response.status_code == status.HTTP_200_OK
|
||||
assert response.json() == result
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_create_schedule_wrong_type(make_organization_and_user_with_token):
|
||||
|
|
@ -819,7 +838,7 @@ def test_create_schedule_wrong_type(make_organization_and_user_with_token):
|
|||
"apps.public_api.serializers.schedules_ical.ScheduleICalSerializer.validate_ical_url_primary",
|
||||
return_value=ICAL_URL,
|
||||
):
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
|
||||
|
|
@ -839,7 +858,7 @@ def test_create_schedule_invalid_timezone(make_organization_and_user_with_token,
|
|||
"type": schedule_type,
|
||||
}
|
||||
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.json() == {"time_zone": ["Invalid timezone"]}
|
||||
|
||||
|
|
@ -861,7 +880,7 @@ def test_create_calendar_schedule_slack_error(make_organization_and_user_with_to
|
|||
},
|
||||
}
|
||||
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.data["detail"] == "Slack isn't connected to this workspace"
|
||||
# with slack user group id
|
||||
|
|
@ -875,7 +894,7 @@ def test_create_calendar_schedule_slack_error(make_organization_and_user_with_to
|
|||
},
|
||||
}
|
||||
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.data["detail"] == "Slack isn't connected to this workspace"
|
||||
|
||||
|
|
@ -892,13 +911,13 @@ def test_update_calendar_schedule_slack_error(
|
|||
|
||||
data = {"slack": {"channel_id": "TEST_SLACK_ID"}}
|
||||
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.data["detail"] == "Slack isn't connected to this workspace"
|
||||
|
||||
data = {"slack": {"user_group_id": "TEST_SLACK_ID"}}
|
||||
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.put(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
assert response.data["detail"] == "Slack isn't connected to this workspace"
|
||||
|
||||
|
|
@ -914,7 +933,7 @@ def test_create_ical_schedule_without_ical_url(make_organization_and_user_with_t
|
|||
"name": "schedule test name",
|
||||
"type": "ical",
|
||||
}
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
|
||||
data = {
|
||||
|
|
@ -923,7 +942,7 @@ def test_create_ical_schedule_without_ical_url(make_organization_and_user_with_t
|
|||
"ical_url_primary": None,
|
||||
"type": "ical",
|
||||
}
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
|
||||
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=token)
|
||||
assert response.status_code == status.HTTP_400_BAD_REQUEST
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@ from apps.alerts.constants import ActionSource
|
|||
from apps.alerts.models import AlertGroup, AlertReceiveChannel
|
||||
from apps.alerts.tasks import delete_alert_group, wipe
|
||||
from apps.api.label_filtering import parse_label_query
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.constants import VALID_DATE_FOR_DELETE_INCIDENT
|
||||
from apps.public_api.helpers import is_valid_group_creation_date, team_has_slack_token_for_deleting
|
||||
|
|
@ -57,7 +58,19 @@ class AlertGroupView(
|
|||
GenericViewSet,
|
||||
):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.ALERT_GROUPS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.ALERT_GROUPS_READ],
|
||||
"destroy": [RBACPermission.Permissions.ALERT_GROUPS_WRITE],
|
||||
"acknowledge": [RBACPermission.Permissions.ALERT_GROUPS_WRITE],
|
||||
"unacknowledge": [RBACPermission.Permissions.ALERT_GROUPS_WRITE],
|
||||
"resolve": [RBACPermission.Permissions.ALERT_GROUPS_WRITE],
|
||||
"unresolve": [RBACPermission.Permissions.ALERT_GROUPS_WRITE],
|
||||
"silence": [RBACPermission.Permissions.ALERT_GROUPS_WRITE],
|
||||
"unsilence": [RBACPermission.Permissions.ALERT_GROUPS_WRITE],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.viewsets import GenericViewSet
|
||||
|
||||
from apps.alerts.models import Alert
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers.alerts import AlertSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -19,7 +20,11 @@ class AlertFilter(filters.FilterSet):
|
|||
|
||||
class AlertView(RateLimitHeadersMixin, mixins.ListModelMixin, GenericViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.ALERT_GROUPS_READ],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ from rest_framework.response import Response
|
|||
from rest_framework.views import APIView
|
||||
|
||||
from apps.alerts.paging import DirectPagingAlertGroupResolvedError, DirectPagingUserTeamValidationError, direct_paging
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers import AlertGroupSerializer, EscalationSerializer
|
||||
from apps.public_api.throttlers import UserThrottle
|
||||
|
|
@ -16,7 +17,11 @@ class EscalationView(APIView):
|
|||
"""
|
||||
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"post": [RBACPermission.Permissions.ALERT_GROUPS_DIRECT_PAGING],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.viewsets import ModelViewSet
|
||||
|
||||
from apps.alerts.models import EscalationChain
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers import EscalationChainSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -15,7 +16,16 @@ from common.insight_log import EntityEvent, write_resource_insight_log
|
|||
|
||||
class EscalationChainView(RateLimitHeadersMixin, ModelViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.ESCALATION_CHAINS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.ESCALATION_CHAINS_READ],
|
||||
"create": [RBACPermission.Permissions.ESCALATION_CHAINS_WRITE],
|
||||
"update": [RBACPermission.Permissions.ESCALATION_CHAINS_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.ESCALATION_CHAINS_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.ESCALATION_CHAINS_WRITE],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.viewsets import ModelViewSet
|
||||
|
||||
from apps.alerts.models import EscalationPolicy
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers import EscalationPolicySerializer, EscalationPolicyUpdateSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -14,7 +15,16 @@ from common.insight_log import EntityEvent, write_resource_insight_log
|
|||
|
||||
class EscalationPolicyView(RateLimitHeadersMixin, UpdateSerializerMixin, ModelViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.ESCALATION_CHAINS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.ESCALATION_CHAINS_READ],
|
||||
"create": [RBACPermission.Permissions.ESCALATION_CHAINS_WRITE],
|
||||
"update": [RBACPermission.Permissions.ESCALATION_CHAINS_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.ESCALATION_CHAINS_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.ESCALATION_CHAINS_WRITE],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -2,13 +2,18 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.response import Response
|
||||
from rest_framework.views import APIView
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.throttlers import InfoThrottler
|
||||
|
||||
|
||||
class InfoView(APIView):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"get": [RBACPermission.Permissions.OTHER_SETTINGS_READ],
|
||||
}
|
||||
|
||||
throttle_classes = [InfoThrottler]
|
||||
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.viewsets import ModelViewSet
|
||||
|
||||
from apps.alerts.models import AlertReceiveChannel
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers import IntegrationSerializer, IntegrationUpdateSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -24,7 +25,18 @@ class IntegrationView(
|
|||
ModelViewSet,
|
||||
):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.INTEGRATIONS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.INTEGRATIONS_READ],
|
||||
"create": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
"update": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
"maintenance_start": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
"maintenance_stop": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ from rest_framework.exceptions import NotFound
|
|||
from rest_framework.permissions import IsAuthenticated
|
||||
from rest_framework.viewsets import ModelViewSet
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers import CustomOnCallShiftSerializer, CustomOnCallShiftUpdateSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -16,7 +17,16 @@ from common.insight_log import EntityEvent, write_resource_insight_log
|
|||
|
||||
class CustomOnCallShiftView(RateLimitHeadersMixin, UpdateSerializerMixin, ModelViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.SCHEDULES_READ],
|
||||
"retrieve": [RBACPermission.Permissions.SCHEDULES_READ],
|
||||
"create": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"update": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.settings import api_settings
|
||||
from rest_framework.viewsets import ReadOnlyModelViewSet
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers import OrganizationSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -15,7 +16,11 @@ class OrganizationView(
|
|||
ReadOnlyModelViewSet,
|
||||
):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"retrieve": [RBACPermission.Permissions.OTHER_SETTINGS_READ],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.response import Response
|
||||
from rest_framework.viewsets import ModelViewSet
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.base.models import UserNotificationPolicy
|
||||
from apps.public_api.serializers import PersonalNotificationRuleSerializer, PersonalNotificationRuleUpdateSerializer
|
||||
|
|
@ -17,7 +18,16 @@ from common.insight_log import EntityEvent, write_resource_insight_log
|
|||
|
||||
class PersonalNotificationView(RateLimitHeadersMixin, UpdateSerializerMixin, ModelViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.USER_SETTINGS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.USER_SETTINGS_READ],
|
||||
"create": [RBACPermission.Permissions.USER_SETTINGS_WRITE],
|
||||
"update": [RBACPermission.Permissions.USER_SETTINGS_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.USER_SETTINGS_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.USER_SETTINGS_WRITE],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -18,7 +18,6 @@ class ResolutionNoteView(RateLimitHeadersMixin, UpdateSerializerMixin, ModelView
|
|||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"metadata": [RBACPermission.Permissions.ALERT_GROUPS_READ],
|
||||
"list": [RBACPermission.Permissions.ALERT_GROUPS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.ALERT_GROUPS_READ],
|
||||
"create": [RBACPermission.Permissions.ALERT_GROUPS_WRITE],
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ from rest_framework.response import Response
|
|||
from rest_framework.viewsets import ModelViewSet
|
||||
|
||||
from apps.alerts.models import ChannelFilter
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers import ChannelFilterSerializer, ChannelFilterUpdateSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -17,7 +18,16 @@ from common.insight_log import EntityEvent, write_resource_insight_log
|
|||
|
||||
class ChannelFilterView(RateLimitHeadersMixin, UpdateSerializerMixin, ModelViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.INTEGRATIONS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.INTEGRATIONS_READ],
|
||||
"create": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
"update": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.INTEGRATIONS_WRITE],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -8,6 +8,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.views import Response
|
||||
from rest_framework.viewsets import ModelViewSet
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication, ScheduleExportAuthentication
|
||||
from apps.public_api.custom_renderers import CalendarRenderer
|
||||
from apps.public_api.serializers import PolymorphicScheduleSerializer, PolymorphicScheduleUpdateSerializer
|
||||
|
|
@ -28,7 +29,17 @@ logger = logging.getLogger(__name__)
|
|||
|
||||
class OnCallScheduleChannelView(RateLimitHeadersMixin, UpdateSerializerMixin, ModelViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.SCHEDULES_READ],
|
||||
"retrieve": [RBACPermission.Permissions.SCHEDULES_READ],
|
||||
"create": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"update": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"final_shifts": [RBACPermission.Permissions.SCHEDULES_READ],
|
||||
}
|
||||
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.response import Response
|
||||
from rest_framework.serializers import BaseSerializer
|
||||
|
||||
from apps.api.permissions import AuthenticatedRequest
|
||||
from apps.api.permissions import AuthenticatedRequest, RBACPermission
|
||||
from apps.api.views.shift_swap import BaseShiftSwapViewSet
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -24,7 +24,17 @@ logger = logging.getLogger(__name__)
|
|||
class ShiftSwapViewSet(RateLimitHeadersMixin, BaseShiftSwapViewSet):
|
||||
# set authentication and permission classes
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.SCHEDULES_READ],
|
||||
"retrieve": [RBACPermission.Permissions.SCHEDULES_READ],
|
||||
"create": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"update": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
"take": [RBACPermission.Permissions.SCHEDULES_WRITE],
|
||||
}
|
||||
|
||||
# public API customizations
|
||||
throttle_classes = [UserThrottle]
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ from rest_framework import mixins
|
|||
from rest_framework.permissions import IsAuthenticated
|
||||
from rest_framework.viewsets import GenericViewSet
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers.slack_channel import SlackChannelSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -12,7 +13,12 @@ from common.api_helpers.paginators import FiftyPageSizePaginator
|
|||
|
||||
class SlackChannelView(RateLimitHeadersMixin, mixins.ListModelMixin, GenericViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.CHATOPS_READ],
|
||||
}
|
||||
|
||||
pagination_class = FiftyPageSizePaginator
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ from rest_framework import viewsets
|
|||
from rest_framework.mixins import ListModelMixin, RetrieveModelMixin
|
||||
from rest_framework.permissions import IsAuthenticated
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers.teams import TeamSerializer
|
||||
from apps.public_api.tf_sync import is_request_from_terraform, sync_teams_on_tf_request
|
||||
|
|
@ -14,7 +15,12 @@ from common.api_helpers.paginators import FiftyPageSizePaginator
|
|||
class TeamView(PublicPrimaryKeyMixin, RetrieveModelMixin, ListModelMixin, viewsets.GenericViewSet):
|
||||
serializer_class = TeamSerializer
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.USER_SETTINGS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.USER_SETTINGS_READ],
|
||||
}
|
||||
|
||||
model = Team
|
||||
pagination_class = FiftyPageSizePaginator
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ from rest_framework import mixins
|
|||
from rest_framework.permissions import IsAuthenticated
|
||||
from rest_framework.viewsets import GenericViewSet
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers.user_groups import UserGroupSerializer
|
||||
from apps.public_api.throttlers.user_throttle import UserThrottle
|
||||
|
|
@ -12,7 +13,12 @@ from common.api_helpers.paginators import FiftyPageSizePaginator
|
|||
|
||||
class UserGroupView(RateLimitHeadersMixin, mixins.ListModelMixin, GenericViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.CHATOPS_READ],
|
||||
}
|
||||
|
||||
pagination_class = FiftyPageSizePaginator
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.views import Response
|
||||
from rest_framework.viewsets import ReadOnlyModelViewSet
|
||||
|
||||
from apps.api.permissions import LegacyAccessControlRole
|
||||
from apps.api.permissions import LegacyAccessControlRole, RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication, UserScheduleExportAuthentication
|
||||
from apps.public_api.custom_renderers import CalendarRenderer
|
||||
from apps.public_api.serializers import FastUserSerializer, UserSerializer
|
||||
|
|
@ -36,7 +36,12 @@ class UserFilter(filters.FilterSet):
|
|||
|
||||
class UserView(RateLimitHeadersMixin, ShortSerializerMixin, ReadOnlyModelViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.USER_SETTINGS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.USER_SETTINGS_READ],
|
||||
}
|
||||
|
||||
model = User
|
||||
pagination_class = HundredPageSizePaginator
|
||||
|
|
|
|||
|
|
@ -5,6 +5,7 @@ from rest_framework.permissions import IsAuthenticated
|
|||
from rest_framework.response import Response
|
||||
from rest_framework.viewsets import ModelViewSet
|
||||
|
||||
from apps.api.permissions import RBACPermission
|
||||
from apps.auth_token.auth import ApiTokenAuthentication
|
||||
from apps.public_api.serializers.webhooks import (
|
||||
WebhookCreateSerializer,
|
||||
|
|
@ -21,7 +22,18 @@ from common.insight_log import EntityEvent, write_resource_insight_log
|
|||
|
||||
class WebhooksView(RateLimitHeadersMixin, UpdateSerializerMixin, ModelViewSet):
|
||||
authentication_classes = (ApiTokenAuthentication,)
|
||||
permission_classes = (IsAuthenticated,)
|
||||
permission_classes = (IsAuthenticated, RBACPermission)
|
||||
|
||||
rbac_permissions = {
|
||||
"list": [RBACPermission.Permissions.OUTGOING_WEBHOOKS_READ],
|
||||
"retrieve": [RBACPermission.Permissions.OUTGOING_WEBHOOKS_READ],
|
||||
"create": [RBACPermission.Permissions.OUTGOING_WEBHOOKS_WRITE],
|
||||
"update": [RBACPermission.Permissions.OUTGOING_WEBHOOKS_WRITE],
|
||||
"partial_update": [RBACPermission.Permissions.OUTGOING_WEBHOOKS_WRITE],
|
||||
"destroy": [RBACPermission.Permissions.OUTGOING_WEBHOOKS_WRITE],
|
||||
"responses": [RBACPermission.Permissions.OUTGOING_WEBHOOKS_READ],
|
||||
}
|
||||
|
||||
pagination_class = FiftyPageSizePaginator
|
||||
throttle_classes = [UserThrottle]
|
||||
|
||||
|
|
|
|||
|
|
@ -345,7 +345,7 @@ def list_users_to_notify_from_ical(
|
|||
schedule: "OnCallSchedule",
|
||||
events_datetime: typing.Optional[datetime.datetime] = None,
|
||||
from_cached_final: bool = False,
|
||||
) -> typing.Sequence["User"]:
|
||||
) -> typing.List["User"]:
|
||||
"""
|
||||
Retrieve on-call users for the current time
|
||||
"""
|
||||
|
|
@ -363,7 +363,7 @@ def list_users_to_notify_from_ical_for_period(
|
|||
start_datetime: datetime.datetime,
|
||||
end_datetime: datetime.datetime,
|
||||
from_cached_final: bool = False,
|
||||
) -> typing.Sequence["User"]:
|
||||
) -> typing.List["User"]:
|
||||
if from_cached_final and schedule.cached_ical_final_schedule:
|
||||
events = schedule.filter_events(start_datetime, end_datetime, from_cached_final=True)
|
||||
else:
|
||||
|
|
@ -606,7 +606,7 @@ def get_missing_users_from_ical_event(event, organization: "Organization"):
|
|||
return [u for u in all_usernames if u != "" and u not in found_usernames and u.lower() not in found_emails]
|
||||
|
||||
|
||||
def get_users_from_ical_event(event, organization: "Organization") -> typing.Sequence["User"]:
|
||||
def get_users_from_ical_event(event, organization: "Organization") -> typing.List["User"]:
|
||||
usernames_from_ical, _ = get_usernames_from_ical_event(event)
|
||||
users = []
|
||||
if len(usernames_from_ical) != 0:
|
||||
|
|
|
|||
|
|
@ -0,0 +1,20 @@
|
|||
# Generated by Django 4.2.16 on 2024-10-21 17:34
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('slack', '0005_slackteamidentity__unified_slack_app_installed'),
|
||||
('schedules', '0017_alter_oncallschedule_polymorphic_ctype'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='oncallschedule',
|
||||
name='slack_channel',
|
||||
field=models.ForeignKey(default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='slack.slackchannel'),
|
||||
),
|
||||
]
|
||||
65
engine/apps/schedules/migrations/0019_auto_20241021_1735.py
Normal file
65
engine/apps/schedules/migrations/0019_auto_20241021_1735.py
Normal file
|
|
@ -0,0 +1,65 @@
|
|||
# Generated by Django 4.2.16 on 2024-10-21 17:35
|
||||
import logging
|
||||
|
||||
from django.db import migrations
|
||||
import django_migration_linter as linter
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def populate_slack_channel(apps, schema_editor):
|
||||
OnCallSchedule = apps.get_model("schedules", "OnCallSchedule")
|
||||
SlackChannel = apps.get_model("slack", "SlackChannel")
|
||||
|
||||
logger.info("Starting migration to populate slack_channel field.")
|
||||
|
||||
queryset = OnCallSchedule.objects.filter(channel__isnull=False, organization__slack_team_identity__isnull=False)
|
||||
total_schedules = queryset.count()
|
||||
updated_schedules = 0
|
||||
missing_channels = 0
|
||||
schedules_to_update = []
|
||||
|
||||
logger.info(f"Total schedules to process: {total_schedules}")
|
||||
|
||||
for schedule in queryset:
|
||||
slack_id = schedule.channel
|
||||
slack_team_identity = schedule.organization.slack_team_identity
|
||||
|
||||
try:
|
||||
slack_channel = SlackChannel.objects.get(slack_id=slack_id, slack_team_identity=slack_team_identity)
|
||||
|
||||
schedule.slack_channel = slack_channel
|
||||
schedules_to_update.append(schedule)
|
||||
|
||||
updated_schedules += 1
|
||||
logger.info(
|
||||
f"Schedule {schedule.id} updated with SlackChannel {slack_channel.id} (slack_id: {slack_id})."
|
||||
)
|
||||
except SlackChannel.DoesNotExist:
|
||||
missing_channels += 1
|
||||
logger.warning(
|
||||
f"SlackChannel with slack_id {slack_id} and slack_team_identity {slack_team_identity} "
|
||||
f"does not exist for Schedule {schedule.id}."
|
||||
)
|
||||
|
||||
if schedules_to_update:
|
||||
OnCallSchedule.objects.bulk_update(schedules_to_update, ["slack_channel"])
|
||||
logger.info(f"Bulk updated {len(schedules_to_update)} OnCallSchedules with their Slack channel.")
|
||||
|
||||
logger.info(
|
||||
f"Finished migration. Total schedules processed: {total_schedules}. "
|
||||
f"Schedules updated: {updated_schedules}. Missing SlackChannels: {missing_channels}."
|
||||
)
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('schedules', '0018_oncallschedule_slack_channel'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# simply setting this new field is okay, we are not deleting the value of channel
|
||||
# therefore, no need to revert it
|
||||
linter.IgnoreMigration(),
|
||||
migrations.RunPython(populate_slack_channel, migrations.RunPython.noop),
|
||||
]
|
||||
|
|
@ -53,7 +53,7 @@ if typing.TYPE_CHECKING:
|
|||
from apps.alerts.models import EscalationPolicy
|
||||
from apps.auth_token.models import ScheduleExportAuthToken
|
||||
from apps.schedules.models import ShiftSwapRequest
|
||||
from apps.slack.models import SlackUserGroup
|
||||
from apps.slack.models import SlackChannel, SlackUserGroup
|
||||
from apps.user_management.models import Organization, Team
|
||||
|
||||
|
||||
|
|
@ -167,8 +167,9 @@ class OnCallSchedule(PolymorphicModel):
|
|||
custom_shifts: "RelatedManager['CustomOnCallShift']"
|
||||
organization: "Organization"
|
||||
shift_swap_requests: "RelatedManager['ShiftSwapRequest']"
|
||||
slack_user_group: typing.Optional["SlackUserGroup"]
|
||||
slack_channel: typing.Optional["SlackChannel"]
|
||||
team: typing.Optional["Team"]
|
||||
user_group: typing.Optional["SlackUserGroup"]
|
||||
|
||||
objects: models.Manager["OnCallSchedule"] = PolymorphicManager.from_queryset(OnCallScheduleQuerySet)()
|
||||
|
||||
|
|
@ -207,7 +208,16 @@ class OnCallSchedule(PolymorphicModel):
|
|||
)
|
||||
|
||||
name = models.CharField(max_length=200)
|
||||
|
||||
# TODO: drop this field in a subsequent release, this has been migrated to slack_channel field
|
||||
channel = models.CharField(max_length=100, null=True, default=None)
|
||||
slack_channel = models.ForeignKey(
|
||||
"slack.SlackChannel",
|
||||
null=True,
|
||||
default=None,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name="+",
|
||||
)
|
||||
|
||||
# Slack user group to be updated when on-call users change for this schedule
|
||||
user_group = models.ForeignKey(
|
||||
|
|
@ -258,6 +268,10 @@ class OnCallSchedule(PolymorphicModel):
|
|||
def slack_url(self) -> str:
|
||||
return f"<{self.web_detail_page_link}|{self.name}>"
|
||||
|
||||
@property
|
||||
def slack_channel_slack_id(self) -> typing.Optional[str]:
|
||||
return self.slack_channel.slack_id if self.slack_channel else None
|
||||
|
||||
def get_icalendars(self) -> typing.Tuple[typing.Optional[icalendar.Calendar], typing.Optional[icalendar.Calendar]]:
|
||||
"""Returns list of calendars. Primary calendar should always be the first"""
|
||||
# if self._ical_file_(primary|overrides) is None -> no cache, will trigger a refresh
|
||||
|
|
@ -1043,14 +1057,11 @@ class OnCallSchedule(PolymorphicModel):
|
|||
result["team_id"] = self.team.public_primary_key
|
||||
else:
|
||||
result["team"] = "General"
|
||||
if self.organization.slack_team_identity:
|
||||
if self.channel:
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
sti = self.organization.slack_team_identity
|
||||
slack_channel = SlackChannel.objects.filter(slack_team_identity=sti, slack_id=self.channel).first()
|
||||
if slack_channel:
|
||||
result["slack_channel"] = slack_channel.name
|
||||
if self.organization.slack_team_identity:
|
||||
if self.slack_channel is not None:
|
||||
result["slack_channel"] = self.slack_channel.name
|
||||
|
||||
if self.user_group is not None:
|
||||
result["user_group"] = self.user_group.handle
|
||||
|
||||
|
|
@ -1058,6 +1069,7 @@ class OnCallSchedule(PolymorphicModel):
|
|||
result["current_shift_notification"] = self.mention_oncall_start
|
||||
result["next_shift_notification"] = self.mention_oncall_next
|
||||
result["notify_empty_oncall"] = self.get_notify_empty_oncall_display()
|
||||
|
||||
return result
|
||||
|
||||
@property
|
||||
|
|
|
|||
|
|
@ -169,7 +169,7 @@ class ShiftSwapRequest(models.Model):
|
|||
This is only set if the schedule associated with the shift swap request
|
||||
has a Slack channel configured for it.
|
||||
"""
|
||||
return self.schedule.channel
|
||||
return self.schedule.slack_channel_slack_id
|
||||
|
||||
@property
|
||||
def organization(self) -> "Organization":
|
||||
|
|
|
|||
|
|
@ -32,7 +32,7 @@ def start_notify_about_empty_shifts_in_schedule():
|
|||
week_ago = today - timezone.timedelta(days=7)
|
||||
schedules = OnCallScheduleICal.objects.filter(
|
||||
empty_shifts_report_sent_at__lte=week_ago,
|
||||
channel__isnull=False,
|
||||
slack_channel__isnull=False,
|
||||
organization__deleted_at__isnull=True,
|
||||
)
|
||||
|
||||
|
|
@ -54,7 +54,7 @@ def notify_about_empty_shifts_in_schedule_task(schedule_pk):
|
|||
if current_task_id != cached_task_id and cached_task_id is not None:
|
||||
return
|
||||
try:
|
||||
schedule = OnCallSchedule.objects.get(pk=schedule_pk, channel__isnull=False)
|
||||
schedule = OnCallSchedule.objects.get(pk=schedule_pk, slack_channel__isnull=False)
|
||||
except OnCallSchedule.DoesNotExist:
|
||||
task_logger.info(f"Tried to notify_about_empty_shifts_in_schedule_task for non-existing schedule {schedule_pk}")
|
||||
return
|
||||
|
|
@ -94,7 +94,7 @@ def notify_about_empty_shifts_in_schedule_task(schedule_pk):
|
|||
text += f"_From {OnCallSchedule.CALENDAR_TYPE_VERBAL[empty_shift.calendar_type]} calendar_\n"
|
||||
if idx != len(empty_shifts) - 1:
|
||||
text += "\n\n"
|
||||
post_message_to_channel(schedule.organization, schedule.channel, text)
|
||||
post_message_to_channel(schedule.organization, schedule.slack_channel_slack_id, text)
|
||||
else:
|
||||
schedule.has_empty_shifts = False
|
||||
schedule.save(update_fields=["empty_shifts_report_sent_at", "has_empty_shifts"])
|
||||
|
|
|
|||
|
|
@ -31,7 +31,7 @@ def start_notify_about_gaps_in_schedule():
|
|||
week_ago = today - timezone.timedelta(days=7)
|
||||
schedules = OnCallSchedule.objects.filter(
|
||||
gaps_report_sent_at__lte=week_ago,
|
||||
channel__isnull=False,
|
||||
slack_channel__isnull=False,
|
||||
organization__deleted_at__isnull=True,
|
||||
)
|
||||
|
||||
|
|
@ -54,7 +54,7 @@ def notify_about_gaps_in_schedule_task(schedule_pk):
|
|||
return
|
||||
|
||||
try:
|
||||
schedule = OnCallSchedule.objects.get(pk=schedule_pk, channel__isnull=False)
|
||||
schedule = OnCallSchedule.objects.get(pk=schedule_pk, slack_channel__isnull=False)
|
||||
except OnCallSchedule.DoesNotExist:
|
||||
task_logger.info(f"Tried to notify_about_gaps_in_schedule_task for non-existing schedule {schedule_pk}")
|
||||
return
|
||||
|
|
@ -77,7 +77,7 @@ def notify_about_gaps_in_schedule_task(schedule_pk):
|
|||
text += f"From {start_verbal} to {end_verbal} (your TZ)\n"
|
||||
if idx != len(gaps) - 1:
|
||||
text += "\n\n"
|
||||
post_message_to_channel(schedule.organization, schedule.channel, text)
|
||||
post_message_to_channel(schedule.organization, schedule.slack_channel_slack_id, text)
|
||||
else:
|
||||
schedule.has_gaps = False
|
||||
schedule.save(update_fields=["gaps_report_sent_at", "has_gaps"])
|
||||
|
|
|
|||
|
|
@ -51,7 +51,7 @@ def refresh_ical_file(schedule_pk):
|
|||
return
|
||||
|
||||
schedule.refresh_ical_file()
|
||||
if schedule.channel is not None:
|
||||
if schedule.slack_channel is not None:
|
||||
notify_ical_schedule_shift.apply_async((schedule.pk,))
|
||||
|
||||
run_task_primary = False
|
||||
|
|
|
|||
|
|
@ -33,7 +33,7 @@ def test_notify_beneficiary_about_taken_shift_swap_request_no_configured_slack_c
|
|||
shift_swap_request_setup,
|
||||
):
|
||||
ssr, _, _ = shift_swap_request_setup()
|
||||
assert ssr.schedule.channel is None
|
||||
assert ssr.schedule.slack_channel is None
|
||||
|
||||
notify_beneficiary_about_taken_shift_swap_request(ssr.pk)
|
||||
|
||||
|
|
@ -55,16 +55,15 @@ def test_notify_beneficiary_about_taken_shift_swap_request_post_message_to_chann
|
|||
mock_notify_beneficiary_about_taken_shift_swap_request_via_push_notification,
|
||||
shift_swap_request_setup,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
):
|
||||
slack_channel_id = "C1234ASDFJ"
|
||||
|
||||
ssr, _, _ = shift_swap_request_setup()
|
||||
schedule = ssr.schedule
|
||||
organization = schedule.organization
|
||||
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
|
||||
schedule.channel = slack_channel_id
|
||||
schedule.slack_channel = make_slack_channel(slack_team_identity)
|
||||
schedule.save()
|
||||
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
|
|
|
|||
|
|
@ -31,7 +31,7 @@ def shift_swap_request_test_setup(
|
|||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_message = make_slack_message(alert_group=None, organization=organization, slack_id="12345")
|
||||
|
||||
schedule = make_schedule(organization, schedule_class=OnCallScheduleWeb, channel=slack_channel.slack_id)
|
||||
schedule = make_schedule(organization, schedule_class=OnCallScheduleWeb, slack_channel=slack_channel)
|
||||
|
||||
if swap_start is None:
|
||||
swap_start = timezone.now() + timezone.timedelta(days=7)
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ def test_create_shift_swap_request_message_no_configured_slack_channel_for_sched
|
|||
shift_swap_request_setup,
|
||||
):
|
||||
ssr, _, _ = shift_swap_request_setup()
|
||||
assert ssr.schedule.channel is None
|
||||
assert ssr.schedule.slack_channel is None
|
||||
|
||||
slack_msg_tasks.create_shift_swap_request_message(ssr.pk)
|
||||
|
||||
|
|
@ -36,9 +36,8 @@ def test_create_shift_swap_request_message_post_message_to_channel_called(
|
|||
shift_swap_request_setup,
|
||||
make_slack_message,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
):
|
||||
slack_channel_id = "C1234ASDFJ"
|
||||
|
||||
ssr, _, _ = shift_swap_request_setup()
|
||||
schedule = ssr.schedule
|
||||
organization = schedule.organization
|
||||
|
|
@ -48,7 +47,7 @@ def test_create_shift_swap_request_message_post_message_to_channel_called(
|
|||
|
||||
MockBaseShiftSwapRequestStep.return_value.create_message.return_value = slack_message
|
||||
|
||||
schedule.channel = slack_channel_id
|
||||
schedule.slack_channel = make_slack_channel(slack_team_identity)
|
||||
schedule.save()
|
||||
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
|
|
@ -79,7 +78,7 @@ def test_update_shift_swap_request_message_no_configured_slack_channel_for_sched
|
|||
shift_swap_request_setup,
|
||||
):
|
||||
ssr, _, _ = shift_swap_request_setup()
|
||||
assert ssr.schedule.channel is None
|
||||
assert ssr.schedule.slack_channel is None
|
||||
|
||||
slack_msg_tasks.update_shift_swap_request_message(ssr.pk)
|
||||
|
||||
|
|
@ -93,16 +92,15 @@ def test_update_shift_swap_request_message_post_message_to_channel_called(
|
|||
MockBaseShiftSwapRequestStep,
|
||||
shift_swap_request_setup,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
):
|
||||
slack_channel_id = "C1234ASDFJ"
|
||||
|
||||
ssr, _, _ = shift_swap_request_setup()
|
||||
schedule = ssr.schedule
|
||||
organization = schedule.organization
|
||||
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
|
||||
schedule.channel = slack_channel_id
|
||||
schedule.slack_channel = make_slack_channel(slack_team_identity)
|
||||
schedule.save()
|
||||
|
||||
organization.slack_team_identity = slack_team_identity
|
||||
|
|
|
|||
|
|
@ -11,19 +11,23 @@ from apps.schedules.tasks import notify_about_empty_shifts_in_schedule_task
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_no_empty_shifts_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -58,19 +62,23 @@ def test_no_empty_shifts_no_triggering_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_empty_shifts_trigger_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1", role=LegacyAccessControlRole.VIEWER)
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -105,12 +113,16 @@ def test_empty_shifts_trigger_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_empty_non_empty_shifts_trigger_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
user2 = make_user(organization=organization, username="user2", role=LegacyAccessControlRole.VIEWER)
|
||||
|
||||
|
|
@ -118,7 +130,7 @@ def test_empty_non_empty_shifts_trigger_notification(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
|
|||
|
|
@ -10,19 +10,23 @@ from apps.schedules.tasks import notify_about_gaps_in_schedule_task
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_no_gaps_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -57,19 +61,23 @@ def test_no_gaps_no_triggering_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_gaps_in_the_past_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -120,19 +128,23 @@ def test_gaps_in_the_past_no_triggering_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_gaps_now_trigger_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -170,19 +182,23 @@ def test_gaps_now_trigger_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_gaps_near_future_trigger_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
schedule = make_schedule(
|
||||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
@ -221,12 +237,16 @@ def test_gaps_near_future_trigger_notification(
|
|||
|
||||
@pytest.mark.django_db
|
||||
def test_gaps_later_than_7_days_no_triggering_notification(
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_user,
|
||||
make_schedule,
|
||||
make_on_call_shift,
|
||||
):
|
||||
organization, _, _, _ = make_organization_and_user_with_slack_identities()
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity)
|
||||
user1 = make_user(organization=organization, username="user1")
|
||||
|
||||
now = timezone.now().replace(microsecond=0)
|
||||
|
|
@ -235,7 +255,7 @@ def test_gaps_later_than_7_days_no_triggering_notification(
|
|||
organization,
|
||||
schedule_class=OnCallScheduleWeb,
|
||||
name="test_schedule",
|
||||
channel="channel",
|
||||
slack_channel=slack_channel,
|
||||
prev_ical_file_overrides=None,
|
||||
cached_ical_file_overrides=None,
|
||||
)
|
||||
|
|
|
|||
|
|
@ -43,3 +43,7 @@ class SlackChannel(models.Model):
|
|||
|
||||
class Meta:
|
||||
unique_together = ("slack_id", "slack_team_identity")
|
||||
|
||||
@classmethod
|
||||
def __str__(self):
|
||||
return f"{self.name}: {self.slack_id}"
|
||||
|
|
|
|||
|
|
@ -29,6 +29,8 @@ class SlackMessage(models.Model):
|
|||
id = models.CharField(primary_key=True, default=uuid.uuid4, editable=False, max_length=36)
|
||||
|
||||
slack_id = models.CharField(max_length=100)
|
||||
|
||||
# TODO: convert this to a foreign key field to SlackChannel
|
||||
channel_id = models.CharField(max_length=100, null=True, default=None)
|
||||
|
||||
organization = models.ForeignKey(
|
||||
|
|
|
|||
|
|
@ -19,11 +19,15 @@ from apps.user_management.models import Organization, User
|
|||
if typing.TYPE_CHECKING:
|
||||
from django.db.models.manager import RelatedManager
|
||||
|
||||
from apps.slack.models import SlackChannel, SlackUserGroup
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SlackTeamIdentity(models.Model):
|
||||
cached_channels: "RelatedManager['SlackChannel']"
|
||||
organizations: "RelatedManager[Organization]"
|
||||
usergroups: "RelatedManager['SlackUserGroup']"
|
||||
|
||||
id = models.AutoField(primary_key=True)
|
||||
slack_id = models.CharField(max_length=100)
|
||||
|
|
|
|||
|
|
@ -68,10 +68,10 @@ class AlertShootingStep(scenario_step.ScenarioStep):
|
|||
if num_updated_rows == 1:
|
||||
try:
|
||||
channel_id = (
|
||||
alert.group.channel_filter.slack_channel_id_or_general_log_id
|
||||
alert.group.channel_filter.slack_channel_id_or_org_default_id
|
||||
if alert.group.channel_filter
|
||||
# if channel filter is deleted mid escalation, use default Slack channel
|
||||
else alert.group.channel.organization.general_log_channel_id
|
||||
else alert.group.channel.organization.default_slack_channel_slack_id
|
||||
)
|
||||
self._send_first_alert(alert, channel_id)
|
||||
except (SlackAPIError, TimeoutError):
|
||||
|
|
|
|||
|
|
@ -71,7 +71,7 @@ class AddToResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
predefined_org: typing.Optional["Organization"] = None,
|
||||
) -> None:
|
||||
from apps.alerts.models import ResolutionNote, ResolutionNoteSlackMessage
|
||||
from apps.slack.models import SlackMessage, SlackUserIdentity
|
||||
from apps.slack.models import SlackChannel, SlackMessage, SlackUserIdentity
|
||||
|
||||
try:
|
||||
channel_id = payload["channel"]["id"]
|
||||
|
|
@ -156,12 +156,17 @@ class AddToResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
except ResolutionNoteSlackMessage.DoesNotExist:
|
||||
text = payload["message"]["text"]
|
||||
text = text.replace("```", "")
|
||||
|
||||
slack_channel = SlackChannel.objects.get(
|
||||
slack_id=channel_id, slack_team_identity=slack_team_identity
|
||||
)
|
||||
slack_message = SlackMessage.objects.get(
|
||||
slack_id=thread_ts,
|
||||
_slack_team_identity=slack_team_identity,
|
||||
channel_id=channel_id,
|
||||
)
|
||||
alert_group = slack_message.alert_group
|
||||
|
||||
try:
|
||||
author_slack_user_identity = SlackUserIdentity.objects.get(
|
||||
slack_id=payload["message"]["user"], slack_team_identity=slack_team_identity
|
||||
|
|
@ -174,12 +179,13 @@ class AddToResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
)
|
||||
self.open_warning_window(payload, warning_text)
|
||||
return
|
||||
|
||||
resolution_note_slack_message = ResolutionNoteSlackMessage(
|
||||
alert_group=alert_group,
|
||||
user=author_user,
|
||||
added_by_user=self.user,
|
||||
text=text,
|
||||
slack_channel_id=channel_id,
|
||||
slack_channel=slack_channel,
|
||||
thread_ts=thread_ts,
|
||||
ts=message_ts,
|
||||
permalink=permalink,
|
||||
|
|
@ -188,6 +194,7 @@ class AddToResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
resolution_note_slack_message.added_to_resolution_note = True
|
||||
resolution_note_slack_message.save()
|
||||
resolution_note = resolution_note_slack_message.get_resolution_note()
|
||||
|
||||
if resolution_note is None:
|
||||
ResolutionNote(
|
||||
alert_group=alert_group,
|
||||
|
|
@ -197,6 +204,7 @@ class AddToResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
).save()
|
||||
else:
|
||||
resolution_note.recreate()
|
||||
|
||||
try:
|
||||
self._slack_client.reactions_add(
|
||||
channel=channel_id,
|
||||
|
|
@ -225,14 +233,14 @@ class UpdateResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
)
|
||||
|
||||
def remove_resolution_note_slack_message(self, resolution_note: "ResolutionNote") -> None:
|
||||
resolution_note_slack_message = resolution_note.resolution_note_slack_message
|
||||
if resolution_note_slack_message is not None:
|
||||
if (resolution_note_slack_message := resolution_note.resolution_note_slack_message) is not None:
|
||||
resolution_note_slack_message.added_to_resolution_note = False
|
||||
resolution_note_slack_message.save(update_fields=["added_to_resolution_note"])
|
||||
|
||||
if resolution_note_slack_message.posted_by_bot:
|
||||
try:
|
||||
self._slack_client.chat_delete(
|
||||
channel=resolution_note_slack_message.slack_channel_id,
|
||||
channel=resolution_note_slack_message.slack_channel_slack_id,
|
||||
ts=resolution_note_slack_message.ts,
|
||||
)
|
||||
except RESOLUTION_NOTE_EXCEPTIONS:
|
||||
|
|
@ -242,17 +250,23 @@ class UpdateResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
|
||||
def post_or_update_resolution_note_in_thread(self, resolution_note: "ResolutionNote") -> None:
|
||||
from apps.alerts.models import ResolutionNoteSlackMessage
|
||||
from apps.slack.models import SlackChannel
|
||||
|
||||
resolution_note_slack_message = resolution_note.resolution_note_slack_message
|
||||
alert_group = resolution_note.alert_group
|
||||
alert_group_slack_message = alert_group.slack_message
|
||||
slack_channel_id = alert_group_slack_message.channel_id
|
||||
blocks = self.get_resolution_note_blocks(resolution_note)
|
||||
|
||||
slack_channel = SlackChannel.objects.get(
|
||||
slack_id=slack_channel_id, slack_team_identity=self.slack_team_identity
|
||||
)
|
||||
|
||||
if resolution_note_slack_message is None:
|
||||
resolution_note_text = Truncator(resolution_note.text)
|
||||
try:
|
||||
result = self._slack_client.chat_postMessage(
|
||||
channel=alert_group_slack_message.channel_id,
|
||||
channel=slack_channel_id,
|
||||
thread_ts=alert_group_slack_message.slack_id,
|
||||
text=resolution_note_text.chars(BLOCK_SECTION_TEXT_MAX_SIZE),
|
||||
blocks=blocks,
|
||||
|
|
@ -261,17 +275,14 @@ class UpdateResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
pass
|
||||
else:
|
||||
message_ts = result["message"]["ts"]
|
||||
result_permalink = self._slack_client.chat_getPermalink(
|
||||
channel=alert_group_slack_message.channel_id,
|
||||
message_ts=message_ts,
|
||||
)
|
||||
result_permalink = self._slack_client.chat_getPermalink(channel=slack_channel_id, message_ts=message_ts)
|
||||
|
||||
resolution_note_slack_message = ResolutionNoteSlackMessage(
|
||||
alert_group=alert_group,
|
||||
user=resolution_note.author,
|
||||
added_by_user=resolution_note.author,
|
||||
text=resolution_note.text,
|
||||
slack_channel_id=alert_group_slack_message.channel_id,
|
||||
slack_channel=slack_channel,
|
||||
thread_ts=result["ts"],
|
||||
ts=message_ts,
|
||||
permalink=result_permalink["permalink"],
|
||||
|
|
@ -305,7 +316,7 @@ class UpdateResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
def add_resolution_note_reaction(self, slack_thread_message: "ResolutionNoteSlackMessage"):
|
||||
try:
|
||||
self._slack_client.reactions_add(
|
||||
channel=slack_thread_message.slack_channel_id,
|
||||
channel=slack_thread_message.slack_channel_slack_id,
|
||||
name="memo",
|
||||
timestamp=slack_thread_message.ts,
|
||||
)
|
||||
|
|
@ -315,7 +326,7 @@ class UpdateResolutionNoteStep(scenario_step.ScenarioStep):
|
|||
def remove_resolution_note_reaction(self, slack_thread_message: "ResolutionNoteSlackMessage") -> None:
|
||||
try:
|
||||
self._slack_client.reactions_remove(
|
||||
channel=slack_thread_message.slack_channel_id,
|
||||
channel=slack_thread_message.slack_channel_slack_id,
|
||||
name="memo",
|
||||
timestamp=slack_thread_message.ts,
|
||||
)
|
||||
|
|
|
|||
|
|
@ -57,8 +57,6 @@ class SlackChannelDeletedEventStep(scenario_step.ScenarioStep):
|
|||
slack_id=slack_id,
|
||||
slack_team_identity=slack_team_identity,
|
||||
).delete()
|
||||
# even if channel is deteletd run the task to clean possible leftowers
|
||||
clean_slack_channel_leftovers.apply_async((slack_team_identity.id, slack_id))
|
||||
|
||||
|
||||
class SlackChannelArchivedEventStep(scenario_step.ScenarioStep):
|
||||
|
|
@ -83,7 +81,7 @@ class SlackChannelArchivedEventStep(scenario_step.ScenarioStep):
|
|||
clean_slack_channel_leftovers.apply_async((slack_team_identity.id, slack_id))
|
||||
|
||||
|
||||
class SlackChannelUnArchivedEventStep(scenario_step.ScenarioStep):
|
||||
class SlackChannelUnarchivedEventStep(scenario_step.ScenarioStep):
|
||||
def process_scenario(
|
||||
self,
|
||||
slack_user_identity: "SlackUserIdentity",
|
||||
|
|
@ -128,6 +126,6 @@ STEPS_ROUTING: ScenarioRoute.RoutingSteps = [
|
|||
{
|
||||
"payload_type": PayloadType.EVENT_CALLBACK,
|
||||
"event_type": EventType.CHANNEL_UNARCHIVED,
|
||||
"step": SlackChannelUnArchivedEventStep,
|
||||
"step": SlackChannelUnarchivedEventStep,
|
||||
},
|
||||
]
|
||||
|
|
|
|||
|
|
@ -44,7 +44,7 @@ class SlackChannelMessageEventStep(scenario_step.ScenarioStep):
|
|||
self, slack_user_identity: "SlackUserIdentity", payload: EventPayload
|
||||
) -> None:
|
||||
from apps.alerts.models import ResolutionNoteSlackMessage
|
||||
from apps.slack.models import SlackMessage
|
||||
from apps.slack.models import SlackChannel, SlackMessage
|
||||
|
||||
if slack_user_identity is None:
|
||||
logger.warning(
|
||||
|
|
@ -53,7 +53,7 @@ class SlackChannelMessageEventStep(scenario_step.ScenarioStep):
|
|||
)
|
||||
return
|
||||
|
||||
channel = payload["event"]["channel"]
|
||||
channel_id = payload["event"]["channel"]
|
||||
thread_ts = payload["event"].get("thread_ts") or payload["event"]["message"]["thread_ts"]
|
||||
# sometimes we get messages with empty text, probably because it's an image or attachment
|
||||
event_text = payload["event"].get("text")
|
||||
|
|
@ -68,18 +68,23 @@ class SlackChannelMessageEventStep(scenario_step.ScenarioStep):
|
|||
try:
|
||||
slack_message = SlackMessage.objects.get(
|
||||
slack_id=thread_ts,
|
||||
channel_id=channel,
|
||||
channel_id=channel_id,
|
||||
_slack_team_identity=self.slack_team_identity,
|
||||
)
|
||||
except SlackMessage.DoesNotExist:
|
||||
return
|
||||
|
||||
try:
|
||||
slack_channel = SlackChannel.objects.get(slack_id=channel_id, slack_team_identity=self.slack_team_identity)
|
||||
except SlackChannel.DoesNotExist:
|
||||
return
|
||||
|
||||
if not slack_message.alert_group:
|
||||
# SlackMessage instances without alert_group set (e.g., SSR Slack messages)
|
||||
return
|
||||
|
||||
try:
|
||||
result = self._slack_client.chat_getPermalink(channel=channel, message_ts=message_ts)
|
||||
result = self._slack_client.chat_getPermalink(channel=channel_id, message_ts=message_ts)
|
||||
except RESOLUTION_NOTE_EXCEPTIONS:
|
||||
return
|
||||
|
||||
|
|
@ -89,7 +94,7 @@ class SlackChannelMessageEventStep(scenario_step.ScenarioStep):
|
|||
|
||||
if len(text) > 2900:
|
||||
self._slack_client.chat_postEphemeral(
|
||||
channel=channel,
|
||||
channel=channel_id,
|
||||
user=slack_user_identity.slack_id,
|
||||
text=":warning: Unable to show the <{}|message> in Resolution Note: the message is too long ({}). "
|
||||
"Max length - 2900 symbols.".format(permalink, len(text)),
|
||||
|
|
@ -104,7 +109,7 @@ class SlackChannelMessageEventStep(scenario_step.ScenarioStep):
|
|||
"user": self.user,
|
||||
"added_by_user": self.user,
|
||||
"text": text,
|
||||
"slack_channel_id": channel,
|
||||
"slack_channel": slack_channel,
|
||||
"permalink": permalink,
|
||||
},
|
||||
)
|
||||
|
|
@ -132,7 +137,7 @@ class SlackChannelMessageEventStep(scenario_step.ScenarioStep):
|
|||
slack_thread_message = ResolutionNoteSlackMessage.objects.get(
|
||||
ts=message_ts,
|
||||
thread_ts=thread_ts,
|
||||
slack_channel_id=channel_id,
|
||||
slack_channel__slack_id=channel_id,
|
||||
)
|
||||
except ResolutionNoteSlackMessage.DoesNotExist:
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -31,7 +31,7 @@ class AlertGroupLogSlackRenderer:
|
|||
log_builder = IncidentLogBuilder(alert_group)
|
||||
result = ""
|
||||
if not (alert_group.resolved or alert_group.wiped_at or alert_group.root_alert_group):
|
||||
escalation_policies_plan = log_builder.get_incident_escalation_plan(for_slack=True)
|
||||
escalation_policies_plan = log_builder.get_escalation_plan(for_slack=True)
|
||||
if escalation_policies_plan:
|
||||
result += "\n:arrow_down: :arrow_down: :arrow_down: Plan:\n\n"
|
||||
# humanize time, create plan text
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import logging
|
||||
import random
|
||||
from typing import Optional
|
||||
import typing
|
||||
|
||||
from celery import uuid as celery_uuid
|
||||
from celery.exceptions import Retry
|
||||
|
|
@ -198,8 +198,8 @@ def unpopulate_slack_user_identities(organization_pk, force=False, ts=None):
|
|||
|
||||
if force:
|
||||
organization.slack_team_identity = None
|
||||
organization.general_log_channel_id = None
|
||||
organization.save(update_fields=["slack_team_identity", "general_log_channel_id"])
|
||||
organization.default_slack_channel = None
|
||||
organization.save(update_fields=["slack_team_identity", "default_slack_channel"])
|
||||
|
||||
|
||||
@shared_dedicated_queue_retry_task(autoretry_for=(Exception,), retry_backoff=True, max_retries=0)
|
||||
|
|
@ -302,16 +302,16 @@ def post_slack_rate_limit_message(integration_id):
|
|||
f"Active: {integration.rate_limit_message_task_id}"
|
||||
)
|
||||
return
|
||||
|
||||
default_route = integration.channel_filters.get(is_default=True)
|
||||
slack_channel = default_route.slack_channel_id_or_general_log_id
|
||||
if slack_channel:
|
||||
if (slack_channel_id := default_route.slack_channel_id_or_org_default_id) is not None:
|
||||
text = (
|
||||
f"Delivering and updating alert groups of integration {integration.verbal_name} in Slack is "
|
||||
f"temporarily stopped due to rate limit. You could find new alert groups at "
|
||||
f"<{integration.new_incidents_web_link}|web page "
|
||||
'"Alert Groups">'
|
||||
)
|
||||
post_message_to_channel(integration.organization, slack_channel, text)
|
||||
post_message_to_channel(integration.organization, slack_channel_id, text)
|
||||
|
||||
|
||||
@shared_dedicated_queue_retry_task(
|
||||
|
|
@ -433,7 +433,7 @@ def populate_slack_channels():
|
|||
|
||||
|
||||
def start_populate_slack_channels_for_team(
|
||||
slack_team_identity_id: int, delay: int, cursor: Optional[str] = None
|
||||
slack_team_identity_id: int, delay: int, cursor: typing.Optional[str] = None
|
||||
) -> None:
|
||||
# save active task id in cache to make only one populate task active per team
|
||||
task_id = celery_uuid()
|
||||
|
|
@ -445,7 +445,7 @@ def start_populate_slack_channels_for_team(
|
|||
@shared_dedicated_queue_retry_task(
|
||||
autoretry_for=(Exception,), retry_backoff=True, max_retries=1 if settings.DEBUG else None
|
||||
)
|
||||
def populate_slack_channels_for_team(slack_team_identity_id: int, cursor: Optional[str] = None) -> None:
|
||||
def populate_slack_channels_for_team(slack_team_identity_id: int, cursor: typing.Optional[str] = None) -> None:
|
||||
"""
|
||||
Make paginated request to get slack channels. On ratelimit - update info for got channels, save collected channels
|
||||
ids in cache and restart the task with the last successful pagination cursor to avoid any data loss during delay
|
||||
|
|
@ -539,7 +539,7 @@ def populate_slack_channels_for_team(slack_team_identity_id: int, cursor: Option
|
|||
|
||||
|
||||
@shared_dedicated_queue_retry_task(autoretry_for=(Exception,), retry_backoff=True, max_retries=0)
|
||||
def clean_slack_integration_leftovers(organization_id, *args, **kwargs):
|
||||
def clean_slack_integration_leftovers(organization_id: int, *args, **kwargs) -> None:
|
||||
"""
|
||||
This task removes binding to slack (e.g ChannelFilter's slack channel) for a given organization.
|
||||
It is used when user changes slack integration.
|
||||
|
|
@ -548,19 +548,25 @@ def clean_slack_integration_leftovers(organization_id, *args, **kwargs):
|
|||
from apps.schedules.models import OnCallSchedule
|
||||
|
||||
logger.info(f"Cleaning up for organization {organization_id}")
|
||||
ChannelFilter.objects.filter(alert_receive_channel__organization_id=organization_id).update(slack_channel_id=None)
|
||||
OnCallSchedule.objects.filter(organization_id=organization_id).update(channel=None, user_group=None)
|
||||
ChannelFilter.objects.filter(alert_receive_channel__organization_id=organization_id).update(slack_channel=None)
|
||||
OnCallSchedule.objects.filter(organization_id=organization_id).update(slack_channel=None, user_group=None)
|
||||
|
||||
|
||||
@shared_dedicated_queue_retry_task(autoretry_for=(Exception,), retry_backoff=True, max_retries=10)
|
||||
def clean_slack_channel_leftovers(slack_team_identity_id, slack_channel_id):
|
||||
def clean_slack_channel_leftovers(slack_team_identity_id: int, slack_channel_id: str) -> None:
|
||||
"""
|
||||
This task removes binding to slack channel after channel arcived or deleted in slack.
|
||||
This task removes binding to slack channel after a channel is archived in Slack.
|
||||
|
||||
**NOTE**: this is only needed for Slack Channel archive. If a channel is deleted, we simply remove references
|
||||
to that channel via `on_delete=models.SET_NULL`.
|
||||
"""
|
||||
from apps.alerts.models import ChannelFilter
|
||||
from apps.schedules.models import OnCallSchedule
|
||||
from apps.slack.models import SlackTeamIdentity
|
||||
from apps.user_management.models import Organization
|
||||
|
||||
orgs_to_clean_default_slack_channel: typing.List[Organization] = []
|
||||
|
||||
try:
|
||||
sti = SlackTeamIdentity.objects.get(id=slack_team_identity_id)
|
||||
except SlackTeamIdentity.DoesNotExist:
|
||||
|
|
@ -569,16 +575,26 @@ def clean_slack_channel_leftovers(slack_team_identity_id, slack_channel_id):
|
|||
)
|
||||
return
|
||||
|
||||
orgs_to_clean_general_log_channel_id = []
|
||||
for org in sti.organizations.all():
|
||||
if org.general_log_channel_id == slack_channel_id:
|
||||
logger.info(
|
||||
f"Set general_log_channel_id to None for org_id={org.id} slack_channel_id={slack_channel_id} since slack_channel is arcived or deleted"
|
||||
)
|
||||
org.general_log_channel_id = None
|
||||
orgs_to_clean_general_log_channel_id.append(org)
|
||||
ChannelFilter.objects.filter(alert_receive_channel__organization=org, slack_channel_id=slack_channel_id).update(
|
||||
slack_channel_id=None
|
||||
)
|
||||
org_id = org.id
|
||||
|
||||
Organization.objects.bulk_update(orgs_to_clean_general_log_channel_id, ["general_log_channel_id"], batch_size=5000)
|
||||
if org.default_slack_channel_slack_id == slack_channel_id:
|
||||
logger.info(
|
||||
f"Set default_slack_channel to None for org_id={org_id} slack_channel_id={slack_channel_id} since slack_channel is arcived or deleted"
|
||||
)
|
||||
org.default_slack_channel = None
|
||||
orgs_to_clean_default_slack_channel.append(org)
|
||||
|
||||
# The channel no longer exists, so update any integration routes (ie. ChannelFilter) or schedules
|
||||
# that reference it
|
||||
ChannelFilter.objects.filter(
|
||||
alert_receive_channel__organization=org,
|
||||
slack_channel__slack_id=slack_channel_id,
|
||||
).update(slack_channel=None)
|
||||
|
||||
OnCallSchedule.objects.filter(
|
||||
organization_id=org_id,
|
||||
slack_channel__slack_id=slack_channel_id,
|
||||
).update(slack_channel=None)
|
||||
|
||||
Organization.objects.bulk_update(orgs_to_clean_default_slack_channel, ["default_slack_channel"], batch_size=5000)
|
||||
|
|
|
|||
|
|
@ -32,7 +32,7 @@ class SlackUserGroupFactory(factory.DjangoModelFactory):
|
|||
|
||||
|
||||
class SlackChannelFactory(factory.DjangoModelFactory):
|
||||
slack_id = UniqueFaker("sentence", nb_words=3)
|
||||
slack_id = factory.Sequence(lambda n: f"TEST_SLACK_ID_{n}")
|
||||
name = factory.Faker("word")
|
||||
|
||||
class Meta:
|
||||
|
|
|
|||
|
|
@ -55,6 +55,7 @@ def test_skip_escalations_error(
|
|||
@pytest.mark.django_db
|
||||
def test_timeout_error(
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
|
|
@ -62,9 +63,8 @@ def test_timeout_error(
|
|||
):
|
||||
SlackAlertShootingStep = ScenarioStep.get_step("distribute_alerts", "AlertShootingStep")
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
organization = make_organization(
|
||||
slack_team_identity=slack_team_identity, general_log_channel_id="DEFAULT_CHANNEL_ID"
|
||||
)
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
organization = make_organization(slack_team_identity=slack_team_identity, default_slack_channel=slack_channel)
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
alert_group = make_alert_group(alert_receive_channel)
|
||||
alert = make_alert(alert_group, raw_request_data="{}")
|
||||
|
|
@ -89,15 +89,15 @@ def test_timeout_error(
|
|||
def test_alert_shooting_no_channel_filter(
|
||||
mock_post_alert_group_to_slack,
|
||||
make_slack_team_identity,
|
||||
make_slack_channel,
|
||||
make_organization,
|
||||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_alert,
|
||||
):
|
||||
slack_team_identity = make_slack_team_identity()
|
||||
organization = make_organization(
|
||||
slack_team_identity=slack_team_identity, general_log_channel_id="DEFAULT_CHANNEL_ID"
|
||||
)
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id="DEFAULT_CHANNEL_ID")
|
||||
organization = make_organization(slack_team_identity=slack_team_identity, default_slack_channel=slack_channel)
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
|
||||
# simulate an alert group with channel filter deleted in the middle of the escalation
|
||||
|
|
@ -145,17 +145,23 @@ def test_post_or_update_resolution_note_in_thread_truncate_message_text(
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_slack_message,
|
||||
make_slack_channel,
|
||||
make_resolution_note,
|
||||
):
|
||||
UpdateResolutionNoteStep = ScenarioStep.get_step("resolution_note", "UpdateResolutionNoteStep")
|
||||
organization, user, slack_team_identity, _ = make_organization_and_user_with_slack_identities()
|
||||
step = UpdateResolutionNoteStep(slack_team_identity)
|
||||
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
alert_group = make_alert_group(alert_receive_channel)
|
||||
make_slack_message(alert_group=alert_group, channel_id="RANDOM_CHANNEL_ID", slack_id="RANDOM_MESSAGE_ID")
|
||||
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
make_slack_message(alert_group=alert_group, channel_id=slack_channel_id)
|
||||
|
||||
resolution_note = make_resolution_note(alert_group=alert_group, author=user, message_text="a" * 3000)
|
||||
|
||||
UpdateResolutionNoteStep = ScenarioStep.get_step("resolution_note", "UpdateResolutionNoteStep")
|
||||
step = UpdateResolutionNoteStep(slack_team_identity)
|
||||
|
||||
with patch("apps.slack.client.SlackClient.api_call") as mock_slack_api_call:
|
||||
mock_slack_api_call.return_value = {
|
||||
"ts": "timestamp",
|
||||
|
|
@ -176,16 +182,19 @@ def test_post_or_update_resolution_note_in_thread_update_truncate_message_text(
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_slack_message,
|
||||
make_slack_channel,
|
||||
make_resolution_note,
|
||||
make_resolution_note_slack_message,
|
||||
):
|
||||
UpdateResolutionNoteStep = ScenarioStep.get_step("resolution_note", "UpdateResolutionNoteStep")
|
||||
organization, user, slack_team_identity, _ = make_organization_and_user_with_slack_identities()
|
||||
step = UpdateResolutionNoteStep(slack_team_identity)
|
||||
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
alert_group = make_alert_group(alert_receive_channel)
|
||||
make_slack_message(alert_group=alert_group, channel_id="RANDOM_CHANNEL_ID", slack_id="RANDOM_MESSAGE_ID")
|
||||
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
make_slack_message(alert_group=alert_group, channel_id=slack_channel_id)
|
||||
|
||||
resolution_note = make_resolution_note(alert_group=alert_group, author=user, message_text="a" * 3000)
|
||||
make_resolution_note_slack_message(
|
||||
alert_group=alert_group,
|
||||
|
|
@ -197,6 +206,9 @@ def test_post_or_update_resolution_note_in_thread_update_truncate_message_text(
|
|||
text=resolution_note.text,
|
||||
)
|
||||
|
||||
UpdateResolutionNoteStep = ScenarioStep.get_step("resolution_note", "UpdateResolutionNoteStep")
|
||||
step = UpdateResolutionNoteStep(slack_team_identity)
|
||||
|
||||
with patch("apps.slack.client.SlackClient.api_call") as mock_slack_api_call:
|
||||
mock_slack_api_call.return_value = {
|
||||
"ts": "timestamp",
|
||||
|
|
@ -345,16 +357,21 @@ def test_add_to_resolution_note(
|
|||
make_alert_group,
|
||||
make_alert,
|
||||
make_slack_message,
|
||||
make_slack_channel,
|
||||
settings,
|
||||
):
|
||||
organization, user, slack_team_identity, slack_user_identity = make_organization_and_user_with_slack_identities()
|
||||
alert_receive_channel = make_alert_receive_channel(organization)
|
||||
alert_group = make_alert_group(alert_receive_channel)
|
||||
make_alert(alert_group=alert_group, raw_request_data={})
|
||||
slack_message = make_slack_message(alert_group=alert_group)
|
||||
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
slack_message = make_slack_message(alert_group=alert_group, channel_id=slack_channel_id)
|
||||
|
||||
payload = {
|
||||
"channel": {"id": slack_message.channel_id},
|
||||
"channel": {"id": slack_channel_id},
|
||||
"message_ts": "random_ts",
|
||||
"message": {
|
||||
"type": "message",
|
||||
217
engine/apps/slack/tests/scenario_steps/test_slack_channel.py
Normal file
217
engine/apps/slack/tests/scenario_steps/test_slack_channel.py
Normal file
|
|
@ -0,0 +1,217 @@
|
|||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
from django.utils import timezone
|
||||
|
||||
from apps.slack.models import SlackChannel
|
||||
from apps.slack.scenarios import slack_channel as slack_channel_scenarios
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestSlackChannelCreatedOrRenamedEventStep:
|
||||
def test_process_scenario_channel_created(
|
||||
self,
|
||||
make_organization_and_user_with_slack_identities,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
user,
|
||||
slack_team_identity,
|
||||
slack_user_identity,
|
||||
) = make_organization_and_user_with_slack_identities()
|
||||
slack_channel_id = "C12345678"
|
||||
channel_name = "new-channel"
|
||||
payload = {
|
||||
"event": {
|
||||
"channel": {
|
||||
"id": slack_channel_id,
|
||||
"name": channel_name,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
# Ensure the SlackChannel does not exist
|
||||
assert not SlackChannel.objects.filter(
|
||||
slack_id=slack_channel_id,
|
||||
slack_team_identity=slack_team_identity,
|
||||
).exists()
|
||||
|
||||
step = slack_channel_scenarios.SlackChannelCreatedOrRenamedEventStep(slack_team_identity, organization, user)
|
||||
step.process_scenario(slack_user_identity, slack_team_identity, payload)
|
||||
|
||||
# Now the SlackChannel should exist with correct data
|
||||
slack_channel = SlackChannel.objects.get(
|
||||
slack_id=slack_channel_id,
|
||||
slack_team_identity=slack_team_identity,
|
||||
)
|
||||
assert slack_channel.name == channel_name
|
||||
assert slack_channel.last_populated == timezone.now().date()
|
||||
|
||||
def test_process_scenario_channel_renamed(
|
||||
self,
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_channel,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
user,
|
||||
slack_team_identity,
|
||||
slack_user_identity,
|
||||
) = make_organization_and_user_with_slack_identities()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
new_name = "renamed-channel"
|
||||
payload = {
|
||||
"event": {
|
||||
"channel": {
|
||||
"id": slack_channel_id,
|
||||
"name": new_name,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
step = slack_channel_scenarios.SlackChannelCreatedOrRenamedEventStep(slack_team_identity, organization, user)
|
||||
step.process_scenario(slack_user_identity, slack_team_identity, payload)
|
||||
|
||||
slack_channel.refresh_from_db()
|
||||
assert slack_channel.name == new_name
|
||||
assert slack_channel.last_populated == timezone.now().date()
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestSlackChannelDeletedEventStep:
|
||||
def test_process_scenario_channel_deleted(
|
||||
self,
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_channel,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
user,
|
||||
slack_team_identity,
|
||||
slack_user_identity,
|
||||
) = make_organization_and_user_with_slack_identities()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
# Ensure the SlackChannel exists
|
||||
assert SlackChannel.objects.filter(
|
||||
slack_id=slack_channel_id,
|
||||
slack_team_identity=slack_team_identity,
|
||||
).exists()
|
||||
|
||||
step = slack_channel_scenarios.SlackChannelDeletedEventStep(slack_team_identity, organization, user)
|
||||
step.process_scenario(slack_user_identity, slack_team_identity, {"event": {"channel": slack_channel_id}})
|
||||
|
||||
# Now the SlackChannel should not exist
|
||||
assert not SlackChannel.objects.filter(
|
||||
slack_id=slack_channel_id,
|
||||
slack_team_identity=slack_team_identity,
|
||||
).exists()
|
||||
|
||||
def test_process_scenario_channel_does_not_exist(
|
||||
self,
|
||||
make_organization_and_user_with_slack_identities,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
user,
|
||||
slack_team_identity,
|
||||
slack_user_identity,
|
||||
) = make_organization_and_user_with_slack_identities()
|
||||
slack_channel_id = "C12345678"
|
||||
|
||||
# Ensure the SlackChannel does not exist
|
||||
assert not SlackChannel.objects.filter(
|
||||
slack_id=slack_channel_id,
|
||||
slack_team_identity=slack_team_identity,
|
||||
).exists()
|
||||
|
||||
step = slack_channel_scenarios.SlackChannelDeletedEventStep(slack_team_identity, organization, user)
|
||||
# The step should not raise an exception even if the channel does not exist
|
||||
step.process_scenario(slack_user_identity, slack_team_identity, {"event": {"channel": slack_channel_id}})
|
||||
|
||||
# Still, the SlackChannel does not exist
|
||||
assert not SlackChannel.objects.filter(
|
||||
slack_id=slack_channel_id,
|
||||
slack_team_identity=slack_team_identity,
|
||||
).exists()
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestSlackChannelArchivedEventStep:
|
||||
@patch("apps.slack.scenarios.slack_channel.clean_slack_channel_leftovers")
|
||||
def test_process_scenario(
|
||||
self,
|
||||
mock_clean_slack_channel_leftovers,
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_channel,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
user,
|
||||
slack_team_identity,
|
||||
slack_user_identity,
|
||||
) = make_organization_and_user_with_slack_identities()
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
assert slack_channel.is_archived is False
|
||||
|
||||
step = slack_channel_scenarios.SlackChannelArchivedEventStep(slack_team_identity, organization, user)
|
||||
step.process_scenario(slack_user_identity, slack_team_identity, {"event": {"channel": slack_channel_id}})
|
||||
|
||||
slack_channel.refresh_from_db()
|
||||
|
||||
assert slack_channel.is_archived is True
|
||||
mock_clean_slack_channel_leftovers.apply_async.assert_called_once_with(
|
||||
(slack_team_identity.id, slack_channel_id)
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestSlackChannelUnarchivedEventStep:
|
||||
def test_process_scenario_channel_unarchived(
|
||||
self,
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_channel,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
user,
|
||||
slack_team_identity,
|
||||
slack_user_identity,
|
||||
) = make_organization_and_user_with_slack_identities()
|
||||
slack_channel = make_slack_channel(slack_team_identity, is_archived=True)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
assert slack_channel.is_archived is True
|
||||
|
||||
step = slack_channel_scenarios.SlackChannelUnarchivedEventStep(slack_team_identity, organization, user)
|
||||
step.process_scenario(slack_user_identity, slack_team_identity, {"event": {"channel": slack_channel_id}})
|
||||
|
||||
slack_channel.refresh_from_db()
|
||||
assert slack_channel.is_archived is False
|
||||
|
||||
def test_process_scenario_channel_already_unarchived(
|
||||
self,
|
||||
make_organization_and_user_with_slack_identities,
|
||||
make_slack_channel,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
user,
|
||||
slack_team_identity,
|
||||
slack_user_identity,
|
||||
) = make_organization_and_user_with_slack_identities()
|
||||
slack_channel = make_slack_channel(slack_team_identity, is_archived=False)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
assert slack_channel.is_archived is False
|
||||
|
||||
step = slack_channel_scenarios.SlackChannelUnarchivedEventStep(slack_team_identity, organization, user)
|
||||
step.process_scenario(slack_user_identity, slack_team_identity, {"event": {"channel": slack_channel_id}})
|
||||
|
||||
slack_channel.refresh_from_db()
|
||||
# Ensure that is_archived remains False
|
||||
assert slack_channel.is_archived is False
|
||||
|
|
@ -219,6 +219,7 @@ class TestSlackChannelMessageEventStep:
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_slack_message,
|
||||
make_slack_channel,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
|
|
@ -229,11 +230,13 @@ class TestSlackChannelMessageEventStep:
|
|||
integration = make_alert_receive_channel(organization)
|
||||
alert_group = make_alert_group(integration)
|
||||
|
||||
channel = "potato"
|
||||
ts = 88945.4849
|
||||
thread_ts = 16789.123
|
||||
|
||||
make_slack_message(alert_group, slack_id=thread_ts, channel_id=channel)
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
make_slack_message(alert_group, slack_id=thread_ts, channel_id=slack_channel_id)
|
||||
|
||||
mock_permalink = "http://example.com"
|
||||
|
||||
|
|
@ -243,7 +246,7 @@ class TestSlackChannelMessageEventStep:
|
|||
|
||||
payload = {
|
||||
"event": {
|
||||
"channel": channel,
|
||||
"channel": slack_channel_id,
|
||||
"ts": ts,
|
||||
"thread_ts": thread_ts,
|
||||
"text": "h" * 2901,
|
||||
|
|
@ -272,6 +275,7 @@ class TestSlackChannelMessageEventStep:
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_slack_message,
|
||||
make_slack_channel,
|
||||
) -> None:
|
||||
(
|
||||
organization,
|
||||
|
|
@ -282,11 +286,13 @@ class TestSlackChannelMessageEventStep:
|
|||
integration = make_alert_receive_channel(organization)
|
||||
alert_group = make_alert_group(integration)
|
||||
|
||||
channel = "potato"
|
||||
ts = 88945.4849
|
||||
thread_ts = 16789.123
|
||||
|
||||
make_slack_message(alert_group, slack_id=thread_ts, channel_id=channel)
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
make_slack_message(alert_group, slack_id=thread_ts, channel_id=slack_channel_id)
|
||||
|
||||
step = SlackChannelMessageEventStep(slack_team_identity, organization, user)
|
||||
step._slack_client = Mock()
|
||||
|
|
@ -296,7 +302,7 @@ class TestSlackChannelMessageEventStep:
|
|||
|
||||
payload = {
|
||||
"event": {
|
||||
"channel": channel,
|
||||
"channel": slack_channel_id,
|
||||
"ts": ts,
|
||||
"thread_ts": thread_ts,
|
||||
"text": "h" * 2901,
|
||||
|
|
@ -318,6 +324,7 @@ class TestSlackChannelMessageEventStep:
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_slack_message,
|
||||
make_slack_channel,
|
||||
make_resolution_note_slack_message,
|
||||
resolution_note_slack_message_already_exists,
|
||||
) -> None:
|
||||
|
|
@ -332,12 +339,13 @@ class TestSlackChannelMessageEventStep:
|
|||
|
||||
original_text = "original text"
|
||||
new_text = "new text"
|
||||
|
||||
channel = "potato"
|
||||
ts = 88945.4849
|
||||
thread_ts = 16789.123
|
||||
|
||||
make_slack_message(alert_group, slack_id=thread_ts, channel_id=channel)
|
||||
slack_channel = make_slack_channel(slack_team_identity)
|
||||
slack_channel_id = slack_channel.slack_id
|
||||
|
||||
make_slack_message(alert_group, slack_id=thread_ts, channel_id=slack_channel_id)
|
||||
|
||||
resolution_note_slack_message = None
|
||||
if resolution_note_slack_message_already_exists:
|
||||
|
|
@ -353,7 +361,7 @@ class TestSlackChannelMessageEventStep:
|
|||
|
||||
payload = {
|
||||
"event": {
|
||||
"channel": channel,
|
||||
"channel": slack_channel_id,
|
||||
"ts": ts,
|
||||
"thread_ts": thread_ts,
|
||||
"text": new_text,
|
||||
|
|
@ -433,23 +441,25 @@ class TestSlackChannelMessageEventStep:
|
|||
make_alert_receive_channel,
|
||||
make_alert_group,
|
||||
make_resolution_note_slack_message,
|
||||
make_slack_channel,
|
||||
) -> None:
|
||||
channel_id = "potato"
|
||||
ts = 88945.4849
|
||||
thread_ts = 16789.123
|
||||
|
||||
(
|
||||
organization,
|
||||
user,
|
||||
slack_team_identity,
|
||||
slack_user_identity,
|
||||
) = make_organization_and_user_with_slack_identities()
|
||||
slack_channel = make_slack_channel(slack_team_identity, slack_id=channel_id)
|
||||
integration = make_alert_receive_channel(organization)
|
||||
alert_group = make_alert_group(integration)
|
||||
|
||||
channel = "potato"
|
||||
ts = 88945.4849
|
||||
thread_ts = 16789.123
|
||||
|
||||
payload = {
|
||||
"event": {
|
||||
"channel": channel,
|
||||
"channel": channel_id,
|
||||
"previous_message": {
|
||||
"ts": ts,
|
||||
"thread_ts": thread_ts,
|
||||
|
|
@ -458,7 +468,7 @@ class TestSlackChannelMessageEventStep:
|
|||
}
|
||||
|
||||
make_resolution_note_slack_message(
|
||||
alert_group, user, user, ts=ts, thread_ts=thread_ts, slack_channel_id=channel
|
||||
alert_group, user, user, ts=ts, thread_ts=thread_ts, slack_channel=slack_channel
|
||||
)
|
||||
|
||||
step = SlackChannelMessageEventStep(slack_team_identity, organization, user)
|
||||
|
|
@ -471,7 +481,7 @@ class TestSlackChannelMessageEventStep:
|
|||
ResolutionNoteSlackMessage.objects.filter(
|
||||
ts=ts,
|
||||
thread_ts=thread_ts,
|
||||
slack_channel_id=channel,
|
||||
slack_channel=slack_channel,
|
||||
).count()
|
||||
== 0
|
||||
)
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue