upgrade to Python 3.12 (#3456)

# What this PR does

Upgrade to Python 3.12 + fix several invalid test assertions that lead
to test failures in the latest version of `pytest`:
```
AttributeError: 'called_once_with' is not a valid assertion. Use a spec for the mock if 'called_once_with' is meant to be an attribute.. Did you mean: 'assert_called_once_with'?
```

## Checklist

- [ ] Unit, integration, and e2e (if applicable) tests updated
- [x] Documentation added (or `pr:no public docs` PR label added if not
required)
- [x] `CHANGELOG.md` updated (or `pr:no changelog` PR label added if not
required)
This commit is contained in:
Joey Orlando 2023-11-30 08:47:41 -05:00 committed by GitHub
parent 6ef0a3a683
commit 7c4b40a046
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
37 changed files with 125 additions and 111 deletions

View file

@ -53,7 +53,7 @@ steps:
- refs/tags/v*.*.*
- name: Lint Backend
image: python:3.11.4
image: python:3.12.0
environment:
DJANGO_SETTINGS_MODULE: settings.ci-test
commands:
@ -63,7 +63,7 @@ steps:
- pre-commit run flake8 --all-files
- name: Unit Test Backend
image: python:3.11.4
image: python:3.12.0
environment:
RABBITMQ_URI: amqp://rabbitmq:rabbitmq@rabbit_test:5672
DJANGO_SETTINGS_MODULE: settings.ci-test
@ -379,4 +379,4 @@ kind: secret
name: github_api_token
---
kind: signature
hmac: b9e499a424faecd9a8f41552cc307bd3431cb0e3fac77f3ee99ce19258fc0fec
hmac: 36a7d2e2906bad4f186adfa488057ffb9107ef1cdbb3e4d7cb61165b00870c6b

View file

@ -26,7 +26,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.11.4"
python-version: "3.12.0"
cache: "pip"
cache-dependency-path: |
engine/requirements.txt
@ -117,7 +117,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.11.4"
python-version: "3.12.0"
cache: "pip"
cache-dependency-path: |
engine/requirements.txt
@ -175,7 +175,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.11.4"
python-version: "3.12.0"
cache: "pip"
cache-dependency-path: |
engine/requirements.txt
@ -225,7 +225,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.11.4"
python-version: "3.12.0"
cache: "pip"
cache-dependency-path: |
engine/requirements.txt
@ -263,7 +263,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.11.4"
python-version: "3.12.0"
cache: "pip"
cache-dependency-path: |
engine/requirements.txt
@ -282,7 +282,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.11.4"
python-version: "3.12.0"
cache: "pip"
cache-dependency-path: tools/pagerduty-migrator/requirements.txt
- name: Unit Test PD Migrator
@ -298,7 +298,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.11.4"
python-version: "3.12.0"
cache: "pip"
cache-dependency-path: |
engine/requirements.txt

View file

@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.11.4"
python-version: "3.12.0"
cache: "pip"
cache-dependency-path: engine/requirements.txt
- uses: actions/setup-node@v3

View file

@ -16,7 +16,7 @@ repos:
args: [--settings-file=dev/scripts/.isort.cfg, --filter-files]
- repo: https://github.com/psf/black
rev: 23.7.0
rev: 23.11.0
hooks:
- id: black
files: ^engine
@ -29,7 +29,7 @@ repos:
files: ^dev/scripts
- repo: https://github.com/pycqa/flake8
rev: 6.0.0
rev: 6.1.0
hooks:
- id: flake8
files: ^engine

View file

@ -21,7 +21,11 @@ Minor bugfixes + dependency updates :)
### Added
- Add options to customize table columns in AlertGroup page ([3281](https://github.com/grafana/oncall/pull/3281))
- Add options to customize table columns in AlertGroup page ([#3281](https://github.com/grafana/oncall/pull/3281))
### Changed
- Upgrade to Python 3.12 by @joeyorlando ([#3456](https://github.com/grafana/oncall/pull/3456))
### Fixed
@ -39,9 +43,9 @@ Minor bugfixes + dependency updates :)
### Added
- Add ability to use Grafana Service Account Tokens for OnCall API (This is only enabled for resolution_notes
endpoint currently) @mderynck ([#3189](https://github.com/grafana/oncall/pull/3189))
endpoint currently) @mderynck ([#3189](https://github.com/grafana/oncall/pull/3189))
- Add ability for webhook presets to mask sensitive headers @mderynck
([#3189](https://github.com/grafana/oncall/pull/3189))
([#3189](https://github.com/grafana/oncall/pull/3189))
### Changed
@ -50,7 +54,7 @@ endpoint currently) @mderynck ([#3189](https://github.com/grafana/oncall/pull/31
### Fixed
- Fixed issue that blocked saving webhooks with presets if the preset is controlling the URL @mderynck
([#3189](https://github.com/grafana/oncall/pull/3189))
([#3189](https://github.com/grafana/oncall/pull/3189))
- User filter doesn't display current value on Alert Groups page ([1714](https://github.com/grafana/oncall/issues/1714))
- Remove displaying rotation modal for Terraform/API based schedules
- Filters polishing ([3183](https://github.com/grafana/oncall/issues/3183))

View file

@ -155,12 +155,8 @@ cleanup: stop ## this will remove all of the images, containers, volumes, and n
docker system prune --filter label="$(DOCKER_COMPOSE_DEV_LABEL)" --all --volumes
install-pre-commit:
@if [ ! -x "$$(command -v pre-commit)" ]; then \
echo "installing pre-commit"; \
pip install $$(grep "pre-commit" $(ENGINE_DIR)/requirements-dev.txt); \
else \
echo "pre-commit already installed"; \
fi
echo "installing pre-commit"
pip install $$(grep "pre-commit" $(ENGINE_DIR)/requirements-dev.txt)
lint: install-pre-commit ## run both frontend and backend linters
## may need to run `yarn install` from within `grafana-plugin`

View file

@ -58,7 +58,7 @@ Related: [How to develop integrations](/engine/config_integrations/README.md)
```yaml
env:
- name: FEATURE_LABELS_ENABLED_FOR_ALL
value: 'True'
value: "True"
```
3. Wait until all resources are green and open <http://localhost:3000/a/grafana-oncall-app> (user: oncall, password: oncall)
@ -215,8 +215,8 @@ See the `django-silk` documentation [here](https://github.com/jazzband/django-si
By default everything runs inside Docker. If you would like to run the backend services outside of Docker
(for integrating w/ PyCharm for example), follow these instructions:
1. Create a Python 3.11 virtual environment using a method of your choosing (ex.
[venv](https://docs.python.org/3.11/library/venv.html) or [pyenv-virtualenv](https://github.com/pyenv/pyenv-virtualenv)).
1. Create a Python 3.12 virtual environment using a method of your choosing (ex.
[venv](https://docs.python.org/3.12/library/venv.html) or [pyenv-virtualenv](https://github.com/pyenv/pyenv-virtualenv)).
Make sure the virtualenv is "activated".
2. `postgres` is a dependency on some of our Python dependencies (notably `psycopg2`
([docs](https://www.psycopg.org/docs/install.html#prerequisites))). Please visit

View file

@ -10,7 +10,7 @@ capable of generating the following objects:
## Prerequisites
1. Create/active a Python 3.11 virtual environment
1. Create/active a Python 3.12 virtual environment
2. `pip install -r requirements.txt`
3. Must have a local version of Grafana and OnCall up and running
4. Generate an API key inside of Grafana OnCall

View file

@ -1,4 +1,4 @@
FROM python:3.11.4-alpine3.18 AS base
FROM python:3.12.0-alpine3.18 AS base
# Create a group and user to run an app
ENV APP_USER=appuser

View file

@ -203,7 +203,7 @@ class GrafanaAlertingSyncManager:
if config is None:
logger.warning(
f"GrafanaAlertingSyncManager: Got config None in get_alerting_config_for_datasource "
f"for is_grafana_datasource {datasource_uid==cls.GRAFANA_ALERTING_DATASOURCE}, "
f"for is_grafana_datasource {datasource_uid == cls.GRAFANA_ALERTING_DATASOURCE}, "
f"response: {response_info}"
)
return
@ -232,7 +232,7 @@ class GrafanaAlertingSyncManager:
if response is None:
logger.warning(
f"GrafanaAlertingSyncManager: Failed to update contact point (POST) for is_grafana_datasource "
f"{datasource_uid==cls.GRAFANA_ALERTING_DATASOURCE}; response: {response_info}"
f"{datasource_uid == cls.GRAFANA_ALERTING_DATASOURCE}; response: {response_info}"
)
if response_info.get("status_code") == status.HTTP_400_BAD_REQUEST:
logger.warning(f"GrafanaAlertingSyncManager: Config: {config}, Updated config: {updated_config}")

View file

@ -1845,11 +1845,11 @@ class AlertGroup(AlertGroupSlackRenderingMixin, EscalationSnapshotMixin, models.
result_log_report = list()
for log_record in log_records_list:
if type(log_record) == AlertGroupLogRecord:
if type(log_record) is AlertGroupLogRecord:
result_log_report.append(log_record.render_log_line_json())
elif type(log_record) == UserNotificationPolicyLogRecord:
elif type(log_record) is UserNotificationPolicyLogRecord:
result_log_report.append(log_record.rendered_notification_log_line_json)
elif type(log_record) == ResolutionNote:
elif type(log_record) is ResolutionNote:
result_log_report.append(log_record.render_log_line_json())
return result_log_report

View file

@ -1,4 +1,4 @@
from unittest.mock import patch
from unittest.mock import call, patch
import pytest
from django.utils import timezone
@ -74,10 +74,16 @@ def test_direct_paging_user(make_organization, make_user_for_organization):
assert alert.message == msg
# notifications sent
for u, important in ((user, False), (other_user, True)):
assert notify_task.apply_async.called_with(
(u.pk, ag.pk), {"important": important, "notify_even_acknowledged": True, "notify_anyway": True}
)
notifications_sent = ((user, False), (other_user, True))
notify_task.apply_async.assert_has_calls(
[
call((u.pk, ag.pk), {"important": important, "notify_even_acknowledged": True, "notify_anyway": True})
for u, important in notifications_sent
]
)
for u, important in notifications_sent:
expected_info = {"user": u.public_primary_key, "important": important}
assert_log_record(ag, f"{from_user.username} paged user {u.username}", expected_info=expected_info)
@ -173,7 +179,7 @@ def test_direct_paging_reusing_alert_group(
# notifications sent
ag = alert_groups.get()
assert notify_task.apply_async.called_with(
notify_task.apply_async.assert_called_with(
(user.pk, ag.pk), {"important": False, "notify_even_acknowledged": True, "notify_anyway": True}
)
@ -243,11 +249,17 @@ def test_direct_paging_always_create_group(make_organization, make_user_for_orga
assert alert_groups.count() == 2
# notifications sent
assert notify_task.apply_async.called_with(
(user.pk, alert_groups[0].pk), {"important": False, "notify_even_acknowledged": True, "notify_anyway": True}
)
assert notify_task.apply_async.called_with(
(user.pk, alert_groups[1].pk), {"important": False, "notify_even_acknowledged": True, "notify_anyway": True}
notify_task.apply_async.assert_has_calls(
[
call(
(user.pk, alert_groups[0].pk),
{"important": False, "notify_even_acknowledged": True, "notify_anyway": True},
),
call(
(user.pk, alert_groups[1].pk),
{"important": False, "notify_even_acknowledged": True, "notify_anyway": True},
),
]
)

View file

@ -85,10 +85,10 @@ def test_list_alert_receive_channel_skip_pagination_for_grafana_alerting(
assert response.status_code == status.HTTP_200_OK
if should_be_unpaginated:
assert type(results) == list
assert type(results) is list
assert len(results) > 0
else:
assert type(results["results"]) == list
assert type(results["results"]) is list
assert len(results["results"]) > 0

View file

@ -271,7 +271,7 @@ class AlertReceiveChannelView(
if payload is None:
return channel.alert_groups.last().alerts.first()
else:
if type(payload) != dict:
if type(payload) is not dict:
raise PreviewTemplateException("Payload must be a valid json object")
# Build Alert and AlertGroup objects to pass to templater without saving them to db
alert_group_to_template = AlertGroup(channel=channel)
@ -336,7 +336,7 @@ class AlertReceiveChannelView(
try:
instance.start_maintenance(mode, duration, request.user)
except MaintenanceCouldNotBeStartedError as e:
if type(instance) == AlertReceiveChannel:
if type(instance) is AlertReceiveChannel:
detail = {"alert_receive_channel_id": ["Already on maintenance"]}
else:
detail = str(e)

View file

@ -40,7 +40,7 @@ def test_multi_type_support(value):
LiveSetting.objects.create(name="SOME_NEW_FEATURE_ENABLED", value=value)
setting_value = LiveSetting.get_setting("SOME_NEW_FEATURE_ENABLED")
assert type(setting_value) == type(value)
assert type(setting_value) is type(value)
assert setting_value == value

View file

@ -31,7 +31,7 @@ class TestIsRbacEnabledForStack:
api_client = GcomAPIClient("someFakeApiToken")
assert api_client.is_rbac_enabled_for_stack(stack_id) == expected
assert mocked_gcom_api_client_api_get.called_once_with(f"instances/{stack_id}?config=true")
mocked_gcom_api_client_api_get.assert_called_once_with(f"instances/{stack_id}?config=true")
@pytest.mark.parametrize(
"instance_info_feature_toggles,delimiter,expected",

View file

@ -20,7 +20,7 @@ def test_it_triggers_an_organization_sync_and_saves_the_grafana_token(
response = client.post(reverse("grafana-plugin:install"), format="json", **auth_headers)
assert response.status_code == status.HTTP_204_NO_CONTENT
assert mocked_sync_organization.called_once_with(organization)
mocked_sync_organization.assert_called_once_with(organization)
# make sure api token is saved on the org
organization.refresh_from_db()

View file

@ -74,8 +74,8 @@ def test_it_properly_handles_errors_from_the_grafana_api(
url = reverse("grafana-plugin:self-hosted-install")
response = client.post(url, format="json", **make_self_hosted_install_header(GRAFANA_TOKEN))
assert mocked_grafana_api_client.called_once_with(api_url=GRAFANA_API_URL, api_token=GRAFANA_TOKEN)
assert mocked_grafana_api_client.return_value.check_token.called_once_with()
mocked_grafana_api_client.assert_called_once_with(api_url=GRAFANA_API_URL, api_token=GRAFANA_TOKEN)
mocked_grafana_api_client.return_value.check_token.assert_called_once_with()
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data["error"] == expected_error_msg
@ -106,13 +106,13 @@ def test_if_organization_exists_it_is_updated(
url = reverse("grafana-plugin:self-hosted-install")
response = client.post(url, format="json", **make_self_hosted_install_header(GRAFANA_TOKEN))
assert mocked_grafana_api_client.called_once_with(api_url=GRAFANA_API_URL, api_token=GRAFANA_TOKEN)
assert mocked_grafana_api_client.return_value.check_token.called_once_with()
assert mocked_grafana_api_client.return_value.is_rbac_enabled_for_organization.called_once_with()
mocked_grafana_api_client.assert_called_once_with(api_url=GRAFANA_API_URL, api_token=GRAFANA_TOKEN)
mocked_grafana_api_client.return_value.check_token.assert_called_once_with()
mocked_grafana_api_client.return_value.is_rbac_enabled_for_organization.assert_called_once_with()
assert mocked_sync_organization.called_once_with(organization)
assert mocked_provision_plugin.called_once_with()
assert mocked_revoke_plugin.called_once_with()
mocked_sync_organization.assert_called_once_with(organization)
mocked_provision_plugin.assert_called_once_with()
mocked_revoke_plugin.assert_called_once_with()
assert response.status_code == status.HTTP_201_CREATED
assert response.data == {"error": None, **provision_plugin_response}
@ -151,12 +151,12 @@ def test_if_organization_does_not_exist_it_is_created(
organization = Organization.objects.filter(stack_id=STACK_ID, org_id=ORG_ID).first()
assert mocked_grafana_api_client.called_once_with(api_url=GRAFANA_API_URL, api_token=GRAFANA_TOKEN)
assert mocked_grafana_api_client.return_value.check_token.called_once_with()
assert mocked_grafana_api_client.return_value.is_rbac_enabled_for_organization.called_once_with()
mocked_grafana_api_client.assert_called_once_with(api_url=GRAFANA_API_URL, api_token=GRAFANA_TOKEN)
mocked_grafana_api_client.return_value.check_token.assert_called_once_with()
mocked_grafana_api_client.return_value.is_rbac_enabled_for_organization.assert_called_once_with()
assert mocked_sync_organization.called_once_with(organization)
assert mocked_provision_plugin.called_once_with()
mocked_sync_organization.assert_called_once_with(organization)
mocked_provision_plugin.assert_called_once_with()
assert not mocked_revoke_plugin.called
assert response.status_code == status.HTTP_201_CREATED

View file

@ -4,7 +4,7 @@ import pytest
from django.core.files.uploadedfile import SimpleUploadedFile
from django.db import OperationalError
from django.urls import reverse
from pytest_django.plugin import _DatabaseBlocker
from pytest_django import DjangoDbBlocker
from rest_framework import status
from rest_framework.test import APIClient
@ -12,9 +12,12 @@ from apps.alerts.models import AlertReceiveChannel
from apps.integrations.mixins import AlertChannelDefiningMixin
class DatabaseBlocker(_DatabaseBlocker):
class DatabaseBlocker(DjangoDbBlocker):
"""Customize pytest_django db blocker to raise OperationalError exception."""
def __init__(self, *args) -> None:
super().__init__(_ispytest=True)
def _blocking_wrapper(*args, **kwargs):
__tracebackhide__ = True
__tracebackhide__ # Silence pyflakes

View file

@ -47,7 +47,7 @@ def setup_heartbeat_integration(name=None):
}
)
else:
setup_heartbeat_integration(f"{name} { random.randint(1, 1024)}")
setup_heartbeat_integration(f"{name} {random.randint(1, 1024)}")
except requests.Timeout:
logger.warning("Unable to create cloud heartbeat integration. Request timeout.")
except requests.exceptions.RequestException as e:

View file

@ -141,7 +141,7 @@ def test_notify_by_provider_call_limits_warning(
phone_backend = PhoneBackend()
phone_backend._notify_by_provider_call(user, "some_message")
assert mock_add_call_limit_warning.called_once_with(2, "some_message")
mock_add_call_limit_warning.assert_called_once_with(2, "some_message")
@pytest.mark.django_db

View file

@ -150,7 +150,7 @@ def test_notify_by_provider_sms_limits_warning(
phone_backend = PhoneBackend()
phone_backend._notify_by_provider_sms(user, "some_message")
assert mock_add_sms_limit_warning.called_once_with(2, "some_message")
mock_add_sms_limit_warning.assert_called_once_with(2, "some_message")
@pytest.mark.django_db

View file

@ -78,7 +78,7 @@ def test_export_calendar(make_organization_and_user_with_token, make_user_for_or
cal = Calendar.from_ical(response.data)
assert type(cal) == Calendar
assert type(cal) is Calendar
# check there are events
assert len(cal.subcomponents) > 0
for component in cal.walk():
@ -112,7 +112,7 @@ def test_export_user_calendar(make_organization_and_user_with_token, make_schedu
cal = Calendar.from_ical(response.data)
assert type(cal) == Calendar
assert type(cal) is Calendar
assert cal.get("x-wr-calname") == "On-Call Schedule for {0}".format(user.username)
assert cal.get("x-wr-timezone") == "UTC"
assert cal.get("calscale") == "GREGORIAN"

View file

@ -184,7 +184,7 @@ def list_of_oncall_shifts_from_ical(
pytz_tz = pytz.timezone("UTC")
return (
datetime.datetime.combine(e["start"], datetime.datetime.min.time(), tzinfo=pytz_tz)
if type(e["start"]) == datetime.date
if type(e["start"]) is datetime.date
else e["start"]
)
@ -231,7 +231,7 @@ def get_shifts_dict(
)
# Define on-call shift out of ical event that has the actual user
if len(users) > 0 or with_empty_shifts:
if type(event[ICAL_DATETIME_START].dt) == datetime.date:
if type(event[ICAL_DATETIME_START].dt) is datetime.date:
start = event[ICAL_DATETIME_START].dt
end = event[ICAL_DATETIME_END].dt
result_date.append(
@ -623,7 +623,7 @@ def is_icals_equal(first, second):
def ical_date_to_datetime(date, tz, start):
datetime_to_combine = datetime.time.min
all_day = False
if type(date) == datetime.date:
if type(date) is datetime.date:
all_day = True
calendar_timezone_offset = datetime.datetime.now().astimezone(tz).utcoffset()
date = datetime.datetime.combine(date, datetime_to_combine).astimezone(tz) - calendar_timezone_offset
@ -776,7 +776,7 @@ def start_end_with_respect_to_all_day(event: IcalEvent, calendar_tz):
def event_start_end_all_day_with_respect_to_type(event: IcalEvent, calendar_tz):
all_day = False
if type(event[ICAL_DATETIME_START].dt) == datetime.date:
if type(event[ICAL_DATETIME_START].dt) is datetime.date:
start, end = start_end_with_respect_to_all_day(event, calendar_tz)
all_day = True
else:

View file

@ -375,7 +375,7 @@ class OnCallSchedule(PolymorphicModel):
events: ScheduleEvents = []
for shift in shifts:
start = shift["start"]
all_day = type(start) == datetime.date
all_day = type(start) is datetime.date
# fix confusing end date for all-day event
end = shift["end"] - datetime.timedelta(days=1) if all_day else shift["end"]
if all_day and all_day_datetime:
@ -500,7 +500,7 @@ class OnCallSchedule(PolymorphicModel):
# check if event was ended or cancelled, update ical
dtend = component.get(ICAL_DATETIME_END)
dtend_datetime = dtend.dt if dtend else None
if dtend_datetime and type(dtend_datetime) == datetime.date:
if dtend_datetime and type(dtend_datetime) is datetime.date:
# shift or overrides coming from ical calendars can be all day events, change to datetime
dtend_datetime = datetime.datetime.combine(
dtend.dt, datetime.datetime.min.time(), tzinfo=pytz.UTC
@ -1120,6 +1120,7 @@ class OnCallScheduleICal(OnCallSchedule):
class OnCallScheduleCalendar(OnCallSchedule):
custom_on_call_shifts: "RelatedManager['CustomOnCallShift']"
escalation_policies: "RelatedManager['EscalationPolicy']"
objects: models.Manager["OnCallScheduleCalendar"]
schedule_export_token: "RelatedManager['ScheduleExportAuthToken']"

View file

@ -23,7 +23,7 @@ def test_soft_delete(shift_swap_request_setup):
ssr.refresh_from_db()
assert ssr.deleted_at is not None
assert mock_refresh_final.apply_async.called_with((ssr.schedule.pk,))
mock_refresh_final.apply_async.assert_called_with((ssr.schedule.pk,))
assert ShiftSwapRequest.objects.all().count() == 0
assert ShiftSwapRequest.objects_with_deleted.all().count() == 1
@ -100,7 +100,7 @@ def test_take(
mock_notify_beneficiary_about_taken_shift_swap_request.apply_async.assert_called_once_with((ssr.pk,))
# final schedule refresh was triggered
assert mock_refresh_final.apply_async.called_with((ssr.schedule.pk,))
mock_refresh_final.apply_async.assert_called_with((ssr.schedule.pk,))
@pytest.mark.django_db

View file

@ -142,7 +142,7 @@ class SlackChannelMessageEventStep(scenario_step.ScenarioStep):
STEPS_ROUTING: ScenarioRoute.RoutingSteps = [
typing.cast(
ScenarioRoute.EventCallbackChannelMessageScenarioRoute,
ScenarioRoute.EventCallbackScenarioRoute,
{
"payload_type": PayloadType.EVENT_CALLBACK,
"event_type": EventType.MESSAGE,

View file

@ -22,9 +22,9 @@ class AlertGroupLogSlackRenderer:
# get rendered logs
result = ""
for log_record in all_log_records: # list of AlertGroupLogRecord and UserNotificationPolicyLogRecord logs
if type(log_record) == AlertGroupLogRecord:
if type(log_record) is AlertGroupLogRecord:
result += f"{log_record.rendered_incident_log_line(for_slack=True)}\n"
elif type(log_record) == UserNotificationPolicyLogRecord:
elif type(log_record) is UserNotificationPolicyLogRecord:
result += f"{log_record.rendered_notification_log_line(for_slack=True)}\n"
attachments.append(

View file

@ -242,7 +242,9 @@ def test_trigger_paging_additional_responders(make_organization_and_user_with_sl
with patch.object(step._slack_client, "api_call"):
step.process_scenario(slack_user_identity, slack_team_identity, payload)
mock_direct_paging.called_once_with(organization, user, "The Message", team, [(user, True)])
mock_direct_paging.assert_called_once_with(
organization=organization, from_user=user, message="The Message", team=team, users=[(user, True)]
)
@pytest.mark.django_db
@ -256,7 +258,13 @@ def test_page_team(make_organization_and_user_with_slack_identities, make_team):
with patch.object(step._slack_client, "api_call"):
step.process_scenario(slack_user_identity, slack_team_identity, payload)
mock_direct_paging.called_once_with(organization, user, "The Message", team)
mock_direct_paging.assert_called_once_with(
organization=organization,
from_user=user,
message="The Message",
team=team,
users=[],
)
@pytest.mark.django_db

View file

@ -19,17 +19,7 @@ class ScenarioRoute:
class EventCallbackScenarioRoute(_Base):
payload_type: typing.Literal[PayloadType.EVENT_CALLBACK]
event_type: EventType
class EventCallbackChannelMessageScenarioRoute(EventCallbackScenarioRoute):
"""
NOTE: the reason why we need to subclass `EventCallbackScenarioRoute` is because in Python 3.11 there is currently
no way to specify keys as optional in a `typing.TypedDict`. See [PEP-692](https://peps.python.org/pep-0692/) which
will implement this typing feature in Python 3.12.
When we upgrade to 3.12 we should update this type.
"""
message_channel_type: typing.Literal[EventType.MESSAGE_CHANNEL]
message_channel_type: typing.NotRequired[typing.Literal[EventType.MESSAGE_CHANNEL]]
class InteractiveMessageScenarioRoute(_Base):
payload_type: typing.Literal[PayloadType.INTERACTIVE_MESSAGE]
@ -51,7 +41,6 @@ class ScenarioRoute:
RoutingStep = (
BlockActionsScenarioRoute
| EventCallbackScenarioRoute
| EventCallbackChannelMessageScenarioRoute
| InteractiveMessageScenarioRoute
| MessageActionScenarioRoute
| SlashCommandScenarioRoute

View file

@ -270,7 +270,7 @@ def test_organization_moved_middleware_amazon_sns_headers(
response = client.post(url, data, format="json", **expected_sns_headers)
assert mocked_make_request.called
for k in AMAZON_SNS_HEADERS:
assert expected_sns_headers.get(f'HTTP_{k.upper().replace("-","_")}') == mocked_make_request.call_args.args[
assert expected_sns_headers.get(f'HTTP_{k.upper().replace("-", "_")}') == mocked_make_request.call_args.args[
2
].get(k)
assert response.content == expected_message

View file

@ -358,8 +358,8 @@ def test_sync_organization_is_rbac_permissions_enabled_cloud(mocked_gcom_client,
organization.refresh_from_db()
assert mocked_gcom_client.return_value.called_once_with("mockedToken")
assert mocked_gcom_client.return_value.is_rbac_enabled_for_stack.called_once_with(stack_id)
mocked_gcom_client.assert_called_once_with("mockedToken")
mocked_gcom_client.return_value.is_rbac_enabled_for_stack.assert_called_once_with(stack_id)
assert organization.is_rbac_permissions_enabled == gcom_api_response

View file

@ -150,7 +150,7 @@ def isoformat_with_tz_suffix(value):
def is_string_with_visible_characters(string):
return type(string) == str and not string.isspace() and not string == ""
return type(string) is str and not string.isspace() and not string == ""
def str_or_backup(string, backup):

View file

@ -2,12 +2,13 @@
profile = "black"
line_length=120
float_to_top=true
# TODO: update py_version to 312 once isort supports it
py_version=311
extend_skip_glob = "**/migrations/**"
[tool.black]
line-length = 120
target-version = ["py311"]
target-version = ["py312"]
force-exclude = "migrations"
[tool.mypy]

View file

@ -1,12 +1,12 @@
celery-types==0.18.0
django-filter-stubs==0.1.3
django-stubs[compatible-mypy]==4.2.2
djangorestframework-stubs[compatible-mypy]==3.14.2
mypy==1.4.1
pre-commit==2.15.0
pytest==7.3.1
pytest-django==4.5.2
pytest_factoryboy==2.5.1
django-stubs==4.2.6
djangorestframework-stubs==3.14.4
mypy==1.7.1
pre-commit==3.5.0
pytest==7.4.3
pytest-django==4.7.0
pytest_factoryboy==2.6.0
types-beautifulsoup4==4.12.0.5
types-PyMySQL==1.0.19.7
types-python-dateutil==2.8.19.13

View file

@ -50,7 +50,7 @@ pymdown-extensions==10.0
requests==2.31.0
urllib3==1.26.18
prometheus_client==0.16.0
lxml==4.9.2
lxml==4.9.3
babel==2.12.1
drf-spectacular==0.26.5
grpcio==1.57.0
grpcio==1.59.0

View file

@ -1,4 +1,4 @@
FROM python:3.11.4-alpine
FROM python:3.12.0-alpine
ENV PYTHONUNBUFFERED=1
WORKDIR /app