fake-data generation script + fixes for django-silk and django-debug-toolbar (#1128)

# What this PR does

## Main stuff

- add Python script to populate local Grafana/OnCall setup w/ large
amounts of fake data. Right now the data types that can be generated
are:
- teams and Admin users via the Grafana API (must be synced manually by
going into the UI before going onto the next step)
- Calendar Schedules which have three 8h oncall-shifts, via the OnCall
public API
- fixes `django-debug-toolbar` when being run in `docker-compose`
locally

## Other stuff
- documents how to easily modify the Grafana `docker-compose` container
provisioning configuration
- document solutions for two backend setup related issues encountered
when running the engine/celery workers locally, outside of
`docker-compose`, on an Apple silicon Mac
- fixes small bug in `grafana_plugin.helpers.client.APIClient.call_api`
where it would call `response.json()` for all requests, regardless of
whether or not the response actually contained data or not
- in `engine/settings/dev.py`, properly setup `django-silk` and document
the steps to use it locally
- make it possible to log out debug SQL queries by specifying
`DEV_DEBUG_VIEW_SQL_QUERIES` env var, rather than having to uncomment
out a section of `settings/dev.py`

## Which issue(s) this PR fixes

- Some local setup issues when trying to use `django-silk` and
`django-debug-toolbar`
- Makes it much easier to populate your local setup with a lot of fake
data
- Makes it possible to easily modify your local grafana's provisioning
configuration

## Checklist

- [ ] Tests updated (N/A)
- [ ] Documentation added (N/A)
- [ ] `CHANGELOG.md` updated (N/A)
This commit is contained in:
Joey Orlando 2023-01-20 09:19:41 +01:00 committed by GitHub
parent cc3fdab8fb
commit 98241b9a10
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
13 changed files with 496 additions and 26 deletions

View file

@ -10,6 +10,10 @@ repos:
files: ^tools/pagerduty-migrator
args:
[--settings-file=tools/pagerduty-migrator/.isort.cfg, --filter-files]
- id: isort
name: isort - dev/scripts
files: ^dev/scripts
args: [--settings-file=dev/scripts/.isort.cfg, --filter-files]
- repo: https://github.com/psf/black
rev: 22.3.0
@ -20,6 +24,9 @@ repos:
- id: black
name: black - pd-migrator
files: ^tools/pagerduty-migrator
- id: black
name: black - dev/scripts
files: ^dev/scripts
- repo: https://github.com/pycqa/flake8
rev: 3.9.2
@ -40,6 +47,17 @@ repos:
"--select=C,E,F,W,B,B950",
"--extend-ignore=E203,E501",
]
- id: flake8
name: flake8 - dev/scripts
files: ^dev/scripts
# Make sure config is compatible with black
# https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html#line-length
args:
[
--max-line-length=88,
"--select=C,E,F,W,B,B950",
"--extend-ignore=E203,E501",
]
- repo: https://github.com/pre-commit/mirrors-eslint
rev: v8.25.0
@ -90,4 +108,4 @@ repos:
entry: markdownlint --ignore grafana-plugin/node_modules --ignore grafana-plugin/dist --ignore docs **/*.md
- id: markdownlint
name: markdownlint - docs
entry: markdownlint --ignore grafana-plugin/node_modules --ignore grafana-plugin/dist -c ./docs/.markdownlint.json ./docs/**/*.md
entry: markdownlint -c ./docs/.markdownlint.json ./docs/**/*.md

1
dev/.gitignore vendored
View file

@ -1 +1,2 @@
.env.dev
grafana.dev.ini

View file

@ -3,6 +3,8 @@
- [Running the project](#running-the-project)
- [`COMPOSE_PROFILES`](#compose_profiles)
- [`GRAFANA_VERSION`](#grafana_version)
- [Configuring Grafana](#configuring-grafana)
- [Django Silk Profiling](#django-silk-profiling)
- [Running backend services outside Docker](#running-backend-services-outside-docker)
- [Useful `make` commands](#useful-make-commands)
- [Setting environment variables](#setting-environment-variables)
@ -14,6 +16,8 @@
- [django.db.utils.OperationalError: (1366, "Incorrect string value")](#djangodbutilsoperationalerror-1366-incorrect-string-value)
- [/bin/sh: line 0: cd: grafana-plugin: No such file or directory](#binsh-line-0-cd-grafana-plugin-no-such-file-or-directory)
- [Encountered error while trying to install package - grpcio](#encountered-error-while-trying-to-install-package---grpcio)
- [distutils.errors.CompileError: command '/usr/bin/clang' failed with exit code 1](#distutilserrorscompileerror-command-usrbinclang-failed-with-exit-code-1)
- [symbol not found in flat namespace '\_EVP_DigestSignUpdate'](#symbol-not-found-in-flat-namespace-_evp_digestsignupdate)
- [IDE Specific Instructions](#ide-specific-instructions)
- [PyCharm](#pycharm)
@ -80,6 +84,33 @@ If you would like to change the version of Grafana being run, simply pass in a `
to `make start` (or alternatively set it in your `.env.dev` file). The value of this environment variable should be a
valid `grafana/grafana` published Docker [image tag](https://hub.docker.com/r/grafana/grafana/tags).
### Configuring Grafana
This section is applicable for when you are running a Grafana container inside of `docker-compose` and you would like
to modify your Grafana instance's provisioning configuration.
The following commands assume you run them from the root of the project:
```bash
touch ./dev/grafana.dev.ini
# make desired changes to ./dev/grafana.dev.ini then run
touch .env && ./dev/add_env_var.sh GRAFANA_DEV_PROVISIONING ./dev/grafana.dev.ini .env
```
The next time you start the project via `docker-compose`, the `grafana` container will have `./dev/grafana.dev.ini`
volume mounted inside the container.
### Django Silk Profiling
In order to setup [`django-silk`](https://github.com/jazzband/django-silk) for local profiling, perform the following
steps:
1. `make engine-manage CMD="createsuperuser"` - follow CLI prompts to create a Django superuser
2. Visit <http://localhost:8080/django-admin> and login using the credentials you created in step #2
You should now be able to visit <http://localhost:8080/silk/> and see the Django Silk UI.
See the `django-silk` documentation [here](https://github.com/jazzband/django-silk) for more information.
### Running backend services outside Docker
By default everything runs inside Docker. If you would like to run the backend services outside of Docker
@ -306,6 +337,49 @@ Use a `conda` virtualenv, and then run the following when installing the engine
GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1 GRPC_PYTHON_BUILD_SYSTEM_ZLIB=1 pip install -r requirements.txt
```
### distutils.errors.CompileError: command '/usr/bin/clang' failed with exit code 1
See solution for "Encountered error while trying to install package - grpcio" [here](#encountered-error-while-trying-to-install-package---grpcio)
### symbol not found in flat namespace '\_EVP_DigestSignUpdate'
**Problem:**
This problem seems to occur when running the Celery process, outside of `docker-compose`
(via `make run-backend-celery`), and using a `conda` virtual environment.
<!-- markdownlint-disable MD013 -->
```bash
conda create --name oncall-dev python=3.9.13
conda activate oncall-dev
make backend-bootstrap
make run-backend-celery
File "~/oncall/engine/engine/__init__.py", line 5, in <module>
from .celery import app as celery_app
File "~/oncall/engine/engine/celery.py", line 11, in <module>
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
File "/opt/homebrew/Caskroom/miniconda/base/envs/oncall-dev/lib/python3.9/site-packages/opentelemetry/exporter/otlp/proto/grpc/trace_exporter/__init__.py", line 20, in <module>
from grpc import ChannelCredentials, Compression
File "/opt/homebrew/Caskroom/miniconda/base/envs/oncall-dev/lib/python3.9/site-packages/grpc/__init__.py", line 22, in <module>
from grpc import _compression
File "/opt/homebrew/Caskroom/miniconda/base/envs/oncall-dev/lib/python3.9/site-packages/grpc/_compression.py", line 20, in <module>
from grpc._cython import cygrpc
ImportError: dlopen(/opt/homebrew/Caskroom/miniconda/base/envs/oncall-dev/lib/python3.9/site-packages/grpc/_cython/cygrpc.cpython-39-darwin.so, 0x0002): symbol not found in flat namespace '_EVP_DigestSignUpdate'
```
<!-- markdownlint-enable MD013 -->
**Solution:**
[This solution](https://github.com/grpc/grpc/issues/15510#issuecomment-392012594) posted in a GitHub issue thread for
the `grpc/grpc` repository, fixes the issue:
```bash
conda install grpcio
make run-backend-celery
```
## IDE Specific Instructions
### PyCharm

2
dev/scripts/.isort.cfg Normal file
View file

@ -0,0 +1,2 @@
[settings]
profile=black

View file

@ -0,0 +1,45 @@
# Fake Data Generator Script
This script can be used to easily populate fake data into your local Grafana/OnCall setup. Currently the script is
capable of generating the following objects:
- teams
- users
- schedules
- schedule on call shifts
## Prerequisites
1. Create/active a Python 3.9 virtual environment
2. `pip install -r requirements.txt`
3. Must have a local version of Grafana and OnCall up and running
4. Generate an API key inside of Grafana OnCall
## How to run
**Note**: The below flag values assume you are running a `grafana` container locally via the `docker-compose` setup.
The reason why there is a few separate steps involved is that we need to first create teams and users in the Grafana
instance. Later on, in order to create OnCall schedules/oncall-shifts, we need the OnCall user ID to do so. There is
currently no way to trigger a Grafana -> OnCall data sync via the public API, hence the manual step in the middle
to have data synced between Grafana and OnCall.
1. Create teams and users in Grafana. The `teams` and `users` flags represent the number of teams and users you would
like to create respectively:
```bash
# by default this will generate 10 teams and 1000 users
python main.py generate_teams_and_users
```
See `python main.py generate_teams_and_users -h` for more information on how to run the command.
2. Head to your OnCall setup, and trigger a Grafana -> OnCall data sync by visiting the plugin page.
3. Create schedules and on call shifts in OnCall. The `schedules` flag represents the number of OnCall schedules you
would like to generate. **Note** that one on call shift is created for each schedule:
```bash
# by default this will generate 100 schedules
python main.py generate_schedules_and_oncall_shifts --oncall-api-token=<oncall-api-key>
```
See `python main.py generate_schedules_and_oncall_shifts -h` for more information on how to run the command.

View file

@ -0,0 +1,305 @@
import argparse
import asyncio
import math
import random
import typing
import uuid
from datetime import datetime
import aiohttp
from faker import Faker
from tqdm.asyncio import tqdm
fake = Faker()
TEAMS_USERS_COMMAND = "generate_teams_and_users"
SCHEDULES_ONCALL_SHIFTS_COMMAND = "generate_schedules_and_oncall_shifts"
GRAFANA_API_URL = None
ONCALL_API_URL = None
ONCALL_API_TOKEN = None
class OnCallApiUser(typing.TypedDict):
id: str
class OnCallApiOnCallShift(typing.TypedDict):
id: str
class OnCallApiListUsersResponse(typing.TypedDict):
results: typing.List[OnCallApiUser]
class GrafanaAPIUser(typing.TypedDict):
id: int
def _generate_unique_email() -> str:
user = fake.profile()
return f'{uuid.uuid4()}-{user["mail"]}'
async def _grafana_api_request(
http_session: aiohttp.ClientSession, method: str, url: str, **request_kwargs
) -> typing.Awaitable[typing.Dict]:
resp = await http_session.request(
method, f"{GRAFANA_API_URL}{url}", **request_kwargs
)
return await resp.json()
async def _oncall_api_request(
http_session: aiohttp.ClientSession, method: str, url: str, **request_kwargs
) -> typing.Awaitable[typing.Dict]:
resp = await http_session.request(
method,
f"{ONCALL_API_URL}{url}",
headers={"Authorization": ONCALL_API_TOKEN},
**request_kwargs,
)
return await resp.json()
def generate_team(
http_session: aiohttp.ClientSession, org_id: int
) -> typing.Callable[[], typing.Awaitable[typing.Dict]]:
"""
https://grafana.com/docs/grafana/latest/developers/http_api/team/#add-team
"""
def _generate_team() -> typing.Awaitable[typing.Dict]:
return _grafana_api_request(
http_session,
"POST",
"/api/teams",
json={
"name": str(uuid.uuid4()),
"email": _generate_unique_email(),
"orgId": org_id,
},
)
return _generate_team
def generate_user(
http_session: aiohttp.ClientSession, org_id: int
) -> typing.Callable[[], typing.Awaitable[typing.Dict]]:
"""
https://grafana.com/docs/grafana/latest/developers/http_api/admin/#global-users
"""
async def _generate_user() -> typing.Awaitable[typing.Dict]:
user = fake.profile()
# create the user in grafana
grafana_user: GrafanaAPIUser = await _grafana_api_request(
http_session,
"POST",
"/api/admin/users",
json={
"name": user["name"],
"email": _generate_unique_email(),
"login": str(uuid.uuid4()),
"password": fake.password(length=20),
"OrgId": org_id,
},
)
# update the user's basic role in grafana to Admin
# https://grafana.com/docs/grafana/latest/developers/http_api/org/#updates-the-given-user
await _grafana_api_request(
http_session,
"PATCH",
f'/api/org/users/{grafana_user["id"]}',
json={"role": "Admin"},
)
return grafana_user
return _generate_user
def generate_schedule(
http_session: aiohttp.ClientSession, oncall_shift_ids: typing.List[str]
) -> typing.Callable[[], typing.Awaitable[typing.Dict]]:
def _generate_schedule() -> typing.Awaitable[typing.Dict]:
# Create a schedule
# https://grafana.com/docs/oncall/latest/oncall-api-reference/schedules/#create-a-schedule
return _oncall_api_request(
http_session,
"POST",
"/api/v1/schedules",
json={
"name": f"Schedule {uuid.uuid4()}",
"type": "calendar",
"time_zone": "UTC",
"shifts": oncall_shift_ids,
},
)
return _generate_schedule
def _bulk_generate_data(
iterations: int,
data_generator_func: typing.Callable[[], typing.Awaitable[typing.Dict]],
) -> typing.Awaitable[typing.List[typing.Dict]]:
return tqdm.gather(
*[asyncio.ensure_future(data_generator_func()) for _ in range(iterations)]
)
async def _generate_grafana_teams_and_users(
args: argparse.Namespace, http_session: aiohttp.ClientSession
) -> typing.Awaitable[None]:
global GRAFANA_API_URL
GRAFANA_API_URL = args.grafana_api_url
org_id = args.grafana_org_id
print("Generating team(s)")
await _bulk_generate_data(args.teams, generate_team(http_session, org_id))
print("Generating user(s)")
await _bulk_generate_data(args.users, generate_user(http_session, org_id))
print(
f"""
Grafana teams and users generated
Now head to the OnCall plugin and manually visit the plugin to trigger a sync. This will sync grafana
teams/users to OnCall. Once completed, you can run the {SCHEDULES_ONCALL_SHIFTS_COMMAND} command.
"""
)
async def _generate_oncall_schedules_and_oncall_shifts(
args: argparse.Namespace, http_session: aiohttp.ClientSession
) -> typing.Awaitable[None]:
global ONCALL_API_URL, ONCALL_API_TOKEN
ONCALL_API_URL = args.oncall_api_url
ONCALL_API_TOKEN = args.oncall_api_token
today = datetime.now()
print("Fetching users from OnCall API")
# Fetch users from the OnCall API
users: OnCallApiListUsersResponse = await _oncall_api_request(
http_session, "GET", "/api/v1/users"
)
user_ids: typing.List[str] = [u["id"] for u in users["results"]]
num_users = len(user_ids)
print(f"Fetched {num_users} user(s) from the OnCall API")
async def _create_oncall_shift(shift_start_time: str) -> typing.Awaitable[str]:
"""
Creates an eight hour shift.
`shift_start_time` - ex. 09:00:00, 15:00:00
https://grafana.com/docs/oncall/latest/oncall-api-reference/on_call_shifts/#create-an-oncall-shift
"""
new_shift: OnCallApiOnCallShift = await _oncall_api_request(
http_session,
"POST",
"/api/v1/on_call_shifts",
json={
"name": f"On call shift{uuid.uuid4()}",
"type": "rolling_users",
"start": today.strftime(f"%Y-%m-%dT{shift_start_time}"),
"time_zone": "UTC",
"duration": 60 * 60 * 8, # 8 hours
"frequency": "daily",
"week_start": "MO",
"rolling_users": [
[u] for u in random.choices(user_ids, k=math.floor(num_users / 2))
],
"start_rotation_from_user_index": 0,
"team_id": None,
},
)
oncall_shift_id = new_shift["id"]
print(f"Generated OnCall shift w/ ID {oncall_shift_id}")
return oncall_shift_id
print("Creating three 8h on-call shifts")
morning_shift_id = await _create_oncall_shift("00:00:00")
afternoon_shift_id = await _create_oncall_shift("08:00:00")
evening_shift_id = await _create_oncall_shift("16:00:00")
print("Generating schedules(s)")
await _bulk_generate_data(
args.schedules,
generate_schedule(
http_session, [morning_shift_id, afternoon_shift_id, evening_shift_id]
),
)
async def main() -> typing.Awaitable[None]:
parser = argparse.ArgumentParser(
description="Set of commands to help generate fake data in a Grafana OnCall setup."
)
subparsers = parser.add_subparsers(help="sub-command help")
grafana_command_parser = subparsers.add_parser(
TEAMS_USERS_COMMAND,
description="Command to generate teams and users in Grafana",
)
grafana_command_parser.set_defaults(func=_generate_grafana_teams_and_users)
grafana_command_parser.add_argument(
"--grafana-api-url",
help="Grafana API URL. This should include the basic authentication username/password in the URL. ex. http://oncall:oncall@localhost:3000",
default="http://oncall:oncall@localhost:3000",
)
grafana_command_parser.add_argument(
"--grafana-org-id",
help="Org ID, in Grafana, of the org that you would like to generate data for",
type=int,
default=1,
)
grafana_command_parser.add_argument(
"-t", "--teams", help="Number of teams to generate", default=10, type=int
)
grafana_command_parser.add_argument(
"-u", "--users", help="Number of users to generate", default=1_000, type=int
)
oncall_command_parser = subparsers.add_parser(
SCHEDULES_ONCALL_SHIFTS_COMMAND,
description="Command to generate schedules and on-call shifts in OnCall",
)
oncall_command_parser.set_defaults(
func=_generate_oncall_schedules_and_oncall_shifts
)
oncall_command_parser.add_argument(
"--oncall-api-url",
help="OnCall API URL",
default="http://localhost:8080",
)
oncall_command_parser.add_argument(
"--oncall-api-token", help="OnCall API token", required=True
)
oncall_command_parser.add_argument(
"-s",
"--schedules",
help="Number of schedules to generate",
default=100,
type=int,
)
args = parser.parse_args()
async with aiohttp.ClientSession(
connector=aiohttp.TCPConnector(limit=5)
) as session:
await args.func(args, session)
if __name__ == "__main__":
asyncio.run(main())

View file

@ -0,0 +1,3 @@
aiohttp==3.8.3
Faker==16.4.0
tqdm==4.64.1

View file

@ -296,6 +296,7 @@ services:
volumes:
- grafanadata_dev:/var/lib/grafana
- ./grafana-plugin:/var/lib/grafana/plugins/grafana-plugin
- ${GRAFANA_DEV_PROVISIONING:-/dev/null}:/etc/grafana/grafana.ini
depends_on:
postgres:
condition: service_healthy

View file

@ -1,6 +1,7 @@
import enum
import typing
from django.conf import settings
from rest_framework import permissions
from rest_framework.authentication import BasicAuthentication, SessionAuthentication
from rest_framework.request import Request
@ -184,6 +185,11 @@ class RBACPermission(permissions.BasePermission):
return view.action if isinstance(view, ViewSetMixin) else request.method.lower()
def has_permission(self, request: Request, view: ViewSetOrAPIView) -> bool:
# the django-debug-toolbar UI makes OPTIONS calls. Without this statement the debug UI can't gather the
# necessary info it needs to work properly
if settings.DEBUG and request.method == "OPTIONS":
return True
action = self._get_view_action(request, view)
rbac_permissions: RBACPermissionsAttribute = getattr(view, RBAC_PERMISSIONS_ATTR, None)

View file

@ -82,7 +82,9 @@ class APIClient:
if response.status_code == status.HTTP_204_NO_CONTENT:
return {}, call_status
return response.json(), call_status
# ex. a HEAD call (self.api_head) would have a response.content of b''
# and hence calling response.json() throws a json.JSONDecodeError
return response.json() if response.content else None, call_status
except (
requests.exceptions.ConnectionError,
requests.exceptions.HTTPError,

View file

@ -70,5 +70,6 @@ if settings.DEBUG:
path("__debug__/", include(debug_toolbar.urls)),
] + urlpatterns
urlpatterns += [path("silk/", include("silk.urls", namespace="silk"))]
admin.site.site_header = settings.ADMIN_SITE_HEADER

View file

@ -13,7 +13,7 @@ django-cors-headers==3.7.0
django-debug-toolbar==3.2.1
django-sns-view==0.1.2
python-telegram-bot==13.13
django-silk==4.1.0
django-silk==5.0.3
django-redis-cache==3.0.0
hiredis==1.0.0
django-ratelimit==2.0.0
@ -23,7 +23,7 @@ recurring-ical-events==0.1.16b0
slack-export-viewer==1.0.0
beautifulsoup4==4.8.1
social-auth-app-django==3.1.0
cryptography==39.0.0
cryptography==38.0.4 # version 39.0.0 introduced an issue - https://stackoverflow.com/a/75053968/3902555
pytest==5.4.3
pytest-django==3.9.0
pytest_factoryboy==2.0.3

View file

@ -1,5 +1,6 @@
# flake8: noqa
import os
import socket
import sys
from .base import *
@ -30,30 +31,31 @@ SILKY_PYTHON_PROFILER = True
# For any requests that come in with that header/value, request.is_secure() will return True.
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
# Uncomment this to view SQL queries
# LOGGING = {
# 'version': 1,
# 'filters': {
# 'require_debug_true': {
# '()': 'django.utils.log.RequireDebugTrue',
# }
# },
# 'handlers': {
# 'console': {
# 'level': 'DEBUG',
# 'filters': ['require_debug_true'],
# 'class': 'logging.StreamHandler',
# }
# },
# 'loggers': {
# 'django.db.backends': {
# 'level': 'DEBUG',
# 'handlers': ['console'],
# }
# }
# }
if getenv_boolean("DEV_DEBUG_VIEW_SQL_QUERIES", default=False):
LOGGING = {
"version": 1,
"filters": {
"require_debug_true": {
"()": "django.utils.log.RequireDebugTrue",
}
},
"handlers": {
"console": {
"level": "DEBUG",
"filters": ["require_debug_true"],
"class": "logging.StreamHandler",
}
},
"loggers": {
"django.db.backends": {
"level": "DEBUG",
"handlers": ["console"],
}
},
}
SILKY_INTERCEPT_PERCENT = 100
MIDDLEWARE += ["silk.middleware.SilkyMiddleware"]
SWAGGER_SETTINGS = {
"SECURITY_DEFINITIONS": {
@ -67,3 +69,13 @@ if TESTING:
EXTRA_MESSAGING_BACKENDS = [("apps.base.tests.messaging_backend.TestOnlyBackend", 42)]
TELEGRAM_TOKEN = "0000000000:XXXXXXXXXXXXXXXXXXXXXXXXXXXX-XXXXXX"
TWILIO_AUTH_TOKEN = "twilio_auth_token"
INTERNAL_IPS = [
"127.0.0.1",
]
# the below two lines make it possible to use django-debug-toolbar inside of docker locally
# https://knasmueller.net/fix-djangos-debug-toolbar-not-showing-inside-docker
# https://stackoverflow.com/questions/10517765/django-debug-toolbar-not-showing-up
hostname, _, ips = socket.gethostbyname_ex(socket.gethostname())
INTERNAL_IPS += [".".join(ip.split(".")[:-1] + ["1"]) for ip in ips]