Compare commits
215 Commits
master
...
306eb20257
| Author | SHA1 | Date | |
|---|---|---|---|
| 306eb20257 | |||
| 5ae85083a5 | |||
| 1ac83f1066 | |||
| 7e0e6e9652 | |||
| ab1b152a46 | |||
| 8821cd86c6 | |||
| 4ea93c9eff | |||
| 40ca2064d9 | |||
| e840fbabac | |||
| ff2a8ff6d4 | |||
| d33256edb8 | |||
| 10b9e9280f | |||
| ee7f301f44 | |||
| 64984d69d5 | |||
| 4f856c0705 | |||
| 8b4da52760 | |||
| aa171375e5 | |||
| 1674266890 | |||
| dd71f984ed | |||
| 269b29b2c1 | |||
| df41885cca | |||
| 2a75608701 | |||
| e1cf455384 | |||
| 93e286fa62 | |||
| e3966ca5cb | |||
| c335ed9fb9 | |||
| ad90255570 | |||
| cab6cacd0b | |||
| 1475a80316 | |||
| b9148933f4 | |||
| 5bb52a9d67 | |||
| 8cf57c07b2 | |||
| 20aa3aba9d | |||
| d53cb3e935 | |||
| 25ceacdff9 | |||
| 0adcacf375 | |||
| c31d776504 | |||
| 69b77a675b | |||
| ac9b3c8ede | |||
| 825ed03dbd | |||
| 561914ee78 | |||
| ccea2a5ea6 | |||
| c52e87acd2 | |||
| 294c848415 | |||
| 1f6381bf07 | |||
| 73d8ff32d5 | |||
| 46a9008cda | |||
| eaa1e2412a | |||
| a3c21c64ed | |||
| 0e444f0502 | |||
| 4778803ee0 | |||
| 9c3eaf05c7 | |||
| f05e47af1c | |||
| 1de9548111 | |||
| 8202a9324c | |||
| f5ce00795e | |||
| 4e04cfae7e | |||
| d2ed8df2ac | |||
| 712ab223ba | |||
| ed9affbef6 | |||
| cb5a6a3ee8 | |||
| bc697bd4bd | |||
| d075fecbce | |||
| 0c07586787 | |||
| 9661e98a70 | |||
| 69d77e1d0c | |||
| 62ada47352 | |||
| 474016f776 | |||
| 6a21a9d094 | |||
| b4c12def13 | |||
| 9097bced4a | |||
| 61e6732b19 | |||
| fd4c765dc4 | |||
| 36ae12af3d | |||
| 0a86d4e0a3 | |||
| a53aebb5b8 | |||
| 1af0348c89 | |||
| 8ab8ef5b1c | |||
| bf9da835b2 | |||
| 7b28149d7e | |||
| 87a2ee5a45 | |||
| ab231e9a89 | |||
| 1661601caf | |||
| e690953b82 | |||
| f7a61fe6c0 | |||
| 5ea092dba6 | |||
| 876e4cdfa7 | |||
| a0468899b0 | |||
| 2ce435fb5d | |||
| f9ad81ddac | |||
| faecdf5495 | |||
| 2618eb295b | |||
| a2360a882d | |||
| 6bd8307fbc | |||
| f8305be4cd | |||
| 05c6cbc839 | |||
| b453c821c7 | |||
| 73c7c471ea | |||
| 87394c2955 | |||
| a5cda0b203 | |||
| 86b6a929be | |||
| cd19c26e82 | |||
| 695e4d7c5d | |||
| d89cc98b44 | |||
| 0e183099ed | |||
| 4bebc56a28 | |||
| 112770eddf | |||
| 934817da8a | |||
| 894d7b0149 | |||
| ba585c64bb | |||
| e9e363a13a | |||
| 9097fd5310 | |||
| c2cd3b0301 | |||
| 4c3878a300 | |||
| 9ec25ed109 | |||
| 46da13a0df | |||
| f32f9ff27f | |||
| 276e65e0b4 | |||
| 246eba6654 | |||
| 8bd11f363b | |||
| b0eaf0c531 | |||
| fceee3146e | |||
| d2a65bd1fe | |||
| a53a37021c | |||
| 180cc8bc02 | |||
| 61af53eecb | |||
| cfc0e45439 | |||
| 5a89563d4a | |||
| 80cd5baa18 | |||
| e873ff71e8 | |||
| 1fed61f47f | |||
| 05a3b9c95d | |||
| 8c8ac863dd | |||
| ec844297ee | |||
| da42ea4bcf | |||
| 861573a7df | |||
| 78d67a3fbb | |||
| 156b0afcb9 | |||
| a24afd114e | |||
| 91707c4553 | |||
| b2c1d8bde9 | |||
| 4eef440a2a | |||
| 3e25f3030c | |||
| c1bf1c34f0 | |||
| 8a4d43cfb2 | |||
| 836d3a75bf | |||
| 112533bbab | |||
| dbc57cbcd1 | |||
| 44a530ce9e | |||
| 1d60f2fa41 | |||
| aad15891b0 | |||
| 3195f936d3 | |||
| ffd5b5b013 | |||
| a4593dc0c5 | |||
| ce8e1f00e9 | |||
| 6a44022afb | |||
| dfc8f6a09c | |||
| 520f5ad0d8 | |||
| fbc61614d7 | |||
| 9c147b2a6d | |||
| aa61918d69 | |||
| 85046e5a5a | |||
| b8e088fc11 | |||
| a7746dfdea | |||
| 12dcd94573 | |||
| a87b9c7e72 | |||
| 59d5a1c3dc | |||
| 094e0ef1d2 | |||
| 669d26600a | |||
| cf02bce45e | |||
| 478ce58c17 | |||
| 8f0dd9d248 | |||
| 6107324be0 | |||
| f92aa8d1b2 | |||
| 428903acfd | |||
| e983ca64bd | |||
| ff398d8e7f | |||
| e44b77d0b3 | |||
| 36f5ee8b44 | |||
| e94aeb2440 | |||
| 1ee5e57547 | |||
| a805ce6777 | |||
| d4676a5dd1 | |||
| af7289614f | |||
| fb665aff5d | |||
| 66f02bdb05 | |||
| ec59092d34 | |||
| 2dd2b766a6 | |||
| d29f89758b | |||
| 81dc3a385c | |||
| f65bfb1564 | |||
| 2c46cde8e6 | |||
| b1395a4b5d | |||
| fd7b504d01 | |||
| 65a6c9f90c | |||
| 17512d14b8 | |||
| 4a00ecd691 | |||
| 4733d9ac7c | |||
| e3b744be70 | |||
| d17d6831dd | |||
| 4bbf0d85e0 | |||
| 789808e815 | |||
| adb10c3d95 | |||
| 5b788ca28f | |||
| ab8858154b | |||
| 4e03abcac8 | |||
| cd360d9ac7 | |||
| 849a14014c | |||
| e9e09104ad | |||
| 95a847cee0 | |||
| 8281c1a4c9 | |||
| cbae38e893 | |||
| efd940f34f | |||
| 893f441ddd | |||
| cd76659ed9 |
3
.gitignore
vendored
3
.gitignore
vendored
@ -6,4 +6,5 @@ node_modules/
|
||||
*.egg-info/
|
||||
*.egg
|
||||
|
||||
|
||||
db.sqlite3
|
||||
instance/settings/settings.py
|
||||
|
||||
20
.pre-commit-config.yaml
Normal file
20
.pre-commit-config.yaml
Normal file
@ -0,0 +1,20 @@
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v2.3.0
|
||||
hooks:
|
||||
- id: check-yaml
|
||||
- id: end-of-file-fixer
|
||||
- id: trailing-whitespace
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 23.1.0
|
||||
hooks:
|
||||
- id: black
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: v0.0.292
|
||||
hooks:
|
||||
- id: ruff
|
||||
args: [--fix, --exit-non-zero-on-fix]
|
||||
- repo: https://github.com/PyCQA/docformatter.git
|
||||
rev: v1.5.1
|
||||
hooks:
|
||||
- id: docformatter
|
||||
73
README.md
73
README.md
@ -1,20 +1,64 @@
|
||||

|
||||
|
||||
Platform to manage a radio, schedules, website, and so on. We use the power of great tools like Django or Liquidsoap.
|
||||
|
||||
This project is distributed under GPL version 3. More information in the LICENSE file, except for some files whose license is indicated.
|
||||
A platform to manage radio schedules, website content, and more. It uses the power of great tools like Django or Liquidsoap.
|
||||
|
||||
This project is distributed under GPL version 3. More information in the LICENSE file, except for some files whose license is indicated inside source code.
|
||||
|
||||
## Features
|
||||
* **streams**: multiple random music streams when no program is played. We also can specify a time range and frequency for each;
|
||||
* **diffusions**: generate diffusions time slot for programs that have schedule informations. Check for conflicts and rerun.
|
||||
* **liquidsoap**: create a configuration to use liquidsoap as a stream generator. Also provides interface and control to it;
|
||||
* **sounds**: each programs have a folder where sounds can be put, that will be detected by the system. Quality can be check and reported for later use. Later, we plan to have uploaders to external plateforms. Sounds can be defined as excerpts or as archives.
|
||||
* **cms**: application that can be used as basis for website;
|
||||
* **sounds**: each programs have a folder for its podcast. Aircox detects updates, can run quality check, import related playlist (timestamped or position in track list). Sounds can be defined as excerpts or as archives.
|
||||
* **log**: keep a trace of every played/loaded sounds on the stream generator.
|
||||
* **admin**: admin user interface.
|
||||
* **cms**: content management system.
|
||||
|
||||
|
||||
## Scripts
|
||||
## Architecture and concepts
|
||||
Aircox is divided in two main modules:
|
||||
* `aircox`: basics of Aircox (programs, diffusions, sounds, etc. management); interface for managing a website with Aircox elements (playlists, timetable, players on the website);
|
||||
* `aircox_streamer`: interact with application to generate audio stream (LiquidSoap);
|
||||
|
||||
## Development setup
|
||||
Start installing a virtual environment :
|
||||
|
||||
```
|
||||
virtualenv venv
|
||||
source venv/bin/activate
|
||||
pip install -r requirements.txt
|
||||
pip install -r requirements_tests.txt
|
||||
```
|
||||
|
||||
Then copy the default settings and initiate the database :
|
||||
|
||||
```
|
||||
cp instance/settings/sample.py instance/settings/settings.py
|
||||
python -c "from django.core.management.utils import get_random_secret_key; print('SECRET_KEY = \"%s\"' % get_random_secret_key())" >> instance/settings/settings.py
|
||||
DJANGO_SETTINGS_MODULE=instance.settings.dev ./manage.py migrate
|
||||
```
|
||||
|
||||
Finally test and run the instance using development settings, and point your browser to http://localhost:8000 :
|
||||
|
||||
```
|
||||
DJANGO_SETTINGS_MODULE=instance.settings.dev pytest
|
||||
DJANGO_SETTINGS_MODULE=instance.settings.dev ./manage.py runserver
|
||||
```
|
||||
|
||||
Before requesting a merge, enable pre-commit :
|
||||
|
||||
```
|
||||
pip install pre-commit
|
||||
pre-commit install
|
||||
```
|
||||
|
||||
## Installation
|
||||
Running Aircox on production involves:
|
||||
* Aircox modules and a running Django project;
|
||||
* a supervisor for common tasks (sounds monitoring, stream control, etc.) -- `supervisord`;
|
||||
* a wsgi and an HTTP server -- `gunicorn`, `nginx`;
|
||||
* a database supported by Django (MySQL, SQLite, PostGresSQL);
|
||||
|
||||
### Scripts
|
||||
Are included various configuration scripts that can be used to ease setup. They
|
||||
assume that the project is present in `/srv/apps/aircox`:
|
||||
|
||||
@ -26,9 +70,6 @@ The scripts are written with a combination of `cron`, `supervisord`, `nginx`
|
||||
and `gunicorn` in mind.
|
||||
|
||||
|
||||
## Installation
|
||||
Later we plan to have an installation script to reduce the number of above steps.
|
||||
|
||||
### Dependencies
|
||||
For python dependencies take a peek at the `requirements.txt` file, plus
|
||||
dependencies specific to Django (e.g. for database: `mysqlclient` for MySql
|
||||
@ -50,7 +91,7 @@ Development dependencies:
|
||||
All scripts and files assumes that:
|
||||
- you have cloned aircox in `/srv/apps/` (such as `/srv/apps/aircox/README.md`)
|
||||
- you have a supervisor running (we have scripts for `supervisord`)
|
||||
- you want to use `gunicorn` as WSGI server (otherwise, you'll need to remove it from the requirement list)
|
||||
- you use `gunicorn` as WSGI server (otherwise, you'll need to remove it from the requirement list)
|
||||
|
||||
This installation process uses a virtualenv, including all provided scripts.
|
||||
|
||||
@ -63,8 +104,8 @@ pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### Configuration
|
||||
You must write a settings.py file in the `instance` directory (you can just
|
||||
copy and paste `instance/sample_settings.py`. There still is configuration
|
||||
You must write a settings.py file in the `instance/settings` directory (you can just
|
||||
copy and paste `instance/settings/sample.py`. There still is configuration
|
||||
required in this file, check it in for more info.
|
||||
|
||||
|
||||
@ -87,8 +128,7 @@ server from this directory:
|
||||
./manage.py runserver
|
||||
```
|
||||
|
||||
You can access to the django admin interface at `http://127.0.0.1:8000/admin`
|
||||
and to the cms interface at `http://127.0.0.1:8000/cms/`.
|
||||
You can access to the django admin interface at `http://127.0.0.1:8000/admin`.
|
||||
|
||||
From the admin interface:
|
||||
* create a Station
|
||||
@ -96,8 +136,6 @@ From the admin interface:
|
||||
* defines Outputs for the streamer (look at Liquidsoap documentation for
|
||||
more information on how to configure it)
|
||||
|
||||
TODO: cms related documentation here
|
||||
|
||||
Once the configuration is okay, you must start the *controllers monitor*,
|
||||
that creates configuration file for the audio streams using the new information
|
||||
and that runs the appropriate application (note that you dont need to restart it
|
||||
@ -107,5 +145,4 @@ If you use supervisord and our script with it, you can use the services defined
|
||||
in it instead of running commands manually.
|
||||
|
||||
## More informations
|
||||
There are extra informations in `aircox/README.md`.
|
||||
|
||||
There are extra informations in `aircox/README.md` and `aircox_streamer/README.md`.
|
||||
|
||||
@ -6,16 +6,13 @@ A Station contains programs that can be scheduled or streamed. A *Scheduled Prog
|
||||
|
||||
Each program has a directory on the server where user puts its podcasts (in **AIRCOX_PROGRAM_DIR**). It contains the directories **archives** (complete show's podcasts) and **excerpts** (partial or whatever podcasts).
|
||||
|
||||
|
||||
## manage.py's commands
|
||||
* **diffusions**: update/create, check and clean diffusions based on programs schedules;
|
||||
* **import_playlist**: import a playlist from a csv file, and associate it to a sound;
|
||||
* **sound_monitor**: check for existing and missing sounds files in programs directories and synchronize the database. It can check for the quality of file and update sound info.
|
||||
* **sound_quality_check**: check for the quality of the file (don't update database);
|
||||
* **streamer**: audio stream generation and control it;
|
||||
* `diffusions`: update/create, check and clean diffusions based on programs schedules;
|
||||
* `import_playlist`: import a playlist from a csv file, and associate it to a sound;
|
||||
* `sounds_monitor`: check for existing and missing sounds files in programs directories and synchronize the database. It can check for the quality of file and update sound info.
|
||||
* `sounds_quality_check`: check for the quality of the file (don't update database);
|
||||
|
||||
|
||||
## Requirements
|
||||
* Sox (and soxi): sound file monitor and quality check
|
||||
* requirements.txt for python's dependecies
|
||||
|
||||
|
||||
@ -1 +0,0 @@
|
||||
|
||||
|
||||
@ -1,8 +1,26 @@
|
||||
from . import filters
|
||||
from .article import ArticleAdmin
|
||||
from .episode import DiffusionAdmin, EpisodeAdmin
|
||||
from .diffusion import DiffusionAdmin
|
||||
from .episode import EpisodeAdmin
|
||||
from .log import LogAdmin
|
||||
from .page import PageAdmin, StaticPageAdmin
|
||||
from .program import ProgramAdmin, ScheduleAdmin, StreamAdmin
|
||||
from .program import ProgramAdmin, StreamAdmin
|
||||
from .schedule import ScheduleAdmin
|
||||
from .sound import SoundAdmin, TrackAdmin
|
||||
from .station import StationAdmin
|
||||
|
||||
__all__ = (
|
||||
"filters",
|
||||
"ArticleAdmin",
|
||||
"DiffusionAdmin",
|
||||
"EpisodeAdmin",
|
||||
"LogAdmin",
|
||||
"PageAdmin",
|
||||
"StaticPageAdmin",
|
||||
"ProgramAdmin",
|
||||
"ScheduleAdmin",
|
||||
"StreamAdmin",
|
||||
"SoundAdmin",
|
||||
"TrackAdmin",
|
||||
"StationAdmin",
|
||||
)
|
||||
|
||||
@ -1,17 +1,12 @@
|
||||
import copy
|
||||
|
||||
from django.contrib import admin
|
||||
|
||||
from ..models import Article
|
||||
from .page import PageAdmin
|
||||
|
||||
|
||||
__all__ = ['ArticleAdmin']
|
||||
__all__ = ["ArticleAdmin"]
|
||||
|
||||
|
||||
@admin.register(Article)
|
||||
class ArticleAdmin(PageAdmin):
|
||||
search_fields = PageAdmin.search_fields + ('parent__title',)
|
||||
search_fields = PageAdmin.search_fields + ("parent__title",)
|
||||
# TODO: readonly field
|
||||
|
||||
|
||||
|
||||
48
aircox/admin/diffusion.py
Normal file
48
aircox/admin/diffusion.py
Normal file
@ -0,0 +1,48 @@
|
||||
from django.contrib import admin
|
||||
from django.utils.translation import gettext as _
|
||||
|
||||
from aircox.models import Diffusion
|
||||
|
||||
|
||||
__all__ = ("DiffusionBaseAdmin", "DiffusionAdmin", "DiffusionInline")
|
||||
|
||||
|
||||
class DiffusionBaseAdmin:
|
||||
fields = ("type", "start", "end", "schedule")
|
||||
readonly_fields = ("schedule",)
|
||||
|
||||
def get_readonly_fields(self, request, obj=None):
|
||||
fields = super().get_readonly_fields(request, obj)
|
||||
if not request.user.has_perm("aircox_program.scheduling"):
|
||||
fields = fields + ("program", "start", "end")
|
||||
return [field for field in fields if field in self.fields]
|
||||
|
||||
|
||||
@admin.register(Diffusion)
|
||||
class DiffusionAdmin(DiffusionBaseAdmin, admin.ModelAdmin):
|
||||
def start_date(self, obj):
|
||||
return obj.local_start.strftime("%Y/%m/%d %H:%M")
|
||||
|
||||
start_date.short_description = _("start")
|
||||
|
||||
def end_date(self, obj):
|
||||
return obj.local_end.strftime("%H:%M")
|
||||
|
||||
end_date.short_description = _("end")
|
||||
|
||||
list_display = ("episode", "start", "end", "type", "initial")
|
||||
list_filter = ("type", "start", "program")
|
||||
list_editable = ("type", "start", "end")
|
||||
ordering = ("-start", "id")
|
||||
|
||||
fields = ("type", "start", "end", "initial", "program", "schedule")
|
||||
readonly_fields = ("schedule",)
|
||||
|
||||
|
||||
class DiffusionInline(DiffusionBaseAdmin, admin.TabularInline):
|
||||
model = Diffusion
|
||||
fk_name = "episode"
|
||||
extra = 0
|
||||
|
||||
def has_add_permission(self, request, obj):
|
||||
return request.user.has_perm("aircox_program.scheduling")
|
||||
@ -1,68 +1,40 @@
|
||||
from copy import copy
|
||||
|
||||
from adminsortable2.admin import SortableAdminBase
|
||||
from django.contrib import admin
|
||||
from django.forms import ModelForm
|
||||
from django.utils.translation import gettext as _
|
||||
|
||||
from ..models import Episode, Diffusion
|
||||
|
||||
from aircox.models import Episode
|
||||
from .page import PageAdmin
|
||||
from .sound import SoundInline, TrackInline
|
||||
|
||||
|
||||
class DiffusionBaseAdmin:
|
||||
fields = ('type', 'start', 'end', 'schedule')
|
||||
readonly_fields = ('schedule',)
|
||||
|
||||
def get_readonly_fields(self, request, obj=None):
|
||||
fields = super().get_readonly_fields(request, obj)
|
||||
if not request.user.has_perm('aircox_program.scheduling'):
|
||||
fields = fields + ('program', 'start', 'end')
|
||||
return [field for field in fields if field in self.fields]
|
||||
|
||||
|
||||
@admin.register(Diffusion)
|
||||
class DiffusionAdmin(DiffusionBaseAdmin, admin.ModelAdmin):
|
||||
def start_date(self, obj):
|
||||
return obj.local_start.strftime('%Y/%m/%d %H:%M')
|
||||
start_date.short_description = _('start')
|
||||
|
||||
def end_date(self, obj):
|
||||
return obj.local_end.strftime('%H:%M')
|
||||
end_date.short_description = _('end')
|
||||
|
||||
list_display = ('episode', 'start_date', 'end_date', 'type', 'initial')
|
||||
list_filter = ('type', 'start', 'program')
|
||||
list_editable = ('type',)
|
||||
ordering = ('-start', 'id')
|
||||
|
||||
fields = ('type', 'start', 'end', 'initial', 'program', 'schedule')
|
||||
readonly_fields = ('schedule',)
|
||||
|
||||
|
||||
class DiffusionInline(DiffusionBaseAdmin, admin.TabularInline):
|
||||
model = Diffusion
|
||||
fk_name = 'episode'
|
||||
extra = 0
|
||||
|
||||
def has_add_permission(self, request, obj):
|
||||
return request.user.has_perm('aircox_program.scheduling')
|
||||
from .diffusion import DiffusionInline
|
||||
|
||||
|
||||
class EpisodeAdminForm(ModelForm):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.fields['parent'].required = True
|
||||
self.fields["parent"].required = True
|
||||
|
||||
|
||||
@admin.register(Episode)
|
||||
class EpisodeAdmin(PageAdmin):
|
||||
class EpisodeAdmin(SortableAdminBase, PageAdmin):
|
||||
form = EpisodeAdminForm
|
||||
list_display = PageAdmin.list_display
|
||||
list_filter = PageAdmin.list_filter + ('diffusion__start',)
|
||||
search_fields = PageAdmin.search_fields + ('parent__title',)
|
||||
list_filter = tuple(f for f in PageAdmin.list_filter if f != "pub_date") + (
|
||||
"diffusion__start",
|
||||
"pub_date",
|
||||
)
|
||||
search_fields = PageAdmin.search_fields + ("parent__title",)
|
||||
# readonly_fields = ('parent',)
|
||||
|
||||
inlines = [TrackInline, SoundInline, DiffusionInline]
|
||||
|
||||
def add_view(self, request, object_id, form_url="", context=None):
|
||||
context = context or {}
|
||||
context["init_app"] = True
|
||||
context["init_el"] = "#inline-tracks"
|
||||
return super().change_view(request, object_id, form_url, context)
|
||||
|
||||
def change_view(self, request, object_id, form_url="", context=None):
|
||||
context = context or {}
|
||||
context["init_app"] = True
|
||||
context["init_el"] = "#inline-tracks"
|
||||
return super().change_view(request, object_id, form_url, context)
|
||||
|
||||
72
aircox/admin/filters.py
Normal file
72
aircox/admin/filters.py
Normal file
@ -0,0 +1,72 @@
|
||||
from django.contrib.admin import filters
|
||||
from django.db import models
|
||||
from django.utils.http import urlencode
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
__all__ = ("DateFieldFilter", "DateTimeFieldFilter")
|
||||
|
||||
|
||||
class DateFieldFilter(filters.FieldListFilter):
|
||||
"""Display date input."""
|
||||
|
||||
template = "admin/aircox/filters/date_filter.html"
|
||||
input_type = "date"
|
||||
|
||||
def __init__(self, field, request, params, model, model_admin, field_path):
|
||||
self.field_generic = f"{field_path}__"
|
||||
self.date_params = {k: v for k, v in params.items() if k.startswith(self.field_generic)}
|
||||
|
||||
exact_lookup = "date" if isinstance(field, models.DateTimeField) else "exact"
|
||||
|
||||
# links as: (label, param, input_type|None, value)
|
||||
self.links = [
|
||||
(_("Exact"), self.field_generic + exact_lookup, self.input_type),
|
||||
(_("Since"), self.field_generic + "gte", self.input_type),
|
||||
(_("Until"), self.field_generic + "lte", self.input_type),
|
||||
]
|
||||
if field.null:
|
||||
self.links.insert(0, (_("None"), self.field_generic + "isnull", None, "1"))
|
||||
|
||||
self.query_attrs = {k: v for k, v in request.GET.items() if k not in self.date_params}
|
||||
self.query_string = urlencode(self.query_attrs)
|
||||
super().__init__(field, request, params, model, model_admin, field_path)
|
||||
|
||||
def expected_parameters(self):
|
||||
return [link[1] for link in self.links]
|
||||
|
||||
def choices(self, changelist):
|
||||
yield {
|
||||
"label": _("Any"),
|
||||
"type": None,
|
||||
"query_string": self.query_string,
|
||||
}
|
||||
|
||||
for link in self.links:
|
||||
value = len(link) > 3 and link[3] or self.date_params.get(link[1])
|
||||
yield {
|
||||
"label": link[0],
|
||||
"name": link[1],
|
||||
"value": value,
|
||||
"type": link[2],
|
||||
"query_attrs": self.query_attrs,
|
||||
"query_string": urlencode({link[1]: value}) + "&" + self.query_string if value else self.query_string,
|
||||
}
|
||||
|
||||
|
||||
class DateTimeFieldFilter(DateFieldFilter):
|
||||
"""Display datetime input."""
|
||||
|
||||
input_type = "datetime-local"
|
||||
|
||||
|
||||
filters.FieldListFilter.register(
|
||||
lambda f: isinstance(f, models.DateField),
|
||||
DateFieldFilter,
|
||||
take_priority=True,
|
||||
)
|
||||
|
||||
filters.FieldListFilter.register(
|
||||
lambda f: isinstance(f, models.DateTimeField),
|
||||
DateTimeFieldFilter,
|
||||
take_priority=True,
|
||||
)
|
||||
@ -2,12 +2,10 @@ from django.contrib import admin
|
||||
|
||||
from ..models import Log
|
||||
|
||||
|
||||
__all__ = ['LogAdmin']
|
||||
__all__ = ["LogAdmin"]
|
||||
|
||||
|
||||
@admin.register(Log)
|
||||
class LogAdmin(admin.ModelAdmin):
|
||||
list_display = ['id', 'date', 'station', 'source', 'type', 'comment']
|
||||
list_filter = ['date', 'source', 'station']
|
||||
|
||||
list_display = ["id", "date", "station", "source", "type", "comment"]
|
||||
list_filter = ["date", "source", "station"]
|
||||
|
||||
@ -1,42 +0,0 @@
|
||||
class UnrelatedInlineMixin:
|
||||
"""
|
||||
Inline class that can be included in an admin change view whose model
|
||||
is not directly related to inline's model.
|
||||
"""
|
||||
view_model = None
|
||||
parent_model = None
|
||||
parent_fk = ''
|
||||
|
||||
def __init__(self, parent_model, admin_site):
|
||||
self.view_model = parent_model
|
||||
super().__init__(self.parent_model, admin_site)
|
||||
|
||||
def get_parent(self, view_obj):
|
||||
""" Get formset's instance from `obj` of AdminSite's change form. """
|
||||
field = self.parent_model._meta.get_field(self.parent_fk).remote_field
|
||||
return getattr(view_obj, field.name, None)
|
||||
|
||||
def save_parent(self, parent, view_obj):
|
||||
""" Save formset's instance. """
|
||||
setattr(parent, self.parent_fk, view_obj)
|
||||
parent.save()
|
||||
return parent
|
||||
|
||||
def get_formset(self, request, obj):
|
||||
ParentFormSet = super().get_formset(request, obj)
|
||||
inline = self
|
||||
class FormSet(ParentFormSet):
|
||||
view_obj = None
|
||||
|
||||
def __init__(self, *args, instance=None, **kwargs):
|
||||
self.view_obj = instance
|
||||
instance = inline.get_parent(instance)
|
||||
self.instance = instance
|
||||
super().__init__(*args, instance=instance, **kwargs)
|
||||
|
||||
def save(self):
|
||||
inline.save_parent(self.instance, self.view_obj)
|
||||
return super().save()
|
||||
return FormSet
|
||||
|
||||
|
||||
@ -1,74 +1,78 @@
|
||||
from copy import deepcopy
|
||||
|
||||
from adminsortable2.admin import SortableInlineAdminMixin
|
||||
from django.contrib import admin
|
||||
from django.http import QueryDict
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from adminsortable2.admin import SortableInlineAdminMixin
|
||||
|
||||
from ..models import Category, Comment, NavItem, Page, StaticPage
|
||||
|
||||
|
||||
__all__ = ['CategoryAdmin', 'PageAdmin', 'NavItemInline']
|
||||
__all__ = ("CategoryAdmin", "PageAdmin", "NavItemInline")
|
||||
|
||||
|
||||
@admin.register(Category)
|
||||
class CategoryAdmin(admin.ModelAdmin):
|
||||
list_display = ['pk', 'title', 'slug']
|
||||
list_editable = ['title', 'slug']
|
||||
search_fields = ['title']
|
||||
fields = ['title', 'slug']
|
||||
list_display = ["pk", "title", "slug"]
|
||||
list_editable = ["title", "slug"]
|
||||
search_fields = ["title"]
|
||||
fields = ["title", "slug"]
|
||||
prepopulated_fields = {"slug": ("title",)}
|
||||
|
||||
|
||||
class BasePageAdmin(admin.ModelAdmin):
|
||||
list_display = ('cover_thumb', 'title', 'status', 'parent')
|
||||
list_display_links = ('cover_thumb', 'title')
|
||||
list_editable = ('status',)
|
||||
list_filter = ('status',)
|
||||
list_display = ("cover_thumb", "title", "status", "parent")
|
||||
list_display_links = ("cover_thumb", "title")
|
||||
list_editable = ("status",)
|
||||
list_filter = ("status",)
|
||||
prepopulated_fields = {"slug": ("title",)}
|
||||
|
||||
# prepopulate fields using changelist's filters
|
||||
prepopulated_filters = ('parent',)
|
||||
prepopulated_filters = ("parent",)
|
||||
|
||||
search_fields = ('title',)
|
||||
search_fields = ("title",)
|
||||
|
||||
fieldsets = [
|
||||
('', {
|
||||
'fields': ['title', 'slug', 'cover', 'content'],
|
||||
}),
|
||||
(_('Publication Settings'), {
|
||||
'fields': ['status', 'parent'],
|
||||
}),
|
||||
(
|
||||
"",
|
||||
{
|
||||
"fields": ["title", "slug", "cover", "content"],
|
||||
},
|
||||
),
|
||||
(
|
||||
_("Publication Settings"),
|
||||
{
|
||||
"fields": ["status", "parent"],
|
||||
},
|
||||
),
|
||||
]
|
||||
|
||||
change_form_template = 'admin/aircox/page_change_form.html'
|
||||
change_form_template = "admin/aircox/page_change_form.html"
|
||||
|
||||
def cover_thumb(self, obj):
|
||||
return mark_safe('<img src="{}"/>'.format(obj.cover.icons['64'])) \
|
||||
if obj.cover else ''
|
||||
if obj.cover and obj.cover.thumbnails:
|
||||
return mark_safe('<img src="{}"/>'.format(obj.cover.icons["64"]))
|
||||
return ""
|
||||
|
||||
def get_changeform_initial_data(self, request):
|
||||
data = super().get_changeform_initial_data(request)
|
||||
filters = QueryDict(request.GET.get('_changelist_filters', ''))
|
||||
data['parent'] = filters.get('parent', None)
|
||||
filters = QueryDict(request.GET.get("_changelist_filters", ""))
|
||||
data["parent"] = filters.get("parent", None)
|
||||
return data
|
||||
|
||||
def _get_common_context(self, query, extra_context=None):
|
||||
extra_context = extra_context or {}
|
||||
parent = query.get('parent', None)
|
||||
extra_context['parent'] = None if parent is None else \
|
||||
Page.objects.get_subclass(id=parent)
|
||||
parent = query.get("parent", None)
|
||||
extra_context["parent"] = None if parent is None else Page.objects.get_subclass(id=parent)
|
||||
return extra_context
|
||||
|
||||
def render_change_form(self, request, context, *args, **kwargs):
|
||||
if context['original'] and not 'parent' in context:
|
||||
context['parent'] = context['original'].parent
|
||||
if context["original"] and "parent" not in context:
|
||||
context["parent"] = context["original"].parent
|
||||
return super().render_change_form(request, context, *args, **kwargs)
|
||||
|
||||
def add_view(self, request, form_url='', extra_context=None):
|
||||
filters = QueryDict(request.GET.get('_changelist_filters', ''))
|
||||
def add_view(self, request, form_url="", extra_context=None):
|
||||
filters = QueryDict(request.GET.get("_changelist_filters", ""))
|
||||
extra_context = self._get_common_context(filters, extra_context)
|
||||
return super().add_view(request, form_url, extra_context)
|
||||
|
||||
@ -78,30 +82,32 @@ class BasePageAdmin(admin.ModelAdmin):
|
||||
|
||||
|
||||
class PageAdmin(BasePageAdmin):
|
||||
change_list_template = 'admin/aircox/page_change_list.html'
|
||||
change_list_template = "admin/aircox/page_change_list.html"
|
||||
|
||||
list_display = BasePageAdmin.list_display + ('category',)
|
||||
list_editable = BasePageAdmin.list_editable + ('category',)
|
||||
list_filter = BasePageAdmin.list_editable + ('category',)
|
||||
search_fields = ('category__title',)
|
||||
list_display = BasePageAdmin.list_display + ("category",)
|
||||
list_editable = BasePageAdmin.list_editable + ("category",)
|
||||
list_filter = BasePageAdmin.list_filter + ("category", "pub_date")
|
||||
search_fields = BasePageAdmin.search_fields + ("category__title",)
|
||||
fieldsets = deepcopy(BasePageAdmin.fieldsets)
|
||||
|
||||
fieldsets[0][1]['fields'].insert(fieldsets[0][1]['fields'].index('slug') + 1, 'category')
|
||||
fieldsets[1][1]['fields'] += ('featured', 'allow_comments')
|
||||
fieldsets[0][1]["fields"].insert(fieldsets[0][1]["fields"].index("slug") + 1, "category")
|
||||
fieldsets[1][1]["fields"] += ("featured", "allow_comments")
|
||||
|
||||
|
||||
@admin.register(StaticPage)
|
||||
class StaticPageAdmin(BasePageAdmin):
|
||||
list_display = BasePageAdmin.list_display + ('attach_to',)
|
||||
list_display = BasePageAdmin.list_display + ("attach_to",)
|
||||
list_editable = BasePageAdmin.list_editable + ("attach_to",)
|
||||
fieldsets = deepcopy(BasePageAdmin.fieldsets)
|
||||
|
||||
fieldsets[1][1]['fields'] += ('attach_to',)
|
||||
fieldsets[1][1]["fields"] += ("attach_to",)
|
||||
|
||||
|
||||
@admin.register(Comment)
|
||||
class CommentAdmin(admin.ModelAdmin):
|
||||
list_display = ('page_title', 'date', 'nickname')
|
||||
list_filter = ('date',)
|
||||
search_fields = ('page__title', 'nickname')
|
||||
list_display = ("page_title", "date", "nickname")
|
||||
list_filter = ("date",)
|
||||
search_fields = ("page__title", "nickname")
|
||||
|
||||
def page_title(self, obj):
|
||||
return obj.page.title
|
||||
@ -109,4 +115,3 @@ class CommentAdmin(admin.ModelAdmin):
|
||||
|
||||
class NavItemInline(SortableInlineAdminMixin, admin.TabularInline):
|
||||
model = NavItem
|
||||
|
||||
|
||||
@ -1,33 +1,17 @@
|
||||
from copy import copy
|
||||
|
||||
from django.contrib import admin
|
||||
from django.forms import ModelForm
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from ..models import Program, Schedule, Stream
|
||||
from aircox.models import Program, Schedule, Stream
|
||||
from .page import PageAdmin
|
||||
from .schedule import ScheduleInline
|
||||
|
||||
|
||||
# In order to simplify schedule_post_save algorithm, an existing schedule can't
|
||||
# update the following fields: "frequency", "date"
|
||||
class ScheduleInlineForm(ModelForm):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
if self.initial:
|
||||
self.fields['date'].disabled = True
|
||||
self.fields['frequency'].disabled = True
|
||||
|
||||
|
||||
class ScheduleInline(admin.TabularInline):
|
||||
model = Schedule
|
||||
form = ScheduleInlineForm
|
||||
readonly_fields = ('timezone',)
|
||||
extra = 1
|
||||
__all__ = ("ProgramAdmin", "StreamInline", "StreamAdmin")
|
||||
|
||||
|
||||
class StreamInline(admin.TabularInline):
|
||||
model = Stream
|
||||
fields = ['delay', 'begin', 'end']
|
||||
fields = ["delay", "begin", "end"]
|
||||
extra = 1
|
||||
|
||||
|
||||
@ -39,48 +23,27 @@ class ProgramAdmin(PageAdmin):
|
||||
schedule.boolean = True
|
||||
schedule.short_description = _("Schedule")
|
||||
|
||||
list_display = PageAdmin.list_display + ('schedule', 'station', 'active')
|
||||
list_filter = PageAdmin.list_filter + ('station', 'active')
|
||||
prepopulated_fields = {'slug': ('title',)}
|
||||
search_fields = ('title',)
|
||||
list_display = PageAdmin.list_display + ("schedule", "station", "active")
|
||||
list_filter = PageAdmin.list_filter + ("station", "active")
|
||||
prepopulated_fields = {"slug": ("title",)}
|
||||
search_fields = ("title",)
|
||||
|
||||
inlines = [ScheduleInline, StreamInline]
|
||||
|
||||
def get_fieldsets(self, request, obj=None):
|
||||
fields = super().get_fieldsets(request, obj)
|
||||
if request.user.has_perm('aircox.program.scheduling'):
|
||||
if request.user.has_perm("aircox.program.scheduling"):
|
||||
fields = fields + [
|
||||
(_('Program Settings'), {
|
||||
'fields': ['active', 'station', 'sync'],
|
||||
})
|
||||
(
|
||||
_("Program Settings"),
|
||||
{
|
||||
"fields": ["active", "station", "sync"],
|
||||
},
|
||||
)
|
||||
]
|
||||
return fields
|
||||
|
||||
|
||||
@admin.register(Schedule)
|
||||
class ScheduleAdmin(admin.ModelAdmin):
|
||||
def program_title(self, obj):
|
||||
return obj.program.title
|
||||
program_title.short_description = _('Program')
|
||||
|
||||
def freq(self, obj):
|
||||
return obj.get_frequency_verbose()
|
||||
freq.short_description = _('Day')
|
||||
|
||||
list_filter = ['frequency', 'program']
|
||||
list_display = ['program_title', 'freq', 'time', 'timezone', 'duration',
|
||||
'initial']
|
||||
list_editable = ['time', 'duration', 'initial']
|
||||
|
||||
def get_readonly_fields(self, request, obj=None):
|
||||
if obj:
|
||||
return ['program', 'date', 'frequency']
|
||||
else:
|
||||
return []
|
||||
|
||||
|
||||
@admin.register(Stream)
|
||||
class StreamAdmin(admin.ModelAdmin):
|
||||
list_display = ('id', 'program', 'delay', 'begin', 'end')
|
||||
|
||||
|
||||
list_display = ("id", "program", "delay", "begin", "end")
|
||||
|
||||
55
aircox/admin/schedule.py
Normal file
55
aircox/admin/schedule.py
Normal file
@ -0,0 +1,55 @@
|
||||
from django.contrib import admin
|
||||
from django.forms import ModelForm
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from aircox.models import Schedule
|
||||
|
||||
|
||||
__all__ = ("ScheduleInlineForm", "ScheduleInline", "ScheduleAdmin")
|
||||
|
||||
|
||||
# In order to simplify schedule_post_save algorithm, an existing schedule can't
|
||||
# update the following fields: "frequency", "date"
|
||||
class ScheduleInlineForm(ModelForm):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
if self.initial:
|
||||
self.fields["date"].disabled = True
|
||||
self.fields["frequency"].disabled = True
|
||||
|
||||
|
||||
class ScheduleInline(admin.TabularInline):
|
||||
model = Schedule
|
||||
form = ScheduleInlineForm
|
||||
readonly_fields = ("timezone",)
|
||||
extra = 1
|
||||
|
||||
|
||||
@admin.register(Schedule)
|
||||
class ScheduleAdmin(admin.ModelAdmin):
|
||||
def program_title(self, obj):
|
||||
return obj.program.title
|
||||
|
||||
program_title.short_description = _("Program")
|
||||
|
||||
def freq(self, obj):
|
||||
return obj.get_frequency_display()
|
||||
|
||||
freq.short_description = _("Day")
|
||||
|
||||
list_filter = ["frequency", "program"]
|
||||
list_display = [
|
||||
"program_title",
|
||||
"freq",
|
||||
"time",
|
||||
"timezone",
|
||||
"duration",
|
||||
"initial",
|
||||
]
|
||||
list_editable = ["time", "duration", "initial"]
|
||||
|
||||
def get_readonly_fields(self, request, obj=None):
|
||||
if obj:
|
||||
return ["program", "date", "frequency"]
|
||||
else:
|
||||
return []
|
||||
@ -1,83 +1,152 @@
|
||||
import math
|
||||
|
||||
from adminsortable2.admin import SortableAdminBase
|
||||
from django.contrib import admin
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from adminsortable2.admin import SortableInlineAdminMixin
|
||||
|
||||
from ..models import Sound, Track
|
||||
|
||||
|
||||
class TrackInline(SortableInlineAdminMixin, admin.TabularInline):
|
||||
template = 'admin/aircox/playlist_inline.html'
|
||||
class TrackInline(admin.TabularInline):
|
||||
template = "admin/aircox/playlist_inline.html"
|
||||
model = Track
|
||||
extra = 0
|
||||
fields = ('position', 'artist', 'title', 'info', 'tags')
|
||||
fields = ("position", "artist", "title", "tags", "album", "year", "info")
|
||||
|
||||
list_display = ["artist", "album", "title", "tags", "related"]
|
||||
list_filter = ["artist", "album", "title", "tags"]
|
||||
|
||||
list_display = ['artist', 'title', 'tags', 'related']
|
||||
list_filter = ['artist', 'title', 'tags']
|
||||
|
||||
class SoundTrackInline(TrackInline):
|
||||
fields = TrackInline.fields + ('timestamp',)
|
||||
fields = TrackInline.fields + ("timestamp",)
|
||||
|
||||
|
||||
class SoundInline(admin.TabularInline):
|
||||
model = Sound
|
||||
fields = ['type', 'name', 'audio', 'duration', 'is_good_quality', 'is_public']
|
||||
readonly_fields = ['type', 'audio', 'duration', 'is_good_quality']
|
||||
fields = [
|
||||
"type",
|
||||
"name",
|
||||
"audio",
|
||||
"duration",
|
||||
"is_good_quality",
|
||||
"is_public",
|
||||
"is_downloadable",
|
||||
]
|
||||
readonly_fields = ["type", "audio", "duration", "is_good_quality"]
|
||||
extra = 0
|
||||
max_num = 0
|
||||
|
||||
def audio(self, obj):
|
||||
return mark_safe('<audio src="{}" controls></audio>'.format(obj.url()))
|
||||
audio.short_descripton = _('Audio')
|
||||
return mark_safe('<audio src="{}" controls></audio>'.format(obj.file.url))
|
||||
|
||||
audio.short_description = _("Audio")
|
||||
|
||||
def get_queryset(self, request):
|
||||
return super().get_queryset(request).available()
|
||||
|
||||
|
||||
@admin.register(Sound)
|
||||
class SoundAdmin(admin.ModelAdmin):
|
||||
class SoundAdmin(SortableAdminBase, admin.ModelAdmin):
|
||||
fields = None
|
||||
list_display = ['id', 'name', 'related',
|
||||
'type', 'duration', 'is_public', 'is_good_quality',
|
||||
'audio']
|
||||
list_filter = ('type', 'is_good_quality', 'is_public')
|
||||
list_editable = ['name', 'type', 'is_public']
|
||||
|
||||
search_fields = ['name', 'program__title']
|
||||
fieldsets = [
|
||||
(None, {'fields': ['name', 'path', 'type', 'program', 'episode']}),
|
||||
(None, {'fields': ['duration', 'is_public', 'is_good_quality', 'mtime']}),
|
||||
list_display = [
|
||||
"id",
|
||||
"name",
|
||||
"related",
|
||||
"type",
|
||||
"duration",
|
||||
"is_public",
|
||||
"is_good_quality",
|
||||
"is_downloadable",
|
||||
"audio",
|
||||
]
|
||||
readonly_fields = ('path', 'duration',)
|
||||
list_filter = ("type", "is_good_quality", "is_public")
|
||||
list_editable = ["name", "is_public", "is_downloadable"]
|
||||
|
||||
search_fields = ["name", "program__title"]
|
||||
fieldsets = [
|
||||
(None, {"fields": ["name", "file", "type", "program", "episode"]}),
|
||||
(
|
||||
None,
|
||||
{
|
||||
"fields": [
|
||||
"duration",
|
||||
"is_public",
|
||||
"is_downloadable",
|
||||
"is_good_quality",
|
||||
"mtime",
|
||||
]
|
||||
},
|
||||
),
|
||||
]
|
||||
readonly_fields = ("file", "duration", "type")
|
||||
inlines = [SoundTrackInline]
|
||||
|
||||
def related(self, obj):
|
||||
# TODO: link to episode or program edit
|
||||
return obj.episode.title if obj.episode else\
|
||||
obj.program.title if obj.program else ''
|
||||
related.short_description = _('Program / Episode')
|
||||
return obj.episode.title if obj.episode else obj.program.title if obj.program else ""
|
||||
|
||||
related.short_description = _("Program / Episode")
|
||||
|
||||
def audio(self, obj):
|
||||
return mark_safe('<audio src="{}" controls></audio>'.format(obj.url()))
|
||||
audio.short_descripton = _('Audio')
|
||||
return (
|
||||
mark_safe('<audio src="{}" controls></audio>'.format(obj.file.url))
|
||||
if obj.type != Sound.TYPE_REMOVED
|
||||
else ""
|
||||
)
|
||||
|
||||
audio.short_description = _("Audio")
|
||||
|
||||
def add_view(self, request, form_url="", context=None):
|
||||
context = context or {}
|
||||
context["init_app"] = True
|
||||
context["init_el"] = "#inline-tracks"
|
||||
context["track_timestamp"] = True
|
||||
return super().add_view(request, form_url, context)
|
||||
|
||||
def change_view(self, request, object_id, form_url="", context=None):
|
||||
context = context or {}
|
||||
context["init_app"] = True
|
||||
context["init_el"] = "#inline-tracks"
|
||||
context["track_timestamp"] = True
|
||||
return super().change_view(request, object_id, form_url, context)
|
||||
|
||||
|
||||
@admin.register(Track)
|
||||
class TrackAdmin(admin.ModelAdmin):
|
||||
def tag_list(self, obj):
|
||||
return u", ".join(o.name for o in obj.tags.all())
|
||||
return ", ".join(o.name for o in obj.tags.all())
|
||||
|
||||
list_display = ['pk', 'artist', 'title', 'tag_list', 'episode', 'sound', 'timestamp']
|
||||
list_editable = ['artist', 'title']
|
||||
list_filter = ['artist', 'title', 'tags']
|
||||
list_display = [
|
||||
"pk",
|
||||
"artist",
|
||||
"title",
|
||||
"tag_list",
|
||||
"episode",
|
||||
"sound",
|
||||
"ts",
|
||||
]
|
||||
list_editable = ["artist", "title"]
|
||||
list_filter = ["artist", "title", "tags"]
|
||||
|
||||
search_fields = ['artist', 'title']
|
||||
search_fields = ["artist", "title"]
|
||||
fieldsets = [
|
||||
(_('Playlist'), {'fields': ['episode', 'sound', 'position', 'timestamp']}),
|
||||
(_('Info'), {'fields': ['artist', 'title', 'info', 'tags']}),
|
||||
(
|
||||
_("Playlist"),
|
||||
{"fields": ["episode", "sound", "position", "timestamp"]},
|
||||
),
|
||||
(_("Info"), {"fields": ["artist", "title", "info", "tags"]}),
|
||||
]
|
||||
|
||||
# TODO on edit: readonly_fields = ['episode', 'sound']
|
||||
|
||||
def ts(self, obj):
|
||||
ts = obj.timestamp
|
||||
if ts is None:
|
||||
return ""
|
||||
h = math.floor(ts / 3600)
|
||||
m = math.floor((ts - h) / 60)
|
||||
s = ts - h * 3600 - m * 60
|
||||
return "{:0>2}:{:0>2}:{:0>2}".format(h, m, s)
|
||||
|
||||
ts.short_description = _("timestamp")
|
||||
|
||||
@ -1,10 +1,10 @@
|
||||
from adminsortable2.admin import SortableAdminBase
|
||||
from django.contrib import admin
|
||||
|
||||
from ..models import Port, Station
|
||||
from .page import NavItemInline
|
||||
|
||||
|
||||
__all__ = ['PortInline', 'StationAdmin']
|
||||
__all__ = ["PortInline", "StationAdmin"]
|
||||
|
||||
|
||||
class PortInline(admin.StackedInline):
|
||||
@ -13,8 +13,6 @@ class PortInline(admin.StackedInline):
|
||||
|
||||
|
||||
@admin.register(Station)
|
||||
class StationAdmin(admin.ModelAdmin):
|
||||
prepopulated_fields = {'slug': ('name',)}
|
||||
class StationAdmin(SortableAdminBase, admin.ModelAdmin):
|
||||
prepopulated_fields = {"slug": ("name",)}
|
||||
inlines = (PortInline, NavItemInline)
|
||||
|
||||
|
||||
|
||||
@ -1,20 +1,18 @@
|
||||
from django.contrib import admin
|
||||
from django.urls import path, include, reverse
|
||||
from django.urls import include, path, reverse
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from rest_framework.routers import DefaultRouter
|
||||
|
||||
from .models import Comment, Diffusion, Program
|
||||
from . import models
|
||||
from .views.admin import StatisticsView
|
||||
|
||||
|
||||
__all__ = ['AdminSite']
|
||||
__all__ = ["AdminSite"]
|
||||
|
||||
|
||||
class AdminSite(admin.AdminSite):
|
||||
extra_urls = None
|
||||
tools = [
|
||||
(_('Statistics'), 'admin:tools-stats'),
|
||||
(_("Statistics"), "admin:tools-stats"),
|
||||
]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
@ -25,41 +23,45 @@ class AdminSite(admin.AdminSite):
|
||||
|
||||
def each_context(self, request):
|
||||
context = super().each_context(request)
|
||||
context.update({
|
||||
# all programs
|
||||
'programs': Program.objects.active().values('pk', 'title') \
|
||||
.order_by('title'),
|
||||
# today's diffusions
|
||||
'diffusions': Diffusion.objects.on_air().date().order_by('start') \
|
||||
.select_related('episode'),
|
||||
# TODO: only for dashboard
|
||||
# last comments
|
||||
'comments': Comment.objects.order_by('-date')
|
||||
.select_related('page')[0:10],
|
||||
})
|
||||
context.update(
|
||||
{
|
||||
# all programs
|
||||
"programs": models.Program.objects.active().values("pk", "title").order_by("title"),
|
||||
# today's diffusions
|
||||
"diffusions": models.Diffusion.objects.date().order_by("start").select_related("episode"),
|
||||
# TODO: only for dashboard
|
||||
# last comments
|
||||
"comments": models.Comment.objects.order_by("-date").select_related("page")[0:10],
|
||||
"latests": models.Page.objects.select_subclasses().order_by("-pub_date")[0:10],
|
||||
}
|
||||
)
|
||||
return context
|
||||
|
||||
def get_urls(self):
|
||||
urls = super().get_urls() + [
|
||||
path('api/', include((self.router.urls, 'api'))),
|
||||
path('tools/statistics/',
|
||||
self.admin_view(StatisticsView.as_view()),
|
||||
name='tools-stats'),
|
||||
path('tools/statistics/<date:date>/',
|
||||
self.admin_view(StatisticsView.as_view()),
|
||||
name='tools-stats'),
|
||||
] + self.extra_urls
|
||||
urls = (
|
||||
[
|
||||
path("api/", include((self.router.urls, "api"))),
|
||||
path(
|
||||
"tools/statistics/",
|
||||
self.admin_view(StatisticsView.as_view()),
|
||||
name="tools-stats",
|
||||
),
|
||||
path(
|
||||
"tools/statistics/<date:date>/",
|
||||
self.admin_view(StatisticsView.as_view()),
|
||||
name="tools-stats",
|
||||
),
|
||||
]
|
||||
+ self.extra_urls
|
||||
+ super().get_urls()
|
||||
)
|
||||
return urls
|
||||
|
||||
def get_tools(self):
|
||||
return [(label, reverse(url)) for label, url in self.tools]
|
||||
|
||||
def route_view(self, url, view, name, admin_view=True, label=None):
|
||||
self.extra_urls.append(path(
|
||||
url, self.admin_view(view) if admin_view else view, name=name
|
||||
))
|
||||
self.extra_urls.append(path(url, self.admin_view(view) if admin_view else view, name=name))
|
||||
|
||||
if label:
|
||||
self.tools.append((label, 'admin:' + name))
|
||||
|
||||
|
||||
self.tools.append((label, "admin:" + name))
|
||||
|
||||
@ -3,11 +3,9 @@ from django.contrib.admin.apps import AdminConfig
|
||||
|
||||
|
||||
class AircoxConfig(AppConfig):
|
||||
name = 'aircox'
|
||||
verbose_name = 'Aircox'
|
||||
name = "aircox"
|
||||
verbose_name = "Aircox"
|
||||
|
||||
|
||||
class AircoxAdminConfig(AdminConfig):
|
||||
default_site = 'aircox.admin_site.AdminSite'
|
||||
|
||||
|
||||
default_site = "aircox.admin_site.AdminSite"
|
||||
|
||||
180
aircox/conf.py
Executable file
180
aircox/conf.py
Executable file
@ -0,0 +1,180 @@
|
||||
import os
|
||||
|
||||
import inspect
|
||||
|
||||
from django.conf import settings as d_settings
|
||||
|
||||
|
||||
__all__ = ("Settings", "settings")
|
||||
|
||||
|
||||
# code from django-fox
|
||||
class BaseSettings:
|
||||
"""Utility class used to load and save settings, can be used as model.
|
||||
|
||||
Some members are excluded from being configuration:
|
||||
- Protected/private members;
|
||||
- On django model, "objects" and "Meta";
|
||||
- Class declaration and callables
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
class MySettings(Settings):
|
||||
a = 13
|
||||
b = 12
|
||||
|
||||
my_settings = MySettings().load('MY_SETTINGS_KEY')
|
||||
print(my_settings.a, my_settings.get('b'))
|
||||
```
|
||||
|
||||
This will load values from django project settings.
|
||||
"""
|
||||
|
||||
def __init__(self, key, module=None):
|
||||
self.load(key, module)
|
||||
|
||||
def load(self, key, module=None):
|
||||
"""Load settings from module's item specified by its member name. When
|
||||
no module is provided, uses ``django.conf.settings``.
|
||||
|
||||
:param str key: module member name.
|
||||
:param module: configuration object.
|
||||
:returns self
|
||||
"""
|
||||
if module is None:
|
||||
module = d_settings
|
||||
settings = getattr(module, key, None)
|
||||
if settings:
|
||||
self.update(settings)
|
||||
return self
|
||||
|
||||
def update(self, settings):
|
||||
"""Update self's values from provided settings. ``settings`` can be an
|
||||
iterable of ``(key, value)``.
|
||||
|
||||
:param dict|Settings|iterable settings: value to update from.
|
||||
"""
|
||||
if isinstance(settings, (dict, Settings)):
|
||||
settings = settings.items()
|
||||
for key, value in settings:
|
||||
if self.is_config_item(key, value):
|
||||
setattr(self, key, value)
|
||||
|
||||
def get(self, key, default=None):
|
||||
"""Return settings' value for provided key."""
|
||||
return getattr(self, key, default)
|
||||
|
||||
def items(self):
|
||||
"""Iterate over items members, as tupple of ``key, value``."""
|
||||
for key in dir(self):
|
||||
value = getattr(self, key)
|
||||
if self.is_config_item(key, value):
|
||||
yield key, value
|
||||
|
||||
def is_config_item(self, key, value):
|
||||
"""Return True if key/value item is a configuration setting."""
|
||||
if key.startswith("_") or callable(value) or inspect.isclass(value):
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
# --- Global & misc
|
||||
DEFAULT_USER_GROUPS = {
|
||||
"radio hosts": (
|
||||
# TODO include content_type in order to avoid clash with potential
|
||||
# extra applications
|
||||
# aircox
|
||||
"view_program",
|
||||
"view_episode",
|
||||
"change_diffusion",
|
||||
"add_comment",
|
||||
"change_comment",
|
||||
"delete_comment",
|
||||
"add_article",
|
||||
"change_article",
|
||||
"delete_article",
|
||||
"change_sound",
|
||||
"add_track",
|
||||
"change_track",
|
||||
"delete_track",
|
||||
# taggit
|
||||
"add_tag",
|
||||
"change_tag",
|
||||
"delete_tag",
|
||||
# filer
|
||||
"add_folder",
|
||||
"change_folder",
|
||||
"delete_folder",
|
||||
"can_use_directory_listing",
|
||||
"add_image",
|
||||
"change_image",
|
||||
"delete_image",
|
||||
),
|
||||
}
|
||||
"""Groups to assign to users at their creation, along with the permissions
|
||||
to add to each group."""
|
||||
PROGRAMS_DIR = "programs"
|
||||
"""Directory for the programs data."""
|
||||
|
||||
@property
|
||||
def PROGRAMS_DIR_ABS(self):
|
||||
return os.path.join(d_settings.MEDIA_ROOT, self.PROGRAMS_DIR)
|
||||
|
||||
# --- Programs & episodes
|
||||
EPISODE_TITLE = "{program.title} - {date}"
|
||||
"""Default title for episodes."""
|
||||
EPISODE_TITLE_DATE_FORMAT = "%-d %B %Y"
|
||||
"""Date format in episode title (python's strftime)"""
|
||||
|
||||
# --- Logs & archives
|
||||
LOGS_ARCHIVES_DIR = "logs/archives"
|
||||
"""Directory where to save logs' archives."""
|
||||
|
||||
@property
|
||||
def LOGS_ARCHIVES_DIR_ABS(self):
|
||||
return os.path.join(d_settings.PROJECT_ROOT, self.LOGS_ARCHIVES_DIR)
|
||||
|
||||
LOGS_ARCHIVES_AGE = 60
|
||||
"""In days, minimal age of a log before it is archived."""
|
||||
|
||||
# --- Sounds
|
||||
SOUND_ARCHIVES_SUBDIR = "archives"
|
||||
"""Sub directory used for the complete episode sounds."""
|
||||
SOUND_EXCERPTS_SUBDIR = "excerpts"
|
||||
"""Sub directory used for the excerpts of the episode."""
|
||||
SOUND_QUALITY = {
|
||||
"attribute": "RMS lev dB",
|
||||
"range": (-18.0, -8.0),
|
||||
"sample_length": 120,
|
||||
}
|
||||
"""Quality attributes passed to sound_quality_check from sounds_monitor
|
||||
(Soxi parameters)."""
|
||||
SOUND_FILE_EXT = (".ogg", ".flac", ".wav", ".mp3", ".opus")
|
||||
"""Extension of sound files."""
|
||||
SOUND_KEEP_DELETED = False
|
||||
"""Tag sounds as deleted instead of deleting them when file has been
|
||||
removed from filesystem (sound monitoring)."""
|
||||
|
||||
# --- Streamer & Controllers
|
||||
CONTROLLERS_WORKING_DIR = "/tmp/aircox"
|
||||
"""Controllers working directory."""
|
||||
|
||||
# --- Playlist import from CSV
|
||||
IMPORT_PLAYLIST_CSV_COLS = (
|
||||
"artist",
|
||||
"title",
|
||||
"minutes",
|
||||
"seconds",
|
||||
"tags",
|
||||
"info",
|
||||
)
|
||||
"""Columns for CSV file."""
|
||||
IMPORT_PLAYLIST_CSV_DELIMITER = ";"
|
||||
"""Column delimiter of csv text files."""
|
||||
IMPORT_PLAYLIST_CSV_TEXT_QUOTE = '"'
|
||||
"""Text delimiter of csv text files."""
|
||||
|
||||
|
||||
settings = Settings("AIRCOX")
|
||||
4
aircox/context_processors/__init__.py
Normal file
4
aircox/context_processors/__init__.py
Normal file
@ -0,0 +1,4 @@
|
||||
def station(request):
|
||||
station = request.station
|
||||
audio_streams = station.streams if station else None
|
||||
return {"station": station, "audio_streams": audio_streams}
|
||||
8
aircox/controllers/README.md
Normal file
8
aircox/controllers/README.md
Normal file
@ -0,0 +1,8 @@
|
||||
# aircox.controllers
|
||||
This module provides the following controllers classes:
|
||||
- `log_archiver.LogArchiver`: dumps and load gzip archives from Log models.
|
||||
- `sound_file.SoundFile`: handle synchronisation between filesystem and database for a sound file.
|
||||
- `sound_monitor.SoundMonitor`: monitor filesystem for changes on audio files and synchronise database.
|
||||
- `sound_stats.SoundStats` (+ `SoxStats`): get audio statistics of an audio file using Sox.
|
||||
- `diffuions.Diffusions`: generate, update and clean diffusions.
|
||||
- `playlist_import.PlaylistImport`: import playlists from CSV.
|
||||
0
aircox/controllers/__init__.py
Normal file
0
aircox/controllers/__init__.py
Normal file
58
aircox/controllers/diffusion_monitor.py
Normal file
58
aircox/controllers/diffusion_monitor.py
Normal file
@ -0,0 +1,58 @@
|
||||
from datetime import datetime, time
|
||||
import logging
|
||||
|
||||
from django.db import transaction
|
||||
from django.utils import timezone as tz
|
||||
|
||||
from aircox.models import Diffusion, Schedule
|
||||
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
__all__ = ("DiffusionMonitor",)
|
||||
|
||||
|
||||
class DiffusionMonitor:
|
||||
"""Handle generation and update of Diffusion instances."""
|
||||
|
||||
date = None
|
||||
|
||||
def __init__(self, date):
|
||||
self.date = date or date.today()
|
||||
|
||||
def update(self):
|
||||
episodes, diffusions = [], []
|
||||
for schedule in Schedule.objects.filter(program__active=True, initial__isnull=True):
|
||||
eps, diffs = schedule.diffusions_of_month(self.date)
|
||||
if eps:
|
||||
episodes += eps
|
||||
if diffs:
|
||||
diffusions += diffs
|
||||
|
||||
logger.info(
|
||||
"[update] %s: %d episodes, %d diffusions and reruns",
|
||||
str(schedule),
|
||||
len(eps),
|
||||
len(diffs),
|
||||
)
|
||||
|
||||
with transaction.atomic():
|
||||
logger.info(
|
||||
"[update] save %d episodes and %d diffusions",
|
||||
len(episodes),
|
||||
len(diffusions),
|
||||
)
|
||||
for episode in episodes:
|
||||
episode.save()
|
||||
for diffusion in diffusions:
|
||||
# force episode id's update
|
||||
diffusion.episode = diffusion.episode
|
||||
diffusion.save()
|
||||
|
||||
def clean(self):
|
||||
qs = Diffusion.objects.filter(
|
||||
type=Diffusion.TYPE_UNCONFIRMED,
|
||||
start__lt=tz.make_aware(datetime.combine(self.date, time.min)),
|
||||
)
|
||||
logger.info("[clean] %d diffusions will be removed", qs.count())
|
||||
qs.delete()
|
||||
107
aircox/controllers/log_archiver.py
Normal file
107
aircox/controllers/log_archiver.py
Normal file
@ -0,0 +1,107 @@
|
||||
import gzip
|
||||
import os
|
||||
|
||||
import yaml
|
||||
from django.utils.functional import cached_property
|
||||
|
||||
from aircox.conf import settings
|
||||
from aircox.models import Diffusion, Sound, Track, Log
|
||||
|
||||
|
||||
__all__ = ("LogArchiver",)
|
||||
|
||||
|
||||
class LogArchiver:
|
||||
"""Commodity class used to manage archives of logs."""
|
||||
|
||||
@cached_property
|
||||
def fields(self):
|
||||
return Log._meta.get_fields()
|
||||
|
||||
@staticmethod
|
||||
def get_path(station, date):
|
||||
return os.path.join(
|
||||
settings.LOGS_ARCHIVES_DIR_ABS,
|
||||
"{}_{}.log.gz".format(date.strftime("%Y%m%d"), station.pk),
|
||||
)
|
||||
|
||||
def archive(self, qs, keep=False):
|
||||
"""Archive logs of the given queryset.
|
||||
|
||||
Delete archived logs if not `keep`. Return the count of archived
|
||||
logs
|
||||
"""
|
||||
if not qs.exists():
|
||||
return 0
|
||||
|
||||
os.makedirs(settings.LOGS_ARCHIVES_DIR_ABS, exist_ok=True)
|
||||
count = qs.count()
|
||||
logs = self.sort_logs(qs)
|
||||
|
||||
# Note: since we use Yaml, we can just append new logs when file
|
||||
# exists yet <3
|
||||
for (station, date), logs in logs.items():
|
||||
path = self.get_path(station, date)
|
||||
# FIXME: remove binary mode
|
||||
with gzip.open(path, "ab") as archive:
|
||||
data = yaml.dump([self.serialize(line) for line in logs]).encode("utf8")
|
||||
archive.write(data)
|
||||
|
||||
if not keep:
|
||||
qs.delete()
|
||||
|
||||
return count
|
||||
|
||||
@staticmethod
|
||||
def sort_logs(qs):
|
||||
"""Sort logs by station and date and return a dict of `{
|
||||
(station,date): [logs] }`."""
|
||||
qs = qs.order_by("date")
|
||||
logs = {}
|
||||
for log in qs:
|
||||
key = (log.station, log.date.date())
|
||||
logs.setdefault(key, []).append(log)
|
||||
return logs
|
||||
|
||||
def serialize(self, log):
|
||||
"""Serialize log."""
|
||||
return {i.attname: getattr(log, i.attname) for i in self.fields}
|
||||
|
||||
def load(self, station, date):
|
||||
"""Load an archive returning logs in a list."""
|
||||
path = self.get_path(station, date)
|
||||
|
||||
if not os.path.exists(path):
|
||||
return []
|
||||
return self.load_file(path)
|
||||
|
||||
def load_file(self, path):
|
||||
with gzip.open(path, "rb") as archive:
|
||||
data = archive.read()
|
||||
logs = yaml.safe_load(data)
|
||||
|
||||
# we need to preload diffusions, sounds and tracks
|
||||
rels = {
|
||||
"diffusion": self.get_relations(logs, Diffusion, "diffusion"),
|
||||
"sound": self.get_relations(logs, Sound, "sound"),
|
||||
"track": self.get_relations(logs, Track, "track"),
|
||||
}
|
||||
|
||||
def rel_obj(log, attr):
|
||||
rel_id = log.get(attr + "_id")
|
||||
return rels[attr][rel_id] if rel_id else None
|
||||
|
||||
return [
|
||||
Log(
|
||||
diffusion=rel_obj(log, "diffusion"), sound=rel_obj(log, "sound"), track=rel_obj(log, "track"), **log
|
||||
)
|
||||
for log in logs
|
||||
]
|
||||
|
||||
@staticmethod
|
||||
def get_relations(logs, model, attr):
|
||||
"""From a list of dict representing logs, retrieve related objects of
|
||||
the given type."""
|
||||
attr_id = attr + "_id"
|
||||
pks = {log[attr_id] for log in logs if attr_id in log}
|
||||
return {rel.pk: rel for rel in model.objects.filter(pk__in=pks)}
|
||||
100
aircox/controllers/playlist_import.py
Normal file
100
aircox/controllers/playlist_import.py
Normal file
@ -0,0 +1,100 @@
|
||||
import csv
|
||||
import logging
|
||||
import os
|
||||
|
||||
|
||||
from aircox.conf import settings
|
||||
from aircox.models import Track
|
||||
|
||||
|
||||
__all__ = ("PlaylistImport",)
|
||||
|
||||
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
class PlaylistImport:
|
||||
"""Import one or more playlist for the given sound. Attach it to the
|
||||
provided sound.
|
||||
|
||||
Playlists are in CSV format, where columns are separated with a
|
||||
'{settings.IMPORT_PLAYLIST_CSV_DELIMITER}'. Text quote is
|
||||
{settings.IMPORT_PLAYLIST_CSV_TEXT_QUOTE}.
|
||||
|
||||
If 'minutes' or 'seconds' are given, position will be expressed as timed
|
||||
position, instead of position in playlist.
|
||||
"""
|
||||
|
||||
path = None
|
||||
data = None
|
||||
tracks = None
|
||||
track_kwargs = {}
|
||||
|
||||
def __init__(self, path=None, **track_kwargs):
|
||||
self.path = path
|
||||
self.track_kwargs = track_kwargs
|
||||
|
||||
def reset(self):
|
||||
self.data = None
|
||||
self.tracks = None
|
||||
|
||||
def run(self):
|
||||
self.read()
|
||||
if self.track_kwargs.get("sound") is not None:
|
||||
self.make_playlist()
|
||||
|
||||
def read(self):
|
||||
if not os.path.exists(self.path):
|
||||
return True
|
||||
with open(self.path, "r") as file:
|
||||
logger.info("start reading csv " + self.path)
|
||||
self.data = list(
|
||||
csv.DictReader(
|
||||
(row for row in file if not (row.startswith("#") or row.startswith("\ufeff#")) and row.strip()),
|
||||
fieldnames=settings.IMPORT_PLAYLIST_CSV_COLS,
|
||||
delimiter=settings.IMPORT_PLAYLIST_CSV_DELIMITER,
|
||||
quotechar=settings.IMPORT_PLAYLIST_CSV_TEXT_QUOTE,
|
||||
)
|
||||
)
|
||||
|
||||
def make_playlist(self):
|
||||
"""Make a playlist from the read data, and return it.
|
||||
|
||||
If save is true, save it into the database
|
||||
"""
|
||||
if self.track_kwargs.get("sound") is None:
|
||||
logger.error("related track's sound is missing. Skip import of " + self.path + ".")
|
||||
return
|
||||
|
||||
maps = settings.IMPORT_PLAYLIST_CSV_COLS
|
||||
tracks = []
|
||||
|
||||
logger.info("parse csv file " + self.path)
|
||||
has_timestamp = ("minutes" or "seconds") in maps
|
||||
for index, line in enumerate(self.data):
|
||||
if ("title" or "artist") not in line:
|
||||
return
|
||||
try:
|
||||
timestamp = (
|
||||
int(line.get("minutes") or 0) * 60 + int(line.get("seconds") or 0) if has_timestamp else None
|
||||
)
|
||||
|
||||
track, created = Track.objects.get_or_create(
|
||||
title=line.get("title"), artist=line.get("artist"), position=index, **self.track_kwargs
|
||||
)
|
||||
track.timestamp = timestamp
|
||||
track.info = line.get("info")
|
||||
tags = line.get("tags")
|
||||
if tags:
|
||||
track.tags.add(*tags.lower().split(","))
|
||||
except Exception as err:
|
||||
logger.warning(
|
||||
"an error occured for track {index}, it may not "
|
||||
"have been saved: {err}".format(index=index, err=err)
|
||||
)
|
||||
continue
|
||||
|
||||
track.save()
|
||||
tracks.append(track)
|
||||
self.tracks = tracks
|
||||
return tracks
|
||||
213
aircox/controllers/sound_file.py
Normal file
213
aircox/controllers/sound_file.py
Normal file
@ -0,0 +1,213 @@
|
||||
#! /usr/bin/env python3
|
||||
"""Provide SoundFile which is used to link between database and file system.
|
||||
|
||||
File name
|
||||
=========
|
||||
It tries to parse the file name to get the date of the diffusion of an
|
||||
episode and associate the file with it; We use the following format:
|
||||
yyyymmdd[_n][_][name]
|
||||
|
||||
Where:
|
||||
'yyyy' the year of the episode's diffusion;
|
||||
'mm' the month of the episode's diffusion;
|
||||
'dd' the day of the episode's diffusion;
|
||||
'n' the number of the episode (if multiple episodes);
|
||||
'name' the title of the sound;
|
||||
|
||||
Sound Quality
|
||||
=============
|
||||
To check quality of files, call the command sound_quality_check using the
|
||||
parameters given by the setting SOUND_QUALITY. This script requires
|
||||
Sox (and soxi).
|
||||
"""
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
from datetime import date
|
||||
|
||||
import mutagen
|
||||
from django.conf import settings as conf
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.translation import gettext as _
|
||||
|
||||
from aircox import utils
|
||||
from aircox.models import Program, Sound, Track
|
||||
|
||||
from .playlist_import import PlaylistImport
|
||||
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
class SoundFile:
|
||||
"""Handle synchronisation between sounds on files and database."""
|
||||
|
||||
path = None
|
||||
info = None
|
||||
path_info = None
|
||||
sound = None
|
||||
|
||||
def __init__(self, path):
|
||||
self.path = path
|
||||
|
||||
@property
|
||||
def sound_path(self):
|
||||
"""Relative path name."""
|
||||
return self.path.replace(conf.MEDIA_ROOT + "/", "")
|
||||
|
||||
@property
|
||||
def episode(self):
|
||||
return self.sound and self.sound.episode
|
||||
|
||||
def sync(self, sound=None, program=None, deleted=False, keep_deleted=False, **kwargs):
|
||||
"""Update related sound model and save it."""
|
||||
if deleted:
|
||||
return self._on_delete(self.path, keep_deleted)
|
||||
|
||||
# FIXME: sound.program as not null
|
||||
if not program:
|
||||
program = Program.get_from_path(self.path)
|
||||
logger.debug('program from path "%s" -> %s', self.path, program)
|
||||
kwargs["program_id"] = program.pk
|
||||
|
||||
if sound:
|
||||
created = False
|
||||
else:
|
||||
sound, created = Sound.objects.get_or_create(file=self.sound_path, defaults=kwargs)
|
||||
|
||||
self.sound = sound
|
||||
self.path_info = self.read_path(self.path)
|
||||
|
||||
sound.program = program
|
||||
if created or sound.check_on_file():
|
||||
sound.name = self.path_info.get("name")
|
||||
self.info = self.read_file_info()
|
||||
if self.info is not None:
|
||||
sound.duration = utils.seconds_to_time(self.info.info.length)
|
||||
|
||||
# check for episode
|
||||
if sound.episode is None and "year" in self.path_info:
|
||||
sound.episode = self.find_episode(sound, self.path_info)
|
||||
sound.save()
|
||||
|
||||
# check for playlist
|
||||
self.find_playlist(sound)
|
||||
return sound
|
||||
|
||||
def _on_delete(self, path, keep_deleted):
|
||||
# TODO: remove from db on delete
|
||||
if keep_deleted:
|
||||
sound = Sound.objects.path(self.path).first()
|
||||
if sound:
|
||||
if keep_deleted:
|
||||
sound.type = sound.TYPE_REMOVED
|
||||
sound.check_on_file()
|
||||
sound.save()
|
||||
return sound
|
||||
else:
|
||||
Sound.objects.path(self.path).delete()
|
||||
|
||||
def read_path(self, path):
|
||||
"""Parse path name returning dictionary of extracted info. It can
|
||||
contain:
|
||||
|
||||
- `year`, `month`, `day`: diffusion date
|
||||
- `hour`, `minute`: diffusion time
|
||||
- `n`: sound arbitrary number (used for sound ordering)
|
||||
- `name`: cleaned name extracted or file name (without extension)
|
||||
"""
|
||||
basename = os.path.basename(path)
|
||||
basename = os.path.splitext(basename)[0]
|
||||
reg_match = self._path_re.search(basename)
|
||||
if reg_match:
|
||||
info = reg_match.groupdict()
|
||||
for k in ("year", "month", "day", "hour", "minute", "n"):
|
||||
if info.get(k) is not None:
|
||||
info[k] = int(info[k])
|
||||
|
||||
name = info.get("name")
|
||||
info["name"] = name and self._into_name(name) or basename
|
||||
else:
|
||||
info = {"name": basename}
|
||||
return info
|
||||
|
||||
_path_re = re.compile(
|
||||
"^(?P<year>[0-9]{4})(?P<month>[0-9]{2})(?P<day>[0-9]{2})"
|
||||
"(_(?P<hour>[0-9]{2})h(?P<minute>[0-9]{2}))?"
|
||||
"(_(?P<n>[0-9]+))?"
|
||||
"_?[ -]*(?P<name>.*)$"
|
||||
)
|
||||
|
||||
def _into_name(self, name):
|
||||
name = name.replace("_", " ")
|
||||
return " ".join(r.capitalize() for r in name.split(" "))
|
||||
|
||||
def read_file_info(self):
|
||||
"""Read file information and metadata."""
|
||||
try:
|
||||
if os.path.exists(self.path):
|
||||
return mutagen.File(self.path)
|
||||
except Exception:
|
||||
pass
|
||||
return None
|
||||
|
||||
def find_episode(self, sound, path_info):
|
||||
"""For a given program, check if there is an initial diffusion to
|
||||
associate to, using the date info we have. Update self.sound and save
|
||||
it consequently.
|
||||
|
||||
We only allow initial diffusion since there should be no rerun.
|
||||
"""
|
||||
program, pi = sound.program, path_info
|
||||
if "year" not in pi or not sound or sound.episode:
|
||||
return None
|
||||
|
||||
year, month, day = pi.get("year"), pi.get("month"), pi.get("day")
|
||||
if pi.get("hour") is not None:
|
||||
at = tz.datetime(year, month, day, pi.get("hour", 0), pi.get("minute", 0))
|
||||
at = tz.make_aware(at)
|
||||
else:
|
||||
at = date(year, month, day)
|
||||
|
||||
diffusion = program.diffusion_set.at(at).first()
|
||||
if not diffusion:
|
||||
return None
|
||||
|
||||
logger.debug("%s <--> %s", sound.file.name, str(diffusion.episode))
|
||||
return diffusion.episode
|
||||
|
||||
def find_playlist(self, sound=None, use_meta=True):
|
||||
"""Find a playlist file corresponding to the sound path, such as:
|
||||
my_sound.ogg => my_sound.csv.
|
||||
|
||||
Use sound's file metadata if no corresponding playlist has been
|
||||
found and `use_meta` is True.
|
||||
"""
|
||||
if sound is None:
|
||||
sound = self.sound
|
||||
if sound.track_set.count() > 1:
|
||||
return
|
||||
|
||||
# import playlist
|
||||
path_noext, ext = os.path.splitext(self.sound.file.path)
|
||||
path = path_noext + ".csv"
|
||||
if os.path.exists(path):
|
||||
PlaylistImport(path, sound=sound).run()
|
||||
# use metadata
|
||||
elif use_meta:
|
||||
if self.info is None:
|
||||
self.read_file_info()
|
||||
if self.info and self.info.tags:
|
||||
tags = self.info.tags
|
||||
title, artist, album, year = tuple(
|
||||
t and ", ".join(t) for t in (tags.get(k) for k in ("title", "artist", "album", "year"))
|
||||
)
|
||||
title = title or (self.path_info and self.path_info.get("name")) or os.path.basename(path_noext)
|
||||
info = "{} ({})".format(album, year) if album and year else album or year or ""
|
||||
track = Track(
|
||||
sound=sound,
|
||||
position=int(tags.get("tracknumber", 0)),
|
||||
title=title,
|
||||
artist=artist or _("unknown"),
|
||||
info=info,
|
||||
)
|
||||
track.save()
|
||||
310
aircox/controllers/sound_monitor.py
Normal file
310
aircox/controllers/sound_monitor.py
Normal file
@ -0,0 +1,310 @@
|
||||
#! /usr/bin/env python3
|
||||
|
||||
"""Monitor sound files; For each program, check for:
|
||||
|
||||
- new files;
|
||||
- deleted files;
|
||||
- differences between files and sound;
|
||||
- quality of the files;
|
||||
|
||||
It tries to parse the file name to get the date of the diffusion of an
|
||||
episode and associate the file with it; WNotifye the following format:
|
||||
yyyymmdd[_n][_][name]
|
||||
|
||||
Where:
|
||||
'yyyy' the year Notifyhe episode's diffusion;
|
||||
'mm' the month of the episode's difNotifyon;
|
||||
'dd' the day of the episode's diffusion;
|
||||
'n' the number of the episode (if multiple episodes);
|
||||
'name' the title of the sNotify;
|
||||
|
||||
|
||||
To check quality of files, call the command sound_quality_check using the
|
||||
parameters given by the setting SOUND_QUALITY. This script requires
|
||||
Sox (and soxi).
|
||||
"""
|
||||
import atexit
|
||||
from concurrent import futures
|
||||
import logging
|
||||
import time
|
||||
import os
|
||||
|
||||
# from datetime import datetime, timedelta
|
||||
|
||||
from django.utils.timezone import datetime, timedelta
|
||||
|
||||
from watchdog.observers import Observer
|
||||
from watchdog.events import PatternMatchingEventHandler
|
||||
|
||||
from aircox.conf import settings
|
||||
from aircox.models import Sound, Program
|
||||
|
||||
from .sound_file import SoundFile
|
||||
|
||||
|
||||
# FIXME: logger should be different in used classes (e.g. "aircox.commands")
|
||||
# defaulting to logging.
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
__all__ = (
|
||||
"Task",
|
||||
"CreateTask",
|
||||
"DeleteTask",
|
||||
"MoveTask",
|
||||
"ModifiedTask",
|
||||
"MonitorHandler",
|
||||
)
|
||||
|
||||
|
||||
class Task:
|
||||
"""Base class used to execute a specific task on file change event.
|
||||
|
||||
Handlers are sent to a multithread pool.
|
||||
"""
|
||||
|
||||
future = None
|
||||
"""Future that promised the handler's call."""
|
||||
log_msg = None
|
||||
"""Log message to display on event happens."""
|
||||
timestamp = None
|
||||
"""Last ping timestamp (the event happened)."""
|
||||
|
||||
def __init__(self, logger=logging):
|
||||
self.ping()
|
||||
|
||||
def ping(self):
|
||||
""""""
|
||||
self.timestamp = datetime.now()
|
||||
|
||||
def __call__(self, event, path=None, logger=logging, **kw):
|
||||
sound_file = SoundFile(path or event.src_path)
|
||||
if self.log_msg:
|
||||
msg = self.log_msg.format(event=event, sound_file=sound_file)
|
||||
logger.info(msg)
|
||||
|
||||
sound_file.sync(**kw)
|
||||
return sound_file
|
||||
|
||||
|
||||
class CreateTask(Task):
|
||||
log_msg = "Sound file created: {sound_file.path}"
|
||||
|
||||
|
||||
class DeleteTask(Task):
|
||||
log_msg = "Sound file deleted: {sound_file.path}"
|
||||
|
||||
def __call__(self, *args, **kwargs):
|
||||
kwargs["deleted"] = True
|
||||
return super().__call__(*args, **kwargs)
|
||||
|
||||
|
||||
class MoveTask(Task):
|
||||
log_msg = "Sound file moved: {event.src_path} -> {event.dest_path}"
|
||||
|
||||
def __call__(self, event, **kw):
|
||||
sound = Sound.objects.filter(file=event.src_path).first()
|
||||
if sound:
|
||||
kw["sound"] = sound
|
||||
kw["path"] = event.src_path
|
||||
else:
|
||||
kw["path"] = event.dest_path
|
||||
return super().__call__(event, **kw)
|
||||
|
||||
|
||||
class ModifiedTask(Task):
|
||||
timeout_delta = timedelta(seconds=30)
|
||||
log_msg = "Sound file updated: {sound_file.path}"
|
||||
|
||||
def wait(self):
|
||||
# multiple call of this handler can be done consecutively, we block
|
||||
# its thread using timeout
|
||||
# Note: this method may be subject to some race conflicts, but this
|
||||
# should not be big a real issue.
|
||||
timeout = self.timestamp + self.timeout_delta
|
||||
while datetime.now() < timeout:
|
||||
time.sleep(self.timeout_delta.total_seconds())
|
||||
timeout = self.timestamp + self.timeout_delta
|
||||
|
||||
def __call__(self, event, **kw):
|
||||
self.wait()
|
||||
return super().__call__(event, **kw)
|
||||
|
||||
|
||||
class MonitorHandler(PatternMatchingEventHandler):
|
||||
"""MonitorHandler is used as a Watchdog event handler.
|
||||
|
||||
It uses a multithread pool in order to execute tasks on events. If a
|
||||
job already exists for this file and event, it pings existing job
|
||||
without creating a new one.
|
||||
"""
|
||||
|
||||
pool = None
|
||||
jobs = None
|
||||
|
||||
def __init__(self, subdir, pool, jobs=None, **sync_kw):
|
||||
"""
|
||||
:param str subdir: sub-directory in program dirs to monitor \
|
||||
(SOUND_ARCHIVES_SUBDIR or SOUND_EXCERPTS_SUBDIR);
|
||||
:param concurrent.futures.Executor pool: pool executing jobs on file
|
||||
change;
|
||||
:param **sync_kw: kwargs passed to `SoundFile.sync`;
|
||||
"""
|
||||
self.subdir = subdir
|
||||
self.pool = pool
|
||||
self.jobs = jobs or {}
|
||||
self.sync_kw = sync_kw
|
||||
|
||||
patterns = ["*/{}/*{}".format(self.subdir, ext) for ext in settings.SOUND_FILE_EXT]
|
||||
super().__init__(patterns=patterns, ignore_directories=True)
|
||||
|
||||
def on_created(self, event):
|
||||
self._submit(CreateTask(), event, "new", **self.sync_kw)
|
||||
|
||||
def on_deleted(self, event):
|
||||
self._submit(DeleteTask(), event, "del")
|
||||
|
||||
def on_moved(self, event):
|
||||
self._submit(MoveTask(), event, "mv", **self.sync_kw)
|
||||
|
||||
def on_modified(self, event):
|
||||
self._submit(ModifiedTask(), event, "up", **self.sync_kw)
|
||||
|
||||
def _submit(self, handler, event, job_key_prefix, **kwargs):
|
||||
"""Send handler job to pool if not already running.
|
||||
|
||||
Return tuple with running job and boolean indicating if its a
|
||||
new one.
|
||||
"""
|
||||
key = job_key_prefix + ":" + event.src_path
|
||||
job = self.jobs.get(key)
|
||||
if job and not job.future.done():
|
||||
job.ping()
|
||||
return job, False
|
||||
|
||||
handler.future = self.pool.submit(handler, event, **kwargs)
|
||||
self.jobs[key] = handler
|
||||
|
||||
def done(r):
|
||||
if self.jobs.get(key) is handler:
|
||||
del self.jobs[key]
|
||||
|
||||
handler.future.add_done_callback(done)
|
||||
return handler, True
|
||||
|
||||
|
||||
class SoundMonitor:
|
||||
"""Monitor for filesystem changes in order to synchronise database and
|
||||
analyse files of a provided program."""
|
||||
|
||||
def report(self, program=None, component=None, *content, logger=logging):
|
||||
content = " ".join([str(c) for c in content])
|
||||
logger.info(f"{program}: {content}" if not component else f"{program}, {component}: {content}")
|
||||
|
||||
def scan(self, logger=logging):
|
||||
"""For all programs, scan dirs.
|
||||
|
||||
Return scanned directories.
|
||||
"""
|
||||
logger.info("scan all programs...")
|
||||
programs = Program.objects.filter()
|
||||
|
||||
dirs = []
|
||||
for program in programs:
|
||||
logger.info(f"#{program.id} {program.title}")
|
||||
self.scan_for_program(
|
||||
program,
|
||||
settings.SOUND_ARCHIVES_SUBDIR,
|
||||
logger=logger,
|
||||
type=Sound.TYPE_ARCHIVE,
|
||||
)
|
||||
self.scan_for_program(
|
||||
program,
|
||||
settings.SOUND_EXCERPTS_SUBDIR,
|
||||
logger=logger,
|
||||
type=Sound.TYPE_EXCERPT,
|
||||
)
|
||||
dirs.append(program.abspath)
|
||||
return dirs
|
||||
|
||||
def scan_for_program(self, program, subdir, logger=logging, **sound_kwargs):
|
||||
"""Scan a given directory that is associated to the given program, and
|
||||
update sounds information."""
|
||||
logger.info("- %s/", subdir)
|
||||
if not program.ensure_dir(subdir):
|
||||
return
|
||||
|
||||
subdir = os.path.join(program.abspath, subdir)
|
||||
sounds = []
|
||||
|
||||
# sounds in directory
|
||||
for path in os.listdir(subdir):
|
||||
path = os.path.join(subdir, path)
|
||||
if not path.endswith(settings.SOUND_FILE_EXT):
|
||||
continue
|
||||
|
||||
sound_file = SoundFile(path)
|
||||
sound_file.sync(program=program, **sound_kwargs)
|
||||
sounds.append(sound_file.sound.pk)
|
||||
|
||||
# sounds in db & unchecked
|
||||
sounds = Sound.objects.filter(file__startswith=subdir).exclude(pk__in=sounds)
|
||||
self.check_sounds(sounds, program=program)
|
||||
|
||||
def check_sounds(self, qs, **sync_kwargs):
|
||||
"""Only check for the sound existence or update."""
|
||||
# check files
|
||||
for sound in qs:
|
||||
if sound.check_on_file():
|
||||
SoundFile(sound.file.path).sync(sound=sound, **sync_kwargs)
|
||||
|
||||
_running = False
|
||||
|
||||
def monitor(self, logger=logging):
|
||||
if self._running:
|
||||
raise RuntimeError("already running")
|
||||
|
||||
"""Run in monitor mode."""
|
||||
with futures.ThreadPoolExecutor() as pool:
|
||||
archives_handler = MonitorHandler(
|
||||
settings.SOUND_ARCHIVES_SUBDIR,
|
||||
pool,
|
||||
type=Sound.TYPE_ARCHIVE,
|
||||
logger=logger,
|
||||
)
|
||||
excerpts_handler = MonitorHandler(
|
||||
settings.SOUND_EXCERPTS_SUBDIR,
|
||||
pool,
|
||||
type=Sound.TYPE_EXCERPT,
|
||||
logger=logger,
|
||||
)
|
||||
|
||||
observer = Observer()
|
||||
observer.schedule(
|
||||
archives_handler,
|
||||
settings.PROGRAMS_DIR_ABS,
|
||||
recursive=True,
|
||||
)
|
||||
observer.schedule(
|
||||
excerpts_handler,
|
||||
settings.PROGRAMS_DIR_ABS,
|
||||
recursive=True,
|
||||
)
|
||||
observer.start()
|
||||
|
||||
def leave():
|
||||
observer.stop()
|
||||
observer.join()
|
||||
|
||||
atexit.register(leave)
|
||||
|
||||
self._running = True
|
||||
while self._running:
|
||||
time.sleep(1)
|
||||
|
||||
leave()
|
||||
atexit.unregister(leave)
|
||||
|
||||
def stop(self):
|
||||
"""Stop monitor() loop."""
|
||||
self._running = False
|
||||
115
aircox/controllers/sound_stats.py
Normal file
115
aircox/controllers/sound_stats.py
Normal file
@ -0,0 +1,115 @@
|
||||
"""Provide sound analysis class using Sox."""
|
||||
import logging
|
||||
import re
|
||||
import subprocess
|
||||
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
__all__ = ("SoxStats", "SoundStats")
|
||||
|
||||
|
||||
class SoxStats:
|
||||
"""Run Sox process and parse output."""
|
||||
|
||||
attributes = [
|
||||
"DC offset",
|
||||
"Min level",
|
||||
"Max level",
|
||||
"Pk lev dB",
|
||||
"RMS lev dB",
|
||||
"RMS Pk dB",
|
||||
"RMS Tr dB",
|
||||
"Flat factor",
|
||||
"Length s",
|
||||
]
|
||||
|
||||
values = None
|
||||
|
||||
def __init__(self, path=None, **kwargs):
|
||||
"""If path is given, call analyse with path and kwargs."""
|
||||
if path:
|
||||
self.analyse(path, **kwargs)
|
||||
|
||||
def analyse(self, path, at=None, length=None):
|
||||
"""If at and length are given use them as excerpt to analyse."""
|
||||
args = ["sox", path, "-n"]
|
||||
if at is not None and length is not None:
|
||||
args += ["trim", str(at), str(length)]
|
||||
args.append("stats")
|
||||
|
||||
p = subprocess.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
# sox outputs to stderr (my god WHYYYY)
|
||||
out_, out = p.communicate()
|
||||
self.values = self.parse(str(out, encoding="utf-8"))
|
||||
|
||||
def parse(self, output):
|
||||
"""Parse sox output, settubg values from it."""
|
||||
values = {}
|
||||
for attr in self.attributes:
|
||||
value = re.search(attr + r"\s+(?P<value>\S+)", output)
|
||||
value = value and value.groupdict()
|
||||
if value:
|
||||
try:
|
||||
value = float(value.get("value"))
|
||||
except ValueError:
|
||||
value = None
|
||||
values[attr] = value
|
||||
values["length"] = values.pop("Length s", None)
|
||||
return values
|
||||
|
||||
def get(self, attr):
|
||||
return self.values.get(attr)
|
||||
|
||||
|
||||
class SoundStats:
|
||||
path = None # file path
|
||||
sample_length = 120 # default sample length in seconds
|
||||
stats = None # list of samples statistics
|
||||
bad = None # list of bad samples
|
||||
good = None # list of good samples
|
||||
|
||||
def __init__(self, path, sample_length=None):
|
||||
self.path = path
|
||||
if sample_length is not None:
|
||||
self.sample_length = sample_length
|
||||
|
||||
def get_file_stats(self):
|
||||
return self.stats and self.stats[0] or None
|
||||
|
||||
def analyse(self):
|
||||
logger.debug("complete file analysis")
|
||||
self.stats = [SoxStats(self.path)]
|
||||
position = 0
|
||||
length = self.stats[0].get("length")
|
||||
if not self.sample_length:
|
||||
return
|
||||
|
||||
logger.debug("start samples analysis...")
|
||||
while position < length:
|
||||
stats = SoxStats(self.path, at=position, length=self.sample_length)
|
||||
self.stats.append(stats)
|
||||
position += self.sample_length
|
||||
|
||||
def check(self, name, min_val, max_val):
|
||||
self.good = [index for index, stats in enumerate(self.stats) if min_val <= stats.get(name) <= max_val]
|
||||
self.bad = [index for index, stats in enumerate(self.stats) if index not in self.good]
|
||||
self.resume()
|
||||
|
||||
def resume(self):
|
||||
if self.good:
|
||||
logger.debug(
|
||||
self.path + " -> good: \033[92m%s\033[0m",
|
||||
", ".join(self._view(self.good)),
|
||||
)
|
||||
if self.bad:
|
||||
logger.debug(
|
||||
self.path + " -> bad: \033[91m%s\033[0m",
|
||||
", ".join(self._view(self.bad)),
|
||||
)
|
||||
|
||||
def _view(self, array):
|
||||
return [
|
||||
"file" if index == 0 else "sample {} (at {} seconds)".format(index, (index - 1) * self.sample_length)
|
||||
for index in array
|
||||
]
|
||||
@ -1,50 +1,51 @@
|
||||
import datetime
|
||||
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.urls.converters import StringConverter
|
||||
from django.utils.safestring import mark_safe
|
||||
|
||||
from .utils import str_to_date
|
||||
__all__ = ("PagePathConverter", "WeekConverter", "DateConverter")
|
||||
|
||||
|
||||
class PagePathConverter(StringConverter):
|
||||
""" Match path for pages, including surrounding slashes. """
|
||||
regex = r'/?|([-_a-zA-Z0-9]+/)*?'
|
||||
"""Match path for pages, including surrounding slashes."""
|
||||
|
||||
regex = r"/?|([-_a-zA-Z0-9]+/)*?"
|
||||
|
||||
def to_python(self, value):
|
||||
if not value or value[0] != '/':
|
||||
value = '/' + value
|
||||
if len(value) > 1 and value[-1] != '/':
|
||||
value = value + '/'
|
||||
if not value or value[0] != "/":
|
||||
value = "/" + value
|
||||
if len(value) > 1 and value[-1] != "/":
|
||||
value = value + "/"
|
||||
return value
|
||||
|
||||
def to_url(self, value):
|
||||
if value[0] == '/':
|
||||
if value[0] == "/":
|
||||
value = value[1:]
|
||||
if value[-1] != '/':
|
||||
value = value + '/'
|
||||
if value[-1] != "/":
|
||||
value = value + "/"
|
||||
return mark_safe(value)
|
||||
|
||||
|
||||
class WeekConverter:
|
||||
""" Converter for date as YYYYY/WW """
|
||||
regex = r'[0-9]{4}/[0-9]{2}'
|
||||
"""Converter for date as YYYYY/WW."""
|
||||
|
||||
regex = r"[0-9]{4}/[0-9]{2}"
|
||||
|
||||
def to_python(self, value):
|
||||
return datetime.datetime.strptime(value + '/1', '%G/%V/%u').date()
|
||||
return datetime.datetime.strptime(value + "/1", "%G/%V/%u").date()
|
||||
|
||||
def to_url(self, value):
|
||||
return value if isinstance(value, str) else \
|
||||
'{:04d}/{:02d}'.format(*value.isocalendar())
|
||||
return value if isinstance(value, str) else "{:04d}/{:02d}".format(*value.isocalendar())
|
||||
|
||||
|
||||
class DateConverter:
|
||||
""" Converter for date as YYYY/MM/DD """
|
||||
regex = r'[0-9]{4}/[0-9]{2}/[0-9]{2}'
|
||||
"""Converter for date as YYYY/MM/DD."""
|
||||
|
||||
regex = r"[0-9]{4}/[0-9]{2}/[0-9]{2}"
|
||||
|
||||
def to_python(self, value):
|
||||
value = value.split('/')[:3]
|
||||
value = value.split("/")[:3]
|
||||
return datetime.date(int(value[0]), int(value[1]), int(value[2]))
|
||||
|
||||
def to_url(self, value):
|
||||
return value if isinstance(value, str) else \
|
||||
'{:04d}/{:02d}/{:02d}'.format(value.year, value.month, value.day)
|
||||
return value if isinstance(value, str) else "{:04d}/{:02d}/{:02d}".format(value.year, value.month, value.day)
|
||||
|
||||
31
aircox/filters.py
Normal file
31
aircox/filters.py
Normal file
@ -0,0 +1,31 @@
|
||||
import django_filters as filters
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from .models import Episode, Page
|
||||
|
||||
|
||||
class PageFilters(filters.FilterSet):
|
||||
q = filters.CharFilter(method="search_filter", label=_("Search"))
|
||||
|
||||
class Meta:
|
||||
model = Page
|
||||
fields = {
|
||||
"category__id": ["in", "exact"],
|
||||
"pub_date": ["exact", "gte", "lte"],
|
||||
}
|
||||
|
||||
def search_filter(self, queryset, name, value):
|
||||
return queryset.search(value)
|
||||
|
||||
|
||||
class EpisodeFilters(PageFilters):
|
||||
podcast = filters.BooleanFilter(method="podcast_filter", label=_("Podcast"))
|
||||
|
||||
class Meta:
|
||||
model = Episode
|
||||
fields = PageFilters.Meta.fields.copy()
|
||||
|
||||
def podcast_filter(self, queryset, name, value):
|
||||
if value:
|
||||
return queryset.filter(sound__is_public=True).distinct()
|
||||
return queryset.filter(sound__isnull=True)
|
||||
@ -1,7 +1,12 @@
|
||||
from django import forms
|
||||
from django.forms import ModelForm
|
||||
from django.forms import ModelForm, ImageField, FileField
|
||||
|
||||
from .models import Comment
|
||||
from ckeditor.fields import RichTextField
|
||||
from filer.models.imagemodels import Image
|
||||
from filer.models.filemodels import File
|
||||
|
||||
from aircox.models import Comment, Episode, Program
|
||||
from aircox.controllers.sound_file import SoundFile
|
||||
|
||||
|
||||
class CommentForm(ModelForm):
|
||||
@ -9,12 +14,45 @@ class CommentForm(ModelForm):
|
||||
email = forms.EmailField(required=False)
|
||||
content = forms.CharField(widget=forms.Textarea())
|
||||
|
||||
nickname.widget.attrs.update({'class': 'input'})
|
||||
email.widget.attrs.update({'class': 'input'})
|
||||
content.widget.attrs.update({'class': 'textarea'})
|
||||
nickname.widget.attrs.update({"class": "input"})
|
||||
email.widget.attrs.update({"class": "input"})
|
||||
content.widget.attrs.update({"class": "textarea"})
|
||||
|
||||
class Meta:
|
||||
model = Comment
|
||||
fields = ['nickname', 'email', 'content']
|
||||
fields = ["nickname", "email", "content"]
|
||||
|
||||
|
||||
class ProgramForm(ModelForm):
|
||||
content = RichTextField()
|
||||
new_cover = ImageField(required=False)
|
||||
|
||||
class Meta:
|
||||
model = Program
|
||||
fields = ["content"]
|
||||
|
||||
def save(self, commit=True):
|
||||
file_obj = self.cleaned_data["new_cover"]
|
||||
if file_obj:
|
||||
obj, _ = Image.objects.get_or_create(original_filename=file_obj.name, file=file_obj)
|
||||
self.instance.cover = obj
|
||||
super().save(commit=commit)
|
||||
|
||||
|
||||
class EpisodeForm(ModelForm):
|
||||
content = RichTextField()
|
||||
new_podcast = FileField(required=False)
|
||||
|
||||
class Meta:
|
||||
model = Episode
|
||||
fields = ["content"]
|
||||
|
||||
def save(self, commit=True):
|
||||
file_obj = self.cleaned_data["new_podcast"]
|
||||
if file_obj:
|
||||
obj, _ = File.objects.get_or_create(original_filename=file_obj.name, file=file_obj)
|
||||
sound_file = SoundFile(obj.path)
|
||||
sound_file.sync(
|
||||
program=self.instance.program, episode=self.instance, type=0, is_public=True, is_downloadable=True
|
||||
)
|
||||
super().save(commit=commit)
|
||||
|
||||
Binary file not shown.
File diff suppressed because it is too large
Load Diff
@ -1,41 +1,47 @@
|
||||
"""Handle archiving of logs in order to keep database light and fast.
|
||||
|
||||
The logs are archived in gzip files, per day.
|
||||
"""
|
||||
Handle archiving of logs in order to keep database light and fast. The
|
||||
logs are archived in gzip files, per day.
|
||||
"""
|
||||
from argparse import RawTextHelpFormatter
|
||||
import datetime
|
||||
import logging
|
||||
from argparse import RawTextHelpFormatter
|
||||
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.utils import timezone as tz
|
||||
|
||||
import aircox.settings as settings
|
||||
from aircox.models import Log, Station
|
||||
from aircox.conf import settings
|
||||
from aircox.models import Log
|
||||
from aircox.models.log import LogArchiver
|
||||
|
||||
logger = logging.getLogger('aircox.commands')
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
class Command (BaseCommand):
|
||||
__all__ = ("Command",)
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = __doc__
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.formatter_class = RawTextHelpFormatter
|
||||
group = parser.add_argument_group('actions')
|
||||
group = parser.add_argument_group("actions")
|
||||
group.add_argument(
|
||||
'-a', '--age', type=int,
|
||||
default=settings.AIRCOX_LOGS_ARCHIVES_AGE,
|
||||
help='minimal age in days of logs to archive. Default is '
|
||||
'settings.AIRCOX_LOGS_ARCHIVES_AGE'
|
||||
"-a",
|
||||
"--age",
|
||||
type=int,
|
||||
default=settings.LOGS_ARCHIVES_AGE,
|
||||
help="minimal age in days of logs to archive. Default is " "settings.LOGS_ARCHIVES_AGE",
|
||||
)
|
||||
group.add_argument(
|
||||
'-k', '--keep', action='store_true',
|
||||
help='keep logs in database instead of deleting them'
|
||||
"-k",
|
||||
"--keep",
|
||||
action="store_true",
|
||||
help="keep logs in database instead of deleting them",
|
||||
)
|
||||
|
||||
def handle(self, *args, age, keep, **options):
|
||||
date = datetime.date.today() - tz.timedelta(days=age)
|
||||
# FIXME: mysql support?
|
||||
logger.info('archive logs for %s and earlier', date)
|
||||
logger.info("archive logs for %s and earlier", date)
|
||||
count = LogArchiver().archive(Log.objects.filter(date__date__lte=date))
|
||||
logger.info('total log archived %d', count)
|
||||
logger.info("total log archived %d", count)
|
||||
|
||||
@ -1,5 +1,4 @@
|
||||
"""
|
||||
Manage diffusions using schedules, to update, clean up or check diffusions.
|
||||
"""Manage diffusions using schedules, to update, clean up or check diffusions.
|
||||
|
||||
A generated diffusion can be unconfirmed, that means that the user must confirm
|
||||
it by changing its type to "normal". The behaviour is controlled using
|
||||
@ -10,47 +9,11 @@ import logging
|
||||
from argparse import RawTextHelpFormatter
|
||||
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.db import transaction
|
||||
from django.utils import timezone as tz
|
||||
|
||||
from aircox.models import Schedule, Diffusion
|
||||
from aircox.controllers.diffusion_monitor import DiffusionMonitor
|
||||
|
||||
logger = logging.getLogger('aircox.commands')
|
||||
|
||||
|
||||
class Actions:
|
||||
date = None
|
||||
|
||||
def __init__(self, date):
|
||||
self.date = date or datetime.date.today()
|
||||
|
||||
def update(self):
|
||||
episodes, diffusions = [], []
|
||||
for schedule in Schedule.objects.filter(program__active=True,
|
||||
initial__isnull=True):
|
||||
eps, diffs = schedule.diffusions_of_month(self.date)
|
||||
|
||||
episodes += eps
|
||||
diffusions += diffs
|
||||
|
||||
logger.info('[update] %s: %d episodes, %d diffusions and reruns',
|
||||
str(schedule), len(eps), len(diffs))
|
||||
|
||||
with transaction.atomic():
|
||||
logger.info('[update] save %d episodes and %d diffusions',
|
||||
len(episodes), len(diffusions))
|
||||
for episode in episodes:
|
||||
episode.save()
|
||||
for diffusion in diffusions:
|
||||
# force episode id's update
|
||||
diffusion.episode = diffusion.episode
|
||||
diffusion.save()
|
||||
|
||||
def clean(self):
|
||||
qs = Diffusion.objects.filter(type=Diffusion.TYPE_UNCONFIRMED,
|
||||
start__lt=self.date)
|
||||
logger.info('[clean] %d diffusions will be removed', qs.count())
|
||||
qs.delete()
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
@ -60,45 +23,54 @@ class Command(BaseCommand):
|
||||
parser.formatter_class = RawTextHelpFormatter
|
||||
today = datetime.date.today()
|
||||
|
||||
group = parser.add_argument_group('action')
|
||||
group = parser.add_argument_group("action")
|
||||
group.add_argument(
|
||||
'-u', '--update', action='store_true',
|
||||
help='generate (unconfirmed) diffusions for the given month. '
|
||||
'These diffusions must be confirmed manually by changing '
|
||||
'their type to "normal"'
|
||||
"-u",
|
||||
"--update",
|
||||
action="store_true",
|
||||
help="generate (unconfirmed) diffusions for the given month. "
|
||||
"These diffusions must be confirmed manually by changing "
|
||||
'their type to "normal"',
|
||||
)
|
||||
group.add_argument(
|
||||
'-l', '--clean', action='store_true',
|
||||
help='remove unconfirmed diffusions older than the given month'
|
||||
"-l",
|
||||
"--clean",
|
||||
action="store_true",
|
||||
help="remove unconfirmed diffusions older than the given month",
|
||||
)
|
||||
|
||||
group = parser.add_argument_group('date')
|
||||
group = parser.add_argument_group("date")
|
||||
group.add_argument(
|
||||
'--year', type=int, default=today.year,
|
||||
help='used by update, default is today\'s year')
|
||||
"--year",
|
||||
type=int,
|
||||
default=today.year,
|
||||
help="used by update, default is today's year",
|
||||
)
|
||||
group.add_argument(
|
||||
'--month', type=int, default=today.month,
|
||||
help='used by update, default is today\'s month')
|
||||
"--month",
|
||||
type=int,
|
||||
default=today.month,
|
||||
help="used by update, default is today's month",
|
||||
)
|
||||
group.add_argument(
|
||||
'--next-month', action='store_true',
|
||||
help='set the date to the next month of given date'
|
||||
' (if next month from today'
|
||||
"--next-month",
|
||||
action="store_true",
|
||||
help="set the date to the next month of given date" " (if next month from today",
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
date = datetime.date(year=options['year'], month=options['month'],
|
||||
day=1)
|
||||
if options.get('next_month'):
|
||||
month = options.get('month')
|
||||
date = datetime.date(year=options["year"], month=options["month"], day=1)
|
||||
if options.get("next_month"):
|
||||
month = options.get("month")
|
||||
date += tz.timedelta(days=28)
|
||||
if date.month == month:
|
||||
date += tz.timedelta(days=28)
|
||||
date = date.replace(day=1)
|
||||
|
||||
actions = Actions(date)
|
||||
if options.get('update'):
|
||||
actions = DiffusionMonitor(date)
|
||||
if options.get("update"):
|
||||
actions.update()
|
||||
if options.get('clean'):
|
||||
if options.get("clean"):
|
||||
actions.clean()
|
||||
if options.get('check'):
|
||||
if options.get("check"):
|
||||
actions.check()
|
||||
|
||||
@ -1,111 +1,32 @@
|
||||
"""
|
||||
Import one or more playlist for the given sound. Attach it to the provided
|
||||
"""Import one or more playlist for the given sound. Attach it to the provided
|
||||
sound.
|
||||
|
||||
Playlists are in CSV format, where columns are separated with a
|
||||
'{settings.AIRCOX_IMPORT_PLAYLIST_CSV_DELIMITER}'. Text quote is
|
||||
{settings.AIRCOX_IMPORT_PLAYLIST_CSV_TEXT_QUOTE}.
|
||||
The order of the elements is: {settings.AIRCOX_IMPORT_PLAYLIST_CSV_COLS}
|
||||
'{settings.IMPORT_PLAYLIST_CSV_DELIMITER}'. Text quote is
|
||||
{settings.IMPORT_PLAYLIST_CSV_TEXT_QUOTE}.
|
||||
The order of the elements is: {settings.IMPORT_PLAYLIST_CSV_COLS}
|
||||
|
||||
If 'minutes' or 'seconds' are given, position will be expressed as timed
|
||||
position, instead of position in playlist.
|
||||
"""
|
||||
import os
|
||||
import csv
|
||||
import logging
|
||||
import os
|
||||
from argparse import RawTextHelpFormatter
|
||||
|
||||
from django.core.management.base import BaseCommand, CommandError
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.management.base import BaseCommand
|
||||
|
||||
from aircox.conf import settings
|
||||
from aircox.models import Sound
|
||||
from aircox.controllers.playlist_import import PlaylistImport
|
||||
|
||||
from aircox import settings
|
||||
from aircox.models import *
|
||||
|
||||
__doc__ = __doc__.format(settings=settings)
|
||||
|
||||
logger = logging.getLogger('aircox.commands')
|
||||
|
||||
__all__ = ("Command",)
|
||||
|
||||
|
||||
class PlaylistImport:
|
||||
path = None
|
||||
data = None
|
||||
tracks = None
|
||||
track_kwargs = {}
|
||||
|
||||
def __init__(self, path=None, **track_kwargs):
|
||||
self.path = path
|
||||
self.track_kwargs = track_kwargs
|
||||
|
||||
def reset(self):
|
||||
self.data = None
|
||||
self.tracks = None
|
||||
|
||||
def run(self):
|
||||
self.read()
|
||||
if self.track_kwargs.get('sound') is not None:
|
||||
self.make_playlist()
|
||||
|
||||
def read(self):
|
||||
if not os.path.exists(self.path):
|
||||
return True
|
||||
with open(self.path, 'r') as file:
|
||||
logger.info('start reading csv ' + self.path)
|
||||
self.data = list(csv.DictReader(
|
||||
(row for row in file
|
||||
if not (row.startswith('#') or row.startswith('\ufeff#'))
|
||||
and row.strip()),
|
||||
fieldnames=settings.AIRCOX_IMPORT_PLAYLIST_CSV_COLS,
|
||||
delimiter=settings.AIRCOX_IMPORT_PLAYLIST_CSV_DELIMITER,
|
||||
quotechar=settings.AIRCOX_IMPORT_PLAYLIST_CSV_TEXT_QUOTE,
|
||||
))
|
||||
|
||||
def make_playlist(self):
|
||||
"""
|
||||
Make a playlist from the read data, and return it. If save is
|
||||
true, save it into the database
|
||||
"""
|
||||
if self.track_kwargs.get('sound') is None:
|
||||
logger.error('related track\'s sound is missing. Skip import of ' +
|
||||
self.path + '.')
|
||||
return
|
||||
|
||||
maps = settings.AIRCOX_IMPORT_PLAYLIST_CSV_COLS
|
||||
tracks = []
|
||||
|
||||
logger.info('parse csv file ' + self.path)
|
||||
has_timestamp = ('minutes' or 'seconds') in maps
|
||||
for index, line in enumerate(self.data):
|
||||
if ('title' or 'artist') not in line:
|
||||
return
|
||||
try:
|
||||
timestamp = int(line.get('minutes') or 0) * 60 + \
|
||||
int(line.get('seconds') or 0) \
|
||||
if has_timestamp else None
|
||||
|
||||
track, created = Track.objects.get_or_create(
|
||||
title=line.get('title'),
|
||||
artist=line.get('artist'),
|
||||
position=index,
|
||||
**self.track_kwargs
|
||||
)
|
||||
track.timestamp = timestamp
|
||||
print('track', track, timestamp)
|
||||
track.info = line.get('info')
|
||||
tags = line.get('tags')
|
||||
if tags:
|
||||
track.tags.add(*tags.split(','))
|
||||
except Exception as err:
|
||||
logger.warning(
|
||||
'an error occured for track {index}, it may not '
|
||||
'have been saved: {err}'
|
||||
.format(index=index, err=err)
|
||||
)
|
||||
continue
|
||||
|
||||
track.save()
|
||||
tracks.append(track)
|
||||
self.tracks = tracks
|
||||
return tracks
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
@ -114,33 +35,36 @@ class Command(BaseCommand):
|
||||
def add_arguments(self, parser):
|
||||
parser.formatter_class = RawTextHelpFormatter
|
||||
parser.add_argument(
|
||||
'path', metavar='PATH', type=str,
|
||||
help='path of the input playlist to read'
|
||||
"path",
|
||||
metavar="PATH",
|
||||
type=str,
|
||||
help="path of the input playlist to read",
|
||||
)
|
||||
parser.add_argument(
|
||||
'--sound', '-s', type=str,
|
||||
help='generate a playlist for the sound of the given path. '
|
||||
'If not given, try to match a sound with the same path.'
|
||||
"--sound",
|
||||
"-s",
|
||||
type=str,
|
||||
help="generate a playlist for the sound of the given path. "
|
||||
"If not given, try to match a sound with the same path.",
|
||||
)
|
||||
|
||||
def handle(self, path, *args, **options):
|
||||
# FIXME: absolute/relative path of sounds vs given path
|
||||
if options.get('sound'):
|
||||
sound = Sound.objects.filter(path__icontains=options.get('sound'))\
|
||||
.first()
|
||||
if options.get("sound"):
|
||||
sound = Sound.objects.filter(file__icontains=options.get("sound")).first()
|
||||
else:
|
||||
path_, ext = os.path.splitext(path)
|
||||
sound = Sound.objects.filter(path__icontains=path_).first()
|
||||
|
||||
if not sound:
|
||||
logger.error('no sound found in the database for the path '
|
||||
'{path}'.format(path=path))
|
||||
logger.error("no sound found in the database for the path " "{path}".format(path=path))
|
||||
return
|
||||
|
||||
# FIXME: auto get sound.episode if any
|
||||
importer = PlaylistImport(path, sound=sound).run()
|
||||
for track in importer.tracks:
|
||||
logger.info('track #{pos} imported: {title}, by {artist}'.format(
|
||||
pos=track.position, title=track.title, artist=track.artist
|
||||
))
|
||||
|
||||
logger.info(
|
||||
"track #{pos} imported: {title}, by {artist}".format(
|
||||
pos=track.position, title=track.title, artist=track.artist
|
||||
)
|
||||
)
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
#! /usr/bin/env python3
|
||||
|
||||
"""
|
||||
Monitor sound files; For each program, check for:
|
||||
"""Monitor sound files; For each program, check for:
|
||||
|
||||
- new files;
|
||||
- deleted files;
|
||||
- differences between files and sound;
|
||||
@ -20,333 +20,47 @@ Where:
|
||||
|
||||
|
||||
To check quality of files, call the command sound_quality_check using the
|
||||
parameters given by the setting AIRCOX_SOUND_QUALITY. This script requires
|
||||
parameters given by the setting SOUND_QUALITY. This script requires
|
||||
Sox (and soxi).
|
||||
"""
|
||||
from argparse import RawTextHelpFormatter
|
||||
import concurrent.futures as futures
|
||||
import datetime
|
||||
import atexit
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import time
|
||||
from argparse import RawTextHelpFormatter
|
||||
|
||||
import mutagen
|
||||
from watchdog.observers import Observer
|
||||
from watchdog.events import PatternMatchingEventHandler, FileModifiedEvent
|
||||
from django.core.management.base import BaseCommand
|
||||
|
||||
from django.core.management.base import BaseCommand, CommandError
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.translation import gettext as _
|
||||
from aircox.controllers.sound_monitor import SoundMonitor
|
||||
|
||||
from aircox import settings, utils
|
||||
from aircox.models import Diffusion, Program, Sound, Track
|
||||
from .import_playlist import PlaylistImport
|
||||
|
||||
logger = logging.getLogger('aircox.commands')
|
||||
|
||||
|
||||
sound_path_re = re.compile(
|
||||
'^(?P<year>[0-9]{4})(?P<month>[0-9]{2})(?P<day>[0-9]{2})'
|
||||
'(_(?P<hour>[0-9]{2})h(?P<minute>[0-9]{2}))?'
|
||||
'(_(?P<n>[0-9]+))?'
|
||||
'_?(?P<name>.*)$'
|
||||
)
|
||||
|
||||
|
||||
class SoundFile:
|
||||
path = None
|
||||
info = None
|
||||
path_info = None
|
||||
sound = None
|
||||
|
||||
def __init__(self, path):
|
||||
self.path = path
|
||||
|
||||
def sync(self, sound=None, program=None, deleted=False, **kwargs):
|
||||
"""
|
||||
Update related sound model and save it.
|
||||
"""
|
||||
if deleted:
|
||||
sound = Sound.objects.filter(path=self.path).first()
|
||||
if sound:
|
||||
sound.type = sound.TYPE_REMOVED
|
||||
sound.check_on_file()
|
||||
sound.save()
|
||||
return sound
|
||||
|
||||
# FIXME: sound.program as not null
|
||||
program = kwargs['program'] = Program.get_from_path(self.path)
|
||||
sound, created = Sound.objects.get_or_create(path=self.path, defaults=kwargs) \
|
||||
if not sound else (sound, False)
|
||||
self.sound = sound
|
||||
|
||||
sound.program = program
|
||||
if created or sound.check_on_file():
|
||||
logger.info('sound is new or have been modified -> %s', self.path)
|
||||
self.read_path()
|
||||
sound.name = self.path_info.get('name')
|
||||
|
||||
self.read_file_info()
|
||||
if self.info is not None:
|
||||
sound.duration = utils.seconds_to_time(self.info.info.length)
|
||||
|
||||
# check for episode
|
||||
if sound.episode is None and self.read_path():
|
||||
self.find_episode(program)
|
||||
|
||||
sound.save()
|
||||
if self.info is not None:
|
||||
self.find_playlist(sound)
|
||||
return sound
|
||||
|
||||
def read_path(self):
|
||||
"""
|
||||
Parse file name to get info on the assumption it has the correct
|
||||
format (given in Command.help). Return True if path contains informations.
|
||||
"""
|
||||
if self.path_info:
|
||||
return 'year' in self.path_info
|
||||
|
||||
name = os.path.splitext(os.path.basename(self.path))[0]
|
||||
match = sound_path_re.search(name)
|
||||
if match:
|
||||
path_info = match.groupdict()
|
||||
for k in ('year', 'month', 'day', 'hour', 'minute'):
|
||||
if path_info.get(k) is not None:
|
||||
path_info[k] = int(path_info[k])
|
||||
self.path_info = path_info
|
||||
return True
|
||||
else:
|
||||
self.path_info = {'name': name}
|
||||
return False
|
||||
|
||||
def read_file_info(self):
|
||||
""" Read file information and metadata. """
|
||||
if os.path.exists(self.path):
|
||||
self.info = mutagen.File(self.path)
|
||||
else:
|
||||
self.info = None
|
||||
|
||||
def find_episode(self, program):
|
||||
"""
|
||||
For a given program, check if there is an initial diffusion
|
||||
to associate to, using the date info we have. Update self.sound
|
||||
and save it consequently.
|
||||
|
||||
We only allow initial diffusion since there should be no
|
||||
rerun.
|
||||
"""
|
||||
pi = self.path_info
|
||||
if 'year' not in pi or not self.sound or self.sound.episode:
|
||||
return None
|
||||
|
||||
if pi.get('hour') is not None:
|
||||
date = tz.datetime(pi.get('year'), pi.get('month'), pi.get('day'),
|
||||
pi.get('hour') or 0, pi.get('minute') or 0)
|
||||
date = tz.get_current_timezone().localize(date)
|
||||
else:
|
||||
date = datetime.date(pi.get('year'), pi.get('month'), pi.get('day'))
|
||||
|
||||
diffusion = program.diffusion_set.at(date).first()
|
||||
if not diffusion:
|
||||
return None
|
||||
|
||||
logger.info('%s <--> %s', self.sound.path, str(diffusion.episode))
|
||||
self.sound.episode = diffusion.episode
|
||||
return diffusion
|
||||
|
||||
def find_playlist(self, sound=None, use_meta=True):
|
||||
"""
|
||||
Find a playlist file corresponding to the sound path, such as:
|
||||
my_sound.ogg => my_sound.csv
|
||||
|
||||
Use sound's file metadata if no corresponding playlist has been
|
||||
found and `use_meta` is True.
|
||||
"""
|
||||
if sound is None:
|
||||
sound = self.sound
|
||||
|
||||
if sound.track_set.count():
|
||||
return
|
||||
|
||||
# import playlist
|
||||
path = os.path.splitext(self.sound.path)[0] + '.csv'
|
||||
if os.path.exists(path):
|
||||
PlaylistImport(path, sound=sound).run()
|
||||
# use metadata
|
||||
elif use_meta:
|
||||
if self.info is None:
|
||||
self.read_file_info()
|
||||
if self.info.tags:
|
||||
tags = self.info.tags
|
||||
info = '{} ({})'.format(tags.get('album'), tags.get('year')) \
|
||||
if ('album' and 'year' in tags) else tags.get('album') \
|
||||
if 'album' in tags else tags.get('year', '')
|
||||
|
||||
track = Track(sound=sound,
|
||||
position=int(tags.get('tracknumber', 0)),
|
||||
title=tags.get('title', self.path_info['name']),
|
||||
artist=tags.get('artist', _('unknown')),
|
||||
info=info)
|
||||
track.save()
|
||||
|
||||
|
||||
class MonitorHandler(PatternMatchingEventHandler):
|
||||
"""
|
||||
Event handler for watchdog, in order to be used in monitoring.
|
||||
"""
|
||||
pool = None
|
||||
|
||||
def __init__(self, subdir, pool):
|
||||
"""
|
||||
subdir: AIRCOX_SOUND_ARCHIVES_SUBDIR or AIRCOX_SOUND_EXCERPTS_SUBDIR
|
||||
"""
|
||||
self.subdir = subdir
|
||||
self.pool = pool
|
||||
|
||||
if self.subdir == settings.AIRCOX_SOUND_ARCHIVES_SUBDIR:
|
||||
self.sound_kwargs = {'type': Sound.TYPE_ARCHIVE}
|
||||
else:
|
||||
self.sound_kwargs = {'type': Sound.TYPE_EXCERPT}
|
||||
|
||||
patterns = ['*/{}/*{}'.format(self.subdir, ext)
|
||||
for ext in settings.AIRCOX_SOUND_FILE_EXT]
|
||||
super().__init__(patterns=patterns, ignore_directories=True)
|
||||
|
||||
def on_created(self, event):
|
||||
self.on_modified(event)
|
||||
|
||||
def on_modified(self, event):
|
||||
logger.info('sound modified: %s', event.src_path)
|
||||
def updated(event, sound_kwargs):
|
||||
SoundFile(event.src_path).sync(**sound_kwargs)
|
||||
self.pool.submit(updated, event, self.sound_kwargs)
|
||||
|
||||
def on_moved(self, event):
|
||||
logger.info('sound moved: %s -> %s', event.src_path, event.dest_path)
|
||||
def moved(event, sound_kwargs):
|
||||
sound = Sound.objects.filter(path=event.src_path)
|
||||
sound_file = SoundFile(event.dest_path) if not sound else sound
|
||||
sound_file.sync(**sound_kwargs)
|
||||
self.pool.submit(moved, event, self.sound_kwargs)
|
||||
|
||||
def on_deleted(self, event):
|
||||
logger.info('sound deleted: %s', event.src_path)
|
||||
def deleted(event):
|
||||
SoundFile(event.src_path).sync(deleted=True)
|
||||
self.pool.submit(deleted, event.src_path)
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = __doc__
|
||||
|
||||
def report(self, program=None, component=None, *content):
|
||||
if not component:
|
||||
logger.info('%s: %s', str(program),
|
||||
' '.join([str(c) for c in content]))
|
||||
else:
|
||||
logger.info('%s, %s: %s', str(program), str(component),
|
||||
' '.join([str(c) for c in content]))
|
||||
|
||||
def scan(self):
|
||||
"""
|
||||
For all programs, scan dirs
|
||||
"""
|
||||
logger.info('scan all programs...')
|
||||
programs = Program.objects.filter()
|
||||
|
||||
dirs = []
|
||||
for program in programs:
|
||||
logger.info('#%d %s', program.id, program.title)
|
||||
self.scan_for_program(
|
||||
program, settings.AIRCOX_SOUND_ARCHIVES_SUBDIR,
|
||||
type=Sound.TYPE_ARCHIVE,
|
||||
)
|
||||
self.scan_for_program(
|
||||
program, settings.AIRCOX_SOUND_EXCERPTS_SUBDIR,
|
||||
type=Sound.TYPE_EXCERPT,
|
||||
)
|
||||
dirs.append(os.path.join(program.path))
|
||||
|
||||
def scan_for_program(self, program, subdir, **sound_kwargs):
|
||||
"""
|
||||
Scan a given directory that is associated to the given program, and
|
||||
update sounds information.
|
||||
"""
|
||||
logger.info('- %s/', subdir)
|
||||
if not program.ensure_dir(subdir):
|
||||
return
|
||||
|
||||
subdir = os.path.join(program.path, subdir)
|
||||
sounds = []
|
||||
|
||||
# sounds in directory
|
||||
for path in os.listdir(subdir):
|
||||
path = os.path.join(subdir, path)
|
||||
if not path.endswith(settings.AIRCOX_SOUND_FILE_EXT):
|
||||
continue
|
||||
|
||||
sound_file = SoundFile(path)
|
||||
sound_file.sync(program=program, **sound_kwargs)
|
||||
sounds.append(sound_file.sound.pk)
|
||||
|
||||
# sounds in db & unchecked
|
||||
sounds = Sound.objects.filter(path__startswith=subdir). \
|
||||
exclude(pk__in=sounds)
|
||||
self.check_sounds(sounds, program=program)
|
||||
|
||||
def check_sounds(self, qs, **sync_kwargs):
|
||||
""" Only check for the sound existence or update """
|
||||
# check files
|
||||
for sound in qs:
|
||||
if sound.check_on_file():
|
||||
SoundFile(sound.path).sync(sound=sound, **sync_kwargs)
|
||||
|
||||
def monitor(self):
|
||||
""" Run in monitor mode """
|
||||
with futures.ThreadPoolExecutor() as pool:
|
||||
archives_handler = MonitorHandler(settings.AIRCOX_SOUND_ARCHIVES_SUBDIR, pool)
|
||||
excerpts_handler = MonitorHandler(settings.AIRCOX_SOUND_EXCERPTS_SUBDIR, pool)
|
||||
|
||||
observer = Observer()
|
||||
observer.schedule(archives_handler, settings.AIRCOX_PROGRAMS_DIR,
|
||||
recursive=True)
|
||||
observer.schedule(excerpts_handler, settings.AIRCOX_PROGRAMS_DIR,
|
||||
recursive=True)
|
||||
observer.start()
|
||||
|
||||
def leave():
|
||||
observer.stop()
|
||||
observer.join()
|
||||
atexit.register(leave)
|
||||
|
||||
while True:
|
||||
time.sleep(1)
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.formatter_class = RawTextHelpFormatter
|
||||
parser.add_argument(
|
||||
'-q', '--quality_check', action='store_true',
|
||||
help='Enable quality check using sound_quality_check on all '
|
||||
'sounds marqued as not good'
|
||||
"-q",
|
||||
"--quality_check",
|
||||
action="store_true",
|
||||
help="Enable quality check using sound_quality_check on all " "sounds marqued as not good",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-s', '--scan', action='store_true',
|
||||
help='Scan programs directories for changes, plus check for a '
|
||||
' matching diffusion on sounds that have not been yet assigned'
|
||||
"-s",
|
||||
"--scan",
|
||||
action="store_true",
|
||||
help="Scan programs directories for changes, plus check for a "
|
||||
" matching diffusion on sounds that have not been yet assigned",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-m', '--monitor', action='store_true',
|
||||
help='Run in monitor mode, watch for modification in the filesystem '
|
||||
'and react in consequence'
|
||||
"-m",
|
||||
"--monitor",
|
||||
action="store_true",
|
||||
help="Run in monitor mode, watch for modification in the " "filesystem and react in consequence",
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
if options.get('scan'):
|
||||
self.scan()
|
||||
#if options.get('quality_check'):
|
||||
# self.check_quality(check=(not options.get('scan')))
|
||||
if options.get('monitor'):
|
||||
self.monitor()
|
||||
monitor = SoundMonitor()
|
||||
if options.get("scan"):
|
||||
monitor.scan()
|
||||
if options.get("monitor"):
|
||||
monitor.monitor()
|
||||
|
||||
@ -1,119 +1,15 @@
|
||||
"""
|
||||
Analyse and check files using Sox, prints good and bad files.
|
||||
"""
|
||||
import sys
|
||||
"""Analyse and check files using Sox, prints good and bad files."""
|
||||
import logging
|
||||
import re
|
||||
import subprocess
|
||||
from argparse import RawTextHelpFormatter
|
||||
|
||||
from django.core.management.base import BaseCommand, CommandError
|
||||
|
||||
logger = logging.getLogger('aircox.commands')
|
||||
from aircox.controllers.sound_stats import SoundStats, SoxStats
|
||||
|
||||
logger = logging.getLogger("aircox.commands")
|
||||
|
||||
|
||||
class Stats:
|
||||
attributes = [
|
||||
'DC offset', 'Min level', 'Max level',
|
||||
'Pk lev dB', 'RMS lev dB', 'RMS Pk dB',
|
||||
'RMS Tr dB', 'Flat factor', 'Length s',
|
||||
]
|
||||
|
||||
def __init__(self, path, **kwargs):
|
||||
"""
|
||||
If path is given, call analyse with path and kwargs
|
||||
"""
|
||||
self.values = {}
|
||||
if path:
|
||||
self.analyse(path, **kwargs)
|
||||
|
||||
def get(self, attr):
|
||||
return self.values.get(attr)
|
||||
|
||||
def parse(self, output):
|
||||
for attr in Stats.attributes:
|
||||
value = re.search(attr + r'\s+(?P<value>\S+)', output)
|
||||
value = value and value.groupdict()
|
||||
if value:
|
||||
try:
|
||||
value = float(value.get('value'))
|
||||
except ValueError:
|
||||
value = None
|
||||
self.values[attr] = value
|
||||
self.values['length'] = self.values['Length s']
|
||||
|
||||
def analyse(self, path, at=None, length=None):
|
||||
"""
|
||||
If at and length are given use them as excerpt to analyse.
|
||||
"""
|
||||
args = ['sox', path, '-n']
|
||||
|
||||
if at is not None and length is not None:
|
||||
args += ['trim', str(at), str(length)]
|
||||
|
||||
args.append('stats')
|
||||
|
||||
p = subprocess.Popen(args, stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE)
|
||||
# sox outputs to stderr (my god WHYYYY)
|
||||
out_, out = p.communicate()
|
||||
self.parse(str(out, encoding='utf-8'))
|
||||
|
||||
|
||||
class SoundStats:
|
||||
path = None # file path
|
||||
sample_length = 120 # default sample length in seconds
|
||||
stats = None # list of samples statistics
|
||||
bad = None # list of bad samples
|
||||
good = None # list of good samples
|
||||
|
||||
def __init__(self, path, sample_length=None):
|
||||
self.path = path
|
||||
self.sample_length = sample_length if sample_length is not None \
|
||||
else self.sample_length
|
||||
|
||||
def get_file_stats(self):
|
||||
return self.stats and self.stats[0]
|
||||
|
||||
def analyse(self):
|
||||
logger.info('complete file analysis')
|
||||
self.stats = [Stats(self.path)]
|
||||
position = 0
|
||||
length = self.stats[0].get('length')
|
||||
|
||||
if not self.sample_length:
|
||||
return
|
||||
|
||||
logger.info('start samples analysis...')
|
||||
while position < length:
|
||||
stats = Stats(self.path, at=position, length=self.sample_length)
|
||||
self.stats.append(stats)
|
||||
position += self.sample_length
|
||||
|
||||
def check(self, name, min_val, max_val):
|
||||
self.good = [index for index, stats in enumerate(self.stats)
|
||||
if min_val <= stats.get(name) <= max_val]
|
||||
self.bad = [index for index, stats in enumerate(self.stats)
|
||||
if index not in self.good]
|
||||
self.resume()
|
||||
|
||||
def resume(self):
|
||||
def view(array): return [
|
||||
'file' if index is 0 else
|
||||
'sample {} (at {} seconds)'.format(
|
||||
index, (index-1) * self.sample_length)
|
||||
for index in array
|
||||
]
|
||||
|
||||
if self.good:
|
||||
logger.info(self.path + ' -> good: \033[92m%s\033[0m',
|
||||
', '.join(view(self.good)))
|
||||
if self.bad:
|
||||
logger.info(self.path + ' -> bad: \033[91m%s\033[0m',
|
||||
', '.join(view(self.bad)))
|
||||
|
||||
|
||||
class Command (BaseCommand):
|
||||
class Command(BaseCommand):
|
||||
help = __doc__
|
||||
sounds = None
|
||||
|
||||
@ -121,46 +17,56 @@ class Command (BaseCommand):
|
||||
parser.formatter_class = RawTextHelpFormatter
|
||||
|
||||
parser.add_argument(
|
||||
'files', metavar='FILE', type=str, nargs='+',
|
||||
help='file(s) to analyse'
|
||||
"files",
|
||||
metavar="FILE",
|
||||
type=str,
|
||||
nargs="+",
|
||||
help="file(s) to analyse",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-s', '--sample_length', type=int, default=120,
|
||||
help='size of sample to analyse in seconds. If not set (or 0), does'
|
||||
' not analyse by sample',
|
||||
"-s",
|
||||
"--sample_length",
|
||||
type=int,
|
||||
default=120,
|
||||
help="size of sample to analyse in seconds. If not set (or 0), " "does not analyse by sample",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-a', '--attribute', type=str,
|
||||
help='attribute name to use to check, that can be:\n' +
|
||||
', '.join(['"{}"'.format(attr) for attr in Stats.attributes])
|
||||
"-a",
|
||||
"--attribute",
|
||||
type=str,
|
||||
help="attribute name to use to check, that can be:\n"
|
||||
+ ", ".join(['"{}"'.format(attr) for attr in SoxStats.attributes]),
|
||||
)
|
||||
parser.add_argument(
|
||||
'-r', '--range', type=float, nargs=2,
|
||||
help='range of minimal and maximal accepted value such as: '
|
||||
'--range min max'
|
||||
"-r",
|
||||
"--range",
|
||||
type=float,
|
||||
nargs=2,
|
||||
help="range of minimal and maximal accepted value such as: " "--range min max",
|
||||
)
|
||||
parser.add_argument(
|
||||
'-i', '--resume', action='store_true',
|
||||
help='print a resume of good and bad files'
|
||||
"-i",
|
||||
"--resume",
|
||||
action="store_true",
|
||||
help="print a resume of good and bad files",
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
# parameters
|
||||
minmax = options.get('range')
|
||||
minmax = options.get("range")
|
||||
if not minmax:
|
||||
raise CommandError('no range specified')
|
||||
raise CommandError("no range specified")
|
||||
|
||||
attr = options.get('attribute')
|
||||
attr = options.get("attribute")
|
||||
if not attr:
|
||||
raise CommandError('no attribute specified')
|
||||
raise CommandError("no attribute specified")
|
||||
|
||||
# sound analyse and checks
|
||||
self.sounds = [SoundStats(path, options.get('sample_length'))
|
||||
for path in options.get('files')]
|
||||
self.sounds = [SoundStats(path, options.get("sample_length")) for path in options.get("files")]
|
||||
self.bad = []
|
||||
self.good = []
|
||||
for sound in self.sounds:
|
||||
logger.info('analyse ' + sound.path)
|
||||
logger.info("analyse " + sound.path)
|
||||
sound.analyse()
|
||||
sound.check(attr, minmax[0], minmax[1])
|
||||
if sound.bad:
|
||||
@ -169,8 +75,8 @@ class Command (BaseCommand):
|
||||
self.good.append(sound)
|
||||
|
||||
# resume
|
||||
if options.get('resume'):
|
||||
if options.get("resume"):
|
||||
for sound in self.good:
|
||||
logger.info('\033[92m+ %s\033[0m', sound.path)
|
||||
logger.info("\033[92m+ %s\033[0m", sound.path)
|
||||
for sound in self.bad:
|
||||
logger.info('\033[91m+ %s\033[0m', sound.path)
|
||||
logger.info("\033[91m+ %s\033[0m", sound.path)
|
||||
|
||||
@ -1,48 +1,47 @@
|
||||
import pytz
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
from django.db.models import Q
|
||||
from django.utils import timezone as tz
|
||||
|
||||
from .models import Station
|
||||
from .utils import Redirect
|
||||
|
||||
|
||||
__all__ = ['AircoxMiddleware']
|
||||
__all__ = ("AircoxMiddleware",)
|
||||
|
||||
|
||||
class AircoxMiddleware(object):
|
||||
"""
|
||||
Middleware used to get default info for the given website. Theses
|
||||
"""Middleware used to get default info for the given website.
|
||||
|
||||
It provide following request attributes:
|
||||
- ``station``: current Station
|
||||
|
||||
This middleware must be set after the middleware
|
||||
'django.contrib.auth.middleware.AuthenticationMiddleware',
|
||||
"""
|
||||
|
||||
timezone_session_key = "aircox.timezone"
|
||||
|
||||
def __init__(self, get_response):
|
||||
self.get_response = get_response
|
||||
|
||||
def get_station(self, request):
|
||||
""" Return station for the provided request """
|
||||
expr = Q(default=True) | Q(hosts__contains=request.get_host())
|
||||
# case = Case(When(hosts__contains=request.get_host(), then=Value(0)),
|
||||
# When(default=True, then=Value(32)))
|
||||
return Station.objects.filter(expr).order_by('default').first()
|
||||
# .annotate(resolve_priority=case) \
|
||||
# .order_by('resolve_priority').first()
|
||||
"""Return station for the provided request."""
|
||||
host = request.get_host()
|
||||
expr = Q(default=True) | Q(hosts=host) | Q(hosts__contains=host + "\n")
|
||||
return Station.objects.filter(expr).order_by("default").first()
|
||||
|
||||
def init_timezone(self, request):
|
||||
# note: later we can use http://freegeoip.net/ on user side if
|
||||
# required
|
||||
timezone = None
|
||||
try:
|
||||
timezone = request.session.get('aircox.timezone')
|
||||
timezone = request.session.get(self.timezone_session_key)
|
||||
if timezone:
|
||||
timezone = pytz.timezone(timezone)
|
||||
except:
|
||||
timezone = ZoneInfo(timezone)
|
||||
tz.activate(timezone)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if not timezone:
|
||||
timezone = tz.get_current_timezone()
|
||||
tz.activate(timezone)
|
||||
|
||||
def __call__(self, request):
|
||||
self.init_timezone(request)
|
||||
request.station = self.get_station(request)
|
||||
|
||||
1706
aircox/migrations/0001_initial.py
Normal file
1706
aircox/migrations/0001_initial.py
Normal file
File diff suppressed because it is too large
Load Diff
17
aircox/migrations/0002_auto_20200526_1516.py
Normal file
17
aircox/migrations/0002_auto_20200526_1516.py
Normal file
@ -0,0 +1,17 @@
|
||||
# Generated by Django 3.0.6 on 2020-05-26 13:16
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0001_initial"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RenameField(
|
||||
model_name="staticpage",
|
||||
old_name="view",
|
||||
new_name="attach_to",
|
||||
),
|
||||
]
|
||||
146
aircox/migrations/0003_auto_20200530_1116.py
Normal file
146
aircox/migrations/0003_auto_20200530_1116.py
Normal file
@ -0,0 +1,146 @@
|
||||
# Generated by Django 3.0.6 on 2020-05-30 11:16
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import filer.fields.image
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
migrations.swappable_dependency(settings.FILER_IMAGE_MODEL),
|
||||
("aircox", "0002_auto_20200526_1516"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelOptions(
|
||||
name="log",
|
||||
options={"verbose_name": "Log", "verbose_name_plural": "Logs"},
|
||||
),
|
||||
migrations.AlterModelOptions(
|
||||
name="page",
|
||||
options={
|
||||
"verbose_name": "Publication",
|
||||
"verbose_name_plural": "Publications",
|
||||
},
|
||||
),
|
||||
migrations.AlterModelOptions(
|
||||
name="program",
|
||||
options={
|
||||
"verbose_name": "Program",
|
||||
"verbose_name_plural": "Programs",
|
||||
},
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name="article",
|
||||
name="is_static",
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="diffusion",
|
||||
name="schedule",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="aircox.Schedule",
|
||||
verbose_name="schedule",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="diffusion",
|
||||
name="initial",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
limit_choices_to={"initial__isnull": True},
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="rerun_set",
|
||||
to="aircox.Diffusion",
|
||||
verbose_name="rerun of",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="navitem",
|
||||
name="page",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
limit_choices_to={"attach_to__isnull": True},
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="aircox.StaticPage",
|
||||
verbose_name="page",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="frequency",
|
||||
field=models.SmallIntegerField(
|
||||
choices=[
|
||||
(0, "ponctual"),
|
||||
(1, "1st {day} of the month"),
|
||||
(2, "2nd {day} of the month"),
|
||||
(4, "3rd {day} of the month"),
|
||||
(8, "4th {day} of the month"),
|
||||
(16, "last {day} of the month"),
|
||||
(5, "1st and 3rd {day} of the month"),
|
||||
(10, "2nd and 4th {day} of the month"),
|
||||
(31, "every {day}"),
|
||||
(32, "one {day} on two"),
|
||||
],
|
||||
verbose_name="frequency",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="initial",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
limit_choices_to={"initial__isnull": True},
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="rerun_set",
|
||||
to="aircox.Schedule",
|
||||
verbose_name="rerun of",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="staticpage",
|
||||
name="attach_to",
|
||||
field=models.SmallIntegerField(
|
||||
blank=True,
|
||||
choices=[
|
||||
(0, "Home page"),
|
||||
(1, "Diffusions page"),
|
||||
(2, "Logs page"),
|
||||
(3, "Programs list"),
|
||||
(4, "Episodes list"),
|
||||
(5, "Articles list"),
|
||||
],
|
||||
help_text="display this page content to related element",
|
||||
null=True,
|
||||
verbose_name="attach to",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="station",
|
||||
name="default_cover",
|
||||
field=filer.fields.image.FilerImageField(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="+",
|
||||
to=settings.FILER_IMAGE_MODEL,
|
||||
verbose_name="Default pages' cover",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="track",
|
||||
name="timestamp",
|
||||
field=models.PositiveSmallIntegerField(
|
||||
blank=True,
|
||||
help_text="position (in seconds)",
|
||||
null=True,
|
||||
verbose_name="timestamp",
|
||||
),
|
||||
),
|
||||
]
|
||||
55
aircox/migrations/0004_auto_20200921_2356.py
Normal file
55
aircox/migrations/0004_auto_20200921_2356.py
Normal file
@ -0,0 +1,55 @@
|
||||
# Generated by Django 3.1.1 on 2020-09-21 23:56
|
||||
|
||||
import ckeditor_uploader.fields
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0003_auto_20200530_1116"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelOptions(
|
||||
name="comment",
|
||||
options={
|
||||
"verbose_name": "Comment",
|
||||
"verbose_name_plural": "Comments",
|
||||
},
|
||||
),
|
||||
migrations.AlterModelOptions(
|
||||
name="navitem",
|
||||
options={
|
||||
"ordering": ("order", "pk"),
|
||||
"verbose_name": "Menu item",
|
||||
"verbose_name_plural": "Menu items",
|
||||
},
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name="sound",
|
||||
name="embed",
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="page",
|
||||
name="content",
|
||||
field=ckeditor_uploader.fields.RichTextUploadingField(blank=True, null=True, verbose_name="content"),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="sound",
|
||||
name="program",
|
||||
field=models.ForeignKey(
|
||||
default=1,
|
||||
help_text="program related to it",
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="aircox.program",
|
||||
verbose_name="program",
|
||||
),
|
||||
preserve_default=False,
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="staticpage",
|
||||
name="content",
|
||||
field=ckeditor_uploader.fields.RichTextUploadingField(blank=True, null=True, verbose_name="content"),
|
||||
),
|
||||
]
|
||||
839
aircox/migrations/0005_auto_20220318_1205.py
Normal file
839
aircox/migrations/0005_auto_20220318_1205.py
Normal file
@ -0,0 +1,839 @@
|
||||
# Generated by Django 3.2.12 on 2022-03-18 12:05
|
||||
|
||||
import aircox.models.sound
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import django.utils.timezone
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0004_auto_20200921_2356"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RemoveField(
|
||||
model_name="sound",
|
||||
name="path",
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="sound",
|
||||
name="file",
|
||||
field=models.FileField(
|
||||
default="",
|
||||
upload_to=aircox.models.sound.Sound._upload_to,
|
||||
verbose_name="file",
|
||||
),
|
||||
preserve_default=False,
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="category",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="comment",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="diffusion",
|
||||
name="end",
|
||||
field=models.DateTimeField(db_index=True, verbose_name="end"),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="diffusion",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="diffusion",
|
||||
name="start",
|
||||
field=models.DateTimeField(db_index=True, verbose_name="start"),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="log",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="navitem",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="navitem",
|
||||
name="page",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="aircox.staticpage",
|
||||
verbose_name="page",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="page",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="port",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="timezone",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("Africa/Abidjan", "Africa/Abidjan"),
|
||||
("Africa/Accra", "Africa/Accra"),
|
||||
("Africa/Addis_Ababa", "Africa/Addis_Ababa"),
|
||||
("Africa/Algiers", "Africa/Algiers"),
|
||||
("Africa/Asmara", "Africa/Asmara"),
|
||||
("Africa/Asmera", "Africa/Asmera"),
|
||||
("Africa/Bamako", "Africa/Bamako"),
|
||||
("Africa/Bangui", "Africa/Bangui"),
|
||||
("Africa/Banjul", "Africa/Banjul"),
|
||||
("Africa/Bissau", "Africa/Bissau"),
|
||||
("Africa/Blantyre", "Africa/Blantyre"),
|
||||
("Africa/Brazzaville", "Africa/Brazzaville"),
|
||||
("Africa/Bujumbura", "Africa/Bujumbura"),
|
||||
("Africa/Cairo", "Africa/Cairo"),
|
||||
("Africa/Casablanca", "Africa/Casablanca"),
|
||||
("Africa/Ceuta", "Africa/Ceuta"),
|
||||
("Africa/Conakry", "Africa/Conakry"),
|
||||
("Africa/Dakar", "Africa/Dakar"),
|
||||
("Africa/Dar_es_Salaam", "Africa/Dar_es_Salaam"),
|
||||
("Africa/Djibouti", "Africa/Djibouti"),
|
||||
("Africa/Douala", "Africa/Douala"),
|
||||
("Africa/El_Aaiun", "Africa/El_Aaiun"),
|
||||
("Africa/Freetown", "Africa/Freetown"),
|
||||
("Africa/Gaborone", "Africa/Gaborone"),
|
||||
("Africa/Harare", "Africa/Harare"),
|
||||
("Africa/Johannesburg", "Africa/Johannesburg"),
|
||||
("Africa/Juba", "Africa/Juba"),
|
||||
("Africa/Kampala", "Africa/Kampala"),
|
||||
("Africa/Khartoum", "Africa/Khartoum"),
|
||||
("Africa/Kigali", "Africa/Kigali"),
|
||||
("Africa/Kinshasa", "Africa/Kinshasa"),
|
||||
("Africa/Lagos", "Africa/Lagos"),
|
||||
("Africa/Libreville", "Africa/Libreville"),
|
||||
("Africa/Lome", "Africa/Lome"),
|
||||
("Africa/Luanda", "Africa/Luanda"),
|
||||
("Africa/Lubumbashi", "Africa/Lubumbashi"),
|
||||
("Africa/Lusaka", "Africa/Lusaka"),
|
||||
("Africa/Malabo", "Africa/Malabo"),
|
||||
("Africa/Maputo", "Africa/Maputo"),
|
||||
("Africa/Maseru", "Africa/Maseru"),
|
||||
("Africa/Mbabane", "Africa/Mbabane"),
|
||||
("Africa/Mogadishu", "Africa/Mogadishu"),
|
||||
("Africa/Monrovia", "Africa/Monrovia"),
|
||||
("Africa/Nairobi", "Africa/Nairobi"),
|
||||
("Africa/Ndjamena", "Africa/Ndjamena"),
|
||||
("Africa/Niamey", "Africa/Niamey"),
|
||||
("Africa/Nouakchott", "Africa/Nouakchott"),
|
||||
("Africa/Ouagadougou", "Africa/Ouagadougou"),
|
||||
("Africa/Porto-Novo", "Africa/Porto-Novo"),
|
||||
("Africa/Sao_Tome", "Africa/Sao_Tome"),
|
||||
("Africa/Timbuktu", "Africa/Timbuktu"),
|
||||
("Africa/Tripoli", "Africa/Tripoli"),
|
||||
("Africa/Tunis", "Africa/Tunis"),
|
||||
("Africa/Windhoek", "Africa/Windhoek"),
|
||||
("America/Adak", "America/Adak"),
|
||||
("America/Anchorage", "America/Anchorage"),
|
||||
("America/Anguilla", "America/Anguilla"),
|
||||
("America/Antigua", "America/Antigua"),
|
||||
("America/Araguaina", "America/Araguaina"),
|
||||
(
|
||||
"America/Argentina/Buenos_Aires",
|
||||
"America/Argentina/Buenos_Aires",
|
||||
),
|
||||
(
|
||||
"America/Argentina/Catamarca",
|
||||
"America/Argentina/Catamarca",
|
||||
),
|
||||
(
|
||||
"America/Argentina/ComodRivadavia",
|
||||
"America/Argentina/ComodRivadavia",
|
||||
),
|
||||
("America/Argentina/Cordoba", "America/Argentina/Cordoba"),
|
||||
("America/Argentina/Jujuy", "America/Argentina/Jujuy"),
|
||||
(
|
||||
"America/Argentina/La_Rioja",
|
||||
"America/Argentina/La_Rioja",
|
||||
),
|
||||
("America/Argentina/Mendoza", "America/Argentina/Mendoza"),
|
||||
(
|
||||
"America/Argentina/Rio_Gallegos",
|
||||
"America/Argentina/Rio_Gallegos",
|
||||
),
|
||||
("America/Argentina/Salta", "America/Argentina/Salta"),
|
||||
(
|
||||
"America/Argentina/San_Juan",
|
||||
"America/Argentina/San_Juan",
|
||||
),
|
||||
(
|
||||
"America/Argentina/San_Luis",
|
||||
"America/Argentina/San_Luis",
|
||||
),
|
||||
("America/Argentina/Tucuman", "America/Argentina/Tucuman"),
|
||||
("America/Argentina/Ushuaia", "America/Argentina/Ushuaia"),
|
||||
("America/Aruba", "America/Aruba"),
|
||||
("America/Asuncion", "America/Asuncion"),
|
||||
("America/Atikokan", "America/Atikokan"),
|
||||
("America/Atka", "America/Atka"),
|
||||
("America/Bahia", "America/Bahia"),
|
||||
("America/Bahia_Banderas", "America/Bahia_Banderas"),
|
||||
("America/Barbados", "America/Barbados"),
|
||||
("America/Belem", "America/Belem"),
|
||||
("America/Belize", "America/Belize"),
|
||||
("America/Blanc-Sablon", "America/Blanc-Sablon"),
|
||||
("America/Boa_Vista", "America/Boa_Vista"),
|
||||
("America/Bogota", "America/Bogota"),
|
||||
("America/Boise", "America/Boise"),
|
||||
("America/Buenos_Aires", "America/Buenos_Aires"),
|
||||
("America/Cambridge_Bay", "America/Cambridge_Bay"),
|
||||
("America/Campo_Grande", "America/Campo_Grande"),
|
||||
("America/Cancun", "America/Cancun"),
|
||||
("America/Caracas", "America/Caracas"),
|
||||
("America/Catamarca", "America/Catamarca"),
|
||||
("America/Cayenne", "America/Cayenne"),
|
||||
("America/Cayman", "America/Cayman"),
|
||||
("America/Chicago", "America/Chicago"),
|
||||
("America/Chihuahua", "America/Chihuahua"),
|
||||
("America/Coral_Harbour", "America/Coral_Harbour"),
|
||||
("America/Cordoba", "America/Cordoba"),
|
||||
("America/Costa_Rica", "America/Costa_Rica"),
|
||||
("America/Creston", "America/Creston"),
|
||||
("America/Cuiaba", "America/Cuiaba"),
|
||||
("America/Curacao", "America/Curacao"),
|
||||
("America/Danmarkshavn", "America/Danmarkshavn"),
|
||||
("America/Dawson", "America/Dawson"),
|
||||
("America/Dawson_Creek", "America/Dawson_Creek"),
|
||||
("America/Denver", "America/Denver"),
|
||||
("America/Detroit", "America/Detroit"),
|
||||
("America/Dominica", "America/Dominica"),
|
||||
("America/Edmonton", "America/Edmonton"),
|
||||
("America/Eirunepe", "America/Eirunepe"),
|
||||
("America/El_Salvador", "America/El_Salvador"),
|
||||
("America/Ensenada", "America/Ensenada"),
|
||||
("America/Fort_Nelson", "America/Fort_Nelson"),
|
||||
("America/Fort_Wayne", "America/Fort_Wayne"),
|
||||
("America/Fortaleza", "America/Fortaleza"),
|
||||
("America/Glace_Bay", "America/Glace_Bay"),
|
||||
("America/Godthab", "America/Godthab"),
|
||||
("America/Goose_Bay", "America/Goose_Bay"),
|
||||
("America/Grand_Turk", "America/Grand_Turk"),
|
||||
("America/Grenada", "America/Grenada"),
|
||||
("America/Guadeloupe", "America/Guadeloupe"),
|
||||
("America/Guatemala", "America/Guatemala"),
|
||||
("America/Guayaquil", "America/Guayaquil"),
|
||||
("America/Guyana", "America/Guyana"),
|
||||
("America/Halifax", "America/Halifax"),
|
||||
("America/Havana", "America/Havana"),
|
||||
("America/Hermosillo", "America/Hermosillo"),
|
||||
(
|
||||
"America/Indiana/Indianapolis",
|
||||
"America/Indiana/Indianapolis",
|
||||
),
|
||||
("America/Indiana/Knox", "America/Indiana/Knox"),
|
||||
("America/Indiana/Marengo", "America/Indiana/Marengo"),
|
||||
(
|
||||
"America/Indiana/Petersburg",
|
||||
"America/Indiana/Petersburg",
|
||||
),
|
||||
("America/Indiana/Tell_City", "America/Indiana/Tell_City"),
|
||||
("America/Indiana/Vevay", "America/Indiana/Vevay"),
|
||||
("America/Indiana/Vincennes", "America/Indiana/Vincennes"),
|
||||
("America/Indiana/Winamac", "America/Indiana/Winamac"),
|
||||
("America/Indianapolis", "America/Indianapolis"),
|
||||
("America/Inuvik", "America/Inuvik"),
|
||||
("America/Iqaluit", "America/Iqaluit"),
|
||||
("America/Jamaica", "America/Jamaica"),
|
||||
("America/Jujuy", "America/Jujuy"),
|
||||
("America/Juneau", "America/Juneau"),
|
||||
(
|
||||
"America/Kentucky/Louisville",
|
||||
"America/Kentucky/Louisville",
|
||||
),
|
||||
(
|
||||
"America/Kentucky/Monticello",
|
||||
"America/Kentucky/Monticello",
|
||||
),
|
||||
("America/Knox_IN", "America/Knox_IN"),
|
||||
("America/Kralendijk", "America/Kralendijk"),
|
||||
("America/La_Paz", "America/La_Paz"),
|
||||
("America/Lima", "America/Lima"),
|
||||
("America/Los_Angeles", "America/Los_Angeles"),
|
||||
("America/Louisville", "America/Louisville"),
|
||||
("America/Lower_Princes", "America/Lower_Princes"),
|
||||
("America/Maceio", "America/Maceio"),
|
||||
("America/Managua", "America/Managua"),
|
||||
("America/Manaus", "America/Manaus"),
|
||||
("America/Marigot", "America/Marigot"),
|
||||
("America/Martinique", "America/Martinique"),
|
||||
("America/Matamoros", "America/Matamoros"),
|
||||
("America/Mazatlan", "America/Mazatlan"),
|
||||
("America/Mendoza", "America/Mendoza"),
|
||||
("America/Menominee", "America/Menominee"),
|
||||
("America/Merida", "America/Merida"),
|
||||
("America/Metlakatla", "America/Metlakatla"),
|
||||
("America/Mexico_City", "America/Mexico_City"),
|
||||
("America/Miquelon", "America/Miquelon"),
|
||||
("America/Moncton", "America/Moncton"),
|
||||
("America/Monterrey", "America/Monterrey"),
|
||||
("America/Montevideo", "America/Montevideo"),
|
||||
("America/Montreal", "America/Montreal"),
|
||||
("America/Montserrat", "America/Montserrat"),
|
||||
("America/Nassau", "America/Nassau"),
|
||||
("America/New_York", "America/New_York"),
|
||||
("America/Nipigon", "America/Nipigon"),
|
||||
("America/Nome", "America/Nome"),
|
||||
("America/Noronha", "America/Noronha"),
|
||||
(
|
||||
"America/North_Dakota/Beulah",
|
||||
"America/North_Dakota/Beulah",
|
||||
),
|
||||
(
|
||||
"America/North_Dakota/Center",
|
||||
"America/North_Dakota/Center",
|
||||
),
|
||||
(
|
||||
"America/North_Dakota/New_Salem",
|
||||
"America/North_Dakota/New_Salem",
|
||||
),
|
||||
("America/Nuuk", "America/Nuuk"),
|
||||
("America/Ojinaga", "America/Ojinaga"),
|
||||
("America/Panama", "America/Panama"),
|
||||
("America/Pangnirtung", "America/Pangnirtung"),
|
||||
("America/Paramaribo", "America/Paramaribo"),
|
||||
("America/Phoenix", "America/Phoenix"),
|
||||
("America/Port-au-Prince", "America/Port-au-Prince"),
|
||||
("America/Port_of_Spain", "America/Port_of_Spain"),
|
||||
("America/Porto_Acre", "America/Porto_Acre"),
|
||||
("America/Porto_Velho", "America/Porto_Velho"),
|
||||
("America/Puerto_Rico", "America/Puerto_Rico"),
|
||||
("America/Punta_Arenas", "America/Punta_Arenas"),
|
||||
("America/Rainy_River", "America/Rainy_River"),
|
||||
("America/Rankin_Inlet", "America/Rankin_Inlet"),
|
||||
("America/Recife", "America/Recife"),
|
||||
("America/Regina", "America/Regina"),
|
||||
("America/Resolute", "America/Resolute"),
|
||||
("America/Rio_Branco", "America/Rio_Branco"),
|
||||
("America/Rosario", "America/Rosario"),
|
||||
("America/Santa_Isabel", "America/Santa_Isabel"),
|
||||
("America/Santarem", "America/Santarem"),
|
||||
("America/Santiago", "America/Santiago"),
|
||||
("America/Santo_Domingo", "America/Santo_Domingo"),
|
||||
("America/Sao_Paulo", "America/Sao_Paulo"),
|
||||
("America/Scoresbysund", "America/Scoresbysund"),
|
||||
("America/Shiprock", "America/Shiprock"),
|
||||
("America/Sitka", "America/Sitka"),
|
||||
("America/St_Barthelemy", "America/St_Barthelemy"),
|
||||
("America/St_Johns", "America/St_Johns"),
|
||||
("America/St_Kitts", "America/St_Kitts"),
|
||||
("America/St_Lucia", "America/St_Lucia"),
|
||||
("America/St_Thomas", "America/St_Thomas"),
|
||||
("America/St_Vincent", "America/St_Vincent"),
|
||||
("America/Swift_Current", "America/Swift_Current"),
|
||||
("America/Tegucigalpa", "America/Tegucigalpa"),
|
||||
("America/Thule", "America/Thule"),
|
||||
("America/Thunder_Bay", "America/Thunder_Bay"),
|
||||
("America/Tijuana", "America/Tijuana"),
|
||||
("America/Toronto", "America/Toronto"),
|
||||
("America/Tortola", "America/Tortola"),
|
||||
("America/Vancouver", "America/Vancouver"),
|
||||
("America/Virgin", "America/Virgin"),
|
||||
("America/Whitehorse", "America/Whitehorse"),
|
||||
("America/Winnipeg", "America/Winnipeg"),
|
||||
("America/Yakutat", "America/Yakutat"),
|
||||
("America/Yellowknife", "America/Yellowknife"),
|
||||
("Antarctica/Casey", "Antarctica/Casey"),
|
||||
("Antarctica/Davis", "Antarctica/Davis"),
|
||||
("Antarctica/DumontDUrville", "Antarctica/DumontDUrville"),
|
||||
("Antarctica/Macquarie", "Antarctica/Macquarie"),
|
||||
("Antarctica/Mawson", "Antarctica/Mawson"),
|
||||
("Antarctica/McMurdo", "Antarctica/McMurdo"),
|
||||
("Antarctica/Palmer", "Antarctica/Palmer"),
|
||||
("Antarctica/Rothera", "Antarctica/Rothera"),
|
||||
("Antarctica/South_Pole", "Antarctica/South_Pole"),
|
||||
("Antarctica/Syowa", "Antarctica/Syowa"),
|
||||
("Antarctica/Troll", "Antarctica/Troll"),
|
||||
("Antarctica/Vostok", "Antarctica/Vostok"),
|
||||
("Arctic/Longyearbyen", "Arctic/Longyearbyen"),
|
||||
("Asia/Aden", "Asia/Aden"),
|
||||
("Asia/Almaty", "Asia/Almaty"),
|
||||
("Asia/Amman", "Asia/Amman"),
|
||||
("Asia/Anadyr", "Asia/Anadyr"),
|
||||
("Asia/Aqtau", "Asia/Aqtau"),
|
||||
("Asia/Aqtobe", "Asia/Aqtobe"),
|
||||
("Asia/Ashgabat", "Asia/Ashgabat"),
|
||||
("Asia/Ashkhabad", "Asia/Ashkhabad"),
|
||||
("Asia/Atyrau", "Asia/Atyrau"),
|
||||
("Asia/Baghdad", "Asia/Baghdad"),
|
||||
("Asia/Bahrain", "Asia/Bahrain"),
|
||||
("Asia/Baku", "Asia/Baku"),
|
||||
("Asia/Bangkok", "Asia/Bangkok"),
|
||||
("Asia/Barnaul", "Asia/Barnaul"),
|
||||
("Asia/Beirut", "Asia/Beirut"),
|
||||
("Asia/Bishkek", "Asia/Bishkek"),
|
||||
("Asia/Brunei", "Asia/Brunei"),
|
||||
("Asia/Calcutta", "Asia/Calcutta"),
|
||||
("Asia/Chita", "Asia/Chita"),
|
||||
("Asia/Choibalsan", "Asia/Choibalsan"),
|
||||
("Asia/Chongqing", "Asia/Chongqing"),
|
||||
("Asia/Chungking", "Asia/Chungking"),
|
||||
("Asia/Colombo", "Asia/Colombo"),
|
||||
("Asia/Dacca", "Asia/Dacca"),
|
||||
("Asia/Damascus", "Asia/Damascus"),
|
||||
("Asia/Dhaka", "Asia/Dhaka"),
|
||||
("Asia/Dili", "Asia/Dili"),
|
||||
("Asia/Dubai", "Asia/Dubai"),
|
||||
("Asia/Dushanbe", "Asia/Dushanbe"),
|
||||
("Asia/Famagusta", "Asia/Famagusta"),
|
||||
("Asia/Gaza", "Asia/Gaza"),
|
||||
("Asia/Harbin", "Asia/Harbin"),
|
||||
("Asia/Hebron", "Asia/Hebron"),
|
||||
("Asia/Ho_Chi_Minh", "Asia/Ho_Chi_Minh"),
|
||||
("Asia/Hong_Kong", "Asia/Hong_Kong"),
|
||||
("Asia/Hovd", "Asia/Hovd"),
|
||||
("Asia/Irkutsk", "Asia/Irkutsk"),
|
||||
("Asia/Istanbul", "Asia/Istanbul"),
|
||||
("Asia/Jakarta", "Asia/Jakarta"),
|
||||
("Asia/Jayapura", "Asia/Jayapura"),
|
||||
("Asia/Jerusalem", "Asia/Jerusalem"),
|
||||
("Asia/Kabul", "Asia/Kabul"),
|
||||
("Asia/Kamchatka", "Asia/Kamchatka"),
|
||||
("Asia/Karachi", "Asia/Karachi"),
|
||||
("Asia/Kashgar", "Asia/Kashgar"),
|
||||
("Asia/Kathmandu", "Asia/Kathmandu"),
|
||||
("Asia/Katmandu", "Asia/Katmandu"),
|
||||
("Asia/Khandyga", "Asia/Khandyga"),
|
||||
("Asia/Kolkata", "Asia/Kolkata"),
|
||||
("Asia/Krasnoyarsk", "Asia/Krasnoyarsk"),
|
||||
("Asia/Kuala_Lumpur", "Asia/Kuala_Lumpur"),
|
||||
("Asia/Kuching", "Asia/Kuching"),
|
||||
("Asia/Kuwait", "Asia/Kuwait"),
|
||||
("Asia/Macao", "Asia/Macao"),
|
||||
("Asia/Macau", "Asia/Macau"),
|
||||
("Asia/Magadan", "Asia/Magadan"),
|
||||
("Asia/Makassar", "Asia/Makassar"),
|
||||
("Asia/Manila", "Asia/Manila"),
|
||||
("Asia/Muscat", "Asia/Muscat"),
|
||||
("Asia/Nicosia", "Asia/Nicosia"),
|
||||
("Asia/Novokuznetsk", "Asia/Novokuznetsk"),
|
||||
("Asia/Novosibirsk", "Asia/Novosibirsk"),
|
||||
("Asia/Omsk", "Asia/Omsk"),
|
||||
("Asia/Oral", "Asia/Oral"),
|
||||
("Asia/Phnom_Penh", "Asia/Phnom_Penh"),
|
||||
("Asia/Pontianak", "Asia/Pontianak"),
|
||||
("Asia/Pyongyang", "Asia/Pyongyang"),
|
||||
("Asia/Qatar", "Asia/Qatar"),
|
||||
("Asia/Qostanay", "Asia/Qostanay"),
|
||||
("Asia/Qyzylorda", "Asia/Qyzylorda"),
|
||||
("Asia/Rangoon", "Asia/Rangoon"),
|
||||
("Asia/Riyadh", "Asia/Riyadh"),
|
||||
("Asia/Saigon", "Asia/Saigon"),
|
||||
("Asia/Sakhalin", "Asia/Sakhalin"),
|
||||
("Asia/Samarkand", "Asia/Samarkand"),
|
||||
("Asia/Seoul", "Asia/Seoul"),
|
||||
("Asia/Shanghai", "Asia/Shanghai"),
|
||||
("Asia/Singapore", "Asia/Singapore"),
|
||||
("Asia/Srednekolymsk", "Asia/Srednekolymsk"),
|
||||
("Asia/Taipei", "Asia/Taipei"),
|
||||
("Asia/Tashkent", "Asia/Tashkent"),
|
||||
("Asia/Tbilisi", "Asia/Tbilisi"),
|
||||
("Asia/Tehran", "Asia/Tehran"),
|
||||
("Asia/Tel_Aviv", "Asia/Tel_Aviv"),
|
||||
("Asia/Thimbu", "Asia/Thimbu"),
|
||||
("Asia/Thimphu", "Asia/Thimphu"),
|
||||
("Asia/Tokyo", "Asia/Tokyo"),
|
||||
("Asia/Tomsk", "Asia/Tomsk"),
|
||||
("Asia/Ujung_Pandang", "Asia/Ujung_Pandang"),
|
||||
("Asia/Ulaanbaatar", "Asia/Ulaanbaatar"),
|
||||
("Asia/Ulan_Bator", "Asia/Ulan_Bator"),
|
||||
("Asia/Urumqi", "Asia/Urumqi"),
|
||||
("Asia/Ust-Nera", "Asia/Ust-Nera"),
|
||||
("Asia/Vientiane", "Asia/Vientiane"),
|
||||
("Asia/Vladivostok", "Asia/Vladivostok"),
|
||||
("Asia/Yakutsk", "Asia/Yakutsk"),
|
||||
("Asia/Yangon", "Asia/Yangon"),
|
||||
("Asia/Yekaterinburg", "Asia/Yekaterinburg"),
|
||||
("Asia/Yerevan", "Asia/Yerevan"),
|
||||
("Atlantic/Azores", "Atlantic/Azores"),
|
||||
("Atlantic/Bermuda", "Atlantic/Bermuda"),
|
||||
("Atlantic/Canary", "Atlantic/Canary"),
|
||||
("Atlantic/Cape_Verde", "Atlantic/Cape_Verde"),
|
||||
("Atlantic/Faeroe", "Atlantic/Faeroe"),
|
||||
("Atlantic/Faroe", "Atlantic/Faroe"),
|
||||
("Atlantic/Jan_Mayen", "Atlantic/Jan_Mayen"),
|
||||
("Atlantic/Madeira", "Atlantic/Madeira"),
|
||||
("Atlantic/Reykjavik", "Atlantic/Reykjavik"),
|
||||
("Atlantic/South_Georgia", "Atlantic/South_Georgia"),
|
||||
("Atlantic/St_Helena", "Atlantic/St_Helena"),
|
||||
("Atlantic/Stanley", "Atlantic/Stanley"),
|
||||
("Australia/ACT", "Australia/ACT"),
|
||||
("Australia/Adelaide", "Australia/Adelaide"),
|
||||
("Australia/Brisbane", "Australia/Brisbane"),
|
||||
("Australia/Broken_Hill", "Australia/Broken_Hill"),
|
||||
("Australia/Canberra", "Australia/Canberra"),
|
||||
("Australia/Currie", "Australia/Currie"),
|
||||
("Australia/Darwin", "Australia/Darwin"),
|
||||
("Australia/Eucla", "Australia/Eucla"),
|
||||
("Australia/Hobart", "Australia/Hobart"),
|
||||
("Australia/LHI", "Australia/LHI"),
|
||||
("Australia/Lindeman", "Australia/Lindeman"),
|
||||
("Australia/Lord_Howe", "Australia/Lord_Howe"),
|
||||
("Australia/Melbourne", "Australia/Melbourne"),
|
||||
("Australia/NSW", "Australia/NSW"),
|
||||
("Australia/North", "Australia/North"),
|
||||
("Australia/Perth", "Australia/Perth"),
|
||||
("Australia/Queensland", "Australia/Queensland"),
|
||||
("Australia/South", "Australia/South"),
|
||||
("Australia/Sydney", "Australia/Sydney"),
|
||||
("Australia/Tasmania", "Australia/Tasmania"),
|
||||
("Australia/Victoria", "Australia/Victoria"),
|
||||
("Australia/West", "Australia/West"),
|
||||
("Australia/Yancowinna", "Australia/Yancowinna"),
|
||||
("Brazil/Acre", "Brazil/Acre"),
|
||||
("Brazil/DeNoronha", "Brazil/DeNoronha"),
|
||||
("Brazil/East", "Brazil/East"),
|
||||
("Brazil/West", "Brazil/West"),
|
||||
("CET", "CET"),
|
||||
("CST6CDT", "CST6CDT"),
|
||||
("Canada/Atlantic", "Canada/Atlantic"),
|
||||
("Canada/Central", "Canada/Central"),
|
||||
("Canada/Eastern", "Canada/Eastern"),
|
||||
("Canada/Mountain", "Canada/Mountain"),
|
||||
("Canada/Newfoundland", "Canada/Newfoundland"),
|
||||
("Canada/Pacific", "Canada/Pacific"),
|
||||
("Canada/Saskatchewan", "Canada/Saskatchewan"),
|
||||
("Canada/Yukon", "Canada/Yukon"),
|
||||
("Chile/Continental", "Chile/Continental"),
|
||||
("Chile/EasterIsland", "Chile/EasterIsland"),
|
||||
("Cuba", "Cuba"),
|
||||
("EET", "EET"),
|
||||
("EST", "EST"),
|
||||
("EST5EDT", "EST5EDT"),
|
||||
("Egypt", "Egypt"),
|
||||
("Eire", "Eire"),
|
||||
("Etc/GMT", "Etc/GMT"),
|
||||
("Etc/GMT+0", "Etc/GMT+0"),
|
||||
("Etc/GMT+1", "Etc/GMT+1"),
|
||||
("Etc/GMT+10", "Etc/GMT+10"),
|
||||
("Etc/GMT+11", "Etc/GMT+11"),
|
||||
("Etc/GMT+12", "Etc/GMT+12"),
|
||||
("Etc/GMT+2", "Etc/GMT+2"),
|
||||
("Etc/GMT+3", "Etc/GMT+3"),
|
||||
("Etc/GMT+4", "Etc/GMT+4"),
|
||||
("Etc/GMT+5", "Etc/GMT+5"),
|
||||
("Etc/GMT+6", "Etc/GMT+6"),
|
||||
("Etc/GMT+7", "Etc/GMT+7"),
|
||||
("Etc/GMT+8", "Etc/GMT+8"),
|
||||
("Etc/GMT+9", "Etc/GMT+9"),
|
||||
("Etc/GMT-0", "Etc/GMT-0"),
|
||||
("Etc/GMT-1", "Etc/GMT-1"),
|
||||
("Etc/GMT-10", "Etc/GMT-10"),
|
||||
("Etc/GMT-11", "Etc/GMT-11"),
|
||||
("Etc/GMT-12", "Etc/GMT-12"),
|
||||
("Etc/GMT-13", "Etc/GMT-13"),
|
||||
("Etc/GMT-14", "Etc/GMT-14"),
|
||||
("Etc/GMT-2", "Etc/GMT-2"),
|
||||
("Etc/GMT-3", "Etc/GMT-3"),
|
||||
("Etc/GMT-4", "Etc/GMT-4"),
|
||||
("Etc/GMT-5", "Etc/GMT-5"),
|
||||
("Etc/GMT-6", "Etc/GMT-6"),
|
||||
("Etc/GMT-7", "Etc/GMT-7"),
|
||||
("Etc/GMT-8", "Etc/GMT-8"),
|
||||
("Etc/GMT-9", "Etc/GMT-9"),
|
||||
("Etc/GMT0", "Etc/GMT0"),
|
||||
("Etc/Greenwich", "Etc/Greenwich"),
|
||||
("Etc/UCT", "Etc/UCT"),
|
||||
("Etc/UTC", "Etc/UTC"),
|
||||
("Etc/Universal", "Etc/Universal"),
|
||||
("Etc/Zulu", "Etc/Zulu"),
|
||||
("Europe/Amsterdam", "Europe/Amsterdam"),
|
||||
("Europe/Andorra", "Europe/Andorra"),
|
||||
("Europe/Astrakhan", "Europe/Astrakhan"),
|
||||
("Europe/Athens", "Europe/Athens"),
|
||||
("Europe/Belfast", "Europe/Belfast"),
|
||||
("Europe/Belgrade", "Europe/Belgrade"),
|
||||
("Europe/Berlin", "Europe/Berlin"),
|
||||
("Europe/Bratislava", "Europe/Bratislava"),
|
||||
("Europe/Brussels", "Europe/Brussels"),
|
||||
("Europe/Bucharest", "Europe/Bucharest"),
|
||||
("Europe/Budapest", "Europe/Budapest"),
|
||||
("Europe/Busingen", "Europe/Busingen"),
|
||||
("Europe/Chisinau", "Europe/Chisinau"),
|
||||
("Europe/Copenhagen", "Europe/Copenhagen"),
|
||||
("Europe/Dublin", "Europe/Dublin"),
|
||||
("Europe/Gibraltar", "Europe/Gibraltar"),
|
||||
("Europe/Guernsey", "Europe/Guernsey"),
|
||||
("Europe/Helsinki", "Europe/Helsinki"),
|
||||
("Europe/Isle_of_Man", "Europe/Isle_of_Man"),
|
||||
("Europe/Istanbul", "Europe/Istanbul"),
|
||||
("Europe/Jersey", "Europe/Jersey"),
|
||||
("Europe/Kaliningrad", "Europe/Kaliningrad"),
|
||||
("Europe/Kiev", "Europe/Kiev"),
|
||||
("Europe/Kirov", "Europe/Kirov"),
|
||||
("Europe/Lisbon", "Europe/Lisbon"),
|
||||
("Europe/Ljubljana", "Europe/Ljubljana"),
|
||||
("Europe/London", "Europe/London"),
|
||||
("Europe/Luxembourg", "Europe/Luxembourg"),
|
||||
("Europe/Madrid", "Europe/Madrid"),
|
||||
("Europe/Malta", "Europe/Malta"),
|
||||
("Europe/Mariehamn", "Europe/Mariehamn"),
|
||||
("Europe/Minsk", "Europe/Minsk"),
|
||||
("Europe/Monaco", "Europe/Monaco"),
|
||||
("Europe/Moscow", "Europe/Moscow"),
|
||||
("Europe/Nicosia", "Europe/Nicosia"),
|
||||
("Europe/Oslo", "Europe/Oslo"),
|
||||
("Europe/Paris", "Europe/Paris"),
|
||||
("Europe/Podgorica", "Europe/Podgorica"),
|
||||
("Europe/Prague", "Europe/Prague"),
|
||||
("Europe/Riga", "Europe/Riga"),
|
||||
("Europe/Rome", "Europe/Rome"),
|
||||
("Europe/Samara", "Europe/Samara"),
|
||||
("Europe/San_Marino", "Europe/San_Marino"),
|
||||
("Europe/Sarajevo", "Europe/Sarajevo"),
|
||||
("Europe/Saratov", "Europe/Saratov"),
|
||||
("Europe/Simferopol", "Europe/Simferopol"),
|
||||
("Europe/Skopje", "Europe/Skopje"),
|
||||
("Europe/Sofia", "Europe/Sofia"),
|
||||
("Europe/Stockholm", "Europe/Stockholm"),
|
||||
("Europe/Tallinn", "Europe/Tallinn"),
|
||||
("Europe/Tirane", "Europe/Tirane"),
|
||||
("Europe/Tiraspol", "Europe/Tiraspol"),
|
||||
("Europe/Ulyanovsk", "Europe/Ulyanovsk"),
|
||||
("Europe/Uzhgorod", "Europe/Uzhgorod"),
|
||||
("Europe/Vaduz", "Europe/Vaduz"),
|
||||
("Europe/Vatican", "Europe/Vatican"),
|
||||
("Europe/Vienna", "Europe/Vienna"),
|
||||
("Europe/Vilnius", "Europe/Vilnius"),
|
||||
("Europe/Volgograd", "Europe/Volgograd"),
|
||||
("Europe/Warsaw", "Europe/Warsaw"),
|
||||
("Europe/Zagreb", "Europe/Zagreb"),
|
||||
("Europe/Zaporozhye", "Europe/Zaporozhye"),
|
||||
("Europe/Zurich", "Europe/Zurich"),
|
||||
("GB", "GB"),
|
||||
("GB-Eire", "GB-Eire"),
|
||||
("GMT", "GMT"),
|
||||
("GMT+0", "GMT+0"),
|
||||
("GMT-0", "GMT-0"),
|
||||
("GMT0", "GMT0"),
|
||||
("Greenwich", "Greenwich"),
|
||||
("HST", "HST"),
|
||||
("Hongkong", "Hongkong"),
|
||||
("Iceland", "Iceland"),
|
||||
("Indian/Antananarivo", "Indian/Antananarivo"),
|
||||
("Indian/Chagos", "Indian/Chagos"),
|
||||
("Indian/Christmas", "Indian/Christmas"),
|
||||
("Indian/Cocos", "Indian/Cocos"),
|
||||
("Indian/Comoro", "Indian/Comoro"),
|
||||
("Indian/Kerguelen", "Indian/Kerguelen"),
|
||||
("Indian/Mahe", "Indian/Mahe"),
|
||||
("Indian/Maldives", "Indian/Maldives"),
|
||||
("Indian/Mauritius", "Indian/Mauritius"),
|
||||
("Indian/Mayotte", "Indian/Mayotte"),
|
||||
("Indian/Reunion", "Indian/Reunion"),
|
||||
("Iran", "Iran"),
|
||||
("Israel", "Israel"),
|
||||
("Jamaica", "Jamaica"),
|
||||
("Japan", "Japan"),
|
||||
("Kwajalein", "Kwajalein"),
|
||||
("Libya", "Libya"),
|
||||
("MET", "MET"),
|
||||
("MST", "MST"),
|
||||
("MST7MDT", "MST7MDT"),
|
||||
("Mexico/BajaNorte", "Mexico/BajaNorte"),
|
||||
("Mexico/BajaSur", "Mexico/BajaSur"),
|
||||
("Mexico/General", "Mexico/General"),
|
||||
("NZ", "NZ"),
|
||||
("NZ-CHAT", "NZ-CHAT"),
|
||||
("Navajo", "Navajo"),
|
||||
("PRC", "PRC"),
|
||||
("PST8PDT", "PST8PDT"),
|
||||
("Pacific/Apia", "Pacific/Apia"),
|
||||
("Pacific/Auckland", "Pacific/Auckland"),
|
||||
("Pacific/Bougainville", "Pacific/Bougainville"),
|
||||
("Pacific/Chatham", "Pacific/Chatham"),
|
||||
("Pacific/Chuuk", "Pacific/Chuuk"),
|
||||
("Pacific/Easter", "Pacific/Easter"),
|
||||
("Pacific/Efate", "Pacific/Efate"),
|
||||
("Pacific/Enderbury", "Pacific/Enderbury"),
|
||||
("Pacific/Fakaofo", "Pacific/Fakaofo"),
|
||||
("Pacific/Fiji", "Pacific/Fiji"),
|
||||
("Pacific/Funafuti", "Pacific/Funafuti"),
|
||||
("Pacific/Galapagos", "Pacific/Galapagos"),
|
||||
("Pacific/Gambier", "Pacific/Gambier"),
|
||||
("Pacific/Guadalcanal", "Pacific/Guadalcanal"),
|
||||
("Pacific/Guam", "Pacific/Guam"),
|
||||
("Pacific/Honolulu", "Pacific/Honolulu"),
|
||||
("Pacific/Johnston", "Pacific/Johnston"),
|
||||
("Pacific/Kanton", "Pacific/Kanton"),
|
||||
("Pacific/Kiritimati", "Pacific/Kiritimati"),
|
||||
("Pacific/Kosrae", "Pacific/Kosrae"),
|
||||
("Pacific/Kwajalein", "Pacific/Kwajalein"),
|
||||
("Pacific/Majuro", "Pacific/Majuro"),
|
||||
("Pacific/Marquesas", "Pacific/Marquesas"),
|
||||
("Pacific/Midway", "Pacific/Midway"),
|
||||
("Pacific/Nauru", "Pacific/Nauru"),
|
||||
("Pacific/Niue", "Pacific/Niue"),
|
||||
("Pacific/Norfolk", "Pacific/Norfolk"),
|
||||
("Pacific/Noumea", "Pacific/Noumea"),
|
||||
("Pacific/Pago_Pago", "Pacific/Pago_Pago"),
|
||||
("Pacific/Palau", "Pacific/Palau"),
|
||||
("Pacific/Pitcairn", "Pacific/Pitcairn"),
|
||||
("Pacific/Pohnpei", "Pacific/Pohnpei"),
|
||||
("Pacific/Ponape", "Pacific/Ponape"),
|
||||
("Pacific/Port_Moresby", "Pacific/Port_Moresby"),
|
||||
("Pacific/Rarotonga", "Pacific/Rarotonga"),
|
||||
("Pacific/Saipan", "Pacific/Saipan"),
|
||||
("Pacific/Samoa", "Pacific/Samoa"),
|
||||
("Pacific/Tahiti", "Pacific/Tahiti"),
|
||||
("Pacific/Tarawa", "Pacific/Tarawa"),
|
||||
("Pacific/Tongatapu", "Pacific/Tongatapu"),
|
||||
("Pacific/Truk", "Pacific/Truk"),
|
||||
("Pacific/Wake", "Pacific/Wake"),
|
||||
("Pacific/Wallis", "Pacific/Wallis"),
|
||||
("Pacific/Yap", "Pacific/Yap"),
|
||||
("Poland", "Poland"),
|
||||
("Portugal", "Portugal"),
|
||||
("ROC", "ROC"),
|
||||
("ROK", "ROK"),
|
||||
("Singapore", "Singapore"),
|
||||
("Turkey", "Turkey"),
|
||||
("UCT", "UCT"),
|
||||
("US/Alaska", "US/Alaska"),
|
||||
("US/Aleutian", "US/Aleutian"),
|
||||
("US/Arizona", "US/Arizona"),
|
||||
("US/Central", "US/Central"),
|
||||
("US/East-Indiana", "US/East-Indiana"),
|
||||
("US/Eastern", "US/Eastern"),
|
||||
("US/Hawaii", "US/Hawaii"),
|
||||
("US/Indiana-Starke", "US/Indiana-Starke"),
|
||||
("US/Michigan", "US/Michigan"),
|
||||
("US/Mountain", "US/Mountain"),
|
||||
("US/Pacific", "US/Pacific"),
|
||||
("US/Samoa", "US/Samoa"),
|
||||
("UTC", "UTC"),
|
||||
("Universal", "Universal"),
|
||||
("W-SU", "W-SU"),
|
||||
("WET", "WET"),
|
||||
("Zulu", "Zulu"),
|
||||
],
|
||||
default=django.utils.timezone.get_current_timezone,
|
||||
help_text="timezone used for the date",
|
||||
max_length=100,
|
||||
verbose_name="timezone",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="sound",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="sound",
|
||||
name="program",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
help_text="program related to it",
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="aircox.program",
|
||||
verbose_name="program",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="staticpage",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="station",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="stream",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="track",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
]
|
||||
23
aircox/migrations/0006_alter_sound_file.py
Normal file
23
aircox/migrations/0006_alter_sound_file.py
Normal file
@ -0,0 +1,23 @@
|
||||
# Generated by Django 3.2.12 on 2022-03-26 15:21
|
||||
|
||||
import aircox.models.sound
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0005_auto_20220318_1205"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="sound",
|
||||
name="file",
|
||||
field=models.FileField(
|
||||
db_index=True,
|
||||
max_length=256,
|
||||
upload_to=aircox.models.sound.Sound._upload_to,
|
||||
verbose_name="file",
|
||||
),
|
||||
),
|
||||
]
|
||||
@ -0,0 +1,710 @@
|
||||
# Generated by Django 4.1 on 2022-10-06 13:47
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.utils.timezone
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0006_alter_sound_file"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="sound",
|
||||
name="is_downloadable",
|
||||
field=models.BooleanField(
|
||||
default=False,
|
||||
help_text="whether it can be publicly downloaded by visitors (sound must be public)",
|
||||
verbose_name="downloadable",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="page",
|
||||
name="pub_date",
|
||||
field=models.DateTimeField(
|
||||
blank=True,
|
||||
db_index=True,
|
||||
null=True,
|
||||
verbose_name="publication date",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="timezone",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("Africa/Abidjan", "Africa/Abidjan"),
|
||||
("Africa/Accra", "Africa/Accra"),
|
||||
("Africa/Addis_Ababa", "Africa/Addis_Ababa"),
|
||||
("Africa/Algiers", "Africa/Algiers"),
|
||||
("Africa/Asmara", "Africa/Asmara"),
|
||||
("Africa/Asmera", "Africa/Asmera"),
|
||||
("Africa/Bamako", "Africa/Bamako"),
|
||||
("Africa/Bangui", "Africa/Bangui"),
|
||||
("Africa/Banjul", "Africa/Banjul"),
|
||||
("Africa/Bissau", "Africa/Bissau"),
|
||||
("Africa/Blantyre", "Africa/Blantyre"),
|
||||
("Africa/Brazzaville", "Africa/Brazzaville"),
|
||||
("Africa/Bujumbura", "Africa/Bujumbura"),
|
||||
("Africa/Cairo", "Africa/Cairo"),
|
||||
("Africa/Casablanca", "Africa/Casablanca"),
|
||||
("Africa/Ceuta", "Africa/Ceuta"),
|
||||
("Africa/Conakry", "Africa/Conakry"),
|
||||
("Africa/Dakar", "Africa/Dakar"),
|
||||
("Africa/Dar_es_Salaam", "Africa/Dar_es_Salaam"),
|
||||
("Africa/Djibouti", "Africa/Djibouti"),
|
||||
("Africa/Douala", "Africa/Douala"),
|
||||
("Africa/El_Aaiun", "Africa/El_Aaiun"),
|
||||
("Africa/Freetown", "Africa/Freetown"),
|
||||
("Africa/Gaborone", "Africa/Gaborone"),
|
||||
("Africa/Harare", "Africa/Harare"),
|
||||
("Africa/Johannesburg", "Africa/Johannesburg"),
|
||||
("Africa/Juba", "Africa/Juba"),
|
||||
("Africa/Kampala", "Africa/Kampala"),
|
||||
("Africa/Khartoum", "Africa/Khartoum"),
|
||||
("Africa/Kigali", "Africa/Kigali"),
|
||||
("Africa/Kinshasa", "Africa/Kinshasa"),
|
||||
("Africa/Lagos", "Africa/Lagos"),
|
||||
("Africa/Libreville", "Africa/Libreville"),
|
||||
("Africa/Lome", "Africa/Lome"),
|
||||
("Africa/Luanda", "Africa/Luanda"),
|
||||
("Africa/Lubumbashi", "Africa/Lubumbashi"),
|
||||
("Africa/Lusaka", "Africa/Lusaka"),
|
||||
("Africa/Malabo", "Africa/Malabo"),
|
||||
("Africa/Maputo", "Africa/Maputo"),
|
||||
("Africa/Maseru", "Africa/Maseru"),
|
||||
("Africa/Mbabane", "Africa/Mbabane"),
|
||||
("Africa/Mogadishu", "Africa/Mogadishu"),
|
||||
("Africa/Monrovia", "Africa/Monrovia"),
|
||||
("Africa/Nairobi", "Africa/Nairobi"),
|
||||
("Africa/Ndjamena", "Africa/Ndjamena"),
|
||||
("Africa/Niamey", "Africa/Niamey"),
|
||||
("Africa/Nouakchott", "Africa/Nouakchott"),
|
||||
("Africa/Ouagadougou", "Africa/Ouagadougou"),
|
||||
("Africa/Porto-Novo", "Africa/Porto-Novo"),
|
||||
("Africa/Sao_Tome", "Africa/Sao_Tome"),
|
||||
("Africa/Timbuktu", "Africa/Timbuktu"),
|
||||
("Africa/Tripoli", "Africa/Tripoli"),
|
||||
("Africa/Tunis", "Africa/Tunis"),
|
||||
("Africa/Windhoek", "Africa/Windhoek"),
|
||||
("America/Adak", "America/Adak"),
|
||||
("America/Anchorage", "America/Anchorage"),
|
||||
("America/Anguilla", "America/Anguilla"),
|
||||
("America/Antigua", "America/Antigua"),
|
||||
("America/Araguaina", "America/Araguaina"),
|
||||
(
|
||||
"America/Argentina/Buenos_Aires",
|
||||
"America/Argentina/Buenos_Aires",
|
||||
),
|
||||
(
|
||||
"America/Argentina/Catamarca",
|
||||
"America/Argentina/Catamarca",
|
||||
),
|
||||
(
|
||||
"America/Argentina/ComodRivadavia",
|
||||
"America/Argentina/ComodRivadavia",
|
||||
),
|
||||
("America/Argentina/Cordoba", "America/Argentina/Cordoba"),
|
||||
("America/Argentina/Jujuy", "America/Argentina/Jujuy"),
|
||||
(
|
||||
"America/Argentina/La_Rioja",
|
||||
"America/Argentina/La_Rioja",
|
||||
),
|
||||
("America/Argentina/Mendoza", "America/Argentina/Mendoza"),
|
||||
(
|
||||
"America/Argentina/Rio_Gallegos",
|
||||
"America/Argentina/Rio_Gallegos",
|
||||
),
|
||||
("America/Argentina/Salta", "America/Argentina/Salta"),
|
||||
(
|
||||
"America/Argentina/San_Juan",
|
||||
"America/Argentina/San_Juan",
|
||||
),
|
||||
(
|
||||
"America/Argentina/San_Luis",
|
||||
"America/Argentina/San_Luis",
|
||||
),
|
||||
("America/Argentina/Tucuman", "America/Argentina/Tucuman"),
|
||||
("America/Argentina/Ushuaia", "America/Argentina/Ushuaia"),
|
||||
("America/Aruba", "America/Aruba"),
|
||||
("America/Asuncion", "America/Asuncion"),
|
||||
("America/Atikokan", "America/Atikokan"),
|
||||
("America/Atka", "America/Atka"),
|
||||
("America/Bahia", "America/Bahia"),
|
||||
("America/Bahia_Banderas", "America/Bahia_Banderas"),
|
||||
("America/Barbados", "America/Barbados"),
|
||||
("America/Belem", "America/Belem"),
|
||||
("America/Belize", "America/Belize"),
|
||||
("America/Blanc-Sablon", "America/Blanc-Sablon"),
|
||||
("America/Boa_Vista", "America/Boa_Vista"),
|
||||
("America/Bogota", "America/Bogota"),
|
||||
("America/Boise", "America/Boise"),
|
||||
("America/Buenos_Aires", "America/Buenos_Aires"),
|
||||
("America/Cambridge_Bay", "America/Cambridge_Bay"),
|
||||
("America/Campo_Grande", "America/Campo_Grande"),
|
||||
("America/Cancun", "America/Cancun"),
|
||||
("America/Caracas", "America/Caracas"),
|
||||
("America/Catamarca", "America/Catamarca"),
|
||||
("America/Cayenne", "America/Cayenne"),
|
||||
("America/Cayman", "America/Cayman"),
|
||||
("America/Chicago", "America/Chicago"),
|
||||
("America/Chihuahua", "America/Chihuahua"),
|
||||
("America/Coral_Harbour", "America/Coral_Harbour"),
|
||||
("America/Cordoba", "America/Cordoba"),
|
||||
("America/Costa_Rica", "America/Costa_Rica"),
|
||||
("America/Creston", "America/Creston"),
|
||||
("America/Cuiaba", "America/Cuiaba"),
|
||||
("America/Curacao", "America/Curacao"),
|
||||
("America/Danmarkshavn", "America/Danmarkshavn"),
|
||||
("America/Dawson", "America/Dawson"),
|
||||
("America/Dawson_Creek", "America/Dawson_Creek"),
|
||||
("America/Denver", "America/Denver"),
|
||||
("America/Detroit", "America/Detroit"),
|
||||
("America/Dominica", "America/Dominica"),
|
||||
("America/Edmonton", "America/Edmonton"),
|
||||
("America/Eirunepe", "America/Eirunepe"),
|
||||
("America/El_Salvador", "America/El_Salvador"),
|
||||
("America/Ensenada", "America/Ensenada"),
|
||||
("America/Fort_Nelson", "America/Fort_Nelson"),
|
||||
("America/Fort_Wayne", "America/Fort_Wayne"),
|
||||
("America/Fortaleza", "America/Fortaleza"),
|
||||
("America/Glace_Bay", "America/Glace_Bay"),
|
||||
("America/Godthab", "America/Godthab"),
|
||||
("America/Goose_Bay", "America/Goose_Bay"),
|
||||
("America/Grand_Turk", "America/Grand_Turk"),
|
||||
("America/Grenada", "America/Grenada"),
|
||||
("America/Guadeloupe", "America/Guadeloupe"),
|
||||
("America/Guatemala", "America/Guatemala"),
|
||||
("America/Guayaquil", "America/Guayaquil"),
|
||||
("America/Guyana", "America/Guyana"),
|
||||
("America/Halifax", "America/Halifax"),
|
||||
("America/Havana", "America/Havana"),
|
||||
("America/Hermosillo", "America/Hermosillo"),
|
||||
(
|
||||
"America/Indiana/Indianapolis",
|
||||
"America/Indiana/Indianapolis",
|
||||
),
|
||||
("America/Indiana/Knox", "America/Indiana/Knox"),
|
||||
("America/Indiana/Marengo", "America/Indiana/Marengo"),
|
||||
(
|
||||
"America/Indiana/Petersburg",
|
||||
"America/Indiana/Petersburg",
|
||||
),
|
||||
("America/Indiana/Tell_City", "America/Indiana/Tell_City"),
|
||||
("America/Indiana/Vevay", "America/Indiana/Vevay"),
|
||||
("America/Indiana/Vincennes", "America/Indiana/Vincennes"),
|
||||
("America/Indiana/Winamac", "America/Indiana/Winamac"),
|
||||
("America/Indianapolis", "America/Indianapolis"),
|
||||
("America/Inuvik", "America/Inuvik"),
|
||||
("America/Iqaluit", "America/Iqaluit"),
|
||||
("America/Jamaica", "America/Jamaica"),
|
||||
("America/Jujuy", "America/Jujuy"),
|
||||
("America/Juneau", "America/Juneau"),
|
||||
(
|
||||
"America/Kentucky/Louisville",
|
||||
"America/Kentucky/Louisville",
|
||||
),
|
||||
(
|
||||
"America/Kentucky/Monticello",
|
||||
"America/Kentucky/Monticello",
|
||||
),
|
||||
("America/Knox_IN", "America/Knox_IN"),
|
||||
("America/Kralendijk", "America/Kralendijk"),
|
||||
("America/La_Paz", "America/La_Paz"),
|
||||
("America/Lima", "America/Lima"),
|
||||
("America/Los_Angeles", "America/Los_Angeles"),
|
||||
("America/Louisville", "America/Louisville"),
|
||||
("America/Lower_Princes", "America/Lower_Princes"),
|
||||
("America/Maceio", "America/Maceio"),
|
||||
("America/Managua", "America/Managua"),
|
||||
("America/Manaus", "America/Manaus"),
|
||||
("America/Marigot", "America/Marigot"),
|
||||
("America/Martinique", "America/Martinique"),
|
||||
("America/Matamoros", "America/Matamoros"),
|
||||
("America/Mazatlan", "America/Mazatlan"),
|
||||
("America/Mendoza", "America/Mendoza"),
|
||||
("America/Menominee", "America/Menominee"),
|
||||
("America/Merida", "America/Merida"),
|
||||
("America/Metlakatla", "America/Metlakatla"),
|
||||
("America/Mexico_City", "America/Mexico_City"),
|
||||
("America/Miquelon", "America/Miquelon"),
|
||||
("America/Moncton", "America/Moncton"),
|
||||
("America/Monterrey", "America/Monterrey"),
|
||||
("America/Montevideo", "America/Montevideo"),
|
||||
("America/Montreal", "America/Montreal"),
|
||||
("America/Montserrat", "America/Montserrat"),
|
||||
("America/Nassau", "America/Nassau"),
|
||||
("America/New_York", "America/New_York"),
|
||||
("America/Nipigon", "America/Nipigon"),
|
||||
("America/Nome", "America/Nome"),
|
||||
("America/Noronha", "America/Noronha"),
|
||||
(
|
||||
"America/North_Dakota/Beulah",
|
||||
"America/North_Dakota/Beulah",
|
||||
),
|
||||
(
|
||||
"America/North_Dakota/Center",
|
||||
"America/North_Dakota/Center",
|
||||
),
|
||||
(
|
||||
"America/North_Dakota/New_Salem",
|
||||
"America/North_Dakota/New_Salem",
|
||||
),
|
||||
("America/Nuuk", "America/Nuuk"),
|
||||
("America/Ojinaga", "America/Ojinaga"),
|
||||
("America/Panama", "America/Panama"),
|
||||
("America/Pangnirtung", "America/Pangnirtung"),
|
||||
("America/Paramaribo", "America/Paramaribo"),
|
||||
("America/Phoenix", "America/Phoenix"),
|
||||
("America/Port-au-Prince", "America/Port-au-Prince"),
|
||||
("America/Port_of_Spain", "America/Port_of_Spain"),
|
||||
("America/Porto_Acre", "America/Porto_Acre"),
|
||||
("America/Porto_Velho", "America/Porto_Velho"),
|
||||
("America/Puerto_Rico", "America/Puerto_Rico"),
|
||||
("America/Punta_Arenas", "America/Punta_Arenas"),
|
||||
("America/Rainy_River", "America/Rainy_River"),
|
||||
("America/Rankin_Inlet", "America/Rankin_Inlet"),
|
||||
("America/Recife", "America/Recife"),
|
||||
("America/Regina", "America/Regina"),
|
||||
("America/Resolute", "America/Resolute"),
|
||||
("America/Rio_Branco", "America/Rio_Branco"),
|
||||
("America/Rosario", "America/Rosario"),
|
||||
("America/Santa_Isabel", "America/Santa_Isabel"),
|
||||
("America/Santarem", "America/Santarem"),
|
||||
("America/Santiago", "America/Santiago"),
|
||||
("America/Santo_Domingo", "America/Santo_Domingo"),
|
||||
("America/Sao_Paulo", "America/Sao_Paulo"),
|
||||
("America/Scoresbysund", "America/Scoresbysund"),
|
||||
("America/Shiprock", "America/Shiprock"),
|
||||
("America/Sitka", "America/Sitka"),
|
||||
("America/St_Barthelemy", "America/St_Barthelemy"),
|
||||
("America/St_Johns", "America/St_Johns"),
|
||||
("America/St_Kitts", "America/St_Kitts"),
|
||||
("America/St_Lucia", "America/St_Lucia"),
|
||||
("America/St_Thomas", "America/St_Thomas"),
|
||||
("America/St_Vincent", "America/St_Vincent"),
|
||||
("America/Swift_Current", "America/Swift_Current"),
|
||||
("America/Tegucigalpa", "America/Tegucigalpa"),
|
||||
("America/Thule", "America/Thule"),
|
||||
("America/Thunder_Bay", "America/Thunder_Bay"),
|
||||
("America/Tijuana", "America/Tijuana"),
|
||||
("America/Toronto", "America/Toronto"),
|
||||
("America/Tortola", "America/Tortola"),
|
||||
("America/Vancouver", "America/Vancouver"),
|
||||
("America/Virgin", "America/Virgin"),
|
||||
("America/Whitehorse", "America/Whitehorse"),
|
||||
("America/Winnipeg", "America/Winnipeg"),
|
||||
("America/Yakutat", "America/Yakutat"),
|
||||
("America/Yellowknife", "America/Yellowknife"),
|
||||
("Antarctica/Casey", "Antarctica/Casey"),
|
||||
("Antarctica/Davis", "Antarctica/Davis"),
|
||||
("Antarctica/DumontDUrville", "Antarctica/DumontDUrville"),
|
||||
("Antarctica/Macquarie", "Antarctica/Macquarie"),
|
||||
("Antarctica/Mawson", "Antarctica/Mawson"),
|
||||
("Antarctica/McMurdo", "Antarctica/McMurdo"),
|
||||
("Antarctica/Palmer", "Antarctica/Palmer"),
|
||||
("Antarctica/Rothera", "Antarctica/Rothera"),
|
||||
("Antarctica/South_Pole", "Antarctica/South_Pole"),
|
||||
("Antarctica/Syowa", "Antarctica/Syowa"),
|
||||
("Antarctica/Troll", "Antarctica/Troll"),
|
||||
("Antarctica/Vostok", "Antarctica/Vostok"),
|
||||
("Arctic/Longyearbyen", "Arctic/Longyearbyen"),
|
||||
("Asia/Aden", "Asia/Aden"),
|
||||
("Asia/Almaty", "Asia/Almaty"),
|
||||
("Asia/Amman", "Asia/Amman"),
|
||||
("Asia/Anadyr", "Asia/Anadyr"),
|
||||
("Asia/Aqtau", "Asia/Aqtau"),
|
||||
("Asia/Aqtobe", "Asia/Aqtobe"),
|
||||
("Asia/Ashgabat", "Asia/Ashgabat"),
|
||||
("Asia/Ashkhabad", "Asia/Ashkhabad"),
|
||||
("Asia/Atyrau", "Asia/Atyrau"),
|
||||
("Asia/Baghdad", "Asia/Baghdad"),
|
||||
("Asia/Bahrain", "Asia/Bahrain"),
|
||||
("Asia/Baku", "Asia/Baku"),
|
||||
("Asia/Bangkok", "Asia/Bangkok"),
|
||||
("Asia/Barnaul", "Asia/Barnaul"),
|
||||
("Asia/Beirut", "Asia/Beirut"),
|
||||
("Asia/Bishkek", "Asia/Bishkek"),
|
||||
("Asia/Brunei", "Asia/Brunei"),
|
||||
("Asia/Calcutta", "Asia/Calcutta"),
|
||||
("Asia/Chita", "Asia/Chita"),
|
||||
("Asia/Choibalsan", "Asia/Choibalsan"),
|
||||
("Asia/Chongqing", "Asia/Chongqing"),
|
||||
("Asia/Chungking", "Asia/Chungking"),
|
||||
("Asia/Colombo", "Asia/Colombo"),
|
||||
("Asia/Dacca", "Asia/Dacca"),
|
||||
("Asia/Damascus", "Asia/Damascus"),
|
||||
("Asia/Dhaka", "Asia/Dhaka"),
|
||||
("Asia/Dili", "Asia/Dili"),
|
||||
("Asia/Dubai", "Asia/Dubai"),
|
||||
("Asia/Dushanbe", "Asia/Dushanbe"),
|
||||
("Asia/Famagusta", "Asia/Famagusta"),
|
||||
("Asia/Gaza", "Asia/Gaza"),
|
||||
("Asia/Harbin", "Asia/Harbin"),
|
||||
("Asia/Hebron", "Asia/Hebron"),
|
||||
("Asia/Ho_Chi_Minh", "Asia/Ho_Chi_Minh"),
|
||||
("Asia/Hong_Kong", "Asia/Hong_Kong"),
|
||||
("Asia/Hovd", "Asia/Hovd"),
|
||||
("Asia/Irkutsk", "Asia/Irkutsk"),
|
||||
("Asia/Istanbul", "Asia/Istanbul"),
|
||||
("Asia/Jakarta", "Asia/Jakarta"),
|
||||
("Asia/Jayapura", "Asia/Jayapura"),
|
||||
("Asia/Jerusalem", "Asia/Jerusalem"),
|
||||
("Asia/Kabul", "Asia/Kabul"),
|
||||
("Asia/Kamchatka", "Asia/Kamchatka"),
|
||||
("Asia/Karachi", "Asia/Karachi"),
|
||||
("Asia/Kashgar", "Asia/Kashgar"),
|
||||
("Asia/Kathmandu", "Asia/Kathmandu"),
|
||||
("Asia/Katmandu", "Asia/Katmandu"),
|
||||
("Asia/Khandyga", "Asia/Khandyga"),
|
||||
("Asia/Kolkata", "Asia/Kolkata"),
|
||||
("Asia/Krasnoyarsk", "Asia/Krasnoyarsk"),
|
||||
("Asia/Kuala_Lumpur", "Asia/Kuala_Lumpur"),
|
||||
("Asia/Kuching", "Asia/Kuching"),
|
||||
("Asia/Kuwait", "Asia/Kuwait"),
|
||||
("Asia/Macao", "Asia/Macao"),
|
||||
("Asia/Macau", "Asia/Macau"),
|
||||
("Asia/Magadan", "Asia/Magadan"),
|
||||
("Asia/Makassar", "Asia/Makassar"),
|
||||
("Asia/Manila", "Asia/Manila"),
|
||||
("Asia/Muscat", "Asia/Muscat"),
|
||||
("Asia/Nicosia", "Asia/Nicosia"),
|
||||
("Asia/Novokuznetsk", "Asia/Novokuznetsk"),
|
||||
("Asia/Novosibirsk", "Asia/Novosibirsk"),
|
||||
("Asia/Omsk", "Asia/Omsk"),
|
||||
("Asia/Oral", "Asia/Oral"),
|
||||
("Asia/Phnom_Penh", "Asia/Phnom_Penh"),
|
||||
("Asia/Pontianak", "Asia/Pontianak"),
|
||||
("Asia/Pyongyang", "Asia/Pyongyang"),
|
||||
("Asia/Qatar", "Asia/Qatar"),
|
||||
("Asia/Qostanay", "Asia/Qostanay"),
|
||||
("Asia/Qyzylorda", "Asia/Qyzylorda"),
|
||||
("Asia/Rangoon", "Asia/Rangoon"),
|
||||
("Asia/Riyadh", "Asia/Riyadh"),
|
||||
("Asia/Saigon", "Asia/Saigon"),
|
||||
("Asia/Sakhalin", "Asia/Sakhalin"),
|
||||
("Asia/Samarkand", "Asia/Samarkand"),
|
||||
("Asia/Seoul", "Asia/Seoul"),
|
||||
("Asia/Shanghai", "Asia/Shanghai"),
|
||||
("Asia/Singapore", "Asia/Singapore"),
|
||||
("Asia/Srednekolymsk", "Asia/Srednekolymsk"),
|
||||
("Asia/Taipei", "Asia/Taipei"),
|
||||
("Asia/Tashkent", "Asia/Tashkent"),
|
||||
("Asia/Tbilisi", "Asia/Tbilisi"),
|
||||
("Asia/Tehran", "Asia/Tehran"),
|
||||
("Asia/Tel_Aviv", "Asia/Tel_Aviv"),
|
||||
("Asia/Thimbu", "Asia/Thimbu"),
|
||||
("Asia/Thimphu", "Asia/Thimphu"),
|
||||
("Asia/Tokyo", "Asia/Tokyo"),
|
||||
("Asia/Tomsk", "Asia/Tomsk"),
|
||||
("Asia/Ujung_Pandang", "Asia/Ujung_Pandang"),
|
||||
("Asia/Ulaanbaatar", "Asia/Ulaanbaatar"),
|
||||
("Asia/Ulan_Bator", "Asia/Ulan_Bator"),
|
||||
("Asia/Urumqi", "Asia/Urumqi"),
|
||||
("Asia/Ust-Nera", "Asia/Ust-Nera"),
|
||||
("Asia/Vientiane", "Asia/Vientiane"),
|
||||
("Asia/Vladivostok", "Asia/Vladivostok"),
|
||||
("Asia/Yakutsk", "Asia/Yakutsk"),
|
||||
("Asia/Yangon", "Asia/Yangon"),
|
||||
("Asia/Yekaterinburg", "Asia/Yekaterinburg"),
|
||||
("Asia/Yerevan", "Asia/Yerevan"),
|
||||
("Atlantic/Azores", "Atlantic/Azores"),
|
||||
("Atlantic/Bermuda", "Atlantic/Bermuda"),
|
||||
("Atlantic/Canary", "Atlantic/Canary"),
|
||||
("Atlantic/Cape_Verde", "Atlantic/Cape_Verde"),
|
||||
("Atlantic/Faeroe", "Atlantic/Faeroe"),
|
||||
("Atlantic/Faroe", "Atlantic/Faroe"),
|
||||
("Atlantic/Jan_Mayen", "Atlantic/Jan_Mayen"),
|
||||
("Atlantic/Madeira", "Atlantic/Madeira"),
|
||||
("Atlantic/Reykjavik", "Atlantic/Reykjavik"),
|
||||
("Atlantic/South_Georgia", "Atlantic/South_Georgia"),
|
||||
("Atlantic/St_Helena", "Atlantic/St_Helena"),
|
||||
("Atlantic/Stanley", "Atlantic/Stanley"),
|
||||
("Australia/ACT", "Australia/ACT"),
|
||||
("Australia/Adelaide", "Australia/Adelaide"),
|
||||
("Australia/Brisbane", "Australia/Brisbane"),
|
||||
("Australia/Broken_Hill", "Australia/Broken_Hill"),
|
||||
("Australia/Canberra", "Australia/Canberra"),
|
||||
("Australia/Currie", "Australia/Currie"),
|
||||
("Australia/Darwin", "Australia/Darwin"),
|
||||
("Australia/Eucla", "Australia/Eucla"),
|
||||
("Australia/Hobart", "Australia/Hobart"),
|
||||
("Australia/LHI", "Australia/LHI"),
|
||||
("Australia/Lindeman", "Australia/Lindeman"),
|
||||
("Australia/Lord_Howe", "Australia/Lord_Howe"),
|
||||
("Australia/Melbourne", "Australia/Melbourne"),
|
||||
("Australia/NSW", "Australia/NSW"),
|
||||
("Australia/North", "Australia/North"),
|
||||
("Australia/Perth", "Australia/Perth"),
|
||||
("Australia/Queensland", "Australia/Queensland"),
|
||||
("Australia/South", "Australia/South"),
|
||||
("Australia/Sydney", "Australia/Sydney"),
|
||||
("Australia/Tasmania", "Australia/Tasmania"),
|
||||
("Australia/Victoria", "Australia/Victoria"),
|
||||
("Australia/West", "Australia/West"),
|
||||
("Australia/Yancowinna", "Australia/Yancowinna"),
|
||||
("Brazil/Acre", "Brazil/Acre"),
|
||||
("Brazil/DeNoronha", "Brazil/DeNoronha"),
|
||||
("Brazil/East", "Brazil/East"),
|
||||
("Brazil/West", "Brazil/West"),
|
||||
("CET", "CET"),
|
||||
("CST6CDT", "CST6CDT"),
|
||||
("Canada/Atlantic", "Canada/Atlantic"),
|
||||
("Canada/Central", "Canada/Central"),
|
||||
("Canada/Eastern", "Canada/Eastern"),
|
||||
("Canada/Mountain", "Canada/Mountain"),
|
||||
("Canada/Newfoundland", "Canada/Newfoundland"),
|
||||
("Canada/Pacific", "Canada/Pacific"),
|
||||
("Canada/Saskatchewan", "Canada/Saskatchewan"),
|
||||
("Canada/Yukon", "Canada/Yukon"),
|
||||
("Chile/Continental", "Chile/Continental"),
|
||||
("Chile/EasterIsland", "Chile/EasterIsland"),
|
||||
("Cuba", "Cuba"),
|
||||
("EET", "EET"),
|
||||
("EST", "EST"),
|
||||
("EST5EDT", "EST5EDT"),
|
||||
("Egypt", "Egypt"),
|
||||
("Eire", "Eire"),
|
||||
("Etc/GMT", "Etc/GMT"),
|
||||
("Etc/GMT+0", "Etc/GMT+0"),
|
||||
("Etc/GMT+1", "Etc/GMT+1"),
|
||||
("Etc/GMT+10", "Etc/GMT+10"),
|
||||
("Etc/GMT+11", "Etc/GMT+11"),
|
||||
("Etc/GMT+12", "Etc/GMT+12"),
|
||||
("Etc/GMT+2", "Etc/GMT+2"),
|
||||
("Etc/GMT+3", "Etc/GMT+3"),
|
||||
("Etc/GMT+4", "Etc/GMT+4"),
|
||||
("Etc/GMT+5", "Etc/GMT+5"),
|
||||
("Etc/GMT+6", "Etc/GMT+6"),
|
||||
("Etc/GMT+7", "Etc/GMT+7"),
|
||||
("Etc/GMT+8", "Etc/GMT+8"),
|
||||
("Etc/GMT+9", "Etc/GMT+9"),
|
||||
("Etc/GMT-0", "Etc/GMT-0"),
|
||||
("Etc/GMT-1", "Etc/GMT-1"),
|
||||
("Etc/GMT-10", "Etc/GMT-10"),
|
||||
("Etc/GMT-11", "Etc/GMT-11"),
|
||||
("Etc/GMT-12", "Etc/GMT-12"),
|
||||
("Etc/GMT-13", "Etc/GMT-13"),
|
||||
("Etc/GMT-14", "Etc/GMT-14"),
|
||||
("Etc/GMT-2", "Etc/GMT-2"),
|
||||
("Etc/GMT-3", "Etc/GMT-3"),
|
||||
("Etc/GMT-4", "Etc/GMT-4"),
|
||||
("Etc/GMT-5", "Etc/GMT-5"),
|
||||
("Etc/GMT-6", "Etc/GMT-6"),
|
||||
("Etc/GMT-7", "Etc/GMT-7"),
|
||||
("Etc/GMT-8", "Etc/GMT-8"),
|
||||
("Etc/GMT-9", "Etc/GMT-9"),
|
||||
("Etc/GMT0", "Etc/GMT0"),
|
||||
("Etc/Greenwich", "Etc/Greenwich"),
|
||||
("Etc/UCT", "Etc/UCT"),
|
||||
("Etc/UTC", "Etc/UTC"),
|
||||
("Etc/Universal", "Etc/Universal"),
|
||||
("Etc/Zulu", "Etc/Zulu"),
|
||||
("Europe/Amsterdam", "Europe/Amsterdam"),
|
||||
("Europe/Andorra", "Europe/Andorra"),
|
||||
("Europe/Astrakhan", "Europe/Astrakhan"),
|
||||
("Europe/Athens", "Europe/Athens"),
|
||||
("Europe/Belfast", "Europe/Belfast"),
|
||||
("Europe/Belgrade", "Europe/Belgrade"),
|
||||
("Europe/Berlin", "Europe/Berlin"),
|
||||
("Europe/Bratislava", "Europe/Bratislava"),
|
||||
("Europe/Brussels", "Europe/Brussels"),
|
||||
("Europe/Bucharest", "Europe/Bucharest"),
|
||||
("Europe/Budapest", "Europe/Budapest"),
|
||||
("Europe/Busingen", "Europe/Busingen"),
|
||||
("Europe/Chisinau", "Europe/Chisinau"),
|
||||
("Europe/Copenhagen", "Europe/Copenhagen"),
|
||||
("Europe/Dublin", "Europe/Dublin"),
|
||||
("Europe/Gibraltar", "Europe/Gibraltar"),
|
||||
("Europe/Guernsey", "Europe/Guernsey"),
|
||||
("Europe/Helsinki", "Europe/Helsinki"),
|
||||
("Europe/Isle_of_Man", "Europe/Isle_of_Man"),
|
||||
("Europe/Istanbul", "Europe/Istanbul"),
|
||||
("Europe/Jersey", "Europe/Jersey"),
|
||||
("Europe/Kaliningrad", "Europe/Kaliningrad"),
|
||||
("Europe/Kiev", "Europe/Kiev"),
|
||||
("Europe/Kirov", "Europe/Kirov"),
|
||||
("Europe/Kyiv", "Europe/Kyiv"),
|
||||
("Europe/Lisbon", "Europe/Lisbon"),
|
||||
("Europe/Ljubljana", "Europe/Ljubljana"),
|
||||
("Europe/London", "Europe/London"),
|
||||
("Europe/Luxembourg", "Europe/Luxembourg"),
|
||||
("Europe/Madrid", "Europe/Madrid"),
|
||||
("Europe/Malta", "Europe/Malta"),
|
||||
("Europe/Mariehamn", "Europe/Mariehamn"),
|
||||
("Europe/Minsk", "Europe/Minsk"),
|
||||
("Europe/Monaco", "Europe/Monaco"),
|
||||
("Europe/Moscow", "Europe/Moscow"),
|
||||
("Europe/Nicosia", "Europe/Nicosia"),
|
||||
("Europe/Oslo", "Europe/Oslo"),
|
||||
("Europe/Paris", "Europe/Paris"),
|
||||
("Europe/Podgorica", "Europe/Podgorica"),
|
||||
("Europe/Prague", "Europe/Prague"),
|
||||
("Europe/Riga", "Europe/Riga"),
|
||||
("Europe/Rome", "Europe/Rome"),
|
||||
("Europe/Samara", "Europe/Samara"),
|
||||
("Europe/San_Marino", "Europe/San_Marino"),
|
||||
("Europe/Sarajevo", "Europe/Sarajevo"),
|
||||
("Europe/Saratov", "Europe/Saratov"),
|
||||
("Europe/Simferopol", "Europe/Simferopol"),
|
||||
("Europe/Skopje", "Europe/Skopje"),
|
||||
("Europe/Sofia", "Europe/Sofia"),
|
||||
("Europe/Stockholm", "Europe/Stockholm"),
|
||||
("Europe/Tallinn", "Europe/Tallinn"),
|
||||
("Europe/Tirane", "Europe/Tirane"),
|
||||
("Europe/Tiraspol", "Europe/Tiraspol"),
|
||||
("Europe/Ulyanovsk", "Europe/Ulyanovsk"),
|
||||
("Europe/Uzhgorod", "Europe/Uzhgorod"),
|
||||
("Europe/Vaduz", "Europe/Vaduz"),
|
||||
("Europe/Vatican", "Europe/Vatican"),
|
||||
("Europe/Vienna", "Europe/Vienna"),
|
||||
("Europe/Vilnius", "Europe/Vilnius"),
|
||||
("Europe/Volgograd", "Europe/Volgograd"),
|
||||
("Europe/Warsaw", "Europe/Warsaw"),
|
||||
("Europe/Zagreb", "Europe/Zagreb"),
|
||||
("Europe/Zaporozhye", "Europe/Zaporozhye"),
|
||||
("Europe/Zurich", "Europe/Zurich"),
|
||||
("GB", "GB"),
|
||||
("GB-Eire", "GB-Eire"),
|
||||
("GMT", "GMT"),
|
||||
("GMT+0", "GMT+0"),
|
||||
("GMT-0", "GMT-0"),
|
||||
("GMT0", "GMT0"),
|
||||
("Greenwich", "Greenwich"),
|
||||
("HST", "HST"),
|
||||
("Hongkong", "Hongkong"),
|
||||
("Iceland", "Iceland"),
|
||||
("Indian/Antananarivo", "Indian/Antananarivo"),
|
||||
("Indian/Chagos", "Indian/Chagos"),
|
||||
("Indian/Christmas", "Indian/Christmas"),
|
||||
("Indian/Cocos", "Indian/Cocos"),
|
||||
("Indian/Comoro", "Indian/Comoro"),
|
||||
("Indian/Kerguelen", "Indian/Kerguelen"),
|
||||
("Indian/Mahe", "Indian/Mahe"),
|
||||
("Indian/Maldives", "Indian/Maldives"),
|
||||
("Indian/Mauritius", "Indian/Mauritius"),
|
||||
("Indian/Mayotte", "Indian/Mayotte"),
|
||||
("Indian/Reunion", "Indian/Reunion"),
|
||||
("Iran", "Iran"),
|
||||
("Israel", "Israel"),
|
||||
("Jamaica", "Jamaica"),
|
||||
("Japan", "Japan"),
|
||||
("Kwajalein", "Kwajalein"),
|
||||
("Libya", "Libya"),
|
||||
("MET", "MET"),
|
||||
("MST", "MST"),
|
||||
("MST7MDT", "MST7MDT"),
|
||||
("Mexico/BajaNorte", "Mexico/BajaNorte"),
|
||||
("Mexico/BajaSur", "Mexico/BajaSur"),
|
||||
("Mexico/General", "Mexico/General"),
|
||||
("NZ", "NZ"),
|
||||
("NZ-CHAT", "NZ-CHAT"),
|
||||
("Navajo", "Navajo"),
|
||||
("PRC", "PRC"),
|
||||
("PST8PDT", "PST8PDT"),
|
||||
("Pacific/Apia", "Pacific/Apia"),
|
||||
("Pacific/Auckland", "Pacific/Auckland"),
|
||||
("Pacific/Bougainville", "Pacific/Bougainville"),
|
||||
("Pacific/Chatham", "Pacific/Chatham"),
|
||||
("Pacific/Chuuk", "Pacific/Chuuk"),
|
||||
("Pacific/Easter", "Pacific/Easter"),
|
||||
("Pacific/Efate", "Pacific/Efate"),
|
||||
("Pacific/Enderbury", "Pacific/Enderbury"),
|
||||
("Pacific/Fakaofo", "Pacific/Fakaofo"),
|
||||
("Pacific/Fiji", "Pacific/Fiji"),
|
||||
("Pacific/Funafuti", "Pacific/Funafuti"),
|
||||
("Pacific/Galapagos", "Pacific/Galapagos"),
|
||||
("Pacific/Gambier", "Pacific/Gambier"),
|
||||
("Pacific/Guadalcanal", "Pacific/Guadalcanal"),
|
||||
("Pacific/Guam", "Pacific/Guam"),
|
||||
("Pacific/Honolulu", "Pacific/Honolulu"),
|
||||
("Pacific/Johnston", "Pacific/Johnston"),
|
||||
("Pacific/Kanton", "Pacific/Kanton"),
|
||||
("Pacific/Kiritimati", "Pacific/Kiritimati"),
|
||||
("Pacific/Kosrae", "Pacific/Kosrae"),
|
||||
("Pacific/Kwajalein", "Pacific/Kwajalein"),
|
||||
("Pacific/Majuro", "Pacific/Majuro"),
|
||||
("Pacific/Marquesas", "Pacific/Marquesas"),
|
||||
("Pacific/Midway", "Pacific/Midway"),
|
||||
("Pacific/Nauru", "Pacific/Nauru"),
|
||||
("Pacific/Niue", "Pacific/Niue"),
|
||||
("Pacific/Norfolk", "Pacific/Norfolk"),
|
||||
("Pacific/Noumea", "Pacific/Noumea"),
|
||||
("Pacific/Pago_Pago", "Pacific/Pago_Pago"),
|
||||
("Pacific/Palau", "Pacific/Palau"),
|
||||
("Pacific/Pitcairn", "Pacific/Pitcairn"),
|
||||
("Pacific/Pohnpei", "Pacific/Pohnpei"),
|
||||
("Pacific/Ponape", "Pacific/Ponape"),
|
||||
("Pacific/Port_Moresby", "Pacific/Port_Moresby"),
|
||||
("Pacific/Rarotonga", "Pacific/Rarotonga"),
|
||||
("Pacific/Saipan", "Pacific/Saipan"),
|
||||
("Pacific/Samoa", "Pacific/Samoa"),
|
||||
("Pacific/Tahiti", "Pacific/Tahiti"),
|
||||
("Pacific/Tarawa", "Pacific/Tarawa"),
|
||||
("Pacific/Tongatapu", "Pacific/Tongatapu"),
|
||||
("Pacific/Truk", "Pacific/Truk"),
|
||||
("Pacific/Wake", "Pacific/Wake"),
|
||||
("Pacific/Wallis", "Pacific/Wallis"),
|
||||
("Pacific/Yap", "Pacific/Yap"),
|
||||
("Poland", "Poland"),
|
||||
("Portugal", "Portugal"),
|
||||
("ROC", "ROC"),
|
||||
("ROK", "ROK"),
|
||||
("Singapore", "Singapore"),
|
||||
("Turkey", "Turkey"),
|
||||
("UCT", "UCT"),
|
||||
("US/Alaska", "US/Alaska"),
|
||||
("US/Aleutian", "US/Aleutian"),
|
||||
("US/Arizona", "US/Arizona"),
|
||||
("US/Central", "US/Central"),
|
||||
("US/East-Indiana", "US/East-Indiana"),
|
||||
("US/Eastern", "US/Eastern"),
|
||||
("US/Hawaii", "US/Hawaii"),
|
||||
("US/Indiana-Starke", "US/Indiana-Starke"),
|
||||
("US/Michigan", "US/Michigan"),
|
||||
("US/Mountain", "US/Mountain"),
|
||||
("US/Pacific", "US/Pacific"),
|
||||
("US/Samoa", "US/Samoa"),
|
||||
("UTC", "UTC"),
|
||||
("Universal", "Universal"),
|
||||
("W-SU", "W-SU"),
|
||||
("WET", "WET"),
|
||||
("Zulu", "Zulu"),
|
||||
],
|
||||
default=django.utils.timezone.get_current_timezone,
|
||||
help_text="timezone used for the date",
|
||||
max_length=100,
|
||||
verbose_name="timezone",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="sound",
|
||||
name="is_public",
|
||||
field=models.BooleanField(
|
||||
default=False,
|
||||
help_text="whether it is publicly available as podcast",
|
||||
verbose_name="public",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="stream",
|
||||
name="begin",
|
||||
field=models.TimeField(
|
||||
blank=True,
|
||||
help_text="used to define a time range this stream is played",
|
||||
null=True,
|
||||
verbose_name="begin",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="stream",
|
||||
name="end",
|
||||
field=models.TimeField(
|
||||
blank=True,
|
||||
help_text="used to define a time range this stream is played",
|
||||
null=True,
|
||||
verbose_name="end",
|
||||
),
|
||||
),
|
||||
]
|
||||
@ -0,0 +1,44 @@
|
||||
# Generated by Django 4.1 on 2022-12-09 13:46
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0007_sound_is_downloadable_alter_page_pub_date_and_more"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelOptions(
|
||||
name="diffusion",
|
||||
options={
|
||||
"permissions": (("programming", "edit the diffusions' planification"),),
|
||||
"verbose_name": "Diffusion",
|
||||
"verbose_name_plural": "Diffusions",
|
||||
},
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="track",
|
||||
name="album",
|
||||
field=models.CharField(default="", max_length=128, verbose_name="album"),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="frequency",
|
||||
field=models.SmallIntegerField(
|
||||
choices=[
|
||||
(0, "ponctual"),
|
||||
(1, "1st {day} of the month"),
|
||||
(2, "2nd {day} of the month"),
|
||||
(4, "3rd {day} of the month"),
|
||||
(8, "4th {day} of the month"),
|
||||
(16, "last {day} of the month"),
|
||||
(5, "1st and 3rd {day} of the month"),
|
||||
(10, "2nd and 4th {day} of the month"),
|
||||
(31, "{day}"),
|
||||
(32, "one {day} on two"),
|
||||
],
|
||||
verbose_name="frequency",
|
||||
),
|
||||
),
|
||||
]
|
||||
17
aircox/migrations/0009_track_year.py
Normal file
17
aircox/migrations/0009_track_year.py
Normal file
@ -0,0 +1,17 @@
|
||||
# Generated by Django 4.1 on 2022-12-09 13:50
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0008_alter_diffusion_options_track_album_and_more"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="track",
|
||||
name="year",
|
||||
field=models.IntegerField(blank=True, null=True, verbose_name="year"),
|
||||
),
|
||||
]
|
||||
17
aircox/migrations/0010_alter_track_album.py
Normal file
17
aircox/migrations/0010_alter_track_album.py
Normal file
@ -0,0 +1,17 @@
|
||||
# Generated by Django 4.1 on 2022-12-09 18:13
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0009_track_year"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="track",
|
||||
name="album",
|
||||
field=models.CharField(blank=True, max_length=128, null=True, verbose_name="album"),
|
||||
),
|
||||
]
|
||||
46
aircox/migrations/0011_usersettings.py
Normal file
46
aircox/migrations/0011_usersettings.py
Normal file
@ -0,0 +1,46 @@
|
||||
# Generated by Django 4.1 on 2022-12-11 12:24
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
("aircox", "0010_alter_track_album"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="UserSettings",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
(
|
||||
"playlist_editor_columns",
|
||||
models.JSONField(verbose_name="Playlist Editor Columns"),
|
||||
),
|
||||
(
|
||||
"playlist_editor_sep",
|
||||
models.CharField(max_length=16, verbose_name="Playlist Editor Separator"),
|
||||
),
|
||||
(
|
||||
"user",
|
||||
models.OneToOneField(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="aircox_settings",
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
verbose_name="User",
|
||||
),
|
||||
),
|
||||
],
|
||||
),
|
||||
]
|
||||
@ -0,0 +1,33 @@
|
||||
# Generated by Django 4.1 on 2023-01-25 15:18
|
||||
|
||||
import aircox.models.sound
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0011_usersettings"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="sound",
|
||||
name="file",
|
||||
field=models.FileField(
|
||||
db_index=True,
|
||||
max_length=256,
|
||||
unique=True,
|
||||
upload_to=aircox.models.sound.Sound._upload_to,
|
||||
verbose_name="file",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="station",
|
||||
name="default",
|
||||
field=models.BooleanField(
|
||||
default=False,
|
||||
help_text="use this station as the main one.",
|
||||
verbose_name="default station",
|
||||
),
|
||||
),
|
||||
]
|
||||
@ -0,0 +1,675 @@
|
||||
# Generated by Django 4.2.1 on 2023-09-28 11:07
|
||||
|
||||
import aircox.models.schedule
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0012_alter_sound_file_alter_station_default"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="timezone",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("Africa/Mogadishu", "Africa/Mogadishu"),
|
||||
("Pacific/Guadalcanal", "Pacific/Guadalcanal"),
|
||||
("Asia/Baku", "Asia/Baku"),
|
||||
("America/Thunder_Bay", "America/Thunder_Bay"),
|
||||
("Etc/GMT-10", "Etc/GMT-10"),
|
||||
("UTC", "UTC"),
|
||||
("Europe/Uzhgorod", "Europe/Uzhgorod"),
|
||||
("ROC", "ROC"),
|
||||
("Asia/Seoul", "Asia/Seoul"),
|
||||
("Europe/Moscow", "Europe/Moscow"),
|
||||
("Australia/Melbourne", "Australia/Melbourne"),
|
||||
("Asia/Manila", "Asia/Manila"),
|
||||
("America/Tegucigalpa", "America/Tegucigalpa"),
|
||||
("Australia/Adelaide", "Australia/Adelaide"),
|
||||
(
|
||||
"America/Argentina/Rio_Gallegos",
|
||||
"America/Argentina/Rio_Gallegos",
|
||||
),
|
||||
("Brazil/Acre", "Brazil/Acre"),
|
||||
("America/Porto_Acre", "America/Porto_Acre"),
|
||||
("Europe/Nicosia", "Europe/Nicosia"),
|
||||
("Europe/Vienna", "Europe/Vienna"),
|
||||
("GB-Eire", "GB-Eire"),
|
||||
("US/Mountain", "US/Mountain"),
|
||||
("Etc/GMT-2", "Etc/GMT-2"),
|
||||
("America/Buenos_Aires", "America/Buenos_Aires"),
|
||||
("Africa/Malabo", "Africa/Malabo"),
|
||||
("Asia/Qostanay", "Asia/Qostanay"),
|
||||
("America/Noronha", "America/Noronha"),
|
||||
("Etc/GMT+2", "Etc/GMT+2"),
|
||||
("Asia/Novosibirsk", "Asia/Novosibirsk"),
|
||||
("America/Ensenada", "America/Ensenada"),
|
||||
("Africa/Bujumbura", "Africa/Bujumbura"),
|
||||
("America/Anchorage", "America/Anchorage"),
|
||||
("America/Miquelon", "America/Miquelon"),
|
||||
("Europe/Simferopol", "Europe/Simferopol"),
|
||||
("America/Martinique", "America/Martinique"),
|
||||
("Canada/Eastern", "Canada/Eastern"),
|
||||
("Asia/Ujung_Pandang", "Asia/Ujung_Pandang"),
|
||||
("America/St_Vincent", "America/St_Vincent"),
|
||||
("America/Dawson_Creek", "America/Dawson_Creek"),
|
||||
("Pacific/Yap", "Pacific/Yap"),
|
||||
("America/St_Lucia", "America/St_Lucia"),
|
||||
("CET", "CET"),
|
||||
("Africa/Monrovia", "Africa/Monrovia"),
|
||||
("Etc/Universal", "Etc/Universal"),
|
||||
("America/Belem", "America/Belem"),
|
||||
("US/Pacific", "US/Pacific"),
|
||||
("Africa/Dakar", "Africa/Dakar"),
|
||||
("Europe/Belfast", "Europe/Belfast"),
|
||||
("Pacific/Funafuti", "Pacific/Funafuti"),
|
||||
("Africa/Casablanca", "Africa/Casablanca"),
|
||||
(
|
||||
"America/Kentucky/Monticello",
|
||||
"America/Kentucky/Monticello",
|
||||
),
|
||||
("Etc/Greenwich", "Etc/Greenwich"),
|
||||
("Indian/Chagos", "Indian/Chagos"),
|
||||
("Asia/Shanghai", "Asia/Shanghai"),
|
||||
("Mexico/BajaSur", "Mexico/BajaSur"),
|
||||
("Europe/Madrid", "Europe/Madrid"),
|
||||
("America/Lower_Princes", "America/Lower_Princes"),
|
||||
("Europe/Busingen", "Europe/Busingen"),
|
||||
("Asia/Macao", "Asia/Macao"),
|
||||
("Australia/Tasmania", "Australia/Tasmania"),
|
||||
("Asia/Saigon", "Asia/Saigon"),
|
||||
("America/Nipigon", "America/Nipigon"),
|
||||
("MST", "MST"),
|
||||
("America/Juneau", "America/Juneau"),
|
||||
("Singapore", "Singapore"),
|
||||
("Pacific/Kosrae", "Pacific/Kosrae"),
|
||||
("America/Argentina/Cordoba", "America/Argentina/Cordoba"),
|
||||
("HST", "HST"),
|
||||
("Indian/Christmas", "Indian/Christmas"),
|
||||
("Indian/Kerguelen", "Indian/Kerguelen"),
|
||||
("America/Port-au-Prince", "America/Port-au-Prince"),
|
||||
("Europe/Monaco", "Europe/Monaco"),
|
||||
("Asia/Pyongyang", "Asia/Pyongyang"),
|
||||
("Australia/Darwin", "Australia/Darwin"),
|
||||
("Asia/Ulaanbaatar", "Asia/Ulaanbaatar"),
|
||||
("Asia/Amman", "Asia/Amman"),
|
||||
(
|
||||
"America/Argentina/San_Juan",
|
||||
"America/Argentina/San_Juan",
|
||||
),
|
||||
("Indian/Reunion", "Indian/Reunion"),
|
||||
("America/Coral_Harbour", "America/Coral_Harbour"),
|
||||
("Antarctica/Davis", "Antarctica/Davis"),
|
||||
("Europe/Kyiv", "Europe/Kyiv"),
|
||||
("America/Argentina/Tucuman", "America/Argentina/Tucuman"),
|
||||
("Pacific/Tarawa", "Pacific/Tarawa"),
|
||||
("Pacific/Kwajalein", "Pacific/Kwajalein"),
|
||||
("America/Metlakatla", "America/Metlakatla"),
|
||||
("Australia/Canberra", "Australia/Canberra"),
|
||||
("Europe/Rome", "Europe/Rome"),
|
||||
("Pacific/Fakaofo", "Pacific/Fakaofo"),
|
||||
("Europe/Tirane", "Europe/Tirane"),
|
||||
("Asia/Dhaka", "Asia/Dhaka"),
|
||||
("Europe/Mariehamn", "Europe/Mariehamn"),
|
||||
("America/New_York", "America/New_York"),
|
||||
("Pacific/Johnston", "Pacific/Johnston"),
|
||||
("Africa/Abidjan", "Africa/Abidjan"),
|
||||
("Pacific/Noumea", "Pacific/Noumea"),
|
||||
("Canada/Central", "Canada/Central"),
|
||||
("Pacific/Pohnpei", "Pacific/Pohnpei"),
|
||||
("America/Rosario", "America/Rosario"),
|
||||
("Asia/Baghdad", "Asia/Baghdad"),
|
||||
("America/Argentina/Salta", "America/Argentina/Salta"),
|
||||
("Canada/Pacific", "Canada/Pacific"),
|
||||
("US/Indiana-Starke", "US/Indiana-Starke"),
|
||||
("America/Cuiaba", "America/Cuiaba"),
|
||||
("Asia/Barnaul", "Asia/Barnaul"),
|
||||
("Pacific/Gambier", "Pacific/Gambier"),
|
||||
("America/Mazatlan", "America/Mazatlan"),
|
||||
("Europe/Helsinki", "Europe/Helsinki"),
|
||||
("Asia/Urumqi", "Asia/Urumqi"),
|
||||
("Indian/Maldives", "Indian/Maldives"),
|
||||
("CST6CDT", "CST6CDT"),
|
||||
("Africa/Blantyre", "Africa/Blantyre"),
|
||||
("Europe/Minsk", "Europe/Minsk"),
|
||||
("Asia/Samarkand", "Asia/Samarkand"),
|
||||
("US/Michigan", "US/Michigan"),
|
||||
("Etc/GMT+6", "Etc/GMT+6"),
|
||||
("Asia/Nicosia", "Asia/Nicosia"),
|
||||
("America/Bahia_Banderas", "America/Bahia_Banderas"),
|
||||
("Europe/Bratislava", "Europe/Bratislava"),
|
||||
("Atlantic/South_Georgia", "Atlantic/South_Georgia"),
|
||||
("NZ-CHAT", "NZ-CHAT"),
|
||||
("Antarctica/Troll", "Antarctica/Troll"),
|
||||
(
|
||||
"America/Argentina/La_Rioja",
|
||||
"America/Argentina/La_Rioja",
|
||||
),
|
||||
("Etc/GMT+12", "Etc/GMT+12"),
|
||||
("Africa/Gaborone", "Africa/Gaborone"),
|
||||
("Asia/Ust-Nera", "Asia/Ust-Nera"),
|
||||
("Etc/GMT-14", "Etc/GMT-14"),
|
||||
("Africa/Luanda", "Africa/Luanda"),
|
||||
("America/Denver", "America/Denver"),
|
||||
("Antarctica/Vostok", "Antarctica/Vostok"),
|
||||
("America/Pangnirtung", "America/Pangnirtung"),
|
||||
("Africa/Ndjamena", "Africa/Ndjamena"),
|
||||
("GMT-0", "GMT-0"),
|
||||
("Australia/Victoria", "Australia/Victoria"),
|
||||
("Africa/Ouagadougou", "Africa/Ouagadougou"),
|
||||
("Europe/Berlin", "Europe/Berlin"),
|
||||
("Etc/GMT0", "Etc/GMT0"),
|
||||
("America/Halifax", "America/Halifax"),
|
||||
(
|
||||
"America/North_Dakota/New_Salem",
|
||||
"America/North_Dakota/New_Salem",
|
||||
),
|
||||
("NZ", "NZ"),
|
||||
("America/Nome", "America/Nome"),
|
||||
("Europe/Brussels", "Europe/Brussels"),
|
||||
("Europe/Gibraltar", "Europe/Gibraltar"),
|
||||
("Africa/Asmara", "Africa/Asmara"),
|
||||
("Africa/Lusaka", "Africa/Lusaka"),
|
||||
("America/Cancun", "America/Cancun"),
|
||||
("Iran", "Iran"),
|
||||
("Asia/Brunei", "Asia/Brunei"),
|
||||
("America/Barbados", "America/Barbados"),
|
||||
("Asia/Aqtau", "Asia/Aqtau"),
|
||||
("Asia/Ashkhabad", "Asia/Ashkhabad"),
|
||||
("America/Punta_Arenas", "America/Punta_Arenas"),
|
||||
("America/Dominica", "America/Dominica"),
|
||||
("Etc/GMT-1", "Etc/GMT-1"),
|
||||
("Etc/GMT", "Etc/GMT"),
|
||||
("Europe/Kaliningrad", "Europe/Kaliningrad"),
|
||||
(
|
||||
"America/Indiana/Petersburg",
|
||||
"America/Indiana/Petersburg",
|
||||
),
|
||||
("Africa/Harare", "Africa/Harare"),
|
||||
("US/Alaska", "US/Alaska"),
|
||||
("Asia/Chongqing", "Asia/Chongqing"),
|
||||
("Asia/Jakarta", "Asia/Jakarta"),
|
||||
("Etc/GMT-8", "Etc/GMT-8"),
|
||||
("Asia/Katmandu", "Asia/Katmandu"),
|
||||
("Africa/Maputo", "Africa/Maputo"),
|
||||
("Indian/Antananarivo", "Indian/Antananarivo"),
|
||||
("America/Havana", "America/Havana"),
|
||||
("Asia/Chungking", "Asia/Chungking"),
|
||||
("Pacific/Pago_Pago", "Pacific/Pago_Pago"),
|
||||
("America/Fortaleza", "America/Fortaleza"),
|
||||
("America/Campo_Grande", "America/Campo_Grande"),
|
||||
("America/Rio_Branco", "America/Rio_Branco"),
|
||||
("America/Bogota", "America/Bogota"),
|
||||
("Asia/Kuala_Lumpur", "Asia/Kuala_Lumpur"),
|
||||
("Australia/North", "Australia/North"),
|
||||
("Etc/GMT-6", "Etc/GMT-6"),
|
||||
("Europe/Samara", "Europe/Samara"),
|
||||
("GMT0", "GMT0"),
|
||||
("Europe/Paris", "Europe/Paris"),
|
||||
("America/Vancouver", "America/Vancouver"),
|
||||
("America/Santiago", "America/Santiago"),
|
||||
("America/Paramaribo", "America/Paramaribo"),
|
||||
("America/Blanc-Sablon", "America/Blanc-Sablon"),
|
||||
("America/Manaus", "America/Manaus"),
|
||||
("America/Grand_Turk", "America/Grand_Turk"),
|
||||
("America/Yakutat", "America/Yakutat"),
|
||||
("Africa/El_Aaiun", "Africa/El_Aaiun"),
|
||||
("America/Edmonton", "America/Edmonton"),
|
||||
("Europe/Athens", "Europe/Athens"),
|
||||
("America/Guayaquil", "America/Guayaquil"),
|
||||
("America/Puerto_Rico", "America/Puerto_Rico"),
|
||||
("Atlantic/St_Helena", "Atlantic/St_Helena"),
|
||||
("Pacific/Kanton", "Pacific/Kanton"),
|
||||
("Africa/Ceuta", "Africa/Ceuta"),
|
||||
("America/Kralendijk", "America/Kralendijk"),
|
||||
("Pacific/Midway", "Pacific/Midway"),
|
||||
("Zulu", "Zulu"),
|
||||
("Asia/Tehran", "Asia/Tehran"),
|
||||
(
|
||||
"America/North_Dakota/Beulah",
|
||||
"America/North_Dakota/Beulah",
|
||||
),
|
||||
(
|
||||
"America/Argentina/Buenos_Aires",
|
||||
"America/Argentina/Buenos_Aires",
|
||||
),
|
||||
("Asia/Novokuznetsk", "Asia/Novokuznetsk"),
|
||||
("America/Danmarkshavn", "America/Danmarkshavn"),
|
||||
("America/Yellowknife", "America/Yellowknife"),
|
||||
("America/Indiana/Marengo", "America/Indiana/Marengo"),
|
||||
("Africa/Tripoli", "Africa/Tripoli"),
|
||||
("Europe/Skopje", "Europe/Skopje"),
|
||||
("Australia/NSW", "Australia/NSW"),
|
||||
("Australia/Currie", "Australia/Currie"),
|
||||
("Antarctica/Rothera", "Antarctica/Rothera"),
|
||||
("Asia/Gaza", "Asia/Gaza"),
|
||||
("Africa/Douala", "Africa/Douala"),
|
||||
("Africa/Nouakchott", "Africa/Nouakchott"),
|
||||
("Poland", "Poland"),
|
||||
("America/Sao_Paulo", "America/Sao_Paulo"),
|
||||
(
|
||||
"America/Argentina/Catamarca",
|
||||
"America/Argentina/Catamarca",
|
||||
),
|
||||
("Antarctica/Palmer", "Antarctica/Palmer"),
|
||||
("Europe/London", "Europe/London"),
|
||||
("America/Indiana/Winamac", "America/Indiana/Winamac"),
|
||||
("America/Godthab", "America/Godthab"),
|
||||
("Europe/Warsaw", "Europe/Warsaw"),
|
||||
("Etc/Zulu", "Etc/Zulu"),
|
||||
("Africa/Cairo", "Africa/Cairo"),
|
||||
("Africa/Brazzaville", "Africa/Brazzaville"),
|
||||
("Indian/Comoro", "Indian/Comoro"),
|
||||
("Europe/Riga", "Europe/Riga"),
|
||||
("America/Port_of_Spain", "America/Port_of_Spain"),
|
||||
("Pacific/Samoa", "Pacific/Samoa"),
|
||||
("Pacific/Fiji", "Pacific/Fiji"),
|
||||
("Africa/Timbuktu", "Africa/Timbuktu"),
|
||||
("Etc/GMT-9", "Etc/GMT-9"),
|
||||
("Asia/Thimphu", "Asia/Thimphu"),
|
||||
("Pacific/Auckland", "Pacific/Auckland"),
|
||||
("Africa/Windhoek", "Africa/Windhoek"),
|
||||
("America/Los_Angeles", "America/Los_Angeles"),
|
||||
("America/Managua", "America/Managua"),
|
||||
("Pacific/Majuro", "Pacific/Majuro"),
|
||||
("America/Adak", "America/Adak"),
|
||||
("Etc/UCT", "Etc/UCT"),
|
||||
("Mexico/BajaNorte", "Mexico/BajaNorte"),
|
||||
("US/Hawaii", "US/Hawaii"),
|
||||
("Europe/Vilnius", "Europe/Vilnius"),
|
||||
("Asia/Dushanbe", "Asia/Dushanbe"),
|
||||
("Asia/Kuwait", "Asia/Kuwait"),
|
||||
("Asia/Dili", "Asia/Dili"),
|
||||
("America/El_Salvador", "America/El_Salvador"),
|
||||
("US/Aleutian", "US/Aleutian"),
|
||||
("Etc/GMT-3", "Etc/GMT-3"),
|
||||
("Pacific/Rarotonga", "Pacific/Rarotonga"),
|
||||
("America/Moncton", "America/Moncton"),
|
||||
("America/Rankin_Inlet", "America/Rankin_Inlet"),
|
||||
("Africa/Kinshasa", "Africa/Kinshasa"),
|
||||
("Asia/Chita", "Asia/Chita"),
|
||||
("America/Cayenne", "America/Cayenne"),
|
||||
("Africa/Bissau", "Africa/Bissau"),
|
||||
("Pacific/Bougainville", "Pacific/Bougainville"),
|
||||
("America/Porto_Velho", "America/Porto_Velho"),
|
||||
("Africa/Niamey", "Africa/Niamey"),
|
||||
("Asia/Famagusta", "Asia/Famagusta"),
|
||||
("Etc/UTC", "Etc/UTC"),
|
||||
("Greenwich", "Greenwich"),
|
||||
("America/Grenada", "America/Grenada"),
|
||||
("Asia/Kathmandu", "Asia/Kathmandu"),
|
||||
("W-SU", "W-SU"),
|
||||
("Factory", "Factory"),
|
||||
("Europe/Bucharest", "Europe/Bucharest"),
|
||||
("America/St_Kitts", "America/St_Kitts"),
|
||||
("Africa/Sao_Tome", "Africa/Sao_Tome"),
|
||||
("Asia/Bangkok", "Asia/Bangkok"),
|
||||
("Africa/Dar_es_Salaam", "Africa/Dar_es_Salaam"),
|
||||
("Egypt", "Egypt"),
|
||||
("Africa/Maseru", "Africa/Maseru"),
|
||||
("Pacific/Galapagos", "Pacific/Galapagos"),
|
||||
("Asia/Harbin", "Asia/Harbin"),
|
||||
("Asia/Beirut", "Asia/Beirut"),
|
||||
("America/Monterrey", "America/Monterrey"),
|
||||
("Africa/Kampala", "Africa/Kampala"),
|
||||
("Asia/Ashgabat", "Asia/Ashgabat"),
|
||||
("America/Chihuahua", "America/Chihuahua"),
|
||||
("Eire", "Eire"),
|
||||
("Europe/Saratov", "Europe/Saratov"),
|
||||
("Cuba", "Cuba"),
|
||||
("Asia/Tashkent", "Asia/Tashkent"),
|
||||
("Pacific/Guam", "Pacific/Guam"),
|
||||
("America/Jamaica", "America/Jamaica"),
|
||||
("America/Hermosillo", "America/Hermosillo"),
|
||||
("Australia/Hobart", "Australia/Hobart"),
|
||||
("Asia/Krasnoyarsk", "Asia/Krasnoyarsk"),
|
||||
("America/Antigua", "America/Antigua"),
|
||||
("Indian/Mauritius", "Indian/Mauritius"),
|
||||
("America/Ciudad_Juarez", "America/Ciudad_Juarez"),
|
||||
("Asia/Muscat", "Asia/Muscat"),
|
||||
("Europe/Budapest", "Europe/Budapest"),
|
||||
("MET", "MET"),
|
||||
("Navajo", "Navajo"),
|
||||
("Etc/GMT-4", "Etc/GMT-4"),
|
||||
("America/Nassau", "America/Nassau"),
|
||||
("Asia/Bishkek", "Asia/Bishkek"),
|
||||
("America/Argentina/Jujuy", "America/Argentina/Jujuy"),
|
||||
("America/Nuuk", "America/Nuuk"),
|
||||
("Etc/GMT+9", "Etc/GMT+9"),
|
||||
("Australia/LHI", "Australia/LHI"),
|
||||
("America/Scoresbysund", "America/Scoresbysund"),
|
||||
("Asia/Yekaterinburg", "Asia/Yekaterinburg"),
|
||||
("Etc/GMT-0", "Etc/GMT-0"),
|
||||
("America/Creston", "America/Creston"),
|
||||
("Indian/Mahe", "Indian/Mahe"),
|
||||
(
|
||||
"America/Indiana/Indianapolis",
|
||||
"America/Indiana/Indianapolis",
|
||||
),
|
||||
("Pacific/Wallis", "Pacific/Wallis"),
|
||||
("America/Jujuy", "America/Jujuy"),
|
||||
("Europe/Zurich", "Europe/Zurich"),
|
||||
("Australia/Brisbane", "Australia/Brisbane"),
|
||||
("Etc/GMT-13", "Etc/GMT-13"),
|
||||
("Etc/GMT-5", "Etc/GMT-5"),
|
||||
("Hongkong", "Hongkong"),
|
||||
("Asia/Tel_Aviv", "Asia/Tel_Aviv"),
|
||||
("America/Recife", "America/Recife"),
|
||||
("America/Knox_IN", "America/Knox_IN"),
|
||||
("Australia/Lindeman", "Australia/Lindeman"),
|
||||
("Etc/GMT+11", "Etc/GMT+11"),
|
||||
("Canada/Yukon", "Canada/Yukon"),
|
||||
("Africa/Banjul", "Africa/Banjul"),
|
||||
("America/Belize", "America/Belize"),
|
||||
("Asia/Hovd", "Asia/Hovd"),
|
||||
("Etc/GMT+4", "Etc/GMT+4"),
|
||||
("Africa/Djibouti", "Africa/Djibouti"),
|
||||
("Africa/Nairobi", "Africa/Nairobi"),
|
||||
("Iceland", "Iceland"),
|
||||
("Australia/Yancowinna", "Australia/Yancowinna"),
|
||||
("Canada/Saskatchewan", "Canada/Saskatchewan"),
|
||||
("Asia/Magadan", "Asia/Magadan"),
|
||||
("America/Lima", "America/Lima"),
|
||||
("America/Cambridge_Bay", "America/Cambridge_Bay"),
|
||||
("Europe/Ulyanovsk", "Europe/Ulyanovsk"),
|
||||
("America/Merida", "America/Merida"),
|
||||
("America/Aruba", "America/Aruba"),
|
||||
("Pacific/Port_Moresby", "Pacific/Port_Moresby"),
|
||||
("Europe/Kirov", "Europe/Kirov"),
|
||||
("America/St_Johns", "America/St_Johns"),
|
||||
("Africa/Bamako", "Africa/Bamako"),
|
||||
("Asia/Ulan_Bator", "Asia/Ulan_Bator"),
|
||||
("Australia/Queensland", "Australia/Queensland"),
|
||||
("America/Santo_Domingo", "America/Santo_Domingo"),
|
||||
("Europe/Tallinn", "Europe/Tallinn"),
|
||||
("Europe/Lisbon", "Europe/Lisbon"),
|
||||
("America/Catamarca", "America/Catamarca"),
|
||||
("America/Phoenix", "America/Phoenix"),
|
||||
("America/Indiana/Vevay", "America/Indiana/Vevay"),
|
||||
("Asia/Karachi", "Asia/Karachi"),
|
||||
("America/Curacao", "America/Curacao"),
|
||||
("MST7MDT", "MST7MDT"),
|
||||
("Europe/Podgorica", "Europe/Podgorica"),
|
||||
("Asia/Makassar", "Asia/Makassar"),
|
||||
("America/Regina", "America/Regina"),
|
||||
("Asia/Aden", "Asia/Aden"),
|
||||
("Europe/Luxembourg", "Europe/Luxembourg"),
|
||||
("Asia/Vientiane", "Asia/Vientiane"),
|
||||
("US/Eastern", "US/Eastern"),
|
||||
("Asia/Tokyo", "Asia/Tokyo"),
|
||||
("America/Fort_Wayne", "America/Fort_Wayne"),
|
||||
("America/Tijuana", "America/Tijuana"),
|
||||
("America/Montevideo", "America/Montevideo"),
|
||||
("Europe/Oslo", "Europe/Oslo"),
|
||||
("America/La_Paz", "America/La_Paz"),
|
||||
("Asia/Aqtobe", "Asia/Aqtobe"),
|
||||
("Europe/Volgograd", "Europe/Volgograd"),
|
||||
("America/Costa_Rica", "America/Costa_Rica"),
|
||||
("GMT+0", "GMT+0"),
|
||||
("America/Guadeloupe", "America/Guadeloupe"),
|
||||
("America/Bahia", "America/Bahia"),
|
||||
("Africa/Khartoum", "Africa/Khartoum"),
|
||||
("Europe/Belgrade", "Europe/Belgrade"),
|
||||
("Pacific/Chuuk", "Pacific/Chuuk"),
|
||||
("America/Swift_Current", "America/Swift_Current"),
|
||||
("Asia/Macau", "Asia/Macau"),
|
||||
("America/Dawson", "America/Dawson"),
|
||||
("Asia/Thimbu", "Asia/Thimbu"),
|
||||
("America/Panama", "America/Panama"),
|
||||
("Europe/Ljubljana", "Europe/Ljubljana"),
|
||||
("Africa/Mbabane", "Africa/Mbabane"),
|
||||
("Africa/Libreville", "Africa/Libreville"),
|
||||
("PST8PDT", "PST8PDT"),
|
||||
("Brazil/DeNoronha", "Brazil/DeNoronha"),
|
||||
("Europe/Amsterdam", "Europe/Amsterdam"),
|
||||
("Asia/Jayapura", "Asia/Jayapura"),
|
||||
(
|
||||
"America/North_Dakota/Center",
|
||||
"America/North_Dakota/Center",
|
||||
),
|
||||
("Etc/GMT-11", "Etc/GMT-11"),
|
||||
("Etc/GMT-12", "Etc/GMT-12"),
|
||||
("GB", "GB"),
|
||||
("Africa/Lubumbashi", "Africa/Lubumbashi"),
|
||||
("Africa/Kigali", "Africa/Kigali"),
|
||||
("America/Marigot", "America/Marigot"),
|
||||
("Asia/Oral", "Asia/Oral"),
|
||||
("Brazil/West", "Brazil/West"),
|
||||
("Antarctica/Casey", "Antarctica/Casey"),
|
||||
("US/Central", "US/Central"),
|
||||
("America/Ojinaga", "America/Ojinaga"),
|
||||
("America/Santa_Isabel", "America/Santa_Isabel"),
|
||||
("America/Argentina/Ushuaia", "America/Argentina/Ushuaia"),
|
||||
("Atlantic/Stanley", "Atlantic/Stanley"),
|
||||
("Africa/Conakry", "Africa/Conakry"),
|
||||
("Europe/Andorra", "Europe/Andorra"),
|
||||
("Pacific/Apia", "Pacific/Apia"),
|
||||
("America/Santarem", "America/Santarem"),
|
||||
("Europe/Kiev", "Europe/Kiev"),
|
||||
("Australia/West", "Australia/West"),
|
||||
("Asia/Taipei", "Asia/Taipei"),
|
||||
("America/Goose_Bay", "America/Goose_Bay"),
|
||||
("America/Indiana/Knox", "America/Indiana/Knox"),
|
||||
("Asia/Yakutsk", "Asia/Yakutsk"),
|
||||
("Pacific/Niue", "Pacific/Niue"),
|
||||
("Africa/Lome", "Africa/Lome"),
|
||||
("Europe/Tiraspol", "Europe/Tiraspol"),
|
||||
("Atlantic/Jan_Mayen", "Atlantic/Jan_Mayen"),
|
||||
("Indian/Mayotte", "Indian/Mayotte"),
|
||||
("America/Indiana/Vincennes", "America/Indiana/Vincennes"),
|
||||
("Etc/GMT+7", "Etc/GMT+7"),
|
||||
("America/Mendoza", "America/Mendoza"),
|
||||
("America/Atka", "America/Atka"),
|
||||
("Asia/Qatar", "Asia/Qatar"),
|
||||
("Pacific/Pitcairn", "Pacific/Pitcairn"),
|
||||
("America/Asuncion", "America/Asuncion"),
|
||||
("Europe/Prague", "Europe/Prague"),
|
||||
("EET", "EET"),
|
||||
("America/Anguilla", "America/Anguilla"),
|
||||
("America/Sitka", "America/Sitka"),
|
||||
("Asia/Kamchatka", "Asia/Kamchatka"),
|
||||
("Asia/Irkutsk", "Asia/Irkutsk"),
|
||||
("Jamaica", "Jamaica"),
|
||||
("America/St_Thomas", "America/St_Thomas"),
|
||||
(
|
||||
"America/Argentina/San_Luis",
|
||||
"America/Argentina/San_Luis",
|
||||
),
|
||||
("Chile/Continental", "Chile/Continental"),
|
||||
("Asia/Jerusalem", "Asia/Jerusalem"),
|
||||
("Africa/Lagos", "Africa/Lagos"),
|
||||
("Antarctica/Syowa", "Antarctica/Syowa"),
|
||||
("Atlantic/Canary", "Atlantic/Canary"),
|
||||
("Europe/Vatican", "Europe/Vatican"),
|
||||
("America/Guatemala", "America/Guatemala"),
|
||||
("Africa/Addis_Ababa", "Africa/Addis_Ababa"),
|
||||
("America/Indianapolis", "America/Indianapolis"),
|
||||
("Asia/Calcutta", "Asia/Calcutta"),
|
||||
("Indian/Cocos", "Indian/Cocos"),
|
||||
("Pacific/Tongatapu", "Pacific/Tongatapu"),
|
||||
("Europe/San_Marino", "Europe/San_Marino"),
|
||||
("Australia/Broken_Hill", "Australia/Broken_Hill"),
|
||||
("Etc/GMT+8", "Etc/GMT+8"),
|
||||
("Asia/Atyrau", "Asia/Atyrau"),
|
||||
("Arctic/Longyearbyen", "Arctic/Longyearbyen"),
|
||||
("Pacific/Kiritimati", "Pacific/Kiritimati"),
|
||||
("Asia/Istanbul", "Asia/Istanbul"),
|
||||
("America/Fort_Nelson", "America/Fort_Nelson"),
|
||||
("Africa/Algiers", "Africa/Algiers"),
|
||||
("Asia/Almaty", "Asia/Almaty"),
|
||||
("Antarctica/Macquarie", "Antarctica/Macquarie"),
|
||||
("Africa/Freetown", "Africa/Freetown"),
|
||||
("Asia/Kabul", "Asia/Kabul"),
|
||||
("Asia/Choibalsan", "Asia/Choibalsan"),
|
||||
("America/Detroit", "America/Detroit"),
|
||||
("America/Cordoba", "America/Cordoba"),
|
||||
("America/Whitehorse", "America/Whitehorse"),
|
||||
("Asia/Riyadh", "Asia/Riyadh"),
|
||||
("Asia/Dubai", "Asia/Dubai"),
|
||||
("Universal", "Universal"),
|
||||
("America/Boise", "America/Boise"),
|
||||
("Africa/Tunis", "Africa/Tunis"),
|
||||
("Asia/Yangon", "Asia/Yangon"),
|
||||
("America/Araguaina", "America/Araguaina"),
|
||||
("Chile/EasterIsland", "Chile/EasterIsland"),
|
||||
("America/Caracas", "America/Caracas"),
|
||||
("Antarctica/DumontDUrville", "Antarctica/DumontDUrville"),
|
||||
("Atlantic/Faroe", "Atlantic/Faroe"),
|
||||
("Europe/Astrakhan", "Europe/Astrakhan"),
|
||||
("Asia/Rangoon", "Asia/Rangoon"),
|
||||
("Australia/Eucla", "Australia/Eucla"),
|
||||
("PRC", "PRC"),
|
||||
("Pacific/Tahiti", "Pacific/Tahiti"),
|
||||
("Australia/South", "Australia/South"),
|
||||
(
|
||||
"America/Kentucky/Louisville",
|
||||
"America/Kentucky/Louisville",
|
||||
),
|
||||
("America/Iqaluit", "America/Iqaluit"),
|
||||
("Antarctica/South_Pole", "Antarctica/South_Pole"),
|
||||
("Asia/Damascus", "Asia/Damascus"),
|
||||
("America/Glace_Bay", "America/Glace_Bay"),
|
||||
("Atlantic/Bermuda", "Atlantic/Bermuda"),
|
||||
("Asia/Pontianak", "Asia/Pontianak"),
|
||||
("Asia/Kolkata", "Asia/Kolkata"),
|
||||
("Pacific/Marquesas", "Pacific/Marquesas"),
|
||||
("Asia/Vladivostok", "Asia/Vladivostok"),
|
||||
("WET", "WET"),
|
||||
("Atlantic/Reykjavik", "Atlantic/Reykjavik"),
|
||||
("EST5EDT", "EST5EDT"),
|
||||
("Europe/Zagreb", "Europe/Zagreb"),
|
||||
("America/Toronto", "America/Toronto"),
|
||||
(
|
||||
"America/Argentina/ComodRivadavia",
|
||||
"America/Argentina/ComodRivadavia",
|
||||
),
|
||||
("Pacific/Chatham", "Pacific/Chatham"),
|
||||
("Europe/Istanbul", "Europe/Istanbul"),
|
||||
("Asia/Singapore", "Asia/Singapore"),
|
||||
("Asia/Srednekolymsk", "Asia/Srednekolymsk"),
|
||||
("Atlantic/Cape_Verde", "Atlantic/Cape_Verde"),
|
||||
("US/Arizona", "US/Arizona"),
|
||||
("America/Montreal", "America/Montreal"),
|
||||
("America/Resolute", "America/Resolute"),
|
||||
("America/Boa_Vista", "America/Boa_Vista"),
|
||||
("Antarctica/McMurdo", "Antarctica/McMurdo"),
|
||||
("Atlantic/Madeira", "Atlantic/Madeira"),
|
||||
("Canada/Atlantic", "Canada/Atlantic"),
|
||||
("Australia/Perth", "Australia/Perth"),
|
||||
("Kwajalein", "Kwajalein"),
|
||||
("Asia/Phnom_Penh", "Asia/Phnom_Penh"),
|
||||
("Europe/Malta", "Europe/Malta"),
|
||||
("America/Indiana/Tell_City", "America/Indiana/Tell_City"),
|
||||
("America/Guyana", "America/Guyana"),
|
||||
("Pacific/Palau", "Pacific/Palau"),
|
||||
("America/Winnipeg", "America/Winnipeg"),
|
||||
("UCT", "UCT"),
|
||||
("Atlantic/Azores", "Atlantic/Azores"),
|
||||
("Mexico/General", "Mexico/General"),
|
||||
("Pacific/Nauru", "Pacific/Nauru"),
|
||||
("Asia/Hebron", "Asia/Hebron"),
|
||||
("Asia/Khandyga", "Asia/Khandyga"),
|
||||
("Australia/Lord_Howe", "Australia/Lord_Howe"),
|
||||
("Portugal", "Portugal"),
|
||||
("Etc/GMT-7", "Etc/GMT-7"),
|
||||
("ROK", "ROK"),
|
||||
("Libya", "Libya"),
|
||||
("Europe/Jersey", "Europe/Jersey"),
|
||||
("Israel", "Israel"),
|
||||
("Pacific/Wake", "Pacific/Wake"),
|
||||
("Africa/Porto-Novo", "Africa/Porto-Novo"),
|
||||
("Africa/Asmera", "Africa/Asmera"),
|
||||
("America/Maceio", "America/Maceio"),
|
||||
("Europe/Sarajevo", "Europe/Sarajevo"),
|
||||
("US/East-Indiana", "US/East-Indiana"),
|
||||
("America/Rainy_River", "America/Rainy_River"),
|
||||
("Europe/Stockholm", "Europe/Stockholm"),
|
||||
("America/Thule", "America/Thule"),
|
||||
("Pacific/Enderbury", "Pacific/Enderbury"),
|
||||
("Pacific/Truk", "Pacific/Truk"),
|
||||
("Pacific/Ponape", "Pacific/Ponape"),
|
||||
("America/St_Barthelemy", "America/St_Barthelemy"),
|
||||
("Turkey", "Turkey"),
|
||||
("Antarctica/Mawson", "Antarctica/Mawson"),
|
||||
("Etc/GMT+0", "Etc/GMT+0"),
|
||||
("Europe/Sofia", "Europe/Sofia"),
|
||||
("Asia/Tbilisi", "Asia/Tbilisi"),
|
||||
("Australia/ACT", "Australia/ACT"),
|
||||
("Canada/Mountain", "Canada/Mountain"),
|
||||
("Europe/Isle_of_Man", "Europe/Isle_of_Man"),
|
||||
("Asia/Kashgar", "Asia/Kashgar"),
|
||||
("Europe/Chisinau", "Europe/Chisinau"),
|
||||
("Pacific/Efate", "Pacific/Efate"),
|
||||
("Pacific/Norfolk", "Pacific/Norfolk"),
|
||||
("America/Eirunepe", "America/Eirunepe"),
|
||||
("Europe/Guernsey", "Europe/Guernsey"),
|
||||
("Europe/Vaduz", "Europe/Vaduz"),
|
||||
("US/Samoa", "US/Samoa"),
|
||||
("Africa/Bangui", "Africa/Bangui"),
|
||||
("GMT", "GMT"),
|
||||
("Asia/Omsk", "Asia/Omsk"),
|
||||
("America/Menominee", "America/Menominee"),
|
||||
("America/Matamoros", "America/Matamoros"),
|
||||
("Canada/Newfoundland", "Canada/Newfoundland"),
|
||||
("Asia/Hong_Kong", "Asia/Hong_Kong"),
|
||||
("America/Montserrat", "America/Montserrat"),
|
||||
("Australia/Sydney", "Australia/Sydney"),
|
||||
("Asia/Qyzylorda", "Asia/Qyzylorda"),
|
||||
("Asia/Colombo", "Asia/Colombo"),
|
||||
("America/Argentina/Mendoza", "America/Argentina/Mendoza"),
|
||||
("Etc/GMT+1", "Etc/GMT+1"),
|
||||
("Asia/Dacca", "Asia/Dacca"),
|
||||
("America/Louisville", "America/Louisville"),
|
||||
("Asia/Sakhalin", "Asia/Sakhalin"),
|
||||
("Africa/Juba", "Africa/Juba"),
|
||||
("Japan", "Japan"),
|
||||
("America/Inuvik", "America/Inuvik"),
|
||||
("America/Cayman", "America/Cayman"),
|
||||
("Africa/Johannesburg", "Africa/Johannesburg"),
|
||||
("Pacific/Honolulu", "Pacific/Honolulu"),
|
||||
("Asia/Anadyr", "Asia/Anadyr"),
|
||||
("America/Atikokan", "America/Atikokan"),
|
||||
("Asia/Tomsk", "Asia/Tomsk"),
|
||||
("Europe/Zaporozhye", "Europe/Zaporozhye"),
|
||||
("Pacific/Saipan", "Pacific/Saipan"),
|
||||
("America/Virgin", "America/Virgin"),
|
||||
("Asia/Ho_Chi_Minh", "Asia/Ho_Chi_Minh"),
|
||||
("Pacific/Easter", "Pacific/Easter"),
|
||||
("Brazil/East", "Brazil/East"),
|
||||
("Africa/Accra", "Africa/Accra"),
|
||||
("America/Mexico_City", "America/Mexico_City"),
|
||||
("Europe/Dublin", "Europe/Dublin"),
|
||||
("America/Chicago", "America/Chicago"),
|
||||
("Etc/GMT+3", "Etc/GMT+3"),
|
||||
("Etc/GMT+5", "Etc/GMT+5"),
|
||||
("America/Tortola", "America/Tortola"),
|
||||
("Europe/Copenhagen", "Europe/Copenhagen"),
|
||||
("Asia/Bahrain", "Asia/Bahrain"),
|
||||
("Asia/Kuching", "Asia/Kuching"),
|
||||
("EST", "EST"),
|
||||
("Atlantic/Faeroe", "Atlantic/Faeroe"),
|
||||
("America/Shiprock", "America/Shiprock"),
|
||||
("Asia/Yerevan", "Asia/Yerevan"),
|
||||
("Etc/GMT+10", "Etc/GMT+10"),
|
||||
],
|
||||
default=aircox.models.schedule.current_timezone_key,
|
||||
help_text="timezone used for the date",
|
||||
max_length=100,
|
||||
verbose_name="timezone",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="station",
|
||||
name="hosts",
|
||||
field=models.TextField(
|
||||
blank=True,
|
||||
help_text="specify one domain per line, without 'http://' prefix",
|
||||
max_length=512,
|
||||
null=True,
|
||||
verbose_name="website's urls",
|
||||
),
|
||||
),
|
||||
]
|
||||
623
aircox/migrations/0014_alter_schedule_timezone.py
Normal file
623
aircox/migrations/0014_alter_schedule_timezone.py
Normal file
@ -0,0 +1,623 @@
|
||||
# Generated by Django 4.2.5 on 2023-10-18 07:26
|
||||
|
||||
import aircox.models.schedule
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0013_alter_schedule_timezone_alter_station_hosts"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="timezone",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("Africa/Abidjan", "Africa/Abidjan"),
|
||||
("Africa/Accra", "Africa/Accra"),
|
||||
("Africa/Addis_Ababa", "Africa/Addis_Ababa"),
|
||||
("Africa/Algiers", "Africa/Algiers"),
|
||||
("Africa/Asmara", "Africa/Asmara"),
|
||||
("Africa/Asmera", "Africa/Asmera"),
|
||||
("Africa/Bamako", "Africa/Bamako"),
|
||||
("Africa/Bangui", "Africa/Bangui"),
|
||||
("Africa/Banjul", "Africa/Banjul"),
|
||||
("Africa/Bissau", "Africa/Bissau"),
|
||||
("Africa/Blantyre", "Africa/Blantyre"),
|
||||
("Africa/Brazzaville", "Africa/Brazzaville"),
|
||||
("Africa/Bujumbura", "Africa/Bujumbura"),
|
||||
("Africa/Cairo", "Africa/Cairo"),
|
||||
("Africa/Casablanca", "Africa/Casablanca"),
|
||||
("Africa/Ceuta", "Africa/Ceuta"),
|
||||
("Africa/Conakry", "Africa/Conakry"),
|
||||
("Africa/Dakar", "Africa/Dakar"),
|
||||
("Africa/Dar_es_Salaam", "Africa/Dar_es_Salaam"),
|
||||
("Africa/Djibouti", "Africa/Djibouti"),
|
||||
("Africa/Douala", "Africa/Douala"),
|
||||
("Africa/El_Aaiun", "Africa/El_Aaiun"),
|
||||
("Africa/Freetown", "Africa/Freetown"),
|
||||
("Africa/Gaborone", "Africa/Gaborone"),
|
||||
("Africa/Harare", "Africa/Harare"),
|
||||
("Africa/Johannesburg", "Africa/Johannesburg"),
|
||||
("Africa/Juba", "Africa/Juba"),
|
||||
("Africa/Kampala", "Africa/Kampala"),
|
||||
("Africa/Khartoum", "Africa/Khartoum"),
|
||||
("Africa/Kigali", "Africa/Kigali"),
|
||||
("Africa/Kinshasa", "Africa/Kinshasa"),
|
||||
("Africa/Lagos", "Africa/Lagos"),
|
||||
("Africa/Libreville", "Africa/Libreville"),
|
||||
("Africa/Lome", "Africa/Lome"),
|
||||
("Africa/Luanda", "Africa/Luanda"),
|
||||
("Africa/Lubumbashi", "Africa/Lubumbashi"),
|
||||
("Africa/Lusaka", "Africa/Lusaka"),
|
||||
("Africa/Malabo", "Africa/Malabo"),
|
||||
("Africa/Maputo", "Africa/Maputo"),
|
||||
("Africa/Maseru", "Africa/Maseru"),
|
||||
("Africa/Mbabane", "Africa/Mbabane"),
|
||||
("Africa/Mogadishu", "Africa/Mogadishu"),
|
||||
("Africa/Monrovia", "Africa/Monrovia"),
|
||||
("Africa/Nairobi", "Africa/Nairobi"),
|
||||
("Africa/Ndjamena", "Africa/Ndjamena"),
|
||||
("Africa/Niamey", "Africa/Niamey"),
|
||||
("Africa/Nouakchott", "Africa/Nouakchott"),
|
||||
("Africa/Ouagadougou", "Africa/Ouagadougou"),
|
||||
("Africa/Porto-Novo", "Africa/Porto-Novo"),
|
||||
("Africa/Sao_Tome", "Africa/Sao_Tome"),
|
||||
("Africa/Timbuktu", "Africa/Timbuktu"),
|
||||
("Africa/Tripoli", "Africa/Tripoli"),
|
||||
("Africa/Tunis", "Africa/Tunis"),
|
||||
("Africa/Windhoek", "Africa/Windhoek"),
|
||||
("America/Adak", "America/Adak"),
|
||||
("America/Anchorage", "America/Anchorage"),
|
||||
("America/Anguilla", "America/Anguilla"),
|
||||
("America/Antigua", "America/Antigua"),
|
||||
("America/Araguaina", "America/Araguaina"),
|
||||
("America/Argentina/Buenos_Aires", "America/Argentina/Buenos_Aires"),
|
||||
("America/Argentina/Catamarca", "America/Argentina/Catamarca"),
|
||||
("America/Argentina/ComodRivadavia", "America/Argentina/ComodRivadavia"),
|
||||
("America/Argentina/Cordoba", "America/Argentina/Cordoba"),
|
||||
("America/Argentina/Jujuy", "America/Argentina/Jujuy"),
|
||||
("America/Argentina/La_Rioja", "America/Argentina/La_Rioja"),
|
||||
("America/Argentina/Mendoza", "America/Argentina/Mendoza"),
|
||||
("America/Argentina/Rio_Gallegos", "America/Argentina/Rio_Gallegos"),
|
||||
("America/Argentina/Salta", "America/Argentina/Salta"),
|
||||
("America/Argentina/San_Juan", "America/Argentina/San_Juan"),
|
||||
("America/Argentina/San_Luis", "America/Argentina/San_Luis"),
|
||||
("America/Argentina/Tucuman", "America/Argentina/Tucuman"),
|
||||
("America/Argentina/Ushuaia", "America/Argentina/Ushuaia"),
|
||||
("America/Aruba", "America/Aruba"),
|
||||
("America/Asuncion", "America/Asuncion"),
|
||||
("America/Atikokan", "America/Atikokan"),
|
||||
("America/Atka", "America/Atka"),
|
||||
("America/Bahia", "America/Bahia"),
|
||||
("America/Bahia_Banderas", "America/Bahia_Banderas"),
|
||||
("America/Barbados", "America/Barbados"),
|
||||
("America/Belem", "America/Belem"),
|
||||
("America/Belize", "America/Belize"),
|
||||
("America/Blanc-Sablon", "America/Blanc-Sablon"),
|
||||
("America/Boa_Vista", "America/Boa_Vista"),
|
||||
("America/Bogota", "America/Bogota"),
|
||||
("America/Boise", "America/Boise"),
|
||||
("America/Buenos_Aires", "America/Buenos_Aires"),
|
||||
("America/Cambridge_Bay", "America/Cambridge_Bay"),
|
||||
("America/Campo_Grande", "America/Campo_Grande"),
|
||||
("America/Cancun", "America/Cancun"),
|
||||
("America/Caracas", "America/Caracas"),
|
||||
("America/Catamarca", "America/Catamarca"),
|
||||
("America/Cayenne", "America/Cayenne"),
|
||||
("America/Cayman", "America/Cayman"),
|
||||
("America/Chicago", "America/Chicago"),
|
||||
("America/Chihuahua", "America/Chihuahua"),
|
||||
("America/Ciudad_Juarez", "America/Ciudad_Juarez"),
|
||||
("America/Coral_Harbour", "America/Coral_Harbour"),
|
||||
("America/Cordoba", "America/Cordoba"),
|
||||
("America/Costa_Rica", "America/Costa_Rica"),
|
||||
("America/Creston", "America/Creston"),
|
||||
("America/Cuiaba", "America/Cuiaba"),
|
||||
("America/Curacao", "America/Curacao"),
|
||||
("America/Danmarkshavn", "America/Danmarkshavn"),
|
||||
("America/Dawson", "America/Dawson"),
|
||||
("America/Dawson_Creek", "America/Dawson_Creek"),
|
||||
("America/Denver", "America/Denver"),
|
||||
("America/Detroit", "America/Detroit"),
|
||||
("America/Dominica", "America/Dominica"),
|
||||
("America/Edmonton", "America/Edmonton"),
|
||||
("America/Eirunepe", "America/Eirunepe"),
|
||||
("America/El_Salvador", "America/El_Salvador"),
|
||||
("America/Ensenada", "America/Ensenada"),
|
||||
("America/Fort_Nelson", "America/Fort_Nelson"),
|
||||
("America/Fort_Wayne", "America/Fort_Wayne"),
|
||||
("America/Fortaleza", "America/Fortaleza"),
|
||||
("America/Glace_Bay", "America/Glace_Bay"),
|
||||
("America/Godthab", "America/Godthab"),
|
||||
("America/Goose_Bay", "America/Goose_Bay"),
|
||||
("America/Grand_Turk", "America/Grand_Turk"),
|
||||
("America/Grenada", "America/Grenada"),
|
||||
("America/Guadeloupe", "America/Guadeloupe"),
|
||||
("America/Guatemala", "America/Guatemala"),
|
||||
("America/Guayaquil", "America/Guayaquil"),
|
||||
("America/Guyana", "America/Guyana"),
|
||||
("America/Halifax", "America/Halifax"),
|
||||
("America/Havana", "America/Havana"),
|
||||
("America/Hermosillo", "America/Hermosillo"),
|
||||
("America/Indiana/Indianapolis", "America/Indiana/Indianapolis"),
|
||||
("America/Indiana/Knox", "America/Indiana/Knox"),
|
||||
("America/Indiana/Marengo", "America/Indiana/Marengo"),
|
||||
("America/Indiana/Petersburg", "America/Indiana/Petersburg"),
|
||||
("America/Indiana/Tell_City", "America/Indiana/Tell_City"),
|
||||
("America/Indiana/Vevay", "America/Indiana/Vevay"),
|
||||
("America/Indiana/Vincennes", "America/Indiana/Vincennes"),
|
||||
("America/Indiana/Winamac", "America/Indiana/Winamac"),
|
||||
("America/Indianapolis", "America/Indianapolis"),
|
||||
("America/Inuvik", "America/Inuvik"),
|
||||
("America/Iqaluit", "America/Iqaluit"),
|
||||
("America/Jamaica", "America/Jamaica"),
|
||||
("America/Jujuy", "America/Jujuy"),
|
||||
("America/Juneau", "America/Juneau"),
|
||||
("America/Kentucky/Louisville", "America/Kentucky/Louisville"),
|
||||
("America/Kentucky/Monticello", "America/Kentucky/Monticello"),
|
||||
("America/Knox_IN", "America/Knox_IN"),
|
||||
("America/Kralendijk", "America/Kralendijk"),
|
||||
("America/La_Paz", "America/La_Paz"),
|
||||
("America/Lima", "America/Lima"),
|
||||
("America/Los_Angeles", "America/Los_Angeles"),
|
||||
("America/Louisville", "America/Louisville"),
|
||||
("America/Lower_Princes", "America/Lower_Princes"),
|
||||
("America/Maceio", "America/Maceio"),
|
||||
("America/Managua", "America/Managua"),
|
||||
("America/Manaus", "America/Manaus"),
|
||||
("America/Marigot", "America/Marigot"),
|
||||
("America/Martinique", "America/Martinique"),
|
||||
("America/Matamoros", "America/Matamoros"),
|
||||
("America/Mazatlan", "America/Mazatlan"),
|
||||
("America/Mendoza", "America/Mendoza"),
|
||||
("America/Menominee", "America/Menominee"),
|
||||
("America/Merida", "America/Merida"),
|
||||
("America/Metlakatla", "America/Metlakatla"),
|
||||
("America/Mexico_City", "America/Mexico_City"),
|
||||
("America/Miquelon", "America/Miquelon"),
|
||||
("America/Moncton", "America/Moncton"),
|
||||
("America/Monterrey", "America/Monterrey"),
|
||||
("America/Montevideo", "America/Montevideo"),
|
||||
("America/Montreal", "America/Montreal"),
|
||||
("America/Montserrat", "America/Montserrat"),
|
||||
("America/Nassau", "America/Nassau"),
|
||||
("America/New_York", "America/New_York"),
|
||||
("America/Nipigon", "America/Nipigon"),
|
||||
("America/Nome", "America/Nome"),
|
||||
("America/Noronha", "America/Noronha"),
|
||||
("America/North_Dakota/Beulah", "America/North_Dakota/Beulah"),
|
||||
("America/North_Dakota/Center", "America/North_Dakota/Center"),
|
||||
("America/North_Dakota/New_Salem", "America/North_Dakota/New_Salem"),
|
||||
("America/Nuuk", "America/Nuuk"),
|
||||
("America/Ojinaga", "America/Ojinaga"),
|
||||
("America/Panama", "America/Panama"),
|
||||
("America/Pangnirtung", "America/Pangnirtung"),
|
||||
("America/Paramaribo", "America/Paramaribo"),
|
||||
("America/Phoenix", "America/Phoenix"),
|
||||
("America/Port-au-Prince", "America/Port-au-Prince"),
|
||||
("America/Port_of_Spain", "America/Port_of_Spain"),
|
||||
("America/Porto_Acre", "America/Porto_Acre"),
|
||||
("America/Porto_Velho", "America/Porto_Velho"),
|
||||
("America/Puerto_Rico", "America/Puerto_Rico"),
|
||||
("America/Punta_Arenas", "America/Punta_Arenas"),
|
||||
("America/Rainy_River", "America/Rainy_River"),
|
||||
("America/Rankin_Inlet", "America/Rankin_Inlet"),
|
||||
("America/Recife", "America/Recife"),
|
||||
("America/Regina", "America/Regina"),
|
||||
("America/Resolute", "America/Resolute"),
|
||||
("America/Rio_Branco", "America/Rio_Branco"),
|
||||
("America/Rosario", "America/Rosario"),
|
||||
("America/Santa_Isabel", "America/Santa_Isabel"),
|
||||
("America/Santarem", "America/Santarem"),
|
||||
("America/Santiago", "America/Santiago"),
|
||||
("America/Santo_Domingo", "America/Santo_Domingo"),
|
||||
("America/Sao_Paulo", "America/Sao_Paulo"),
|
||||
("America/Scoresbysund", "America/Scoresbysund"),
|
||||
("America/Shiprock", "America/Shiprock"),
|
||||
("America/Sitka", "America/Sitka"),
|
||||
("America/St_Barthelemy", "America/St_Barthelemy"),
|
||||
("America/St_Johns", "America/St_Johns"),
|
||||
("America/St_Kitts", "America/St_Kitts"),
|
||||
("America/St_Lucia", "America/St_Lucia"),
|
||||
("America/St_Thomas", "America/St_Thomas"),
|
||||
("America/St_Vincent", "America/St_Vincent"),
|
||||
("America/Swift_Current", "America/Swift_Current"),
|
||||
("America/Tegucigalpa", "America/Tegucigalpa"),
|
||||
("America/Thule", "America/Thule"),
|
||||
("America/Thunder_Bay", "America/Thunder_Bay"),
|
||||
("America/Tijuana", "America/Tijuana"),
|
||||
("America/Toronto", "America/Toronto"),
|
||||
("America/Tortola", "America/Tortola"),
|
||||
("America/Vancouver", "America/Vancouver"),
|
||||
("America/Virgin", "America/Virgin"),
|
||||
("America/Whitehorse", "America/Whitehorse"),
|
||||
("America/Winnipeg", "America/Winnipeg"),
|
||||
("America/Yakutat", "America/Yakutat"),
|
||||
("America/Yellowknife", "America/Yellowknife"),
|
||||
("Antarctica/Casey", "Antarctica/Casey"),
|
||||
("Antarctica/Davis", "Antarctica/Davis"),
|
||||
("Antarctica/DumontDUrville", "Antarctica/DumontDUrville"),
|
||||
("Antarctica/Macquarie", "Antarctica/Macquarie"),
|
||||
("Antarctica/Mawson", "Antarctica/Mawson"),
|
||||
("Antarctica/McMurdo", "Antarctica/McMurdo"),
|
||||
("Antarctica/Palmer", "Antarctica/Palmer"),
|
||||
("Antarctica/Rothera", "Antarctica/Rothera"),
|
||||
("Antarctica/South_Pole", "Antarctica/South_Pole"),
|
||||
("Antarctica/Syowa", "Antarctica/Syowa"),
|
||||
("Antarctica/Troll", "Antarctica/Troll"),
|
||||
("Antarctica/Vostok", "Antarctica/Vostok"),
|
||||
("Arctic/Longyearbyen", "Arctic/Longyearbyen"),
|
||||
("Asia/Aden", "Asia/Aden"),
|
||||
("Asia/Almaty", "Asia/Almaty"),
|
||||
("Asia/Amman", "Asia/Amman"),
|
||||
("Asia/Anadyr", "Asia/Anadyr"),
|
||||
("Asia/Aqtau", "Asia/Aqtau"),
|
||||
("Asia/Aqtobe", "Asia/Aqtobe"),
|
||||
("Asia/Ashgabat", "Asia/Ashgabat"),
|
||||
("Asia/Ashkhabad", "Asia/Ashkhabad"),
|
||||
("Asia/Atyrau", "Asia/Atyrau"),
|
||||
("Asia/Baghdad", "Asia/Baghdad"),
|
||||
("Asia/Bahrain", "Asia/Bahrain"),
|
||||
("Asia/Baku", "Asia/Baku"),
|
||||
("Asia/Bangkok", "Asia/Bangkok"),
|
||||
("Asia/Barnaul", "Asia/Barnaul"),
|
||||
("Asia/Beirut", "Asia/Beirut"),
|
||||
("Asia/Bishkek", "Asia/Bishkek"),
|
||||
("Asia/Brunei", "Asia/Brunei"),
|
||||
("Asia/Calcutta", "Asia/Calcutta"),
|
||||
("Asia/Chita", "Asia/Chita"),
|
||||
("Asia/Choibalsan", "Asia/Choibalsan"),
|
||||
("Asia/Chongqing", "Asia/Chongqing"),
|
||||
("Asia/Chungking", "Asia/Chungking"),
|
||||
("Asia/Colombo", "Asia/Colombo"),
|
||||
("Asia/Dacca", "Asia/Dacca"),
|
||||
("Asia/Damascus", "Asia/Damascus"),
|
||||
("Asia/Dhaka", "Asia/Dhaka"),
|
||||
("Asia/Dili", "Asia/Dili"),
|
||||
("Asia/Dubai", "Asia/Dubai"),
|
||||
("Asia/Dushanbe", "Asia/Dushanbe"),
|
||||
("Asia/Famagusta", "Asia/Famagusta"),
|
||||
("Asia/Gaza", "Asia/Gaza"),
|
||||
("Asia/Harbin", "Asia/Harbin"),
|
||||
("Asia/Hebron", "Asia/Hebron"),
|
||||
("Asia/Ho_Chi_Minh", "Asia/Ho_Chi_Minh"),
|
||||
("Asia/Hong_Kong", "Asia/Hong_Kong"),
|
||||
("Asia/Hovd", "Asia/Hovd"),
|
||||
("Asia/Irkutsk", "Asia/Irkutsk"),
|
||||
("Asia/Istanbul", "Asia/Istanbul"),
|
||||
("Asia/Jakarta", "Asia/Jakarta"),
|
||||
("Asia/Jayapura", "Asia/Jayapura"),
|
||||
("Asia/Jerusalem", "Asia/Jerusalem"),
|
||||
("Asia/Kabul", "Asia/Kabul"),
|
||||
("Asia/Kamchatka", "Asia/Kamchatka"),
|
||||
("Asia/Karachi", "Asia/Karachi"),
|
||||
("Asia/Kashgar", "Asia/Kashgar"),
|
||||
("Asia/Kathmandu", "Asia/Kathmandu"),
|
||||
("Asia/Katmandu", "Asia/Katmandu"),
|
||||
("Asia/Khandyga", "Asia/Khandyga"),
|
||||
("Asia/Kolkata", "Asia/Kolkata"),
|
||||
("Asia/Krasnoyarsk", "Asia/Krasnoyarsk"),
|
||||
("Asia/Kuala_Lumpur", "Asia/Kuala_Lumpur"),
|
||||
("Asia/Kuching", "Asia/Kuching"),
|
||||
("Asia/Kuwait", "Asia/Kuwait"),
|
||||
("Asia/Macao", "Asia/Macao"),
|
||||
("Asia/Macau", "Asia/Macau"),
|
||||
("Asia/Magadan", "Asia/Magadan"),
|
||||
("Asia/Makassar", "Asia/Makassar"),
|
||||
("Asia/Manila", "Asia/Manila"),
|
||||
("Asia/Muscat", "Asia/Muscat"),
|
||||
("Asia/Nicosia", "Asia/Nicosia"),
|
||||
("Asia/Novokuznetsk", "Asia/Novokuznetsk"),
|
||||
("Asia/Novosibirsk", "Asia/Novosibirsk"),
|
||||
("Asia/Omsk", "Asia/Omsk"),
|
||||
("Asia/Oral", "Asia/Oral"),
|
||||
("Asia/Phnom_Penh", "Asia/Phnom_Penh"),
|
||||
("Asia/Pontianak", "Asia/Pontianak"),
|
||||
("Asia/Pyongyang", "Asia/Pyongyang"),
|
||||
("Asia/Qatar", "Asia/Qatar"),
|
||||
("Asia/Qostanay", "Asia/Qostanay"),
|
||||
("Asia/Qyzylorda", "Asia/Qyzylorda"),
|
||||
("Asia/Rangoon", "Asia/Rangoon"),
|
||||
("Asia/Riyadh", "Asia/Riyadh"),
|
||||
("Asia/Saigon", "Asia/Saigon"),
|
||||
("Asia/Sakhalin", "Asia/Sakhalin"),
|
||||
("Asia/Samarkand", "Asia/Samarkand"),
|
||||
("Asia/Seoul", "Asia/Seoul"),
|
||||
("Asia/Shanghai", "Asia/Shanghai"),
|
||||
("Asia/Singapore", "Asia/Singapore"),
|
||||
("Asia/Srednekolymsk", "Asia/Srednekolymsk"),
|
||||
("Asia/Taipei", "Asia/Taipei"),
|
||||
("Asia/Tashkent", "Asia/Tashkent"),
|
||||
("Asia/Tbilisi", "Asia/Tbilisi"),
|
||||
("Asia/Tehran", "Asia/Tehran"),
|
||||
("Asia/Tel_Aviv", "Asia/Tel_Aviv"),
|
||||
("Asia/Thimbu", "Asia/Thimbu"),
|
||||
("Asia/Thimphu", "Asia/Thimphu"),
|
||||
("Asia/Tokyo", "Asia/Tokyo"),
|
||||
("Asia/Tomsk", "Asia/Tomsk"),
|
||||
("Asia/Ujung_Pandang", "Asia/Ujung_Pandang"),
|
||||
("Asia/Ulaanbaatar", "Asia/Ulaanbaatar"),
|
||||
("Asia/Ulan_Bator", "Asia/Ulan_Bator"),
|
||||
("Asia/Urumqi", "Asia/Urumqi"),
|
||||
("Asia/Ust-Nera", "Asia/Ust-Nera"),
|
||||
("Asia/Vientiane", "Asia/Vientiane"),
|
||||
("Asia/Vladivostok", "Asia/Vladivostok"),
|
||||
("Asia/Yakutsk", "Asia/Yakutsk"),
|
||||
("Asia/Yangon", "Asia/Yangon"),
|
||||
("Asia/Yekaterinburg", "Asia/Yekaterinburg"),
|
||||
("Asia/Yerevan", "Asia/Yerevan"),
|
||||
("Atlantic/Azores", "Atlantic/Azores"),
|
||||
("Atlantic/Bermuda", "Atlantic/Bermuda"),
|
||||
("Atlantic/Canary", "Atlantic/Canary"),
|
||||
("Atlantic/Cape_Verde", "Atlantic/Cape_Verde"),
|
||||
("Atlantic/Faeroe", "Atlantic/Faeroe"),
|
||||
("Atlantic/Faroe", "Atlantic/Faroe"),
|
||||
("Atlantic/Jan_Mayen", "Atlantic/Jan_Mayen"),
|
||||
("Atlantic/Madeira", "Atlantic/Madeira"),
|
||||
("Atlantic/Reykjavik", "Atlantic/Reykjavik"),
|
||||
("Atlantic/South_Georgia", "Atlantic/South_Georgia"),
|
||||
("Atlantic/St_Helena", "Atlantic/St_Helena"),
|
||||
("Atlantic/Stanley", "Atlantic/Stanley"),
|
||||
("Australia/ACT", "Australia/ACT"),
|
||||
("Australia/Adelaide", "Australia/Adelaide"),
|
||||
("Australia/Brisbane", "Australia/Brisbane"),
|
||||
("Australia/Broken_Hill", "Australia/Broken_Hill"),
|
||||
("Australia/Canberra", "Australia/Canberra"),
|
||||
("Australia/Currie", "Australia/Currie"),
|
||||
("Australia/Darwin", "Australia/Darwin"),
|
||||
("Australia/Eucla", "Australia/Eucla"),
|
||||
("Australia/Hobart", "Australia/Hobart"),
|
||||
("Australia/LHI", "Australia/LHI"),
|
||||
("Australia/Lindeman", "Australia/Lindeman"),
|
||||
("Australia/Lord_Howe", "Australia/Lord_Howe"),
|
||||
("Australia/Melbourne", "Australia/Melbourne"),
|
||||
("Australia/NSW", "Australia/NSW"),
|
||||
("Australia/North", "Australia/North"),
|
||||
("Australia/Perth", "Australia/Perth"),
|
||||
("Australia/Queensland", "Australia/Queensland"),
|
||||
("Australia/South", "Australia/South"),
|
||||
("Australia/Sydney", "Australia/Sydney"),
|
||||
("Australia/Tasmania", "Australia/Tasmania"),
|
||||
("Australia/Victoria", "Australia/Victoria"),
|
||||
("Australia/West", "Australia/West"),
|
||||
("Australia/Yancowinna", "Australia/Yancowinna"),
|
||||
("Brazil/Acre", "Brazil/Acre"),
|
||||
("Brazil/DeNoronha", "Brazil/DeNoronha"),
|
||||
("Brazil/East", "Brazil/East"),
|
||||
("Brazil/West", "Brazil/West"),
|
||||
("CET", "CET"),
|
||||
("CST6CDT", "CST6CDT"),
|
||||
("Canada/Atlantic", "Canada/Atlantic"),
|
||||
("Canada/Central", "Canada/Central"),
|
||||
("Canada/Eastern", "Canada/Eastern"),
|
||||
("Canada/Mountain", "Canada/Mountain"),
|
||||
("Canada/Newfoundland", "Canada/Newfoundland"),
|
||||
("Canada/Pacific", "Canada/Pacific"),
|
||||
("Canada/Saskatchewan", "Canada/Saskatchewan"),
|
||||
("Canada/Yukon", "Canada/Yukon"),
|
||||
("Chile/Continental", "Chile/Continental"),
|
||||
("Chile/EasterIsland", "Chile/EasterIsland"),
|
||||
("Cuba", "Cuba"),
|
||||
("EET", "EET"),
|
||||
("EST", "EST"),
|
||||
("EST5EDT", "EST5EDT"),
|
||||
("Egypt", "Egypt"),
|
||||
("Eire", "Eire"),
|
||||
("Etc/GMT", "Etc/GMT"),
|
||||
("Etc/GMT+0", "Etc/GMT+0"),
|
||||
("Etc/GMT+1", "Etc/GMT+1"),
|
||||
("Etc/GMT+10", "Etc/GMT+10"),
|
||||
("Etc/GMT+11", "Etc/GMT+11"),
|
||||
("Etc/GMT+12", "Etc/GMT+12"),
|
||||
("Etc/GMT+2", "Etc/GMT+2"),
|
||||
("Etc/GMT+3", "Etc/GMT+3"),
|
||||
("Etc/GMT+4", "Etc/GMT+4"),
|
||||
("Etc/GMT+5", "Etc/GMT+5"),
|
||||
("Etc/GMT+6", "Etc/GMT+6"),
|
||||
("Etc/GMT+7", "Etc/GMT+7"),
|
||||
("Etc/GMT+8", "Etc/GMT+8"),
|
||||
("Etc/GMT+9", "Etc/GMT+9"),
|
||||
("Etc/GMT-0", "Etc/GMT-0"),
|
||||
("Etc/GMT-1", "Etc/GMT-1"),
|
||||
("Etc/GMT-10", "Etc/GMT-10"),
|
||||
("Etc/GMT-11", "Etc/GMT-11"),
|
||||
("Etc/GMT-12", "Etc/GMT-12"),
|
||||
("Etc/GMT-13", "Etc/GMT-13"),
|
||||
("Etc/GMT-14", "Etc/GMT-14"),
|
||||
("Etc/GMT-2", "Etc/GMT-2"),
|
||||
("Etc/GMT-3", "Etc/GMT-3"),
|
||||
("Etc/GMT-4", "Etc/GMT-4"),
|
||||
("Etc/GMT-5", "Etc/GMT-5"),
|
||||
("Etc/GMT-6", "Etc/GMT-6"),
|
||||
("Etc/GMT-7", "Etc/GMT-7"),
|
||||
("Etc/GMT-8", "Etc/GMT-8"),
|
||||
("Etc/GMT-9", "Etc/GMT-9"),
|
||||
("Etc/GMT0", "Etc/GMT0"),
|
||||
("Etc/Greenwich", "Etc/Greenwich"),
|
||||
("Etc/UCT", "Etc/UCT"),
|
||||
("Etc/UTC", "Etc/UTC"),
|
||||
("Etc/Universal", "Etc/Universal"),
|
||||
("Etc/Zulu", "Etc/Zulu"),
|
||||
("Europe/Amsterdam", "Europe/Amsterdam"),
|
||||
("Europe/Andorra", "Europe/Andorra"),
|
||||
("Europe/Astrakhan", "Europe/Astrakhan"),
|
||||
("Europe/Athens", "Europe/Athens"),
|
||||
("Europe/Belfast", "Europe/Belfast"),
|
||||
("Europe/Belgrade", "Europe/Belgrade"),
|
||||
("Europe/Berlin", "Europe/Berlin"),
|
||||
("Europe/Bratislava", "Europe/Bratislava"),
|
||||
("Europe/Brussels", "Europe/Brussels"),
|
||||
("Europe/Bucharest", "Europe/Bucharest"),
|
||||
("Europe/Budapest", "Europe/Budapest"),
|
||||
("Europe/Busingen", "Europe/Busingen"),
|
||||
("Europe/Chisinau", "Europe/Chisinau"),
|
||||
("Europe/Copenhagen", "Europe/Copenhagen"),
|
||||
("Europe/Dublin", "Europe/Dublin"),
|
||||
("Europe/Gibraltar", "Europe/Gibraltar"),
|
||||
("Europe/Guernsey", "Europe/Guernsey"),
|
||||
("Europe/Helsinki", "Europe/Helsinki"),
|
||||
("Europe/Isle_of_Man", "Europe/Isle_of_Man"),
|
||||
("Europe/Istanbul", "Europe/Istanbul"),
|
||||
("Europe/Jersey", "Europe/Jersey"),
|
||||
("Europe/Kaliningrad", "Europe/Kaliningrad"),
|
||||
("Europe/Kiev", "Europe/Kiev"),
|
||||
("Europe/Kirov", "Europe/Kirov"),
|
||||
("Europe/Kyiv", "Europe/Kyiv"),
|
||||
("Europe/Lisbon", "Europe/Lisbon"),
|
||||
("Europe/Ljubljana", "Europe/Ljubljana"),
|
||||
("Europe/London", "Europe/London"),
|
||||
("Europe/Luxembourg", "Europe/Luxembourg"),
|
||||
("Europe/Madrid", "Europe/Madrid"),
|
||||
("Europe/Malta", "Europe/Malta"),
|
||||
("Europe/Mariehamn", "Europe/Mariehamn"),
|
||||
("Europe/Minsk", "Europe/Minsk"),
|
||||
("Europe/Monaco", "Europe/Monaco"),
|
||||
("Europe/Moscow", "Europe/Moscow"),
|
||||
("Europe/Nicosia", "Europe/Nicosia"),
|
||||
("Europe/Oslo", "Europe/Oslo"),
|
||||
("Europe/Paris", "Europe/Paris"),
|
||||
("Europe/Podgorica", "Europe/Podgorica"),
|
||||
("Europe/Prague", "Europe/Prague"),
|
||||
("Europe/Riga", "Europe/Riga"),
|
||||
("Europe/Rome", "Europe/Rome"),
|
||||
("Europe/Samara", "Europe/Samara"),
|
||||
("Europe/San_Marino", "Europe/San_Marino"),
|
||||
("Europe/Sarajevo", "Europe/Sarajevo"),
|
||||
("Europe/Saratov", "Europe/Saratov"),
|
||||
("Europe/Simferopol", "Europe/Simferopol"),
|
||||
("Europe/Skopje", "Europe/Skopje"),
|
||||
("Europe/Sofia", "Europe/Sofia"),
|
||||
("Europe/Stockholm", "Europe/Stockholm"),
|
||||
("Europe/Tallinn", "Europe/Tallinn"),
|
||||
("Europe/Tirane", "Europe/Tirane"),
|
||||
("Europe/Tiraspol", "Europe/Tiraspol"),
|
||||
("Europe/Ulyanovsk", "Europe/Ulyanovsk"),
|
||||
("Europe/Uzhgorod", "Europe/Uzhgorod"),
|
||||
("Europe/Vaduz", "Europe/Vaduz"),
|
||||
("Europe/Vatican", "Europe/Vatican"),
|
||||
("Europe/Vienna", "Europe/Vienna"),
|
||||
("Europe/Vilnius", "Europe/Vilnius"),
|
||||
("Europe/Volgograd", "Europe/Volgograd"),
|
||||
("Europe/Warsaw", "Europe/Warsaw"),
|
||||
("Europe/Zagreb", "Europe/Zagreb"),
|
||||
("Europe/Zaporozhye", "Europe/Zaporozhye"),
|
||||
("Europe/Zurich", "Europe/Zurich"),
|
||||
("Factory", "Factory"),
|
||||
("GB", "GB"),
|
||||
("GB-Eire", "GB-Eire"),
|
||||
("GMT", "GMT"),
|
||||
("GMT+0", "GMT+0"),
|
||||
("GMT-0", "GMT-0"),
|
||||
("GMT0", "GMT0"),
|
||||
("Greenwich", "Greenwich"),
|
||||
("HST", "HST"),
|
||||
("Hongkong", "Hongkong"),
|
||||
("Iceland", "Iceland"),
|
||||
("Indian/Antananarivo", "Indian/Antananarivo"),
|
||||
("Indian/Chagos", "Indian/Chagos"),
|
||||
("Indian/Christmas", "Indian/Christmas"),
|
||||
("Indian/Cocos", "Indian/Cocos"),
|
||||
("Indian/Comoro", "Indian/Comoro"),
|
||||
("Indian/Kerguelen", "Indian/Kerguelen"),
|
||||
("Indian/Mahe", "Indian/Mahe"),
|
||||
("Indian/Maldives", "Indian/Maldives"),
|
||||
("Indian/Mauritius", "Indian/Mauritius"),
|
||||
("Indian/Mayotte", "Indian/Mayotte"),
|
||||
("Indian/Reunion", "Indian/Reunion"),
|
||||
("Iran", "Iran"),
|
||||
("Israel", "Israel"),
|
||||
("Jamaica", "Jamaica"),
|
||||
("Japan", "Japan"),
|
||||
("Kwajalein", "Kwajalein"),
|
||||
("Libya", "Libya"),
|
||||
("MET", "MET"),
|
||||
("MST", "MST"),
|
||||
("MST7MDT", "MST7MDT"),
|
||||
("Mexico/BajaNorte", "Mexico/BajaNorte"),
|
||||
("Mexico/BajaSur", "Mexico/BajaSur"),
|
||||
("Mexico/General", "Mexico/General"),
|
||||
("NZ", "NZ"),
|
||||
("NZ-CHAT", "NZ-CHAT"),
|
||||
("Navajo", "Navajo"),
|
||||
("PRC", "PRC"),
|
||||
("PST8PDT", "PST8PDT"),
|
||||
("Pacific/Apia", "Pacific/Apia"),
|
||||
("Pacific/Auckland", "Pacific/Auckland"),
|
||||
("Pacific/Bougainville", "Pacific/Bougainville"),
|
||||
("Pacific/Chatham", "Pacific/Chatham"),
|
||||
("Pacific/Chuuk", "Pacific/Chuuk"),
|
||||
("Pacific/Easter", "Pacific/Easter"),
|
||||
("Pacific/Efate", "Pacific/Efate"),
|
||||
("Pacific/Enderbury", "Pacific/Enderbury"),
|
||||
("Pacific/Fakaofo", "Pacific/Fakaofo"),
|
||||
("Pacific/Fiji", "Pacific/Fiji"),
|
||||
("Pacific/Funafuti", "Pacific/Funafuti"),
|
||||
("Pacific/Galapagos", "Pacific/Galapagos"),
|
||||
("Pacific/Gambier", "Pacific/Gambier"),
|
||||
("Pacific/Guadalcanal", "Pacific/Guadalcanal"),
|
||||
("Pacific/Guam", "Pacific/Guam"),
|
||||
("Pacific/Honolulu", "Pacific/Honolulu"),
|
||||
("Pacific/Johnston", "Pacific/Johnston"),
|
||||
("Pacific/Kanton", "Pacific/Kanton"),
|
||||
("Pacific/Kiritimati", "Pacific/Kiritimati"),
|
||||
("Pacific/Kosrae", "Pacific/Kosrae"),
|
||||
("Pacific/Kwajalein", "Pacific/Kwajalein"),
|
||||
("Pacific/Majuro", "Pacific/Majuro"),
|
||||
("Pacific/Marquesas", "Pacific/Marquesas"),
|
||||
("Pacific/Midway", "Pacific/Midway"),
|
||||
("Pacific/Nauru", "Pacific/Nauru"),
|
||||
("Pacific/Niue", "Pacific/Niue"),
|
||||
("Pacific/Norfolk", "Pacific/Norfolk"),
|
||||
("Pacific/Noumea", "Pacific/Noumea"),
|
||||
("Pacific/Pago_Pago", "Pacific/Pago_Pago"),
|
||||
("Pacific/Palau", "Pacific/Palau"),
|
||||
("Pacific/Pitcairn", "Pacific/Pitcairn"),
|
||||
("Pacific/Pohnpei", "Pacific/Pohnpei"),
|
||||
("Pacific/Ponape", "Pacific/Ponape"),
|
||||
("Pacific/Port_Moresby", "Pacific/Port_Moresby"),
|
||||
("Pacific/Rarotonga", "Pacific/Rarotonga"),
|
||||
("Pacific/Saipan", "Pacific/Saipan"),
|
||||
("Pacific/Samoa", "Pacific/Samoa"),
|
||||
("Pacific/Tahiti", "Pacific/Tahiti"),
|
||||
("Pacific/Tarawa", "Pacific/Tarawa"),
|
||||
("Pacific/Tongatapu", "Pacific/Tongatapu"),
|
||||
("Pacific/Truk", "Pacific/Truk"),
|
||||
("Pacific/Wake", "Pacific/Wake"),
|
||||
("Pacific/Wallis", "Pacific/Wallis"),
|
||||
("Pacific/Yap", "Pacific/Yap"),
|
||||
("Poland", "Poland"),
|
||||
("Portugal", "Portugal"),
|
||||
("ROC", "ROC"),
|
||||
("ROK", "ROK"),
|
||||
("Singapore", "Singapore"),
|
||||
("Turkey", "Turkey"),
|
||||
("UCT", "UCT"),
|
||||
("US/Alaska", "US/Alaska"),
|
||||
("US/Aleutian", "US/Aleutian"),
|
||||
("US/Arizona", "US/Arizona"),
|
||||
("US/Central", "US/Central"),
|
||||
("US/East-Indiana", "US/East-Indiana"),
|
||||
("US/Eastern", "US/Eastern"),
|
||||
("US/Hawaii", "US/Hawaii"),
|
||||
("US/Indiana-Starke", "US/Indiana-Starke"),
|
||||
("US/Michigan", "US/Michigan"),
|
||||
("US/Mountain", "US/Mountain"),
|
||||
("US/Pacific", "US/Pacific"),
|
||||
("US/Samoa", "US/Samoa"),
|
||||
("UTC", "UTC"),
|
||||
("Universal", "Universal"),
|
||||
("W-SU", "W-SU"),
|
||||
("WET", "WET"),
|
||||
("Zulu", "Zulu"),
|
||||
("localtime", "localtime"),
|
||||
],
|
||||
default=aircox.models.schedule.current_timezone_key,
|
||||
help_text="timezone used for the date",
|
||||
max_length=100,
|
||||
verbose_name="timezone",
|
||||
),
|
||||
),
|
||||
]
|
||||
@ -0,0 +1,641 @@
|
||||
# Generated by Django 4.2.1 on 2023-11-24 21:11
|
||||
|
||||
import aircox.models.schedule
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0014_alter_schedule_timezone"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="timezone",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("Africa/Abidjan", "Africa/Abidjan"),
|
||||
("Africa/Accra", "Africa/Accra"),
|
||||
("Africa/Addis_Ababa", "Africa/Addis_Ababa"),
|
||||
("Africa/Algiers", "Africa/Algiers"),
|
||||
("Africa/Asmara", "Africa/Asmara"),
|
||||
("Africa/Asmera", "Africa/Asmera"),
|
||||
("Africa/Bamako", "Africa/Bamako"),
|
||||
("Africa/Bangui", "Africa/Bangui"),
|
||||
("Africa/Banjul", "Africa/Banjul"),
|
||||
("Africa/Bissau", "Africa/Bissau"),
|
||||
("Africa/Blantyre", "Africa/Blantyre"),
|
||||
("Africa/Brazzaville", "Africa/Brazzaville"),
|
||||
("Africa/Bujumbura", "Africa/Bujumbura"),
|
||||
("Africa/Cairo", "Africa/Cairo"),
|
||||
("Africa/Casablanca", "Africa/Casablanca"),
|
||||
("Africa/Ceuta", "Africa/Ceuta"),
|
||||
("Africa/Conakry", "Africa/Conakry"),
|
||||
("Africa/Dakar", "Africa/Dakar"),
|
||||
("Africa/Dar_es_Salaam", "Africa/Dar_es_Salaam"),
|
||||
("Africa/Djibouti", "Africa/Djibouti"),
|
||||
("Africa/Douala", "Africa/Douala"),
|
||||
("Africa/El_Aaiun", "Africa/El_Aaiun"),
|
||||
("Africa/Freetown", "Africa/Freetown"),
|
||||
("Africa/Gaborone", "Africa/Gaborone"),
|
||||
("Africa/Harare", "Africa/Harare"),
|
||||
("Africa/Johannesburg", "Africa/Johannesburg"),
|
||||
("Africa/Juba", "Africa/Juba"),
|
||||
("Africa/Kampala", "Africa/Kampala"),
|
||||
("Africa/Khartoum", "Africa/Khartoum"),
|
||||
("Africa/Kigali", "Africa/Kigali"),
|
||||
("Africa/Kinshasa", "Africa/Kinshasa"),
|
||||
("Africa/Lagos", "Africa/Lagos"),
|
||||
("Africa/Libreville", "Africa/Libreville"),
|
||||
("Africa/Lome", "Africa/Lome"),
|
||||
("Africa/Luanda", "Africa/Luanda"),
|
||||
("Africa/Lubumbashi", "Africa/Lubumbashi"),
|
||||
("Africa/Lusaka", "Africa/Lusaka"),
|
||||
("Africa/Malabo", "Africa/Malabo"),
|
||||
("Africa/Maputo", "Africa/Maputo"),
|
||||
("Africa/Maseru", "Africa/Maseru"),
|
||||
("Africa/Mbabane", "Africa/Mbabane"),
|
||||
("Africa/Mogadishu", "Africa/Mogadishu"),
|
||||
("Africa/Monrovia", "Africa/Monrovia"),
|
||||
("Africa/Nairobi", "Africa/Nairobi"),
|
||||
("Africa/Ndjamena", "Africa/Ndjamena"),
|
||||
("Africa/Niamey", "Africa/Niamey"),
|
||||
("Africa/Nouakchott", "Africa/Nouakchott"),
|
||||
("Africa/Ouagadougou", "Africa/Ouagadougou"),
|
||||
("Africa/Porto-Novo", "Africa/Porto-Novo"),
|
||||
("Africa/Sao_Tome", "Africa/Sao_Tome"),
|
||||
("Africa/Timbuktu", "Africa/Timbuktu"),
|
||||
("Africa/Tripoli", "Africa/Tripoli"),
|
||||
("Africa/Tunis", "Africa/Tunis"),
|
||||
("Africa/Windhoek", "Africa/Windhoek"),
|
||||
("America/Adak", "America/Adak"),
|
||||
("America/Anchorage", "America/Anchorage"),
|
||||
("America/Anguilla", "America/Anguilla"),
|
||||
("America/Antigua", "America/Antigua"),
|
||||
("America/Araguaina", "America/Araguaina"),
|
||||
("America/Argentina/Buenos_Aires", "America/Argentina/Buenos_Aires"),
|
||||
("America/Argentina/Catamarca", "America/Argentina/Catamarca"),
|
||||
("America/Argentina/ComodRivadavia", "America/Argentina/ComodRivadavia"),
|
||||
("America/Argentina/Cordoba", "America/Argentina/Cordoba"),
|
||||
("America/Argentina/Jujuy", "America/Argentina/Jujuy"),
|
||||
("America/Argentina/La_Rioja", "America/Argentina/La_Rioja"),
|
||||
("America/Argentina/Mendoza", "America/Argentina/Mendoza"),
|
||||
("America/Argentina/Rio_Gallegos", "America/Argentina/Rio_Gallegos"),
|
||||
("America/Argentina/Salta", "America/Argentina/Salta"),
|
||||
("America/Argentina/San_Juan", "America/Argentina/San_Juan"),
|
||||
("America/Argentina/San_Luis", "America/Argentina/San_Luis"),
|
||||
("America/Argentina/Tucuman", "America/Argentina/Tucuman"),
|
||||
("America/Argentina/Ushuaia", "America/Argentina/Ushuaia"),
|
||||
("America/Aruba", "America/Aruba"),
|
||||
("America/Asuncion", "America/Asuncion"),
|
||||
("America/Atikokan", "America/Atikokan"),
|
||||
("America/Atka", "America/Atka"),
|
||||
("America/Bahia", "America/Bahia"),
|
||||
("America/Bahia_Banderas", "America/Bahia_Banderas"),
|
||||
("America/Barbados", "America/Barbados"),
|
||||
("America/Belem", "America/Belem"),
|
||||
("America/Belize", "America/Belize"),
|
||||
("America/Blanc-Sablon", "America/Blanc-Sablon"),
|
||||
("America/Boa_Vista", "America/Boa_Vista"),
|
||||
("America/Bogota", "America/Bogota"),
|
||||
("America/Boise", "America/Boise"),
|
||||
("America/Buenos_Aires", "America/Buenos_Aires"),
|
||||
("America/Cambridge_Bay", "America/Cambridge_Bay"),
|
||||
("America/Campo_Grande", "America/Campo_Grande"),
|
||||
("America/Cancun", "America/Cancun"),
|
||||
("America/Caracas", "America/Caracas"),
|
||||
("America/Catamarca", "America/Catamarca"),
|
||||
("America/Cayenne", "America/Cayenne"),
|
||||
("America/Cayman", "America/Cayman"),
|
||||
("America/Chicago", "America/Chicago"),
|
||||
("America/Chihuahua", "America/Chihuahua"),
|
||||
("America/Ciudad_Juarez", "America/Ciudad_Juarez"),
|
||||
("America/Coral_Harbour", "America/Coral_Harbour"),
|
||||
("America/Cordoba", "America/Cordoba"),
|
||||
("America/Costa_Rica", "America/Costa_Rica"),
|
||||
("America/Creston", "America/Creston"),
|
||||
("America/Cuiaba", "America/Cuiaba"),
|
||||
("America/Curacao", "America/Curacao"),
|
||||
("America/Danmarkshavn", "America/Danmarkshavn"),
|
||||
("America/Dawson", "America/Dawson"),
|
||||
("America/Dawson_Creek", "America/Dawson_Creek"),
|
||||
("America/Denver", "America/Denver"),
|
||||
("America/Detroit", "America/Detroit"),
|
||||
("America/Dominica", "America/Dominica"),
|
||||
("America/Edmonton", "America/Edmonton"),
|
||||
("America/Eirunepe", "America/Eirunepe"),
|
||||
("America/El_Salvador", "America/El_Salvador"),
|
||||
("America/Ensenada", "America/Ensenada"),
|
||||
("America/Fort_Nelson", "America/Fort_Nelson"),
|
||||
("America/Fort_Wayne", "America/Fort_Wayne"),
|
||||
("America/Fortaleza", "America/Fortaleza"),
|
||||
("America/Glace_Bay", "America/Glace_Bay"),
|
||||
("America/Godthab", "America/Godthab"),
|
||||
("America/Goose_Bay", "America/Goose_Bay"),
|
||||
("America/Grand_Turk", "America/Grand_Turk"),
|
||||
("America/Grenada", "America/Grenada"),
|
||||
("America/Guadeloupe", "America/Guadeloupe"),
|
||||
("America/Guatemala", "America/Guatemala"),
|
||||
("America/Guayaquil", "America/Guayaquil"),
|
||||
("America/Guyana", "America/Guyana"),
|
||||
("America/Halifax", "America/Halifax"),
|
||||
("America/Havana", "America/Havana"),
|
||||
("America/Hermosillo", "America/Hermosillo"),
|
||||
("America/Indiana/Indianapolis", "America/Indiana/Indianapolis"),
|
||||
("America/Indiana/Knox", "America/Indiana/Knox"),
|
||||
("America/Indiana/Marengo", "America/Indiana/Marengo"),
|
||||
("America/Indiana/Petersburg", "America/Indiana/Petersburg"),
|
||||
("America/Indiana/Tell_City", "America/Indiana/Tell_City"),
|
||||
("America/Indiana/Vevay", "America/Indiana/Vevay"),
|
||||
("America/Indiana/Vincennes", "America/Indiana/Vincennes"),
|
||||
("America/Indiana/Winamac", "America/Indiana/Winamac"),
|
||||
("America/Indianapolis", "America/Indianapolis"),
|
||||
("America/Inuvik", "America/Inuvik"),
|
||||
("America/Iqaluit", "America/Iqaluit"),
|
||||
("America/Jamaica", "America/Jamaica"),
|
||||
("America/Jujuy", "America/Jujuy"),
|
||||
("America/Juneau", "America/Juneau"),
|
||||
("America/Kentucky/Louisville", "America/Kentucky/Louisville"),
|
||||
("America/Kentucky/Monticello", "America/Kentucky/Monticello"),
|
||||
("America/Knox_IN", "America/Knox_IN"),
|
||||
("America/Kralendijk", "America/Kralendijk"),
|
||||
("America/La_Paz", "America/La_Paz"),
|
||||
("America/Lima", "America/Lima"),
|
||||
("America/Los_Angeles", "America/Los_Angeles"),
|
||||
("America/Louisville", "America/Louisville"),
|
||||
("America/Lower_Princes", "America/Lower_Princes"),
|
||||
("America/Maceio", "America/Maceio"),
|
||||
("America/Managua", "America/Managua"),
|
||||
("America/Manaus", "America/Manaus"),
|
||||
("America/Marigot", "America/Marigot"),
|
||||
("America/Martinique", "America/Martinique"),
|
||||
("America/Matamoros", "America/Matamoros"),
|
||||
("America/Mazatlan", "America/Mazatlan"),
|
||||
("America/Mendoza", "America/Mendoza"),
|
||||
("America/Menominee", "America/Menominee"),
|
||||
("America/Merida", "America/Merida"),
|
||||
("America/Metlakatla", "America/Metlakatla"),
|
||||
("America/Mexico_City", "America/Mexico_City"),
|
||||
("America/Miquelon", "America/Miquelon"),
|
||||
("America/Moncton", "America/Moncton"),
|
||||
("America/Monterrey", "America/Monterrey"),
|
||||
("America/Montevideo", "America/Montevideo"),
|
||||
("America/Montreal", "America/Montreal"),
|
||||
("America/Montserrat", "America/Montserrat"),
|
||||
("America/Nassau", "America/Nassau"),
|
||||
("America/New_York", "America/New_York"),
|
||||
("America/Nipigon", "America/Nipigon"),
|
||||
("America/Nome", "America/Nome"),
|
||||
("America/Noronha", "America/Noronha"),
|
||||
("America/North_Dakota/Beulah", "America/North_Dakota/Beulah"),
|
||||
("America/North_Dakota/Center", "America/North_Dakota/Center"),
|
||||
("America/North_Dakota/New_Salem", "America/North_Dakota/New_Salem"),
|
||||
("America/Nuuk", "America/Nuuk"),
|
||||
("America/Ojinaga", "America/Ojinaga"),
|
||||
("America/Panama", "America/Panama"),
|
||||
("America/Pangnirtung", "America/Pangnirtung"),
|
||||
("America/Paramaribo", "America/Paramaribo"),
|
||||
("America/Phoenix", "America/Phoenix"),
|
||||
("America/Port-au-Prince", "America/Port-au-Prince"),
|
||||
("America/Port_of_Spain", "America/Port_of_Spain"),
|
||||
("America/Porto_Acre", "America/Porto_Acre"),
|
||||
("America/Porto_Velho", "America/Porto_Velho"),
|
||||
("America/Puerto_Rico", "America/Puerto_Rico"),
|
||||
("America/Punta_Arenas", "America/Punta_Arenas"),
|
||||
("America/Rainy_River", "America/Rainy_River"),
|
||||
("America/Rankin_Inlet", "America/Rankin_Inlet"),
|
||||
("America/Recife", "America/Recife"),
|
||||
("America/Regina", "America/Regina"),
|
||||
("America/Resolute", "America/Resolute"),
|
||||
("America/Rio_Branco", "America/Rio_Branco"),
|
||||
("America/Rosario", "America/Rosario"),
|
||||
("America/Santa_Isabel", "America/Santa_Isabel"),
|
||||
("America/Santarem", "America/Santarem"),
|
||||
("America/Santiago", "America/Santiago"),
|
||||
("America/Santo_Domingo", "America/Santo_Domingo"),
|
||||
("America/Sao_Paulo", "America/Sao_Paulo"),
|
||||
("America/Scoresbysund", "America/Scoresbysund"),
|
||||
("America/Shiprock", "America/Shiprock"),
|
||||
("America/Sitka", "America/Sitka"),
|
||||
("America/St_Barthelemy", "America/St_Barthelemy"),
|
||||
("America/St_Johns", "America/St_Johns"),
|
||||
("America/St_Kitts", "America/St_Kitts"),
|
||||
("America/St_Lucia", "America/St_Lucia"),
|
||||
("America/St_Thomas", "America/St_Thomas"),
|
||||
("America/St_Vincent", "America/St_Vincent"),
|
||||
("America/Swift_Current", "America/Swift_Current"),
|
||||
("America/Tegucigalpa", "America/Tegucigalpa"),
|
||||
("America/Thule", "America/Thule"),
|
||||
("America/Thunder_Bay", "America/Thunder_Bay"),
|
||||
("America/Tijuana", "America/Tijuana"),
|
||||
("America/Toronto", "America/Toronto"),
|
||||
("America/Tortola", "America/Tortola"),
|
||||
("America/Vancouver", "America/Vancouver"),
|
||||
("America/Virgin", "America/Virgin"),
|
||||
("America/Whitehorse", "America/Whitehorse"),
|
||||
("America/Winnipeg", "America/Winnipeg"),
|
||||
("America/Yakutat", "America/Yakutat"),
|
||||
("America/Yellowknife", "America/Yellowknife"),
|
||||
("Antarctica/Casey", "Antarctica/Casey"),
|
||||
("Antarctica/Davis", "Antarctica/Davis"),
|
||||
("Antarctica/DumontDUrville", "Antarctica/DumontDUrville"),
|
||||
("Antarctica/Macquarie", "Antarctica/Macquarie"),
|
||||
("Antarctica/Mawson", "Antarctica/Mawson"),
|
||||
("Antarctica/McMurdo", "Antarctica/McMurdo"),
|
||||
("Antarctica/Palmer", "Antarctica/Palmer"),
|
||||
("Antarctica/Rothera", "Antarctica/Rothera"),
|
||||
("Antarctica/South_Pole", "Antarctica/South_Pole"),
|
||||
("Antarctica/Syowa", "Antarctica/Syowa"),
|
||||
("Antarctica/Troll", "Antarctica/Troll"),
|
||||
("Antarctica/Vostok", "Antarctica/Vostok"),
|
||||
("Arctic/Longyearbyen", "Arctic/Longyearbyen"),
|
||||
("Asia/Aden", "Asia/Aden"),
|
||||
("Asia/Almaty", "Asia/Almaty"),
|
||||
("Asia/Amman", "Asia/Amman"),
|
||||
("Asia/Anadyr", "Asia/Anadyr"),
|
||||
("Asia/Aqtau", "Asia/Aqtau"),
|
||||
("Asia/Aqtobe", "Asia/Aqtobe"),
|
||||
("Asia/Ashgabat", "Asia/Ashgabat"),
|
||||
("Asia/Ashkhabad", "Asia/Ashkhabad"),
|
||||
("Asia/Atyrau", "Asia/Atyrau"),
|
||||
("Asia/Baghdad", "Asia/Baghdad"),
|
||||
("Asia/Bahrain", "Asia/Bahrain"),
|
||||
("Asia/Baku", "Asia/Baku"),
|
||||
("Asia/Bangkok", "Asia/Bangkok"),
|
||||
("Asia/Barnaul", "Asia/Barnaul"),
|
||||
("Asia/Beirut", "Asia/Beirut"),
|
||||
("Asia/Bishkek", "Asia/Bishkek"),
|
||||
("Asia/Brunei", "Asia/Brunei"),
|
||||
("Asia/Calcutta", "Asia/Calcutta"),
|
||||
("Asia/Chita", "Asia/Chita"),
|
||||
("Asia/Choibalsan", "Asia/Choibalsan"),
|
||||
("Asia/Chongqing", "Asia/Chongqing"),
|
||||
("Asia/Chungking", "Asia/Chungking"),
|
||||
("Asia/Colombo", "Asia/Colombo"),
|
||||
("Asia/Dacca", "Asia/Dacca"),
|
||||
("Asia/Damascus", "Asia/Damascus"),
|
||||
("Asia/Dhaka", "Asia/Dhaka"),
|
||||
("Asia/Dili", "Asia/Dili"),
|
||||
("Asia/Dubai", "Asia/Dubai"),
|
||||
("Asia/Dushanbe", "Asia/Dushanbe"),
|
||||
("Asia/Famagusta", "Asia/Famagusta"),
|
||||
("Asia/Gaza", "Asia/Gaza"),
|
||||
("Asia/Harbin", "Asia/Harbin"),
|
||||
("Asia/Hebron", "Asia/Hebron"),
|
||||
("Asia/Ho_Chi_Minh", "Asia/Ho_Chi_Minh"),
|
||||
("Asia/Hong_Kong", "Asia/Hong_Kong"),
|
||||
("Asia/Hovd", "Asia/Hovd"),
|
||||
("Asia/Irkutsk", "Asia/Irkutsk"),
|
||||
("Asia/Istanbul", "Asia/Istanbul"),
|
||||
("Asia/Jakarta", "Asia/Jakarta"),
|
||||
("Asia/Jayapura", "Asia/Jayapura"),
|
||||
("Asia/Jerusalem", "Asia/Jerusalem"),
|
||||
("Asia/Kabul", "Asia/Kabul"),
|
||||
("Asia/Kamchatka", "Asia/Kamchatka"),
|
||||
("Asia/Karachi", "Asia/Karachi"),
|
||||
("Asia/Kashgar", "Asia/Kashgar"),
|
||||
("Asia/Kathmandu", "Asia/Kathmandu"),
|
||||
("Asia/Katmandu", "Asia/Katmandu"),
|
||||
("Asia/Khandyga", "Asia/Khandyga"),
|
||||
("Asia/Kolkata", "Asia/Kolkata"),
|
||||
("Asia/Krasnoyarsk", "Asia/Krasnoyarsk"),
|
||||
("Asia/Kuala_Lumpur", "Asia/Kuala_Lumpur"),
|
||||
("Asia/Kuching", "Asia/Kuching"),
|
||||
("Asia/Kuwait", "Asia/Kuwait"),
|
||||
("Asia/Macao", "Asia/Macao"),
|
||||
("Asia/Macau", "Asia/Macau"),
|
||||
("Asia/Magadan", "Asia/Magadan"),
|
||||
("Asia/Makassar", "Asia/Makassar"),
|
||||
("Asia/Manila", "Asia/Manila"),
|
||||
("Asia/Muscat", "Asia/Muscat"),
|
||||
("Asia/Nicosia", "Asia/Nicosia"),
|
||||
("Asia/Novokuznetsk", "Asia/Novokuznetsk"),
|
||||
("Asia/Novosibirsk", "Asia/Novosibirsk"),
|
||||
("Asia/Omsk", "Asia/Omsk"),
|
||||
("Asia/Oral", "Asia/Oral"),
|
||||
("Asia/Phnom_Penh", "Asia/Phnom_Penh"),
|
||||
("Asia/Pontianak", "Asia/Pontianak"),
|
||||
("Asia/Pyongyang", "Asia/Pyongyang"),
|
||||
("Asia/Qatar", "Asia/Qatar"),
|
||||
("Asia/Qostanay", "Asia/Qostanay"),
|
||||
("Asia/Qyzylorda", "Asia/Qyzylorda"),
|
||||
("Asia/Rangoon", "Asia/Rangoon"),
|
||||
("Asia/Riyadh", "Asia/Riyadh"),
|
||||
("Asia/Saigon", "Asia/Saigon"),
|
||||
("Asia/Sakhalin", "Asia/Sakhalin"),
|
||||
("Asia/Samarkand", "Asia/Samarkand"),
|
||||
("Asia/Seoul", "Asia/Seoul"),
|
||||
("Asia/Shanghai", "Asia/Shanghai"),
|
||||
("Asia/Singapore", "Asia/Singapore"),
|
||||
("Asia/Srednekolymsk", "Asia/Srednekolymsk"),
|
||||
("Asia/Taipei", "Asia/Taipei"),
|
||||
("Asia/Tashkent", "Asia/Tashkent"),
|
||||
("Asia/Tbilisi", "Asia/Tbilisi"),
|
||||
("Asia/Tehran", "Asia/Tehran"),
|
||||
("Asia/Tel_Aviv", "Asia/Tel_Aviv"),
|
||||
("Asia/Thimbu", "Asia/Thimbu"),
|
||||
("Asia/Thimphu", "Asia/Thimphu"),
|
||||
("Asia/Tokyo", "Asia/Tokyo"),
|
||||
("Asia/Tomsk", "Asia/Tomsk"),
|
||||
("Asia/Ujung_Pandang", "Asia/Ujung_Pandang"),
|
||||
("Asia/Ulaanbaatar", "Asia/Ulaanbaatar"),
|
||||
("Asia/Ulan_Bator", "Asia/Ulan_Bator"),
|
||||
("Asia/Urumqi", "Asia/Urumqi"),
|
||||
("Asia/Ust-Nera", "Asia/Ust-Nera"),
|
||||
("Asia/Vientiane", "Asia/Vientiane"),
|
||||
("Asia/Vladivostok", "Asia/Vladivostok"),
|
||||
("Asia/Yakutsk", "Asia/Yakutsk"),
|
||||
("Asia/Yangon", "Asia/Yangon"),
|
||||
("Asia/Yekaterinburg", "Asia/Yekaterinburg"),
|
||||
("Asia/Yerevan", "Asia/Yerevan"),
|
||||
("Atlantic/Azores", "Atlantic/Azores"),
|
||||
("Atlantic/Bermuda", "Atlantic/Bermuda"),
|
||||
("Atlantic/Canary", "Atlantic/Canary"),
|
||||
("Atlantic/Cape_Verde", "Atlantic/Cape_Verde"),
|
||||
("Atlantic/Faeroe", "Atlantic/Faeroe"),
|
||||
("Atlantic/Faroe", "Atlantic/Faroe"),
|
||||
("Atlantic/Jan_Mayen", "Atlantic/Jan_Mayen"),
|
||||
("Atlantic/Madeira", "Atlantic/Madeira"),
|
||||
("Atlantic/Reykjavik", "Atlantic/Reykjavik"),
|
||||
("Atlantic/South_Georgia", "Atlantic/South_Georgia"),
|
||||
("Atlantic/St_Helena", "Atlantic/St_Helena"),
|
||||
("Atlantic/Stanley", "Atlantic/Stanley"),
|
||||
("Australia/ACT", "Australia/ACT"),
|
||||
("Australia/Adelaide", "Australia/Adelaide"),
|
||||
("Australia/Brisbane", "Australia/Brisbane"),
|
||||
("Australia/Broken_Hill", "Australia/Broken_Hill"),
|
||||
("Australia/Canberra", "Australia/Canberra"),
|
||||
("Australia/Currie", "Australia/Currie"),
|
||||
("Australia/Darwin", "Australia/Darwin"),
|
||||
("Australia/Eucla", "Australia/Eucla"),
|
||||
("Australia/Hobart", "Australia/Hobart"),
|
||||
("Australia/LHI", "Australia/LHI"),
|
||||
("Australia/Lindeman", "Australia/Lindeman"),
|
||||
("Australia/Lord_Howe", "Australia/Lord_Howe"),
|
||||
("Australia/Melbourne", "Australia/Melbourne"),
|
||||
("Australia/NSW", "Australia/NSW"),
|
||||
("Australia/North", "Australia/North"),
|
||||
("Australia/Perth", "Australia/Perth"),
|
||||
("Australia/Queensland", "Australia/Queensland"),
|
||||
("Australia/South", "Australia/South"),
|
||||
("Australia/Sydney", "Australia/Sydney"),
|
||||
("Australia/Tasmania", "Australia/Tasmania"),
|
||||
("Australia/Victoria", "Australia/Victoria"),
|
||||
("Australia/West", "Australia/West"),
|
||||
("Australia/Yancowinna", "Australia/Yancowinna"),
|
||||
("Brazil/Acre", "Brazil/Acre"),
|
||||
("Brazil/DeNoronha", "Brazil/DeNoronha"),
|
||||
("Brazil/East", "Brazil/East"),
|
||||
("Brazil/West", "Brazil/West"),
|
||||
("CET", "CET"),
|
||||
("CST6CDT", "CST6CDT"),
|
||||
("Canada/Atlantic", "Canada/Atlantic"),
|
||||
("Canada/Central", "Canada/Central"),
|
||||
("Canada/Eastern", "Canada/Eastern"),
|
||||
("Canada/Mountain", "Canada/Mountain"),
|
||||
("Canada/Newfoundland", "Canada/Newfoundland"),
|
||||
("Canada/Pacific", "Canada/Pacific"),
|
||||
("Canada/Saskatchewan", "Canada/Saskatchewan"),
|
||||
("Canada/Yukon", "Canada/Yukon"),
|
||||
("Chile/Continental", "Chile/Continental"),
|
||||
("Chile/EasterIsland", "Chile/EasterIsland"),
|
||||
("Cuba", "Cuba"),
|
||||
("EET", "EET"),
|
||||
("EST", "EST"),
|
||||
("EST5EDT", "EST5EDT"),
|
||||
("Egypt", "Egypt"),
|
||||
("Eire", "Eire"),
|
||||
("Etc/GMT", "Etc/GMT"),
|
||||
("Etc/GMT+0", "Etc/GMT+0"),
|
||||
("Etc/GMT+1", "Etc/GMT+1"),
|
||||
("Etc/GMT+10", "Etc/GMT+10"),
|
||||
("Etc/GMT+11", "Etc/GMT+11"),
|
||||
("Etc/GMT+12", "Etc/GMT+12"),
|
||||
("Etc/GMT+2", "Etc/GMT+2"),
|
||||
("Etc/GMT+3", "Etc/GMT+3"),
|
||||
("Etc/GMT+4", "Etc/GMT+4"),
|
||||
("Etc/GMT+5", "Etc/GMT+5"),
|
||||
("Etc/GMT+6", "Etc/GMT+6"),
|
||||
("Etc/GMT+7", "Etc/GMT+7"),
|
||||
("Etc/GMT+8", "Etc/GMT+8"),
|
||||
("Etc/GMT+9", "Etc/GMT+9"),
|
||||
("Etc/GMT-0", "Etc/GMT-0"),
|
||||
("Etc/GMT-1", "Etc/GMT-1"),
|
||||
("Etc/GMT-10", "Etc/GMT-10"),
|
||||
("Etc/GMT-11", "Etc/GMT-11"),
|
||||
("Etc/GMT-12", "Etc/GMT-12"),
|
||||
("Etc/GMT-13", "Etc/GMT-13"),
|
||||
("Etc/GMT-14", "Etc/GMT-14"),
|
||||
("Etc/GMT-2", "Etc/GMT-2"),
|
||||
("Etc/GMT-3", "Etc/GMT-3"),
|
||||
("Etc/GMT-4", "Etc/GMT-4"),
|
||||
("Etc/GMT-5", "Etc/GMT-5"),
|
||||
("Etc/GMT-6", "Etc/GMT-6"),
|
||||
("Etc/GMT-7", "Etc/GMT-7"),
|
||||
("Etc/GMT-8", "Etc/GMT-8"),
|
||||
("Etc/GMT-9", "Etc/GMT-9"),
|
||||
("Etc/GMT0", "Etc/GMT0"),
|
||||
("Etc/Greenwich", "Etc/Greenwich"),
|
||||
("Etc/UCT", "Etc/UCT"),
|
||||
("Etc/UTC", "Etc/UTC"),
|
||||
("Etc/Universal", "Etc/Universal"),
|
||||
("Etc/Zulu", "Etc/Zulu"),
|
||||
("Europe/Amsterdam", "Europe/Amsterdam"),
|
||||
("Europe/Andorra", "Europe/Andorra"),
|
||||
("Europe/Astrakhan", "Europe/Astrakhan"),
|
||||
("Europe/Athens", "Europe/Athens"),
|
||||
("Europe/Belfast", "Europe/Belfast"),
|
||||
("Europe/Belgrade", "Europe/Belgrade"),
|
||||
("Europe/Berlin", "Europe/Berlin"),
|
||||
("Europe/Bratislava", "Europe/Bratislava"),
|
||||
("Europe/Brussels", "Europe/Brussels"),
|
||||
("Europe/Bucharest", "Europe/Bucharest"),
|
||||
("Europe/Budapest", "Europe/Budapest"),
|
||||
("Europe/Busingen", "Europe/Busingen"),
|
||||
("Europe/Chisinau", "Europe/Chisinau"),
|
||||
("Europe/Copenhagen", "Europe/Copenhagen"),
|
||||
("Europe/Dublin", "Europe/Dublin"),
|
||||
("Europe/Gibraltar", "Europe/Gibraltar"),
|
||||
("Europe/Guernsey", "Europe/Guernsey"),
|
||||
("Europe/Helsinki", "Europe/Helsinki"),
|
||||
("Europe/Isle_of_Man", "Europe/Isle_of_Man"),
|
||||
("Europe/Istanbul", "Europe/Istanbul"),
|
||||
("Europe/Jersey", "Europe/Jersey"),
|
||||
("Europe/Kaliningrad", "Europe/Kaliningrad"),
|
||||
("Europe/Kiev", "Europe/Kiev"),
|
||||
("Europe/Kirov", "Europe/Kirov"),
|
||||
("Europe/Kyiv", "Europe/Kyiv"),
|
||||
("Europe/Lisbon", "Europe/Lisbon"),
|
||||
("Europe/Ljubljana", "Europe/Ljubljana"),
|
||||
("Europe/London", "Europe/London"),
|
||||
("Europe/Luxembourg", "Europe/Luxembourg"),
|
||||
("Europe/Madrid", "Europe/Madrid"),
|
||||
("Europe/Malta", "Europe/Malta"),
|
||||
("Europe/Mariehamn", "Europe/Mariehamn"),
|
||||
("Europe/Minsk", "Europe/Minsk"),
|
||||
("Europe/Monaco", "Europe/Monaco"),
|
||||
("Europe/Moscow", "Europe/Moscow"),
|
||||
("Europe/Nicosia", "Europe/Nicosia"),
|
||||
("Europe/Oslo", "Europe/Oslo"),
|
||||
("Europe/Paris", "Europe/Paris"),
|
||||
("Europe/Podgorica", "Europe/Podgorica"),
|
||||
("Europe/Prague", "Europe/Prague"),
|
||||
("Europe/Riga", "Europe/Riga"),
|
||||
("Europe/Rome", "Europe/Rome"),
|
||||
("Europe/Samara", "Europe/Samara"),
|
||||
("Europe/San_Marino", "Europe/San_Marino"),
|
||||
("Europe/Sarajevo", "Europe/Sarajevo"),
|
||||
("Europe/Saratov", "Europe/Saratov"),
|
||||
("Europe/Simferopol", "Europe/Simferopol"),
|
||||
("Europe/Skopje", "Europe/Skopje"),
|
||||
("Europe/Sofia", "Europe/Sofia"),
|
||||
("Europe/Stockholm", "Europe/Stockholm"),
|
||||
("Europe/Tallinn", "Europe/Tallinn"),
|
||||
("Europe/Tirane", "Europe/Tirane"),
|
||||
("Europe/Tiraspol", "Europe/Tiraspol"),
|
||||
("Europe/Ulyanovsk", "Europe/Ulyanovsk"),
|
||||
("Europe/Uzhgorod", "Europe/Uzhgorod"),
|
||||
("Europe/Vaduz", "Europe/Vaduz"),
|
||||
("Europe/Vatican", "Europe/Vatican"),
|
||||
("Europe/Vienna", "Europe/Vienna"),
|
||||
("Europe/Vilnius", "Europe/Vilnius"),
|
||||
("Europe/Volgograd", "Europe/Volgograd"),
|
||||
("Europe/Warsaw", "Europe/Warsaw"),
|
||||
("Europe/Zagreb", "Europe/Zagreb"),
|
||||
("Europe/Zaporozhye", "Europe/Zaporozhye"),
|
||||
("Europe/Zurich", "Europe/Zurich"),
|
||||
("Factory", "Factory"),
|
||||
("GB", "GB"),
|
||||
("GB-Eire", "GB-Eire"),
|
||||
("GMT", "GMT"),
|
||||
("GMT+0", "GMT+0"),
|
||||
("GMT-0", "GMT-0"),
|
||||
("GMT0", "GMT0"),
|
||||
("Greenwich", "Greenwich"),
|
||||
("HST", "HST"),
|
||||
("Hongkong", "Hongkong"),
|
||||
("Iceland", "Iceland"),
|
||||
("Indian/Antananarivo", "Indian/Antananarivo"),
|
||||
("Indian/Chagos", "Indian/Chagos"),
|
||||
("Indian/Christmas", "Indian/Christmas"),
|
||||
("Indian/Cocos", "Indian/Cocos"),
|
||||
("Indian/Comoro", "Indian/Comoro"),
|
||||
("Indian/Kerguelen", "Indian/Kerguelen"),
|
||||
("Indian/Mahe", "Indian/Mahe"),
|
||||
("Indian/Maldives", "Indian/Maldives"),
|
||||
("Indian/Mauritius", "Indian/Mauritius"),
|
||||
("Indian/Mayotte", "Indian/Mayotte"),
|
||||
("Indian/Reunion", "Indian/Reunion"),
|
||||
("Iran", "Iran"),
|
||||
("Israel", "Israel"),
|
||||
("Jamaica", "Jamaica"),
|
||||
("Japan", "Japan"),
|
||||
("Kwajalein", "Kwajalein"),
|
||||
("Libya", "Libya"),
|
||||
("MET", "MET"),
|
||||
("MST", "MST"),
|
||||
("MST7MDT", "MST7MDT"),
|
||||
("Mexico/BajaNorte", "Mexico/BajaNorte"),
|
||||
("Mexico/BajaSur", "Mexico/BajaSur"),
|
||||
("Mexico/General", "Mexico/General"),
|
||||
("NZ", "NZ"),
|
||||
("NZ-CHAT", "NZ-CHAT"),
|
||||
("Navajo", "Navajo"),
|
||||
("PRC", "PRC"),
|
||||
("PST8PDT", "PST8PDT"),
|
||||
("Pacific/Apia", "Pacific/Apia"),
|
||||
("Pacific/Auckland", "Pacific/Auckland"),
|
||||
("Pacific/Bougainville", "Pacific/Bougainville"),
|
||||
("Pacific/Chatham", "Pacific/Chatham"),
|
||||
("Pacific/Chuuk", "Pacific/Chuuk"),
|
||||
("Pacific/Easter", "Pacific/Easter"),
|
||||
("Pacific/Efate", "Pacific/Efate"),
|
||||
("Pacific/Enderbury", "Pacific/Enderbury"),
|
||||
("Pacific/Fakaofo", "Pacific/Fakaofo"),
|
||||
("Pacific/Fiji", "Pacific/Fiji"),
|
||||
("Pacific/Funafuti", "Pacific/Funafuti"),
|
||||
("Pacific/Galapagos", "Pacific/Galapagos"),
|
||||
("Pacific/Gambier", "Pacific/Gambier"),
|
||||
("Pacific/Guadalcanal", "Pacific/Guadalcanal"),
|
||||
("Pacific/Guam", "Pacific/Guam"),
|
||||
("Pacific/Honolulu", "Pacific/Honolulu"),
|
||||
("Pacific/Johnston", "Pacific/Johnston"),
|
||||
("Pacific/Kanton", "Pacific/Kanton"),
|
||||
("Pacific/Kiritimati", "Pacific/Kiritimati"),
|
||||
("Pacific/Kosrae", "Pacific/Kosrae"),
|
||||
("Pacific/Kwajalein", "Pacific/Kwajalein"),
|
||||
("Pacific/Majuro", "Pacific/Majuro"),
|
||||
("Pacific/Marquesas", "Pacific/Marquesas"),
|
||||
("Pacific/Midway", "Pacific/Midway"),
|
||||
("Pacific/Nauru", "Pacific/Nauru"),
|
||||
("Pacific/Niue", "Pacific/Niue"),
|
||||
("Pacific/Norfolk", "Pacific/Norfolk"),
|
||||
("Pacific/Noumea", "Pacific/Noumea"),
|
||||
("Pacific/Pago_Pago", "Pacific/Pago_Pago"),
|
||||
("Pacific/Palau", "Pacific/Palau"),
|
||||
("Pacific/Pitcairn", "Pacific/Pitcairn"),
|
||||
("Pacific/Pohnpei", "Pacific/Pohnpei"),
|
||||
("Pacific/Ponape", "Pacific/Ponape"),
|
||||
("Pacific/Port_Moresby", "Pacific/Port_Moresby"),
|
||||
("Pacific/Rarotonga", "Pacific/Rarotonga"),
|
||||
("Pacific/Saipan", "Pacific/Saipan"),
|
||||
("Pacific/Samoa", "Pacific/Samoa"),
|
||||
("Pacific/Tahiti", "Pacific/Tahiti"),
|
||||
("Pacific/Tarawa", "Pacific/Tarawa"),
|
||||
("Pacific/Tongatapu", "Pacific/Tongatapu"),
|
||||
("Pacific/Truk", "Pacific/Truk"),
|
||||
("Pacific/Wake", "Pacific/Wake"),
|
||||
("Pacific/Wallis", "Pacific/Wallis"),
|
||||
("Pacific/Yap", "Pacific/Yap"),
|
||||
("Poland", "Poland"),
|
||||
("Portugal", "Portugal"),
|
||||
("ROC", "ROC"),
|
||||
("ROK", "ROK"),
|
||||
("Singapore", "Singapore"),
|
||||
("Turkey", "Turkey"),
|
||||
("UCT", "UCT"),
|
||||
("US/Alaska", "US/Alaska"),
|
||||
("US/Aleutian", "US/Aleutian"),
|
||||
("US/Arizona", "US/Arizona"),
|
||||
("US/Central", "US/Central"),
|
||||
("US/East-Indiana", "US/East-Indiana"),
|
||||
("US/Eastern", "US/Eastern"),
|
||||
("US/Hawaii", "US/Hawaii"),
|
||||
("US/Indiana-Starke", "US/Indiana-Starke"),
|
||||
("US/Michigan", "US/Michigan"),
|
||||
("US/Mountain", "US/Mountain"),
|
||||
("US/Pacific", "US/Pacific"),
|
||||
("US/Samoa", "US/Samoa"),
|
||||
("UTC", "UTC"),
|
||||
("Universal", "Universal"),
|
||||
("W-SU", "W-SU"),
|
||||
("WET", "WET"),
|
||||
("Zulu", "Zulu"),
|
||||
],
|
||||
default=aircox.models.schedule.current_timezone_key,
|
||||
help_text="timezone used for the date",
|
||||
max_length=100,
|
||||
verbose_name="timezone",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="staticpage",
|
||||
name="attach_to",
|
||||
field=models.SmallIntegerField(
|
||||
blank=True,
|
||||
choices=[
|
||||
(0, "Home page"),
|
||||
(1, "Diffusions page"),
|
||||
(2, "Logs page"),
|
||||
(3, "Programs list"),
|
||||
(4, "Episodes list"),
|
||||
(5, "Articles list"),
|
||||
(6, "Publications list"),
|
||||
],
|
||||
help_text="display this page content to related element",
|
||||
null=True,
|
||||
verbose_name="attach to",
|
||||
),
|
||||
),
|
||||
]
|
||||
25
aircox/migrations/0015_program_editors.py
Normal file
25
aircox/migrations/0015_program_editors.py
Normal file
@ -0,0 +1,25 @@
|
||||
# Generated by Django 4.2.5 on 2023-10-18 13:50
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("auth", "0012_alter_user_first_name_max_length"),
|
||||
("aircox", "0014_alter_schedule_timezone"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="program",
|
||||
name="editors",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="auth.group",
|
||||
verbose_name="editors",
|
||||
),
|
||||
),
|
||||
]
|
||||
32
aircox/migrations/0016_alter_staticpage_attach_to.py
Normal file
32
aircox/migrations/0016_alter_staticpage_attach_to.py
Normal file
@ -0,0 +1,32 @@
|
||||
# Generated by Django 4.2.1 on 2023-11-28 01:15
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0015_alter_schedule_timezone_alter_staticpage_attach_to"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="staticpage",
|
||||
name="attach_to",
|
||||
field=models.SmallIntegerField(
|
||||
blank=True,
|
||||
choices=[
|
||||
(0, "Home page"),
|
||||
(1, "Diffusions page"),
|
||||
(2, "Logs page"),
|
||||
(3, "Programs list"),
|
||||
(4, "Episodes list"),
|
||||
(5, "Articles list"),
|
||||
(6, "Publications list"),
|
||||
(7, "Podcasts list"),
|
||||
],
|
||||
help_text="display this page content to related element",
|
||||
null=True,
|
||||
verbose_name="attach to",
|
||||
),
|
||||
),
|
||||
]
|
||||
@ -0,0 +1,36 @@
|
||||
# Generated by Django 4.2.1 on 2023-12-12 16:58
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0016_alter_staticpage_attach_to"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="navitem",
|
||||
name="text",
|
||||
field=models.CharField(blank=True, max_length=64, null=True, verbose_name="title"),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="staticpage",
|
||||
name="attach_to",
|
||||
field=models.SmallIntegerField(
|
||||
blank=True,
|
||||
choices=[
|
||||
(0, "Home page"),
|
||||
(1, "Diffusions page"),
|
||||
(3, "Programs list"),
|
||||
(4, "Episodes list"),
|
||||
(5, "Articles list"),
|
||||
(6, "Publications list"),
|
||||
(7, "Podcasts list"),
|
||||
],
|
||||
help_text="display this page content to related element",
|
||||
null=True,
|
||||
verbose_name="attach to",
|
||||
),
|
||||
),
|
||||
]
|
||||
32
aircox/migrations/0018_alter_staticpage_attach_to.py
Normal file
32
aircox/migrations/0018_alter_staticpage_attach_to.py
Normal file
@ -0,0 +1,32 @@
|
||||
# Generated by Django 4.2.1 on 2023-12-12 18:17
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0017_alter_navitem_text_alter_staticpage_attach_to"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="staticpage",
|
||||
name="attach_to",
|
||||
field=models.CharField(
|
||||
blank=True,
|
||||
choices=[
|
||||
("", "Home Page"),
|
||||
("timetable-list", "Timetable"),
|
||||
("program-list", "Programs list"),
|
||||
("episode-list", "Episodes list"),
|
||||
("article-list", "Articles list"),
|
||||
("page-list", "Publications list"),
|
||||
("podcast-list", "Podcasts list"),
|
||||
],
|
||||
help_text="display this page content to related element",
|
||||
max_length=32,
|
||||
null=True,
|
||||
verbose_name="attach to",
|
||||
),
|
||||
),
|
||||
]
|
||||
12
aircox/migrations/0019_merge_20240119_1022.py
Normal file
12
aircox/migrations/0019_merge_20240119_1022.py
Normal file
@ -0,0 +1,12 @@
|
||||
# Generated by Django 4.2.7 on 2024-01-19 09:22
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0015_program_editors"),
|
||||
("aircox", "0018_alter_staticpage_attach_to"),
|
||||
]
|
||||
|
||||
operations = []
|
||||
@ -0,0 +1,42 @@
|
||||
# Generated by Django 4.2.1 on 2024-02-01 18:12
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0018_alter_staticpage_attach_to"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name="station",
|
||||
name="music_stream_title",
|
||||
field=models.CharField(
|
||||
default="Music stream",
|
||||
max_length=64,
|
||||
verbose_name="Music stream's title",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name="staticpage",
|
||||
name="attach_to",
|
||||
field=models.CharField(
|
||||
blank=True,
|
||||
choices=[
|
||||
("", "None"),
|
||||
("home", "Home Page"),
|
||||
("timetable-list", "Timetable"),
|
||||
("program-list", "Programs list"),
|
||||
("episode-list", "Episodes list"),
|
||||
("article-list", "Articles list"),
|
||||
("page-list", "Publications list"),
|
||||
("podcast-list", "Podcasts list"),
|
||||
],
|
||||
help_text="display this page content to related element",
|
||||
max_length=32,
|
||||
null=True,
|
||||
verbose_name="attach to",
|
||||
),
|
||||
),
|
||||
]
|
||||
12
aircox/migrations/0020_merge_20240205_1027.py
Normal file
12
aircox/migrations/0020_merge_20240205_1027.py
Normal file
@ -0,0 +1,12 @@
|
||||
# Generated by Django 4.2.7 on 2024-02-05 09:27
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0019_merge_20240119_1022"),
|
||||
("aircox", "0019_station_program_streams_title_and_more"),
|
||||
]
|
||||
|
||||
operations = []
|
||||
623
aircox/migrations/0021_alter_schedule_timezone.py
Normal file
623
aircox/migrations/0021_alter_schedule_timezone.py
Normal file
@ -0,0 +1,623 @@
|
||||
# Generated by Django 4.2.7 on 2024-02-06 08:13
|
||||
|
||||
import aircox.models.schedule
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0020_merge_20240205_1027"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name="schedule",
|
||||
name="timezone",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("Africa/Abidjan", "Africa/Abidjan"),
|
||||
("Africa/Accra", "Africa/Accra"),
|
||||
("Africa/Addis_Ababa", "Africa/Addis_Ababa"),
|
||||
("Africa/Algiers", "Africa/Algiers"),
|
||||
("Africa/Asmara", "Africa/Asmara"),
|
||||
("Africa/Asmera", "Africa/Asmera"),
|
||||
("Africa/Bamako", "Africa/Bamako"),
|
||||
("Africa/Bangui", "Africa/Bangui"),
|
||||
("Africa/Banjul", "Africa/Banjul"),
|
||||
("Africa/Bissau", "Africa/Bissau"),
|
||||
("Africa/Blantyre", "Africa/Blantyre"),
|
||||
("Africa/Brazzaville", "Africa/Brazzaville"),
|
||||
("Africa/Bujumbura", "Africa/Bujumbura"),
|
||||
("Africa/Cairo", "Africa/Cairo"),
|
||||
("Africa/Casablanca", "Africa/Casablanca"),
|
||||
("Africa/Ceuta", "Africa/Ceuta"),
|
||||
("Africa/Conakry", "Africa/Conakry"),
|
||||
("Africa/Dakar", "Africa/Dakar"),
|
||||
("Africa/Dar_es_Salaam", "Africa/Dar_es_Salaam"),
|
||||
("Africa/Djibouti", "Africa/Djibouti"),
|
||||
("Africa/Douala", "Africa/Douala"),
|
||||
("Africa/El_Aaiun", "Africa/El_Aaiun"),
|
||||
("Africa/Freetown", "Africa/Freetown"),
|
||||
("Africa/Gaborone", "Africa/Gaborone"),
|
||||
("Africa/Harare", "Africa/Harare"),
|
||||
("Africa/Johannesburg", "Africa/Johannesburg"),
|
||||
("Africa/Juba", "Africa/Juba"),
|
||||
("Africa/Kampala", "Africa/Kampala"),
|
||||
("Africa/Khartoum", "Africa/Khartoum"),
|
||||
("Africa/Kigali", "Africa/Kigali"),
|
||||
("Africa/Kinshasa", "Africa/Kinshasa"),
|
||||
("Africa/Lagos", "Africa/Lagos"),
|
||||
("Africa/Libreville", "Africa/Libreville"),
|
||||
("Africa/Lome", "Africa/Lome"),
|
||||
("Africa/Luanda", "Africa/Luanda"),
|
||||
("Africa/Lubumbashi", "Africa/Lubumbashi"),
|
||||
("Africa/Lusaka", "Africa/Lusaka"),
|
||||
("Africa/Malabo", "Africa/Malabo"),
|
||||
("Africa/Maputo", "Africa/Maputo"),
|
||||
("Africa/Maseru", "Africa/Maseru"),
|
||||
("Africa/Mbabane", "Africa/Mbabane"),
|
||||
("Africa/Mogadishu", "Africa/Mogadishu"),
|
||||
("Africa/Monrovia", "Africa/Monrovia"),
|
||||
("Africa/Nairobi", "Africa/Nairobi"),
|
||||
("Africa/Ndjamena", "Africa/Ndjamena"),
|
||||
("Africa/Niamey", "Africa/Niamey"),
|
||||
("Africa/Nouakchott", "Africa/Nouakchott"),
|
||||
("Africa/Ouagadougou", "Africa/Ouagadougou"),
|
||||
("Africa/Porto-Novo", "Africa/Porto-Novo"),
|
||||
("Africa/Sao_Tome", "Africa/Sao_Tome"),
|
||||
("Africa/Timbuktu", "Africa/Timbuktu"),
|
||||
("Africa/Tripoli", "Africa/Tripoli"),
|
||||
("Africa/Tunis", "Africa/Tunis"),
|
||||
("Africa/Windhoek", "Africa/Windhoek"),
|
||||
("America/Adak", "America/Adak"),
|
||||
("America/Anchorage", "America/Anchorage"),
|
||||
("America/Anguilla", "America/Anguilla"),
|
||||
("America/Antigua", "America/Antigua"),
|
||||
("America/Araguaina", "America/Araguaina"),
|
||||
("America/Argentina/Buenos_Aires", "America/Argentina/Buenos_Aires"),
|
||||
("America/Argentina/Catamarca", "America/Argentina/Catamarca"),
|
||||
("America/Argentina/ComodRivadavia", "America/Argentina/ComodRivadavia"),
|
||||
("America/Argentina/Cordoba", "America/Argentina/Cordoba"),
|
||||
("America/Argentina/Jujuy", "America/Argentina/Jujuy"),
|
||||
("America/Argentina/La_Rioja", "America/Argentina/La_Rioja"),
|
||||
("America/Argentina/Mendoza", "America/Argentina/Mendoza"),
|
||||
("America/Argentina/Rio_Gallegos", "America/Argentina/Rio_Gallegos"),
|
||||
("America/Argentina/Salta", "America/Argentina/Salta"),
|
||||
("America/Argentina/San_Juan", "America/Argentina/San_Juan"),
|
||||
("America/Argentina/San_Luis", "America/Argentina/San_Luis"),
|
||||
("America/Argentina/Tucuman", "America/Argentina/Tucuman"),
|
||||
("America/Argentina/Ushuaia", "America/Argentina/Ushuaia"),
|
||||
("America/Aruba", "America/Aruba"),
|
||||
("America/Asuncion", "America/Asuncion"),
|
||||
("America/Atikokan", "America/Atikokan"),
|
||||
("America/Atka", "America/Atka"),
|
||||
("America/Bahia", "America/Bahia"),
|
||||
("America/Bahia_Banderas", "America/Bahia_Banderas"),
|
||||
("America/Barbados", "America/Barbados"),
|
||||
("America/Belem", "America/Belem"),
|
||||
("America/Belize", "America/Belize"),
|
||||
("America/Blanc-Sablon", "America/Blanc-Sablon"),
|
||||
("America/Boa_Vista", "America/Boa_Vista"),
|
||||
("America/Bogota", "America/Bogota"),
|
||||
("America/Boise", "America/Boise"),
|
||||
("America/Buenos_Aires", "America/Buenos_Aires"),
|
||||
("America/Cambridge_Bay", "America/Cambridge_Bay"),
|
||||
("America/Campo_Grande", "America/Campo_Grande"),
|
||||
("America/Cancun", "America/Cancun"),
|
||||
("America/Caracas", "America/Caracas"),
|
||||
("America/Catamarca", "America/Catamarca"),
|
||||
("America/Cayenne", "America/Cayenne"),
|
||||
("America/Cayman", "America/Cayman"),
|
||||
("America/Chicago", "America/Chicago"),
|
||||
("America/Chihuahua", "America/Chihuahua"),
|
||||
("America/Ciudad_Juarez", "America/Ciudad_Juarez"),
|
||||
("America/Coral_Harbour", "America/Coral_Harbour"),
|
||||
("America/Cordoba", "America/Cordoba"),
|
||||
("America/Costa_Rica", "America/Costa_Rica"),
|
||||
("America/Creston", "America/Creston"),
|
||||
("America/Cuiaba", "America/Cuiaba"),
|
||||
("America/Curacao", "America/Curacao"),
|
||||
("America/Danmarkshavn", "America/Danmarkshavn"),
|
||||
("America/Dawson", "America/Dawson"),
|
||||
("America/Dawson_Creek", "America/Dawson_Creek"),
|
||||
("America/Denver", "America/Denver"),
|
||||
("America/Detroit", "America/Detroit"),
|
||||
("America/Dominica", "America/Dominica"),
|
||||
("America/Edmonton", "America/Edmonton"),
|
||||
("America/Eirunepe", "America/Eirunepe"),
|
||||
("America/El_Salvador", "America/El_Salvador"),
|
||||
("America/Ensenada", "America/Ensenada"),
|
||||
("America/Fort_Nelson", "America/Fort_Nelson"),
|
||||
("America/Fort_Wayne", "America/Fort_Wayne"),
|
||||
("America/Fortaleza", "America/Fortaleza"),
|
||||
("America/Glace_Bay", "America/Glace_Bay"),
|
||||
("America/Godthab", "America/Godthab"),
|
||||
("America/Goose_Bay", "America/Goose_Bay"),
|
||||
("America/Grand_Turk", "America/Grand_Turk"),
|
||||
("America/Grenada", "America/Grenada"),
|
||||
("America/Guadeloupe", "America/Guadeloupe"),
|
||||
("America/Guatemala", "America/Guatemala"),
|
||||
("America/Guayaquil", "America/Guayaquil"),
|
||||
("America/Guyana", "America/Guyana"),
|
||||
("America/Halifax", "America/Halifax"),
|
||||
("America/Havana", "America/Havana"),
|
||||
("America/Hermosillo", "America/Hermosillo"),
|
||||
("America/Indiana/Indianapolis", "America/Indiana/Indianapolis"),
|
||||
("America/Indiana/Knox", "America/Indiana/Knox"),
|
||||
("America/Indiana/Marengo", "America/Indiana/Marengo"),
|
||||
("America/Indiana/Petersburg", "America/Indiana/Petersburg"),
|
||||
("America/Indiana/Tell_City", "America/Indiana/Tell_City"),
|
||||
("America/Indiana/Vevay", "America/Indiana/Vevay"),
|
||||
("America/Indiana/Vincennes", "America/Indiana/Vincennes"),
|
||||
("America/Indiana/Winamac", "America/Indiana/Winamac"),
|
||||
("America/Indianapolis", "America/Indianapolis"),
|
||||
("America/Inuvik", "America/Inuvik"),
|
||||
("America/Iqaluit", "America/Iqaluit"),
|
||||
("America/Jamaica", "America/Jamaica"),
|
||||
("America/Jujuy", "America/Jujuy"),
|
||||
("America/Juneau", "America/Juneau"),
|
||||
("America/Kentucky/Louisville", "America/Kentucky/Louisville"),
|
||||
("America/Kentucky/Monticello", "America/Kentucky/Monticello"),
|
||||
("America/Knox_IN", "America/Knox_IN"),
|
||||
("America/Kralendijk", "America/Kralendijk"),
|
||||
("America/La_Paz", "America/La_Paz"),
|
||||
("America/Lima", "America/Lima"),
|
||||
("America/Los_Angeles", "America/Los_Angeles"),
|
||||
("America/Louisville", "America/Louisville"),
|
||||
("America/Lower_Princes", "America/Lower_Princes"),
|
||||
("America/Maceio", "America/Maceio"),
|
||||
("America/Managua", "America/Managua"),
|
||||
("America/Manaus", "America/Manaus"),
|
||||
("America/Marigot", "America/Marigot"),
|
||||
("America/Martinique", "America/Martinique"),
|
||||
("America/Matamoros", "America/Matamoros"),
|
||||
("America/Mazatlan", "America/Mazatlan"),
|
||||
("America/Mendoza", "America/Mendoza"),
|
||||
("America/Menominee", "America/Menominee"),
|
||||
("America/Merida", "America/Merida"),
|
||||
("America/Metlakatla", "America/Metlakatla"),
|
||||
("America/Mexico_City", "America/Mexico_City"),
|
||||
("America/Miquelon", "America/Miquelon"),
|
||||
("America/Moncton", "America/Moncton"),
|
||||
("America/Monterrey", "America/Monterrey"),
|
||||
("America/Montevideo", "America/Montevideo"),
|
||||
("America/Montreal", "America/Montreal"),
|
||||
("America/Montserrat", "America/Montserrat"),
|
||||
("America/Nassau", "America/Nassau"),
|
||||
("America/New_York", "America/New_York"),
|
||||
("America/Nipigon", "America/Nipigon"),
|
||||
("America/Nome", "America/Nome"),
|
||||
("America/Noronha", "America/Noronha"),
|
||||
("America/North_Dakota/Beulah", "America/North_Dakota/Beulah"),
|
||||
("America/North_Dakota/Center", "America/North_Dakota/Center"),
|
||||
("America/North_Dakota/New_Salem", "America/North_Dakota/New_Salem"),
|
||||
("America/Nuuk", "America/Nuuk"),
|
||||
("America/Ojinaga", "America/Ojinaga"),
|
||||
("America/Panama", "America/Panama"),
|
||||
("America/Pangnirtung", "America/Pangnirtung"),
|
||||
("America/Paramaribo", "America/Paramaribo"),
|
||||
("America/Phoenix", "America/Phoenix"),
|
||||
("America/Port-au-Prince", "America/Port-au-Prince"),
|
||||
("America/Port_of_Spain", "America/Port_of_Spain"),
|
||||
("America/Porto_Acre", "America/Porto_Acre"),
|
||||
("America/Porto_Velho", "America/Porto_Velho"),
|
||||
("America/Puerto_Rico", "America/Puerto_Rico"),
|
||||
("America/Punta_Arenas", "America/Punta_Arenas"),
|
||||
("America/Rainy_River", "America/Rainy_River"),
|
||||
("America/Rankin_Inlet", "America/Rankin_Inlet"),
|
||||
("America/Recife", "America/Recife"),
|
||||
("America/Regina", "America/Regina"),
|
||||
("America/Resolute", "America/Resolute"),
|
||||
("America/Rio_Branco", "America/Rio_Branco"),
|
||||
("America/Rosario", "America/Rosario"),
|
||||
("America/Santa_Isabel", "America/Santa_Isabel"),
|
||||
("America/Santarem", "America/Santarem"),
|
||||
("America/Santiago", "America/Santiago"),
|
||||
("America/Santo_Domingo", "America/Santo_Domingo"),
|
||||
("America/Sao_Paulo", "America/Sao_Paulo"),
|
||||
("America/Scoresbysund", "America/Scoresbysund"),
|
||||
("America/Shiprock", "America/Shiprock"),
|
||||
("America/Sitka", "America/Sitka"),
|
||||
("America/St_Barthelemy", "America/St_Barthelemy"),
|
||||
("America/St_Johns", "America/St_Johns"),
|
||||
("America/St_Kitts", "America/St_Kitts"),
|
||||
("America/St_Lucia", "America/St_Lucia"),
|
||||
("America/St_Thomas", "America/St_Thomas"),
|
||||
("America/St_Vincent", "America/St_Vincent"),
|
||||
("America/Swift_Current", "America/Swift_Current"),
|
||||
("America/Tegucigalpa", "America/Tegucigalpa"),
|
||||
("America/Thule", "America/Thule"),
|
||||
("America/Thunder_Bay", "America/Thunder_Bay"),
|
||||
("America/Tijuana", "America/Tijuana"),
|
||||
("America/Toronto", "America/Toronto"),
|
||||
("America/Tortola", "America/Tortola"),
|
||||
("America/Vancouver", "America/Vancouver"),
|
||||
("America/Virgin", "America/Virgin"),
|
||||
("America/Whitehorse", "America/Whitehorse"),
|
||||
("America/Winnipeg", "America/Winnipeg"),
|
||||
("America/Yakutat", "America/Yakutat"),
|
||||
("America/Yellowknife", "America/Yellowknife"),
|
||||
("Antarctica/Casey", "Antarctica/Casey"),
|
||||
("Antarctica/Davis", "Antarctica/Davis"),
|
||||
("Antarctica/DumontDUrville", "Antarctica/DumontDUrville"),
|
||||
("Antarctica/Macquarie", "Antarctica/Macquarie"),
|
||||
("Antarctica/Mawson", "Antarctica/Mawson"),
|
||||
("Antarctica/McMurdo", "Antarctica/McMurdo"),
|
||||
("Antarctica/Palmer", "Antarctica/Palmer"),
|
||||
("Antarctica/Rothera", "Antarctica/Rothera"),
|
||||
("Antarctica/South_Pole", "Antarctica/South_Pole"),
|
||||
("Antarctica/Syowa", "Antarctica/Syowa"),
|
||||
("Antarctica/Troll", "Antarctica/Troll"),
|
||||
("Antarctica/Vostok", "Antarctica/Vostok"),
|
||||
("Arctic/Longyearbyen", "Arctic/Longyearbyen"),
|
||||
("Asia/Aden", "Asia/Aden"),
|
||||
("Asia/Almaty", "Asia/Almaty"),
|
||||
("Asia/Amman", "Asia/Amman"),
|
||||
("Asia/Anadyr", "Asia/Anadyr"),
|
||||
("Asia/Aqtau", "Asia/Aqtau"),
|
||||
("Asia/Aqtobe", "Asia/Aqtobe"),
|
||||
("Asia/Ashgabat", "Asia/Ashgabat"),
|
||||
("Asia/Ashkhabad", "Asia/Ashkhabad"),
|
||||
("Asia/Atyrau", "Asia/Atyrau"),
|
||||
("Asia/Baghdad", "Asia/Baghdad"),
|
||||
("Asia/Bahrain", "Asia/Bahrain"),
|
||||
("Asia/Baku", "Asia/Baku"),
|
||||
("Asia/Bangkok", "Asia/Bangkok"),
|
||||
("Asia/Barnaul", "Asia/Barnaul"),
|
||||
("Asia/Beirut", "Asia/Beirut"),
|
||||
("Asia/Bishkek", "Asia/Bishkek"),
|
||||
("Asia/Brunei", "Asia/Brunei"),
|
||||
("Asia/Calcutta", "Asia/Calcutta"),
|
||||
("Asia/Chita", "Asia/Chita"),
|
||||
("Asia/Choibalsan", "Asia/Choibalsan"),
|
||||
("Asia/Chongqing", "Asia/Chongqing"),
|
||||
("Asia/Chungking", "Asia/Chungking"),
|
||||
("Asia/Colombo", "Asia/Colombo"),
|
||||
("Asia/Dacca", "Asia/Dacca"),
|
||||
("Asia/Damascus", "Asia/Damascus"),
|
||||
("Asia/Dhaka", "Asia/Dhaka"),
|
||||
("Asia/Dili", "Asia/Dili"),
|
||||
("Asia/Dubai", "Asia/Dubai"),
|
||||
("Asia/Dushanbe", "Asia/Dushanbe"),
|
||||
("Asia/Famagusta", "Asia/Famagusta"),
|
||||
("Asia/Gaza", "Asia/Gaza"),
|
||||
("Asia/Harbin", "Asia/Harbin"),
|
||||
("Asia/Hebron", "Asia/Hebron"),
|
||||
("Asia/Ho_Chi_Minh", "Asia/Ho_Chi_Minh"),
|
||||
("Asia/Hong_Kong", "Asia/Hong_Kong"),
|
||||
("Asia/Hovd", "Asia/Hovd"),
|
||||
("Asia/Irkutsk", "Asia/Irkutsk"),
|
||||
("Asia/Istanbul", "Asia/Istanbul"),
|
||||
("Asia/Jakarta", "Asia/Jakarta"),
|
||||
("Asia/Jayapura", "Asia/Jayapura"),
|
||||
("Asia/Jerusalem", "Asia/Jerusalem"),
|
||||
("Asia/Kabul", "Asia/Kabul"),
|
||||
("Asia/Kamchatka", "Asia/Kamchatka"),
|
||||
("Asia/Karachi", "Asia/Karachi"),
|
||||
("Asia/Kashgar", "Asia/Kashgar"),
|
||||
("Asia/Kathmandu", "Asia/Kathmandu"),
|
||||
("Asia/Katmandu", "Asia/Katmandu"),
|
||||
("Asia/Khandyga", "Asia/Khandyga"),
|
||||
("Asia/Kolkata", "Asia/Kolkata"),
|
||||
("Asia/Krasnoyarsk", "Asia/Krasnoyarsk"),
|
||||
("Asia/Kuala_Lumpur", "Asia/Kuala_Lumpur"),
|
||||
("Asia/Kuching", "Asia/Kuching"),
|
||||
("Asia/Kuwait", "Asia/Kuwait"),
|
||||
("Asia/Macao", "Asia/Macao"),
|
||||
("Asia/Macau", "Asia/Macau"),
|
||||
("Asia/Magadan", "Asia/Magadan"),
|
||||
("Asia/Makassar", "Asia/Makassar"),
|
||||
("Asia/Manila", "Asia/Manila"),
|
||||
("Asia/Muscat", "Asia/Muscat"),
|
||||
("Asia/Nicosia", "Asia/Nicosia"),
|
||||
("Asia/Novokuznetsk", "Asia/Novokuznetsk"),
|
||||
("Asia/Novosibirsk", "Asia/Novosibirsk"),
|
||||
("Asia/Omsk", "Asia/Omsk"),
|
||||
("Asia/Oral", "Asia/Oral"),
|
||||
("Asia/Phnom_Penh", "Asia/Phnom_Penh"),
|
||||
("Asia/Pontianak", "Asia/Pontianak"),
|
||||
("Asia/Pyongyang", "Asia/Pyongyang"),
|
||||
("Asia/Qatar", "Asia/Qatar"),
|
||||
("Asia/Qostanay", "Asia/Qostanay"),
|
||||
("Asia/Qyzylorda", "Asia/Qyzylorda"),
|
||||
("Asia/Rangoon", "Asia/Rangoon"),
|
||||
("Asia/Riyadh", "Asia/Riyadh"),
|
||||
("Asia/Saigon", "Asia/Saigon"),
|
||||
("Asia/Sakhalin", "Asia/Sakhalin"),
|
||||
("Asia/Samarkand", "Asia/Samarkand"),
|
||||
("Asia/Seoul", "Asia/Seoul"),
|
||||
("Asia/Shanghai", "Asia/Shanghai"),
|
||||
("Asia/Singapore", "Asia/Singapore"),
|
||||
("Asia/Srednekolymsk", "Asia/Srednekolymsk"),
|
||||
("Asia/Taipei", "Asia/Taipei"),
|
||||
("Asia/Tashkent", "Asia/Tashkent"),
|
||||
("Asia/Tbilisi", "Asia/Tbilisi"),
|
||||
("Asia/Tehran", "Asia/Tehran"),
|
||||
("Asia/Tel_Aviv", "Asia/Tel_Aviv"),
|
||||
("Asia/Thimbu", "Asia/Thimbu"),
|
||||
("Asia/Thimphu", "Asia/Thimphu"),
|
||||
("Asia/Tokyo", "Asia/Tokyo"),
|
||||
("Asia/Tomsk", "Asia/Tomsk"),
|
||||
("Asia/Ujung_Pandang", "Asia/Ujung_Pandang"),
|
||||
("Asia/Ulaanbaatar", "Asia/Ulaanbaatar"),
|
||||
("Asia/Ulan_Bator", "Asia/Ulan_Bator"),
|
||||
("Asia/Urumqi", "Asia/Urumqi"),
|
||||
("Asia/Ust-Nera", "Asia/Ust-Nera"),
|
||||
("Asia/Vientiane", "Asia/Vientiane"),
|
||||
("Asia/Vladivostok", "Asia/Vladivostok"),
|
||||
("Asia/Yakutsk", "Asia/Yakutsk"),
|
||||
("Asia/Yangon", "Asia/Yangon"),
|
||||
("Asia/Yekaterinburg", "Asia/Yekaterinburg"),
|
||||
("Asia/Yerevan", "Asia/Yerevan"),
|
||||
("Atlantic/Azores", "Atlantic/Azores"),
|
||||
("Atlantic/Bermuda", "Atlantic/Bermuda"),
|
||||
("Atlantic/Canary", "Atlantic/Canary"),
|
||||
("Atlantic/Cape_Verde", "Atlantic/Cape_Verde"),
|
||||
("Atlantic/Faeroe", "Atlantic/Faeroe"),
|
||||
("Atlantic/Faroe", "Atlantic/Faroe"),
|
||||
("Atlantic/Jan_Mayen", "Atlantic/Jan_Mayen"),
|
||||
("Atlantic/Madeira", "Atlantic/Madeira"),
|
||||
("Atlantic/Reykjavik", "Atlantic/Reykjavik"),
|
||||
("Atlantic/South_Georgia", "Atlantic/South_Georgia"),
|
||||
("Atlantic/St_Helena", "Atlantic/St_Helena"),
|
||||
("Atlantic/Stanley", "Atlantic/Stanley"),
|
||||
("Australia/ACT", "Australia/ACT"),
|
||||
("Australia/Adelaide", "Australia/Adelaide"),
|
||||
("Australia/Brisbane", "Australia/Brisbane"),
|
||||
("Australia/Broken_Hill", "Australia/Broken_Hill"),
|
||||
("Australia/Canberra", "Australia/Canberra"),
|
||||
("Australia/Currie", "Australia/Currie"),
|
||||
("Australia/Darwin", "Australia/Darwin"),
|
||||
("Australia/Eucla", "Australia/Eucla"),
|
||||
("Australia/Hobart", "Australia/Hobart"),
|
||||
("Australia/LHI", "Australia/LHI"),
|
||||
("Australia/Lindeman", "Australia/Lindeman"),
|
||||
("Australia/Lord_Howe", "Australia/Lord_Howe"),
|
||||
("Australia/Melbourne", "Australia/Melbourne"),
|
||||
("Australia/NSW", "Australia/NSW"),
|
||||
("Australia/North", "Australia/North"),
|
||||
("Australia/Perth", "Australia/Perth"),
|
||||
("Australia/Queensland", "Australia/Queensland"),
|
||||
("Australia/South", "Australia/South"),
|
||||
("Australia/Sydney", "Australia/Sydney"),
|
||||
("Australia/Tasmania", "Australia/Tasmania"),
|
||||
("Australia/Victoria", "Australia/Victoria"),
|
||||
("Australia/West", "Australia/West"),
|
||||
("Australia/Yancowinna", "Australia/Yancowinna"),
|
||||
("Brazil/Acre", "Brazil/Acre"),
|
||||
("Brazil/DeNoronha", "Brazil/DeNoronha"),
|
||||
("Brazil/East", "Brazil/East"),
|
||||
("Brazil/West", "Brazil/West"),
|
||||
("CET", "CET"),
|
||||
("CST6CDT", "CST6CDT"),
|
||||
("Canada/Atlantic", "Canada/Atlantic"),
|
||||
("Canada/Central", "Canada/Central"),
|
||||
("Canada/Eastern", "Canada/Eastern"),
|
||||
("Canada/Mountain", "Canada/Mountain"),
|
||||
("Canada/Newfoundland", "Canada/Newfoundland"),
|
||||
("Canada/Pacific", "Canada/Pacific"),
|
||||
("Canada/Saskatchewan", "Canada/Saskatchewan"),
|
||||
("Canada/Yukon", "Canada/Yukon"),
|
||||
("Chile/Continental", "Chile/Continental"),
|
||||
("Chile/EasterIsland", "Chile/EasterIsland"),
|
||||
("Cuba", "Cuba"),
|
||||
("EET", "EET"),
|
||||
("EST", "EST"),
|
||||
("EST5EDT", "EST5EDT"),
|
||||
("Egypt", "Egypt"),
|
||||
("Eire", "Eire"),
|
||||
("Etc/GMT", "Etc/GMT"),
|
||||
("Etc/GMT+0", "Etc/GMT+0"),
|
||||
("Etc/GMT+1", "Etc/GMT+1"),
|
||||
("Etc/GMT+10", "Etc/GMT+10"),
|
||||
("Etc/GMT+11", "Etc/GMT+11"),
|
||||
("Etc/GMT+12", "Etc/GMT+12"),
|
||||
("Etc/GMT+2", "Etc/GMT+2"),
|
||||
("Etc/GMT+3", "Etc/GMT+3"),
|
||||
("Etc/GMT+4", "Etc/GMT+4"),
|
||||
("Etc/GMT+5", "Etc/GMT+5"),
|
||||
("Etc/GMT+6", "Etc/GMT+6"),
|
||||
("Etc/GMT+7", "Etc/GMT+7"),
|
||||
("Etc/GMT+8", "Etc/GMT+8"),
|
||||
("Etc/GMT+9", "Etc/GMT+9"),
|
||||
("Etc/GMT-0", "Etc/GMT-0"),
|
||||
("Etc/GMT-1", "Etc/GMT-1"),
|
||||
("Etc/GMT-10", "Etc/GMT-10"),
|
||||
("Etc/GMT-11", "Etc/GMT-11"),
|
||||
("Etc/GMT-12", "Etc/GMT-12"),
|
||||
("Etc/GMT-13", "Etc/GMT-13"),
|
||||
("Etc/GMT-14", "Etc/GMT-14"),
|
||||
("Etc/GMT-2", "Etc/GMT-2"),
|
||||
("Etc/GMT-3", "Etc/GMT-3"),
|
||||
("Etc/GMT-4", "Etc/GMT-4"),
|
||||
("Etc/GMT-5", "Etc/GMT-5"),
|
||||
("Etc/GMT-6", "Etc/GMT-6"),
|
||||
("Etc/GMT-7", "Etc/GMT-7"),
|
||||
("Etc/GMT-8", "Etc/GMT-8"),
|
||||
("Etc/GMT-9", "Etc/GMT-9"),
|
||||
("Etc/GMT0", "Etc/GMT0"),
|
||||
("Etc/Greenwich", "Etc/Greenwich"),
|
||||
("Etc/UCT", "Etc/UCT"),
|
||||
("Etc/UTC", "Etc/UTC"),
|
||||
("Etc/Universal", "Etc/Universal"),
|
||||
("Etc/Zulu", "Etc/Zulu"),
|
||||
("Europe/Amsterdam", "Europe/Amsterdam"),
|
||||
("Europe/Andorra", "Europe/Andorra"),
|
||||
("Europe/Astrakhan", "Europe/Astrakhan"),
|
||||
("Europe/Athens", "Europe/Athens"),
|
||||
("Europe/Belfast", "Europe/Belfast"),
|
||||
("Europe/Belgrade", "Europe/Belgrade"),
|
||||
("Europe/Berlin", "Europe/Berlin"),
|
||||
("Europe/Bratislava", "Europe/Bratislava"),
|
||||
("Europe/Brussels", "Europe/Brussels"),
|
||||
("Europe/Bucharest", "Europe/Bucharest"),
|
||||
("Europe/Budapest", "Europe/Budapest"),
|
||||
("Europe/Busingen", "Europe/Busingen"),
|
||||
("Europe/Chisinau", "Europe/Chisinau"),
|
||||
("Europe/Copenhagen", "Europe/Copenhagen"),
|
||||
("Europe/Dublin", "Europe/Dublin"),
|
||||
("Europe/Gibraltar", "Europe/Gibraltar"),
|
||||
("Europe/Guernsey", "Europe/Guernsey"),
|
||||
("Europe/Helsinki", "Europe/Helsinki"),
|
||||
("Europe/Isle_of_Man", "Europe/Isle_of_Man"),
|
||||
("Europe/Istanbul", "Europe/Istanbul"),
|
||||
("Europe/Jersey", "Europe/Jersey"),
|
||||
("Europe/Kaliningrad", "Europe/Kaliningrad"),
|
||||
("Europe/Kiev", "Europe/Kiev"),
|
||||
("Europe/Kirov", "Europe/Kirov"),
|
||||
("Europe/Kyiv", "Europe/Kyiv"),
|
||||
("Europe/Lisbon", "Europe/Lisbon"),
|
||||
("Europe/Ljubljana", "Europe/Ljubljana"),
|
||||
("Europe/London", "Europe/London"),
|
||||
("Europe/Luxembourg", "Europe/Luxembourg"),
|
||||
("Europe/Madrid", "Europe/Madrid"),
|
||||
("Europe/Malta", "Europe/Malta"),
|
||||
("Europe/Mariehamn", "Europe/Mariehamn"),
|
||||
("Europe/Minsk", "Europe/Minsk"),
|
||||
("Europe/Monaco", "Europe/Monaco"),
|
||||
("Europe/Moscow", "Europe/Moscow"),
|
||||
("Europe/Nicosia", "Europe/Nicosia"),
|
||||
("Europe/Oslo", "Europe/Oslo"),
|
||||
("Europe/Paris", "Europe/Paris"),
|
||||
("Europe/Podgorica", "Europe/Podgorica"),
|
||||
("Europe/Prague", "Europe/Prague"),
|
||||
("Europe/Riga", "Europe/Riga"),
|
||||
("Europe/Rome", "Europe/Rome"),
|
||||
("Europe/Samara", "Europe/Samara"),
|
||||
("Europe/San_Marino", "Europe/San_Marino"),
|
||||
("Europe/Sarajevo", "Europe/Sarajevo"),
|
||||
("Europe/Saratov", "Europe/Saratov"),
|
||||
("Europe/Simferopol", "Europe/Simferopol"),
|
||||
("Europe/Skopje", "Europe/Skopje"),
|
||||
("Europe/Sofia", "Europe/Sofia"),
|
||||
("Europe/Stockholm", "Europe/Stockholm"),
|
||||
("Europe/Tallinn", "Europe/Tallinn"),
|
||||
("Europe/Tirane", "Europe/Tirane"),
|
||||
("Europe/Tiraspol", "Europe/Tiraspol"),
|
||||
("Europe/Ulyanovsk", "Europe/Ulyanovsk"),
|
||||
("Europe/Uzhgorod", "Europe/Uzhgorod"),
|
||||
("Europe/Vaduz", "Europe/Vaduz"),
|
||||
("Europe/Vatican", "Europe/Vatican"),
|
||||
("Europe/Vienna", "Europe/Vienna"),
|
||||
("Europe/Vilnius", "Europe/Vilnius"),
|
||||
("Europe/Volgograd", "Europe/Volgograd"),
|
||||
("Europe/Warsaw", "Europe/Warsaw"),
|
||||
("Europe/Zagreb", "Europe/Zagreb"),
|
||||
("Europe/Zaporozhye", "Europe/Zaporozhye"),
|
||||
("Europe/Zurich", "Europe/Zurich"),
|
||||
("Factory", "Factory"),
|
||||
("GB", "GB"),
|
||||
("GB-Eire", "GB-Eire"),
|
||||
("GMT", "GMT"),
|
||||
("GMT+0", "GMT+0"),
|
||||
("GMT-0", "GMT-0"),
|
||||
("GMT0", "GMT0"),
|
||||
("Greenwich", "Greenwich"),
|
||||
("HST", "HST"),
|
||||
("Hongkong", "Hongkong"),
|
||||
("Iceland", "Iceland"),
|
||||
("Indian/Antananarivo", "Indian/Antananarivo"),
|
||||
("Indian/Chagos", "Indian/Chagos"),
|
||||
("Indian/Christmas", "Indian/Christmas"),
|
||||
("Indian/Cocos", "Indian/Cocos"),
|
||||
("Indian/Comoro", "Indian/Comoro"),
|
||||
("Indian/Kerguelen", "Indian/Kerguelen"),
|
||||
("Indian/Mahe", "Indian/Mahe"),
|
||||
("Indian/Maldives", "Indian/Maldives"),
|
||||
("Indian/Mauritius", "Indian/Mauritius"),
|
||||
("Indian/Mayotte", "Indian/Mayotte"),
|
||||
("Indian/Reunion", "Indian/Reunion"),
|
||||
("Iran", "Iran"),
|
||||
("Israel", "Israel"),
|
||||
("Jamaica", "Jamaica"),
|
||||
("Japan", "Japan"),
|
||||
("Kwajalein", "Kwajalein"),
|
||||
("Libya", "Libya"),
|
||||
("MET", "MET"),
|
||||
("MST", "MST"),
|
||||
("MST7MDT", "MST7MDT"),
|
||||
("Mexico/BajaNorte", "Mexico/BajaNorte"),
|
||||
("Mexico/BajaSur", "Mexico/BajaSur"),
|
||||
("Mexico/General", "Mexico/General"),
|
||||
("NZ", "NZ"),
|
||||
("NZ-CHAT", "NZ-CHAT"),
|
||||
("Navajo", "Navajo"),
|
||||
("PRC", "PRC"),
|
||||
("PST8PDT", "PST8PDT"),
|
||||
("Pacific/Apia", "Pacific/Apia"),
|
||||
("Pacific/Auckland", "Pacific/Auckland"),
|
||||
("Pacific/Bougainville", "Pacific/Bougainville"),
|
||||
("Pacific/Chatham", "Pacific/Chatham"),
|
||||
("Pacific/Chuuk", "Pacific/Chuuk"),
|
||||
("Pacific/Easter", "Pacific/Easter"),
|
||||
("Pacific/Efate", "Pacific/Efate"),
|
||||
("Pacific/Enderbury", "Pacific/Enderbury"),
|
||||
("Pacific/Fakaofo", "Pacific/Fakaofo"),
|
||||
("Pacific/Fiji", "Pacific/Fiji"),
|
||||
("Pacific/Funafuti", "Pacific/Funafuti"),
|
||||
("Pacific/Galapagos", "Pacific/Galapagos"),
|
||||
("Pacific/Gambier", "Pacific/Gambier"),
|
||||
("Pacific/Guadalcanal", "Pacific/Guadalcanal"),
|
||||
("Pacific/Guam", "Pacific/Guam"),
|
||||
("Pacific/Honolulu", "Pacific/Honolulu"),
|
||||
("Pacific/Johnston", "Pacific/Johnston"),
|
||||
("Pacific/Kanton", "Pacific/Kanton"),
|
||||
("Pacific/Kiritimati", "Pacific/Kiritimati"),
|
||||
("Pacific/Kosrae", "Pacific/Kosrae"),
|
||||
("Pacific/Kwajalein", "Pacific/Kwajalein"),
|
||||
("Pacific/Majuro", "Pacific/Majuro"),
|
||||
("Pacific/Marquesas", "Pacific/Marquesas"),
|
||||
("Pacific/Midway", "Pacific/Midway"),
|
||||
("Pacific/Nauru", "Pacific/Nauru"),
|
||||
("Pacific/Niue", "Pacific/Niue"),
|
||||
("Pacific/Norfolk", "Pacific/Norfolk"),
|
||||
("Pacific/Noumea", "Pacific/Noumea"),
|
||||
("Pacific/Pago_Pago", "Pacific/Pago_Pago"),
|
||||
("Pacific/Palau", "Pacific/Palau"),
|
||||
("Pacific/Pitcairn", "Pacific/Pitcairn"),
|
||||
("Pacific/Pohnpei", "Pacific/Pohnpei"),
|
||||
("Pacific/Ponape", "Pacific/Ponape"),
|
||||
("Pacific/Port_Moresby", "Pacific/Port_Moresby"),
|
||||
("Pacific/Rarotonga", "Pacific/Rarotonga"),
|
||||
("Pacific/Saipan", "Pacific/Saipan"),
|
||||
("Pacific/Samoa", "Pacific/Samoa"),
|
||||
("Pacific/Tahiti", "Pacific/Tahiti"),
|
||||
("Pacific/Tarawa", "Pacific/Tarawa"),
|
||||
("Pacific/Tongatapu", "Pacific/Tongatapu"),
|
||||
("Pacific/Truk", "Pacific/Truk"),
|
||||
("Pacific/Wake", "Pacific/Wake"),
|
||||
("Pacific/Wallis", "Pacific/Wallis"),
|
||||
("Pacific/Yap", "Pacific/Yap"),
|
||||
("Poland", "Poland"),
|
||||
("Portugal", "Portugal"),
|
||||
("ROC", "ROC"),
|
||||
("ROK", "ROK"),
|
||||
("Singapore", "Singapore"),
|
||||
("Turkey", "Turkey"),
|
||||
("UCT", "UCT"),
|
||||
("US/Alaska", "US/Alaska"),
|
||||
("US/Aleutian", "US/Aleutian"),
|
||||
("US/Arizona", "US/Arizona"),
|
||||
("US/Central", "US/Central"),
|
||||
("US/East-Indiana", "US/East-Indiana"),
|
||||
("US/Eastern", "US/Eastern"),
|
||||
("US/Hawaii", "US/Hawaii"),
|
||||
("US/Indiana-Starke", "US/Indiana-Starke"),
|
||||
("US/Michigan", "US/Michigan"),
|
||||
("US/Mountain", "US/Mountain"),
|
||||
("US/Pacific", "US/Pacific"),
|
||||
("US/Samoa", "US/Samoa"),
|
||||
("UTC", "UTC"),
|
||||
("Universal", "Universal"),
|
||||
("W-SU", "W-SU"),
|
||||
("WET", "WET"),
|
||||
("Zulu", "Zulu"),
|
||||
("localtime", "localtime"),
|
||||
],
|
||||
default=aircox.models.schedule.current_timezone_key,
|
||||
help_text="timezone used for the date",
|
||||
max_length=100,
|
||||
verbose_name="timezone",
|
||||
),
|
||||
),
|
||||
]
|
||||
18
aircox/migrations/0022_set_group_ownership.py
Normal file
18
aircox/migrations/0022_set_group_ownership.py
Normal file
@ -0,0 +1,18 @@
|
||||
from django.db import migrations
|
||||
|
||||
from aircox.models import Program
|
||||
|
||||
|
||||
def set_group_ownership(*args):
|
||||
for program in Program.objects.all():
|
||||
program.set_group_ownership()
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("aircox", "0021_alter_schedule_timezone"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(set_group_ownership),
|
||||
]
|
||||
0
aircox/migrations/__init__.py
Normal file
0
aircox/migrations/__init__.py
Normal file
@ -1,11 +1,39 @@
|
||||
from .article import Article
|
||||
from .page import Category, Page, StaticPage, Comment, NavItem
|
||||
from .program import Program, Stream, Schedule
|
||||
from .episode import Episode, Diffusion
|
||||
from .log import Log
|
||||
from .sound import Sound, Track
|
||||
from .station import Station, Port
|
||||
|
||||
from . import signals
|
||||
from .article import Article
|
||||
from .diffusion import Diffusion, DiffusionQuerySet
|
||||
from .episode import Episode
|
||||
from .log import Log, LogQuerySet
|
||||
from .page import Category, Comment, NavItem, Page, PageQuerySet, StaticPage
|
||||
from .program import Program, ProgramChildQuerySet, ProgramQuerySet, Stream
|
||||
from .schedule import Schedule
|
||||
from .sound import Sound, SoundQuerySet, Track
|
||||
from .station import Port, Station, StationQuerySet
|
||||
from .user_settings import UserSettings
|
||||
|
||||
|
||||
__all__ = (
|
||||
"signals",
|
||||
"Article",
|
||||
"Episode",
|
||||
"Diffusion",
|
||||
"DiffusionQuerySet",
|
||||
"Log",
|
||||
"LogQuerySet",
|
||||
"Category",
|
||||
"PageQuerySet",
|
||||
"Page",
|
||||
"StaticPage",
|
||||
"Comment",
|
||||
"NavItem",
|
||||
"Program",
|
||||
"ProgramQuerySet",
|
||||
"Stream",
|
||||
"Schedule",
|
||||
"ProgramChildQuerySet",
|
||||
"Sound",
|
||||
"SoundQuerySet",
|
||||
"Track",
|
||||
"Station",
|
||||
"StationQuerySet",
|
||||
"Port",
|
||||
"UserSettings",
|
||||
)
|
||||
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@ -1,16 +1,17 @@
|
||||
from django.db import models
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from .page import Page, PageQuerySet
|
||||
from .program import Program, ProgramChildQuerySet
|
||||
from .page import Page
|
||||
from .program import ProgramChildQuerySet
|
||||
|
||||
__all__ = ("Article",)
|
||||
|
||||
|
||||
class Article(Page):
|
||||
detail_url_name = 'article-detail'
|
||||
detail_url_name = "article-detail"
|
||||
template_prefix = "article"
|
||||
|
||||
objects = ProgramChildQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Article')
|
||||
verbose_name_plural = _('Articles')
|
||||
|
||||
verbose_name = _("Article")
|
||||
verbose_name_plural = _("Articles")
|
||||
|
||||
265
aircox/models/diffusion.py
Normal file
265
aircox/models/diffusion.py
Normal file
@ -0,0 +1,265 @@
|
||||
import datetime
|
||||
|
||||
from django.db import models
|
||||
from django.db.models import Q
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.functional import cached_property
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from aircox import utils
|
||||
|
||||
from .episode import Episode
|
||||
from .schedule import Schedule
|
||||
from .rerun import Rerun, RerunQuerySet
|
||||
|
||||
|
||||
__all__ = ("Diffusion", "DiffusionQuerySet")
|
||||
|
||||
|
||||
class DiffusionQuerySet(RerunQuerySet):
|
||||
def episode(self, episode=None, id=None):
|
||||
"""Diffusions for this episode."""
|
||||
return self.filter(episode=episode) if id is None else self.filter(episode__id=id)
|
||||
|
||||
def on_air(self):
|
||||
"""On air diffusions."""
|
||||
return self.filter(type=Diffusion.TYPE_ON_AIR)
|
||||
|
||||
# TODO: rename to `datetime`
|
||||
def now(self, now=None, order=True):
|
||||
"""Diffusions occuring now."""
|
||||
now = now or tz.now()
|
||||
qs = self.filter(start__lte=now, end__gte=now).distinct()
|
||||
return qs.order_by("start") if order else qs
|
||||
|
||||
def date(self, date=None, order=True):
|
||||
"""Diffusions occuring date."""
|
||||
date = date or datetime.date.today()
|
||||
start = tz.make_aware(tz.datetime.combine(date, datetime.time()))
|
||||
end = tz.make_aware(tz.datetime.combine(date, datetime.time(23, 59, 59, 999)))
|
||||
# start = tz.get_current_timezone().localize(start)
|
||||
# end = tz.get_current_timezone().localize(end)
|
||||
qs = self.filter(start__range=(start, end))
|
||||
return qs.order_by("start") if order else qs
|
||||
|
||||
def at(self, date, order=True):
|
||||
"""Return diffusions at specified date or datetime."""
|
||||
return self.now(date, order) if isinstance(date, tz.datetime) else self.date(date, order)
|
||||
|
||||
def after(self, date=None):
|
||||
"""Return a queryset of diffusions that happen after the given date
|
||||
(default: today)."""
|
||||
date = utils.date_or_default(date)
|
||||
if isinstance(date, tz.datetime):
|
||||
qs = self.filter(Q(start__gte=date) | Q(end__gte=date))
|
||||
else:
|
||||
qs = self.filter(Q(start__date__gte=date) | Q(end__date__gte=date))
|
||||
return qs.order_by("start")
|
||||
|
||||
def before(self, date=None):
|
||||
"""Return a queryset of diffusions that finish before the given date
|
||||
(default: today)."""
|
||||
date = utils.date_or_default(date)
|
||||
if isinstance(date, tz.datetime):
|
||||
qs = self.filter(start__lt=date)
|
||||
else:
|
||||
qs = self.filter(start__date__lt=date)
|
||||
return qs.order_by("start")
|
||||
|
||||
def range(self, start, end):
|
||||
# FIXME can return dates that are out of range...
|
||||
return self.after(start).before(end)
|
||||
|
||||
|
||||
class Diffusion(Rerun):
|
||||
"""A Diffusion is an occurrence of a Program that is scheduled on the
|
||||
station's timetable. It can be a rerun of a previous diffusion. In such a
|
||||
case, use rerun's info instead of its own.
|
||||
|
||||
A Diffusion without any rerun is named Episode (previously, a
|
||||
Diffusion was different from an Episode, but in the end, an
|
||||
episode only has a name, a linked program, and a list of sounds, so we
|
||||
finally merge theme).
|
||||
|
||||
A Diffusion can have different types:
|
||||
- default: simple diffusion that is planified / did occurred
|
||||
- unconfirmed: a generated diffusion that has not been confirmed and thus
|
||||
is not yet planified
|
||||
- cancel: the diffusion has been canceled
|
||||
- stop: the diffusion has been manually stopped
|
||||
"""
|
||||
|
||||
list_url_name = "timetable-list"
|
||||
|
||||
objects = DiffusionQuerySet.as_manager()
|
||||
|
||||
TYPE_ON_AIR = 0x00
|
||||
TYPE_UNCONFIRMED = 0x01
|
||||
TYPE_CANCEL = 0x02
|
||||
TYPE_CHOICES = (
|
||||
(TYPE_ON_AIR, _("on air")),
|
||||
(TYPE_UNCONFIRMED, _("not confirmed")),
|
||||
(TYPE_CANCEL, _("cancelled")),
|
||||
)
|
||||
|
||||
episode = models.ForeignKey(
|
||||
Episode,
|
||||
models.CASCADE,
|
||||
verbose_name=_("episode"),
|
||||
)
|
||||
schedule = models.ForeignKey(
|
||||
Schedule,
|
||||
models.CASCADE,
|
||||
verbose_name=_("schedule"),
|
||||
blank=True,
|
||||
null=True,
|
||||
)
|
||||
type = models.SmallIntegerField(
|
||||
verbose_name=_("type"),
|
||||
default=TYPE_ON_AIR,
|
||||
choices=TYPE_CHOICES,
|
||||
)
|
||||
start = models.DateTimeField(_("start"), db_index=True)
|
||||
end = models.DateTimeField(_("end"), db_index=True)
|
||||
# port = models.ForeignKey(
|
||||
# 'self',
|
||||
# verbose_name = _('port'),
|
||||
# blank = True, null = True,
|
||||
# on_delete=models.SET_NULL,
|
||||
# help_text = _('use this input port'),
|
||||
# )
|
||||
|
||||
class Meta:
|
||||
verbose_name = _("Diffusion")
|
||||
verbose_name_plural = _("Diffusions")
|
||||
permissions = (("programming", _("edit the diffusions' planification")),)
|
||||
|
||||
def __str__(self):
|
||||
str_ = "{episode} - {date}".format(
|
||||
episode=self.episode and self.episode.title,
|
||||
date=self.local_start.strftime("%Y/%m/%d %H:%M%z"),
|
||||
)
|
||||
if self.initial:
|
||||
str_ += " ({})".format(_("rerun"))
|
||||
return str_
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
super().save(*args, **kwargs)
|
||||
if self.is_initial and self.episode != self._initial["episode"]:
|
||||
self.rerun_set.update(episode=self.episode, program=self.program)
|
||||
|
||||
# def save(self, no_check=False, *args, **kwargs):
|
||||
# if self.start != self._initial['start'] or \
|
||||
# self.end != self._initial['end']:
|
||||
# self.check_conflicts()
|
||||
|
||||
def save_rerun(self):
|
||||
self.episode = self.initial.episode
|
||||
super().save_rerun()
|
||||
|
||||
def save_initial(self):
|
||||
self.program = self.episode.program
|
||||
|
||||
@property
|
||||
def duration(self):
|
||||
return self.end - self.start
|
||||
|
||||
@property
|
||||
def date(self):
|
||||
"""Return diffusion start as a date."""
|
||||
|
||||
return utils.cast_date(self.start)
|
||||
|
||||
@cached_property
|
||||
def local_start(self):
|
||||
"""Return a version of self.date that is localized to self.timezone;
|
||||
This is needed since datetime are stored as UTC date and we want to get
|
||||
it as local time."""
|
||||
|
||||
return tz.localtime(self.start, tz.get_current_timezone())
|
||||
|
||||
@property
|
||||
def local_end(self):
|
||||
"""Return a version of self.date that is localized to self.timezone;
|
||||
This is needed since datetime are stored as UTC date and we want to get
|
||||
it as local time."""
|
||||
|
||||
return tz.localtime(self.end, tz.get_current_timezone())
|
||||
|
||||
@property
|
||||
def is_now(self):
|
||||
"""True if diffusion is currently running."""
|
||||
now = tz.now()
|
||||
return self.type == self.TYPE_ON_AIR and self.start <= now and self.end >= now
|
||||
|
||||
@property
|
||||
def is_today(self):
|
||||
"""True if diffusion is currently today."""
|
||||
return self.start.date() == datetime.date.today()
|
||||
|
||||
@property
|
||||
def is_live(self):
|
||||
"""True if Diffusion is live (False if there are sounds files)."""
|
||||
return self.type == self.TYPE_ON_AIR and not self.episode.sound_set.archive().count()
|
||||
|
||||
def get_playlist(self, **types):
|
||||
"""Returns sounds as a playlist (list of *local* archive file path).
|
||||
|
||||
The given arguments are passed to ``get_sounds``.
|
||||
"""
|
||||
from .sound import Sound
|
||||
|
||||
return list(
|
||||
self.get_sounds(**types).filter(path__isnull=False, type=Sound.TYPE_ARCHIVE).values_list("path", flat=True)
|
||||
)
|
||||
|
||||
def get_sounds(self, **types):
|
||||
"""Return a queryset of sounds related to this diffusion, ordered by
|
||||
type then path.
|
||||
|
||||
**types: filter on the given sound types name, as `archive=True`
|
||||
"""
|
||||
from .sound import Sound
|
||||
|
||||
sounds = (self.initial or self).sound_set.order_by("type", "path")
|
||||
_in = [getattr(Sound.Type, name) for name, value in types.items() if value]
|
||||
|
||||
return sounds.filter(type__in=_in)
|
||||
|
||||
def is_date_in_range(self, date=None):
|
||||
"""Return true if the given date is in the diffusion's start-end
|
||||
range."""
|
||||
date = date or tz.now()
|
||||
|
||||
return self.start < date < self.end
|
||||
|
||||
def get_conflicts(self):
|
||||
"""Return conflicting diffusions queryset."""
|
||||
|
||||
# conflicts=Diffusion.objects.filter(
|
||||
# Q(start__lt=OuterRef('start'), end__gt=OuterRef('end')) |
|
||||
# Q(start__gt=OuterRef('start'), start__lt=OuterRef('end'))
|
||||
# )
|
||||
# diffs= Diffusion.objects.annotate(conflict_with=Exists(conflicts))
|
||||
# .filter(conflict_with=True)
|
||||
return (
|
||||
Diffusion.objects.filter(
|
||||
Q(start__lt=self.start, end__gt=self.start) | Q(start__gt=self.start, start__lt=self.end)
|
||||
)
|
||||
.exclude(pk=self.pk)
|
||||
.distinct()
|
||||
)
|
||||
|
||||
def check_conflicts(self):
|
||||
conflicts = self.get_conflicts()
|
||||
self.conflicts.set(conflicts)
|
||||
|
||||
_initial = None
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self._initial = {
|
||||
"start": self.start,
|
||||
"end": self.end,
|
||||
"episode": getattr(self, "episode", None),
|
||||
}
|
||||
@ -1,56 +1,63 @@
|
||||
import datetime
|
||||
|
||||
from django.db import models
|
||||
from django.db.models import Q
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.utils.functional import cached_property
|
||||
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from easy_thumbnails.files import get_thumbnailer
|
||||
|
||||
from aircox import settings, utils
|
||||
from .program import Program, ProgramChildQuerySet, \
|
||||
BaseRerun, BaseRerunQuerySet, Schedule
|
||||
from .page import Page, PageQuerySet
|
||||
from aircox.conf import settings
|
||||
|
||||
from .page import Page
|
||||
from .program import ProgramChildQuerySet
|
||||
|
||||
__all__ = ("Episode",)
|
||||
|
||||
|
||||
__all__ = ['Episode', 'Diffusion', 'DiffusionQuerySet']
|
||||
class EpisodeQuerySet(ProgramChildQuerySet):
|
||||
def with_podcasts(self):
|
||||
return self.filter(sound__is_public=True).distinct()
|
||||
|
||||
|
||||
class Episode(Page):
|
||||
objects = ProgramChildQuerySet.as_manager()
|
||||
detail_url_name = 'episode-detail'
|
||||
item_template_name = 'aircox/widgets/episode_item.html'
|
||||
objects = EpisodeQuerySet.as_manager()
|
||||
detail_url_name = "episode-detail"
|
||||
list_url_name = "episode-list"
|
||||
template_prefix = "episode"
|
||||
|
||||
@property
|
||||
def program(self):
|
||||
return getattr(self.parent, 'program', None)
|
||||
|
||||
@cached_property
|
||||
def podcasts(self):
|
||||
""" Return serialized data about podcasts. """
|
||||
from ..serializers import PodcastSerializer
|
||||
podcasts = [PodcastSerializer(s).data
|
||||
for s in self.sound_set.public().order_by('type') ]
|
||||
if self.cover:
|
||||
options = {'size': (128,128), 'crop':'scale'}
|
||||
cover = get_thumbnailer(self.cover).get_thumbnail(options).url
|
||||
else:
|
||||
cover = None
|
||||
|
||||
for index, podcast in enumerate(podcasts):
|
||||
podcasts[index]['cover'] = cover
|
||||
podcasts[index]['page_url'] = self.get_absolute_url()
|
||||
podcasts[index]['page_title'] = self.title
|
||||
return podcasts
|
||||
return self.parent_subclass
|
||||
|
||||
@program.setter
|
||||
def program(self, value):
|
||||
self.parent = value
|
||||
|
||||
@cached_property
|
||||
def podcasts(self):
|
||||
"""Return serialized data about podcasts."""
|
||||
from .sound import Sound
|
||||
from ..serializers import PodcastSerializer
|
||||
|
||||
podcasts = [PodcastSerializer(s).data for s in self.sound_set.public().order_by("type")]
|
||||
if self.cover:
|
||||
options = {"size": (128, 128), "crop": "scale"}
|
||||
cover = get_thumbnailer(self.cover).get_thumbnail(options).url
|
||||
else:
|
||||
cover = None
|
||||
|
||||
archive_index = 1
|
||||
for index, podcast in enumerate(podcasts):
|
||||
if podcast["type"] == Sound.TYPE_ARCHIVE:
|
||||
if archive_index > 1:
|
||||
podcast["name"] = f"{self.title} - {archive_index}"
|
||||
else:
|
||||
podcast["name"] = self.title
|
||||
|
||||
podcasts[index]["cover"] = cover
|
||||
podcasts[index]["page_url"] = self.get_absolute_url()
|
||||
podcasts[index]["page_title"] = self.title
|
||||
return podcasts
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Episode')
|
||||
verbose_name_plural = _('Episodes')
|
||||
verbose_name = _("Episode")
|
||||
verbose_name_plural = _("Episodes")
|
||||
|
||||
def get_absolute_url(self):
|
||||
if not self.is_published:
|
||||
@ -59,260 +66,25 @@ class Episode(Page):
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if self.parent is None:
|
||||
raise ValueError('missing parent program')
|
||||
raise ValueError("missing parent program")
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
@classmethod
|
||||
def get_init_kwargs_from(cls, page, date, title=None, **kwargs):
|
||||
""" Get default Episode's title """
|
||||
title = settings.AIRCOX_EPISODE_TITLE.format(
|
||||
def get_default_title(cls, page, date):
|
||||
return settings.EPISODE_TITLE.format(
|
||||
program=page,
|
||||
date=date.strftime(settings.AIRCOX_EPISODE_TITLE_DATE_FORMAT),
|
||||
) if title is None else title
|
||||
return super().get_init_kwargs_from(page, title=title, program=page,
|
||||
**kwargs)
|
||||
|
||||
|
||||
class DiffusionQuerySet(BaseRerunQuerySet):
|
||||
def episode(self, episode=None, id=None):
|
||||
""" Diffusions for this episode """
|
||||
return self.filter(episode=episode) if id is None else \
|
||||
self.filter(episode__id=id)
|
||||
|
||||
def on_air(self):
|
||||
""" On air diffusions """
|
||||
return self.filter(type=Diffusion.TYPE_ON_AIR)
|
||||
|
||||
# TODO: rename to `datetime`
|
||||
def now(self, now=None, order=True):
|
||||
""" Diffusions occuring now """
|
||||
now = now or tz.now()
|
||||
qs = self.filter(start__lte=now, end__gte=now).distinct()
|
||||
return qs.order_by('start') if order else qs
|
||||
|
||||
def date(self, date=None, order=True):
|
||||
""" Diffusions occuring date. """
|
||||
date = date or datetime.date.today()
|
||||
start = tz.datetime.combine(date, datetime.time())
|
||||
end = tz.datetime.combine(date, datetime.time(23, 59, 59, 999))
|
||||
# start = tz.get_current_timezone().localize(start)
|
||||
# end = tz.get_current_timezone().localize(end)
|
||||
qs = self.filter(start__range = (start, end))
|
||||
return qs.order_by('start') if order else qs
|
||||
|
||||
def at(self, date, order=True):
|
||||
""" Return diffusions at specified date or datetime """
|
||||
return self.now(date, order) if isinstance(date, tz.datetime) else \
|
||||
self.date(date, order)
|
||||
|
||||
def after(self, date=None):
|
||||
"""
|
||||
Return a queryset of diffusions that happen after the given
|
||||
date (default: today).
|
||||
"""
|
||||
date = utils.date_or_default(date)
|
||||
if isinstance(date, tz.datetime):
|
||||
qs = self.filter(Q(start__gte=date) | Q(end__gte=date))
|
||||
else:
|
||||
qs = self.filter(Q(start__date__gte=date) | Q(end__date__gte=date))
|
||||
return qs.order_by('start')
|
||||
|
||||
def before(self, date=None):
|
||||
"""
|
||||
Return a queryset of diffusions that finish before the given
|
||||
date (default: today).
|
||||
"""
|
||||
date = utils.date_or_default(date)
|
||||
if isinstance(date, tz.datetime):
|
||||
qs = self.filter(start__lt=date)
|
||||
else:
|
||||
qs = self.filter(start__date__lt=date)
|
||||
return qs.order_by('start')
|
||||
|
||||
def range(self, start, end):
|
||||
# FIXME can return dates that are out of range...
|
||||
return self.after(start).before(end)
|
||||
|
||||
|
||||
class Diffusion(BaseRerun):
|
||||
"""
|
||||
A Diffusion is an occurrence of a Program that is scheduled on the
|
||||
station's timetable. It can be a rerun of a previous diffusion. In such
|
||||
a case, use rerun's info instead of its own.
|
||||
|
||||
A Diffusion without any rerun is named Episode (previously, a
|
||||
Diffusion was different from an Episode, but in the end, an
|
||||
episode only has a name, a linked program, and a list of sounds, so we
|
||||
finally merge theme).
|
||||
|
||||
A Diffusion can have different types:
|
||||
- default: simple diffusion that is planified / did occurred
|
||||
- unconfirmed: a generated diffusion that has not been confirmed and thus
|
||||
is not yet planified
|
||||
- cancel: the diffusion has been canceled
|
||||
- stop: the diffusion has been manually stopped
|
||||
"""
|
||||
objects = DiffusionQuerySet.as_manager()
|
||||
|
||||
TYPE_ON_AIR = 0x00
|
||||
TYPE_UNCONFIRMED = 0x01
|
||||
TYPE_CANCEL = 0x02
|
||||
TYPE_CHOICES = (
|
||||
(TYPE_ON_AIR, _('on air')),
|
||||
(TYPE_UNCONFIRMED, _('not confirmed')),
|
||||
(TYPE_CANCEL, _('cancelled')),
|
||||
)
|
||||
|
||||
episode = models.ForeignKey(
|
||||
Episode, models.CASCADE, verbose_name=_('episode'),
|
||||
)
|
||||
schedule = models.ForeignKey(
|
||||
Schedule, models.CASCADE, verbose_name=_('schedule'),
|
||||
blank=True, null=True,
|
||||
)
|
||||
type = models.SmallIntegerField(
|
||||
verbose_name=_('type'), default=TYPE_ON_AIR, choices=TYPE_CHOICES,
|
||||
)
|
||||
start = models.DateTimeField(_('start'), db_index=True)
|
||||
end = models.DateTimeField(_('end'), db_index=True)
|
||||
# port = models.ForeignKey(
|
||||
# 'self',
|
||||
# verbose_name = _('port'),
|
||||
# blank = True, null = True,
|
||||
# on_delete=models.SET_NULL,
|
||||
# help_text = _('use this input port'),
|
||||
# )
|
||||
|
||||
item_template_name = 'aircox/widgets/diffusion_item.html'
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Diffusion')
|
||||
verbose_name_plural = _('Diffusions')
|
||||
permissions = (
|
||||
('programming', _('edit the diffusion\'s planification')),
|
||||
date=date.strftime(settings.EPISODE_TITLE_DATE_FORMAT),
|
||||
)
|
||||
|
||||
def __str__(self):
|
||||
str_ = '{episode} - {date}'.format(
|
||||
self=self, episode=self.episode and self.episode.title,
|
||||
date=self.local_start.strftime('%Y/%m/%d %H:%M%z'),
|
||||
@classmethod
|
||||
def get_init_kwargs_from(cls, page, date=None, title=None, **kwargs):
|
||||
"""Get default Episode's title."""
|
||||
title = (
|
||||
settings.EPISODE_TITLE.format(
|
||||
program=page,
|
||||
date=date.strftime(settings.EPISODE_TITLE_DATE_FORMAT),
|
||||
)
|
||||
if title is None
|
||||
else title
|
||||
)
|
||||
if self.initial:
|
||||
str_ += ' ({})'.format(_('rerun'))
|
||||
return str_
|
||||
|
||||
#def save(self, no_check=False, *args, **kwargs):
|
||||
#if self.start != self._initial['start'] or \
|
||||
# self.end != self._initial['end']:
|
||||
# self.check_conflicts()
|
||||
|
||||
def save_rerun(self):
|
||||
self.episode = self.initial.episode
|
||||
self.program = self.episode.program
|
||||
|
||||
def save_initial(self):
|
||||
self.program = self.episode.program
|
||||
if self.episode != self._initial['episode']:
|
||||
self.rerun_set.update(episode=self.episode, program=self.program)
|
||||
|
||||
@property
|
||||
def duration(self):
|
||||
return self.end - self.start
|
||||
|
||||
@property
|
||||
def date(self):
|
||||
""" Return diffusion start as a date. """
|
||||
|
||||
return utils.cast_date(self.start)
|
||||
|
||||
@cached_property
|
||||
def local_start(self):
|
||||
"""
|
||||
Return a version of self.date that is localized to self.timezone;
|
||||
This is needed since datetime are stored as UTC date and we want
|
||||
to get it as local time.
|
||||
"""
|
||||
|
||||
return tz.localtime(self.start, tz.get_current_timezone())
|
||||
|
||||
@property
|
||||
def local_end(self):
|
||||
"""
|
||||
Return a version of self.date that is localized to self.timezone;
|
||||
This is needed since datetime are stored as UTC date and we want
|
||||
to get it as local time.
|
||||
"""
|
||||
|
||||
return tz.localtime(self.end, tz.get_current_timezone())
|
||||
|
||||
@property
|
||||
def is_now(self):
|
||||
""" True if diffusion is currently running """
|
||||
now = tz.now()
|
||||
return self.type == self.TYPE_ON_AIR and \
|
||||
self.start <= now and self.end >= now
|
||||
|
||||
# TODO: property?
|
||||
def is_live(self):
|
||||
""" True if Diffusion is live (False if there are sounds files). """
|
||||
return self.type == self.TYPE_ON_AIR and \
|
||||
not self.episode.sound_set.archive().count()
|
||||
|
||||
def get_playlist(self, **types):
|
||||
"""
|
||||
Returns sounds as a playlist (list of *local* archive file path).
|
||||
The given arguments are passed to ``get_sounds``.
|
||||
"""
|
||||
from .sound import Sound
|
||||
return list(self.get_sounds(**types)
|
||||
.filter(path__isnull=False, type=Sound.TYPE_ARCHIVE)
|
||||
.values_list('path', flat=True))
|
||||
|
||||
def get_sounds(self, **types):
|
||||
"""
|
||||
Return a queryset of sounds related to this diffusion,
|
||||
ordered by type then path.
|
||||
|
||||
**types: filter on the given sound types name, as `archive=True`
|
||||
"""
|
||||
from .sound import Sound
|
||||
sounds = (self.initial or self).sound_set.order_by('type', 'path')
|
||||
_in = [getattr(Sound.Type, name)
|
||||
for name, value in types.items() if value]
|
||||
|
||||
return sounds.filter(type__in=_in)
|
||||
|
||||
def is_date_in_range(self, date=None):
|
||||
"""
|
||||
Return true if the given date is in the diffusion's start-end
|
||||
range.
|
||||
"""
|
||||
date = date or tz.now()
|
||||
|
||||
return self.start < date < self.end
|
||||
|
||||
def get_conflicts(self):
|
||||
""" Return conflicting diffusions queryset """
|
||||
|
||||
# conflicts=Diffusion.objects.filter(Q(start__lt=OuterRef('start'), end__gt=OuterRef('end')) | Q(start__gt=OuterRef('start'), start__lt=OuterRef('end')))
|
||||
# diffs= Diffusion.objects.annotate(conflict_with=Exists(conflicts)).filter(conflict_with=True)
|
||||
return Diffusion.objects.filter(
|
||||
Q(start__lt=self.start, end__gt=self.start) |
|
||||
Q(start__gt=self.start, start__lt=self.end)
|
||||
).exclude(pk=self.pk).distinct()
|
||||
|
||||
def check_conflicts(self):
|
||||
conflicts = self.get_conflicts()
|
||||
self.conflicts.set(conflicts)
|
||||
|
||||
_initial = None
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self._initial = {
|
||||
'start': self.start,
|
||||
'end': self.end,
|
||||
'episode': getattr(self, 'episode', None),
|
||||
}
|
||||
|
||||
|
||||
return super().get_init_kwargs_from(page, title=title, program=page, **kwargs)
|
||||
|
||||
@ -1,44 +1,36 @@
|
||||
from collections import deque
|
||||
import datetime
|
||||
import gzip
|
||||
import logging
|
||||
import os
|
||||
|
||||
import yaml
|
||||
import operator
|
||||
from collections import deque
|
||||
|
||||
from django.db import models
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.functional import cached_property
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from aircox import settings
|
||||
from .episode import Diffusion
|
||||
from .diffusion import Diffusion
|
||||
from .sound import Sound, Track
|
||||
from .station import Station
|
||||
from .page import Renderable
|
||||
|
||||
logger = logging.getLogger("aircox")
|
||||
|
||||
|
||||
logger = logging.getLogger('aircox')
|
||||
|
||||
|
||||
__all__ = ['Log', 'LogQuerySet', 'LogArchiver']
|
||||
__all__ = ("Log", "LogQuerySet")
|
||||
|
||||
|
||||
class LogQuerySet(models.QuerySet):
|
||||
def station(self, station=None, id=None):
|
||||
return self.filter(station=station) if id is None else \
|
||||
self.filter(station_id=id)
|
||||
return self.filter(station=station) if id is None else self.filter(station_id=id)
|
||||
|
||||
def date(self, date):
|
||||
start = tz.datetime.combine(date, datetime.time())
|
||||
end = tz.datetime.combine(date, datetime.time(23, 59, 59, 999))
|
||||
return self.filter(date__range = (start, end))
|
||||
return self.filter(date__range=(start, end))
|
||||
# this filter does not work with mysql
|
||||
# return self.filter(date__date=date)
|
||||
|
||||
def after(self, date):
|
||||
return self.filter(date__gte=date) \
|
||||
if isinstance(date, tz.datetime) else \
|
||||
self.filter(date__date__gte=date)
|
||||
return self.filter(date__gte=date) if isinstance(date, tz.datetime) else self.filter(date__date__gte=date)
|
||||
|
||||
def on_air(self):
|
||||
return self.filter(type=Log.TYPE_ON_AIR)
|
||||
@ -56,65 +48,83 @@ class LogQuerySet(models.QuerySet):
|
||||
return self.filter(track__isnull=not with_it)
|
||||
|
||||
|
||||
class Log(models.Model):
|
||||
"""
|
||||
Log sounds and diffusions that are played on the station.
|
||||
class Log(Renderable, models.Model):
|
||||
"""Log sounds and diffusions that are played on the station.
|
||||
|
||||
This only remember what has been played on the outputs, not on each
|
||||
source; Source designate here which source is responsible of that.
|
||||
"""
|
||||
|
||||
template_prefix = "log"
|
||||
|
||||
TYPE_STOP = 0x00
|
||||
""" Source has been stopped, e.g. manually """
|
||||
"""Source has been stopped, e.g. manually."""
|
||||
# Rule: \/ diffusion != null \/ sound != null
|
||||
TYPE_START = 0x01
|
||||
""" Diffusion or sound has been request to be played. """
|
||||
"""Diffusion or sound has been request to be played."""
|
||||
TYPE_CANCEL = 0x02
|
||||
""" Diffusion has been canceled. """
|
||||
"""Diffusion has been canceled."""
|
||||
# Rule: \/ sound != null /\ track == null
|
||||
# \/ sound == null /\ track != null
|
||||
# \/ sound == null /\ track == null /\ comment = sound_path
|
||||
TYPE_ON_AIR = 0x03
|
||||
""" Sound or diffusion occured on air """
|
||||
"""Sound or diffusion occured on air."""
|
||||
TYPE_OTHER = 0x04
|
||||
""" Other log """
|
||||
"""Other log."""
|
||||
TYPE_CHOICES = (
|
||||
(TYPE_STOP, _('stop')), (TYPE_START, _('start')),
|
||||
(TYPE_CANCEL, _('cancelled')), (TYPE_ON_AIR, _('on air')),
|
||||
(TYPE_OTHER, _('other'))
|
||||
(TYPE_STOP, _("stop")),
|
||||
(TYPE_START, _("start")),
|
||||
(TYPE_CANCEL, _("cancelled")),
|
||||
(TYPE_ON_AIR, _("on air")),
|
||||
(TYPE_OTHER, _("other")),
|
||||
)
|
||||
|
||||
station = models.ForeignKey(
|
||||
Station, models.CASCADE,
|
||||
verbose_name=_('station'), help_text=_('related station'),
|
||||
Station,
|
||||
models.CASCADE,
|
||||
verbose_name=_("station"),
|
||||
help_text=_("related station"),
|
||||
)
|
||||
type = models.SmallIntegerField(_('type'), choices=TYPE_CHOICES)
|
||||
date = models.DateTimeField(_('date'), default=tz.now, db_index=True)
|
||||
type = models.SmallIntegerField(_("type"), choices=TYPE_CHOICES)
|
||||
date = models.DateTimeField(_("date"), default=tz.now, db_index=True)
|
||||
source = models.CharField(
|
||||
# we use a CharField to avoid loosing logs information if the
|
||||
# source is removed
|
||||
max_length=64, blank=True, null=True,
|
||||
verbose_name=_('source'),
|
||||
help_text=_('identifier of the source related to this log'),
|
||||
max_length=64,
|
||||
blank=True,
|
||||
null=True,
|
||||
verbose_name=_("source"),
|
||||
help_text=_("identifier of the source related to this log"),
|
||||
)
|
||||
comment = models.CharField(
|
||||
max_length=512, blank=True, null=True,
|
||||
verbose_name=_('comment'),
|
||||
max_length=512,
|
||||
blank=True,
|
||||
null=True,
|
||||
verbose_name=_("comment"),
|
||||
)
|
||||
sound = models.ForeignKey(
|
||||
Sound, models.SET_NULL,
|
||||
blank=True, null=True, db_index=True,
|
||||
verbose_name=_('Sound'),
|
||||
Sound,
|
||||
models.SET_NULL,
|
||||
blank=True,
|
||||
null=True,
|
||||
db_index=True,
|
||||
verbose_name=_("Sound"),
|
||||
)
|
||||
track = models.ForeignKey(
|
||||
Track, models.SET_NULL,
|
||||
blank=True, null=True, db_index=True,
|
||||
verbose_name=_('Track'),
|
||||
Track,
|
||||
models.SET_NULL,
|
||||
blank=True,
|
||||
null=True,
|
||||
db_index=True,
|
||||
verbose_name=_("Track"),
|
||||
)
|
||||
diffusion = models.ForeignKey(
|
||||
Diffusion, models.SET_NULL,
|
||||
blank=True, null=True, db_index=True,
|
||||
verbose_name=_('Diffusion'),
|
||||
Diffusion,
|
||||
models.SET_NULL,
|
||||
blank=True,
|
||||
null=True,
|
||||
db_index=True,
|
||||
verbose_name=_("Diffusion"),
|
||||
)
|
||||
|
||||
objects = LogQuerySet.as_manager()
|
||||
@ -126,11 +136,9 @@ class Log(models.Model):
|
||||
# FIXME: required????
|
||||
@property
|
||||
def local_date(self):
|
||||
"""
|
||||
Return a version of self.date that is localized to self.timezone;
|
||||
This is needed since datetime are stored as UTC date and we want
|
||||
to get it as local time.
|
||||
"""
|
||||
"""Return a version of self.date that is localized to self.timezone;
|
||||
This is needed since datetime are stored as UTC date and we want to get
|
||||
it as local time."""
|
||||
return tz.localtime(self.date, tz.get_current_timezone())
|
||||
|
||||
# prepare for the future on crash + ease the use in merged lists with
|
||||
@ -140,34 +148,38 @@ class Log(models.Model):
|
||||
return self.date
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Log')
|
||||
verbose_name_plural = _('Logs')
|
||||
verbose_name = _("Log")
|
||||
verbose_name_plural = _("Logs")
|
||||
|
||||
def __str__(self):
|
||||
return '#{} ({}, {}, {})'.format(
|
||||
self.pk, self.get_type_display(),
|
||||
self.source, self.local_date.strftime('%Y/%m/%d %H:%M%z'))
|
||||
return "#{} ({}, {}, {})".format(
|
||||
self.pk,
|
||||
self.get_type_display(),
|
||||
self.source,
|
||||
self.local_date.strftime("%Y/%m/%d %H:%M%z"),
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def __list_append(cls, object_list, items):
|
||||
object_list += [cls(obj) for obj in items]
|
||||
|
||||
@classmethod
|
||||
def merge_diffusions(cls, logs, diffs, count=None):
|
||||
def merge_diffusions(cls, logs, diffs, count=None, diff_count=None, group_logs=False):
|
||||
"""Merge logs and diffusions together.
|
||||
|
||||
`logs` can either be a queryset or a list ordered by `Log.date`.
|
||||
"""
|
||||
Merge logs and diffusions together. `logs` can either be a queryset
|
||||
or a list ordered by `Log.date`.
|
||||
"""
|
||||
# TODO: limit count
|
||||
# FIXME: log may be iterable (in stats view)
|
||||
if isinstance(logs, models.QuerySet):
|
||||
logs = list(logs.order_by('-date'))
|
||||
diffs = deque(diffs.on_air().before().order_by('-start'))
|
||||
logs = list(logs.order_by("-date"))
|
||||
diffs = diffs.on_air().order_by("-start")
|
||||
if diff_count:
|
||||
diffs = diffs[:diff_count]
|
||||
diffs = deque(diffs)
|
||||
object_list = []
|
||||
|
||||
while True:
|
||||
if not len(diffs):
|
||||
object_list += logs
|
||||
cls._append_logs(object_list, logs, len(logs), group=group_logs)
|
||||
break
|
||||
|
||||
if not len(logs):
|
||||
@ -177,21 +189,17 @@ class Log(models.Model):
|
||||
diff = diffs.popleft()
|
||||
|
||||
# - takes all logs after diff start
|
||||
index = next((i for i, v in enumerate(logs)
|
||||
if v.date <= diff.end), len(logs))
|
||||
if index is not None and index > 0:
|
||||
object_list += logs[:index]
|
||||
logs = logs[index:]
|
||||
index = cls._next_index(logs, diff.end, len(logs), pred=operator.le)
|
||||
cls._append_logs(object_list, logs, index, group=group_logs)
|
||||
|
||||
if len(logs):
|
||||
# FIXME
|
||||
# - last log while diff is running
|
||||
#if logs[0].date > diff.start:
|
||||
# if logs[0].date > diff.start:
|
||||
# object_list.append(logs[0])
|
||||
|
||||
# - skips logs while diff is running
|
||||
index = next((i for i, v in enumerate(logs)
|
||||
if v.date < diff.start), len(logs))
|
||||
index = cls._next_index(logs, diff.start, len(logs))
|
||||
if index is not None and index > 0:
|
||||
logs = logs[index:]
|
||||
|
||||
@ -200,112 +208,51 @@ class Log(models.Model):
|
||||
|
||||
return object_list if count is None else object_list[:count]
|
||||
|
||||
@classmethod
|
||||
def _next_index(cls, items, date, default, pred=operator.lt):
|
||||
iter = (i for i, v in enumerate(items) if pred(v.date, date))
|
||||
return next(iter, default)
|
||||
|
||||
@classmethod
|
||||
def _append_logs(cls, object_list, logs, count, group=False):
|
||||
logs = logs[:count]
|
||||
if not logs:
|
||||
return object_list
|
||||
|
||||
if group:
|
||||
grouped = cls._group_logs_by_time(logs)
|
||||
object_list.extend(grouped)
|
||||
else:
|
||||
object_list += logs
|
||||
return object_list
|
||||
|
||||
@classmethod
|
||||
def _group_logs_by_time(cls, logs):
|
||||
last_time = -1
|
||||
cum = []
|
||||
for log in logs:
|
||||
hour = log.date.time().hour
|
||||
if hour != last_time:
|
||||
if cum:
|
||||
yield cum
|
||||
cum = []
|
||||
last_time = hour
|
||||
# reverse from lowest to highest date
|
||||
cum.insert(0, log)
|
||||
if cum:
|
||||
yield cum
|
||||
|
||||
def print(self):
|
||||
r = []
|
||||
if self.diffusion:
|
||||
r.append('diff: ' + str(self.diffusion_id))
|
||||
r.append("diff: " + str(self.diffusion_id))
|
||||
if self.sound:
|
||||
r.append('sound: ' + str(self.sound_id))
|
||||
r.append("sound: " + str(self.sound_id))
|
||||
if self.track:
|
||||
r.append('track: ' + str(self.track_id))
|
||||
logger.info('log %s: %s%s', str(self), self.comment or '',
|
||||
' (' + ', '.join(r) + ')' if r else '')
|
||||
|
||||
|
||||
|
||||
class LogArchiver:
|
||||
""" Commodity class used to manage archives of logs. """
|
||||
@cached_property
|
||||
def fields(self):
|
||||
return Log._meta.get_fields()
|
||||
|
||||
@staticmethod
|
||||
def get_path(station, date):
|
||||
return os.path.join(
|
||||
settings.AIRCOX_LOGS_ARCHIVES_DIR,
|
||||
'{}_{}.log.gz'.format(date.strftime("%Y%m%d"), station.pk)
|
||||
r.append("track: " + str(self.track_id))
|
||||
logger.info(
|
||||
"log %s: %s%s",
|
||||
str(self),
|
||||
self.comment or "",
|
||||
" (" + ", ".join(r) + ")" if r else "",
|
||||
)
|
||||
|
||||
def archive(self, qs, keep=False):
|
||||
"""
|
||||
Archive logs of the given queryset. Delete archived logs if not
|
||||
`keep`. Return the count of archived logs
|
||||
"""
|
||||
if not qs.exists():
|
||||
return 0
|
||||
|
||||
os.makedirs(settings.AIRCOX_LOGS_ARCHIVES_DIR, exist_ok=True)
|
||||
count = qs.count()
|
||||
logs = self.sort_logs(qs)
|
||||
|
||||
# Note: since we use Yaml, we can just append new logs when file
|
||||
# exists yet <3
|
||||
for (station, date), logs in logs.items():
|
||||
path = self.get_path(station, date)
|
||||
with gzip.open(path, 'ab') as archive:
|
||||
data = yaml.dump([self.serialize(l) for l in logs]).encode('utf8')
|
||||
archive.write(data)
|
||||
|
||||
if not keep:
|
||||
qs.delete()
|
||||
|
||||
return count
|
||||
|
||||
@staticmethod
|
||||
def sort_logs(qs):
|
||||
"""
|
||||
Sort logs by station and date and return a dict of
|
||||
`{ (station,date): [logs] }`.
|
||||
"""
|
||||
qs = qs.order_by('date')
|
||||
logs = {}
|
||||
for log in qs:
|
||||
key = (log.station, log.date)
|
||||
if key not in logs:
|
||||
logs[key] = [log]
|
||||
else:
|
||||
logs[key].append(log)
|
||||
return logs
|
||||
|
||||
def serialize(self, log):
|
||||
""" Serialize log """
|
||||
return {i.attname: getattr(log, i.attname)
|
||||
for i in self.fields}
|
||||
|
||||
def load(self, station, date):
|
||||
""" Load an archive returning logs in a list. """
|
||||
path = self.get_path(station, date)
|
||||
|
||||
if not os.path.exists(path):
|
||||
return []
|
||||
|
||||
with gzip.open(path, 'rb') as archive:
|
||||
data = archive.read()
|
||||
logs = yaml.load(data)
|
||||
|
||||
# we need to preload diffusions, sounds and tracks
|
||||
rels = {
|
||||
'diffusion': self.get_relations(logs, Diffusion, 'diffusion'),
|
||||
'sound': self.get_relations(logs, Sound, 'sound'),
|
||||
'track': self.get_relations(logs, Track, 'track'),
|
||||
}
|
||||
|
||||
def rel_obj(log, attr):
|
||||
rel_id = log.get(attr + '_id')
|
||||
return rels[attr][rel_id] if rel_id else None
|
||||
|
||||
return [Log(diffusion=rel_obj(log, 'diffusion'),
|
||||
sound=rel_obj(log, 'sound'),
|
||||
track=rel_obj(log, 'track'),
|
||||
**log) for log in logs]
|
||||
|
||||
@staticmethod
|
||||
def get_relations(logs, model, attr):
|
||||
"""
|
||||
From a list of dict representing logs, retrieve related objects
|
||||
of the given type.
|
||||
"""
|
||||
attr_id = attr + '_id'
|
||||
pks = (log[attr_id] for log in logs if attr_id in log)
|
||||
return {rel.pk: rel for rel in model.objects.filter(pk__in=pks)}
|
||||
|
||||
|
||||
@ -1,44 +1,57 @@
|
||||
from enum import IntEnum
|
||||
import re
|
||||
|
||||
from django.db import models
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.text import slugify
|
||||
from django.utils.html import format_html
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.utils.functional import cached_property
|
||||
|
||||
import bleach
|
||||
from ckeditor_uploader.fields import RichTextUploadingField
|
||||
from django.db import models
|
||||
from django.urls import reverse
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.functional import cached_property
|
||||
from django.utils.html import format_html
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.utils.text import slugify
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from filer.fields.image import FilerImageField
|
||||
from model_utils.managers import InheritanceQuerySet
|
||||
|
||||
from .station import Station
|
||||
|
||||
|
||||
__all__ = ['Category', 'PageQuerySet', 'Page', 'Comment', 'NavItem']
|
||||
__all__ = (
|
||||
"Renderable",
|
||||
"Category",
|
||||
"PageQuerySet",
|
||||
"Page",
|
||||
"StaticPage",
|
||||
"Comment",
|
||||
"NavItem",
|
||||
)
|
||||
|
||||
|
||||
headline_re = re.compile(r'(<p>)?'
|
||||
r'(?P<headline>[^\n]{1,140}(\n|[^\.]*?\.))'
|
||||
r'(</p>)?')
|
||||
headline_clean_re = re.compile(r"\n(\s| )+", re.MULTILINE)
|
||||
headline_re = re.compile(r"(?P<headline>([\S+]|\s+){1,240}\S+)", re.MULTILINE)
|
||||
|
||||
|
||||
class Renderable:
|
||||
template_prefix = "page"
|
||||
template_name = "aircox/widgets/{prefix}.html"
|
||||
|
||||
def get_template_name(self, widget):
|
||||
"""Return template name for the provided widget."""
|
||||
return self.template_name.format(prefix=self.template_prefix, widget=widget)
|
||||
|
||||
|
||||
class Category(models.Model):
|
||||
title = models.CharField(_('title'), max_length=64)
|
||||
slug = models.SlugField(_('slug'), max_length=64, db_index=True)
|
||||
title = models.CharField(_("title"), max_length=64)
|
||||
slug = models.SlugField(_("slug"), max_length=64, db_index=True)
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Category')
|
||||
verbose_name_plural = _('Categories')
|
||||
verbose_name = _("Category")
|
||||
verbose_name_plural = _("Categories")
|
||||
|
||||
def __str__(self):
|
||||
return self.title
|
||||
|
||||
|
||||
class PageQuerySet(InheritanceQuerySet):
|
||||
class BasePageQuerySet(InheritanceQuerySet):
|
||||
def draft(self):
|
||||
return self.filter(status=Page.STATUS_DRAFT)
|
||||
|
||||
@ -48,64 +61,90 @@ class PageQuerySet(InheritanceQuerySet):
|
||||
def trash(self):
|
||||
return self.filter(status=Page.STATUS_TRASH)
|
||||
|
||||
def by_last(self):
|
||||
return self.order_by("-pub_date")
|
||||
|
||||
def parent(self, parent=None, id=None):
|
||||
""" Return pages having this parent. """
|
||||
return self.filter(parent=parent) if id is None else \
|
||||
self.filter(parent__id=id)
|
||||
"""Return pages having this parent."""
|
||||
return self.filter(parent=parent) if id is None else self.filter(parent__id=id)
|
||||
|
||||
def search(self, q, search_content=True):
|
||||
if search_content:
|
||||
return self.filter(models.Q(title__icontains=q) | models.Q(content__icontains=q))
|
||||
return self.filter(title__icontains=q)
|
||||
|
||||
|
||||
class BasePage(models.Model):
|
||||
""" Base class for publishable content """
|
||||
class BasePage(Renderable, models.Model):
|
||||
"""Base class for publishable content."""
|
||||
|
||||
STATUS_DRAFT = 0x00
|
||||
STATUS_PUBLISHED = 0x10
|
||||
STATUS_TRASH = 0x20
|
||||
STATUS_CHOICES = (
|
||||
(STATUS_DRAFT, _('draft')),
|
||||
(STATUS_PUBLISHED, _('published')),
|
||||
(STATUS_TRASH, _('trash')),
|
||||
(STATUS_DRAFT, _("draft")),
|
||||
(STATUS_PUBLISHED, _("published")),
|
||||
(STATUS_TRASH, _("trash")),
|
||||
)
|
||||
|
||||
parent = models.ForeignKey('self', models.CASCADE, blank=True, null=True,
|
||||
db_index=True, related_name='child_set')
|
||||
parent = models.ForeignKey(
|
||||
"self",
|
||||
models.CASCADE,
|
||||
blank=True,
|
||||
null=True,
|
||||
db_index=True,
|
||||
related_name="child_set",
|
||||
)
|
||||
title = models.CharField(max_length=100)
|
||||
slug = models.SlugField(_('slug'), max_length=120, blank=True, unique=True,
|
||||
db_index=True)
|
||||
slug = models.SlugField(_("slug"), max_length=120, blank=True, unique=True, db_index=True)
|
||||
status = models.PositiveSmallIntegerField(
|
||||
_('status'), default=STATUS_DRAFT, choices=STATUS_CHOICES,
|
||||
_("status"),
|
||||
default=STATUS_DRAFT,
|
||||
choices=STATUS_CHOICES,
|
||||
)
|
||||
cover = FilerImageField(
|
||||
on_delete=models.SET_NULL,
|
||||
verbose_name=_('cover'), null=True, blank=True,
|
||||
verbose_name=_("cover"),
|
||||
null=True,
|
||||
blank=True,
|
||||
)
|
||||
content = RichTextUploadingField(
|
||||
_('content'), blank=True, null=True,
|
||||
_("content"),
|
||||
blank=True,
|
||||
null=True,
|
||||
)
|
||||
|
||||
objects = PageQuerySet.as_manager()
|
||||
objects = BasePageQuerySet.as_manager()
|
||||
|
||||
detail_url_name = None
|
||||
item_template_name = 'aircox/widgets/page_item.html'
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
@property
|
||||
def cover_url(self):
|
||||
return self.cover_id and self.cover.url
|
||||
|
||||
def __str__(self):
|
||||
return '{}'.format(self.title or self.pk)
|
||||
return "{}".format(self.title or self.pk)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if not self.slug:
|
||||
self.slug = slugify(self.title)[:100]
|
||||
count = Page.objects.filter(slug__startswith=self.slug).count()
|
||||
if count:
|
||||
self.slug += '-' + str(count)
|
||||
self.slug += "-" + str(count)
|
||||
|
||||
if self.parent and not self.cover:
|
||||
self.cover = self.parent.cover
|
||||
if self.parent:
|
||||
if self.parent == self:
|
||||
self.parent = None
|
||||
if not self.cover:
|
||||
self.cover = self.parent.cover
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
def get_absolute_url(self):
|
||||
return reverse(self.detail_url_name, kwargs={'slug': self.slug}) \
|
||||
if self.is_published else '#'
|
||||
if self.is_published:
|
||||
return reverse(self.detail_url_name, kwargs={"slug": self.slug})
|
||||
return ""
|
||||
|
||||
@property
|
||||
def is_draft(self):
|
||||
@ -121,22 +160,33 @@ class BasePage(models.Model):
|
||||
|
||||
@property
|
||||
def display_title(self):
|
||||
if self.is_published():
|
||||
if self.is_published:
|
||||
return self.title
|
||||
return self.parent.display_title()
|
||||
return self.parent and self.parent.title or ""
|
||||
|
||||
@cached_property
|
||||
def headline(self):
|
||||
if not self.content:
|
||||
return ''
|
||||
def display_headline(self):
|
||||
if not self.content or not self.is_published:
|
||||
return self.parent and self.parent.display_headline or ""
|
||||
content = bleach.clean(self.content, tags=[], strip=True)
|
||||
content = headline_clean_re.sub("\n", content)
|
||||
if content.startswith("\n"):
|
||||
content = content[1:]
|
||||
headline = headline_re.search(content)
|
||||
return mark_safe(headline.groupdict()['headline']) if headline else ''
|
||||
if not headline:
|
||||
return ""
|
||||
|
||||
headline = headline.groupdict()["headline"]
|
||||
suffix = "<b>...</b>" if len(headline) < len(content) else ""
|
||||
|
||||
headline = headline.split("\n")[:3]
|
||||
headline[-1] += suffix
|
||||
return mark_safe(" ".join(headline))
|
||||
|
||||
@classmethod
|
||||
def get_init_kwargs_from(cls, page, **kwargs):
|
||||
kwargs.setdefault('cover', page.cover)
|
||||
kwargs.setdefault('category', page.category)
|
||||
kwargs.setdefault("cover", page.cover)
|
||||
kwargs.setdefault("category", page.category)
|
||||
return kwargs
|
||||
|
||||
@classmethod
|
||||
@ -144,23 +194,54 @@ class BasePage(models.Model):
|
||||
return cls(**cls.get_init_kwargs_from(page, **kwargs))
|
||||
|
||||
|
||||
class PageQuerySet(BasePageQuerySet):
|
||||
def published(self):
|
||||
return self.filter(status=Page.STATUS_PUBLISHED, pub_date__lte=tz.now())
|
||||
|
||||
|
||||
class Page(BasePage):
|
||||
""" Base Page model used for articles and other dated content. """
|
||||
"""Base Page model used for articles and other dated content."""
|
||||
|
||||
category = models.ForeignKey(
|
||||
Category, models.SET_NULL,
|
||||
verbose_name=_('category'), blank=True, null=True, db_index=True
|
||||
Category,
|
||||
models.SET_NULL,
|
||||
verbose_name=_("category"),
|
||||
blank=True,
|
||||
null=True,
|
||||
db_index=True,
|
||||
)
|
||||
pub_date = models.DateTimeField(blank=True, null=True)
|
||||
pub_date = models.DateTimeField(_("publication date"), blank=True, null=True, db_index=True)
|
||||
featured = models.BooleanField(
|
||||
_('featured'), default=False,
|
||||
_("featured"),
|
||||
default=False,
|
||||
)
|
||||
allow_comments = models.BooleanField(
|
||||
_('allow comments'), default=True,
|
||||
_("allow comments"),
|
||||
default=True,
|
||||
)
|
||||
|
||||
objects = PageQuerySet.as_manager()
|
||||
detail_url_name = ""
|
||||
list_url_name = "page-list"
|
||||
|
||||
@cached_property
|
||||
def parent_subclass(self):
|
||||
if self.parent_id:
|
||||
return Page.objects.get_subclass(id=self.parent_id)
|
||||
return None
|
||||
|
||||
def get_absolute_url(self):
|
||||
if not self.is_published and self.parent_subclass:
|
||||
return self.parent_subclass.get_absolute_url()
|
||||
return super().get_absolute_url()
|
||||
|
||||
@classmethod
|
||||
def get_list_url(cls, kwargs={}):
|
||||
return reverse(cls.list_url_name, kwargs=kwargs)
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Publication')
|
||||
verbose_name_plural = _('Publications')
|
||||
verbose_name = _("Publication")
|
||||
verbose_name_plural = _("Publications")
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if self.is_published and self.pub_date is None:
|
||||
@ -174,89 +255,108 @@ class Page(BasePage):
|
||||
|
||||
|
||||
class StaticPage(BasePage):
|
||||
""" Static page that eventually can be attached to a specific view. """
|
||||
detail_url_name = 'static-page-detail'
|
||||
"""Static page that eventually can be attached to a specific view."""
|
||||
|
||||
ATTACH_TO_HOME = 0x00
|
||||
ATTACH_TO_DIFFUSIONS = 0x01
|
||||
ATTACH_TO_LOGS = 0x02
|
||||
ATTACH_TO_PROGRAMS = 0x03
|
||||
ATTACH_TO_EPISODES = 0x04
|
||||
ATTACH_TO_ARTICLES = 0x05
|
||||
detail_url_name = "static-page-detail"
|
||||
|
||||
ATTACH_TO_CHOICES = (
|
||||
(ATTACH_TO_HOME, _('Home page')),
|
||||
(ATTACH_TO_DIFFUSIONS, _('Diffusions page')),
|
||||
(ATTACH_TO_LOGS, _('Logs page')),
|
||||
(ATTACH_TO_PROGRAMS, _('Programs list')),
|
||||
(ATTACH_TO_EPISODES, _('Episodes list')),
|
||||
(ATTACH_TO_ARTICLES, _('Articles list')),
|
||||
class Target(models.TextChoices):
|
||||
NONE = "", _("None")
|
||||
HOME = "home", _("Home Page")
|
||||
TIMETABLE = "timetable-list", _("Timetable")
|
||||
PROGRAMS = "program-list", _("Programs list")
|
||||
EPISODES = "episode-list", _("Episodes list")
|
||||
ARTICLES = "article-list", _("Articles list")
|
||||
PAGES = "page-list", _("Publications list")
|
||||
PODCASTS = "podcast-list", _("Podcasts list")
|
||||
|
||||
attach_to = models.CharField(
|
||||
_("attach to"),
|
||||
choices=Target.choices,
|
||||
max_length=32,
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_("display this page content to related element"),
|
||||
)
|
||||
VIEWS = {
|
||||
ATTACH_TO_HOME: 'home',
|
||||
ATTACH_TO_DIFFUSIONS: 'diffusion-list',
|
||||
ATTACH_TO_LOGS: 'log-list',
|
||||
ATTACH_TO_PROGRAMS: 'program-list',
|
||||
ATTACH_TO_EPISODES: 'episode-list',
|
||||
ATTACH_TO_ARTICLES: 'article-list',
|
||||
}
|
||||
|
||||
attach_to = models.SmallIntegerField(
|
||||
_('attach to'), choices=ATTACH_TO_CHOICES, blank=True, null=True,
|
||||
help_text=_('display this page content to related element'),
|
||||
)
|
||||
def get_related_view(self):
|
||||
from ..views import attached
|
||||
|
||||
return self.attach_to and attached.get(self.attach_to) or None
|
||||
|
||||
def get_absolute_url(self):
|
||||
if self.attach_to:
|
||||
return reverse(self.VIEWS[self.attach_to])
|
||||
return reverse(self.attach_to)
|
||||
return super().get_absolute_url()
|
||||
|
||||
|
||||
class Comment(models.Model):
|
||||
class Comment(Renderable, models.Model):
|
||||
page = models.ForeignKey(
|
||||
Page, models.CASCADE, verbose_name=_('related page'),
|
||||
Page,
|
||||
models.CASCADE,
|
||||
verbose_name=_("related page"),
|
||||
db_index=True,
|
||||
# TODO: allow_comment filter
|
||||
)
|
||||
nickname = models.CharField(_('nickname'), max_length=32)
|
||||
email = models.EmailField(_('email'), max_length=32)
|
||||
nickname = models.CharField(_("nickname"), max_length=32)
|
||||
email = models.EmailField(_("email"), max_length=32)
|
||||
date = models.DateTimeField(auto_now_add=True)
|
||||
content = models.TextField(_('content'), max_length=1024)
|
||||
content = models.TextField(_("content"), max_length=1024)
|
||||
|
||||
template_prefix = "comment"
|
||||
|
||||
@cached_property
|
||||
def parent(self):
|
||||
"""Return Page as its subclass."""
|
||||
return Page.objects.select_subclasses().filter(id=self.page_id).first()
|
||||
|
||||
def get_absolute_url(self):
|
||||
return self.parent.get_absolute_url() + f"#comment-{self.pk}"
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Comment')
|
||||
verbose_name_plural = _('Comments')
|
||||
verbose_name = _("Comment")
|
||||
verbose_name_plural = _("Comments")
|
||||
|
||||
|
||||
class NavItem(models.Model):
|
||||
""" Navigation menu items """
|
||||
station = models.ForeignKey(
|
||||
Station, models.CASCADE, verbose_name=_('station'))
|
||||
menu = models.SlugField(_('menu'), max_length=24)
|
||||
order = models.PositiveSmallIntegerField(_('order'))
|
||||
text = models.CharField(_('title'), max_length=64)
|
||||
url = models.CharField(_('url'), max_length=256, blank=True, null=True)
|
||||
page = models.ForeignKey(StaticPage, models.CASCADE, db_index=True,
|
||||
verbose_name=_('page'), blank=True, null=True)
|
||||
"""Navigation menu items."""
|
||||
|
||||
station = models.ForeignKey(Station, models.CASCADE, verbose_name=_("station"))
|
||||
menu = models.SlugField(_("menu"), max_length=24)
|
||||
order = models.PositiveSmallIntegerField(_("order"))
|
||||
text = models.CharField(_("title"), max_length=64, blank=True, null=True)
|
||||
url = models.CharField(_("url"), max_length=256, blank=True, null=True)
|
||||
page = models.ForeignKey(
|
||||
StaticPage,
|
||||
models.CASCADE,
|
||||
db_index=True,
|
||||
verbose_name=_("page"),
|
||||
blank=True,
|
||||
null=True,
|
||||
)
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Menu item')
|
||||
verbose_name_plural = _('Menu items')
|
||||
ordering = ('order', 'pk')
|
||||
verbose_name = _("Menu item")
|
||||
verbose_name_plural = _("Menu items")
|
||||
ordering = ("order", "pk")
|
||||
|
||||
def get_url(self):
|
||||
return self.url if self.url else \
|
||||
self.page.get_absolute_url() if self.page else None
|
||||
return self.url if self.url else self.page.get_absolute_url() if self.page else None
|
||||
|
||||
def render(self, request, css_class='', active_class=''):
|
||||
def get_label(self):
|
||||
if self.text:
|
||||
return self.text
|
||||
elif self.page:
|
||||
return self.page.title
|
||||
|
||||
def render(self, request, css_class="", active_class=""):
|
||||
url = self.get_url()
|
||||
label = self.get_label()
|
||||
if active_class and request.path.startswith(url):
|
||||
css_class += ' ' + active_class
|
||||
css_class += " " + active_class
|
||||
|
||||
if not url:
|
||||
return self.text
|
||||
return label
|
||||
elif not css_class:
|
||||
return format_html('<a href="{}">{}</a>', url, self.text)
|
||||
return format_html('<a href="{}">{}</a>', url, label)
|
||||
else:
|
||||
return format_html('<a href="{}" class="{}">{}</a>', url,
|
||||
css_class, self.text)
|
||||
|
||||
return format_html('<a href="{}" class="{}">{}</a>', url, css_class, label)
|
||||
|
||||
@ -1,30 +1,29 @@
|
||||
import calendar
|
||||
from collections import OrderedDict
|
||||
import datetime
|
||||
from enum import IntEnum
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
|
||||
import pytz
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.conf import settings as conf
|
||||
from django.contrib.auth.models import Group, Permission
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.db import models
|
||||
from django.db.models import F, Q
|
||||
from django.db.models import F
|
||||
from django.db.models.functions import Concat, Substr
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.utils.functional import cached_property
|
||||
|
||||
from aircox import settings, utils
|
||||
from aircox.conf import settings
|
||||
|
||||
from .page import Page, PageQuerySet
|
||||
from .station import Station
|
||||
|
||||
|
||||
logger = logging.getLogger('aircox')
|
||||
logger = logging.getLogger("aircox")
|
||||
|
||||
|
||||
__all__ = ['Program', 'ProgramQuerySet', 'Stream', 'Schedule',
|
||||
'ProgramChildQuerySet', 'BaseRerun', 'BaseRerunQuerySet']
|
||||
__all__ = (
|
||||
"Program",
|
||||
"ProgramChildQuerySet",
|
||||
"ProgramQuerySet",
|
||||
"Stream",
|
||||
)
|
||||
|
||||
|
||||
class ProgramQuerySet(PageQuerySet):
|
||||
@ -37,8 +36,7 @@ class ProgramQuerySet(PageQuerySet):
|
||||
|
||||
|
||||
class Program(Page):
|
||||
"""
|
||||
A Program can either be a Streamed or a Scheduled program.
|
||||
"""A Program can either be a Streamed or a Scheduled program.
|
||||
|
||||
A Streamed program is used to generate non-stop random playlists when there
|
||||
is not scheduled diffusion. In such a case, a Stream is used to describe
|
||||
@ -49,38 +47,50 @@ class Program(Page):
|
||||
Renaming a Program rename the corresponding directory to matches the new
|
||||
name if it does not exists.
|
||||
"""
|
||||
|
||||
# explicit foreign key in order to avoid related name clashes
|
||||
station = models.ForeignKey(Station, models.CASCADE,
|
||||
verbose_name=_('station'))
|
||||
station = models.ForeignKey(Station, models.CASCADE, verbose_name=_("station"))
|
||||
active = models.BooleanField(
|
||||
_('active'),
|
||||
_("active"),
|
||||
default=True,
|
||||
help_text=_('if not checked this program is no longer active')
|
||||
help_text=_("if not checked this program is no longer active"),
|
||||
)
|
||||
sync = models.BooleanField(
|
||||
_('syncronise'),
|
||||
_("syncronise"),
|
||||
default=True,
|
||||
help_text=_('update later diffusions according to schedule changes')
|
||||
help_text=_("update later diffusions according to schedule changes"),
|
||||
)
|
||||
editors = models.ForeignKey(Group, models.CASCADE, blank=True, null=True, verbose_name=_("editors"))
|
||||
|
||||
objects = ProgramQuerySet.as_manager()
|
||||
detail_url_name = 'program-detail'
|
||||
detail_url_name = "program-detail"
|
||||
list_url_name = "program-list"
|
||||
|
||||
@property
|
||||
def path(self):
|
||||
""" Return program's directory path """
|
||||
return os.path.join(settings.AIRCOX_PROGRAMS_DIR,
|
||||
self.slug.replace('-', '_'))
|
||||
"""Return program's directory path."""
|
||||
return os.path.join(settings.PROGRAMS_DIR, self.slug.replace("-", "_"))
|
||||
|
||||
@property
|
||||
def abspath(self):
|
||||
"""Return absolute path to program's dir."""
|
||||
return os.path.join(conf.MEDIA_ROOT, self.path)
|
||||
|
||||
@property
|
||||
def archives_path(self):
|
||||
return os.path.join(self.path, settings.AIRCOX_SOUND_ARCHIVES_SUBDIR)
|
||||
return os.path.join(self.path, settings.SOUND_ARCHIVES_SUBDIR)
|
||||
|
||||
@property
|
||||
def excerpts_path(self):
|
||||
return os.path.join(
|
||||
self.path, settings.AIRCOX_SOUND_ARCHIVES_SUBDIR
|
||||
)
|
||||
return os.path.join(self.path, settings.SOUND_ARCHIVES_SUBDIR)
|
||||
|
||||
@property
|
||||
def editors_group_name(self):
|
||||
return f"{self.title} editors"
|
||||
|
||||
@property
|
||||
def change_permission_codename(self):
|
||||
return f"change_program_{self.slug}"
|
||||
|
||||
def __init__(self, *kargs, **kwargs):
|
||||
super().__init__(*kargs, **kwargs)
|
||||
@ -90,32 +100,43 @@ class Program(Page):
|
||||
|
||||
@classmethod
|
||||
def get_from_path(cl, path):
|
||||
"""
|
||||
Return a Program from the given path. We assume the path has been
|
||||
given in a previous time by this model (Program.path getter).
|
||||
"""
|
||||
path = path.replace(settings.AIRCOX_PROGRAMS_DIR, '')
|
||||
"""Return a Program from the given path.
|
||||
|
||||
while path[0] == '/':
|
||||
We assume the path has been given in a previous time by this
|
||||
model (Program.path getter).
|
||||
"""
|
||||
if path.startswith(settings.PROGRAMS_DIR_ABS):
|
||||
path = path.replace(settings.PROGRAMS_DIR_ABS, "")
|
||||
while path[0] == "/":
|
||||
path = path[1:]
|
||||
|
||||
path = path[:path.index('/')]
|
||||
return cl.objects.filter(slug=path.replace('_','-')).first()
|
||||
path = path[: path.index("/")]
|
||||
return cl.objects.filter(slug=path.replace("_", "-")).first()
|
||||
|
||||
def ensure_dir(self, subdir=None):
|
||||
"""
|
||||
Make sur the program's dir exists (and optionally subdir). Return True
|
||||
if the dir (or subdir) exists.
|
||||
"""
|
||||
path = os.path.join(self.path, subdir) if subdir else \
|
||||
self.path
|
||||
os.makedirs(path, exist_ok=True)
|
||||
"""Make sur the program's dir exists (and optionally subdir).
|
||||
|
||||
Return True if the dir (or subdir) exists.
|
||||
"""
|
||||
path = os.path.join(self.abspath, subdir) if subdir else self.abspath
|
||||
os.makedirs(path, exist_ok=True)
|
||||
return os.path.exists(path)
|
||||
|
||||
def set_group_ownership(self):
|
||||
editors, created = Group.objects.get_or_create(name=self.editors_group_name)
|
||||
if created:
|
||||
self.editors = editors
|
||||
super().save()
|
||||
permission, _ = Permission.objects.get_or_create(
|
||||
name=f"change program {self.title}",
|
||||
codename=self.change_permission_codename,
|
||||
content_type=ContentType.objects.get_for_model(self),
|
||||
)
|
||||
if permission not in editors.permissions.all():
|
||||
editors.permissions.add(permission)
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Program')
|
||||
verbose_name_plural = _('Programs')
|
||||
verbose_name = _("Program")
|
||||
verbose_name_plural = _("Programs")
|
||||
|
||||
def __str__(self):
|
||||
return self.title
|
||||
@ -126,349 +147,62 @@ class Program(Page):
|
||||
super().save(*kargs, **kwargs)
|
||||
|
||||
# TODO: move in signals
|
||||
path_ = getattr(self, '__initial_path', None)
|
||||
if path_ is not None and path_ != self.path and \
|
||||
os.path.exists(path_) and not os.path.exists(self.path):
|
||||
logger.info('program #%s\'s dir changed to %s - update it.',
|
||||
self.id, self.title)
|
||||
path_ = getattr(self, "__initial_path", None)
|
||||
abspath = path_ and os.path.join(conf.MEDIA_ROOT, path_)
|
||||
if path_ is not None and path_ != self.path and os.path.exists(abspath) and not os.path.exists(self.abspath):
|
||||
logger.info(
|
||||
"program #%s's dir changed to %s - update it.",
|
||||
self.id,
|
||||
self.title,
|
||||
)
|
||||
|
||||
shutil.move(path_, self.path)
|
||||
Sound.objects.filter(path__startswith=path_) \
|
||||
.update(path=Concat('path', Substr(F('path'), len(path_))))
|
||||
shutil.move(abspath, self.abspath)
|
||||
Sound.objects.filter(path__startswith=path_).update(file=Concat("file", Substr(F("file"), len(path_))))
|
||||
|
||||
self.set_group_ownership()
|
||||
|
||||
|
||||
class ProgramChildQuerySet(PageQuerySet):
|
||||
def station(self, station=None, id=None):
|
||||
return self.filter(parent__program__station=station) if id is None else \
|
||||
self.filter(parent__program__station__id=id)
|
||||
return (
|
||||
self.filter(parent__program__station=station)
|
||||
if id is None
|
||||
else self.filter(parent__program__station__id=id)
|
||||
)
|
||||
|
||||
def program(self, program=None, id=None):
|
||||
return self.parent(program, id)
|
||||
|
||||
|
||||
class BaseRerunQuerySet(models.QuerySet):
|
||||
""" Queryset for BaseRerun (sub)classes. """
|
||||
def station(self, station=None, id=None):
|
||||
return self.filter(program__station=station) if id is None else \
|
||||
self.filter(program__station__id=id)
|
||||
|
||||
def program(self, program=None, id=None):
|
||||
return self.filter(program=program) if id is None else \
|
||||
self.filter(program__id=id)
|
||||
|
||||
def rerun(self):
|
||||
return self.filter(initial__isnull=False)
|
||||
|
||||
def initial(self):
|
||||
return self.filter(initial__isnull=True)
|
||||
|
||||
|
||||
class BaseRerun(models.Model):
|
||||
"""
|
||||
Abstract model offering rerun facilities. Assume `start` is a
|
||||
datetime field or attribute implemented by subclass.
|
||||
"""
|
||||
program = models.ForeignKey(
|
||||
Program, models.CASCADE, db_index=True,
|
||||
verbose_name=_('related program'),
|
||||
)
|
||||
initial = models.ForeignKey(
|
||||
'self', models.SET_NULL, related_name='rerun_set',
|
||||
verbose_name=_('rerun of'),
|
||||
limit_choices_to={'initial__isnull': True},
|
||||
blank=True, null=True, db_index=True,
|
||||
)
|
||||
|
||||
objects = BaseRerunQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if self.initial is not None:
|
||||
self.initial = self.initial.get_initial()
|
||||
if self.initial == self:
|
||||
self.initial = None
|
||||
|
||||
if self.is_rerun:
|
||||
self.save_rerun()
|
||||
else:
|
||||
self.save_initial()
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
def save_rerun(self):
|
||||
pass
|
||||
|
||||
def save_initial(self):
|
||||
pass
|
||||
|
||||
@property
|
||||
def is_initial(self):
|
||||
return self.initial is None
|
||||
|
||||
@property
|
||||
def is_rerun(self):
|
||||
return self.initial is not None
|
||||
|
||||
def get_initial(self):
|
||||
""" Return the initial schedule (self or initial) """
|
||||
return self if self.initial is None else self.initial.get_initial()
|
||||
|
||||
def clean(self):
|
||||
super().clean()
|
||||
if self.initial is not None and self.initial.start >= self.start:
|
||||
raise ValidationError({
|
||||
'initial': _('rerun must happen after original')
|
||||
})
|
||||
|
||||
|
||||
# ? BIG FIXME: self.date is still used as datetime
|
||||
class Schedule(BaseRerun):
|
||||
"""
|
||||
A Schedule defines time slots of programs' diffusions. It can be an initial
|
||||
run or a rerun (in such case it is linked to the related schedule).
|
||||
"""
|
||||
# Frequency for schedules. Basically, it is a mask of bits where each bit is
|
||||
# a week. Bits > rank 5 are used for special schedules.
|
||||
# Important: the first week is always the first week where the weekday of
|
||||
# the schedule is present.
|
||||
# For ponctual programs, there is no need for a schedule, only a diffusion
|
||||
class Frequency(IntEnum):
|
||||
ponctual = 0b000000
|
||||
first = 0b000001
|
||||
second = 0b000010
|
||||
third = 0b000100
|
||||
fourth = 0b001000
|
||||
last = 0b010000
|
||||
first_and_third = 0b000101
|
||||
second_and_fourth = 0b001010
|
||||
every = 0b011111
|
||||
one_on_two = 0b100000
|
||||
|
||||
date = models.DateField(
|
||||
_('date'), help_text=_('date of the first diffusion'),
|
||||
)
|
||||
time = models.TimeField(
|
||||
_('time'), help_text=_('start time'),
|
||||
)
|
||||
timezone = models.CharField(
|
||||
_('timezone'),
|
||||
default=tz.get_current_timezone, max_length=100,
|
||||
choices=[(x, x) for x in pytz.all_timezones],
|
||||
help_text=_('timezone used for the date')
|
||||
)
|
||||
duration = models.TimeField(
|
||||
_('duration'),
|
||||
help_text=_('regular duration'),
|
||||
)
|
||||
frequency = models.SmallIntegerField(
|
||||
_('frequency'),
|
||||
choices=[(int(y), {
|
||||
'ponctual': _('ponctual'),
|
||||
'first': _('1st {day} of the month'),
|
||||
'second': _('2nd {day} of the month'),
|
||||
'third': _('3rd {day} of the month'),
|
||||
'fourth': _('4th {day} of the month'),
|
||||
'last': _('last {day} of the month'),
|
||||
'first_and_third': _('1st and 3rd {day} of the month'),
|
||||
'second_and_fourth': _('2nd and 4th {day} of the month'),
|
||||
'every': _('every {day}'),
|
||||
'one_on_two': _('one {day} on two'),
|
||||
}[x]) for x, y in Frequency.__members__.items()],
|
||||
)
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Schedule')
|
||||
verbose_name_plural = _('Schedules')
|
||||
|
||||
def __str__(self):
|
||||
return '{} - {}, {}'.format(
|
||||
self.program.title, self.get_frequency_verbose(),
|
||||
self.time.strftime('%H:%M')
|
||||
)
|
||||
|
||||
def save_rerun(self, *args, **kwargs):
|
||||
self.program = self.initial.program
|
||||
self.duration = self.initial.duration
|
||||
self.frequency = self.initial.frequency
|
||||
|
||||
@cached_property
|
||||
def tz(self):
|
||||
""" Pytz timezone of the schedule. """
|
||||
import pytz
|
||||
return pytz.timezone(self.timezone)
|
||||
|
||||
@cached_property
|
||||
def start(self):
|
||||
""" Datetime of the start (timezone unaware) """
|
||||
return tz.datetime.combine(self.date, self.time)
|
||||
|
||||
@cached_property
|
||||
def end(self):
|
||||
""" Datetime of the end """
|
||||
return self.start + utils.to_timedelta(self.duration)
|
||||
|
||||
def get_frequency_verbose(self):
|
||||
""" Return frequency formated for display """
|
||||
from django.template.defaultfilters import date
|
||||
return self.get_frequency_display().format(
|
||||
day=date(self.date, 'l')
|
||||
).capitalize()
|
||||
|
||||
# initial cached data
|
||||
__initial = None
|
||||
|
||||
def changed(self, fields=['date', 'duration', 'frequency', 'timezone']):
|
||||
initial = self._Schedule__initial
|
||||
|
||||
if not initial:
|
||||
return
|
||||
|
||||
this = self.__dict__
|
||||
|
||||
for field in fields:
|
||||
if initial.get(field) != this.get(field):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def normalize(self, date):
|
||||
"""
|
||||
Return a datetime set to schedule's time for the provided date,
|
||||
handling timezone (based on schedule's timezone).
|
||||
"""
|
||||
date = tz.datetime.combine(date, self.time)
|
||||
return self.tz.normalize(self.tz.localize(date))
|
||||
|
||||
def dates_of_month(self, date):
|
||||
""" Return normalized diffusion dates of provided date's month. """
|
||||
if self.frequency == Schedule.Frequency.ponctual:
|
||||
return []
|
||||
|
||||
sched_wday, freq = self.date.weekday(), self.frequency
|
||||
date = date.replace(day=1)
|
||||
|
||||
# last of the month
|
||||
if freq == Schedule.Frequency.last:
|
||||
date = date.replace(
|
||||
day=calendar.monthrange(date.year, date.month)[1])
|
||||
date_wday = date.weekday()
|
||||
|
||||
# end of month before the wanted weekday: move one week back
|
||||
if date_wday < sched_wday:
|
||||
date -= tz.timedelta(days=7)
|
||||
date += tz.timedelta(days=sched_wday - date_wday)
|
||||
return [self.normalize(date)]
|
||||
|
||||
# move to the first day of the month that matches the schedule's weekday
|
||||
# check on SO#3284452 for the formula
|
||||
date_wday, month = date.weekday(), date.month
|
||||
date += tz.timedelta(days=(7 if date_wday > sched_wday else 0) -
|
||||
date_wday + sched_wday)
|
||||
|
||||
if freq == Schedule.Frequency.one_on_two:
|
||||
# - adjust date with modulo 14 (= 2 weeks in days)
|
||||
# - there are max 3 "weeks on two" per month
|
||||
if (date - self.date).days % 14:
|
||||
date += tz.timedelta(days=7)
|
||||
dates = (date + tz.timedelta(days=14*i) for i in range(0, 3))
|
||||
else:
|
||||
dates = (date + tz.timedelta(days=7*week) for week in range(0, 5)
|
||||
if freq & (0b1 << week))
|
||||
|
||||
return [self.normalize(date) for date in dates if date.month == month]
|
||||
|
||||
|
||||
def _exclude_existing_date(self, dates):
|
||||
from .episode import Diffusion
|
||||
saved = set(Diffusion.objects.filter(start__in=dates)
|
||||
.values_list('start', flat=True))
|
||||
return [date for date in dates if date not in saved]
|
||||
|
||||
|
||||
def diffusions_of_month(self, date):
|
||||
"""
|
||||
Get episodes and diffusions for month of provided date, including
|
||||
reruns.
|
||||
:returns: tuple([Episode], [Diffusion])
|
||||
"""
|
||||
from .episode import Diffusion, Episode
|
||||
if self.initial is not None or \
|
||||
self.frequency == Schedule.Frequency.ponctual:
|
||||
return []
|
||||
|
||||
# dates for self and reruns as (date, initial)
|
||||
reruns = [(rerun, rerun.date - self.date)
|
||||
for rerun in self.rerun_set.all()]
|
||||
|
||||
dates = OrderedDict((date, None) for date in self.dates_of_month(date))
|
||||
dates.update([(rerun.normalize(date.date() + delta), date)
|
||||
for date in dates.keys() for rerun, delta in reruns])
|
||||
|
||||
# remove dates corresponding to existing diffusions
|
||||
saved = set(Diffusion.objects.filter(start__in=dates.keys(),
|
||||
program=self.program,
|
||||
schedule=self)
|
||||
.values_list('start', flat=True))
|
||||
|
||||
# make diffs
|
||||
duration = utils.to_timedelta(self.duration)
|
||||
diffusions = {}
|
||||
episodes = {}
|
||||
|
||||
for date, initial in dates.items():
|
||||
if date in saved:
|
||||
continue
|
||||
|
||||
if initial is None:
|
||||
episode = Episode.from_page(self.program, date=date)
|
||||
episode.date = date
|
||||
episodes[date] = episode
|
||||
else:
|
||||
episode = episodes[initial]
|
||||
initial = diffusions[initial]
|
||||
|
||||
diffusions[date] = Diffusion(
|
||||
episode=episode, schedule=self, type=Diffusion.TYPE_ON_AIR,
|
||||
initial=initial, start=date, end=date+duration
|
||||
)
|
||||
return episodes.values(), diffusions.values()
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
# TODO/FIXME: use validators?
|
||||
if self.initial is not None and self.date > self.date:
|
||||
raise ValueError('initial must be later')
|
||||
|
||||
|
||||
class Stream(models.Model):
|
||||
"""
|
||||
When there are no program scheduled, it is possible to play sounds
|
||||
in order to avoid blanks. A Stream is a Program that plays this role,
|
||||
and whose linked to a Stream.
|
||||
"""When there are no program scheduled, it is possible to play sounds in
|
||||
order to avoid blanks. A Stream is a Program that plays this role, and
|
||||
whose linked to a Stream.
|
||||
|
||||
All sounds that are marked as good and that are under the related
|
||||
program's archive dir are elligible for the sound's selection.
|
||||
"""
|
||||
|
||||
program = models.ForeignKey(
|
||||
Program, models.CASCADE,
|
||||
verbose_name=_('related program'),
|
||||
Program,
|
||||
models.CASCADE,
|
||||
verbose_name=_("related program"),
|
||||
)
|
||||
delay = models.TimeField(
|
||||
_('delay'), blank=True, null=True,
|
||||
help_text=_('minimal delay between two sound plays')
|
||||
_("delay"),
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_("minimal delay between two sound plays"),
|
||||
)
|
||||
begin = models.TimeField(
|
||||
_('begin'), blank=True, null=True,
|
||||
help_text=_('used to define a time range this stream is'
|
||||
'played')
|
||||
_("begin"),
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_("used to define a time range this stream is " "played"),
|
||||
)
|
||||
end = models.TimeField(
|
||||
_('end'),
|
||||
blank=True, null=True,
|
||||
help_text=_('used to define a time range this stream is'
|
||||
'played')
|
||||
_("end"),
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_("used to define a time range this stream is " "played"),
|
||||
)
|
||||
|
||||
|
||||
|
||||
92
aircox/models/rerun.py
Normal file
92
aircox/models/rerun.py
Normal file
@ -0,0 +1,92 @@
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.db import models
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from .program import Program
|
||||
|
||||
|
||||
__all__ = (
|
||||
"Rerun",
|
||||
"RerunQuerySet",
|
||||
)
|
||||
|
||||
|
||||
class RerunQuerySet(models.QuerySet):
|
||||
"""Queryset for Rerun (sub)classes."""
|
||||
|
||||
def station(self, station=None, id=None):
|
||||
return self.filter(program__station=station) if id is None else self.filter(program__station__id=id)
|
||||
|
||||
def program(self, program=None, id=None):
|
||||
return self.filter(program=program) if id is None else self.filter(program__id=id)
|
||||
|
||||
def rerun(self):
|
||||
return self.filter(initial__isnull=False)
|
||||
|
||||
def initial(self):
|
||||
return self.filter(initial__isnull=True)
|
||||
|
||||
|
||||
class Rerun(models.Model):
|
||||
"""Abstract model offering rerun facilities.
|
||||
|
||||
Assume `start` is a datetime field or attribute implemented by
|
||||
subclass.
|
||||
"""
|
||||
|
||||
program = models.ForeignKey(
|
||||
Program,
|
||||
models.CASCADE,
|
||||
db_index=True,
|
||||
verbose_name=_("related program"),
|
||||
)
|
||||
initial = models.ForeignKey(
|
||||
"self",
|
||||
models.SET_NULL,
|
||||
related_name="rerun_set",
|
||||
verbose_name=_("rerun of"),
|
||||
limit_choices_to={"initial__isnull": True},
|
||||
blank=True,
|
||||
null=True,
|
||||
db_index=True,
|
||||
)
|
||||
|
||||
objects = RerunQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
|
||||
@property
|
||||
def is_initial(self):
|
||||
return self.initial is None
|
||||
|
||||
@property
|
||||
def is_rerun(self):
|
||||
return self.initial is not None
|
||||
|
||||
def get_initial(self):
|
||||
"""Return the initial schedule (self or initial)"""
|
||||
return self if self.initial is None else self.initial.get_initial()
|
||||
|
||||
def clean(self):
|
||||
super().clean()
|
||||
if hasattr(self, "start") and self.initial is not None and self.initial.start >= self.start:
|
||||
raise ValidationError({"initial": _("rerun must happen after original")})
|
||||
|
||||
def save_rerun(self):
|
||||
self.program = self.initial.program
|
||||
|
||||
def save_initial(self):
|
||||
pass
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if self.initial is not None:
|
||||
self.initial = self.initial.get_initial()
|
||||
if self.initial == self:
|
||||
self.initial = None
|
||||
|
||||
if self.is_rerun:
|
||||
self.save_rerun()
|
||||
else:
|
||||
self.save_initial()
|
||||
super().save(*args, **kwargs)
|
||||
226
aircox/models/schedule.py
Normal file
226
aircox/models/schedule.py
Normal file
@ -0,0 +1,226 @@
|
||||
import calendar
|
||||
import zoneinfo
|
||||
|
||||
from django.db import models
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.functional import cached_property
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from aircox import utils
|
||||
|
||||
from .rerun import Rerun
|
||||
|
||||
|
||||
__all__ = ("Schedule",)
|
||||
|
||||
|
||||
def current_timezone_key():
|
||||
return tz.get_current_timezone().key
|
||||
|
||||
|
||||
# ? BIG FIXME: self.date is still used as datetime
|
||||
class Schedule(Rerun):
|
||||
"""A Schedule defines time slots of programs' diffusions.
|
||||
|
||||
It can be an initial run or a rerun (in such case it is linked to
|
||||
the related schedule).
|
||||
"""
|
||||
|
||||
# Frequency for schedules. Basically, it is a mask of bits where each bit
|
||||
# is a week. Bits > rank 5 are used for special schedules.
|
||||
# Important: the first week is always the first week where the weekday of
|
||||
# the schedule is present.
|
||||
# For ponctual programs, there is no need for a schedule, only a diffusion
|
||||
class Frequency(models.IntegerChoices):
|
||||
ponctual = 0b000000, _("ponctual")
|
||||
first = 0b000001, _("1st {day} of the month")
|
||||
second = 0b000010, _("2nd {day} of the month")
|
||||
third = 0b000100, _("3rd {day} of the month")
|
||||
fourth = 0b001000, _("4th {day} of the month")
|
||||
last = 0b010000, _("last {day} of the month")
|
||||
first_and_third = 0b000101, _("1st and 3rd {day} of the month")
|
||||
second_and_fourth = 0b001010, _("2nd and 4th {day} of the month")
|
||||
every = 0b011111, _("{day}")
|
||||
one_on_two = 0b100000, _("one {day} on two")
|
||||
# every_weekday = 0b10000000 _("from Monday to Friday")
|
||||
|
||||
date = models.DateField(
|
||||
_("date"),
|
||||
help_text=_("date of the first diffusion"),
|
||||
)
|
||||
time = models.TimeField(
|
||||
_("time"),
|
||||
help_text=_("start time"),
|
||||
)
|
||||
timezone = models.CharField(
|
||||
_("timezone"),
|
||||
default=current_timezone_key,
|
||||
max_length=100,
|
||||
choices=sorted([(x, x) for x in zoneinfo.available_timezones()]),
|
||||
help_text=_("timezone used for the date"),
|
||||
)
|
||||
duration = models.TimeField(
|
||||
_("duration"),
|
||||
help_text=_("regular duration"),
|
||||
)
|
||||
frequency = models.SmallIntegerField(
|
||||
_("frequency"),
|
||||
choices=Frequency.choices,
|
||||
)
|
||||
|
||||
class Meta:
|
||||
verbose_name = _("Schedule")
|
||||
verbose_name_plural = _("Schedules")
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self._initial = kwargs
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def __str__(self):
|
||||
return "{} - {}, {}".format(
|
||||
self.program.title,
|
||||
self.get_frequency_display(),
|
||||
self.time.strftime("%H:%M"),
|
||||
)
|
||||
|
||||
def save_rerun(self):
|
||||
super().save_rerun()
|
||||
self.duration = self.initial.duration
|
||||
self.frequency = self.initial.frequency
|
||||
|
||||
@cached_property
|
||||
def tz(self):
|
||||
"""Pytz timezone of the schedule."""
|
||||
return zoneinfo.ZoneInfo(self.timezone)
|
||||
|
||||
@cached_property
|
||||
def start(self):
|
||||
"""Datetime of the start (timezone unaware)"""
|
||||
return tz.datetime.combine(self.date, self.time)
|
||||
|
||||
@cached_property
|
||||
def end(self):
|
||||
"""Datetime of the end."""
|
||||
return self.start + utils.to_timedelta(self.duration)
|
||||
|
||||
def get_frequency_display(self):
|
||||
"""Return frequency formated for display."""
|
||||
from django.template.defaultfilters import date
|
||||
|
||||
return self._get_FIELD_display(self._meta.get_field("frequency")).format(day=date(self.date, "l")).capitalize()
|
||||
|
||||
def normalize(self, date):
|
||||
"""Return a datetime set to schedule's time for the provided date,
|
||||
handling timezone (based on schedule's timezone)."""
|
||||
date = tz.datetime.combine(date, self.time)
|
||||
return date.replace(tzinfo=self.tz)
|
||||
|
||||
def dates_of_month(self, date, frequency=None, sched_date=None):
|
||||
"""Return normalized diffusion dates of provided date's month.
|
||||
|
||||
:param Date date: date of the month to get dates from;
|
||||
:param Schedule.Frequency frequency: frequency (defaults to ``self.frequency``)
|
||||
:param Date sched_date: schedule start date (defaults to ``self.date``)
|
||||
:return list of diffusion dates
|
||||
"""
|
||||
if frequency is None:
|
||||
frequency = self.frequency
|
||||
|
||||
if sched_date is None:
|
||||
sched_date = self.date
|
||||
|
||||
if frequency == Schedule.Frequency.ponctual:
|
||||
return []
|
||||
|
||||
sched_wday = sched_date.weekday()
|
||||
date = date.replace(day=1)
|
||||
|
||||
# last of the month
|
||||
if frequency == Schedule.Frequency.last:
|
||||
date = date.replace(day=calendar.monthrange(date.year, date.month)[1])
|
||||
date_wday = date.weekday()
|
||||
|
||||
# end of month before the wanted weekday: move one week back
|
||||
if date_wday < sched_wday:
|
||||
date -= tz.timedelta(days=7)
|
||||
date += tz.timedelta(days=sched_wday - date_wday)
|
||||
return [self.normalize(date)]
|
||||
|
||||
# move to the first day of the month that matches the schedule's
|
||||
# weekday. Check on SO#3284452 for the formula
|
||||
date_wday, month = date.weekday(), date.month
|
||||
date += tz.timedelta(days=(7 if date_wday > sched_wday else 0) - date_wday + sched_wday)
|
||||
|
||||
if frequency == Schedule.Frequency.one_on_two:
|
||||
# - adjust date with modulo 14 (= 2 weeks in days)
|
||||
# - there are max 3 "weeks on two" per month
|
||||
if (date - sched_date).days % 14:
|
||||
date += tz.timedelta(days=7)
|
||||
dates = (date + tz.timedelta(days=14 * i) for i in range(0, 3))
|
||||
else:
|
||||
dates = (date + tz.timedelta(days=7 * week) for week in range(0, 5) if frequency & (0b1 << week))
|
||||
|
||||
return [self.normalize(date) for date in dates if date.month == month]
|
||||
|
||||
def diffusions_of_month(self, date, frequency=None, sched_date=None):
|
||||
"""Get episodes and diffusions for month of provided date, including
|
||||
reruns.
|
||||
|
||||
:param Date date: date of the month to get diffusions from;
|
||||
:param Schedule.Frequency frequency: frequency (defaults to ``self.frequency``)
|
||||
:param Date sched_date: schedule start date (defaults to ``self.date``)
|
||||
:returns: tuple([Episode], [Diffusion])
|
||||
"""
|
||||
from .diffusion import Diffusion
|
||||
from .episode import Episode
|
||||
|
||||
if frequency is None:
|
||||
frequency = self.frequency
|
||||
|
||||
if sched_date is None:
|
||||
sched_date = self.date
|
||||
|
||||
if self.initial is not None or frequency == Schedule.Frequency.ponctual:
|
||||
return [], []
|
||||
|
||||
# dates for self and reruns as (date, initial)
|
||||
reruns = [(rerun, rerun.date - sched_date) for rerun in self.rerun_set.all()]
|
||||
|
||||
dates = {date: None for date in self.dates_of_month(date, frequency, sched_date)}
|
||||
dates.update(
|
||||
(rerun.normalize(date.date() + delta), date) for date in list(dates.keys()) for rerun, delta in reruns
|
||||
)
|
||||
|
||||
# remove dates corresponding to existing diffusions
|
||||
saved = set(
|
||||
Diffusion.objects.filter(start__in=dates.keys(), program=self.program, schedule=self).values_list(
|
||||
"start", flat=True
|
||||
)
|
||||
)
|
||||
|
||||
# make diffs
|
||||
duration = utils.to_timedelta(self.duration)
|
||||
diffusions = {}
|
||||
episodes = {}
|
||||
|
||||
for date, initial in dates.items():
|
||||
if date in saved:
|
||||
continue
|
||||
|
||||
if initial is None:
|
||||
episode = Episode.from_page(self.program, date=date)
|
||||
episode.date = date
|
||||
episodes[date] = episode
|
||||
else:
|
||||
episode = episodes[initial]
|
||||
initial = diffusions[initial]
|
||||
|
||||
diffusions[date] = Diffusion(
|
||||
episode=episode,
|
||||
schedule=self,
|
||||
type=Diffusion.TYPE_ON_AIR,
|
||||
initial=initial,
|
||||
start=date,
|
||||
end=date + duration,
|
||||
)
|
||||
return episodes.values(), diffusions.values()
|
||||
@ -1,13 +1,16 @@
|
||||
import pytz
|
||||
|
||||
from django.contrib.auth.models import User, Group, Permission
|
||||
from django.contrib.auth.models import Group, Permission, User
|
||||
from django.db import transaction
|
||||
from django.db.models import F, signals
|
||||
from django.db.models import signals
|
||||
from django.dispatch import receiver
|
||||
from django.utils import timezone as tz
|
||||
|
||||
from .. import settings, utils
|
||||
from . import Diffusion, Episode, Page, Program, Schedule
|
||||
from aircox import utils
|
||||
from aircox.conf import settings
|
||||
from .diffusion import Diffusion
|
||||
from .episode import Episode
|
||||
from .page import Page
|
||||
from .program import Program
|
||||
from .schedule import Schedule
|
||||
|
||||
|
||||
# Add a default group to a user when it is created. It also assigns a list
|
||||
@ -18,21 +21,18 @@ from . import Diffusion, Episode, Page, Program, Schedule
|
||||
#
|
||||
@receiver(signals.post_save, sender=User)
|
||||
def user_default_groups(sender, instance, created, *args, **kwargs):
|
||||
"""
|
||||
Set users to different default groups
|
||||
"""
|
||||
"""Set users to different default groups."""
|
||||
if not created or instance.is_superuser:
|
||||
return
|
||||
|
||||
for group_name, permissions in settings.AIRCOX_DEFAULT_USER_GROUPS.items():
|
||||
for group_name, permissions in settings.DEFAULT_USER_GROUPS.items():
|
||||
if instance.groups.filter(name=group_name).count():
|
||||
continue
|
||||
|
||||
group, created = Group.objects.get_or_create(name=group_name)
|
||||
if created and permissions:
|
||||
for codename in permissions:
|
||||
permission = Permission.objects.filter(
|
||||
codename=codename).first()
|
||||
permission = Permission.objects.filter(codename=codename).first()
|
||||
if permission:
|
||||
group.permissions.add(permission)
|
||||
group.save()
|
||||
@ -41,44 +41,36 @@ def user_default_groups(sender, instance, created, *args, **kwargs):
|
||||
|
||||
@receiver(signals.post_save, sender=Page)
|
||||
def page_post_save(sender, instance, created, *args, **kwargs):
|
||||
if not created and instance.cover:
|
||||
Page.objects.filter(parent=instance, cover__isnull=True) \
|
||||
.update(cover=instance.cover)
|
||||
if not created and instance.cover and "raw" not in kwargs:
|
||||
Page.objects.filter(parent=instance, cover__isnull=True).update(cover=instance.cover)
|
||||
|
||||
|
||||
@receiver(signals.post_save, sender=Program)
|
||||
def program_post_save(sender, instance, created, *args, **kwargs):
|
||||
"""
|
||||
Clean-up later diffusions when a program becomes inactive
|
||||
"""
|
||||
"""Clean-up later diffusions when a program becomes inactive."""
|
||||
if not instance.active:
|
||||
Diffusion.object.program(instance).after(tz.now()).delete()
|
||||
Episode.object.parent(instance).filter(diffusion__isnull=True) \
|
||||
.delete()
|
||||
Diffusion.objects.program(instance).after(tz.now()).delete()
|
||||
Episode.objects.parent(instance).filter(diffusion__isnull=True).delete()
|
||||
|
||||
cover = getattr(instance, '__initial_cover', None)
|
||||
cover = getattr(instance, "__initial_cover", None)
|
||||
if cover is None and instance.cover is not None:
|
||||
Episode.objects.parent(instance) \
|
||||
.filter(cover__isnull=True) \
|
||||
.update(cover=instance.cover)
|
||||
|
||||
Episode.objects.parent(instance).filter(cover__isnull=True).update(cover=instance.cover)
|
||||
|
||||
|
||||
@receiver(signals.pre_save, sender=Schedule)
|
||||
def schedule_pre_save(sender, instance, *args, **kwargs):
|
||||
if getattr(instance, 'pk') is not None:
|
||||
if getattr(instance, "pk") is not None and "raw" not in kwargs:
|
||||
instance._initial = Schedule.objects.get(pk=instance.pk)
|
||||
|
||||
|
||||
@receiver(signals.post_save, sender=Schedule)
|
||||
def schedule_post_save(sender, instance, created, *args, **kwargs):
|
||||
"""
|
||||
Handles Schedule's time, duration and timezone changes and update
|
||||
corresponding diffusions accordingly.
|
||||
"""
|
||||
initial = getattr(instance, '_initial', None)
|
||||
if not initial or ((instance.time, instance.duration, instance.timezone) ==
|
||||
(initial.time, initial.duration, initial.timezone)):
|
||||
"""Handles Schedule's time, duration and timezone changes and update
|
||||
corresponding diffusions accordingly."""
|
||||
initial = getattr(instance, "_initial", None)
|
||||
if not initial or (
|
||||
(instance.time, instance.duration, instance.timezone) == (initial.time, initial.duration, initial.timezone)
|
||||
):
|
||||
return
|
||||
|
||||
today = tz.datetime.today()
|
||||
@ -94,14 +86,11 @@ def schedule_post_save(sender, instance, created, *args, **kwargs):
|
||||
|
||||
@receiver(signals.pre_delete, sender=Schedule)
|
||||
def schedule_pre_delete(sender, instance, *args, **kwargs):
|
||||
""" Delete later corresponding diffusion to a changed schedule. """
|
||||
"""Delete later corresponding diffusion to a changed schedule."""
|
||||
Diffusion.objects.filter(schedule=instance).after(tz.now()).delete()
|
||||
Episode.objects.filter(diffusion__isnull=True, content__isnull=True,
|
||||
sound__isnull=True).delete()
|
||||
Episode.objects.filter(diffusion__isnull=True, content__isnull=True, sound__isnull=True).delete()
|
||||
|
||||
|
||||
@receiver(signals.post_delete, sender=Diffusion)
|
||||
def diffusion_post_delete(sender, instance, *args, **kwargs):
|
||||
Episode.objects.filter(diffusion__isnull=True, content__isnull=True,
|
||||
sound__isnull=True).delete()
|
||||
|
||||
|
||||
Episode.objects.filter(diffusion__isnull=True, content__isnull=True, sound__isnull=True).delete()
|
||||
|
||||
@ -1,24 +1,22 @@
|
||||
from enum import IntEnum
|
||||
import logging
|
||||
import os
|
||||
|
||||
from django.conf import settings as main_settings
|
||||
from django.conf import settings as conf
|
||||
from django.db import models
|
||||
from django.db.models import Q
|
||||
from django.utils import timezone as tz
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from taggit.managers import TaggableManager
|
||||
|
||||
from aircox import settings
|
||||
from .program import Program
|
||||
from aircox.conf import settings
|
||||
|
||||
from .episode import Episode
|
||||
from .program import Program
|
||||
|
||||
logger = logging.getLogger("aircox")
|
||||
|
||||
|
||||
logger = logging.getLogger('aircox')
|
||||
|
||||
|
||||
__all__ = ['Sound', 'SoundQuerySet', 'Track']
|
||||
__all__ = ("Sound", "SoundQuerySet", "Track")
|
||||
|
||||
|
||||
class SoundQuerySet(models.QuerySet):
|
||||
@ -38,142 +36,174 @@ class SoundQuerySet(models.QuerySet):
|
||||
return self.exclude(type=Sound.TYPE_REMOVED)
|
||||
|
||||
def public(self):
|
||||
""" Return sounds available as podcasts """
|
||||
"""Return sounds available as podcasts."""
|
||||
return self.filter(is_public=True)
|
||||
|
||||
def downloadable(self):
|
||||
"""Return sounds available as podcasts."""
|
||||
return self.filter(is_downloadable=True)
|
||||
|
||||
def archive(self):
|
||||
""" Return sounds that are archives """
|
||||
"""Return sounds that are archives."""
|
||||
return self.filter(type=Sound.TYPE_ARCHIVE)
|
||||
|
||||
def paths(self, archive=True, order_by=True):
|
||||
"""
|
||||
Return paths as a flat list (exclude sound without path).
|
||||
def path(self, paths):
|
||||
if isinstance(paths, str):
|
||||
return self.filter(file=paths.replace(conf.MEDIA_ROOT + "/", ""))
|
||||
return self.filter(file__in=(p.replace(conf.MEDIA_ROOT + "/", "") for p in paths))
|
||||
|
||||
def playlist(self, archive=True, order_by=True):
|
||||
"""Return files absolute paths as a flat list (exclude sound without
|
||||
path).
|
||||
|
||||
If `order_by` is True, order by path.
|
||||
"""
|
||||
if archive:
|
||||
self = self.archive()
|
||||
if order_by:
|
||||
self = self.order_by('path')
|
||||
return self.filter(path__isnull=False).values_list('path', flat=True)
|
||||
self = self.order_by("file")
|
||||
return [
|
||||
os.path.join(conf.MEDIA_ROOT, file)
|
||||
for file in self.filter(file__isnull=False).values_list("file", flat=True)
|
||||
]
|
||||
|
||||
def search(self, query):
|
||||
return self.filter(
|
||||
Q(name__icontains=query) | Q(path__icontains=query) |
|
||||
Q(program__title__icontains=query) |
|
||||
Q(episode__title__icontains=query)
|
||||
Q(name__icontains=query)
|
||||
| Q(file__icontains=query)
|
||||
| Q(program__title__icontains=query)
|
||||
| Q(episode__title__icontains=query)
|
||||
)
|
||||
|
||||
|
||||
# TODO:
|
||||
# - provide a default name based on program and episode
|
||||
class Sound(models.Model):
|
||||
"""
|
||||
A Sound is the representation of a sound file that can be either an excerpt
|
||||
or a complete archive of the related diffusion.
|
||||
"""
|
||||
"""A Sound is the representation of a sound file that can be either an
|
||||
excerpt or a complete archive of the related diffusion."""
|
||||
|
||||
TYPE_OTHER = 0x00
|
||||
TYPE_ARCHIVE = 0x01
|
||||
TYPE_EXCERPT = 0x02
|
||||
TYPE_REMOVED = 0x03
|
||||
TYPE_CHOICES = (
|
||||
(TYPE_OTHER, _('other')), (TYPE_ARCHIVE, _('archive')),
|
||||
(TYPE_EXCERPT, _('excerpt')), (TYPE_REMOVED, _('removed'))
|
||||
(TYPE_OTHER, _("other")),
|
||||
(TYPE_ARCHIVE, _("archive")),
|
||||
(TYPE_EXCERPT, _("excerpt")),
|
||||
(TYPE_REMOVED, _("removed")),
|
||||
)
|
||||
|
||||
name = models.CharField(_('name'), max_length=64)
|
||||
name = models.CharField(_("name"), max_length=64)
|
||||
program = models.ForeignKey(
|
||||
Program, models.CASCADE, blank=True, # NOT NULL
|
||||
verbose_name=_('program'),
|
||||
help_text=_('program related to it'),
|
||||
Program,
|
||||
models.CASCADE,
|
||||
blank=True, # NOT NULL
|
||||
verbose_name=_("program"),
|
||||
help_text=_("program related to it"),
|
||||
db_index=True,
|
||||
)
|
||||
episode = models.ForeignKey(
|
||||
Episode, models.SET_NULL, blank=True, null=True,
|
||||
verbose_name=_('episode'),
|
||||
Episode,
|
||||
models.SET_NULL,
|
||||
blank=True,
|
||||
null=True,
|
||||
verbose_name=_("episode"),
|
||||
db_index=True,
|
||||
)
|
||||
type = models.SmallIntegerField(_('type'), choices=TYPE_CHOICES)
|
||||
type = models.SmallIntegerField(_("type"), choices=TYPE_CHOICES)
|
||||
position = models.PositiveSmallIntegerField(
|
||||
_('order'), default=0, help_text=_('position in the playlist'),
|
||||
_("order"),
|
||||
default=0,
|
||||
help_text=_("position in the playlist"),
|
||||
)
|
||||
# FIXME: url() does not use the same directory than here
|
||||
# should we use FileField for more reliability?
|
||||
path = models.FilePathField(
|
||||
_('file'),
|
||||
path=settings.AIRCOX_PROGRAMS_DIR,
|
||||
match=r'(' + '|'.join(settings.AIRCOX_SOUND_FILE_EXT)
|
||||
.replace('.', r'\.') + ')$',
|
||||
recursive=True, max_length=255,
|
||||
blank=True, null=True, unique=True,
|
||||
|
||||
def _upload_to(self, filename):
|
||||
subdir = settings.SOUND_ARCHIVES_SUBDIR if self.type == self.TYPE_ARCHIVE else settings.SOUND_EXCERPTS_SUBDIR
|
||||
return os.path.join(self.program.path, subdir, filename)
|
||||
|
||||
file = models.FileField(
|
||||
_("file"),
|
||||
upload_to=_upload_to,
|
||||
max_length=256,
|
||||
db_index=True,
|
||||
unique=True,
|
||||
)
|
||||
#embed = models.TextField(
|
||||
# _('embed'),
|
||||
# blank=True, null=True,
|
||||
# help_text=_('HTML code to embed a sound from an external plateform'),
|
||||
#)
|
||||
duration = models.TimeField(
|
||||
_('duration'),
|
||||
blank=True, null=True,
|
||||
help_text=_('duration of the sound'),
|
||||
_("duration"),
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_("duration of the sound"),
|
||||
)
|
||||
mtime = models.DateTimeField(
|
||||
_('modification time'),
|
||||
blank=True, null=True,
|
||||
help_text=_('last modification date and time'),
|
||||
_("modification time"),
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_("last modification date and time"),
|
||||
)
|
||||
is_good_quality = models.BooleanField(
|
||||
_('good quality'), help_text=_('sound meets quality requirements'),
|
||||
blank=True, null=True
|
||||
_("good quality"),
|
||||
help_text=_("sound meets quality requirements"),
|
||||
blank=True,
|
||||
null=True,
|
||||
)
|
||||
is_public = models.BooleanField(
|
||||
_('public'), help_text=_('if it can be podcasted from the server'),
|
||||
_("public"),
|
||||
help_text=_("whether it is publicly available as podcast"),
|
||||
default=False,
|
||||
)
|
||||
is_downloadable = models.BooleanField(
|
||||
_("downloadable"),
|
||||
help_text=_("whether it can be publicly downloaded by visitors (sound must be " "public)"),
|
||||
default=False,
|
||||
)
|
||||
|
||||
objects = SoundQuerySet.as_manager()
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Sound')
|
||||
verbose_name_plural = _('Sounds')
|
||||
verbose_name = _("Sound")
|
||||
verbose_name_plural = _("Sounds")
|
||||
|
||||
@property
|
||||
def url(self):
|
||||
return self.file and self.file.url
|
||||
|
||||
def __str__(self):
|
||||
return '/'.join(self.path.split('/')[-3:])
|
||||
return "/".join(self.file.path.split("/")[-3:])
|
||||
|
||||
def save(self, check=True, *args, **kwargs):
|
||||
if self.episode is not None and self.program is None:
|
||||
self.program = self.episode.program
|
||||
if check:
|
||||
self.check_on_file()
|
||||
if not self.is_public:
|
||||
self.is_downloadable = False
|
||||
self.__check_name()
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
def url(self):
|
||||
""" Return an url to the file. """
|
||||
path = self.path.replace(main_settings.MEDIA_ROOT, '', 1)
|
||||
return (main_settings.MEDIA_URL + path).replace('//','/')
|
||||
|
||||
# TODO: rename get_file_mtime(self)
|
||||
def get_mtime(self):
|
||||
"""
|
||||
Get the last modification date from file
|
||||
"""
|
||||
mtime = os.stat(self.path).st_mtime
|
||||
"""Get the last modification date from file."""
|
||||
mtime = os.stat(self.file.path).st_mtime
|
||||
mtime = tz.datetime.fromtimestamp(mtime)
|
||||
mtime = mtime.replace(microsecond=0)
|
||||
return tz.make_aware(mtime, tz.get_current_timezone())
|
||||
|
||||
def file_exists(self):
|
||||
""" Return true if the file still exists. """
|
||||
"""Return true if the file still exists."""
|
||||
|
||||
return os.path.exists(self.path)
|
||||
return os.path.exists(self.file.path)
|
||||
|
||||
# TODO: rename to sync_fs()
|
||||
def check_on_file(self):
|
||||
"""
|
||||
Check sound file info again'st self, and update informations if
|
||||
needed (do not save). Return True if there was changes.
|
||||
"""Check sound file info again'st self, and update informations if
|
||||
needed (do not save).
|
||||
|
||||
Return True if there was changes.
|
||||
"""
|
||||
if not self.file_exists():
|
||||
if self.type == self.TYPE_REMOVED:
|
||||
return
|
||||
logger.info('sound %s: has been removed', self.path)
|
||||
logger.debug("sound %s: has been removed", self.file.name)
|
||||
self.type = self.TYPE_REMOVED
|
||||
return True
|
||||
|
||||
@ -182,9 +212,9 @@ class Sound(models.Model):
|
||||
|
||||
if self.type == self.TYPE_REMOVED and self.program:
|
||||
changed = True
|
||||
self.type = self.TYPE_ARCHIVE \
|
||||
if self.path.startswith(self.program.archives_path) else \
|
||||
self.TYPE_EXCERPT
|
||||
self.type = (
|
||||
self.TYPE_ARCHIVE if self.file.name.startswith(self.program.archives_path) else self.TYPE_EXCERPT
|
||||
)
|
||||
|
||||
# check mtime -> reset quality if changed (assume file changed)
|
||||
mtime = self.get_mtime()
|
||||
@ -192,18 +222,20 @@ class Sound(models.Model):
|
||||
if self.mtime != mtime:
|
||||
self.mtime = mtime
|
||||
self.is_good_quality = None
|
||||
logger.info('sound %s: m_time has changed. Reset quality info',
|
||||
self.path)
|
||||
logger.debug(
|
||||
"sound %s: m_time has changed. Reset quality info",
|
||||
self.file.name,
|
||||
)
|
||||
return True
|
||||
|
||||
return changed
|
||||
|
||||
def __check_name(self):
|
||||
if not self.name and self.path:
|
||||
if not self.name and self.file and self.file.name:
|
||||
# FIXME: later, remove date?
|
||||
self.name = os.path.basename(self.path)
|
||||
self.name = os.path.splitext(self.name)[0]
|
||||
self.name = self.name.replace('_', ' ')
|
||||
name = os.path.basename(self.file.name)
|
||||
name = os.path.splitext(name)[0]
|
||||
self.name = name.replace("_", " ").strip()
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
@ -211,50 +243,62 @@ class Sound(models.Model):
|
||||
|
||||
|
||||
class Track(models.Model):
|
||||
"""Track of a playlist of an object.
|
||||
|
||||
The position can either be expressed as the position in the playlist
|
||||
or as the moment in seconds it started.
|
||||
"""
|
||||
Track of a playlist of an object. The position can either be expressed
|
||||
as the position in the playlist or as the moment in seconds it started.
|
||||
"""
|
||||
|
||||
episode = models.ForeignKey(
|
||||
Episode, models.CASCADE, blank=True, null=True,
|
||||
verbose_name=_('episode'),
|
||||
Episode,
|
||||
models.CASCADE,
|
||||
blank=True,
|
||||
null=True,
|
||||
verbose_name=_("episode"),
|
||||
)
|
||||
sound = models.ForeignKey(
|
||||
Sound, models.CASCADE, blank=True, null=True,
|
||||
verbose_name=_('sound'),
|
||||
Sound,
|
||||
models.CASCADE,
|
||||
blank=True,
|
||||
null=True,
|
||||
verbose_name=_("sound"),
|
||||
)
|
||||
position = models.PositiveSmallIntegerField(
|
||||
_('order'), default=0, help_text=_('position in the playlist'),
|
||||
_("order"),
|
||||
default=0,
|
||||
help_text=_("position in the playlist"),
|
||||
)
|
||||
timestamp = models.PositiveSmallIntegerField(
|
||||
_('timestamp'),
|
||||
blank=True, null=True,
|
||||
help_text=_('position (in seconds)')
|
||||
_("timestamp"),
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_("position (in seconds)"),
|
||||
)
|
||||
title = models.CharField(_('title'), max_length=128)
|
||||
artist = models.CharField(_('artist'), max_length=128)
|
||||
tags = TaggableManager(verbose_name=_('tags'), blank=True,)
|
||||
title = models.CharField(_("title"), max_length=128)
|
||||
artist = models.CharField(_("artist"), max_length=128)
|
||||
album = models.CharField(_("album"), max_length=128, null=True, blank=True)
|
||||
tags = TaggableManager(verbose_name=_("tags"), blank=True)
|
||||
year = models.IntegerField(_("year"), blank=True, null=True)
|
||||
# FIXME: remove?
|
||||
info = models.CharField(
|
||||
_('information'),
|
||||
_("information"),
|
||||
max_length=128,
|
||||
blank=True, null=True,
|
||||
help_text=_('additional informations about this track, such as '
|
||||
'the version, if is it a remix, features, etc.'),
|
||||
blank=True,
|
||||
null=True,
|
||||
help_text=_(
|
||||
"additional informations about this track, such as " "the version, if is it a remix, features, etc."
|
||||
),
|
||||
)
|
||||
|
||||
class Meta:
|
||||
verbose_name = _('Track')
|
||||
verbose_name_plural = _('Tracks')
|
||||
ordering = ('position',)
|
||||
verbose_name = _("Track")
|
||||
verbose_name_plural = _("Tracks")
|
||||
ordering = ("position",)
|
||||
|
||||
def __str__(self):
|
||||
return '{self.artist} -- {self.title} -- {self.position}'.format(
|
||||
self=self)
|
||||
return "{self.artist} -- {self.title} -- {self.position}".format(self=self)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if (self.sound is None and self.episode is None) or \
|
||||
(self.sound is not None and self.episode is not None):
|
||||
raise ValueError('sound XOR episode is required')
|
||||
if (self.sound is None and self.episode is None) or (self.sound is not None and self.episode is not None):
|
||||
raise ValueError("sound XOR episode is required")
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
|
||||
|
||||
@ -1,24 +1,20 @@
|
||||
import os
|
||||
|
||||
from django.db import models
|
||||
from django.utils.functional import cached_property
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from filer.fields.image import FilerImageField
|
||||
|
||||
from .. import settings
|
||||
from aircox.conf import settings
|
||||
|
||||
|
||||
__all__ = ['Station', 'StationQuerySet', 'Port']
|
||||
__all__ = ("Station", "StationQuerySet", "Port")
|
||||
|
||||
|
||||
class StationQuerySet(models.QuerySet):
|
||||
def default(self, station=None):
|
||||
"""
|
||||
Return station model instance, using defaults or
|
||||
given one.
|
||||
"""
|
||||
"""Return station model instance, using defaults or given one."""
|
||||
if station is None:
|
||||
return self.order_by('-default', 'pk').first()
|
||||
return self.order_by("-default", "pk").first()
|
||||
return self.filter(pk=station).first()
|
||||
|
||||
def active(self):
|
||||
@ -26,61 +22,82 @@ class StationQuerySet(models.QuerySet):
|
||||
|
||||
|
||||
class Station(models.Model):
|
||||
"""
|
||||
Represents a radio station, to which multiple programs are attached
|
||||
and that is used as the top object for everything.
|
||||
"""Represents a radio station, to which multiple programs are attached and
|
||||
that is used as the top object for everything.
|
||||
|
||||
A Station holds controllers for the audio stream generation too.
|
||||
Theses are set up when needed (at the first access to these elements)
|
||||
then cached.
|
||||
Theses are set up when needed (at the first access to these
|
||||
elements) then cached.
|
||||
"""
|
||||
name = models.CharField(_('name'), max_length=64)
|
||||
slug = models.SlugField(_('slug'), max_length=64, unique=True)
|
||||
|
||||
name = models.CharField(_("name"), max_length=64)
|
||||
slug = models.SlugField(_("slug"), max_length=64, unique=True)
|
||||
# FIXME: remove - should be decided only by Streamer controller + settings
|
||||
path = models.CharField(
|
||||
_('path'),
|
||||
help_text=_('path to the working directory'),
|
||||
_("path"),
|
||||
help_text=_("path to the working directory"),
|
||||
max_length=256,
|
||||
blank=True,
|
||||
)
|
||||
default = models.BooleanField(
|
||||
_('default station'),
|
||||
default=True,
|
||||
help_text=_('use this station as the main one.')
|
||||
_("default station"),
|
||||
default=False,
|
||||
help_text=_("use this station as the main one."),
|
||||
)
|
||||
active = models.BooleanField(
|
||||
_('active'),
|
||||
_("active"),
|
||||
default=True,
|
||||
help_text=_('whether this station is still active or not.')
|
||||
help_text=_("whether this station is still active or not."),
|
||||
)
|
||||
logo = FilerImageField(
|
||||
on_delete=models.SET_NULL, null=True, blank=True,
|
||||
verbose_name=_('Logo'),
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
verbose_name=_("Logo"),
|
||||
)
|
||||
hosts = models.TextField(
|
||||
_("website's urls"), max_length=512, null=True, blank=True,
|
||||
help_text=_('specify one url per line')
|
||||
_("website's urls"),
|
||||
max_length=512,
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text=_("specify one domain per line, without 'http://' prefix"),
|
||||
)
|
||||
audio_streams = models.TextField(
|
||||
_("audio streams"), max_length=2048, null=True, blank=True,
|
||||
help_text=_("Audio streams urls used by station's player. One url "
|
||||
"a line.")
|
||||
_("audio streams"),
|
||||
max_length=2048,
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text=_("Audio streams urls used by station's player. One url a line."),
|
||||
)
|
||||
default_cover = FilerImageField(
|
||||
on_delete=models.SET_NULL,
|
||||
verbose_name=_('Default pages\' cover'), null=True, blank=True,
|
||||
related_name='+',
|
||||
verbose_name=_("Default pages' cover"),
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name="+",
|
||||
)
|
||||
music_stream_title = models.CharField(
|
||||
_("Music stream's title"),
|
||||
max_length=64,
|
||||
default=_("Music stream"),
|
||||
)
|
||||
|
||||
objects = StationQuerySet.as_manager()
|
||||
|
||||
@cached_property
|
||||
def streams(self):
|
||||
"""Audio streams as list of urls."""
|
||||
return self.audio_streams.split("\n") if self.audio_streams else []
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
|
||||
def save(self, make_sources=True, *args, **kwargs):
|
||||
if not self.path:
|
||||
self.path = os.path.join(settings.AIRCOX_CONTROLLERS_WORKING_DIR,
|
||||
self.slug.replace('-', '_'))
|
||||
self.path = os.path.join(
|
||||
settings.CONTROLLERS_WORKING_DIR,
|
||||
self.slug.replace("-", "_"),
|
||||
)
|
||||
|
||||
if self.default:
|
||||
qs = Station.objects.filter(default=True)
|
||||
@ -93,22 +110,20 @@ class Station(models.Model):
|
||||
|
||||
class PortQuerySet(models.QuerySet):
|
||||
def active(self, value=True):
|
||||
""" Active ports """
|
||||
"""Active ports."""
|
||||
return self.filter(active=value)
|
||||
|
||||
def output(self):
|
||||
""" Filter in output ports """
|
||||
"""Filter in output ports."""
|
||||
return self.filter(direction=Port.DIRECTION_OUTPUT)
|
||||
|
||||
def input(self):
|
||||
""" Fitler in input ports """
|
||||
"""Fitler in input ports."""
|
||||
return self.filter(direction=Port.DIRECTION_INPUT)
|
||||
|
||||
|
||||
class Port(models.Model):
|
||||
"""
|
||||
Represent an audio input/output for the audio stream
|
||||
generation.
|
||||
"""Represent an audio input/output for the audio stream generation.
|
||||
|
||||
You might want to take a look to LiquidSoap's documentation
|
||||
for the options available for each kind of input/output.
|
||||
@ -116,10 +131,13 @@ class Port(models.Model):
|
||||
Some port types may be not available depending on the
|
||||
direction of the port.
|
||||
"""
|
||||
|
||||
DIRECTION_INPUT = 0x00
|
||||
DIRECTION_OUTPUT = 0x01
|
||||
DIRECTION_CHOICES = ((DIRECTION_INPUT, _('input')),
|
||||
(DIRECTION_OUTPUT, _('output')))
|
||||
DIRECTION_CHOICES = (
|
||||
(DIRECTION_INPUT, _("input")),
|
||||
(DIRECTION_OUTPUT, _("output")),
|
||||
)
|
||||
|
||||
TYPE_JACK = 0x00
|
||||
TYPE_ALSA = 0x01
|
||||
@ -129,27 +147,28 @@ class Port(models.Model):
|
||||
TYPE_HTTPS = 0x05
|
||||
TYPE_FILE = 0x06
|
||||
TYPE_CHOICES = (
|
||||
(TYPE_JACK, 'jack'), (TYPE_ALSA, 'alsa'),
|
||||
(TYPE_PULSEAUDIO, 'pulseaudio'), (TYPE_ICECAST, 'icecast'),
|
||||
(TYPE_HTTP, 'http'), (TYPE_HTTPS, 'https'),
|
||||
(TYPE_FILE, _('file'))
|
||||
(TYPE_JACK, "jack"),
|
||||
(TYPE_ALSA, "alsa"),
|
||||
(TYPE_PULSEAUDIO, "pulseaudio"),
|
||||
(TYPE_ICECAST, "icecast"),
|
||||
(TYPE_HTTP, "http"),
|
||||
(TYPE_HTTPS, "https"),
|
||||
(TYPE_FILE, _("file")),
|
||||
)
|
||||
|
||||
station = models.ForeignKey(
|
||||
Station, models.CASCADE, verbose_name=_('station'))
|
||||
direction = models.SmallIntegerField(
|
||||
_('direction'), choices=DIRECTION_CHOICES)
|
||||
type = models.SmallIntegerField(_('type'), choices=TYPE_CHOICES)
|
||||
active = models.BooleanField(
|
||||
_('active'), default=True,
|
||||
help_text=_('this port is active')
|
||||
)
|
||||
station = models.ForeignKey(Station, models.CASCADE, verbose_name=_("station"))
|
||||
direction = models.SmallIntegerField(_("direction"), choices=DIRECTION_CHOICES)
|
||||
type = models.SmallIntegerField(_("type"), choices=TYPE_CHOICES)
|
||||
active = models.BooleanField(_("active"), default=True, help_text=_("this port is active"))
|
||||
settings = models.TextField(
|
||||
_('port settings'),
|
||||
help_text=_('list of comma separated params available; '
|
||||
'this is put in the output config file as raw code; '
|
||||
'plugin related'),
|
||||
blank=True, null=True
|
||||
_("port settings"),
|
||||
help_text=_(
|
||||
"list of comma separated params available; "
|
||||
"this is put in the output config file as raw code; "
|
||||
"plugin related"
|
||||
),
|
||||
blank=True,
|
||||
null=True,
|
||||
)
|
||||
|
||||
objects = PortQuerySet.as_manager()
|
||||
@ -157,28 +176,20 @@ class Port(models.Model):
|
||||
def __str__(self):
|
||||
return "{direction}: {type} #{id}".format(
|
||||
direction=self.get_direction_display(),
|
||||
type=self.get_type_display(), id=self.pk or ''
|
||||
type=self.get_type_display(),
|
||||
id=self.pk or "",
|
||||
)
|
||||
|
||||
def is_valid_type(self):
|
||||
"""
|
||||
Return True if the type is available for the given direction.
|
||||
"""
|
||||
"""Return True if the type is available for the given direction."""
|
||||
|
||||
if self.direction == self.DIRECTION_INPUT:
|
||||
return self.type not in (
|
||||
self.TYPE_ICECAST, self.TYPE_FILE
|
||||
)
|
||||
return self.type not in (self.TYPE_ICECAST, self.TYPE_FILE)
|
||||
|
||||
return self.type not in (
|
||||
self.TYPE_HTTP, self.TYPE_HTTPS
|
||||
)
|
||||
return self.type not in (self.TYPE_HTTP, self.TYPE_HTTPS)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if not self.is_valid_type():
|
||||
raise ValueError(
|
||||
"port type is not allowed with the given port direction"
|
||||
)
|
||||
raise ValueError("port type is not allowed with the given port direction")
|
||||
|
||||
return super().save(*args, **kwargs)
|
||||
|
||||
|
||||
18
aircox/models/user_settings.py
Normal file
18
aircox/models/user_settings.py
Normal file
@ -0,0 +1,18 @@
|
||||
from django.contrib.auth.models import User
|
||||
from django.db import models
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
__all__ = ("UserSettings",)
|
||||
|
||||
|
||||
class UserSettings(models.Model):
|
||||
"""Store user's settings."""
|
||||
|
||||
user = models.OneToOneField(
|
||||
User,
|
||||
models.CASCADE,
|
||||
verbose_name=_("User"),
|
||||
related_name="aircox_settings",
|
||||
)
|
||||
playlist_editor_columns = models.JSONField(_("Playlist Editor Columns"))
|
||||
playlist_editor_sep = models.CharField(_("Playlist Editor Separator"), max_length=16)
|
||||
12
aircox/serializers/__init__.py
Normal file
12
aircox/serializers/__init__.py
Normal file
@ -0,0 +1,12 @@
|
||||
from .admin import TrackSerializer, UserSettingsSerializer
|
||||
from .log import LogInfo, LogInfoSerializer
|
||||
from .sound import PodcastSerializer, SoundSerializer
|
||||
|
||||
__all__ = (
|
||||
"TrackSerializer",
|
||||
"UserSettingsSerializer",
|
||||
"LogInfo",
|
||||
"LogInfoSerializer",
|
||||
"SoundSerializer",
|
||||
"PodcastSerializer",
|
||||
)
|
||||
39
aircox/serializers/admin.py
Normal file
39
aircox/serializers/admin.py
Normal file
@ -0,0 +1,39 @@
|
||||
from rest_framework import serializers
|
||||
from taggit.serializers import TaggitSerializer, TagListSerializerField
|
||||
|
||||
from ..models import Track, UserSettings
|
||||
|
||||
__all__ = ("TrackSerializer", "UserSettingsSerializer")
|
||||
|
||||
|
||||
class TrackSerializer(TaggitSerializer, serializers.ModelSerializer):
|
||||
tags = TagListSerializerField()
|
||||
|
||||
class Meta:
|
||||
model = Track
|
||||
fields = (
|
||||
"pk",
|
||||
"artist",
|
||||
"title",
|
||||
"album",
|
||||
"year",
|
||||
"position",
|
||||
"info",
|
||||
"tags",
|
||||
"episode",
|
||||
"sound",
|
||||
"timestamp",
|
||||
)
|
||||
|
||||
|
||||
class UserSettingsSerializer(serializers.ModelSerializer):
|
||||
# TODO: validate fields values (playlist_editor_columns at least)
|
||||
class Meta:
|
||||
model = UserSettings
|
||||
fields = ("playlist_editor_columns", "playlist_editor_sep")
|
||||
|
||||
def create(self, validated_data):
|
||||
user = self.context.get("user")
|
||||
if user:
|
||||
validated_data["user_id"] = user.id
|
||||
return super().create(validated_data)
|
||||
@ -1,15 +1,14 @@
|
||||
from rest_framework import serializers
|
||||
|
||||
from .models import Diffusion, Log, Sound
|
||||
from ..models import Diffusion, Log
|
||||
|
||||
|
||||
__all__ = ['LogInfo', 'LogInfoSerializer']
|
||||
__all__ = ("LogInfo", "LogInfoSerializer")
|
||||
|
||||
|
||||
class LogInfo:
|
||||
obj = None
|
||||
start, end = None, None
|
||||
title, artist = '', ''
|
||||
title, artist = "", ""
|
||||
url, cover = None, None
|
||||
info = None
|
||||
|
||||
@ -20,17 +19,17 @@ class LogInfo:
|
||||
elif isinstance(obj, Log):
|
||||
self.from_log(obj)
|
||||
else:
|
||||
raise ValueError('`obj` must be a Diffusion or a Track Log.')
|
||||
raise ValueError("`obj` must be a Diffusion or a Track Log.")
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
return 'track' if isinstance(self.obj, Log) else 'diffusion'
|
||||
return "track" if isinstance(self.obj, Log) else "diffusion"
|
||||
|
||||
def from_diffusion(self, obj):
|
||||
episode = obj.episode
|
||||
self.start, self.end = obj.start, obj.end
|
||||
self.title, self.url = episode.title, episode.get_absolute_url()
|
||||
self.cover = episode.cover and episode.cover.icons['64']
|
||||
self.cover = episode.cover and episode.cover.icons["64"]
|
||||
self.info = episode.category and episode.category.title
|
||||
self.obj = obj
|
||||
|
||||
@ -51,28 +50,3 @@ class LogInfoSerializer(serializers.Serializer):
|
||||
info = serializers.CharField(max_length=200, required=False)
|
||||
url = serializers.URLField(required=False)
|
||||
cover = serializers.URLField(required=False)
|
||||
|
||||
|
||||
class SoundSerializer(serializers.ModelSerializer):
|
||||
# serializers.HyperlinkedIdentityField(view_name='sound', format='html')
|
||||
|
||||
class Meta:
|
||||
model = Sound
|
||||
fields = ['pk', 'name', 'program', 'episode', 'type',
|
||||
'duration', 'mtime', 'is_good_quality', 'is_public', 'url']
|
||||
|
||||
def get_field_names(self, *args):
|
||||
names = super().get_field_names(*args)
|
||||
if 'request' in self.context and self.context['request'].user.is_staff and \
|
||||
self.instance.is_public:
|
||||
names.push('path')
|
||||
return names
|
||||
|
||||
class PodcastSerializer(serializers.ModelSerializer):
|
||||
# serializers.HyperlinkedIdentityField(view_name='sound', format='html')
|
||||
|
||||
class Meta:
|
||||
model = Sound
|
||||
fields = ['pk', 'name', 'program', 'episode', 'type',
|
||||
'duration', 'mtime', 'url']
|
||||
|
||||
43
aircox/serializers/sound.py
Normal file
43
aircox/serializers/sound.py
Normal file
@ -0,0 +1,43 @@
|
||||
from rest_framework import serializers
|
||||
|
||||
from ..models import Sound
|
||||
|
||||
__all__ = ("SoundSerializer", "PodcastSerializer")
|
||||
|
||||
|
||||
class SoundSerializer(serializers.ModelSerializer):
|
||||
file = serializers.FileField(use_url=False)
|
||||
|
||||
class Meta:
|
||||
model = Sound
|
||||
fields = [
|
||||
"pk",
|
||||
"name",
|
||||
"program",
|
||||
"episode",
|
||||
"type",
|
||||
"file",
|
||||
"duration",
|
||||
"mtime",
|
||||
"is_good_quality",
|
||||
"is_public",
|
||||
"url",
|
||||
]
|
||||
|
||||
|
||||
class PodcastSerializer(serializers.ModelSerializer):
|
||||
# serializers.HyperlinkedIdentityField(view_name='sound', format='html')
|
||||
|
||||
class Meta:
|
||||
model = Sound
|
||||
fields = [
|
||||
"pk",
|
||||
"name",
|
||||
"program",
|
||||
"episode",
|
||||
"type",
|
||||
"duration",
|
||||
"mtime",
|
||||
"url",
|
||||
"is_downloadable",
|
||||
]
|
||||
@ -1,159 +0,0 @@
|
||||
import os
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
# TODO:
|
||||
# - items() iteration
|
||||
# - sub-settings as values
|
||||
# - validate() settings
|
||||
# - Meta inner-class?
|
||||
# - custom settings class instead of default
|
||||
#class BaseSettings:
|
||||
# deprecated = set()
|
||||
#
|
||||
# def __init__(self, user_conf):
|
||||
# if user_conf:
|
||||
# for key, value in user_conf.items():
|
||||
# if not hasattr(self, key):
|
||||
# if key in self.deprecated:
|
||||
# raise ValueError('"{}" config is deprecated'.format(key))
|
||||
# else:
|
||||
# raise ValueError('"{}" is not a config value'.format(key))
|
||||
# setattr(self, key, value)
|
||||
#
|
||||
#
|
||||
#class Settings(BaseSettings):
|
||||
# default_user_groups = {
|
||||
#
|
||||
# }
|
||||
#
|
||||
# programs_dir = os.path.join(settings.MEDIA_ROOT, 'programs'),
|
||||
# """ Programs data directory. """
|
||||
# episode_title = '{program.title} - {date}'
|
||||
# """ Default episodes title. """
|
||||
# episode_title_date_format = '%-d %B %Y'
|
||||
# """ Date format used in episode title. """
|
||||
#
|
||||
# logs_archives_dir = os.path.join(settings.PROJECT_ROOT, 'logs/archives')
|
||||
# """ Directory where logs are saved once archived """
|
||||
# logs_archive_age = 30
|
||||
# """ Default age of log before being archived """
|
||||
#
|
||||
# sounds_default_dir = os.path.join(settings.MEDIA_ROOT, 'programs/defaults')
|
||||
# sound_archive_dir = 'archives'
|
||||
# sound_excerpt_dir = 'excerpts'
|
||||
# sound_quality = {
|
||||
# 'attribute': 'RMS lev dB',
|
||||
# 'range': (-18.0, -8.0),
|
||||
# 'sample_length': 120,
|
||||
# }
|
||||
# sound_ext = ('.ogg', '.flac', '.wav', '.mp3', '.opus')
|
||||
#
|
||||
# # TODO: move into aircox_streamer
|
||||
# streamer_working_dir = '/tmp/aircox'
|
||||
#
|
||||
#
|
||||
#
|
||||
|
||||
def ensure(key, default):
|
||||
globals()[key] = getattr(settings, key, default)
|
||||
|
||||
|
||||
########################################################################
|
||||
# Global & misc
|
||||
########################################################################
|
||||
# group to assign to users at their creation, along with the permissions
|
||||
# to add to each group.
|
||||
ensure('AIRCOX_DEFAULT_USER_GROUPS', {
|
||||
'radio hosts': (
|
||||
# TODO include content_type in order to avoid clash with potential
|
||||
# extra applications
|
||||
|
||||
# aircox
|
||||
'change_program', 'change_episode', 'change_diffusion',
|
||||
'add_comment', 'change_comment', 'delete_comment',
|
||||
'add_article', 'change_article', 'delete_article',
|
||||
'change_sound',
|
||||
'add_track', 'change_track', 'delete_track',
|
||||
|
||||
# taggit
|
||||
'add_tag', 'change_tag', 'delete_tag',
|
||||
|
||||
# filer
|
||||
'add_folder', 'change_folder', 'delete_folder', 'can_use_directory_listing',
|
||||
'add_image', 'change_image', 'delete_image',
|
||||
),
|
||||
})
|
||||
|
||||
# Directory for the programs data
|
||||
# TODO: rename to PROGRAMS_ROOT
|
||||
ensure('AIRCOX_PROGRAMS_DIR',
|
||||
os.path.join(settings.MEDIA_ROOT, 'programs'))
|
||||
|
||||
|
||||
########################################################################
|
||||
# Programs & Episodes
|
||||
########################################################################
|
||||
# default title for episodes
|
||||
ensure('AIRCOX_EPISODE_TITLE', '{program.title} - {date}')
|
||||
# date format in episode title (python's strftime)
|
||||
ensure('AIRCOX_EPISODE_TITLE_DATE_FORMAT', '%-d %B %Y')
|
||||
|
||||
########################################################################
|
||||
# Logs & Archives
|
||||
########################################################################
|
||||
# Directory where to save logs' archives
|
||||
ensure('AIRCOX_LOGS_ARCHIVES_DIR', os.path.join(settings.PROJECT_ROOT, 'logs/archives'))
|
||||
# In days, minimal age of a log before it is archived
|
||||
ensure('AIRCOX_LOGS_ARCHIVES_AGE', 60)
|
||||
|
||||
|
||||
########################################################################
|
||||
# Sounds
|
||||
########################################################################
|
||||
# Sub directory used for the complete episode sounds
|
||||
ensure('AIRCOX_SOUND_ARCHIVES_SUBDIR', 'archives')
|
||||
# Sub directory used for the excerpts of the episode
|
||||
ensure('AIRCOX_SOUND_EXCERPTS_SUBDIR', 'excerpts')
|
||||
|
||||
# Quality attributes passed to sound_quality_check from sounds_monitor
|
||||
ensure('AIRCOX_SOUND_QUALITY', {
|
||||
'attribute': 'RMS lev dB',
|
||||
'range': (-18.0, -8.0),
|
||||
'sample_length': 120,
|
||||
}
|
||||
)
|
||||
|
||||
# Extension of sound files
|
||||
ensure(
|
||||
'AIRCOX_SOUND_FILE_EXT',
|
||||
('.ogg', '.flac', '.wav', '.mp3', '.opus')
|
||||
)
|
||||
|
||||
|
||||
########################################################################
|
||||
# Streamer & Controllers
|
||||
########################################################################
|
||||
# Controllers working directory
|
||||
ensure('AIRCOX_CONTROLLERS_WORKING_DIR', '/tmp/aircox')
|
||||
|
||||
|
||||
########################################################################
|
||||
# Playlist import from CSV
|
||||
########################################################################
|
||||
# Columns for CSV file
|
||||
ensure(
|
||||
'AIRCOX_IMPORT_PLAYLIST_CSV_COLS',
|
||||
('artist', 'title', 'minutes', 'seconds', 'tags', 'info')
|
||||
)
|
||||
# Column delimiter of csv text files
|
||||
ensure('AIRCOX_IMPORT_PLAYLIST_CSV_DELIMITER', ';')
|
||||
# Text delimiter of csv text files
|
||||
ensure('AIRCOX_IMPORT_PLAYLIST_CSV_TEXT_QUOTE', '"')
|
||||
|
||||
|
||||
if settings.MEDIA_ROOT not in AIRCOX_PROGRAMS_DIR:
|
||||
# PROGRAMS_DIR must be in MEDIA_ROOT for easy files url resolution
|
||||
# later should this restriction disappear.
|
||||
raise ValueError("settings: AIRCOX_PROGRAMS_DIR must be in MEDIA_ROOT")
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
12
aircox/static/aircox/admin.html
Normal file
12
aircox/static/aircox/admin.html
Normal file
@ -0,0 +1,12 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
||||
<meta name="viewport" content="width=device-width,initial-scale=1.0">
|
||||
<title>Vue App</title>
|
||||
<script defer src="js/chunk-vendors.js"></script><script defer src="js/chunk-common.js"></script><script defer src="js/admin.js"></script><link href="css/chunk-vendors.css" rel="stylesheet"><link href="css/chunk-common.css" rel="stylesheet"><link href="css/admin.css" rel="stylesheet"></head>
|
||||
<body>
|
||||
<div id="app"></div>
|
||||
</body>
|
||||
</html>
|
||||
File diff suppressed because one or more lines are too long
2883
aircox/static/aircox/css/admin.css
Normal file
2883
aircox/static/aircox/css/admin.css
Normal file
File diff suppressed because it is too large
Load Diff
7058
aircox/static/aircox/css/chunk-common.css
Normal file
7058
aircox/static/aircox/css/chunk-common.css
Normal file
File diff suppressed because it is too large
Load Diff
1261
aircox/static/aircox/css/chunk-vendors.css
Normal file
1261
aircox/static/aircox/css/chunk-vendors.css
Normal file
File diff suppressed because one or more lines are too long
7766
aircox/static/aircox/css/public.css
Normal file
7766
aircox/static/aircox/css/public.css
Normal file
File diff suppressed because it is too large
Load Diff
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user