Compare commits

...

174 Commits

Author SHA1 Message Date
Giacomo Leidi 584186e831
Update system dependencies. 2024-03-20 00:41:57 +01:00
Giacomo Leidi 9763ca5fbf
Update system dependencies. 2024-03-14 12:39:16 +01:00
Giacomo Leidi 904aa06629
Fix docker image. 2024-03-02 00:12:05 +01:00
Giacomo Leidi be715d201c
Update dependencies. 2024-03-01 23:38:11 +01:00
Giacomo Leidi 647925acd3
Update dependencies 2024-02-28 23:45:20 +01:00
Giacomo Leidi e5100d499e
Update channels-lock.scm 2024-02-18 14:00:38 +01:00
Giacomo Leidi 9ac1d55d02
Update channels-lock.scm 2024-02-01 20:28:14 +01:00
Giacomo Leidi bf1c18f347
update guix config 2024-01-28 21:50:00 +01:00
Giacomo Leidi e381c1b522
Migrate to importlib and update some dependencies (#189)
* Migrate to importlib.

* Update CI
2024-01-28 21:08:17 +01:00
Giacomo Leidi 77a881980b
Update README.md. 2024-01-07 22:44:11 +01:00
Giacomo Leidi 5710d46874
Update README.md. 2024-01-07 22:42:51 +01:00
Giacomo Leidi 9794b00cc0
add badges 2024-01-07 22:37:58 +01:00
Giacomo Leidi 5ebaa04f3d
Revert "Update .envrc"
This reverts commit 1e43a4e12d.
2023-10-30 20:58:53 +01:00
Giacomo Leidi 0f19cf4a9e
Revert "temporary lock"
This reverts commit 8d3026523a.
2023-10-30 20:57:07 +01:00
Giacomo Leidi 8d3026523a
temporary lock 2023-10-10 19:55:39 +02:00
Giacomo Leidi 1e43a4e12d
Update .envrc 2023-10-10 18:04:05 +02:00
Giacomo Leidi 45e1f551d8
Update release.yml 2023-07-17 17:48:32 +02:00
Giacomo Leidi 056e0217aa
Release v0.3.6. 2023-07-16 15:33:26 +02:00
Giacomo Leidi acce3a83fe
Enable publishing events by UUID. (#187) 2023-07-16 15:09:08 +02:00
Giacomo Leidi 775fb89cf6
telegram: Add support for topics. (#184) 2023-07-16 13:27:00 +02:00
Giacomo Leidi bf3170cb6f
Decouple DB instantiation from logger instantion. (#188) 2023-07-11 22:24:56 +02:00
Giacomo Leidi ff7567dc1b
Update docker image. 2023-07-11 19:20:08 +02:00
Giacomo Leidi f16cffa44e
Update Guix channel references. 2023-07-11 17:23:21 +02:00
Giacomo Leidi 7bcb374891
scripts/scheduler.py: Pass dry_run configuration. 2023-07-11 15:50:12 +02:00
Giacomo Leidi b17dc556d7
Fix scheduler. 2023-07-11 15:23:40 +02:00
Giacomo Leidi 201e259d37
Update CI 2023-07-11 02:19:01 +02:00
Giacomo Leidi 9744f436ae
Update build_docker_image.sh. 2023-07-11 01:50:19 +02:00
Giacomo Leidi 34ebd8f982
Update release.yml 2023-06-11 00:07:35 +02:00
Giacomo Leidi aaff82fe98
Update .envrc. 2023-06-10 23:34:12 +02:00
Giacomo Leidi 1c7e3c7ed5
Update docker-compose.yml 2023-06-10 23:31:04 +02:00
Giacomo Leidi c40a7aca35
Release v0.3.5. 2023-06-10 23:27:34 +02:00
Giacomo Leidi 4757cc6ec8
Hotfix release script 2023-06-10 23:24:59 +02:00
Giacomo Leidi 6bd2d606df
Store notifications. (#180) 2023-05-22 13:00:37 +02:00
Giacomo Leidi 3874acf247
Failures (#183)
* Update beautifulsoup4 to 4.11.

* Update docker image.

* Add type annotations.

* Publish where you can and notify errors.

* Add tests for partial publishing.
2023-05-22 12:53:32 +02:00
Giacomo Leidi 6b72b2630f
Update tweepy to 4.13. 2023-04-18 01:16:16 +02:00
Giacomo Leidi 28446c3401
Update .envrc. 2023-01-03 18:55:50 +01:00
Giacomo Leidi 1d3e5047e8
Update development-environment-with-guix.md 2023-01-03 18:54:19 +01:00
Giacomo Leidi 1c9a95db84
Update development-environment-with-guix.md 2023-01-03 16:46:35 +01:00
Giacomo Leidi 1678e836c9
Update development-environment-with-guix.md 2023-01-03 16:42:19 +01:00
Giacomo Leidi a445eedaea
Switch to TWC DockerHub account (#182) 2022-12-20 22:36:38 +01:00
Simone Robutti 370e00d187
decouple dataclasses from models (#181)
* fixed parsing bug

* implemented events and publications endpoints

split endpoints by entity

removed credentials

* add pagination (#179)

* added pagination

* integrated pagination with tortoise

* added test for publications

* removed converter file

* moved publications to dataclasses module

* implemented import pattern on dataclasses to prevent circular imports

* removed redundant fetch

* removed unused query

* split build_publications

* split failed_publications

* removed redundant query functions

* split publication retrieve

* split all read functions

* removed redundant write function

* fixed lock
2022-12-11 14:15:04 +01:00
Simone Robutti ddc706e201
Add publication crud (#178)
* fixed parsing bug

* implemented events and publications endpoints

split endpoints by entity

removed credentials

* add pagination (#179)

* added pagination

* integrated pagination with tortoise

* added test for publications

* removed converter file

* updated dependencies
2022-12-07 21:46:57 +01:00
Giacomo Leidi 081ca87857
Update dependencies. (#177) 2022-11-13 14:31:03 +01:00
Giacomo Leidi cf9ffd2149 query: Add get_event_publications. 2022-10-26 00:43:25 +02:00
Giacomo Leidi 04f29e37e4
Drop Guix CI for master and PRs. (#176) 2022-10-20 19:16:57 +02:00
Simone Robutti 44340fde8f
Introduced FastAPI and Postgres support (#173)
* added poc

* added check for sqlite db

* added events test

* draft docker-compose-test.yml

* improved docker-compose

* added support for postgres migrations

* add documentation

* added some qol to migrations

* added migration generation script

* removed settings.toml

* waiting for postgress in script

* commented script

* added sample web config

* fixed tests

* mock memory db

* reviewed PR
2022-10-14 22:11:27 +02:00
Simone Robutti 63a30bb483
refactor coordinators (#171)
* introduced CommandConfig object

* added dry_run for start command

* added test to start_dry_run

* added dry_run recap

* fixed import

* improved printing

* fixed zulip debug info

* improved recap dry-run print

* moved coordinator classes into dedicated package

* removed unused init

* divided event coordinators from recap coordinators

* added some docstrings
2022-09-24 16:46:03 +02:00
Giacomo Leidi 2d328a30bf
Update development-environment-with-guix.md 2022-08-16 01:04:51 +02:00
Giacomo Leidi 4ce6c7b171
Update dependencies. (#170)
* Update Guix commit.

* Update dependencies

* Drop CONNECTION_NAME hack.
2022-07-26 22:23:47 +02:00
Simone Robutti b6e203577e
add dry run (#172)
* introduced CommandConfig object

* added dry_run for start command

* added test to start_dry_run

* added dry_run recap

* fixed import

* improved printing

* fixed zulip debug info

* improved recap dry-run print
2022-07-25 19:26:47 +02:00
Simone Robutti 9d71ef36b9
add dry run (#169)
* introduced CommandConfig object

* added dry_run for start command

* added test to start_dry_run

* added dry_run recap

* fixed import

* improved printing

* fixed zulip debug info

* improved recap dry-run print
2022-07-12 07:39:56 +02:00
Giacomo Leidi cf2fabefb4 Release v0.3.2. 2022-06-05 19:12:08 +02:00
Giacomo Leidi 9810c9d5a5
Move Guix packages to a channel. (#168)
* Move Guix packages to a channel.

This patch moves the core Mobilizon Reshare package definition to
https://github.com/fishinthecalculator/mobilizon-reshare-guix .
2022-05-26 22:46:59 +02:00
Giacomo Leidi b66c94c8a2
Add publish command. (#167)
* Add publish command.

* publish: Add tests.

* Add list-platforms and test-configuration.

* Update Guix dependencies.

* Move publishing of events and publications to retry.
2022-05-17 23:14:52 +02:00
Giacomo Leidi 7f3ce9a55a Release v0.3.1. 2022-04-19 21:53:02 +02:00
Giacomo Leidi a9b90be963 Hotfix release script. 2022-04-19 21:52:45 +02:00
Simone Robutti 94c85d8b48
introduce locale (#165)
* introduced locale option

* added tests
2022-04-11 18:13:35 +02:00
Giacomo Leidi 4e40a1979e
Add completion scripts. (#166) 2022-04-11 08:36:10 +02:00
Giacomo Leidi 5f51d68c82 Release v0.3.0. 2022-04-04 11:35:24 +02:00
Giacomo Leidi 05a8c9d5b0
Update docker-compose.yml 2022-03-30 21:21:31 +02:00
Giacomo Leidi 981fcf0486
Fix recap (#161)
* Drop dependency on python-telegram-bot, as it's not used.

* Align default templates.

This patch tries to bring some consistency to the default templates and
fixes some small bug in the output of some formatters.
2022-03-26 22:46:33 +01:00
Giacomo Leidi 002399161d
Pull publish (#157)
* Add pull command.

* Add publish command.

* Enforce start's old semantics.

* Add pull test cases:

- Pull with no events anywhere
- Pull with one event on mobilizon and none on the db
- Pull with one event on mobilizon and another on the db
- Pull with one event on mobilizon and the same event on the db

* Add pull tests:

- Sequence: pull->start
- Sequence: start->pull
- Sequence: pull->new event on mobilizon -> pull

* manifest.scm: Add cloc.

* query: read: Prefer return list to generators.

* tests: pull: Actually test that models are written to the DB.
2022-03-22 21:16:34 +01:00
Simone Robutti 3e7b9097a4
Update twitter.tmpl.j2 (#160)
* Update twitter.tmpl.j2

* Update twitter_recap.tmpl.j2
2022-03-15 21:52:14 +01:00
Giacomo Leidi af45f2b5ea
Run CI for pull requests. (#159) 2022-03-13 21:51:25 +01:00
Giacomo Leidi f8a614750c
Upstream python-tweepy. (#155)
* Upstream python-tweepy.

* Update requests to 2.26.

* Remove python-requests-2.25 as it's not needed.

* Add guide to update dependencies.

* Fix link.
2022-03-12 11:29:56 +01:00
Giacomo Leidi e558034194 Release v0.2.3. 2022-03-07 10:20:13 +01:00
Simone Robutti 1217b17326
fix event retry (#152)
* added twitter error handling

* added facebook tests

* added header format test

* added multiple newlines check

* added test list command

* fixed commands structure

* fixed event retry

* fixed publication retry

* added publication tests

* removed unused option

* fixed list begin/end window

* added test retry failures

* linting

* refactored sender

* added timezone freeze

* fixed facebook-sdk and beatifulsoup errors
2022-03-06 10:41:02 +01:00
Simone Robutti 8b81ceedd0
Merge pull request #154 from Tech-Workers-Coalition-Italia/set_timezone
set timezone
2022-03-04 10:57:01 +01:00
Simone Robutti dae9dfd889 set timezone to CUT 2022-03-04 10:56:20 +01:00
Simone Robutti cb0fe7b5fc
set timezone to Rome (#153) 2022-03-04 10:45:11 +01:00
Simone Robutti f04942eefe
test commands (#151)
* added twitter error handling

* added facebook tests

* added header format test

* added multiple newlines check

* added test list command

* fixed commands structure
2022-03-02 08:59:57 +01:00
Simone Robutti 420f823dd4
platform tests (#150)
* added twitter error handling

* added facebook tests

* added header format test

* added multiple newlines check
2022-03-02 08:59:49 +01:00
Giacomo Leidi 9c77afa456 Restore tests in Guix package. 2022-03-01 23:42:32 +01:00
Giacomo Leidi db659e9cdc Release v0.2.2. 2022-02-24 00:06:54 +01:00
Giacomo Leidi ca878454b4 mobilizon-reshare.scm: Hotfix applied to the image
This patch allows mobilizon-reshare to be installed in a Guix profile.
2022-02-24 00:04:20 +01:00
Giacomo Leidi 5afbcd2192 Release v0.2.1. 2022-02-23 23:10:38 +01:00
magowiz 8ff6555c60
Feature/autogenerate api doc (#147)
* initial commit with generate-doc.sh script and generated documentation, added pdoc3 dependency

* fixed import errors due to missing SECRETS_FOR_DYNACONF environment value, updated doc, force/overwrite if documentation exists

* moved pdoc3 dependency from dependencies to dev-dependencies

* removed generated doc

* add api-documentation (pdoc3 target directory) to .gitignore

* switch to sphinx which supports better typehints notation

* remove module path prefix from methods name

* add support for typehints

* remove autogenerated files

* ignore autogenerated files

* fixed typo in .gitignore

* added empty _static folder to avoid warning on build

* initial commit with generate-doc.sh script and generated documentation, added pdoc3 dependency

* fixed import errors due to missing SECRETS_FOR_DYNACONF environment value, updated doc, force/overwrite if documentation exists

* moved pdoc3 dependency from dependencies to dev-dependencies

* removed generated doc

* switch to sphinx which supports better typehints notation

* remove autogenerated files

* remove gui browser documentation open, use /bin/sh instead of /bin/bash

* remove windows/dos documentation generation make, drop windows/dos support

Co-authored-by: Giacomo Leidi <goodoldpaul@autistici.org>
2022-02-23 17:29:37 +01:00
Giacomo Leidi 529f83825e
Pave the way for using only EventPublications outside of storage module. (#149)
* Move `EventPublication.from_orm` to `storage.query.converter`.

Co-authored-by: Simone Robutti <simone.robutti@protonmail.com>
2022-02-23 17:26:13 +01:00
Simone Robutti bda4b8ee0d
fix timezone in test (#145)
* added timezone fixture

* Make output dependent on local timezone.

* Fix non deterministic behavior in html to text conversion.

Co-authored-by: Giacomo Leidi <goodoldpaul@autistici.org>
2022-02-22 11:28:16 +01:00
Giacomo Leidi e003cf9a90
README.md: Add Pypi link. (#148) 2022-02-21 15:26:03 +01:00
Giacomo Leidi d19f3ac5ca
scheduler.py: Fix intervals. (#146)
APScheduler uses 0-indexed week days.
2022-02-20 12:45:50 +01:00
Simone Robutti b0e88a9e1f
fixed missing whitespace (#143) 2022-02-16 15:54:10 +01:00
Simone Robutti 11de1e1213
fixed facebook format (#142) 2022-02-16 13:51:00 +01:00
Giacomo Leidi 9c44c8d730
Enforce the same dependencies in dev and prod. (#141)
This patch restores Guix' full sanity check, which runs the CLI
entrypoint to verify all required dependencies are present.
2022-02-16 12:23:31 +01:00
Giacomo Leidi e96ee55b48 Release v0.2.0. 2022-02-14 21:12:31 +01:00
Simone Robutti f0a7449336
rework maintenance cli (#140)
* added default

* moved object before verb

* added event retry

* added publication_retry

* renamed inspect to list

* fixed retry publication

* fixed retry event and structure
2022-02-14 21:10:27 +01:00
Simone Robutti ade3204c54
added default (#138) 2022-02-14 21:10:14 +01:00
magowiz 45802ecbdd
Feature/optimizing download (#118)
* add column last_update_time

* save last_update_time in event db record

* use id + updatedAt for comparison instead of mobilizon_id, this will treat updated events like new ones

* rework event selection/comparison to include unpublished with updates to ones need to be saved

* added update for unpublished events

* tests: test_update: Add create_unpublished_events tests.

* Move `MobilizonEvent.to_model` to `storage.query`

* Move `MobilizonEvent.from_model` to `storage.query`

* Move `MobilizonEvent.compute_status` to `storage.query`

* Move `publishers.exception.EventNotFound` to `storage.query.exceptions`
2022-02-09 00:54:56 +01:00
Giacomo Leidi eeb9b04e3e
Update README.md 2022-02-06 22:29:39 +01:00
Giacomo Leidi ad8647704d
Add event name to notifications. (#136) 2022-02-05 18:46:48 +01:00
Giacomo Leidi 3ef8b1f97b
Python scheduler docker image (#134)
* Move scheduler.py into scripts/ .

* docker: Python scheduler image.

* Update channels-lock.scm.

* Use tortoise 0.18.1 and aerich 0.6.2.

* Use fishinthecalculator/publish-docker-image-action@v0.1.10 .

* Make intervals configurable.

* Add platforms to README.md .
2022-02-05 18:46:02 +01:00
Simone Robutti 8c16812ed0
added scheduling (#133) 2022-01-27 20:54:14 +01:00
Giacomo Leidi 1f1ff2e5c2
Debug image (#131)
* config: get_settings_files_paths: Return first existing path.

* tests: Set SETTINGS_FOR_DYNACONF at import time.

* config: get_settings_files_paths: Log config path.

* manifest.scm: Add docker-compose.

* build_docker_image.sh: Allow building debug image.

* storage: db: Use the same configuration for Tortoise and Aerich.

* Distribuite migration queries.

* storage: implement_db_changes: Use pkg_resources.
2022-01-26 10:11:16 +01:00
Giacomo Leidi e109106af9
Use test script everywhere in the pipeline. (#130) 2022-01-24 22:34:52 +01:00
Simone Robutti 7aa1587b2b moved run tests to script 2022-01-20 11:58:44 +01:00
Simone Robutti 994102c937
added timezone_sensitive marker (#129) 2022-01-20 11:30:11 +01:00
Simone Robutti 5804b55dd8
Fix markdown cropping (#128)
* fixed error log

* changed telegram publisher to use HTML format

* fixed dataclass update issue

* docker: Add python-telegram-bot.

* docker: mobilizon-reshare: Reenable non deterministic tests.

Co-authored-by: Giacomo Leidi <goodoldpaul@autistici.org>
2022-01-18 01:39:00 +01:00
Giacomo Leidi 434438fd20
Add Docker Hub link to README.md. (#127) 2022-01-12 01:59:58 +01:00
Giacomo Leidi 3d704cc6b2
docker: Upstream python-markdownify. (#126) 2022-01-09 04:12:34 +01:00
Simone Robutti a0a1d43fa0
rework config (#121)
* simplified config behavior

* temp

* removed redundant tests

* removed publication window

* removed settings_file cli option

* add pre_test code, in order to set environment variables

* Revert "add pre_test code, in order to set environment variables"

This reverts commit 0d25f9313a.

Co-authored-by: magowiz <magowiz@gmail.com>
Co-authored-by: Giacomo Leidi <goodoldpaul@autistici.org>
2022-01-08 00:54:27 +01:00
Giacomo Leidi 352c49ca94
Rewrite README.md. (#125)
* Rewrite README.md.

This patch moves most information about the internals of
mobilizon-reshare to the manual to keep the README light and
easy to follow.
2022-01-07 23:49:36 +01:00
Giacomo Leidi 8569a903f1
Run CI tests for every push to master (#123)
* workflows: Separate release workflow from main.
2022-01-05 23:52:25 +01:00
Simone Robutti 5bce70b4a1
retry command (#119)
* added failed publication retrieval

* completed retry command

* added more error handling

* added atomic to query
2022-01-04 21:35:43 +01:00
Giacomo Leidi 43d653f37c
Update guix package definition after core-updates merge. (#122) 2022-01-03 20:14:31 +01:00
Simone Robutti c63ff905e5
Merge pull request #117 from Tech-Workers-Coalition-Italia/fix/aerich
add intermediate variable since aerich doesn't play nice with functions as configuration entrypoint
2021-12-18 13:00:59 +01:00
magowiz 86e698271c add intermediate variable since aerich doesn't play nice with functions as configuration entrypoint 2021-12-18 12:54:09 +01:00
Simone Robutti 31da51b69b
Merge pull request #110 from magowiz/feature/aerich
Feature/aerich
2021-12-14 22:54:50 +01:00
magowiz 6d84c26614 Revert "ignore pylintrc"
This reverts commit f8ace3fead.
2021-12-14 22:47:51 +01:00
magowiz e836e51ee4 fixed exit on migration exception 2021-12-14 22:37:49 +01:00
Simone Robutti 58580e6f27 fixed cli 2021-12-14 20:30:08 +01:00
Simone Robutti 94904a8917
Merge pull request #109 from Tech-Workers-Coalition-Italia/inspect
cli: inspect: Refactor command.
2021-12-11 11:17:59 +01:00
Giacomo Leidi f273873f67
docker: mobilizon-reshare: Upstream dynaconf. (#115) 2021-12-08 21:11:41 +01:00
Giacomo Leidi a5737d91a8 cli: Move settings option to subcommands. 2021-12-08 13:32:45 +01:00
Simone Robutti 2c0b0fb46d
Merge pull request #113 from Tech-Workers-Coalition-Italia/telegram_log_error
telegram log error
2021-12-08 12:29:04 +01:00
Simone Robutti 572821d24f
Merge pull request #111 from Tech-Workers-Coalition-Italia/test_new_event
added new test
2021-12-08 12:28:53 +01:00
magowiz c27a18cc17 completed chain, handling exception 2021-12-07 16:58:12 +01:00
magowiz f8ace3fead ignore pylintrc 2021-12-07 16:54:17 +01:00
magowiz 42ffc25221 ignore test_run.sh 2021-12-07 12:10:44 +01:00
magowiz e6f25178a5 handle aerich migrations not found 2021-12-07 12:10:32 +01:00
magowiz 681baed1d6 change first script name 2021-12-07 11:34:21 +01:00
magowiz fa431c3775 remove test file 2021-12-05 18:43:28 +01:00
magowiz 87e0d8d77b Merge branch 'feature/prova' into feature/aerich 2021-12-05 18:40:12 +01:00
magowiz 14bd3afa5b move aerich conf to storage.db 2021-12-05 18:37:35 +01:00
Simone Robutti 5d9d556803 improved escape characters for telegram 2021-12-05 17:29:30 +01:00
magowiz 9080f26e1c prova.txt 2021-12-05 17:28:47 +01:00
Simone Robutti b26270eb9a added log 2021-12-05 17:21:43 +01:00
magowiz 6cb1f3eedd changed name of __implement_db_changes__ to _implement_db_changes and using logging facility for printing starting update database 2021-12-05 17:11:24 +01:00
Simone Robutti 7c39d94b48 added new test 2021-12-05 16:50:37 +01:00
magowiz 6538abcb1b re-init 2021-12-04 20:26:16 +01:00
magowiz f77e11fd9e add aenrich with migration initialized and auto-upgrade on start 2021-12-04 19:34:00 +01:00
magowiz b04b0ba923 add initial configuration, replace get_configuration with toml.load to avoid validation error on aerich init 2021-12-04 18:24:09 +01:00
magowiz 653d98e076 add aerich configuration 2021-12-04 17:56:31 +01:00
magowiz 1f27f2f78c add aerich upgrade before init 2021-12-04 17:55:43 +01:00
magowiz 1195aef57e installed aerich 2021-12-04 14:40:34 +01:00
Giacomo Leidi c07c0ad30d cli: cli: Add some consistency to help messages. 2021-12-04 01:20:34 +01:00
Simone Robutti a6df8eb494
added badge 2021-12-03 17:57:06 +01:00
Giacomo Leidi 833a390839 cli: inspect: Make two subcommands for events and publications. 2021-12-03 00:06:51 +01:00
Giacomo Leidi 65ed092204 cli: cli: Add option shortcuts. 2021-12-03 00:04:36 +01:00
Giacomo Leidi 6f8f96d5b6 cli: inspect: Refactor command.
Now the inspect command can display informations about
different kind of objects including events and publications.

This patch also changes the cli to output columnar values
suitable for further processing with standard Unix
tool, such as awk.

$ mobilizon-reshare.sh inspect publication -s completed | awk '{ print  }' | sort | uniq -c
[2021-12-01 01:05:55,321] [20] [INFO] Tortoise-ORM shutdown
      2 mastodon
      2 telegram
      2 zulip
2021-12-01 01:13:59 +01:00
Giacomo Leidi 2d8855f6fe cli: inspect_event: Separate paged fields with \t. 2021-11-30 23:46:07 +01:00
Giacomo Leidi 2966e90d9d cli: cli: Add --version. 2021-11-30 23:45:01 +01:00
Giacomo Leidi 64c9d168c3 storage: publications_with_status: Return a list[Publication]. 2021-11-30 00:05:11 +01:00
Giacomo Leidi a079241801 Add pytest-cov dependency. 2021-11-28 17:27:12 +01:00
Giacomo Leidi fee794be48 test: puslishers: Adapt Zulip tests to HTTPResponseError. 2021-11-28 17:25:43 +01:00
Giacomo Leidi d6714f2142
publishers: zulip: Catch general HTTPErrors. (#107)
* publishers: zulip: Catch general HTTPErrors.
2021-11-28 16:50:22 +01:00
Giacomo Leidi 227ce22d57
scripts: Add mobilizon-reshare.sh. (#106)
* scripts: Add mobilizon-reshare.sh.

This script lets you run mobilizon-reshare commands from your git
checkout. This is very useful for debugging new features or in general
just to have a quick feedback loop from your modifications.

Example usage:

$ scripts/mobilizon-reshare.sh inspect all
2021-11-28 16:42:42 +01:00
Simone Robutti 49b79de38f
Merge pull request #108 from magowiz/feature/test_coverage
Feature/test coverage
2021-11-28 10:11:49 +01:00
magowiz 18e97c863d
change report format from html to term-missing 2021-11-27 23:35:30 +01:00
Giacomo Leidi f8bbd1df41
Fix publication exception handling for notification. (#105) 2021-11-27 23:31:44 +01:00
magowiz d60c5eabfb add pytest coverage configuration 2021-11-27 18:47:43 +01:00
Giacomo Leidi 1efa191771
Query refactoring (#102)
* Rename query modules.

* storage: save_publication_report: Create publications.

* Remove placeholder PublicationStatus.UNSAVED

* Minor fixes.
2021-11-24 23:58:06 +01:00
Giacomo Leidi ed8f752fe6
Make logs a little bit more informative. (#100) 2021-11-20 15:53:38 +01:00
Simone Robutti 2476686c33
added facebook publisher (#99)
* added facebook publisher

* mobilizon-reshare.git: [propagated-inputs]: Add python-sdk-facebook.

This package definition has been generated with `guix import pypi -r
facebook-sdk`.

* mobilizon-reshare.git: [propagated-inputs]: Use python-facebook-sdk.git.

Co-authored-by: Giacomo Leidi <goodoldpaul@autistici.org>
2021-11-20 15:40:10 +01:00
Giacomo Leidi a91e72c3ef mobilizon-reshare.git: Disable test_format_event.
These tests somehow depend on the system timezone. While we try and make
them reproducible, they'll have to be disabled in prod since they
prevent the image from building.
2021-11-12 02:40:50 +01:00
Giacomo Leidi 4fd4a778fc mobilizon-reshare.git: [native-inputs]: Add pytest-lazy-fixture. 2021-11-12 01:05:39 +01:00
Giacomo Leidi f12992c5d9
publishers: zulip: Let the user specify their instance. (#95) 2021-11-11 16:25:09 +01:00
Simone Robutti 5335ed8cc3
command tests (#97)
* filtering publications with inactive publishers

* filtering publications with inactive publishers

* WIP: Generate publications at runtime.

TODO:
- change `MobilizonEvent.compute_status`'s contract and break everything
- while we're at it we should remove `PublicationStatus.WAITING`
- test `storage.query.create_publications_for_publishers`

* cli: inspect_events: Unnest if-then-else.

* publishers: abstract: Remove `EventPublication.make`.

* fixed tests

* split query.py file

* added tests for get_unpublished_events

* added tests

* more tests

* added start test

* refactored start test

* added test start with db event

* added test recap

* added failed publication test

* added format test

Co-authored-by: Giacomo Leidi <goodoldpaul@autistici.org>
2021-11-11 16:20:50 +01:00
Simone Robutti 4dc1e4080a
fix publisher deactivation (#93)
* filtering publications with inactive publishers

* filtering publications with inactive publishers

* WIP: Generate publications at runtime.

TODO:
- change `MobilizonEvent.compute_status`'s contract and break everything
- while we're at it we should remove `PublicationStatus.WAITING`
- test `storage.query.create_publications_for_publishers`

* cli: inspect_events: Unnest if-then-else.

* publishers: abstract: Remove `EventPublication.make`.

* fixed tests

* split query.py file

* added tests for get_unpublished_events

* added tests

* more tests

* added start test

* main: start: Remove filter_publications_with_inactive_publishers.

Co-authored-by: Giacomo Leidi <goodoldpaul@autistici.org>
2021-11-11 15:18:04 +01:00
Giacomo Leidi 4f24d47f19
docker: image: Run cron jobs with root privileges. (#98) 2021-11-11 01:21:24 +01:00
Giacomo Leidi ba3eef4341
Docker image (#84)
* Add [guix](https://guix.gnu.org/) package.

This enables:

- [direnv](https://direnv.net/) integration to setup and tear down
  a suitable development environment;
- if you're not a direnv user you can always
  `guix environment -l guix.scm` to spawn a shell
  with all the necessary dependencies;
- Export of Mobilizon Reshare and its dependencies
  to one of the formats supported by `guix pack`. Right now they are:

    + tarball       Self-contained tarball, ready to run on another machine
    + squashfs      Squashfs image suitable for Singularity
    + docker        Docker image ready for 'docker load'
    + deb           Debian archive installable via dpkg/apt

* Add docker image and docker-compose.yml.

* Add Github CI workflow.
2021-10-31 01:55:25 +02:00
Giacomo Leidi 84e54a503e
Prevent dynaconf from merging different files. (#89) 2021-10-28 21:28:12 +02:00
Simone Robutti bc61ad6123
cli help messages (#85)
* added basic recap feature (no error handling)

* introduced abstractpublication

* extracted base reports

* added error report to recap

* added test

* added docs

* implemented publisher and formatter

* fixed API for recap

* removed redundant config validation

* added config sample

* added mobilizon link to templates

* added link format to telegram

* added mobilizon link to recap

* fixed config and emoji

* refactored commands

* added help messages

* improved format
2021-10-25 13:43:38 +02:00
Simone Robutti 41e82f5035
better error handling (#92)
* safe logging of notifier failure to send

* added test
2021-10-24 21:43:09 +02:00
Simone Robutti e16dd19a7c
simplified validation (#91) 2021-10-24 21:32:28 +02:00
Giacomo Leidi 6430de4a84
Add Mastodon publisher. (#83)
* Add Mastodon publisher.

This commit enables publishing on Mastodon and tries to define the
minimal requirements for adding a new platform to Mobilizon Reshare.

* publishers: exceptions: Add HTTPError.

* platforms: mastodon: Make toot length customizable.
2021-10-20 00:08:58 +02:00
Simone Robutti 8de70ef857
mobilizon link in template (#80)
* added basic recap feature (no error handling)

* introduced abstractpublication

* extracted base reports

* added error report to recap

* added test

* added docs

* implemented publisher and formatter

* fixed API for recap

* removed redundant config validation

* added config sample

* added mobilizon link to templates

* added link format to telegram

* added mobilizon link to recap

* fixed config and emoji
2021-10-19 07:35:18 +02:00
Simone Robutti 489d41179e
recap header (#79)
* added basic recap feature (no error handling)

* introduced abstractpublication

* extracted base reports

* added error report to recap

* added test

* added docs

* implemented publisher and formatter

* fixed API for recap

* removed redundant config validation

* added config sample

* added active publisher test

* added recap header template
2021-10-17 14:09:24 +02:00
Simone Robutti 71b65342b9
fix telegram active (#78)
* added basic recap feature (no error handling)

* introduced abstractpublication

* extracted base reports

* added error report to recap

* added test

* added docs

* implemented publisher and formatter

* fixed API for recap

* removed redundant config validation

* added config sample

* added active publisher test
2021-10-17 14:05:25 +02:00
Simone Robutti c14cdfb67f
twitter (#77)
* added basic recap feature (no error handling)

* introduced abstractpublication

* extracted base reports

* added error report to recap

* added test

* added docs

* implemented publisher and formatter

* fixed API for recap

* removed redundant config validation

* added config sample
2021-10-17 14:05:16 +02:00
Simone Robutti 5e171216d2
event recap (#69)
* added basic recap feature (no error handling)

* introduced abstractpublication

* extracted base reports

* added error report to recap

* added test

* added docs
2021-10-16 01:25:45 +02:00
Simone Robutti 5db7fb8597
Metadata (#68)
* decoupled notifiers from event

* stub

* publishers working

* fixed format CLI

* fixed unit tests

* renamed abstractnotifier

* added another excluded character

* restored bundled secrets file

* test telegram escape

* tested telegram event validation

* added telegram response validation

* added pragma

* added zulip response validation test

* added metadata to pyproject.toml

* improved documentation
2021-10-05 15:32:18 +02:00
Simone Robutti b61a2c5c3c
More tests (#67)
* decoupled notifiers from event

* stub

* publishers working

* fixed format CLI

* fixed unit tests

* renamed abstractnotifier

* added another excluded character

* restored bundled secrets file

* test telegram escape

* tested telegram event validation

* added telegram response validation

* added pragma

* added zulip response validation test
2021-10-05 15:32:07 +02:00
Simone Robutti bc212e7801
Check length (#66)
* decoupled notifiers from event

* stub

* publishers working

* fixed format CLI

* fixed unit tests

* renamed abstractnotifier

* added another excluded character

* restored bundled secrets file
2021-10-03 13:19:37 +02:00
Simone Robutti b6b2402767
Refactor publication (#65)
* decoupled notifiers from event

* stub

* publishers working

* fixed format CLI

* fixed unit tests

* renamed abstractnotifier

* added another excluded character
2021-10-02 18:09:03 +02:00
186 changed files with 7773 additions and 2441 deletions

2
.coveragerc Normal file
View File

@ -0,0 +1,2 @@
[run]
omit = tests/*

30
.envrc Normal file
View File

@ -0,0 +1,30 @@
if has guix; then
GUIX_PROFILE="${PWD}/.guix-root"
rm -f "$GUIX_PROFILE"
eval "$(guix time-machine -C channels-lock.scm -- shell -r "$GUIX_PROFILE" -D -f guix.scm -m manifest.scm --search-paths)"
# Add development scripts to PATH
PATH_add "$(pwd)/scripts"
venv_dir=".venv"
if [ ! -e "$venv_dir/bin/python" ] ; then
rm -rvf "$venv_dir"
pre-commit uninstall
fi
if [ ! -d "$venv_dir" ] ; then
virtualenv -p `which python3` "$venv_dir"
poetry install
pre-commit install
fi
clear
git-cal --author="$(git config user.name)"
cat << EOF
The 'scripts' directory has been added to your PATH: you can now invoke scripts without typing the relative path.
EOF
fi

71
.github/workflows/main.yml vendored Normal file
View File

@ -0,0 +1,71 @@
# This is a basic workflow to help you get started with Actions
name: CI
# Controls when the workflow will run
on:
pull_request:
paths-ignore:
- 'guix.scm'
- 'manifest.scm'
- 'channels-lock.scm'
- '.envrc'
- '.gitignore'
- 'pre-commit-*.yaml'
- Dockerfile
- README.*
- LICENSE
- 'sample_settings/**'
- 'etc/**'
push:
# Sequence of patterns matched against refs/tags
branches: ["master"]
paths-ignore:
- 'guix.scm'
- 'manifest.scm'
- 'channels-lock.scm'
- '.envrc'
- '.gitignore'
- 'pre-commit-*.yaml'
- Dockerfile
- README.*
- LICENSE
- 'sample_settings/**'
- 'etc/**'
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
run-tests-dev:
strategy:
fail-fast: false
matrix:
python-version: ["3.10", "3.11"]
poetry-version: ["1.1.12", "1.7.0"]
os: [ubuntu-latest]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Run image
uses: abatilo/actions-poetry@v2
with:
poetry-version: ${{ matrix.poetry-version }}
- name: Setup a local virtual environment
run: |
poetry config virtualenvs.create true --local
poetry config virtualenvs.in-project true --local
- uses: actions/cache@v3
name: Define a cache for the virtual environment based on the dependencies lock file
with:
path: ./.venv
key: venv-${{ hashFiles('poetry.lock') }}
- name: Install dependencies
run: scripts/install_github_actions_dev_dependencies.sh
- name: Run tests in dev env
run: scripts/run_pipeline_tests.sh

67
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,67 @@
# This is a basic workflow to help you get started with Actions
name: Publish release to Docker Hub
# Controls when the workflow will run
on:
push:
# Sequence of patterns matched against refs/tags
tags:
- v*
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2
# Runs a single command using the runners shell
- name: Install GNU Guix
uses: PromyLOPh/guix-install-action@v1.4
# Runs a set of commands using the runners shell
- name: Build image
run: scripts/build_docker_image.sh -r
- name: Upload pack (Docker)
uses: actions/upload-artifact@v2
with:
name: mobilizon-reshare-docker
path: docker-image.tar.gz
publish:
# The type of runner that the job will run on
runs-on: ubuntu-latest
needs: build
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2
- name: Get release tag
id: vars
run: echo ::set-output name=tag::${GITHUB_REF#refs/*/}
- name: Download image
uses: actions/download-artifact@v2
with:
name: mobilizon-reshare-docker
- name: Publish to Docker Hub
uses: fishinthecalculator/publish-docker-image-action@v0.1.10
env:
IMAGE_TAG: ${{ steps.vars.outputs.tag }}
IMAGE_NAME_TAG: mobilizon-reshare-scheduler-python:latest
with:
name: twcita/mobilizon-reshare
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
image: docker-image.tar.gz

14
.gitignore vendored
View File

@ -176,3 +176,17 @@ crashlytics-build.properties
fabric.properties
.idea
*/local_testing.toml
.direnv/
var/
docker-image.tar.gz
.guix-root
# test run script
test_run.sh
# directory where sphinx documents resides
api_documentation/source/*
!api_documentation/source/conf.py
!api_documentation/source/index.rst
!api_documentation/source/_static/
./settings.toml

1
.img/license.svg Normal file
View File

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="118.9" height="20"><linearGradient id="smooth" x2="0" y2="100%"><stop offset="0" stop-color="#bbb" stop-opacity=".1"/><stop offset="1" stop-opacity=".1"/></linearGradient><clipPath id="round"><rect width="118.9" height="20" rx="3" fill="#fff"/></clipPath><g clip-path="url(#round)"><rect width="56.2" height="20" fill="#555"/><rect x="56.2" width="62.7" height="20" fill="#007ec6"/><rect width="118.9" height="20" fill="url(#smooth)"/></g><g fill="#fff" text-anchor="middle" font-family="DejaVu Sans,Verdana,Geneva,sans-serif" font-size="110"><text x="291.0" y="150" fill="#010101" fill-opacity=".3" transform="scale(0.1)" textLength="462.0" lengthAdjust="spacing">LICENSE</text><text x="291.0" y="140" transform="scale(0.1)" textLength="462.0" lengthAdjust="spacing">LICENSE</text><text x="865.5000000000001" y="150" fill="#010101" fill-opacity=".3" transform="scale(0.1)" textLength="527.0" lengthAdjust="spacing">Coopyleft</text><text x="865.5000000000001" y="140" transform="scale(0.1)" textLength="527.0" lengthAdjust="spacing">Coopyleft</text><a xlink:href="https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/LICENSE"><rect width="56.2" height="20" fill="rgba(0,0,0,0)"/></a><a xlink:href="https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/LICENSE"><rect x="56.2" width="62.7" height="20" fill="rgba(0,0,0,0)"/></a></g></svg>

After

Width:  |  Height:  |  Size: 1.4 KiB

1
.img/pypi.svg Normal file
View File

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="71.6" height="20"><linearGradient id="smooth" x2="0" y2="100%"><stop offset="0" stop-color="#bbb" stop-opacity=".1"/><stop offset="1" stop-opacity=".1"/></linearGradient><clipPath id="round"><rect width="71.6" height="20" rx="3" fill="#fff"/></clipPath><g clip-path="url(#round)"><rect width="33.6" height="20" fill="#555"/><rect x="33.6" width="38.0" height="20" fill="#007ec6"/><rect width="71.6" height="20" fill="url(#smooth)"/></g><g fill="#fff" text-anchor="middle" font-family="DejaVu Sans,Verdana,Geneva,sans-serif" font-size="110"><text x="178.0" y="150" fill="#010101" fill-opacity=".3" transform="scale(0.1)" textLength="236.0" lengthAdjust="spacing">pypi</text><text x="178.0" y="140" transform="scale(0.1)" textLength="236.0" lengthAdjust="spacing">pypi</text><text x="516.0" y="150" fill="#010101" fill-opacity=".3" transform="scale(0.1)" textLength="280.0" lengthAdjust="spacing">0.3.6</text><text x="516.0" y="140" transform="scale(0.1)" textLength="280.0" lengthAdjust="spacing">0.3.6</text><a xlink:href="https://pypi.org/project/mobilizon-reshare/"><rect width="33.6" height="20" fill="rgba(0,0,0,0)"/></a><a xlink:href="https://pypi.org/project/mobilizon-reshare/"><rect x="33.6" width="38.0" height="20" fill="rgba(0,0,0,0)"/></a></g></svg>

After

Width:  |  Height:  |  Size: 1.3 KiB

1
.img/python.svg Normal file
View File

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="131.5" height="20"><linearGradient id="smooth" x2="0" y2="100%"><stop offset="0" stop-color="#bbb" stop-opacity=".1"/><stop offset="1" stop-opacity=".1"/></linearGradient><clipPath id="round"><rect width="131.5" height="20" rx="3" fill="#fff"/></clipPath><g clip-path="url(#round)"><rect width="65.5" height="20" fill="#555"/><rect x="65.5" width="66.0" height="20" fill="#007ec6"/><rect width="131.5" height="20" fill="url(#smooth)"/></g><g fill="#fff" text-anchor="middle" font-family="DejaVu Sans,Verdana,Geneva,sans-serif" font-size="110"><image x="5" y="3" width="14" height="14" xlink:href="data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHZpZXdCb3g9IjAgMCAxMDAgMTAwIj4KICA8ZGVmcz4KICAgIDxsaW5lYXJHcmFkaWVudCBpZD0icHlZZWxsb3ciIGdyYWRpZW50VHJhbnNmb3JtPSJyb3RhdGUoNDUpIj4KICAgICAgPHN0b3Agc3RvcC1jb2xvcj0iI2ZlNSIgb2Zmc2V0PSIwLjYiLz4KICAgICAgPHN0b3Agc3RvcC1jb2xvcj0iI2RhMSIgb2Zmc2V0PSIxIi8+CiAgICA8L2xpbmVhckdyYWRpZW50PgogICAgPGxpbmVhckdyYWRpZW50IGlkPSJweUJsdWUiIGdyYWRpZW50VHJhbnNmb3JtPSJyb3RhdGUoNDUpIj4KICAgICAgPHN0b3Agc3RvcC1jb2xvcj0iIzY5ZiIgb2Zmc2V0PSIwLjQiLz4KICAgICAgPHN0b3Agc3RvcC1jb2xvcj0iIzQ2OCIgb2Zmc2V0PSIxIi8+CiAgICA8L2xpbmVhckdyYWRpZW50PgogIDwvZGVmcz4KCiAgPHBhdGggZD0iTTI3LDE2YzAtNyw5LTEzLDI0LTEzYzE1LDAsMjMsNiwyMywxM2wwLDIyYzAsNy01LDEyLTExLDEybC0yNCwwYy04LDAtMTQsNi0xNCwxNWwwLDEwbC05LDBjLTgsMC0xMy05LTEzLTI0YzAtMTQsNS0yMywxMy0yM2wzNSwwbDAtM2wtMjQsMGwwLTlsMCwweiBNODgsNTB2MSIgZmlsbD0idXJsKCNweUJsdWUpIi8+CiAgPHBhdGggZD0iTTc0LDg3YzAsNy04LDEzLTIzLDEzYy0xNSwwLTI0LTYtMjQtMTNsMC0yMmMwLTcsNi0xMiwxMi0xMmwyNCwwYzgsMCwxNC03LDE0LTE1bDAtMTBsOSwwYzcsMCwxMyw5LDEzLDIzYzAsMTUtNiwyNC0xMywyNGwtMzUsMGwwLDNsMjMsMGwwLDlsMCwweiBNMTQwLDUwdjEiIGZpbGw9InVybCgjcHlZZWxsb3cpIi8+CgogIDxjaXJjbGUgcj0iNCIgY3g9IjY0IiBjeT0iODgiIGZpbGw9IiNGRkYiLz4KICA8Y2lyY2xlIHI9IjQiIGN4PSIzNyIgY3k9IjE1IiBmaWxsPSIjRkZGIi8+Cjwvc3ZnPgo="/><text x="422.5" y="150" fill="#010101" fill-opacity=".3" transform="scale(0.1)" textLength="385.0" lengthAdjust="spacing">python</text><text x="422.5" y="140" transform="scale(0.1)" textLength="385.0" lengthAdjust="spacing">python</text><text x="975.0" y="150" fill="#010101" fill-opacity=".3" transform="scale(0.1)" textLength="560.0" lengthAdjust="spacing">3.10, 3.11</text><text x="975.0" y="140" transform="scale(0.1)" textLength="560.0" lengthAdjust="spacing">3.10, 3.11</text><a xlink:href="https://www.python.org/"><rect width="65.5" height="20" fill="rgba(0,0,0,0)"/></a><a xlink:href="https://www.python.org/"><rect x="65.5" width="66.0" height="20" fill="rgba(0,0,0,0)"/></a></g></svg>

After

Width:  |  Height:  |  Size: 2.6 KiB

View File

@ -3,7 +3,7 @@ repos:
rev: stable
hooks:
- id: black
language_version: python3.9
language_version: python3.10
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v1.2.3
hooks:

24
Dockerfile Normal file
View File

@ -0,0 +1,24 @@
FROM python:3.10-alpine3.16
ENV ENV_FOR_DYNACONF=${ENV_FOR_DYNACONF} \
PYTHONFAULTHANDLER=1 \
PYTHONUNBUFFERED=1 \
PYTHONHASHSEED=random \
PIP_NO_CACHE_DIR=off \
PIP_DISABLE_PIP_VERSION_CHECK=on \
PIP_DEFAULT_TIMEOUT=100 \
POETRY_VERSION=1.0.0
# System deps:
RUN pip install "poetry==$POETRY_VERSION"
# Copy only requirements to cache them in docker layer
WORKDIR /app
COPY poetry.lock pyproject.toml /app/
# Project initialization:
RUN poetry config virtualenvs.create false \
&& poetry install $(test "$ENV_FOR_DYNACONF" == production && echo "--no-dev") --no-interaction --no-ansi
# Creating folders, and files for a project:
COPY . /app

View File

@ -1,64 +1,69 @@
The goal of mobilizon_reshare is to provide a suite to reshare Mobilizon events on a broad selection of platforms. This
[![CI](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/actions/workflows/main.yml/badge.svg?branch=master)](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/actions/workflows/main.yml)
[![Python versions](https://raw.githubusercontent.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/master/.img/python.svg)](https://python.org)
[![PyPI version](https://raw.githubusercontent.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/master/.img/pypi.svg)](https://pypi.org/project/mobilizon-reshare/)
[![License](https://raw.githubusercontent.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/master/.img/license.svg)](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/LICENSE)
The goal of `mobilizon_reshare` is to provide a suite to reshare Mobilizon events on a broad selection of platforms. This
tool enables an organization to automate their social media strategy in regards
to events and their promotion.
# Platforms
`mobilizon-reshare` currently supports the following social platforms:
- Facebook
- Mastodon
- Twitter
- Telegram
- Zulip
# Usage
## Scheduling and temporal logic
The tool is designed to work in combination with a scheduler that executes it at
regular intervals. mobilizon_reshare allows fine-grained control over the logic to decide when
regular intervals. `mobilizon_reshare` allows fine-grained control over the logic to decide when
to publish an event, with the minimization of human effort as its first priority.
## Configuration
## Installation
The configuration is implemented through Dynaconf. It allows a variety of ways to specify configuration keys.
Refer to their [documentation](https://www.dynaconf.com/) to discover how configuration files and environment variables can be specified.
`mobilizon_reshare` is distributed through [Pypi](https://pypi.org/project/mobilizon-reshare/) and [DockerHub](https://hub.docker.com/r/twcita/mobilizon-reshare). Use
We provide a sample configuration in the [settings.toml](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/mobilizon_reshare/settings.toml) file.
```shell
$ pip install mobilizon-reshare
```
### Event selection
to install the tool in your system or virtualenv.
### Publishers
This should install the command `mobilizon-reshare` in your system. Use it to access the CLI and discover the available
commands and their description.
### Notifiers
### Guix package
If you run Guix you can install `mobilizon-reshare` by adding our [Guix channel](https://git.sr.ht/~fishinthecalculator/mobilizon-reshare-guix#configure) to your `.config/guix/channels.scm`.
# Contributing
To run `mobilizon-reshare` from master you can run the following command from the root of the repository:
We welcome contributions from anybody. Currently our process is not structured yet but feel free to open or take issues through Github in case you want to help us.
``` shell
$ guix time-machine -C channels-lock.scm -- install -L . mobilizon-reshare.git
```
## Core Concepts
## Run on your local system
### Publisher
Once you have installed `mobilizon_reshare` you can schedule the refresh from Mobilizon with your system's `cron`:
A Publisher is responsible for formatting and publishing an event on a given platform.
```bash
$ sudo crontab -l
*/15 * * * * mobilizon-reshare start
```
Currently the following publishers are supported:
## Deploying through Docker Compose
* Telegram
* Zulip
To run `mobilizon_reshare` in a production environment you can use the image published to DockerHub. We also provide an example [`docker-compose.yml`](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/docker-compose.yml).
### Notifier
# Contributing
Notifiers are similar to Publishers and share most of the implementation. Their purpose is to
notify the maintainers when something unexpected happens.
### Publication Strategy
A Publication Strategy is responsible for selecting the event to publish. Currently it's possible to publish only one
event per run, under the assumption that the user will implement a social media strategy that doesn't require
concurrent publishing of multiple events on the same platform. Through proper scheduling and configuration is still
possible to achieve such behavior if required.
## Develop
To run pre-commit hooks run `pre-commit install` after cloning the repository.
Make sure to have `pre-commit` installed in your active python environment. To install: `pip install pre-commit`. For more info: https://pre-commit.com/
We welcome contributions from anybody. Currently our process is not structured but feel free to open or take issues through Github in case you want to help us. We have setup some instructions to setup a development environment [here](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/doc/contributing.md).

View File

@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

View File

View File

@ -0,0 +1,54 @@
# Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
# -- Project information -----------------------------------------------------
project = "mobilizon-reshare"
copyright = "2022, -"
author = "-"
# -- General configuration ---------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = ["sphinxcontrib.napoleon", "sphinx_autodoc_typehints"]
# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = []
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = "sphinx_material"
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ["_static"]
# disable full module path in methods list
add_module_names = False

View File

@ -0,0 +1,20 @@
.. mobilizon-reshare documentation master file, created by
sphinx-quickstart on Mon Feb 21 12:57:17 2022.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to mobilizon-reshare's documentation!
=============================================
.. toctree::
:maxdepth: 2
:caption: Contents:
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

18
channels-lock.scm Normal file
View File

@ -0,0 +1,18 @@
(define-module (channels-lock)
#:use-module (guix channels))
(list
(channel
(name 'mobilizon-reshare)
(url "https://git.sr.ht/~fishinthecalculator/mobilizon-reshare-guix")
(branch "main"))
(channel
(name 'guix)
(url "https://git.savannah.gnu.org/git/guix.git")
(commit
"b7eb1a8116b2caee7acf26fb963ae998fbdb4253")
(introduction
(make-channel-introduction
"afb9f2752315f131e4ddd44eba02eed403365085"
(openpgp-fingerprint
"BBB0 2DDF 2CEA F6A8 0D1D E643 A2A0 6DF2 A33A 54FA")))))

9
doc/add-new-publisher.md Normal file
View File

@ -0,0 +1,9 @@
# Add a new publisher
To add a new publishing platform to Mobilizon Reshare you need to follow these steps.
## Add an example configuration in `mobilizon_reshare/.secrets.toml`
## Add suitable validators to `mobilizon_reshare/config/notifiers.py` and `mobilizon_reshare/config/publishers.py`
## Create a new file inside `mobilizon_reshare/publishers/platforms`
## Add suitable mappings inside `mobilizon_reshare/publishers/platform_mapping.py`
## Create suitable message templates inside `mobilizon_reshare/publishers/templates`
## Add some unit test inside `tests/publishers`

49
doc/contributing.md Normal file
View File

@ -0,0 +1,49 @@
# Contributing
## Develop
To run pre-commit hooks run `pre-commit install` after cloning the repository.
Make sure to have `pre-commit` installed in your active python environment. To install: `pip install pre-commit`. For more info: https://pre-commit.com/.
To install dependencies you can either use [Guix](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/doc/development-environment-with-guix.md) or poetry:
```shell
$ poetry install
Installing dependencies from lock file
Package operations: 44 installs, 0 updates, 0 removals
[...]
Installing the current project: mobilizon-reshare (0.1.0)
$
```
### Testing
To run the test suite, run `scripts/run_pipeline_tests.sh` from the root of the repository.
At the moment no integration test is present and they are executed manually. Reach out to us if you want to
access the testing environment or you want to help automate the integration tests.
### How to handle migrations
Changes to the data model need to be handled through migrations. We use aerich to manage the migration files.
Both our CLI and our web service are configured in such a way that migrations are run transparently when the package is
updated. If you want to test that the update doesn't corrupt your data, we suggest trying the update in a test database.
To create a new migration file, use aerich CLI. It will take care of generating the file. If further code is necessary,
add it to the new migration file.
Since we support two database (sqlite and postgres) that have slightly different dialects and since aerich doesn't
really support this scenario, it is necessary to generate migrations separately and place the migrations files in the
respective folders.
Aerich picks up the migrations according to the scheme of the db in the configuration.
Currently the consistency of the migrations for the different databases is not tested so please pay extra care when
committing a change and request special review.
Aerich configuration is specified in the pyproject.toml file. Since it doesn't support multiple databases, we have two
configuration files that allow to run aerich on different databases if you enter their respective migration folders.
You can find them in mobilizon_reshare/migrations.

76
doc/dependency-hell.md Normal file
View File

@ -0,0 +1,76 @@
# Beating dependency hell with GNU Guix
`mobilizon-reshare`'s distribution process relies quite a bit upon GNU Guix. It's involved in our CI pipeline, and it builds the [OCI compliant](https://opencontainers.org/) container image available on Docker Hub. It provides us with [inspectable](https://hpc.guix.info/blog/2021/10/when-docker-images-become-fixed-point/), [bit-for-bit reproducible](https://reproducible-builds.org/) and [fully bootstrappable](https://bootstrappable.org) images which in turns allows for strong control on what code is actually bundled within the image and it should prevent entire classes of supply-chain attacks starting from the [Trusting Trust attack](https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_ReflectionsonTrustingTrust.pdf) up until the many recent [attacks](https://www.sonatype.com/resources/state-of-the-software-supply-chain-2021) to many FOSS software registries.
To allow for interoperability with the Python ecosystem, we also ship a `pyproject.toml` that we handle with [Poetry](https://python-poetry.org/). The next paragraph will elaborate on the interactions between Poetry and Guix.
## Update the dependency graph of mobilizon-reshare
> **Beware!** - Dependency updates are better delivered to master as a single commit, to avoid confusing the CI.
### Python dependencies
We **must** keep Poetry and Guix version as much aligned as possible, to prevent unpredictable behavior. All the following content assumes this invariant.
Everything starts from `pyproject.toml`: usually your IDE warns you about outdated dependency, so let's assume you want to bump the version of a Python package. First keep in mind that Poetry's [tilde requirements](https://python-poetry.org/docs/dependency-specification/#tilde-requirements) are SemVer compatible but stricter than caret requirements so they should make matching Guix version easier. Then it's time to actually edit `pyproject.toml` and bump the version of a package.
To update Python dependencies and test your changes the steps are:
```shell
$ poetry update
$ guix time-machine -C channels-lock.scm -- build -f guix.scm
$ scripts/build_docker_image.sh
```
If these steps succeed you can safely commit your changes. If Guix fails you have to examine the output of the command that failed and figure out the problem. 99% of the times it'll be a version mismatch, as Guix's [`python-build-system`](https://guix.gnu.org/en/manual/devel/en/guix.html#index-python_002dbuild_002dsystem) has a `sanity-check` phase that'll try to instantiate the entry point generated by Poetry that, among other things, checks for runtime dependencies versions and errors out if it finds a mismatch between the version actually available in the runtime environment and the version defined in `pyproject.toml`.
You now have two alternatives:
1. You try to follow the next step about system dependencies. `channels-lock.scm` locks everything: as long as the Guix commit specified in that file does not change, `guix time-machine` will look for the exact same package graph. This means that every time we build the image we get the same exact dependencies we ask for, but this semantics is slightly different from Poetry's lock-file which instead tracks the **latest version** (within the constraints) available on Pypi. Having a more updated Guix version may allow for a more updated mapping of Pypi.
2. You find the package (or packages) responsible for the mismatch and try to manipulate it to follow Poetry's constraints. This requires some basic Scheme understanding but nothing complex. There are many ways a Guix package can be programmatically manipulated as it's just a structured Scheme record, you can start by looking into [package variants](https://guix.gnu.org/en/manual/devel/en/guix.html#Defining-Package-Variants) or also directly at the [channel code](https://github.com/fishinthecalculator/mobilizon-reshare-guix/tree/main/modules/mobilizon-reshare).
### System dependencies
Python's own dependencies are dependencies too! Guix freezes the whole dependency graph of an artifact with [channels specifications](https://guix.gnu.org/en/manual/devel/en/guix.html#Replicating-Guix) so to update "system" dependencies you need to follow these steps.
First let's update our Guix version to the latest commit:
```shell
$ guix pull
Updating channel 'guix' from Git repository at 'https://git.savannah.gnu.org/git/guix.git'...
Authenticating channel 'guix', commits 9edb3f6 to d41c82b (162 new commits)...
Building from these channels:
guix https://git.savannah.gnu.org/git/guix.git d41c82b
substitute: updating substitutes from 'https://ci.guix.gnu.org'... 100.0%\
[...]
building package cache...
building profile with 3 packages...
$
```
Channels specification define the Guix commit that should be used to fetch the right dependency graph, so what we want to do is replace the commit in `channels-lock.scm` with the one we just pulled:
```shell
$ guix describe
Generation 31 Mar 12 2022 12:35:00 (current)
guix d41c82b
repository URL: https://git.savannah.gnu.org/git/guix.git
branch: master
commit: d41c82b481fd0f5c7d45d6e2629fdf9d2085205b
$ vim channels-lock.scm
```
To test our change we can run:
```shell
$ guix time-machine -C channels-lock.scm -- build -f guix.scm
```
But a better test would be to build the Docker image, as that actually bundles all required runtime dependencies:
```shell
$ scripts/build_docker_image.sh
```

View File

@ -0,0 +1,123 @@
# Hacking with Guix on Mobilizon Reshare
To setup a development environment to hack on `mobilizon-reshare` you can use [Guix](https://guix.gnu.org/) and [direnv](https://direnv.net/).
If you already have `guix` and `direnv` installed on your system, the development environment setup is as easy as:
```shell
$ git clone https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare
$ cd mobilizon-reshare/
direnv: error .envrc is blocked. Run `direnv allow` to approve its content.
$ direnv allow
direnv: loading .envrc
[...]
direnv: export +CPLUS_INCLUDE_PATH +C_INCLUDE_PATH +LIBRARY_PATH ~GUIX_LOCPATH ~PATH ~PYTHONPATH
$
```
Hurray 🎉 ! Now you can hack on `mobilizon-reshare` without worrying about dependencies.
## Guix
## Installation
*Caveat:* Guix currently runs only on Linux, if you run a different OS you're probably better off with something like [poetry](https://python-poetry.org/). Just beware that you may end up with slightly different behavior, since `poetry` only locks Python dependencies.
#### Debian/Ubuntu/Linx Mint and derivatives
If you are running Debian or a derivative such as Ubuntu or Linux Mint installing Guix is achieved with:
```shell
$ sudo apt install guix
```
#### openSUSE
Likewise on openSUSE:
```shell
$ sudo zypper install guix
```
#### Arch Linux
The Arch Wiki has a very good [article](https://wiki.archlinux.org/title/Guix).
#### Other distributions
For every other distributions you can install Guix with the installer script. It will guide you through the process of installing Guix.
```shell
$ curl https://git.savannah.gnu.org/cgit/guix.git/plain/etc/guix-install.sh | sudo bash
```
Beware that piping to `sudo bash` is usually a *very* bad idea. Before running the above command please read the script and the [Guix manual](https://guix.gnu.org/en/manual/en/guix.html#Binary-Installation).
### Configuring Guix
To make Guix applications work out of the box you should add the following variables to your `.bash_profile` (or its equivalent for shells other than Bash):
```shell
GUIX_PROFILE="${HOME}/.guix-profile"
. "$GUIX_PROFILE/etc/profile"
export GUIX_LOCPATH="$GUIX_PROFILE/lib/locale"
export SSL_CERT_DIR="$GUIX_PROFILE/etc/ssl/certs"
export SSL_CERT_FILE="$GUIX_PROFILE/etc/ssl/certs/ca-certificates.crt"
export GIT_SSL_CAINFO="$SSL_CERT_FILE"
export CURL_CA_BUNDLE="$SSL_CERT_FILE"
export INFOPATH="$GUIX_PROFILE${INFOPATH:+:}${INFOPATH}"
export MANPATH="$GUIX_PROFILE${MANPATH:+:}${MANPATH}"
GUIX_PROFILE="$XDG_CONFIG_HOME/guix/current"
. "$GUIX_PROFILE/etc/profile"
```
and then run **in a new shell**
```shell
$ guix install nss-certs
$ sudo -i guix install glibc-locales
```
## direnv
### Installation
Once you have Guix properly setup, you can install `direnv` with:
```shell
$ guix install direnv
```
then you should [hook it](https://direnv.net/docs/hook.html) into your shell.
## Troubleshooting
Guix sometimes prints somewhat scary messages like:
```shell
$ guix install hello
The following package will be installed:
hello 2.10
The following derivation will be built:
/gnu/store/15s9gs89i6bf16skwb1c03bm4wj9h30a-profile.drv
building CA certificate bundle...
listing Emacs sub-directories...
building fonts directory...
building directory of Info manuals...
building database for manual pages...
building profile with 1 package...
hint: Consider setting the necessary environment variables by running:
GUIX_PROFILE="~/.guix-profile/hello-profile"
. "$GUIX_PROFILE/etc/profile"
Alternately, see `guix package --search-paths'.
$
```
when you see a message like that you can either run it to make the current shell work with the new packages installed by Guix or just close the current shell and spawn another, this way it'll put Guix packages in the right place in your `PATH`.

88
doc/manual.md Normal file
View File

@ -0,0 +1,88 @@
# Manual
## What is mobilizon-reshare?
`mobilizon_reshare` is a Python application to publish events from Mobilizon on a broad selection of platforms.
This tool enables an organization to automate their social media strategy in regards to events and their promotion.
## Configuration
The configuration is implemented through Dynaconf. It allows a variety of ways to specify configuration keys.
Refer to their [documentation](https://www.dynaconf.com/) to discover how configuration files and environment variables can be specified.
We provide a sample configuration in the
[settings.toml](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/mobilizon_reshare/settings.toml) file and
[.secrets.toml](https://github.com/Tech-Workers-Coalition-Italia/mobilizon-reshare/blob/master/mobilizon_reshare/.secrets.toml) file.
Use these files as the base for your custom configuration.
### Publishers and notifiers
The first step to deploy your custom configuration is to specify the source Mobilizon instance and group and then
activate the publishers and notifiers you're interested in. For each one of them, you have to specify credentials and
options in the .secrets.toml file.
### Publishing strategy
The second important step is to define when and how your posts should be published. `mobilizon-reshare` takes over the
responsibility of deciding *what* and *when* to publish, freeing the humans from the need to curate the social media
strategy around your events. This might mean very different things for different users and covering common use cases is
a goal of this project. To do so though, you need to guide the tool and understand in detail the options available to
configure it in accordance to the requirements of your desired social media strategy.
The first element is the selection strategy. This is the way the tool will decide which event to pick and
publish among all those available. At every execution of `mobilizon-reshare` will publish at most one event so you have
to consider how the selected strategy will interact with the external scheduling. The strategies assume that the
schedule will fire at regular intervals, unless specified otherwise. These intervals can vary but they should be small
compared to the publishing window. Ideally a few minutes to a couple hours.
Currently only one strategy is supported: `next_event`. The semantic of the strategy is the following: pick the next
event in chronological order that hasn't been published yet and publish it only if at least
`break_between_events_in_minutes` minutes have passed.
## Recap
In addition to the event publication feature, `mobilizon-reshare` allows you to do periodical recap of your events.
In the current version, the two features are handled separately and triggered by different CLI commands (respectively
`mobilizon-reshare start` and `mobilizon-reshare recap`).
The recap command, when executed, will retrieve the list of already published events and summarize in a single message
to publish on all the active publishers. At the moment it doesn't support any decision logic and will always publish
when triggered.
## Core Concepts
### Publisher
A Publisher is responsible publishing an event or a message on a given platform.
Currently the following publishers are supported:
- Facebook
- Mastodon
- Twitter
- Telegram
- Zulip
### Notifier
Notifiers are similar to Publishers and share most of the implementation. Their purpose is to
notify the maintainers when something unexpected happens.
### Formatter
A formatter is responsible for the formatting and validation of an event or a message on a given platform.
Different platform require different templates and rules and therefore formatting is considered a platform-specific
issue.
### Publication Strategy
A Publication Strategy is responsible for selecting the event to publish. Currently it's possible to publish only one
event per run, under the assumption that the user will implement a social media strategy that doesn't require
concurrent publishing of multiple events on the same platform. Through proper scheduling and configuration is still
possible to achieve such behavior if required.
### Coordinator
A coordinator is responsible for publishing a message or an event across different platforms using different logics.
It uses publishers and formatters to compose and send the message and compile a report of how the publication went.

View File

@ -0,0 +1,12 @@
version: "3.7"
services:
db:
image: postgres:13
env_file:
- ./.env
healthcheck:
test: ["CMD", "pg_isready", "-U", "mobilizon_reshare"]
interval: 5s
retries: 5
ports:
- 5432:5432

34
docker-compose-web.yml Normal file
View File

@ -0,0 +1,34 @@
version: "3.7"
services:
db:
image: postgres:13
env_file:
- ./.env
volumes:
- postgres-db-volume:/var/lib/postgresql/data
healthcheck:
test: ["CMD", "pg_isready", "-U", "mobilizon_reshare"]
interval: 5s
retries: 5
ports:
- 5432:5432
web:
build: .
command: poetry run mobilizon-reshare web
#command: sh
environment:
SECRETS_FOR_DYNACONF: /app/.secrets.toml
SETTINGS_FILE_FOR_DYNACONF: /app/settings.toml
ENV_FOR_DYNACONF: development
volumes:
- ./sample_settings/docker_web/.sample_secrets.toml:/app/.secrets.toml
- ./sample_settings/docker_web/settings.toml:/app/settings.toml
- /etc/localtime:/etc/localtime:ro
- /etc/timezone:/etc/timezone:ro
- postgres-db-volume:/var/lib/postgresql
- ./:/app
ports:
- 8000:8000
volumes:
postgres-db-volume:

14
docker-compose.yml Normal file
View File

@ -0,0 +1,14 @@
version: "3.7"
services:
mobilizon-reshare:
image: twcita/mobilizon-reshare:v0.3.6
environment:
SECRETS_FOR_DYNACONF: /etc/xdg/mobilizon-reshare/0.3.6/.secrets.toml
ENV_FOR_DYNACONF: production
MOBILIZON_RESHARE_INTERVAL: "*/15 10-18 * * 0-4"
volumes:
- ./.secrets.toml:/etc/xdg/mobilizon-reshare/0.3.6/.secrets.toml:ro
- ./mobilizon_reshare.toml:/etc/xdg/mobilizon-reshare/0.3.6/mobilizon_reshare.toml:ro
- ./var:/var/lib/mobilizon-reshare
- /etc/localtime:/etc/localtime:ro
- /etc/timezone:/etc/timezone:ro

View File

@ -0,0 +1,29 @@
_mobilizon_reshare_completion() {
local IFS=$'\n'
local response
response=$(env COMP_WORDS="${COMP_WORDS[*]}" COMP_CWORD=$COMP_CWORD _MOBILIZON_RESHARE_COMPLETE=bash_complete $1)
for completion in $response; do
IFS=',' read type value <<< "$completion"
if [[ $type == 'dir' ]]; then
COMPREPLY=()
compopt -o dirnames
elif [[ $type == 'file' ]]; then
COMPREPLY=()
compopt -o default
elif [[ $type == 'plain' ]]; then
COMPREPLY+=($value)
fi
done
return 0
}
_mobilizon_reshare_completion_setup() {
complete -o nosort -F _mobilizon_reshare_completion mobilizon-reshare
}
_mobilizon_reshare_completion_setup;

View File

@ -0,0 +1,22 @@
function _mobilizon_reshare_completion;
set -l response;
for value in (env _MOBILIZON_RESHARE_COMPLETE=fish_complete COMP_WORDS=(commandline -cp) COMP_CWORD=(commandline -t) mobilizon-reshare);
set response $response $value;
end;
for completion in $response;
set -l metadata (string split "," $completion);
if test $metadata[1] = "dir";
__fish_complete_directories $metadata[2];
else if test $metadata[1] = "file";
__fish_complete_path $metadata[2];
else if test $metadata[1] = "plain";
echo $metadata[2];
end;
end;
end;
complete --no-files --command mobilizon-reshare --arguments "(_mobilizon_reshare_completion)";

View File

@ -0,0 +1,35 @@
#compdef mobilizon-reshare
_mobilizon_reshare_completion() {
local -a completions
local -a completions_with_descriptions
local -a response
(( ! $+commands[mobilizon-reshare] )) && return 1
response=("${(@f)$(env COMP_WORDS="${words[*]}" COMP_CWORD=$((CURRENT-1)) _MOBILIZON_RESHARE_COMPLETE=zsh_complete mobilizon-reshare)}")
for type key descr in ${response}; do
if [[ "$type" == "plain" ]]; then
if [[ "$descr" == "_" ]]; then
completions+=("$key")
else
completions_with_descriptions+=("$key":"$descr")
fi
elif [[ "$type" == "dir" ]]; then
_path_files -/
elif [[ "$type" == "file" ]]; then
_path_files -f
fi
done
if [ -n "$completions_with_descriptions" ]; then
_describe -V unsorted completions_with_descriptions -U
fi
if [ -n "$completions" ]; then
compadd -U -V unsorted -a completions
fi
}
compdef _mobilizon_reshare_completion mobilizon-reshare;

71
guix.scm Normal file
View File

@ -0,0 +1,71 @@
(define-module (guix)
#:use-module (guix git-download)
#:use-module (guix build-system python)
#:use-module (guix gexp)
#:use-module (guix packages)
#:use-module (guix utils)
#:use-module (gnu packages markup) ;; for python-markdownify
#:use-module (gnu packages python-web) ;; for python-fastapi-pagination-minimal and uvicorn
#:use-module (gnu packages python-xyz) ;; for python-apscheduler
#:use-module (mobilizon-reshare package)
#:use-module (mobilizon-reshare dependencies)
#:use-module (ice-9 rdelim)
#:use-module (ice-9 popen))
(define %source-dir (getcwd))
(define mobilizon-reshare-git-origin
(local-file %source-dir
#:recursive? #t
#:select? (git-predicate %source-dir)))
(define mobilizon-reshare.git
(let ((source-version (with-input-from-file
(string-append %source-dir
"/mobilizon_reshare/VERSION")
read-line))
(revision "0")
(commit (read-line
(open-input-pipe "git show HEAD | head -1 | cut -d ' ' -f 2"))))
((package-input-rewriting/spec `(("python-fastapi" . ,(const python-fastapi))
("python-dotenv" . ,(const python-dotenv-0.13.0))
("python-uvicorn" . ,(const python-uvicorn))))
(package (inherit mobilizon-reshare)
(name "mobilizon-reshare.git")
(version (git-version source-version revision commit))
(source mobilizon-reshare-git-origin)
(propagated-inputs
(modify-inputs (package-propagated-inputs mobilizon-reshare)
(replace "python-uvicorn" python-uvicorn)
(replace "python-fastapi" python-fastapi)
(replace "python-fastapi-pagination-minimal"
(package
(inherit python-fastapi-pagination-minimal)
(propagated-inputs
(modify-inputs (package-propagated-inputs python-fastapi-pagination-minimal)
(replace "python-fastapi" python-fastapi)))))
(replace "python-markdownify" python-markdownify)))))))
(define-public mobilizon-reshare-scheduler
(package (inherit mobilizon-reshare.git)
(name "mobilizon-reshare-scheduler")
(build-system python-build-system)
(arguments
(list
#:phases
#~(modify-phases %standard-phases
(delete 'configure)
(delete 'build)
(delete 'check)
(replace 'install
(lambda _
(let ((bin (string-append #$output "/bin")))
(mkdir-p bin)
(install-file "scripts/scheduler.py" bin)))))))
(propagated-inputs (list mobilizon-reshare.git
python-apscheduler))
(synopsis "Mobilizon Reshare's scheduler")
(description "This script is intended to start a scheduler
running @code{mobilizon-reshare}.")))
mobilizon-reshare.git

17
manifest.scm Normal file
View File

@ -0,0 +1,17 @@
(define-module (manifest)
#:use-module (mobilizon-reshare package)
#:use-module (gnu packages)
#:use-module (guix channels)
#:use-module (guix inferior)
#:use-module (guix packages)
#:use-module (guix profiles)
#:use-module (srfi srfi-1))
(packages->manifest
(append
(map cadr (package-direct-inputs mobilizon-reshare))
(map specification->package+output
'("git-cal" "man-db" "texinfo"
"pre-commit" "cloc"
"ripgrep" "python-semver"
"fd" "docker-compose" "poetry"))))

View File

@ -1,34 +1,56 @@
[default.publisher.telegram]
active=true
chat_id="xxx"
msg_template_path="xxx"
message_thread_id="xxx"
token="xxx"
username="xxx"
[default.publisher.facebook]
active=false
[default.publisher.zulip]
active=true
instance="xxx"
chat_id="xxx"
subject="xxx"
bot_token="xxx"
bot_email="xxx"
[default.publisher.twitter]
active=false
active=true
api_key="xxx"
api_key_secret="xxx"
access_token="xxx"
access_secret="xxx"
[default.publisher.mastodon]
active=false
active=true
instance="xxx"
token="xxx"
name="xxx"
toot_length=500
[default.publisher.facebook]
active=true
page_access_token="xxx"
[default.notifier.telegram]
active=true
chat_id="xxx"
message_thread_id="xxx"
token="xxx"
username="xxx"
[default.notifier.zulip]
active=true
instance="xxx"
chat_id="xxx"
subject="xxx"
bot_token="xxx"
bot_email="xxx"
[default.notifier.twitter]
active=false
active=true
api_key="xxx"
api_key_secret="xxx"
access_token="xxx"
access_secret="xxx"
[default.notifier.mastodon]
active=false
[default.notifier.facebook]
active=false
page_access_token="xxx"

View File

@ -1 +1 @@
0.1.0
0.3.6

View File

@ -1,39 +1,38 @@
import asyncio
import functools
import logging
import sys
import traceback
from logging.config import dictConfig
from pathlib import Path
from mobilizon_reshare.config.config import get_settings
from mobilizon_reshare.storage.db import tear_down, MoReDB
from mobilizon_reshare.config.command import CommandConfig
from mobilizon_reshare.config.config import init_logging
from mobilizon_reshare.storage.db import tear_down, init
logger = logging.getLogger(__name__)
async def graceful_exit(code):
async def graceful_exit():
await tear_down()
exit(code)
async def init(settings_file):
settings = get_settings(settings_file)
dictConfig(settings["logging"])
db_path = Path(settings.db_path)
db = MoReDB(db_path)
await db.setup()
async def _safe_execution(function):
init_logging()
await init()
async def _safe_execution(f, settings_file):
await init(settings_file)
return_code = 1
try:
return_code = await f()
return_code = await function()
except Exception:
traceback.print_exc()
finally:
logger.debug("Closing")
await graceful_exit(return_code)
await graceful_exit()
return return_code
def safe_execution(f, settings_file):
asyncio.run(_safe_execution(f, settings_file))
def safe_execution(function, command_config: CommandConfig = None):
if command_config:
function = functools.partial(function, command_config)
code = asyncio.run(_safe_execution(function))
sys.exit(code)

View File

@ -1,79 +1,265 @@
import functools
import click
from arrow import Arrow
import uvicorn
from click import pass_context
from mobilizon_reshare.cli import safe_execution
from mobilizon_reshare.cli.format import format_event
from mobilizon_reshare.cli.inspect_event import inspect_events
from mobilizon_reshare.cli.main import main
from mobilizon_reshare.event.event import EventPublicationStatus
from mobilizon_reshare.cli.commands.format.format import format_event
from mobilizon_reshare.cli.commands.list.list_event import list_events
from mobilizon_reshare.cli.commands.list.list_publication import list_publications
from mobilizon_reshare.cli.commands.publish.main import publish_command as publish_main
from mobilizon_reshare.cli.commands.pull.main import pull_command as pull_main
from mobilizon_reshare.cli.commands.recap.main import recap_command as recap_main
from mobilizon_reshare.cli.commands.retry.main import (
retry_event_command,
retry_publication_command,
)
from mobilizon_reshare.cli.commands.start.main import start_command as start_main
from mobilizon_reshare.config.command import CommandConfig
from mobilizon_reshare.config.config import current_version, get_settings, init_logging
from mobilizon_reshare.config.publishers import publisher_names
from mobilizon_reshare.dataclasses.event import _EventPublicationStatus
from mobilizon_reshare.models.publication import PublicationStatus
from mobilizon_reshare.publishers import get_active_publishers
settings_file_option = click.option("--settings-file", type=click.Path(exists=True))
def test_settings(ctx, param, value):
if not value or ctx.resilient_parsing:
return
settings = get_settings()
init_logging(settings)
click.echo("OK!")
ctx.exit()
def print_version(ctx, param, value):
if not value or ctx.resilient_parsing:
return
click.echo(current_version())
ctx.exit()
def print_platforms(ctx, param, value):
if not value or ctx.resilient_parsing:
return
for platform in get_active_publishers():
click.echo(platform)
ctx.exit()
status_name_to_enum = {
"event": {
"waiting": _EventPublicationStatus.WAITING,
"completed": _EventPublicationStatus.COMPLETED,
"failed": _EventPublicationStatus.FAILED,
"partial": _EventPublicationStatus.PARTIAL,
"all": None,
},
"publication": {
"completed": PublicationStatus.COMPLETED,
"failed": PublicationStatus.FAILED,
"all": None,
},
}
from_date_option = click.option(
"-b",
"--begin",
type=click.DateTime(),
expose_value=True,
help="Include only events that begin after this datetime",
help="Include only events that begin after this datetime.",
)
to_date_option = click.option(
"-e",
"--end",
type=click.DateTime(),
expose_value=True,
help="Include only events that begin before this datetime",
help="Include only events that end before this datetime.",
)
event_status_argument = click.argument(
"status",
type=click.Choice(list(status_name_to_enum["event"].keys())),
default="all",
expose_value=True,
)
publication_status_argument = click.argument(
"status",
type=click.Choice(list(status_name_to_enum["publication"].keys())),
default="all",
expose_value=True,
)
force_publish_option = click.option(
"-F",
"--force",
type=click.UUID,
expose_value=True,
help="Publish the given event, bypassing all selection logic. This command WILL publish"
"regardless of the configured strategy, so use it with care.",
)
platform_name_option = click.option(
"-p",
"--platform",
type=str,
expose_value=True,
help="Restrict the platforms where the event will be published. This makes sense only in"
" case of force-publishing.",
)
list_supported_option = click.option(
"--list-platforms",
is_flag=True,
callback=print_platforms,
expose_value=False,
is_eager=True,
help="Show all active platforms.",
)
test_configuration = click.option(
"-t",
"--test-configuration",
is_flag=True,
callback=test_settings,
expose_value=False,
is_eager=True,
help="Validate the current configuration.",
)
@click.group()
def mobilizon_reshare():
@test_configuration
@list_supported_option
@click.option(
"--version",
is_flag=True,
callback=print_version,
expose_value=False,
is_eager=True,
help="Show the current version.",
)
@pass_context
def mobilizon_reshare(obj):
pass
@mobilizon_reshare.command()
@settings_file_option
def start(settings_file):
safe_execution(main, settings_file=settings_file)
@mobilizon_reshare.command(
help="Synchronize and publish events. It is equivalent to running consecutively pull and then publish."
)
@click.option(
"--dry-run",
is_flag=True,
help="Prevents data to be published to platforms. WARNING: it will download and write new events to the database",
default=False,
)
def start(dry_run):
safe_execution(start_main, CommandConfig(dry_run=dry_run))
@mobilizon_reshare.command()
@mobilizon_reshare.command(help="Publish a recap of already published events.")
@click.option(
"--dry-run",
"dry_run",
is_flag=True,
help="Prevents data to be published to platforms. WARNING: it will download and write new events to the database",
default=False,
)
def recap(dry_run):
safe_execution(recap_main, CommandConfig(dry_run=dry_run))
@mobilizon_reshare.command(
help="Fetch the latest events from Mobilizon, store them if they are unknown, "
"update them if they are known and changed."
)
def pull():
safe_execution(pull_main,)
@mobilizon_reshare.command(
help="Select an event with the current configured strategy"
" and publish it to all active platforms."
)
@force_publish_option
@platform_name_option
@click.option(
"--dry-run",
"dry_run",
is_flag=True,
help="Prevents data to be published to platforms.",
default=False,
)
def publish(event, platform, dry_run):
safe_execution(functools.partial(
publish_main, event, platform
), CommandConfig(dry_run=dry_run))
@mobilizon_reshare.group(help="Operations that pertain to events")
def event():
pass
@mobilizon_reshare.group(help="Operations that pertain to publications")
def publication():
pass
@event.command(help="Query for events in the database.", name="list")
@event_status_argument
@from_date_option
@to_date_option
@click.argument("target", type=str)
@settings_file_option
@pass_context
def inspect(ctx, target, begin, end, settings_file):
ctx.ensure_object(dict)
begin = Arrow.fromdatetime(begin) if begin else None
end = Arrow.fromdatetime(end) if end else None
target_to_status = {
"waiting": EventPublicationStatus.WAITING,
"completed": EventPublicationStatus.COMPLETED,
"failed": EventPublicationStatus.FAILED,
"partial": EventPublicationStatus.PARTIAL,
"all": None,
}
def event_list(status, begin, end):
safe_execution(
functools.partial(
inspect_events,
target_to_status[target],
frm=begin,
to=end,
list_events, status_name_to_enum["event"][status], frm=begin, to=end,
),
settings_file,
)
@mobilizon_reshare.command()
@settings_file_option
@click.argument("event-id", type=str)
@click.argument("publisher", type=str)
def format(settings_file, event_id, publisher):
@publication.command(help="Query for publications in the database.", name="list")
@publication_status_argument
@from_date_option
@to_date_option
def publication_list(status, begin, end):
safe_execution(
functools.partial(format_event, event_id, publisher),
settings_file,
functools.partial(
list_publications,
status_name_to_enum["publication"][status],
frm=begin,
to=end,
),
)
@event.command(
help="Format and print event with EVENT-ID using the publisher's format named "
"PUBLISHER."
)
@click.argument("event-id", type=click.UUID)
@click.argument("publisher", type=click.Choice(publisher_names))
def format(
event_id, publisher,
):
safe_execution(functools.partial(format_event, event_id, publisher),)
@event.command(name="retry", help="Retries all the failed publications")
@click.argument("event-id", type=click.UUID)
def event_retry(event_id):
safe_execution(functools.partial(retry_event_command, event_id),)
@publication.command(name="retry", help="Retries a specific publication")
@click.argument("publication-id", type=click.UUID)
def publication_retry(publication_id):
safe_execution(functools.partial(retry_publication_command, publication_id),)
@mobilizon_reshare.command("web")
def web():
uvicorn.run(
"mobilizon_reshare.web.backend.main:app", host="0.0.0.0", port=8000, reload=True
)
if __name__ == "__main__":
mobilizon_reshare()
mobilizon_reshare(obj={})

View File

@ -0,0 +1,5 @@
import click
def print_reports(reports) -> None:
click.echo(reports)

View File

@ -1,11 +1,11 @@
import click
from mobilizon_reshare.event.event import MobilizonEvent
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.models.event import Event
from mobilizon_reshare.publishers.coordinator import PublisherCoordinator
from mobilizon_reshare.publishers.platforms.platform_mapping import get_formatter_class
async def format_event(event_id, publisher):
async def format_event(event_id, publisher_name: str):
event = await Event.get_or_none(mobilizon_id=event_id).prefetch_related(
"publications__publisher"
)
@ -13,5 +13,5 @@ async def format_event(event_id, publisher):
click.echo(f"Event with mobilizon_id {event_id} not found.")
return
event = MobilizonEvent.from_model(event)
message = PublisherCoordinator.get_formatted_message(event, publisher)
message = get_formatter_class(publisher_name)().get_message_from_event(event)
click.echo(message)

View File

@ -0,0 +1,72 @@
from datetime import datetime
from typing import Iterable, Optional
import click
from arrow import Arrow
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.dataclasses.event import (
_EventPublicationStatus,
get_all_mobilizon_events,
get_published_events,
get_mobilizon_events_with_status,
get_mobilizon_events_without_publications,
)
from mobilizon_reshare.event.event_selection_strategies import select_unpublished_events
status_to_color = {
_EventPublicationStatus.COMPLETED: "green",
_EventPublicationStatus.FAILED: "red",
_EventPublicationStatus.PARTIAL: "yellow",
_EventPublicationStatus.WAITING: "white",
}
def show_events(events: Iterable[MobilizonEvent]):
click.echo_via_pager("\n".join(map(pretty, events)))
def pretty(event: MobilizonEvent):
return (
f"{event.name : ^40}{click.style(event.status.name, fg=status_to_color[event.status]) : ^22}"
f"{str(event.mobilizon_id) : <40}"
f"{event.begin_datetime.to('local').isoformat() : <29}"
f"{event.end_datetime.to('local').isoformat()}"
)
async def list_unpublished_events(frm: Arrow = None, to: Arrow = None):
return select_unpublished_events(
list(await get_published_events(from_date=frm, to_date=to)),
list(
await get_mobilizon_events_without_publications(from_date=frm, to_date=to)
),
)
async def list_events(
status: Optional[_EventPublicationStatus] = None,
frm: Optional[datetime] = None,
to: Optional[datetime] = None,
):
frm = Arrow.fromdatetime(frm) if frm else None
to = Arrow.fromdatetime(to) if to else None
if status is None:
events = await get_all_mobilizon_events(from_date=frm, to_date=to)
elif status == _EventPublicationStatus.WAITING:
events = await list_unpublished_events(frm=frm, to=to)
else:
events = await get_mobilizon_events_with_status(
[status], from_date=frm, to_date=to
)
events = list(events)
if events:
show_events(events)
else:
message = (
f"No event found with status: {status.name}"
if status is not None
else "No event found"
)
click.echo(message)

View File

@ -0,0 +1,51 @@
from datetime import datetime
from typing import Iterable, Optional
import click
from arrow import Arrow
from mobilizon_reshare.models.publication import Publication, PublicationStatus
from mobilizon_reshare.storage.query.read import (
get_all_publications,
publications_with_status,
)
status_to_color = {
PublicationStatus.COMPLETED: "green",
PublicationStatus.FAILED: "red",
}
def show_publications(publications: Iterable[Publication]):
click.echo_via_pager("\n".join(map(pretty, publications)))
def pretty(publication: Publication):
return (
f"{str(publication.id) : <40}{publication.timestamp.isoformat() : <36}"
f"{click.style(publication.status.name, fg=status_to_color[publication.status]) : <22}"
f"{publication.publisher.name : <12}{str(publication.event.mobilizon_id)}"
)
async def list_publications(
status: PublicationStatus = None,
frm: Optional[datetime] = None,
to: Optional[datetime] = None,
):
frm = Arrow.fromdatetime(frm) if frm else None
to = Arrow.fromdatetime(to) if to else None
if status is None:
publications = await get_all_publications(from_date=frm, to_date=to)
else:
publications = await publications_with_status(status, from_date=frm, to_date=to)
if publications:
show_publications(publications)
else:
message = (
f"No publication found with status: {status.name}"
if status is not None
else "No publication found"
)
click.echo(message)

View File

@ -0,0 +1,23 @@
import logging
import click
from mobilizon_reshare.config.command import CommandConfig
from mobilizon_reshare.main.publish import select_and_publish, publish_by_mobilizon_id
logger = logging.getLogger(__name__)
async def publish_command(event_mobilizon_id: click.UUID, platform: str, command_config: CommandConfig):
"""
Select an event with the current configured strategy
and publish it to all active platforms.
"""
if event_mobilizon_id is not None:
report = await publish_by_mobilizon_id(
event_mobilizon_id,
command_config,
[platform] if platform is not None else None,
)
else:
report = await select_and_publish(command_config)
return 0 if report and report.successful else 1

View File

@ -0,0 +1,10 @@
from mobilizon_reshare.main.pull import pull
async def pull_command():
"""
STUB
:return:
"""
await pull()
return 0

View File

@ -0,0 +1,15 @@
import logging.config
from mobilizon_reshare.cli.commands import print_reports
from mobilizon_reshare.config.command import CommandConfig
from mobilizon_reshare.main.recap import recap
logger = logging.getLogger(__name__)
async def recap_command(command_config: CommandConfig):
reports = await recap(command_config)
if command_config.dry_run and reports:
print_reports(reports)
return 0 if reports and reports.successful else 1

View File

@ -0,0 +1,11 @@
from mobilizon_reshare.main.retry import retry_publication, retry_event
async def retry_event_command(event_id):
reports = await retry_event(event_id)
return 0 if reports and reports.successful else 1
async def retry_publication_command(publication_id):
reports = await retry_publication(publication_id)
return 0 if reports and reports.successful else 1

View File

@ -0,0 +1,14 @@
from mobilizon_reshare.cli.commands import print_reports
from mobilizon_reshare.config.command import CommandConfig
from mobilizon_reshare.main.start import start
async def start_command(command_config: CommandConfig):
"""
STUB
:return:
"""
reports = await start(command_config)
if command_config.dry_run and reports:
print_reports(reports)
return 0 if reports and reports.successful else 1

View File

@ -1,44 +0,0 @@
from typing import Iterable
import click
from arrow import Arrow
from mobilizon_reshare.event.event import EventPublicationStatus
from mobilizon_reshare.event.event import MobilizonEvent
from mobilizon_reshare.storage.query import get_all_events
from mobilizon_reshare.storage.query import events_with_status
status_to_color = {
EventPublicationStatus.COMPLETED: "green",
EventPublicationStatus.FAILED: "red",
EventPublicationStatus.PARTIAL: "yellow",
EventPublicationStatus.WAITING: "white",
}
def show_events(events: Iterable[MobilizonEvent]):
click.echo_via_pager("\n".join(map(pretty, events)))
def pretty(event: MobilizonEvent):
return (
f"{event.name}|{click.style(event.status.name, fg=status_to_color[event.status])}"
f"|{event.mobilizon_id}|{event.begin_datetime.isoformat()}->{event.end_datetime.isoformat()}"
)
async def inspect_events(
status: EventPublicationStatus = None, frm: Arrow = None, to: Arrow = None
):
events = (
await events_with_status([status], from_date=frm, to_date=to)
if status
else await get_all_events(from_date=frm, to_date=to)
)
if events:
show_events(events)
else:
click.echo(f"No event found with status: {status}")

View File

@ -1,58 +0,0 @@
import logging.config
from mobilizon_reshare.event.event_selection_strategies import select_event_to_publish
from mobilizon_reshare.mobilizon.events import get_unpublished_events
from mobilizon_reshare.models.publication import PublicationStatus
from mobilizon_reshare.publishers.coordinator import (
PublicationFailureNotifiersCoordinator,
)
from mobilizon_reshare.publishers.coordinator import PublisherCoordinator
from mobilizon_reshare.storage.query import (
get_published_events,
get_unpublished_events as get_db_unpublished_events,
create_unpublished_events,
save_publication_report,
publications_with_status,
)
logger = logging.getLogger(__name__)
async def main():
"""
STUB
:return:
"""
# TODO: the logic to get published and unpublished events is probably redundant.
# We need a simpler way to bring together events from mobilizon, unpublished events from the db
# and published events from the DB
# Load past events
published_events = list(await get_published_events())
# Pull unpublished events from Mobilizon
unpublished_events = get_unpublished_events(published_events)
# Store in the DB only the ones we didn't know about
await create_unpublished_events(unpublished_events)
event = select_event_to_publish(
published_events,
# We must load unpublished events from DB since it contains
# merged state between Mobilizon and previous WAITING events.
list(await get_db_unpublished_events()),
)
if event:
waiting_publications = await publications_with_status(
status=PublicationStatus.WAITING,
event_mobilizon_id=event.mobilizon_id,
)
logger.debug(f"Event to publish found: {event.name}")
report = PublisherCoordinator(event, waiting_publications).run()
await save_publication_report(report, waiting_publications)
PublicationFailureNotifiersCoordinator(event, report).notify_failures()
return 0 if report.successful else 1
else:
return 0

View File

@ -0,0 +1,6 @@
import dataclasses
@dataclasses.dataclass
class CommandConfig:
dry_run: bool = dataclasses.field(default=False)

View File

@ -1,5 +1,6 @@
import importlib.resources
import os
import importlib
import logging
from logging.config import dictConfig
from pathlib import Path
from typing import Optional
@ -9,23 +10,18 @@ from dynaconf import Dynaconf, Validator
import mobilizon_reshare
from mobilizon_reshare.config import strategies, publishers, notifiers
from mobilizon_reshare.config.notifiers import notifier_names
from mobilizon_reshare.config.publishers import publisher_names
logger = logging.getLogger(__name__)
base_validators = [
# strategy to decide events to publish
Validator("selection.strategy", must_exist=True, is_type_of=str),
Validator(
"publishing.window.begin",
must_exist=True,
is_type_of=int,
gte=0,
lte=24,
),
Validator("publishing.window.end", must_exist=True, is_type_of=int, gte=0, lte=24),
# url of the main Mobilizon instance to download events from
Validator("source.mobilizon.url", must_exist=True, is_type_of=str),
Validator("source.mobilizon.group", must_exist=True, is_type_of=str),
Validator("db_url", must_exist=True, is_type_of=str),
Validator("locale", must_exist=True, is_type_of=str, default="en-us"),
]
activeness_validators = [
@ -42,43 +38,54 @@ def current_version() -> str:
return fp.read()
def build_settings(
settings_file: Optional[str] = None, validators: Optional[list[Validator]] = None
):
def init_logging(settings: Optional[Dynaconf] = None):
if settings is None:
settings = get_settings()
dictConfig(settings["logging"])
def get_settings_files_paths() -> Optional[str]:
dirs = AppDirs(appname="mobilizon-reshare", version=current_version())
bundled_settings_ref = importlib.resources.files(
"mobilizon_reshare"
) / "settings.toml"
with importlib.resources.as_file(bundled_settings_ref) as bundled_settings_path:
for config_path in [
Path(dirs.user_config_dir, "mobilizon_reshare.toml").absolute(),
Path(dirs.site_config_dir, "mobilizon_reshare.toml").absolute(),
bundled_settings_path.absolute(),
]:
if config_path and Path(config_path).exists():
logger.debug(f"Loading configuration from {config_path}")
return config_path
def build_settings(validators: Optional[list[Validator]] = None) -> Dynaconf:
"""
Creates a Dynaconf base object. Configuration files are checked in this order:
1. CLI argument
2. `MOBILIZION_RESHARE_SETTINGS_FILE` environment variable;
3. User configuration directory. On Linux that's `$XDG_CONFIG_HOME/mobilizon_reshare/<mobilizon-reshare-version>`;
4. User configuration directory. On Linux that's the first element of
1. User configuration directory. On Linux that's `$XDG_CONFIG_HOME/mobilizon_reshare/<mobilizon-reshare-version>`;
2. System configuration directory. On Linux that's the first element of
`$XDG_CONFIG_DIRS` + `/mobilizon_reshare/<mobilizon-reshare-version>`.
5. The default configuration distributed with the package.
3. The default configuration distributed with the package.
The first available configuration file will be loaded.
"""
dirs = AppDirs(appname="mobilizon-reshare", version=current_version())
with importlib.resources.path(
mobilizon_reshare, "settings.toml"
) as bundled_settings_path:
SETTINGS_FILE = [
bundled_settings_path,
Path(dirs.site_config_dir, "mobilizon_reshare.toml"),
Path(dirs.user_config_dir, "mobilizon_reshare.toml"),
os.environ.get("MOBILIZION_RESHARE_SETTINGS_FILE"),
settings_file,
]
ENVVAR_PREFIX = "MOBILIZON_RESHARE"
return Dynaconf(
config = Dynaconf(
environments=True,
envvar_prefix=ENVVAR_PREFIX,
settings_files=SETTINGS_FILE,
settings_files=get_settings_files_paths(),
validators=validators or [],
)
# TODO use validation control in dynaconf 3.2.0 once released
config.validators.validate()
return config
def build_and_validate_settings(settings_file: Optional[str] = None):
def build_and_validate_settings() -> Dynaconf:
"""
Creates a settings object to be used in the application. It collects and apply generic validators and validators
specific for each publisher, notifier and publication strategy.
@ -86,9 +93,7 @@ def build_and_validate_settings(settings_file: Optional[str] = None):
# we first do a preliminary load of the settings without validation. We will later use them to determine which
# publishers, notifiers and strategy have been selected
raw_settings = build_settings(
settings_file=settings_file, validators=activeness_validators
)
raw_settings = build_settings(validators=activeness_validators)
# we retrieve validators that are conditional. Each module will analyze the settings and decide which validators
# need to be applied.
@ -98,14 +103,12 @@ def build_and_validate_settings(settings_file: Optional[str] = None):
# we rebuild the settings, providing all the selected validators.
settings = build_settings(
settings_file,
base_validators
+ strategy_validators
+ publisher_validators
+ notifier_validators,
)
# TODO use validation control in dynaconf 3.2.0 once released
settings.validators.validate()
return settings
@ -115,29 +118,26 @@ def build_and_validate_settings(settings_file: Optional[str] = None):
# The normal Dynaconf options to specify the settings files are also not a valid option because of the two steps
# validation that prevents us to employ their mechanism to specify settings files. This could probably be reworked
# better in the future.
class CustomConfig:
_instance = None
_settings_file = None
def __new__(cls, settings_file: Optional[str] = None):
if (
settings_file is None and cls._settings_file is not None
): # normal access, I don't want to reload
return cls._instance
if (
cls._instance is None and cls._settings_file is None
) or settings_file != cls._settings_file:
cls._settings_file = settings_file
cls._instance = super(CustomConfig, cls).__new__(cls)
cls.settings = build_and_validate_settings(settings_file)
@classmethod
def get_instance(cls):
if not hasattr(cls, "_instance") or cls._instance is None:
cls._instance = cls()
return cls._instance
def update(self, settings_file: Optional[str] = None):
self.settings = build_and_validate_settings(settings_file)
def __init__(self):
self.settings = build_and_validate_settings()
@classmethod
def clear(cls):
cls._instance = None
def get_settings(settings_file: Optional[str] = None):
config = CustomConfig(settings_file)
return config.settings
def get_settings() -> Dynaconf:
return CustomConfig.get_instance().settings
def get_settings_without_validation() -> Dynaconf:
return build_settings()

View File

@ -4,20 +4,34 @@ from dynaconf import Validator
telegram_validators = [
Validator("notifier.telegram.chat_id", must_exist=True),
Validator("notifier.telegram.message_thread_id", default=None),
Validator("notifier.telegram.token", must_exist=True),
Validator("notifier.telegram.username", must_exist=True),
]
zulip_validators = [
Validator("publisher.zulip.chat_id", must_exist=True),
Validator("publisher.zulip.subject", must_exist=True),
Validator("publisher.zulip.msg_template_path", must_exist=True, default=None),
Validator("publisher.zulip.bot_token", must_exist=True),
Validator("publisher.zulip.bot_email", must_exist=True),
Validator("notifier.zulip.chat_id", must_exist=True),
Validator("notifier.zulip.subject", must_exist=True),
Validator("notifier.zulip.bot_token", must_exist=True),
Validator("notifier.zulip.bot_email", must_exist=True),
]
mastodon_validators = [
Validator("publisher.zulip.instance", must_exist=True),
Validator("notifier.mastodon.instance", must_exist=True),
Validator("notifier.mastodon.token", must_exist=True),
Validator("notifier.mastodon.name", must_exist=True),
]
twitter_validators = [
Validator("publisher.twitter.api_key", must_exist=True),
Validator("publisher.twitter.api_key_secret", must_exist=True),
Validator("publisher.twitter.access_token", must_exist=True),
Validator("publisher.twitter.access_secret", must_exist=True),
]
mastodon_validators = []
twitter_validators = []
facebook_validators = [
Validator("publisher.facebook.page_access_token", must_exist=True),
]
notifier_name_to_validators = {
"facebook": facebook_validators,
"telegram": telegram_validators,
"twitter": twitter_validators,
"mastodon": mastodon_validators,

View File

@ -3,20 +3,57 @@ from dynaconf import Validator
telegram_validators = [
Validator("publisher.telegram.chat_id", must_exist=True),
Validator("publisher.telegram.message_thread_id", default=None),
Validator("publisher.telegram.msg_template_path", must_exist=True, default=None),
Validator("publisher.telegram.recap_template_path", must_exist=True, default=None),
Validator(
"publisher.telegram.recap_header_template_path", must_exist=True, default=None
),
Validator("publisher.telegram.token", must_exist=True),
Validator("publisher.telegram.username", must_exist=True),
]
zulip_validators = [
Validator("publisher.zulip.instance", must_exist=True),
Validator("publisher.zulip.chat_id", must_exist=True),
Validator("publisher.zulip.subject", must_exist=True),
Validator("publisher.zulip.msg_template_path", must_exist=True, default=None),
Validator("publisher.zulip.recap_template_path", must_exist=True, default=None),
Validator(
"publisher.zulip.recap_header_template_path", must_exist=True, default=None
),
Validator("publisher.zulip.bot_token", must_exist=True),
Validator("publisher.zulip.bot_email", must_exist=True),
]
mastodon_validators = []
twitter_validators = []
facebook_validators = []
mastodon_validators = [
Validator("publisher.mastodon.instance", must_exist=True),
Validator("publisher.mastodon.token", must_exist=True),
Validator("publisher.mastodon.toot_length", default=500),
Validator("publisher.mastodon.msg_template_path", must_exist=True, default=None),
Validator("publisher.mastodon.recap_template_path", must_exist=True, default=None),
Validator(
"publisher.mastodon.recap_header_template_path", must_exist=True, default=None
),
Validator("publisher.mastodon.name", must_exist=True),
]
twitter_validators = [
Validator("publisher.twitter.msg_template_path", must_exist=True, default=None),
Validator("publisher.twitter.recap_template_path", must_exist=True, default=None),
Validator(
"publisher.twitter.recap_header_template_path", must_exist=True, default=None
),
Validator("publisher.twitter.api_key", must_exist=True),
Validator("publisher.twitter.api_key_secret", must_exist=True),
Validator("publisher.twitter.access_token", must_exist=True),
Validator("publisher.twitter.access_secret", must_exist=True),
]
facebook_validators = [
Validator("publisher.facebook.msg_template_path", must_exist=True, default=None),
Validator("publisher.facebook.recap_template_path", must_exist=True, default=None),
Validator(
"publisher.facebook.recap_header_template_path", must_exist=True, default=None
),
Validator("publisher.facebook.page_access_token", must_exist=True),
]
publisher_name_to_validators = {
"telegram": telegram_validators,
@ -29,9 +66,11 @@ publisher_names = publisher_name_to_validators.keys()
def get_active_publishers(settings):
return filter(
lambda publisher_name: settings["publisher"][publisher_name]["active"],
publisher_names,
return list(
filter(
lambda publisher_name: settings["publisher"][publisher_name]["active"],
publisher_names,
)
)

View File

@ -0,0 +1,13 @@
from mobilizon_reshare.dataclasses.event import _MobilizonEvent
from mobilizon_reshare.dataclasses.event_publication_status import (
_EventPublicationStatus,
)
from mobilizon_reshare.dataclasses.publication import (
_EventPublication,
_PublicationNotification,
)
EventPublication = _EventPublication
MobilizonEvent = _MobilizonEvent
EventPublicationStatus = _EventPublicationStatus
PublicationNotification = _PublicationNotification

View File

@ -0,0 +1,164 @@
from dataclasses import dataclass, asdict
from typing import Optional, Iterable
from uuid import UUID
import arrow
from arrow import Arrow
from jinja2 import Template
from mobilizon_reshare.config.config import get_settings
from mobilizon_reshare.dataclasses.event_publication_status import (
_EventPublicationStatus,
_compute_event_status,
)
from mobilizon_reshare.models.event import Event
from mobilizon_reshare.storage.query.read import (
get_all_events,
get_event,
get_events_without_publications,
)
@dataclass
class _MobilizonEvent:
"""Class representing an event retrieved from Mobilizon."""
name: str
description: Optional[str]
begin_datetime: arrow.Arrow
end_datetime: arrow.Arrow
mobilizon_link: str
mobilizon_id: UUID
last_update_time: arrow.Arrow
thumbnail_link: Optional[str] = None
location: Optional[str] = None
publication_time: Optional[dict[str, arrow.Arrow]] = None
status: _EventPublicationStatus = _EventPublicationStatus.WAITING
def __post_init__(self):
assert self.begin_datetime.tzinfo == self.end_datetime.tzinfo
assert self.begin_datetime < self.end_datetime
if self.publication_time is None:
self.publication_time = {}
if self.publication_time:
assert self.status in [
_EventPublicationStatus.COMPLETED,
_EventPublicationStatus.PARTIAL,
_EventPublicationStatus.FAILED,
]
def _fill_template(self, pattern: Template) -> str:
config = get_settings()
return pattern.render(locale=config["locale"], **asdict(self))
def format(self, pattern: Template) -> str:
return self._fill_template(pattern)
@classmethod
def from_model(cls, event: Event):
publication_status = _compute_event_status(list(event.publications))
publication_time = {}
for pub in event.publications:
if publication_status != _EventPublicationStatus.WAITING:
assert pub.timestamp is not None
publication_time[pub.publisher.name] = arrow.get(pub.timestamp).to(
"local"
)
return cls(
name=event.name,
description=event.description,
begin_datetime=arrow.get(event.begin_datetime).to("local"),
end_datetime=arrow.get(event.end_datetime).to("local"),
mobilizon_link=event.mobilizon_link,
mobilizon_id=event.mobilizon_id,
thumbnail_link=event.thumbnail_link,
location=event.location,
publication_time=publication_time,
status=publication_status,
last_update_time=arrow.get(event.last_update_time).to("local"),
)
def to_model(self, db_id: Optional[UUID] = None) -> Event:
kwargs = {
"name": self.name,
"description": self.description,
"mobilizon_id": self.mobilizon_id,
"mobilizon_link": self.mobilizon_link,
"thumbnail_link": self.thumbnail_link,
"location": self.location,
"begin_datetime": self.begin_datetime.astimezone(
self.begin_datetime.tzinfo
),
"end_datetime": self.end_datetime.astimezone(self.end_datetime.tzinfo),
"last_update_time": self.last_update_time.astimezone(
self.last_update_time.tzinfo
),
}
if db_id is not None:
kwargs.update({"id": db_id})
return Event(**kwargs)
@classmethod
async def retrieve(cls, mobilizon_id):
return cls.from_model(await get_event(mobilizon_id))
async def get_all_mobilizon_events(
from_date: Optional[Arrow] = None, to_date: Optional[Arrow] = None,
) -> list[_MobilizonEvent]:
return [_MobilizonEvent.from_model(event) for event in await get_all_events(from_date, to_date)]
async def get_published_events(
from_date: Optional[Arrow] = None, to_date: Optional[Arrow] = None
) -> Iterable[_MobilizonEvent]:
"""
Retrieves events that are not waiting. Function could be renamed to something more fitting
:return:
"""
return await get_mobilizon_events_with_status(
[
_EventPublicationStatus.COMPLETED,
_EventPublicationStatus.PARTIAL,
_EventPublicationStatus.FAILED,
],
from_date=from_date,
to_date=to_date,
)
async def get_mobilizon_events_with_status(
status: list[_EventPublicationStatus],
from_date: Optional[Arrow] = None,
to_date: Optional[Arrow] = None,
) -> Iterable[_MobilizonEvent]:
def _filter_event_with_status(event: Event) -> bool:
# This computes the status client-side instead of running in the DB. It shouldn't pose a performance problem
# in the short term, but should be moved to the query if possible.
event_status = _compute_event_status(list(event.publications))
return event_status in status
return map(
_MobilizonEvent.from_model,
filter(_filter_event_with_status, await get_all_events(from_date, to_date)),
)
async def get_mobilizon_events_without_publications(
from_date: Optional[Arrow] = None, to_date: Optional[Arrow] = None,
) -> list[_MobilizonEvent]:
return [
_MobilizonEvent.from_model(event)
for event in await get_events_without_publications(
from_date=from_date, to_date=to_date
)
]
async def get_mobilizon_event_by_id(
event_id: UUID,
) -> _MobilizonEvent:
event = await get_event(event_id)
return _MobilizonEvent.from_model(event)

View File

@ -0,0 +1,27 @@
from enum import IntEnum
from mobilizon_reshare.models.publication import Publication, PublicationStatus
class _EventPublicationStatus(IntEnum):
WAITING = 1
FAILED = 2
COMPLETED = 3
PARTIAL = 4
def _compute_event_status(publications: list[Publication],) -> _EventPublicationStatus:
if not publications:
return _EventPublicationStatus.WAITING
unique_statuses: set[PublicationStatus] = set(pub.status for pub in publications)
if unique_statuses == {
PublicationStatus.COMPLETED,
PublicationStatus.FAILED,
}:
return _EventPublicationStatus.PARTIAL
elif len(unique_statuses) == 1:
return _EventPublicationStatus[unique_statuses.pop().name]
raise ValueError(f"Illegal combination of PublicationStatus: {unique_statuses}")

View File

@ -0,0 +1,77 @@
from dataclasses import dataclass
from functools import partial
from typing import List, Iterator
from uuid import UUID
from tortoise.transactions import atomic
from mobilizon_reshare.dataclasses.event import _MobilizonEvent
from mobilizon_reshare.models.publication import Publication
from mobilizon_reshare.publishers.abstract import (
AbstractPlatform,
AbstractEventFormatter,
)
from mobilizon_reshare.storage.query.read import (
get_event,
prefetch_publication_relations,
)
@dataclass
class BasePublication:
publisher: AbstractPlatform
formatter: AbstractEventFormatter
@dataclass
class _EventPublication(BasePublication):
event: _MobilizonEvent
id: UUID
@classmethod
def from_orm(cls, model: Publication, event: _MobilizonEvent):
# imported here to avoid circular dependencies
from mobilizon_reshare.publishers.platforms.platform_mapping import (
get_publisher_class,
get_formatter_class,
)
publisher = get_publisher_class(model.publisher.name)()
formatter = get_formatter_class(model.publisher.name)()
return cls(publisher, formatter, event, model.id,)
@classmethod
async def retrieve(cls, publication_id):
publication = await prefetch_publication_relations(
Publication.get(id=publication_id)
)
event = _MobilizonEvent.from_model(publication.event)
return cls.from_orm(publication, event)
@dataclass
class RecapPublication(BasePublication):
events: List[_MobilizonEvent]
@dataclass
class _PublicationNotification(BasePublication):
publication: _EventPublication
@atomic()
async def build_publications_for_event(
event: _MobilizonEvent, publishers: Iterator[str]
) -> list[_EventPublication]:
publication_models = await event.to_model().build_publications(publishers)
return [_EventPublication.from_orm(m, event) for m in publication_models]
async def get_failed_publications_for_event(
event: _MobilizonEvent,
) -> List[_EventPublication]:
event_model = await get_event(event.mobilizon_id)
failed_publications = await event_model.get_failed_publications()
return list(
map(partial(_EventPublication.from_orm, event=event), failed_publications)
)

View File

@ -1,107 +0,0 @@
from dataclasses import dataclass, asdict
from enum import IntEnum
from typing import Optional, Set
from uuid import UUID
import arrow
import tortoise.timezone
from jinja2 import Template
from mobilizon_reshare.models.event import Event
from mobilizon_reshare.models.publication import PublicationStatus, Publication
class EventPublicationStatus(IntEnum):
WAITING = 1
FAILED = 2
COMPLETED = 3
PARTIAL = 4
@dataclass
class MobilizonEvent:
"""Class representing an event retrieved from Mobilizon."""
name: str
description: Optional[str]
begin_datetime: arrow.Arrow
end_datetime: arrow.Arrow
mobilizon_link: str
mobilizon_id: UUID
thumbnail_link: Optional[str] = None
location: Optional[str] = None
publication_time: Optional[dict[str, arrow.Arrow]] = None
status: EventPublicationStatus = EventPublicationStatus.WAITING
def __post_init__(self):
assert self.begin_datetime.tzinfo == self.end_datetime.tzinfo
assert self.begin_datetime < self.end_datetime
if self.publication_time:
assert self.status in [
EventPublicationStatus.COMPLETED,
EventPublicationStatus.PARTIAL,
EventPublicationStatus.FAILED,
]
def _fill_template(self, pattern: Template) -> str:
return pattern.render(**asdict(self))
def format(self, pattern: Template) -> str:
return self._fill_template(pattern)
def to_model(self) -> Event:
return Event(
name=self.name,
description=self.description,
mobilizon_id=self.mobilizon_id,
mobilizon_link=self.mobilizon_link,
thumbnail_link=self.thumbnail_link,
location=self.location,
begin_datetime=self.begin_datetime.astimezone(self.begin_datetime.tzinfo),
end_datetime=self.end_datetime.astimezone(self.end_datetime.tzinfo),
)
@staticmethod
def compute_status(publications: list[Publication]) -> EventPublicationStatus:
unique_statuses: Set[PublicationStatus] = set(
pub.status for pub in publications
)
if PublicationStatus.FAILED in unique_statuses:
return EventPublicationStatus.FAILED
elif unique_statuses == {
PublicationStatus.COMPLETED,
PublicationStatus.WAITING,
}:
return EventPublicationStatus.PARTIAL
elif len(unique_statuses) == 1:
return EventPublicationStatus[unique_statuses.pop().name]
raise ValueError(f"Illegal combination of PublicationStatus: {unique_statuses}")
@staticmethod
def from_model(event: Event, tz: str = "UTC"):
publication_status = MobilizonEvent.compute_status(list(event.publications))
return MobilizonEvent(
name=event.name,
description=event.description,
begin_datetime=arrow.get(
tortoise.timezone.localtime(value=event.begin_datetime, timezone=tz)
).to("local"),
end_datetime=arrow.get(
tortoise.timezone.localtime(value=event.end_datetime, timezone=tz)
).to("local"),
mobilizon_link=event.mobilizon_link,
mobilizon_id=event.mobilizon_id,
thumbnail_link=event.thumbnail_link,
location=event.location,
publication_time={
pub.publisher.name: arrow.get(
tortoise.timezone.localtime(value=pub.timestamp, timezone=tz)
).to("local")
for pub in event.publications
}
if publication_status != PublicationStatus.WAITING
else None,
status=publication_status,
)

View File

@ -5,7 +5,7 @@ from typing import List, Optional
import arrow
from mobilizon_reshare.config.config import get_settings
from mobilizon_reshare.event.event import MobilizonEvent
from mobilizon_reshare.dataclasses import MobilizonEvent
logger = logging.getLogger(__name__)
@ -16,27 +16,17 @@ class EventSelectionStrategy(ABC):
published_events: List[MobilizonEvent],
unpublished_events: List[MobilizonEvent],
) -> Optional[MobilizonEvent]:
if not self.is_in_publishing_window():
return None
return self._select(published_events, unpublished_events)
def is_in_publishing_window(self) -> bool:
settings = get_settings()
window_beginning = settings["publishing"]["window"]["begin"]
window_end = settings["publishing"]["window"]["end"]
now_hour = arrow.now().datetime.hour
if window_beginning <= window_end:
return window_beginning <= now_hour < window_end
else:
return now_hour >= window_beginning or now_hour < window_end
selected = self._select(published_events, unpublished_events)
if selected:
return selected[0]
return None
@abstractmethod
def _select(
self,
published_events: List[MobilizonEvent],
unpublished_events: List[MobilizonEvent],
) -> Optional[MobilizonEvent]:
) -> Optional[List[MobilizonEvent]]:
pass
@ -45,21 +35,19 @@ class SelectNextEventStrategy(EventSelectionStrategy):
self,
published_events: List[MobilizonEvent],
unpublished_events: List[MobilizonEvent],
) -> Optional[MobilizonEvent]:
) -> Optional[List[MobilizonEvent]]:
# if there are no unpublished events, there's nothing I can do
if not unpublished_events:
logger.debug("No event to publish.")
return None
first_unpublished_event = unpublished_events[0]
return []
# if there's no published event (first execution) I return the next in queue
if not published_events:
logger.debug(
"First Execution with an available event. Picking next event in the queue."
)
return first_unpublished_event
return unpublished_events
last_published_event = published_events[-1]
now = arrow.now()
@ -84,21 +72,31 @@ class SelectNextEventStrategy(EventSelectionStrategy):
logger.debug(
"Last event was published recently. No event is going to be published."
)
return None
return []
return first_unpublished_event
return unpublished_events
STRATEGY_NAME_TO_STRATEGY_CLASS = {"next_event": SelectNextEventStrategy}
def select_event_to_publish(
published_events: List[MobilizonEvent],
unpublished_events: List[MobilizonEvent],
def select_unpublished_events(
published_events: List[MobilizonEvent], unpublished_events: List[MobilizonEvent],
):
strategy = STRATEGY_NAME_TO_STRATEGY_CLASS[
get_settings()["selection"]["strategy"]
]()
return strategy._select(published_events, unpublished_events)
def select_event_to_publish(
published_events: List[MobilizonEvent], unpublished_events: List[MobilizonEvent],
) -> Optional[MobilizonEvent]:
strategy = STRATEGY_NAME_TO_STRATEGY_CLASS[
get_settings()["selection"]["strategy"]
]()
return strategy.select(published_events, unpublished_events)

View File

@ -1,28 +1,28 @@
from typing import List
from bs4 import BeautifulSoup, Tag
import markdownify
def get_bottom_paragraphs(soup: BeautifulSoup) -> List[Tag]:
def get_bottom_paragraphs(soup: BeautifulSoup) -> list[Tag]:
return [d for d in soup.findAll("p") if not d.find("p")]
def html_to_plaintext(content):
def html_to_plaintext(content) -> str:
"""
Transform a HTML in a plaintext sting that can be more easily processed by the publishers.
Transform a HTML in a plaintext string that can be more easily processed by the publishers.
:param content:
:return:
"""
# TODO: support links and quotes
soup = BeautifulSoup(content)
return "\n".join(
" ".join(tag.stripped_strings) for tag in get_bottom_paragraphs(soup)
)
soup = BeautifulSoup(content, features="html.parser")
p_list = get_bottom_paragraphs(soup)
if p_list:
return "\n".join(" ".join(tag.stripped_strings) for tag in p_list)
return soup.text
def html_to_markdown(content):
def html_to_markdown(content) -> str:
markdown = markdownify.markdownify(content)
escaped_markdown = markdown.replace(">", "\\>")
return escaped_markdown.strip()

View File

View File

@ -0,0 +1,99 @@
import logging.config
from typing import Optional, Iterator
from mobilizon_reshare.config.command import CommandConfig
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.dataclasses.event import (
get_published_events,
get_mobilizon_events_without_publications,
get_mobilizon_event_by_id,
)
from mobilizon_reshare.dataclasses.publication import (
_EventPublication,
build_publications_for_event,
)
from mobilizon_reshare.event.event_selection_strategies import select_event_to_publish
from mobilizon_reshare.publishers import get_active_publishers
from mobilizon_reshare.publishers.coordinators.event_publishing.dry_run import (
DryRunPublisherCoordinator,
)
from mobilizon_reshare.publishers.coordinators.event_publishing.notify import (
PublicationFailureNotifiersCoordinator,
)
from mobilizon_reshare.publishers.coordinators.event_publishing.publish import (
PublisherCoordinatorReport,
PublisherCoordinator,
)
from mobilizon_reshare.storage.query.write import (
save_publication_report,
save_notification_report,
)
logger = logging.getLogger(__name__)
async def publish_publications(
publications: list[_EventPublication],
) -> PublisherCoordinatorReport:
publishers_report = PublisherCoordinator(publications).run()
await save_publication_report(publishers_report)
for publication_report in publishers_report.reports:
if not publication_report.successful:
notifiers_report = PublicationFailureNotifiersCoordinator(publication_report,).notify_failure()
if notifiers_report:
await save_notification_report(notifiers_report)
return publishers_report
def perform_dry_run(publications: list[_EventPublication]):
return DryRunPublisherCoordinator(publications).run()
async def publish_event(
event: MobilizonEvent,
command_config: CommandConfig,
publishers: Optional[Iterator[str]] = None,
) -> PublisherCoordinatorReport:
logger.info(f"Event to publish found: {event.name}")
if not (publishers and all(publishers)):
publishers = get_active_publishers()
publications = await build_publications_for_event(event, publishers)
if command_config.dry_run:
logger.info("Executing in dry run mode. No event is going to be published.")
return perform_dry_run(publications)
else:
return await publish_publications(publications)
async def publish_by_mobilizon_id(
event_mobilizon_id,
command_config: CommandConfig,
publishers: Optional[Iterator[str]] = None,
):
event = await get_mobilizon_event_by_id(event_mobilizon_id)
return await publish_event(event, command_config, publishers)
async def select_and_publish(
command_config: CommandConfig,
unpublished_events: Optional[list[MobilizonEvent]] = None,
) -> Optional[PublisherCoordinatorReport]:
"""
STUB
:return:
"""
if unpublished_events is None:
unpublished_events = await get_mobilizon_events_without_publications()
event = select_event_to_publish(
list(await get_published_events()), unpublished_events,
)
if event:
return await publish_event(event, command_config)
else:
logger.info("No event to publish found")

View File

@ -0,0 +1,22 @@
import logging.config
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.mobilizon.events import get_mobilizon_future_events
from mobilizon_reshare.storage.query.write import create_unpublished_events
logger = logging.getLogger(__name__)
async def pull() -> list[MobilizonEvent]:
"""
Fetches the latest events from Mobilizon and stores them.
:return:
"""
# Pull future events from Mobilizon
future_events = get_mobilizon_future_events()
logger.info(f"Pulled {len(future_events)} events from Mobilizon.")
# Store in the DB only the ones we didn't know about
events = await create_unpublished_events(future_events)
logger.debug(f"There are now {len(events)} unpublished events.")
return events

View File

@ -0,0 +1,62 @@
import logging
from typing import Optional, List
from arrow import now
from mobilizon_reshare.config.command import CommandConfig
from mobilizon_reshare.dataclasses import EventPublicationStatus
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.dataclasses.event import get_mobilizon_events_with_status
from mobilizon_reshare.dataclasses.publication import RecapPublication
from mobilizon_reshare.publishers import get_active_publishers
from mobilizon_reshare.publishers.coordinators import BaseCoordinatorReport
from mobilizon_reshare.publishers.coordinators.event_publishing.notify import (
PublicationFailureNotifiersCoordinator,
)
from mobilizon_reshare.publishers.coordinators.recap_publishing.dry_run import (
DryRunRecapCoordinator,
)
from mobilizon_reshare.publishers.coordinators.recap_publishing.recap import (
RecapCoordinator,
)
from mobilizon_reshare.publishers.platforms.platform_mapping import (
get_publisher_class,
get_formatter_class,
)
logger = logging.getLogger(__name__)
async def select_events_to_recap() -> List[MobilizonEvent]:
return list(
await get_mobilizon_events_with_status(
status=[EventPublicationStatus.COMPLETED], from_date=now()
)
)
async def recap(command_config: CommandConfig) -> Optional[BaseCoordinatorReport]:
# I want to recap only the events that have been successfully published and that haven't happened yet
events_to_recap = await select_events_to_recap()
if events_to_recap:
logger.info(f"Found {len(events_to_recap)} events to recap.")
recap_publications = [
RecapPublication(
get_publisher_class(publisher)(),
get_formatter_class(publisher)(),
events_to_recap,
)
for publisher in get_active_publishers()
]
if command_config.dry_run:
reports = DryRunRecapCoordinator(recap_publications).run()
else:
reports = RecapCoordinator(recap_publications).run()
for report in reports.reports:
if report.status == EventPublicationStatus.FAILED:
PublicationFailureNotifiersCoordinator(report).notify_failure()
return reports
else:
logger.info("Found no events")

View File

@ -0,0 +1,52 @@
import logging
from typing import Optional
from uuid import UUID
from tortoise.exceptions import DoesNotExist
from mobilizon_reshare.dataclasses import MobilizonEvent, EventPublication
from mobilizon_reshare.dataclasses.publication import get_failed_publications_for_event
from mobilizon_reshare.main.publish import publish_publications
from mobilizon_reshare.publishers.coordinators.event_publishing.publish import (
PublisherCoordinatorReport,
)
from mobilizon_reshare.storage.query.exceptions import EventNotFound
logger = logging.getLogger(__name__)
async def retry_event_publications(event_id) -> Optional[PublisherCoordinatorReport]:
event = await MobilizonEvent.retrieve(event_id)
failed_publications = await get_failed_publications_for_event(event)
if not failed_publications:
logger.info("No failed publications found.")
return
logger.info(f"Found {len(failed_publications)} publications.")
return await publish_publications(failed_publications)
async def retry_publication(publication_id) -> Optional[PublisherCoordinatorReport]:
try:
publication = await EventPublication.retrieve(publication_id)
except DoesNotExist:
logger.info(f"Publication {publication_id} not found.")
return
logger.info(f"Publication {publication_id} found.")
return await publish_publications([publication])
async def retry_event(
mobilizon_event_id: UUID = None,
) -> Optional[PublisherCoordinatorReport]:
if mobilizon_event_id is None:
raise NotImplementedError(
"Autonomous retry not implemented yet, please specify an event_id"
)
try:
return await retry_event_publications(mobilizon_event_id)
except EventNotFound as e:
logger.debug(e, exc_info=True)
logger.error(f"Event with id {mobilizon_event_id} not found")
return

View File

@ -0,0 +1,19 @@
import logging.config
from mobilizon_reshare.config.command import CommandConfig
from mobilizon_reshare.main.publish import select_and_publish
from mobilizon_reshare.main.pull import pull
from mobilizon_reshare.publishers.coordinators.event_publishing.publish import (
PublisherCoordinatorReport,
)
logger = logging.getLogger(__name__)
async def start(command_config: CommandConfig) -> PublisherCoordinatorReport:
"""
STUB
:return:
"""
events = await pull()
return await select_and_publish(command_config, events,)

View File

@ -0,0 +1,41 @@
-- upgrade --
CREATE TABLE IF NOT EXISTS "event" (
"id" UUID NOT NULL PRIMARY KEY,
"name" TEXT NOT NULL,
"description" TEXT,
"mobilizon_id" UUID NOT NULL,
"mobilizon_link" TEXT NOT NULL,
"thumbnail_link" TEXT,
"location" TEXT,
"begin_datetime" TIMESTAMPTZ NOT NULL,
"end_datetime" TIMESTAMPTZ NOT NULL,
"last_update_time" TIMESTAMPTZ NOT NULL
);
CREATE TABLE IF NOT EXISTS "publisher" (
"id" UUID NOT NULL PRIMARY KEY,
"name" VARCHAR(256) NOT NULL,
"account_ref" TEXT
);
CREATE TABLE IF NOT EXISTS "publication" (
"id" UUID NOT NULL PRIMARY KEY,
"status" SMALLINT NOT NULL,
"timestamp" TIMESTAMPTZ NOT NULL,
"reason" TEXT,
"event_id" UUID NOT NULL REFERENCES "event" ("id") ON DELETE CASCADE,
"publisher_id" UUID NOT NULL REFERENCES "publisher" ("id") ON DELETE CASCADE
);
COMMENT ON COLUMN "publication"."status" IS 'FAILED: 0\nCOMPLETED: 1';
CREATE TABLE IF NOT EXISTS "notification" (
"id" UUID NOT NULL PRIMARY KEY,
"status" SMALLINT NOT NULL,
"message" TEXT NOT NULL,
"publication_id" UUID REFERENCES "publication" ("id") ON DELETE CASCADE,
"target_id" UUID REFERENCES "publisher" ("id") ON DELETE CASCADE
);
COMMENT ON COLUMN "notification"."status" IS 'WAITING: 1\nFAILED: 2\nPARTIAL: 3\nCOMPLETED: 4';
CREATE TABLE IF NOT EXISTS "aerich" (
"id" SERIAL NOT NULL PRIMARY KEY,
"version" VARCHAR(255) NOT NULL,
"app" VARCHAR(100) NOT NULL,
"content" JSONB NOT NULL
);

View File

@ -0,0 +1,4 @@
[tool.aerich]
tortoise_orm = "mobilizon_reshare.storage.db.TORTOISE_ORM"
location = "./"
src_folder = "./."

View File

@ -0,0 +1,38 @@
-- upgrade --
CREATE TABLE IF NOT EXISTS "event" (
"id" CHAR(36) NOT NULL PRIMARY KEY,
"name" TEXT NOT NULL,
"description" TEXT,
"mobilizon_id" CHAR(36) NOT NULL,
"mobilizon_link" TEXT NOT NULL,
"thumbnail_link" TEXT,
"location" TEXT,
"begin_datetime" TIMESTAMP NOT NULL,
"end_datetime" TIMESTAMP NOT NULL
);
CREATE TABLE IF NOT EXISTS "publisher" (
"id" CHAR(36) NOT NULL PRIMARY KEY,
"name" VARCHAR(256) NOT NULL,
"account_ref" TEXT
);
CREATE TABLE IF NOT EXISTS "publication" (
"id" CHAR(36) NOT NULL PRIMARY KEY,
"status" SMALLINT NOT NULL /* FAILED: 0\nCOMPLETED: 1 */,
"timestamp" TIMESTAMP,
"reason" TEXT,
"event_id" CHAR(36) NOT NULL REFERENCES "event" ("id") ON DELETE CASCADE,
"publisher_id" CHAR(36) NOT NULL REFERENCES "publisher" ("id") ON DELETE CASCADE
);
CREATE TABLE IF NOT EXISTS "notification" (
"id" CHAR(36) NOT NULL PRIMARY KEY,
"status" SMALLINT NOT NULL /* WAITING: 1\nFAILED: 2\nPARTIAL: 3\nCOMPLETED: 4 */,
"message" TEXT NOT NULL,
"publication_id" CHAR(36) REFERENCES "publication" ("id") ON DELETE CASCADE,
"target_id" CHAR(36) REFERENCES "publisher" ("id") ON DELETE CASCADE
);
CREATE TABLE IF NOT EXISTS "aerich" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
"version" VARCHAR(255) NOT NULL,
"app" VARCHAR(20) NOT NULL,
"content" JSON NOT NULL
);

View File

@ -0,0 +1,4 @@
-- upgrade --
ALTER TABLE "event" ADD "last_update_time" TIMESTAMP NOT NULL DEFAULT 0;
-- downgrade --
ALTER TABLE "event" DROP COLUMN "last_update_time";

View File

@ -0,0 +1,4 @@
[tool.aerich]
tortoise_orm = "mobilizon_reshare.storage.db.TORTOISE_ORM"
location = "."
src_folder = "./."

View File

@ -8,7 +8,7 @@ import arrow
import requests
from mobilizon_reshare.config.config import get_settings
from mobilizon_reshare.event.event import MobilizonEvent, EventPublicationStatus
from mobilizon_reshare.dataclasses import MobilizonEvent, _EventPublicationStatus
logger = logging.getLogger(__name__)
@ -24,8 +24,8 @@ def parse_location(data):
return f"{addr['description']}, {addr['locality']}, {addr['region']}"
elif "onlineAddress" in data and data["onlineAddress"]:
return data["onlineAddress"]
else:
return None
return None
def parse_picture(data):
@ -43,7 +43,8 @@ def parse_event(data):
thumbnail_link=parse_picture(data),
location=parse_location(data),
publication_time=None,
status=EventPublicationStatus.WAITING,
status=_EventPublicationStatus.WAITING,
last_update_time=arrow.get(data["updatedAt"]) if "updatedAt" in data else None,
)
@ -53,6 +54,7 @@ query_future_events = """{{
elements {{
title,
url,
updatedAt,
beginsOn,
endsOn,
options {{
@ -76,20 +78,6 @@ query_future_events = """{{
}}"""
def get_unpublished_events(published_events: List[MobilizonEvent]):
# I take all the future events
future_events = get_mobilizon_future_events()
# I get the ids of all the published events coming from the DB
published_events_id = set(map(lambda x: x.mobilizon_id, published_events))
# I keep the future events only the ones that haven't been published
# Note: some events might exist in the DB and be unpublished. Here they should be ignored because the information
# in the DB might be old and the event might have been updated.
# We assume the published_events list doesn't contain such events.
return list(
filter(lambda x: x.mobilizon_id not in published_events_id, future_events)
)
def get_mobilizon_future_events(
page: int = 1, from_date: Optional[arrow.Arrow] = None
) -> List[MobilizonEvent]:

View File

@ -0,0 +1,7 @@
from tortoise.contrib.pydantic import pydantic_model_creator
class WithPydantic:
@classmethod
def to_pydantic(cls):
return pydantic_model_creator(cls)

View File

@ -1,8 +1,15 @@
from typing import Iterator
from tortoise import fields
from tortoise.models import Model
from tortoise.transactions import atomic
from mobilizon_reshare.models import WithPydantic
from mobilizon_reshare.models.publication import PublicationStatus, Publication
from mobilizon_reshare.models.publisher import Publisher
class Event(Model):
class Event(Model, WithPydantic):
id = fields.UUIDField(pk=True)
name = fields.TextField()
description = fields.TextField(null=True)
@ -15,14 +22,40 @@ class Event(Model):
begin_datetime = fields.DatetimeField()
end_datetime = fields.DatetimeField()
last_update_time = fields.DatetimeField()
publications: fields.ReverseRelation["Publication"]
def __str__(self):
return self.name
return str(self.name)
def __repr__(self):
return f"{self.id} - {self.name}"
class Meta:
table = "event"
async def build_publication_by_publisher_name(
self, publisher_name: str, status: PublicationStatus = PublicationStatus.FAILED
) -> Publication:
publisher = await Publisher.filter(name=publisher_name).first()
return Publication(
status=status,
event_id=self.id,
publisher_id=publisher.id,
publisher=publisher,
)
async def build_publications(self, publishers: Iterator[str]):
return [
await self.build_publication_by_publisher_name(name) for name in publishers
]
@atomic()
async def get_failed_publications(self,) -> list[Publication]:
return list(
filter(
lambda publications: publications.status == PublicationStatus.FAILED,
self.publications,
)
)

View File

@ -5,10 +5,8 @@ from tortoise.models import Model
class NotificationStatus(IntEnum):
WAITING = 1
FAILED = 2
PARTIAL = 3
COMPLETED = 4
FAILED = 0
COMPLETED = 1
class Notification(Model):
@ -17,9 +15,7 @@ class Notification(Model):
message = fields.TextField()
target = fields.ForeignKeyField(
"models.Publisher", related_name="notifications", null=True
)
target = fields.ForeignKeyField("models.Publisher", null=True, related_name=False,)
publication = fields.ForeignKeyField(
"models.Publication", related_name="notifications", null=True

View File

@ -3,20 +3,19 @@ from enum import IntEnum
from tortoise import fields
from tortoise.models import Model
from mobilizon_reshare.models import WithPydantic
class PublicationStatus(IntEnum):
WAITING = 1
FAILED = 2
COMPLETED = 3
FAILED = 0
COMPLETED = 1
class Publication(Model):
class Publication(Model, WithPydantic):
id = fields.UUIDField(pk=True)
status = fields.IntEnumField(PublicationStatus)
# When a Publication's status is WAITING
# we don't need a timestamp nor a reason
timestamp = fields.DatetimeField(null=True)
timestamp = fields.DatetimeField()
reason = fields.TextField(null=True)
event = fields.ForeignKeyField("models.Event", related_name="publications")

View File

@ -1,11 +1,13 @@
from typing import Iterator
import mobilizon_reshare.config.notifiers
import mobilizon_reshare.config.publishers
from mobilizon_reshare.config.config import get_settings
def get_active_publishers():
def get_active_publishers() -> Iterator[str]:
return mobilizon_reshare.config.publishers.get_active_publishers(get_settings())
def get_active_notifiers():
def get_active_notifiers() -> Iterator[str]:
return mobilizon_reshare.config.notifiers.get_active_notifiers(get_settings())

View File

@ -1,66 +1,22 @@
import importlib
import inspect
import logging
from abc import ABC, abstractmethod
from typing import Optional
from dynaconf.utils.boxing import DynaBox
from jinja2 import Environment, FileSystemLoader, Template
from requests import Response
from mobilizon_reshare.config.config import get_settings
from mobilizon_reshare.event.event import MobilizonEvent
from .exceptions import PublisherError, InvalidAttribute
from .exceptions import InvalidAttribute
from ..dataclasses import _MobilizonEvent
JINJA_ENV = Environment(loader=FileSystemLoader("/"))
logger = logging.getLogger(__name__)
class AbstractNotifier(ABC):
"""
Generic notifier class.
Shall be inherited from specific subclasses that will manage validation
process for messages and credentials, text formatting, posting, etc.
Attributes:
- ``message``: a formatted ``str``
"""
# Non-abstract subclasses should define ``_conf`` as a 2-tuple, where the
# first element is the type of class (either 'notifier' or 'publisher') and
# the second the name of its service (ie: 'facebook', 'telegram')
_conf = tuple()
def __repr__(self):
return type(self).__name__
__str__ = __repr__
@property
def conf(self) -> DynaBox:
"""
Retrieves class's settings.
"""
cls = type(self)
if cls in (AbstractPublisher, AbstractNotifier):
raise InvalidAttribute(
"Abstract classes cannot access notifiers/publishers' settings"
)
try:
t, n = cls._conf or tuple()
return get_settings()[t][n]
except (KeyError, ValueError):
raise InvalidAttribute(
f"Class {cls.__name__} has invalid ``_conf`` attribute"
f" (should be 2-tuple)"
)
@abstractmethod
def send(self, message):
"""
Sends a message to the target channel
"""
raise NotImplementedError
class LoggerMixin:
def _log_debug(self, msg, *args, **kwargs):
self.__log(logging.DEBUG, msg, *args, **kwargs)
@ -76,18 +32,69 @@ class AbstractNotifier(ABC):
def _log_critical(self, msg, *args, **kwargs):
self.__log(logging.CRITICAL, msg, *args, **kwargs)
def __log(self, level, msg, raise_error: PublisherError = None, *args, **kwargs):
def __log(self, level, msg, raise_error: type = None, *args, **kwargs):
method = inspect.currentframe().f_back.f_back.f_code.co_name
logger.log(level, f"{self}.{method}(): {msg}", *args, **kwargs)
if raise_error is not None:
raise raise_error(msg)
def are_credentials_valid(self) -> bool:
class ConfLoaderMixin:
_conf = tuple()
@property
def conf(self) -> DynaBox:
"""
Retrieves class's settings.
"""
cls = type(self)
try:
self.validate_credentials()
except PublisherError:
return False
return True
t, n = cls._conf or tuple()
return get_settings()[t][n]
except (KeyError, ValueError):
raise InvalidAttribute(
f"Class {cls.__name__} has invalid ``_conf`` attribute"
f" (should be 2-tuple)"
)
class AbstractPlatform(ABC, LoggerMixin, ConfLoaderMixin):
"""
Generic notifier class.
Shall be inherited from specific subclasses that will manage validation
process for messages and credentials, text formatting, posting, etc.
Attributes:
- ``message``: a formatted ``str``
"""
# Non-abstract subclasses should define ``_conf`` as a 2-tuple, where the
# first element is the type of class (either 'notifier' or 'publisher') and
# the second the name of its service (ie: 'facebook', 'telegram')
def __repr__(self):
return self.name
@property
@abstractmethod
def name(self):
pass
@abstractmethod
def _send(self, message: str, event: Optional[_MobilizonEvent] = None):
raise NotImplementedError # pragma: no cover
def send(self, message: str, event: Optional[_MobilizonEvent] = None):
"""
Sends a message to the target channel
"""
response = self._send(message, event)
self._validate_response(response)
@abstractmethod
def _validate_response(self, response):
raise NotImplementedError # pragma: no cover
@abstractmethod
def validate_credentials(self) -> None:
@ -96,99 +103,100 @@ class AbstractNotifier(ABC):
Should raise ``PublisherError`` (or one of its subclasses) if
credentials are not valid.
"""
raise NotImplementedError
raise NotImplementedError # pragma: no cover
class AbstractEventFormatter(LoggerMixin, ConfLoaderMixin):
@abstractmethod
def publish(self) -> None:
"""
Publishes the actual post on social media.
Should raise ``PublisherError`` (or one of its subclasses) if
anything goes wrong.
"""
raise NotImplementedError
def is_message_valid(self) -> bool:
try:
self.validate_message()
except PublisherError:
return False
return True
@abstractmethod
def validate_message(self) -> None:
"""
Validates notifier's message.
Should raise ``PublisherError`` (or one of its subclasses) if message
is not valid.
"""
raise NotImplementedError
class AbstractPublisher(AbstractNotifier):
"""
Generic publisher class.
Shall be inherited from specific subclasses that will manage validation
process for events and credentials, text formatting, posting, etc.
Attributes:
- ``event``: a ``MobilizonEvent`` containing every useful info from
the event
- ``message``: a formatted ``str``
"""
_conf = tuple()
def __init__(self, event: MobilizonEvent):
self.event = event
super().__init__()
def is_event_valid(self) -> bool:
try:
self.validate_event()
except PublisherError:
return False
return True
@abstractmethod
def validate_event(self) -> None:
def _validate_event(self, event: _MobilizonEvent) -> None:
"""
Validates publisher's event.
Should raise ``PublisherError`` (or one of its subclasses) if event
is not valid.
"""
raise NotImplementedError
raise NotImplementedError # pragma: no cover
def _preprocess_event(self):
@abstractmethod
def _validate_message(self, message: str) -> None:
"""
Validates notifier's message.
Should raise ``PublisherError`` (or one of its subclasses) if message
is not valid.
"""
raise NotImplementedError # pragma: no cover
def _get_name(self) -> str:
return self._conf[1]
def _get_template(self, configured_template, default_generator) -> Template:
if configured_template:
return JINJA_ENV.get_template(configured_template)
else:
template_ref = default_generator()
with importlib.resources.as_file(template_ref) as template_path:
return JINJA_ENV.get_template(template_path.as_posix())
def get_default_template_path(self, type=""):
return importlib.resources.files(
"mobilizon_reshare.publishers.templates"
) / f"{self._get_name()}{type}.tmpl.j2"
def get_default_recap_template_path(self):
return self.get_default_template_path(type="_recap")
def get_default_recap_header_template_path(self):
return self.get_default_template_path(type="_recap_header")
def validate_event(self, event: _MobilizonEvent) -> None:
self._validate_event(event)
self._validate_message(self.get_message_from_event(event))
@abstractmethod
def _preprocess_event(self, event):
"""
Allows publishers to preprocess events before feeding them to the template
"""
pass
return event
def get_message_from_event(self) -> str:
def get_message_from_event(self, event: _MobilizonEvent) -> str:
"""
Retrieves a message from the event itself.
"""
self._preprocess_event()
return self.event.format(self.get_message_template())
event = self._preprocess_event(event)
message = event.format(self.get_message_template())
message = self._preprocess_message(message)
return message
def get_message_template(self) -> Template:
"""
Retrieves publisher's message template.
"""
template_path = self.conf.msg_template_path or self.default_template_path
return JINJA_ENV.get_template(template_path)
return self._get_template(self.conf.msg_template_path, self.get_default_template_path)
@abstractmethod
def _send(self, message) -> Response:
pass
def get_recap_header(self) -> Template:
return self._get_template(
self.conf.recap_header_template_path,
self.get_default_recap_header_template_path
)
@abstractmethod
def _validate_response(self, response: Response) -> None:
pass
def send(self, message):
res = self._send(message)
self._validate_response(res)
def get_recap_fragment_template(self) -> Template:
return self._get_template(
self.conf.recap_template_path,
self.get_default_recap_template_path
)
def publish(self) -> None:
self.send(message=self.get_message_from_event())
def get_recap_fragment(self, event: _MobilizonEvent) -> str:
"""
Retrieves the fragment that describes a single event inside the event recap.
"""
event = self._preprocess_event(event)
return event.format(self.get_recap_fragment_template())
def _preprocess_message(self, message: str):
return message

View File

@ -1,164 +0,0 @@
import logging
from dataclasses import dataclass, field
from uuid import UUID
from mobilizon_reshare.event.event import MobilizonEvent
from mobilizon_reshare.models.publication import Publication
from mobilizon_reshare.models.publication import PublicationStatus
from mobilizon_reshare.publishers import get_active_notifiers, get_active_publishers
from mobilizon_reshare.publishers.abstract import AbstractPublisher
from mobilizon_reshare.publishers.exceptions import PublisherError
from mobilizon_reshare.publishers.telegram import TelegramPublisher
from mobilizon_reshare.publishers.zulip import ZulipPublisher
logger = logging.getLogger(__name__)
name_to_publisher_class = {"telegram": TelegramPublisher, "zulip": ZulipPublisher}
class BuildPublisherMixin:
@staticmethod
def build_publishers(
event: MobilizonEvent, publisher_names
) -> dict[str, AbstractPublisher]:
return {
publisher_name: name_to_publisher_class[publisher_name](event)
for publisher_name in publisher_names
}
@dataclass
class PublicationReport:
status: PublicationStatus
reason: str
publication_id: UUID
@dataclass
class PublisherCoordinatorReport:
publishers: dict[UUID, AbstractPublisher]
reports: dict[UUID, PublicationReport] = field(default_factory={})
@property
def successful(self):
return all(
r.status == PublicationStatus.COMPLETED for r in self.reports.values()
)
class PublisherCoordinator(BuildPublisherMixin):
def __init__(self, event: MobilizonEvent, publications: dict[UUID, Publication]):
publishers = self.build_publishers(event, get_active_publishers())
self.publishers_by_publication_id = {
publication_id: publishers[publication.publisher.name]
for publication_id, publication in publications.items()
}
def run(self) -> PublisherCoordinatorReport:
errors = self._validate()
if errors:
return PublisherCoordinatorReport(
reports=errors, publishers=self.publishers_by_publication_id
)
return self._post()
def _make_successful_report(self, failed_ids):
return {
publication_id: PublicationReport(
status=PublicationStatus.COMPLETED,
reason="",
publication_id=publication_id,
)
for publication_id in self.publishers_by_publication_id
if publication_id not in failed_ids
}
def _post(self):
failed_publishers_reports = {}
for publication_id, p in self.publishers_by_publication_id.items():
try:
p.publish()
except PublisherError as e:
failed_publishers_reports[publication_id] = PublicationReport(
status=PublicationStatus.FAILED,
reason=str(e),
publication_id=publication_id,
)
reports = failed_publishers_reports | self._make_successful_report(
failed_publishers_reports.keys()
)
return PublisherCoordinatorReport(
publishers=self.publishers_by_publication_id, reports=reports
)
def _validate(self):
errors: dict[UUID, PublicationReport] = {}
for publication_id, p in self.publishers_by_publication_id.items():
reason = []
if not p.are_credentials_valid():
reason.append("Invalid credentials")
if not p.is_event_valid():
reason.append("Invalid event")
if not p.is_message_valid():
reason.append("Invalid message")
if len(reason) > 0:
errors[publication_id] = PublicationReport(
status=PublicationStatus.FAILED,
reason=", ".join(reason),
publication_id=publication_id,
)
return errors
@staticmethod
def get_formatted_message(event: MobilizonEvent, publisher: str) -> str:
"""
Returns the formatted message for a given event and publisher.
"""
if publisher not in name_to_publisher_class:
raise ValueError(
f"Publisher {publisher} does not exist.\nSupported publishers: "
f"{', '.join(list(name_to_publisher_class.keys()))}"
)
return name_to_publisher_class[publisher](event).get_message_from_event()
class AbstractNotifiersCoordinator(BuildPublisherMixin):
def __init__(self, event: MobilizonEvent):
self.event = event
self.notifiers = self.build_publishers(event, get_active_notifiers())
def send_to_all(self, message):
# TODO: failure to notify should fail safely and write to a dedicated log
for notifier in self.notifiers.values():
notifier.send(message)
class PublicationFailureNotifiersCoordinator(AbstractNotifiersCoordinator):
def __init__(
self,
event: MobilizonEvent,
publisher_coordinator_report: PublisherCoordinatorReport,
):
self.report = publisher_coordinator_report
super(PublicationFailureNotifiersCoordinator, self).__init__(event)
def build_failure_message(self, report: PublicationReport):
return (
f"Publication {report.publication_id} failed with status: {report.status}.\n"
f"Reason: {report.reason}"
)
def notify_failures(self):
for publication_id, report in self.report.reports.items():
logger.info(
f"Sending failure notifications for publication: {publication_id}"
)
if report.status == PublicationStatus.FAILED:
self.send_to_all(self.build_failure_message(report))

View File

@ -0,0 +1,32 @@
import logging
from dataclasses import dataclass
from typing import Optional, Sequence
from mobilizon_reshare.models.publication import PublicationStatus
@dataclass
class BasePublicationReport:
status: PublicationStatus
reason: Optional[str]
@property
def successful(self):
return self.status == PublicationStatus.COMPLETED
def get_failure_message(self):
return (
f"Publication failed with status: {self.status.name}.\n" f"Reason: {self.reason}"
)
@dataclass
class BaseCoordinatorReport:
reports: Sequence[BasePublicationReport]
@property
def successful(self):
return all(r.successful for r in self.reports)
logger = logging.getLogger(__name__)

View File

@ -0,0 +1,66 @@
import dataclasses
import logging
from dataclasses import dataclass
from typing import List, Optional, Sequence
from mobilizon_reshare.dataclasses.publication import _EventPublication
from mobilizon_reshare.models.publication import PublicationStatus
from mobilizon_reshare.publishers.coordinators import BasePublicationReport
logger = logging.getLogger(__name__)
@dataclass
class EventPublicationReport(BasePublicationReport):
publication: _EventPublication
published_content: Optional[str] = dataclasses.field(default=None)
def get_failure_message(self):
if not self.reason:
logger.error("Report of failure without reason.", exc_info=True)
return (
f"Publication {self.publication.id} failed with status: {self.status.name}.\n"
f"Reason: {self.reason}\n"
f"Publisher: {self.publication.publisher.name}\n"
f"Event: {self.publication.event.name}"
)
class BaseEventPublishingCoordinator:
def __init__(self, publications: List[_EventPublication]):
self.publications = publications
def _safe_run(self, reasons, f, *args, **kwargs):
try:
f(*args, **kwargs)
return reasons
except Exception as e:
return reasons + [str(e)]
def _validate(self) -> List[EventPublicationReport]:
errors = []
for publication in self.publications:
reasons = []
reasons = self._safe_run(
reasons, publication.publisher.validate_credentials,
)
reasons = self._safe_run(
reasons, publication.formatter.validate_event, publication.event
)
if len(reasons) > 0:
errors.append(
EventPublicationReport(
status=PublicationStatus.FAILED,
reason=", ".join(reasons),
publication=publication,
)
)
return errors
def _filter_publications(self, errors: Sequence[EventPublicationReport]) -> List[_EventPublication]:
publishers_with_errors = set(e.publication.publisher for e in errors)
return [p for p in self.publications if p.publisher not in publishers_with_errors]

View File

@ -0,0 +1,36 @@
import logging
from typing import List, Sequence
from mobilizon_reshare.dataclasses import _EventPublication
from mobilizon_reshare.models.publication import PublicationStatus
from mobilizon_reshare.publishers.coordinators.event_publishing.publish import (
PublisherCoordinator,
EventPublicationReport,
)
logger = logging.getLogger(__name__)
class DryRunPublisherCoordinator(PublisherCoordinator):
"""
Coordinator to perform a dry-run on the event publication
"""
def _publish(self, publications: Sequence[_EventPublication]) -> List[EventPublicationReport]:
reports = [
EventPublicationReport(
status=PublicationStatus.COMPLETED,
publication=publication,
reason=None,
published_content=publication.formatter.get_message_from_event(
publication.event
),
)
for publication in publications
]
logger.info("The following events would be published:")
for r in reports:
event_name = r.publication.event.name
publisher_name = r.publication.publisher.name
logger.info(f"{event_name}{publisher_name}")
return reports

View File

@ -0,0 +1,126 @@
from abc import ABC, abstractmethod
from dataclasses import dataclass, field
from typing import List, Optional, Sequence
from mobilizon_reshare.dataclasses import PublicationNotification, EventPublication
from mobilizon_reshare.models.notification import NotificationStatus
from mobilizon_reshare.models.publication import PublicationStatus
from mobilizon_reshare.publishers import get_active_notifiers
from mobilizon_reshare.publishers.abstract import (
AbstractPlatform,
)
from mobilizon_reshare.publishers.coordinators import (
logger,
BasePublicationReport,
BaseCoordinatorReport,
)
from mobilizon_reshare.publishers.coordinators.event_publishing import (
EventPublicationReport,
)
from mobilizon_reshare.publishers.platforms.platform_mapping import (
get_notifier_class,
get_formatter_class,
)
@dataclass
class PublicationNotificationReport(BasePublicationReport):
status: NotificationStatus
notification: PublicationNotification
@property
def successful(self):
return self.status == NotificationStatus.COMPLETED
def get_failure_message(self):
if not self.reason:
logger.error("Report of failure without reason.", exc_info=True)
return (
f"Failed with status: {self.status.name}.\n"
f"Reason: {self.reason}\n"
f"Publisher: {self.notification.publisher.name}\n"
f"Publication: {self.notification.publication.id}"
)
@dataclass
class NotifierCoordinatorReport(BaseCoordinatorReport):
reports: Sequence[PublicationNotificationReport]
notifications: Sequence[PublicationNotification] = field(default_factory=list)
class Sender:
def __init__(
self,
message: str,
publication: EventPublication,
platforms: List[AbstractPlatform] = None,
):
self.message = message
self.platforms = platforms
self.publication = publication
def send_to_all(self) -> NotifierCoordinatorReport:
reports = []
notifications = []
for platform in self.platforms:
notification = PublicationNotification(
platform, get_formatter_class(platform.name)(), self.publication
)
try:
platform.send(self.message)
report = PublicationNotificationReport(
NotificationStatus.COMPLETED, self.message, notification
)
except Exception as e:
msg = f"[{platform.name}] Failed to notify failure of message:\n{self.message}"
logger.critical(msg)
logger.exception(e)
report = PublicationNotificationReport(
NotificationStatus.FAILED, msg, notification
)
notifications.append(notification)
reports.append(report)
return NotifierCoordinatorReport(reports=reports, notifications=notifications)
class AbstractNotifiersCoordinator(ABC):
def __init__(
self, report: BasePublicationReport, notifiers: List[AbstractPlatform] = None
):
self.platforms = notifiers or [
get_notifier_class(notifier)() for notifier in get_active_notifiers()
]
self.report = report
@abstractmethod
def notify_failure(self):
pass
class PublicationFailureNotifiersCoordinator(AbstractNotifiersCoordinator):
"""
Sends a notification of a failure report to the active platforms
"""
report: EventPublicationReport
platforms: List[AbstractPlatform]
def notify_failure(self) -> Optional[NotifierCoordinatorReport]:
logger.info("Sending failure notifications")
if self.report.status == PublicationStatus.FAILED:
return Sender(
self.report.get_failure_message(),
self.report.publication,
self.platforms,
).send_to_all()
class PublicationFailureLoggerCoordinator(PublicationFailureNotifiersCoordinator):
"""
Logs a report to console
"""
def notify_failure(self):
if self.report.status == PublicationStatus.FAILED:
logger.error(self.report.get_failure_message())

View File

@ -0,0 +1,86 @@
import dataclasses
import logging
from dataclasses import dataclass
from typing import Sequence, List
from mobilizon_reshare.dataclasses.publication import _EventPublication
from mobilizon_reshare.models.publication import PublicationStatus
from mobilizon_reshare.publishers.coordinators import BaseCoordinatorReport
from mobilizon_reshare.publishers.coordinators.event_publishing import (
BaseEventPublishingCoordinator,
EventPublicationReport,
)
from mobilizon_reshare.publishers.exceptions import PublisherError
logger = logging.getLogger(__name__)
@dataclass
class PublisherCoordinatorReport(BaseCoordinatorReport):
reports: Sequence[EventPublicationReport]
publications: Sequence[_EventPublication] = dataclasses.field(default_factory=list)
def __str__(self):
platform_messages = []
for report in self.reports:
intro = f"Message for: {report.publication.publisher.name}"
platform_messages.append(
f"""{intro}
{"*" * len(intro)}
{report.published_content}
{"-" * 80}"""
)
return "\n".join(platform_messages)
class PublisherCoordinator(BaseEventPublishingCoordinator):
"""
Coordinator to publish an event on every active platform
"""
def run(self) -> PublisherCoordinatorReport:
validation_reports = self._validate()
valid_publications = self._filter_publications(validation_reports)
publishing_reports = self._publish(valid_publications)
return PublisherCoordinatorReport(
publications=self.publications,
reports=validation_reports + publishing_reports
)
def _publish(self, publications: Sequence[_EventPublication]) -> List[EventPublicationReport]:
reports = []
for publication in publications:
try:
publication_report = self._publish_publication(publication)
reports.append(publication_report)
except PublisherError as e:
logger.error(str(e))
reports.append(
EventPublicationReport(
status=PublicationStatus.FAILED,
reason=str(e),
publication=publication,
)
)
return reports
@staticmethod
def _publish_publication(publication):
"""
Publishes a single publication
:param publication:
:return:
"""
logger.info("Publishing to %s", publication.publisher.name)
message = publication.formatter.get_message_from_event(publication.event)
publication.publisher.send(message, publication.event)
return EventPublicationReport(
status=PublicationStatus.COMPLETED,
publication=publication,
reason=None,
published_content=message,
)

View File

@ -0,0 +1,20 @@
from mobilizon_reshare.publishers.coordinators.recap_publishing.recap import (
RecapCoordinator,
)
class DryRunRecapCoordinator(RecapCoordinator):
"""
Coordinator to perform a dry-run on the event recap
"""
def _send(self, content, recap_publication):
"""
Overrides the Recap Coordinator _send on the assumption that _send is just a side effect publishing
on a given platform. The report generated by RecapCoordinator should be sufficient to perform
the dry-run print at CLI level.
:param content:
:param recap_publication:
:return:
"""
pass

View File

@ -0,0 +1,78 @@
import dataclasses
from dataclasses import dataclass
from typing import Optional, Sequence, List
from mobilizon_reshare.models.publication import PublicationStatus
from mobilizon_reshare.dataclasses.publication import RecapPublication
from mobilizon_reshare.publishers.coordinators import (
BasePublicationReport,
BaseCoordinatorReport,
)
from mobilizon_reshare.publishers.exceptions import PublisherError
@dataclass
class RecapPublicationReport(BasePublicationReport):
publication: RecapPublication
published_content: Optional[str] = dataclasses.field(default=None)
@dataclass
class RecapCoordinatorReport(BaseCoordinatorReport):
reports: Sequence[RecapPublicationReport]
def __str__(self):
platform_messages = []
for report in self.reports:
intro = f"Message for: {report.publication.publisher.name}"
platform_messages.append(
f"""{intro}
{"*"*len(intro)}
{report.published_content}
{"-"*80}"""
)
return "\n".join(platform_messages)
class RecapCoordinator:
"""
Coordinator to publish a recap on future events
"""
def __init__(self, recap_publications: List[RecapPublication]):
self.recap_publications = recap_publications
def _build_recap_content(self, recap_publication: RecapPublication):
fragments = [recap_publication.formatter.get_recap_header()]
for event in recap_publication.events:
fragments.append(recap_publication.formatter.get_recap_fragment(event))
return "\n\n".join(fragments)
def _send(self, content, recap_publication):
recap_publication.publisher.send(content)
def run(self) -> RecapCoordinatorReport:
reports = []
for recap_publication in self.recap_publications:
try:
message = self._build_recap_content(recap_publication)
self._send(message, recap_publication)
reports.append(
RecapPublicationReport(
status=PublicationStatus.COMPLETED,
reason=None,
published_content=message,
publication=recap_publication,
)
)
except PublisherError as e:
reports.append(
RecapPublicationReport(
status=PublicationStatus.FAILED,
reason=str(e),
publication=recap_publication,
)
)
return RecapCoordinatorReport(reports=reports)

View File

@ -1,38 +1,38 @@
class PublisherError(Exception):
""" Generic publisher error """
def __init__(self, message):
""" :param str message: exception message """
super().__init__(message)
"""Generic publisher error"""
class InvalidAttribute(PublisherError):
""" Publisher defined with invalid or missing attribute """
"""Publisher defined with invalid or missing attribute"""
class InvalidBot(PublisherError):
""" Publisher refers to the wrong service bot """
"""Publisher refers to the wrong service bot"""
class InvalidCredentials(PublisherError):
""" Publisher cannot validate credentials """
"""Publisher cannot validate credentials"""
class InvalidEvent(PublisherError):
""" Publisher cannot validate events """
"""Publisher cannot validate events"""
class InvalidMessage(PublisherError):
""" Publisher cannot validate message """
"""Publisher cannot validate message"""
class InvalidResponse(PublisherError):
""" Publisher receives an invalid response from its service """
"""Publisher receives an invalid response from its service"""
class InvalidSettings(PublisherError):
""" Publisher settings are either missing or badly configured """
"""Publisher settings are either missing or badly configured"""
class ZulipError(PublisherError):
""" Publisher receives an error response from Zulip"""
"""Publisher receives an error response from Zulip"""
class HTTPResponseError(PublisherError):
"""Publisher receives a HTTP error"""

View File

@ -0,0 +1,85 @@
from typing import Optional
import facebook
from facebook import GraphAPIError
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.formatting.description import html_to_plaintext
from mobilizon_reshare.publishers.abstract import (
AbstractPlatform,
AbstractEventFormatter,
)
from mobilizon_reshare.publishers.exceptions import (
InvalidCredentials,
InvalidEvent,
InvalidMessage,
PublisherError,
)
class FacebookFormatter(AbstractEventFormatter):
_conf = ("publisher", "facebook")
def _validate_event(self, event: MobilizonEvent) -> None:
text = event.description
if not (text and text.strip()):
self._log_error("No text was found", raise_error=InvalidEvent)
def _validate_message(self, message) -> None:
if len(message) >= 63200:
self._log_error("Message is too long", raise_error=InvalidMessage)
def _preprocess_event(self, event: MobilizonEvent):
event.description = html_to_plaintext(event.description)
event.name = html_to_plaintext(event.name)
return event
class FacebookPlatform(AbstractPlatform):
"""
Facebook publisher class.
"""
name = "facebook"
def _get_api(self) -> facebook.GraphAPI:
return facebook.GraphAPI(access_token=self.conf["page_access_token"])
def _send(self, message: str, event: Optional[MobilizonEvent] = None):
try:
self._get_api().put_object(
parent_object="me",
connection_name="feed",
message=message,
link=event.mobilizon_link if event else None,
)
except GraphAPIError:
self._log_error(
"Facebook send failed", raise_error=PublisherError,
)
def validate_credentials(self):
try:
self._log_debug("Validating Facebook credentials")
self._get_api().get_object(id="me", field="name")
except GraphAPIError:
self._log_error(
"Invalid Facebook credentials. Authentication Failed",
raise_error=InvalidCredentials,
)
self._log_debug("Facebook credentials are valid")
def _validate_response(self, response):
pass
class FacebookPublisher(FacebookPlatform):
_conf = ("publisher", "facebook")
class FacebookNotifier(FacebookPlatform):
_conf = ("notifier", "facebook")

View File

@ -0,0 +1,95 @@
from typing import Optional
from urllib.parse import urljoin
import requests
from requests import Response
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.publishers.abstract import (
AbstractPlatform,
AbstractEventFormatter,
)
from mobilizon_reshare.publishers.exceptions import (
InvalidBot,
InvalidEvent,
InvalidResponse,
HTTPResponseError,
InvalidMessage,
)
class MastodonFormatter(AbstractEventFormatter):
_conf = ("publisher", "mastodon")
def _validate_event(self, event: MobilizonEvent) -> None:
text = event.description
if not (text and text.strip()):
self._log_error("No text was found", raise_error=InvalidEvent)
def _validate_message(self, message) -> None:
if len(message.encode("utf-8")) >= self.conf.toot_length:
self._log_error("Message is too long", raise_error=InvalidMessage)
class MastodonPlatform(AbstractPlatform):
"""
Mastodon publisher class.
"""
_conf = ("publisher", "mastodon")
api_uri = "api/v1/"
name = "mastodon"
def _send(self, message: str, event: Optional[MobilizonEvent] = None) -> Response:
"""
Send messages
"""
return requests.post(
url=urljoin(self.conf.instance, self.api_uri) + "statuses",
headers={"Authorization": f"Bearer {self.conf.token}"},
data={"status": message, "visibility": "public"},
)
def validate_credentials(self):
res = requests.get(
headers={"Authorization": f"Bearer {self.conf.token}"},
url=urljoin(self.conf.instance, self.api_uri) + "apps/verify_credentials",
)
data = self._validate_response(res)
if not self.conf.name == data["name"]:
self._log_error(
"Found a different bot than the expected one"
f"\n\tfound: {data['name']}"
f"\n\texpected: {self.conf.name}",
raise_error=InvalidBot,
)
def _validate_response(self, res: Response) -> dict:
try:
res.raise_for_status()
except requests.exceptions.HTTPError as e:
self._log_debug(str(res))
self._log_error(
str(e), raise_error=HTTPResponseError,
)
try:
data = res.json()
except Exception as e:
self._log_error(
f"Server returned invalid json data: {str(e)}",
raise_error=InvalidResponse,
)
return data
class MastodonPublisher(MastodonPlatform):
_conf = ("publisher", "mastodon")
class MastodonNotifier(MastodonPlatform):
_conf = ("notifier", "mastodon")

View File

@ -0,0 +1,64 @@
from mobilizon_reshare.publishers.platforms.facebook import (
FacebookPublisher,
FacebookFormatter,
FacebookNotifier,
)
from mobilizon_reshare.publishers.platforms.mastodon import (
MastodonPublisher,
MastodonFormatter,
MastodonNotifier,
)
from mobilizon_reshare.publishers.platforms.telegram import (
TelegramPublisher,
TelegramFormatter,
TelegramNotifier,
)
from mobilizon_reshare.publishers.platforms.twitter import (
TwitterPublisher,
TwitterFormatter,
TwitterNotifier,
)
from mobilizon_reshare.publishers.platforms.zulip import (
ZulipPublisher,
ZulipFormatter,
ZulipNotifier,
)
"""
This module is required to have an explicit mapping between platform names and the classes implementing those platforms.
It could be refactored in a different pattern but this way makes it more explicit and linear. Eventually this could be
turned into a plugin system with a plugin for each platform."""
name_to_publisher_class = {
"mastodon": MastodonPublisher,
"telegram": TelegramPublisher,
"zulip": ZulipPublisher,
"twitter": TwitterPublisher,
"facebook": FacebookPublisher,
}
name_to_formatter_class = {
"mastodon": MastodonFormatter,
"telegram": TelegramFormatter,
"zulip": ZulipFormatter,
"twitter": TwitterFormatter,
"facebook": FacebookFormatter,
}
name_to_notifier_class = {
"mastodon": MastodonNotifier,
"telegram": TelegramNotifier,
"zulip": ZulipNotifier,
"twitter": TwitterNotifier,
"facebook": FacebookNotifier,
}
def get_notifier_class(platform):
return name_to_notifier_class[platform]
def get_publisher_class(platform):
return name_to_publisher_class[platform]
def get_formatter_class(platform):
return name_to_formatter_class[platform]

View File

@ -0,0 +1,132 @@
import re
from typing import Optional
import requests
from bs4 import BeautifulSoup
from requests import Response
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.publishers.abstract import (
AbstractEventFormatter,
AbstractPlatform,
)
from mobilizon_reshare.publishers.exceptions import (
InvalidBot,
InvalidEvent,
InvalidResponse,
InvalidMessage,
)
class TelegramFormatter(AbstractEventFormatter):
_conf = ("publisher", "telegram")
def _validate_event(self, event: MobilizonEvent) -> None:
description = event.description
if not (description and description.strip()):
self._log_error("No description was found", raise_error=InvalidEvent)
def _validate_message(self, message: str) -> None:
if (
len("".join(BeautifulSoup(message, "html.parser").findAll(text=True)))
>= 4096
):
self._log_error("Message is too long", raise_error=InvalidMessage)
def _preprocess_message(self, message: str) -> str:
"""
This function converts HTML5 to Telegram's HTML dialect
:param message: a HTML5 string
:return: a HTML string compatible with Telegram
"""
html = BeautifulSoup(message, "html.parser")
# replacing paragraphs
for tag in html.findAll(["p", "br"]):
tag.append("\n")
tag.replaceWithChildren()
# replacing headers
for tag in html.findAll(["h1", "h2", "h3"]):
if tag.text: # only if they are not empty
tag.name = "b"
tag.insert_after("\n")
tag.insert_before("\n")
else:
tag.decompose()
# removing lists
for tag in html.findAll("ul"):
tag.unwrap()
# replacing list elements with dots
for tag in html.findAll(["li"]):
tag.insert(0, "")
tag.unwrap()
# cleaning html trailing whitespace
for tag in html.findAll("a"):
if "href" in tag:
tag["href"] = tag["href"].replace(" ", "").strip().lstrip()
s = str(html)
return re.sub(r"\n{2,}", "\n\n", s).strip() # remove multiple newlines
class TelegramPlatform(AbstractPlatform):
"""
Telegram publisher class.
"""
name = "telegram"
def validate_credentials(self):
res = requests.get(f"https://api.telegram.org/bot{self.conf.token}/getMe")
data = self._validate_response(res)
if not self.conf.username == data.get("result", {}).get("username"):
self._log_error(
"Found a different bot than the expected one", raise_error=InvalidBot,
)
def _send(self, message: str, event: Optional[MobilizonEvent] = None) -> Response:
json_message = {"chat_id": self.conf.chat_id, "text": message, "parse_mode": "html"}
if self.conf.message_thread_id:
json_message["message_thread_id"] = self.conf.message_thread_id
return requests.post(
url=f"https://api.telegram.org/bot{self.conf.token}/sendMessage",
json=json_message,
)
def _validate_response(self, res):
try:
res.raise_for_status()
except requests.exceptions.HTTPError as e:
self._log_error(
f"Server returned invalid data: {str(e)}\n{res.text}",
raise_error=InvalidResponse,
)
try:
data = res.json()
except Exception as e:
self._log_error(
f"Server returned invalid json data: {str(e)}",
raise_error=InvalidResponse,
)
if not data.get("ok"):
self._log_error(
f"Invalid request (response: {data})", raise_error=InvalidResponse,
)
return data
class TelegramPublisher(TelegramPlatform):
_conf = ("publisher", "telegram")
class TelegramNotifier(TelegramPlatform):
_conf = ("notifier", "telegram")

View File

@ -0,0 +1,73 @@
from typing import Optional
from tweepy import OAuthHandler, API, TweepyException
from tweepy.models import Status
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.publishers.abstract import (
AbstractPlatform,
AbstractEventFormatter,
)
from mobilizon_reshare.publishers.exceptions import (
InvalidCredentials,
PublisherError,
InvalidMessage,
)
class TwitterFormatter(AbstractEventFormatter):
_conf = ("publisher", "twitter")
def _validate_event(self, event: MobilizonEvent) -> None:
pass # pragma: no cover
def _validate_message(self, message) -> None:
# TODO this is not precise. It should count the characters according to Twitter's logic but
# Tweepy doesn't seem to support the validation client side
if len(message.encode("utf-8")) > 280:
self._log_error("Message is too long", raise_error=InvalidMessage)
class TwitterPlatform(AbstractPlatform):
"""
Twitter publisher class.
"""
_conf = ("publisher", "twitter")
name = "twitter"
def _get_api(self):
api_key = self.conf.api_key
api_key_secret = self.conf.api_key_secret
access_token = self.conf.access_token
access_secret = self.conf.access_secret
auth = OAuthHandler(api_key, api_key_secret)
auth.set_access_token(access_token, access_secret)
return API(auth)
def _send(self, message: str, event: Optional[MobilizonEvent] = None) -> Status:
try:
return self._get_api().update_status(message)
except TweepyException as e:
self._log_error(e.args[0], raise_error=PublisherError)
def validate_credentials(self):
if not self._get_api().verify_credentials():
self._log_error(
"Invalid Twitter credentials. Authentication Failed",
raise_error=InvalidCredentials,
)
def _validate_response(self, res: Status) -> dict:
pass # pragma: no cover
class TwitterPublisher(TwitterPlatform):
_conf = ("publisher", "twitter")
class TwitterNotifier(TwitterPlatform):
_conf = ("notifier", "twitter")

View File

@ -0,0 +1,121 @@
from typing import Optional
from urllib.parse import urljoin
import requests
from requests import Response
from requests.auth import HTTPBasicAuth
from mobilizon_reshare.dataclasses import MobilizonEvent
from mobilizon_reshare.formatting.description import html_to_markdown
from mobilizon_reshare.publishers.abstract import (
AbstractPlatform,
AbstractEventFormatter,
)
from mobilizon_reshare.publishers.exceptions import (
InvalidBot,
InvalidEvent,
InvalidResponse,
ZulipError,
InvalidMessage,
HTTPResponseError,
)
class ZulipFormatter(AbstractEventFormatter):
_conf = ("publisher", "zulip")
def _validate_event(self, event: MobilizonEvent) -> None:
text = event.description
if not (text and text.strip()):
self._log_error("No text was found", raise_error=InvalidEvent)
def _validate_message(self, message: str) -> None:
if len(message.encode("utf-8")) >= 10000:
self._log_error("Message is too long", raise_error=InvalidMessage)
def _preprocess_event(self, event: MobilizonEvent):
event.description = html_to_markdown(event.description)
event.name = html_to_markdown(event.name)
return event
class ZulipPlatform(AbstractPlatform):
"""
Zulip publisher class.
"""
_conf = ("publisher", "zulip")
api_uri = "api/v1/"
name = "zulip"
def _send(self, message: str, event: Optional[MobilizonEvent] = None) -> Response:
"""
Send stream messages
"""
return requests.post(
url=urljoin(self.conf.instance, self.api_uri) + "messages",
auth=HTTPBasicAuth(self.conf.bot_email, self.conf.bot_token),
data={
"type": "stream",
"to": self.conf.chat_id,
"subject": self.conf.subject,
"content": message,
},
)
def validate_credentials(self):
conf = self.conf
res = requests.get(
auth=HTTPBasicAuth(self.conf.bot_email, self.conf.bot_token),
url=urljoin(self.conf.instance, self.api_uri) + "users/me",
)
data = self._validate_response(res)
if not data["is_bot"]:
self._log_error(
"These user is not a bot", raise_error=InvalidBot,
)
if not conf.bot_email == data["email"]:
self._log_error(
"Found a different bot than the expected one"
f"\n\tfound: {data['email']}"
f"\n\texpected: {conf.bot_email}",
raise_error=InvalidBot,
)
def _validate_response(self, response: Response) -> dict:
try:
response.raise_for_status()
except requests.exceptions.HTTPError as e:
self._log_debug(str(response.text))
self._log_error(
str(e), raise_error=HTTPResponseError,
)
# See https://zulip.com/api/rest-error-handling
try:
data = response.json()
except Exception as e:
self._log_error(
f"Server returned invalid json data: {str(e)}",
raise_error=InvalidResponse,
)
if data["result"] == "error":
self._log_error(
f"{response.status_code} Error - {data['msg']}", raise_error=ZulipError,
)
return data
class ZulipPublisher(ZulipPlatform):
_conf = ("publisher", "zulip")
class ZulipNotifier(ZulipPlatform):
_conf = ("notifier", "zulip")

View File

@ -1,104 +0,0 @@
import pkg_resources
import requests
from requests import Response
from mobilizon_reshare.formatting.description import html_to_markdown
from mobilizon_reshare.publishers.abstract import AbstractPublisher
from mobilizon_reshare.publishers.exceptions import (
InvalidBot,
InvalidCredentials,
InvalidEvent,
InvalidResponse,
)
class TelegramPublisher(AbstractPublisher):
"""
Telegram publisher class.
"""
_conf = ("publisher", "telegram")
default_template_path = pkg_resources.resource_filename(
"mobilizon_reshare.publishers.templates", "telegram.tmpl.j2"
)
def _escape_message(self, message: str) -> str:
message = (
message.replace("-", "\\-")
.replace(".", "\\.")
.replace("(", "\\(")
.replace(")", "\\)")
.replace("#", "")
)
return message
def _send(self, message: str) -> Response:
return requests.post(
url=f"https://api.telegram.org/bot{self.conf.token}/sendMessage",
json={
"chat_id": self.conf.chat_id,
"text": self._escape_message(message),
"parse_mode": "markdownv2",
},
)
def validate_credentials(self):
conf = self.conf
chat_id = conf.chat_id
token = conf.token
username = conf.username
err = []
if not chat_id:
err.append("chat ID")
if not token:
err.append("token")
if not username:
err.append("username")
if err:
self._log_error(
", ".join(err) + " is/are missing", raise_error=InvalidCredentials,
)
res = requests.get(f"https://api.telegram.org/bot{token}/getMe")
data = self._validate_response(res)
if not username == data.get("result", {}).get("username"):
self._log_error(
"Found a different bot than the expected one", raise_error=InvalidBot,
)
def validate_event(self) -> None:
text = self.event.description
if not (text and text.strip()):
self._log_error("No text was found", raise_error=InvalidEvent)
def _validate_response(self, res):
try:
res.raise_for_status()
except requests.exceptions.HTTPError as e:
self._log_error(
f"Server returned invalid data: {str(e)}", raise_error=InvalidResponse,
)
try:
data = res.json()
except Exception as e:
self._log_error(
f"Server returned invalid json data: {str(e)}",
raise_error=InvalidResponse,
)
if not data.get("ok"):
self._log_error(
f"Invalid request (response: {data})", raise_error=InvalidResponse,
)
return data
def validate_message(self) -> None:
# TODO implement
pass
def _preprocess_event(self):
self.event.description = html_to_markdown(self.event.description)
self.event.name = html_to_markdown(self.event.name)

View File

@ -0,0 +1,11 @@
{{ name }}
🕒 {{ begin_datetime.to('local').format('DD MMMM, HH:mm', locale=locale) }} - {{ end_datetime.to('local').format('DD MMMM, HH:mm', locale=locale) }}
{% if location %}
📍 {{ location }}
{% endif %}
{{ description }}
🔗 Link: {{mobilizon_link}}

View File

@ -0,0 +1,9 @@
{{ name }}
🕒 {{ begin_datetime.to('local').format('DD MMMM, HH:mm', locale=locale) }} - {{ end_datetime.to('local').format('DD MMMM, HH:mm', locale=locale) }}
{% if location %}
📍 {{ location }}
{% endif %}
🔗 Link: {{mobilizon_link}}

Some files were not shown because too many files have changed in this diff Show More