1
0
mirror of https://git.sr.ht/~tsileo/microblog.pub synced 2025-06-05 21:59:23 +02:00

61 Commits

Author SHA1 Message Date
5d95fd44ac Fix webmention discovery 2022-12-04 12:06:15 +01:00
a337b32bcd Blocking server also blocks subdomains 2022-12-04 11:51:52 +01:00
e8fcf5a9a2 Tweak video mode 2022-12-03 19:57:13 +01:00
7525744f82 Test new GIF mode for videos without sound 2022-12-03 19:47:11 +01:00
7d3fc35a24 More proxy client tweaks 2022-12-02 19:40:58 +01:00
73dceee0f5 Fix proxy client 2022-12-02 19:28:59 +01:00
34c7cdb5fb Fix Undo{Announce} recipients 2022-12-02 18:48:23 +01:00
0527e34476 Tweak proxy client 2022-12-02 18:48:05 +01:00
a82f619e89 Revert "fix unshare by getting recipients from Announce activity instead of Undo"
This reverts commit dcd44ec3b6.
2022-12-02 18:12:24 +01:00
a68b3e7318 Don't insert an empty div on the index when there's no pages 2022-11-30 20:11:20 +01:00
436d5ccf1b Tweak in reply to this xyz text 2022-11-30 19:30:26 +01:00
a273f26549 Only show local delete for local replies 2022-11-30 17:49:36 +01:00
9d357446d2 Tweak logging 2022-11-30 17:37:08 +01:00
6cabff21db Document running from subpath 2022-11-30 14:14:09 +01:00
5df4d420de Whitelist object types in the index query
Select the outbox object types that we want to show on the notes page
instead of removing objects that we don't want to show.
That way, it's easier to ensure that there are no objects messing up the
object count/empty checks.

Partially fixes https://todo.sr.ht/~tsileo/microblog.pub/65
2022-11-30 14:10:28 +01:00
68884d9afa Use <details> element for sensitive text
The sensitive text feature was implemented with <label> and hidden
checkbox <input> elements. There were two issues with this
implementation:
1. The user couldn't navigate to the "show/hide more" button using
   keyboard.
2. The label indicates two actions at the same time ("show/hide more"),
   making it unclear what the function of the checkbox was and what the
   current show/collapse state was.

As it is generally preferrable to use built-in HTML elements for the
best semantic, this commit moves to use the <details> and <summary>
elements for the sensitive text feature. The browser will open/collapse
the content in <details> automatically when the user clicks on the
<summary>, and keyboard navigation support is built-in.

This commit also changes the button to display "show more" or "show
less" depending on the state for visual clarity. This button is hidden
from the accessibility tree using `aria-label="false"`, as the <details>
element already exposes its state to the tree and we want to avoid
duplicated information.

A few caveats:
* The "show/hide sensitive content" button for sensitive attachments
  hasn't been changed yet as I'd like to get more feedback about the new
  implementation.
* As the summary/content warning text itself is also part of the
  <summary> tag, the user can now also click on them to toggle the
  visibility of the sensitive text. This may not be desirable as the
  current interface does not make it clear that this could happen; the
  user may try to select some text from the summary and be surprised
  by the sensitive text being expanded. One way to improve this would
  be to add an event listener to the summary text and call
  `preventDefault`, but this would introduce JavaScript code.
2022-11-30 12:26:34 +01:00
46a592b11e Switch back to HTTP1 for the media proxy client 2022-11-30 12:26:31 +01:00
5f0b8f5dfd Tweak media proxy client 2022-11-28 20:58:16 +01:00
5adb2bca9a Revert "Update deps"
This reverts commit 08cc74d928.
2022-11-28 20:35:53 +01:00
08cc74d928 Update deps 2022-11-28 20:30:37 +01:00
578581b4dc More mf2 improvements for shares/reposts 2022-11-27 16:29:49 +01:00
ec36272bb4 Allow to disable certain notification type 2022-11-27 12:11:42 +01:00
e30e0de10e No more HTTP sig check on the actor profile 2022-11-27 11:36:15 +01:00
e672d9b9f0 Update AUTHORS 2022-11-27 11:33:46 +01:00
Sam
dcd44ec3b6 fix unshare by getting recipients from Announce activity instead of Undo 2022-11-27 11:31:45 +01:00
Sam
71a4ea2425 fix typo on deleted object ap_type 2022-11-27 11:29:54 +01:00
441e3d90b1 Fix formatting 2022-11-23 21:58:59 +01:00
d9b9f596d3 Skip custom emojis which don't match emoji regexp
Otherwise, emojis containing forbidden symbols (for example, `-`)
appear in "emoji selector" on admin/new page, but are not replaced
with emoji image on submit.

Also add a note to documentation mentioning allowed characters.
2022-11-23 21:54:02 +01:00
2cc4eda143 Boostrap stream customization (API may change) 2022-11-22 20:30:35 +01:00
bd065446bf Hack in HTTP sig to drop Delete requests early on 2022-11-21 21:43:12 +01:00
8475f5bccd Fix admin session timeout 2022-11-21 20:43:51 +01:00
a435cd33c9 Allow to delete webmentions 2022-11-20 11:56:58 +01:00
d692ec060f Tweak webmention processing 2022-11-20 11:31:00 +01:00
4c6eb51ae2 Proper mf2 for replies 2022-11-20 11:12:34 +01:00
d36102255f Merge branch 'v2' into indieweb-merge-part2 2022-11-20 10:48:43 +01:00
cdbc545d5e Add a flag on new notifications 2022-11-20 10:13:17 +01:00
fbc46e0517 More logging for the admin session 2022-11-20 10:02:28 +01:00
ef4608f348 Switch back the proxy client to HTTP2 mode 2022-11-20 09:49:19 +01:00
4638b98fa8 Regenerate AUTHORS file 2022-11-20 09:49:00 +01:00
a9f41d6be7 Put 'with_icon' param in the correct macro call
Fix for https://todo.sr.ht/~tsileo/microblog.pub/66
2022-11-20 09:47:54 +01:00
59dfc3d128 Update the install guide 2022-11-19 08:38:51 +01:00
822280c280 Tweak proxy client (increased timeout, no more HTTP2) 2022-11-19 08:32:44 +01:00
c83dd30f41 Increase admin session validity to 3 days 2022-11-19 08:16:53 +01:00
9d312bc229 Fix typing 2022-11-19 08:15:36 +01:00
b37b77ad34 Make local actor icon optional
If a remote actor has no icon, we show our local default icon.

If we have no icon, we should allow remote instances to show their
default icon, instead of sending ours.
2022-11-19 08:12:49 +01:00
9ee3f3b971 More progess on webmention replies 2022-11-19 08:12:33 +01:00
066f5ec900 Merge branch 'v2' into indieweb-merge-part2 2022-11-18 20:36:58 +01:00
a2254f2674 Add return type to hmac_sha256 2022-11-18 20:30:29 +01:00
2151733e4f Add robots meta tags on pages in robots.txt
Useful when app is at a non-root path and we're not handling top-level
/robots.txt requests.
2022-11-18 20:30:29 +01:00
3cff4e4507 Use BASE_URL when generating {proxied,resized}_image_url
Necessary when running at a non-root path
2022-11-18 20:30:29 +01:00
120f92a9ed Display Webmention as replies when applicable 2022-11-18 20:20:58 +01:00
ae8029cd22 Fix template 2022-11-17 21:12:16 +01:00
434fd98cd9 Merge IndieWeb likes/reposts with their AP counterpart 2022-11-17 21:03:24 +01:00
89c90fba56 Start to merge IndieWeb and AP interactions 2022-11-17 09:18:06 +01:00
e29fe0a079 Fix DM admin page showing deleted objects 2022-11-15 23:07:10 +01:00
c5aee435f4 Tweak README 2022-11-15 22:22:56 +01:00
224f5d3f55 Add AUTHORS file 2022-11-15 22:20:28 +01:00
6583feb87d Tweak the documentation about contributions 2022-11-15 22:17:55 +01:00
04e75c78e0 Handle inbox delete handler for actors 2022-11-15 21:47:51 +01:00
68c27e083f Allow to click on picture to see the original one 2022-11-14 21:23:41 +01:00
d52528584a Tweak template for the local delete button 2022-11-13 18:32:38 +01:00
36 changed files with 1249 additions and 243 deletions

9
AUTHORS Normal file
View File

@ -0,0 +1,9 @@
Thomas Sileo <t@a4.io>
Kevin Wallace <doof@doof.net>
Miguel Jacq <mig@mig5.net>
Alexey Shpakovsky <alexey@shpakovsky.ru>
Josh Washburne <josh@jodh.us>
Sam <samr1.dev@pm.me>
Ash McAllan <acegiak@gmail.com>
Cassio Zen <cassio@hey.com>
Cocoa <momijizukamori@gmail.com>

View File

@ -58,7 +58,7 @@ All the development takes place on [sourcehut](https://sr.ht/~tsileo/microblog.p
- [Issue tracker](https://todo.sr.ht/~tsileo/microblog.pub)
- [Mailing list](https://sr.ht/~tsileo/microblog.pub/lists)
Contributions are welcomed, check out the [documentation](https://docs.microblog.pub) for more details.
Contributions are welcomed, check out the [contributing section of the documentation](https://docs.microblog.pub/developer_guide.html#contributing) for more details.
## License

View File

@ -0,0 +1,32 @@
"""Add Webmention.webmention_type
Revision ID: fadfd359ce78
Revises: b28c0551c236
Create Date: 2022-11-16 19:42:56.925512+00:00
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = 'fadfd359ce78'
down_revision = 'b28c0551c236'
branch_labels = None
depends_on = None
def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('webmention', schema=None) as batch_op:
batch_op.add_column(sa.Column('webmention_type', sa.Enum('UNKNOWN', 'LIKE', 'REPLY', 'REPOST', name='webmentiontype'), nullable=True))
# ### end Alembic commands ###
def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('webmention', schema=None) as batch_op:
batch_op.drop_column('webmention_type')
# ### end Alembic commands ###

View File

@ -135,11 +135,6 @@ ME = {
"url": config.ID + "/", # XXX: the path is important for Mastodon compat
"manuallyApprovesFollowers": config.CONFIG.manually_approves_followers,
"attachment": _LOCAL_ACTOR_METADATA,
"icon": {
"mediaType": mimetypes.guess_type(config.CONFIG.icon_url)[0],
"type": "Image",
"url": config.CONFIG.icon_url,
},
"publicKey": {
"id": f"{config.ID}#main-key",
"owner": config.ID,
@ -148,6 +143,13 @@ ME = {
"tag": dedup_tags(_LOCAL_ACTOR_TAGS),
}
if config.CONFIG.icon_url:
ME["icon"] = {
"mediaType": mimetypes.guess_type(config.CONFIG.icon_url)[0],
"type": "Image",
"url": config.CONFIG.icon_url,
}
if ALSO_KNOWN_AS:
ME["alsoKnownAs"] = [ALSO_KNOWN_AS]

View File

@ -12,6 +12,7 @@ from sqlalchemy.orm import joinedload
from app import activitypub as ap
from app import media
from app.config import BASE_URL
from app.database import AsyncSession
from app.utils.datetime import as_utc
from app.utils.datetime import now
@ -111,14 +112,14 @@ class Actor:
if self.icon_url:
return media.proxied_media_url(self.icon_url)
else:
return "/static/nopic.png"
return BASE_URL + "/static/nopic.png"
@property
def resized_icon_url(self) -> str:
if self.icon_url:
return media.resized_media_url(self.icon_url, 50)
else:
return "/static/nopic.png"
return BASE_URL + "/static/nopic.png"
@property
def tags(self) -> list[ap.RawObject]:

View File

@ -11,6 +11,7 @@ from fastapi.exceptions import HTTPException
from fastapi.responses import RedirectResponse
from loguru import logger
from sqlalchemy import and_
from sqlalchemy import delete
from sqlalchemy import func
from sqlalchemy import or_
from sqlalchemy import select
@ -29,6 +30,7 @@ from app.boxes import send_block
from app.boxes import send_follow
from app.boxes import send_unblock
from app.config import EMOJIS
from app.config import SESSION_TIMEOUT
from app.config import generate_csrf_token
from app.config import session_serializer
from app.config import verify_csrf_token
@ -61,14 +63,17 @@ async def user_session_or_redirect(
)
if not session:
logger.info("No existing admin session")
raise _RedirectToLoginPage
try:
loaded_session = session_serializer.loads(session, max_age=3600 * 12)
loaded_session = session_serializer.loads(session, max_age=SESSION_TIMEOUT)
except Exception:
logger.exception("Failed to validate admin session")
raise _RedirectToLoginPage
if not loaded_session.get("is_logged_in"):
logger.info(f"Admin session invalidated: {loaded_session}")
raise _RedirectToLoginPage
return None
@ -439,6 +444,7 @@ async def admin_direct_messages(
models.InboxObject.ap_context.is_not(None),
# Skip transient object like poll relies
models.InboxObject.is_transient.is_(False),
models.InboxObject.is_deleted.is_(False),
)
.group_by(models.InboxObject.ap_context, models.InboxObject.actor_id)
)
@ -461,6 +467,7 @@ async def admin_direct_messages(
models.OutboxObject.ap_context.is_not(None),
# Skip transient object like poll relies
models.OutboxObject.is_transient.is_(False),
models.OutboxObject.is_deleted.is_(False),
)
.group_by(models.OutboxObject.ap_context)
)
@ -716,13 +723,9 @@ async def get_notifications(
actors_metadata = await get_actors_metadata(
db_session, [notif.actor for notif in notifications if notif.actor]
)
for notif in notifications:
notif.is_new = False
await db_session.commit()
more_unread_count = 0
next_cursor = None
if notifications and remaining_count > page_size:
decoded_next_cursor = notifications[-1].created_at
next_cursor = pagination.encode_cursor(decoded_next_cursor)
@ -736,7 +739,8 @@ async def get_notifications(
)
)
return await templates.render_template(
# Render the template before we change the new flag on notifications
tpl_resp = await templates.render_template(
db_session,
request,
"notifications.html",
@ -748,6 +752,13 @@ async def get_notifications(
},
)
if len({notif.id for notif in notifications if notif.is_new}):
for notif in notifications:
notif.is_new = False
await db_session.commit()
return tpl_resp
@router.get("/object")
async def admin_object(
@ -874,6 +885,42 @@ async def admin_actions_force_delete(
return RedirectResponse(redirect_url, status_code=302)
@router.post("/actions/force_delete_webmention")
async def admin_actions_force_delete_webmention(
request: Request,
webmention_id: int = Form(),
redirect_url: str = Form(),
csrf_check: None = Depends(verify_csrf_token),
db_session: AsyncSession = Depends(get_db_session),
) -> RedirectResponse:
webmention = await boxes.get_webmention_by_id(db_session, webmention_id)
if not webmention:
raise ValueError(f"Cannot find {webmention_id}")
if not webmention.outbox_object:
raise ValueError(f"Missing related outbox object for {webmention_id}")
# TODO: move this
logger.info(f"Deleting {webmention_id}")
webmention.is_deleted = True
await db_session.flush()
from app.webmentions import _handle_webmention_side_effects
await _handle_webmention_side_effects(
db_session, webmention, webmention.outbox_object
)
# Delete related notifications
notif_deletion_result = await db_session.execute(
delete(models.Notification)
.where(models.Notification.webmention_id == webmention.id)
.execution_options(synchronize_session=False)
)
logger.info(
f"Deleted {notif_deletion_result.rowcount} notifications" # type: ignore
)
await db_session.commit()
return RedirectResponse(redirect_url, status_code=302)
@router.post("/actions/follow")
async def admin_actions_follow(
request: Request,

View File

@ -12,6 +12,7 @@ from app import activitypub as ap
from app.actor import LOCAL_ACTOR
from app.actor import Actor
from app.actor import RemoteActor
from app.config import ID
from app.media import proxied_media_url
from app.utils.datetime import now
from app.utils.datetime import parse_isoformat
@ -212,6 +213,15 @@ class Object:
def in_reply_to(self) -> str | None:
return self.ap_object.get("inReplyTo")
@property
def is_local_reply(self) -> bool:
if not self.in_reply_to:
return False
return bool(
self.in_reply_to.startswith(ID) and self.content # Hide votes from Question
)
@property
def is_in_reply_to_from_inbox(self) -> bool | None:
if not self.in_reply_to:

View File

@ -1,4 +1,5 @@
"""Actions related to the AP inbox/outbox."""
import datetime
import uuid
from collections import defaultdict
from dataclasses import dataclass
@ -27,10 +28,11 @@ from app.actor import save_actor
from app.actor import update_actor_if_needed
from app.ap_object import RemoteObject
from app.config import BASE_URL
from app.config import BLOCKED_SERVERS
from app.config import ID
from app.config import MANUALLY_APPROVES_FOLLOWERS
from app.config import set_moved_to
from app.config import stream_visibility_callback
from app.customization import ObjectInfo
from app.database import AsyncSession
from app.outgoing_activities import new_outgoing_activity
from app.source import dedup_tags
@ -41,11 +43,24 @@ from app.utils import webmentions
from app.utils.datetime import as_utc
from app.utils.datetime import now
from app.utils.datetime import parse_isoformat
from app.utils.facepile import WebmentionReply
from app.utils.text import slugify
from app.utils.url import is_hostname_blocked
AnyboxObject = models.InboxObject | models.OutboxObject
def is_notification_enabled(notification_type: models.NotificationType) -> bool:
"""Checks if a given notification type is enabled."""
if notification_type.value == "pending_incoming_follower":
# This one cannot be disabled as it would prevent manually reviewing
# follow requests.
return True
if notification_type.value in config.CONFIG.disabled_notifications:
return False
return True
def allocate_outbox_id() -> str:
return uuid.uuid4().hex
@ -164,12 +179,13 @@ async def send_block(db_session: AsyncSession, ap_actor_id: str) -> None:
await new_outgoing_activity(db_session, actor.inbox_url, outbox_object.id)
# 4. Create a notification
notif = models.Notification(
notification_type=models.NotificationType.BLOCK,
actor_id=actor.id,
outbox_object_id=outbox_object.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.BLOCK):
notif = models.Notification(
notification_type=models.NotificationType.BLOCK,
actor_id=actor.id,
outbox_object_id=outbox_object.id,
)
db_session.add(notif)
await db_session.commit()
@ -201,7 +217,7 @@ async def send_delete(db_session: AsyncSession, ap_object_id: str) -> None:
raise ValueError("Should never happen")
outbox_object_to_delete.is_deleted = True
await db_session.commit()
await db_session.flush()
# Compute the original recipients
recipients = await _compute_recipients(
@ -216,14 +232,17 @@ async def send_delete(db_session: AsyncSession, ap_object_id: str) -> None:
db_session, outbox_object_to_delete.in_reply_to
)
if replied_object:
new_replies_count = await _get_replies_count(
db_session, replied_object.ap_id
)
if replied_object.is_from_outbox:
# Different helper here because we also count webmentions
new_replies_count = await _get_outbox_replies_count(
db_session, replied_object # type: ignore
)
else:
new_replies_count = await _get_replies_count(
db_session, replied_object.ap_id
)
replied_object.replies_count = new_replies_count
if replied_object.replies_count < 0:
logger.warning("negative replies count for {replied_object.ap_id}")
replied_object.replies_count = 0
else:
logger.info(f"{outbox_object_to_delete.in_reply_to} not found")
@ -420,7 +439,9 @@ async def _send_undo(db_session: AsyncSession, ap_object_id: str) -> None:
announced_object.announced_via_outbox_object_ap_id = None
# Send the Undo to the original recipients
recipients = await _compute_recipients(db_session, outbox_object.ap_object)
recipients = await _compute_recipients(
db_session, outbox_object_to_undo.ap_object
)
for rcp in recipients:
await new_outgoing_activity(db_session, rcp, outbox_object.id)
elif outbox_object_to_undo.ap_type == "Block":
@ -440,12 +461,13 @@ async def _send_undo(db_session: AsyncSession, ap_object_id: str) -> None:
outbox_object.id,
)
notif = models.Notification(
notification_type=models.NotificationType.UNBLOCK,
actor_id=blocked_actor.id,
outbox_object_id=outbox_object.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.UNBLOCK):
notif = models.Notification(
notification_type=models.NotificationType.UNBLOCK,
actor_id=blocked_actor.id,
outbox_object_id=outbox_object.id,
)
db_session.add(notif)
else:
raise ValueError("Should never happen")
@ -1048,6 +1070,32 @@ async def get_outbox_object_by_ap_id(
) # type: ignore
async def get_outbox_object_by_slug_and_short_id(
db_session: AsyncSession,
slug: str,
short_id: str,
) -> models.OutboxObject | None:
return (
(
await db_session.execute(
select(models.OutboxObject)
.options(
joinedload(models.OutboxObject.outbox_object_attachments).options(
joinedload(models.OutboxObjectAttachment.upload)
)
)
.where(
models.OutboxObject.public_id.like(f"{short_id}%"),
models.OutboxObject.slug == slug,
models.OutboxObject.is_deleted.is_(False),
)
)
)
.unique()
.scalar_one_or_none()
)
async def get_anybox_object_by_ap_id(
db_session: AsyncSession, ap_id: str
) -> AnyboxObject | None:
@ -1057,6 +1105,20 @@ async def get_anybox_object_by_ap_id(
return await get_inbox_object_by_ap_id(db_session, ap_id)
async def get_webmention_by_id(
db_session: AsyncSession, webmention_id: int
) -> models.Webmention | None:
return (
await db_session.execute(
select(models.Webmention)
.where(models.Webmention.id == webmention_id)
.options(
joinedload(models.Webmention.outbox_object),
)
)
).scalar_one_or_none() # type: ignore
async def _handle_delete_activity(
db_session: AsyncSession,
from_actor: models.Actor,
@ -1124,6 +1186,23 @@ async def _handle_delete_activity(
logger.info("Removing actor from follower")
await db_session.delete(follower)
# Also mark Follow activities for this actor as deleted
follow_activities = (
await db_session.scalars(
select(models.OutboxObject).where(
models.OutboxObject.ap_type == "Follow",
models.OutboxObject.relates_to_actor_id
== ap_object_to_delete.id,
models.OutboxObject.is_deleted.is_(False),
)
)
).all()
for follow_activity in follow_activities:
logger.info(
f"Marking Follow activity {follow_activity.ap_id} as deleted"
)
follow_activity.is_deleted = True
following = (
await db_session.scalars(
select(models.Following).where(
@ -1184,6 +1263,67 @@ async def _get_replies_count(
)
async def _get_outbox_replies_count(
db_session: AsyncSession,
outbox_object: models.OutboxObject,
) -> int:
return (await _get_replies_count(db_session, outbox_object.ap_id)) + (
await db_session.scalar(
select(func.count(models.Webmention.id)).where(
models.Webmention.is_deleted.is_(False),
models.Webmention.outbox_object_id == outbox_object.id,
models.Webmention.webmention_type == models.WebmentionType.REPLY,
)
)
)
async def _get_outbox_likes_count(
db_session: AsyncSession,
outbox_object: models.OutboxObject,
) -> int:
return (
await db_session.scalar(
select(func.count(models.InboxObject.id)).where(
models.InboxObject.ap_type == "Like",
models.InboxObject.relates_to_outbox_object_id == outbox_object.id,
models.InboxObject.is_deleted.is_(False),
)
)
) + (
await db_session.scalar(
select(func.count(models.Webmention.id)).where(
models.Webmention.is_deleted.is_(False),
models.Webmention.outbox_object_id == outbox_object.id,
models.Webmention.webmention_type == models.WebmentionType.LIKE,
)
)
)
async def _get_outbox_announces_count(
db_session: AsyncSession,
outbox_object: models.OutboxObject,
) -> int:
return (
await db_session.scalar(
select(func.count(models.InboxObject.id)).where(
models.InboxObject.ap_type == "Announce",
models.InboxObject.relates_to_outbox_object_id == outbox_object.id,
models.InboxObject.is_deleted.is_(False),
)
)
) + (
await db_session.scalar(
select(func.count(models.Webmention.id)).where(
models.Webmention.is_deleted.is_(False),
models.Webmention.outbox_object_id == outbox_object.id,
models.Webmention.webmention_type == models.WebmentionType.REPOST,
)
)
)
async def _revert_side_effect_for_deleted_object(
db_session: AsyncSession,
delete_activity: models.InboxObject | None,
@ -1214,8 +1354,8 @@ async def _revert_side_effect_for_deleted_object(
# also needs to be forwarded
is_delete_needs_to_be_forwarded = True
new_replies_count = await _get_replies_count(
db_session, replied_object.ap_id
new_replies_count = await _get_outbox_replies_count(
db_session, replied_object # type: ignore
)
await db_session.execute(
@ -1245,15 +1385,16 @@ async def _revert_side_effect_for_deleted_object(
)
if related_object:
if related_object.is_from_outbox:
likes_count = await _get_outbox_likes_count(db_session, related_object)
await db_session.execute(
update(models.OutboxObject)
.where(
models.OutboxObject.id == related_object.id,
)
.values(likes_count=models.OutboxObject.likes_count - 1)
.values(likes_count=likes_count - 1)
)
elif (
deleted_ap_object.ap_type == "Annouce"
deleted_ap_object.ap_type == "Announce"
and deleted_ap_object.activity_object_ap_id
):
related_object = await get_outbox_object_by_ap_id(
@ -1262,12 +1403,15 @@ async def _revert_side_effect_for_deleted_object(
)
if related_object:
if related_object.is_from_outbox:
announces_count = await _get_outbox_announces_count(
db_session, related_object
)
await db_session.execute(
update(models.OutboxObject)
.where(
models.OutboxObject.id == related_object.id,
)
.values(announces_count=models.OutboxObject.announces_count - 1)
.values(announces_count=announces_count - 1)
)
# Delete any Like/Announce
@ -1396,11 +1540,12 @@ async def _send_accept(
raise ValueError("Should never happen")
await new_outgoing_activity(db_session, from_actor.inbox_url, outbox_activity.id)
notif = models.Notification(
notification_type=models.NotificationType.NEW_FOLLOWER,
actor_id=from_actor.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.NEW_FOLLOWER):
notif = models.Notification(
notification_type=models.NotificationType.NEW_FOLLOWER,
actor_id=from_actor.id,
)
db_session.add(notif)
async def send_reject(
@ -1439,11 +1584,12 @@ async def _send_reject(
raise ValueError("Should never happen")
await new_outgoing_activity(db_session, from_actor.inbox_url, outbox_activity.id)
notif = models.Notification(
notification_type=models.NotificationType.REJECTED_FOLLOWER,
actor_id=from_actor.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.REJECTED_FOLLOWER):
notif = models.Notification(
notification_type=models.NotificationType.REJECTED_FOLLOWER,
actor_id=from_actor.id,
)
db_session.add(notif)
async def _handle_undo_activity(
@ -1469,11 +1615,12 @@ async def _handle_undo_activity(
models.Follower.inbox_object_id == ap_activity_to_undo.id
)
)
notif = models.Notification(
notification_type=models.NotificationType.UNFOLLOW,
actor_id=from_actor.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.UNFOLLOW):
notif = models.Notification(
notification_type=models.NotificationType.UNFOLLOW,
actor_id=from_actor.id,
)
db_session.add(notif)
elif ap_activity_to_undo.ap_type == "Like":
if not ap_activity_to_undo.activity_object_ap_id:
@ -1489,14 +1636,21 @@ async def _handle_undo_activity(
)
return
liked_obj.likes_count = models.OutboxObject.likes_count - 1
notif = models.Notification(
notification_type=models.NotificationType.UNDO_LIKE,
actor_id=from_actor.id,
outbox_object_id=liked_obj.id,
inbox_object_id=ap_activity_to_undo.id,
liked_obj.likes_count = (
await _get_outbox_likes_count(
db_session,
liked_obj,
)
- 1
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.UNDO_LIKE):
notif = models.Notification(
notification_type=models.NotificationType.UNDO_LIKE,
actor_id=from_actor.id,
outbox_object_id=liked_obj.id,
inbox_object_id=ap_activity_to_undo.id,
)
db_session.add(notif)
elif ap_activity_to_undo.ap_type == "Announce":
if not ap_activity_to_undo.activity_object_ap_id:
@ -1514,20 +1668,22 @@ async def _handle_undo_activity(
announced_obj_from_outbox.announces_count = (
models.OutboxObject.announces_count - 1
)
notif = models.Notification(
notification_type=models.NotificationType.UNDO_ANNOUNCE,
actor_id=from_actor.id,
outbox_object_id=announced_obj_from_outbox.id,
inbox_object_id=ap_activity_to_undo.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.UNDO_ANNOUNCE):
notif = models.Notification(
notification_type=models.NotificationType.UNDO_ANNOUNCE,
actor_id=from_actor.id,
outbox_object_id=announced_obj_from_outbox.id,
inbox_object_id=ap_activity_to_undo.id,
)
db_session.add(notif)
elif ap_activity_to_undo.ap_type == "Block":
notif = models.Notification(
notification_type=models.NotificationType.UNBLOCKED,
actor_id=from_actor.id,
inbox_object_id=ap_activity_to_undo.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.UNBLOCKED):
notif = models.Notification(
notification_type=models.NotificationType.UNBLOCKED,
actor_id=from_actor.id,
inbox_object_id=ap_activity_to_undo.id,
)
db_session.add(notif)
else:
logger.warning(f"Don't know how to undo {ap_activity_to_undo.ap_type} activity")
@ -1591,12 +1747,13 @@ async def _handle_move_activity(
else:
logger.info(f"Already following target {new_actor_id}")
notif = models.Notification(
notification_type=models.NotificationType.MOVE,
actor_id=new_actor.id,
inbox_object_id=move_activity.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.MOVE):
notif = models.Notification(
notification_type=models.NotificationType.MOVE,
actor_id=new_actor.id,
inbox_object_id=move_activity.id,
)
db_session.add(notif)
async def _handle_update_activity(
@ -1754,16 +1911,26 @@ async def _process_note_object(
is_from_following = ro.actor.ap_id in {f.ap_actor_id for f in following}
is_reply = bool(ro.in_reply_to)
is_local_reply = (
ro.in_reply_to
and ro.in_reply_to.startswith(BASE_URL)
and ro.content # Hide votes from Question
)
is_local_reply = ro.is_local_reply
is_mention = False
hashtags = []
tags = ro.ap_object.get("tag", [])
for tag in ap.as_list(tags):
if tag.get("name") == LOCAL_ACTOR.handle or tag.get("href") == LOCAL_ACTOR.url:
is_mention = True
if tag.get("type") == "Hashtag":
if tag_name := tag.get("name"):
hashtags.append(tag_name)
object_info = ObjectInfo(
is_reply=is_reply,
is_local_reply=is_local_reply,
is_mention=is_mention,
is_from_following=is_from_following,
hashtags=hashtags,
actor_handle=ro.actor.handle,
remote_object=ro,
)
inbox_object = models.InboxObject(
server=urlparse(ro.ap_id).hostname,
@ -1781,9 +1948,7 @@ async def _process_note_object(
activity_object_ap_id=ro.activity_object_ap_id,
og_meta=await opengraph.og_meta_from_note(db_session, ro),
# Hide replies from the stream
is_hidden_from_stream=not (
(not is_reply and is_from_following) or is_mention or is_local_reply
),
is_hidden_from_stream=not stream_visibility_callback(object_info),
# We may already have some replies in DB
replies_count=await _get_replies_count(db_session, ro.ap_id),
)
@ -1809,8 +1974,8 @@ async def _process_note_object(
replied_object, # type: ignore # outbox check below
)
else:
new_replies_count = await _get_replies_count(
db_session, replied_object.ap_id
new_replies_count = await _get_outbox_replies_count(
db_session, replied_object # type: ignore
)
await db_session.execute(
@ -1858,7 +2023,7 @@ async def _process_note_object(
inbox_object_id=parent_activity.id,
)
if is_mention:
if is_mention and is_notification_enabled(models.NotificationType.MENTION):
notif = models.Notification(
notification_type=models.NotificationType.MENTION,
actor_id=from_actor.id,
@ -1957,13 +2122,14 @@ async def _handle_announce_activity(
models.OutboxObject.announces_count + 1
)
notif = models.Notification(
notification_type=models.NotificationType.ANNOUNCE,
actor_id=actor.id,
outbox_object_id=relates_to_outbox_object.id,
inbox_object_id=announce_activity.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.ANNOUNCE):
notif = models.Notification(
notification_type=models.NotificationType.ANNOUNCE,
actor_id=actor.id,
outbox_object_id=relates_to_outbox_object.id,
inbox_object_id=announce_activity.id,
)
db_session.add(notif)
else:
# Only show the announce in the stream if it comes from an actor
# in the following collection
@ -2056,15 +2222,19 @@ async def _handle_like_activity(
)
await db_session.delete(like_activity)
else:
relates_to_outbox_object.likes_count = models.OutboxObject.likes_count + 1
notif = models.Notification(
notification_type=models.NotificationType.LIKE,
actor_id=actor.id,
outbox_object_id=relates_to_outbox_object.id,
inbox_object_id=like_activity.id,
relates_to_outbox_object.likes_count = await _get_outbox_likes_count(
db_session,
relates_to_outbox_object,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.LIKE):
notif = models.Notification(
notification_type=models.NotificationType.LIKE,
actor_id=actor.id,
outbox_object_id=relates_to_outbox_object.id,
inbox_object_id=like_activity.id,
)
db_session.add(notif)
async def _handle_block_activity(
@ -2081,12 +2251,13 @@ async def _handle_block_activity(
return
# Create a notification
notif = models.Notification(
notification_type=models.NotificationType.BLOCKED,
actor_id=actor.id,
inbox_object_id=block_activity.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.BLOCKED):
notif = models.Notification(
notification_type=models.NotificationType.BLOCKED,
actor_id=actor.id,
inbox_object_id=block_activity.id,
)
db_session.add(notif)
async def _process_transient_object(
@ -2141,7 +2312,7 @@ async def save_to_inbox(
logger.exception("Failed to fetch actor")
return
if actor.server in BLOCKED_SERVERS:
if is_hostname_blocked(actor.server):
logger.warning(f"Server {actor.server} is blocked")
return
@ -2289,12 +2460,13 @@ async def save_to_inbox(
if activity_ro.ap_type == "Accept"
else models.NotificationType.FOLLOW_REQUEST_REJECTED
)
notif = models.Notification(
notification_type=notif_type,
actor_id=actor.id,
inbox_object_id=inbox_object.id,
)
db_session.add(notif)
if is_notification_enabled(notif_type):
notif = models.Notification(
notification_type=notif_type,
actor_id=actor.id,
inbox_object_id=inbox_object.id,
)
db_session.add(notif)
if activity_ro.ap_type == "Accept":
following = models.Following(
@ -2467,11 +2639,21 @@ async def fetch_actor_collection(db_session: AsyncSession, url: str) -> list[Act
@dataclass
class ReplyTreeNode:
ap_object: AnyboxObject
ap_object: AnyboxObject | None
wm_reply: WebmentionReply | None
children: list["ReplyTreeNode"]
is_requested: bool = False
is_root: bool = False
@property
def published_at(self) -> datetime.datetime:
if self.ap_object:
return self.ap_object.ap_published_at # type: ignore
elif self.wm_reply:
return self.wm_reply.published_at
else:
raise ValueError(f"Should never happen: {self}")
async def get_replies_tree(
db_session: AsyncSession,
@ -2545,6 +2727,7 @@ async def get_replies_tree(
for child in index.get(node.ap_object.ap_id, []): # type: ignore
child_node = ReplyTreeNode(
ap_object=child,
wm_reply=None,
is_requested=child.ap_id == requested_object.ap_id, # type: ignore
children=[],
)
@ -2553,7 +2736,7 @@ async def get_replies_tree(
return sorted(
children,
key=lambda node: node.ap_object.ap_published_at, # type: ignore
key=lambda node: node.published_at,
)
if None in nodes_by_in_reply_to:
@ -2566,6 +2749,7 @@ async def get_replies_tree(
root_node = ReplyTreeNode(
ap_object=root_ap_object,
wm_reply=None,
is_root=True,
is_requested=root_ap_object.ap_id == requested_object.ap_id,
children=[],

View File

@ -16,6 +16,8 @@ from loguru import logger
from mistletoe import markdown # type: ignore
from app.customization import _CUSTOM_ROUTES
from app.customization import _StreamVisibilityCallback
from app.customization import default_stream_visibility_callback
from app.utils.emoji import _load_emojis
from app.utils.version import get_version_commit
@ -42,11 +44,14 @@ except FileNotFoundError:
JS_HASH = "none"
try:
# To keep things simple, we keep a single hash for the 2 files
js_data_common = (ROOT_DIR / "app" / "static" / "common-admin.js").read_bytes()
js_data_new = (ROOT_DIR / "app" / "static" / "new.js").read_bytes()
JS_HASH = hashlib.md5(
js_data_common + js_data_new, usedforsecurity=False
).hexdigest()
dat = b""
for j in [
ROOT_DIR / "app" / "static" / "common.js",
ROOT_DIR / "app" / "static" / "common-admin.js",
ROOT_DIR / "app" / "static" / "new.js",
]:
dat += j.read_bytes()
JS_HASH = hashlib.md5(dat, usedforsecurity=False).hexdigest()
except FileNotFoundError:
pass
@ -91,7 +96,7 @@ class Config(pydantic.BaseModel):
name: str
summary: str
https: bool
icon_url: str
icon_url: str | None = None
image_url: str | None = None
secret: str
debug: bool = False
@ -116,6 +121,10 @@ class Config(pydantic.BaseModel):
sqlalchemy_database: str | None = None
key_path: str | None = None
session_timeout: int = 3600 * 24 * 3 # in seconds, 3 days by default
disabled_notifications: list[str] = []
# Only set when the app is served on a non-root path
id: str | None = None
@ -171,6 +180,7 @@ ALSO_KNOWN_AS = CONFIG.also_known_as
CUSTOM_CONTENT_SECURITY_POLICY = CONFIG.custom_content_security_policy
INBOX_RETENTION_DAYS = CONFIG.inbox_retention_days
SESSION_TIMEOUT = CONFIG.session_timeout
CUSTOM_FOOTER = (
markdown(CONFIG.custom_footer.replace("{version}", VERSION))
if CONFIG.custom_footer
@ -257,5 +267,16 @@ def verify_csrf_token(
return None
def hmac_sha256():
def hmac_sha256() -> hmac.HMAC:
return hmac.new(CONFIG.secret.encode(), digestmod=hashlib.sha256)
stream_visibility_callback: _StreamVisibilityCallback
try:
from data.stream import ( # type: ignore # noqa: F401, E501
custom_stream_visibility_callback,
)
stream_visibility_callback = custom_stream_visibility_callback
except ImportError:
stream_visibility_callback = default_stream_visibility_callback

View File

@ -1,12 +1,19 @@
from dataclasses import dataclass
from pathlib import Path
from typing import TYPE_CHECKING
from typing import Any
from typing import Callable
from fastapi import APIRouter
from fastapi import Depends
from fastapi import Request
from loguru import logger
from starlette.responses import JSONResponse
if TYPE_CHECKING:
from app.ap_object import RemoteObject
_DATA_DIR = Path().parent.resolve() / "data"
_Handler = Callable[..., Any]
@ -110,3 +117,39 @@ def get_custom_router() -> APIRouter | None:
router.add_api_route(path, handler.handler)
return router
@dataclass
class ObjectInfo:
# Is it a reply?
is_reply: bool
# Is it a reply to an outbox object
is_local_reply: bool
# Is the object mentioning the local actor
is_mention: bool
# Is it from someone the local actor is following
is_from_following: bool
# List of hashtags, e.g. #microblogpub
hashtags: list[str]
# @dev@microblog.pub
actor_handle: str
remote_object: "RemoteObject"
_StreamVisibilityCallback = Callable[[ObjectInfo], bool]
def default_stream_visibility_callback(object_info: ObjectInfo) -> bool:
result = (
(not object_info.is_reply and object_info.is_from_following)
or object_info.is_mention
or object_info.is_local_reply
)
logger.info(f"{object_info=}/{result=}")
return result

View File

@ -1,5 +1,6 @@
import base64
import hashlib
import json
import typing
from dataclasses import dataclass
from datetime import datetime
@ -22,12 +23,12 @@ from sqlalchemy import select
from app import activitypub as ap
from app import config
from app.config import BLOCKED_SERVERS
from app.config import KEY_PATH
from app.database import AsyncSession
from app.database import get_db_session
from app.key import Key
from app.utils.datetime import now
from app.utils.url import is_hostname_blocked
_KEY_CACHE: MutableMapping[str, Key] = LFUCache(256)
@ -183,7 +184,7 @@ async def httpsig_checker(
)
server = urlparse(key_id).hostname
if server in BLOCKED_SERVERS:
if is_hostname_blocked(server):
return HTTPSigInfo(
has_valid_signature=False,
server=server,
@ -198,6 +199,32 @@ async def httpsig_checker(
server=server,
)
# Try to drop Delete activity spams early on, this prevent making an extra
# HTTP requests trying to fetch an unavailable actor to verify the HTTP sig
try:
if request.method == "POST" and request.url.path.endswith("/inbox"):
from app import models # TODO: solve this circular import
activity = json.loads(body)
actor_id = ap.get_id(activity["actor"])
if (
ap.as_list(activity["type"])[0] == "Delete"
and actor_id == ap.get_id(activity["object"])
and not (
await db_session.scalars(
select(models.Actor).where(
models.Actor.ap_id == actor_id,
)
)
).one_or_none()
):
logger.info(f"Dropping Delete activity early for {body=}")
raise fastapi.HTTPException(status_code=202)
except fastapi.HTTPException as http_exc:
raise http_exc
except Exception:
logger.exception("Failed to check for Delete spam")
# logger.debug(f"hsig={hsig}")
signed_string, signature_date = _build_signed_string(
hsig["headers"],

View File

@ -73,6 +73,9 @@ from app.templates import is_current_user_admin
from app.uploads import UPLOAD_DIR
from app.utils import pagination
from app.utils.emoji import EMOJIS_BY_NAME
from app.utils.facepile import Face
from app.utils.facepile import WebmentionReply
from app.utils.facepile import merge_faces
from app.utils.highlight import HIGHLIGHT_CSS_HASH
from app.utils.url import check_url
from app.webfinger import get_remote_follow_template
@ -282,7 +285,6 @@ async def redirect_to_remote_instance(
async def index(
request: Request,
db_session: AsyncSession = Depends(get_db_session),
_: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
page: int | None = None,
) -> templates.TemplateResponse | ActivityPubResponse:
if is_activitypub_requested(request):
@ -294,7 +296,7 @@ async def index(
models.OutboxObject.visibility == ap.VisibilityEnum.PUBLIC,
models.OutboxObject.is_deleted.is_(False),
models.OutboxObject.is_hidden_from_homepage.is_(False),
models.OutboxObject.ap_type != "Article",
models.OutboxObject.ap_type.in_(["Announce", "Note", "Video", "Question"]),
)
q = select(models.OutboxObject).where(*where)
total_count = await db_session.scalar(
@ -724,7 +726,7 @@ async def _fetch_webmentions(
models.Webmention.outbox_object_id == outbox_object.id,
models.Webmention.is_deleted.is_(False),
)
.limit(10)
.limit(50)
)
).all()
@ -774,23 +776,90 @@ async def outbox_by_public_id(
is_current_user_admin=is_current_user_admin(request),
)
webmentions = await _fetch_webmentions(db_session, maybe_object)
likes = await _fetch_likes(db_session, maybe_object)
shares = await _fetch_shares(db_session, maybe_object)
webmentions = await _fetch_webmentions(db_session, maybe_object)
return await templates.render_template(
db_session,
request,
"object.html",
{
"replies_tree": replies_tree,
"replies_tree": _merge_replies(replies_tree, webmentions),
"outbox_object": maybe_object,
"likes": likes,
"shares": shares,
"webmentions": webmentions,
"likes": _merge_faces_from_inbox_object_and_webmentions(
likes,
webmentions,
models.WebmentionType.LIKE,
),
"shares": _merge_faces_from_inbox_object_and_webmentions(
shares,
webmentions,
models.WebmentionType.REPOST,
),
"webmentions": _filter_webmentions(webmentions),
},
)
def _filter_webmentions(
webmentions: list[models.Webmention],
) -> list[models.Webmention]:
return [
wm
for wm in webmentions
if wm.webmention_type
not in [
models.WebmentionType.LIKE,
models.WebmentionType.REPOST,
models.WebmentionType.REPLY,
]
]
def _merge_faces_from_inbox_object_and_webmentions(
inbox_objects: list[models.InboxObject],
webmentions: list[models.Webmention],
webmention_type: models.WebmentionType,
) -> list[Face]:
wm_faces = []
for wm in webmentions:
if wm.webmention_type != webmention_type:
continue
if face := Face.from_webmention(wm):
wm_faces.append(face)
return merge_faces(
[Face.from_inbox_object(obj) for obj in inbox_objects] + wm_faces
)
def _merge_replies(
reply_tree_node: boxes.ReplyTreeNode,
webmentions: list[models.Webmention],
) -> boxes.ReplyTreeNode:
# TODO: return None as we update the object in place
webmention_replies = []
for wm in [
wm for wm in webmentions if wm.webmention_type == models.WebmentionType.REPLY
]:
if rep := WebmentionReply.from_webmention(wm):
webmention_replies.append(
boxes.ReplyTreeNode(
ap_object=None,
wm_reply=rep,
is_requested=False,
children=[],
)
)
reply_tree_node.children = sorted(
reply_tree_node.children + webmention_replies,
key=lambda node: node.published_at,
reverse=True,
)
return reply_tree_node
@app.get("/articles/{short_id}/{slug}")
async def article_by_slug(
short_id: str,
@ -799,24 +868,8 @@ async def article_by_slug(
db_session: AsyncSession = Depends(get_db_session),
httpsig_info: httpsig.HTTPSigInfo = Depends(httpsig.httpsig_checker),
) -> ActivityPubResponse | templates.TemplateResponse | RedirectResponse:
maybe_object = (
(
await db_session.execute(
select(models.OutboxObject)
.options(
joinedload(models.OutboxObject.outbox_object_attachments).options(
joinedload(models.OutboxObjectAttachment.upload)
)
)
.where(
models.OutboxObject.public_id.like(f"{short_id}%"),
models.OutboxObject.slug == slug,
models.OutboxObject.is_deleted.is_(False),
)
)
)
.unique()
.scalar_one_or_none()
maybe_object = await boxes.get_outbox_object_by_slug_and_short_id(
db_session, slug, short_id
)
if not maybe_object:
raise HTTPException(status_code=404)
@ -840,11 +893,19 @@ async def article_by_slug(
request,
"object.html",
{
"replies_tree": replies_tree,
"replies_tree": _merge_replies(replies_tree, webmentions),
"outbox_object": maybe_object,
"likes": likes,
"shares": shares,
"webmentions": webmentions,
"likes": _merge_faces_from_inbox_object_and_webmentions(
likes,
webmentions,
models.WebmentionType.LIKE,
),
"shares": _merge_faces_from_inbox_object_and_webmentions(
shares,
webmentions,
models.WebmentionType.REPOST,
),
"webmentions": _filter_webmentions(webmentions),
},
)
@ -1118,11 +1179,11 @@ async def nodeinfo(
)
proxy_client = httpx.AsyncClient(follow_redirects=True, http2=True)
async def _proxy_get(
request: starlette.requests.Request, url: str, stream: bool
proxy_client: httpx.AsyncClient,
request: starlette.requests.Request,
url: str,
stream: bool,
) -> httpx.Response:
# Request the URL (and filter request headers)
proxy_req = proxy_client.build_request(
@ -1169,18 +1230,29 @@ async def serve_proxy_media(
exp: int,
sig: str,
encoded_url: str,
background_tasks: fastapi.BackgroundTasks,
) -> StreamingResponse | PlainTextResponse:
# Decode the base64-encoded URL
url = base64.urlsafe_b64decode(encoded_url).decode()
check_url(url)
media.verify_proxied_media_sig(exp, url, sig)
proxy_resp = await _proxy_get(request, url, stream=True)
proxy_client = httpx.AsyncClient(
follow_redirects=True,
timeout=httpx.Timeout(timeout=10.0),
transport=httpx.AsyncHTTPTransport(retries=1),
)
async def _close_proxy_client():
await proxy_client.aclose()
background_tasks.add_task(_close_proxy_client)
proxy_resp = await _proxy_get(proxy_client, request, url, stream=True)
if proxy_resp.status_code >= 300:
logger.info(f"failed to proxy {url}, got {proxy_resp.status_code}")
await proxy_resp.aclose()
return PlainTextResponse(
"proxy error",
status_code=proxy_resp.status_code,
)
@ -1213,6 +1285,7 @@ async def serve_proxy_media_resized(
sig: str,
encoded_url: str,
size: int,
background_tasks: fastapi.BackgroundTasks,
) -> PlainTextResponse:
if size not in {50, 740}:
raise ValueError("Unsupported size")
@ -1230,11 +1303,21 @@ async def serve_proxy_media_resized(
headers=resp_headers,
)
proxy_resp = await _proxy_get(request, url, stream=False)
proxy_client = httpx.AsyncClient(
follow_redirects=True,
timeout=httpx.Timeout(timeout=10.0),
transport=httpx.AsyncHTTPTransport(retries=1),
)
async def _close_proxy_client():
await proxy_client.aclose()
background_tasks.add_task(_close_proxy_client)
proxy_resp = await _proxy_get(proxy_client, request, url, stream=False)
if proxy_resp.status_code >= 300:
logger.info(f"failed to proxy {url}, got {proxy_resp.status_code}")
await proxy_resp.aclose()
return PlainTextResponse(
"proxy error",
status_code=proxy_resp.status_code,
)
@ -1398,7 +1481,7 @@ async def json_feed(
],
}
)
return {
result = {
"version": "https://jsonfeed.org/version/1",
"title": f"{LOCAL_ACTOR.display_name}'s microblog'",
"home_page_url": LOCAL_ACTOR.url,
@ -1406,10 +1489,12 @@ async def json_feed(
"author": {
"name": LOCAL_ACTOR.display_name,
"url": LOCAL_ACTOR.url,
"avatar": LOCAL_ACTOR.icon_url,
},
"items": data,
}
if LOCAL_ACTOR.icon_url:
result["author"]["avatar"] = LOCAL_ACTOR.icon_url # type: ignore
return result
async def _gen_rss_feed(
@ -1421,7 +1506,8 @@ async def _gen_rss_feed(
fg.description(f"{LOCAL_ACTOR.display_name}'s microblog")
fg.author({"name": LOCAL_ACTOR.display_name})
fg.link(href=LOCAL_ACTOR.url, rel="alternate")
fg.logo(LOCAL_ACTOR.icon_url)
if LOCAL_ACTOR.icon_url:
fg.logo(LOCAL_ACTOR.icon_url)
fg.language("en")
outbox_objects = await _get_outbox_for_feed(db_session)

View File

@ -468,6 +468,14 @@ class IndieAuthAccessToken(Base):
is_revoked = Column(Boolean, nullable=False, default=False)
@enum.unique
class WebmentionType(str, enum.Enum):
UNKNOWN = "unknown"
LIKE = "like"
REPLY = "reply"
REPOST = "repost"
class Webmention(Base):
__tablename__ = "webmention"
__table_args__ = (UniqueConstraint("source", "target", name="uix_source_target"),)
@ -484,6 +492,8 @@ class Webmention(Base):
outbox_object_id = Column(Integer, ForeignKey("outbox.id"), nullable=False)
outbox_object = relationship(OutboxObject, uselist=False)
webmention_type = Column(Enum(WebmentionType), nullable=True)
@property
def as_facepile_item(self) -> webmentions.Webmention | None:
if not self.source_microformats:
@ -493,6 +503,7 @@ class Webmention(Base):
self.source_microformats["items"], self.source
)
except Exception:
# TODO: return a facepile with the unknown image
logger.warning(
f"Failed to generate facefile item for Webmention id={self.id}"
)

View File

@ -51,17 +51,20 @@ $code-highlight-background: #f0f0f0;
.p-summary {
display: inline-block;
}
label {
.show-more-btn {
margin-left: 5px;
}
.show-more-state {
display: none;
summary {
display: inline-block;
}
.show-more-state ~ .obj-content {
margin-top: 0;
summary::-webkit-details-marker {
display: none
}
.show-more-state:checked ~ .obj-content {
display: none;
&:not([open]) .show-more-btn::after {
content: 'show more';
}
&[open] .show-more-btn::after {
content: 'show less';
}
}
.sensitive-attachment {
@ -467,6 +470,9 @@ a.label-btn {
span {
color: $muted-color;
}
span.new {
color: $secondary-color;
}
}
.actor-metadata {
color: $muted-color;
@ -541,3 +547,26 @@ a.label-btn {
content: ': ';
}
}
.margin-top-20 {
margin-top: 20px;
}
.video-wrapper {
position: relative;
}
.video-gif-overlay {
display: none;
}
.video-gif-mode + .video-gif-overlay {
display: block;
position: absolute;
top: 5px;
left: 5px;
padding: 0 3px;
font-size: 0.8em;
background: rgba(0,0,0,.5);
color: #fff;
}

32
app/static/common.js Normal file
View File

@ -0,0 +1,32 @@
function hasAudio (video) {
return video.mozHasAudio ||
Boolean(video.webkitAudioDecodedByteCount) ||
Boolean(video.audioTracks && video.audioTracks.length);
}
function setVideoInGIFMode(video) {
if (!hasAudio(video)) {
if (typeof video.loop == 'boolean' && video.duration <= 10.0) {
video.classList.add("video-gif-mode");
video.loop = true;
video.controls = false;
video.addEventListener("mouseover", () => {
video.play();
})
video.addEventListener("mouseleave", () => {
video.pause();
})
}
};
}
var items = document.getElementsByTagName("video")
for (var i = 0; i < items.length; i++) {
if (items[i].duration) {
setVideoInGIFMode(items[i]);
} else {
items[i].addEventListener("loadeddata", function() {
setVideoInGIFMode(this);
});
}
}

View File

@ -27,6 +27,7 @@ from app.ap_object import Object
from app.config import BASE_URL
from app.config import CUSTOM_FOOTER
from app.config import DEBUG
from app.config import SESSION_TIMEOUT
from app.config import VERSION
from app.config import generate_csrf_token
from app.config import session_serializer
@ -69,10 +70,10 @@ def is_current_user_admin(request: Request) -> bool:
try:
loaded_session = session_serializer.loads(
session_cookie,
max_age=3600 * 12,
max_age=SESSION_TIMEOUT,
)
except Exception:
pass
logger.exception("Failed to validate session timeout")
else:
is_admin = loaded_session.get("is_logged_in")
@ -335,6 +336,14 @@ def _clean_html(html: str, note: Object) -> str:
raise
def _clean_html_wm(html: str) -> str:
return bleach.clean(
html,
attributes=ALLOWED_ATTRIBUTES,
strip=True,
)
def _timeago(original_dt: datetime) -> str:
dt = original_dt
if dt.tzinfo:
@ -411,6 +420,7 @@ def _poll_item_pct(item: ap.RawObject, voters_count: int) -> int:
_templates.env.filters["domain"] = _filter_domain
_templates.env.filters["media_proxy_url"] = _media_proxy_url
_templates.env.filters["clean_html"] = _clean_html
_templates.env.filters["clean_html_wm"] = _clean_html_wm
_templates.env.filters["timeago"] = _timeago
_templates.env.filters["format_date"] = _format_date
_templates.env.filters["has_media_type"] = _has_media_type

View File

@ -3,6 +3,7 @@
{% block head %}
<title>{{ local_actor.display_name }}'s followers</title>
<meta name="robots" content="noindex, nofollow">
{% endblock %}
{% block content %}

View File

@ -3,6 +3,7 @@
{% block head %}
<title>{{ local_actor.display_name }}'s follows</title>
<meta name="robots" content="noindex, nofollow">
{% endblock %}
{% block content %}

View File

@ -26,24 +26,30 @@
<div class="h-feed">
<data class="p-name" value="{{ local_actor.display_name}}'s notes"></data>
{% for outbox_object in objects %}
{% if outbox_object.ap_type in ["Note", "Article", "Video", "Question"] %}
{% if outbox_object.ap_type in ["Note", "Video", "Question"] %}
{{ utils.display_object(outbox_object) }}
{% elif outbox_object.ap_type == "Announce" %}
<div class="shared-header"><strong>{{ utils.display_tiny_actor_icon(local_actor) }} {{ local_actor.display_name | clean_html(local_actor) | safe }}</strong> shared <span title="{{ outbox_object.ap_published_at.isoformat() }}">{{ outbox_object.ap_published_at | timeago }}</span></div>
{{ utils.display_object(outbox_object.relates_to_anybox_object) }}
<div class="h-entry" id="{{ outbox_object.permalink_id }}">
<div class="shared-header"><strong><a class="p-author h-card" href="{{ local_actor.url }}">{{ utils.display_tiny_actor_icon(local_actor) }} {{ local_actor.display_name | clean_html(local_actor) | safe }}</a></strong> shared <span title="{{ outbox_object.ap_published_at.isoformat() }}">{{ outbox_object.ap_published_at | timeago }}</span></div>
<div class="h-cite u-repost-of">
{{ utils.display_object(outbox_object.relates_to_anybox_object, is_h_entry=False) }}
</div>
</div>
{% endif %}
{% endfor %}
</div>
<div class="box">
{% if has_previous_page %}
<a href="{{ url_for("index") }}?page={{ current_page - 1 }}">Previous</a>
{% endif %}
{% if has_previous_page or has_next_page %}
<div class="box">
{% if has_previous_page %}
<a href="{{ url_for("index") }}?page={{ current_page - 1 }}">Previous</a>
{% endif %}
{% if has_next_page %}
<a href="{{ url_for("index") }}?page={{ current_page + 1 }}">Next</a>
{% endif %}
</div>
{% if has_next_page %}
<a href="{{ url_for("index") }}?page={{ current_page + 1 }}">Next</a>
{% endif %}
</div>
{% endif %}
{% else %}
<div class="empty-state">

View File

@ -55,5 +55,6 @@
{% if is_admin %}
<script src="{{ BASE_URL }}/static/common-admin.js?v={{ JS_HASH }}"></script>
{% endif %}
<script src="{{ BASE_URL }}/static/common.js?v={{ JS_HASH }}"></script>
</body>
</html>

View File

@ -1,5 +1,8 @@
{%- import "utils.html" as utils with context -%}
{% extends "layout.html" %}
{% block head %}
<meta name="robots" content="noindex, nofollow">
{% endblock %}
{% block main_tag %} class="main-flex"{% endblock %}
{% block content %}
<div class="centered">

View File

@ -10,6 +10,9 @@
<a href="{{ url_for("admin_profile") }}?actor_id={{ notif.actor.ap_id }}">
{% if with_icon %}{{ utils.display_tiny_actor_icon(notif.actor) }}{% endif %} {{ notif.actor.display_name | clean_html(notif.actor) | safe }}</a> {{ text }}
<span title="{{ notif.created_at.isoformat() }}">{{ notif.created_at | timeago }}</span>
{% if notif.is_new %}
<span class="new">new</span>
{% endif %}
</div>
{% endmacro %}
@ -66,8 +69,8 @@
{{ notif_actor_action(notif, "shared a post", with_icon=True) }}
{{ utils.display_object(notif.outbox_object) }}
{% elif notif.notification_type.value == "undo_announce" %}
{{ notif_actor_action(notif, "unshared a post") }}
{{ utils.display_object(notif.outbox_object, with_icon=True) }}
{{ notif_actor_action(notif, "unshared a post", with_icon=True) }}
{{ utils.display_object(notif.outbox_object) }}
{% elif notif.notification_type.value == "mention" %}
{{ notif_actor_action(notif, "mentioned you") }}
{{ utils.display_object(notif.inbox_object) }}

View File

@ -31,9 +31,16 @@
{% macro display_replies_tree(replies_tree_node) %}
{% if replies_tree_node.is_requested %}
{{ utils.display_object(replies_tree_node.ap_object, likes=likes, shares=shares, webmentions=webmentions, expanded=not replies_tree_node.is_root, is_object_page=True) }}
{{ utils.display_object(replies_tree_node.ap_object, likes=likes, shares=shares, webmentions=webmentions, expanded=not replies_tree_node.is_root, is_object_page=True, is_h_entry=False) }}
{% else %}
{{ utils.display_object(replies_tree_node.ap_object) }}
{% if replies_tree_node.wm_reply %}
{# u-comment h-cite is displayed by default for webmention #}
{{ utils.display_webmention_reply(replies_tree_node.wm_reply) }}
{% else %}
<div class="u-comment h-cite">
{{ utils.display_object(replies_tree_node.ap_object, is_h_entry=False) }}
</div>
{% endif %}
{% endif %}
{% for child in replies_tree_node.children %}
@ -42,6 +49,8 @@
{% endmacro %}
<div class="h-entry">
{{ display_replies_tree(replies_tree) }}
</div>
{% endblock %}

View File

@ -3,6 +3,7 @@
{% block head %}
<title>Remote follow {{ local_actor.display_name }}</title>
<meta name="robots" content="noindex, nofollow">
{% endblock %}
{% block content %}

View File

@ -3,6 +3,7 @@
{% block head %}
<title>Interact from your instance</title>
<meta name="robots" content="noindex, nofollow">
{% endblock %}
{% block content %}

View File

@ -142,6 +142,17 @@
{% endblock %}
{% endmacro %}
{% macro admin_force_delete_webmention_button(webmention_id, permalink_id=None) %}
{% block admin_force_delete_webmention_button scoped %}
<form action="{{ request.url_for("admin_actions_force_delete_webmention") }}" class="object-delete-form" method="POST">
{{ embed_csrf_token() }}
{{ embed_redirect_url(permalink_id) }}
<input type="hidden" name="webmention_id" value="{{ webmention_id }}">
<input type="submit" value="local delete">
</form>
{% endblock %}
{% endmacro %}
{% macro admin_announce_button(ap_object_id, permalink_id=None) %}
{% block admin_announce_button scoped %}
<form action="{{ request.url_for("admin_actions_announce") }}" method="POST">
@ -236,7 +247,7 @@
{% macro display_tiny_actor_icon(actor) %}
{% block display_tiny_actor_icon scoped %}
<img class="tiny-actor-icon" src="{{ actor.resized_icon_url }}" alt="{{ actor.display_name }}'s avatar">
<img class="tiny-actor-icon" src="{{ actor.resized_icon_url }}" alt="">
{% endblock %}
{% endmacro %}
@ -413,12 +424,17 @@
{% if attachment.type == "Image" or (attachment | has_media_type("image")) %}
{% if attachment.url not in object.inlined_images %}
<img src="{{ attachment.resized_url or attachment.proxied_url }}"{% if attachment.name %} title="{{ attachment.name }}" alt="{{ attachment.name }}"{% endif %} class="attachment">
<a class="media-link" href="{{ attachment.proxied_url }}" target="_blank">
<img src="{{ attachment.resized_url or attachment.proxied_url }}"{% if attachment.name %} title="{{ attachment.name }}" alt="{{ attachment.name }}"{% endif %} class="attachment u-photo">
</a>
{% endif %}
{% elif attachment.type == "Video" or (attachment | has_media_type("video")) %}
<video controls preload="metadata" src="{{ attachment.url | media_proxy_url }}"{% if attachment.name %} title="{{ attachment.name }}"{% endif %}></video>
<div class="video-wrapper">
<video controls preload="metadata" src="{{ attachment.url | media_proxy_url }}"{% if attachment.name %} title="{{ attachment.name }}"{% endif %} class="u-video"></video>
<div class="video-gif-overlay">GIF</div>
</div>
{% elif attachment.type == "Audio" or (attachment | has_media_type("audio")) %}
<audio controls preload="metadata" src="{{ attachment.url | media_proxy_url }}"{% if attachment.name%} title="{{ attachment.name }}"{% endif %} class="attachment"></audio>
<audio controls preload="metadata" src="{{ attachment.url | media_proxy_url }}"{% if attachment.name%} title="{{ attachment.name }}"{% endif %} class="attachment u-audio"></audio>
{% elif attachment.type == "Link" %}
<a href="{{ attachment.url }}" class="attachment">{{ attachment.url | truncate(64, True) }}</a> ({{ attachment.mimetype}})
{% else %}
@ -439,11 +455,55 @@
{% endblock %}
{% endmacro %}
{% macro display_object(object, likes=[], shares=[], webmentions=[], expanded=False, actors_metadata={}, is_object_page=False) %}
{% macro display_webmention_reply(wm_reply) %}
{% block display_webmention_reply scoped %}
<div class="ap-object u-comment h-cite">
<div class="actor-box h-card p-author">
<div class="icon-box">
<img src="{{ wm_reply.face.picture_url }}" alt="{{ wm_reply.face.name }}'s avatar" class="actor-icon u-photo">
</div>
<a href="{{ wm_reply.face.url }}" class="u-url">
<div><strong class="p-name">{{ wm_reply.face.name | clean_html_wm | safe }}</strong></div>
<div class="actor-handle">{{ wm_reply.face.url | truncate(64, True) }}</div>
</a>
</div>
<p class="in-reply-to">in reply to <a href="{{ wm_reply.in_reply_to }}" title="{{ wm_reply.in_reply_to }}" rel="nofollow">
this object
</a></p>
<div class="obj-content margin-top-20">
<div class="e-content">
{{ wm_reply.content | clean_html_wm | safe }}
</div>
</div>
<nav class="flexbox activity-bar margin-top-20">
<ul>
<li>
<div><a href="{{ wm_reply.url }}" rel="nofollow" class="object-permalink u-url u-uid">permalink</a></div>
</li>
<li>
<time class="dt-published" datetime="{{ wm_reply.published_at.replace(microsecond=0).isoformat() }}" title="{{ wm_reply.published_at.replace(microsecond=0).isoformat() }}">{{ wm_reply.published_at | timeago }}</time>
</li>
{% if is_admin %}
<li>
{{ admin_force_delete_webmention_button(wm_reply.webmention_id) }}
</li>
{% endif %}
</ul>
</nav>
</div>
{% endblock %}
{% endmacro %}
{% macro display_object(object, likes=[], shares=[], webmentions=[], expanded=False, actors_metadata={}, is_object_page=False, is_h_entry=True) %}
{% block display_object scoped %}
{% set is_article_mode = object.is_from_outbox and object.ap_type == "Article" and is_object_page %}
{% if object.ap_type in ["Note", "Article", "Video", "Page", "Question", "Event"] %}
<div class="ap-object {% if expanded %}ap-object-expanded {% endif %}h-entry" id="{{ object.permalink_id }}">
<div class="ap-object {% if expanded %}ap-object-expanded {% endif %}{% if is_h_entry %}h-entry{% endif %}" id="{{ object.permalink_id }}">
{% if is_article_mode %}
<data class="h-card">
@ -457,7 +517,7 @@
{% if object.in_reply_to %}
<p class="in-reply-to">in reply to <a href="{% if is_admin and object.is_in_reply_to_from_inbox %}{{ url_for("get_lookup") }}?query={% endif %}{{ object.in_reply_to }}" title="{{ object.in_reply_to }}" rel="nofollow">
this {{ object.ap_type|lower }}
this object
</a></p>
{% endif %}
@ -492,12 +552,13 @@
{% endif %}
{% if object.summary %}
<div class="show-more-wrapper">
<div class="p-summary">
<p>{{ object.summary | clean_html(object) | safe }}</p>
</div>
<label for="show-more-{{ object.permalink_id }}" class="show-more-btn">show/hide more</label>
<input class="show-more-state" type="checkbox" aria-hidden="true" id="show-more-{{ object.permalink_id }}" checked>
<details class="show-more-wrapper">
<summary>
<div class="p-summary">
<p>{{ object.summary | clean_html(object) | safe }}</p>
</div>
<span class="show-more-btn" aria-hidden="true"></span>
</summary>
{% endif %}
<div class="obj-content">
<div class="e-content">
@ -558,7 +619,7 @@
</div>
{% if object.summary %}
</div>
</details>
{% endif %}
<div class="activity-attachment">
@ -693,7 +754,7 @@
{{ admin_expand_button(object) }}
</li>
{% endif %}
{% if object.is_from_inbox %}
{% if object.is_from_inbox and not object.announced_via_outbox_object_ap_id and object.is_local_reply %}
<li>
{{ admin_force_delete_button(object.ap_id) }}
</li>
@ -709,8 +770,8 @@
<div class="interactions-block">Likes
<div class="facepile-wrapper">
{% for like in likes %}
<a href="{% if is_admin %}{{ url_for("admin_profile") }}?actor_id={{ like.actor.ap_id }}{% else %}{{ like.actor.url }}{% endif %}" title="{{ like.actor.handle }}" rel="noreferrer">
<img src="{{ like.actor.resized_icon_url }}" alt="{{ like.actor.handle}}">
<a href="{% if is_admin and like.ap_actor_id %}{{ url_for("admin_profile") }}?actor_id={{ like.ap_actor_id }}{% else %}{{ like.url }}{% endif %}" title="{{ like.name }}" rel="noreferrer">
<img src="{{ like.picture_url }}" alt="{{ like.name }}">
</a>
{% endfor %}
{% if object.likes_count > likes | length %}
@ -726,8 +787,8 @@
<div class="interactions-block">Shares
<div class="facepile-wrapper">
{% for share in shares %}
<a href="{% if is_admin %}{{ url_for("admin_profile") }}?actor_id={{ share.actor.ap_id }}{% else %}{{ share.actor.url }}{% endif %}" title="{{ share.actor.handle }}" rel="noreferrer">
<img src="{{ share.actor.resized_icon_url }}" alt="{{ share.actor.handle}}">
<a href="{% if is_admin and share.ap_actor_id %}{{ url_for("admin_profile") }}?actor_id={{ share.ap_actor_id }}{% else %}{{ share.url }}{% endif %}" title="{{ share.name }}" rel="noreferrer">
<img src="{{ share.picture_url }}" alt="{{ share.name }}">
</a>
{% endfor %}
{% if object.announces_count > shares | length %}

View File

@ -23,6 +23,8 @@ def _load_emojis(root_dir: Path, base_url: str) -> None:
mt = mimetypes.guess_type(emoji.name)[0]
if mt and mt.startswith("image/"):
name = emoji.name.split(".")[0]
if not re.match(EMOJI_REGEX, f":{name}:"):
continue
ap_emoji: "RawObject" = {
"type": "Emoji",
"name": f":{name}:",

159
app/utils/facepile.py Normal file
View File

@ -0,0 +1,159 @@
import datetime
from dataclasses import dataclass
from typing import Any
from typing import Optional
from loguru import logger
from app import media
from app.models import InboxObject
from app.models import Webmention
from app.utils.datetime import parse_isoformat
from app.utils.url import make_abs
@dataclass
class Face:
ap_actor_id: str | None
url: str
name: str
picture_url: str
created_at: datetime.datetime
@classmethod
def from_inbox_object(cls, like: InboxObject) -> "Face":
return cls(
ap_actor_id=like.actor.ap_id,
url=like.actor.url, # type: ignore
name=like.actor.handle, # type: ignore
picture_url=like.actor.resized_icon_url,
created_at=like.created_at, # type: ignore
)
@classmethod
def from_webmention(cls, webmention: Webmention) -> Optional["Face"]:
items = webmention.source_microformats.get("items", []) # type: ignore
for item in items:
if item["type"][0] == "h-card":
try:
return cls(
ap_actor_id=None,
url=(
item["properties"]["url"][0]
if item["properties"].get("url")
else webmention.source
),
name=item["properties"]["name"][0],
picture_url=media.resized_media_url(
make_abs(
item["properties"]["photo"][0], webmention.source
), # type: ignore
50,
),
created_at=webmention.created_at, # type: ignore
)
except Exception:
logger.exception(
f"Failed to build Face for webmention id={webmention.id}"
)
break
elif item["type"][0] == "h-entry":
author = item["properties"]["author"][0]
try:
return cls(
ap_actor_id=None,
url=webmention.source,
name=author["properties"]["name"][0],
picture_url=media.resized_media_url(
make_abs(
author["properties"]["photo"][0], webmention.source
), # type: ignore
50,
),
created_at=webmention.created_at, # type: ignore
)
except Exception:
logger.exception(
f"Failed to build Face for webmention id={webmention.id}"
)
break
return None
def merge_faces(faces: list[Face]) -> list[Face]:
return sorted(
faces,
key=lambda f: f.created_at,
reverse=True,
)[:10]
def _parse_face(webmention: Webmention, items: list[dict[str, Any]]) -> Face | None:
for item in items:
if item["type"][0] == "h-card":
try:
return Face(
ap_actor_id=None,
url=(
item["properties"]["url"][0]
if item["properties"].get("url")
else webmention.source
),
name=item["properties"]["name"][0],
picture_url=media.resized_media_url(
make_abs(
item["properties"]["photo"][0], webmention.source
), # type: ignore
50,
),
created_at=webmention.created_at, # type: ignore
)
except Exception:
logger.exception(
f"Failed to build Face for webmention id={webmention.id}"
)
break
return None
@dataclass
class WebmentionReply:
face: Face
content: str
url: str
published_at: datetime.datetime
in_reply_to: str
webmention_id: int
@classmethod
def from_webmention(cls, webmention: Webmention) -> Optional["WebmentionReply"]:
items = webmention.source_microformats.get("items", []) # type: ignore
for item in items:
if item["type"][0] == "h-entry":
try:
face = _parse_face(webmention, item["properties"].get("author", []))
if not face:
logger.info(
"Failed to build WebmentionReply/Face for "
f"webmention id={webmention.id}"
)
break
return cls(
face=face,
content=item["properties"]["content"][0]["html"],
url=item["properties"]["url"][0],
published_at=parse_isoformat(
item["properties"]["published"][0]
).replace(tzinfo=None),
in_reply_to=webmention.target, # type: ignore
webmention_id=webmention.id, # type: ignore
)
except Exception:
logger.exception(
f"Failed to build Face for webmention id={webmention.id}"
)
break
return None

View File

@ -54,7 +54,7 @@ def is_url_valid(url: str) -> bool:
if not parsed.hostname or parsed.hostname.lower() in ["localhost"]:
return False
if parsed.hostname in BLOCKED_SERVERS:
if is_hostname_blocked(parsed.hostname):
logger.warning(f"{parsed.hostname} is blocked")
return False
@ -81,3 +81,11 @@ def check_url(url: str) -> None:
raise InvalidURLError(f'"{url}" is invalid')
return None
@functools.lru_cache(maxsize=256)
def is_hostname_blocked(hostname: str) -> bool:
for blocked_hostname in BLOCKED_SERVERS:
if hostname == blocked_hostname or hostname.endswith(f".{blocked_hostname}"):
return True
return False

View File

@ -24,7 +24,7 @@ async def _discover_webmention_endoint(url: str) -> str | None:
follow_redirects=True,
)
resp.raise_for_status()
except (httpx.HTTPError, httpx.HTTPStatusError):
except Exception:
logger.exception(f"Failed to discover webmention endpoint for {url}")
return None

View File

@ -1,3 +1,5 @@
from urllib.parse import urlparse
import httpx
from bs4 import BeautifulSoup # type: ignore
from fastapi import APIRouter
@ -6,13 +8,21 @@ from fastapi import HTTPException
from fastapi import Request
from fastapi.responses import JSONResponse
from loguru import logger
from sqlalchemy import func
from sqlalchemy import select
from app import models
from app.boxes import _get_outbox_announces_count
from app.boxes import _get_outbox_likes_count
from app.boxes import _get_outbox_replies_count
from app.boxes import get_outbox_object_by_ap_id
from app.boxes import get_outbox_object_by_slug_and_short_id
from app.boxes import is_notification_enabled
from app.database import AsyncSession
from app.database import get_db_session
from app.utils import microformats
from app.utils.facepile import Face
from app.utils.facepile import WebmentionReply
from app.utils.url import check_url
from app.utils.url import is_url_valid
@ -47,6 +57,7 @@ async def webmention_endpoint(
check_url(source)
check_url(target)
parsed_target_url = urlparse(target)
except Exception:
logger.exception("Invalid webmention request")
raise HTTPException(status_code=400, detail="Invalid payload")
@ -65,6 +76,16 @@ async def webmention_endpoint(
logger.info("Found existing Webmention, will try to update or delete")
mentioned_object = await get_outbox_object_by_ap_id(db_session, target)
if not mentioned_object and parsed_target_url.path.startswith("/articles/"):
try:
_, _, short_id, slug = parsed_target_url.path.split("/")
mentioned_object = await get_outbox_object_by_slug_and_short_id(
db_session, slug, short_id
)
except Exception:
logger.exception(f"Failed to match {target}")
if not mentioned_object:
logger.info(f"Invalid target {target=}")
@ -90,15 +111,21 @@ async def webmention_endpoint(
logger.warning(f"target {target=} not found in source")
if existing_webmention_in_db:
logger.info("Deleting existing Webmention")
mentioned_object.webmentions_count = mentioned_object.webmentions_count - 1
existing_webmention_in_db.is_deleted = True
await db_session.flush()
notif = models.Notification(
notification_type=models.NotificationType.DELETED_WEBMENTION,
outbox_object_id=mentioned_object.id,
webmention_id=existing_webmention_in_db.id,
# Revert side effects
await _handle_webmention_side_effects(
db_session, existing_webmention_in_db, mentioned_object
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.DELETED_WEBMENTION):
notif = models.Notification(
notification_type=models.NotificationType.DELETED_WEBMENTION,
outbox_object_id=mentioned_object.id,
webmention_id=existing_webmention_in_db.id,
)
db_session.add(notif)
await db_session.commit()
@ -110,36 +137,96 @@ async def webmention_endpoint(
else:
return JSONResponse(content={}, status_code=200)
webmention_type = models.WebmentionType.UNKNOWN
webmention: models.Webmention
if existing_webmention_in_db:
# Undelete if needed
existing_webmention_in_db.is_deleted = False
existing_webmention_in_db.source_microformats = data
await db_session.flush()
webmention = existing_webmention_in_db
notif = models.Notification(
notification_type=models.NotificationType.UPDATED_WEBMENTION,
outbox_object_id=mentioned_object.id,
webmention_id=existing_webmention_in_db.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.UPDATED_WEBMENTION):
notif = models.Notification(
notification_type=models.NotificationType.UPDATED_WEBMENTION,
outbox_object_id=mentioned_object.id,
webmention_id=existing_webmention_in_db.id,
)
db_session.add(notif)
else:
new_webmention = models.Webmention(
source=source,
target=target,
source_microformats=data,
outbox_object_id=mentioned_object.id,
webmention_type=webmention_type,
)
db_session.add(new_webmention)
await db_session.flush()
webmention = new_webmention
notif = models.Notification(
notification_type=models.NotificationType.NEW_WEBMENTION,
outbox_object_id=mentioned_object.id,
webmention_id=new_webmention.id,
)
db_session.add(notif)
if is_notification_enabled(models.NotificationType.NEW_WEBMENTION):
notif = models.Notification(
notification_type=models.NotificationType.NEW_WEBMENTION,
outbox_object_id=mentioned_object.id,
webmention_id=new_webmention.id,
)
db_session.add(notif)
mentioned_object.webmentions_count = mentioned_object.webmentions_count + 1
# Determine the webmention type
for item in data.get("items", []):
if target in item.get("properties", {}).get(
"in-reply-to", []
) and WebmentionReply.from_webmention(webmention):
webmention_type = models.WebmentionType.REPLY
break
elif target in item.get("properties", {}).get(
"like-of", []
) and Face.from_webmention(webmention):
webmention_type = models.WebmentionType.LIKE
break
elif target in item.get("properties", {}).get(
"repost-of", []
) and Face.from_webmention(webmention):
webmention_type = models.WebmentionType.REPOST
break
if webmention_type != models.WebmentionType.UNKNOWN:
webmention.webmention_type = webmention_type
await db_session.flush()
# Handle side effect
await _handle_webmention_side_effects(db_session, webmention, mentioned_object)
await db_session.commit()
return JSONResponse(content={}, status_code=200)
async def _handle_webmention_side_effects(
db_session: AsyncSession,
webmention: models.Webmention,
mentioned_object: models.OutboxObject,
) -> None:
if webmention.webmention_type == models.WebmentionType.UNKNOWN:
# TODO: recount everything
mentioned_object.webmentions_count = await db_session.scalar(
select(func.count(models.Webmention.id)).where(
models.Webmention.is_deleted.is_(False),
models.Webmention.outbox_object_id == mentioned_object.id,
models.Webmention.webmention_type == models.WebmentionType.UNKNOWN,
)
)
elif webmention.webmention_type == models.WebmentionType.LIKE:
mentioned_object.likes_count = await _get_outbox_likes_count(
db_session, mentioned_object
)
elif webmention.webmention_type == models.WebmentionType.REPOST:
mentioned_object.announces_count = await _get_outbox_announces_count(
db_session, mentioned_object
)
elif webmention.webmention_type == models.WebmentionType.REPLY:
mentioned_object.replies_count = await _get_outbox_replies_count(
db_session, mentioned_object
)
else:
raise ValueError(f"Unhandled {webmention.webmention_type} webmention")

View File

@ -58,3 +58,24 @@ And check out the result by starting a static server using Python standard libra
cd docs/dist
python -m http.server 8001
```
## Contributing
Contributions/patches are welcome, but please start a discussion in a [ticket](https://todo.sr.ht/~tsileo/microblog.pub) or a [thread in the mailing list](https://lists.sr.ht/~tsileo/microblog.pub-devel) before working on anything consequent.
### Patches
Please ensure your code passes the code quality checks:
```bash
inv autoformat
inv lint
```
And that the tests suite is passing:
```bash
inv tests
```
Please also consider adding new test cases if needed.

View File

@ -191,6 +191,49 @@ http {
}
```
## (Advanced) Running from subpath
It is possible to configure microblogpub to run from subpath.
To achieve this, do the following configuration _between_ config and start steps.
i.e. _after_ you run `make config` or `poetry run inv configuration-wizard`,
but _before_ you run `docker compose up` or `poetry run supervisord`.
Changing this settings on an instance which has some posts or was seen by other instances will likely break links to these posts or federation (i.e. links to your instance, posts and profile from other instances).
The following steps will explain how to configure instance to be available at `https://example.com/subdir`.
Change them to your actual domain and subdir.
* Edit `data/profile.toml` file, add this line:
id = "https://example.com/subdir"
* Edit `misc/*-supervisord.conf` file which is relevant to you (it depends on how you start microblogpub - if in doubt, do the same change in all of them) - in `[program:uvicorn]` section, in the line which starts with `command`, add this argument at the very end: ` --root-path /subdir`
Above two steps are enough to configure microblogpub.
Next, you also need to configure reverse proxy.
It might slightly differ if you plan to have other services running on the same domain, but for [NGINX config shown above](#reverse-proxy), the following changes are enough:
* Add subdir to location, so location block starts like this:
location /subdir {
* Add `/` at the end of `proxy_pass` directive, like this:
proxy_pass http://localhost:8000/;
These two changes will instruct NGINX that requests sent to `https://example.com/subdir/...` should be forwarded to `http://localhost:8000/...`.
* Inside `server` block, add redirects for well-known URLs (add these lines after `client_max_body_size`, remember to replace `subdir` with your actual subdir!):
location /.well-known/webfinger { return 301 /subdir$request_uri; }
location /.well-known/nodeinfo { return 301 /subdir$request_uri; }
location /.well-known/oauth-authorization-server { return 301 /subdir$request_uri; }
* Optionally, [check robots.txt from a running microblogpub instance](https://microblog.pub/robots.txt) and integrate it into robots.txt file in the root of your server - remember to prepend `subdir` to URLs, so for example `Disallow: /admin` becomes `Disallow: /subdir/admin`.
## YunoHost edition
[YunoHost](https://yunohost.org/) support is a work in progress.
[YunoHost](https://yunohost.org/) support is available (although it is not an official package for now): <https://git.sr.ht/~tsileo/microblog.pub_ynh>.
## Available tutorial/guides
- [Opalstack](https://community.opalstack.com/d/1055-howto-install-and-run-microblogpub-on-opalstack), thanks to [@defulmere@mastodon.social](https://mastodon.online/@defulmere).

View File

@ -98,6 +98,39 @@ privacy_replace = [
]
```
### Disabling certain notification types
All notifications are enabled by default.
You can disabled specific notifications by adding them to the `disabled_notifications` list.
This example disables likes and shares notifications:
```
disabled_notifications = ["like", "announce"]
```
#### Available notification types
- `new_follower`
- `rejected_follower`
- `unfollow`
- `follow_request_accepted`
- `follow_request_rejected`
- `move`
- `like`
- `undo_like`
- `announce`
- `undo_announce`
- `mention`
- `new_webmention`
- `updated_webmention`
- `deleted_webmention`
- `blocked`
- `unblocked`
- `block`
- `unblock`
### Customization
#### Default emoji
@ -113,6 +146,7 @@ You can copy/paste them from [getemoji.com](https://getemoji.com/).
#### Custom emoji
You can add custom emoji in the `data/custom_emoji` directory and they will be picked automatically.
Do not use exotic characters in filename - only letters, numbers, and underscore symbol `_` are allowed.
#### Custom CSS
@ -465,6 +499,7 @@ make self-destruct
If the server is not (re)starting, you can:
- [Ensure that the configuration is valid](/user_guide.html#configuration-checking)
- [Verify if you haven't any syntax error in the custom theme by recompiling the CSS](/user_guide.html#recompiling-css-files)
- Look at the log files
- [Ensure that the configuration is valid](/user_guide.html#configuration-checking).
- [Verify if you haven't any syntax error in the custom theme by recompiling the CSS](/user_guide.html#recompiling-css-files).
- Look at the log files (in `data/uvicorn.log`, `data/incoming.log` and `data/outgoing.log`).
- If the CSS is not working, ensure your reverse proxy is serving the static file correctly.

View File

@ -75,9 +75,10 @@ def main() -> None:
proto = "http"
print("Note that you can put your icon/avatar in the static/ directory")
dat["icon_url"] = prompt(
if icon_url := prompt(
"icon URL: ", default=f'{proto}://{dat["domain"]}/static/nopic.png'
)
):
dat["icon_url"] = icon_url
dat["secret"] = os.urandom(16).hex()
with config_file.open("w") as f:

19
tests/test_utils.py Normal file
View File

@ -0,0 +1,19 @@
from unittest import mock
import pytest
from app.utils.url import is_hostname_blocked
@pytest.mark.parametrize(
"hostname,should_be_blocked",
[
("example.com", True),
("subdomain.example.com", True),
("example.xyz", False),
],
)
def test_is_hostname_blocked(hostname: str, should_be_blocked: bool) -> None:
with mock.patch("app.utils.url.BLOCKED_SERVERS", ["example.com"]):
is_hostname_blocked.cache_clear()
assert is_hostname_blocked(hostname) is should_be_blocked