social media crossposting tool. 3rd time's the charm
mastodon misskey crossposting bluesky

Compare changes

Choose any two refs to compare.

+5 -14
.dockerignore
···
-
# Python-generated files
-
__pycache__/
-
*.py[oc]
-
build/
-
dist/
-
wheels/
-
*.egg-info
-
-
# Virtual environments
-
.venv
-
-
# Random junk
-
.gitignore
.env
.env.*
.DS_Store
-
data/
···
.env
.env.*
+
.gitignore
.DS_Store
+
*.swp
+
*~
+
__pycache__/
+
.venv
+2 -4
.gitignore
···
# Virtual environments
.venv
-
# Random junk
-
.env
-
.env.*
-
.DS_Store
data/
···
# Virtual environments
.venv
+
.idea/
+
.vscode/
data/
+50
.tangled/workflows/build-images.yml
···
···
+
when:
+
- event: ["push", "manual"]
+
branch: master
+
+
engine: nixery
+
+
dependencies:
+
nixpkgs:
+
- kaniko
+
- regctl
+
+
environment:
+
GHCR_USER: "zenfyrdev"
+
+
steps:
+
- name: create auth configs
+
command: |
+
mkdir -p $HOME/.docker $HOME/.regctl
+
+
cat > $HOME/.docker/config.json <<EOF
+
{"auths": {"ghcr.io": {"auth": "$(echo -n "$GHCR_USER:$GHCR_PAT" | base64 -w0)"}}}
+
EOF
+
+
cat > $HOME/.regctl/config.json <<EOF
+
{"hosts": {"ghcr.io": {"user": "$GHCR_USER","pass": "$GHCR_PAT"}}}
+
EOF
+
+
- name: build amd64
+
command: |
+
executor \
+
--context=dir://. \
+
--dockerfile=Containerfile \
+
--verbosity=info \
+
--destination=ghcr.io/$GHCR_USER/xpost:amd64-latest \
+
--custom-platform=linux/amd64
+
+
- name: build arm64
+
command: |
+
executor \
+
--context=dir://. \
+
--dockerfile=Containerfile \
+
--verbosity=info \
+
--destination=ghcr.io/$GHCR_USER/xpost:arm64-latest \
+
--custom-platform=linux/arm64
+
+
- name: tag latest artifact
+
command: |
+
regctl index create ghcr.io/$GHCR_USER/xpost:latest \
+
--ref ghcr.io/$GHCR_USER/xpost:amd64-latest --platform linux/amd64 \
+
--ref ghcr.io/$GHCR_USER/xpost:arm64-latest --platform linux/arm64
+21
LICENSE
···
···
+
MIT License
+
+
Copyright (c) 2025
+
+
Permission is hereby granted, free of charge, to any person obtaining a copy
+
of this software and associated documentation files (the "Software"), to deal
+
in the Software without restriction, including without limitation the rights
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+
copies of the Software, and to permit persons to whom the Software is
+
furnished to do so, subject to the following conditions:
+
+
The above copyright notice and this permission notice shall be included in all
+
copies or substantial portions of the Software.
+
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+
SOFTWARE.
+169
README.md
···
XPost is a social media cross-posting tool that differs from others by using streaming APIs to allow instant, zero-input cross-posting. this means you can continue posting on your preferred platform without using special apps.
XPost tries to support as many features as possible. for example, when cross-posting from mastodon to bluesky, unsupported file types will be attached as links. posts with mixed media or too many files will be split and spread across text.
···
XPost is a social media cross-posting tool that differs from others by using streaming APIs to allow instant, zero-input cross-posting. this means you can continue posting on your preferred platform without using special apps.
XPost tries to support as many features as possible. for example, when cross-posting from mastodon to bluesky, unsupported file types will be attached as links. posts with mixed media or too many files will be split and spread across text.
+
+
the tool may undergo breaking changes as new features are added, so proceed with caution when deploying.
+
+
# Installation
+
+
## Native
+
+
first install `ffmpeg`, `ffprobe` and `libmagic`, make sure that `ffmpeg` is available on PATH! `ffmpeg` and `libmagic` are required to crosspost media.
+
+
then get [uv](https://github.com/astral-sh/uv) and sync the project
+
+
```
+
uv sync
+
```
+
+
generate settings.json on first launch
+
+
```
+
uv run main.py
+
```
+
+
## Docker Compose
+
+
the official immage is available on [docker hub](https://hub.docker.com/r/melontini/xpost). example `compose.yaml`. this assumes that data dir is `./data`, and env file is `./.config/docker.env`. add `:Z` to volume mounts for podman.
+
+
```yaml
+
services:
+
xpost:
+
image: melontini/xpost:latest
+
restart: unless-stopped
+
env_file: ./.config/docker.env
+
volumes:
+
- ./data:/app/data
+
```
+
+
# Settings
+
+
the tool allows you to specify an input and multiple outputs to post to.
+
+
some options accept a envvar syntax:
+
+
```json
+
{
+
"token": "env:TOKEN"
+
}
+
```
+
+
## Inputs
+
+
all inputs have common options.
+
+
```json5
+
{
+
"options": {
+
"regex_filters": [ //posts matching any of the following regexes will be skipped
+
"(?i)\\b(?:test|hello|hi)\\b"
+
]
+
}
+
}
+
```
+
+
### Bluesky Jetstream
+
+
listens to repo operation events emmited by Jetstream. handle becomes optional if you specify a DID.
+
+
```json5
+
{
+
"type": "bluesky-jetstream-wss",
+
"handle": "env:BLUESKY_HANDLE", // handle (e.g. melontini.me)
+
"did": "env:BLUESKY_DID", // use a DID instead of handle (avoids handle resolution)
+
"jetstream": "wss://jetstream2.us-east.bsky.network/subscribe" //optional, change jetstream endpoint
+
}
+
```
+
+
### Mastodon WebSocket `mastodon-wss`
+
+
listens to the user's home timeline for new posts, crossposts only the public/unlisted ones by the user.
+
+
```json5
+
{
+
"type": "mastodon-wss", // type
+
"instance": "env:MASTODON_INSTANCE", // mastodon api compatible instance
+
"token": "env:MASTODON_TOKEN", // Must be a mastodon token. get from something like phanpy + webtools. or https://getauth.thms.uk/?client_name=xpost&scopes=read:statuses%20write:statuses%20profile but doesn't work with all software
+
"options": {
+
"allowed_visibility": [
+
"public",
+
"unlisted"
+
]
+
}
+
}
+
```
+
+
any instance implementing `/api/v1/instance`, `/api/v1/accounts/verify_credentials` and `/api/v1/streaming?stream` will work fine.
+
+
confirmed supported:
+
- Mastodon
+
- Iceshrimp.NET
+
- Akkoma
+
+
confirmed unsupported:
+
- Mitra
+
- Sharkey
+
+
### Misskey WebSocket
+
+
listens to the homeTimeline channel for new posts, crossposts only the public/home ones by the user.
+
+
**IMPORTANT**: Misskey WSS does Not support deletes, you must delete posts manually. if you know how i can listen to all note events, i would appreciate your help.
+
+
```json5
+
{
+
"type": "misskey-wss", // type
+
"instance": "env:MISSKEY_INSTANCE", // misskey instance
+
"token": "env:MISSKEY_TOKEN", // access token with the `View your account information` scope
+
"options": {
+
"allowed_visibility": [
+
"public",
+
"home"
+
]
+
}
+
}
+
```
+
+
Misskey API is not very good, this also wasn't tested on vanilla misskey.
+
+
confirmed supported:
+
- Sharkey
+
+
## Outputs
+
+
### Mastodon API
+
+
no remarks.
+
+
```json5
+
{
+
"type": "mastodon",
+
"token": "env:MASTODON_TOKEN", // Must be a mastodon token. get from something like phanpy + webtools. or https://getauth.thms.uk/?client_name=xpost&scopes=read%20write%20profile but doesn't work with all software
+
"instance": "env:MASTODON_INSTNACE", // mastodon api compatible instance
+
"options": {
+
"visibility": "public"
+
}
+
}
+
```
+
+
### Bluesky
+
+
in the bluesky block, you can configure who is allowed to reply to and quote the new posts. handle becomes optional if you specify a DID.
+
+
```json5
+
{
+
"type": "bluesky", // type
+
"handle": "env:BLUESKY_HANDLE", // handle (e.g. melontini.me)
+
"app_password": "env:BLUESKY_APP_PASSWORD", // https://bsky.app/settings/app-passwords
+
"did": "env:BLUESKY_DID", // use a DID instead of handle (avoids handle resolution)
+
"pds": "env:BLUESKY_PDS", // specify Your PDS directly (avoids DID doc lookup)
+
"bsky_appview": "env:BLUESKY_APPVIEW", // bypass suspensions by specifying a different appview (e.g. did:web:bsky.zeppelin.social)
+
"options": {
+
"encode_videos": true, // bluesky only accepts mp4 videos, try to convert if the video is not mp4
+
"quote_gate": false, // block users from quoting the post
+
"thread_gate": [ // block replies. leave empty to disable replies
+
"mentioned",
+
"following",
+
"followers",
+
"everybody" // allow everybody to reply (ignores other options)
+
]
+
}
+
}
+
```
+196
bluesky/atproto2.py
···
···
+
from typing import Any
+
+
from atproto import AtUri, Client, IdResolver, client_utils
+
from atproto_client import models
+
+
from util.util import LOGGER
+
+
+
def resolve_identity(
+
handle: str | None = None, did: str | None = None, pds: str | None = None
+
):
+
"""helper to try and resolve identity from provided parameters, a valid handle is enough"""
+
+
if did and pds:
+
return did, pds[:-1] if pds.endswith("/") else pds
+
+
resolver = IdResolver()
+
if not did:
+
if not handle:
+
raise Exception("ATP handle not specified!")
+
LOGGER.info("Resolving ATP identity for %s...", handle)
+
did = resolver.handle.resolve(handle)
+
if not did:
+
raise Exception("Failed to resolve DID!")
+
+
if not pds:
+
LOGGER.info("Resolving PDS from DID document...")
+
did_doc = resolver.did.resolve(did)
+
if not did_doc:
+
raise Exception("Failed to resolve DID doc for '%s'", did)
+
pds = did_doc.get_pds_endpoint()
+
if not pds:
+
raise Exception("Failed to resolve PDS!")
+
+
return did, pds[:-1] if pds.endswith("/") else pds
+
+
+
class Client2(Client):
+
def __init__(self, base_url: str | None = None, *args: Any, **kwargs: Any) -> None:
+
super().__init__(base_url, *args, **kwargs)
+
+
def send_video(
+
self,
+
text: str | client_utils.TextBuilder,
+
video: bytes,
+
video_alt: str | None = None,
+
video_aspect_ratio: models.AppBskyEmbedDefs.AspectRatio | None = None,
+
reply_to: models.AppBskyFeedPost.ReplyRef | None = None,
+
langs: list[str] | None = None,
+
facets: list[models.AppBskyRichtextFacet.Main] | None = None,
+
labels: models.ComAtprotoLabelDefs.SelfLabels | None = None,
+
time_iso: str | None = None,
+
) -> models.AppBskyFeedPost.CreateRecordResponse:
+
"""same as send_video, but with labels"""
+
+
if video_alt is None:
+
video_alt = ""
+
+
upload = self.upload_blob(video)
+
+
return self.send_post(
+
text,
+
reply_to=reply_to,
+
embed=models.AppBskyEmbedVideo.Main(
+
video=upload.blob, alt=video_alt, aspect_ratio=video_aspect_ratio
+
),
+
langs=langs,
+
facets=facets,
+
labels=labels,
+
time_iso=time_iso,
+
)
+
+
def send_images(
+
self,
+
text: str | client_utils.TextBuilder,
+
images: list[bytes],
+
image_alts: list[str] | None = None,
+
image_aspect_ratios: list[models.AppBskyEmbedDefs.AspectRatio] | None = None,
+
reply_to: models.AppBskyFeedPost.ReplyRef | None = None,
+
langs: list[str] | None = None,
+
facets: list[models.AppBskyRichtextFacet.Main] | None = None,
+
labels: models.ComAtprotoLabelDefs.SelfLabels | None = None,
+
time_iso: str | None = None,
+
) -> models.AppBskyFeedPost.CreateRecordResponse:
+
"""same as send_images, but with labels"""
+
+
if image_alts is None:
+
image_alts = [""] * len(images)
+
else:
+
diff = len(images) - len(image_alts)
+
image_alts = image_alts + [""] * diff
+
+
if image_aspect_ratios is None:
+
aligned_image_aspect_ratios = [None] * len(images)
+
else:
+
diff = len(images) - len(image_aspect_ratios)
+
aligned_image_aspect_ratios = image_aspect_ratios + [None] * diff
+
+
uploads = [self.upload_blob(image) for image in images]
+
+
embed_images = [
+
models.AppBskyEmbedImages.Image(
+
alt=alt, image=upload.blob, aspect_ratio=aspect_ratio
+
)
+
for alt, upload, aspect_ratio in zip(
+
image_alts, uploads, aligned_image_aspect_ratios
+
)
+
]
+
+
return self.send_post(
+
text,
+
reply_to=reply_to,
+
embed=models.AppBskyEmbedImages.Main(images=embed_images),
+
langs=langs,
+
facets=facets,
+
labels=labels,
+
time_iso=time_iso,
+
)
+
+
def send_post(
+
self,
+
text: str | client_utils.TextBuilder,
+
reply_to: models.AppBskyFeedPost.ReplyRef | None = None,
+
embed: None
+
| models.AppBskyEmbedImages.Main
+
| models.AppBskyEmbedExternal.Main
+
| models.AppBskyEmbedRecord.Main
+
| models.AppBskyEmbedRecordWithMedia.Main
+
| models.AppBskyEmbedVideo.Main = None,
+
langs: list[str] | None = None,
+
facets: list[models.AppBskyRichtextFacet.Main] | None = None,
+
labels: models.ComAtprotoLabelDefs.SelfLabels | None = None,
+
time_iso: str | None = None,
+
) -> models.AppBskyFeedPost.CreateRecordResponse:
+
"""same as send_post, but with labels"""
+
+
if isinstance(text, client_utils.TextBuilder):
+
facets = text.build_facets()
+
text = text.build_text()
+
+
repo = self.me and self.me.did
+
if not repo:
+
raise Exception("Client not logged in!")
+
+
if not langs:
+
langs = ["en"]
+
+
record = models.AppBskyFeedPost.Record(
+
created_at=time_iso or self.get_current_time_iso(),
+
text=text,
+
reply=reply_to or None,
+
embed=embed or None,
+
langs=langs,
+
facets=facets or None,
+
labels=labels or None,
+
)
+
return self.app.bsky.feed.post.create(repo, record)
+
+
def create_gates(
+
self,
+
thread_gate_opts: list[str],
+
quote_gate: bool,
+
post_uri: str,
+
time_iso: str | None = None,
+
):
+
account = self.me
+
if not account:
+
raise Exception("Client not logged in!")
+
+
rkey = AtUri.from_str(post_uri).rkey
+
time_iso = time_iso or self.get_current_time_iso()
+
+
if "everybody" not in thread_gate_opts:
+
allow = []
+
if thread_gate_opts:
+
if "following" in thread_gate_opts:
+
allow.append(models.AppBskyFeedThreadgate.FollowingRule())
+
if "followers" in thread_gate_opts:
+
allow.append(models.AppBskyFeedThreadgate.FollowerRule())
+
if "mentioned" in thread_gate_opts:
+
allow.append(models.AppBskyFeedThreadgate.MentionRule())
+
+
thread_gate = models.AppBskyFeedThreadgate.Record(
+
post=post_uri, created_at=time_iso, allow=allow
+
)
+
+
self.app.bsky.feed.threadgate.create(account.did, thread_gate, rkey)
+
+
if quote_gate:
+
post_gate = models.AppBskyFeedPostgate.Record(
+
post=post_uri,
+
created_at=time_iso,
+
embedding_rules=[models.AppBskyFeedPostgate.DisableRule()],
+
)
+
+
self.app.bsky.feed.postgate.create(account.did, post_gate, rkey)
+199
bluesky/common.py
···
···
+
import re
+
+
from atproto import client_utils
+
+
import cross
+
from util.media import MediaInfo
+
from util.util import canonical_label
+
+
# only for lexicon reference
+
SERVICE = "https://bsky.app"
+
+
# TODO this is terrible and stupid
+
ADULT_PATTERN = re.compile(
+
r"\b(sexual content|nsfw|erotic|adult only|18\+)\b", re.IGNORECASE
+
)
+
PORN_PATTERN = re.compile(r"\b(porn|yiff|hentai|pornographic|fetish)\b", re.IGNORECASE)
+
+
+
class BlueskyPost(cross.Post):
+
def __init__(
+
self, record: dict, tokens: list[cross.Token], attachments: list[MediaInfo]
+
) -> None:
+
super().__init__()
+
self.uri = record["$xpost.strongRef"]["uri"]
+
self.parent_uri = None
+
if record.get("reply"):
+
self.parent_uri = record["reply"]["parent"]["uri"]
+
+
self.tokens = tokens
+
self.timestamp = record["createdAt"]
+
labels = record.get("labels", {}).get("values")
+
self.spoiler = None
+
if labels:
+
self.spoiler = ", ".join(
+
[str(label["val"]).replace("-", " ") for label in labels]
+
)
+
+
self.attachments = attachments
+
self.languages = record.get("langs", [])
+
+
# at:// of the post record
+
def get_id(self) -> str:
+
return self.uri
+
+
def get_parent_id(self) -> str | None:
+
return self.parent_uri
+
+
def get_tokens(self) -> list[cross.Token]:
+
return self.tokens
+
+
def get_text_type(self) -> str:
+
return "text/plain"
+
+
def get_timestamp(self) -> str:
+
return self.timestamp
+
+
def get_attachments(self) -> list[MediaInfo]:
+
return self.attachments
+
+
def get_spoiler(self) -> str | None:
+
return self.spoiler
+
+
def get_languages(self) -> list[str]:
+
return self.languages
+
+
def is_sensitive(self) -> bool:
+
return self.spoiler is not None
+
+
def get_post_url(self) -> str | None:
+
did, _, post_id = str(self.uri[len("at://") :]).split("/")
+
+
return f"https://bsky.app/profile/{did}/post/{post_id}"
+
+
+
def tokenize_post(post: dict) -> list[cross.Token]:
+
text: str = post.get("text", "")
+
if not text:
+
return []
+
ut8_text = text.encode(encoding="utf-8")
+
+
def decode(ut8: bytes) -> str:
+
return ut8.decode(encoding="utf-8")
+
+
facets: list[dict] = post.get("facets", [])
+
if not facets:
+
return [cross.TextToken(decode(ut8_text))]
+
+
slices: list[tuple[int, int, str, str]] = []
+
+
for facet in facets:
+
features: list[dict] = facet.get("features", [])
+
if not features:
+
continue
+
+
# we don't support overlapping facets/features
+
feature = features[0]
+
feature_type = feature["$type"]
+
index = facet["index"]
+
match feature_type:
+
case "app.bsky.richtext.facet#tag":
+
slices.append(
+
(index["byteStart"], index["byteEnd"], "tag", feature["tag"])
+
)
+
case "app.bsky.richtext.facet#link":
+
slices.append(
+
(index["byteStart"], index["byteEnd"], "link", feature["uri"])
+
)
+
case "app.bsky.richtext.facet#mention":
+
slices.append(
+
(index["byteStart"], index["byteEnd"], "mention", feature["did"])
+
)
+
+
if not slices:
+
return [cross.TextToken(decode(ut8_text))]
+
+
slices.sort(key=lambda s: s[0])
+
unique: list[tuple[int, int, str, str]] = []
+
current_end = 0
+
for start, end, ttype, val in slices:
+
if start >= current_end:
+
unique.append((start, end, ttype, val))
+
current_end = end
+
+
if not unique:
+
return [cross.TextToken(decode(ut8_text))]
+
+
tokens: list[cross.Token] = []
+
prev = 0
+
+
for start, end, ttype, val in unique:
+
if start > prev:
+
# text between facets
+
tokens.append(cross.TextToken(decode(ut8_text[prev:start])))
+
# facet token
+
match ttype:
+
case "link":
+
label = decode(ut8_text[start:end])
+
+
# try to unflatten links
+
split = val.split("://", 1)
+
if len(split) > 1:
+
if split[1].startswith(label):
+
tokens.append(cross.LinkToken(val, ""))
+
prev = end
+
continue
+
+
if label.endswith("...") and split[1].startswith(label[:-3]):
+
tokens.append(cross.LinkToken(val, ""))
+
prev = end
+
continue
+
+
tokens.append(cross.LinkToken(val, label))
+
case "tag":
+
tag = decode(ut8_text[start:end])
+
tokens.append(cross.TagToken(tag[1:] if tag.startswith("#") else tag))
+
case "mention":
+
mention = decode(ut8_text[start:end])
+
tokens.append(
+
cross.MentionToken(
+
mention[1:] if mention.startswith("@") else mention, val
+
)
+
)
+
prev = end
+
+
if prev < len(ut8_text):
+
tokens.append(cross.TextToken(decode(ut8_text[prev:])))
+
+
return tokens
+
+
+
def tokens_to_richtext(tokens: list[cross.Token]) -> client_utils.TextBuilder | None:
+
builder = client_utils.TextBuilder()
+
+
def flatten_link(href: str):
+
split = href.split("://", 1)
+
if len(split) > 1:
+
href = split[1]
+
+
if len(href) > 32:
+
href = href[:32] + "..."
+
+
return href
+
+
for token in tokens:
+
if isinstance(token, cross.TextToken):
+
builder.text(token.text)
+
elif isinstance(token, cross.LinkToken):
+
if canonical_label(token.label, token.href):
+
builder.link(flatten_link(token.href), token.href)
+
continue
+
+
builder.link(token.label, token.href)
+
elif isinstance(token, cross.TagToken):
+
builder.tag("#" + token.tag, token.tag.lower())
+
else:
+
# fail on unsupported tokens
+
return None
+
+
return builder
+203
bluesky/input.py
···
···
+
import asyncio
+
import json
+
import re
+
from typing import Any, Callable
+
+
import websockets
+
from atproto_client import models
+
from atproto_client.models.utils import get_or_create as get_model_or_create
+
+
import cross
+
import util.database as database
+
from bluesky.atproto2 import resolve_identity
+
from bluesky.common import SERVICE, BlueskyPost, tokenize_post
+
from util.database import DataBaseWorker
+
from util.media import MediaInfo, download_media
+
from util.util import LOGGER, as_envvar
+
+
+
class BlueskyInputOptions:
+
def __init__(self, o: dict) -> None:
+
self.filters = [re.compile(f) for f in o.get("regex_filters", [])]
+
+
+
class BlueskyInput(cross.Input):
+
def __init__(self, settings: dict, db: DataBaseWorker) -> None:
+
self.options = BlueskyInputOptions(settings.get("options", {}))
+
did, pds = resolve_identity(
+
handle=as_envvar(settings.get("handle")),
+
did=as_envvar(settings.get("did")),
+
pds=as_envvar(settings.get("pds")),
+
)
+
self.pds = pds
+
+
# PDS is Not a service, the lexicon and rids are the same across pds
+
super().__init__(SERVICE, did, settings, db)
+
+
def _on_post(self, outputs: list[cross.Output], post: dict[str, Any]):
+
post_uri = post["$xpost.strongRef"]["uri"]
+
post_cid = post["$xpost.strongRef"]["cid"]
+
+
parent_uri = None
+
if post.get("reply"):
+
parent_uri = post["reply"]["parent"]["uri"]
+
+
embed = post.get("embed", {})
+
if embed.get("$type") in (
+
"app.bsky.embed.record",
+
"app.bsky.embed.recordWithMedia",
+
):
+
did, collection, rid = str(embed["record"]["uri"][len("at://") :]).split(
+
"/"
+
)
+
if collection == "app.bsky.feed.post":
+
LOGGER.info("Skipping '%s'! Quote..", post_uri)
+
return
+
+
success = database.try_insert_post(
+
self.db, post_uri, parent_uri, self.user_id, self.service
+
)
+
if not success:
+
LOGGER.info("Skipping '%s' as parent post was not found in db!", post_uri)
+
return
+
database.store_data(
+
self.db, post_uri, self.user_id, self.service, {"cid": post_cid}
+
)
+
+
tokens = tokenize_post(post)
+
if not cross.test_filters(tokens, self.options.filters):
+
LOGGER.info("Skipping '%s'. Matched a filter!", post_uri)
+
return
+
+
LOGGER.info("Crossposting '%s'...", post_uri)
+
+
def get_blob_url(blob: str):
+
return f"{self.pds}/xrpc/com.atproto.sync.getBlob?did={self.user_id}&cid={blob}"
+
+
attachments: list[MediaInfo] = []
+
if embed.get("$type") == "app.bsky.embed.images":
+
model = get_model_or_create(embed, model=models.AppBskyEmbedImages.Main)
+
assert isinstance(model, models.AppBskyEmbedImages.Main)
+
+
for image in model.images:
+
url = get_blob_url(image.image.cid.encode())
+
LOGGER.info("Downloading %s...", url)
+
io = download_media(url, image.alt)
+
if not io:
+
LOGGER.error("Skipping '%s'. Failed to download media!", post_uri)
+
return
+
attachments.append(io)
+
elif embed.get("$type") == "app.bsky.embed.video":
+
model = get_model_or_create(embed, model=models.AppBskyEmbedVideo.Main)
+
assert isinstance(model, models.AppBskyEmbedVideo.Main)
+
url = get_blob_url(model.video.cid.encode())
+
LOGGER.info("Downloading %s...", url)
+
io = download_media(url, model.alt if model.alt else "")
+
if not io:
+
LOGGER.error("Skipping '%s'. Failed to download media!", post_uri)
+
return
+
attachments.append(io)
+
+
cross_post = BlueskyPost(post, tokens, attachments)
+
for output in outputs:
+
output.accept_post(cross_post)
+
+
def _on_delete_post(self, outputs: list[cross.Output], post_id: str, repost: bool):
+
post = database.find_post(self.db, post_id, self.user_id, self.service)
+
if not post:
+
return
+
+
LOGGER.info("Deleting '%s'...", post_id)
+
if repost:
+
for output in outputs:
+
output.delete_repost(post_id)
+
else:
+
for output in outputs:
+
output.delete_post(post_id)
+
database.delete_post(self.db, post_id, self.user_id, self.service)
+
+
def _on_repost(self, outputs: list[cross.Output], post: dict[str, Any]):
+
post_uri = post["$xpost.strongRef"]["uri"]
+
post_cid = post["$xpost.strongRef"]["cid"]
+
+
reposted_uri = post["subject"]["uri"]
+
+
success = database.try_insert_repost(
+
self.db, post_uri, reposted_uri, self.user_id, self.service
+
)
+
if not success:
+
LOGGER.info("Skipping '%s' as reposted post was not found in db!", post_uri)
+
return
+
database.store_data(
+
self.db, post_uri, self.user_id, self.service, {"cid": post_cid}
+
)
+
+
LOGGER.info("Crossposting '%s'...", post_uri)
+
for output in outputs:
+
output.accept_repost(post_uri, reposted_uri)
+
+
+
class BlueskyJetstreamInput(BlueskyInput):
+
def __init__(self, settings: dict, db: DataBaseWorker) -> None:
+
super().__init__(settings, db)
+
self.jetstream = settings.get(
+
"jetstream", "wss://jetstream2.us-east.bsky.network/subscribe"
+
)
+
+
def __on_commit(self, outputs: list[cross.Output], msg: dict):
+
if msg.get("did") != self.user_id:
+
return
+
+
commit: dict = msg.get("commit", {})
+
if not commit:
+
return
+
+
commit_type = commit["operation"]
+
match commit_type:
+
case "create":
+
record = dict(commit.get("record", {}))
+
record["$xpost.strongRef"] = {
+
"cid": commit["cid"],
+
"uri": f"at://{self.user_id}/{commit['collection']}/{commit['rkey']}",
+
}
+
+
match commit["collection"]:
+
case "app.bsky.feed.post":
+
self._on_post(outputs, record)
+
case "app.bsky.feed.repost":
+
self._on_repost(outputs, record)
+
case "delete":
+
post_id: str = (
+
f"at://{self.user_id}/{commit['collection']}/{commit['rkey']}"
+
)
+
match commit["collection"]:
+
case "app.bsky.feed.post":
+
self._on_delete_post(outputs, post_id, False)
+
case "app.bsky.feed.repost":
+
self._on_delete_post(outputs, post_id, True)
+
+
async def listen(
+
self, outputs: list[cross.Output], submit: Callable[[Callable[[], Any]], Any]
+
):
+
uri = self.jetstream + "?"
+
uri += "wantedCollections=app.bsky.feed.post"
+
uri += "&wantedCollections=app.bsky.feed.repost"
+
uri += f"&wantedDids={self.user_id}"
+
+
async for ws in websockets.connect(
+
uri, extra_headers={"User-Agent": "XPost/0.0.3"}
+
):
+
try:
+
LOGGER.info("Listening to %s...", self.jetstream)
+
+
async def listen_for_messages():
+
async for msg in ws:
+
submit(lambda: self.__on_commit(outputs, json.loads(msg)))
+
+
listen = asyncio.create_task(listen_for_messages())
+
+
await asyncio.gather(listen)
+
except websockets.ConnectionClosedError as e:
+
LOGGER.error(e, stack_info=True, exc_info=True)
+
LOGGER.info("Reconnecting to %s...", self.jetstream)
+
continue
+481
bluesky/output.py
···
···
+
from atproto import Request, client_utils
+
from atproto_client import models
+
from httpx import Timeout
+
+
import cross
+
import misskey.mfm_util as mfm_util
+
import util.database as database
+
from bluesky.atproto2 import Client2, resolve_identity
+
from bluesky.common import ADULT_PATTERN, PORN_PATTERN, SERVICE, tokens_to_richtext
+
from util.database import DataBaseWorker
+
from util.media import (
+
MediaInfo,
+
compress_image,
+
convert_to_mp4,
+
get_filename_from_url,
+
get_media_meta,
+
)
+
from util.util import LOGGER, as_envvar
+
+
ALLOWED_GATES = ["mentioned", "following", "followers", "everybody"]
+
+
+
class BlueskyOutputOptions:
+
def __init__(self, o: dict) -> None:
+
self.quote_gate: bool = False
+
self.thread_gate: list[str] = ["everybody"]
+
self.encode_videos: bool = True
+
+
quote_gate = o.get("quote_gate")
+
if quote_gate is not None:
+
self.quote_gate = bool(quote_gate)
+
+
thread_gate = o.get("thread_gate")
+
if thread_gate is not None:
+
if any([v not in ALLOWED_GATES for v in thread_gate]):
+
raise ValueError(
+
f"'thread_gate' only accepts {', '.join(ALLOWED_GATES)} or [], got: {thread_gate}"
+
)
+
self.thread_gate = thread_gate
+
+
encode_videos = o.get("encode_videos")
+
if encode_videos is not None:
+
self.encode_videos = bool(encode_videos)
+
+
+
class BlueskyOutput(cross.Output):
+
def __init__(self, input: cross.Input, settings: dict, db: DataBaseWorker) -> None:
+
super().__init__(input, settings, db)
+
self.options = BlueskyOutputOptions(settings.get("options") or {})
+
+
if not as_envvar(settings.get("app-password")):
+
raise Exception("Account app password not provided!")
+
+
did, pds = resolve_identity(
+
handle=as_envvar(settings.get("handle")),
+
did=as_envvar(settings.get("did")),
+
pds=as_envvar(settings.get("pds")),
+
)
+
+
reqs = Request(timeout=Timeout(None, connect=30.0))
+
+
self.bsky = Client2(pds, request=reqs)
+
self.bsky.configure_proxy_header(
+
service_type="bsky_appview",
+
did=as_envvar(settings.get("bsky_appview")) or "did:web:api.bsky.app",
+
)
+
self.bsky.login(did, as_envvar(settings.get("app-password")))
+
+
def __check_login(self):
+
login = self.bsky.me
+
if not login:
+
raise Exception("Client not logged in!")
+
return login
+
+
def _find_parent(self, parent_id: str):
+
login = self.__check_login()
+
+
thread_tuple = database.find_mapped_thread(
+
self.db,
+
parent_id,
+
self.input.user_id,
+
self.input.service,
+
login.did,
+
SERVICE,
+
)
+
+
if not thread_tuple:
+
LOGGER.error("Failed to find thread tuple in the database!")
+
return None
+
+
root_uri: str = thread_tuple[0]
+
reply_uri: str = thread_tuple[1]
+
+
root_cid = database.fetch_data(self.db, root_uri, login.did, SERVICE)["cid"]
+
reply_cid = database.fetch_data(self.db, root_uri, login.did, SERVICE)["cid"]
+
+
root_record = models.AppBskyFeedPost.CreateRecordResponse(
+
uri=root_uri, cid=root_cid
+
)
+
reply_record = models.AppBskyFeedPost.CreateRecordResponse(
+
uri=reply_uri, cid=reply_cid
+
)
+
+
return (
+
models.create_strong_ref(root_record),
+
models.create_strong_ref(reply_record),
+
thread_tuple[2],
+
thread_tuple[3],
+
)
+
+
def _split_attachments(self, attachments: list[MediaInfo]):
+
sup_media: list[MediaInfo] = []
+
unsup_media: list[MediaInfo] = []
+
+
for a in attachments:
+
if a.mime.startswith("image/") or a.mime.startswith(
+
"video/"
+
): # TODO convert gifs to videos
+
sup_media.append(a)
+
else:
+
unsup_media.append(a)
+
+
return (sup_media, unsup_media)
+
+
def _split_media_per_post(
+
self, tokens: list[client_utils.TextBuilder], media: list[MediaInfo]
+
):
+
posts: list[dict] = [{"tokens": tokens, "attachments": []} for tokens in tokens]
+
available_indices: list[int] = list(range(len(posts)))
+
+
current_image_post_idx: int | None = None
+
+
def make_blank_post() -> dict:
+
return {"tokens": [client_utils.TextBuilder().text("")], "attachments": []}
+
+
def pop_next_empty_index() -> int:
+
if available_indices:
+
return available_indices.pop(0)
+
else:
+
new_idx = len(posts)
+
posts.append(make_blank_post())
+
return new_idx
+
+
for att in media:
+
if att.mime.startswith("video/"):
+
current_image_post_idx = None
+
idx = pop_next_empty_index()
+
posts[idx]["attachments"].append(att)
+
elif att.mime.startswith("image/"):
+
if (
+
current_image_post_idx is not None
+
and len(posts[current_image_post_idx]["attachments"]) < 4
+
):
+
posts[current_image_post_idx]["attachments"].append(att)
+
else:
+
idx = pop_next_empty_index()
+
posts[idx]["attachments"].append(att)
+
current_image_post_idx = idx
+
+
result: list[tuple[client_utils.TextBuilder, list[MediaInfo]]] = []
+
for p in posts:
+
result.append((p["tokens"], p["attachments"]))
+
return result
+
+
def accept_post(self, post: cross.Post):
+
login = self.__check_login()
+
+
parent_id = post.get_parent_id()
+
+
# used for db insertion
+
new_root_id = None
+
new_parent_id = None
+
+
root_ref = None
+
reply_ref = None
+
if parent_id:
+
parents = self._find_parent(parent_id)
+
if not parents:
+
return
+
root_ref, reply_ref, new_root_id, new_parent_id = parents
+
+
tokens = post.get_tokens().copy()
+
+
unique_labels: set[str] = set()
+
cw = post.get_spoiler()
+
if cw:
+
tokens.insert(0, cross.TextToken("CW: " + cw + "\n\n"))
+
unique_labels.add("graphic-media")
+
+
# from bsky.app, a post can only have one of those labels
+
if PORN_PATTERN.search(cw):
+
unique_labels.add("porn")
+
elif ADULT_PATTERN.search(cw):
+
unique_labels.add("sexual")
+
+
if post.is_sensitive():
+
unique_labels.add("graphic-media")
+
+
labels = (
+
models.ComAtprotoLabelDefs.SelfLabels(
+
values=[
+
models.ComAtprotoLabelDefs.SelfLabel(val=label)
+
for label in unique_labels
+
]
+
)
+
if unique_labels
+
else None
+
)
+
+
sup_media, unsup_media = self._split_attachments(post.get_attachments())
+
+
if unsup_media:
+
if tokens:
+
tokens.append(cross.TextToken("\n"))
+
for i, attachment in enumerate(unsup_media):
+
tokens.append(
+
cross.LinkToken(
+
attachment.url, f"[{get_filename_from_url(attachment.url)}]"
+
)
+
)
+
tokens.append(cross.TextToken(" "))
+
+
if post.get_text_type() == "text/x.misskeymarkdown":
+
tokens, status = mfm_util.strip_mfm(tokens)
+
post_url = post.get_post_url()
+
if status and post_url:
+
tokens.append(cross.TextToken("\n"))
+
tokens.append(
+
cross.LinkToken(post_url, "[Post contains MFM, see original]")
+
)
+
+
split_tokens: list[list[cross.Token]] = cross.split_tokens(tokens, 300)
+
post_text: list[client_utils.TextBuilder] = []
+
+
# convert tokens into rich text. skip post if contains unsupported tokens
+
for block in split_tokens:
+
rich_text = tokens_to_richtext(block)
+
+
if not rich_text:
+
LOGGER.error(
+
"Skipping '%s' as it contains invalid rich text types!",
+
post.get_id(),
+
)
+
return
+
post_text.append(rich_text)
+
+
if not post_text:
+
post_text = [client_utils.TextBuilder().text("")]
+
+
for m in sup_media:
+
if m.mime.startswith("image/"):
+
if len(m.io) > 2_000_000:
+
LOGGER.error(
+
"Skipping post_id '%s', failed to download attachment! File too large.",
+
post.get_id(),
+
)
+
return
+
+
if m.mime.startswith("video/"):
+
if m.mime != "video/mp4" and not self.options.encode_videos:
+
LOGGER.info(
+
"Video is not mp4, but encoding is disabled. Skipping '%s'...",
+
post.get_id(),
+
)
+
return
+
+
if len(m.io) > 100_000_000:
+
LOGGER.error(
+
"Skipping post_id '%s', failed to download attachment! File too large?",
+
post.get_id(),
+
)
+
return
+
+
created_records: list[models.AppBskyFeedPost.CreateRecordResponse] = []
+
baked_media = self._split_media_per_post(post_text, sup_media)
+
+
for text, attachments in baked_media:
+
if not attachments:
+
if reply_ref and root_ref:
+
new_post = self.bsky.send_post(
+
text,
+
reply_to=models.AppBskyFeedPost.ReplyRef(
+
parent=reply_ref, root=root_ref
+
),
+
labels=labels,
+
time_iso=post.get_timestamp(),
+
)
+
else:
+
new_post = self.bsky.send_post(
+
text, labels=labels, time_iso=post.get_timestamp()
+
)
+
root_ref = models.create_strong_ref(new_post)
+
+
self.bsky.create_gates(
+
self.options.thread_gate,
+
self.options.quote_gate,
+
new_post.uri,
+
time_iso=post.get_timestamp(),
+
)
+
reply_ref = models.create_strong_ref(new_post)
+
created_records.append(new_post)
+
else:
+
# if a single post is an image - everything else is an image
+
if attachments[0].mime.startswith("image/"):
+
images: list[bytes] = []
+
image_alts: list[str] = []
+
image_aspect_ratios: list[models.AppBskyEmbedDefs.AspectRatio] = []
+
+
for attachment in attachments:
+
image_io = compress_image(attachment.io, quality=100)
+
metadata = get_media_meta(image_io)
+
+
if len(image_io) > 1_000_000:
+
LOGGER.info("Compressing %s...", attachment.name)
+
image_io = compress_image(image_io)
+
+
images.append(image_io)
+
image_alts.append(attachment.alt)
+
image_aspect_ratios.append(
+
models.AppBskyEmbedDefs.AspectRatio(
+
width=metadata["width"], height=metadata["height"]
+
)
+
)
+
+
new_post = self.bsky.send_images(
+
text=post_text[0],
+
images=images,
+
image_alts=image_alts,
+
image_aspect_ratios=image_aspect_ratios,
+
reply_to=models.AppBskyFeedPost.ReplyRef(
+
parent=reply_ref, root=root_ref
+
)
+
if root_ref and reply_ref
+
else None,
+
labels=labels,
+
time_iso=post.get_timestamp(),
+
)
+
if not root_ref:
+
root_ref = models.create_strong_ref(new_post)
+
+
self.bsky.create_gates(
+
self.options.thread_gate,
+
self.options.quote_gate,
+
new_post.uri,
+
time_iso=post.get_timestamp(),
+
)
+
reply_ref = models.create_strong_ref(new_post)
+
created_records.append(new_post)
+
else: # video is guarantedd to be one
+
metadata = get_media_meta(attachments[0].io)
+
if metadata["duration"] > 180:
+
LOGGER.info(
+
"Skipping post_id '%s', video attachment too long!",
+
post.get_id(),
+
)
+
return
+
+
video_io = attachments[0].io
+
if attachments[0].mime != "video/mp4":
+
LOGGER.info("Converting %s to mp4...", attachments[0].name)
+
video_io = convert_to_mp4(video_io)
+
+
aspect_ratio = models.AppBskyEmbedDefs.AspectRatio(
+
width=metadata["width"], height=metadata["height"]
+
)
+
+
new_post = self.bsky.send_video(
+
text=post_text[0],
+
video=video_io,
+
video_aspect_ratio=aspect_ratio,
+
video_alt=attachments[0].alt,
+
reply_to=models.AppBskyFeedPost.ReplyRef(
+
parent=reply_ref, root=root_ref
+
)
+
if root_ref and reply_ref
+
else None,
+
labels=labels,
+
time_iso=post.get_timestamp(),
+
)
+
if not root_ref:
+
root_ref = models.create_strong_ref(new_post)
+
+
self.bsky.create_gates(
+
self.options.thread_gate,
+
self.options.quote_gate,
+
new_post.uri,
+
time_iso=post.get_timestamp(),
+
)
+
reply_ref = models.create_strong_ref(new_post)
+
created_records.append(new_post)
+
+
db_post = database.find_post(
+
self.db, post.get_id(), self.input.user_id, self.input.service
+
)
+
assert db_post, "ghghghhhhh"
+
+
if new_root_id is None or new_parent_id is None:
+
new_root_id = database.insert_post(
+
self.db, created_records[0].uri, login.did, SERVICE
+
)
+
database.store_data(
+
self.db,
+
created_records[0].uri,
+
login.did,
+
SERVICE,
+
{"cid": created_records[0].cid},
+
)
+
+
new_parent_id = new_root_id
+
database.insert_mapping(self.db, db_post["id"], new_parent_id)
+
created_records = created_records[1:]
+
+
for record in created_records:
+
new_parent_id = database.insert_reply(
+
self.db, record.uri, login.did, SERVICE, new_parent_id, new_root_id
+
)
+
database.store_data(
+
self.db, record.uri, login.did, SERVICE, {"cid": record.cid}
+
)
+
database.insert_mapping(self.db, db_post["id"], new_parent_id)
+
+
def delete_post(self, identifier: str):
+
login = self.__check_login()
+
+
post = database.find_post(
+
self.db, identifier, self.input.user_id, self.input.service
+
)
+
if not post:
+
return
+
+
mappings = database.find_mappings(self.db, post["id"], SERVICE, login.did)
+
for mapping in mappings[::-1]:
+
LOGGER.info("Deleting '%s'...", mapping[0])
+
self.bsky.delete_post(mapping[0])
+
database.delete_post(self.db, mapping[0], SERVICE, login.did)
+
+
def accept_repost(self, repost_id: str, reposted_id: str):
+
login, repost = self.__delete_repost(repost_id)
+
if not (login and repost):
+
return
+
+
reposted = database.find_post(
+
self.db, reposted_id, self.input.user_id, self.input.service
+
)
+
if not reposted:
+
return
+
+
# mappings of the reposted post
+
mappings = database.find_mappings(self.db, reposted["id"], SERVICE, login.did)
+
if mappings:
+
cid = database.fetch_data(self.db, mappings[0][0], login.did, SERVICE)[
+
"cid"
+
]
+
rsp = self.bsky.repost(mappings[0][0], cid)
+
+
internal_id = database.insert_repost(
+
self.db, rsp.uri, reposted["id"], login.did, SERVICE
+
)
+
database.store_data(self.db, rsp.uri, login.did, SERVICE, {"cid": rsp.cid})
+
database.insert_mapping(self.db, repost["id"], internal_id)
+
+
def __delete_repost(
+
self, repost_id: str
+
) -> tuple[models.AppBskyActorDefs.ProfileViewDetailed | None, dict | None]:
+
login = self.__check_login()
+
+
repost = database.find_post(
+
self.db, repost_id, self.input.user_id, self.input.service
+
)
+
if not repost:
+
return None, None
+
+
mappings = database.find_mappings(self.db, repost["id"], SERVICE, login.did)
+
if mappings:
+
LOGGER.info("Deleting '%s'...", mappings[0][0])
+
self.bsky.unrepost(mappings[0][0])
+
database.delete_post(self.db, mappings[0][0], login.did, SERVICE)
+
return login, repost
+
+
def delete_repost(self, repost_id: str):
+
self.__delete_repost(repost_id)
+237
cross.py
···
···
+
import re
+
from abc import ABC, abstractmethod
+
from datetime import datetime, timezone
+
from typing import Any, Callable
+
+
from util.database import DataBaseWorker
+
from util.media import MediaInfo
+
from util.util import LOGGER, canonical_label
+
+
ALTERNATE = re.compile(r"\S+|\s+")
+
+
+
# generic token
+
class Token:
+
def __init__(self, type: str) -> None:
+
self.type = type
+
+
+
class TextToken(Token):
+
def __init__(self, text: str) -> None:
+
super().__init__("text")
+
self.text = text
+
+
+
# token that represents a link to a website. e.g. [link](https://google.com/)
+
class LinkToken(Token):
+
def __init__(self, href: str, label: str) -> None:
+
super().__init__("link")
+
self.href = href
+
self.label = label
+
+
+
# token that represents a hashtag. e.g. #SocialMedia
+
class TagToken(Token):
+
def __init__(self, tag: str) -> None:
+
super().__init__("tag")
+
self.tag = tag
+
+
+
# token that represents a mention of a user.
+
class MentionToken(Token):
+
def __init__(self, username: str, uri: str) -> None:
+
super().__init__("mention")
+
self.username = username
+
self.uri = uri
+
+
+
class MediaMeta:
+
def __init__(self, width: int, height: int, duration: float) -> None:
+
self.width = width
+
self.height = height
+
self.duration = duration
+
+
def get_width(self) -> int:
+
return self.width
+
+
def get_height(self) -> int:
+
return self.height
+
+
def get_duration(self) -> float:
+
return self.duration
+
+
+
class Post(ABC):
+
@abstractmethod
+
def get_id(self) -> str:
+
return ""
+
+
@abstractmethod
+
def get_parent_id(self) -> str | None:
+
pass
+
+
@abstractmethod
+
def get_tokens(self) -> list[Token]:
+
pass
+
+
# returns input text type.
+
# text/plain, text/markdown, text/x.misskeymarkdown
+
@abstractmethod
+
def get_text_type(self) -> str:
+
pass
+
+
# post iso timestamp
+
@abstractmethod
+
def get_timestamp(self) -> str:
+
pass
+
+
def get_attachments(self) -> list[MediaInfo]:
+
return []
+
+
def get_spoiler(self) -> str | None:
+
return None
+
+
def get_languages(self) -> list[str]:
+
return []
+
+
def is_sensitive(self) -> bool:
+
return False
+
+
def get_post_url(self) -> str | None:
+
return None
+
+
+
# generic input service.
+
# user and service for db queries
+
class Input:
+
def __init__(
+
self, service: str, user_id: str, settings: dict, db: DataBaseWorker
+
) -> None:
+
self.service = service
+
self.user_id = user_id
+
self.settings = settings
+
self.db = db
+
+
async def listen(self, outputs: list, handler: Callable[[Post], Any]):
+
pass
+
+
+
class Output:
+
def __init__(self, input: Input, settings: dict, db: DataBaseWorker) -> None:
+
self.input = input
+
self.settings = settings
+
self.db = db
+
+
def accept_post(self, post: Post):
+
LOGGER.warning('Not Implemented.. "posted" %s', post.get_id())
+
+
def delete_post(self, identifier: str):
+
LOGGER.warning('Not Implemented.. "deleted" %s', identifier)
+
+
def accept_repost(self, repost_id: str, reposted_id: str):
+
LOGGER.warning('Not Implemented.. "reblogged" %s, %s', repost_id, reposted_id)
+
+
def delete_repost(self, repost_id: str):
+
LOGGER.warning('Not Implemented.. "removed reblog" %s', repost_id)
+
+
+
def test_filters(tokens: list[Token], filters: list[re.Pattern[str]]):
+
if not tokens or not filters:
+
return True
+
+
markdown = ""
+
+
for token in tokens:
+
if isinstance(token, TextToken):
+
markdown += token.text
+
elif isinstance(token, LinkToken):
+
markdown += f"[{token.label}]({token.href})"
+
elif isinstance(token, TagToken):
+
markdown += "#" + token.tag
+
elif isinstance(token, MentionToken):
+
markdown += token.username
+
+
for filter in filters:
+
if filter.search(markdown):
+
return False
+
+
return True
+
+
+
def split_tokens(
+
tokens: list[Token], max_chars: int, max_link_len: int = 35
+
) -> list[list[Token]]:
+
def new_block():
+
nonlocal blocks, block, length
+
if block:
+
blocks.append(block)
+
block = []
+
length = 0
+
+
def append_text(text_segment):
+
nonlocal block
+
# if the last element in the current block is also text, just append to it
+
if block and isinstance(block[-1], TextToken):
+
block[-1].text += text_segment
+
else:
+
block.append(TextToken(text_segment))
+
+
blocks: list[list[Token]] = []
+
block: list[Token] = []
+
length = 0
+
+
for tk in tokens:
+
if isinstance(tk, TagToken):
+
tag_len = 1 + len(tk.tag) # (#) + tag
+
if length + tag_len > max_chars:
+
new_block() # create new block if the current one is too large
+
+
block.append(tk)
+
length += tag_len
+
elif isinstance(tk, LinkToken): # TODO labels should proably be split too
+
link_len = len(tk.label)
+
if canonical_label(
+
tk.label, tk.href
+
): # cut down the link if the label is canonical
+
link_len = min(link_len, max_link_len)
+
+
if length + link_len > max_chars:
+
new_block()
+
block.append(tk)
+
length += link_len
+
elif isinstance(tk, TextToken):
+
segments: list[str] = ALTERNATE.findall(tk.text)
+
+
for seg in segments:
+
seg_len: int = len(seg)
+
if length + seg_len <= max_chars - (0 if seg.isspace() else 1):
+
append_text(seg)
+
length += seg_len
+
continue
+
+
if length > 0:
+
new_block()
+
+
if not seg.isspace():
+
while len(seg) > max_chars - 1:
+
chunk = seg[: max_chars - 1] + "-"
+
append_text(chunk)
+
new_block()
+
seg = seg[max_chars - 1 :]
+
else:
+
while len(seg) > max_chars:
+
chunk = seg[:max_chars]
+
append_text(chunk)
+
new_block()
+
seg = seg[max_chars:]
+
+
if seg:
+
append_text(seg)
+
length = len(seg)
+
else: # TODO fix mentions
+
block.append(tk)
+
+
if block:
+
blocks.append(block)
+
+
return blocks
-68
database/migrations.py
···
-
import sqlite3
-
from pathlib import Path
-
-
from util.util import LOGGER
-
-
-
class DatabaseMigrator:
-
def __init__(self, db_path: Path, migrations_folder: Path) -> None:
-
self.db_path: Path = db_path
-
self.migrations_folder: Path = migrations_folder
-
self.conn: sqlite3.Connection = sqlite3.connect(db_path, autocommit=False)
-
self.conn.row_factory = sqlite3.Row
-
-
def close(self):
-
self.conn.close()
-
-
def get_version(self) -> int:
-
cursor = self.conn.cursor()
-
_ = cursor.execute("PRAGMA user_version")
-
return int(cursor.fetchone()[0])
-
-
def set_version(self, version: int):
-
cursor = self.conn.cursor()
-
_ = cursor.execute(f"PRAGMA user_version = {version}")
-
self.conn.commit()
-
-
def get_migrations(self) -> list[tuple[int, Path]]:
-
if not self.migrations_folder.exists():
-
return []
-
-
files: list[tuple[int, Path]] = []
-
for f in self.migrations_folder.glob("*.sql"):
-
try:
-
version = int(f.stem.split("_")[0])
-
files.append((version, f))
-
except (ValueError, IndexError):
-
LOGGER.warning(f"Warning: Skipping invalid migration file: {f.name}")
-
-
return sorted(files, key=lambda x: x[0])
-
-
def apply_migration(self, version: int, path: Path):
-
with open(path, "r") as f:
-
sql = f.read()
-
-
cursor = self.conn.cursor()
-
try:
-
_ = cursor.executescript(sql)
-
self.set_version(version)
-
LOGGER.info(f"Applied migration: {path.name}")
-
except sqlite3.Error as e:
-
self.conn.rollback()
-
raise Exception(f"Error applying migration {version}: {e}")
-
-
def migrate(self):
-
current_version = self.get_version()
-
migrations = self.get_migrations()
-
-
if not migrations:
-
LOGGER.warning("No migration files found.")
-
return
-
-
pending = [m for m in migrations if m[0] > current_version]
-
if not pending:
-
LOGGER.info("No pending migrations.")
-
return
-
-
for version, filepath in pending:
-
self.apply_migration(version, filepath)
···
+148 -18
main.py
···
-
from pathlib import Path
-
from database.migrations import DatabaseMigrator
-
from util.util import LOGGER
-
def main(data: Path):
-
if not data.exists():
-
data.mkdir(parents=True)
-
settings = data.joinpath("settings.json")
-
database = data.joinpath("db.sqlite")
-
if not settings.exists():
-
LOGGER.info("First launch detected! Creating %s and exiting!", settings)
return 0
LOGGER.info("Loading settings...")
-
# TODO
-
migrator = DatabaseMigrator(database, Path("./migrations"))
try:
-
migrator.migrate()
-
except Exception:
-
LOGGER.exception("Failed to migrate database!")
-
finally:
-
migrator.close()
if __name__ == "__main__":
-
main(Path("./data"))
···
+
import asyncio
+
import json
+
import os
+
import queue
+
import threading
+
import traceback
+
+
import cross
+
import util.database as database
+
from bluesky.input import BlueskyJetstreamInput
+
from bluesky.output import BlueskyOutput, BlueskyOutputOptions
+
from mastodon.input import MastodonInput, MastodonInputOptions
+
from mastodon.output import MastodonOutput
+
from misskey.input import MisskeyInput
+
from util.util import LOGGER, as_json
+
+
DEFAULT_SETTINGS: dict = {
+
"input": {
+
"type": "mastodon-wss",
+
"instance": "env:MASTODON_INSTANCE",
+
"token": "env:MASTODON_TOKEN",
+
"options": MastodonInputOptions({}),
+
},
+
"outputs": [
+
{
+
"type": "bluesky",
+
"handle": "env:BLUESKY_HANDLE",
+
"app-password": "env:BLUESKY_APP_PASSWORD",
+
"options": BlueskyOutputOptions({}),
+
}
+
],
+
}
+
+
INPUTS = {
+
"mastodon-wss": lambda settings, db: MastodonInput(settings, db),
+
"misskey-wss": lambda settigs, db: MisskeyInput(settigs, db),
+
"bluesky-jetstream-wss": lambda settings, db: BlueskyJetstreamInput(settings, db),
+
}
+
+
OUTPUTS = {
+
"bluesky": lambda input, settings, db: BlueskyOutput(input, settings, db),
+
"mastodon": lambda input, settings, db: MastodonOutput(input, settings, db),
+
}
+
def execute(data_dir):
+
if not os.path.exists(data_dir):
+
os.makedirs(data_dir)
+
settings_path = os.path.join(data_dir, "settings.json")
+
database_path = os.path.join(data_dir, "data.db")
+
if not os.path.exists(settings_path):
+
LOGGER.info("First launch detected! Creating %s and exiting!", settings_path)
+
with open(settings_path, "w") as f:
+
f.write(as_json(DEFAULT_SETTINGS, indent=2))
return 0
LOGGER.info("Loading settings...")
+
with open(settings_path, "rb") as f:
+
settings = json.load(f)
+
LOGGER.info("Starting database worker...")
+
db_worker = database.DataBaseWorker(os.path.abspath(database_path))
+
+
db_worker.execute("PRAGMA foreign_keys = ON;")
+
+
# create the posts table
+
# id - internal id of the post
+
# user_id - user id on the service (e.g. a724sknj5y9ydk0w)
+
# service - the service (e.g. https://shrimp.melontini.me)
+
# identifier - post id on the service (e.g. a8mpiyeej0fpjp0p)
+
# parent_id - the internal id of the parent
+
db_worker.execute(
+
"""
+
CREATE TABLE IF NOT EXISTS posts (
+
id INTEGER PRIMARY KEY AUTOINCREMENT,
+
user_id TEXT NOT NULL,
+
service TEXT NOT NULL,
+
identifier TEXT NOT NULL,
+
parent_id INTEGER NULL REFERENCES posts(id) ON DELETE SET NULL,
+
root_id INTEGER NULL REFERENCES posts(id) ON DELETE SET NULL
+
);
+
"""
+
)
+
+
columns = db_worker.execute("PRAGMA table_info(posts)")
+
column_names = [col[1] for col in columns]
+
if "reposted_id" not in column_names:
+
db_worker.execute("""
+
ALTER TABLE posts
+
ADD COLUMN reposted_id INTEGER NULL REFERENCES posts(id) ON DELETE SET NULL
+
""")
+
if "extra_data" not in column_names:
+
db_worker.execute("""
+
ALTER TABLE posts
+
ADD COLUMN extra_data TEXT NULL
+
""")
+
+
# create the mappings table
+
# original_post_id - the post this was mapped from
+
# mapped_post_id - the post this was mapped to
+
db_worker.execute(
+
"""
+
CREATE TABLE IF NOT EXISTS mappings (
+
original_post_id INTEGER NOT NULL REFERENCES posts(id) ON DELETE CASCADE,
+
mapped_post_id INTEGER NOT NULL
+
);
+
"""
+
)
+
+
input_settings = settings.get("input")
+
if not input_settings:
+
raise Exception("No input specified!")
+
outputs_settings = settings.get("outputs", [])
+
+
input = INPUTS[input_settings["type"]](input_settings, db_worker)
+
+
if not outputs_settings:
+
LOGGER.warning("No outputs specified! Check the config!")
+
+
outputs: list[cross.Output] = []
+
for output_settings in outputs_settings:
+
outputs.append(
+
OUTPUTS[output_settings["type"]](input, output_settings, db_worker)
+
)
+
+
LOGGER.info("Starting task worker...")
+
+
def worker(queue: queue.Queue):
+
while True:
+
task = queue.get()
+
if task is None:
+
break
+
+
try:
+
task()
+
except Exception as e:
+
LOGGER.error(f"Exception in worker thread!\n{e}")
+
traceback.print_exc()
+
finally:
+
queue.task_done()
+
+
task_queue = queue.Queue()
+
thread = threading.Thread(target=worker, args=(task_queue,), daemon=True)
+
thread.start()
+
+
LOGGER.info("Connecting to %s...", input.service)
try:
+
asyncio.run(input.listen(outputs, lambda x: task_queue.put(x)))
+
except KeyboardInterrupt:
+
LOGGER.info("Stopping...")
+
+
task_queue.join()
+
task_queue.put(None)
+
thread.join()
if __name__ == "__main__":
+
execute("./data")
+52
mastodon/common.py
···
···
+
import cross
+
from util.media import MediaInfo
+
+
+
class MastodonPost(cross.Post):
+
def __init__(
+
self,
+
status: dict,
+
tokens: list[cross.Token],
+
media_attachments: list[MediaInfo],
+
) -> None:
+
super().__init__()
+
self.id = status["id"]
+
self.parent_id = status.get("in_reply_to_id")
+
self.tokens = tokens
+
self.content_type = status.get("content_type", "text/plain")
+
self.timestamp = status["created_at"]
+
self.media_attachments = media_attachments
+
self.spoiler = status.get("spoiler_text")
+
self.language = [status["language"]] if status.get("language") else []
+
self.sensitive = status.get("sensitive", False)
+
self.url = status.get("url")
+
+
def get_id(self) -> str:
+
return self.id
+
+
def get_parent_id(self) -> str | None:
+
return self.parent_id
+
+
def get_tokens(self) -> list[cross.Token]:
+
return self.tokens
+
+
def get_text_type(self) -> str:
+
return self.content_type
+
+
def get_timestamp(self) -> str:
+
return self.timestamp
+
+
def get_attachments(self) -> list[MediaInfo]:
+
return self.media_attachments
+
+
def get_spoiler(self) -> str | None:
+
return self.spoiler
+
+
def get_languages(self) -> list[str]:
+
return self.language
+
+
def is_sensitive(self) -> bool:
+
return self.sensitive or (self.spoiler is not None and self.spoiler != "")
+
+
def get_post_url(self) -> str | None:
+
return self.url
+225
mastodon/input.py
···
···
+
import asyncio
+
import json
+
import re
+
from typing import Any, Callable
+
+
import requests
+
import websockets
+
+
import cross
+
import util.database as database
+
import util.html_util as html_util
+
import util.md_util as md_util
+
from mastodon.common import MastodonPost
+
from util.database import DataBaseWorker
+
from util.media import MediaInfo, download_media
+
from util.util import LOGGER, as_envvar
+
+
ALLOWED_VISIBILITY = ["public", "unlisted"]
+
MARKDOWNY = ["text/x.misskeymarkdown", "text/markdown", "text/plain"]
+
+
+
class MastodonInputOptions:
+
def __init__(self, o: dict) -> None:
+
self.allowed_visibility = ALLOWED_VISIBILITY
+
self.filters = [re.compile(f) for f in o.get("regex_filters", [])]
+
+
allowed_visibility = o.get("allowed_visibility")
+
if allowed_visibility is not None:
+
if any([v not in ALLOWED_VISIBILITY for v in allowed_visibility]):
+
raise ValueError(
+
f"'allowed_visibility' only accepts {', '.join(ALLOWED_VISIBILITY)}, got: {allowed_visibility}"
+
)
+
self.allowed_visibility = allowed_visibility
+
+
+
class MastodonInput(cross.Input):
+
def __init__(self, settings: dict, db: DataBaseWorker) -> None:
+
self.options = MastodonInputOptions(settings.get("options", {}))
+
self.token = as_envvar(settings.get("token")) or (_ for _ in ()).throw(
+
ValueError("'token' is required")
+
)
+
instance: str = as_envvar(settings.get("instance")) or (_ for _ in ()).throw(
+
ValueError("'instance' is required")
+
)
+
+
service = instance[:-1] if instance.endswith("/") else instance
+
+
LOGGER.info("Verifying %s credentails...", service)
+
responce = requests.get(
+
f"{service}/api/v1/accounts/verify_credentials",
+
headers={"Authorization": f"Bearer {self.token}"},
+
)
+
if responce.status_code != 200:
+
LOGGER.error("Failed to validate user credentials!")
+
responce.raise_for_status()
+
return
+
+
super().__init__(service, responce.json()["id"], settings, db)
+
self.streaming = self._get_streaming_url()
+
+
if not self.streaming:
+
raise Exception("Instance %s does not support streaming!", service)
+
+
def _get_streaming_url(self):
+
response = requests.get(f"{self.service}/api/v1/instance")
+
response.raise_for_status()
+
data: dict = response.json()
+
return (data.get("urls") or {}).get("streaming_api")
+
+
def __to_tokens(self, status: dict):
+
content_type = status.get("content_type", "text/plain")
+
raw_text = status.get("text")
+
+
tags: list[str] = []
+
for tag in status.get("tags", []):
+
tags.append(tag["name"])
+
+
mentions: list[tuple[str, str]] = []
+
for mention in status.get("mentions", []):
+
mentions.append(("@" + mention["username"], "@" + mention["acct"]))
+
+
if raw_text and content_type in MARKDOWNY:
+
return md_util.tokenize_markdown(raw_text, tags, mentions)
+
+
akkoma_ext: dict | None = status.get("akkoma", {}).get("source")
+
if akkoma_ext:
+
if akkoma_ext.get("mediaType") in MARKDOWNY:
+
return md_util.tokenize_markdown(akkoma_ext["content"], tags, mentions)
+
+
tokenizer = html_util.HTMLPostTokenizer()
+
tokenizer.mentions = mentions
+
tokenizer.tags = tags
+
tokenizer.feed(status.get("content", ""))
+
return tokenizer.get_tokens()
+
+
def _on_create_post(self, outputs: list[cross.Output], status: dict):
+
# skip events from other users
+
if (status.get("account") or {})["id"] != self.user_id:
+
return
+
+
if status.get("visibility") not in self.options.allowed_visibility:
+
# Skip f/o and direct posts
+
LOGGER.info(
+
"Skipping '%s'! '%s' visibility..",
+
status["id"],
+
status.get("visibility"),
+
)
+
return
+
+
# TODO polls not supported on bsky. maybe 3rd party? skip for now
+
# we don't handle reblogs. possible with bridgy(?) and self
+
# we don't handle quotes.
+
if status.get("poll"):
+
LOGGER.info("Skipping '%s'! Contains a poll..", status["id"])
+
return
+
+
if status.get("quote_id") or status.get("quote"):
+
LOGGER.info("Skipping '%s'! Quote..", status["id"])
+
return
+
+
reblog: dict | None = status.get("reblog")
+
if reblog:
+
if (reblog.get("account") or {})["id"] != self.user_id:
+
LOGGER.info("Skipping '%s'! Reblog of other user..", status["id"])
+
return
+
+
success = database.try_insert_repost(
+
self.db, status["id"], reblog["id"], self.user_id, self.service
+
)
+
if not success:
+
LOGGER.info(
+
"Skipping '%s' as reblogged post was not found in db!", status["id"]
+
)
+
return
+
+
for output in outputs:
+
output.accept_repost(status["id"], reblog["id"])
+
return
+
+
in_reply: str | None = status.get("in_reply_to_id")
+
in_reply_to: str | None = status.get("in_reply_to_account_id")
+
if in_reply_to and in_reply_to != self.user_id:
+
# We don't support replies.
+
LOGGER.info("Skipping '%s'! Reply to other user..", status["id"])
+
return
+
+
success = database.try_insert_post(
+
self.db, status["id"], in_reply, self.user_id, self.service
+
)
+
if not success:
+
LOGGER.info(
+
"Skipping '%s' as parent post was not found in db!", status["id"]
+
)
+
return
+
+
tokens = self.__to_tokens(status)
+
if not cross.test_filters(tokens, self.options.filters):
+
LOGGER.info("Skipping '%s'. Matched a filter!", status["id"])
+
return
+
+
LOGGER.info("Crossposting '%s'...", status["id"])
+
+
media_attachments: list[MediaInfo] = []
+
for attachment in status.get("media_attachments", []):
+
LOGGER.info("Downloading %s...", attachment["url"])
+
info = download_media(
+
attachment["url"], attachment.get("description") or ""
+
)
+
if not info:
+
LOGGER.error("Skipping '%s'. Failed to download media!", status["id"])
+
return
+
media_attachments.append(info)
+
+
cross_post = MastodonPost(status, tokens, media_attachments)
+
for output in outputs:
+
output.accept_post(cross_post)
+
+
def _on_delete_post(self, outputs: list[cross.Output], identifier: str):
+
post = database.find_post(self.db, identifier, self.user_id, self.service)
+
if not post:
+
return
+
+
LOGGER.info("Deleting '%s'...", identifier)
+
if post["reposted_id"]:
+
for output in outputs:
+
output.delete_repost(identifier)
+
else:
+
for output in outputs:
+
output.delete_post(identifier)
+
+
database.delete_post(self.db, identifier, self.user_id, self.service)
+
+
def _on_post(self, outputs: list[cross.Output], event: str, payload: str):
+
match event:
+
case "update":
+
self._on_create_post(outputs, json.loads(payload))
+
case "delete":
+
self._on_delete_post(outputs, payload)
+
+
async def listen(
+
self, outputs: list[cross.Output], submit: Callable[[Callable[[], Any]], Any]
+
):
+
uri = f"{self.streaming}/api/v1/streaming?stream=user&access_token={self.token}"
+
+
async for ws in websockets.connect(
+
uri, extra_headers={"User-Agent": "XPost/0.0.3"}
+
):
+
try:
+
LOGGER.info("Listening to %s...", self.streaming)
+
+
async def listen_for_messages():
+
async for msg in ws:
+
data = json.loads(msg)
+
event: str = data.get("event")
+
payload: str = data.get("payload")
+
+
submit(lambda: self._on_post(outputs, str(event), str(payload)))
+
+
listen = asyncio.create_task(listen_for_messages())
+
+
await asyncio.gather(listen)
+
except websockets.ConnectionClosedError as e:
+
LOGGER.error(e, stack_info=True, exc_info=True)
+
LOGGER.info("Reconnecting to %s...", self.streaming)
+
continue
+448
mastodon/output.py
···
···
+
import time
+
+
import requests
+
+
import cross
+
import misskey.mfm_util as mfm_util
+
import util.database as database
+
from util.database import DataBaseWorker
+
from util.media import MediaInfo
+
from util.util import LOGGER, as_envvar, canonical_label
+
+
POSSIBLE_MIMES = [
+
"audio/ogg",
+
"audio/mp3",
+
"image/webp",
+
"image/jpeg",
+
"image/png",
+
"video/mp4",
+
"video/quicktime",
+
"video/webm",
+
]
+
+
TEXT_MIMES = ["text/x.misskeymarkdown", "text/markdown", "text/plain"]
+
+
ALLOWED_POSTING_VISIBILITY = ["public", "unlisted", "private"]
+
+
+
class MastodonOutputOptions:
+
def __init__(self, o: dict) -> None:
+
self.visibility = "public"
+
+
visibility = o.get("visibility")
+
if visibility is not None:
+
if visibility not in ALLOWED_POSTING_VISIBILITY:
+
raise ValueError(
+
f"'visibility' only accepts {', '.join(ALLOWED_POSTING_VISIBILITY)}, got: {visibility}"
+
)
+
self.visibility = visibility
+
+
+
class MastodonOutput(cross.Output):
+
def __init__(self, input: cross.Input, settings: dict, db: DataBaseWorker) -> None:
+
super().__init__(input, settings, db)
+
self.options = settings.get("options") or {}
+
self.token = as_envvar(settings.get("token")) or (_ for _ in ()).throw(
+
ValueError("'token' is required")
+
)
+
instance: str = as_envvar(settings.get("instance")) or (_ for _ in ()).throw(
+
ValueError("'instance' is required")
+
)
+
+
self.service = instance[:-1] if instance.endswith("/") else instance
+
+
LOGGER.info("Verifying %s credentails...", self.service)
+
responce = requests.get(
+
f"{self.service}/api/v1/accounts/verify_credentials",
+
headers={"Authorization": f"Bearer {self.token}"},
+
)
+
if responce.status_code != 200:
+
LOGGER.error("Failed to validate user credentials!")
+
responce.raise_for_status()
+
return
+
self.user_id: str = responce.json()["id"]
+
+
LOGGER.info("Getting %s configuration...", self.service)
+
responce = requests.get(
+
f"{self.service}/api/v1/instance",
+
headers={"Authorization": f"Bearer {self.token}"},
+
)
+
if responce.status_code != 200:
+
LOGGER.error("Failed to get instance info!")
+
responce.raise_for_status()
+
return
+
+
instance_info: dict = responce.json()
+
configuration: dict = instance_info["configuration"]
+
+
statuses_config: dict = configuration.get("statuses", {})
+
self.max_characters: int = statuses_config.get("max_characters", 500)
+
self.max_media_attachments: int = statuses_config.get(
+
"max_media_attachments", 4
+
)
+
self.characters_reserved_per_url: int = statuses_config.get(
+
"characters_reserved_per_url", 23
+
)
+
+
media_config: dict = configuration.get("media_attachments", {})
+
self.image_size_limit: int = media_config.get("image_size_limit", 16777216)
+
self.video_size_limit: int = media_config.get("video_size_limit", 103809024)
+
self.supported_mime_types: list[str] = media_config.get(
+
"supported_mime_types", POSSIBLE_MIMES
+
)
+
+
# *oma: max post chars
+
max_toot_chars = instance_info.get("max_toot_chars")
+
if max_toot_chars:
+
self.max_characters: int = max_toot_chars
+
+
# *oma: max upload limit
+
upload_limit = instance_info.get("upload_limit")
+
if upload_limit:
+
self.image_size_limit: int = upload_limit
+
self.video_size_limit: int = upload_limit
+
+
# chuckya: supported text types
+
chuckya_text_mimes: list[str] = statuses_config.get("supported_mime_types", [])
+
self.text_format = next(
+
(mime for mime in TEXT_MIMES if mime in (chuckya_text_mimes)), "text/plain"
+
)
+
+
# *oma ext: supported text types
+
pleroma = instance_info.get("pleroma")
+
if pleroma:
+
post_formats: list[str] = pleroma.get("metadata", {}).get(
+
"post_formats", []
+
)
+
self.text_format = next(
+
(mime for mime in TEXT_MIMES if mime in post_formats), self.text_format
+
)
+
+
def upload_media(self, attachments: list[MediaInfo]) -> list[str] | None:
+
for a in attachments:
+
if a.mime.startswith("image/") and len(a.io) > self.image_size_limit:
+
return None
+
+
if a.mime.startswith("video/") and len(a.io) > self.video_size_limit:
+
return None
+
+
if not a.mime.startswith("image/") and not a.mime.startswith("video/"):
+
if len(a.io) > 7_000_000:
+
return None
+
+
uploads: list[dict] = []
+
for a in attachments:
+
data = {}
+
if a.alt:
+
data["description"] = a.alt
+
+
req = requests.post(
+
f"{self.service}/api/v2/media",
+
headers={"Authorization": f"Bearer {self.token}"},
+
files={"file": (a.name, a.io, a.mime)},
+
data=data,
+
)
+
+
if req.status_code == 200:
+
LOGGER.info("Uploaded %s! (%s)", a.name, req.json()["id"])
+
uploads.append({"done": True, "id": req.json()["id"]})
+
elif req.status_code == 202:
+
LOGGER.info("Waiting for %s to process!", a.name)
+
uploads.append({"done": False, "id": req.json()["id"]})
+
else:
+
LOGGER.error("Failed to upload %s! %s", a.name, req.text)
+
req.raise_for_status()
+
+
while any([not val["done"] for val in uploads]):
+
LOGGER.info("Waiting for media to process...")
+
time.sleep(3)
+
for media in uploads:
+
if media["done"]:
+
continue
+
+
reqs = requests.get(
+
f"{self.service}/api/v1/media/{media['id']}",
+
headers={"Authorization": f"Bearer {self.token}"},
+
)
+
+
if reqs.status_code == 206:
+
continue
+
+
if reqs.status_code == 200:
+
media["done"] = True
+
continue
+
reqs.raise_for_status()
+
+
return [val["id"] for val in uploads]
+
+
def token_to_string(self, tokens: list[cross.Token]) -> str | None:
+
p_text: str = ""
+
+
for token in tokens:
+
if isinstance(token, cross.TextToken):
+
p_text += token.text
+
elif isinstance(token, cross.TagToken):
+
p_text += "#" + token.tag
+
elif isinstance(token, cross.LinkToken):
+
if canonical_label(token.label, token.href):
+
p_text += token.href
+
else:
+
if self.text_format == "text/plain":
+
p_text += f"{token.label} ({token.href})"
+
elif self.text_format in {
+
"text/x.misskeymarkdown",
+
"text/markdown",
+
}:
+
p_text += f"[{token.label}]({token.href})"
+
else:
+
return None
+
+
return p_text
+
+
def split_tokens_media(self, tokens: list[cross.Token], media: list[MediaInfo]):
+
split_tokens = cross.split_tokens(
+
tokens, self.max_characters, self.characters_reserved_per_url
+
)
+
post_text: list[str] = []
+
+
for block in split_tokens:
+
baked_text = self.token_to_string(block)
+
+
if baked_text is None:
+
return None
+
post_text.append(baked_text)
+
+
if not post_text:
+
post_text = [""]
+
+
posts: list[dict] = [
+
{"text": post_text, "attachments": []} for post_text in post_text
+
]
+
available_indices: list[int] = list(range(len(posts)))
+
+
current_image_post_idx: int | None = None
+
+
def make_blank_post() -> dict:
+
return {"text": "", "attachments": []}
+
+
def pop_next_empty_index() -> int:
+
if available_indices:
+
return available_indices.pop(0)
+
else:
+
new_idx = len(posts)
+
posts.append(make_blank_post())
+
return new_idx
+
+
for att in media:
+
if (
+
current_image_post_idx is not None
+
and len(posts[current_image_post_idx]["attachments"])
+
< self.max_media_attachments
+
):
+
posts[current_image_post_idx]["attachments"].append(att)
+
else:
+
idx = pop_next_empty_index()
+
posts[idx]["attachments"].append(att)
+
current_image_post_idx = idx
+
+
result: list[tuple[str, list[MediaInfo]]] = []
+
+
for p in posts:
+
result.append((p["text"], p["attachments"]))
+
+
return result
+
+
def accept_post(self, post: cross.Post):
+
parent_id = post.get_parent_id()
+
+
new_root_id: int | None = None
+
new_parent_id: int | None = None
+
+
reply_ref: str | None = None
+
if parent_id:
+
thread_tuple = database.find_mapped_thread(
+
self.db,
+
parent_id,
+
self.input.user_id,
+
self.input.service,
+
self.user_id,
+
self.service,
+
)
+
+
if not thread_tuple:
+
LOGGER.error("Failed to find thread tuple in the database!")
+
return None
+
+
_, reply_ref, new_root_id, new_parent_id = thread_tuple
+
+
lang: str
+
if post.get_languages():
+
lang = post.get_languages()[0]
+
else:
+
lang = "en"
+
+
post_tokens = post.get_tokens()
+
if post.get_text_type() == "text/x.misskeymarkdown":
+
post_tokens, status = mfm_util.strip_mfm(post_tokens)
+
post_url = post.get_post_url()
+
if status and post_url:
+
post_tokens.append(cross.TextToken("\n"))
+
post_tokens.append(
+
cross.LinkToken(post_url, "[Post contains MFM, see original]")
+
)
+
+
raw_statuses = self.split_tokens_media(post_tokens, post.get_attachments())
+
if not raw_statuses:
+
LOGGER.error("Failed to split post into statuses?")
+
return None
+
baked_statuses = []
+
+
for status, raw_media in raw_statuses:
+
media: list[str] | None = None
+
if raw_media:
+
media = self.upload_media(raw_media)
+
if not media:
+
LOGGER.error("Failed to upload attachments!")
+
return None
+
baked_statuses.append((status, media))
+
continue
+
baked_statuses.append((status, []))
+
+
created_statuses: list[str] = []
+
+
for status, media in baked_statuses:
+
payload = {
+
"status": status,
+
"media_ids": media or [],
+
"spoiler_text": post.get_spoiler() or "",
+
"visibility": self.options.get("visibility", "public"),
+
"content_type": self.text_format,
+
"language": lang,
+
}
+
+
if media:
+
payload["sensitive"] = post.is_sensitive()
+
+
if post.get_spoiler():
+
payload["sensitive"] = True
+
+
if not status:
+
payload["status"] = "๐Ÿ–ผ๏ธ"
+
+
if reply_ref:
+
payload["in_reply_to_id"] = reply_ref
+
+
reqs = requests.post(
+
f"{self.service}/api/v1/statuses",
+
headers={
+
"Authorization": f"Bearer {self.token}",
+
"Content-Type": "application/json",
+
},
+
json=payload,
+
)
+
+
if reqs.status_code != 200:
+
LOGGER.info(
+
"Failed to post status! %s - %s", reqs.status_code, reqs.text
+
)
+
reqs.raise_for_status()
+
+
reply_ref = reqs.json()["id"]
+
LOGGER.info("Created new status %s!", reply_ref)
+
+
created_statuses.append(reqs.json()["id"])
+
+
db_post = database.find_post(
+
self.db, post.get_id(), self.input.user_id, self.input.service
+
)
+
assert db_post, "ghghghhhhh"
+
+
if new_root_id is None or new_parent_id is None:
+
new_root_id = database.insert_post(
+
self.db, created_statuses[0], self.user_id, self.service
+
)
+
new_parent_id = new_root_id
+
database.insert_mapping(self.db, db_post["id"], new_parent_id)
+
created_statuses = created_statuses[1:]
+
+
for db_id in created_statuses:
+
new_parent_id = database.insert_reply(
+
self.db, db_id, self.user_id, self.service, new_parent_id, new_root_id
+
)
+
database.insert_mapping(self.db, db_post["id"], new_parent_id)
+
+
def delete_post(self, identifier: str):
+
post = database.find_post(
+
self.db, identifier, self.input.user_id, self.input.service
+
)
+
if not post:
+
return
+
+
mappings = database.find_mappings(
+
self.db, post["id"], self.service, self.user_id
+
)
+
for mapping in mappings[::-1]:
+
LOGGER.info("Deleting '%s'...", mapping[0])
+
requests.delete(
+
f"{self.service}/api/v1/statuses/{mapping[0]}",
+
headers={"Authorization": f"Bearer {self.token}"},
+
)
+
database.delete_post(self.db, mapping[0], self.service, self.user_id)
+
+
def accept_repost(self, repost_id: str, reposted_id: str):
+
repost = self.__delete_repost(repost_id)
+
if not repost:
+
return None
+
+
reposted = database.find_post(
+
self.db, reposted_id, self.input.user_id, self.input.service
+
)
+
if not reposted:
+
return
+
+
mappings = database.find_mappings(
+
self.db, reposted["id"], self.service, self.user_id
+
)
+
if mappings:
+
rsp = requests.post(
+
f"{self.service}/api/v1/statuses/{mappings[0][0]}/reblog",
+
headers={"Authorization": f"Bearer {self.token}"},
+
)
+
+
if rsp.status_code != 200:
+
LOGGER.error(
+
"Failed to boost status! status_code: %s, msg: %s",
+
rsp.status_code,
+
rsp.content,
+
)
+
return
+
+
internal_id = database.insert_repost(
+
self.db, rsp.json()["id"], reposted["id"], self.user_id, self.service
+
)
+
database.insert_mapping(self.db, repost["id"], internal_id)
+
+
def __delete_repost(self, repost_id: str) -> dict | None:
+
repost = database.find_post(
+
self.db, repost_id, self.input.user_id, self.input.service
+
)
+
if not repost:
+
return None
+
+
mappings = database.find_mappings(
+
self.db, repost["id"], self.service, self.user_id
+
)
+
reposted_mappings = database.find_mappings(
+
self.db, repost["reposted_id"], self.service, self.user_id
+
)
+
if mappings and reposted_mappings:
+
LOGGER.info("Deleting '%s'...", mappings[0][0])
+
requests.post(
+
f"{self.service}/api/v1/statuses/{reposted_mappings[0][0]}/unreblog",
+
headers={"Authorization": f"Bearer {self.token}"},
+
)
+
database.delete_post(self.db, mappings[0][0], self.user_id, self.service)
+
return repost
+
+
def delete_repost(self, repost_id: str):
+
self.__delete_repost(repost_id)
-15
migrations/001_initdb.sql
···
-
PRAGMA foreign_keys = ON;
-
-
CREATE TABLE IF NOT EXISTS posts (
-
id INTEGER PRIMARY KEY AUTOINCREMENT,
-
user_id TEXT NOT NULL,
-
service TEXT NOT NULL,
-
identifier TEXT NOT NULL,
-
parent_id INTEGER NULL REFERENCES posts(id),
-
root_id INTEGER NULL REFERENCES posts(id)
-
);
-
-
CREATE TABLE IF NOT EXISTS mappings (
-
original_post_id INTEGER NOT NULL REFERENCES posts(id) ON DELETE CASCADE,
-
mapped_post_id INTEGER NOT NULL REFERENCES posts(id) ON DELETE CASCADE
-
);
···
-2
migrations/002_add_reposted_column.sql
···
-
ALTER TABLE posts
-
ADD COLUMN reposted_id INTEGER NULL REFERENCES posts(id) ON DELETE SET NULL;
···
-2
migrations/003_add_extra_data.sql
···
-
ALTER TABLE posts
-
ADD COLUMN extra_data TEXT NULL;
···
+54
misskey/common.py
···
···
+
import cross
+
from util.media import MediaInfo
+
+
+
class MisskeyPost(cross.Post):
+
def __init__(
+
self,
+
instance_url: str,
+
note: dict,
+
tokens: list[cross.Token],
+
files: list[MediaInfo],
+
) -> None:
+
super().__init__()
+
self.note = note
+
self.id = note["id"]
+
self.parent_id = note.get("replyId")
+
self.tokens = tokens
+
self.timestamp = note["createdAt"]
+
self.media_attachments = files
+
self.spoiler = note.get("cw")
+
self.sensitive = any(
+
[a.get("isSensitive", False) for a in note.get("files", [])]
+
)
+
self.url = instance_url + "/notes/" + note["id"]
+
+
def get_id(self) -> str:
+
return self.id
+
+
def get_parent_id(self) -> str | None:
+
return self.parent_id
+
+
def get_tokens(self) -> list[cross.Token]:
+
return self.tokens
+
+
def get_text_type(self) -> str:
+
return "text/x.misskeymarkdown"
+
+
def get_timestamp(self) -> str:
+
return self.timestamp
+
+
def get_attachments(self) -> list[MediaInfo]:
+
return self.media_attachments
+
+
def get_spoiler(self) -> str | None:
+
return self.spoiler
+
+
def get_languages(self) -> list[str]:
+
return []
+
+
def is_sensitive(self) -> bool:
+
return self.sensitive or (self.spoiler is not None and self.spoiler != "")
+
+
def get_post_url(self) -> str | None:
+
return self.url
+202
misskey/input.py
···
···
+
import asyncio
+
import json
+
import re
+
import uuid
+
from typing import Any, Callable
+
+
import requests
+
import websockets
+
+
import cross
+
import util.database as database
+
import util.md_util as md_util
+
from misskey.common import MisskeyPost
+
from util.media import MediaInfo, download_media
+
from util.util import LOGGER, as_envvar
+
+
ALLOWED_VISIBILITY = ["public", "home"]
+
+
+
class MisskeyInputOptions:
+
def __init__(self, o: dict) -> None:
+
self.allowed_visibility = ALLOWED_VISIBILITY
+
self.filters = [re.compile(f) for f in o.get("regex_filters", [])]
+
+
allowed_visibility = o.get("allowed_visibility")
+
if allowed_visibility is not None:
+
if any([v not in ALLOWED_VISIBILITY for v in allowed_visibility]):
+
raise ValueError(
+
f"'allowed_visibility' only accepts {', '.join(ALLOWED_VISIBILITY)}, got: {allowed_visibility}"
+
)
+
self.allowed_visibility = allowed_visibility
+
+
+
class MisskeyInput(cross.Input):
+
def __init__(self, settings: dict, db: cross.DataBaseWorker) -> None:
+
self.options = MisskeyInputOptions(settings.get("options", {}))
+
self.token = as_envvar(settings.get("token")) or (_ for _ in ()).throw(
+
ValueError("'token' is required")
+
)
+
instance: str = as_envvar(settings.get("instance")) or (_ for _ in ()).throw(
+
ValueError("'instance' is required")
+
)
+
+
service = instance[:-1] if instance.endswith("/") else instance
+
+
LOGGER.info("Verifying %s credentails...", service)
+
responce = requests.post(
+
f"{instance}/api/i",
+
json={"i": self.token},
+
headers={"Content-Type": "application/json"},
+
)
+
if responce.status_code != 200:
+
LOGGER.error("Failed to validate user credentials!")
+
responce.raise_for_status()
+
return
+
+
super().__init__(service, responce.json()["id"], settings, db)
+
+
def _on_note(self, outputs: list[cross.Output], note: dict):
+
if note["userId"] != self.user_id:
+
return
+
+
if note.get("visibility") not in self.options.allowed_visibility:
+
LOGGER.info(
+
"Skipping '%s'! '%s' visibility..", note["id"], note.get("visibility")
+
)
+
return
+
+
# TODO polls not supported on bsky. maybe 3rd party? skip for now
+
# we don't handle reblogs. possible with bridgy(?) and self
+
if note.get("poll"):
+
LOGGER.info("Skipping '%s'! Contains a poll..", note["id"])
+
return
+
+
renote: dict | None = note.get("renote")
+
if renote:
+
if note.get("text") is not None:
+
LOGGER.info("Skipping '%s'! Quote..", note["id"])
+
return
+
+
if renote.get("userId") != self.user_id:
+
LOGGER.info("Skipping '%s'! Reblog of other user..", note["id"])
+
return
+
+
success = database.try_insert_repost(
+
self.db, note["id"], renote["id"], self.user_id, self.service
+
)
+
if not success:
+
LOGGER.info(
+
"Skipping '%s' as renoted note was not found in db!", note["id"]
+
)
+
return
+
+
for output in outputs:
+
output.accept_repost(note["id"], renote["id"])
+
return
+
+
reply_id: str | None = note.get("replyId")
+
if reply_id:
+
if note.get("reply", {}).get("userId") != self.user_id:
+
LOGGER.info("Skipping '%s'! Reply to other user..", note["id"])
+
return
+
+
success = database.try_insert_post(
+
self.db, note["id"], reply_id, self.user_id, self.service
+
)
+
if not success:
+
LOGGER.info("Skipping '%s' as parent note was not found in db!", note["id"])
+
return
+
+
mention_handles: dict = note.get("mentionHandles") or {}
+
tags: list[str] = note.get("tags") or []
+
+
handles: list[tuple[str, str]] = []
+
for key, value in mention_handles.items():
+
handles.append((value, value))
+
+
tokens = md_util.tokenize_markdown(note.get("text", ""), tags, handles)
+
if not cross.test_filters(tokens, self.options.filters):
+
LOGGER.info("Skipping '%s'. Matched a filter!", note["id"])
+
return
+
+
LOGGER.info("Crossposting '%s'...", note["id"])
+
+
media_attachments: list[MediaInfo] = []
+
for attachment in note.get("files", []):
+
LOGGER.info("Downloading %s...", attachment["url"])
+
info = download_media(attachment["url"], attachment.get("comment") or "")
+
if not info:
+
LOGGER.error("Skipping '%s'. Failed to download media!", note["id"])
+
return
+
media_attachments.append(info)
+
+
cross_post = MisskeyPost(self.service, note, tokens, media_attachments)
+
for output in outputs:
+
output.accept_post(cross_post)
+
+
def _on_delete(self, outputs: list[cross.Output], note: dict):
+
# TODO handle deletes
+
pass
+
+
def _on_message(self, outputs: list[cross.Output], data: dict):
+
if data["type"] == "channel":
+
type: str = data["body"]["type"]
+
if type == "note" or type == "reply":
+
note_body = data["body"]["body"]
+
self._on_note(outputs, note_body)
+
return
+
+
pass
+
+
async def _send_keepalive(self, ws: websockets.WebSocketClientProtocol):
+
while ws.open:
+
try:
+
await asyncio.sleep(120)
+
if ws.open:
+
await ws.send("h")
+
LOGGER.debug("Sent keepalive h..")
+
else:
+
LOGGER.info("WebSocket is closed, stopping keepalive task.")
+
break
+
except Exception as e:
+
LOGGER.error(f"Error sending keepalive: {e}")
+
break
+
+
async def _subscribe_to_home(self, ws: websockets.WebSocketClientProtocol):
+
await ws.send(
+
json.dumps(
+
{
+
"type": "connect",
+
"body": {"channel": "homeTimeline", "id": str(uuid.uuid4())},
+
}
+
)
+
)
+
LOGGER.info("Subscribed to 'homeTimeline' channel...")
+
+
async def listen(
+
self, outputs: list[cross.Output], submit: Callable[[Callable[[], Any]], Any]
+
):
+
streaming: str = f"wss://{self.service.split('://', 1)[1]}"
+
url: str = f"{streaming}/streaming?i={self.token}"
+
+
async for ws in websockets.connect(
+
url, extra_headers={"User-Agent": "XPost/0.0.3"}
+
):
+
try:
+
LOGGER.info("Listening to %s...", streaming)
+
await self._subscribe_to_home(ws)
+
+
async def listen_for_messages():
+
async for msg in ws:
+
# TODO listen to deletes somehow
+
submit(lambda: self._on_message(outputs, json.loads(msg)))
+
+
keepalive = asyncio.create_task(self._send_keepalive(ws))
+
listen = asyncio.create_task(listen_for_messages())
+
+
await asyncio.gather(keepalive, listen)
+
except websockets.ConnectionClosedError as e:
+
LOGGER.error(e, stack_info=True, exc_info=True)
+
LOGGER.info("Reconnecting to %s...", streaming)
+
continue
+38
misskey/mfm_util.py
···
···
+
import re
+
+
import cross
+
+
MFM_PATTERN = re.compile(r"\$\[([^\[\]]+)\]")
+
+
+
def strip_mfm(tokens: list[cross.Token]) -> tuple[list[cross.Token], bool]:
+
modified = False
+
+
for tk in tokens:
+
if isinstance(tk, cross.TextToken):
+
original = tk.text
+
cleaned = __strip_mfm(original)
+
if cleaned != original:
+
modified = True
+
tk.text = cleaned
+
+
elif isinstance(tk, cross.LinkToken):
+
original = tk.label
+
cleaned = __strip_mfm(original)
+
if cleaned != original:
+
modified = True
+
tk.label = cleaned
+
+
return tokens, modified
+
+
+
def __strip_mfm(text: str) -> str:
+
def match_contents(match: re.Match[str]):
+
content = match.group(1).strip()
+
parts = content.split(" ", 1)
+
return parts[1] if len(parts) > 1 else ""
+
+
while MFM_PATTERN.search(text):
+
text = MFM_PATTERN.sub(match_contents, text)
+
+
return text
+6 -4
pyproject.toml
···
[project]
name = "xpost"
-
version = "0.1.0"
-
description = "social media crossposting tool"
readme = "README.md"
requires-python = ">=3.12"
dependencies = [
"python-magic>=0.4.27",
-
"requests>=2.32.5",
-
"websockets>=15.0.1",
]
···
[project]
name = "xpost"
+
version = "0.0.3"
+
description = "mastodon -> bluesky crossposting tool"
readme = "README.md"
requires-python = ">=3.12"
dependencies = [
+
"atproto>=0.0.61",
+
"click>=8.2.1",
"python-magic>=0.4.27",
+
"requests>=2.32.3",
+
"websockets>=13.1",
]
+290
util/database.py
···
···
+
import json
+
import queue
+
import sqlite3
+
import threading
+
from concurrent.futures import Future
+
+
+
class DataBaseWorker:
+
def __init__(self, database: str) -> None:
+
super(DataBaseWorker, self).__init__()
+
self.database = database
+
self.queue = queue.Queue()
+
self.thread = threading.Thread(target=self._run, daemon=True)
+
self.shutdown_event = threading.Event()
+
self.conn = sqlite3.connect(self.database, check_same_thread=False)
+
self.lock = threading.Lock()
+
self.thread.start()
+
+
def _run(self):
+
while not self.shutdown_event.is_set():
+
try:
+
task, future = self.queue.get(timeout=1)
+
try:
+
with self.lock:
+
result = task(self.conn)
+
future.set_result(result)
+
except Exception as e:
+
future.set_exception(e)
+
finally:
+
self.queue.task_done()
+
except queue.Empty:
+
continue
+
+
def execute(self, sql: str, params=()):
+
def task(conn: sqlite3.Connection):
+
cursor = conn.execute(sql, params)
+
conn.commit()
+
return cursor.fetchall()
+
+
future = Future()
+
self.queue.put((task, future))
+
return future.result()
+
+
def close(self):
+
self.shutdown_event.set()
+
self.thread.join()
+
with self.lock:
+
self.conn.close()
+
+
+
def try_insert_repost(
+
db: DataBaseWorker,
+
post_id: str,
+
reposted_id: str,
+
input_user: str,
+
input_service: str,
+
) -> bool:
+
reposted = find_post(db, reposted_id, input_user, input_service)
+
if not reposted:
+
return False
+
+
insert_repost(db, post_id, reposted["id"], input_user, input_service)
+
return True
+
+
+
def try_insert_post(
+
db: DataBaseWorker,
+
post_id: str,
+
in_reply: str | None,
+
input_user: str,
+
input_service: str,
+
) -> bool:
+
root_id = None
+
parent_id = None
+
+
if in_reply:
+
parent_post = find_post(db, in_reply, input_user, input_service)
+
if not parent_post:
+
return False
+
+
root_id = parent_post["id"]
+
parent_id = root_id
+
if parent_post["root_id"]:
+
root_id = parent_post["root_id"]
+
+
if root_id and parent_id:
+
insert_reply(db, post_id, input_user, input_service, parent_id, root_id)
+
else:
+
insert_post(db, post_id, input_user, input_service)
+
+
return True
+
+
+
def insert_repost(
+
db: DataBaseWorker, identifier: str, reposted_id: int, user_id: str, serivce: str
+
) -> int:
+
db.execute(
+
"""
+
INSERT INTO posts (user_id, service, identifier, reposted_id)
+
VALUES (?, ?, ?, ?);
+
""",
+
(user_id, serivce, identifier, reposted_id),
+
)
+
return db.execute("SELECT last_insert_rowid();", ())[0][0]
+
+
+
def insert_post(db: DataBaseWorker, identifier: str, user_id: str, serivce: str) -> int:
+
db.execute(
+
"""
+
INSERT INTO posts (user_id, service, identifier)
+
VALUES (?, ?, ?);
+
""",
+
(user_id, serivce, identifier),
+
)
+
return db.execute("SELECT last_insert_rowid();", ())[0][0]
+
+
+
def insert_reply(
+
db: DataBaseWorker,
+
identifier: str,
+
user_id: str,
+
serivce: str,
+
parent: int,
+
root: int,
+
) -> int:
+
db.execute(
+
"""
+
INSERT INTO posts (user_id, service, identifier, parent_id, root_id)
+
VALUES (?, ?, ?, ?, ?);
+
""",
+
(user_id, serivce, identifier, parent, root),
+
)
+
return db.execute("SELECT last_insert_rowid();", ())[0][0]
+
+
+
def insert_mapping(db: DataBaseWorker, original: int, mapped: int):
+
db.execute(
+
"""
+
INSERT INTO mappings (original_post_id, mapped_post_id)
+
VALUES (?, ?);
+
""",
+
(original, mapped),
+
)
+
+
+
def delete_post(db: DataBaseWorker, identifier: str, user_id: str, serivce: str):
+
db.execute(
+
"""
+
DELETE FROM posts
+
WHERE identifier = ?
+
AND service = ?
+
AND user_id = ?
+
""",
+
(identifier, serivce, user_id),
+
)
+
+
+
def fetch_data(db: DataBaseWorker, identifier: str, user_id: str, service: str) -> dict:
+
result = db.execute(
+
"""
+
SELECT extra_data
+
FROM posts
+
WHERE identifier = ?
+
AND user_id = ?
+
AND service = ?
+
""",
+
(identifier, user_id, service),
+
)
+
if not result or not result[0]:
+
return {}
+
return json.loads(result[0][0])
+
+
+
def store_data(
+
db: DataBaseWorker, identifier: str, user_id: str, service: str, extra_data: dict
+
) -> None:
+
db.execute(
+
"""
+
UPDATE posts
+
SET extra_data = ?
+
WHERE identifier = ?
+
AND user_id = ?
+
AND service = ?
+
""",
+
(json.dumps(extra_data), identifier, user_id, service),
+
)
+
+
+
def find_mappings(
+
db: DataBaseWorker, original_post: int, service: str, user_id: str
+
) -> list[str]:
+
return db.execute(
+
"""
+
SELECT p.identifier
+
FROM posts AS p
+
JOIN mappings AS m
+
ON p.id = m.mapped_post_id
+
WHERE m.original_post_id = ?
+
AND p.service = ?
+
AND p.user_id = ?
+
ORDER BY p.id;
+
""",
+
(original_post, service, user_id),
+
)
+
+
+
def find_post_by_id(db: DataBaseWorker, id: int) -> dict | None:
+
result = db.execute(
+
"""
+
SELECT user_id, service, identifier, parent_id, root_id, reposted_id
+
FROM posts
+
WHERE id = ?
+
""",
+
(id,),
+
)
+
if not result:
+
return None
+
user_id, service, identifier, parent_id, root_id, reposted_id = result[0]
+
return {
+
"user_id": user_id,
+
"service": service,
+
"identifier": identifier,
+
"parent_id": parent_id,
+
"root_id": root_id,
+
"reposted_id": reposted_id,
+
}
+
+
+
def find_post(
+
db: DataBaseWorker, identifier: str, user_id: str, service: str
+
) -> dict | None:
+
result = db.execute(
+
"""
+
SELECT id, parent_id, root_id, reposted_id
+
FROM posts
+
WHERE identifier = ?
+
AND user_id = ?
+
AND service = ?
+
""",
+
(identifier, user_id, service),
+
)
+
if not result:
+
return None
+
id, parent_id, root_id, reposted_id = result[0]
+
return {
+
"id": id,
+
"parent_id": parent_id,
+
"root_id": root_id,
+
"reposted_id": reposted_id,
+
}
+
+
+
def find_mapped_thread(
+
db: DataBaseWorker,
+
parent_id: str,
+
input_user: str,
+
input_service: str,
+
output_user: str,
+
output_service: str,
+
):
+
reply_data: dict | None = find_post(db, parent_id, input_user, input_service)
+
if not reply_data:
+
return None
+
+
reply_mappings: list[str] | None = find_mappings(
+
db, reply_data["id"], output_service, output_user
+
)
+
if not reply_mappings:
+
return None
+
+
reply_identifier: str = reply_mappings[-1]
+
root_identifier: str = reply_mappings[0]
+
if reply_data["root_id"]:
+
root_data = find_post_by_id(db, reply_data["root_id"])
+
if not root_data:
+
return None
+
+
root_mappings = find_mappings(
+
db, reply_data["root_id"], output_service, output_user
+
)
+
if not root_mappings:
+
return None
+
root_identifier = root_mappings[0]
+
+
return (
+
root_identifier[0], # real ids
+
reply_identifier[0],
+
reply_data["root_id"], # db ids
+
reply_data["id"],
+
)
+172
util/html_util.py
···
···
+
from html.parser import HTMLParser
+
+
import cross
+
+
+
class HTMLPostTokenizer(HTMLParser):
+
def __init__(self) -> None:
+
super().__init__()
+
self.tokens: list[cross.Token] = []
+
+
self.mentions: list[tuple[str, str]]
+
self.tags: list[str]
+
+
self.in_pre = False
+
self.in_code = False
+
+
self.current_tag_stack = []
+
self.list_stack = []
+
+
self.anchor_stack = []
+
self.anchor_data = []
+
+
def handle_starttag(self, tag: str, attrs: list[tuple[str, str | None]]) -> None:
+
attrs_dict = dict(attrs)
+
+
def append_newline():
+
if self.tokens:
+
last_token = self.tokens[-1]
+
if isinstance(
+
last_token, cross.TextToken
+
) and not last_token.text.endswith("\n"):
+
self.tokens.append(cross.TextToken("\n"))
+
+
match tag:
+
case "br":
+
self.tokens.append(cross.TextToken(" \n"))
+
case "a":
+
href = attrs_dict.get("href", "")
+
self.anchor_stack.append(href)
+
case "strong", "b":
+
self.tokens.append(cross.TextToken("**"))
+
case "em", "i":
+
self.tokens.append(cross.TextToken("*"))
+
case "del", "s":
+
self.tokens.append(cross.TextToken("~~"))
+
case "code":
+
if not self.in_pre:
+
self.tokens.append(cross.TextToken("`"))
+
self.in_code = True
+
case "pre":
+
append_newline()
+
self.tokens.append(cross.TextToken("```\n"))
+
self.in_pre = True
+
case "blockquote":
+
append_newline()
+
self.tokens.append(cross.TextToken("> "))
+
case "ul", "ol":
+
self.list_stack.append(tag)
+
append_newline()
+
case "li":
+
indent = " " * (len(self.list_stack) - 1)
+
if self.list_stack and self.list_stack[-1] == "ul":
+
self.tokens.append(cross.TextToken(f"{indent}- "))
+
elif self.list_stack and self.list_stack[-1] == "ol":
+
self.tokens.append(cross.TextToken(f"{indent}1. "))
+
case _:
+
if tag in {"h1", "h2", "h3", "h4", "h5", "h6"}:
+
level = int(tag[1])
+
self.tokens.append(cross.TextToken("\n" + "#" * level + " "))
+
+
self.current_tag_stack.append(tag)
+
+
def handle_data(self, data: str) -> None:
+
if self.anchor_stack:
+
self.anchor_data.append(data)
+
else:
+
self.tokens.append(cross.TextToken(data))
+
+
def handle_endtag(self, tag: str) -> None:
+
if not self.current_tag_stack:
+
return
+
+
if tag in self.current_tag_stack:
+
self.current_tag_stack.remove(tag)
+
+
match tag:
+
case "p":
+
self.tokens.append(cross.TextToken("\n\n"))
+
case "a":
+
href = self.anchor_stack.pop()
+
anchor_data = "".join(self.anchor_data)
+
self.anchor_data = []
+
+
if anchor_data.startswith("#"):
+
as_tag = anchor_data[1:].lower()
+
if any(as_tag == block for block in self.tags):
+
self.tokens.append(cross.TagToken(anchor_data[1:]))
+
elif anchor_data.startswith("@"):
+
match = next(
+
(pair for pair in self.mentions if anchor_data in pair), None
+
)
+
+
if match:
+
self.tokens.append(cross.MentionToken(match[1], ""))
+
else:
+
self.tokens.append(cross.LinkToken(href, anchor_data))
+
case "strong", "b":
+
self.tokens.append(cross.TextToken("**"))
+
case "em", "i":
+
self.tokens.append(cross.TextToken("*"))
+
case "del", "s":
+
self.tokens.append(cross.TextToken("~~"))
+
case "code":
+
if not self.in_pre and self.in_code:
+
self.tokens.append(cross.TextToken("`"))
+
self.in_code = False
+
case "pre":
+
self.tokens.append(cross.TextToken("\n```\n"))
+
self.in_pre = False
+
case "blockquote":
+
self.tokens.append(cross.TextToken("\n"))
+
case "ul", "ol":
+
if self.list_stack:
+
self.list_stack.pop()
+
self.tokens.append(cross.TextToken("\n"))
+
case "li":
+
self.tokens.append(cross.TextToken("\n"))
+
case _:
+
if tag in ["h1", "h2", "h3", "h4", "h5", "h6"]:
+
self.tokens.append(cross.TextToken("\n"))
+
+
def get_tokens(self) -> list[cross.Token]:
+
if not self.tokens:
+
return []
+
+
combined: list[cross.Token] = []
+
buffer: list[str] = []
+
+
def flush_buffer():
+
if buffer:
+
merged = "".join(buffer)
+
combined.append(cross.TextToken(text=merged))
+
buffer.clear()
+
+
for token in self.tokens:
+
if isinstance(token, cross.TextToken):
+
buffer.append(token.text)
+
else:
+
flush_buffer()
+
combined.append(token)
+
+
flush_buffer()
+
+
if combined and isinstance(combined[-1], cross.TextToken):
+
if combined[-1].text.endswith("\n\n"):
+
combined[-1] = cross.TextToken(combined[-1].text[:-2])
+
return combined
+
+
def reset(self):
+
"""Reset the parser state for reuse."""
+
super().reset()
+
self.tokens = []
+
+
self.mentions = []
+
self.tags = []
+
+
self.in_pre = False
+
self.in_code = False
+
+
self.current_tag_stack = []
+
self.anchor_stack = []
+
self.list_stack = []
+123
util/md_util.py
···
···
+
import re
+
+
import cross
+
import util.html_util as html_util
+
import util.util as util
+
+
URL = re.compile(r"(?:(?:[A-Za-z][A-Za-z0-9+.-]*://)|mailto:)[^\s]+", re.IGNORECASE)
+
MD_INLINE_LINK = re.compile(
+
r"\[([^\]]+)\]\(\s*((?:(?:[A-Za-z][A-Za-z0-9+.\-]*://)|mailto:)[^\s\)]+)\s*\)",
+
re.IGNORECASE,
+
)
+
MD_AUTOLINK = re.compile(
+
r"<((?:(?:[A-Za-z][A-Za-z0-9+.\-]*://)|mailto:)[^\s>]+)>", re.IGNORECASE
+
)
+
HASHTAG = re.compile(r"(?<!\w)\#([\w]+)")
+
FEDIVERSE_HANDLE = re.compile(r"(?<![\w@])@([\w\.-]+)(?:@([\w\.-]+\.[\w\.-]+))?")
+
+
+
def tokenize_markdown(
+
text: str, tags: list[str], handles: list[tuple[str, str]]
+
) -> list[cross.Token]:
+
if not text:
+
return []
+
+
tokenizer = html_util.HTMLPostTokenizer()
+
tokenizer.mentions = handles
+
tokenizer.tags = tags
+
tokenizer.feed(text)
+
html_tokens = tokenizer.get_tokens()
+
+
tokens: list[cross.Token] = []
+
+
for tk in html_tokens:
+
if isinstance(tk, cross.TextToken):
+
tokens.extend(__tokenize_md(tk.text, tags, handles))
+
elif isinstance(tk, cross.LinkToken):
+
if not tk.label or util.canonical_label(tk.label, tk.href):
+
tokens.append(tk)
+
continue
+
+
tokens.extend(__tokenize_md(f"[{tk.label}]({tk.href})", tags, handles))
+
else:
+
tokens.append(tk)
+
+
return tokens
+
+
+
def __tokenize_md(
+
text: str, tags: list[str], handles: list[tuple[str, str]]
+
) -> list[cross.Token]:
+
index: int = 0
+
total: int = len(text)
+
buffer: list[str] = []
+
+
tokens: list[cross.Token] = []
+
+
def flush():
+
nonlocal buffer
+
if buffer:
+
tokens.append(cross.TextToken("".join(buffer)))
+
buffer = []
+
+
while index < total:
+
if text[index] == "[":
+
md_inline = MD_INLINE_LINK.match(text, index)
+
if md_inline:
+
flush()
+
label = md_inline.group(1)
+
href = md_inline.group(2)
+
tokens.append(cross.LinkToken(href, label))
+
index = md_inline.end()
+
continue
+
+
if text[index] == "<":
+
md_auto = MD_AUTOLINK.match(text, index)
+
if md_auto:
+
flush()
+
href = md_auto.group(1)
+
tokens.append(cross.LinkToken(href, href))
+
index = md_auto.end()
+
continue
+
+
if text[index] == "#":
+
tag = HASHTAG.match(text, index)
+
if tag:
+
tag_text = tag.group(1)
+
if tag_text.lower() in tags:
+
flush()
+
tokens.append(cross.TagToken(tag_text))
+
index = tag.end()
+
continue
+
+
if text[index] == "@":
+
handle = FEDIVERSE_HANDLE.match(text, index)
+
if handle:
+
handle_text = handle.group(0)
+
stripped_handle = handle_text.strip()
+
+
match = next(
+
(pair for pair in handles if stripped_handle in pair), None
+
)
+
+
if match:
+
flush()
+
tokens.append(
+
cross.MentionToken(match[1], "")
+
) # TODO: misskey doesnโ€™t provide a uri
+
index = handle.end()
+
continue
+
+
url = URL.match(text, index)
+
if url:
+
flush()
+
href = url.group(0)
+
tokens.append(cross.LinkToken(href, href))
+
index = url.end()
+
continue
+
+
buffer.append(text[index])
+
index += 1
+
+
flush()
+
return tokens
+160
util/media.py
···
···
+
import json
+
import os
+
import re
+
import subprocess
+
import urllib.parse
+
+
import magic
+
import requests
+
+
from util.util import LOGGER
+
+
FILENAME = re.compile(r'filename="?([^\";]*)"?')
+
MAGIC = magic.Magic(mime=True)
+
+
+
class MediaInfo:
+
def __init__(self, url: str, name: str, mime: str, alt: str, io: bytes) -> None:
+
self.url = url
+
self.name = name
+
self.mime = mime
+
self.alt = alt
+
self.io = io
+
+
+
def download_media(url: str, alt: str) -> MediaInfo | None:
+
name = get_filename_from_url(url)
+
io = download_blob(url, max_bytes=100_000_000)
+
if not io:
+
LOGGER.error("Failed to download media attachment! %s", url)
+
return None
+
mime = MAGIC.from_buffer(io)
+
if not mime:
+
mime = "application/octet-stream"
+
return MediaInfo(url, name, mime, alt, io)
+
+
+
def get_filename_from_url(url):
+
try:
+
response = requests.head(url, allow_redirects=True)
+
disposition = response.headers.get("Content-Disposition")
+
if disposition:
+
filename = FILENAME.findall(disposition)
+
if filename:
+
return filename[0]
+
except requests.RequestException:
+
pass
+
+
parsed_url = urllib.parse.urlparse(url)
+
base_name = os.path.basename(parsed_url.path)
+
+
# hardcoded fix to return the cid for pds
+
if base_name == "com.atproto.sync.getBlob":
+
qs = urllib.parse.parse_qs(parsed_url.query)
+
if qs and qs.get("cid"):
+
return qs["cid"][0]
+
+
return base_name
+
+
+
def probe_bytes(bytes: bytes) -> dict:
+
cmd = [
+
"ffprobe",
+
"-v", "error",
+
"-show_format",
+
"-show_streams",
+
"-print_format", "json",
+
"pipe:0",
+
]
+
proc = subprocess.run(
+
cmd, input=bytes, stdout=subprocess.PIPE, stderr=subprocess.PIPE
+
)
+
+
if proc.returncode != 0:
+
raise RuntimeError(f"ffprobe failed: {proc.stderr.decode()}")
+
+
return json.loads(proc.stdout)
+
+
+
def convert_to_mp4(video_bytes: bytes) -> bytes:
+
cmd = [
+
"ffmpeg",
+
"-i", "pipe:0",
+
"-c:v", "libx264",
+
"-crf", "30",
+
"-preset", "slow",
+
"-c:a", "aac",
+
"-b:a", "128k",
+
"-movflags", "frag_keyframe+empty_moov+default_base_moof",
+
"-f", "mp4",
+
"pipe:1",
+
]
+
+
proc = subprocess.Popen(
+
cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE
+
)
+
out_bytes, err = proc.communicate(input=video_bytes)
+
+
if proc.returncode != 0:
+
raise RuntimeError(f"ffmpeg compress failed: {err.decode()}")
+
+
return out_bytes
+
+
+
def compress_image(image_bytes: bytes, quality: int = 90):
+
cmd = [
+
"ffmpeg",
+
"-f", "image2pipe",
+
"-i", "pipe:0",
+
"-c:v", "webp",
+
"-q:v", str(quality),
+
"-f", "image2pipe",
+
"pipe:1",
+
]
+
+
proc = subprocess.Popen(
+
cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE
+
)
+
out_bytes, err = proc.communicate(input=image_bytes)
+
+
if proc.returncode != 0:
+
raise RuntimeError(f"ffmpeg compress failed: {err.decode()}")
+
+
return out_bytes
+
+
+
def download_blob(url: str, max_bytes: int = 5_000_000) -> bytes | None:
+
response = requests.get(url, stream=True, timeout=20)
+
if response.status_code != 200:
+
LOGGER.info("Failed to download %s! %s", url, response.text)
+
return None
+
+
downloaded_bytes = b""
+
current_size = 0
+
+
for chunk in response.iter_content(chunk_size=8192):
+
if not chunk:
+
continue
+
+
current_size += len(chunk)
+
if current_size > max_bytes:
+
response.close()
+
return None
+
+
downloaded_bytes += chunk
+
+
return downloaded_bytes
+
+
+
def get_media_meta(bytes: bytes):
+
probe = probe_bytes(bytes)
+
streams = [s for s in probe["streams"] if s["codec_type"] == "video"]
+
if not streams:
+
raise ValueError("No video stream found")
+
+
media = streams[0]
+
return {
+
"width": int(media["width"]),
+
"height": int(media["height"]),
+
"duration": float(media.get("duration", probe["format"].get("duration", -1))),
+
}
+39 -1
util/util.py
···
import logging
import sys
-
logging.basicConfig(stream=sys.stderr, level=logging.INFO)
LOGGER = logging.getLogger("XPost")
···
+
import json
import logging
+
import os
import sys
+
logging.basicConfig(stream=sys.stdout, level=logging.INFO)
LOGGER = logging.getLogger("XPost")
+
+
+
def as_json(obj, indent=None, sort_keys=False) -> str:
+
return json.dumps(
+
obj.__dict__ if not isinstance(obj, dict) else obj,
+
default=lambda o: o.__json__() if hasattr(o, "__json__") else o.__dict__,
+
indent=indent,
+
sort_keys=sort_keys,
+
)
+
+
+
def canonical_label(label: str | None, href: str):
+
if not label or label == href:
+
return True
+
+
split = href.split("://", 1)
+
if len(split) > 1:
+
if split[1] == label:
+
return True
+
+
return False
+
+
+
def safe_get(obj: dict, key: str, default):
+
val = obj.get(key, default)
+
return val if val else default
+
+
+
def as_envvar(text: str | None) -> str | None:
+
if not text:
+
return None
+
+
if text.startswith("env:"):
+
return os.environ.get(text[4:], "")
+
+
return text
+381 -92
uv.lock
···
version = 1
-
revision = 3
requires-python = ">=3.12"
[[package]]
name = "certifi"
-
version = "2025.10.5"
source = { registry = "https://pypi.org/simple" }
-
sdist = { url = "https://files.pythonhosted.org/packages/4c/5b/b6ce21586237c77ce67d01dc5507039d444b630dd76611bbca2d8e5dcd91/certifi-2025.10.5.tar.gz", hash = "sha256:47c09d31ccf2acf0be3f701ea53595ee7e0b8fa08801c6624be771df09ae7b43", size = 164519, upload-time = "2025-10-05T04:12:15.808Z" }
wheels = [
-
{ url = "https://files.pythonhosted.org/packages/e4/37/af0d2ef3967ac0d6113837b44a4f0bfe1328c2b9763bd5b1744520e5cfed/certifi-2025.10.5-py3-none-any.whl", hash = "sha256:0f212c2744a9bb6de0c56639a6f68afe01ecd92d91f14ae897c4fe7bbeeef0de", size = 163286, upload-time = "2025-10-05T04:12:14.03Z" },
]
[[package]]
name = "charset-normalizer"
-
version = "3.4.4"
source = { registry = "https://pypi.org/simple" }
-
sdist = { url = "https://files.pythonhosted.org/packages/13/69/33ddede1939fdd074bce5434295f38fae7136463422fe4fd3e0e89b98062/charset_normalizer-3.4.4.tar.gz", hash = "sha256:94537985111c35f28720e43603b8e7b43a6ecfb2ce1d3058bbe955b73404e21a", size = 129418, upload-time = "2025-10-14T04:42:32.879Z" }
wheels = [
-
{ url = "https://files.pythonhosted.org/packages/f3/85/1637cd4af66fa687396e757dec650f28025f2a2f5a5531a3208dc0ec43f2/charset_normalizer-3.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0a98e6759f854bd25a58a73fa88833fba3b7c491169f86ce1180c948ab3fd394", size = 208425, upload-time = "2025-10-14T04:40:53.353Z" },
-
{ url = "https://files.pythonhosted.org/packages/9d/6a/04130023fef2a0d9c62d0bae2649b69f7b7d8d24ea5536feef50551029df/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b5b290ccc2a263e8d185130284f8501e3e36c5e02750fc6b6bdeb2e9e96f1e25", size = 148162, upload-time = "2025-10-14T04:40:54.558Z" },
-
{ url = "https://files.pythonhosted.org/packages/78/29/62328d79aa60da22c9e0b9a66539feae06ca0f5a4171ac4f7dc285b83688/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74bb723680f9f7a6234dcf67aea57e708ec1fbdf5699fb91dfd6f511b0a320ef", size = 144558, upload-time = "2025-10-14T04:40:55.677Z" },
-
{ url = "https://files.pythonhosted.org/packages/86/bb/b32194a4bf15b88403537c2e120b817c61cd4ecffa9b6876e941c3ee38fe/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f1e34719c6ed0b92f418c7c780480b26b5d9c50349e9a9af7d76bf757530350d", size = 161497, upload-time = "2025-10-14T04:40:57.217Z" },
-
{ url = "https://files.pythonhosted.org/packages/19/89/a54c82b253d5b9b111dc74aca196ba5ccfcca8242d0fb64146d4d3183ff1/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2437418e20515acec67d86e12bf70056a33abdacb5cb1655042f6538d6b085a8", size = 159240, upload-time = "2025-10-14T04:40:58.358Z" },
-
{ url = "https://files.pythonhosted.org/packages/c0/10/d20b513afe03acc89ec33948320a5544d31f21b05368436d580dec4e234d/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:11d694519d7f29d6cd09f6ac70028dba10f92f6cdd059096db198c283794ac86", size = 153471, upload-time = "2025-10-14T04:40:59.468Z" },
-
{ url = "https://files.pythonhosted.org/packages/61/fa/fbf177b55bdd727010f9c0a3c49eefa1d10f960e5f09d1d887bf93c2e698/charset_normalizer-3.4.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ac1c4a689edcc530fc9d9aa11f5774b9e2f33f9a0c6a57864e90908f5208d30a", size = 150864, upload-time = "2025-10-14T04:41:00.623Z" },
-
{ url = "https://files.pythonhosted.org/packages/05/12/9fbc6a4d39c0198adeebbde20b619790e9236557ca59fc40e0e3cebe6f40/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:21d142cc6c0ec30d2efee5068ca36c128a30b0f2c53c1c07bd78cb6bc1d3be5f", size = 150647, upload-time = "2025-10-14T04:41:01.754Z" },
-
{ url = "https://files.pythonhosted.org/packages/ad/1f/6a9a593d52e3e8c5d2b167daf8c6b968808efb57ef4c210acb907c365bc4/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:5dbe56a36425d26d6cfb40ce79c314a2e4dd6211d51d6d2191c00bed34f354cc", size = 145110, upload-time = "2025-10-14T04:41:03.231Z" },
-
{ url = "https://files.pythonhosted.org/packages/30/42/9a52c609e72471b0fc54386dc63c3781a387bb4fe61c20231a4ebcd58bdd/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5bfbb1b9acf3334612667b61bd3002196fe2a1eb4dd74d247e0f2a4d50ec9bbf", size = 162839, upload-time = "2025-10-14T04:41:04.715Z" },
-
{ url = "https://files.pythonhosted.org/packages/c4/5b/c0682bbf9f11597073052628ddd38344a3d673fda35a36773f7d19344b23/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:d055ec1e26e441f6187acf818b73564e6e6282709e9bcb5b63f5b23068356a15", size = 150667, upload-time = "2025-10-14T04:41:05.827Z" },
-
{ url = "https://files.pythonhosted.org/packages/e4/24/a41afeab6f990cf2daf6cb8c67419b63b48cf518e4f56022230840c9bfb2/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:af2d8c67d8e573d6de5bc30cdb27e9b95e49115cd9baad5ddbd1a6207aaa82a9", size = 160535, upload-time = "2025-10-14T04:41:06.938Z" },
-
{ url = "https://files.pythonhosted.org/packages/2a/e5/6a4ce77ed243c4a50a1fecca6aaaab419628c818a49434be428fe24c9957/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:780236ac706e66881f3b7f2f32dfe90507a09e67d1d454c762cf642e6e1586e0", size = 154816, upload-time = "2025-10-14T04:41:08.101Z" },
-
{ url = "https://files.pythonhosted.org/packages/a8/ef/89297262b8092b312d29cdb2517cb1237e51db8ecef2e9af5edbe7b683b1/charset_normalizer-3.4.4-cp312-cp312-win32.whl", hash = "sha256:5833d2c39d8896e4e19b689ffc198f08ea58116bee26dea51e362ecc7cd3ed26", size = 99694, upload-time = "2025-10-14T04:41:09.23Z" },
-
{ url = "https://files.pythonhosted.org/packages/3d/2d/1e5ed9dd3b3803994c155cd9aacb60c82c331bad84daf75bcb9c91b3295e/charset_normalizer-3.4.4-cp312-cp312-win_amd64.whl", hash = "sha256:a79cfe37875f822425b89a82333404539ae63dbdddf97f84dcbc3d339aae9525", size = 107131, upload-time = "2025-10-14T04:41:10.467Z" },
-
{ url = "https://files.pythonhosted.org/packages/d0/d9/0ed4c7098a861482a7b6a95603edce4c0d9db2311af23da1fb2b75ec26fc/charset_normalizer-3.4.4-cp312-cp312-win_arm64.whl", hash = "sha256:376bec83a63b8021bb5c8ea75e21c4ccb86e7e45ca4eb81146091b56599b80c3", size = 100390, upload-time = "2025-10-14T04:41:11.915Z" },
-
{ url = "https://files.pythonhosted.org/packages/97/45/4b3a1239bbacd321068ea6e7ac28875b03ab8bc0aa0966452db17cd36714/charset_normalizer-3.4.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:e1f185f86a6f3403aa2420e815904c67b2f9ebc443f045edd0de921108345794", size = 208091, upload-time = "2025-10-14T04:41:13.346Z" },
-
{ url = "https://files.pythonhosted.org/packages/7d/62/73a6d7450829655a35bb88a88fca7d736f9882a27eacdca2c6d505b57e2e/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b39f987ae8ccdf0d2642338faf2abb1862340facc796048b604ef14919e55ed", size = 147936, upload-time = "2025-10-14T04:41:14.461Z" },
-
{ url = "https://files.pythonhosted.org/packages/89/c5/adb8c8b3d6625bef6d88b251bbb0d95f8205831b987631ab0c8bb5d937c2/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3162d5d8ce1bb98dd51af660f2121c55d0fa541b46dff7bb9b9f86ea1d87de72", size = 144180, upload-time = "2025-10-14T04:41:15.588Z" },
-
{ url = "https://files.pythonhosted.org/packages/91/ed/9706e4070682d1cc219050b6048bfd293ccf67b3d4f5a4f39207453d4b99/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:81d5eb2a312700f4ecaa977a8235b634ce853200e828fbadf3a9c50bab278328", size = 161346, upload-time = "2025-10-14T04:41:16.738Z" },
-
{ url = "https://files.pythonhosted.org/packages/d5/0d/031f0d95e4972901a2f6f09ef055751805ff541511dc1252ba3ca1f80cf5/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5bd2293095d766545ec1a8f612559f6b40abc0eb18bb2f5d1171872d34036ede", size = 158874, upload-time = "2025-10-14T04:41:17.923Z" },
-
{ url = "https://files.pythonhosted.org/packages/f5/83/6ab5883f57c9c801ce5e5677242328aa45592be8a00644310a008d04f922/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a8a8b89589086a25749f471e6a900d3f662d1d3b6e2e59dcecf787b1cc3a1894", size = 153076, upload-time = "2025-10-14T04:41:19.106Z" },
-
{ url = "https://files.pythonhosted.org/packages/75/1e/5ff781ddf5260e387d6419959ee89ef13878229732732ee73cdae01800f2/charset_normalizer-3.4.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc7637e2f80d8530ee4a78e878bce464f70087ce73cf7c1caf142416923b98f1", size = 150601, upload-time = "2025-10-14T04:41:20.245Z" },
-
{ url = "https://files.pythonhosted.org/packages/d7/57/71be810965493d3510a6ca79b90c19e48696fb1ff964da319334b12677f0/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f8bf04158c6b607d747e93949aa60618b61312fe647a6369f88ce2ff16043490", size = 150376, upload-time = "2025-10-14T04:41:21.398Z" },
-
{ url = "https://files.pythonhosted.org/packages/e5/d5/c3d057a78c181d007014feb7e9f2e65905a6c4ef182c0ddf0de2924edd65/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:554af85e960429cf30784dd47447d5125aaa3b99a6f0683589dbd27e2f45da44", size = 144825, upload-time = "2025-10-14T04:41:22.583Z" },
-
{ url = "https://files.pythonhosted.org/packages/e6/8c/d0406294828d4976f275ffbe66f00266c4b3136b7506941d87c00cab5272/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:74018750915ee7ad843a774364e13a3db91682f26142baddf775342c3f5b1133", size = 162583, upload-time = "2025-10-14T04:41:23.754Z" },
-
{ url = "https://files.pythonhosted.org/packages/d7/24/e2aa1f18c8f15c4c0e932d9287b8609dd30ad56dbe41d926bd846e22fb8d/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c0463276121fdee9c49b98908b3a89c39be45d86d1dbaa22957e38f6321d4ce3", size = 150366, upload-time = "2025-10-14T04:41:25.27Z" },
-
{ url = "https://files.pythonhosted.org/packages/e4/5b/1e6160c7739aad1e2df054300cc618b06bf784a7a164b0f238360721ab86/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:362d61fd13843997c1c446760ef36f240cf81d3ebf74ac62652aebaf7838561e", size = 160300, upload-time = "2025-10-14T04:41:26.725Z" },
-
{ url = "https://files.pythonhosted.org/packages/7a/10/f882167cd207fbdd743e55534d5d9620e095089d176d55cb22d5322f2afd/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a26f18905b8dd5d685d6d07b0cdf98a79f3c7a918906af7cc143ea2e164c8bc", size = 154465, upload-time = "2025-10-14T04:41:28.322Z" },
-
{ url = "https://files.pythonhosted.org/packages/89/66/c7a9e1b7429be72123441bfdbaf2bc13faab3f90b933f664db506dea5915/charset_normalizer-3.4.4-cp313-cp313-win32.whl", hash = "sha256:9b35f4c90079ff2e2edc5b26c0c77925e5d2d255c42c74fdb70fb49b172726ac", size = 99404, upload-time = "2025-10-14T04:41:29.95Z" },
-
{ url = "https://files.pythonhosted.org/packages/c4/26/b9924fa27db384bdcd97ab83b4f0a8058d96ad9626ead570674d5e737d90/charset_normalizer-3.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:b435cba5f4f750aa6c0a0d92c541fb79f69a387c91e61f1795227e4ed9cece14", size = 107092, upload-time = "2025-10-14T04:41:31.188Z" },
-
{ url = "https://files.pythonhosted.org/packages/af/8f/3ed4bfa0c0c72a7ca17f0380cd9e4dd842b09f664e780c13cff1dcf2ef1b/charset_normalizer-3.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:542d2cee80be6f80247095cc36c418f7bddd14f4a6de45af91dfad36d817bba2", size = 100408, upload-time = "2025-10-14T04:41:32.624Z" },
-
{ url = "https://files.pythonhosted.org/packages/2a/35/7051599bd493e62411d6ede36fd5af83a38f37c4767b92884df7301db25d/charset_normalizer-3.4.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:da3326d9e65ef63a817ecbcc0df6e94463713b754fe293eaa03da99befb9a5bd", size = 207746, upload-time = "2025-10-14T04:41:33.773Z" },
-
{ url = "https://files.pythonhosted.org/packages/10/9a/97c8d48ef10d6cd4fcead2415523221624bf58bcf68a802721a6bc807c8f/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8af65f14dc14a79b924524b1e7fffe304517b2bff5a58bf64f30b98bbc5079eb", size = 147889, upload-time = "2025-10-14T04:41:34.897Z" },
-
{ url = "https://files.pythonhosted.org/packages/10/bf/979224a919a1b606c82bd2c5fa49b5c6d5727aa47b4312bb27b1734f53cd/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74664978bb272435107de04e36db5a9735e78232b85b77d45cfb38f758efd33e", size = 143641, upload-time = "2025-10-14T04:41:36.116Z" },
-
{ url = "https://files.pythonhosted.org/packages/ba/33/0ad65587441fc730dc7bd90e9716b30b4702dc7b617e6ba4997dc8651495/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:752944c7ffbfdd10c074dc58ec2d5a8a4cd9493b314d367c14d24c17684ddd14", size = 160779, upload-time = "2025-10-14T04:41:37.229Z" },
-
{ url = "https://files.pythonhosted.org/packages/67/ed/331d6b249259ee71ddea93f6f2f0a56cfebd46938bde6fcc6f7b9a3d0e09/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1f13550535ad8cff21b8d757a3257963e951d96e20ec82ab44bc64aeb62a191", size = 159035, upload-time = "2025-10-14T04:41:38.368Z" },
-
{ url = "https://files.pythonhosted.org/packages/67/ff/f6b948ca32e4f2a4576aa129d8bed61f2e0543bf9f5f2b7fc3758ed005c9/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecaae4149d99b1c9e7b88bb03e3221956f68fd6d50be2ef061b2381b61d20838", size = 152542, upload-time = "2025-10-14T04:41:39.862Z" },
-
{ url = "https://files.pythonhosted.org/packages/16/85/276033dcbcc369eb176594de22728541a925b2632f9716428c851b149e83/charset_normalizer-3.4.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cb6254dc36b47a990e59e1068afacdcd02958bdcce30bb50cc1700a8b9d624a6", size = 149524, upload-time = "2025-10-14T04:41:41.319Z" },
-
{ url = "https://files.pythonhosted.org/packages/9e/f2/6a2a1f722b6aba37050e626530a46a68f74e63683947a8acff92569f979a/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c8ae8a0f02f57a6e61203a31428fa1d677cbe50c93622b4149d5c0f319c1d19e", size = 150395, upload-time = "2025-10-14T04:41:42.539Z" },
-
{ url = "https://files.pythonhosted.org/packages/60/bb/2186cb2f2bbaea6338cad15ce23a67f9b0672929744381e28b0592676824/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:47cc91b2f4dd2833fddaedd2893006b0106129d4b94fdb6af1f4ce5a9965577c", size = 143680, upload-time = "2025-10-14T04:41:43.661Z" },
-
{ url = "https://files.pythonhosted.org/packages/7d/a5/bf6f13b772fbb2a90360eb620d52ed8f796f3c5caee8398c3b2eb7b1c60d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:82004af6c302b5d3ab2cfc4cc5f29db16123b1a8417f2e25f9066f91d4411090", size = 162045, upload-time = "2025-10-14T04:41:44.821Z" },
-
{ url = "https://files.pythonhosted.org/packages/df/c5/d1be898bf0dc3ef9030c3825e5d3b83f2c528d207d246cbabe245966808d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b7d8f6c26245217bd2ad053761201e9f9680f8ce52f0fcd8d0755aeae5b2152", size = 149687, upload-time = "2025-10-14T04:41:46.442Z" },
-
{ url = "https://files.pythonhosted.org/packages/a5/42/90c1f7b9341eef50c8a1cb3f098ac43b0508413f33affd762855f67a410e/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:799a7a5e4fb2d5898c60b640fd4981d6a25f1c11790935a44ce38c54e985f828", size = 160014, upload-time = "2025-10-14T04:41:47.631Z" },
-
{ url = "https://files.pythonhosted.org/packages/76/be/4d3ee471e8145d12795ab655ece37baed0929462a86e72372fd25859047c/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:99ae2cffebb06e6c22bdc25801d7b30f503cc87dbd283479e7b606f70aff57ec", size = 154044, upload-time = "2025-10-14T04:41:48.81Z" },
-
{ url = "https://files.pythonhosted.org/packages/b0/6f/8f7af07237c34a1defe7defc565a9bc1807762f672c0fde711a4b22bf9c0/charset_normalizer-3.4.4-cp314-cp314-win32.whl", hash = "sha256:f9d332f8c2a2fcbffe1378594431458ddbef721c1769d78e2cbc06280d8155f9", size = 99940, upload-time = "2025-10-14T04:41:49.946Z" },
-
{ url = "https://files.pythonhosted.org/packages/4b/51/8ade005e5ca5b0d80fb4aff72a3775b325bdc3d27408c8113811a7cbe640/charset_normalizer-3.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:8a6562c3700cce886c5be75ade4a5db4214fda19fede41d9792d100288d8f94c", size = 107104, upload-time = "2025-10-14T04:41:51.051Z" },
-
{ url = "https://files.pythonhosted.org/packages/da/5f/6b8f83a55bb8278772c5ae54a577f3099025f9ade59d0136ac24a0df4bde/charset_normalizer-3.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:de00632ca48df9daf77a2c65a484531649261ec9f25489917f09e455cb09ddb2", size = 100743, upload-time = "2025-10-14T04:41:52.122Z" },
-
{ url = "https://files.pythonhosted.org/packages/0a/4c/925909008ed5a988ccbb72dcc897407e5d6d3bd72410d69e051fc0c14647/charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f", size = 53402, upload-time = "2025-10-14T04:42:31.76Z" },
]
[[package]]
name = "idna"
-
version = "3.11"
source = { registry = "https://pypi.org/simple" }
-
sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582, upload-time = "2025-10-12T14:55:20.501Z" }
wheels = [
-
{ url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" },
]
[[package]]
···
[[package]]
name = "requests"
-
version = "2.32.5"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
···
{ name = "idna" },
{ name = "urllib3" },
]
-
sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
wheels = [
-
{ url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
]
[[package]]
name = "urllib3"
-
version = "2.5.0"
source = { registry = "https://pypi.org/simple" }
-
sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" }
wheels = [
-
{ url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" },
]
[[package]]
name = "websockets"
-
version = "15.0.1"
source = { registry = "https://pypi.org/simple" }
-
sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016, upload-time = "2025-03-05T20:03:41.606Z" }
wheels = [
-
{ url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437, upload-time = "2025-03-05T20:02:16.706Z" },
-
{ url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096, upload-time = "2025-03-05T20:02:18.832Z" },
-
{ url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332, upload-time = "2025-03-05T20:02:20.187Z" },
-
{ url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152, upload-time = "2025-03-05T20:02:22.286Z" },
-
{ url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096, upload-time = "2025-03-05T20:02:24.368Z" },
-
{ url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523, upload-time = "2025-03-05T20:02:25.669Z" },
-
{ url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790, upload-time = "2025-03-05T20:02:26.99Z" },
-
{ url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165, upload-time = "2025-03-05T20:02:30.291Z" },
-
{ url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160, upload-time = "2025-03-05T20:02:31.634Z" },
-
{ url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395, upload-time = "2025-03-05T20:02:33.017Z" },
-
{ url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841, upload-time = "2025-03-05T20:02:34.498Z" },
-
{ url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440, upload-time = "2025-03-05T20:02:36.695Z" },
-
{ url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098, upload-time = "2025-03-05T20:02:37.985Z" },
-
{ url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329, upload-time = "2025-03-05T20:02:39.298Z" },
-
{ url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111, upload-time = "2025-03-05T20:02:40.595Z" },
-
{ url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054, upload-time = "2025-03-05T20:02:41.926Z" },
-
{ url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496, upload-time = "2025-03-05T20:02:43.304Z" },
-
{ url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829, upload-time = "2025-03-05T20:02:48.812Z" },
-
{ url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217, upload-time = "2025-03-05T20:02:50.14Z" },
-
{ url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195, upload-time = "2025-03-05T20:02:51.561Z" },
-
{ url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393, upload-time = "2025-03-05T20:02:53.814Z" },
-
{ url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837, upload-time = "2025-03-05T20:02:55.237Z" },
-
{ url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743, upload-time = "2025-03-05T20:03:39.41Z" },
]
[[package]]
name = "xpost"
-
version = "0.1.0"
source = { virtual = "." }
dependencies = [
{ name = "python-magic" },
{ name = "requests" },
{ name = "websockets" },
···
[package.metadata]
requires-dist = [
{ name = "python-magic", specifier = ">=0.4.27" },
-
{ name = "requests", specifier = ">=2.32.5" },
-
{ name = "websockets", specifier = ">=15.0.1" },
]
···
version = 1
+
revision = 2
requires-python = ">=3.12"
[[package]]
+
name = "annotated-types"
+
version = "0.7.0"
+
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" },
+
]
+
+
[[package]]
+
name = "anyio"
+
version = "4.9.0"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "idna" },
+
{ name = "sniffio" },
+
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/95/7d/4c1bd541d4dffa1b52bd83fb8527089e097a106fc90b467a7313b105f840/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028", size = 190949, upload-time = "2025-03-17T00:02:54.77Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/a1/ee/48ca1a7c89ffec8b6a0c5d02b89c305671d5ffd8d3c94acf8b8c408575bb/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c", size = 100916, upload-time = "2025-03-17T00:02:52.713Z" },
+
]
+
+
[[package]]
+
name = "atproto"
+
version = "0.0.61"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "click" },
+
{ name = "cryptography" },
+
{ name = "dnspython" },
+
{ name = "httpx" },
+
{ name = "libipld" },
+
{ name = "pydantic" },
+
{ name = "typing-extensions" },
+
{ name = "websockets" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/b1/59/6f5074b3a45e0e3c1853544240e9039e86219feb30ff1bb5e8582c791547/atproto-0.0.61.tar.gz", hash = "sha256:98e022daf538d14f134ce7c91d42c4c973f3493ac56e43a84daa4c881f102beb", size = 189208, upload-time = "2025-04-19T00:20:11.918Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/bd/b6/da9963bf54d4c0a8a590b6297d8858c395243dbb04cb581fdadb5fe7eac7/atproto-0.0.61-py3-none-any.whl", hash = "sha256:658da5832aaeea4a12a9a74235f9c90c11453e77d596fdccb1f8b39d56245b88", size = 380426, upload-time = "2025-04-19T00:20:10.026Z" },
+
]
+
+
[[package]]
name = "certifi"
+
version = "2025.4.26"
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/e8/9e/c05b3920a3b7d20d3d3310465f50348e5b3694f4f88c6daf736eef3024c4/certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6", size = 160705, upload-time = "2025-04-26T02:12:29.51Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/4a/7e/3db2bd1b1f9e95f7cddca6d6e75e2f2bd9f51b1246e546d88addca0106bd/certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3", size = 159618, upload-time = "2025-04-26T02:12:27.662Z" },
+
]
+
+
[[package]]
+
name = "cffi"
+
version = "1.17.1"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "pycparser" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/fc/97/c783634659c2920c3fc70419e3af40972dbaf758daa229a7d6ea6135c90d/cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824", size = 516621, upload-time = "2024-09-04T20:45:21.852Z" }
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/5a/84/e94227139ee5fb4d600a7a4927f322e1d4aea6fdc50bd3fca8493caba23f/cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4", size = 183178, upload-time = "2024-09-04T20:44:12.232Z" },
+
{ url = "https://files.pythonhosted.org/packages/da/ee/fb72c2b48656111c4ef27f0f91da355e130a923473bf5ee75c5643d00cca/cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c", size = 178840, upload-time = "2024-09-04T20:44:13.739Z" },
+
{ url = "https://files.pythonhosted.org/packages/cc/b6/db007700f67d151abadf508cbfd6a1884f57eab90b1bb985c4c8c02b0f28/cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36", size = 454803, upload-time = "2024-09-04T20:44:15.231Z" },
+
{ url = "https://files.pythonhosted.org/packages/1a/df/f8d151540d8c200eb1c6fba8cd0dfd40904f1b0682ea705c36e6c2e97ab3/cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5", size = 478850, upload-time = "2024-09-04T20:44:17.188Z" },
+
{ url = "https://files.pythonhosted.org/packages/28/c0/b31116332a547fd2677ae5b78a2ef662dfc8023d67f41b2a83f7c2aa78b1/cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff", size = 485729, upload-time = "2024-09-04T20:44:18.688Z" },
+
{ url = "https://files.pythonhosted.org/packages/91/2b/9a1ddfa5c7f13cab007a2c9cc295b70fbbda7cb10a286aa6810338e60ea1/cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99", size = 471256, upload-time = "2024-09-04T20:44:20.248Z" },
+
{ url = "https://files.pythonhosted.org/packages/b2/d5/da47df7004cb17e4955df6a43d14b3b4ae77737dff8bf7f8f333196717bf/cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93", size = 479424, upload-time = "2024-09-04T20:44:21.673Z" },
+
{ url = "https://files.pythonhosted.org/packages/0b/ac/2a28bcf513e93a219c8a4e8e125534f4f6db03e3179ba1c45e949b76212c/cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3", size = 484568, upload-time = "2024-09-04T20:44:23.245Z" },
+
{ url = "https://files.pythonhosted.org/packages/d4/38/ca8a4f639065f14ae0f1d9751e70447a261f1a30fa7547a828ae08142465/cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8", size = 488736, upload-time = "2024-09-04T20:44:24.757Z" },
+
{ url = "https://files.pythonhosted.org/packages/86/c5/28b2d6f799ec0bdecf44dced2ec5ed43e0eb63097b0f58c293583b406582/cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65", size = 172448, upload-time = "2024-09-04T20:44:26.208Z" },
+
{ url = "https://files.pythonhosted.org/packages/50/b9/db34c4755a7bd1cb2d1603ac3863f22bcecbd1ba29e5ee841a4bc510b294/cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903", size = 181976, upload-time = "2024-09-04T20:44:27.578Z" },
+
{ url = "https://files.pythonhosted.org/packages/8d/f8/dd6c246b148639254dad4d6803eb6a54e8c85c6e11ec9df2cffa87571dbe/cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e", size = 182989, upload-time = "2024-09-04T20:44:28.956Z" },
+
{ url = "https://files.pythonhosted.org/packages/8b/f1/672d303ddf17c24fc83afd712316fda78dc6fce1cd53011b839483e1ecc8/cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2", size = 178802, upload-time = "2024-09-04T20:44:30.289Z" },
+
{ url = "https://files.pythonhosted.org/packages/0e/2d/eab2e858a91fdff70533cab61dcff4a1f55ec60425832ddfdc9cd36bc8af/cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3", size = 454792, upload-time = "2024-09-04T20:44:32.01Z" },
+
{ url = "https://files.pythonhosted.org/packages/75/b2/fbaec7c4455c604e29388d55599b99ebcc250a60050610fadde58932b7ee/cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683", size = 478893, upload-time = "2024-09-04T20:44:33.606Z" },
+
{ url = "https://files.pythonhosted.org/packages/4f/b7/6e4a2162178bf1935c336d4da8a9352cccab4d3a5d7914065490f08c0690/cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5", size = 485810, upload-time = "2024-09-04T20:44:35.191Z" },
+
{ url = "https://files.pythonhosted.org/packages/c7/8a/1d0e4a9c26e54746dc08c2c6c037889124d4f59dffd853a659fa545f1b40/cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4", size = 471200, upload-time = "2024-09-04T20:44:36.743Z" },
+
{ url = "https://files.pythonhosted.org/packages/26/9f/1aab65a6c0db35f43c4d1b4f580e8df53914310afc10ae0397d29d697af4/cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd", size = 479447, upload-time = "2024-09-04T20:44:38.492Z" },
+
{ url = "https://files.pythonhosted.org/packages/5f/e4/fb8b3dd8dc0e98edf1135ff067ae070bb32ef9d509d6cb0f538cd6f7483f/cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed", size = 484358, upload-time = "2024-09-04T20:44:40.046Z" },
+
{ url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469, upload-time = "2024-09-04T20:44:41.616Z" },
+
{ url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475, upload-time = "2024-09-04T20:44:43.733Z" },
+
{ url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009, upload-time = "2024-09-04T20:44:45.309Z" },
]
[[package]]
name = "charset-normalizer"
+
version = "3.4.2"
+
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/e4/33/89c2ced2b67d1c2a61c19c6751aa8902d46ce3dacb23600a283619f5a12d/charset_normalizer-3.4.2.tar.gz", hash = "sha256:5baececa9ecba31eff645232d59845c07aa030f0c81ee70184a90d35099a0e63", size = 126367, upload-time = "2025-05-02T08:34:42.01Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/d7/a4/37f4d6035c89cac7930395a35cc0f1b872e652eaafb76a6075943754f095/charset_normalizer-3.4.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0c29de6a1a95f24b9a1aa7aefd27d2487263f00dfd55a77719b530788f75cff7", size = 199936, upload-time = "2025-05-02T08:32:33.712Z" },
+
{ url = "https://files.pythonhosted.org/packages/ee/8a/1a5e33b73e0d9287274f899d967907cd0bf9c343e651755d9307e0dbf2b3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cddf7bd982eaa998934a91f69d182aec997c6c468898efe6679af88283b498d3", size = 143790, upload-time = "2025-05-02T08:32:35.768Z" },
+
{ url = "https://files.pythonhosted.org/packages/66/52/59521f1d8e6ab1482164fa21409c5ef44da3e9f653c13ba71becdd98dec3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fcbe676a55d7445b22c10967bceaaf0ee69407fbe0ece4d032b6eb8d4565982a", size = 153924, upload-time = "2025-05-02T08:32:37.284Z" },
+
{ url = "https://files.pythonhosted.org/packages/86/2d/fb55fdf41964ec782febbf33cb64be480a6b8f16ded2dbe8db27a405c09f/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d41c4d287cfc69060fa91cae9683eacffad989f1a10811995fa309df656ec214", size = 146626, upload-time = "2025-05-02T08:32:38.803Z" },
+
{ url = "https://files.pythonhosted.org/packages/8c/73/6ede2ec59bce19b3edf4209d70004253ec5f4e319f9a2e3f2f15601ed5f7/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e594135de17ab3866138f496755f302b72157d115086d100c3f19370839dd3a", size = 148567, upload-time = "2025-05-02T08:32:40.251Z" },
+
{ url = "https://files.pythonhosted.org/packages/09/14/957d03c6dc343c04904530b6bef4e5efae5ec7d7990a7cbb868e4595ee30/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf713fe9a71ef6fd5adf7a79670135081cd4431c2943864757f0fa3a65b1fafd", size = 150957, upload-time = "2025-05-02T08:32:41.705Z" },
+
{ url = "https://files.pythonhosted.org/packages/0d/c8/8174d0e5c10ccebdcb1b53cc959591c4c722a3ad92461a273e86b9f5a302/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a370b3e078e418187da8c3674eddb9d983ec09445c99a3a263c2011993522981", size = 145408, upload-time = "2025-05-02T08:32:43.709Z" },
+
{ url = "https://files.pythonhosted.org/packages/58/aa/8904b84bc8084ac19dc52feb4f5952c6df03ffb460a887b42615ee1382e8/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a955b438e62efdf7e0b7b52a64dc5c3396e2634baa62471768a64bc2adb73d5c", size = 153399, upload-time = "2025-05-02T08:32:46.197Z" },
+
{ url = "https://files.pythonhosted.org/packages/c2/26/89ee1f0e264d201cb65cf054aca6038c03b1a0c6b4ae998070392a3ce605/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7222ffd5e4de8e57e03ce2cef95a4c43c98fcb72ad86909abdfc2c17d227fc1b", size = 156815, upload-time = "2025-05-02T08:32:48.105Z" },
+
{ url = "https://files.pythonhosted.org/packages/fd/07/68e95b4b345bad3dbbd3a8681737b4338ff2c9df29856a6d6d23ac4c73cb/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:bee093bf902e1d8fc0ac143c88902c3dfc8941f7ea1d6a8dd2bcb786d33db03d", size = 154537, upload-time = "2025-05-02T08:32:49.719Z" },
+
{ url = "https://files.pythonhosted.org/packages/77/1a/5eefc0ce04affb98af07bc05f3bac9094513c0e23b0562d64af46a06aae4/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:dedb8adb91d11846ee08bec4c8236c8549ac721c245678282dcb06b221aab59f", size = 149565, upload-time = "2025-05-02T08:32:51.404Z" },
+
{ url = "https://files.pythonhosted.org/packages/37/a0/2410e5e6032a174c95e0806b1a6585eb21e12f445ebe239fac441995226a/charset_normalizer-3.4.2-cp312-cp312-win32.whl", hash = "sha256:db4c7bf0e07fc3b7d89ac2a5880a6a8062056801b83ff56d8464b70f65482b6c", size = 98357, upload-time = "2025-05-02T08:32:53.079Z" },
+
{ url = "https://files.pythonhosted.org/packages/6c/4f/c02d5c493967af3eda9c771ad4d2bbc8df6f99ddbeb37ceea6e8716a32bc/charset_normalizer-3.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:5a9979887252a82fefd3d3ed2a8e3b937a7a809f65dcb1e068b090e165bbe99e", size = 105776, upload-time = "2025-05-02T08:32:54.573Z" },
+
{ url = "https://files.pythonhosted.org/packages/ea/12/a93df3366ed32db1d907d7593a94f1fe6293903e3e92967bebd6950ed12c/charset_normalizer-3.4.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:926ca93accd5d36ccdabd803392ddc3e03e6d4cd1cf17deff3b989ab8e9dbcf0", size = 199622, upload-time = "2025-05-02T08:32:56.363Z" },
+
{ url = "https://files.pythonhosted.org/packages/04/93/bf204e6f344c39d9937d3c13c8cd5bbfc266472e51fc8c07cb7f64fcd2de/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eba9904b0f38a143592d9fc0e19e2df0fa2e41c3c3745554761c5f6447eedabf", size = 143435, upload-time = "2025-05-02T08:32:58.551Z" },
+
{ url = "https://files.pythonhosted.org/packages/22/2a/ea8a2095b0bafa6c5b5a55ffdc2f924455233ee7b91c69b7edfcc9e02284/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3fddb7e2c84ac87ac3a947cb4e66d143ca5863ef48e4a5ecb83bd48619e4634e", size = 153653, upload-time = "2025-05-02T08:33:00.342Z" },
+
{ url = "https://files.pythonhosted.org/packages/b6/57/1b090ff183d13cef485dfbe272e2fe57622a76694061353c59da52c9a659/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98f862da73774290f251b9df8d11161b6cf25b599a66baf087c1ffe340e9bfd1", size = 146231, upload-time = "2025-05-02T08:33:02.081Z" },
+
{ url = "https://files.pythonhosted.org/packages/e2/28/ffc026b26f441fc67bd21ab7f03b313ab3fe46714a14b516f931abe1a2d8/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c9379d65defcab82d07b2a9dfbfc2e95bc8fe0ebb1b176a3190230a3ef0e07c", size = 148243, upload-time = "2025-05-02T08:33:04.063Z" },
+
{ url = "https://files.pythonhosted.org/packages/c0/0f/9abe9bd191629c33e69e47c6ef45ef99773320e9ad8e9cb08b8ab4a8d4cb/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e635b87f01ebc977342e2697d05b56632f5f879a4f15955dfe8cef2448b51691", size = 150442, upload-time = "2025-05-02T08:33:06.418Z" },
+
{ url = "https://files.pythonhosted.org/packages/67/7c/a123bbcedca91d5916c056407f89a7f5e8fdfce12ba825d7d6b9954a1a3c/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1c95a1e2902a8b722868587c0e1184ad5c55631de5afc0eb96bc4b0d738092c0", size = 145147, upload-time = "2025-05-02T08:33:08.183Z" },
+
{ url = "https://files.pythonhosted.org/packages/ec/fe/1ac556fa4899d967b83e9893788e86b6af4d83e4726511eaaad035e36595/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ef8de666d6179b009dce7bcb2ad4c4a779f113f12caf8dc77f0162c29d20490b", size = 153057, upload-time = "2025-05-02T08:33:09.986Z" },
+
{ url = "https://files.pythonhosted.org/packages/2b/ff/acfc0b0a70b19e3e54febdd5301a98b72fa07635e56f24f60502e954c461/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:32fc0341d72e0f73f80acb0a2c94216bd704f4f0bce10aedea38f30502b271ff", size = 156454, upload-time = "2025-05-02T08:33:11.814Z" },
+
{ url = "https://files.pythonhosted.org/packages/92/08/95b458ce9c740d0645feb0e96cea1f5ec946ea9c580a94adfe0b617f3573/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:289200a18fa698949d2b39c671c2cc7a24d44096784e76614899a7ccf2574b7b", size = 154174, upload-time = "2025-05-02T08:33:13.707Z" },
+
{ url = "https://files.pythonhosted.org/packages/78/be/8392efc43487ac051eee6c36d5fbd63032d78f7728cb37aebcc98191f1ff/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a476b06fbcf359ad25d34a057b7219281286ae2477cc5ff5e3f70a246971148", size = 149166, upload-time = "2025-05-02T08:33:15.458Z" },
+
{ url = "https://files.pythonhosted.org/packages/44/96/392abd49b094d30b91d9fbda6a69519e95802250b777841cf3bda8fe136c/charset_normalizer-3.4.2-cp313-cp313-win32.whl", hash = "sha256:aaeeb6a479c7667fbe1099af9617c83aaca22182d6cf8c53966491a0f1b7ffb7", size = 98064, upload-time = "2025-05-02T08:33:17.06Z" },
+
{ url = "https://files.pythonhosted.org/packages/e9/b0/0200da600134e001d91851ddc797809e2fe0ea72de90e09bec5a2fbdaccb/charset_normalizer-3.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:aa6af9e7d59f9c12b33ae4e9450619cf2488e2bbe9b44030905877f0b2324980", size = 105641, upload-time = "2025-05-02T08:33:18.753Z" },
+
{ url = "https://files.pythonhosted.org/packages/20/94/c5790835a017658cbfabd07f3bfb549140c3ac458cfc196323996b10095a/charset_normalizer-3.4.2-py3-none-any.whl", hash = "sha256:7f56930ab0abd1c45cd15be65cc741c28b1c9a34876ce8c17a2fa107810c0af0", size = 52626, upload-time = "2025-05-02T08:34:40.053Z" },
+
]
+
+
[[package]]
+
name = "click"
+
version = "8.2.1"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "colorama", marker = "sys_platform == 'win32'" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" },
+
]
+
+
[[package]]
+
name = "colorama"
+
version = "0.4.6"
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
+
]
+
+
[[package]]
+
name = "cryptography"
+
version = "45.0.3"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/13/1f/9fa001e74a1993a9cadd2333bb889e50c66327b8594ac538ab8a04f915b7/cryptography-45.0.3.tar.gz", hash = "sha256:ec21313dd335c51d7877baf2972569f40a4291b76a0ce51391523ae358d05899", size = 744738, upload-time = "2025-05-25T14:17:24.777Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/82/b2/2345dc595998caa6f68adf84e8f8b50d18e9fc4638d32b22ea8daedd4b7a/cryptography-45.0.3-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:7573d9eebaeceeb55285205dbbb8753ac1e962af3d9640791d12b36864065e71", size = 7056239, upload-time = "2025-05-25T14:16:12.22Z" },
+
{ url = "https://files.pythonhosted.org/packages/71/3d/ac361649a0bfffc105e2298b720d8b862330a767dab27c06adc2ddbef96a/cryptography-45.0.3-cp311-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d377dde61c5d67eb4311eace661c3efda46c62113ff56bf05e2d679e02aebb5b", size = 4205541, upload-time = "2025-05-25T14:16:14.333Z" },
+
{ url = "https://files.pythonhosted.org/packages/70/3e/c02a043750494d5c445f769e9c9f67e550d65060e0bfce52d91c1362693d/cryptography-45.0.3-cp311-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fae1e637f527750811588e4582988932c222f8251f7b7ea93739acb624e1487f", size = 4433275, upload-time = "2025-05-25T14:16:16.421Z" },
+
{ url = "https://files.pythonhosted.org/packages/40/7a/9af0bfd48784e80eef3eb6fd6fde96fe706b4fc156751ce1b2b965dada70/cryptography-45.0.3-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:ca932e11218bcc9ef812aa497cdf669484870ecbcf2d99b765d6c27a86000942", size = 4209173, upload-time = "2025-05-25T14:16:18.163Z" },
+
{ url = "https://files.pythonhosted.org/packages/31/5f/d6f8753c8708912df52e67969e80ef70b8e8897306cd9eb8b98201f8c184/cryptography-45.0.3-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:af3f92b1dc25621f5fad065288a44ac790c5798e986a34d393ab27d2b27fcff9", size = 3898150, upload-time = "2025-05-25T14:16:20.34Z" },
+
{ url = "https://files.pythonhosted.org/packages/8b/50/f256ab79c671fb066e47336706dc398c3b1e125f952e07d54ce82cf4011a/cryptography-45.0.3-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:2f8f8f0b73b885ddd7f3d8c2b2234a7d3ba49002b0223f58cfde1bedd9563c56", size = 4466473, upload-time = "2025-05-25T14:16:22.605Z" },
+
{ url = "https://files.pythonhosted.org/packages/62/e7/312428336bb2df0848d0768ab5a062e11a32d18139447a76dfc19ada8eed/cryptography-45.0.3-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:9cc80ce69032ffa528b5e16d217fa4d8d4bb7d6ba8659c1b4d74a1b0f4235fca", size = 4211890, upload-time = "2025-05-25T14:16:24.738Z" },
+
{ url = "https://files.pythonhosted.org/packages/e7/53/8a130e22c1e432b3c14896ec5eb7ac01fb53c6737e1d705df7e0efb647c6/cryptography-45.0.3-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:c824c9281cb628015bfc3c59335163d4ca0540d49de4582d6c2637312907e4b1", size = 4466300, upload-time = "2025-05-25T14:16:26.768Z" },
+
{ url = "https://files.pythonhosted.org/packages/ba/75/6bb6579688ef805fd16a053005fce93944cdade465fc92ef32bbc5c40681/cryptography-45.0.3-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:5833bb4355cb377ebd880457663a972cd044e7f49585aee39245c0d592904578", size = 4332483, upload-time = "2025-05-25T14:16:28.316Z" },
+
{ url = "https://files.pythonhosted.org/packages/2f/11/2538f4e1ce05c6c4f81f43c1ef2bd6de7ae5e24ee284460ff6c77e42ca77/cryptography-45.0.3-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:9bb5bf55dcb69f7067d80354d0a348368da907345a2c448b0babc4215ccd3497", size = 4573714, upload-time = "2025-05-25T14:16:30.474Z" },
+
{ url = "https://files.pythonhosted.org/packages/f5/bb/e86e9cf07f73a98d84a4084e8fd420b0e82330a901d9cac8149f994c3417/cryptography-45.0.3-cp311-abi3-win32.whl", hash = "sha256:3ad69eeb92a9de9421e1f6685e85a10fbcfb75c833b42cc9bc2ba9fb00da4710", size = 2934752, upload-time = "2025-05-25T14:16:32.204Z" },
+
{ url = "https://files.pythonhosted.org/packages/c7/75/063bc9ddc3d1c73e959054f1fc091b79572e716ef74d6caaa56e945b4af9/cryptography-45.0.3-cp311-abi3-win_amd64.whl", hash = "sha256:97787952246a77d77934d41b62fb1b6f3581d83f71b44796a4158d93b8f5c490", size = 3412465, upload-time = "2025-05-25T14:16:33.888Z" },
+
{ url = "https://files.pythonhosted.org/packages/71/9b/04ead6015229a9396890d7654ee35ef630860fb42dc9ff9ec27f72157952/cryptography-45.0.3-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:c92519d242703b675ccefd0f0562eb45e74d438e001f8ab52d628e885751fb06", size = 7031892, upload-time = "2025-05-25T14:16:36.214Z" },
+
{ url = "https://files.pythonhosted.org/packages/46/c7/c7d05d0e133a09fc677b8a87953815c522697bdf025e5cac13ba419e7240/cryptography-45.0.3-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5edcb90da1843df85292ef3a313513766a78fbbb83f584a5a58fb001a5a9d57", size = 4196181, upload-time = "2025-05-25T14:16:37.934Z" },
+
{ url = "https://files.pythonhosted.org/packages/08/7a/6ad3aa796b18a683657cef930a986fac0045417e2dc428fd336cfc45ba52/cryptography-45.0.3-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:38deed72285c7ed699864f964a3f4cf11ab3fb38e8d39cfcd96710cd2b5bb716", size = 4423370, upload-time = "2025-05-25T14:16:39.502Z" },
+
{ url = "https://files.pythonhosted.org/packages/4f/58/ec1461bfcb393525f597ac6a10a63938d18775b7803324072974b41a926b/cryptography-45.0.3-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:5555365a50efe1f486eed6ac7062c33b97ccef409f5970a0b6f205a7cfab59c8", size = 4197839, upload-time = "2025-05-25T14:16:41.322Z" },
+
{ url = "https://files.pythonhosted.org/packages/d4/3d/5185b117c32ad4f40846f579369a80e710d6146c2baa8ce09d01612750db/cryptography-45.0.3-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9e4253ed8f5948a3589b3caee7ad9a5bf218ffd16869c516535325fece163dcc", size = 3886324, upload-time = "2025-05-25T14:16:43.041Z" },
+
{ url = "https://files.pythonhosted.org/packages/67/85/caba91a57d291a2ad46e74016d1f83ac294f08128b26e2a81e9b4f2d2555/cryptography-45.0.3-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:cfd84777b4b6684955ce86156cfb5e08d75e80dc2585e10d69e47f014f0a5342", size = 4450447, upload-time = "2025-05-25T14:16:44.759Z" },
+
{ url = "https://files.pythonhosted.org/packages/ae/d1/164e3c9d559133a38279215c712b8ba38e77735d3412f37711b9f8f6f7e0/cryptography-45.0.3-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:a2b56de3417fd5f48773ad8e91abaa700b678dc7fe1e0c757e1ae340779acf7b", size = 4200576, upload-time = "2025-05-25T14:16:46.438Z" },
+
{ url = "https://files.pythonhosted.org/packages/71/7a/e002d5ce624ed46dfc32abe1deff32190f3ac47ede911789ee936f5a4255/cryptography-45.0.3-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:57a6500d459e8035e813bd8b51b671977fb149a8c95ed814989da682314d0782", size = 4450308, upload-time = "2025-05-25T14:16:48.228Z" },
+
{ url = "https://files.pythonhosted.org/packages/87/ad/3fbff9c28cf09b0a71e98af57d74f3662dea4a174b12acc493de00ea3f28/cryptography-45.0.3-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:f22af3c78abfbc7cbcdf2c55d23c3e022e1a462ee2481011d518c7fb9c9f3d65", size = 4325125, upload-time = "2025-05-25T14:16:49.844Z" },
+
{ url = "https://files.pythonhosted.org/packages/f5/b4/51417d0cc01802304c1984d76e9592f15e4801abd44ef7ba657060520bf0/cryptography-45.0.3-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:232954730c362638544758a8160c4ee1b832dc011d2c41a306ad8f7cccc5bb0b", size = 4560038, upload-time = "2025-05-25T14:16:51.398Z" },
+
{ url = "https://files.pythonhosted.org/packages/80/38/d572f6482d45789a7202fb87d052deb7a7b136bf17473ebff33536727a2c/cryptography-45.0.3-cp37-abi3-win32.whl", hash = "sha256:cb6ab89421bc90e0422aca911c69044c2912fc3debb19bb3c1bfe28ee3dff6ab", size = 2924070, upload-time = "2025-05-25T14:16:53.472Z" },
+
{ url = "https://files.pythonhosted.org/packages/91/5a/61f39c0ff4443651cc64e626fa97ad3099249152039952be8f344d6b0c86/cryptography-45.0.3-cp37-abi3-win_amd64.whl", hash = "sha256:d54ae41e6bd70ea23707843021c778f151ca258081586f0cfa31d936ae43d1b2", size = 3395005, upload-time = "2025-05-25T14:16:55.134Z" },
+
]
+
+
[[package]]
+
name = "dnspython"
+
version = "2.7.0"
+
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/b5/4a/263763cb2ba3816dd94b08ad3a33d5fdae34ecb856678773cc40a3605829/dnspython-2.7.0.tar.gz", hash = "sha256:ce9c432eda0dc91cf618a5cedf1a4e142651196bbcd2c80e89ed5a907e5cfaf1", size = 345197, upload-time = "2024-10-05T20:14:59.362Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/68/1b/e0a87d256e40e8c888847551b20a017a6b98139178505dc7ffb96f04e954/dnspython-2.7.0-py3-none-any.whl", hash = "sha256:b4c34b7d10b51bcc3a5071e7b8dee77939f1e878477eeecc965e9835f63c6c86", size = 313632, upload-time = "2024-10-05T20:14:57.687Z" },
+
]
+
+
[[package]]
+
name = "h11"
+
version = "0.16.0"
+
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" },
+
]
+
+
[[package]]
+
name = "httpcore"
+
version = "1.0.9"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "certifi" },
+
{ name = "h11" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" },
+
]
+
+
[[package]]
+
name = "httpx"
+
version = "0.28.1"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "anyio" },
+
{ name = "certifi" },
+
{ name = "httpcore" },
+
{ name = "idna" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
]
[[package]]
name = "idna"
+
version = "3.10"
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" }
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" },
+
]
+
+
[[package]]
+
name = "libipld"
+
version = "3.0.1"
+
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/d4/ad/b440c64e2d1ee84f2933979175399ff09bd0ba7b1b07c6bc20ba585825cd/libipld-3.0.1.tar.gz", hash = "sha256:2970752de70e5fdcac4646900cdefaa0dca08db9b5d59c40b5496d99e3bffa64", size = 4359070, upload-time = "2025-02-18T11:19:59.924Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/b8/6b/87c3b3222a1ebc9b8654a2ec168d177e85c993a679b698f53f199b367e37/libipld-3.0.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:27313adb70ca9ecfaaa34f1ca6e45ee0569935b7ba9802f78c2f37f7a633a7dd", size = 307914, upload-time = "2025-02-18T11:18:13.449Z" },
+
{ url = "https://files.pythonhosted.org/packages/62/fc/9cd90e1bf5e50fa31ced3a9e4eced8b386a509f693d915ff483c320f8556/libipld-3.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:bf5a14647350aa6779d634b7dc0f6967296fe52e9ca1d6132e24aa388c77c68e", size = 295778, upload-time = "2025-02-18T11:18:15.223Z" },
+
{ url = "https://files.pythonhosted.org/packages/9b/17/c4ee7f38d43d513935179706011aa8fa5ef70d223626477de05ae301f4ae/libipld-3.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d9e619573d500eb4a4ab4a8ef90882305fba43a5a405eb80fcc0afe5d6e9dcd", size = 675489, upload-time = "2025-02-18T11:18:16.808Z" },
+
{ url = "https://files.pythonhosted.org/packages/8f/93/f7ba7d2ce896a774634f3a279a0d7900ea2b76e0d93c335727b01c564fd6/libipld-3.0.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a2fbfaed3fc98c95cd412e61e960cd41633fc880de24327613b0cb0b974d277b", size = 681145, upload-time = "2025-02-18T11:18:18.835Z" },
+
{ url = "https://files.pythonhosted.org/packages/92/16/c247088ec2194bfc5b5ed71059c468d1f16987696905fe9b5aaaac336521/libipld-3.0.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b36044476920455a26d30df75728eab069201c42c0af3e3610a30fd62b96ab55", size = 685159, upload-time = "2025-02-18T11:18:20.172Z" },
+
{ url = "https://files.pythonhosted.org/packages/e1/f3/3d0442d0bd92f2bbc5bc7259569c2886bd1398a6f090ea30cd19e8c45f00/libipld-3.0.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4586a3442f12065a64a36ae56d80c71d05a87413fbf17bae330c42793c8ecfac", size = 820381, upload-time = "2025-02-18T11:18:22.398Z" },
+
{ url = "https://files.pythonhosted.org/packages/c7/a7/63998349b924f0d2225ed194497d24bf088fad34fc02085fd97c4777164c/libipld-3.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d243ca7dea89e1579fd95f95ff612a7b56a980743c25e2a0b1a39cae7b67e55e", size = 681046, upload-time = "2025-02-18T11:18:23.954Z" },
+
{ url = "https://files.pythonhosted.org/packages/0b/5a/bdbadafe5cb3c5ae1b4e7fd1517a436d7bda8b63621f3d39af92622d905e/libipld-3.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1525c07363abb20e8cd416df7ca316ddfc4f592ed2da694b02e0e4a4af1b9418", size = 689931, upload-time = "2025-02-18T11:18:26.868Z" },
+
{ url = "https://files.pythonhosted.org/packages/b1/3c/759fcc3f12e41485ef374fab202b7ba84e9f001ca821d3811ff8cd030fdf/libipld-3.0.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:372768df5540867962c3c16fe80976f8b162a9771e8fe1b2175f18dabf23b9ce", size = 849420, upload-time = "2025-02-18T11:18:28.847Z" },
+
{ url = "https://files.pythonhosted.org/packages/c4/ac/d697be6d9f20c5176d11193edbac70d55bdeaa70cd110a156ac87aaecaae/libipld-3.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:47bf15f9fc5890ff4807c0c5cb0ff99d625bcea3cd222aaa500d57466da529bd", size = 841270, upload-time = "2025-02-18T11:18:30.588Z" },
+
{ url = "https://files.pythonhosted.org/packages/6e/91/5c64cd11e2daee21c968baa6a0669a0f402ead5fc99ad78b92e06a42e4e5/libipld-3.0.1-cp312-cp312-win32.whl", hash = "sha256:989d37ae0cb31380e6b76391e0272342de830adad2821c2de7b925b360fc45f3", size = 182583, upload-time = "2025-02-18T11:18:31.775Z" },
+
{ url = "https://files.pythonhosted.org/packages/84/b7/37f88ada4e6fb762a71e93366c320f58995022cf8f67c4ad91d4b9a4568d/libipld-3.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:4557f20d4b8e61ac6c89ab4cea04f3a518a266f3c3d7348cf4cc8ac9b02c89dc", size = 197643, upload-time = "2025-02-18T11:18:32.86Z" },
+
{ url = "https://files.pythonhosted.org/packages/3a/23/184f246a3ef1f6fe9775ad27851091a3779c14657e5591f6bdbe910bfe88/libipld-3.0.1-cp312-cp312-win_arm64.whl", hash = "sha256:92ec97dac2e978f09343ebb64b0bb9bed9c294e8a224490552cfc200e9101f5c", size = 176991, upload-time = "2025-02-18T11:18:34.147Z" },
+
{ url = "https://files.pythonhosted.org/packages/9d/a2/28c89265a107f9e92e32e308084edd7669e3fe40acb5e21b9e5af231f627/libipld-3.0.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:2cc452e533b7af10a66134aa33a064b40e05fe51fa4a509a969342768543953f", size = 305678, upload-time = "2025-02-18T11:18:36.125Z" },
+
{ url = "https://files.pythonhosted.org/packages/05/41/ccb2251240547e0903a55f84bcab0de3b766297f5112c9a3519ce0c66dee/libipld-3.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6cd8e21c0c7ee87831dc262794637cf6c47b55c55689bc917d2c3d2518221048", size = 295909, upload-time = "2025-02-18T11:18:37.246Z" },
+
{ url = "https://files.pythonhosted.org/packages/9b/01/93f4e7f751eaafb6e7ba2a5c2dc859eda743837f3edbd06b712a5e92e63e/libipld-3.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9de6425fc8ba0e9072c77826e66ece2dcb1d161f933cc35f2ad94470d5a304fb", size = 675461, upload-time = "2025-02-18T11:18:38.328Z" },
+
{ url = "https://files.pythonhosted.org/packages/5e/a7/d1ff7b19e48f814f4fc908bd0a9160d80539a0128fe9b51285af09f65625/libipld-3.0.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:23c84465181ed30760ba9483e3ae71027573903cfbadf173be9fdd44bd83d8bd", size = 681427, upload-time = "2025-02-18T11:18:39.638Z" },
+
{ url = "https://files.pythonhosted.org/packages/e2/42/7c3b45b9186f7f67015b0d717feeaa920ea215c51df675e27419f598ffb2/libipld-3.0.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:45052b7f9b6a61a425318ff611b115571965d00e42c2ca66dfd0c56a4f3002b4", size = 684988, upload-time = "2025-02-18T11:18:42.021Z" },
+
{ url = "https://files.pythonhosted.org/packages/33/02/dd30f423e8e74ba830dff5bbbd2d7f68c474e5df1d3b56fce5e59bc08a1e/libipld-3.0.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6d183c2543db326d9a4e21819ba5674ae4f1e69dcfd853c654fba471cfbbaa88", size = 820272, upload-time = "2025-02-18T11:18:46.181Z" },
+
{ url = "https://files.pythonhosted.org/packages/80/cd/bdd10568306ed1d71d24440e08b526ae69b93405d75a5289e0d54cf7b961/libipld-3.0.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ceb82681b6985e34609636186ac00b51105816d310ed510de1169cd65f903622", size = 680986, upload-time = "2025-02-18T11:18:48.285Z" },
+
{ url = "https://files.pythonhosted.org/packages/0a/20/d03eddce8c41f1f928efb37268424e336d97d2aca829bd267b1f12851759/libipld-3.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e3c71ffe0b9c182664bac3a2386e6c6580744f5aa46513d0d6823e671ab71d82", size = 689783, upload-time = "2025-02-18T11:18:49.501Z" },
+
{ url = "https://files.pythonhosted.org/packages/27/17/fdfcb6d0b0d7120eb3ad9361173cc6d5c24814b6ea2e7b135b3bb8d6920e/libipld-3.0.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:6ed68ff00bb8d63e18bf823eb89ec86e9f30b997c6d152a35ec6c4c8502ea080", size = 849382, upload-time = "2025-02-18T11:18:51.183Z" },
+
{ url = "https://files.pythonhosted.org/packages/6c/99/237d618fa6707300a60b8b4b859855e4e34dadb00233dc1e92d911166ae2/libipld-3.0.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:8d517c69b8f29acca27b0ced0ecb78f6e54f70952a35bc8f3060b628069c63ec", size = 841299, upload-time = "2025-02-18T11:18:53.398Z" },
+
{ url = "https://files.pythonhosted.org/packages/93/49/32c73fd530fab341bebc4e400657f5c2189a8d4d627bcdeb774eb37dd90f/libipld-3.0.1-cp313-cp313-win32.whl", hash = "sha256:21989622e02a3bd8be16e97c412af4f48b5ddf3b32f9b0da9d7c6b0724d01e91", size = 182567, upload-time = "2025-02-18T11:18:54.635Z" },
+
{ url = "https://files.pythonhosted.org/packages/7f/1e/ea73ea525d716ce836367daa212d4d0b1c25a89ffa281c9fee535cb99840/libipld-3.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:da81784d00597a0c9ac0a133ac820aaea60599b077778046dde4726e1a08685c", size = 196204, upload-time = "2025-02-18T11:18:55.706Z" },
+
{ url = "https://files.pythonhosted.org/packages/e2/ba/56e9082bdd997c41b3e58d3afb9d40cf08725cbd486f7e334538a41bc2a8/libipld-3.0.1-cp313-cp313-win_arm64.whl", hash = "sha256:d670dea8a76188e2977b5c3d780a6393bb270b0d04976436ce3afbc2cf4da516", size = 177044, upload-time = "2025-02-18T11:18:56.786Z" },
+
]
+
+
[[package]]
+
name = "pycparser"
+
version = "2.22"
+
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/1d/b2/31537cf4b1ca988837256c910a668b553fceb8f069bedc4b1c826024b52c/pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6", size = 172736, upload-time = "2024-03-30T13:22:22.564Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/13/a3/a812df4e2dd5696d1f351d58b8fe16a405b234ad2886a0dab9183fb78109/pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc", size = 117552, upload-time = "2024-03-30T13:22:20.476Z" },
+
]
+
+
[[package]]
+
name = "pydantic"
+
version = "2.11.5"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "annotated-types" },
+
{ name = "pydantic-core" },
+
{ name = "typing-extensions" },
+
{ name = "typing-inspection" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/f0/86/8ce9040065e8f924d642c58e4a344e33163a07f6b57f836d0d734e0ad3fb/pydantic-2.11.5.tar.gz", hash = "sha256:7f853db3d0ce78ce8bbb148c401c2cdd6431b3473c0cdff2755c7690952a7b7a", size = 787102, upload-time = "2025-05-22T21:18:08.761Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/b5/69/831ed22b38ff9b4b64b66569f0e5b7b97cf3638346eb95a2147fdb49ad5f/pydantic-2.11.5-py3-none-any.whl", hash = "sha256:f9c26ba06f9747749ca1e5c94d6a85cb84254577553c8785576fd38fa64dc0f7", size = 444229, upload-time = "2025-05-22T21:18:06.329Z" },
+
]
+
+
[[package]]
+
name = "pydantic-core"
+
version = "2.33.2"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "typing-extensions" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195, upload-time = "2025-04-23T18:33:52.104Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000, upload-time = "2025-04-23T18:31:25.863Z" },
+
{ url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996, upload-time = "2025-04-23T18:31:27.341Z" },
+
{ url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957, upload-time = "2025-04-23T18:31:28.956Z" },
+
{ url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199, upload-time = "2025-04-23T18:31:31.025Z" },
+
{ url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296, upload-time = "2025-04-23T18:31:32.514Z" },
+
{ url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109, upload-time = "2025-04-23T18:31:33.958Z" },
+
{ url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028, upload-time = "2025-04-23T18:31:39.095Z" },
+
{ url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044, upload-time = "2025-04-23T18:31:41.034Z" },
+
{ url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881, upload-time = "2025-04-23T18:31:42.757Z" },
+
{ url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034, upload-time = "2025-04-23T18:31:44.304Z" },
+
{ url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187, upload-time = "2025-04-23T18:31:45.891Z" },
+
{ url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628, upload-time = "2025-04-23T18:31:47.819Z" },
+
{ url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866, upload-time = "2025-04-23T18:31:49.635Z" },
+
{ url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894, upload-time = "2025-04-23T18:31:51.609Z" },
+
{ url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688, upload-time = "2025-04-23T18:31:53.175Z" },
+
{ url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808, upload-time = "2025-04-23T18:31:54.79Z" },
+
{ url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580, upload-time = "2025-04-23T18:31:57.393Z" },
+
{ url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859, upload-time = "2025-04-23T18:31:59.065Z" },
+
{ url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810, upload-time = "2025-04-23T18:32:00.78Z" },
+
{ url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498, upload-time = "2025-04-23T18:32:02.418Z" },
+
{ url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611, upload-time = "2025-04-23T18:32:04.152Z" },
+
{ url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924, upload-time = "2025-04-23T18:32:06.129Z" },
+
{ url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196, upload-time = "2025-04-23T18:32:08.178Z" },
+
{ url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389, upload-time = "2025-04-23T18:32:10.242Z" },
+
{ url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223, upload-time = "2025-04-23T18:32:12.382Z" },
+
{ url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473, upload-time = "2025-04-23T18:32:14.034Z" },
+
{ url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269, upload-time = "2025-04-23T18:32:15.783Z" },
+
{ url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921, upload-time = "2025-04-23T18:32:18.473Z" },
+
{ url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162, upload-time = "2025-04-23T18:32:20.188Z" },
+
{ url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560, upload-time = "2025-04-23T18:32:22.354Z" },
+
{ url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777, upload-time = "2025-04-23T18:32:25.088Z" },
]
[[package]]
···
[[package]]
name = "requests"
+
version = "2.32.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
···
{ name = "idna" },
{ name = "urllib3" },
]
+
sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218, upload-time = "2024-05-29T15:37:49.536Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928, upload-time = "2024-05-29T15:37:47.027Z" },
+
]
+
+
[[package]]
+
name = "sniffio"
+
version = "1.3.1"
+
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" },
+
]
+
+
[[package]]
+
name = "typing-extensions"
+
version = "4.14.0"
+
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/d1/bc/51647cd02527e87d05cb083ccc402f93e441606ff1f01739a62c8ad09ba5/typing_extensions-4.14.0.tar.gz", hash = "sha256:8676b788e32f02ab42d9e7c61324048ae4c6d844a399eebace3d4979d75ceef4", size = 107423, upload-time = "2025-06-02T14:52:11.399Z" }
+
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/69/e0/552843e0d356fbb5256d21449fa957fa4eff3bbc135a74a691ee70c7c5da/typing_extensions-4.14.0-py3-none-any.whl", hash = "sha256:a1514509136dd0b477638fc68d6a91497af5076466ad0fa6c338e44e359944af", size = 43839, upload-time = "2025-06-02T14:52:10.026Z" },
+
]
+
+
[[package]]
+
name = "typing-inspection"
+
version = "0.4.1"
+
source = { registry = "https://pypi.org/simple" }
+
dependencies = [
+
{ name = "typing-extensions" },
+
]
+
sdist = { url = "https://files.pythonhosted.org/packages/f8/b1/0c11f5058406b3af7609f121aaa6b609744687f1d158b3c3a5bf4cc94238/typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28", size = 75726, upload-time = "2025-05-21T18:55:23.885Z" }
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/17/69/cd203477f944c353c31bade965f880aa1061fd6bf05ded0726ca845b6ff7/typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51", size = 14552, upload-time = "2025-05-21T18:55:22.152Z" },
]
[[package]]
name = "urllib3"
+
version = "2.4.0"
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/8a/78/16493d9c386d8e60e442a35feac5e00f0913c0f4b7c217c11e8ec2ff53e0/urllib3-2.4.0.tar.gz", hash = "sha256:414bc6535b787febd7567804cc015fee39daab8ad86268f1310a9250697de466", size = 390672, upload-time = "2025-04-10T15:23:39.232Z" }
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/6b/11/cc635220681e93a0183390e26485430ca2c7b5f9d33b15c74c2861cb8091/urllib3-2.4.0-py3-none-any.whl", hash = "sha256:4e16665048960a0900c702d4a66415956a584919c03361cac9f1df5c5dd7e813", size = 128680, upload-time = "2025-04-10T15:23:37.377Z" },
]
[[package]]
name = "websockets"
+
version = "13.1"
source = { registry = "https://pypi.org/simple" }
+
sdist = { url = "https://files.pythonhosted.org/packages/e2/73/9223dbc7be3dcaf2a7bbf756c351ec8da04b1fa573edaf545b95f6b0c7fd/websockets-13.1.tar.gz", hash = "sha256:a3b3366087c1bc0a2795111edcadddb8b3b59509d5db5d7ea3fdd69f954a8878", size = 158549, upload-time = "2024-09-21T17:34:21.54Z" }
wheels = [
+
{ url = "https://files.pythonhosted.org/packages/df/46/c426282f543b3c0296cf964aa5a7bb17e984f58dde23460c3d39b3148fcf/websockets-13.1-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:9d75baf00138f80b48f1eac72ad1535aac0b6461265a0bcad391fc5aba875cfc", size = 157821, upload-time = "2024-09-21T17:32:56.442Z" },
+
{ url = "https://files.pythonhosted.org/packages/aa/85/22529867010baac258da7c45848f9415e6cf37fef00a43856627806ffd04/websockets-13.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:9b6f347deb3dcfbfde1c20baa21c2ac0751afaa73e64e5b693bb2b848efeaa49", size = 155480, upload-time = "2024-09-21T17:32:57.698Z" },
+
{ url = "https://files.pythonhosted.org/packages/29/2c/bdb339bfbde0119a6e84af43ebf6275278698a2241c2719afc0d8b0bdbf2/websockets-13.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:de58647e3f9c42f13f90ac7e5f58900c80a39019848c5547bc691693098ae1bd", size = 155715, upload-time = "2024-09-21T17:32:59.429Z" },
+
{ url = "https://files.pythonhosted.org/packages/9f/d0/8612029ea04c5c22bf7af2fd3d63876c4eaeef9b97e86c11972a43aa0e6c/websockets-13.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1b54689e38d1279a51d11e3467dd2f3a50f5f2e879012ce8f2d6943f00e83f0", size = 165647, upload-time = "2024-09-21T17:33:00.495Z" },
+
{ url = "https://files.pythonhosted.org/packages/56/04/1681ed516fa19ca9083f26d3f3a302257e0911ba75009533ed60fbb7b8d1/websockets-13.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf1781ef73c073e6b0f90af841aaf98501f975d306bbf6221683dd594ccc52b6", size = 164592, upload-time = "2024-09-21T17:33:02.223Z" },
+
{ url = "https://files.pythonhosted.org/packages/38/6f/a96417a49c0ed132bb6087e8e39a37db851c70974f5c724a4b2a70066996/websockets-13.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d23b88b9388ed85c6faf0e74d8dec4f4d3baf3ecf20a65a47b836d56260d4b9", size = 165012, upload-time = "2024-09-21T17:33:03.288Z" },
+
{ url = "https://files.pythonhosted.org/packages/40/8b/fccf294919a1b37d190e86042e1a907b8f66cff2b61e9befdbce03783e25/websockets-13.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3c78383585f47ccb0fcf186dcb8a43f5438bd7d8f47d69e0b56f71bf431a0a68", size = 165311, upload-time = "2024-09-21T17:33:04.728Z" },
+
{ url = "https://files.pythonhosted.org/packages/c1/61/f8615cf7ce5fe538476ab6b4defff52beb7262ff8a73d5ef386322d9761d/websockets-13.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:d6d300f8ec35c24025ceb9b9019ae9040c1ab2f01cddc2bcc0b518af31c75c14", size = 164692, upload-time = "2024-09-21T17:33:05.829Z" },
+
{ url = "https://files.pythonhosted.org/packages/5c/f1/a29dd6046d3a722d26f182b783a7997d25298873a14028c4760347974ea3/websockets-13.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a9dcaf8b0cc72a392760bb8755922c03e17a5a54e08cca58e8b74f6902b433cf", size = 164686, upload-time = "2024-09-21T17:33:06.823Z" },
+
{ url = "https://files.pythonhosted.org/packages/0f/99/ab1cdb282f7e595391226f03f9b498f52109d25a2ba03832e21614967dfa/websockets-13.1-cp312-cp312-win32.whl", hash = "sha256:2f85cf4f2a1ba8f602298a853cec8526c2ca42a9a4b947ec236eaedb8f2dc80c", size = 158712, upload-time = "2024-09-21T17:33:07.877Z" },
+
{ url = "https://files.pythonhosted.org/packages/46/93/e19160db48b5581feac8468330aa11b7292880a94a37d7030478596cc14e/websockets-13.1-cp312-cp312-win_amd64.whl", hash = "sha256:38377f8b0cdeee97c552d20cf1865695fcd56aba155ad1b4ca8779a5b6ef4ac3", size = 159145, upload-time = "2024-09-21T17:33:09.202Z" },
+
{ url = "https://files.pythonhosted.org/packages/51/20/2b99ca918e1cbd33c53db2cace5f0c0cd8296fc77558e1908799c712e1cd/websockets-13.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a9ab1e71d3d2e54a0aa646ab6d4eebfaa5f416fe78dfe4da2839525dc5d765c6", size = 157828, upload-time = "2024-09-21T17:33:10.987Z" },
+
{ url = "https://files.pythonhosted.org/packages/b8/47/0932a71d3d9c0e9483174f60713c84cee58d62839a143f21a2bcdbd2d205/websockets-13.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b9d7439d7fab4dce00570bb906875734df13d9faa4b48e261c440a5fec6d9708", size = 155487, upload-time = "2024-09-21T17:33:12.153Z" },
+
{ url = "https://files.pythonhosted.org/packages/a9/60/f1711eb59ac7a6c5e98e5637fef5302f45b6f76a2c9d64fd83bbb341377a/websockets-13.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:327b74e915cf13c5931334c61e1a41040e365d380f812513a255aa804b183418", size = 155721, upload-time = "2024-09-21T17:33:13.909Z" },
+
{ url = "https://files.pythonhosted.org/packages/6a/e6/ba9a8db7f9d9b0e5f829cf626ff32677f39824968317223605a6b419d445/websockets-13.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:325b1ccdbf5e5725fdcb1b0e9ad4d2545056479d0eee392c291c1bf76206435a", size = 165609, upload-time = "2024-09-21T17:33:14.967Z" },
+
{ url = "https://files.pythonhosted.org/packages/c1/22/4ec80f1b9c27a0aebd84ccd857252eda8418ab9681eb571b37ca4c5e1305/websockets-13.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:346bee67a65f189e0e33f520f253d5147ab76ae42493804319b5716e46dddf0f", size = 164556, upload-time = "2024-09-21T17:33:17.113Z" },
+
{ url = "https://files.pythonhosted.org/packages/27/ac/35f423cb6bb15600438db80755609d27eda36d4c0b3c9d745ea12766c45e/websockets-13.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:91a0fa841646320ec0d3accdff5b757b06e2e5c86ba32af2e0815c96c7a603c5", size = 164993, upload-time = "2024-09-21T17:33:18.168Z" },
+
{ url = "https://files.pythonhosted.org/packages/31/4e/98db4fd267f8be9e52e86b6ee4e9aa7c42b83452ea0ea0672f176224b977/websockets-13.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:18503d2c5f3943e93819238bf20df71982d193f73dcecd26c94514f417f6b135", size = 165360, upload-time = "2024-09-21T17:33:19.233Z" },
+
{ url = "https://files.pythonhosted.org/packages/3f/15/3f0de7cda70ffc94b7e7024544072bc5b26e2c1eb36545291abb755d8cdb/websockets-13.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:a9cd1af7e18e5221d2878378fbc287a14cd527fdd5939ed56a18df8a31136bb2", size = 164745, upload-time = "2024-09-21T17:33:20.361Z" },
+
{ url = "https://files.pythonhosted.org/packages/a1/6e/66b6b756aebbd680b934c8bdbb6dcb9ce45aad72cde5f8a7208dbb00dd36/websockets-13.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:70c5be9f416aa72aab7a2a76c90ae0a4fe2755c1816c153c1a2bcc3333ce4ce6", size = 164732, upload-time = "2024-09-21T17:33:23.103Z" },
+
{ url = "https://files.pythonhosted.org/packages/35/c6/12e3aab52c11aeb289e3dbbc05929e7a9d90d7a9173958477d3ef4f8ce2d/websockets-13.1-cp313-cp313-win32.whl", hash = "sha256:624459daabeb310d3815b276c1adef475b3e6804abaf2d9d2c061c319f7f187d", size = 158709, upload-time = "2024-09-21T17:33:24.196Z" },
+
{ url = "https://files.pythonhosted.org/packages/41/d8/63d6194aae711d7263df4498200c690a9c39fb437ede10f3e157a6343e0d/websockets-13.1-cp313-cp313-win_amd64.whl", hash = "sha256:c518e84bb59c2baae725accd355c8dc517b4a3ed8db88b4bc93c78dae2974bf2", size = 159144, upload-time = "2024-09-21T17:33:25.96Z" },
+
{ url = "https://files.pythonhosted.org/packages/56/27/96a5cd2626d11c8280656c6c71d8ab50fe006490ef9971ccd154e0c42cd2/websockets-13.1-py3-none-any.whl", hash = "sha256:a9a396a6ad26130cdae92ae10c36af09d9bfe6cafe69670fd3b6da9b07b4044f", size = 152134, upload-time = "2024-09-21T17:34:19.904Z" },
]
[[package]]
name = "xpost"
+
version = "0.0.3"
source = { virtual = "." }
dependencies = [
+
{ name = "atproto" },
+
{ name = "click" },
{ name = "python-magic" },
{ name = "requests" },
{ name = "websockets" },
···
[package.metadata]
requires-dist = [
+
{ name = "atproto", specifier = ">=0.0.61" },
+
{ name = "click", specifier = ">=8.2.1" },
{ name = "python-magic", specifier = ">=0.4.27" },
+
{ name = "requests", specifier = ">=2.32.3" },
+
{ name = "websockets", specifier = ">=13.1" },
]