Monorepo for wisp.place. A static site hosting service built on top of the AT Protocol. wisp.place

Compare changes

Choose any two refs to compare.

+6
.dockerignore
···
*.log
.vscode
.idea
+
.prettierrc
+
testDeploy
+
.tangled
+
.crush
+
.claude
+
hosting-service
+1
.gitignore
···
# production
/build
+
/result
# misc
.DS_Store
-3
.gitmodules
···
-
[submodule "cli/jacquard"]
-
path = cli/jacquard
-
url = https://tangled.org/@nonbinary.computer/jacquard
-1
.tangled/workflows/deploy-wisp.yml
···
- name: 'Deploy to Wisp.place'
command: |
-
echo
./cli/target/release/wisp-cli \
"$WISP_HANDLE" \
--path "$SITE_PATH" \
+4
.tangled/workflows/test.yml
···
- name: install dependencies
command: |
export PATH="$HOME/.nix-profile/bin:$PATH"
+
+
# have to regenerate otherwise it wont install necessary dependencies to run
+
rm -rf bun.lock package-lock.json
+
bun install @oven/bun-linux-aarch64
bun install
- name: run all tests
+10 -15
Dockerfile
···
WORKDIR /app
# Copy package files
-
COPY package.json bun.lock* ./
+
COPY package.json ./
+
+
# Copy Bun configuration
+
COPY bunfig.toml ./
+
+
COPY tsconfig.json ./
# Install dependencies
-
RUN bun install --frozen-lockfile
+
RUN bun install
# Copy source code
COPY src ./src
COPY public ./public
-
# Build the application (if needed)
-
# RUN bun run build
-
-
# Set environment variables (can be overridden at runtime)
-
ENV PORT=3000
+
ENV PORT=8000
ENV NODE_ENV=production
-
# Expose the application port
-
EXPOSE 3000
-
-
# Health check
-
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
-
CMD bun -e "fetch('http://localhost:3000/health').then(r => r.ok ? process.exit(0) : process.exit(1)).catch(() => process.exit(1))"
+
EXPOSE 8000
-
# Start the application
-
CMD ["bun", "src/index.ts"]
+
CMD ["bun", "start"]
+99 -5
README.md
···
# Wisp.place
-
A static site hosting service built on the AT Protocol. [https://wisp.place](https://wisp.place)
-
/src is the main backend
+
Decentralized static site hosting on the AT Protocol. [https://wisp.place](https://wisp.place)
+
+
## What is this?
-
/hosting-service is the microservice that serves on-disk caches of sites pulled from the firehose and pdses
+
Host static sites in your AT Protocol repo, served with CDN distribution. Your PDS holds the cryptographically signed manifest and files - the source of truth. Hosting services index and serve them fast.
-
/cli is the wisp-cli, a way to upload sites directly to the pds
+
## Quick Start
-
full readme soon
+
```bash
+
# Using the web interface
+
Visit https://wisp.place and sign in
+
+
# Or use the CLI
+
cd cli
+
cargo build --release
+
./target/release/wisp-cli your-handle.bsky.social --path ./my-site --site my-site
+
```
+
+
Your site appears at `https://sites.wisp.place/{your-did}/{site-name}` or your custom domain.
+
+
## Architecture
+
+
- **`/src`** - Main backend (OAuth, site management, custom domains)
+
- **`/hosting-service`** - Microservice that serves cached sites from disk
+
- **`/cli`** - Rust CLI for direct PDS uploads
+
- **`/public`** - React frontend
+
+
### How it works
+
+
1. Sites stored as `place.wisp.fs` records in your AT Protocol repo
+
2. Files compressed (gzip) and base64-encoded as blobs
+
3. Hosting service watches firehose, caches sites locally
+
4. Sites served via custom domains or `*.wisp.place` subdomains
+
+
## Development
+
+
```bash
+
# Backend
+
bun install
+
bun run src/index.ts
+
+
# Hosting service
+
cd hosting-service
+
npm run start
+
+
# CLI
+
cd cli
+
cargo build
+
```
+
+
## Features
+
+
### URL Redirects and Rewrites
+
+
The hosting service supports Netlify-style `_redirects` files for managing URLs. Place a `_redirects` file in your site root to enable:
+
+
- **301/302 Redirects**: Permanent and temporary URL redirects
+
- **200 Rewrites**: Serve different content without changing the URL
+
- **404 Custom Pages**: Custom error pages for specific paths
+
- **Splats & Placeholders**: Dynamic path matching (`/blog/:year/:month/:day`, `/news/*`)
+
- **Query Parameter Matching**: Redirect based on URL parameters
+
- **Conditional Redirects**: Route by country, language, or cookie presence
+
- **Force Redirects**: Override existing files with redirects
+
+
Example `_redirects`:
+
```
+
# Single-page app routing (React, Vue, etc.)
+
/* /index.html 200
+
+
# Simple redirects
+
/home /
+
/old-blog/* /blog/:splat
+
+
# API proxy
+
/api/* https://api.example.com/:splat 200
+
+
# Country-based routing
+
/ /us/ 302 Country=us
+
/ /uk/ 302 Country=gb
+
```
+
+
## Limits
+
+
- Max file size: 100MB (PDS limit)
+
- Max files: 2000
+
+
## Tech Stack
+
+
- Backend: Bun + Elysia + PostgreSQL
+
- Frontend: React 19 + Tailwind 4 + Radix UI
+
- Hosting: Node microservice using Hono
+
- CLI: Rust + Jacquard (AT Protocol library)
+
- Protocol: AT Protocol OAuth + custom lexicons
+
+
## License
+
+
MIT
+
+
## Links
+
+
- [AT Protocol](https://atproto.com)
+
- [Jacquard Library](https://tangled.org/@nonbinary.computer/jacquard)
-41
api.md
···
-
/**
-
* AUTHENTICATION ROUTES
-
*
-
* Handles OAuth authentication flow for Bluesky/ATProto accounts
-
* All routes are on the editor.wisp.place subdomain
-
*
-
* Routes:
-
* POST /api/auth/signin - Initiate OAuth sign-in flow
-
* GET /api/auth/callback - OAuth callback handler (redirect from PDS)
-
* GET /api/auth/status - Check current authentication status
-
* POST /api/auth/logout - Sign out and clear session
-
*/
-
-
/**
-
* CUSTOM DOMAIN ROUTES
-
*
-
* Handles custom domain (BYOD - Bring Your Own Domain) management
-
* Users can claim custom domains with DNS verification (TXT + CNAME)
-
* and map them to their sites
-
*
-
* Routes:
-
* GET /api/check-domain - Fast verification check for routing (public)
-
* GET /api/custom-domains - List user's custom domains
-
* POST /api/custom-domains/check - Check domain availability and DNS config
-
* POST /api/custom-domains/claim - Claim a custom domain
-
* PUT /api/custom-domains/:id/site - Update site mapping
-
* DELETE /api/custom-domains/:id - Remove a custom domain
-
* POST /api/custom-domains/:id/verify - Manually trigger verification
-
*/
-
-
/**
-
* WISP SITE MANAGEMENT ROUTES
-
*
-
* API endpoints for managing user's Wisp sites stored in ATProto repos
-
* Handles reading site metadata, fetching content, updating sites, and uploads
-
* All routes are on the editor.wisp.place subdomain
-
*
-
* Routes:
-
* GET /wisp/sites - List all sites for authenticated user
-
* POST /wisp/upload-files - Upload and deploy files as a site
-
*/
+154 -35
bun.lock
···
{
"lockfileVersion": 1,
+
"configVersion": 0,
"workspaces": {
"": {
"name": "elysia-static",
···
"@elysiajs/openapi": "^1.4.11",
"@elysiajs/opentelemetry": "^1.4.6",
"@elysiajs/static": "^1.4.2",
+
"@radix-ui/react-checkbox": "^1.3.3",
"@radix-ui/react-dialog": "^1.1.15",
"@radix-ui/react-label": "^2.1.7",
"@radix-ui/react-radio-group": "^1.3.8",
"@radix-ui/react-slot": "^1.2.3",
"@radix-ui/react-tabs": "^1.1.13",
"@tanstack/react-query": "^5.90.2",
+
"actor-typeahead": "^0.1.1",
+
"atproto-ui": "^0.11.3",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"elysia": "latest",
"iron-session": "^8.0.4",
"lucide-react": "^0.546.0",
+
"multiformats": "^13.4.1",
+
"prismjs": "^1.30.0",
"react": "^19.2.0",
"react-dom": "^19.2.0",
"tailwind-merge": "^3.3.1",
···
"@types/react-dom": "^19.2.1",
"bun-plugin-tailwind": "^0.1.2",
"bun-types": "latest",
+
"esbuild": "0.26.0",
},
},
},
"trustedDependencies": [
"core-js",
+
"cbor-extract",
+
"bun",
"protobufjs",
],
"packages": {
+
"@atcute/atproto": ["@atcute/atproto@3.1.9", "", { "dependencies": { "@atcute/lexicons": "^1.2.2" } }, "sha512-DyWwHCTdR4hY2BPNbLXgVmm7lI+fceOwWbE4LXbGvbvVtSn+ejSVFaAv01Ra3kWDha0whsOmbJL8JP0QPpf1+w=="],
+
+
"@atcute/bluesky": ["@atcute/bluesky@3.2.10", "", { "dependencies": { "@atcute/atproto": "^3.1.9", "@atcute/lexicons": "^1.2.2" } }, "sha512-qwQWTzRf3umnh2u41gdU+xWYkbzGlKDupc3zeOB+YjmuP1N9wEaUhwS8H7vgrqr0xC9SGNDjeUVcjC4m5BPLBg=="],
+
+
"@atcute/client": ["@atcute/client@4.0.5", "", { "dependencies": { "@atcute/identity": "^1.1.1", "@atcute/lexicons": "^1.2.2" } }, "sha512-R8Qen8goGmEkynYGg2m6XFlVmz0GTDvQ+9w+4QqOob+XMk8/WDpF4aImev7WKEde/rV2gjcqW7zM8E6W9NShDA=="],
+
+
"@atcute/identity": ["@atcute/identity@1.1.1", "", { "dependencies": { "@atcute/lexicons": "^1.2.2", "@badrap/valita": "^0.4.6" } }, "sha512-zax42n693VEhnC+5tndvO2KLDTMkHOz8UExwmklvJv7R9VujfEwiSWhcv6Jgwb3ellaG8wjiQ1lMOIjLLvwh0Q=="],
+
+
"@atcute/identity-resolver": ["@atcute/identity-resolver@1.1.4", "", { "dependencies": { "@atcute/lexicons": "^1.2.2", "@atcute/util-fetch": "^1.0.3", "@badrap/valita": "^0.4.6" }, "peerDependencies": { "@atcute/identity": "^1.0.0" } }, "sha512-/SVh8vf2cXFJenmBnGeYF2aY3WGQm3cJeew5NWTlkqoy3LvJ5wkvKq9PWu4Tv653VF40rPOp6LOdVr9Fa+q5rA=="],
+
+
"@atcute/lexicons": ["@atcute/lexicons@1.2.2", "", { "dependencies": { "@standard-schema/spec": "^1.0.0", "esm-env": "^1.2.2" } }, "sha512-bgEhJq5Z70/0TbK5sx+tAkrR8FsCODNiL2gUEvS5PuJfPxmFmRYNWaMGehxSPaXWpU2+Oa9ckceHiYbrItDTkA=="],
+
+
"@atcute/tangled": ["@atcute/tangled@1.0.10", "", { "dependencies": { "@atcute/atproto": "^3.1.8", "@atcute/lexicons": "^1.2.2" } }, "sha512-DGconZIN5TpLBah+aHGbWI1tMsL7XzyVEbr/fW4CbcLWYKICU6SAUZ0YnZ+5GvltjlORWHUy7hfftvoh4zodIA=="],
+
+
"@atcute/util-fetch": ["@atcute/util-fetch@1.0.3", "", { "dependencies": { "@badrap/valita": "^0.4.6" } }, "sha512-f8zzTb/xlKIwv2OQ31DhShPUNCmIIleX6p7qIXwWwEUjX6x8skUtpdISSjnImq01LXpltGV5y8yhV4/Mlb7CRQ=="],
+
"@atproto-labs/did-resolver": ["@atproto-labs/did-resolver@0.2.2", "", { "dependencies": { "@atproto-labs/fetch": "0.2.3", "@atproto-labs/pipe": "0.1.1", "@atproto-labs/simple-store": "0.3.0", "@atproto-labs/simple-store-memory": "0.1.4", "@atproto/did": "0.2.1", "zod": "^3.23.8" } }, "sha512-ca2B7xR43tVoQ8XxBvha58DXwIH8cIyKQl6lpOKGkPUrJuFoO4iCLlDiSDi2Ueh+yE1rMDPP/qveHdajgDX3WQ=="],
"@atproto-labs/fetch": ["@atproto-labs/fetch@0.2.3", "", { "dependencies": { "@atproto-labs/pipe": "0.1.1" } }, "sha512-NZtbJOCbxKUFRFKMpamT38PUQMY0hX0p7TG5AEYOPhZKZEP7dHZ1K2s1aB8MdVH0qxmqX7nQleNrrvLf09Zfdw=="],
-
"@atproto-labs/fetch-node": ["@atproto-labs/fetch-node@0.1.10", "", { "dependencies": { "@atproto-labs/fetch": "0.2.3", "@atproto-labs/pipe": "0.1.1", "ipaddr.js": "^2.1.0", "undici": "^6.14.1" } }, "sha512-o7hGaonA71A6p7O107VhM6UBUN/g9tTyYohMp1q0Kf6xQ4npnuZYRSHSf2g6reSfGQJ1GoFNjBObETTT1ge/jQ=="],
+
"@atproto-labs/fetch-node": ["@atproto-labs/fetch-node@0.2.0", "", { "dependencies": { "@atproto-labs/fetch": "0.2.3", "@atproto-labs/pipe": "0.1.1", "ipaddr.js": "^2.1.0", "undici": "^6.14.1" } }, "sha512-Krq09nH/aeoiU2s9xdHA0FjTEFWG9B5FFenipv1iRixCcPc7V3DhTNDawxG9gI8Ny0k4dBVS9WTRN/IDzBx86Q=="],
"@atproto-labs/handle-resolver": ["@atproto-labs/handle-resolver@0.3.2", "", { "dependencies": { "@atproto-labs/simple-store": "0.3.0", "@atproto-labs/simple-store-memory": "0.1.4", "@atproto/did": "0.2.1", "zod": "^3.23.8" } }, "sha512-KIerCzh3qb+zZoqWbIvTlvBY0XPq0r56kwViaJY/LTe/3oPO2JaqlYKS/F4dByWBhHK6YoUOJ0sWrh6PMJl40A=="],
-
"@atproto-labs/handle-resolver-node": ["@atproto-labs/handle-resolver-node@0.1.20", "", { "dependencies": { "@atproto-labs/fetch-node": "0.1.10", "@atproto-labs/handle-resolver": "0.3.2", "@atproto/did": "0.2.1" } }, "sha512-094EL61XN9M7vm22cloSOxk/gcTRaCK52vN7BYgXgdoEI8uJJMTFXenQqu+LRGwiCcjvyclcBqbaz0DzJep50Q=="],
+
"@atproto-labs/handle-resolver-node": ["@atproto-labs/handle-resolver-node@0.1.21", "", { "dependencies": { "@atproto-labs/fetch-node": "0.2.0", "@atproto-labs/handle-resolver": "0.3.2", "@atproto/did": "0.2.1" } }, "sha512-fuJy5Px5pGF3lJX/ATdurbT8tbmaFWtf+PPxAQDFy7ot2no3t+iaAgymhyxYymrssOuWs6BwOP8tyF3VrfdwtQ=="],
"@atproto-labs/identity-resolver": ["@atproto-labs/identity-resolver@0.3.2", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/handle-resolver": "0.3.2" } }, "sha512-MYxO9pe0WsFyi5HFdKAwqIqHfiF2kBPoVhAIuH/4PYHzGr799ED47xLhNMxR3ZUYrJm5+TQzWXypGZ0Btw1Ffw=="],
···
"@atproto-labs/simple-store-memory": ["@atproto-labs/simple-store-memory@0.1.4", "", { "dependencies": { "@atproto-labs/simple-store": "0.3.0", "lru-cache": "^10.2.0" } }, "sha512-3mKY4dP8I7yKPFj9VKpYyCRzGJOi5CEpOLPlRhoJyLmgs3J4RzDrjn323Oakjz2Aj2JzRU/AIvWRAZVhpYNJHw=="],
-
"@atproto/api": ["@atproto/api@0.17.3", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "@atproto/xrpc": "^0.7.5", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, "sha512-pdQXhUAapNPdmN00W0vX5ta/aMkHqfgBHATt20X02XwxQpY2AnrPm2Iog4FyjsZqoHooAtCNV/NWJ4xfddJzsg=="],
+
"@atproto/api": ["@atproto/api@0.17.7", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "@atproto/xrpc": "^0.7.5", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, "sha512-V+OJBZq9chcrD21xk1bUa6oc5DSKfQj5DmUPf5rmZncqL1w9ZEbS38H5cMyqqdhfgo2LWeDRdZHD0rvNyJsIaw=="],
"@atproto/common": ["@atproto/common@0.4.12", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@ipld/dag-cbor": "^7.0.3", "cbor-x": "^1.5.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "pino": "^8.21.0" } }, "sha512-NC+TULLQiqs6MvNymhQS5WDms3SlbIKGLf4n33tpftRJcalh507rI+snbcUb7TLIkKw7VO17qMqxEXtIdd5auQ=="],
···
"@atproto/jwk-webcrypto": ["@atproto/jwk-webcrypto@0.2.0", "", { "dependencies": { "@atproto/jwk": "0.6.0", "@atproto/jwk-jose": "0.1.11", "zod": "^3.23.8" } }, "sha512-UmgRrrEAkWvxwhlwe30UmDOdTEFidlIzBC7C3cCbeJMcBN1x8B3KH+crXrsTqfWQBG58mXgt8wgSK3Kxs2LhFg=="],
-
"@atproto/lex-cli": ["@atproto/lex-cli@0.9.5", "", { "dependencies": { "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "chalk": "^4.1.2", "commander": "^9.4.0", "prettier": "^3.2.5", "ts-morph": "^24.0.0", "yesno": "^0.4.0", "zod": "^3.23.8" }, "bin": { "lex": "dist/index.js" } }, "sha512-zun4jhD1dbjD7IHtLIjh/1UsMx+6E8+OyOT2GXYAKIj9N6wmLKM/v2OeQBKxcyqUmtRi57lxWnGikWjjU7pplQ=="],
+
"@atproto/lex-cli": ["@atproto/lex-cli@0.9.6", "", { "dependencies": { "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "chalk": "^4.1.2", "commander": "^9.4.0", "prettier": "^3.2.5", "ts-morph": "^24.0.0", "yesno": "^0.4.0", "zod": "^3.23.8" }, "bin": { "lex": "dist/index.js" } }, "sha512-EedEKmURoSP735YwSDHsFrLOhZ4P2it8goCHv5ApWi/R9DFpOKOpmYfIXJ9MAprK8cw+yBnjDJbzpLJy7UXlTg=="],
"@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-y8AEtYmfgVl4fqFxqXAeGvhesiGkxiy3CWoJIfsFDDdTlZUC8DFnZrYhcqkIop3OlCkkljvpSJi1hbeC1tbi8A=="],
-
"@atproto/oauth-client": ["@atproto/oauth-client@0.5.7", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/fetch": "0.2.3", "@atproto-labs/handle-resolver": "0.3.2", "@atproto-labs/identity-resolver": "0.3.2", "@atproto-labs/simple-store": "0.3.0", "@atproto-labs/simple-store-memory": "0.1.4", "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "@atproto/oauth-types": "0.4.2", "@atproto/xrpc": "0.7.5", "core-js": "^3", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-pDvbvy9DCxrAJv7bAbBUzWrHZKhFy091HvEMZhr+EyZA6gSCGYmmQJG/coDj0oICSVQeafAZd+IxR0YUCWwmEg=="],
+
"@atproto/oauth-client": ["@atproto/oauth-client@0.5.8", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/fetch": "0.2.3", "@atproto-labs/handle-resolver": "0.3.2", "@atproto-labs/identity-resolver": "0.3.2", "@atproto-labs/simple-store": "0.3.0", "@atproto-labs/simple-store-memory": "0.1.4", "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "@atproto/oauth-types": "0.5.0", "@atproto/xrpc": "0.7.5", "core-js": "^3", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-7YEym6d97+Dd73qGdkQTXi5La8xvCQxwRUDzzlR/NVAARa9a4YP7MCmqBJVeP2anT0By+DSAPyPDLTsxcjIcCg=="],
-
"@atproto/oauth-client-node": ["@atproto/oauth-client-node@0.3.9", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/handle-resolver-node": "0.1.20", "@atproto-labs/simple-store": "0.3.0", "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "@atproto/jwk-jose": "0.1.11", "@atproto/jwk-webcrypto": "0.2.0", "@atproto/oauth-client": "0.5.7", "@atproto/oauth-types": "0.4.2" } }, "sha512-JdzwDQ8Gczl0lgfJNm7lG7omkJ4yu99IuGkkRWixpEvKY/jY/mDZaho+3pfd29SrUvwQOOx4Bm4l7DGeYwxxyA=="],
+
"@atproto/oauth-client-node": ["@atproto/oauth-client-node@0.3.10", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/handle-resolver-node": "0.1.21", "@atproto-labs/simple-store": "0.3.0", "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "@atproto/jwk-jose": "0.1.11", "@atproto/jwk-webcrypto": "0.2.0", "@atproto/oauth-client": "0.5.8", "@atproto/oauth-types": "0.5.0" } }, "sha512-6khKlJqu1Ed5rt3rzcTD5hymB6JUjKdOHWYXwiphw4inkAIo6GxLCighI4eGOqZorYk2j8ueeTNB6KsgH0kcRw=="],
-
"@atproto/oauth-types": ["@atproto/oauth-types@0.4.2", "", { "dependencies": { "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "zod": "^3.23.8" } }, "sha512-gcfNTyFsPJcYDf79M0iKHykWqzxloscioKoerdIN3MTS3htiNOSgZjm2p8ho7pdrElLzea3qktuhTQI39j1XFQ=="],
+
"@atproto/oauth-types": ["@atproto/oauth-types@0.5.0", "", { "dependencies": { "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "zod": "^3.23.8" } }, "sha512-33xz7HcXhbl+XRqbIMVu3GE02iK1nKe2oMWENASsfZEYbCz2b9ZOarOFuwi7g4LKqpGowGp0iRKsQHFcq4SDaQ=="],
"@atproto/syntax": ["@atproto/syntax@0.4.1", "", {}, "sha512-CJdImtLAiFO+0z3BWTtxwk6aY5w4t8orHTMVJgkf++QRJWTxPbIFko/0hrkADB7n2EruDxDSeAgfUGehpH6ngw=="],
"@atproto/xrpc": ["@atproto/xrpc@0.7.5", "", { "dependencies": { "@atproto/lexicon": "^0.5.1", "zod": "^3.23.8" } }, "sha512-MUYNn5d2hv8yVegRL0ccHvTHAVj5JSnW07bkbiaz96UH45lvYNRVwt44z+yYVnb0/mvBzyD3/ZQ55TRGt7fHkA=="],
"@atproto/xrpc-server": ["@atproto/xrpc-server@0.9.5", "", { "dependencies": { "@atproto/common": "^0.4.12", "@atproto/crypto": "^0.4.4", "@atproto/lexicon": "^0.5.1", "@atproto/xrpc": "^0.7.5", "cbor-x": "^1.5.1", "express": "^4.17.2", "http-errors": "^2.0.0", "mime-types": "^2.1.35", "rate-limiter-flexible": "^2.4.1", "uint8arrays": "3.0.0", "ws": "^8.12.0", "zod": "^3.23.8" } }, "sha512-V0srjUgy6mQ5yf9+MSNBLs457m4qclEaWZsnqIE7RfYywvntexTAbMoo7J7ONfTNwdmA9Gw4oLak2z2cDAET4w=="],
+
+
"@badrap/valita": ["@badrap/valita@0.4.6", "", {}, "sha512-4kdqcjyxo/8RQ8ayjms47HCWZIF5981oE5nIenbfThKDxWXtEHKipAOWlflpPJzZx9y/JWYQkp18Awr7VuepFg=="],
"@borewit/text-codec": ["@borewit/text-codec@0.1.1", "", {}, "sha512-5L/uBxmjaCIX5h8Z+uu+kA9BQLkc/Wl06UGR5ajNRxu+/XjonB5i8JpgFMrPj3LXTCPA0pv8yxUvbUi+QthGGA=="],
···
"@elysiajs/cors": ["@elysiajs/cors@1.4.0", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-pb0SCzBfFbFSYA/U40HHO7R+YrcXBJXOWgL20eSViK33ol1e20ru2/KUaZYo5IMUn63yaTJI/bQERuQ+77ND8g=="],
-
"@elysiajs/eden": ["@elysiajs/eden@1.4.3", "", { "peerDependencies": { "elysia": ">= 1.4.0-exp.0" } }, "sha512-mX0v5cTvTJiDsDWNEEyuoqudOvW5J+tXsvp/ZOJXJF3iCIEJI0Brvm78ymPrvwiOG4nUr3lS8BxUfbNf32DSXA=="],
+
"@elysiajs/eden": ["@elysiajs/eden@1.4.4", "", { "peerDependencies": { "elysia": ">= 1.4.0-exp.0" } }, "sha512-/LVqflmgUcCiXb8rz1iRq9Rx3SWfIV/EkoNqDFGMx+TvOyo8QHAygFXAVQz7RHs+jk6n6mEgpI6KlKBANoErsQ=="],
"@elysiajs/openapi": ["@elysiajs/openapi@1.4.11", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-d75bMxYJpN6qSDi/z9L1S7SLk1S/8Px+cTb3W2lrYzU8uQ5E0kXdy1oOMJEfTyVsz3OA19NP9KNxE7ztSbLBLg=="],
"@elysiajs/opentelemetry": ["@elysiajs/opentelemetry@1.4.6", "", { "dependencies": { "@opentelemetry/api": "^1.9.0", "@opentelemetry/instrumentation": "^0.200.0", "@opentelemetry/sdk-node": "^0.200.0" }, "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-jR7t4M6ZvMnBqzzHsNTL6y3sNq9jbGi2vKxbkizi/OO5tlvlKl/rnBGyFjZUjQ1Hte7rCz+2kfmgOQMhkjk+Og=="],
-
"@elysiajs/static": ["@elysiajs/static@1.4.2", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-lAEvdxeBhU/jX/hTzfoP+1AtqhsKNCwW4Q+tfNwAShWU6s4ZPQxR1hLoHBveeApofJt4HWEq/tBGvfFz3ODuKg=="],
+
"@elysiajs/static": ["@elysiajs/static@1.4.6", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-cd61aY/DHOVhlnBjzTBX8E1XANIrsCH8MwEGHeLMaZzNrz0gD4Q8Qsde2dFMzu81I7ZDaaZ2Rim9blSLtUrYBg=="],
+
+
"@esbuild/aix-ppc64": ["@esbuild/aix-ppc64@0.26.0", "", { "os": "aix", "cpu": "ppc64" }, "sha512-hj0sKNCQOOo2fgyII3clmJXP28VhgDfU5iy3GNHlWO76KG6N7x4D9ezH5lJtQTG+1J6MFDAJXC1qsI+W+LvZoA=="],
+
+
"@esbuild/android-arm": ["@esbuild/android-arm@0.26.0", "", { "os": "android", "cpu": "arm" }, "sha512-C0hkDsYNHZkBtPxxDx177JN90/1MiCpvBNjz1f5yWJo1+5+c5zr8apjastpEG+wtPjo9FFtGG7owSsAxyKiHxA=="],
+
+
"@esbuild/android-arm64": ["@esbuild/android-arm64@0.26.0", "", { "os": "android", "cpu": "arm64" }, "sha512-DDnoJ5eoa13L8zPh87PUlRd/IyFaIKOlRbxiwcSbeumcJ7UZKdtuMCHa1Q27LWQggug6W4m28i4/O2qiQQ5NZQ=="],
+
+
"@esbuild/android-x64": ["@esbuild/android-x64@0.26.0", "", { "os": "android", "cpu": "x64" }, "sha512-bKDkGXGZnj0T70cRpgmv549x38Vr2O3UWLbjT2qmIkdIWcmlg8yebcFWoT9Dku7b5OV3UqPEuNKRzlNhjwUJ9A=="],
+
+
"@esbuild/darwin-arm64": ["@esbuild/darwin-arm64@0.26.0", "", { "os": "darwin", "cpu": "arm64" }, "sha512-6Z3naJgOuAIB0RLlJkYc81An3rTlQ/IeRdrU3dOea8h/PvZSgitZV+thNuIccw0MuK1GmIAnAmd5TrMZad8FTQ=="],
+
+
"@esbuild/darwin-x64": ["@esbuild/darwin-x64@0.26.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-OPnYj0zpYW0tHusMefyaMvNYQX5pNQuSsHFTHUBNp3vVXupwqpxofcjVsUx11CQhGVkGeXjC3WLjh91hgBG2xw=="],
+
+
"@esbuild/freebsd-arm64": ["@esbuild/freebsd-arm64@0.26.0", "", { "os": "freebsd", "cpu": "arm64" }, "sha512-jix2fa6GQeZhO1sCKNaNMjfj5hbOvoL2F5t+w6gEPxALumkpOV/wq7oUBMHBn2hY2dOm+mEV/K+xfZy3mrsxNQ=="],
+
+
"@esbuild/freebsd-x64": ["@esbuild/freebsd-x64@0.26.0", "", { "os": "freebsd", "cpu": "x64" }, "sha512-tccJaH5xHJD/239LjbVvJwf6T4kSzbk6wPFerF0uwWlkw/u7HL+wnAzAH5GB2irGhYemDgiNTp8wJzhAHQ64oA=="],
+
+
"@esbuild/linux-arm": ["@esbuild/linux-arm@0.26.0", "", { "os": "linux", "cpu": "arm" }, "sha512-JY8NyU31SyRmRpuc5W8PQarAx4TvuYbyxbPIpHAZdr/0g4iBr8KwQBS4kiiamGl2f42BBecHusYCsyxi7Kn8UQ=="],
+
+
"@esbuild/linux-arm64": ["@esbuild/linux-arm64@0.26.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-IMJYN7FSkLttYyTbsbme0Ra14cBO5z47kpamo16IwggzzATFY2lcZAwkbcNkWiAduKrTgFJP7fW5cBI7FzcuNQ=="],
+
+
"@esbuild/linux-ia32": ["@esbuild/linux-ia32@0.26.0", "", { "os": "linux", "cpu": "ia32" }, "sha512-XITaGqGVLgk8WOHw8We9Z1L0lbLFip8LyQzKYFKO4zFo1PFaaSKsbNjvkb7O8kEXytmSGRkYpE8LLVpPJpsSlw=="],
+
+
"@esbuild/linux-loong64": ["@esbuild/linux-loong64@0.26.0", "", { "os": "linux", "cpu": "none" }, "sha512-MkggfbDIczStUJwq9wU7gQ7kO33d8j9lWuOCDifN9t47+PeI+9m2QVh51EI/zZQ1spZtFMC1nzBJ+qNGCjJnsg=="],
+
+
"@esbuild/linux-mips64el": ["@esbuild/linux-mips64el@0.26.0", "", { "os": "linux", "cpu": "none" }, "sha512-fUYup12HZWAeccNLhQ5HwNBPr4zXCPgUWzEq2Rfw7UwqwfQrFZ0SR/JljaURR8xIh9t+o1lNUFTECUTmaP7yKA=="],
+
+
"@esbuild/linux-ppc64": ["@esbuild/linux-ppc64@0.26.0", "", { "os": "linux", "cpu": "ppc64" }, "sha512-MzRKhM0Ip+//VYwC8tialCiwUQ4G65WfALtJEFyU0GKJzfTYoPBw5XNWf0SLbCUYQbxTKamlVwPmcw4DgZzFxg=="],
+
+
"@esbuild/linux-riscv64": ["@esbuild/linux-riscv64@0.26.0", "", { "os": "linux", "cpu": "none" }, "sha512-QhCc32CwI1I4Jrg1enCv292sm3YJprW8WHHlyxJhae/dVs+KRWkbvz2Nynl5HmZDW/m9ZxrXayHzjzVNvQMGQA=="],
+
+
"@esbuild/linux-s390x": ["@esbuild/linux-s390x@0.26.0", "", { "os": "linux", "cpu": "s390x" }, "sha512-1D6vi6lfI18aNT1aTf2HV+RIlm6fxtlAp8eOJ4mmnbYmZ4boz8zYDar86sIYNh0wmiLJEbW/EocaKAX6Yso2fw=="],
+
+
"@esbuild/linux-x64": ["@esbuild/linux-x64@0.26.0", "", { "os": "linux", "cpu": "x64" }, "sha512-rnDcepj7LjrKFvZkx+WrBv6wECeYACcFjdNPvVPojCPJD8nHpb3pv3AuR9CXgdnjH1O23btICj0rsp0L9wAnHA=="],
+
+
"@esbuild/netbsd-arm64": ["@esbuild/netbsd-arm64@0.26.0", "", { "os": "none", "cpu": "arm64" }, "sha512-FSWmgGp0mDNjEXXFcsf12BmVrb+sZBBBlyh3LwB/B9ac3Kkc8x5D2WimYW9N7SUkolui8JzVnVlWh7ZmjCpnxw=="],
+
+
"@esbuild/netbsd-x64": ["@esbuild/netbsd-x64@0.26.0", "", { "os": "none", "cpu": "x64" }, "sha512-0QfciUDFryD39QoSPUDshj4uNEjQhp73+3pbSAaxjV2qGOEDsM67P7KbJq7LzHoVl46oqhIhJ1S+skKGR7lMXA=="],
+
+
"@esbuild/openbsd-arm64": ["@esbuild/openbsd-arm64@0.26.0", "", { "os": "openbsd", "cpu": "arm64" }, "sha512-vmAK+nHhIZWImwJ3RNw9hX3fU4UGN/OqbSE0imqljNbUQC3GvVJ1jpwYoTfD6mmXmQaxdJY6Hn4jQbLGJKg5Yw=="],
+
+
"@esbuild/openbsd-x64": ["@esbuild/openbsd-x64@0.26.0", "", { "os": "openbsd", "cpu": "x64" }, "sha512-GPXF7RMkJ7o9bTyUsnyNtrFMqgM3X+uM/LWw4CeHIjqc32fm0Ir6jKDnWHpj8xHFstgWDUYseSABK9KCkHGnpg=="],
+
+
"@esbuild/openharmony-arm64": ["@esbuild/openharmony-arm64@0.26.0", "", { "os": "none", "cpu": "arm64" }, "sha512-nUHZ5jEYqbBthbiBksbmHTlbb5eElyVfs/s1iHQ8rLBq1eWsd5maOnDpCocw1OM8kFK747d1Xms8dXJHtduxSw=="],
+
+
"@esbuild/sunos-x64": ["@esbuild/sunos-x64@0.26.0", "", { "os": "sunos", "cpu": "x64" }, "sha512-TMg3KCTCYYaVO+R6P5mSORhcNDDlemUVnUbb8QkboUtOhb5JWKAzd5uMIMECJQOxHZ/R+N8HHtDF5ylzLfMiLw=="],
+
+
"@esbuild/win32-arm64": ["@esbuild/win32-arm64@0.26.0", "", { "os": "win32", "cpu": "arm64" }, "sha512-apqYgoAUd6ZCb9Phcs8zN32q6l0ZQzQBdVXOofa6WvHDlSOhwCWgSfVQabGViThS40Y1NA4SCvQickgZMFZRlA=="],
+
+
"@esbuild/win32-ia32": ["@esbuild/win32-ia32@0.26.0", "", { "os": "win32", "cpu": "ia32" }, "sha512-FGJAcImbJNZzLWu7U6WB0iKHl4RuY4TsXEwxJPl9UZLS47agIZuILZEX3Pagfw7I4J3ddflomt9f0apfaJSbaw=="],
+
+
"@esbuild/win32-x64": ["@esbuild/win32-x64@0.26.0", "", { "os": "win32", "cpu": "x64" }, "sha512-WAckBKaVnmFqbEhbymrPK7M086DQMpL1XoRbpmN0iW8k5JSXjDRQBhcZNa0VweItknLq9eAeCL34jK7/CDcw7A=="],
-
"@grpc/grpc-js": ["@grpc/grpc-js@1.14.0", "", { "dependencies": { "@grpc/proto-loader": "^0.8.0", "@js-sdsl/ordered-map": "^4.4.2" } }, "sha512-N8Jx6PaYzcTRNzirReJCtADVoq4z7+1KQ4E70jTg/koQiMoUSN1kbNjPOqpPbhMFhfU1/l7ixspPl8dNY+FoUg=="],
+
"@grpc/grpc-js": ["@grpc/grpc-js@1.14.1", "", { "dependencies": { "@grpc/proto-loader": "^0.8.0", "@js-sdsl/ordered-map": "^4.4.2" } }, "sha512-sPxgEWtPUR3EnRJCEtbGZG2iX8LQDUls2wUS3o27jg07KqJFMq6YDeWvMo1wfpmy3rqRdS0rivpLwhqQtEyCuQ=="],
"@grpc/proto-loader": ["@grpc/proto-loader@0.8.0", "", { "dependencies": { "lodash.camelcase": "^4.3.0", "long": "^5.0.0", "protobufjs": "^7.5.3", "yargs": "^17.7.2" }, "bin": { "proto-loader-gen-types": "build/bin/proto-loader-gen-types.js" } }, "sha512-rc1hOQtjIWGxcxpb9aHAfLpIctjEnsDehj0DAiVfBlmT84uvR0uUtN2hEi/ecvWVjXUGf5qPF4qEgiLOx1YIMQ=="],
···
"@opentelemetry/sdk-trace-node": ["@opentelemetry/sdk-trace-node@2.0.0", "", { "dependencies": { "@opentelemetry/context-async-hooks": "2.0.0", "@opentelemetry/core": "2.0.0", "@opentelemetry/sdk-trace-base": "2.0.0" }, "peerDependencies": { "@opentelemetry/api": ">=1.0.0 <1.10.0" } }, "sha512-omdilCZozUjQwY3uZRBwbaRMJ3p09l4t187Lsdf0dGMye9WKD4NGcpgZRvqhI1dwcH6og+YXQEtoO9Wx3ykilg=="],
-
"@opentelemetry/semantic-conventions": ["@opentelemetry/semantic-conventions@1.37.0", "", {}, "sha512-JD6DerIKdJGmRp4jQyX5FlrQjA4tjOw1cvfsPAZXfOOEErMUHjPcPSICS+6WnM0nB0efSFARh0KAZss+bvExOA=="],
+
"@opentelemetry/semantic-conventions": ["@opentelemetry/semantic-conventions@1.38.0", "", {}, "sha512-kocjix+/sSggfJhwXqClZ3i9Y/MI0fp7b+g7kCRm6psy2dsf8uApTRclwG18h8Avm7C9+fnt+O36PspJ/OzoWg=="],
-
"@oven/bun-darwin-aarch64": ["@oven/bun-darwin-aarch64@1.3.0", "", { "os": "darwin", "cpu": "arm64" }, "sha512-WeXSaL29ylJEZMYHHW28QZ6rgAbxQ1KuNSZD9gvd3fPlo0s6s2PglvPArjjP07nmvIK9m4OffN0k4M98O7WmAg=="],
+
"@oven/bun-darwin-aarch64": ["@oven/bun-darwin-aarch64@1.3.2", "", { "os": "darwin", "cpu": "arm64" }, "sha512-licBDIbbLP5L5/S0+bwtJynso94XD3KyqSP48K59Sq7Mude6C7dR5ZujZm4Ut4BwZqUFfNOfYNMWBU5nlL7t1A=="],
-
"@oven/bun-darwin-x64": ["@oven/bun-darwin-x64@1.3.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-CFKjoUWQH0Oz3UHYfKbdKLq0wGryrFsTJEYq839qAwHQSECvVZYAnxVVDYUDa0yQFonhO2qSHY41f6HK+b7xtw=="],
+
"@oven/bun-darwin-x64": ["@oven/bun-darwin-x64@1.3.2", "", { "os": "darwin", "cpu": "x64" }, "sha512-hn8lLzsYyyh6ULo2E8v2SqtrWOkdQKJwapeVy1rDw7juTTeHY3KDudGWf4mVYteC9riZU6HD88Fn3nGwyX0eIg=="],
-
"@oven/bun-darwin-x64-baseline": ["@oven/bun-darwin-x64-baseline@1.3.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-+FSr/ub5vA/EkD3fMhHJUzYioSf/sXd50OGxNDAntVxcDu4tXL/81Ka3R/gkZmjznpLFIzovU/1Ts+b7dlkrfw=="],
+
"@oven/bun-darwin-x64-baseline": ["@oven/bun-darwin-x64-baseline@1.3.2", "", { "os": "darwin", "cpu": "x64" }, "sha512-UHxdtbyxdtNJUNcXtIrjx3Lmq8ji3KywlXtIHV/0vn9A8W5mulqOcryqUWMFVH9JTIIzmNn6Q/qVmXHTME63Ww=="],
-
"@oven/bun-linux-aarch64": ["@oven/bun-linux-aarch64@1.3.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-WHthS/eLkCNcp9pk4W8aubRl9fIUgt2XhHyLrP0GClB1FVvmodu/zIOtG0NXNpzlzB8+gglOkGo4dPjfVf4Z+g=="],
+
"@oven/bun-linux-aarch64": ["@oven/bun-linux-aarch64@1.3.2", "", { "os": "linux", "cpu": "arm64" }, "sha512-5uZzxzvHU/z+3cZwN/A0H8G+enQ+9FkeJVZkE2fwK2XhiJZFUGAuWajCpy7GepvOWlqV7VjPaKi2+Qmr4IX7nQ=="],
-
"@oven/bun-linux-aarch64-musl": ["@oven/bun-linux-aarch64-musl@1.3.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-HT5sr7N8NDYbQRjAnT7ISpx64y+ewZZRQozOJb0+KQObKvg4UUNXGm4Pn1xA4/WPMZDDazjO8E2vtOQw1nJlAQ=="],
+
"@oven/bun-linux-aarch64-musl": ["@oven/bun-linux-aarch64-musl@1.3.2", "", { "os": "linux", "cpu": "arm64" }, "sha512-OD9DYkjes7WXieBn4zQZGXWhRVZhIEWMDGCetZ3H4vxIuweZ++iul/CNX5jdpNXaJ17myb1ROMvmRbrqW44j3w=="],
-
"@oven/bun-linux-x64": ["@oven/bun-linux-x64@1.3.0", "", { "os": "linux", "cpu": "x64" }, "sha512-sGEWoJQXO4GDr0x4t/yJQ/Bq1yNkOdX9tHbZZ+DBGJt3z3r7jeb4Digv8xQUk6gdTFC9vnGHuin+KW3/yD1Aww=="],
+
"@oven/bun-linux-x64": ["@oven/bun-linux-x64@1.3.2", "", { "os": "linux", "cpu": "x64" }, "sha512-EoEuRP9bxAxVKuvi6tZ0ZENjueP4lvjz0mKsMzdG0kwg/2apGKiirH1l0RIcdmvfDGGuDmNiv/XBpkoXq1x8ug=="],
-
"@oven/bun-linux-x64-baseline": ["@oven/bun-linux-x64-baseline@1.3.0", "", { "os": "linux", "cpu": "x64" }, "sha512-OmlEH3nlxQyv7HOvTH21vyNAZGv9DIPnrTznzvKiOQxkOphhCyKvPTlF13ydw4s/i18iwaUrhHy+YG9HSSxa4Q=="],
+
"@oven/bun-linux-x64-baseline": ["@oven/bun-linux-x64-baseline@1.3.2", "", { "os": "linux", "cpu": "x64" }, "sha512-m9Ov9YH8KjRLui87eNtQQFKVnjGsNk3xgbrR9c8d2FS3NfZSxmVjSeBvEsDjzNf1TXLDriHb/NYOlpiMf/QzDg=="],
-
"@oven/bun-linux-x64-musl": ["@oven/bun-linux-x64-musl@1.3.0", "", { "os": "linux", "cpu": "x64" }, "sha512-rtzUEzCynl3Rhgn/iR9DQezSFiZMcAXAbU+xfROqsweMGKwvwIA2ckyyckO08psEP8XcUZTs3LT9CH7PnaMiEA=="],
+
"@oven/bun-linux-x64-musl": ["@oven/bun-linux-x64-musl@1.3.2", "", { "os": "linux", "cpu": "x64" }, "sha512-3TuOsRVoG8K+soQWRo+Cp5ACpRs6rTFSu5tAqc/6WrqwbNWmqjov/eWJPTgz3gPXnC7uNKVG7RxxAmV8r2EYTQ=="],
-
"@oven/bun-linux-x64-musl-baseline": ["@oven/bun-linux-x64-musl-baseline@1.3.0", "", { "os": "linux", "cpu": "x64" }, "sha512-hrr7mDvUjMX1tuJaXz448tMsgKIqGJBY8+rJqztKOw1U5+a/v2w5HuIIW1ce7ut0ZwEn+KIDvAujlPvpH33vpQ=="],
+
"@oven/bun-linux-x64-musl-baseline": ["@oven/bun-linux-x64-musl-baseline@1.3.2", "", { "os": "linux", "cpu": "x64" }, "sha512-q8Hto8hcpofPJjvuvjuwyYvhOaAzPw1F5vRUUeOJDmDwZ4lZhANFM0rUwchMzfWUJCD6jg8/EVQ8MiixnZWU0A=="],
-
"@oven/bun-windows-x64": ["@oven/bun-windows-x64@1.3.0", "", { "os": "win32", "cpu": "x64" }, "sha512-xXwtpZVVP7T+vkxcF/TUVVOGRjEfkByO4mKveKYb4xnHWV4u4NnV0oNmzyMKkvmj10to5j2h0oZxA4ZVVv4gfA=="],
+
"@oven/bun-windows-x64": ["@oven/bun-windows-x64@1.3.2", "", { "os": "win32", "cpu": "x64" }, "sha512-nZJUa5NprPYQ4Ii4cMwtP9PzlJJTp1XhxJ+A9eSn1Jfr6YygVWyN2KLjenyI93IcuBouBAaepDAVZZjH2lFBhg=="],
-
"@oven/bun-windows-x64-baseline": ["@oven/bun-windows-x64-baseline@1.3.0", "", { "os": "win32", "cpu": "x64" }, "sha512-/jVZ8eYjpYHLDFNoT86cP+AjuWvpkzFY+0R0a1bdeu0sQ6ILuy1FV6hz1hUAP390E09VCo5oP76fnx29giHTtA=="],
+
"@oven/bun-windows-x64-baseline": ["@oven/bun-windows-x64-baseline@1.3.2", "", { "os": "win32", "cpu": "x64" }, "sha512-s00T99MjB+xLOWq+t+wVaVBrry+oBOZNiTJijt+bmkp/MJptYS3FGvs7a+nkjLNzoNDoWQcXgKew6AaHES37Bg=="],
"@protobufjs/aspromise": ["@protobufjs/aspromise@1.1.2", "", {}, "sha512-j+gKExEuLmKwvz3OgROXtrJ2UG2x8Ch2YZUxahh+s1F2HZ+wAceUNLkvy6zKCPVRkU++ZWQrdxsUeQXmcg4uoQ=="],
···
"@protobufjs/utf8": ["@protobufjs/utf8@1.1.0", "", {}, "sha512-Vvn3zZrhQZkkBE8LSuW3em98c0FwgO4nxzv6OdSxPKJIEKY2bGbHn+mhGIPerzI4twdxaP8/0+06HBpwf345Lw=="],
"@radix-ui/primitive": ["@radix-ui/primitive@1.1.3", "", {}, "sha512-JTF99U/6XIjCBo0wqkU5sK10glYe27MRRsfwoiq5zzOEZLHU3A3KCMa5X/azekYRCJ0HlwI0crAXS/5dEHTzDg=="],
+
+
"@radix-ui/react-checkbox": ["@radix-ui/react-checkbox@1.3.3", "", { "dependencies": { "@radix-ui/primitive": "1.1.3", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-presence": "1.1.5", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-use-controllable-state": "1.2.2", "@radix-ui/react-use-previous": "1.1.1", "@radix-ui/react-use-size": "1.1.1" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-wBbpv+NQftHDdG86Qc0pIyXk5IR3tM8Vd0nWLKDcX8nNn4nXFOFwsKuqw2okA/1D/mpaAkmuyndrPJTYDNZtFw=="],
"@radix-ui/react-collection": ["@radix-ui/react-collection@1.1.7", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-slot": "1.2.3" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-Fh9rGN0MoI4ZFUNyfFVNU4y9LUz93u9/0K+yLgA2bwRojxM8JU1DyvvMBabnZPBgMWREAJvU2jjVzq+LrFUglw=="],
···
"@radix-ui/react-id": ["@radix-ui/react-id@1.1.1", "", { "dependencies": { "@radix-ui/react-use-layout-effect": "1.1.1" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-kGkGegYIdQsOb4XjsfM97rXsiHaBwco+hFI66oO4s9LU+PLAC5oJ7khdOVFxkhsmlbpUqDAvXw11CluXP+jkHg=="],
-
"@radix-ui/react-label": ["@radix-ui/react-label@2.1.7", "", { "dependencies": { "@radix-ui/react-primitive": "2.1.3" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-YT1GqPSL8kJn20djelMX7/cTRp/Y9w5IZHvfxQTVHrOqa2yMl7i/UfMqKRU5V7mEyKTrUVgJXhNQPVCG8PBLoQ=="],
+
"@radix-ui/react-label": ["@radix-ui/react-label@2.1.8", "", { "dependencies": { "@radix-ui/react-primitive": "2.1.4" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-FmXs37I6hSBVDlO4y764TNz1rLgKwjJMQ0EGte6F3Cb3f4bIuHB/iLa/8I9VKkmOy+gNHq8rql3j686ACVV21A=="],
"@radix-ui/react-portal": ["@radix-ui/react-portal@1.1.9", "", { "dependencies": { "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-use-layout-effect": "1.1.1" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-bpIxvq03if6UNwXZ+HTK71JLh4APvnXntDc6XOX8UVq4XQOVl7lwok0AvIl+b8zgCw3fSaVTZMpAPPagXbKmHQ=="],
···
"@radix-ui/react-roving-focus": ["@radix-ui/react-roving-focus@1.1.11", "", { "dependencies": { "@radix-ui/primitive": "1.1.3", "@radix-ui/react-collection": "1.1.7", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-direction": "1.1.1", "@radix-ui/react-id": "1.1.1", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-use-callback-ref": "1.1.1", "@radix-ui/react-use-controllable-state": "1.2.2" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-7A6S9jSgm/S+7MdtNDSb+IU859vQqJ/QAtcYQcfFC6W8RS4IxIZDldLR0xqCFZ6DCyrQLjLPsxtTNch5jVA4lA=="],
-
"@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.3", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A=="],
+
"@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.4", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-Jl+bCv8HxKnlTLVrcDE8zTMJ09R9/ukw4qBs/oZClOfoQk/cOTbDn+NceXfV7j09YPVQUryJPHurafcSg6EVKA=="],
"@radix-ui/react-tabs": ["@radix-ui/react-tabs@1.1.13", "", { "dependencies": { "@radix-ui/primitive": "1.1.3", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-direction": "1.1.1", "@radix-ui/react-id": "1.1.1", "@radix-ui/react-presence": "1.1.5", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-roving-focus": "1.1.11", "@radix-ui/react-use-controllable-state": "1.2.2" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-7xdcatg7/U+7+Udyoj2zodtI9H/IIopqo+YOIcZOq1nJwXWBZ9p8xiu5llXlekDbZkca79a/fozEYQXIA4sW6A=="],
···
"@sinclair/typebox": ["@sinclair/typebox@0.34.41", "", {}, "sha512-6gS8pZzSXdyRHTIqoqSVknxolr1kzfy4/CeDnrzsVz8TTIWUbOBr6gnzOmTYJ3eXQNh4IYHIGi5aIL7sOZ2G/g=="],
-
"@tanstack/query-core": ["@tanstack/query-core@5.90.2", "", {}, "sha512-k/TcR3YalnzibscALLwxeiLUub6jN5EDLwKDiO7q5f4ICEoptJ+n9+7vcEFy5/x/i6Q+Lb/tXrsKCggf5uQJXQ=="],
+
"@standard-schema/spec": ["@standard-schema/spec@1.0.0", "", {}, "sha512-m2bOd0f2RT9k8QJx1JN85cZYyH1RqFBdlwtkSlf4tBDYLCiiZnv1fIIwacK6cqwXavOydf0NPToMQgpKq+dVlA=="],
+
+
"@tanstack/query-core": ["@tanstack/query-core@5.90.7", "", {}, "sha512-6PN65csiuTNfBMXqQUxQhCNdtm1rV+9kC9YwWAIKcaxAauq3Wu7p18j3gQY3YIBJU70jT/wzCCZ2uqto/vQgiQ=="],
-
"@tanstack/react-query": ["@tanstack/react-query@5.90.2", "", { "dependencies": { "@tanstack/query-core": "5.90.2" }, "peerDependencies": { "react": "^18 || ^19" } }, "sha512-CLABiR+h5PYfOWr/z+vWFt5VsOA2ekQeRQBFSKlcoW6Ndx/f8rfyVmq4LbgOM4GG2qtxAxjLYLOpCNTYm4uKzw=="],
+
"@tanstack/react-query": ["@tanstack/react-query@5.90.7", "", { "dependencies": { "@tanstack/query-core": "5.90.7" }, "peerDependencies": { "react": "^18 || ^19" } }, "sha512-wAHc/cgKzW7LZNFloThyHnV/AX9gTg3w5yAv0gvQHPZoCnepwqCMtzbuPbb2UvfvO32XZ46e8bPOYbfZhzVnnQ=="],
"@tokenizer/inflate": ["@tokenizer/inflate@0.2.7", "", { "dependencies": { "debug": "^4.4.0", "fflate": "^0.8.2", "token-types": "^6.0.0" } }, "sha512-MADQgmZT1eKjp06jpI2yozxaU9uVs4GzzgSL+uEq7bVcJ9V1ZXQkeGNql1fsSI0gMy1vhvNTNbUqrx+pZfJVmg=="],
···
"@ts-morph/common": ["@ts-morph/common@0.25.0", "", { "dependencies": { "minimatch": "^9.0.4", "path-browserify": "^1.0.1", "tinyglobby": "^0.2.9" } }, "sha512-kMnZz+vGGHi4GoHnLmMhGNjm44kGtKUXGnOvrKmMwAuvNjM/PgKVGfUnL7IDvK7Jb2QQ82jq3Zmp04Gy+r3Dkg=="],
-
"@types/node": ["@types/node@24.7.2", "", { "dependencies": { "undici-types": "~7.14.0" } }, "sha512-/NbVmcGTP+lj5oa4yiYxxeBjRivKQ5Ns1eSZeB99ExsEQ6rX5XYU1Zy/gGxY/ilqtD4Etx9mKyrPxZRetiahhA=="],
+
"@types/node": ["@types/node@24.10.0", "", { "dependencies": { "undici-types": "~7.16.0" } }, "sha512-qzQZRBqkFsYyaSWXuEHc2WR9c0a0CXwiE5FWUvn7ZM+vdy1uZLfCunD38UzhuB7YN/J11ndbDBcTmOdxJo9Q7A=="],
"@types/react": ["@types/react@19.2.2", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA=="],
-
"@types/react-dom": ["@types/react-dom@19.2.1", "", { "peerDependencies": { "@types/react": "^19.2.0" } }, "sha512-/EEvYBdT3BflCWvTMO7YkYBHVE9Ci6XdqZciZANQgKpaiDRGOLIlRo91jbTNRQjgPFWVaRxcYc0luVNFitz57A=="],
+
"@types/react-dom": ["@types/react-dom@19.2.2", "", { "peerDependencies": { "@types/react": "^19.2.0" } }, "sha512-9KQPoO6mZCi7jcIStSnlOWn2nEF3mNmyr3rIAsGnAbQKYbRLyqmeSc39EVgtxXVia+LMT8j3knZLAZAh+xLmrw=="],
"@types/shimmer": ["@types/shimmer@1.2.0", "", {}, "sha512-UE7oxhQLLd9gub6JKIAhDq06T0F6FnztwMNRvYgjeQSBeMc1ZG/tA47EwfduvkuQS8apbkM/lpLpWsaCeYsXVg=="],
···
"acorn-import-attributes": ["acorn-import-attributes@1.9.5", "", { "peerDependencies": { "acorn": "^8" } }, "sha512-n02Vykv5uA3eHGM/Z2dQrcD56kL8TyDb2p1+0P83PClMnC/nc+anbQRhIOWnSq4Ke/KvDPrY3C9hDtC/A3eHnQ=="],
+
"actor-typeahead": ["actor-typeahead@0.1.1", "", {}, "sha512-ilsBwzplKwMSBiO6Tg6RdaZ5xxqgXds5jCQuHV+ib9Aq3ja9g0T7u2Y1PmihotmS7l5RxhpGI/tPm3ljoRDRwg=="],
+
"ansi-regex": ["ansi-regex@5.0.1", "", {}, "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ=="],
"ansi-styles": ["ansi-styles@4.3.0", "", { "dependencies": { "color-convert": "^2.0.1" } }, "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg=="],
···
"atomic-sleep": ["atomic-sleep@1.0.0", "", {}, "sha512-kNOjDqAh7px0XWNI+4QbzoiR/nTkHAWNud2uvnJquD1/x5a7EQZMJT0AczqK0Qn67oY/TTQ1LbUKajZpp3I9tQ=="],
+
"atproto-ui": ["atproto-ui@0.11.3", "", { "dependencies": { "@atcute/atproto": "^3.1.7", "@atcute/bluesky": "^3.2.3", "@atcute/client": "^4.0.3", "@atcute/identity-resolver": "^1.1.3", "@atcute/tangled": "^1.0.10" }, "peerDependencies": { "react": "^18.2.0 || ^19.0.0", "react-dom": "^18.2.0 || ^19.0.0" }, "optionalPeers": ["react-dom"] }, "sha512-NIBsORuo9lpCpr1SNKcKhNvqOVpsEy9IoHqFe1CM9gNTArpQL1hUcoP1Cou9a1O5qzCul9kaiu5xBHnB81I/WQ=="],
+
"await-lock": ["await-lock@2.2.2", "", {}, "sha512-aDczADvlvTGajTDjcjpJMqRkOF6Qdz3YbPZm/PyW6tKPkx2hlYBzxMhEywM/tU72HrVZjgl5VCdRuMlA7pZ8Gw=="],
"balanced-match": ["balanced-match@1.0.2", "", {}, "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw=="],
···
"buffer": ["buffer@6.0.3", "", { "dependencies": { "base64-js": "^1.3.1", "ieee754": "^1.2.1" } }, "sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA=="],
-
"bun": ["bun@1.3.0", "", { "optionalDependencies": { "@oven/bun-darwin-aarch64": "1.3.0", "@oven/bun-darwin-x64": "1.3.0", "@oven/bun-darwin-x64-baseline": "1.3.0", "@oven/bun-linux-aarch64": "1.3.0", "@oven/bun-linux-aarch64-musl": "1.3.0", "@oven/bun-linux-x64": "1.3.0", "@oven/bun-linux-x64-baseline": "1.3.0", "@oven/bun-linux-x64-musl": "1.3.0", "@oven/bun-linux-x64-musl-baseline": "1.3.0", "@oven/bun-windows-x64": "1.3.0", "@oven/bun-windows-x64-baseline": "1.3.0" }, "os": [ "linux", "win32", "darwin", ], "cpu": [ "x64", "arm64", ], "bin": { "bun": "bin/bun.exe", "bunx": "bin/bunx.exe" } }, "sha512-YI7mFs7iWc/VsGsh2aw6eAPD2cjzn1j+LKdYVk09x1CrdTWKYIHyd+dG5iQoN9//3hCDoZj8U6vKpZzEf5UARA=="],
+
"bun": ["bun@1.3.2", "", { "optionalDependencies": { "@oven/bun-darwin-aarch64": "1.3.2", "@oven/bun-darwin-x64": "1.3.2", "@oven/bun-darwin-x64-baseline": "1.3.2", "@oven/bun-linux-aarch64": "1.3.2", "@oven/bun-linux-aarch64-musl": "1.3.2", "@oven/bun-linux-x64": "1.3.2", "@oven/bun-linux-x64-baseline": "1.3.2", "@oven/bun-linux-x64-musl": "1.3.2", "@oven/bun-linux-x64-musl-baseline": "1.3.2", "@oven/bun-windows-x64": "1.3.2", "@oven/bun-windows-x64-baseline": "1.3.2" }, "os": [ "linux", "win32", "darwin", ], "cpu": [ "x64", "arm64", ], "bin": { "bun": "bin/bun.exe", "bunx": "bin/bunx.exe" } }, "sha512-x75mPJiEfhO1j4Tfc65+PtW6ZyrAB6yTZInydnjDZXF9u9PRAnr6OK3v0Q9dpDl0dxRHkXlYvJ8tteJxc8t4Sw=="],
"bun-plugin-tailwind": ["bun-plugin-tailwind@0.1.2", "", { "peerDependencies": { "bun": ">=1.0.0" } }, "sha512-41jNC1tZRSK3s1o7pTNrLuQG8kL/0vR/JgiTmZAJ1eHwe0w5j6HFPKeqEk0WAD13jfrUC7+ULuewFBBCoADPpg=="],
-
"bun-types": ["bun-types@1.3.0", "", { "dependencies": { "@types/node": "*" }, "peerDependencies": { "@types/react": "^19" } }, "sha512-u8X0thhx+yJ0KmkxuEo9HAtdfgCBaM/aI9K90VQcQioAmkVp3SG3FkwWGibUFz3WdXAdcsqOcbU40lK7tbHdkQ=="],
+
"bun-types": ["bun-types@1.3.2", "", { "dependencies": { "@types/node": "*" }, "peerDependencies": { "@types/react": "^19" } }, "sha512-i/Gln4tbzKNuxP70OWhJRZz1MRfvqExowP7U6JKoI8cntFrtxg7RJK3jvz7wQW54UuvNC8tbKHHri5fy74FVqg=="],
"bytes": ["bytes@3.1.2", "", {}, "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg=="],
···
"ee-first": ["ee-first@1.1.1", "", {}, "sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow=="],
-
"elysia": ["elysia@1.4.11", "", { "dependencies": { "cookie": "^1.0.2", "exact-mirror": "0.2.2", "fast-decode-uri-component": "^1.0.1" }, "peerDependencies": { "@sinclair/typebox": ">= 0.34.0 < 1", "file-type": ">= 20.0.0", "openapi-types": ">= 12.0.0", "typescript": ">= 5.0.0" }, "optionalPeers": ["typescript"] }, "sha512-cphuzQj0fRw1ICRvwHy2H3xQio9bycaZUVHnDHJQnKqBfMNlZ+Hzj6TMmt9lc0Az0mvbCnPXWVF7y1MCRhUuOA=="],
+
"elysia": ["elysia@1.4.15", "", { "dependencies": { "cookie": "^1.0.2", "exact-mirror": "0.2.2", "fast-decode-uri-component": "^1.0.1", "memoirist": "^0.4.0" }, "peerDependencies": { "@sinclair/typebox": ">= 0.34.0 < 1", "@types/bun": ">= 1.2.0", "file-type": ">= 20.0.0", "openapi-types": ">= 12.0.0", "typescript": ">= 5.0.0" }, "optionalPeers": ["@types/bun", "typescript"] }, "sha512-RaDqqZdLuC4UJetfVRQ4Z5aVpGgEtQ+pZnsbI4ZzEaf3l/MzuHcqSVoL/Fue3d6qE4RV9HMB2rAZaHyPIxkyzg=="],
"emoji-regex": ["emoji-regex@8.0.0", "", {}, "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="],
···
"es-object-atoms": ["es-object-atoms@1.1.1", "", { "dependencies": { "es-errors": "^1.3.0" } }, "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA=="],
+
"esbuild": ["esbuild@0.26.0", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.26.0", "@esbuild/android-arm": "0.26.0", "@esbuild/android-arm64": "0.26.0", "@esbuild/android-x64": "0.26.0", "@esbuild/darwin-arm64": "0.26.0", "@esbuild/darwin-x64": "0.26.0", "@esbuild/freebsd-arm64": "0.26.0", "@esbuild/freebsd-x64": "0.26.0", "@esbuild/linux-arm": "0.26.0", "@esbuild/linux-arm64": "0.26.0", "@esbuild/linux-ia32": "0.26.0", "@esbuild/linux-loong64": "0.26.0", "@esbuild/linux-mips64el": "0.26.0", "@esbuild/linux-ppc64": "0.26.0", "@esbuild/linux-riscv64": "0.26.0", "@esbuild/linux-s390x": "0.26.0", "@esbuild/linux-x64": "0.26.0", "@esbuild/netbsd-arm64": "0.26.0", "@esbuild/netbsd-x64": "0.26.0", "@esbuild/openbsd-arm64": "0.26.0", "@esbuild/openbsd-x64": "0.26.0", "@esbuild/openharmony-arm64": "0.26.0", "@esbuild/sunos-x64": "0.26.0", "@esbuild/win32-arm64": "0.26.0", "@esbuild/win32-ia32": "0.26.0", "@esbuild/win32-x64": "0.26.0" }, "bin": { "esbuild": "bin/esbuild" } }, "sha512-3Hq7jri+tRrVWha+ZeIVhl4qJRha/XjRNSopvTsOaCvfPHrflTYTcUFcEjMKdxofsXXsdc4zjg5NOTnL4Gl57Q=="],
+
"escalade": ["escalade@3.2.0", "", {}, "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA=="],
"escape-html": ["escape-html@1.0.3", "", {}, "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow=="],
+
+
"esm-env": ["esm-env@1.2.2", "", {}, "sha512-Epxrv+Nr/CaL4ZcFGPJIYLWFom+YeV1DqMLHJoEd9SYRxNbaFruBwfEX/kkHUJf55j2+TUbmDcmuilbP1TmXHA=="],
"etag": ["etag@1.8.1", "", {}, "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg=="],
···
"media-typer": ["media-typer@0.3.0", "", {}, "sha512-dq+qelQ9akHpcOl/gUVRTxVIOkAJ1wR3QAvb4RsVjS8oVoFjDGTc679wJYmUmknUF5HwMLOgb5O+a3KxfWapPQ=="],
+
"memoirist": ["memoirist@0.4.0", "", {}, "sha512-zxTgA0mSYELa66DimuNQDvyLq36AwDlTuVRbnQtB+VuTcKWm5Qc4z3WkSpgsFWHNhexqkIooqpv4hdcqrX5Nmg=="],
+
"merge-descriptors": ["merge-descriptors@1.0.3", "", {}, "sha512-gaNvAS7TZ897/rVaZ0nMtAyxNyi/pdbjbAwUpFQpN70GqnVfOiXpeUUMKRBmzXaSQ8DdTX4/0ms62r2K+hE6mQ=="],
"methods": ["methods@1.1.2", "", {}, "sha512-iclAHeNqNm68zFtnZ0e+1L2yUIdvzNoauKU4WBA3VvH/vPFieF7qfRlwUZU+DA9P9bPXIS90ulxoUoCH23sV2w=="],
···
"ms": ["ms@2.0.0", "", {}, "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="],
-
"multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
"multiformats": ["multiformats@13.4.1", "", {}, "sha512-VqO6OSvLrFVAYYjgsr8tyv62/rCQhPgsZUXLTqoFLSgdkgiUYKYeArbt1uWLlEpkjxQe+P0+sHlbPEte1Bi06Q=="],
"negotiator": ["negotiator@0.6.3", "", {}, "sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg=="],
···
"pino-std-serializers": ["pino-std-serializers@6.2.2", "", {}, "sha512-cHjPPsE+vhj/tnhCy/wiMh3M3z3h/j15zHQX+S9GkTBgqJuTuJzYJ4gUyACLhDaJ7kk9ba9iRDmbH2tJU03OiA=="],
"prettier": ["prettier@3.6.2", "", { "bin": { "prettier": "bin/prettier.cjs" } }, "sha512-I7AIg5boAr5R0FFtJ6rCfD+LFsWHp81dolrFD8S79U9tb8Az2nGrJncnMSnys+bpQJfRUzqs9hnA81OAA3hCuQ=="],
+
+
"prismjs": ["prismjs@1.30.0", "", {}, "sha512-DEvV2ZF2r2/63V+tK8hQvrR2ZGn10srHbXviTlcv7Kpzw8jWiNTqbVgjO3IY8RxrrOUF8VPMQQFysYYYv0YZxw=="],
"process": ["process@0.11.10", "", {}, "sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A=="],
···
"tailwind-merge": ["tailwind-merge@3.3.1", "", {}, "sha512-gBXpgUm/3rp1lMZZrM/w7D8GKqshif0zAymAhbCyIt8KMe+0v9DQ7cdYLR4FHH/cKpdTXb+A/tKKU3eolfsI+g=="],
-
"tailwindcss": ["tailwindcss@4.1.14", "", {}, "sha512-b7pCxjGO98LnxVkKjaZSDeNuljC4ueKUddjENJOADtubtdo8llTaJy7HwBMeLNSSo2N5QIAgklslK1+Ir8r6CA=="],
+
"tailwindcss": ["tailwindcss@4.1.17", "", {}, "sha512-j9Ee2YjuQqYT9bbRTfTZht9W/ytp5H+jJpZKiYdP/bpnXARAuELt9ofP0lPnmHjbga7SNQIxdTAXCmtKVYjN+Q=="],
"thread-stream": ["thread-stream@2.7.0", "", { "dependencies": { "real-require": "^0.2.0" } }, "sha512-qQiRWsU/wvNolI6tbbCKd9iKaTnCXsTwVxhhKM6nctPdujTyztjlbUkUTUymidWcMnZ5pWR0ej4a0tjsW021vw=="],
"tinyglobby": ["tinyglobby@0.2.15", "", { "dependencies": { "fdir": "^6.5.0", "picomatch": "^4.0.3" } }, "sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ=="],
-
"tlds": ["tlds@1.260.0", "", { "bin": { "tlds": "bin.js" } }, "sha512-78+28EWBhCEE7qlyaHA9OR3IPvbCLiDh3Ckla593TksfFc9vfTsgvH7eS+dr3o9qr31gwGbogcI16yN91PoRjQ=="],
+
"tlds": ["tlds@1.261.0", "", { "bin": { "tlds": "bin.js" } }, "sha512-QXqwfEl9ddlGBaRFXIvNKK6OhipSiLXuRuLJX5DErz0o0Q0rYxulWLdFryTkV5PkdZct5iMInwYEGe/eR++1AA=="],
"toidentifier": ["toidentifier@1.0.1", "", {}, "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA=="],
···
"undici": ["undici@6.22.0", "", {}, "sha512-hU/10obOIu62MGYjdskASR3CUAiYaFTtC9Pa6vHyf//mAipSvSQg6od2CnJswq7fvzNS3zJhxoRkgNVaHurWKw=="],
-
"undici-types": ["undici-types@7.14.0", "", {}, "sha512-QQiYxHuyZ9gQUIrmPo3IA+hUl4KYk8uSA7cHrcKd/l3p1OTpZcM0Tbp9x7FAtXdAYhlasd60ncPpgu6ihG6TOA=="],
+
"undici-types": ["undici-types@7.16.0", "", {}, "sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw=="],
"unpipe": ["unpipe@1.0.0", "", {}, "sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ=="],
···
"zod": ["zod@3.25.76", "", {}, "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ=="],
+
"@atproto/api/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/common/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/common-web/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/jwk/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/lexicon/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/oauth-client/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@ipld/dag-cbor/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@radix-ui/react-collection/@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.3", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A=="],
+
+
"@radix-ui/react-dialog/@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.3", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A=="],
+
+
"@radix-ui/react-label/@radix-ui/react-primitive": ["@radix-ui/react-primitive@2.1.4", "", { "dependencies": { "@radix-ui/react-slot": "1.2.4" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-9hQc4+GNVtJAIEPEqlYqW5RiYdrr8ea5XQ0ZOnD6fgru+83kqT15mq2OCcbe8KnjRZl5vF3ks69AKz3kh1jrhg=="],
+
+
"@radix-ui/react-primitive/@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.3", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A=="],
+
"@tokenizer/inflate/debug": ["debug@4.4.3", "", { "dependencies": { "ms": "^2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="],
"express/cookie": ["cookie@0.7.1", "", {}, "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w=="],
···
"send/encodeurl": ["encodeurl@1.0.2", "", {}, "sha512-TPJXq8JqFaVYm2CWmPvnP2Iyo4ZSM7/QKcSmuMLDObfpH5fi7RUGmd/rTDf+rut/saiDiQEeVTNgAmJEdAOx0w=="],
"send/ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="],
+
+
"uint8arrays/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
"@tokenizer/inflate/debug/ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="],
+25
cli/.gitignore
···
+
test/
+
.DS_STORE
+
jacquard/
+
binaries/
+
# Generated by Cargo
+
# will have compiled files and executables
+
debug
+
target
+
+
# These are backup files generated by rustfmt
+
**/*.rs.bk
+
+
# MSVC Windows builds of rustc generate these, which store debugging information
+
*.pdb
+
+
# Generated by cargo mutants
+
# Contains mutation testing data
+
**/mutants.out*/
+
+
# RustRover
+
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
+
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
+
# and can be added to the global gitignore or merged into this file. For a more nuclear
+
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
+
#.idea/
+581 -170
cli/Cargo.lock
···
[[package]]
name = "async-compression"
-
version = "0.4.32"
+
version = "0.4.33"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "5a89bce6054c720275ac2432fbba080a66a2106a44a1b804553930ca6909f4e0"
+
checksum = "93c1f86859c1af3d514fa19e8323147ff10ea98684e6c7b307912509f50e67b2"
dependencies = [
"compression-codecs",
"compression-core",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
checksum = "c08606f8c3cbf4ce6ec8e28fb0014a2c086708fe954eaa885384a6165172e7e8"
[[package]]
+
name = "axum"
+
version = "0.7.9"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "edca88bc138befd0323b20752846e6587272d3b03b0343c8ea28a6f819e6e71f"
+
dependencies = [
+
"async-trait",
+
"axum-core",
+
"bytes",
+
"futures-util",
+
"http",
+
"http-body",
+
"http-body-util",
+
"hyper",
+
"hyper-util",
+
"itoa",
+
"matchit",
+
"memchr",
+
"mime",
+
"percent-encoding",
+
"pin-project-lite",
+
"rustversion",
+
"serde",
+
"serde_json",
+
"serde_path_to_error",
+
"serde_urlencoded",
+
"sync_wrapper",
+
"tokio",
+
"tower 0.5.2",
+
"tower-layer",
+
"tower-service",
+
"tracing",
+
]
+
+
[[package]]
+
name = "axum-core"
+
version = "0.4.5"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "09f2bd6146b97ae3359fa0cc6d6b376d9539582c7b4220f041a33ec24c226199"
+
dependencies = [
+
"async-trait",
+
"bytes",
+
"futures-util",
+
"http",
+
"http-body",
+
"http-body-util",
+
"mime",
+
"pin-project-lite",
+
"rustversion",
+
"sync_wrapper",
+
"tower-layer",
+
"tower-service",
+
"tracing",
+
]
+
+
[[package]]
name = "backtrace"
version = "0.3.76"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"proc-macro2",
"quote",
"rustversion",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
checksum = "46c5e41b57b8bba42a04676d81cb89e9ee8e859a1a66f80a5a72e1cb76b34d43"
[[package]]
+
name = "byteorder"
+
version = "1.5.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "1fd0f2584146f6f2ef48085050886acf353beff7305ebd1ae69500e27c67f64b"
+
+
[[package]]
name = "bytes"
version = "1.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
name = "cc"
-
version = "1.2.44"
+
version = "1.2.45"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "37521ac7aabe3d13122dc382493e20c9416f299d2ccd5b3a5340a2570cdeb0f3"
+
checksum = "35900b6c8d709fb1d854671ae27aeaa9eec2f8b01b364e1619a40da3e6fe2afe"
dependencies = [
"find-msvc-tools",
"shlex",
···
"heck 0.5.0",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
[[package]]
name = "compression-codecs"
-
version = "0.4.31"
+
version = "0.4.32"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "ef8a506ec4b81c460798f572caead636d57d3d7e940f998160f52bd254bf2d23"
+
checksum = "680dc087785c5230f8e8843e2e57ac7c1c90488b6a91b88caa265410568f441b"
dependencies = [
"compression-core",
"flate2",
···
[[package]]
name = "compression-core"
-
version = "0.4.29"
+
version = "0.4.30"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "e47641d3deaf41fb1538ac1f54735925e275eaf3bf4d55c81b137fba797e5cbb"
+
checksum = "3a9b614a5787ef0c8802a55766480563cb3a93b435898c422ed2a359cf811582"
[[package]]
name = "const-oid"
···
checksum = "2f421161cb492475f1661ddc9815a745a1c894592070661180fdec3d4872e9c3"
[[package]]
+
name = "cordyceps"
+
version = "0.3.4"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "688d7fbb8092b8de775ef2536f36c8c31f2bc4006ece2e8d8ad2d17d00ce0a2a"
+
dependencies = [
+
"loom",
+
"tracing",
+
]
+
+
[[package]]
name = "core-foundation"
version = "0.9.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"proc-macro2",
"quote",
"strsim",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
dependencies = [
"darling_core",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
checksum = "8d162beedaa69905488a8da94f5ac3edb4dd4788b732fadb7bd120b2625c1976"
dependencies = [
"data-encoding",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
]
[[package]]
+
name = "derive_more"
+
version = "1.0.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "4a9b99b9cbbe49445b21764dc0625032a89b145a2642e67603e1c936f5458d05"
+
dependencies = [
+
"derive_more-impl",
+
]
+
+
[[package]]
+
name = "derive_more-impl"
+
version = "1.0.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "cb7330aeadfbe296029522e6c40f315320aba36fc43a5b3632f3795348f3bd22"
+
dependencies = [
+
"proc-macro2",
+
"quote",
+
"syn 2.0.110",
+
"unicode-xid",
+
]
+
+
[[package]]
+
name = "diatomic-waker"
+
version = "0.2.3"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "ab03c107fafeb3ee9f5925686dbb7a73bc76e3932abb0d2b365cb64b169cf04c"
+
+
[[package]]
name = "digest"
version = "0.10.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
"heck 0.5.0",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1"
[[package]]
-
name = "foreign-types"
-
version = "0.3.2"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
-
dependencies = [
-
"foreign-types-shared",
-
]
-
-
[[package]]
-
name = "foreign-types-shared"
-
version = "0.1.1"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
-
-
[[package]]
name = "form_urlencoded"
version = "1.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
]
[[package]]
+
name = "futures-buffered"
+
version = "0.2.12"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "a8e0e1f38ec07ba4abbde21eed377082f17ccb988be9d988a5adbf4bafc118fd"
+
dependencies = [
+
"cordyceps",
+
"diatomic-waker",
+
"futures-core",
+
"pin-project-lite",
+
"spin 0.10.0",
+
]
+
+
[[package]]
name = "futures-channel"
version = "0.3.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "9e5c1b78ca4aae1ac06c48a526a655760685149f0d465d21f37abfe57ce075c6"
[[package]]
+
name = "futures-lite"
+
version = "2.6.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "f78e10609fe0e0b3f4157ffab1876319b5b0db102a2c60dc4626306dc46b44ad"
+
dependencies = [
+
"fastrand",
+
"futures-core",
+
"futures-io",
+
"parking",
+
"pin-project-lite",
+
]
+
+
[[package]]
name = "futures-macro"
version = "0.3.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
"pin-project-lite",
"pin-utils",
"slab",
+
]
+
+
[[package]]
+
name = "generator"
+
version = "0.8.7"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "605183a538e3e2a9c1038635cc5c2d194e2ee8fd0d1b66b8349fad7dbacce5a2"
+
dependencies = [
+
"cc",
+
"cfg-if",
+
"libc",
+
"log",
+
"rustversion",
+
"windows",
[[package]]
···
"markup5ever",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
+
name = "http-range-header"
+
version = "0.4.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "9171a2ea8a68358193d15dd5d70c1c10a2afc3e7e4c5bc92bc9f025cebd7359c"
+
+
[[package]]
name = "httparse"
version = "1.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
name = "hyper"
-
version = "1.7.0"
+
version = "1.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "eb3aa54a13a0dfe7fbe3a59e0c76093041720fdc77b110cc0fc260fafb4dc51e"
+
checksum = "1744436df46f0bde35af3eda22aeaba453aada65d8f1c171cd8a5f59030bd69f"
dependencies = [
"atomic-waker",
"bytes",
···
"http",
"http-body",
"httparse",
+
"httpdate",
"itoa",
"pin-project-lite",
"pin-utils",
···
[[package]]
-
name = "hyper-tls"
-
version = "0.6.0"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "70206fc6890eaca9fde8a0bf71caa2ddfc9fe045ac9e5c70df101a7dbde866e0"
-
dependencies = [
-
"bytes",
-
"http-body-util",
-
"hyper",
-
"hyper-util",
-
"native-tls",
-
"tokio",
-
"tokio-native-tls",
-
"tower-service",
-
]
-
-
[[package]]
name = "hyper-util"
version = "0.1.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"js-sys",
"log",
"wasm-bindgen",
-
"windows-core",
+
"windows-core 0.62.2",
[[package]]
···
[[package]]
name = "iri-string"
-
version = "0.7.8"
+
version = "0.7.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "dbc5ebe9c3a1a7a5127f920a418f7585e9e758e911d0466ed004f393b0e380b2"
+
checksum = "4f867b9d1d896b67beb18518eda36fdb77a32ea590de864f1325b294a6d14397"
dependencies = [
"memchr",
"serde",
···
[[package]]
name = "jacquard"
version = "0.9.0"
+
source = "git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65"
dependencies = [
"bytes",
"getrandom 0.2.16",
···
[[package]]
name = "jacquard-api"
version = "0.9.0"
+
source = "git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65"
dependencies = [
"bon",
"bytes",
···
[[package]]
name = "jacquard-common"
version = "0.9.0"
+
source = "git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65"
dependencies = [
"base64 0.22.1",
"bon",
"bytes",
"chrono",
+
"ciborium",
"cid",
+
"futures",
"getrandom 0.2.16",
"getrandom 0.3.4",
"http",
···
"miette",
"multibase",
"multihash",
+
"n0-future",
"ouroboros",
"p256",
"rand 0.9.2",
···
"smol_str",
"thiserror 2.0.17",
"tokio",
+
"tokio-tungstenite-wasm",
"tokio-util",
"trait-variant",
"url",
···
[[package]]
name = "jacquard-derive"
version = "0.9.0"
+
source = "git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65"
dependencies = [
"heck 0.5.0",
"jacquard-lexicon",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
name = "jacquard-identity"
-
version = "0.9.0"
+
version = "0.9.1"
+
source = "git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65"
dependencies = [
"bon",
"bytes",
···
[[package]]
name = "jacquard-lexicon"
-
version = "0.9.0"
+
version = "0.9.1"
+
source = "git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65"
dependencies = [
"cid",
"dashmap",
···
"serde_repr",
"serde_with",
"sha2",
-
"syn 2.0.108",
+
"syn 2.0.110",
"thiserror 2.0.17",
"unicode-segmentation",
···
[[package]]
name = "jacquard-oauth"
version = "0.9.0"
+
source = "git+https://tangled.org/@nonbinary.computer/jacquard#5c79bb76de544cbd4fa8d5d8b01ba6e828f8ba65"
dependencies = [
"base64 0.22.1",
"bytes",
···
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbd2bcb4c963f2ddae06a2efc7e9f3591312473c50c6685e1f298068316e66fe"
dependencies = [
-
"spin",
+
"spin 0.9.8",
[[package]]
···
checksum = "34080505efa8e45a4b816c349525ebe327ceaa8559756f0356cba97ef3bf7432"
[[package]]
+
name = "loom"
+
version = "0.7.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "419e0dc8046cb947daa77eb95ae174acfbddb7673b4151f56d1eed8e93fbfaca"
+
dependencies = [
+
"cfg-if",
+
"generator",
+
"scoped-tls",
+
"tracing",
+
"tracing-subscriber",
+
]
+
+
[[package]]
name = "lru-cache"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
+
name = "matchers"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "d1525a2a28c7f4fa0fc98bb91ae755d1e2d1505079e05539e35bc876b5d65ae9"
+
dependencies = [
+
"regex-automata",
+
]
+
+
[[package]]
+
name = "matchit"
+
version = "0.7.3"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "0e7465ac9959cc2b1404e8e2367b43684a6d13790fe23056cc8c6c5a6b7bcb94"
+
+
[[package]]
name = "memchr"
version = "2.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
-
name = "native-tls"
-
version = "0.2.14"
+
name = "n0-future"
+
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "87de3442987e9dbec73158d5c715e7ad9072fda936bb03d19d7fa10e00520f0e"
+
checksum = "7bb0e5d99e681ab3c938842b96fcb41bf8a7bb4bfdb11ccbd653a7e83e06c794"
dependencies = [
-
"libc",
-
"log",
-
"openssl",
-
"openssl-probe",
-
"openssl-sys",
-
"schannel",
-
"security-framework",
-
"security-framework-sys",
-
"tempfile",
+
"cfg_aliases",
+
"derive_more",
+
"futures-buffered",
+
"futures-lite",
+
"futures-util",
+
"js-sys",
+
"pin-project",
+
"send_wrapper",
+
"tokio",
+
"tokio-util",
+
"wasm-bindgen",
+
"wasm-bindgen-futures",
+
"web-time",
[[package]]
···
[[package]]
+
name = "nu-ansi-term"
+
version = "0.50.3"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "7957b9740744892f114936ab4a57b3f487491bbeafaf8083688b16841a4240e5"
+
dependencies = [
+
"windows-sys 0.61.2",
+
]
+
+
[[package]]
name = "num-bigint-dig"
-
version = "0.8.5"
+
version = "0.8.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "82c79c15c05d4bf82b6f5ef163104cc81a760d8e874d38ac50ab67c8877b647b"
+
checksum = "e661dda6640fad38e827a6d4a310ff4763082116fe217f279885c97f511bb0b7"
dependencies = [
"lazy_static",
"libm",
···
checksum = "384b8ab6d37215f3c5301a95a4accb5d64aa607f1fcb26a11b5303878451b4fe"
[[package]]
-
name = "openssl"
-
version = "0.10.74"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "24ad14dd45412269e1a30f52ad8f0664f0f4f4a89ee8fe28c3b3527021ebb654"
-
dependencies = [
-
"bitflags",
-
"cfg-if",
-
"foreign-types",
-
"libc",
-
"once_cell",
-
"openssl-macros",
-
"openssl-sys",
-
]
-
-
[[package]]
-
name = "openssl-macros"
-
version = "0.1.1"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
-
dependencies = [
-
"proc-macro2",
-
"quote",
-
"syn 2.0.108",
-
]
-
-
[[package]]
name = "openssl-probe"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d05e27ee213611ffe7d6348b942e8f942b37114c00cc03cec254295a4a17852e"
-
-
[[package]]
-
name = "openssl-sys"
-
version = "0.9.110"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "0a9f0075ba3c21b09f8e8b2026584b1d18d49388648f2fbbf3c97ea8deced8e2"
-
dependencies = [
-
"cc",
-
"libc",
-
"pkg-config",
-
"vcpkg",
-
]
[[package]]
name = "option-ext"
···
"proc-macro2",
"proc-macro2-diagnostics",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
"elliptic-curve",
"primeorder",
+
+
[[package]]
+
name = "parking"
+
version = "2.2.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "f38d5652c16fde515bb1ecef450ab0f6a219d619a7274976324d5e377f7dceba"
[[package]]
name = "parking_lot"
···
[[package]]
+
name = "pin-project"
+
version = "1.1.10"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "677f1add503faace112b9f1373e43e9e054bfdd22ff1a63c1bc485eaec6a6a8a"
+
dependencies = [
+
"pin-project-internal",
+
]
+
+
[[package]]
+
name = "pin-project-internal"
+
version = "1.1.10"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "6e918e4ff8c4549eb882f14b3a4bc8c8bc93de829416eacf579f1207a8fbf861"
+
dependencies = [
+
"proc-macro2",
+
"quote",
+
"syn 2.0.110",
+
]
+
+
[[package]]
name = "pin-project-lite"
version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"der",
"spki",
-
-
[[package]]
-
name = "pkg-config"
-
version = "0.3.32"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "7edddbd0b52d732b21ad9a5fab5c704c14cd949e5e9a1ec5929a24fded1b904c"
[[package]]
name = "potential_utf"
···
checksum = "479ca8adacdd7ce8f1fb39ce9ecccbfe93a3f1344b3d0d97f20bc0196208f62b"
dependencies = [
"proc-macro2",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
"version_check",
"yansi",
···
[[package]]
name = "quote"
-
version = "1.0.41"
+
version = "1.0.42"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "ce25767e7b499d1b604768e7cde645d14cc8584231ea6b295e9c9eb22c02e1d1"
+
checksum = "a338cc41d27e6cc6dce6cefc13a0729dfbb81c262b1f519331575dd80ef3067f"
dependencies = [
"proc-macro2",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
"http-body-util",
"hyper",
"hyper-rustls",
-
"hyper-tls",
"hyper-util",
"js-sys",
"log",
"mime",
-
"native-tls",
"percent-encoding",
"pin-project-lite",
"quinn",
···
"serde_urlencoded",
"sync_wrapper",
"tokio",
-
"tokio-native-tls",
"tokio-rustls",
"tokio-util",
-
"tower",
-
"tower-http",
+
"tower 0.5.2",
+
"tower-http 0.6.6",
"tower-service",
"url",
"wasm-bindgen",
···
[[package]]
name = "rustls"
-
version = "0.23.34"
+
version = "0.23.35"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "6a9586e9ee2b4f8fab52a0048ca7334d7024eef48e2cb9407e3497bb7cab7fa7"
+
checksum = "533f54bc6a7d4f647e46ad909549eda97bf5afc1585190ef692b4286b198bd8f"
dependencies = [
"once_cell",
"ring",
···
"rustls-webpki",
"subtle",
"zeroize",
+
]
+
+
[[package]]
+
name = "rustls-native-certs"
+
version = "0.8.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "9980d917ebb0c0536119ba501e90834767bffc3d60641457fd84a1f3fd337923"
+
dependencies = [
+
"openssl-probe",
+
"rustls-pki-types",
+
"schannel",
+
"security-framework",
[[package]]
···
[[package]]
name = "schemars"
-
version = "1.0.4"
+
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "82d20c4491bc164fa2f6c5d44565947a52ad80b9505d8e36f8d54c27c739fcd0"
+
checksum = "9558e172d4e8533736ba97870c4b2cd63f84b382a3d6eb063da41b91cce17289"
dependencies = [
"dyn-clone",
"ref-cast",
···
[[package]]
+
name = "scoped-tls"
+
version = "1.0.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "e1cf6437eb19a8f4a6cc0f7dca544973b0b78843adbfeb3683d1a94a0024a294"
+
+
[[package]]
name = "scopeguard"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
name = "security-framework"
-
version = "2.11.1"
+
version = "3.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "897b2245f0b511c87893af39b033e5ca9cce68824c4d7e7630b5a1d339658d02"
+
checksum = "b3297343eaf830f66ede390ea39da1d462b6b0c1b000f420d0a83f898bbbe6ef"
dependencies = [
"bitflags",
-
"core-foundation 0.9.4",
+
"core-foundation 0.10.1",
"core-foundation-sys",
"libc",
"security-framework-sys",
···
[[package]]
+
name = "send_wrapper"
+
version = "0.6.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "cd0b0ec5f1c1ca621c432a25813d8d60c88abe6d3e08a3eb9cf37d97a0fe3d73"
+
+
[[package]]
name = "serde"
version = "1.0.228"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
+
name = "serde_path_to_error"
+
version = "0.1.20"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "10a9ff822e371bb5403e391ecd83e182e0e77ba7f6fe0160b795797109d1b457"
+
dependencies = [
+
"itoa",
+
"serde",
+
"serde_core",
+
]
+
+
[[package]]
name = "serde_repr"
version = "0.1.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
"indexmap 1.9.3",
"indexmap 2.12.0",
"schemars 0.9.0",
-
"schemars 1.0.4",
+
"schemars 1.1.0",
"serde_core",
"serde_json",
"serde_with_macros",
···
"darling",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
+
]
+
+
[[package]]
+
name = "sha1"
+
version = "0.10.6"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "e3bf829a2d51ab4a5ddf1352d8470c140cadc8301b2ae1789db023f01cedd6ba"
+
dependencies = [
+
"cfg-if",
+
"cpufeatures",
+
"digest",
[[package]]
···
"cfg-if",
"cpufeatures",
"digest",
+
]
+
+
[[package]]
+
name = "sharded-slab"
+
version = "0.1.7"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "f40ca3c46823713e0d4209592e8d6e826aa57e928f09752619fc696c499637f6"
+
dependencies = [
+
"lazy_static",
[[package]]
···
checksum = "6980e8d7511241f8acf4aebddbb1ff938df5eebe98691418c4468d0b72a96a67"
[[package]]
+
name = "spin"
+
version = "0.10.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "d5fe4ccb98d9c292d56fec89a5e07da7fc4cf0dc11e156b41793132775d3e591"
+
+
[[package]]
name = "spki"
version = "0.7.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"quote",
"serde",
"sha2",
-
"syn 2.0.108",
+
"syn 2.0.110",
"thiserror 1.0.69",
···
[[package]]
name = "syn"
-
version = "2.0.108"
+
version = "2.0.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "da58917d35242480a05c2897064da0a80589a2a0476c9a3f2fdc83b53502e917"
+
checksum = "a99801b5bd34ede4cf3fc688c5919368fea4e4814a4664359503e6015b280aea"
dependencies = [
"proc-macro2",
"quote",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
+
]
+
+
[[package]]
+
name = "thread_local"
+
version = "1.1.9"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "f60246a4944f24f6e018aa17cdeffb7818b76356965d03b07d6a9886e8962185"
+
dependencies = [
+
"cfg-if",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
-
name = "tokio-native-tls"
-
version = "0.3.1"
+
name = "tokio-rustls"
+
version = "0.26.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
+
checksum = "1729aa945f29d91ba541258c8df89027d5792d85a8841fb65e8bf0f4ede4ef61"
dependencies = [
-
"native-tls",
+
"rustls",
"tokio",
[[package]]
-
name = "tokio-rustls"
-
version = "0.26.4"
+
name = "tokio-tungstenite"
+
version = "0.24.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "1729aa945f29d91ba541258c8df89027d5792d85a8841fb65e8bf0f4ede4ef61"
+
checksum = "edc5f74e248dc973e0dbb7b74c7e0d6fcc301c694ff50049504004ef4d0cdcd9"
dependencies = [
+
"futures-util",
+
"log",
"rustls",
+
"rustls-native-certs",
+
"rustls-pki-types",
"tokio",
+
"tokio-rustls",
+
"tungstenite",
+
]
+
+
[[package]]
+
name = "tokio-tungstenite-wasm"
+
version = "0.4.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "e21a5c399399c3db9f08d8297ac12b500e86bca82e930253fdc62eaf9c0de6ae"
+
dependencies = [
+
"futures-channel",
+
"futures-util",
+
"http",
+
"httparse",
+
"js-sys",
+
"rustls",
+
"thiserror 1.0.69",
+
"tokio",
+
"tokio-tungstenite",
+
"wasm-bindgen",
+
"web-sys",
[[package]]
name = "tokio-util"
-
version = "0.7.16"
+
version = "0.7.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "14307c986784f72ef81c89db7d9e28d6ac26d16213b109ea501696195e6e3ce5"
+
checksum = "2efa149fe76073d6e8fd97ef4f4eca7b67f599660115591483572e406e165594"
dependencies = [
"bytes",
"futures-core",
"futures-sink",
+
"futures-util",
"pin-project-lite",
"tokio",
+
]
+
+
[[package]]
+
name = "tower"
+
version = "0.4.13"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "b8fa9be0de6cf49e536ce1851f987bd21a43b771b09473c3549a6c853db37c1c"
+
dependencies = [
+
"tower-layer",
+
"tower-service",
+
"tracing",
[[package]]
···
"tokio",
"tower-layer",
"tower-service",
+
"tracing",
+
]
+
+
[[package]]
+
name = "tower-http"
+
version = "0.5.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "1e9cd434a998747dd2c4276bc96ee2e0c7a2eadf3cae88e52be55a05fa9053f5"
+
dependencies = [
+
"async-compression",
+
"bitflags",
+
"bytes",
+
"futures-core",
+
"futures-util",
+
"http",
+
"http-body",
+
"http-body-util",
+
"http-range-header",
+
"httpdate",
+
"mime",
+
"mime_guess",
+
"percent-encoding",
+
"pin-project-lite",
+
"tokio",
+
"tokio-util",
+
"tower-layer",
+
"tower-service",
+
"tracing",
[[package]]
···
"http-body",
"iri-string",
"pin-project-lite",
-
"tower",
+
"tower 0.5.2",
"tower-layer",
"tower-service",
···
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "784e0ac535deb450455cbfa28a6f0df145ea1bb7ae51b821cf5e7927fdcfbdd0"
dependencies = [
+
"log",
"pin-project-lite",
"tracing-attributes",
"tracing-core",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
checksum = "b9d12581f227e93f094d3af2ae690a574abb8a2b9b7a96e7cfe9647b2b617678"
dependencies = [
"once_cell",
+
"valuable",
+
]
+
+
[[package]]
+
name = "tracing-log"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "ee855f1f400bd0e5c02d150ae5de3840039a3f54b025156404e34c23c03f47c3"
+
dependencies = [
+
"log",
+
"once_cell",
+
"tracing-core",
+
]
+
+
[[package]]
+
name = "tracing-subscriber"
+
version = "0.3.20"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "2054a14f5307d601f88daf0553e1cbf472acc4f2c51afab632431cdcd72124d5"
+
dependencies = [
+
"matchers",
+
"nu-ansi-term",
+
"once_cell",
+
"regex-automata",
+
"sharded-slab",
+
"smallvec",
+
"thread_local",
+
"tracing",
+
"tracing-core",
+
"tracing-log",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
checksum = "e421abadd41a4225275504ea4d6566923418b7f05506fbc9c0fe86ba7396114b"
[[package]]
+
name = "tungstenite"
+
version = "0.24.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "18e5b8366ee7a95b16d32197d0b2604b43a0be89dc5fac9f8e96ccafbaedda8a"
+
dependencies = [
+
"byteorder",
+
"bytes",
+
"data-encoding",
+
"http",
+
"httparse",
+
"log",
+
"rand 0.8.5",
+
"rustls",
+
"rustls-pki-types",
+
"sha1",
+
"thiserror 1.0.69",
+
"utf-8",
+
]
+
+
[[package]]
name = "twoway"
version = "0.1.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "b4ac048d71ede7ee76d585517add45da530660ef4390e49b098733c6e897f254"
[[package]]
+
name = "unicode-xid"
+
version = "0.2.6"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
+
+
[[package]]
name = "unsigned-varint"
version = "0.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "06abde3611657adf66d383f00b093d7faecc7fa57071cce2578660c9f1010821"
[[package]]
-
name = "vcpkg"
-
version = "0.2.15"
+
name = "valuable"
+
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "accd4ea62f7bb7a82fe23066fb0957d48ef677f6eeb8215f372f52e48bb32426"
+
checksum = "ba73ea9cf16a25df0c8caa16c51acb937d5712a8429db78a3ee29d5dcacd3a65"
[[package]]
name = "version_check"
···
"bumpalo",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
"wasm-bindgen-shared",
···
[[package]]
+
name = "windows"
+
version = "0.61.3"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "9babd3a767a4c1aef6900409f85f5d53ce2544ccdfaa86dad48c91782c6d6893"
+
dependencies = [
+
"windows-collections",
+
"windows-core 0.61.2",
+
"windows-future",
+
"windows-link 0.1.3",
+
"windows-numerics",
+
]
+
+
[[package]]
+
name = "windows-collections"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "3beeceb5e5cfd9eb1d76b381630e82c4241ccd0d27f1a39ed41b2760b255c5e8"
+
dependencies = [
+
"windows-core 0.61.2",
+
]
+
+
[[package]]
+
name = "windows-core"
+
version = "0.61.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "c0fdd3ddb90610c7638aa2b3a3ab2904fb9e5cdbecc643ddb3647212781c4ae3"
+
dependencies = [
+
"windows-implement",
+
"windows-interface",
+
"windows-link 0.1.3",
+
"windows-result 0.3.4",
+
"windows-strings 0.4.2",
+
]
+
+
[[package]]
name = "windows-core"
version = "0.62.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
+
name = "windows-future"
+
version = "0.2.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "fc6a41e98427b19fe4b73c550f060b59fa592d7d686537eebf9385621bfbad8e"
+
dependencies = [
+
"windows-core 0.61.2",
+
"windows-link 0.1.3",
+
"windows-threading",
+
]
+
+
[[package]]
name = "windows-implement"
version = "0.60.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f0805222e57f7521d6a62e36fa9163bc891acd422f971defe97d64e70d0a4fe5"
+
+
[[package]]
+
name = "windows-numerics"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "9150af68066c4c5c07ddc0ce30421554771e528bde427614c61038bc2c92c2b1"
+
dependencies = [
+
"windows-core 0.61.2",
+
"windows-link 0.1.3",
+
]
[[package]]
name = "windows-registry"
···
[[package]]
+
name = "windows-threading"
+
version = "0.1.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "b66463ad2e0ea3bbf808b7f1d371311c80e115c0b71d60efc142cafbcfb057a6"
+
dependencies = [
+
"windows-link 0.1.3",
+
]
+
+
[[package]]
name = "windows_aarch64_gnullvm"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
name = "wisp-cli"
-
version = "0.1.0"
+
version = "0.2.0"
dependencies = [
+
"axum",
"base64 0.22.1",
"bytes",
+
"chrono",
"clap",
"flate2",
"futures",
···
"jacquard-oauth",
"miette",
"mime_guess",
+
"multibase",
+
"multihash",
+
"n0-future",
"reqwest",
"rustversion",
"serde",
"serde_json",
+
"sha2",
"shellexpand",
"tokio",
+
"tower 0.4.13",
+
"tower-http 0.5.2",
+
"url",
"walkdir",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
"synstructure",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
"synstructure",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
+19 -9
cli/Cargo.toml
···
[package]
name = "wisp-cli"
-
version = "0.1.0"
+
version = "0.2.0"
edition = "2024"
[features]
···
place_wisp = []
[dependencies]
-
jacquard = { path = "jacquard/crates/jacquard", features = ["loopback"] }
-
jacquard-oauth = { path = "jacquard/crates/jacquard-oauth" }
-
jacquard-api = { path = "jacquard/crates/jacquard-api" }
-
jacquard-common = { path = "jacquard/crates/jacquard-common" }
-
jacquard-identity = { path = "jacquard/crates/jacquard-identity", features = ["dns"] }
-
jacquard-derive = { path = "jacquard/crates/jacquard-derive" }
-
jacquard-lexicon = { path = "jacquard/crates/jacquard-lexicon" }
+
jacquard = { git = "https://tangled.org/@nonbinary.computer/jacquard", features = ["loopback"] }
+
jacquard-oauth = { git = "https://tangled.org/@nonbinary.computer/jacquard" }
+
jacquard-api = { git = "https://tangled.org/@nonbinary.computer/jacquard" }
+
jacquard-common = { git = "https://tangled.org/@nonbinary.computer/jacquard", features = ["websocket"] }
+
jacquard-identity = { git = "https://tangled.org/@nonbinary.computer/jacquard", features = ["dns"] }
+
jacquard-derive = { git = "https://tangled.org/@nonbinary.computer/jacquard" }
+
jacquard-lexicon = { git = "https://tangled.org/@nonbinary.computer/jacquard" }
clap = { version = "4.5.51", features = ["derive"] }
tokio = { version = "1.48", features = ["full"] }
miette = { version = "7.6.0", features = ["fancy"] }
serde_json = "1.0.145"
serde = { version = "1.0", features = ["derive"] }
shellexpand = "3.1.1"
-
reqwest = "0.12"
+
#reqwest = "0.12"
+
reqwest = { version = "0.12", default-features = false, features = ["rustls-tls"] }
rustversion = "1.0"
flate2 = "1.0"
base64 = "0.22"
···
mime_guess = "2.0"
bytes = "1.10"
futures = "0.3.31"
+
multihash = "0.19.3"
+
multibase = "0.9"
+
sha2 = "0.10"
+
axum = "0.7"
+
tower-http = { version = "0.5", features = ["fs", "compression-gzip"] }
+
tower = "0.4"
+
n0-future = "0.1"
+
chrono = "0.4"
+
url = "2.5"
+271
cli/README.md
···
+
# Wisp CLI
+
+
A command-line tool for deploying static sites to your AT Protocol repo to be served on [wisp.place](https://wisp.place), an AT indexer to serve such sites.
+
+
## Why?
+
+
The PDS serves as a way to verfiably, cryptographically prove that you own your site. That it was you (or at least someone who controls your account) who uploaded it. It is also a manifest of each file in the site to ensure file integrity. Keeping hosting seperate ensures that you could move your site across other servers or even serverless solutions to ensure speedy delievery while keeping it backed by an absolute source of truth being the manifest record and the blobs of each file in your repo.
+
+
## Features
+
+
- Deploy static sites directly to your AT Protocol repo
+
- Supports both OAuth and app password authentication
+
- Preserves directory structure and file integrity
+
+
## Soon
+
+
-- Host sites
+
-- Manage and delete sites
+
-- Metrics and logs for self hosting.
+
+
## Installation
+
+
### From Source
+
+
```bash
+
cargo build --release
+
```
+
+
Check out the build scripts for cross complation using nix-shell.
+
+
The binary will be available at `target/release/wisp-cli`.
+
+
## Usage
+
+
### Basic Deployment
+
+
Deploy the current directory:
+
+
```bash
+
wisp-cli nekomimi.ppet --path . --site my-site
+
```
+
+
Deploy a specific directory:
+
+
```bash
+
wisp-cli alice.bsky.social --path ./dist/ --site my-site
+
```
+
+
### Authentication Methods
+
+
#### OAuth (Recommended)
+
+
By default, the CLI uses OAuth authentication with a local loopback server:
+
+
```bash
+
wisp-cli alice.bsky.social --path ./my-site --site my-site
+
```
+
+
This will:
+
1. Open your browser for authentication
+
2. Save the session to a file (default: `/tmp/wisp-oauth-session.json`)
+
3. Reuse the session for future deployments
+
+
Specify a custom session file location:
+
+
```bash
+
wisp-cli alice.bsky.social --path ./my-site --site my-site --store ~/.wisp-session.json
+
```
+
+
#### App Password
+
+
For headless environments or CI/CD, use an app password:
+
+
```bash
+
wisp-cli alice.bsky.social --path ./my-site --site my-site --password YOUR_APP_PASSWORD
+
```
+
+
**Note:** When using `--password`, the `--store` option is ignored.
+
+
## Command-Line Options
+
+
```
+
wisp-cli [OPTIONS] <INPUT>
+
+
Arguments:
+
<INPUT> Handle (e.g., alice.bsky.social), DID, or PDS URL
+
+
Options:
+
-p, --path <PATH> Path to the directory containing your static site [default: .]
+
-s, --site <SITE> Site name (defaults to directory name)
+
--store <STORE> Path to auth store file (only used with OAuth) [default: /tmp/wisp-oauth-session.json]
+
--password <PASSWORD> App Password for authentication (alternative to OAuth)
+
-h, --help Print help
+
-V, --version Print version
+
```
+
+
## How It Works
+
+
1. **Authentication**: Authenticates using OAuth or app password
+
2. **File Processing**:
+
- Recursively walks the directory tree
+
- Skips hidden files (starting with `.`)
+
- Detects MIME types automatically
+
- Compresses files with gzip
+
- Base64 encodes compressed content
+
3. **Upload**:
+
- Uploads files as blobs to your PDS
+
- Processes up to 5 files concurrently
+
- Creates a `place.wisp.fs` record with the site manifest
+
4. **Deployment**: Site is immediately available at `https://sites.wisp.place/{did}/{site-name}`
+
+
## File Processing
+
+
All files are automatically:
+
+
- **Compressed** with gzip (level 9)
+
- **Base64 encoded** to bypass PDS content sniffing
+
- **Uploaded** as `application/octet-stream` blobs
+
- **Stored** with original MIME type metadata
+
+
The hosting service automatically decompresses non HTML/CSS/JS files when serving them.
+
+
## Limitations
+
+
- **Max file size**: 100MB per file (after compression) (this is a PDS limit, but not enforced by the CLI in case yours is higher)
+
- **Max file count**: 2000 files
+
- **Site name** must follow AT Protocol rkey format rules (alphanumeric, hyphens, underscores)
+
+
## Deploy with CI/CD
+
+
### GitHub Actions
+
+
```yaml
+
name: Deploy to Wisp
+
on:
+
push:
+
branches: [main]
+
+
jobs:
+
deploy:
+
runs-on: ubuntu-latest
+
steps:
+
- uses: actions/checkout@v3
+
+
- name: Setup Node
+
uses: actions/setup-node@v3
+
with:
+
node-version: '25'
+
+
- name: Install dependencies
+
run: npm install
+
+
- name: Build site
+
run: npm run build
+
+
- name: Download Wisp CLI
+
run: |
+
curl -L https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux -o wisp-cli
+
chmod +x wisp-cli
+
+
- name: Deploy to Wisp
+
env:
+
WISP_APP_PASSWORD: ${{ secrets.WISP_APP_PASSWORD }}
+
run: |
+
./wisp-cli alice.bsky.social \
+
--path ./dist \
+
--site my-site \
+
--password "$WISP_APP_PASSWORD"
+
```
+
+
### Tangled.org
+
+
```yaml
+
when:
+
- event: ['push']
+
branch: ['main']
+
- event: ['manual']
+
+
engine: 'nixery'
+
+
clone:
+
skip: false
+
depth: 1
+
submodules: false
+
+
dependencies:
+
nixpkgs:
+
- nodejs
+
- coreutils
+
- curl
+
github:NixOS/nixpkgs/nixpkgs-unstable:
+
- bun
+
+
environment:
+
SITE_PATH: 'dist'
+
SITE_NAME: 'my-site'
+
WISP_HANDLE: 'your-handle.bsky.social'
+
+
steps:
+
- name: build site
+
command: |
+
export PATH="$HOME/.nix-profile/bin:$PATH"
+
+
# regenerate lockfile
+
rm package-lock.json bun.lock
+
bun install @rolldown/binding-linux-arm64-gnu --save-optional
+
bun install
+
+
# build with vite
+
bun node_modules/.bin/vite build
+
+
- name: deploy to wisp
+
command: |
+
# Download Wisp CLI
+
curl https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux -o wisp-cli
+
chmod +x wisp-cli
+
+
# Deploy to Wisp
+
./wisp-cli \
+
"$WISP_HANDLE" \
+
--path "$SITE_PATH" \
+
--site "$SITE_NAME" \
+
--password "$WISP_APP_PASSWORD"
+
```
+
+
### Generic Shell Script
+
+
```bash
+
# Use app password from environment variable
+
wisp-cli alice.bsky.social --path ./dist --site my-site --password "$WISP_APP_PASSWORD"
+
```
+
+
## Output
+
+
Upon successful deployment, you'll see:
+
+
```
+
Deployed site 'my-site': at://did:plc:abc123xyz/place.wisp.fs/my-site
+
Available at: https://sites.wisp.place/did:plc:abc123xyz/my-site
+
```
+
+
### Dependencies
+
+
- **jacquard**: AT Protocol client library
+
- **clap**: Command-line argument parsing
+
- **tokio**: Async runtime
+
- **flate2**: Gzip compression
+
- **base64**: Base64 encoding
+
- **walkdir**: Directory traversal
+
- **mime_guess**: MIME type detection
+
+
## License
+
+
MIT License
+
+
## Contributing
+
+
Just don't give me entirely claude slop especailly not in the PR description itself. You should be responsible for code you submit and aware of what it even is you're submitting.
+
+
## Links
+
+
- **Website**: https://wisp.place
+
- **Main Repository**: https://tangled.org/@nekomimi.pet/wisp.place-monorepo
+
- **AT Protocol**: https://atproto.com
+
- **Jacquard Library**: https://tangled.org/@nonbinary.computer/jacquard
+
+
## Support
+
+
For issues and questions:
+
- Check the main wisp.place documentation
+
- Open an issue in the main repository
+23
cli/build-linux.sh
···
+
#!/usr/bin/env bash
+
# Build Linux binaries (statically linked)
+
set -e
+
mkdir -p binaries
+
+
# Build Linux binaries
+
echo "Building Linux binaries..."
+
+
echo "Building Linux ARM64 (static)..."
+
nix-shell -p rustup --run '
+
rustup target add aarch64-unknown-linux-musl
+
RUSTFLAGS="-C target-feature=+crt-static" cargo zigbuild --release --target aarch64-unknown-linux-musl
+
'
+
cp target/aarch64-unknown-linux-musl/release/wisp-cli binaries/wisp-cli-aarch64-linux
+
+
echo "Building Linux x86_64 (static)..."
+
nix-shell -p rustup --run '
+
rustup target add x86_64-unknown-linux-musl
+
RUSTFLAGS="-C target-feature=+crt-static" cargo build --release --target x86_64-unknown-linux-musl
+
'
+
cp target/x86_64-unknown-linux-musl/release/wisp-cli binaries/wisp-cli-x86_64-linux
+
+
echo "Done! Binaries in ./binaries/"
+15
cli/build-macos.sh
···
+
#!/bin/bash
+
# Build Linux and macOS binaries
+
+
set -e
+
+
mkdir -p binaries
+
rm -rf target
+
+
# Build macOS binaries natively
+
echo "Building macOS binaries..."
+
rustup target add aarch64-apple-darwin
+
+
echo "Building macOS arm64 binary."
+
RUSTFLAGS="-C target-feature=+crt-static" cargo build --release --target aarch64-apple-darwin
+
cp target/aarch64-apple-darwin/release/wisp-cli binaries/wisp-cli-macos-arm64
+85
cli/src/blob_map.rs
···
+
use jacquard_common::types::blob::BlobRef;
+
use jacquard_common::IntoStatic;
+
use std::collections::HashMap;
+
+
use crate::place_wisp::fs::{Directory, EntryNode};
+
+
/// Extract blob information from a directory tree
+
/// Returns a map of file paths to their blob refs and CIDs
+
///
+
/// This mirrors the TypeScript implementation in src/lib/wisp-utils.ts lines 275-302
+
pub fn extract_blob_map(
+
directory: &Directory,
+
) -> HashMap<String, (BlobRef<'static>, String)> {
+
extract_blob_map_recursive(directory, String::new())
+
}
+
+
fn extract_blob_map_recursive(
+
directory: &Directory,
+
current_path: String,
+
) -> HashMap<String, (BlobRef<'static>, String)> {
+
let mut blob_map = HashMap::new();
+
+
for entry in &directory.entries {
+
let full_path = if current_path.is_empty() {
+
entry.name.to_string()
+
} else {
+
format!("{}/{}", current_path, entry.name)
+
};
+
+
match &entry.node {
+
EntryNode::File(file_node) => {
+
// Extract CID from blob ref
+
// BlobRef is an enum with Blob variant, which has a ref field (CidLink)
+
let blob_ref = &file_node.blob;
+
let cid_string = blob_ref.blob().r#ref.to_string();
+
+
// Store with full path (mirrors TypeScript implementation)
+
blob_map.insert(
+
full_path,
+
(blob_ref.clone().into_static(), cid_string)
+
);
+
}
+
EntryNode::Directory(subdir) => {
+
let sub_map = extract_blob_map_recursive(subdir, full_path);
+
blob_map.extend(sub_map);
+
}
+
EntryNode::Unknown(_) => {
+
// Skip unknown node types
+
}
+
}
+
}
+
+
blob_map
+
}
+
+
/// Normalize file path by removing base folder prefix
+
/// Example: "cobblemon/index.html" -> "index.html"
+
///
+
/// Note: This function is kept for reference but is no longer used in production code.
+
/// The TypeScript server has a similar normalization (src/routes/wisp.ts line 291) to handle
+
/// uploads that include a base folder prefix, but our CLI doesn't need this since we
+
/// track full paths consistently.
+
#[allow(dead_code)]
+
pub fn normalize_path(path: &str) -> String {
+
// Remove base folder prefix (everything before first /)
+
if let Some(idx) = path.find('/') {
+
path[idx + 1..].to_string()
+
} else {
+
path.to_string()
+
}
+
}
+
+
#[cfg(test)]
+
mod tests {
+
use super::*;
+
+
#[test]
+
fn test_normalize_path() {
+
assert_eq!(normalize_path("index.html"), "index.html");
+
assert_eq!(normalize_path("cobblemon/index.html"), "index.html");
+
assert_eq!(normalize_path("folder/subfolder/file.txt"), "subfolder/file.txt");
+
assert_eq!(normalize_path("a/b/c/d.txt"), "b/c/d.txt");
+
}
+
}
+
+66
cli/src/cid.rs
···
+
use jacquard_common::types::cid::IpldCid;
+
use sha2::{Digest, Sha256};
+
+
/// Compute CID (Content Identifier) for blob content
+
/// Uses the same algorithm as AT Protocol: CIDv1 with raw codec (0x55) and SHA-256
+
///
+
/// CRITICAL: This must be called on BASE64-ENCODED GZIPPED content, not just gzipped content
+
///
+
/// Based on @atproto/common/src/ipld.ts sha256RawToCid implementation
+
pub fn compute_cid(content: &[u8]) -> String {
+
// Use node crypto to compute sha256 hash (same as AT Protocol)
+
let hash = Sha256::digest(content);
+
+
// Create multihash (code 0x12 = sha2-256)
+
let multihash = multihash::Multihash::wrap(0x12, &hash)
+
.expect("SHA-256 hash should always fit in multihash");
+
+
// Create CIDv1 with raw codec (0x55)
+
let cid = IpldCid::new_v1(0x55, multihash);
+
+
// Convert to base32 string representation
+
cid.to_string_of_base(multibase::Base::Base32Lower)
+
.unwrap_or_else(|_| cid.to_string())
+
}
+
+
#[cfg(test)]
+
mod tests {
+
use super::*;
+
use base64::Engine;
+
+
#[test]
+
fn test_compute_cid() {
+
// Test with a simple string: "hello"
+
let content = b"hello";
+
let cid = compute_cid(content);
+
+
// CID should start with 'baf' for raw codec base32
+
assert!(cid.starts_with("baf"));
+
}
+
+
#[test]
+
fn test_compute_cid_base64_encoded() {
+
// Simulate the actual use case: gzipped then base64 encoded
+
use flate2::write::GzEncoder;
+
use flate2::Compression;
+
use std::io::Write;
+
+
let original = b"hello world";
+
+
// Gzip compress
+
let mut encoder = GzEncoder::new(Vec::new(), Compression::default());
+
encoder.write_all(original).unwrap();
+
let gzipped = encoder.finish().unwrap();
+
+
// Base64 encode the gzipped data
+
let base64_bytes = base64::prelude::BASE64_STANDARD.encode(&gzipped).into_bytes();
+
+
// Compute CID on the base64 bytes
+
let cid = compute_cid(&base64_bytes);
+
+
// Should be a valid CID
+
assert!(cid.starts_with("baf"));
+
assert!(cid.len() > 10);
+
}
+
}
+
+71
cli/src/download.rs
···
+
use base64::Engine;
+
use bytes::Bytes;
+
use flate2::read::GzDecoder;
+
use jacquard_common::types::blob::BlobRef;
+
use miette::IntoDiagnostic;
+
use std::io::Read;
+
use url::Url;
+
+
/// Download a blob from the PDS
+
pub async fn download_blob(pds_url: &Url, blob_ref: &BlobRef<'_>, did: &str) -> miette::Result<Bytes> {
+
// Extract CID from blob ref
+
let cid = blob_ref.blob().r#ref.to_string();
+
+
// Construct blob download URL
+
// The correct endpoint is: /xrpc/com.atproto.sync.getBlob?did={did}&cid={cid}
+
let blob_url = pds_url
+
.join(&format!("/xrpc/com.atproto.sync.getBlob?did={}&cid={}", did, cid))
+
.into_diagnostic()?;
+
+
let client = reqwest::Client::new();
+
let response = client
+
.get(blob_url)
+
.send()
+
.await
+
.into_diagnostic()?;
+
+
if !response.status().is_success() {
+
return Err(miette::miette!(
+
"Failed to download blob: {}",
+
response.status()
+
));
+
}
+
+
let bytes = response.bytes().await.into_diagnostic()?;
+
Ok(bytes)
+
}
+
+
/// Decompress and decode a blob (base64 + gzip)
+
pub fn decompress_blob(data: &[u8], is_base64: bool, is_gzipped: bool) -> miette::Result<Vec<u8>> {
+
let mut current_data = data.to_vec();
+
+
// First, decode base64 if needed
+
if is_base64 {
+
current_data = base64::prelude::BASE64_STANDARD
+
.decode(&current_data)
+
.into_diagnostic()?;
+
}
+
+
// Then, decompress gzip if needed
+
if is_gzipped {
+
let mut decoder = GzDecoder::new(&current_data[..]);
+
let mut decompressed = Vec::new();
+
decoder.read_to_end(&mut decompressed).into_diagnostic()?;
+
current_data = decompressed;
+
}
+
+
Ok(current_data)
+
}
+
+
/// Download and decompress a blob
+
pub async fn download_and_decompress_blob(
+
pds_url: &Url,
+
blob_ref: &BlobRef<'_>,
+
did: &str,
+
is_base64: bool,
+
is_gzipped: bool,
+
) -> miette::Result<Vec<u8>> {
+
let data = download_blob(pds_url, blob_ref, did).await?;
+
decompress_blob(&data, is_base64, is_gzipped)
+
}
+
+243 -56
cli/src/main.rs
···
mod builder_types;
mod place_wisp;
+
mod cid;
+
mod blob_map;
+
mod metadata;
+
mod download;
+
mod pull;
+
mod serve;
-
use clap::Parser;
+
use clap::{Parser, Subcommand};
use jacquard::CowStr;
-
use jacquard::client::{Agent, FileAuthStore, AgentSessionExt, MemoryCredentialSession};
+
use jacquard::client::{Agent, FileAuthStore, AgentSessionExt, MemoryCredentialSession, AgentSession};
use jacquard::oauth::client::OAuthClient;
use jacquard::oauth::loopback::LoopbackConfig;
use jacquard::prelude::IdentityResolver;
···
use jacquard_common::types::blob::MimeType;
use miette::IntoDiagnostic;
use std::path::{Path, PathBuf};
+
use std::collections::HashMap;
use flate2::Compression;
use flate2::write::GzEncoder;
use std::io::Write;
···
use place_wisp::fs::*;
#[derive(Parser, Debug)]
-
#[command(author, version, about = "Deploy a static site to wisp.place")]
+
#[command(author, version, about = "wisp.place CLI tool")]
struct Args {
+
#[command(subcommand)]
+
command: Option<Commands>,
+
+
// Deploy arguments (when no subcommand is specified)
/// Handle (e.g., alice.bsky.social), DID, or PDS URL
-
input: CowStr<'static>,
+
#[arg(global = true, conflicts_with = "command")]
+
input: Option<CowStr<'static>>,
/// Path to the directory containing your static site
-
#[arg(short, long, default_value = ".")]
-
path: PathBuf,
+
#[arg(short, long, global = true, conflicts_with = "command")]
+
path: Option<PathBuf>,
/// Site name (defaults to directory name)
-
#[arg(short, long)]
+
#[arg(short, long, global = true, conflicts_with = "command")]
site: Option<String>,
-
/// Path to auth store file (will be created if missing, only used with OAuth)
-
#[arg(long, default_value = "/tmp/wisp-oauth-session.json")]
-
store: String,
+
/// Path to auth store file
+
#[arg(long, global = true, conflicts_with = "command")]
+
store: Option<String>,
-
/// App Password for authentication (alternative to OAuth)
-
#[arg(long)]
+
/// App Password for authentication
+
#[arg(long, global = true, conflicts_with = "command")]
password: Option<CowStr<'static>>,
}
+
#[derive(Subcommand, Debug)]
+
enum Commands {
+
/// Deploy a static site to wisp.place (default command)
+
Deploy {
+
/// Handle (e.g., alice.bsky.social), DID, or PDS URL
+
input: CowStr<'static>,
+
+
/// Path to the directory containing your static site
+
#[arg(short, long, default_value = ".")]
+
path: PathBuf,
+
+
/// Site name (defaults to directory name)
+
#[arg(short, long)]
+
site: Option<String>,
+
+
/// Path to auth store file (will be created if missing, only used with OAuth)
+
#[arg(long, default_value = "/tmp/wisp-oauth-session.json")]
+
store: String,
+
+
/// App Password for authentication (alternative to OAuth)
+
#[arg(long)]
+
password: Option<CowStr<'static>>,
+
},
+
/// Pull a site from the PDS to a local directory
+
Pull {
+
/// Handle (e.g., alice.bsky.social) or DID
+
input: CowStr<'static>,
+
+
/// Site name (record key)
+
#[arg(short, long)]
+
site: String,
+
+
/// Output directory for the downloaded site
+
#[arg(short, long, default_value = ".")]
+
output: PathBuf,
+
},
+
/// Serve a site locally with real-time firehose updates
+
Serve {
+
/// Handle (e.g., alice.bsky.social) or DID
+
input: CowStr<'static>,
+
+
/// Site name (record key)
+
#[arg(short, long)]
+
site: String,
+
+
/// Output directory for the site files
+
#[arg(short, long, default_value = ".")]
+
output: PathBuf,
+
+
/// Port to serve on
+
#[arg(short, long, default_value = "8080")]
+
port: u16,
+
},
+
}
+
#[tokio::main]
async fn main() -> miette::Result<()> {
let args = Args::parse();
-
// Dispatch to appropriate authentication method
-
if let Some(password) = args.password {
-
run_with_app_password(args.input, password, args.path, args.site).await
-
} else {
-
run_with_oauth(args.input, args.store, args.path, args.site).await
+
match args.command {
+
Some(Commands::Deploy { input, path, site, store, password }) => {
+
// Dispatch to appropriate authentication method
+
if let Some(password) = password {
+
run_with_app_password(input, password, path, site).await
+
} else {
+
run_with_oauth(input, store, path, site).await
+
}
+
}
+
Some(Commands::Pull { input, site, output }) => {
+
pull::pull_site(input, CowStr::from(site), output).await
+
}
+
Some(Commands::Serve { input, site, output, port }) => {
+
serve::serve_site(input, CowStr::from(site), output, port).await
+
}
+
None => {
+
// Legacy mode: if input is provided, assume deploy command
+
if let Some(input) = args.input {
+
let path = args.path.unwrap_or_else(|| PathBuf::from("."));
+
let store = args.store.unwrap_or_else(|| "/tmp/wisp-oauth-session.json".to_string());
+
+
// Dispatch to appropriate authentication method
+
if let Some(password) = args.password {
+
run_with_app_password(input, password, path, args.site).await
+
} else {
+
run_with_oauth(input, store, path, args.site).await
+
}
+
} else {
+
// No command and no input, show help
+
use clap::CommandFactory;
+
Args::command().print_help().into_diagnostic()?;
+
Ok(())
+
}
+
}
}
}
···
println!("Deploying site '{}'...", site_name);
-
// Build directory tree
-
let root_dir = build_directory(agent, &path).await?;
+
// Try to fetch existing manifest for incremental updates
+
let existing_blob_map: HashMap<String, (jacquard_common::types::blob::BlobRef<'static>, String)> = {
+
use jacquard_common::types::string::AtUri;
+
+
// Get the DID for this session
+
let session_info = agent.session_info().await;
+
if let Some((did, _)) = session_info {
+
// Construct the AT URI for the record
+
let uri_string = format!("at://{}/place.wisp.fs/{}", did, site_name);
+
if let Ok(uri) = AtUri::new(&uri_string) {
+
match agent.get_record::<Fs>(&uri).await {
+
Ok(response) => {
+
match response.into_output() {
+
Ok(record_output) => {
+
let existing_manifest = record_output.value;
+
let blob_map = blob_map::extract_blob_map(&existing_manifest.root);
+
println!("Found existing manifest with {} files, checking for changes...", blob_map.len());
+
blob_map
+
}
+
Err(_) => {
+
println!("No existing manifest found, uploading all files...");
+
HashMap::new()
+
}
+
}
+
}
+
Err(_) => {
+
// Record doesn't exist yet - this is a new site
+
println!("No existing manifest found, uploading all files...");
+
HashMap::new()
+
}
+
}
+
} else {
+
println!("No existing manifest found (invalid URI), uploading all files...");
+
HashMap::new()
+
}
+
} else {
+
println!("No existing manifest found (could not get DID), uploading all files...");
+
HashMap::new()
+
}
+
};
-
// Count total files
-
let file_count = count_files(&root_dir);
+
// Build directory tree
+
let (root_dir, total_files, reused_count) = build_directory(agent, &path, &existing_blob_map, String::new()).await?;
+
let uploaded_count = total_files - reused_count;
// Create the Fs record
let fs_record = Fs::new()
.site(CowStr::from(site_name.clone()))
.root(root_dir)
-
.file_count(file_count as i64)
+
.file_count(total_files as i64)
.created_at(Datetime::now())
.build();
···
.and_then(|s| s.split('/').next())
.ok_or_else(|| miette::miette!("Failed to parse DID from URI"))?;
-
println!("Deployed site '{}': {}", site_name, output.uri);
-
println!("Available at: https://sites.wisp.place/{}/{}", did, site_name);
+
println!("\nโœ“ Deployed site '{}': {}", site_name, output.uri);
+
println!(" Total files: {} ({} reused, {} uploaded)", total_files, reused_count, uploaded_count);
+
println!(" Available at: https://sites.wisp.place/{}/{}", did, site_name);
Ok(())
}
/// Recursively build a Directory from a filesystem path
+
/// current_path is the path from the root of the site (e.g., "" for root, "config" for config dir)
fn build_directory<'a>(
agent: &'a Agent<impl jacquard::client::AgentSession + IdentityResolver + 'a>,
dir_path: &'a Path,
-
) -> std::pin::Pin<Box<dyn std::future::Future<Output = miette::Result<Directory<'static>>> + 'a>>
+
existing_blobs: &'a HashMap<String, (jacquard_common::types::blob::BlobRef<'static>, String)>,
+
current_path: String,
+
) -> std::pin::Pin<Box<dyn std::future::Future<Output = miette::Result<(Directory<'static>, usize, usize)>> + 'a>>
{
Box::pin(async move {
// Collect all directory entries first
···
let metadata = entry.metadata().into_diagnostic()?;
if metadata.is_file() {
-
file_tasks.push((name_str, path));
+
// Construct full path for this file (for blob map lookup)
+
let full_path = if current_path.is_empty() {
+
name_str.clone()
+
} else {
+
format!("{}/{}", current_path, name_str)
+
};
+
file_tasks.push((name_str, path, full_path));
} else if metadata.is_dir() {
dir_tasks.push((name_str, path));
}
}
// Process files concurrently with a limit of 5
-
let file_entries: Vec<Entry> = stream::iter(file_tasks)
-
.map(|(name, path)| async move {
-
let file_node = process_file(agent, &path).await?;
-
Ok::<_, miette::Report>(Entry::new()
+
let file_results: Vec<(Entry<'static>, bool)> = stream::iter(file_tasks)
+
.map(|(name, path, full_path)| async move {
+
let (file_node, reused) = process_file(agent, &path, &full_path, existing_blobs).await?;
+
let entry = Entry::new()
.name(CowStr::from(name))
.node(EntryNode::File(Box::new(file_node)))
-
.build())
+
.build();
+
Ok::<_, miette::Report>((entry, reused))
})
.buffer_unordered(5)
.collect::<Vec<_>>()
.await
.into_iter()
.collect::<miette::Result<Vec<_>>>()?;
+
+
let mut file_entries = Vec::new();
+
let mut reused_count = 0;
+
let mut total_files = 0;
+
+
for (entry, reused) in file_results {
+
file_entries.push(entry);
+
total_files += 1;
+
if reused {
+
reused_count += 1;
+
}
+
}
// Process directories recursively (sequentially to avoid too much nesting)
let mut dir_entries = Vec::new();
for (name, path) in dir_tasks {
-
let subdir = build_directory(agent, &path).await?;
+
// Construct full path for subdirectory
+
let subdir_path = if current_path.is_empty() {
+
name.clone()
+
} else {
+
format!("{}/{}", current_path, name)
+
};
+
let (subdir, sub_total, sub_reused) = build_directory(agent, &path, existing_blobs, subdir_path).await?;
dir_entries.push(Entry::new()
.name(CowStr::from(name))
.node(EntryNode::Directory(Box::new(subdir)))
.build());
+
total_files += sub_total;
+
reused_count += sub_reused;
}
// Combine file and directory entries
let mut entries = file_entries;
entries.extend(dir_entries);
-
Ok(Directory::new()
+
let directory = Directory::new()
.r#type(CowStr::from("directory"))
.entries(entries)
-
.build())
+
.build();
+
+
Ok((directory, total_files, reused_count))
})
}
-
/// Process a single file: gzip -> base64 -> upload blob
+
/// Process a single file: gzip -> base64 -> upload blob (or reuse existing)
+
/// Returns (File, reused: bool)
+
/// file_path_key is the full path from the site root (e.g., "config/file.json") for blob map lookup
async fn process_file(
agent: &Agent<impl jacquard::client::AgentSession + IdentityResolver>,
file_path: &Path,
-
) -> miette::Result<File<'static>>
+
file_path_key: &str,
+
existing_blobs: &HashMap<String, (jacquard_common::types::blob::BlobRef<'static>, String)>,
+
) -> miette::Result<(File<'static>, bool)>
{
// Read file
let file_data = std::fs::read(file_path).into_diagnostic()?;
···
// Base64 encode the gzipped data
let base64_bytes = base64::prelude::BASE64_STANDARD.encode(&gzipped).into_bytes();
-
// Upload blob as octet-stream
+
// Compute CID for this file (CRITICAL: on base64-encoded gzipped content)
+
let file_cid = cid::compute_cid(&base64_bytes);
+
+
// Check if we have an existing blob with the same CID
+
let existing_blob = existing_blobs.get(file_path_key);
+
+
if let Some((existing_blob_ref, existing_cid)) = existing_blob {
+
if existing_cid == &file_cid {
+
// CIDs match - reuse existing blob
+
println!(" โœ“ Reusing blob for {} (CID: {})", file_path_key, file_cid);
+
return Ok((
+
File::new()
+
.r#type(CowStr::from("file"))
+
.blob(existing_blob_ref.clone())
+
.encoding(CowStr::from("gzip"))
+
.mime_type(CowStr::from(original_mime))
+
.base64(true)
+
.build(),
+
true
+
));
+
}
+
}
+
+
// File is new or changed - upload it
+
println!(" โ†‘ Uploading {} ({} bytes, CID: {})", file_path_key, base64_bytes.len(), file_cid);
let blob = agent.upload_blob(
base64_bytes,
MimeType::new_static("application/octet-stream"),
).await?;
-
Ok(File::new()
-
.r#type(CowStr::from("file"))
-
.blob(blob)
-
.encoding(CowStr::from("gzip"))
-
.mime_type(CowStr::from(original_mime))
-
.base64(true)
-
.build())
+
Ok((
+
File::new()
+
.r#type(CowStr::from("file"))
+
.blob(blob)
+
.encoding(CowStr::from("gzip"))
+
.mime_type(CowStr::from(original_mime))
+
.base64(true)
+
.build(),
+
false
+
))
}
-
/// Count total files in a directory tree
-
fn count_files(dir: &Directory) -> usize {
-
let mut count = 0;
-
for entry in &dir.entries {
-
match &entry.node {
-
EntryNode::File(_) => count += 1,
-
EntryNode::Directory(subdir) => count += count_files(subdir),
-
_ => {} // Unknown variants
-
}
-
}
-
count
-
}
+46
cli/src/metadata.rs
···
+
use serde::{Deserialize, Serialize};
+
use std::collections::HashMap;
+
use std::path::Path;
+
use miette::IntoDiagnostic;
+
+
/// Metadata tracking file CIDs for incremental updates
+
#[derive(Debug, Clone, Serialize, Deserialize)]
+
pub struct SiteMetadata {
+
/// Record CID from the PDS
+
pub record_cid: String,
+
/// Map of file paths to their blob CIDs
+
pub file_cids: HashMap<String, String>,
+
/// Timestamp when the site was last synced
+
pub last_sync: i64,
+
}
+
+
impl SiteMetadata {
+
pub fn new(record_cid: String, file_cids: HashMap<String, String>) -> Self {
+
Self {
+
record_cid,
+
file_cids,
+
last_sync: chrono::Utc::now().timestamp(),
+
}
+
}
+
+
/// Load metadata from a directory
+
pub fn load(dir: &Path) -> miette::Result<Option<Self>> {
+
let metadata_path = dir.join(".wisp-metadata.json");
+
if !metadata_path.exists() {
+
return Ok(None);
+
}
+
+
let contents = std::fs::read_to_string(&metadata_path).into_diagnostic()?;
+
let metadata: SiteMetadata = serde_json::from_str(&contents).into_diagnostic()?;
+
Ok(Some(metadata))
+
}
+
+
/// Save metadata to a directory
+
pub fn save(&self, dir: &Path) -> miette::Result<()> {
+
let metadata_path = dir.join(".wisp-metadata.json");
+
let contents = serde_json::to_string_pretty(self).into_diagnostic()?;
+
std::fs::write(&metadata_path, contents).into_diagnostic()?;
+
Ok(())
+
}
+
}
+
+305
cli/src/pull.rs
···
+
use crate::blob_map;
+
use crate::download;
+
use crate::metadata::SiteMetadata;
+
use crate::place_wisp::fs::*;
+
use jacquard::CowStr;
+
use jacquard::prelude::IdentityResolver;
+
use jacquard_common::types::string::Did;
+
use jacquard_common::xrpc::XrpcExt;
+
use jacquard_identity::PublicResolver;
+
use miette::IntoDiagnostic;
+
use std::collections::HashMap;
+
use std::path::{Path, PathBuf};
+
use url::Url;
+
+
/// Pull a site from the PDS to a local directory
+
pub async fn pull_site(
+
input: CowStr<'static>,
+
rkey: CowStr<'static>,
+
output_dir: PathBuf,
+
) -> miette::Result<()> {
+
println!("Pulling site {} from {}...", rkey, input);
+
+
// Resolve handle to DID if needed
+
let resolver = PublicResolver::default();
+
let did = if input.starts_with("did:") {
+
Did::new(&input).into_diagnostic()?
+
} else {
+
// It's a handle, resolve it
+
let handle = jacquard_common::types::string::Handle::new(&input).into_diagnostic()?;
+
resolver.resolve_handle(&handle).await.into_diagnostic()?
+
};
+
+
// Resolve PDS endpoint for the DID
+
let pds_url = resolver.pds_for_did(&did).await.into_diagnostic()?;
+
println!("Resolved PDS: {}", pds_url);
+
+
// Fetch the place.wisp.fs record
+
+
println!("Fetching record from PDS...");
+
let client = reqwest::Client::new();
+
+
// Use com.atproto.repo.getRecord
+
use jacquard::api::com_atproto::repo::get_record::GetRecord;
+
use jacquard_common::types::string::Rkey as RkeyType;
+
let rkey_parsed = RkeyType::new(&rkey).into_diagnostic()?;
+
+
use jacquard_common::types::ident::AtIdentifier;
+
use jacquard_common::types::string::RecordKey;
+
let request = GetRecord::new()
+
.repo(AtIdentifier::Did(did.clone()))
+
.collection(CowStr::from("place.wisp.fs"))
+
.rkey(RecordKey::from(rkey_parsed))
+
.build();
+
+
let response = client
+
.xrpc(pds_url.clone())
+
.send(&request)
+
.await
+
.into_diagnostic()?;
+
+
let record_output = response.into_output().into_diagnostic()?;
+
let record_cid = record_output.cid.as_ref().map(|c| c.to_string()).unwrap_or_default();
+
+
// Parse the record value as Fs
+
use jacquard_common::types::value::from_data;
+
let fs_record: Fs = from_data(&record_output.value).into_diagnostic()?;
+
+
let file_count = fs_record.file_count.map(|c| c.to_string()).unwrap_or_else(|| "?".to_string());
+
println!("Found site '{}' with {} files", fs_record.site, file_count);
+
+
// Load existing metadata for incremental updates
+
let existing_metadata = SiteMetadata::load(&output_dir)?;
+
let existing_file_cids = existing_metadata
+
.as_ref()
+
.map(|m| m.file_cids.clone())
+
.unwrap_or_default();
+
+
// Extract blob map from the new manifest
+
let new_blob_map = blob_map::extract_blob_map(&fs_record.root);
+
let new_file_cids: HashMap<String, String> = new_blob_map
+
.iter()
+
.map(|(path, (_blob_ref, cid))| (path.clone(), cid.clone()))
+
.collect();
+
+
// Clean up any leftover temp directories from previous failed attempts
+
let parent = output_dir.parent().unwrap_or_else(|| std::path::Path::new("."));
+
let output_name = output_dir.file_name().unwrap_or_else(|| std::ffi::OsStr::new("site")).to_string_lossy();
+
let temp_prefix = format!(".tmp-{}-", output_name);
+
+
if let Ok(entries) = parent.read_dir() {
+
for entry in entries.flatten() {
+
let name = entry.file_name();
+
if name.to_string_lossy().starts_with(&temp_prefix) {
+
let _ = std::fs::remove_dir_all(entry.path());
+
}
+
}
+
}
+
+
// Check if we need to update (but only if output directory actually exists with files)
+
if let Some(metadata) = &existing_metadata {
+
if metadata.record_cid == record_cid {
+
// Verify that the output directory actually exists and has content
+
let has_content = output_dir.exists() &&
+
output_dir.read_dir()
+
.map(|mut entries| entries.any(|e| {
+
if let Ok(entry) = e {
+
!entry.file_name().to_string_lossy().starts_with(".wisp-metadata")
+
} else {
+
false
+
}
+
}))
+
.unwrap_or(false);
+
+
if has_content {
+
println!("Site is already up to date!");
+
return Ok(());
+
}
+
}
+
}
+
+
// Create temporary directory for atomic update
+
// Place temp dir in parent directory to avoid issues with non-existent output_dir
+
let parent = output_dir.parent().unwrap_or_else(|| std::path::Path::new("."));
+
let temp_dir_name = format!(
+
".tmp-{}-{}",
+
output_dir.file_name().unwrap_or_else(|| std::ffi::OsStr::new("site")).to_string_lossy(),
+
chrono::Utc::now().timestamp()
+
);
+
let temp_dir = parent.join(temp_dir_name);
+
std::fs::create_dir_all(&temp_dir).into_diagnostic()?;
+
+
println!("Downloading files...");
+
let mut downloaded = 0;
+
let mut reused = 0;
+
+
// Download files recursively
+
let download_result = download_directory(
+
&fs_record.root,
+
&temp_dir,
+
&pds_url,
+
did.as_str(),
+
&new_blob_map,
+
&existing_file_cids,
+
&output_dir,
+
String::new(),
+
&mut downloaded,
+
&mut reused,
+
)
+
.await;
+
+
// If download failed, clean up temp directory
+
if let Err(e) = download_result {
+
let _ = std::fs::remove_dir_all(&temp_dir);
+
return Err(e);
+
}
+
+
println!(
+
"Downloaded {} files, reused {} files",
+
downloaded, reused
+
);
+
+
// Save metadata
+
let metadata = SiteMetadata::new(record_cid, new_file_cids);
+
metadata.save(&temp_dir)?;
+
+
// Move files from temp to output directory
+
let output_abs = std::fs::canonicalize(&output_dir).unwrap_or_else(|_| output_dir.clone());
+
let current_dir = std::env::current_dir().into_diagnostic()?;
+
+
// Special handling for pulling to current directory
+
if output_abs == current_dir {
+
// Move files from temp to current directory
+
for entry in std::fs::read_dir(&temp_dir).into_diagnostic()? {
+
let entry = entry.into_diagnostic()?;
+
let dest = current_dir.join(entry.file_name());
+
+
// Remove existing file/dir if it exists
+
if dest.exists() {
+
if dest.is_dir() {
+
std::fs::remove_dir_all(&dest).into_diagnostic()?;
+
} else {
+
std::fs::remove_file(&dest).into_diagnostic()?;
+
}
+
}
+
+
// Move from temp to current dir
+
std::fs::rename(entry.path(), dest).into_diagnostic()?;
+
}
+
+
// Clean up temp directory
+
std::fs::remove_dir_all(&temp_dir).into_diagnostic()?;
+
} else {
+
// If output directory exists and has content, remove it first
+
if output_dir.exists() {
+
std::fs::remove_dir_all(&output_dir).into_diagnostic()?;
+
}
+
+
// Ensure parent directory exists
+
if let Some(parent) = output_dir.parent() {
+
if !parent.as_os_str().is_empty() && !parent.exists() {
+
std::fs::create_dir_all(parent).into_diagnostic()?;
+
}
+
}
+
+
// Rename temp to final location
+
match std::fs::rename(&temp_dir, &output_dir) {
+
Ok(_) => {},
+
Err(e) => {
+
// Clean up temp directory on failure
+
let _ = std::fs::remove_dir_all(&temp_dir);
+
return Err(miette::miette!("Failed to move temp directory: {}", e));
+
}
+
}
+
}
+
+
println!("โœ“ Site pulled successfully to {}", output_dir.display());
+
+
Ok(())
+
}
+
+
/// Recursively download a directory
+
fn download_directory<'a>(
+
dir: &'a Directory<'_>,
+
output_dir: &'a Path,
+
pds_url: &'a Url,
+
did: &'a str,
+
new_blob_map: &'a HashMap<String, (jacquard_common::types::blob::BlobRef<'static>, String)>,
+
existing_file_cids: &'a HashMap<String, String>,
+
existing_output_dir: &'a Path,
+
path_prefix: String,
+
downloaded: &'a mut usize,
+
reused: &'a mut usize,
+
) -> std::pin::Pin<Box<dyn std::future::Future<Output = miette::Result<()>> + Send + 'a>> {
+
Box::pin(async move {
+
for entry in &dir.entries {
+
let entry_name = entry.name.as_str();
+
let current_path = if path_prefix.is_empty() {
+
entry_name.to_string()
+
} else {
+
format!("{}/{}", path_prefix, entry_name)
+
};
+
+
match &entry.node {
+
EntryNode::File(file) => {
+
let output_path = output_dir.join(entry_name);
+
+
// Check if file CID matches existing
+
if let Some((_blob_ref, new_cid)) = new_blob_map.get(&current_path) {
+
if let Some(existing_cid) = existing_file_cids.get(&current_path) {
+
if existing_cid == new_cid {
+
// File unchanged, copy from existing directory
+
let existing_path = existing_output_dir.join(&current_path);
+
if existing_path.exists() {
+
std::fs::copy(&existing_path, &output_path).into_diagnostic()?;
+
*reused += 1;
+
println!(" โœ“ Reused {}", current_path);
+
continue;
+
}
+
}
+
}
+
}
+
+
// File is new or changed, download it
+
println!(" โ†“ Downloading {}", current_path);
+
let data = download::download_and_decompress_blob(
+
pds_url,
+
&file.blob,
+
did,
+
file.base64.unwrap_or(false),
+
file.encoding.as_ref().map(|e| e.as_str() == "gzip").unwrap_or(false),
+
)
+
.await?;
+
+
std::fs::write(&output_path, data).into_diagnostic()?;
+
*downloaded += 1;
+
}
+
EntryNode::Directory(subdir) => {
+
let subdir_path = output_dir.join(entry_name);
+
std::fs::create_dir_all(&subdir_path).into_diagnostic()?;
+
+
download_directory(
+
subdir,
+
&subdir_path,
+
pds_url,
+
did,
+
new_blob_map,
+
existing_file_cids,
+
existing_output_dir,
+
current_path,
+
downloaded,
+
reused,
+
)
+
.await?;
+
}
+
EntryNode::Unknown(_) => {
+
// Skip unknown node types
+
println!(" โš  Skipping unknown node type for {}", current_path);
+
}
+
}
+
}
+
+
Ok(())
+
})
+
}
+
+202
cli/src/serve.rs
···
+
use crate::pull::pull_site;
+
use axum::Router;
+
use jacquard::CowStr;
+
use jacquard_common::jetstream::{CommitOperation, JetstreamMessage, JetstreamParams};
+
use jacquard_common::types::string::Did;
+
use jacquard_common::xrpc::{SubscriptionClient, TungsteniteSubscriptionClient};
+
use miette::IntoDiagnostic;
+
use n0_future::StreamExt;
+
use std::path::PathBuf;
+
use std::sync::Arc;
+
use tokio::sync::RwLock;
+
use tower_http::compression::CompressionLayer;
+
use tower_http::services::ServeDir;
+
use url::Url;
+
+
/// Shared state for the server
+
#[derive(Clone)]
+
struct ServerState {
+
did: CowStr<'static>,
+
rkey: CowStr<'static>,
+
output_dir: PathBuf,
+
last_cid: Arc<RwLock<Option<String>>>,
+
}
+
+
/// Serve a site locally with real-time firehose updates
+
pub async fn serve_site(
+
input: CowStr<'static>,
+
rkey: CowStr<'static>,
+
output_dir: PathBuf,
+
port: u16,
+
) -> miette::Result<()> {
+
println!("Serving site {} from {} on port {}...", rkey, input, port);
+
+
// Resolve handle to DID if needed
+
use jacquard_identity::PublicResolver;
+
use jacquard::prelude::IdentityResolver;
+
+
let resolver = PublicResolver::default();
+
let did = if input.starts_with("did:") {
+
Did::new(&input).into_diagnostic()?
+
} else {
+
// It's a handle, resolve it
+
let handle = jacquard_common::types::string::Handle::new(&input).into_diagnostic()?;
+
resolver.resolve_handle(&handle).await.into_diagnostic()?
+
};
+
+
println!("Resolved to DID: {}", did.as_str());
+
+
// Create output directory if it doesn't exist
+
std::fs::create_dir_all(&output_dir).into_diagnostic()?;
+
+
// Initial pull of the site
+
println!("Performing initial pull...");
+
let did_str = CowStr::from(did.as_str().to_string());
+
pull_site(did_str.clone(), rkey.clone(), output_dir.clone()).await?;
+
+
// Create shared state
+
let state = ServerState {
+
did: did_str.clone(),
+
rkey: rkey.clone(),
+
output_dir: output_dir.clone(),
+
last_cid: Arc::new(RwLock::new(None)),
+
};
+
+
// Start firehose listener in background
+
let firehose_state = state.clone();
+
tokio::spawn(async move {
+
if let Err(e) = watch_firehose(firehose_state).await {
+
eprintln!("Firehose error: {}", e);
+
}
+
});
+
+
// Create HTTP server with gzip compression
+
let app = Router::new()
+
.fallback_service(
+
ServeDir::new(&output_dir)
+
.precompressed_gzip()
+
)
+
.layer(CompressionLayer::new())
+
.with_state(state);
+
+
let addr = format!("0.0.0.0:{}", port);
+
let listener = tokio::net::TcpListener::bind(&addr)
+
.await
+
.into_diagnostic()?;
+
+
println!("\nโœ“ Server running at http://localhost:{}", port);
+
println!(" Watching for updates on the firehose...\n");
+
+
axum::serve(listener, app).await.into_diagnostic()?;
+
+
Ok(())
+
}
+
+
/// Watch the firehose for updates to the specific site
+
fn watch_firehose(state: ServerState) -> std::pin::Pin<Box<dyn std::future::Future<Output = miette::Result<()>> + Send>> {
+
Box::pin(async move {
+
let jetstream_url = Url::parse("wss://jetstream1.us-east.fire.hose.cam")
+
.into_diagnostic()?;
+
+
println!("[Firehose] Connecting to Jetstream...");
+
+
// Create subscription client
+
let client = TungsteniteSubscriptionClient::from_base_uri(jetstream_url);
+
+
// Subscribe with no filters (we'll filter manually)
+
// Jetstream doesn't support filtering by collection in the params builder
+
let params = JetstreamParams::new().build();
+
+
let stream = client.subscribe(&params).await.into_diagnostic()?;
+
println!("[Firehose] Connected! Watching for updates...");
+
+
// Convert to typed message stream
+
let (_sink, mut messages) = stream.into_stream();
+
+
loop {
+
match messages.next().await {
+
Some(Ok(msg)) => {
+
if let Err(e) = handle_firehose_message(&state, msg).await {
+
eprintln!("[Firehose] Error handling message: {}", e);
+
}
+
}
+
Some(Err(e)) => {
+
eprintln!("[Firehose] Stream error: {}", e);
+
// Try to reconnect after a delay
+
tokio::time::sleep(tokio::time::Duration::from_secs(5)).await;
+
return Box::pin(watch_firehose(state)).await;
+
}
+
None => {
+
println!("[Firehose] Stream ended, reconnecting...");
+
tokio::time::sleep(tokio::time::Duration::from_secs(5)).await;
+
return Box::pin(watch_firehose(state)).await;
+
}
+
}
+
}
+
})
+
}
+
+
/// Handle a firehose message
+
async fn handle_firehose_message(
+
state: &ServerState,
+
msg: JetstreamMessage<'_>,
+
) -> miette::Result<()> {
+
match msg {
+
JetstreamMessage::Commit {
+
did,
+
commit,
+
..
+
} => {
+
// Check if this is our site
+
if did.as_str() == state.did.as_str()
+
&& commit.collection.as_str() == "place.wisp.fs"
+
&& commit.rkey.as_str() == state.rkey.as_str()
+
{
+
match commit.operation {
+
CommitOperation::Create | CommitOperation::Update => {
+
let new_cid = commit.cid.as_ref().map(|c| c.to_string());
+
+
// Check if CID changed
+
let should_update = {
+
let last_cid = state.last_cid.read().await;
+
new_cid != *last_cid
+
};
+
+
if should_update {
+
println!("\n[Update] Detected change to site {} (CID: {:?})", state.rkey, new_cid);
+
println!("[Update] Pulling latest version...");
+
+
// Pull the updated site
+
match pull_site(
+
state.did.clone(),
+
state.rkey.clone(),
+
state.output_dir.clone(),
+
)
+
.await
+
{
+
Ok(_) => {
+
// Update last CID
+
let mut last_cid = state.last_cid.write().await;
+
*last_cid = new_cid;
+
println!("[Update] โœ“ Site updated successfully!\n");
+
}
+
Err(e) => {
+
eprintln!("[Update] Failed to pull site: {}", e);
+
}
+
}
+
}
+
}
+
CommitOperation::Delete => {
+
println!("\n[Update] Site {} was deleted", state.rkey);
+
}
+
}
+
}
+
}
+
_ => {
+
// Ignore identity and account messages
+
}
+
}
+
+
Ok(())
+
}
+
-14
cli/test_headers.rs
···
-
use http::Request;
-
-
fn main() {
-
let builder = Request::builder()
-
.header(http::header::CONTENT_TYPE, "*/*")
-
.header(http::header::CONTENT_TYPE, "application/octet-stream");
-
-
let req = builder.body(()).unwrap();
-
-
println!("Content-Type headers:");
-
for value in req.headers().get_all(http::header::CONTENT_TYPE) {
-
println!(" {:?}", value);
-
}
-
}
+90
crates.nix
···
+
{...}: {
+
perSystem = {
+
pkgs,
+
config,
+
lib,
+
inputs',
+
...
+
}: {
+
# declare projects
+
nci.projects."wisp-place-cli" = {
+
path = ./cli;
+
export = false;
+
};
+
nci.toolchains.mkBuild = _:
+
with inputs'.fenix.packages;
+
combine [
+
minimal.rustc
+
minimal.cargo
+
targets.x86_64-pc-windows-gnu.latest.rust-std
+
targets.x86_64-unknown-linux-gnu.latest.rust-std
+
targets.aarch64-apple-darwin.latest.rust-std
+
targets.aarch64-unknown-linux-gnu.latest.rust-std
+
];
+
# configure crates
+
nci.crates."wisp-cli" = {
+
profiles = {
+
dev.runTests = false;
+
release.runTests = false;
+
};
+
targets."x86_64-unknown-linux-gnu" = let
+
targetPkgs = pkgs.pkgsCross.gnu64;
+
targetCC = targetPkgs.stdenv.cc;
+
targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;
+
in rec {
+
default = true;
+
depsDrvConfig.mkDerivation = {
+
nativeBuildInputs = [targetCC];
+
};
+
depsDrvConfig.env = rec {
+
TARGET_CC = "${targetCC.targetPrefix}cc";
+
"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER" = TARGET_CC;
+
};
+
drvConfig = depsDrvConfig;
+
};
+
targets."x86_64-pc-windows-gnu" = let
+
targetPkgs = pkgs.pkgsCross.mingwW64;
+
targetCC = targetPkgs.stdenv.cc;
+
targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;
+
in rec {
+
depsDrvConfig.mkDerivation = {
+
nativeBuildInputs = [targetCC];
+
buildInputs = with targetPkgs; [windows.pthreads];
+
};
+
depsDrvConfig.env = rec {
+
TARGET_CC = "${targetCC.targetPrefix}cc";
+
"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER" = TARGET_CC;
+
};
+
drvConfig = depsDrvConfig;
+
};
+
targets."aarch64-apple-darwin" = let
+
targetPkgs = pkgs.pkgsCross.aarch64-darwin;
+
targetCC = targetPkgs.stdenv.cc;
+
targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;
+
in rec {
+
depsDrvConfig.mkDerivation = {
+
nativeBuildInputs = [targetCC];
+
};
+
depsDrvConfig.env = rec {
+
TARGET_CC = "${targetCC.targetPrefix}cc";
+
"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER" = TARGET_CC;
+
};
+
drvConfig = depsDrvConfig;
+
};
+
targets."aarch64-unknown-linux-gnu" = let
+
targetPkgs = pkgs.pkgsCross.aarch64-multiplatform;
+
targetCC = targetPkgs.stdenv.cc;
+
targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;
+
in rec {
+
depsDrvConfig.mkDerivation = {
+
nativeBuildInputs = [targetCC];
+
};
+
depsDrvConfig.env = rec {
+
TARGET_CC = "${targetCC.targetPrefix}cc";
+
"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER" = TARGET_CC;
+
};
+
drvConfig = depsDrvConfig;
+
};
+
};
+
};
+
}
+318
flake.lock
···
+
{
+
"nodes": {
+
"crane": {
+
"flake": false,
+
"locked": {
+
"lastModified": 1758758545,
+
"narHash": "sha256-NU5WaEdfwF6i8faJ2Yh+jcK9vVFrofLcwlD/mP65JrI=",
+
"owner": "ipetkov",
+
"repo": "crane",
+
"rev": "95d528a5f54eaba0d12102249ce42f4d01f4e364",
+
"type": "github"
+
},
+
"original": {
+
"owner": "ipetkov",
+
"ref": "v0.21.1",
+
"repo": "crane",
+
"type": "github"
+
}
+
},
+
"dream2nix": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"nixpkgs"
+
],
+
"purescript-overlay": "purescript-overlay",
+
"pyproject-nix": "pyproject-nix"
+
},
+
"locked": {
+
"lastModified": 1754978539,
+
"narHash": "sha256-nrDovydywSKRbWim9Ynmgj8SBm8LK3DI2WuhIqzOHYI=",
+
"owner": "nix-community",
+
"repo": "dream2nix",
+
"rev": "fbec3263cb4895ac86ee9506cdc4e6919a1a2214",
+
"type": "github"
+
},
+
"original": {
+
"owner": "nix-community",
+
"repo": "dream2nix",
+
"type": "github"
+
}
+
},
+
"fenix": {
+
"inputs": {
+
"nixpkgs": [
+
"nixpkgs"
+
],
+
"rust-analyzer-src": "rust-analyzer-src"
+
},
+
"locked": {
+
"lastModified": 1762584108,
+
"narHash": "sha256-wZUW7dlXMXaRdvNbaADqhF8gg9bAfFiMV+iyFQiDv+Y=",
+
"owner": "nix-community",
+
"repo": "fenix",
+
"rev": "32f3ad3b6c690061173e1ac16708874975ec6056",
+
"type": "github"
+
},
+
"original": {
+
"owner": "nix-community",
+
"repo": "fenix",
+
"type": "github"
+
}
+
},
+
"flake-compat": {
+
"flake": false,
+
"locked": {
+
"lastModified": 1696426674,
+
"narHash": "sha256-kvjfFW7WAETZlt09AgDn1MrtKzP7t90Vf7vypd3OL1U=",
+
"owner": "edolstra",
+
"repo": "flake-compat",
+
"rev": "0f9255e01c2351cc7d116c072cb317785dd33b33",
+
"type": "github"
+
},
+
"original": {
+
"owner": "edolstra",
+
"repo": "flake-compat",
+
"type": "github"
+
}
+
},
+
"mk-naked-shell": {
+
"flake": false,
+
"locked": {
+
"lastModified": 1681286841,
+
"narHash": "sha256-3XlJrwlR0nBiREnuogoa5i1b4+w/XPe0z8bbrJASw0g=",
+
"owner": "90-008",
+
"repo": "mk-naked-shell",
+
"rev": "7612f828dd6f22b7fb332cc69440e839d7ffe6bd",
+
"type": "github"
+
},
+
"original": {
+
"owner": "90-008",
+
"repo": "mk-naked-shell",
+
"type": "github"
+
}
+
},
+
"nci": {
+
"inputs": {
+
"crane": "crane",
+
"dream2nix": "dream2nix",
+
"mk-naked-shell": "mk-naked-shell",
+
"nixpkgs": [
+
"nixpkgs"
+
],
+
"parts": "parts",
+
"rust-overlay": "rust-overlay",
+
"treefmt": "treefmt"
+
},
+
"locked": {
+
"lastModified": 1762582646,
+
"narHash": "sha256-MMzE4xccG+8qbLhdaZoeFDUKWUOn3B4lhp5dZmgukmM=",
+
"owner": "90-008",
+
"repo": "nix-cargo-integration",
+
"rev": "0993c449377049fa8868a664e8290ac6658e0b9a",
+
"type": "github"
+
},
+
"original": {
+
"owner": "90-008",
+
"repo": "nix-cargo-integration",
+
"type": "github"
+
}
+
},
+
"nixpkgs": {
+
"locked": {
+
"lastModified": 1762361079,
+
"narHash": "sha256-lz718rr1BDpZBYk7+G8cE6wee3PiBUpn8aomG/vLLiY=",
+
"owner": "nixos",
+
"repo": "nixpkgs",
+
"rev": "ffcdcf99d65c61956d882df249a9be53e5902ea5",
+
"type": "github"
+
},
+
"original": {
+
"owner": "nixos",
+
"ref": "nixpkgs-unstable",
+
"repo": "nixpkgs",
+
"type": "github"
+
}
+
},
+
"parts": {
+
"inputs": {
+
"nixpkgs-lib": [
+
"nci",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1762440070,
+
"narHash": "sha256-xxdepIcb39UJ94+YydGP221rjnpkDZUlykKuF54PsqI=",
+
"owner": "hercules-ci",
+
"repo": "flake-parts",
+
"rev": "26d05891e14c88eb4a5d5bee659c0db5afb609d8",
+
"type": "github"
+
},
+
"original": {
+
"owner": "hercules-ci",
+
"repo": "flake-parts",
+
"type": "github"
+
}
+
},
+
"parts_2": {
+
"inputs": {
+
"nixpkgs-lib": [
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1762440070,
+
"narHash": "sha256-xxdepIcb39UJ94+YydGP221rjnpkDZUlykKuF54PsqI=",
+
"owner": "hercules-ci",
+
"repo": "flake-parts",
+
"rev": "26d05891e14c88eb4a5d5bee659c0db5afb609d8",
+
"type": "github"
+
},
+
"original": {
+
"owner": "hercules-ci",
+
"repo": "flake-parts",
+
"type": "github"
+
}
+
},
+
"purescript-overlay": {
+
"inputs": {
+
"flake-compat": "flake-compat",
+
"nixpkgs": [
+
"nci",
+
"dream2nix",
+
"nixpkgs"
+
],
+
"slimlock": "slimlock"
+
},
+
"locked": {
+
"lastModified": 1728546539,
+
"narHash": "sha256-Sws7w0tlnjD+Bjck1nv29NjC5DbL6nH5auL9Ex9Iz2A=",
+
"owner": "thomashoneyman",
+
"repo": "purescript-overlay",
+
"rev": "4ad4c15d07bd899d7346b331f377606631eb0ee4",
+
"type": "github"
+
},
+
"original": {
+
"owner": "thomashoneyman",
+
"repo": "purescript-overlay",
+
"type": "github"
+
}
+
},
+
"pyproject-nix": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"dream2nix",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1752481895,
+
"narHash": "sha256-luVj97hIMpCbwhx3hWiRwjP2YvljWy8FM+4W9njDhLA=",
+
"owner": "pyproject-nix",
+
"repo": "pyproject.nix",
+
"rev": "16ee295c25107a94e59a7fc7f2e5322851781162",
+
"type": "github"
+
},
+
"original": {
+
"owner": "pyproject-nix",
+
"repo": "pyproject.nix",
+
"type": "github"
+
}
+
},
+
"root": {
+
"inputs": {
+
"fenix": "fenix",
+
"nci": "nci",
+
"nixpkgs": "nixpkgs",
+
"parts": "parts_2"
+
}
+
},
+
"rust-analyzer-src": {
+
"flake": false,
+
"locked": {
+
"lastModified": 1762438844,
+
"narHash": "sha256-ApIKJf6CcMsV2nYBXhGF95BmZMO/QXPhgfSnkA/rVUo=",
+
"owner": "rust-lang",
+
"repo": "rust-analyzer",
+
"rev": "4bf516ee5a960c1e2eee9fedd9b1c9e976a19c86",
+
"type": "github"
+
},
+
"original": {
+
"owner": "rust-lang",
+
"ref": "nightly",
+
"repo": "rust-analyzer",
+
"type": "github"
+
}
+
},
+
"rust-overlay": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1762569282,
+
"narHash": "sha256-vINZAJpXQTZd5cfh06Rcw7hesH7sGSvi+Tn+HUieJn8=",
+
"owner": "oxalica",
+
"repo": "rust-overlay",
+
"rev": "a35a6144b976f70827c2fe2f5c89d16d8f9179d8",
+
"type": "github"
+
},
+
"original": {
+
"owner": "oxalica",
+
"repo": "rust-overlay",
+
"type": "github"
+
}
+
},
+
"slimlock": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"dream2nix",
+
"purescript-overlay",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1688756706,
+
"narHash": "sha256-xzkkMv3neJJJ89zo3o2ojp7nFeaZc2G0fYwNXNJRFlo=",
+
"owner": "thomashoneyman",
+
"repo": "slimlock",
+
"rev": "cf72723f59e2340d24881fd7bf61cb113b4c407c",
+
"type": "github"
+
},
+
"original": {
+
"owner": "thomashoneyman",
+
"repo": "slimlock",
+
"type": "github"
+
}
+
},
+
"treefmt": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1762410071,
+
"narHash": "sha256-aF5fvoZeoXNPxT0bejFUBXeUjXfHLSL7g+mjR/p5TEg=",
+
"owner": "numtide",
+
"repo": "treefmt-nix",
+
"rev": "97a30861b13c3731a84e09405414398fbf3e109f",
+
"type": "github"
+
},
+
"original": {
+
"owner": "numtide",
+
"repo": "treefmt-nix",
+
"type": "github"
+
}
+
}
+
},
+
"root": "root",
+
"version": 7
+
}
+59
flake.nix
···
+
{
+
inputs.nixpkgs.url = "github:nixos/nixpkgs/nixpkgs-unstable";
+
inputs.nci.url = "github:90-008/nix-cargo-integration";
+
inputs.nci.inputs.nixpkgs.follows = "nixpkgs";
+
inputs.parts.url = "github:hercules-ci/flake-parts";
+
inputs.parts.inputs.nixpkgs-lib.follows = "nixpkgs";
+
inputs.fenix = {
+
url = "github:nix-community/fenix";
+
inputs.nixpkgs.follows = "nixpkgs";
+
};
+
+
outputs = inputs @ {
+
parts,
+
nci,
+
...
+
}:
+
parts.lib.mkFlake {inherit inputs;} {
+
systems = ["x86_64-linux" "aarch64-darwin"];
+
imports = [
+
nci.flakeModule
+
./crates.nix
+
];
+
perSystem = {
+
pkgs,
+
config,
+
...
+
}: let
+
crateOutputs = config.nci.outputs."wisp-cli";
+
mkRenamedPackage = name: pkg: isWindows: pkgs.runCommand name {} ''
+
mkdir -p $out/bin
+
if [ -f ${pkg}/bin/wisp-cli.exe ]; then
+
cp ${pkg}/bin/wisp-cli.exe $out/bin/${name}
+
elif [ -f ${pkg}/bin/wisp-cli ]; then
+
cp ${pkg}/bin/wisp-cli $out/bin/${name}
+
else
+
echo "Error: Could not find wisp-cli binary in ${pkg}/bin/"
+
ls -la ${pkg}/bin/ || true
+
exit 1
+
fi
+
'';
+
in {
+
devShells.default = crateOutputs.devShell;
+
packages.default = crateOutputs.packages.release;
+
packages.wisp-cli-x86_64-linux = mkRenamedPackage "wisp-cli-x86_64-linux" crateOutputs.packages.release false;
+
packages.wisp-cli-aarch64-linux = mkRenamedPackage "wisp-cli-aarch64-linux" crateOutputs.allTargets."aarch64-unknown-linux-gnu".packages.release false;
+
packages.wisp-cli-x86_64-windows = mkRenamedPackage "wisp-cli-x86_64-windows.exe" crateOutputs.allTargets."x86_64-pc-windows-gnu".packages.release true;
+
packages.wisp-cli-aarch64-darwin = mkRenamedPackage "wisp-cli-aarch64-darwin" crateOutputs.allTargets."aarch64-apple-darwin".packages.release false;
+
packages.all = pkgs.symlinkJoin {
+
name = "wisp-cli-all";
+
paths = [
+
config.packages.wisp-cli-x86_64-linux
+
config.packages.wisp-cli-aarch64-linux
+
config.packages.wisp-cli-x86_64-windows
+
config.packages.wisp-cli-aarch64-darwin
+
];
+
};
+
};
+
};
+
}
+7 -7
hosting-service/Dockerfile
···
-
# Use official Bun image
-
FROM oven/bun:1.3 AS base
+
# Use official Node.js Alpine image
+
FROM node:alpine AS base
# Set working directory
WORKDIR /app
# Copy package files
-
COPY package.json bun.lock ./
+
COPY package.json ./
# Install dependencies
-
RUN bun install --frozen-lockfile --production
+
RUN npm install
# Copy source code
COPY src ./src
···
# Health check
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
-
CMD bun -e "fetch('http://localhost:3001/health').then(r => r.ok ? process.exit(0) : process.exit(1)).catch(() => process.exit(1))"
+
CMD node -e "fetch('http://localhost:3001/health').then(r => r.ok ? process.exit(0) : process.exit(1)).catch(() => process.exit(1))"
-
# Start the application
-
CMD ["bun", "src/index.ts"]
+
# Start the application (can override with 'npm run backfill' in compose)
+
CMD ["npm", "run", "start"]
-123
hosting-service/EXAMPLE.md
···
-
# HTML Path Rewriting Example
-
-
This document demonstrates how HTML path rewriting works when serving sites via the `/s/:identifier/:site/*` route.
-
-
## Problem
-
-
When you create a static site with absolute paths like `/style.css` or `/images/logo.png`, these paths work fine when served from the root domain. However, when served from a subdirectory like `/s/alice.bsky.social/mysite/`, these absolute paths break because they resolve to the server root instead of the site root.
-
-
## Solution
-
-
The hosting service automatically rewrites absolute paths in HTML files to work correctly in the subdirectory context.
-
-
## Example
-
-
**Original HTML file (index.html):**
-
```html
-
<!DOCTYPE html>
-
<html>
-
<head>
-
<meta charset="UTF-8">
-
<title>My Site</title>
-
<link rel="stylesheet" href="/style.css">
-
<link rel="icon" href="/favicon.ico">
-
<script src="/app.js"></script>
-
</head>
-
<body>
-
<header>
-
<img src="/images/logo.png" alt="Logo">
-
<nav>
-
<a href="/">Home</a>
-
<a href="/about">About</a>
-
<a href="/contact">Contact</a>
-
</nav>
-
</header>
-
-
<main>
-
<h1>Welcome</h1>
-
<img src="/images/hero.jpg"
-
srcset="/images/hero.jpg 1x, /images/hero@2x.jpg 2x"
-
alt="Hero">
-
-
<form action="/submit" method="post">
-
<input type="text" name="email">
-
<button>Submit</button>
-
</form>
-
</main>
-
-
<footer>
-
<a href="https://example.com">External Link</a>
-
<a href="#top">Back to Top</a>
-
</footer>
-
</body>
-
</html>
-
```
-
-
**When accessed via `/s/alice.bsky.social/mysite/`, the HTML is rewritten to:**
-
```html
-
<!DOCTYPE html>
-
<html>
-
<head>
-
<meta charset="UTF-8">
-
<title>My Site</title>
-
<link rel="stylesheet" href="/s/alice.bsky.social/mysite/style.css">
-
<link rel="icon" href="/s/alice.bsky.social/mysite/favicon.ico">
-
<script src="/s/alice.bsky.social/mysite/app.js"></script>
-
</head>
-
<body>
-
<header>
-
<img src="/s/alice.bsky.social/mysite/images/logo.png" alt="Logo">
-
<nav>
-
<a href="/s/alice.bsky.social/mysite/">Home</a>
-
<a href="/s/alice.bsky.social/mysite/about">About</a>
-
<a href="/s/alice.bsky.social/mysite/contact">Contact</a>
-
</nav>
-
</header>
-
-
<main>
-
<h1>Welcome</h1>
-
<img src="/s/alice.bsky.social/mysite/images/hero.jpg"
-
srcset="/s/alice.bsky.social/mysite/images/hero.jpg 1x, /s/alice.bsky.social/mysite/images/hero@2x.jpg 2x"
-
alt="Hero">
-
-
<form action="/s/alice.bsky.social/mysite/submit" method="post">
-
<input type="text" name="email">
-
<button>Submit</button>
-
</form>
-
</main>
-
-
<footer>
-
<a href="https://example.com">External Link</a>
-
<a href="#top">Back to Top</a>
-
</footer>
-
</body>
-
</html>
-
```
-
-
## What's Preserved
-
-
Notice that:
-
- โœ… Absolute paths are rewritten: `/style.css` โ†’ `/s/alice.bsky.social/mysite/style.css`
-
- โœ… External URLs are preserved: `https://example.com` stays the same
-
- โœ… Anchors are preserved: `#top` stays the same
-
- โœ… The rewriting is safe and won't break your site
-
-
## Supported Attributes
-
-
The rewriter handles these HTML attributes:
-
- `src` - images, scripts, iframes, videos, audio
-
- `href` - links, stylesheets
-
- `action` - forms
-
- `data` - objects
-
- `poster` - video posters
-
- `srcset` - responsive images
-
-
## Testing Your Site
-
-
To test if your site works with path rewriting:
-
-
1. Upload your site to your PDS as a `place.wisp.fs` record
-
2. Access it via: `https://hosting.wisp.place/s/YOUR_HANDLE/SITE_NAME/`
-
3. Check that all resources load correctly
-
-
If you're using relative paths already (like `./style.css` or `../images/logo.png`), they'll work without any rewriting.
+32
hosting-service/docker-entrypoint.sh
···
+
#!/bin/sh
+
set -e
+
+
# Run different modes based on MODE environment variable
+
# Modes:
+
# - server (default): Start the hosting service
+
# - backfill: Run cache backfill and exit
+
# - backfill-server: Run cache backfill, then start the server
+
+
MODE="${MODE:-server}"
+
+
case "$MODE" in
+
backfill)
+
echo "๐Ÿ”„ Running in backfill-only mode..."
+
exec npm run backfill
+
;;
+
backfill-server)
+
echo "๐Ÿ”„ Running backfill, then starting server..."
+
npm run backfill
+
echo "โœ… Backfill complete, starting server..."
+
exec npm run start
+
;;
+
server)
+
echo "๐Ÿš€ Starting server..."
+
exec npm run start
+
;;
+
*)
+
echo "โŒ Unknown MODE: $MODE"
+
echo "Valid modes: server, backfill, backfill-server"
+
exit 1
+
;;
+
esac
+134
hosting-service/example-_redirects
···
+
# Example _redirects file for Wisp hosting
+
# Place this file in the root directory of your site as "_redirects"
+
# Lines starting with # are comments
+
+
# ===================================
+
# SIMPLE REDIRECTS
+
# ===================================
+
+
# Redirect home page
+
# /home /
+
+
# Redirect old URLs to new ones
+
# /old-blog /blog
+
# /about-us /about
+
+
# ===================================
+
# SPLAT REDIRECTS (WILDCARDS)
+
# ===================================
+
+
# Redirect entire directories
+
# /news/* /blog/:splat
+
# /old-site/* /new-site/:splat
+
+
# ===================================
+
# PLACEHOLDER REDIRECTS
+
# ===================================
+
+
# Restructure blog URLs
+
# /blog/:year/:month/:day/:slug /posts/:year-:month-:day/:slug
+
+
# Capture multiple parameters
+
# /products/:category/:id /shop/:category/item/:id
+
+
# ===================================
+
# STATUS CODES
+
# ===================================
+
+
# Permanent redirect (301) - default if not specified
+
# /permanent-move /new-location 301
+
+
# Temporary redirect (302)
+
# /temp-redirect /temp-location 302
+
+
# Rewrite (200) - serves different content, URL stays the same
+
# /api/* /functions/:splat 200
+
+
# Custom 404 page
+
# /shop/* /shop-closed.html 404
+
+
# ===================================
+
# FORCE REDIRECTS
+
# ===================================
+
+
# Force redirect even if file exists (note the ! after status code)
+
# /override-file /other-file.html 200!
+
+
# ===================================
+
# CONDITIONAL REDIRECTS
+
# ===================================
+
+
# Country-based redirects (ISO 3166-1 alpha-2 codes)
+
# / /us/ 302 Country=us
+
# / /uk/ 302 Country=gb
+
# / /anz/ 302 Country=au,nz
+
+
# Language-based redirects
+
# /products /en/products 301 Language=en
+
# /products /de/products 301 Language=de
+
# /products /fr/products 301 Language=fr
+
+
# Cookie-based redirects (checks if cookie exists)
+
# /* /legacy/:splat 200 Cookie=is_legacy
+
+
# ===================================
+
# QUERY PARAMETERS
+
# ===================================
+
+
# Match specific query parameters
+
# /store id=:id /blog/:id 301
+
+
# Multiple parameters
+
# /search q=:query category=:cat /find/:cat/:query 301
+
+
# ===================================
+
# DOMAIN-LEVEL REDIRECTS
+
# ===================================
+
+
# Redirect to different domain (must include protocol)
+
# /external https://example.com/path
+
+
# Redirect entire subdomain
+
# http://blog.example.com/* https://example.com/blog/:splat 301!
+
# https://blog.example.com/* https://example.com/blog/:splat 301!
+
+
# ===================================
+
# COMMON PATTERNS
+
# ===================================
+
+
# Remove .html extensions
+
# /page.html /page
+
+
# Add trailing slash
+
# /about /about/
+
+
# Single-page app fallback (serve index.html for all paths)
+
# /* /index.html 200
+
+
# API proxy
+
# /api/* https://api.example.com/:splat 200
+
+
# ===================================
+
# CUSTOM ERROR PAGES
+
# ===================================
+
+
# Language-specific 404 pages
+
# /en/* /en/404.html 404
+
# /de/* /de/404.html 404
+
+
# Section-specific 404 pages
+
# /shop/* /shop/not-found.html 404
+
# /blog/* /blog/404.html 404
+
+
# ===================================
+
# NOTES
+
# ===================================
+
#
+
# - Rules are processed in order (first match wins)
+
# - More specific rules should come before general ones
+
# - Splats (*) can only be used at the end of a path
+
# - Query parameters are automatically preserved for 200, 301, 302
+
# - Trailing slashes are normalized (/ and no / are treated the same)
+
# - Default status code is 301 if not specified
+
#
+
+3 -2
hosting-service/package.json
···
"version": "1.0.0",
"type": "module",
"scripts": {
-
"dev": "tsx watch src/index.ts",
+
"dev": "tsx --env-file=.env watch src/index.ts",
"build": "tsc",
-
"start": "tsx src/index.ts"
+
"start": "tsx src/index.ts",
+
"backfill": "tsx src/index.ts --backfill"
},
"dependencies": {
"@atproto/api": "^0.17.4",
+36 -1
hosting-service/src/index.ts
···
import { FirehoseWorker } from './lib/firehose';
import { logger } from './lib/observability';
import { mkdirSync, existsSync } from 'fs';
+
import { backfillCache } from './lib/backfill';
+
import { startDomainCacheCleanup, stopDomainCacheCleanup, setCacheOnlyMode } from './lib/db';
const PORT = process.env.PORT ? parseInt(process.env.PORT) : 3001;
-
const CACHE_DIR = './cache/sites';
+
const CACHE_DIR = process.env.CACHE_DIR || './cache/sites';
+
+
// Parse CLI arguments
+
const args = process.argv.slice(2);
+
const hasBackfillFlag = args.includes('--backfill');
+
const backfillOnStartup = hasBackfillFlag || process.env.BACKFILL_ON_STARTUP === 'true';
+
+
// Cache-only mode: service will only cache files locally, no DB writes
+
const hasCacheOnlyFlag = args.includes('--cache-only');
+
export const CACHE_ONLY_MODE = hasCacheOnlyFlag || process.env.CACHE_ONLY_MODE === 'true';
+
+
// Configure cache-only mode in database module
+
if (CACHE_ONLY_MODE) {
+
setCacheOnlyMode(true);
+
}
// Ensure cache directory exists
if (!existsSync(CACHE_DIR)) {
···
console.log('Created cache directory:', CACHE_DIR);
}
+
// Start domain cache cleanup
+
startDomainCacheCleanup();
+
// Start firehose worker with observability logger
const firehose = new FirehoseWorker((msg, data) => {
logger.info(msg, data);
···
firehose.start();
+
// Run backfill if requested
+
if (backfillOnStartup) {
+
console.log('๐Ÿ”„ Backfill requested, starting cache backfill...');
+
backfillCache({
+
skipExisting: true,
+
concurrency: 3,
+
}).then((stats) => {
+
console.log('โœ… Cache backfill completed');
+
}).catch((err) => {
+
console.error('โŒ Cache backfill error:', err);
+
});
+
}
+
// Add health check endpoint
app.get('/health', (c) => {
const firehoseHealth = firehose.getHealth();
···
Health: http://localhost:${PORT}/health
Cache: ${CACHE_DIR}
Firehose: Connected to Firehose
+
Cache-Only: ${CACHE_ONLY_MODE ? 'ENABLED (no DB writes)' : 'DISABLED'}
`);
// Graceful shutdown
process.on('SIGINT', async () => {
console.log('\n๐Ÿ›‘ Shutting down...');
firehose.stop();
+
stopDomainCacheCleanup();
server.close();
process.exit(0);
});
···
process.on('SIGTERM', async () => {
console.log('\n๐Ÿ›‘ Shutting down...');
firehose.stop();
+
stopDomainCacheCleanup();
server.close();
process.exit(0);
});
+136
hosting-service/src/lib/backfill.ts
···
+
import { getAllSites } from './db';
+
import { fetchSiteRecord, getPdsForDid, downloadAndCacheSite, isCached } from './utils';
+
import { logger } from './observability';
+
+
export interface BackfillOptions {
+
skipExisting?: boolean; // Skip sites already in cache
+
concurrency?: number; // Number of sites to cache concurrently
+
maxSites?: number; // Maximum number of sites to backfill (for testing)
+
}
+
+
export interface BackfillStats {
+
total: number;
+
cached: number;
+
skipped: number;
+
failed: number;
+
duration: number;
+
}
+
+
/**
+
* Backfill all sites from the database into the local cache
+
*/
+
export async function backfillCache(options: BackfillOptions = {}): Promise<BackfillStats> {
+
const {
+
skipExisting = true,
+
concurrency = 3,
+
maxSites,
+
} = options;
+
+
const startTime = Date.now();
+
const stats: BackfillStats = {
+
total: 0,
+
cached: 0,
+
skipped: 0,
+
failed: 0,
+
duration: 0,
+
};
+
+
logger.info('Starting cache backfill', { skipExisting, concurrency, maxSites });
+
console.log(`
+
โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—
+
โ•‘ CACHE BACKFILL STARTING โ•‘
+
โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
+
`);
+
+
try {
+
// Get all sites from database
+
let sites = await getAllSites();
+
stats.total = sites.length;
+
+
logger.info(`Found ${sites.length} sites in database`);
+
console.log(`๐Ÿ“Š Found ${sites.length} sites in database`);
+
+
// Limit if specified
+
if (maxSites && maxSites > 0) {
+
sites = sites.slice(0, maxSites);
+
console.log(`โš™๏ธ Limited to ${maxSites} sites for backfill`);
+
}
+
+
// Process sites in batches
+
const batches: typeof sites[] = [];
+
for (let i = 0; i < sites.length; i += concurrency) {
+
batches.push(sites.slice(i, i + concurrency));
+
}
+
+
let processed = 0;
+
for (const batch of batches) {
+
await Promise.all(
+
batch.map(async (site) => {
+
try {
+
// Check if already cached
+
if (skipExisting && isCached(site.did, site.rkey)) {
+
stats.skipped++;
+
processed++;
+
logger.debug(`Skipping already cached site`, { did: site.did, rkey: site.rkey });
+
console.log(`โญ๏ธ [${processed}/${sites.length}] Skipped (cached): ${site.display_name || site.rkey}`);
+
return;
+
}
+
+
// Fetch site record
+
const siteData = await fetchSiteRecord(site.did, site.rkey);
+
if (!siteData) {
+
stats.failed++;
+
processed++;
+
logger.error('Site record not found during backfill', null, { did: site.did, rkey: site.rkey });
+
console.log(`โŒ [${processed}/${sites.length}] Failed (not found): ${site.display_name || site.rkey}`);
+
return;
+
}
+
+
// Get PDS endpoint
+
const pdsEndpoint = await getPdsForDid(site.did);
+
if (!pdsEndpoint) {
+
stats.failed++;
+
processed++;
+
logger.error('PDS not found during backfill', null, { did: site.did });
+
console.log(`โŒ [${processed}/${sites.length}] Failed (no PDS): ${site.display_name || site.rkey}`);
+
return;
+
}
+
+
// Download and cache site
+
await downloadAndCacheSite(site.did, site.rkey, siteData.record, pdsEndpoint, siteData.cid);
+
stats.cached++;
+
processed++;
+
logger.info('Successfully cached site during backfill', { did: site.did, rkey: site.rkey });
+
console.log(`โœ… [${processed}/${sites.length}] Cached: ${site.display_name || site.rkey}`);
+
} catch (err) {
+
stats.failed++;
+
processed++;
+
logger.error('Failed to cache site during backfill', err, { did: site.did, rkey: site.rkey });
+
console.log(`โŒ [${processed}/${sites.length}] Failed: ${site.display_name || site.rkey}`);
+
}
+
})
+
);
+
}
+
+
stats.duration = Date.now() - startTime;
+
+
console.log(`
+
โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—
+
โ•‘ CACHE BACKFILL COMPLETED โ•‘
+
โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
+
+
๐Ÿ“Š Total Sites: ${stats.total}
+
โœ… Cached: ${stats.cached}
+
โญ๏ธ Skipped: ${stats.skipped}
+
โŒ Failed: ${stats.failed}
+
โฑ๏ธ Duration: ${(stats.duration / 1000).toFixed(2)}s
+
`);
+
+
logger.info('Cache backfill completed', stats);
+
} catch (err) {
+
logger.error('Cache backfill failed', err);
+
console.error('โŒ Cache backfill failed:', err);
+
}
+
+
return stats;
+
}
+177
hosting-service/src/lib/cache.ts
···
+
// In-memory LRU cache for file contents and metadata
+
+
interface CacheEntry<T> {
+
value: T;
+
size: number;
+
timestamp: number;
+
}
+
+
interface CacheStats {
+
hits: number;
+
misses: number;
+
evictions: number;
+
currentSize: number;
+
currentCount: number;
+
}
+
+
export class LRUCache<T> {
+
private cache: Map<string, CacheEntry<T>>;
+
private maxSize: number;
+
private maxCount: number;
+
private currentSize: number;
+
private stats: CacheStats;
+
+
constructor(maxSize: number, maxCount: number) {
+
this.cache = new Map();
+
this.maxSize = maxSize;
+
this.maxCount = maxCount;
+
this.currentSize = 0;
+
this.stats = {
+
hits: 0,
+
misses: 0,
+
evictions: 0,
+
currentSize: 0,
+
currentCount: 0,
+
};
+
}
+
+
get(key: string): T | null {
+
const entry = this.cache.get(key);
+
if (!entry) {
+
this.stats.misses++;
+
return null;
+
}
+
+
// Move to end (most recently used)
+
this.cache.delete(key);
+
this.cache.set(key, entry);
+
+
this.stats.hits++;
+
return entry.value;
+
}
+
+
set(key: string, value: T, size: number): void {
+
// Remove existing entry if present
+
if (this.cache.has(key)) {
+
const existing = this.cache.get(key)!;
+
this.currentSize -= existing.size;
+
this.cache.delete(key);
+
}
+
+
// Evict entries if needed
+
while (
+
(this.cache.size >= this.maxCount || this.currentSize + size > this.maxSize) &&
+
this.cache.size > 0
+
) {
+
const firstKey = this.cache.keys().next().value;
+
if (!firstKey) break; // Should never happen, but satisfy TypeScript
+
const firstEntry = this.cache.get(firstKey);
+
if (!firstEntry) break; // Should never happen, but satisfy TypeScript
+
this.cache.delete(firstKey);
+
this.currentSize -= firstEntry.size;
+
this.stats.evictions++;
+
}
+
+
// Add new entry
+
this.cache.set(key, {
+
value,
+
size,
+
timestamp: Date.now(),
+
});
+
this.currentSize += size;
+
+
// Update stats
+
this.stats.currentSize = this.currentSize;
+
this.stats.currentCount = this.cache.size;
+
}
+
+
delete(key: string): boolean {
+
const entry = this.cache.get(key);
+
if (!entry) return false;
+
+
this.cache.delete(key);
+
this.currentSize -= entry.size;
+
this.stats.currentSize = this.currentSize;
+
this.stats.currentCount = this.cache.size;
+
return true;
+
}
+
+
// Invalidate all entries for a specific site
+
invalidateSite(did: string, rkey: string): number {
+
const prefix = `${did}:${rkey}:`;
+
let count = 0;
+
+
for (const key of Array.from(this.cache.keys())) {
+
if (key.startsWith(prefix)) {
+
this.delete(key);
+
count++;
+
}
+
}
+
+
return count;
+
}
+
+
// Get cache size
+
size(): number {
+
return this.cache.size;
+
}
+
+
clear(): void {
+
this.cache.clear();
+
this.currentSize = 0;
+
this.stats.currentSize = 0;
+
this.stats.currentCount = 0;
+
}
+
+
getStats(): CacheStats {
+
return { ...this.stats };
+
}
+
+
// Get cache hit rate
+
getHitRate(): number {
+
const total = this.stats.hits + this.stats.misses;
+
return total === 0 ? 0 : (this.stats.hits / total) * 100;
+
}
+
}
+
+
// File metadata cache entry
+
export interface FileMetadata {
+
encoding?: 'gzip';
+
mimeType: string;
+
}
+
+
// Global cache instances
+
const FILE_CACHE_SIZE = 100 * 1024 * 1024; // 100MB
+
const FILE_CACHE_COUNT = 500;
+
const METADATA_CACHE_COUNT = 2000;
+
+
export const fileCache = new LRUCache<Buffer>(FILE_CACHE_SIZE, FILE_CACHE_COUNT);
+
export const metadataCache = new LRUCache<FileMetadata>(1024 * 1024, METADATA_CACHE_COUNT); // 1MB for metadata
+
export const rewrittenHtmlCache = new LRUCache<Buffer>(50 * 1024 * 1024, 200); // 50MB for rewritten HTML
+
+
// Helper to generate cache keys
+
export function getCacheKey(did: string, rkey: string, filePath: string, suffix?: string): string {
+
const base = `${did}:${rkey}:${filePath}`;
+
return suffix ? `${base}:${suffix}` : base;
+
}
+
+
// Invalidate all caches for a site
+
export function invalidateSiteCache(did: string, rkey: string): void {
+
const fileCount = fileCache.invalidateSite(did, rkey);
+
const metaCount = metadataCache.invalidateSite(did, rkey);
+
const htmlCount = rewrittenHtmlCache.invalidateSite(did, rkey);
+
+
console.log(`[Cache] Invalidated site ${did}:${rkey} - ${fileCount} files, ${metaCount} metadata, ${htmlCount} HTML`);
+
}
+
+
// Get overall cache statistics
+
export function getCacheStats() {
+
return {
+
files: fileCache.getStats(),
+
fileHitRate: fileCache.getHitRate(),
+
metadata: metadataCache.getStats(),
+
metadataHitRate: metadataCache.getHitRate(),
+
rewrittenHtml: rewrittenHtmlCache.getStats(),
+
rewrittenHtmlHitRate: rewrittenHtmlCache.getHitRate(),
+
};
+
}
+104
hosting-service/src/lib/db.ts
···
import postgres from 'postgres';
import { createHash } from 'crypto';
+
// Global cache-only mode flag (set by index.ts)
+
let cacheOnlyMode = false;
+
+
export function setCacheOnlyMode(enabled: boolean) {
+
cacheOnlyMode = enabled;
+
if (enabled) {
+
console.log('[DB] Cache-only mode enabled - database writes will be skipped');
+
}
+
}
+
const sql = postgres(
process.env.DATABASE_URL || 'postgres://postgres:postgres@localhost:5432/wisp',
{
···
}
);
+
// Domain lookup cache with TTL
+
const DOMAIN_CACHE_TTL = 5 * 60 * 1000; // 5 minutes
+
+
interface CachedDomain<T> {
+
value: T;
+
timestamp: number;
+
}
+
+
const domainCache = new Map<string, CachedDomain<DomainLookup | null>>();
+
const customDomainCache = new Map<string, CachedDomain<CustomDomainLookup | null>>();
+
+
let cleanupInterval: NodeJS.Timeout | null = null;
+
+
export function startDomainCacheCleanup() {
+
if (cleanupInterval) return;
+
+
cleanupInterval = setInterval(() => {
+
const now = Date.now();
+
+
for (const [key, entry] of domainCache.entries()) {
+
if (now - entry.timestamp > DOMAIN_CACHE_TTL) {
+
domainCache.delete(key);
+
}
+
}
+
+
for (const [key, entry] of customDomainCache.entries()) {
+
if (now - entry.timestamp > DOMAIN_CACHE_TTL) {
+
customDomainCache.delete(key);
+
}
+
}
+
}, 30 * 60 * 1000); // Run every 30 minutes
+
}
+
+
export function stopDomainCacheCleanup() {
+
if (cleanupInterval) {
+
clearInterval(cleanupInterval);
+
cleanupInterval = null;
+
}
+
}
+
export interface DomainLookup {
did: string;
rkey: string | null;
···
export async function getWispDomain(domain: string): Promise<DomainLookup | null> {
const key = domain.toLowerCase();
+
// Check cache first
+
const cached = domainCache.get(key);
+
if (cached && Date.now() - cached.timestamp < DOMAIN_CACHE_TTL) {
+
return cached.value;
+
}
+
// Query database
const result = await sql<DomainLookup[]>`
SELECT did, rkey FROM domains WHERE domain = ${key} LIMIT 1
`;
const data = result[0] || null;
+
+
// Cache the result
+
domainCache.set(key, { value: data, timestamp: Date.now() });
return data;
}
export async function getCustomDomain(domain: string): Promise<CustomDomainLookup | null> {
const key = domain.toLowerCase();
+
+
// Check cache first
+
const cached = customDomainCache.get(key);
+
if (cached && Date.now() - cached.timestamp < DOMAIN_CACHE_TTL) {
+
return cached.value;
+
}
// Query database
const result = await sql<CustomDomainLookup[]>`
···
`;
const data = result[0] || null;
+
// Cache the result
+
customDomainCache.set(key, { value: data, timestamp: Date.now() });
+
return data;
}
export async function getCustomDomainByHash(hash: string): Promise<CustomDomainLookup | null> {
+
const key = `hash:${hash}`;
+
+
// Check cache first
+
const cached = customDomainCache.get(key);
+
if (cached && Date.now() - cached.timestamp < DOMAIN_CACHE_TTL) {
+
return cached.value;
+
}
+
// Query database
const result = await sql<CustomDomainLookup[]>`
SELECT id, domain, did, rkey, verified FROM custom_domains
···
`;
const data = result[0] || null;
+
// Cache the result
+
customDomainCache.set(key, { value: data, timestamp: Date.now() });
+
return data;
}
export async function upsertSite(did: string, rkey: string, displayName?: string) {
+
// Skip database writes in cache-only mode
+
if (cacheOnlyMode) {
+
console.log('[DB] Skipping upsertSite (cache-only mode)', { did, rkey });
+
return;
+
}
+
try {
// Only set display_name if provided (not undefined/null/empty)
const cleanDisplayName = displayName && displayName.trim() ? displayName.trim() : null;
···
`;
} catch (err) {
console.error('Failed to upsert site', err);
+
}
+
}
+
+
export interface SiteRecord {
+
did: string;
+
rkey: string;
+
display_name?: string;
+
}
+
+
export async function getAllSites(): Promise<SiteRecord[]> {
+
try {
+
const result = await sql<SiteRecord[]>`
+
SELECT did, rkey, display_name FROM sites
+
ORDER BY created_at DESC
+
`;
+
return result;
+
} catch (err) {
+
console.error('Failed to get all sites', err);
+
return [];
}
}
+268 -219
hosting-service/src/lib/firehose.ts
···
-
import { existsSync, rmSync } from 'fs';
-
import { getPdsForDid, downloadAndCacheSite, extractBlobCid, fetchSiteRecord } from './utils';
-
import { upsertSite, tryAcquireLock, releaseLock } from './db';
-
import { safeFetch } from './safe-fetch';
-
import { isRecord, validateRecord } from '../lexicon/types/place/wisp/fs';
-
import { Firehose } from '@atproto/sync';
-
import { IdResolver } from '@atproto/identity';
+
import { existsSync, rmSync } from 'fs'
+
import {
+
getPdsForDid,
+
downloadAndCacheSite,
+
extractBlobCid,
+
fetchSiteRecord
+
} from './utils'
+
import { upsertSite, tryAcquireLock, releaseLock } from './db'
+
import { safeFetch } from './safe-fetch'
+
import { isRecord, validateRecord } from '../lexicon/types/place/wisp/fs'
+
import { Firehose } from '@atproto/sync'
+
import { IdResolver } from '@atproto/identity'
+
import { invalidateSiteCache } from './cache'
-
const CACHE_DIR = './cache/sites';
+
const CACHE_DIR = './cache/sites'
export class FirehoseWorker {
-
private firehose: Firehose | null = null;
-
private idResolver: IdResolver;
-
private isShuttingDown = false;
-
private lastEventTime = Date.now();
+
private firehose: Firehose | null = null
+
private idResolver: IdResolver
+
private isShuttingDown = false
+
private lastEventTime = Date.now()
-
constructor(
-
private logger?: (msg: string, data?: Record<string, unknown>) => void,
-
) {
-
this.idResolver = new IdResolver();
-
}
+
constructor(
+
private logger?: (msg: string, data?: Record<string, unknown>) => void
+
) {
+
this.idResolver = new IdResolver()
+
}
-
private log(msg: string, data?: Record<string, unknown>) {
-
const log = this.logger || console.log;
-
log(`[FirehoseWorker] ${msg}`, data || {});
-
}
+
private log(msg: string, data?: Record<string, unknown>) {
+
const log = this.logger || console.log
+
log(`[FirehoseWorker] ${msg}`, data || {})
+
}
-
start() {
-
this.log('Starting firehose worker');
-
this.connect();
-
}
+
start() {
+
this.log('Starting firehose worker')
+
this.connect()
+
}
-
stop() {
-
this.log('Stopping firehose worker');
-
this.isShuttingDown = true;
+
stop() {
+
this.log('Stopping firehose worker')
+
this.isShuttingDown = true
-
if (this.firehose) {
-
this.firehose.destroy();
-
this.firehose = null;
-
}
-
}
+
if (this.firehose) {
+
this.firehose.destroy()
+
this.firehose = null
+
}
+
}
-
private connect() {
-
if (this.isShuttingDown) return;
+
private connect() {
+
if (this.isShuttingDown) return
-
this.log('Connecting to AT Protocol firehose');
+
this.log('Connecting to AT Protocol firehose')
-
this.firehose = new Firehose({
-
idResolver: this.idResolver,
-
service: 'wss://bsky.network',
-
filterCollections: ['place.wisp.fs'],
-
handleEvent: async (evt: any) => {
-
this.lastEventTime = Date.now();
+
this.firehose = new Firehose({
+
idResolver: this.idResolver,
+
service: 'wss://bsky.network',
+
filterCollections: ['place.wisp.fs'],
+
handleEvent: async (evt: any) => {
+
this.lastEventTime = Date.now()
-
// Watch for write events
-
if (evt.event === 'create' || evt.event === 'update') {
-
const record = evt.record;
+
// Watch for write events
+
if (evt.event === 'create' || evt.event === 'update') {
+
const record = evt.record
-
// If the write is a valid place.wisp.fs record
-
if (
-
evt.collection === 'place.wisp.fs' &&
-
isRecord(record) &&
-
validateRecord(record).success
-
) {
-
this.log('Received place.wisp.fs event', {
-
did: evt.did,
-
event: evt.event,
-
rkey: evt.rkey,
-
});
+
// If the write is a valid place.wisp.fs record
+
if (
+
evt.collection === 'place.wisp.fs' &&
+
isRecord(record) &&
+
validateRecord(record).success
+
) {
+
this.log('Received place.wisp.fs event', {
+
did: evt.did,
+
event: evt.event,
+
rkey: evt.rkey
+
})
-
try {
-
await this.handleCreateOrUpdate(evt.did, evt.rkey, record, evt.cid?.toString());
-
} catch (err) {
-
this.log('Error handling event', {
-
did: evt.did,
-
event: evt.event,
-
rkey: evt.rkey,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
}
-
}
-
} else if (evt.event === 'delete' && evt.collection === 'place.wisp.fs') {
-
this.log('Received delete event', {
-
did: evt.did,
-
rkey: evt.rkey,
-
});
+
try {
+
await this.handleCreateOrUpdate(
+
evt.did,
+
evt.rkey,
+
record,
+
evt.cid?.toString()
+
)
+
} catch (err) {
+
this.log('Error handling event', {
+
did: evt.did,
+
event: evt.event,
+
rkey: evt.rkey,
+
error:
+
err instanceof Error
+
? err.message
+
: String(err)
+
})
+
}
+
}
+
} else if (
+
evt.event === 'delete' &&
+
evt.collection === 'place.wisp.fs'
+
) {
+
this.log('Received delete event', {
+
did: evt.did,
+
rkey: evt.rkey
+
})
-
try {
-
await this.handleDelete(evt.did, evt.rkey);
-
} catch (err) {
-
this.log('Error handling delete', {
-
did: evt.did,
-
rkey: evt.rkey,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
}
-
}
-
},
-
onError: (err: any) => {
-
this.log('Firehose error', {
-
error: err instanceof Error ? err.message : String(err),
-
stack: err instanceof Error ? err.stack : undefined,
-
fullError: err,
-
});
-
console.error('Full firehose error:', err);
-
},
-
});
+
try {
+
await this.handleDelete(evt.did, evt.rkey)
+
} catch (err) {
+
this.log('Error handling delete', {
+
did: evt.did,
+
rkey: evt.rkey,
+
error:
+
err instanceof Error ? err.message : String(err)
+
})
+
}
+
}
+
},
+
onError: (err: any) => {
+
this.log('Firehose error', {
+
error: err instanceof Error ? err.message : String(err),
+
stack: err instanceof Error ? err.stack : undefined,
+
fullError: err
+
})
+
console.error('Full firehose error:', err)
+
}
+
})
+
+
this.firehose.start()
+
this.log('Firehose started')
+
}
+
+
private async handleCreateOrUpdate(
+
did: string,
+
site: string,
+
record: any,
+
eventCid?: string
+
) {
+
this.log('Processing create/update', { did, site })
-
this.firehose.start();
-
this.log('Firehose started');
-
}
+
// Record is already validated in handleEvent
+
const fsRecord = record
-
private async handleCreateOrUpdate(did: string, site: string, record: any, eventCid?: string) {
-
this.log('Processing create/update', { did, site });
+
const pdsEndpoint = await getPdsForDid(did)
+
if (!pdsEndpoint) {
+
this.log('Could not resolve PDS for DID', { did })
+
return
+
}
-
// Record is already validated in handleEvent
-
const fsRecord = record;
+
this.log('Resolved PDS', { did, pdsEndpoint })
-
const pdsEndpoint = await getPdsForDid(did);
-
if (!pdsEndpoint) {
-
this.log('Could not resolve PDS for DID', { did });
-
return;
-
}
+
// Verify record exists on PDS and fetch its CID
+
let verifiedCid: string
+
try {
+
const result = await fetchSiteRecord(did, site)
-
this.log('Resolved PDS', { did, pdsEndpoint });
+
if (!result) {
+
this.log('Record not found on PDS, skipping cache', {
+
did,
+
site
+
})
+
return
+
}
-
// Verify record exists on PDS and fetch its CID
-
let verifiedCid: string;
-
try {
-
const result = await fetchSiteRecord(did, site);
+
verifiedCid = result.cid
-
if (!result) {
-
this.log('Record not found on PDS, skipping cache', { did, site });
-
return;
-
}
+
// Verify event CID matches PDS CID (prevent cache poisoning)
+
if (eventCid && eventCid !== verifiedCid) {
+
this.log('CID mismatch detected - potential spoofed event', {
+
did,
+
site,
+
eventCid,
+
verifiedCid
+
})
+
return
+
}
-
verifiedCid = result.cid;
+
this.log('Record verified on PDS', { did, site, cid: verifiedCid })
+
} catch (err) {
+
this.log('Failed to verify record on PDS', {
+
did,
+
site,
+
error: err instanceof Error ? err.message : String(err)
+
})
+
return
+
}
-
// Verify event CID matches PDS CID (prevent cache poisoning)
-
if (eventCid && eventCid !== verifiedCid) {
-
this.log('CID mismatch detected - potential spoofed event', {
-
did,
-
site,
-
eventCid,
-
verifiedCid
-
});
-
return;
-
}
+
// Invalidate in-memory caches before updating
+
invalidateSiteCache(did, site)
-
this.log('Record verified on PDS', { did, site, cid: verifiedCid });
-
} catch (err) {
-
this.log('Failed to verify record on PDS', {
-
did,
-
site,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
return;
-
}
+
// Cache the record with verified CID (uses atomic swap internally)
+
// All instances cache locally for edge serving
+
await downloadAndCacheSite(
+
did,
+
site,
+
fsRecord,
+
pdsEndpoint,
+
verifiedCid
+
)
-
// Cache the record with verified CID (uses atomic swap internally)
-
// All instances cache locally for edge serving
-
await downloadAndCacheSite(did, site, fsRecord, pdsEndpoint, verifiedCid);
+
// Acquire distributed lock only for database write to prevent duplicate writes
+
// Note: upsertSite will check cache-only mode internally and skip if needed
+
const lockKey = `db:upsert:${did}:${site}`
+
const lockAcquired = await tryAcquireLock(lockKey)
-
// Acquire distributed lock only for database write to prevent duplicate writes
-
const lockKey = `db:upsert:${did}:${site}`;
-
const lockAcquired = await tryAcquireLock(lockKey);
+
if (!lockAcquired) {
+
this.log('Another instance is writing to DB, skipping upsert', {
+
did,
+
site
+
})
+
this.log('Successfully processed create/update (cached locally)', {
+
did,
+
site
+
})
+
return
+
}
-
if (!lockAcquired) {
-
this.log('Another instance is writing to DB, skipping upsert', { did, site });
-
this.log('Successfully processed create/update (cached locally)', { did, site });
-
return;
-
}
+
try {
+
// Upsert site to database (only one instance does this)
+
// In cache-only mode, this will be a no-op
+
await upsertSite(did, site, fsRecord.site)
+
this.log(
+
'Successfully processed create/update (cached + DB updated)',
+
{ did, site }
+
)
+
} finally {
+
// Always release lock, even if DB write fails
+
await releaseLock(lockKey)
+
}
+
}
-
try {
-
// Upsert site to database (only one instance does this)
-
await upsertSite(did, site, fsRecord.site);
-
this.log('Successfully processed create/update (cached + DB updated)', { did, site });
-
} finally {
-
// Always release lock, even if DB write fails
-
await releaseLock(lockKey);
-
}
-
}
+
private async handleDelete(did: string, site: string) {
+
this.log('Processing delete', { did, site })
-
private async handleDelete(did: string, site: string) {
-
this.log('Processing delete', { did, site });
+
// All instances should delete their local cache (no lock needed)
+
const pdsEndpoint = await getPdsForDid(did)
+
if (!pdsEndpoint) {
+
this.log('Could not resolve PDS for DID', { did })
+
return
+
}
-
// All instances should delete their local cache (no lock needed)
-
const pdsEndpoint = await getPdsForDid(did);
-
if (!pdsEndpoint) {
-
this.log('Could not resolve PDS for DID', { did });
-
return;
-
}
+
// Verify record is actually deleted from PDS
+
try {
+
const recordUrl = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=place.wisp.fs&rkey=${encodeURIComponent(site)}`
+
const recordRes = await safeFetch(recordUrl)
-
// Verify record is actually deleted from PDS
-
try {
-
const recordUrl = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=place.wisp.fs&rkey=${encodeURIComponent(site)}`;
-
const recordRes = await safeFetch(recordUrl);
+
if (recordRes.ok) {
+
this.log('Record still exists on PDS, not deleting cache', {
+
did,
+
site
+
})
+
return
+
}
-
if (recordRes.ok) {
-
this.log('Record still exists on PDS, not deleting cache', {
-
did,
-
site,
-
});
-
return;
-
}
+
this.log('Verified record is deleted from PDS', {
+
did,
+
site,
+
status: recordRes.status
+
})
+
} catch (err) {
+
this.log('Error verifying deletion on PDS', {
+
did,
+
site,
+
error: err instanceof Error ? err.message : String(err)
+
})
+
}
-
this.log('Verified record is deleted from PDS', {
-
did,
-
site,
-
status: recordRes.status,
-
});
-
} catch (err) {
-
this.log('Error verifying deletion on PDS', {
-
did,
-
site,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
}
+
// Invalidate in-memory caches
+
invalidateSiteCache(did, site)
-
// Delete cache
-
this.deleteCache(did, site);
+
// Delete disk cache
+
this.deleteCache(did, site)
-
this.log('Successfully processed delete', { did, site });
-
}
+
this.log('Successfully processed delete', { did, site })
+
}
-
private deleteCache(did: string, site: string) {
-
const cacheDir = `${CACHE_DIR}/${did}/${site}`;
+
private deleteCache(did: string, site: string) {
+
const cacheDir = `${CACHE_DIR}/${did}/${site}`
-
if (!existsSync(cacheDir)) {
-
this.log('Cache directory does not exist, nothing to delete', {
-
did,
-
site,
-
});
-
return;
-
}
+
if (!existsSync(cacheDir)) {
+
this.log('Cache directory does not exist, nothing to delete', {
+
did,
+
site
+
})
+
return
+
}
-
try {
-
rmSync(cacheDir, { recursive: true, force: true });
-
this.log('Cache deleted', { did, site, path: cacheDir });
-
} catch (err) {
-
this.log('Failed to delete cache', {
-
did,
-
site,
-
path: cacheDir,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
}
-
}
+
try {
+
rmSync(cacheDir, { recursive: true, force: true })
+
this.log('Cache deleted', { did, site, path: cacheDir })
+
} catch (err) {
+
this.log('Failed to delete cache', {
+
did,
+
site,
+
path: cacheDir,
+
error: err instanceof Error ? err.message : String(err)
+
})
+
}
+
}
-
getHealth() {
-
const isConnected = this.firehose !== null;
-
const timeSinceLastEvent = Date.now() - this.lastEventTime;
+
getHealth() {
+
const isConnected = this.firehose !== null
+
const timeSinceLastEvent = Date.now() - this.lastEventTime
-
return {
-
connected: isConnected,
-
lastEventTime: this.lastEventTime,
-
timeSinceLastEvent,
-
healthy: isConnected && timeSinceLastEvent < 300000, // 5 minutes
-
};
-
}
+
return {
+
connected: isConnected,
+
lastEventTime: this.lastEventTime,
+
timeSinceLastEvent,
+
healthy: isConnected && timeSinceLastEvent < 300000 // 5 minutes
+
}
+
}
}
+457
hosting-service/src/lib/html-rewriter.test.ts
···
+
import { describe, test, expect } from 'bun:test'
+
import { rewriteHtmlPaths, isHtmlContent } from './html-rewriter'
+
+
describe('rewriteHtmlPaths', () => {
+
const basePath = '/identifier/site/'
+
+
describe('absolute paths', () => {
+
test('rewrites absolute paths with leading slash', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('rewrites nested absolute paths', () => {
+
const html = '<link href="/css/style.css">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<link href="/identifier/site/css/style.css">')
+
})
+
})
+
+
describe('relative paths from root document', () => {
+
test('rewrites relative paths with ./ prefix', () => {
+
const html = '<img src="./image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('rewrites relative paths without prefix', () => {
+
const html = '<img src="image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('rewrites relative paths with ../ (should stay at root)', () => {
+
const html = '<img src="../image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
})
+
+
describe('relative paths from nested documents', () => {
+
test('rewrites relative path from nested document', () => {
+
const html = '<img src="./photo.jpg">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/index.html'
+
)
+
expect(result).toBe(
+
'<img src="/identifier/site/folder1/folder2/photo.jpg">'
+
)
+
})
+
+
test('rewrites plain filename from nested document', () => {
+
const html = '<script src="app.js"></script>'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/index.html'
+
)
+
expect(result).toBe(
+
'<script src="/identifier/site/folder1/folder2/app.js"></script>'
+
)
+
})
+
+
test('rewrites ../ to go up one level', () => {
+
const html = '<img src="../image.png">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/folder3/index.html'
+
)
+
expect(result).toBe(
+
'<img src="/identifier/site/folder1/folder2/image.png">'
+
)
+
})
+
+
test('rewrites multiple ../ to go up multiple levels', () => {
+
const html = '<link href="../../css/style.css">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/folder3/index.html'
+
)
+
expect(result).toBe(
+
'<link href="/identifier/site/folder1/css/style.css">'
+
)
+
})
+
+
test('rewrites ../ with additional path segments', () => {
+
const html = '<img src="../assets/logo.png">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'pages/about/index.html'
+
)
+
expect(result).toBe(
+
'<img src="/identifier/site/pages/assets/logo.png">'
+
)
+
})
+
+
test('handles complex nested relative paths', () => {
+
const html = '<script src="../../lib/vendor/jquery.js"></script>'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'pages/blog/post/index.html'
+
)
+
expect(result).toBe(
+
'<script src="/identifier/site/pages/lib/vendor/jquery.js"></script>'
+
)
+
})
+
+
test('handles ../ going past root (stays at root)', () => {
+
const html = '<img src="../../../image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'folder1/index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
})
+
+
describe('external URLs and special schemes', () => {
+
test('does not rewrite http URLs', () => {
+
const html = '<img src="http://example.com/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="http://example.com/image.png">')
+
})
+
+
test('does not rewrite https URLs', () => {
+
const html = '<link href="https://cdn.example.com/style.css">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<link href="https://cdn.example.com/style.css">'
+
)
+
})
+
+
test('does not rewrite protocol-relative URLs', () => {
+
const html = '<script src="//cdn.example.com/script.js"></script>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<script src="//cdn.example.com/script.js"></script>'
+
)
+
})
+
+
test('does not rewrite data URIs', () => {
+
const html =
+
'<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUA">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUA">'
+
)
+
})
+
+
test('does not rewrite mailto links', () => {
+
const html = '<a href="mailto:test@example.com">Email</a>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<a href="mailto:test@example.com">Email</a>')
+
})
+
+
test('does not rewrite tel links', () => {
+
const html = '<a href="tel:+1234567890">Call</a>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<a href="tel:+1234567890">Call</a>')
+
})
+
})
+
+
describe('different HTML attributes', () => {
+
test('rewrites src attribute', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('rewrites href attribute', () => {
+
const html = '<a href="/page.html">Link</a>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<a href="/identifier/site/page.html">Link</a>')
+
})
+
+
test('rewrites action attribute', () => {
+
const html = '<form action="/submit"></form>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<form action="/identifier/site/submit"></form>')
+
})
+
+
test('rewrites data attribute', () => {
+
const html = '<object data="/document.pdf"></object>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<object data="/identifier/site/document.pdf"></object>'
+
)
+
})
+
+
test('rewrites poster attribute', () => {
+
const html = '<video poster="/thumbnail.jpg"></video>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<video poster="/identifier/site/thumbnail.jpg"></video>'
+
)
+
})
+
+
test('rewrites srcset attribute with single URL', () => {
+
const html = '<img srcset="/image.png 1x">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img srcset="/identifier/site/image.png 1x">'
+
)
+
})
+
+
test('rewrites srcset attribute with multiple URLs', () => {
+
const html = '<img srcset="/image-1x.png 1x, /image-2x.png 2x">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img srcset="/identifier/site/image-1x.png 1x, /identifier/site/image-2x.png 2x">'
+
)
+
})
+
+
test('rewrites srcset with width descriptors', () => {
+
const html = '<img srcset="/small.jpg 320w, /large.jpg 1024w">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img srcset="/identifier/site/small.jpg 320w, /identifier/site/large.jpg 1024w">'
+
)
+
})
+
+
test('rewrites srcset with relative paths from nested document', () => {
+
const html = '<img srcset="../img1.png 1x, ../img2.png 2x">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/index.html'
+
)
+
expect(result).toBe(
+
'<img srcset="/identifier/site/folder1/img1.png 1x, /identifier/site/folder1/img2.png 2x">'
+
)
+
})
+
})
+
+
describe('quote handling', () => {
+
test('handles double quotes', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('handles single quotes', () => {
+
const html = "<img src='/image.png'>"
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe("<img src='/identifier/site/image.png'>")
+
})
+
+
test('handles mixed quotes in same document', () => {
+
const html = '<img src="/img1.png"><link href=\'/style.css\'>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img src="/identifier/site/img1.png"><link href=\'/identifier/site/style.css\'>'
+
)
+
})
+
})
+
+
describe('multiple rewrites in same document', () => {
+
test('rewrites multiple attributes in complex HTML', () => {
+
const html = `
+
<!DOCTYPE html>
+
<html>
+
<head>
+
<link href="/css/style.css" rel="stylesheet">
+
<script src="/js/app.js"></script>
+
</head>
+
<body>
+
<img src="/images/logo.png" alt="Logo">
+
<a href="/about.html">About</a>
+
<form action="/submit">
+
<button type="submit">Submit</button>
+
</form>
+
</body>
+
</html>
+
`
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toContain('href="/identifier/site/css/style.css"')
+
expect(result).toContain('src="/identifier/site/js/app.js"')
+
expect(result).toContain('src="/identifier/site/images/logo.png"')
+
expect(result).toContain('href="/identifier/site/about.html"')
+
expect(result).toContain('action="/identifier/site/submit"')
+
})
+
+
test('handles mix of relative and absolute paths', () => {
+
const html = `
+
<img src="/abs/image.png">
+
<img src="./rel/image.png">
+
<img src="../parent/image.png">
+
<img src="https://external.com/image.png">
+
`
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/page.html'
+
)
+
expect(result).toContain('src="/identifier/site/abs/image.png"')
+
expect(result).toContain(
+
'src="/identifier/site/folder1/folder2/rel/image.png"'
+
)
+
expect(result).toContain(
+
'src="/identifier/site/folder1/parent/image.png"'
+
)
+
expect(result).toContain('src="https://external.com/image.png"')
+
})
+
})
+
+
describe('edge cases', () => {
+
test('handles empty src attribute', () => {
+
const html = '<img src="">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="">')
+
})
+
+
test('handles basePath without trailing slash', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(html, '/identifier/site', 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('handles basePath with trailing slash', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(
+
html,
+
'/identifier/site/',
+
'index.html'
+
)
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('handles whitespace around equals sign', () => {
+
const html = '<img src = "/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('preserves query strings in URLs', () => {
+
const html = '<img src="/image.png?v=123">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png?v=123">')
+
})
+
+
test('preserves hash fragments in URLs', () => {
+
const html = '<a href="/page.html#section">Link</a>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<a href="/identifier/site/page.html#section">Link</a>'
+
)
+
})
+
+
test('handles paths with special characters', () => {
+
const html = '<img src="/folder-name/file_name.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img src="/identifier/site/folder-name/file_name.png">'
+
)
+
})
+
})
+
+
describe('real-world scenario', () => {
+
test('handles the example from the bug report', () => {
+
// HTML file at: /folder1/folder2/folder3/index.html
+
// Image at: /folder1/folder2/img.png
+
// Reference: src="../img.png"
+
const html = '<img src="../img.png">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/folder3/index.html'
+
)
+
expect(result).toBe(
+
'<img src="/identifier/site/folder1/folder2/img.png">'
+
)
+
})
+
+
test('handles deeply nested static site structure', () => {
+
// A typical static site with nested pages and shared assets
+
const html = `
+
<!DOCTYPE html>
+
<html>
+
<head>
+
<link href="../../css/style.css" rel="stylesheet">
+
<link href="../../css/theme.css" rel="stylesheet">
+
<script src="../../js/main.js"></script>
+
</head>
+
<body>
+
<img src="../../images/logo.png" alt="Logo">
+
<img src="./post-image.jpg" alt="Post">
+
<a href="../index.html">Back to Blog</a>
+
<a href="../../index.html">Home</a>
+
</body>
+
</html>
+
`
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'blog/posts/my-post.html'
+
)
+
+
// Assets two levels up
+
expect(result).toContain('href="/identifier/site/css/style.css"')
+
expect(result).toContain('href="/identifier/site/css/theme.css"')
+
expect(result).toContain('src="/identifier/site/js/main.js"')
+
expect(result).toContain('src="/identifier/site/images/logo.png"')
+
+
// Same directory
+
expect(result).toContain(
+
'src="/identifier/site/blog/posts/post-image.jpg"'
+
)
+
+
// One level up
+
expect(result).toContain('href="/identifier/site/blog/index.html"')
+
+
// Two levels up
+
expect(result).toContain('href="/identifier/site/index.html"')
+
})
+
})
+
})
+
+
describe('isHtmlContent', () => {
+
test('identifies HTML by content type', () => {
+
expect(isHtmlContent('file.txt', 'text/html')).toBe(true)
+
expect(isHtmlContent('file.txt', 'text/html; charset=utf-8')).toBe(
+
true
+
)
+
})
+
+
test('identifies HTML by .html extension', () => {
+
expect(isHtmlContent('index.html')).toBe(true)
+
expect(isHtmlContent('page.html', undefined)).toBe(true)
+
expect(isHtmlContent('/path/to/file.html')).toBe(true)
+
})
+
+
test('identifies HTML by .htm extension', () => {
+
expect(isHtmlContent('index.htm')).toBe(true)
+
expect(isHtmlContent('page.htm', undefined)).toBe(true)
+
})
+
+
test('handles case-insensitive extensions', () => {
+
expect(isHtmlContent('INDEX.HTML')).toBe(true)
+
expect(isHtmlContent('page.HTM')).toBe(true)
+
expect(isHtmlContent('File.HtMl')).toBe(true)
+
})
+
+
test('returns false for non-HTML files', () => {
+
expect(isHtmlContent('script.js')).toBe(false)
+
expect(isHtmlContent('style.css')).toBe(false)
+
expect(isHtmlContent('image.png')).toBe(false)
+
expect(isHtmlContent('data.json')).toBe(false)
+
})
+
+
test('returns false for files with no extension', () => {
+
expect(isHtmlContent('README')).toBe(false)
+
expect(isHtmlContent('Makefile')).toBe(false)
+
})
+
})
+178 -99
hosting-service/src/lib/html-rewriter.ts
···
*/
const REWRITABLE_ATTRIBUTES = [
-
'src',
-
'href',
-
'action',
-
'data',
-
'poster',
-
'srcset',
-
] as const;
+
'src',
+
'href',
+
'action',
+
'data',
+
'poster',
+
'srcset'
+
] as const
/**
* Check if a path should be rewritten
*/
function shouldRewritePath(path: string): boolean {
-
// Don't rewrite empty paths
-
if (!path) return false;
+
// Don't rewrite empty paths
+
if (!path) return false
-
// Don't rewrite external URLs (http://, https://, //)
-
if (path.startsWith('http://') || path.startsWith('https://') || path.startsWith('//')) {
-
return false;
-
}
+
// Don't rewrite external URLs (http://, https://, //)
+
if (
+
path.startsWith('http://') ||
+
path.startsWith('https://') ||
+
path.startsWith('//')
+
) {
+
return false
+
}
-
// Don't rewrite data URIs or other schemes (except file paths)
-
if (path.includes(':') && !path.startsWith('./') && !path.startsWith('../')) {
-
return false;
-
}
+
// Don't rewrite data URIs or other schemes (except file paths)
+
if (
+
path.includes(':') &&
+
!path.startsWith('./') &&
+
!path.startsWith('../')
+
) {
+
return false
+
}
-
// Don't rewrite pure anchors or paths that start with /#
-
if (path.startsWith('#') || path.startsWith('/#')) return false;
+
// Rewrite absolute paths (/) and relative paths (./ or ../ or plain filenames)
+
return true
+
}
+
+
/**
+
* Normalize a path by resolving . and .. segments
+
*/
+
function normalizePath(path: string): string {
+
const parts = path.split('/')
+
const result: string[] = []
+
+
for (const part of parts) {
+
if (part === '.' || part === '') {
+
// Skip current directory and empty parts (but keep leading empty for absolute paths)
+
if (part === '' && result.length === 0) {
+
result.push(part)
+
}
+
continue
+
}
+
if (part === '..') {
+
// Go up one directory (but not past root)
+
if (result.length > 0 && result[result.length - 1] !== '..') {
+
result.pop()
+
}
+
continue
+
}
+
result.push(part)
+
}
-
// Don't rewrite relative paths (./ or ../)
-
if (path.startsWith('./') || path.startsWith('../')) return false;
+
return result.join('/')
+
}
-
// Rewrite absolute paths (/)
-
return true;
+
/**
+
* Get the directory path from a file path
+
* e.g., "folder1/folder2/file.html" -> "folder1/folder2/"
+
*/
+
function getDirectory(filepath: string): string {
+
const lastSlash = filepath.lastIndexOf('/')
+
if (lastSlash === -1) {
+
return ''
+
}
+
return filepath.substring(0, lastSlash + 1)
}
/**
* Rewrite a single path
*/
-
function rewritePath(path: string, basePath: string): string {
-
if (!shouldRewritePath(path)) {
-
return path;
-
}
+
function rewritePath(
+
path: string,
+
basePath: string,
+
documentPath: string
+
): string {
+
if (!shouldRewritePath(path)) {
+
return path
+
}
+
+
// Handle absolute paths: /file.js -> /base/file.js
+
if (path.startsWith('/')) {
+
return basePath + path.slice(1)
+
}
+
+
// Handle relative paths by resolving against document directory
+
const documentDir = getDirectory(documentPath)
+
let resolvedPath: string
-
// Handle absolute paths: /file.js -> /base/file.js
-
if (path.startsWith('/')) {
-
return basePath + path.slice(1);
-
}
+
if (path.startsWith('./')) {
+
// ./file.js relative to current directory
+
resolvedPath = documentDir + path.slice(2)
+
} else if (path.startsWith('../')) {
+
// ../file.js relative to parent directory
+
resolvedPath = documentDir + path
+
} else {
+
// file.js (no prefix) - treat as relative to current directory
+
resolvedPath = documentDir + path
+
}
-
// At this point, only plain filenames without ./ or ../ prefix should reach here
-
// But since we're filtering those in shouldRewritePath, this shouldn't happen
-
return path;
+
// Normalize the path to resolve .. and .
+
resolvedPath = normalizePath(resolvedPath)
+
+
return basePath + resolvedPath
}
/**
* Rewrite srcset attribute (can contain multiple URLs)
* Format: "url1 1x, url2 2x" or "url1 100w, url2 200w"
*/
-
function rewriteSrcset(srcset: string, basePath: string): string {
-
return srcset
-
.split(',')
-
.map(part => {
-
const trimmed = part.trim();
-
const spaceIndex = trimmed.indexOf(' ');
+
function rewriteSrcset(
+
srcset: string,
+
basePath: string,
+
documentPath: string
+
): string {
+
return srcset
+
.split(',')
+
.map((part) => {
+
const trimmed = part.trim()
+
const spaceIndex = trimmed.indexOf(' ')
-
if (spaceIndex === -1) {
-
// No descriptor, just URL
-
return rewritePath(trimmed, basePath);
-
}
+
if (spaceIndex === -1) {
+
// No descriptor, just URL
+
return rewritePath(trimmed, basePath, documentPath)
+
}
-
const url = trimmed.substring(0, spaceIndex);
-
const descriptor = trimmed.substring(spaceIndex);
-
return rewritePath(url, basePath) + descriptor;
-
})
-
.join(', ');
+
const url = trimmed.substring(0, spaceIndex)
+
const descriptor = trimmed.substring(spaceIndex)
+
return rewritePath(url, basePath, documentPath) + descriptor
+
})
+
.join(', ')
}
/**
-
* Rewrite absolute paths in HTML content
+
* Rewrite absolute and relative paths in HTML content
* Uses simple regex matching for safety (no full HTML parsing)
*/
-
export function rewriteHtmlPaths(html: string, basePath: string): string {
-
// Ensure base path ends with /
-
const normalizedBase = basePath.endsWith('/') ? basePath : basePath + '/';
+
export function rewriteHtmlPaths(
+
html: string,
+
basePath: string,
+
documentPath: string
+
): string {
+
// Ensure base path ends with /
+
const normalizedBase = basePath.endsWith('/') ? basePath : basePath + '/'
-
let rewritten = html;
+
let rewritten = html
-
// Rewrite each attribute type
-
// Use more specific patterns to prevent ReDoS attacks
-
for (const attr of REWRITABLE_ATTRIBUTES) {
-
if (attr === 'srcset') {
-
// Special handling for srcset - use possessive quantifiers via atomic grouping simulation
-
// Limit whitespace to reasonable amount (max 5 spaces) to prevent ReDoS
-
const srcsetRegex = new RegExp(
-
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}"([^"]*)"`,
-
'gi'
-
);
-
rewritten = rewritten.replace(srcsetRegex, (match, value) => {
-
const rewrittenValue = rewriteSrcset(value, normalizedBase);
-
return `${attr}="${rewrittenValue}"`;
-
});
-
} else {
-
// Regular attributes with quoted values
-
// Limit whitespace to prevent catastrophic backtracking
-
const doubleQuoteRegex = new RegExp(
-
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}"([^"]*)"`,
-
'gi'
-
);
-
const singleQuoteRegex = new RegExp(
-
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}'([^']*)'`,
-
'gi'
-
);
+
// Rewrite each attribute type
+
// Use more specific patterns to prevent ReDoS attacks
+
for (const attr of REWRITABLE_ATTRIBUTES) {
+
if (attr === 'srcset') {
+
// Special handling for srcset - use possessive quantifiers via atomic grouping simulation
+
// Limit whitespace to reasonable amount (max 5 spaces) to prevent ReDoS
+
const srcsetRegex = new RegExp(
+
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}"([^"]*)"`,
+
'gi'
+
)
+
rewritten = rewritten.replace(srcsetRegex, (match, value) => {
+
const rewrittenValue = rewriteSrcset(
+
value,
+
normalizedBase,
+
documentPath
+
)
+
return `${attr}="${rewrittenValue}"`
+
})
+
} else {
+
// Regular attributes with quoted values
+
// Limit whitespace to prevent catastrophic backtracking
+
const doubleQuoteRegex = new RegExp(
+
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}"([^"]*)"`,
+
'gi'
+
)
+
const singleQuoteRegex = new RegExp(
+
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}'([^']*)'`,
+
'gi'
+
)
-
rewritten = rewritten.replace(doubleQuoteRegex, (match, value) => {
-
const rewrittenValue = rewritePath(value, normalizedBase);
-
return `${attr}="${rewrittenValue}"`;
-
});
+
rewritten = rewritten.replace(doubleQuoteRegex, (match, value) => {
+
const rewrittenValue = rewritePath(
+
value,
+
normalizedBase,
+
documentPath
+
)
+
return `${attr}="${rewrittenValue}"`
+
})
-
rewritten = rewritten.replace(singleQuoteRegex, (match, value) => {
-
const rewrittenValue = rewritePath(value, normalizedBase);
-
return `${attr}='${rewrittenValue}'`;
-
});
-
}
-
}
+
rewritten = rewritten.replace(singleQuoteRegex, (match, value) => {
+
const rewrittenValue = rewritePath(
+
value,
+
normalizedBase,
+
documentPath
+
)
+
return `${attr}='${rewrittenValue}'`
+
})
+
}
+
}
-
return rewritten;
+
return rewritten
}
/**
* Check if content is HTML based on content or filename
*/
-
export function isHtmlContent(
-
filepath: string,
-
contentType?: string
-
): boolean {
-
if (contentType && contentType.includes('text/html')) {
-
return true;
-
}
+
export function isHtmlContent(filepath: string, contentType?: string): boolean {
+
if (contentType && contentType.includes('text/html')) {
+
return true
+
}
-
const ext = filepath.toLowerCase().split('.').pop();
-
return ext === 'html' || ext === 'htm';
+
const ext = filepath.toLowerCase().split('.').pop()
+
return ext === 'html' || ext === 'htm'
}
+215
hosting-service/src/lib/redirects.test.ts
···
+
import { describe, it, expect } from 'bun:test'
+
import { parseRedirectsFile, matchRedirectRule } from './redirects';
+
+
describe('parseRedirectsFile', () => {
+
it('should parse simple redirects', () => {
+
const content = `
+
# Comment line
+
/old-path /new-path
+
/home / 301
+
`;
+
const rules = parseRedirectsFile(content);
+
expect(rules).toHaveLength(2);
+
expect(rules[0]).toMatchObject({
+
from: '/old-path',
+
to: '/new-path',
+
status: 301,
+
force: false,
+
});
+
expect(rules[1]).toMatchObject({
+
from: '/home',
+
to: '/',
+
status: 301,
+
force: false,
+
});
+
});
+
+
it('should parse redirects with different status codes', () => {
+
const content = `
+
/temp-redirect /target 302
+
/rewrite /content 200
+
/not-found /404 404
+
`;
+
const rules = parseRedirectsFile(content);
+
expect(rules).toHaveLength(3);
+
expect(rules[0]?.status).toBe(302);
+
expect(rules[1]?.status).toBe(200);
+
expect(rules[2]?.status).toBe(404);
+
});
+
+
it('should parse force redirects', () => {
+
const content = `/force-path /target 301!`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.force).toBe(true);
+
expect(rules[0]?.status).toBe(301);
+
});
+
+
it('should parse splat redirects', () => {
+
const content = `/news/* /blog/:splat`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.from).toBe('/news/*');
+
expect(rules[0]?.to).toBe('/blog/:splat');
+
});
+
+
it('should parse placeholder redirects', () => {
+
const content = `/blog/:year/:month/:day /posts/:year-:month-:day`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.from).toBe('/blog/:year/:month/:day');
+
expect(rules[0]?.to).toBe('/posts/:year-:month-:day');
+
});
+
+
it('should parse country-based redirects', () => {
+
const content = `/ /anz 302 Country=au,nz`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.conditions?.country).toEqual(['au', 'nz']);
+
});
+
+
it('should parse language-based redirects', () => {
+
const content = `/products /en/products 301 Language=en`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.conditions?.language).toEqual(['en']);
+
});
+
+
it('should parse cookie-based redirects', () => {
+
const content = `/* /legacy/:splat 200 Cookie=is_legacy,my_cookie`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.conditions?.cookie).toEqual(['is_legacy', 'my_cookie']);
+
});
+
});
+
+
describe('matchRedirectRule', () => {
+
it('should match exact paths', () => {
+
const rules = parseRedirectsFile('/old-path /new-path');
+
const match = matchRedirectRule('/old-path', rules);
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/new-path');
+
expect(match?.status).toBe(301);
+
});
+
+
it('should match paths with trailing slash', () => {
+
const rules = parseRedirectsFile('/old-path /new-path');
+
const match = matchRedirectRule('/old-path/', rules);
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/new-path');
+
});
+
+
it('should match splat patterns', () => {
+
const rules = parseRedirectsFile('/news/* /blog/:splat');
+
const match = matchRedirectRule('/news/2024/01/15/my-post', rules);
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/blog/2024/01/15/my-post');
+
});
+
+
it('should match placeholder patterns', () => {
+
const rules = parseRedirectsFile('/blog/:year/:month/:day /posts/:year-:month-:day');
+
const match = matchRedirectRule('/blog/2024/01/15', rules);
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/posts/2024-01-15');
+
});
+
+
it('should preserve query strings for 301/302 redirects', () => {
+
const rules = parseRedirectsFile('/old /new 301');
+
const match = matchRedirectRule('/old', rules, {
+
queryParams: { foo: 'bar', baz: 'qux' },
+
});
+
expect(match?.targetPath).toContain('?');
+
expect(match?.targetPath).toContain('foo=bar');
+
expect(match?.targetPath).toContain('baz=qux');
+
});
+
+
it('should match based on query parameters', () => {
+
const rules = parseRedirectsFile('/store id=:id /blog/:id 301');
+
const match = matchRedirectRule('/store', rules, {
+
queryParams: { id: 'my-post' },
+
});
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toContain('/blog/my-post');
+
});
+
+
it('should not match when query params are missing', () => {
+
const rules = parseRedirectsFile('/store id=:id /blog/:id 301');
+
const match = matchRedirectRule('/store', rules, {
+
queryParams: {},
+
});
+
expect(match).toBeNull();
+
});
+
+
it('should match based on country header', () => {
+
const rules = parseRedirectsFile('/ /aus 302 Country=au');
+
const match = matchRedirectRule('/', rules, {
+
headers: { 'cf-ipcountry': 'AU' },
+
});
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/aus');
+
});
+
+
it('should not match wrong country', () => {
+
const rules = parseRedirectsFile('/ /aus 302 Country=au');
+
const match = matchRedirectRule('/', rules, {
+
headers: { 'cf-ipcountry': 'US' },
+
});
+
expect(match).toBeNull();
+
});
+
+
it('should match based on language header', () => {
+
const rules = parseRedirectsFile('/products /en/products 301 Language=en');
+
const match = matchRedirectRule('/products', rules, {
+
headers: { 'accept-language': 'en-US,en;q=0.9' },
+
});
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/en/products');
+
});
+
+
it('should match based on cookie presence', () => {
+
const rules = parseRedirectsFile('/* /legacy/:splat 200 Cookie=is_legacy');
+
const match = matchRedirectRule('/some-path', rules, {
+
cookies: { is_legacy: 'true' },
+
});
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/legacy/some-path');
+
});
+
+
it('should return first matching rule', () => {
+
const content = `
+
/path /first
+
/path /second
+
`;
+
const rules = parseRedirectsFile(content);
+
const match = matchRedirectRule('/path', rules);
+
expect(match?.targetPath).toBe('/first');
+
});
+
+
it('should match more specific rules before general ones', () => {
+
const content = `
+
/jobs/customer-ninja /careers/support
+
/jobs/* /careers/:splat
+
`;
+
const rules = parseRedirectsFile(content);
+
+
const match1 = matchRedirectRule('/jobs/customer-ninja', rules);
+
expect(match1?.targetPath).toBe('/careers/support');
+
+
const match2 = matchRedirectRule('/jobs/developer', rules);
+
expect(match2?.targetPath).toBe('/careers/developer');
+
});
+
+
it('should handle SPA routing pattern', () => {
+
const rules = parseRedirectsFile('/* /index.html 200');
+
+
// Should match any path
+
const match1 = matchRedirectRule('/about', rules);
+
expect(match1).toBeTruthy();
+
expect(match1?.targetPath).toBe('/index.html');
+
expect(match1?.status).toBe(200);
+
+
const match2 = matchRedirectRule('/users/123/profile', rules);
+
expect(match2).toBeTruthy();
+
expect(match2?.targetPath).toBe('/index.html');
+
expect(match2?.status).toBe(200);
+
+
const match3 = matchRedirectRule('/', rules);
+
expect(match3).toBeTruthy();
+
expect(match3?.targetPath).toBe('/index.html');
+
});
+
});
+
+413
hosting-service/src/lib/redirects.ts
···
+
import { readFile } from 'fs/promises';
+
import { existsSync } from 'fs';
+
+
export interface RedirectRule {
+
from: string;
+
to: string;
+
status: number;
+
force: boolean;
+
conditions?: {
+
country?: string[];
+
language?: string[];
+
role?: string[];
+
cookie?: string[];
+
};
+
// For pattern matching
+
fromPattern?: RegExp;
+
fromParams?: string[]; // Named parameters from the pattern
+
queryParams?: Record<string, string>; // Expected query parameters
+
}
+
+
export interface RedirectMatch {
+
rule: RedirectRule;
+
targetPath: string;
+
status: number;
+
}
+
+
/**
+
* Parse a _redirects file into an array of redirect rules
+
*/
+
export function parseRedirectsFile(content: string): RedirectRule[] {
+
const lines = content.split('\n');
+
const rules: RedirectRule[] = [];
+
+
for (let lineNum = 0; lineNum < lines.length; lineNum++) {
+
const lineRaw = lines[lineNum];
+
if (!lineRaw) continue;
+
+
const line = lineRaw.trim();
+
+
// Skip empty lines and comments
+
if (!line || line.startsWith('#')) {
+
continue;
+
}
+
+
try {
+
const rule = parseRedirectLine(line);
+
if (rule && rule.fromPattern) {
+
rules.push(rule);
+
}
+
} catch (err) {
+
console.warn(`Failed to parse redirect rule on line ${lineNum + 1}: ${line}`, err);
+
}
+
}
+
+
return rules;
+
}
+
+
/**
+
* Parse a single redirect rule line
+
* Format: /from [query_params] /to [status] [conditions]
+
*/
+
function parseRedirectLine(line: string): RedirectRule | null {
+
// Split by whitespace, but respect quoted strings (though not commonly used)
+
const parts = line.split(/\s+/);
+
+
if (parts.length < 2) {
+
return null;
+
}
+
+
let idx = 0;
+
const from = parts[idx++];
+
+
if (!from) {
+
return null;
+
}
+
+
let status = 301; // Default status
+
let force = false;
+
const conditions: NonNullable<RedirectRule['conditions']> = {};
+
const queryParams: Record<string, string> = {};
+
+
// Parse query parameters that come before the destination path
+
// They look like: key=:value (and don't start with /)
+
while (idx < parts.length) {
+
const part = parts[idx];
+
if (!part) {
+
idx++;
+
continue;
+
}
+
+
// If it starts with / or http, it's the destination path
+
if (part.startsWith('/') || part.startsWith('http://') || part.startsWith('https://')) {
+
break;
+
}
+
+
// If it contains = and comes before the destination, it's a query param
+
if (part.includes('=')) {
+
const splitIndex = part.indexOf('=');
+
const key = part.slice(0, splitIndex);
+
const value = part.slice(splitIndex + 1);
+
+
if (key && value) {
+
queryParams[key] = value;
+
}
+
idx++;
+
} else {
+
// Not a query param, must be destination or something else
+
break;
+
}
+
}
+
+
// Next part should be the destination
+
if (idx >= parts.length) {
+
return null;
+
}
+
+
const to = parts[idx++];
+
if (!to) {
+
return null;
+
}
+
+
// Parse remaining parts for status code and conditions
+
for (let i = idx; i < parts.length; i++) {
+
const part = parts[i];
+
+
if (!part) continue;
+
+
// Check for status code (with optional ! for force)
+
if (/^\d+!?$/.test(part)) {
+
if (part.endsWith('!')) {
+
force = true;
+
status = parseInt(part.slice(0, -1));
+
} else {
+
status = parseInt(part);
+
}
+
continue;
+
}
+
+
// Check for condition parameters (Country=, Language=, Role=, Cookie=)
+
if (part.includes('=')) {
+
const splitIndex = part.indexOf('=');
+
const key = part.slice(0, splitIndex);
+
const value = part.slice(splitIndex + 1);
+
+
if (!key || !value) continue;
+
+
const keyLower = key.toLowerCase();
+
+
if (keyLower === 'country') {
+
conditions.country = value.split(',').map(v => v.trim().toLowerCase());
+
} else if (keyLower === 'language') {
+
conditions.language = value.split(',').map(v => v.trim().toLowerCase());
+
} else if (keyLower === 'role') {
+
conditions.role = value.split(',').map(v => v.trim());
+
} else if (keyLower === 'cookie') {
+
conditions.cookie = value.split(',').map(v => v.trim().toLowerCase());
+
}
+
}
+
}
+
+
// Parse the 'from' pattern
+
const { pattern, params } = convertPathToRegex(from);
+
+
return {
+
from,
+
to,
+
status,
+
force,
+
conditions: Object.keys(conditions).length > 0 ? conditions : undefined,
+
queryParams: Object.keys(queryParams).length > 0 ? queryParams : undefined,
+
fromPattern: pattern,
+
fromParams: params,
+
};
+
}
+
+
/**
+
* Convert a path pattern with placeholders and splats to a regex
+
* Examples:
+
* /blog/:year/:month/:day -> captures year, month, day
+
* /news/* -> captures splat
+
*/
+
function convertPathToRegex(pattern: string): { pattern: RegExp; params: string[] } {
+
const params: string[] = [];
+
let regexStr = '^';
+
+
// Split by query string if present
+
const pathPart = pattern.split('?')[0] || pattern;
+
+
// Escape special regex characters except * and :
+
let escaped = pathPart.replace(/[.+^${}()|[\]\\]/g, '\\$&');
+
+
// Replace :param with named capture groups
+
escaped = escaped.replace(/:([a-zA-Z_][a-zA-Z0-9_]*)/g, (match, paramName) => {
+
params.push(paramName);
+
// Match path segment (everything except / and ?)
+
return '([^/?]+)';
+
});
+
+
// Replace * with splat capture (matches everything including /)
+
if (escaped.includes('*')) {
+
escaped = escaped.replace(/\*/g, '(.*)');
+
params.push('splat');
+
}
+
+
regexStr += escaped;
+
+
// Make trailing slash optional
+
if (!regexStr.endsWith('.*')) {
+
regexStr += '/?';
+
}
+
+
regexStr += '$';
+
+
return {
+
pattern: new RegExp(regexStr),
+
params,
+
};
+
}
+
+
/**
+
* Match a request path against redirect rules
+
*/
+
export function matchRedirectRule(
+
requestPath: string,
+
rules: RedirectRule[],
+
context?: {
+
queryParams?: Record<string, string>;
+
headers?: Record<string, string>;
+
cookies?: Record<string, string>;
+
}
+
): RedirectMatch | null {
+
// Normalize path: ensure leading slash, remove trailing slash (except for root)
+
let normalizedPath = requestPath.startsWith('/') ? requestPath : `/${requestPath}`;
+
+
for (const rule of rules) {
+
// Check query parameter conditions first (if any)
+
if (rule.queryParams) {
+
// If rule requires query params but none provided, skip this rule
+
if (!context?.queryParams) {
+
continue;
+
}
+
+
const queryMatches = Object.entries(rule.queryParams).every(([key, value]) => {
+
const actualValue = context.queryParams?.[key];
+
return actualValue !== undefined;
+
});
+
+
if (!queryMatches) {
+
continue;
+
}
+
}
+
+
// Check conditional redirects (country, language, role, cookie)
+
if (rule.conditions) {
+
if (rule.conditions.country && context?.headers) {
+
const cfCountry = context.headers['cf-ipcountry'];
+
const xCountry = context.headers['x-country'];
+
const country = (cfCountry?.toLowerCase() || xCountry?.toLowerCase());
+
if (!country || !rule.conditions.country.includes(country)) {
+
continue;
+
}
+
}
+
+
if (rule.conditions.language && context?.headers) {
+
const acceptLang = context.headers['accept-language'];
+
if (!acceptLang) {
+
continue;
+
}
+
// Parse accept-language header (simplified)
+
const langs = acceptLang.split(',').map(l => {
+
const langPart = l.split(';')[0];
+
return langPart ? langPart.trim().toLowerCase() : '';
+
}).filter(l => l !== '');
+
const hasMatch = rule.conditions.language.some(lang =>
+
langs.some(l => l === lang || l.startsWith(lang + '-'))
+
);
+
if (!hasMatch) {
+
continue;
+
}
+
}
+
+
if (rule.conditions.cookie && context?.cookies) {
+
const hasCookie = rule.conditions.cookie.some(cookieName =>
+
context.cookies && cookieName in context.cookies
+
);
+
if (!hasCookie) {
+
continue;
+
}
+
}
+
+
// Role-based redirects would need JWT verification - skip for now
+
if (rule.conditions.role) {
+
continue;
+
}
+
}
+
+
// Match the path pattern
+
const match = rule.fromPattern?.exec(normalizedPath);
+
if (!match) {
+
continue;
+
}
+
+
// Build the target path by replacing placeholders
+
let targetPath = rule.to;
+
+
// Replace captured parameters
+
if (rule.fromParams && match.length > 1) {
+
for (let i = 0; i < rule.fromParams.length; i++) {
+
const paramName = rule.fromParams[i];
+
const paramValue = match[i + 1];
+
+
if (!paramName || !paramValue) continue;
+
+
if (paramName === 'splat') {
+
targetPath = targetPath.replace(':splat', paramValue);
+
} else {
+
targetPath = targetPath.replace(`:${paramName}`, paramValue);
+
}
+
}
+
}
+
+
// Handle query parameter replacements
+
if (rule.queryParams && context?.queryParams) {
+
for (const [key, placeholder] of Object.entries(rule.queryParams)) {
+
const actualValue = context.queryParams[key];
+
if (actualValue && placeholder && placeholder.startsWith(':')) {
+
const paramName = placeholder.slice(1);
+
if (paramName) {
+
targetPath = targetPath.replace(`:${paramName}`, actualValue);
+
}
+
}
+
}
+
}
+
+
// Preserve query string for 200, 301, 302 redirects (unless target already has one)
+
if ([200, 301, 302].includes(rule.status) && context?.queryParams && !targetPath.includes('?')) {
+
const queryString = Object.entries(context.queryParams)
+
.map(([k, v]) => `${encodeURIComponent(k)}=${encodeURIComponent(v)}`)
+
.join('&');
+
if (queryString) {
+
targetPath += `?${queryString}`;
+
}
+
}
+
+
return {
+
rule,
+
targetPath,
+
status: rule.status,
+
};
+
}
+
+
return null;
+
}
+
+
/**
+
* Load redirect rules from a cached site
+
*/
+
export async function loadRedirectRules(did: string, rkey: string): Promise<RedirectRule[]> {
+
const CACHE_DIR = process.env.CACHE_DIR || './cache/sites';
+
const redirectsPath = `${CACHE_DIR}/${did}/${rkey}/_redirects`;
+
+
if (!existsSync(redirectsPath)) {
+
return [];
+
}
+
+
try {
+
const content = await readFile(redirectsPath, 'utf-8');
+
return parseRedirectsFile(content);
+
} catch (err) {
+
console.error('Failed to load _redirects file', err);
+
return [];
+
}
+
}
+
+
/**
+
* Parse cookies from Cookie header
+
*/
+
export function parseCookies(cookieHeader?: string): Record<string, string> {
+
if (!cookieHeader) return {};
+
+
const cookies: Record<string, string> = {};
+
const parts = cookieHeader.split(';');
+
+
for (const part of parts) {
+
const [key, ...valueParts] = part.split('=');
+
if (key && valueParts.length > 0) {
+
cookies[key.trim()] = valueParts.join('=').trim();
+
}
+
}
+
+
return cookies;
+
}
+
+
/**
+
* Parse query string into object
+
*/
+
export function parseQueryString(url: string): Record<string, string> {
+
const queryStart = url.indexOf('?');
+
if (queryStart === -1) return {};
+
+
const queryString = url.slice(queryStart + 1);
+
const params: Record<string, string> = {};
+
+
for (const pair of queryString.split('&')) {
+
const [key, value] = pair.split('=');
+
if (key) {
+
params[decodeURIComponent(key)] = value ? decodeURIComponent(value) : '';
+
}
+
}
+
+
return params;
+
}
+
+1 -1
hosting-service/src/lib/safe-fetch.ts
···
const FETCH_TIMEOUT_BLOB = 120000; // 2 minutes for blob downloads
const MAX_RESPONSE_SIZE = 10 * 1024 * 1024; // 10MB
const MAX_JSON_SIZE = 1024 * 1024; // 1MB
-
const MAX_BLOB_SIZE = 100 * 1024 * 1024; // 100MB
+
const MAX_BLOB_SIZE = 500 * 1024 * 1024; // 500MB
const MAX_REDIRECTS = 10;
function isBlockedHost(hostname: string): boolean {
+198 -37
hosting-service/src/lib/utils.ts
···
import { safeFetchJson, safeFetchBlob } from './safe-fetch';
import { CID } from 'multiformats';
-
const CACHE_DIR = './cache/sites';
+
const CACHE_DIR = process.env.CACHE_DIR || './cache/sites';
const CACHE_TTL = 14 * 24 * 60 * 60 * 1000; // 14 days cache TTL
interface CacheMetadata {
···
cachedAt: number;
did: string;
rkey: string;
+
// Map of file path to blob CID for incremental updates
+
fileCids?: Record<string, string>;
+
}
+
+
/**
+
* Determines if a MIME type should benefit from gzip compression.
+
* Returns true for text-based web assets (HTML, CSS, JS, JSON, XML, SVG).
+
* Returns false for already-compressed formats (images, video, audio, PDFs).
+
*
+
*/
+
export function shouldCompressMimeType(mimeType: string | undefined): boolean {
+
if (!mimeType) return false;
+
+
const mime = mimeType.toLowerCase();
+
+
// Text-based web assets that benefit from compression
+
const compressibleTypes = [
+
'text/html',
+
'text/css',
+
'text/javascript',
+
'application/javascript',
+
'application/x-javascript',
+
'text/xml',
+
'application/xml',
+
'application/json',
+
'text/plain',
+
'image/svg+xml',
+
];
+
+
if (compressibleTypes.some(type => mime === type || mime.startsWith(type))) {
+
return true;
+
}
+
+
// Already-compressed formats that should NOT be double-compressed
+
const alreadyCompressedPrefixes = [
+
'video/',
+
'audio/',
+
'image/',
+
'application/pdf',
+
'application/zip',
+
'application/gzip',
+
];
+
+
if (alreadyCompressedPrefixes.some(prefix => mime.startsWith(prefix))) {
+
return false;
+
}
+
+
// Default to not compressing for unknown types
+
return false;
}
interface IpldLink {
···
throw new Error('Invalid record structure: root missing entries array');
}
+
// Get existing cache metadata to check for incremental updates
+
const existingMetadata = await getCacheMetadata(did, rkey);
+
const existingFileCids = existingMetadata?.fileCids || {};
+
// Use a temporary directory with timestamp to avoid collisions
const tempSuffix = `.tmp-${Date.now()}-${Math.random().toString(36).slice(2, 9)}`;
const tempDir = `${CACHE_DIR}/${did}/${rkey}${tempSuffix}`;
const finalDir = `${CACHE_DIR}/${did}/${rkey}`;
try {
-
// Download to temporary directory
-
await cacheFiles(did, rkey, record.root.entries, pdsEndpoint, '', tempSuffix);
-
await saveCacheMetadata(did, rkey, recordCid, tempSuffix);
+
// Collect file CIDs from the new record
+
const newFileCids: Record<string, string> = {};
+
collectFileCidsFromEntries(record.root.entries, '', newFileCids);
+
+
// Download/copy files to temporary directory (with incremental logic)
+
await cacheFiles(did, rkey, record.root.entries, pdsEndpoint, '', tempSuffix, existingFileCids, finalDir);
+
await saveCacheMetadata(did, rkey, recordCid, tempSuffix, newFileCids);
// Atomically replace old cache with new cache
// On POSIX systems (Linux/macOS), rename is atomic
···
}
}
+
/**
+
* Recursively collect file CIDs from entries for incremental update tracking
+
*/
+
function collectFileCidsFromEntries(entries: Entry[], pathPrefix: string, fileCids: Record<string, string>): void {
+
for (const entry of entries) {
+
const currentPath = pathPrefix ? `${pathPrefix}/${entry.name}` : entry.name;
+
const node = entry.node;
+
+
if ('type' in node && node.type === 'directory' && 'entries' in node) {
+
collectFileCidsFromEntries(node.entries, currentPath, fileCids);
+
} else if ('type' in node && node.type === 'file' && 'blob' in node) {
+
const fileNode = node as File;
+
const cid = extractBlobCid(fileNode.blob);
+
if (cid) {
+
fileCids[currentPath] = cid;
+
}
+
}
+
}
+
}
+
async function cacheFiles(
did: string,
site: string,
entries: Entry[],
pdsEndpoint: string,
pathPrefix: string,
-
dirSuffix: string = ''
+
dirSuffix: string = '',
+
existingFileCids: Record<string, string> = {},
+
existingCacheDir?: string
): Promise<void> {
-
// Collect all file blob download tasks first
+
// Collect file tasks, separating unchanged files from new/changed files
const downloadTasks: Array<() => Promise<void>> = [];
-
+
const copyTasks: Array<() => Promise<void>> = [];
+
function collectFileTasks(
entries: Entry[],
currentPathPrefix: string
···
collectFileTasks(node.entries, currentPath);
} else if ('type' in node && node.type === 'file' && 'blob' in node) {
const fileNode = node as File;
-
downloadTasks.push(() => cacheFileBlob(
-
did,
-
site,
-
currentPath,
-
fileNode.blob,
-
pdsEndpoint,
-
fileNode.encoding,
-
fileNode.mimeType,
-
fileNode.base64,
-
dirSuffix
-
));
+
const cid = extractBlobCid(fileNode.blob);
+
+
// Check if file is unchanged (same CID as existing cache)
+
if (cid && existingFileCids[currentPath] === cid && existingCacheDir) {
+
// File unchanged - copy from existing cache instead of downloading
+
copyTasks.push(() => copyExistingFile(
+
did,
+
site,
+
currentPath,
+
dirSuffix,
+
existingCacheDir
+
));
+
} else {
+
// File new or changed - download it
+
downloadTasks.push(() => cacheFileBlob(
+
did,
+
site,
+
currentPath,
+
fileNode.blob,
+
pdsEndpoint,
+
fileNode.encoding,
+
fileNode.mimeType,
+
fileNode.base64,
+
dirSuffix
+
));
+
}
}
}
}
collectFileTasks(entries, pathPrefix);
-
// Execute downloads concurrently with a limit of 3 at a time
-
const concurrencyLimit = 3;
-
for (let i = 0; i < downloadTasks.length; i += concurrencyLimit) {
-
const batch = downloadTasks.slice(i, i + concurrencyLimit);
+
console.log(`[Incremental Update] Files to copy: ${copyTasks.length}, Files to download: ${downloadTasks.length}`);
+
+
// Copy unchanged files in parallel (fast local operations)
+
const copyLimit = 10;
+
for (let i = 0; i < copyTasks.length; i += copyLimit) {
+
const batch = copyTasks.slice(i, i + copyLimit);
+
await Promise.all(batch.map(task => task()));
+
}
+
+
// Download new/changed files concurrently with a limit of 3 at a time
+
const downloadLimit = 3;
+
for (let i = 0; i < downloadTasks.length; i += downloadLimit) {
+
const batch = downloadTasks.slice(i, i + downloadLimit);
await Promise.all(batch.map(task => task()));
}
}
+
/**
+
* Copy an unchanged file from existing cache to new cache location
+
*/
+
async function copyExistingFile(
+
did: string,
+
site: string,
+
filePath: string,
+
dirSuffix: string,
+
existingCacheDir: string
+
): Promise<void> {
+
const { copyFile } = await import('fs/promises');
+
+
const sourceFile = `${existingCacheDir}/${filePath}`;
+
const destFile = `${CACHE_DIR}/${did}/${site}${dirSuffix}/${filePath}`;
+
const destDir = destFile.substring(0, destFile.lastIndexOf('/'));
+
+
// Create destination directory if needed
+
if (destDir && !existsSync(destDir)) {
+
mkdirSync(destDir, { recursive: true });
+
}
+
+
try {
+
// Copy the file
+
await copyFile(sourceFile, destFile);
+
+
// Copy metadata file if it exists
+
const sourceMetaFile = `${sourceFile}.meta`;
+
const destMetaFile = `${destFile}.meta`;
+
if (existsSync(sourceMetaFile)) {
+
await copyFile(sourceMetaFile, destMetaFile);
+
}
+
+
console.log(`[Incremental] Copied unchanged file: ${filePath}`);
+
} catch (err) {
+
console.error(`[Incremental] Failed to copy file ${filePath}, will attempt download:`, err);
+
throw err;
+
}
+
}
+
async function cacheFileBlob(
did: string,
site: string,
···
const blobUrl = `${pdsEndpoint}/xrpc/com.atproto.sync.getBlob?did=${encodeURIComponent(did)}&cid=${encodeURIComponent(cid)}`;
-
// Allow up to 100MB per file blob, with 2 minute timeout
-
let content = await safeFetchBlob(blobUrl, { maxSize: 100 * 1024 * 1024, timeout: 120000 });
+
// Allow up to 500MB per file blob, with 5 minute timeout
+
let content = await safeFetchBlob(blobUrl, { maxSize: 500 * 1024 * 1024, timeout: 300000 });
console.log(`[DEBUG] ${filePath}: fetched ${content.length} bytes, base64=${base64}, encoding=${encoding}, mimeType=${mimeType}`);
-
// If content is base64-encoded, decode it back to binary (gzipped or not)
+
// If content is base64-encoded, decode it back to raw binary (gzipped or not)
if (base64) {
const originalSize = content.length;
-
// The content from the blob is base64 text, decode it directly to binary
-
const buffer = Buffer.from(content);
-
const base64String = buffer.toString('ascii'); // Use ascii for base64 text, not utf-8
-
console.log(`[DEBUG] ${filePath}: base64 string first 100 chars: ${base64String.substring(0, 100)}`);
+
// Decode base64 directly from raw bytes - no string conversion
+
// The blob contains base64-encoded text as raw bytes, decode it in-place
+
const textDecoder = new TextDecoder();
+
const base64String = textDecoder.decode(content);
content = Buffer.from(base64String, 'base64');
-
console.log(`[DEBUG] ${filePath}: decoded from ${originalSize} bytes to ${content.length} bytes`);
+
console.log(`[DEBUG] ${filePath}: decoded base64 from ${originalSize} bytes to ${content.length} bytes`);
// Check if it's actually gzipped by looking at magic bytes
if (content.length >= 2) {
-
const magic = content[0] === 0x1f && content[1] === 0x8b;
-
const byte0 = content[0];
-
const byte1 = content[1];
-
console.log(`[DEBUG] ${filePath}: has gzip magic bytes: ${magic} (0x${byte0?.toString(16)}, 0x${byte1?.toString(16)})`);
+
const hasGzipMagic = content[0] === 0x1f && content[1] === 0x8b;
+
console.log(`[DEBUG] ${filePath}: has gzip magic bytes: ${hasGzipMagic}`);
}
}
···
mkdirSync(fileDir, { recursive: true });
}
+
// Use the shared function to determine if this should remain compressed
+
const shouldStayCompressed = shouldCompressMimeType(mimeType);
+
+
// Decompress files that shouldn't be stored compressed
+
if (encoding === 'gzip' && !shouldStayCompressed && content.length >= 2 &&
+
content[0] === 0x1f && content[1] === 0x8b) {
+
console.log(`[DEBUG] ${filePath}: decompressing non-compressible type (${mimeType}) before caching`);
+
try {
+
const { gunzipSync } = await import('zlib');
+
const decompressed = gunzipSync(content);
+
console.log(`[DEBUG] ${filePath}: decompressed from ${content.length} to ${decompressed.length} bytes`);
+
content = decompressed;
+
// Clear the encoding flag since we're storing decompressed
+
encoding = undefined;
+
} catch (error) {
+
console.log(`[DEBUG] ${filePath}: failed to decompress, storing original gzipped content. Error:`, error);
+
}
+
}
+
await writeFile(cacheFile, content);
-
// Store metadata if file is compressed
+
// Store metadata only if file is still compressed
if (encoding === 'gzip' && mimeType) {
const metaFile = `${cacheFile}.meta`;
await writeFile(metaFile, JSON.stringify({ encoding, mimeType }));
···
return existsSync(`${CACHE_DIR}/${did}/${site}`);
}
-
async function saveCacheMetadata(did: string, rkey: string, recordCid: string, dirSuffix: string = ''): Promise<void> {
+
async function saveCacheMetadata(did: string, rkey: string, recordCid: string, dirSuffix: string = '', fileCids?: Record<string, string>): Promise<void> {
const metadata: CacheMetadata = {
recordCid,
cachedAt: Date.now(),
did,
-
rkey
+
rkey,
+
fileCids
};
const metadataPath = `${CACHE_DIR}/${did}/${rkey}${dirSuffix}/.metadata.json`;
+364 -149
hosting-service/src/server.ts
···
import { Hono } from 'hono';
import { getWispDomain, getCustomDomain, getCustomDomainByHash } from './lib/db';
-
import { resolveDid, getPdsForDid, fetchSiteRecord, downloadAndCacheSite, getCachedFilePath, isCached, sanitizePath } from './lib/utils';
+
import { resolveDid, getPdsForDid, fetchSiteRecord, downloadAndCacheSite, getCachedFilePath, isCached, sanitizePath, shouldCompressMimeType } from './lib/utils';
import { rewriteHtmlPaths, isHtmlContent } from './lib/html-rewriter';
-
import { existsSync, readFileSync } from 'fs';
+
import { existsSync } from 'fs';
+
import { readFile, access } from 'fs/promises';
import { lookup } from 'mime-types';
import { logger, observabilityMiddleware, observabilityErrorHandler, logCollector, errorTracker, metricsCollector } from './lib/observability';
+
import { fileCache, metadataCache, rewrittenHtmlCache, getCacheKey, type FileMetadata } from './lib/cache';
+
import { loadRedirectRules, matchRedirectRule, parseCookies, parseQueryString, type RedirectRule } from './lib/redirects';
const BASE_HOST = process.env.BASE_HOST || 'wisp.place';
···
return validRkeyPattern.test(rkey);
}
+
/**
+
* Async file existence check
+
*/
+
async function fileExists(path: string): Promise<boolean> {
+
try {
+
await access(path);
+
return true;
+
} catch {
+
return false;
+
}
+
}
+
+
// Cache for redirect rules (per site)
+
const redirectRulesCache = new Map<string, RedirectRule[]>();
+
+
/**
+
* Clear redirect rules cache for a specific site
+
* Should be called when a site is updated/recached
+
*/
+
export function clearRedirectRulesCache(did: string, rkey: string) {
+
const cacheKey = `${did}:${rkey}`;
+
redirectRulesCache.delete(cacheKey);
+
}
+
// Helper to serve files from cache
-
async function serveFromCache(did: string, rkey: string, filePath: string) {
+
async function serveFromCache(
+
did: string,
+
rkey: string,
+
filePath: string,
+
fullUrl?: string,
+
headers?: Record<string, string>
+
) {
+
// Check for redirect rules first
+
const redirectCacheKey = `${did}:${rkey}`;
+
let redirectRules = redirectRulesCache.get(redirectCacheKey);
+
+
if (redirectRules === undefined) {
+
// Load rules for the first time
+
redirectRules = await loadRedirectRules(did, rkey);
+
redirectRulesCache.set(redirectCacheKey, redirectRules);
+
}
+
+
// Apply redirect rules if any exist
+
if (redirectRules.length > 0) {
+
const requestPath = '/' + (filePath || '');
+
const queryParams = fullUrl ? parseQueryString(fullUrl) : {};
+
const cookies = parseCookies(headers?.['cookie']);
+
+
const redirectMatch = matchRedirectRule(requestPath, redirectRules, {
+
queryParams,
+
headers,
+
cookies,
+
});
+
+
if (redirectMatch) {
+
const { targetPath, status } = redirectMatch;
+
+
// Handle different status codes
+
if (status === 200) {
+
// Rewrite: serve different content but keep URL the same
+
// Remove leading slash for internal path resolution
+
const rewritePath = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;
+
return serveFileInternal(did, rkey, rewritePath);
+
} else if (status === 301 || status === 302) {
+
// External redirect: change the URL
+
return new Response(null, {
+
status,
+
headers: {
+
'Location': targetPath,
+
'Cache-Control': status === 301 ? 'public, max-age=31536000' : 'public, max-age=0',
+
},
+
});
+
} else if (status === 404) {
+
// Custom 404 page
+
const custom404Path = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;
+
const response = await serveFileInternal(did, rkey, custom404Path);
+
// Override status to 404
+
return new Response(response.body, {
+
status: 404,
+
headers: response.headers,
+
});
+
}
+
}
+
}
+
+
// No redirect matched, serve normally
+
return serveFileInternal(did, rkey, filePath);
+
}
+
+
// Internal function to serve a file (used by both normal serving and rewrites)
+
async function serveFileInternal(did: string, rkey: string, filePath: string) {
// Default to index.html if path is empty or ends with /
let requestPath = filePath || 'index.html';
if (requestPath.endsWith('/')) {
requestPath += 'index.html';
}
+
const cacheKey = getCacheKey(did, rkey, requestPath);
const cachedFile = getCachedFilePath(did, rkey, requestPath);
-
if (existsSync(cachedFile)) {
-
const content = readFileSync(cachedFile);
+
// Check in-memory cache first
+
let content = fileCache.get(cacheKey);
+
let meta = metadataCache.get(cacheKey);
+
+
if (!content && await fileExists(cachedFile)) {
+
// Read from disk and cache
+
content = await readFile(cachedFile);
+
fileCache.set(cacheKey, content, content.length);
+
const metaFile = `${cachedFile}.meta`;
+
if (await fileExists(metaFile)) {
+
const metaJson = await readFile(metaFile, 'utf-8');
+
meta = JSON.parse(metaJson);
+
metadataCache.set(cacheKey, meta!, JSON.stringify(meta).length);
+
}
+
}
-
console.log(`[DEBUG SERVE] ${requestPath}: file size=${content.length} bytes, metaFile exists=${existsSync(metaFile)}`);
+
if (content) {
+
// Build headers with caching
+
const headers: Record<string, string> = {};
-
// Check if file has compression metadata
-
if (existsSync(metaFile)) {
-
const meta = JSON.parse(readFileSync(metaFile, 'utf-8'));
-
console.log(`[DEBUG SERVE] ${requestPath}: meta=${JSON.stringify(meta)}`);
-
-
// Check actual content for gzip magic bytes
-
if (content.length >= 2) {
-
const hasGzipMagic = content[0] === 0x1f && content[1] === 0x8b;
-
const byte0 = content[0];
-
const byte1 = content[1];
-
console.log(`[DEBUG SERVE] ${requestPath}: has gzip magic bytes=${hasGzipMagic} (0x${byte0?.toString(16)}, 0x${byte1?.toString(16)})`);
+
if (meta && meta.encoding === 'gzip' && meta.mimeType) {
+
const shouldServeCompressed = shouldCompressMimeType(meta.mimeType);
+
+
if (!shouldServeCompressed) {
+
const { gunzipSync } = await import('zlib');
+
const decompressed = gunzipSync(content);
+
headers['Content-Type'] = meta.mimeType;
+
headers['Cache-Control'] = 'public, max-age=31536000, immutable';
+
return new Response(decompressed, { headers });
}
-
-
if (meta.encoding === 'gzip' && meta.mimeType) {
-
// Don't serve already-compressed media formats with Content-Encoding: gzip
-
// These formats (video, audio, images) are already compressed and the browser
-
// can't decode them if we add another layer of compression
-
const alreadyCompressedTypes = [
-
'video/', 'audio/', 'image/jpeg', 'image/jpg', 'image/png',
-
'image/gif', 'image/webp', 'application/pdf'
-
];
-
-
const isAlreadyCompressed = alreadyCompressedTypes.some(type =>
-
meta.mimeType.toLowerCase().startsWith(type)
-
);
-
-
if (isAlreadyCompressed) {
-
// Decompress the file before serving
-
console.log(`[DEBUG SERVE] ${requestPath}: decompressing already-compressed media type`);
-
const { gunzipSync } = await import('zlib');
-
const decompressed = gunzipSync(content);
-
console.log(`[DEBUG SERVE] ${requestPath}: decompressed from ${content.length} to ${decompressed.length} bytes`);
-
return new Response(decompressed, {
-
headers: {
-
'Content-Type': meta.mimeType,
-
},
-
});
-
}
-
-
// Serve gzipped content with proper headers (for HTML, CSS, JS, etc.)
-
console.log(`[DEBUG SERVE] ${requestPath}: serving as gzipped with Content-Encoding header`);
-
return new Response(content, {
-
headers: {
-
'Content-Type': meta.mimeType,
-
'Content-Encoding': 'gzip',
-
},
-
});
-
}
+
+
headers['Content-Type'] = meta.mimeType;
+
headers['Content-Encoding'] = 'gzip';
+
headers['Cache-Control'] = meta.mimeType.startsWith('text/html')
+
? 'public, max-age=300'
+
: 'public, max-age=31536000, immutable';
+
return new Response(content, { headers });
}
-
// Serve non-compressed files normally
+
// Non-compressed files
const mimeType = lookup(cachedFile) || 'application/octet-stream';
-
return new Response(content, {
-
headers: {
-
'Content-Type': mimeType,
-
},
-
});
+
headers['Content-Type'] = mimeType;
+
headers['Cache-Control'] = mimeType.startsWith('text/html')
+
? 'public, max-age=300'
+
: 'public, max-age=31536000, immutable';
+
return new Response(content, { headers });
}
// Try index.html for directory-like paths
if (!requestPath.includes('.')) {
-
const indexFile = getCachedFilePath(did, rkey, `${requestPath}/index.html`);
-
if (existsSync(indexFile)) {
-
const content = readFileSync(indexFile);
-
const metaFile = `${indexFile}.meta`;
+
const indexPath = `${requestPath}/index.html`;
+
const indexCacheKey = getCacheKey(did, rkey, indexPath);
+
const indexFile = getCachedFilePath(did, rkey, indexPath);
-
// Check if file has compression metadata
-
if (existsSync(metaFile)) {
-
const meta = JSON.parse(readFileSync(metaFile, 'utf-8'));
-
if (meta.encoding === 'gzip' && meta.mimeType) {
-
return new Response(content, {
-
headers: {
-
'Content-Type': meta.mimeType,
-
'Content-Encoding': 'gzip',
-
},
-
});
-
}
+
let indexContent = fileCache.get(indexCacheKey);
+
let indexMeta = metadataCache.get(indexCacheKey);
+
+
if (!indexContent && await fileExists(indexFile)) {
+
indexContent = await readFile(indexFile);
+
fileCache.set(indexCacheKey, indexContent, indexContent.length);
+
+
const indexMetaFile = `${indexFile}.meta`;
+
if (await fileExists(indexMetaFile)) {
+
const metaJson = await readFile(indexMetaFile, 'utf-8');
+
indexMeta = JSON.parse(metaJson);
+
metadataCache.set(indexCacheKey, indexMeta!, JSON.stringify(indexMeta).length);
}
+
}
-
return new Response(content, {
-
headers: {
-
'Content-Type': 'text/html; charset=utf-8',
-
},
-
});
+
if (indexContent) {
+
const headers: Record<string, string> = {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'public, max-age=300',
+
};
+
+
if (indexMeta && indexMeta.encoding === 'gzip') {
+
headers['Content-Encoding'] = 'gzip';
+
}
+
+
return new Response(indexContent, { headers });
}
}
···
did: string,
rkey: string,
filePath: string,
-
basePath: string
+
basePath: string,
+
fullUrl?: string,
+
headers?: Record<string, string>
) {
+
// Check for redirect rules first
+
const redirectCacheKey = `${did}:${rkey}`;
+
let redirectRules = redirectRulesCache.get(redirectCacheKey);
+
+
if (redirectRules === undefined) {
+
// Load rules for the first time
+
redirectRules = await loadRedirectRules(did, rkey);
+
redirectRulesCache.set(redirectCacheKey, redirectRules);
+
}
+
+
// Apply redirect rules if any exist
+
if (redirectRules.length > 0) {
+
const requestPath = '/' + (filePath || '');
+
const queryParams = fullUrl ? parseQueryString(fullUrl) : {};
+
const cookies = parseCookies(headers?.['cookie']);
+
+
const redirectMatch = matchRedirectRule(requestPath, redirectRules, {
+
queryParams,
+
headers,
+
cookies,
+
});
+
+
if (redirectMatch) {
+
const { targetPath, status } = redirectMatch;
+
+
// Handle different status codes
+
if (status === 200) {
+
// Rewrite: serve different content but keep URL the same
+
const rewritePath = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;
+
return serveFileInternalWithRewrite(did, rkey, rewritePath, basePath);
+
} else if (status === 301 || status === 302) {
+
// External redirect: change the URL
+
// For sites.wisp.place, we need to adjust the target path to include the base path
+
// unless it's an absolute URL
+
let redirectTarget = targetPath;
+
if (!targetPath.startsWith('http://') && !targetPath.startsWith('https://')) {
+
redirectTarget = basePath + (targetPath.startsWith('/') ? targetPath.slice(1) : targetPath);
+
}
+
return new Response(null, {
+
status,
+
headers: {
+
'Location': redirectTarget,
+
'Cache-Control': status === 301 ? 'public, max-age=31536000' : 'public, max-age=0',
+
},
+
});
+
} else if (status === 404) {
+
// Custom 404 page
+
const custom404Path = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;
+
const response = await serveFileInternalWithRewrite(did, rkey, custom404Path, basePath);
+
// Override status to 404
+
return new Response(response.body, {
+
status: 404,
+
headers: response.headers,
+
});
+
}
+
}
+
}
+
+
// No redirect matched, serve normally
+
return serveFileInternalWithRewrite(did, rkey, filePath, basePath);
+
}
+
+
// Internal function to serve a file with rewriting
+
async function serveFileInternalWithRewrite(did: string, rkey: string, filePath: string, basePath: string) {
// Default to index.html if path is empty or ends with /
let requestPath = filePath || 'index.html';
if (requestPath.endsWith('/')) {
requestPath += 'index.html';
}
+
const cacheKey = getCacheKey(did, rkey, requestPath);
const cachedFile = getCachedFilePath(did, rkey, requestPath);
-
if (existsSync(cachedFile)) {
-
const metaFile = `${cachedFile}.meta`;
-
let mimeType = lookup(cachedFile) || 'application/octet-stream';
-
let isGzipped = false;
+
// Check for rewritten HTML in cache first (if it's HTML)
+
const mimeTypeGuess = lookup(requestPath) || 'application/octet-stream';
+
if (isHtmlContent(requestPath, mimeTypeGuess)) {
+
const rewrittenKey = getCacheKey(did, rkey, requestPath, `rewritten:${basePath}`);
+
const rewrittenContent = rewrittenHtmlCache.get(rewrittenKey);
+
if (rewrittenContent) {
+
return new Response(rewrittenContent, {
+
headers: {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Content-Encoding': 'gzip',
+
'Cache-Control': 'public, max-age=300',
+
},
+
});
+
}
+
}
-
// Check if file has compression metadata
-
if (existsSync(metaFile)) {
-
const meta = JSON.parse(readFileSync(metaFile, 'utf-8'));
-
if (meta.encoding === 'gzip' && meta.mimeType) {
-
mimeType = meta.mimeType;
-
isGzipped = true;
-
}
+
// Check in-memory file cache
+
let content = fileCache.get(cacheKey);
+
let meta = metadataCache.get(cacheKey);
+
+
if (!content && await fileExists(cachedFile)) {
+
// Read from disk and cache
+
content = await readFile(cachedFile);
+
fileCache.set(cacheKey, content, content.length);
+
+
const metaFile = `${cachedFile}.meta`;
+
if (await fileExists(metaFile)) {
+
const metaJson = await readFile(metaFile, 'utf-8');
+
meta = JSON.parse(metaJson);
+
metadataCache.set(cacheKey, meta!, JSON.stringify(meta).length);
}
+
}
+
+
if (content) {
+
const mimeType = meta?.mimeType || lookup(cachedFile) || 'application/octet-stream';
+
const isGzipped = meta?.encoding === 'gzip';
// Check if this is HTML content that needs rewriting
-
// Note: For gzipped HTML with path rewriting, we need to decompress, rewrite, and serve uncompressed
-
// This is a trade-off for the sites.wisp.place domain which needs path rewriting
if (isHtmlContent(requestPath, mimeType)) {
-
let content: string;
+
let htmlContent: string;
if (isGzipped) {
const { gunzipSync } = await import('zlib');
-
const compressed = readFileSync(cachedFile);
-
content = gunzipSync(compressed).toString('utf-8');
+
htmlContent = gunzipSync(content).toString('utf-8');
} else {
-
content = readFileSync(cachedFile, 'utf-8');
+
htmlContent = content.toString('utf-8');
}
-
const rewritten = rewriteHtmlPaths(content, basePath);
-
return new Response(rewritten, {
+
const rewritten = rewriteHtmlPaths(htmlContent, basePath, requestPath);
+
+
// Recompress and cache the rewritten HTML
+
const { gzipSync } = await import('zlib');
+
const recompressed = gzipSync(Buffer.from(rewritten, 'utf-8'));
+
+
const rewrittenKey = getCacheKey(did, rkey, requestPath, `rewritten:${basePath}`);
+
rewrittenHtmlCache.set(rewrittenKey, recompressed, recompressed.length);
+
+
return new Response(recompressed, {
headers: {
'Content-Type': 'text/html; charset=utf-8',
+
'Content-Encoding': 'gzip',
+
'Cache-Control': 'public, max-age=300',
},
});
}
-
// Non-HTML files: serve gzipped content as-is with proper headers
-
const content = readFileSync(cachedFile);
+
// Non-HTML files: serve as-is
+
const headers: Record<string, string> = {
+
'Content-Type': mimeType,
+
'Cache-Control': 'public, max-age=31536000, immutable',
+
};
+
if (isGzipped) {
-
// Don't serve already-compressed media formats with Content-Encoding: gzip
-
const alreadyCompressedTypes = [
-
'video/', 'audio/', 'image/jpeg', 'image/jpg', 'image/png',
-
'image/gif', 'image/webp', 'application/pdf'
-
];
-
-
const isAlreadyCompressed = alreadyCompressedTypes.some(type =>
-
mimeType.toLowerCase().startsWith(type)
-
);
-
-
if (isAlreadyCompressed) {
-
// Decompress the file before serving
+
const shouldServeCompressed = shouldCompressMimeType(mimeType);
+
if (!shouldServeCompressed) {
const { gunzipSync } = await import('zlib');
const decompressed = gunzipSync(content);
-
return new Response(decompressed, {
-
headers: {
-
'Content-Type': mimeType,
-
},
-
});
+
return new Response(decompressed, { headers });
}
-
-
return new Response(content, {
+
headers['Content-Encoding'] = 'gzip';
+
}
+
+
return new Response(content, { headers });
+
}
+
+
// Try index.html for directory-like paths
+
if (!requestPath.includes('.')) {
+
const indexPath = `${requestPath}/index.html`;
+
const indexCacheKey = getCacheKey(did, rkey, indexPath);
+
const indexFile = getCachedFilePath(did, rkey, indexPath);
+
+
// Check for rewritten index.html in cache
+
const rewrittenKey = getCacheKey(did, rkey, indexPath, `rewritten:${basePath}`);
+
const rewrittenContent = rewrittenHtmlCache.get(rewrittenKey);
+
if (rewrittenContent) {
+
return new Response(rewrittenContent, {
headers: {
-
'Content-Type': mimeType,
+
'Content-Type': 'text/html; charset=utf-8',
'Content-Encoding': 'gzip',
+
'Cache-Control': 'public, max-age=300',
},
});
}
-
return new Response(content, {
-
headers: {
-
'Content-Type': mimeType,
-
},
-
});
-
}
-
// Try index.html for directory-like paths
-
if (!requestPath.includes('.')) {
-
const indexFile = getCachedFilePath(did, rkey, `${requestPath}/index.html`);
-
if (existsSync(indexFile)) {
-
const metaFile = `${indexFile}.meta`;
-
let isGzipped = false;
+
let indexContent = fileCache.get(indexCacheKey);
+
let indexMeta = metadataCache.get(indexCacheKey);
-
if (existsSync(metaFile)) {
-
const meta = JSON.parse(readFileSync(metaFile, 'utf-8'));
-
if (meta.encoding === 'gzip') {
-
isGzipped = true;
-
}
+
if (!indexContent && await fileExists(indexFile)) {
+
indexContent = await readFile(indexFile);
+
fileCache.set(indexCacheKey, indexContent, indexContent.length);
+
+
const indexMetaFile = `${indexFile}.meta`;
+
if (await fileExists(indexMetaFile)) {
+
const metaJson = await readFile(indexMetaFile, 'utf-8');
+
indexMeta = JSON.parse(metaJson);
+
metadataCache.set(indexCacheKey, indexMeta!, JSON.stringify(indexMeta).length);
}
+
}
-
// HTML needs path rewriting, so decompress if needed
-
let content: string;
+
if (indexContent) {
+
const isGzipped = indexMeta?.encoding === 'gzip';
+
+
let htmlContent: string;
if (isGzipped) {
const { gunzipSync } = await import('zlib');
-
const compressed = readFileSync(indexFile);
-
content = gunzipSync(compressed).toString('utf-8');
+
htmlContent = gunzipSync(indexContent).toString('utf-8');
} else {
-
content = readFileSync(indexFile, 'utf-8');
+
htmlContent = indexContent.toString('utf-8');
}
-
const rewritten = rewriteHtmlPaths(content, basePath);
-
return new Response(rewritten, {
+
const rewritten = rewriteHtmlPaths(htmlContent, basePath, indexPath);
+
+
const { gzipSync } = await import('zlib');
+
const recompressed = gzipSync(Buffer.from(rewritten, 'utf-8'));
+
+
rewrittenHtmlCache.set(rewrittenKey, recompressed, recompressed.length);
+
+
return new Response(recompressed, {
headers: {
'Content-Type': 'text/html; charset=utf-8',
+
'Content-Encoding': 'gzip',
+
'Cache-Control': 'public, max-age=300',
},
});
}
···
try {
await downloadAndCacheSite(did, rkey, siteData.record, pdsEndpoint, siteData.cid);
+
// Clear redirect rules cache since the site was updated
+
clearRedirectRulesCache(did, rkey);
logger.info('Site cached successfully', { did, rkey });
return true;
} catch (err) {
···
// Serve with HTML path rewriting to handle absolute paths
const basePath = `/${identifier}/${site}/`;
-
return serveFromCacheWithRewrite(did, site, filePath, basePath);
+
const headers: Record<string, string> = {};
+
c.req.raw.headers.forEach((value, key) => {
+
headers[key.toLowerCase()] = value;
+
});
+
return serveFromCacheWithRewrite(did, site, filePath, basePath, c.req.url, headers);
}
// Check if this is a DNS hash subdomain
···
return c.text('Site not found', 404);
}
-
return serveFromCache(customDomain.did, rkey, path);
+
const headers: Record<string, string> = {};
+
c.req.raw.headers.forEach((value, key) => {
+
headers[key.toLowerCase()] = value;
+
});
+
return serveFromCache(customDomain.did, rkey, path, c.req.url, headers);
}
// Route 2: Registered subdomains - /*.wisp.place/*
···
return c.text('Site not found', 404);
}
-
return serveFromCache(domainInfo.did, rkey, path);
+
const headers: Record<string, string> = {};
+
c.req.raw.headers.forEach((value, key) => {
+
headers[key.toLowerCase()] = value;
+
});
+
return serveFromCache(domainInfo.did, rkey, path, c.req.url, headers);
}
// Route 1: Custom domains - /*
···
return c.text('Site not found', 404);
}
-
return serveFromCache(customDomain.did, rkey, path);
+
const headers: Record<string, string> = {};
+
c.req.raw.headers.forEach((value, key) => {
+
headers[key.toLowerCase()] = value;
+
});
+
return serveFromCache(customDomain.did, rkey, path, c.req.url, headers);
});
// Internal observability endpoints (for admin panel)
···
const timeWindow = query.timeWindow ? parseInt(query.timeWindow as string) : 3600000;
const stats = metricsCollector.getStats('hosting-service', timeWindow);
return c.json({ stats, timeWindow });
+
});
+
+
app.get('/__internal__/observability/cache', async (c) => {
+
const { getCacheStats } = await import('./lib/cache');
+
const stats = getCacheStats();
+
return c.json({ cache: stats });
});
export default app;
+3 -1
hosting-service/tsconfig.json
···
/* Code doesn't run in DOM */
"lib": ["es2022"],
-
}
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "cache", "dist"]
}
+9 -1
package.json
···
"@elysiajs/openapi": "^1.4.11",
"@elysiajs/opentelemetry": "^1.4.6",
"@elysiajs/static": "^1.4.2",
+
"@radix-ui/react-checkbox": "^1.3.3",
"@radix-ui/react-dialog": "^1.1.15",
"@radix-ui/react-label": "^2.1.7",
"@radix-ui/react-radio-group": "^1.3.8",
"@radix-ui/react-slot": "^1.2.3",
"@radix-ui/react-tabs": "^1.1.13",
"@tanstack/react-query": "^5.90.2",
+
"actor-typeahead": "^0.1.1",
+
"atproto-ui": "^0.11.3",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"elysia": "latest",
"iron-session": "^8.0.4",
"lucide-react": "^0.546.0",
+
"multiformats": "^13.4.1",
+
"prismjs": "^1.30.0",
"react": "^19.2.0",
"react-dom": "^19.2.0",
"tailwind-merge": "^3.3.1",
···
"@types/react": "^19.2.2",
"@types/react-dom": "^19.2.1",
"bun-plugin-tailwind": "^0.1.2",
-
"bun-types": "latest"
+
"bun-types": "latest",
+
"esbuild": "0.26.0"
},
"module": "src/index.js",
"trustedDependencies": [
+
"bun",
+
"cbor-extract",
"core-js",
"protobufjs"
]
+379
public/acceptable-use/acceptable-use.tsx
···
+
import { createRoot } from 'react-dom/client'
+
import Layout from '@public/layouts'
+
import { Button } from '@public/components/ui/button'
+
import { Card } from '@public/components/ui/card'
+
import { ArrowLeft, Shield, AlertCircle, CheckCircle, Scale } from 'lucide-react'
+
+
function AcceptableUsePage() {
+
return (
+
<div className="min-h-screen bg-background">
+
{/* Header */}
+
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
+
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
+
<div className="flex items-center gap-2">
+
<img src="/transparent-full-size-ico.png" alt="wisp.place" className="w-8 h-8" />
+
<span className="text-xl font-semibold text-foreground">
+
wisp.place
+
</span>
+
</div>
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={() => window.location.href = '/'}
+
>
+
<ArrowLeft className="w-4 h-4 mr-2" />
+
Back to Home
+
</Button>
+
</div>
+
</header>
+
+
{/* Hero Section */}
+
<div className="bg-gradient-to-b from-accent/10 to-background border-b border-border/40">
+
<div className="container mx-auto px-4 py-16 max-w-4xl text-center">
+
<div className="inline-flex items-center justify-center w-16 h-16 rounded-full bg-accent/20 mb-6">
+
<Shield className="w-8 h-8 text-accent" />
+
</div>
+
<h1 className="text-4xl md:text-5xl font-bold mb-4">Acceptable Use Policy</h1>
+
<div className="flex items-center justify-center gap-6 text-sm text-muted-foreground">
+
<div className="flex items-center gap-2">
+
<span className="font-medium">Effective:</span>
+
<span>November 10, 2025</span>
+
</div>
+
<div className="h-4 w-px bg-border"></div>
+
<div className="flex items-center gap-2">
+
<span className="font-medium">Last Updated:</span>
+
<span>November 10, 2025</span>
+
</div>
+
</div>
+
</div>
+
</div>
+
+
{/* Content */}
+
<div className="container mx-auto px-4 py-12 max-w-4xl">
+
<article className="space-y-12">
+
{/* Our Philosophy */}
+
<section>
+
<h2 className="text-3xl font-bold mb-6 text-foreground">Our Philosophy</h2>
+
<div className="space-y-4 text-lg leading-relaxed text-muted-foreground">
+
<p>
+
wisp.place exists to give you a corner of the internet that's truly yoursโ€”a place to create, experiment, and express yourself freely. We believe in the open web and the fundamental importance of free expression. We're not here to police your thoughts, moderate your aesthetics, or judge your taste.
+
</p>
+
<p>
+
That said, we're also real people running real servers in real jurisdictions (the United States and the Netherlands), and there are legal and practical limits to what we can host. This policy aims to be as permissive as possible while keeping the lights on and staying on the right side of the law.
+
</p>
+
</div>
+
</section>
+
+
{/* What You Can Do */}
+
<Card className="bg-green-500/5 border-green-500/20 p-8">
+
<div className="flex items-start gap-4">
+
<div className="flex-shrink-0">
+
<CheckCircle className="w-8 h-8 text-green-500" />
+
</div>
+
<div className="space-y-4">
+
<h2 className="text-3xl font-bold text-foreground">What You Can Do</h2>
+
<div className="space-y-4 text-lg leading-relaxed text-muted-foreground">
+
<p>
+
<strong className="text-green-600 dark:text-green-400">Almost anything.</strong> Seriously. Build weird art projects. Write controversial essays. Create spaces that would make corporate platforms nervous. Express unpopular opinions. Make things that are strange, provocative, uncomfortable, or just plain yours.
+
</p>
+
<p>
+
We support creative and personal expression in all its forms, including adult content, political speech, counter-cultural work, and experimental projects.
+
</p>
+
</div>
+
</div>
+
</div>
+
</Card>
+
+
{/* What You Can't Do */}
+
<section>
+
<div className="flex items-center gap-3 mb-6">
+
<AlertCircle className="w-8 h-8 text-red-500" />
+
<h2 className="text-3xl font-bold text-foreground">What You Can't Do</h2>
+
</div>
+
+
<div className="space-y-8">
+
<Card className="p-6 border-2">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Illegal Content</h3>
+
<p className="text-muted-foreground mb-4">
+
Don't host content that's illegal in the United States or the Netherlands. This includes but isn't limited to:
+
</p>
+
<ul className="space-y-3 text-muted-foreground">
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span><strong>Child sexual abuse material (CSAM)</strong> involving real minors in any form</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span><strong>Realistic or AI-generated depictions</strong> of minors in sexual contexts, including photorealistic renders, deepfakes, or AI-generated imagery</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span><strong>Non-consensual intimate imagery</strong> (revenge porn, deepfakes, hidden camera footage, etc.)</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Content depicting or facilitating human trafficking, sexual exploitation, or sexual violence</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Instructions for manufacturing explosives, biological weapons, or other instruments designed for mass harm</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Content that facilitates imminent violence or terrorism</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Stolen financial information, credentials, or personal data used for fraud</span>
+
</li>
+
</ul>
+
</Card>
+
+
<Card className="p-6 border-2">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Intellectual Property Violations</h3>
+
<div className="space-y-4 text-muted-foreground">
+
<p>
+
Don't host content that clearly violates someone else's copyright, trademark, or other intellectual property rights. We're required to respond to valid DMCA takedown notices.
+
</p>
+
<p>
+
We understand that copyright law is complicated and sometimes ridiculous. We're not going to proactively scan your site or nitpick over fair use. But if we receive a legitimate legal complaint, we'll have to act on it.
+
</p>
+
</div>
+
</Card>
+
+
<Card className="p-6 border-2 border-red-500/30 bg-red-500/5">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Hate Content</h3>
+
<div className="space-y-4 text-muted-foreground">
+
<p>
+
You can express controversial ideas. You can be offensive. You can make people uncomfortable. But pure hateโ€”content that exists solely to dehumanize, threaten, or incite violence against people based on race, ethnicity, religion, gender, sexual orientation, disability, or similar characteristicsโ€”isn't welcome here.
+
</p>
+
<p>
+
There's a difference between "I have deeply unpopular opinions about X" and "People like X should be eliminated." The former is protected expression. The latter isn't.
+
</p>
+
<div className="bg-background/50 border-l-4 border-red-500 p-4 rounded">
+
<p className="font-medium text-foreground">
+
<strong>A note on enforcement:</strong> While we're generally permissive and believe in giving people the benefit of the doubt, hate content is where we draw a hard line. I will be significantly more aggressive in moderating this type of content than anything else on this list. If your site exists primarily to spread hate or recruit people into hateful ideologies, you will be removed swiftly and without extensive appeals. This is non-negotiable.
+
</p>
+
</div>
+
</div>
+
</Card>
+
+
<Card className="p-6 border-2">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Adult Content Guidelines</h3>
+
<div className="space-y-4 text-muted-foreground">
+
<p>
+
Adult content is allowed. This includes sexually explicit material, erotica, adult artwork, and NSFW creative expression.
+
</p>
+
<p className="font-medium">However:</p>
+
<ul className="space-y-2">
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>No content involving real minors in any sexual context whatsoever</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>No photorealistic, AI-generated, or otherwise realistic depictions of minors in sexual contexts</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-green-500 mt-1">โ€ข</span>
+
<span>Clearly stylized drawings and written fiction are permitted, provided they remain obviously non-photographic in nature</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>No non-consensual content (revenge porn, voyeurism, etc.)</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>No content depicting illegal sexual acts (bestiality, necrophilia, etc.)</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-yellow-500 mt-1">โ€ข</span>
+
<span>Adult content should be clearly marked as such if discoverable through public directories or search</span>
+
</li>
+
</ul>
+
</div>
+
</Card>
+
+
<Card className="p-6 border-2">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Malicious Technical Activity</h3>
+
<p className="text-muted-foreground mb-4">Don't use your site to:</p>
+
<ul className="space-y-2 text-muted-foreground">
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Distribute malware, viruses, or exploits</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Conduct phishing or social engineering attacks</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Launch DDoS attacks or network abuse</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Mine cryptocurrency without explicit user consent</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Scrape, spam, or abuse other services</span>
+
</li>
+
</ul>
+
</Card>
+
</div>
+
</section>
+
+
{/* Our Approach to Enforcement */}
+
<section>
+
<div className="flex items-center gap-3 mb-6">
+
<Scale className="w-8 h-8 text-accent" />
+
<h2 className="text-3xl font-bold text-foreground">Our Approach to Enforcement</h2>
+
</div>
+
<div className="space-y-6">
+
<div className="space-y-4 text-lg leading-relaxed text-muted-foreground">
+
<p>
+
<strong>We actively monitor for obvious violations.</strong> Not to censor your creativity or police your opinions, but to catch the clear-cut stuff that threatens the service's existence and makes this a worse place for everyone. We're looking for the blatantly illegal, the obviously harmfulโ€”the stuff that would get servers seized and communities destroyed.
+
</p>
+
<p>
+
We're not reading your blog posts looking for wrongthink. We're making sure this platform doesn't become a haven for the kind of content that ruins good things.
+
</p>
+
</div>
+
+
<Card className="p-6 bg-muted/30">
+
<p className="font-semibold mb-3 text-foreground">We take action when:</p>
+
<ol className="space-y-2 text-muted-foreground">
+
<li className="flex items-start gap-3">
+
<span className="font-bold text-accent">1.</span>
+
<span>We identify content that clearly violates this policy during routine monitoring</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="font-bold text-accent">2.</span>
+
<span>We receive a valid legal complaint (DMCA, court order, etc.)</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="font-bold text-accent">3.</span>
+
<span>Someone reports content that violates this policy and we can verify the violation</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="font-bold text-accent">4.</span>
+
<span>Your site is causing technical problems for the service or other users</span>
+
</li>
+
</ol>
+
</Card>
+
+
<Card className="p-6 bg-muted/30">
+
<p className="font-semibold mb-3 text-foreground">When we do need to take action, we'll try to:</p>
+
<ul className="space-y-2 text-muted-foreground">
+
<li className="flex items-start gap-3">
+
<span className="text-accent">โ€ข</span>
+
<span>Contact you first when legally and practically possible</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-accent">โ€ข</span>
+
<span>Be transparent about what's happening and why</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-accent">โ€ข</span>
+
<span>Give you an opportunity to address the issue if appropriate</span>
+
</li>
+
</ul>
+
</Card>
+
+
<p className="text-muted-foreground">
+
For serious or repeated violations, we may suspend or terminate your account.
+
</p>
+
</div>
+
</section>
+
+
{/* Regional Compliance */}
+
<Card className="p-6 bg-blue-500/5 border-blue-500/20">
+
<h2 className="text-2xl font-bold mb-4 text-foreground">Regional Compliance</h2>
+
<p className="text-muted-foreground">
+
Our servers are located in the United States and the Netherlands. Content hosted on wisp.place must comply with the laws of both jurisdictions. While we aim to provide broad creative freedom, these legal requirements are non-negotiable.
+
</p>
+
</Card>
+
+
{/* Changes to This Policy */}
+
<section>
+
<h2 className="text-2xl font-bold mb-4 text-foreground">Changes to This Policy</h2>
+
<p className="text-muted-foreground">
+
We may update this policy as legal requirements or service realities change. If we make significant changes, we'll notify active users.
+
</p>
+
</section>
+
+
{/* Questions or Reports */}
+
<section>
+
<h2 className="text-2xl font-bold mb-4 text-foreground">Questions or Reports</h2>
+
<p className="text-muted-foreground">
+
If you have questions about this policy or need to report a violation, contact us at{' '}
+
<a
+
href="mailto:contact@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
contact@wisp.place
+
</a>
+
.
+
</p>
+
</section>
+
+
{/* Final Message */}
+
<Card className="p-8 bg-accent/10 border-accent/30 border-2">
+
<p className="text-lg leading-relaxed text-foreground">
+
<strong>Remember:</strong> This policy exists to keep the service running and this community healthy, not to limit your creativity. When in doubt, ask yourself: "Is this likely to get real-world authorities knocking on doors or make this place worse for everyone?" If the answer is yes, it probably doesn't belong here. Everything else? Go wild.
+
</p>
+
</Card>
+
</article>
+
</div>
+
+
{/* Footer */}
+
<footer className="border-t border-border/40 bg-muted/20 mt-12">
+
<div className="container mx-auto px-4 py-8">
+
<div className="text-center text-sm text-muted-foreground">
+
<p>
+
Built by{' '}
+
<a
+
href="https://bsky.app/profile/nekomimi.pet"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
@nekomimi.pet
+
</a>
+
{' โ€ข '}
+
Contact:{' '}
+
<a
+
href="mailto:contact@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
contact@wisp.place
+
</a>
+
{' โ€ข '}
+
Legal/DMCA:{' '}
+
<a
+
href="mailto:legal@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
legal@wisp.place
+
</a>
+
</p>
+
<p className="mt-2">
+
<a
+
href="/acceptable-use"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
Acceptable Use Policy
+
</a>
+
</p>
+
</div>
+
</div>
+
</footer>
+
</div>
+
)
+
}
+
+
const root = createRoot(document.getElementById('elysia')!)
+
root.render(
+
<Layout className="gap-6">
+
<AcceptableUsePage />
+
</Layout>
+
)
+35
public/acceptable-use/index.html
···
+
<!doctype html>
+
<html lang="en">
+
<head>
+
<meta charset="UTF-8" />
+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
+
<title>Acceptable Use Policy - wisp.place</title>
+
<meta name="description" content="Acceptable Use Policy for wisp.place - Guidelines for hosting content on our decentralized static site hosting platform." />
+
+
<!-- Open Graph / Facebook -->
+
<meta property="og:type" content="website" />
+
<meta property="og:url" content="https://wisp.place/acceptable-use" />
+
<meta property="og:title" content="Acceptable Use Policy - wisp.place" />
+
<meta property="og:description" content="Acceptable Use Policy for wisp.place - Guidelines for hosting content on our decentralized static site hosting platform." />
+
<meta property="og:site_name" content="wisp.place" />
+
+
<!-- Twitter -->
+
<meta name="twitter:card" content="summary_large_image" />
+
<meta name="twitter:url" content="https://wisp.place/acceptable-use" />
+
<meta name="twitter:title" content="Acceptable Use Policy - wisp.place" />
+
<meta name="twitter:description" content="Acceptable Use Policy for wisp.place - Guidelines for hosting content on our decentralized static site hosting platform." />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
+
<link rel="icon" type="image/x-icon" href="../favicon.ico">
+
<link rel="icon" type="image/png" sizes="32x32" href="../favicon-32x32.png">
+
<link rel="icon" type="image/png" sizes="16x16" href="../favicon-16x16.png">
+
<link rel="apple-touch-icon" sizes="180x180" href="../apple-touch-icon.png">
+
<link rel="manifest" href="../site.webmanifest">
+
</head>
+
<body>
+
<div id="elysia"></div>
+
<script type="module" src="./acceptable-use.tsx"></script>
+
</body>
+
</html>
+7 -1
public/admin/index.html
···
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
-
<title>Admin Dashboard - Wisp.place</title>
+
<title>wisp.place</title>
+
<meta name="description" content="Admin dashboard for wisp.place decentralized static site hosting." />
+
<meta name="robots" content="noindex, nofollow" />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
<link rel="stylesheet" href="./styles.css" />
</head>
<body>
public/android-chrome-192x192.png

This is a binary file and will not be displayed.

public/android-chrome-512x512.png

This is a binary file and will not be displayed.

public/apple-touch-icon.png

This is a binary file and will not be displayed.

+30
public/components/ui/checkbox.tsx
···
+
import * as React from "react"
+
import * as CheckboxPrimitive from "@radix-ui/react-checkbox"
+
import { CheckIcon } from "lucide-react"
+
+
import { cn } from "@public/lib/utils"
+
+
function Checkbox({
+
className,
+
...props
+
}: React.ComponentProps<typeof CheckboxPrimitive.Root>) {
+
return (
+
<CheckboxPrimitive.Root
+
data-slot="checkbox"
+
className={cn(
+
"peer border-input dark:bg-input/30 data-[state=checked]:bg-primary data-[state=checked]:text-primary-foreground dark:data-[state=checked]:bg-primary data-[state=checked]:border-primary focus-visible:border-ring focus-visible:ring-ring/50 aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive size-4 shrink-0 rounded-[4px] border shadow-xs transition-shadow outline-none focus-visible:ring-[3px] disabled:cursor-not-allowed disabled:opacity-50",
+
className
+
)}
+
{...props}
+
>
+
<CheckboxPrimitive.Indicator
+
data-slot="checkbox-indicator"
+
className="grid place-content-center text-current transition-none"
+
>
+
<CheckIcon className="size-3.5" />
+
</CheckboxPrimitive.Indicator>
+
</CheckboxPrimitive.Root>
+
)
+
}
+
+
export { Checkbox }
+104
public/components/ui/code-block.tsx
···
+
import { useEffect, useRef, useState } from 'react'
+
+
declare global {
+
interface Window {
+
Prism: {
+
languages: Record<string, any>
+
highlightElement: (element: HTMLElement) => void
+
highlightAll: () => void
+
}
+
}
+
}
+
+
interface CodeBlockProps {
+
code: string
+
language?: 'bash' | 'yaml'
+
className?: string
+
}
+
+
export function CodeBlock({ code, language = 'bash', className = '' }: CodeBlockProps) {
+
const [isThemeLoaded, setIsThemeLoaded] = useState(false)
+
const codeRef = useRef<HTMLElement>(null)
+
+
useEffect(() => {
+
// Load Catppuccin theme CSS
+
const loadTheme = async () => {
+
// Detect if user prefers dark mode
+
const prefersDark = window.matchMedia('(prefers-color-scheme: dark)').matches
+
const theme = prefersDark ? 'mocha' : 'latte'
+
+
// Remove any existing theme CSS
+
const existingTheme = document.querySelector('link[data-prism-theme]')
+
if (existingTheme) {
+
existingTheme.remove()
+
}
+
+
// Load the appropriate Catppuccin theme
+
const link = document.createElement('link')
+
link.rel = 'stylesheet'
+
link.href = `https://prismjs.catppuccin.com/${theme}.css`
+
link.setAttribute('data-prism-theme', theme)
+
document.head.appendChild(link)
+
+
// Load PrismJS if not already loaded
+
if (!window.Prism) {
+
const script = document.createElement('script')
+
script.src = 'https://cdnjs.cloudflare.com/ajax/libs/prism/1.30.0/prism.min.js'
+
script.onload = () => {
+
// Load language support if needed
+
if (language === 'yaml' && !window.Prism.languages.yaml) {
+
const yamlScript = document.createElement('script')
+
yamlScript.src = 'https://cdnjs.cloudflare.com/ajax/libs/prism/1.30.0/components/prism-yaml.min.js'
+
yamlScript.onload = () => setIsThemeLoaded(true)
+
document.head.appendChild(yamlScript)
+
} else {
+
setIsThemeLoaded(true)
+
}
+
}
+
document.head.appendChild(script)
+
} else {
+
setIsThemeLoaded(true)
+
}
+
}
+
+
loadTheme()
+
+
// Listen for theme changes
+
const mediaQuery = window.matchMedia('(prefers-color-scheme: dark)')
+
const handleThemeChange = () => loadTheme()
+
mediaQuery.addEventListener('change', handleThemeChange)
+
+
return () => {
+
mediaQuery.removeEventListener('change', handleThemeChange)
+
}
+
}, [language])
+
+
// Highlight code when Prism is loaded and component is mounted
+
useEffect(() => {
+
if (isThemeLoaded && codeRef.current && window.Prism) {
+
window.Prism.highlightElement(codeRef.current)
+
}
+
}, [isThemeLoaded, code])
+
+
if (!isThemeLoaded) {
+
return (
+
<pre className={`p-4 bg-muted rounded-lg overflow-x-auto ${className}`}>
+
<code>{code.trim()}</code>
+
</pre>
+
)
+
}
+
+
// Map language to Prism language class
+
const languageMap = {
+
'bash': 'language-bash',
+
'yaml': 'language-yaml'
+
}
+
+
const prismLanguage = languageMap[language] || 'language-bash'
+
+
return (
+
<pre className={`p-4 rounded-lg overflow-x-auto ${className}`}>
+
<code ref={codeRef} className={prismLanguage}>{code.trim()}</code>
+
</pre>
+
)
+
}
+1 -1
public/components/ui/radio-group.tsx
···
<RadioGroupPrimitive.Item
data-slot="radio-group-item"
className={cn(
-
"border-input text-primary focus-visible:border-ring focus-visible:ring-ring/50 aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive dark:bg-input/30 aspect-square size-4 shrink-0 rounded-full border shadow-xs transition-[color,box-shadow] outline-none focus-visible:ring-[3px] disabled:cursor-not-allowed disabled:opacity-50",
+
"border-input text-primary focus-visible:border-ring focus-visible:ring-ring/50 aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive dark:bg-input/50 aspect-square size-4 shrink-0 rounded-full border border-black/30 dark:border-white/30 shadow-inner transition-[color,box-shadow] outline-none focus-visible:ring-[3px] disabled:cursor-not-allowed disabled:opacity-50",
className
)}
{...props}
+2 -2
public/components/ui/tabs.tsx
···
<TabsPrimitive.List
data-slot="tabs-list"
className={cn(
-
"bg-muted text-muted-foreground inline-flex h-9 w-fit items-center justify-center rounded-lg p-[3px]",
+
"bg-muted dark:bg-muted/80 text-muted-foreground inline-flex h-9 w-fit items-center justify-center rounded-lg p-[3px]",
className
)}
{...props}
···
<TabsPrimitive.Trigger
data-slot="tabs-trigger"
className={cn(
-
"data-[state=active]:bg-background dark:data-[state=active]:text-foreground focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:outline-ring dark:data-[state=active]:border-input dark:data-[state=active]:bg-input/30 text-foreground dark:text-muted-foreground inline-flex h-[calc(100%-1px)] flex-1 items-center justify-center gap-1.5 rounded-md border border-transparent px-2 py-1 text-sm font-medium whitespace-nowrap transition-[color,box-shadow] focus-visible:ring-[3px] focus-visible:outline-1 disabled:pointer-events-none disabled:opacity-50 data-[state=active]:shadow-sm [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
+
"data-[state=active]:bg-background dark:data-[state=active]:bg-background dark:data-[state=active]:text-foreground dark:data-[state=active]:border-border focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:outline-ring dark:data-[state=active]:border-border dark:data-[state=active]:shadow-sm text-foreground dark:text-muted-foreground/70 inline-flex h-[calc(100%-1px)] flex-1 items-center justify-center gap-1.5 rounded-md border border-transparent px-2 py-1 text-sm font-medium whitespace-nowrap transition-[color,box-shadow] focus-visible:ring-[3px] focus-visible:outline-1 disabled:pointer-events-none disabled:opacity-50 data-[state=active]:shadow-sm [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
className
)}
{...props}
+65
public/editor/components/TabSkeleton.tsx
···
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
+
// Shimmer animation for skeleton loading
+
const Shimmer = () => (
+
<div className="animate-pulse">
+
<div className="h-4 bg-muted rounded w-3/4 mb-2"></div>
+
<div className="h-4 bg-muted rounded w-1/2"></div>
+
</div>
+
)
+
+
const SkeletonLine = ({ className = '' }: { className?: string }) => (
+
<div className={`animate-pulse bg-muted rounded ${className}`}></div>
+
)
+
+
export function TabSkeleton() {
+
return (
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<div className="space-y-2">
+
<SkeletonLine className="h-6 w-1/3" />
+
<SkeletonLine className="h-4 w-2/3" />
+
</div>
+
</CardHeader>
+
<CardContent className="space-y-4">
+
{/* Skeleton content items */}
+
<div className="p-4 border border-border rounded-lg">
+
<SkeletonLine className="h-5 w-1/2 mb-3" />
+
<SkeletonLine className="h-4 w-3/4 mb-2" />
+
<SkeletonLine className="h-4 w-2/3" />
+
</div>
+
<div className="p-4 border border-border rounded-lg">
+
<SkeletonLine className="h-5 w-1/2 mb-3" />
+
<SkeletonLine className="h-4 w-3/4 mb-2" />
+
<SkeletonLine className="h-4 w-2/3" />
+
</div>
+
<div className="p-4 border border-border rounded-lg">
+
<SkeletonLine className="h-5 w-1/2 mb-3" />
+
<SkeletonLine className="h-4 w-3/4 mb-2" />
+
<SkeletonLine className="h-4 w-2/3" />
+
</div>
+
</CardContent>
+
</Card>
+
+
<Card>
+
<CardHeader>
+
<div className="space-y-2">
+
<SkeletonLine className="h-6 w-1/4" />
+
<SkeletonLine className="h-4 w-1/2" />
+
</div>
+
</CardHeader>
+
<CardContent className="space-y-3">
+
<SkeletonLine className="h-10 w-full" />
+
<SkeletonLine className="h-4 w-3/4" />
+
</CardContent>
+
</Card>
+
</div>
+
)
+
}
+249 -1144
public/editor/editor.tsx
···
import { createRoot } from 'react-dom/client'
import { Button } from '@public/components/ui/button'
import {
-
Card,
-
CardContent,
-
CardDescription,
-
CardHeader,
-
CardTitle
-
} from '@public/components/ui/card'
-
import { Input } from '@public/components/ui/input'
-
import { Label } from '@public/components/ui/label'
-
import {
Tabs,
TabsContent,
TabsList,
TabsTrigger
} from '@public/components/ui/tabs'
-
import { Badge } from '@public/components/ui/badge'
import {
Dialog,
DialogContent,
···
DialogTitle,
DialogFooter
} from '@public/components/ui/dialog'
+
import { Checkbox } from '@public/components/ui/checkbox'
+
import { Label } from '@public/components/ui/label'
+
import { Badge } from '@public/components/ui/badge'
import {
-
Globe,
-
Upload,
-
ExternalLink,
-
CheckCircle2,
-
XCircle,
-
AlertCircle,
Loader2,
Trash2,
-
RefreshCw,
-
Settings
+
LogOut
} from 'lucide-react'
-
import { RadioGroup, RadioGroupItem } from '@public/components/ui/radio-group'
-
import Layout from '@public/layouts'
-
-
interface UserInfo {
-
did: string
-
handle: string
-
}
-
-
interface Site {
-
did: string
-
rkey: string
-
display_name: string | null
-
created_at: number
-
updated_at: number
-
}
-
-
interface CustomDomain {
-
id: string
-
domain: string
-
did: string
-
rkey: string
-
verified: boolean
-
last_verified_at: number | null
-
created_at: number
-
}
-
-
interface WispDomain {
-
domain: string
-
rkey: string | null
-
}
+
import { useUserInfo } from './hooks/useUserInfo'
+
import { useSiteData, type SiteWithDomains } from './hooks/useSiteData'
+
import { useDomainData } from './hooks/useDomainData'
+
import { SitesTab } from './tabs/SitesTab'
+
import { DomainsTab } from './tabs/DomainsTab'
+
import { UploadTab } from './tabs/UploadTab'
+
import { CLITab } from './tabs/CLITab'
function Dashboard() {
-
// User state
-
const [userInfo, setUserInfo] = useState<UserInfo | null>(null)
-
const [loading, setLoading] = useState(true)
-
-
// Sites state
-
const [sites, setSites] = useState<Site[]>([])
-
const [sitesLoading, setSitesLoading] = useState(true)
-
const [isSyncing, setIsSyncing] = useState(false)
+
// Use custom hooks
+
const { userInfo, loading, fetchUserInfo } = useUserInfo()
+
const { sites, sitesLoading, isSyncing, fetchSites, syncSites, deleteSite } = useSiteData()
+
const {
+
wispDomains,
+
customDomains,
+
domainsLoading,
+
verificationStatus,
+
fetchDomains,
+
addCustomDomain,
+
verifyDomain,
+
deleteCustomDomain,
+
mapWispDomain,
+
deleteWispDomain,
+
mapCustomDomain,
+
claimWispDomain,
+
checkWispAvailability
+
} = useDomainData()
-
// Domains state
-
const [wispDomain, setWispDomain] = useState<WispDomain | null>(null)
-
const [customDomains, setCustomDomains] = useState<CustomDomain[]>([])
-
const [domainsLoading, setDomainsLoading] = useState(true)
-
-
// Site configuration state
-
const [configuringSite, setConfiguringSite] = useState<Site | null>(null)
-
const [selectedDomain, setSelectedDomain] = useState<string>('')
+
// Site configuration modal state (shared across components)
+
const [configuringSite, setConfiguringSite] = useState<SiteWithDomains | null>(null)
+
const [selectedDomains, setSelectedDomains] = useState<Set<string>>(new Set())
const [isSavingConfig, setIsSavingConfig] = useState(false)
const [isDeletingSite, setIsDeletingSite] = useState(false)
-
// Upload state
-
const [siteMode, setSiteMode] = useState<'existing' | 'new'>('existing')
-
const [selectedSiteRkey, setSelectedSiteRkey] = useState<string>('')
-
const [newSiteName, setNewSiteName] = useState('')
-
const [selectedFiles, setSelectedFiles] = useState<FileList | null>(null)
-
const [isUploading, setIsUploading] = useState(false)
-
const [uploadProgress, setUploadProgress] = useState('')
-
const [skippedFiles, setSkippedFiles] = useState<Array<{ name: string; reason: string }>>([])
-
const [uploadedCount, setUploadedCount] = useState(0)
-
-
// Custom domain modal state
-
const [addDomainModalOpen, setAddDomainModalOpen] = useState(false)
-
const [customDomain, setCustomDomain] = useState('')
-
const [isAddingDomain, setIsAddingDomain] = useState(false)
-
const [verificationStatus, setVerificationStatus] = useState<{
-
[id: string]: 'idle' | 'verifying' | 'success' | 'error'
-
}>({})
-
const [viewDomainDNS, setViewDomainDNS] = useState<string | null>(null)
-
-
// Wisp domain claim state
-
const [wispHandle, setWispHandle] = useState('')
-
const [isClaimingWisp, setIsClaimingWisp] = useState(false)
-
const [wispAvailability, setWispAvailability] = useState<{
-
available: boolean | null
-
checking: boolean
-
}>({ available: null, checking: false })
-
-
// Fetch user info on mount
+
// Fetch initial data on mount
useEffect(() => {
fetchUserInfo()
fetchSites()
fetchDomains()
}, [])
-
// Auto-switch to 'new' mode if no sites exist
-
useEffect(() => {
-
if (!sitesLoading && sites.length === 0 && siteMode === 'existing') {
-
setSiteMode('new')
-
}
-
}, [sites, sitesLoading, siteMode])
+
// Handle site configuration modal
+
const handleConfigureSite = (site: SiteWithDomains) => {
+
setConfiguringSite(site)
-
const fetchUserInfo = async () => {
-
try {
-
const response = await fetch('/api/user/info')
-
const data = await response.json()
-
setUserInfo(data)
-
} catch (err) {
-
console.error('Failed to fetch user info:', err)
-
} finally {
-
setLoading(false)
-
}
-
}
+
// Build set of currently mapped domains
+
const mappedDomains = new Set<string>()
-
const fetchSites = async () => {
-
try {
-
const response = await fetch('/api/user/sites')
-
const data = await response.json()
-
setSites(data.sites || [])
-
} catch (err) {
-
console.error('Failed to fetch sites:', err)
-
} finally {
-
setSitesLoading(false)
-
}
-
}
-
-
const syncSites = async () => {
-
setIsSyncing(true)
-
try {
-
const response = await fetch('/api/user/sync', {
-
method: 'POST'
+
if (site.domains) {
+
site.domains.forEach(domainInfo => {
+
if (domainInfo.type === 'wisp') {
+
// For wisp domains, use the domain itself as the identifier
+
mappedDomains.add(`wisp:${domainInfo.domain}`)
+
} else if (domainInfo.id) {
+
mappedDomains.add(domainInfo.id)
+
}
})
-
const data = await response.json()
-
if (data.success) {
-
console.log(`Synced ${data.synced} sites from PDS`)
-
// Refresh sites list
-
await fetchSites()
-
}
-
} catch (err) {
-
console.error('Failed to sync sites:', err)
-
alert('Failed to sync sites from PDS')
-
} finally {
-
setIsSyncing(false)
}
-
}
-
const fetchDomains = async () => {
-
try {
-
const response = await fetch('/api/user/domains')
-
const data = await response.json()
-
setWispDomain(data.wispDomain)
-
setCustomDomains(data.customDomains || [])
-
} catch (err) {
-
console.error('Failed to fetch domains:', err)
-
} finally {
-
setDomainsLoading(false)
-
}
+
setSelectedDomains(mappedDomains)
}
-
const getSiteUrl = (site: Site) => {
-
// Check if this site is mapped to the wisp.place domain
-
if (wispDomain && wispDomain.rkey === site.rkey) {
-
return `https://${wispDomain.domain}`
-
}
-
-
// Check if this site is mapped to any custom domain
-
const customDomain = customDomains.find((d) => d.rkey === site.rkey)
-
if (customDomain) {
-
return `https://${customDomain.domain}`
-
}
-
-
// Default fallback URL
-
if (!userInfo) return '#'
-
return `https://sites.wisp.place/${site.did}/${site.rkey}`
-
}
+
const handleSaveSiteConfig = async () => {
+
if (!configuringSite) return
-
const getSiteDomainName = (site: Site) => {
-
if (wispDomain && wispDomain.rkey === site.rkey) {
-
return wispDomain.domain
-
}
-
-
const customDomain = customDomains.find((d) => d.rkey === site.rkey)
-
if (customDomain) {
-
return customDomain.domain
-
}
-
-
return `sites.wisp.place/${site.did}/${site.rkey}`
-
}
-
-
const handleFileSelect = (e: React.ChangeEvent<HTMLInputElement>) => {
-
if (e.target.files && e.target.files.length > 0) {
-
setSelectedFiles(e.target.files)
-
}
-
}
-
-
const handleUpload = async () => {
-
const siteName = siteMode === 'existing' ? selectedSiteRkey : newSiteName
-
-
if (!siteName) {
-
alert(siteMode === 'existing' ? 'Please select a site' : 'Please enter a site name')
-
return
-
}
-
-
setIsUploading(true)
-
setUploadProgress('Preparing files...')
-
+
setIsSavingConfig(true)
try {
-
const formData = new FormData()
-
formData.append('siteName', siteName)
-
-
if (selectedFiles) {
-
for (let i = 0; i < selectedFiles.length; i++) {
-
formData.append('files', selectedFiles[i])
-
}
-
}
-
-
setUploadProgress('Uploading to AT Protocol...')
-
const response = await fetch('/wisp/upload-files', {
-
method: 'POST',
-
body: formData
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
setUploadProgress('Upload complete!')
-
setSkippedFiles(data.skippedFiles || [])
-
setUploadedCount(data.uploadedCount || data.fileCount || 0)
-
setSelectedSiteRkey('')
-
setNewSiteName('')
-
setSelectedFiles(null)
-
-
// Refresh sites list
-
await fetchSites()
+
// Handle wisp domain mappings
+
const selectedWispDomainIds = Array.from(selectedDomains).filter(id => id.startsWith('wisp:'))
+
const selectedWispDomains = selectedWispDomainIds.map(id => id.replace('wisp:', ''))
-
// Reset form - give more time if there are skipped files
-
const resetDelay = data.skippedFiles && data.skippedFiles.length > 0 ? 4000 : 1500
-
setTimeout(() => {
-
setUploadProgress('')
-
setSkippedFiles([])
-
setUploadedCount(0)
-
setIsUploading(false)
-
}, resetDelay)
-
} else {
-
throw new Error(data.error || 'Upload failed')
-
}
-
} catch (err) {
-
console.error('Upload error:', err)
-
alert(
-
`Upload failed: ${err instanceof Error ? err.message : 'Unknown error'}`
+
// Get currently mapped wisp domains
+
const currentlyMappedWispDomains = wispDomains.filter(
+
d => d.rkey === configuringSite.rkey
)
-
setIsUploading(false)
-
setUploadProgress('')
-
}
-
}
-
const handleAddCustomDomain = async () => {
-
if (!customDomain) {
-
alert('Please enter a domain')
-
return
-
}
-
-
setIsAddingDomain(true)
-
try {
-
const response = await fetch('/api/domain/custom/add', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ domain: customDomain })
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
setCustomDomain('')
-
setAddDomainModalOpen(false)
-
await fetchDomains()
-
-
// Automatically show DNS configuration for the newly added domain
-
setViewDomainDNS(data.id)
-
} else {
-
throw new Error(data.error || 'Failed to add domain')
+
// Unmap wisp domains that are no longer selected
+
for (const domain of currentlyMappedWispDomains) {
+
if (!selectedWispDomains.includes(domain.domain)) {
+
await mapWispDomain(domain.domain, null)
+
}
}
-
} catch (err) {
-
console.error('Add domain error:', err)
-
alert(
-
`Failed to add domain: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
} finally {
-
setIsAddingDomain(false)
-
}
-
}
-
const handleVerifyDomain = async (id: string) => {
-
setVerificationStatus({ ...verificationStatus, [id]: 'verifying' })
-
-
try {
-
const response = await fetch('/api/domain/custom/verify', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ id })
-
})
-
-
const data = await response.json()
-
if (data.success && data.verified) {
-
setVerificationStatus({ ...verificationStatus, [id]: 'success' })
-
await fetchDomains()
-
} else {
-
setVerificationStatus({ ...verificationStatus, [id]: 'error' })
-
if (data.error) {
-
alert(`Verification failed: ${data.error}`)
+
// Map newly selected wisp domains
+
for (const domainName of selectedWispDomains) {
+
const isAlreadyMapped = currentlyMappedWispDomains.some(d => d.domain === domainName)
+
if (!isAlreadyMapped) {
+
await mapWispDomain(domainName, configuringSite.rkey)
}
}
-
} catch (err) {
-
console.error('Verify domain error:', err)
-
setVerificationStatus({ ...verificationStatus, [id]: 'error' })
-
alert(
-
`Verification failed: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
}
-
}
-
const handleDeleteCustomDomain = async (id: string) => {
-
if (!confirm('Are you sure you want to remove this custom domain?')) {
-
return
-
}
-
-
try {
-
const response = await fetch(`/api/domain/custom/${id}`, {
-
method: 'DELETE'
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
await fetchDomains()
-
} else {
-
throw new Error('Failed to delete domain')
-
}
-
} catch (err) {
-
console.error('Delete domain error:', err)
-
alert(
-
`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`
+
// Handle custom domain mappings
+
const selectedCustomDomainIds = Array.from(selectedDomains).filter(id => !id.startsWith('wisp:'))
+
const currentlyMappedCustomDomains = customDomains.filter(
+
d => d.rkey === configuringSite.rkey
)
-
}
-
}
-
const handleConfigureSite = (site: Site) => {
-
setConfiguringSite(site)
-
-
// Determine current domain mapping
-
if (wispDomain && wispDomain.rkey === site.rkey) {
-
setSelectedDomain('wisp')
-
} else {
-
const customDomain = customDomains.find((d) => d.rkey === site.rkey)
-
if (customDomain) {
-
setSelectedDomain(customDomain.id)
-
} else {
-
setSelectedDomain('none')
+
// Unmap domains that are no longer selected
+
for (const domain of currentlyMappedCustomDomains) {
+
if (!selectedCustomDomainIds.includes(domain.id)) {
+
await mapCustomDomain(domain.id, null)
+
}
}
-
}
-
}
-
const handleSaveSiteConfig = async () => {
-
if (!configuringSite) return
-
-
setIsSavingConfig(true)
-
try {
-
if (selectedDomain === 'wisp') {
-
// Map to wisp.place domain
-
const response = await fetch('/api/domain/wisp/map-site', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ siteRkey: configuringSite.rkey })
-
})
-
const data = await response.json()
-
if (!data.success) throw new Error('Failed to map site')
-
} else if (selectedDomain === 'none') {
-
// Unmap from all domains
-
// Unmap wisp domain if this site was mapped to it
-
if (wispDomain && wispDomain.rkey === configuringSite.rkey) {
-
await fetch('/api/domain/wisp/map-site', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ siteRkey: null })
-
})
-
}
-
-
// Unmap from custom domains
-
const mappedCustom = customDomains.find(
-
(d) => d.rkey === configuringSite.rkey
-
)
-
if (mappedCustom) {
-
await fetch(`/api/domain/custom/${mappedCustom.id}/map-site`, {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ siteRkey: null })
-
})
+
// Map newly selected domains
+
for (const domainId of selectedCustomDomainIds) {
+
const isAlreadyMapped = currentlyMappedCustomDomains.some(d => d.id === domainId)
+
if (!isAlreadyMapped) {
+
await mapCustomDomain(domainId, configuringSite.rkey)
}
-
} else {
-
// Map to a custom domain
-
const response = await fetch(
-
`/api/domain/custom/${selectedDomain}/map-site`,
-
{
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ siteRkey: configuringSite.rkey })
-
}
-
)
-
const data = await response.json()
-
if (!data.success) throw new Error('Failed to map site')
}
-
// Refresh domains to get updated mappings
+
// Refresh both domains and sites to get updated mappings
await fetchDomains()
+
await fetchSites()
setConfiguringSite(null)
} catch (err) {
console.error('Save config error:', err)
···
}
setIsDeletingSite(true)
-
try {
-
const response = await fetch(`/api/site/${configuringSite.rkey}`, {
-
method: 'DELETE'
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
// Refresh sites list
-
await fetchSites()
-
// Refresh domains in case this site was mapped
-
await fetchDomains()
-
setConfiguringSite(null)
-
} else {
-
throw new Error(data.error || 'Failed to delete site')
-
}
-
} catch (err) {
-
console.error('Delete site error:', err)
-
alert(
-
`Failed to delete site: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
} finally {
-
setIsDeletingSite(false)
+
const success = await deleteSite(configuringSite.rkey)
+
if (success) {
+
// Refresh domains in case this site was mapped
+
await fetchDomains()
+
setConfiguringSite(null)
}
+
setIsDeletingSite(false)
}
-
const checkWispAvailability = async (handle: string) => {
-
const trimmedHandle = handle.trim().toLowerCase()
-
if (!trimmedHandle) {
-
setWispAvailability({ available: null, checking: false })
-
return
-
}
-
-
setWispAvailability({ available: null, checking: true })
-
try {
-
const response = await fetch(`/api/domain/check?handle=${encodeURIComponent(trimmedHandle)}`)
-
const data = await response.json()
-
setWispAvailability({ available: data.available, checking: false })
-
} catch (err) {
-
console.error('Check availability error:', err)
-
setWispAvailability({ available: false, checking: false })
-
}
+
const handleUploadComplete = async () => {
+
await fetchSites()
}
-
const handleClaimWispDomain = async () => {
-
const trimmedHandle = wispHandle.trim().toLowerCase()
-
if (!trimmedHandle) {
-
alert('Please enter a handle')
-
return
-
}
-
-
setIsClaimingWisp(true)
+
const handleLogout = async () => {
try {
-
const response = await fetch('/api/domain/claim', {
+
const response = await fetch('/api/auth/logout', {
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ handle: trimmedHandle })
+
credentials: 'include'
})
-
-
const data = await response.json()
-
if (data.success) {
-
setWispHandle('')
-
setWispAvailability({ available: null, checking: false })
-
await fetchDomains()
+
const result = await response.json()
+
if (result.success) {
+
// Redirect to home page after successful logout
+
window.location.href = '/'
} else {
-
throw new Error(data.error || 'Failed to claim domain')
+
alert('Logout failed: ' + (result.error || 'Unknown error'))
}
} catch (err) {
-
console.error('Claim domain error:', err)
-
const errorMessage = err instanceof Error ? err.message : 'Unknown error'
-
-
// Handle "Already claimed" error more gracefully
-
if (errorMessage.includes('Already claimed')) {
-
alert('You have already claimed a wisp.place subdomain. Please refresh the page.')
-
await fetchDomains()
-
} else {
-
alert(`Failed to claim domain: ${errorMessage}`)
-
}
-
} finally {
-
setIsClaimingWisp(false)
+
alert('Logout failed: ' + (err instanceof Error ? err.message : 'Unknown error'))
}
}
···
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
<div className="flex items-center gap-2">
-
<div className="w-8 h-8 bg-primary rounded-lg flex items-center justify-center">
-
<Globe className="w-5 h-5 text-primary-foreground" />
-
</div>
+
<img src="/transparent-full-size-ico.png" alt="wisp.place" className="w-8 h-8" />
<span className="text-xl font-semibold text-foreground">
wisp.place
</span>
···
<span className="text-sm text-muted-foreground">
{userInfo?.handle || 'Loading...'}
</span>
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={handleLogout}
+
className="h-8 px-2"
+
>
+
<LogOut className="w-4 h-4" />
+
</Button>
</div>
</div>
</header>
···
</div>
<Tabs defaultValue="sites" className="space-y-6 w-full">
-
<TabsList className="grid w-full grid-cols-3 max-w-md">
+
<TabsList className="grid w-full grid-cols-4">
<TabsTrigger value="sites">Sites</TabsTrigger>
<TabsTrigger value="domains">Domains</TabsTrigger>
<TabsTrigger value="upload">Upload</TabsTrigger>
+
<TabsTrigger value="cli">CLI</TabsTrigger>
</TabsList>
{/* Sites Tab */}
-
<TabsContent value="sites" className="space-y-4 min-h-[400px]">
-
<Card>
-
<CardHeader>
-
<div className="flex items-center justify-between">
-
<div>
-
<CardTitle>Your Sites</CardTitle>
-
<CardDescription>
-
View and manage all your deployed sites
-
</CardDescription>
-
</div>
-
<Button
-
variant="outline"
-
size="sm"
-
onClick={syncSites}
-
disabled={isSyncing || sitesLoading}
-
>
-
<RefreshCw
-
className={`w-4 h-4 mr-2 ${isSyncing ? 'animate-spin' : ''}`}
-
/>
-
Sync from PDS
-
</Button>
-
</div>
-
</CardHeader>
-
<CardContent className="space-y-4">
-
{sitesLoading ? (
-
<div className="flex items-center justify-center py-8">
-
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
-
</div>
-
) : sites.length === 0 ? (
-
<div className="text-center py-8 text-muted-foreground">
-
<p>No sites yet. Upload your first site!</p>
-
</div>
-
) : (
-
sites.map((site) => (
-
<div
-
key={`${site.did}-${site.rkey}`}
-
className="flex items-center justify-between p-4 border border-border rounded-lg hover:bg-muted/50 transition-colors"
-
>
-
<div className="flex-1">
-
<div className="flex items-center gap-3 mb-2">
-
<h3 className="font-semibold text-lg">
-
{site.display_name || site.rkey}
-
</h3>
-
<Badge
-
variant="secondary"
-
className="text-xs"
-
>
-
active
-
</Badge>
-
</div>
-
<a
-
href={getSiteUrl(site)}
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-sm text-accent hover:text-accent/80 flex items-center gap-1"
-
>
-
{getSiteDomainName(site)}
-
<ExternalLink className="w-3 h-3" />
-
</a>
-
</div>
-
<Button
-
variant="outline"
-
size="sm"
-
onClick={() => handleConfigureSite(site)}
-
>
-
<Settings className="w-4 h-4 mr-2" />
-
Configure
-
</Button>
-
</div>
-
))
-
)}
-
</CardContent>
-
</Card>
+
<TabsContent value="sites">
+
<SitesTab
+
sites={sites}
+
sitesLoading={sitesLoading}
+
isSyncing={isSyncing}
+
userInfo={userInfo}
+
onSyncSites={syncSites}
+
onConfigureSite={handleConfigureSite}
+
/>
</TabsContent>
{/* Domains Tab */}
-
<TabsContent value="domains" className="space-y-4 min-h-[400px]">
-
<Card>
-
<CardHeader>
-
<CardTitle>wisp.place Subdomain</CardTitle>
-
<CardDescription>
-
Your free subdomain on the wisp.place network
-
</CardDescription>
-
</CardHeader>
-
<CardContent>
-
{domainsLoading ? (
-
<div className="flex items-center justify-center py-4">
-
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
-
</div>
-
) : wispDomain ? (
-
<>
-
<div className="flex flex-col gap-2 p-4 bg-muted/50 rounded-lg">
-
<div className="flex items-center gap-2">
-
<CheckCircle2 className="w-5 h-5 text-green-500" />
-
<span className="font-mono text-lg">
-
{wispDomain.domain}
-
</span>
-
</div>
-
{wispDomain.rkey && (
-
<p className="text-xs text-muted-foreground ml-7">
-
โ†’ Mapped to site: {wispDomain.rkey}
-
</p>
-
)}
-
</div>
-
<p className="text-sm text-muted-foreground mt-3">
-
{wispDomain.rkey
-
? 'This domain is mapped to a specific site'
-
: 'This domain is not mapped to any site yet. Configure it from the Sites tab.'}
-
</p>
-
</>
-
) : (
-
<div className="space-y-4">
-
<div className="p-4 bg-muted/30 rounded-lg">
-
<p className="text-sm text-muted-foreground mb-4">
-
Claim your free wisp.place subdomain
-
</p>
-
<div className="space-y-3">
-
<div className="space-y-2">
-
<Label htmlFor="wisp-handle">Choose your handle</Label>
-
<div className="flex gap-2">
-
<div className="flex-1 relative">
-
<Input
-
id="wisp-handle"
-
placeholder="mysite"
-
value={wispHandle}
-
onChange={(e) => {
-
setWispHandle(e.target.value)
-
if (e.target.value.trim()) {
-
checkWispAvailability(e.target.value)
-
} else {
-
setWispAvailability({ available: null, checking: false })
-
}
-
}}
-
disabled={isClaimingWisp}
-
className="pr-24"
-
/>
-
<span className="absolute right-3 top-1/2 -translate-y-1/2 text-sm text-muted-foreground">
-
.wisp.place
-
</span>
-
</div>
-
</div>
-
{wispAvailability.checking && (
-
<p className="text-xs text-muted-foreground flex items-center gap-1">
-
<Loader2 className="w-3 h-3 animate-spin" />
-
Checking availability...
-
</p>
-
)}
-
{!wispAvailability.checking && wispAvailability.available === true && (
-
<p className="text-xs text-green-600 flex items-center gap-1">
-
<CheckCircle2 className="w-3 h-3" />
-
Available
-
</p>
-
)}
-
{!wispAvailability.checking && wispAvailability.available === false && (
-
<p className="text-xs text-red-600 flex items-center gap-1">
-
<XCircle className="w-3 h-3" />
-
Not available
-
</p>
-
)}
-
</div>
-
<Button
-
onClick={handleClaimWispDomain}
-
disabled={!wispHandle.trim() || isClaimingWisp || wispAvailability.available !== true}
-
className="w-full"
-
>
-
{isClaimingWisp ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Claiming...
-
</>
-
) : (
-
'Claim Subdomain'
-
)}
-
</Button>
-
</div>
-
</div>
-
</div>
-
)}
-
</CardContent>
-
</Card>
-
-
<Card>
-
<CardHeader>
-
<CardTitle>Custom Domains</CardTitle>
-
<CardDescription>
-
Bring your own domain with DNS verification
-
</CardDescription>
-
</CardHeader>
-
<CardContent className="space-y-4">
-
<Button
-
onClick={() => setAddDomainModalOpen(true)}
-
className="w-full"
-
>
-
Add Custom Domain
-
</Button>
-
-
{domainsLoading ? (
-
<div className="flex items-center justify-center py-4">
-
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
-
</div>
-
) : customDomains.length === 0 ? (
-
<div className="text-center py-4 text-muted-foreground text-sm">
-
No custom domains added yet
-
</div>
-
) : (
-
<div className="space-y-2">
-
{customDomains.map((domain) => (
-
<div
-
key={domain.id}
-
className="flex items-center justify-between p-3 border border-border rounded-lg"
-
>
-
<div className="flex flex-col gap-1 flex-1">
-
<div className="flex items-center gap-2">
-
{domain.verified ? (
-
<CheckCircle2 className="w-4 h-4 text-green-500" />
-
) : (
-
<XCircle className="w-4 h-4 text-red-500" />
-
)}
-
<span className="font-mono">
-
{domain.domain}
-
</span>
-
</div>
-
{domain.rkey && domain.rkey !== 'self' && (
-
<p className="text-xs text-muted-foreground ml-6">
-
โ†’ Mapped to site: {domain.rkey}
-
</p>
-
)}
-
</div>
-
<div className="flex items-center gap-2">
-
<Button
-
variant="outline"
-
size="sm"
-
onClick={() =>
-
setViewDomainDNS(domain.id)
-
}
-
>
-
View DNS
-
</Button>
-
{domain.verified ? (
-
<Badge variant="secondary">
-
Verified
-
</Badge>
-
) : (
-
<Button
-
variant="outline"
-
size="sm"
-
onClick={() =>
-
handleVerifyDomain(domain.id)
-
}
-
disabled={
-
verificationStatus[
-
domain.id
-
] === 'verifying'
-
}
-
>
-
{verificationStatus[
-
domain.id
-
] === 'verifying' ? (
-
<>
-
<Loader2 className="w-3 h-3 mr-1 animate-spin" />
-
Verifying...
-
</>
-
) : (
-
'Verify DNS'
-
)}
-
</Button>
-
)}
-
<Button
-
variant="ghost"
-
size="sm"
-
onClick={() =>
-
handleDeleteCustomDomain(
-
domain.id
-
)
-
}
-
>
-
<Trash2 className="w-4 h-4" />
-
</Button>
-
</div>
-
</div>
-
))}
-
</div>
-
)}
-
</CardContent>
-
</Card>
+
<TabsContent value="domains">
+
<DomainsTab
+
wispDomains={wispDomains}
+
customDomains={customDomains}
+
domainsLoading={domainsLoading}
+
verificationStatus={verificationStatus}
+
userInfo={userInfo}
+
onAddCustomDomain={addCustomDomain}
+
onVerifyDomain={verifyDomain}
+
onDeleteCustomDomain={deleteCustomDomain}
+
onDeleteWispDomain={deleteWispDomain}
+
onClaimWispDomain={claimWispDomain}
+
onCheckWispAvailability={checkWispAvailability}
+
/>
</TabsContent>
{/* Upload Tab */}
-
<TabsContent value="upload" className="space-y-4 min-h-[400px]">
-
<Card>
-
<CardHeader>
-
<CardTitle>Upload Site</CardTitle>
-
<CardDescription>
-
Deploy a new site from a folder or Git repository
-
</CardDescription>
-
</CardHeader>
-
<CardContent className="space-y-6">
-
<div className="space-y-4">
-
<RadioGroup
-
value={siteMode}
-
onValueChange={(value) => setSiteMode(value as 'existing' | 'new')}
-
disabled={isUploading}
-
>
-
<div className="flex items-center space-x-2">
-
<RadioGroupItem value="existing" id="existing" />
-
<Label htmlFor="existing" className="cursor-pointer">
-
Update existing site
-
</Label>
-
</div>
-
<div className="flex items-center space-x-2">
-
<RadioGroupItem value="new" id="new" />
-
<Label htmlFor="new" className="cursor-pointer">
-
Create new site
-
</Label>
-
</div>
-
</RadioGroup>
-
-
{siteMode === 'existing' ? (
-
<div className="space-y-2">
-
<Label htmlFor="site-select">Select Site</Label>
-
{sitesLoading ? (
-
<div className="flex items-center justify-center py-4">
-
<Loader2 className="w-5 h-5 animate-spin text-muted-foreground" />
-
</div>
-
) : sites.length === 0 ? (
-
<div className="p-4 border border-dashed rounded-lg text-center text-sm text-muted-foreground">
-
No sites available. Create a new site instead.
-
</div>
-
) : (
-
<select
-
id="site-select"
-
className="flex h-10 w-full rounded-md border border-input bg-background px-3 py-2 text-sm ring-offset-background file:border-0 file:bg-transparent file:text-sm file:font-medium placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
-
value={selectedSiteRkey}
-
onChange={(e) => setSelectedSiteRkey(e.target.value)}
-
disabled={isUploading}
-
>
-
<option value="">Select a site...</option>
-
{sites.map((site) => (
-
<option key={site.rkey} value={site.rkey}>
-
{site.display_name || site.rkey}
-
</option>
-
))}
-
</select>
-
)}
-
</div>
-
) : (
-
<div className="space-y-2">
-
<Label htmlFor="new-site-name">New Site Name</Label>
-
<Input
-
id="new-site-name"
-
placeholder="my-awesome-site"
-
value={newSiteName}
-
onChange={(e) => setNewSiteName(e.target.value)}
-
disabled={isUploading}
-
/>
-
</div>
-
)}
-
-
<p className="text-xs text-muted-foreground">
-
File limits: 100MB per file, 300MB total
-
</p>
-
</div>
-
-
<div className="grid md:grid-cols-2 gap-4">
-
<Card className="border-2 border-dashed hover:border-accent transition-colors cursor-pointer">
-
<CardContent className="flex flex-col items-center justify-center p-8 text-center">
-
<Upload className="w-12 h-12 text-muted-foreground mb-4" />
-
<h3 className="font-semibold mb-2">
-
Upload Folder
-
</h3>
-
<p className="text-sm text-muted-foreground mb-4">
-
Drag and drop or click to upload your
-
static site files
-
</p>
-
<input
-
type="file"
-
id="file-upload"
-
multiple
-
onChange={handleFileSelect}
-
className="hidden"
-
{...(({ webkitdirectory: '', directory: '' } as any))}
-
disabled={isUploading}
-
/>
-
<label htmlFor="file-upload">
-
<Button
-
variant="outline"
-
type="button"
-
onClick={() =>
-
document
-
.getElementById('file-upload')
-
?.click()
-
}
-
disabled={isUploading}
-
>
-
Choose Folder
-
</Button>
-
</label>
-
{selectedFiles && selectedFiles.length > 0 && (
-
<p className="text-sm text-muted-foreground mt-3">
-
{selectedFiles.length} files selected
-
</p>
-
)}
-
</CardContent>
-
</Card>
-
-
<Card className="border-2 border-dashed opacity-50">
-
<CardContent className="flex flex-col items-center justify-center p-8 text-center">
-
<Globe className="w-12 h-12 text-muted-foreground mb-4" />
-
<h3 className="font-semibold mb-2">
-
Connect Git Repository
-
</h3>
-
<p className="text-sm text-muted-foreground mb-4">
-
Link your GitHub, GitLab, or any Git
-
repository
-
</p>
-
<Badge variant="secondary">Coming soon!</Badge>
-
</CardContent>
-
</Card>
-
</div>
-
-
{uploadProgress && (
-
<div className="space-y-3">
-
<div className="p-4 bg-muted rounded-lg">
-
<div className="flex items-center gap-2">
-
<Loader2 className="w-4 h-4 animate-spin" />
-
<span className="text-sm">{uploadProgress}</span>
-
</div>
-
</div>
-
-
{skippedFiles.length > 0 && (
-
<div className="p-4 bg-yellow-500/10 border border-yellow-500/20 rounded-lg">
-
<div className="flex items-start gap-2 text-yellow-600 dark:text-yellow-400 mb-2">
-
<AlertCircle className="w-4 h-4 mt-0.5 flex-shrink-0" />
-
<div className="flex-1">
-
<span className="font-medium">
-
{skippedFiles.length} file{skippedFiles.length > 1 ? 's' : ''} skipped
-
</span>
-
{uploadedCount > 0 && (
-
<span className="text-sm ml-2">
-
({uploadedCount} uploaded successfully)
-
</span>
-
)}
-
</div>
-
</div>
-
<div className="ml-6 space-y-1 max-h-32 overflow-y-auto">
-
{skippedFiles.slice(0, 5).map((file, idx) => (
-
<div key={idx} className="text-xs">
-
<span className="font-mono">{file.name}</span>
-
<span className="text-muted-foreground"> - {file.reason}</span>
-
</div>
-
))}
-
{skippedFiles.length > 5 && (
-
<div className="text-xs text-muted-foreground">
-
...and {skippedFiles.length - 5} more
-
</div>
-
)}
-
</div>
-
</div>
-
)}
-
</div>
-
)}
+
<TabsContent value="upload">
+
<UploadTab
+
sites={sites}
+
sitesLoading={sitesLoading}
+
onUploadComplete={handleUploadComplete}
+
/>
+
</TabsContent>
-
<Button
-
onClick={handleUpload}
-
className="w-full"
-
disabled={
-
(siteMode === 'existing' ? !selectedSiteRkey : !newSiteName) ||
-
isUploading ||
-
(siteMode === 'existing' && (!selectedFiles || selectedFiles.length === 0))
-
}
-
>
-
{isUploading ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Uploading...
-
</>
-
) : (
-
<>
-
{siteMode === 'existing' ? (
-
'Update Site'
-
) : (
-
selectedFiles && selectedFiles.length > 0
-
? 'Upload & Deploy'
-
: 'Create Empty Site'
-
)}
-
</>
-
)}
-
</Button>
-
</CardContent>
-
</Card>
+
{/* CLI Tab */}
+
<TabsContent value="cli">
+
<CLITab />
</TabsContent>
</Tabs>
</div>
-
{/* Add Custom Domain Modal */}
-
<Dialog open={addDomainModalOpen} onOpenChange={setAddDomainModalOpen}>
-
<DialogContent className="sm:max-w-lg">
-
<DialogHeader>
-
<DialogTitle>Add Custom Domain</DialogTitle>
-
<DialogDescription>
-
Enter your domain name. After adding, you'll see the DNS
-
records to configure.
-
</DialogDescription>
-
</DialogHeader>
-
<div className="space-y-4 py-4">
-
<div className="space-y-2">
-
<Label htmlFor="new-domain">Domain Name</Label>
-
<Input
-
id="new-domain"
-
placeholder="example.com"
-
value={customDomain}
-
onChange={(e) => setCustomDomain(e.target.value)}
-
/>
-
<p className="text-xs text-muted-foreground">
-
After adding, click "View DNS" to see the records you
-
need to configure.
-
</p>
-
</div>
+
{/* Footer */}
+
<footer className="border-t border-border/40 bg-muted/20 mt-12">
+
<div className="container mx-auto px-4 py-8">
+
<div className="text-center text-sm text-muted-foreground">
+
<p>
+
Built by{' '}
+
<a
+
href="https://bsky.app/profile/nekomimi.pet"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
@nekomimi.pet
+
</a>
+
{' โ€ข '}
+
Contact:{' '}
+
<a
+
href="mailto:contact@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
contact@wisp.place
+
</a>
+
{' โ€ข '}
+
Legal/DMCA:{' '}
+
<a
+
href="mailto:legal@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
legal@wisp.place
+
</a>
+
</p>
+
<p className="mt-2">
+
<a
+
href="/acceptable-use"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
Acceptable Use Policy
+
</a>
+
</p>
</div>
-
<DialogFooter className="flex-col sm:flex-row gap-2">
-
<Button
-
variant="outline"
-
onClick={() => {
-
setAddDomainModalOpen(false)
-
setCustomDomain('')
-
}}
-
className="w-full sm:w-auto"
-
disabled={isAddingDomain}
-
>
-
Cancel
-
</Button>
-
<Button
-
onClick={handleAddCustomDomain}
-
disabled={!customDomain || isAddingDomain}
-
className="w-full sm:w-auto"
-
>
-
{isAddingDomain ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Adding...
-
</>
-
) : (
-
'Add Domain'
-
)}
-
</Button>
-
</DialogFooter>
-
</DialogContent>
-
</Dialog>
+
</div>
+
</footer>
{/* Site Configuration Modal */}
<Dialog
···
>
<DialogContent className="sm:max-w-lg">
<DialogHeader>
-
<DialogTitle>Configure Site Domain</DialogTitle>
+
<DialogTitle>Configure Site Domains</DialogTitle>
<DialogDescription>
-
Choose which domain this site should use
+
Select which domains should be mapped to this site. You can select multiple domains.
</DialogDescription>
</DialogHeader>
{configuringSite && (
···
</p>
</div>
-
<RadioGroup
-
value={selectedDomain}
-
onValueChange={setSelectedDomain}
-
>
-
{wispDomain && (
-
<div className="flex items-center space-x-2">
-
<RadioGroupItem value="wisp" id="wisp" />
-
<Label
-
htmlFor="wisp"
-
className="flex-1 cursor-pointer"
-
>
-
<div className="flex items-center justify-between">
-
<span className="font-mono text-sm">
-
{wispDomain.domain}
-
</span>
-
<Badge variant="secondary" className="text-xs ml-2">
-
Free
-
</Badge>
-
</div>
-
</Label>
-
</div>
-
)}
+
<div className="space-y-3">
+
<p className="text-sm font-medium">Available Domains:</p>
+
+
{wispDomains.map((wispDomain) => {
+
const domainId = `wisp:${wispDomain.domain}`
+
return (
+
<div key={domainId} className="flex items-center space-x-3 p-3 border rounded-lg hover:bg-muted/30">
+
<Checkbox
+
id={domainId}
+
checked={selectedDomains.has(domainId)}
+
onCheckedChange={(checked) => {
+
const newSelected = new Set(selectedDomains)
+
if (checked) {
+
newSelected.add(domainId)
+
} else {
+
newSelected.delete(domainId)
+
}
+
setSelectedDomains(newSelected)
+
}}
+
/>
+
<Label
+
htmlFor={domainId}
+
className="flex-1 cursor-pointer"
+
>
+
<div className="flex items-center justify-between">
+
<span className="font-mono text-sm">
+
{wispDomain.domain}
+
</span>
+
<Badge variant="secondary" className="text-xs ml-2">
+
Wisp
+
</Badge>
+
</div>
+
</Label>
+
</div>
+
)
+
})}
{customDomains
.filter((d) => d.verified)
.map((domain) => (
<div
key={domain.id}
-
className="flex items-center space-x-2"
+
className="flex items-center space-x-3 p-3 border rounded-lg hover:bg-muted/30"
>
-
<RadioGroupItem
-
value={domain.id}
+
<Checkbox
id={domain.id}
+
checked={selectedDomains.has(domain.id)}
+
onCheckedChange={(checked) => {
+
const newSelected = new Set(selectedDomains)
+
if (checked) {
+
newSelected.add(domain.id)
+
} else {
+
newSelected.delete(domain.id)
+
}
+
setSelectedDomains(newSelected)
+
}}
/>
<Label
htmlFor={domain.id}
···
</div>
))}
-
<div className="flex items-center space-x-2">
-
<RadioGroupItem value="none" id="none" />
-
<Label htmlFor="none" className="flex-1 cursor-pointer">
-
<div className="flex flex-col">
-
<span className="text-sm">Default URL</span>
-
<span className="text-xs text-muted-foreground font-mono break-all">
-
sites.wisp.place/{configuringSite.did}/
-
{configuringSite.rkey}
-
</span>
-
</div>
-
</Label>
-
</div>
-
</RadioGroup>
+
{customDomains.filter(d => d.verified).length === 0 && wispDomains.length === 0 && (
+
<p className="text-sm text-muted-foreground py-4 text-center">
+
No domains available. Add a custom domain or claim a wisp.place subdomain.
+
</p>
+
)}
+
</div>
+
+
<div className="p-3 bg-muted/20 rounded-lg border-l-4 border-blue-500/50">
+
<p className="text-xs text-muted-foreground">
+
<strong>Note:</strong> If no domains are selected, the site will be accessible at:{' '}
+
<span className="font-mono">
+
sites.wisp.place/{userInfo?.handle || '...'}/{configuringSite.rkey}
+
</span>
+
</p>
+
</div>
</div>
)}
<DialogFooter className="flex flex-col sm:flex-row sm:justify-between gap-2">
···
)}
</Button>
</div>
-
</DialogFooter>
-
</DialogContent>
-
</Dialog>
-
-
{/* View DNS Records Modal */}
-
<Dialog
-
open={viewDomainDNS !== null}
-
onOpenChange={(open) => !open && setViewDomainDNS(null)}
-
>
-
<DialogContent className="sm:max-w-lg">
-
<DialogHeader>
-
<DialogTitle>DNS Configuration</DialogTitle>
-
<DialogDescription>
-
Add these DNS records to your domain provider
-
</DialogDescription>
-
</DialogHeader>
-
{viewDomainDNS && userInfo && (
-
<>
-
{(() => {
-
const domain = customDomains.find(
-
(d) => d.id === viewDomainDNS
-
)
-
if (!domain) return null
-
-
return (
-
<div className="space-y-4 py-4">
-
<div className="p-3 bg-muted/30 rounded-lg">
-
<p className="text-sm font-medium mb-1">
-
Domain:
-
</p>
-
<p className="font-mono text-sm">
-
{domain.domain}
-
</p>
-
</div>
-
-
<div className="space-y-3">
-
<div className="p-3 bg-background rounded border border-border">
-
<div className="flex justify-between items-start mb-2">
-
<span className="text-xs font-semibold text-muted-foreground">
-
TXT Record (Verification)
-
</span>
-
</div>
-
<div className="font-mono text-xs space-y-2">
-
<div>
-
<span className="text-muted-foreground">
-
Name:
-
</span>{' '}
-
<span className="select-all">
-
_wisp.{domain.domain}
-
</span>
-
</div>
-
<div>
-
<span className="text-muted-foreground">
-
Value:
-
</span>{' '}
-
<span className="select-all break-all">
-
{userInfo.did}
-
</span>
-
</div>
-
</div>
-
</div>
-
-
<div className="p-3 bg-background rounded border border-border">
-
<div className="flex justify-between items-start mb-2">
-
<span className="text-xs font-semibold text-muted-foreground">
-
CNAME Record (Pointing)
-
</span>
-
</div>
-
<div className="font-mono text-xs space-y-2">
-
<div>
-
<span className="text-muted-foreground">
-
Name:
-
</span>{' '}
-
<span className="select-all">
-
{domain.domain}
-
</span>
-
</div>
-
<div>
-
<span className="text-muted-foreground">
-
Value:
-
</span>{' '}
-
<span className="select-all">
-
{domain.id}.dns.wisp.place
-
</span>
-
</div>
-
</div>
-
<p className="text-xs text-muted-foreground mt-2">
-
Some DNS providers may require you to use @ or leave it blank for the root domain
-
</p>
-
</div>
-
</div>
-
-
<div className="p-3 bg-muted/30 rounded-lg">
-
<p className="text-xs text-muted-foreground">
-
๐Ÿ’ก After configuring DNS, click "Verify DNS"
-
to check if everything is set up correctly.
-
DNS changes can take a few minutes to
-
propagate.
-
</p>
-
</div>
-
</div>
-
)
-
})()}
-
</>
-
)}
-
<DialogFooter>
-
<Button
-
variant="outline"
-
onClick={() => setViewDomainDNS(null)}
-
className="w-full sm:w-auto"
-
>
-
Close
-
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
+239
public/editor/hooks/useDomainData.ts
···
+
import { useState } from 'react'
+
+
export interface CustomDomain {
+
id: string
+
domain: string
+
did: string
+
rkey: string
+
verified: boolean
+
last_verified_at: number | null
+
created_at: number
+
}
+
+
export interface WispDomain {
+
domain: string
+
rkey: string | null
+
}
+
+
type VerificationStatus = 'idle' | 'verifying' | 'success' | 'error'
+
+
export function useDomainData() {
+
const [wispDomains, setWispDomains] = useState<WispDomain[]>([])
+
const [customDomains, setCustomDomains] = useState<CustomDomain[]>([])
+
const [domainsLoading, setDomainsLoading] = useState(true)
+
const [verificationStatus, setVerificationStatus] = useState<{
+
[id: string]: VerificationStatus
+
}>({})
+
+
const fetchDomains = async () => {
+
try {
+
const response = await fetch('/api/user/domains')
+
const data = await response.json()
+
setWispDomains(data.wispDomains || [])
+
setCustomDomains(data.customDomains || [])
+
} catch (err) {
+
console.error('Failed to fetch domains:', err)
+
} finally {
+
setDomainsLoading(false)
+
}
+
}
+
+
const addCustomDomain = async (domain: string) => {
+
try {
+
const response = await fetch('/api/domain/custom/add', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ domain })
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
await fetchDomains()
+
return { success: true, id: data.id }
+
} else {
+
throw new Error(data.error || 'Failed to add domain')
+
}
+
} catch (err) {
+
console.error('Add domain error:', err)
+
alert(
+
`Failed to add domain: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
return { success: false }
+
}
+
}
+
+
const verifyDomain = async (id: string) => {
+
setVerificationStatus({ ...verificationStatus, [id]: 'verifying' })
+
+
try {
+
const response = await fetch('/api/domain/custom/verify', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ id })
+
})
+
+
const data = await response.json()
+
if (data.success && data.verified) {
+
setVerificationStatus({ ...verificationStatus, [id]: 'success' })
+
await fetchDomains()
+
} else {
+
setVerificationStatus({ ...verificationStatus, [id]: 'error' })
+
if (data.error) {
+
alert(`Verification failed: ${data.error}`)
+
}
+
}
+
} catch (err) {
+
console.error('Verify domain error:', err)
+
setVerificationStatus({ ...verificationStatus, [id]: 'error' })
+
alert(
+
`Verification failed: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
}
+
}
+
+
const deleteCustomDomain = async (id: string) => {
+
if (!confirm('Are you sure you want to remove this custom domain?')) {
+
return false
+
}
+
+
try {
+
const response = await fetch(`/api/domain/custom/${id}`, {
+
method: 'DELETE'
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
await fetchDomains()
+
return true
+
} else {
+
throw new Error('Failed to delete domain')
+
}
+
} catch (err) {
+
console.error('Delete domain error:', err)
+
alert(
+
`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
return false
+
}
+
}
+
+
const mapWispDomain = async (domain: string, siteRkey: string | null) => {
+
try {
+
const response = await fetch('/api/domain/wisp/map-site', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ domain, siteRkey })
+
})
+
const data = await response.json()
+
if (!data.success) throw new Error('Failed to map wisp domain')
+
return true
+
} catch (err) {
+
console.error('Map wisp domain error:', err)
+
throw err
+
}
+
}
+
+
const deleteWispDomain = async (domain: string) => {
+
if (!confirm('Are you sure you want to remove this wisp.place domain?')) {
+
return false
+
}
+
+
try {
+
const response = await fetch(`/api/domain/wisp/${encodeURIComponent(domain)}`, {
+
method: 'DELETE'
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
await fetchDomains()
+
return true
+
} else {
+
throw new Error('Failed to delete domain')
+
}
+
} catch (err) {
+
console.error('Delete wisp domain error:', err)
+
alert(
+
`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
return false
+
}
+
}
+
+
const mapCustomDomain = async (domainId: string, siteRkey: string | null) => {
+
try {
+
const response = await fetch(`/api/domain/custom/${domainId}/map-site`, {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ siteRkey })
+
})
+
const data = await response.json()
+
if (!data.success) throw new Error(`Failed to map custom domain ${domainId}`)
+
return true
+
} catch (err) {
+
console.error('Map custom domain error:', err)
+
throw err
+
}
+
}
+
+
const claimWispDomain = async (handle: string) => {
+
try {
+
const response = await fetch('/api/domain/claim', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ handle })
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
await fetchDomains()
+
return { success: true }
+
} else {
+
throw new Error(data.error || 'Failed to claim domain')
+
}
+
} catch (err) {
+
console.error('Claim domain error:', err)
+
const errorMessage = err instanceof Error ? err.message : 'Unknown error'
+
+
// Handle domain limit error more gracefully
+
if (errorMessage.includes('Domain limit reached')) {
+
alert('You have already claimed 3 wisp.place subdomains (maximum limit).')
+
await fetchDomains()
+
} else {
+
alert(`Failed to claim domain: ${errorMessage}`)
+
}
+
return { success: false, error: errorMessage }
+
}
+
}
+
+
const checkWispAvailability = async (handle: string) => {
+
const trimmedHandle = handle.trim().toLowerCase()
+
if (!trimmedHandle) {
+
return { available: null }
+
}
+
+
try {
+
const response = await fetch(`/api/domain/check?handle=${encodeURIComponent(trimmedHandle)}`)
+
const data = await response.json()
+
return { available: data.available }
+
} catch (err) {
+
console.error('Check availability error:', err)
+
return { available: false }
+
}
+
}
+
+
return {
+
wispDomains,
+
customDomains,
+
domainsLoading,
+
verificationStatus,
+
fetchDomains,
+
addCustomDomain,
+
verifyDomain,
+
deleteCustomDomain,
+
mapWispDomain,
+
deleteWispDomain,
+
mapCustomDomain,
+
claimWispDomain,
+
checkWispAvailability
+
}
+
}
+112
public/editor/hooks/useSiteData.ts
···
+
import { useState } from 'react'
+
+
export interface Site {
+
did: string
+
rkey: string
+
display_name: string | null
+
created_at: number
+
updated_at: number
+
}
+
+
export interface DomainInfo {
+
type: 'wisp' | 'custom'
+
domain: string
+
verified?: boolean
+
id?: string
+
}
+
+
export interface SiteWithDomains extends Site {
+
domains?: DomainInfo[]
+
}
+
+
export function useSiteData() {
+
const [sites, setSites] = useState<SiteWithDomains[]>([])
+
const [sitesLoading, setSitesLoading] = useState(true)
+
const [isSyncing, setIsSyncing] = useState(false)
+
+
const fetchSites = async () => {
+
try {
+
const response = await fetch('/api/user/sites')
+
const data = await response.json()
+
const sitesData: Site[] = data.sites || []
+
+
// Fetch domain info for each site
+
const sitesWithDomains = await Promise.all(
+
sitesData.map(async (site) => {
+
try {
+
const domainsResponse = await fetch(`/api/user/site/${site.rkey}/domains`)
+
const domainsData = await domainsResponse.json()
+
return {
+
...site,
+
domains: domainsData.domains || []
+
}
+
} catch (err) {
+
console.error(`Failed to fetch domains for site ${site.rkey}:`, err)
+
return {
+
...site,
+
domains: []
+
}
+
}
+
})
+
)
+
+
setSites(sitesWithDomains)
+
} catch (err) {
+
console.error('Failed to fetch sites:', err)
+
} finally {
+
setSitesLoading(false)
+
}
+
}
+
+
const syncSites = async () => {
+
setIsSyncing(true)
+
try {
+
const response = await fetch('/api/user/sync', {
+
method: 'POST'
+
})
+
const data = await response.json()
+
if (data.success) {
+
console.log(`Synced ${data.synced} sites from PDS`)
+
// Refresh sites list
+
await fetchSites()
+
}
+
} catch (err) {
+
console.error('Failed to sync sites:', err)
+
alert('Failed to sync sites from PDS')
+
} finally {
+
setIsSyncing(false)
+
}
+
}
+
+
const deleteSite = async (rkey: string) => {
+
try {
+
const response = await fetch(`/api/site/${rkey}`, {
+
method: 'DELETE'
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
// Refresh sites list
+
await fetchSites()
+
return true
+
} else {
+
throw new Error(data.error || 'Failed to delete site')
+
}
+
} catch (err) {
+
console.error('Delete site error:', err)
+
alert(
+
`Failed to delete site: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
return false
+
}
+
}
+
+
return {
+
sites,
+
sitesLoading,
+
isSyncing,
+
fetchSites,
+
syncSites,
+
deleteSite
+
}
+
}
+29
public/editor/hooks/useUserInfo.ts
···
+
import { useState } from 'react'
+
+
export interface UserInfo {
+
did: string
+
handle: string
+
}
+
+
export function useUserInfo() {
+
const [userInfo, setUserInfo] = useState<UserInfo | null>(null)
+
const [loading, setLoading] = useState(true)
+
+
const fetchUserInfo = async () => {
+
try {
+
const response = await fetch('/api/user/info')
+
const data = await response.json()
+
setUserInfo(data)
+
} catch (err) {
+
console.error('Failed to fetch user info:', err)
+
} finally {
+
setLoading(false)
+
}
+
}
+
+
return {
+
userInfo,
+
loading,
+
fetchUserInfo
+
}
+
}
+41 -1
public/editor/index.html
···
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
-
<title>Elysia Static</title>
+
<title>wisp.place</title>
+
<meta name="description" content="Manage your decentralized static sites hosted on AT Protocol." />
+
+
<!-- Open Graph / Facebook -->
+
<meta property="og:type" content="website" />
+
<meta property="og:url" content="https://wisp.place/editor" />
+
<meta property="og:title" content="Editor - wisp.place" />
+
<meta property="og:description" content="Manage your decentralized static sites hosted on AT Protocol." />
+
<meta property="og:site_name" content="wisp.place" />
+
+
<!-- Twitter -->
+
<meta name="twitter:card" content="summary" />
+
<meta name="twitter:url" content="https://wisp.place/editor" />
+
<meta name="twitter:title" content="Editor - wisp.place" />
+
<meta name="twitter:description" content="Manage your decentralized static sites hosted on AT Protocol." />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
<link rel="icon" type="image/x-icon" href="../favicon.ico">
+
<link rel="icon" type="image/png" sizes="32x32" href="../favicon-32x32.png">
+
<link rel="icon" type="image/png" sizes="16x16" href="../favicon-16x16.png">
+
<link rel="apple-touch-icon" sizes="180x180" href="../apple-touch-icon.png">
+
<link rel="manifest" href="../site.webmanifest">
+
<style>
+
/* Dark theme fallback styles for before JS loads */
+
@media (prefers-color-scheme: dark) {
+
body {
+
background-color: oklch(0.23 0.015 285);
+
color: oklch(0.90 0.005 285);
+
}
+
+
pre {
+
background-color: oklch(0.33 0.015 285) !important;
+
color: oklch(0.90 0.005 285) !important;
+
}
+
+
.bg-muted {
+
background-color: oklch(0.33 0.015 285) !important;
+
}
+
}
+
</style>
</head>
<body>
<div id="elysia"></div>
+322
public/editor/tabs/CLITab.tsx
···
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
import { Badge } from '@public/components/ui/badge'
+
import { ExternalLink } from 'lucide-react'
+
import { CodeBlock } from '@public/components/ui/code-block'
+
+
export function CLITab() {
+
return (
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<div className="flex items-center gap-2 mb-2">
+
<CardTitle>Wisp CLI Tool</CardTitle>
+
<Badge variant="secondary" className="text-xs">v0.2.0</Badge>
+
<Badge variant="outline" className="text-xs">Alpha</Badge>
+
</div>
+
<CardDescription>
+
Deploy static sites directly from your terminal
+
</CardDescription>
+
</CardHeader>
+
<CardContent className="space-y-6">
+
<div className="prose prose-sm max-w-none dark:prose-invert">
+
<p className="text-sm text-muted-foreground">
+
The Wisp CLI is a command-line tool for deploying static websites directly to your AT Protocol account.
+
Authenticate with app password or OAuth and deploy from CI/CD pipelines.
+
</p>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Features</h3>
+
<ul className="text-sm text-muted-foreground space-y-2 list-disc list-inside">
+
<li><strong>Deploy:</strong> Push static sites directly from your terminal</li>
+
<li><strong>Pull:</strong> Download sites from the PDS for development or backup</li>
+
<li><strong>Serve:</strong> Run a local server with real-time firehose updates</li>
+
</ul>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Download v0.2.0</h3>
+
<div className="grid gap-2">
+
<div className="p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border">
+
<a
+
href="https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-aarch64-darwin"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between mb-2"
+
>
+
<span className="font-mono text-sm">macOS (Apple Silicon)</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<div className="text-xs text-muted-foreground">
+
<span className="font-mono">SHA-1: a8c27ea41c5e2672bfecb3476ece1c801741d759</span>
+
</div>
+
</div>
+
<div className="p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border">
+
<a
+
href="https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-aarch64-linux"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between mb-2"
+
>
+
<span className="font-mono text-sm">Linux (ARM64)</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<div className="text-xs text-muted-foreground">
+
<span className="font-mono">SHA-1: fd7ee689c7600fc953179ea755b0357c8481a622</span>
+
</div>
+
</div>
+
<div className="p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border">
+
<a
+
href="https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between mb-2"
+
>
+
<span className="font-mono text-sm">Linux (x86_64)</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<div className="text-xs text-muted-foreground">
+
<span className="font-mono">SHA-1: 8bca6992559e19e1d29ab3d2fcc6d09b28e5a485</span>
+
</div>
+
</div>
+
<div className="p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border">
+
<a
+
href="https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-windows.exe"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between mb-2"
+
>
+
<span className="font-mono text-sm">Windows (x86_64)</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<div className="text-xs text-muted-foreground">
+
<span className="font-mono">SHA-1: 90ea3987a06597fa6c42e1df9009e9758e92dd54</span>
+
</div>
+
</div>
+
</div>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Deploy a Site</h3>
+
<CodeBlock
+
code={`# Download and make executable
+
curl -O https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-aarch64-darwin
+
chmod +x wisp-cli-aarch64-darwin
+
+
# Deploy your site
+
./wisp-cli-aarch64-darwin deploy your-handle.bsky.social \\
+
--path ./dist \\
+
--site my-site \\
+
--password your-app-password
+
+
# Your site will be available at:
+
# https://sites.wisp.place/your-handle/my-site`}
+
language="bash"
+
/>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Pull a Site from PDS</h3>
+
<p className="text-xs text-muted-foreground">
+
Download a site from the PDS to your local machine (uses OAuth authentication):
+
</p>
+
<CodeBlock
+
code={`# Pull a site to a specific directory
+
wisp-cli pull your-handle.bsky.social \\
+
--site my-site \\
+
--output ./my-site
+
+
# Pull to current directory
+
wisp-cli pull your-handle.bsky.social \\
+
--site my-site
+
+
# Opens browser for OAuth authentication on first run`}
+
language="bash"
+
/>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Serve a Site Locally with Real-Time Updates</h3>
+
<p className="text-xs text-muted-foreground">
+
Run a local server that monitors the firehose for real-time updates (uses OAuth authentication):
+
</p>
+
<CodeBlock
+
code={`# Serve on http://localhost:8080 (default)
+
wisp-cli serve your-handle.bsky.social \\
+
--site my-site
+
+
# Serve on a custom port
+
wisp-cli serve your-handle.bsky.social \\
+
--site my-site \\
+
--port 3000
+
+
# Downloads site, serves it, and watches firehose for live updates!`}
+
language="bash"
+
/>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">CI/CD with Tangled Spindle</h3>
+
<p className="text-xs text-muted-foreground">
+
Deploy automatically on every push using{' '}
+
<a
+
href="https://blog.tangled.org/ci"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-accent hover:underline"
+
>
+
Tangled Spindle
+
</a>
+
</p>
+
+
<div className="space-y-4">
+
<div>
+
<h4 className="text-xs font-semibold mb-2 flex items-center gap-2">
+
<span>Example 1: Simple Asset Publishing</span>
+
<Badge variant="secondary" className="text-xs">Copy Files</Badge>
+
</h4>
+
<CodeBlock
+
code={`when:
+
- event: ['push']
+
branch: ['main']
+
- event: ['manual']
+
+
engine: 'nixery'
+
+
clone:
+
skip: false
+
depth: 1
+
+
dependencies:
+
nixpkgs:
+
- coreutils
+
- curl
+
+
environment:
+
SITE_PATH: '.' # Copy entire repo
+
SITE_NAME: 'myWebbedSite'
+
WISP_HANDLE: 'your-handle.bsky.social'
+
+
steps:
+
- name: deploy assets to wisp
+
command: |
+
# Download Wisp CLI
+
curl https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux -o wisp-cli
+
chmod +x wisp-cli
+
+
# Deploy to Wisp
+
./wisp-cli deploy \\
+
"$WISP_HANDLE" \\
+
--path "$SITE_PATH" \\
+
--site "$SITE_NAME" \\
+
--password "$WISP_APP_PASSWORD"
+
+
# Output
+
#Deployed site 'myWebbedSite': at://did:plc:ttdrpj45ibqunmfhdsb4zdwq/place.wisp.fs/myWebbedSite
+
#Available at: https://sites.wisp.place/did:plc:ttdrpj45ibqunmfhdsb4zdwq/myWebbedSite
+
`}
+
language="yaml"
+
/>
+
</div>
+
+
<div>
+
<h4 className="text-xs font-semibold mb-2 flex items-center gap-2">
+
<span>Example 2: React/Vite Build & Deploy</span>
+
<Badge variant="secondary" className="text-xs">Full Build</Badge>
+
</h4>
+
<CodeBlock
+
code={`when:
+
- event: ['push']
+
branch: ['main']
+
- event: ['manual']
+
+
engine: 'nixery'
+
+
clone:
+
skip: false
+
depth: 1
+
submodules: false
+
+
dependencies:
+
nixpkgs:
+
- nodejs
+
- coreutils
+
- curl
+
github:NixOS/nixpkgs/nixpkgs-unstable:
+
- bun
+
+
environment:
+
SITE_PATH: 'dist'
+
SITE_NAME: 'my-react-site'
+
WISP_HANDLE: 'your-handle.bsky.social'
+
+
steps:
+
- name: build site
+
command: |
+
# necessary to ensure bun is in PATH
+
export PATH="$HOME/.nix-profile/bin:$PATH"
+
+
bun install --frozen-lockfile
+
+
# build with vite, run directly to get around env issues
+
bun node_modules/.bin/vite build
+
+
- name: deploy to wisp
+
command: |
+
# Download Wisp CLI
+
curl https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux -o wisp-cli
+
chmod +x wisp-cli
+
+
# Deploy to Wisp
+
./wisp-cli deploy \\
+
"$WISP_HANDLE" \\
+
--path "$SITE_PATH" \\
+
--site "$SITE_NAME" \\
+
--password "$WISP_APP_PASSWORD"`}
+
language="yaml"
+
/>
+
</div>
+
</div>
+
+
<div className="p-3 bg-muted/30 rounded-lg border-l-4 border-accent">
+
<p className="text-xs text-muted-foreground">
+
<strong className="text-foreground">Note:</strong> Set <code className="px-1.5 py-0.5 bg-background rounded text-xs">WISP_APP_PASSWORD</code> as a secret in your Tangled Spindle repository settings.
+
Generate an app password from your AT Protocol account settings.
+
</p>
+
</div>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Learn More</h3>
+
<div className="grid gap-2">
+
<a
+
href="https://tangled.org/@nekomimi.pet/wisp.place-monorepo/tree/main/cli"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border"
+
>
+
<span className="text-sm">Source Code</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<a
+
href="https://blog.tangled.org/ci"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border"
+
>
+
<span className="text-sm">Tangled Spindle CI/CD</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
</div>
+
</div>
+
</CardContent>
+
</Card>
+
</div>
+
)
+
}
+524
public/editor/tabs/DomainsTab.tsx
···
+
import { useState } from 'react'
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
import { Button } from '@public/components/ui/button'
+
import { Input } from '@public/components/ui/input'
+
import { Label } from '@public/components/ui/label'
+
import { Badge } from '@public/components/ui/badge'
+
import {
+
Dialog,
+
DialogContent,
+
DialogDescription,
+
DialogHeader,
+
DialogTitle,
+
DialogFooter
+
} from '@public/components/ui/dialog'
+
import {
+
CheckCircle2,
+
XCircle,
+
Loader2,
+
Trash2
+
} from 'lucide-react'
+
import type { WispDomain, CustomDomain } from '../hooks/useDomainData'
+
import type { UserInfo } from '../hooks/useUserInfo'
+
+
interface DomainsTabProps {
+
wispDomains: WispDomain[]
+
customDomains: CustomDomain[]
+
domainsLoading: boolean
+
verificationStatus: { [id: string]: 'idle' | 'verifying' | 'success' | 'error' }
+
userInfo: UserInfo | null
+
onAddCustomDomain: (domain: string) => Promise<{ success: boolean; id?: string }>
+
onVerifyDomain: (id: string) => Promise<void>
+
onDeleteCustomDomain: (id: string) => Promise<boolean>
+
onDeleteWispDomain: (domain: string) => Promise<boolean>
+
onClaimWispDomain: (handle: string) => Promise<{ success: boolean; error?: string }>
+
onCheckWispAvailability: (handle: string) => Promise<{ available: boolean | null }>
+
}
+
+
export function DomainsTab({
+
wispDomains,
+
customDomains,
+
domainsLoading,
+
verificationStatus,
+
userInfo,
+
onAddCustomDomain,
+
onVerifyDomain,
+
onDeleteCustomDomain,
+
onDeleteWispDomain,
+
onClaimWispDomain,
+
onCheckWispAvailability
+
}: DomainsTabProps) {
+
// Wisp domain claim state
+
const [wispHandle, setWispHandle] = useState('')
+
const [isClaimingWisp, setIsClaimingWisp] = useState(false)
+
const [wispAvailability, setWispAvailability] = useState<{
+
available: boolean | null
+
checking: boolean
+
}>({ available: null, checking: false })
+
+
// Custom domain modal state
+
const [addDomainModalOpen, setAddDomainModalOpen] = useState(false)
+
const [customDomain, setCustomDomain] = useState('')
+
const [isAddingDomain, setIsAddingDomain] = useState(false)
+
const [viewDomainDNS, setViewDomainDNS] = useState<string | null>(null)
+
+
const checkWispAvailability = async (handle: string) => {
+
const trimmedHandle = handle.trim().toLowerCase()
+
if (!trimmedHandle) {
+
setWispAvailability({ available: null, checking: false })
+
return
+
}
+
+
setWispAvailability({ available: null, checking: true })
+
const result = await onCheckWispAvailability(trimmedHandle)
+
setWispAvailability({ available: result.available, checking: false })
+
}
+
+
const handleClaimWispDomain = async () => {
+
const trimmedHandle = wispHandle.trim().toLowerCase()
+
if (!trimmedHandle) {
+
alert('Please enter a handle')
+
return
+
}
+
+
setIsClaimingWisp(true)
+
const result = await onClaimWispDomain(trimmedHandle)
+
if (result.success) {
+
setWispHandle('')
+
setWispAvailability({ available: null, checking: false })
+
}
+
setIsClaimingWisp(false)
+
}
+
+
const handleAddCustomDomain = async () => {
+
if (!customDomain) {
+
alert('Please enter a domain')
+
return
+
}
+
+
setIsAddingDomain(true)
+
const result = await onAddCustomDomain(customDomain)
+
setIsAddingDomain(false)
+
+
if (result.success) {
+
setCustomDomain('')
+
setAddDomainModalOpen(false)
+
// Automatically show DNS configuration for the newly added domain
+
if (result.id) {
+
setViewDomainDNS(result.id)
+
}
+
}
+
}
+
+
return (
+
<>
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<CardTitle>wisp.place Subdomains</CardTitle>
+
<CardDescription>
+
Your free subdomains on the wisp.place network (up to 3)
+
</CardDescription>
+
</CardHeader>
+
<CardContent>
+
{domainsLoading ? (
+
<div className="flex items-center justify-center py-4">
+
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
+
</div>
+
) : (
+
<div className="space-y-4">
+
{wispDomains.length > 0 && (
+
<div className="space-y-2">
+
{wispDomains.map((domain) => (
+
<div
+
key={domain.domain}
+
className="flex items-center justify-between p-3 border border-border rounded-lg"
+
>
+
<div className="flex flex-col gap-1 flex-1">
+
<div className="flex items-center gap-2">
+
<CheckCircle2 className="w-4 h-4 text-green-500" />
+
<span className="font-mono">
+
{domain.domain}
+
</span>
+
</div>
+
{domain.rkey && (
+
<p className="text-xs text-muted-foreground ml-6">
+
โ†’ Mapped to site: {domain.rkey}
+
</p>
+
)}
+
</div>
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={() => onDeleteWispDomain(domain.domain)}
+
>
+
<Trash2 className="w-4 h-4" />
+
</Button>
+
</div>
+
))}
+
</div>
+
)}
+
+
{wispDomains.length < 3 && (
+
<div className="p-4 bg-muted/30 rounded-lg">
+
<p className="text-sm text-muted-foreground mb-4">
+
{wispDomains.length === 0
+
? 'Claim your free wisp.place subdomain'
+
: `Claim another wisp.place subdomain (${wispDomains.length}/3)`}
+
</p>
+
<div className="space-y-3">
+
<div className="space-y-2">
+
<Label htmlFor="wisp-handle">Choose your handle</Label>
+
<div className="flex gap-2">
+
<div className="flex-1 relative">
+
<Input
+
id="wisp-handle"
+
placeholder="mysite"
+
value={wispHandle}
+
onChange={(e) => {
+
setWispHandle(e.target.value)
+
if (e.target.value.trim()) {
+
checkWispAvailability(e.target.value)
+
} else {
+
setWispAvailability({ available: null, checking: false })
+
}
+
}}
+
disabled={isClaimingWisp}
+
className="pr-24"
+
/>
+
<span className="absolute right-3 top-1/2 -translate-y-1/2 text-sm text-muted-foreground">
+
.wisp.place
+
</span>
+
</div>
+
</div>
+
{wispAvailability.checking && (
+
<p className="text-xs text-muted-foreground flex items-center gap-1">
+
<Loader2 className="w-3 h-3 animate-spin" />
+
Checking availability...
+
</p>
+
)}
+
{!wispAvailability.checking && wispAvailability.available === true && (
+
<p className="text-xs text-green-600 flex items-center gap-1">
+
<CheckCircle2 className="w-3 h-3" />
+
Available
+
</p>
+
)}
+
{!wispAvailability.checking && wispAvailability.available === false && (
+
<p className="text-xs text-red-600 flex items-center gap-1">
+
<XCircle className="w-3 h-3" />
+
Not available
+
</p>
+
)}
+
</div>
+
<Button
+
onClick={handleClaimWispDomain}
+
disabled={!wispHandle.trim() || isClaimingWisp || wispAvailability.available !== true}
+
className="w-full"
+
>
+
{isClaimingWisp ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Claiming...
+
</>
+
) : (
+
'Claim Subdomain'
+
)}
+
</Button>
+
</div>
+
</div>
+
)}
+
+
{wispDomains.length === 3 && (
+
<div className="p-3 bg-muted/30 rounded-lg text-center">
+
<p className="text-sm text-muted-foreground">
+
You have claimed the maximum of 3 wisp.place subdomains
+
</p>
+
</div>
+
)}
+
</div>
+
)}
+
</CardContent>
+
</Card>
+
+
<Card>
+
<CardHeader>
+
<CardTitle>Custom Domains</CardTitle>
+
<CardDescription>
+
Bring your own domain with DNS verification
+
</CardDescription>
+
</CardHeader>
+
<CardContent className="space-y-4">
+
<Button
+
onClick={() => setAddDomainModalOpen(true)}
+
className="w-full"
+
>
+
Add Custom Domain
+
</Button>
+
+
{domainsLoading ? (
+
<div className="flex items-center justify-center py-4">
+
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
+
</div>
+
) : customDomains.length === 0 ? (
+
<div className="text-center py-4 text-muted-foreground text-sm">
+
No custom domains added yet
+
</div>
+
) : (
+
<div className="space-y-2">
+
{customDomains.map((domain) => (
+
<div
+
key={domain.id}
+
className="flex items-center justify-between p-3 border border-border rounded-lg"
+
>
+
<div className="flex flex-col gap-1 flex-1">
+
<div className="flex items-center gap-2">
+
{domain.verified ? (
+
<CheckCircle2 className="w-4 h-4 text-green-500" />
+
) : (
+
<XCircle className="w-4 h-4 text-red-500" />
+
)}
+
<span className="font-mono">
+
{domain.domain}
+
</span>
+
</div>
+
{domain.rkey && domain.rkey !== 'self' && (
+
<p className="text-xs text-muted-foreground ml-6">
+
โ†’ Mapped to site: {domain.rkey}
+
</p>
+
)}
+
</div>
+
<div className="flex items-center gap-2">
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={() =>
+
setViewDomainDNS(domain.id)
+
}
+
>
+
View DNS
+
</Button>
+
{domain.verified ? (
+
<Badge variant="secondary">
+
Verified
+
</Badge>
+
) : (
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={() =>
+
onVerifyDomain(domain.id)
+
}
+
disabled={
+
verificationStatus[
+
domain.id
+
] === 'verifying'
+
}
+
>
+
{verificationStatus[
+
domain.id
+
] === 'verifying' ? (
+
<>
+
<Loader2 className="w-3 h-3 mr-1 animate-spin" />
+
Verifying...
+
</>
+
) : (
+
'Verify DNS'
+
)}
+
</Button>
+
)}
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={() =>
+
onDeleteCustomDomain(
+
domain.id
+
)
+
}
+
>
+
<Trash2 className="w-4 h-4" />
+
</Button>
+
</div>
+
</div>
+
))}
+
</div>
+
)}
+
</CardContent>
+
</Card>
+
</div>
+
+
{/* Add Custom Domain Modal */}
+
<Dialog open={addDomainModalOpen} onOpenChange={setAddDomainModalOpen}>
+
<DialogContent className="sm:max-w-lg">
+
<DialogHeader>
+
<DialogTitle>Add Custom Domain</DialogTitle>
+
<DialogDescription>
+
Enter your domain name. After adding, you'll see the DNS
+
records to configure.
+
</DialogDescription>
+
</DialogHeader>
+
<div className="space-y-4 py-4">
+
<div className="space-y-2">
+
<Label htmlFor="new-domain">Domain Name</Label>
+
<Input
+
id="new-domain"
+
placeholder="example.com"
+
value={customDomain}
+
onChange={(e) => setCustomDomain(e.target.value)}
+
/>
+
<p className="text-xs text-muted-foreground">
+
After adding, click "View DNS" to see the records you
+
need to configure.
+
</p>
+
</div>
+
</div>
+
<DialogFooter className="flex-col sm:flex-row gap-2">
+
<Button
+
variant="outline"
+
onClick={() => {
+
setAddDomainModalOpen(false)
+
setCustomDomain('')
+
}}
+
className="w-full sm:w-auto"
+
disabled={isAddingDomain}
+
>
+
Cancel
+
</Button>
+
<Button
+
onClick={handleAddCustomDomain}
+
disabled={!customDomain || isAddingDomain}
+
className="w-full sm:w-auto"
+
>
+
{isAddingDomain ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Adding...
+
</>
+
) : (
+
'Add Domain'
+
)}
+
</Button>
+
</DialogFooter>
+
</DialogContent>
+
</Dialog>
+
+
{/* View DNS Records Modal */}
+
<Dialog
+
open={viewDomainDNS !== null}
+
onOpenChange={(open) => !open && setViewDomainDNS(null)}
+
>
+
<DialogContent className="sm:max-w-lg">
+
<DialogHeader>
+
<DialogTitle>DNS Configuration</DialogTitle>
+
<DialogDescription>
+
Add these DNS records to your domain provider
+
</DialogDescription>
+
</DialogHeader>
+
{viewDomainDNS && userInfo && (
+
<>
+
{(() => {
+
const domain = customDomains.find(
+
(d) => d.id === viewDomainDNS
+
)
+
if (!domain) return null
+
+
return (
+
<div className="space-y-4 py-4">
+
<div className="p-3 bg-muted/30 rounded-lg">
+
<p className="text-sm font-medium mb-1">
+
Domain:
+
</p>
+
<p className="font-mono text-sm">
+
{domain.domain}
+
</p>
+
</div>
+
+
<div className="space-y-3">
+
<div className="p-3 bg-background rounded border border-border">
+
<div className="flex justify-between items-start mb-2">
+
<span className="text-xs font-semibold text-muted-foreground">
+
TXT Record (Verification)
+
</span>
+
</div>
+
<div className="font-mono text-xs space-y-2">
+
<div>
+
<span className="text-muted-foreground">
+
Name:
+
</span>{' '}
+
<span className="select-all">
+
_wisp.{domain.domain}
+
</span>
+
</div>
+
<div>
+
<span className="text-muted-foreground">
+
Value:
+
</span>{' '}
+
<span className="select-all break-all">
+
{userInfo.did}
+
</span>
+
</div>
+
</div>
+
</div>
+
+
<div className="p-3 bg-background rounded border border-border">
+
<div className="flex justify-between items-start mb-2">
+
<span className="text-xs font-semibold text-muted-foreground">
+
CNAME Record (Pointing)
+
</span>
+
</div>
+
<div className="font-mono text-xs space-y-2">
+
<div>
+
<span className="text-muted-foreground">
+
Name:
+
</span>{' '}
+
<span className="select-all">
+
{domain.domain}
+
</span>
+
</div>
+
<div>
+
<span className="text-muted-foreground">
+
Value:
+
</span>{' '}
+
<span className="select-all">
+
{domain.id}.dns.wisp.place
+
</span>
+
</div>
+
</div>
+
<p className="text-xs text-muted-foreground mt-2">
+
Note: Some DNS providers (like Cloudflare) flatten CNAMEs to A records - this is fine and won't affect verification.
+
</p>
+
</div>
+
</div>
+
+
<div className="p-3 bg-muted/30 rounded-lg">
+
<p className="text-xs text-muted-foreground">
+
๐Ÿ’ก After configuring DNS, click "Verify DNS"
+
to check if everything is set up correctly.
+
DNS changes can take a few minutes to
+
propagate.
+
</p>
+
</div>
+
</div>
+
)
+
})()}
+
</>
+
)}
+
<DialogFooter>
+
<Button
+
variant="outline"
+
onClick={() => setViewDomainDNS(null)}
+
className="w-full sm:w-auto"
+
>
+
Close
+
</Button>
+
</DialogFooter>
+
</DialogContent>
+
</Dialog>
+
</>
+
)
+
}
+196
public/editor/tabs/SitesTab.tsx
···
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
import { Button } from '@public/components/ui/button'
+
import { Badge } from '@public/components/ui/badge'
+
import {
+
Globe,
+
ExternalLink,
+
CheckCircle2,
+
AlertCircle,
+
Loader2,
+
RefreshCw,
+
Settings
+
} from 'lucide-react'
+
import type { SiteWithDomains } from '../hooks/useSiteData'
+
import type { UserInfo } from '../hooks/useUserInfo'
+
+
interface SitesTabProps {
+
sites: SiteWithDomains[]
+
sitesLoading: boolean
+
isSyncing: boolean
+
userInfo: UserInfo | null
+
onSyncSites: () => Promise<void>
+
onConfigureSite: (site: SiteWithDomains) => void
+
}
+
+
export function SitesTab({
+
sites,
+
sitesLoading,
+
isSyncing,
+
userInfo,
+
onSyncSites,
+
onConfigureSite
+
}: SitesTabProps) {
+
const getSiteUrl = (site: SiteWithDomains) => {
+
// Use the first mapped domain if available
+
if (site.domains && site.domains.length > 0) {
+
return `https://${site.domains[0].domain}`
+
}
+
+
// Default fallback URL - use handle instead of DID
+
if (!userInfo) return '#'
+
return `https://sites.wisp.place/${userInfo.handle}/${site.rkey}`
+
}
+
+
const getSiteDomainName = (site: SiteWithDomains) => {
+
// Return the first domain if available
+
if (site.domains && site.domains.length > 0) {
+
return site.domains[0].domain
+
}
+
+
// Use handle instead of DID for display
+
if (!userInfo) return `sites.wisp.place/.../${site.rkey}`
+
return `sites.wisp.place/${userInfo.handle}/${site.rkey}`
+
}
+
+
return (
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<div className="flex items-center justify-between">
+
<div>
+
<CardTitle>Your Sites</CardTitle>
+
<CardDescription>
+
View and manage all your deployed sites
+
</CardDescription>
+
</div>
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={onSyncSites}
+
disabled={isSyncing || sitesLoading}
+
>
+
<RefreshCw
+
className={`w-4 h-4 mr-2 ${isSyncing ? 'animate-spin' : ''}`}
+
/>
+
Sync from PDS
+
</Button>
+
</div>
+
</CardHeader>
+
<CardContent className="space-y-4">
+
{sitesLoading ? (
+
<div className="flex items-center justify-center py-8">
+
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
+
</div>
+
) : sites.length === 0 ? (
+
<div className="text-center py-8 text-muted-foreground">
+
<p>No sites yet. Upload your first site!</p>
+
</div>
+
) : (
+
sites.map((site) => (
+
<div
+
key={`${site.did}-${site.rkey}`}
+
className="flex items-center justify-between p-4 border border-border rounded-lg hover:bg-muted/50 transition-colors"
+
>
+
<div className="flex-1">
+
<div className="flex items-center gap-3 mb-2">
+
<h3 className="font-semibold text-lg">
+
{site.display_name || site.rkey}
+
</h3>
+
<Badge
+
variant="secondary"
+
className="text-xs"
+
>
+
active
+
</Badge>
+
</div>
+
+
{/* Display all mapped domains */}
+
{site.domains && site.domains.length > 0 ? (
+
<div className="space-y-1">
+
{site.domains.map((domainInfo, idx) => (
+
<div key={`${domainInfo.domain}-${idx}`} className="flex items-center gap-2">
+
<a
+
href={`https://${domainInfo.domain}`}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-sm text-accent hover:text-accent/80 flex items-center gap-1"
+
>
+
<Globe className="w-3 h-3" />
+
{domainInfo.domain}
+
<ExternalLink className="w-3 h-3" />
+
</a>
+
<Badge
+
variant={domainInfo.type === 'wisp' ? 'default' : 'outline'}
+
className="text-xs"
+
>
+
{domainInfo.type}
+
</Badge>
+
{domainInfo.type === 'custom' && (
+
<Badge
+
variant={domainInfo.verified ? 'default' : 'secondary'}
+
className="text-xs"
+
>
+
{domainInfo.verified ? (
+
<>
+
<CheckCircle2 className="w-3 h-3 mr-1" />
+
verified
+
</>
+
) : (
+
<>
+
<AlertCircle className="w-3 h-3 mr-1" />
+
pending
+
</>
+
)}
+
</Badge>
+
)}
+
</div>
+
))}
+
</div>
+
) : (
+
<a
+
href={getSiteUrl(site)}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-sm text-muted-foreground hover:text-accent flex items-center gap-1"
+
>
+
{getSiteDomainName(site)}
+
<ExternalLink className="w-3 h-3" />
+
</a>
+
)}
+
</div>
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={() => onConfigureSite(site)}
+
>
+
<Settings className="w-4 h-4 mr-2" />
+
Configure
+
</Button>
+
</div>
+
))
+
)}
+
</CardContent>
+
</Card>
+
+
<div className="p-4 bg-muted/30 rounded-lg border-l-4 border-yellow-500/50">
+
<div className="flex items-start gap-2">
+
<AlertCircle className="w-4 h-4 text-yellow-600 dark:text-yellow-400 mt-0.5 shrink-0" />
+
<div className="flex-1 space-y-1">
+
<p className="text-xs font-semibold text-yellow-600 dark:text-yellow-400">
+
Note about sites.wisp.place URLs
+
</p>
+
<p className="text-xs text-muted-foreground">
+
Complex sites hosted on <code className="px-1 py-0.5 bg-background rounded text-xs">sites.wisp.place</code> may have broken assets if they use absolute paths (e.g., <code className="px-1 py-0.5 bg-background rounded text-xs">/folder/script.js</code>) in CSS or JavaScript files. While HTML paths are automatically rewritten, CSS and JS files are served as-is. For best results, use a wisp.place subdomain or custom domain, or ensure your site uses relative paths.
+
</p>
+
</div>
+
</div>
+
</div>
+
</div>
+
)
+
}
+323
public/editor/tabs/UploadTab.tsx
···
+
import { useState, useEffect } from 'react'
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
import { Button } from '@public/components/ui/button'
+
import { Input } from '@public/components/ui/input'
+
import { Label } from '@public/components/ui/label'
+
import { RadioGroup, RadioGroupItem } from '@public/components/ui/radio-group'
+
import { Badge } from '@public/components/ui/badge'
+
import {
+
Globe,
+
Upload,
+
AlertCircle,
+
Loader2
+
} from 'lucide-react'
+
import type { SiteWithDomains } from '../hooks/useSiteData'
+
+
interface UploadTabProps {
+
sites: SiteWithDomains[]
+
sitesLoading: boolean
+
onUploadComplete: () => Promise<void>
+
}
+
+
export function UploadTab({
+
sites,
+
sitesLoading,
+
onUploadComplete
+
}: UploadTabProps) {
+
// Upload state
+
const [siteMode, setSiteMode] = useState<'existing' | 'new'>('existing')
+
const [selectedSiteRkey, setSelectedSiteRkey] = useState<string>('')
+
const [newSiteName, setNewSiteName] = useState('')
+
const [selectedFiles, setSelectedFiles] = useState<FileList | null>(null)
+
const [isUploading, setIsUploading] = useState(false)
+
const [uploadProgress, setUploadProgress] = useState('')
+
const [skippedFiles, setSkippedFiles] = useState<Array<{ name: string; reason: string }>>([])
+
const [uploadedCount, setUploadedCount] = useState(0)
+
+
// Auto-switch to 'new' mode if no sites exist
+
useEffect(() => {
+
if (!sitesLoading && sites.length === 0 && siteMode === 'existing') {
+
setSiteMode('new')
+
}
+
}, [sites, sitesLoading, siteMode])
+
+
const handleFileSelect = (e: React.ChangeEvent<HTMLInputElement>) => {
+
if (e.target.files && e.target.files.length > 0) {
+
setSelectedFiles(e.target.files)
+
}
+
}
+
+
const handleUpload = async () => {
+
const siteName = siteMode === 'existing' ? selectedSiteRkey : newSiteName
+
+
if (!siteName) {
+
alert(siteMode === 'existing' ? 'Please select a site' : 'Please enter a site name')
+
return
+
}
+
+
setIsUploading(true)
+
setUploadProgress('Preparing files...')
+
+
try {
+
const formData = new FormData()
+
formData.append('siteName', siteName)
+
+
if (selectedFiles) {
+
for (let i = 0; i < selectedFiles.length; i++) {
+
formData.append('files', selectedFiles[i])
+
}
+
}
+
+
setUploadProgress('Uploading to AT Protocol...')
+
const response = await fetch('/wisp/upload-files', {
+
method: 'POST',
+
body: formData
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
setUploadProgress('Upload complete!')
+
setSkippedFiles(data.skippedFiles || [])
+
setUploadedCount(data.uploadedCount || data.fileCount || 0)
+
setSelectedSiteRkey('')
+
setNewSiteName('')
+
setSelectedFiles(null)
+
+
// Refresh sites list
+
await onUploadComplete()
+
+
// Reset form - give more time if there are skipped files
+
const resetDelay = data.skippedFiles && data.skippedFiles.length > 0 ? 4000 : 1500
+
setTimeout(() => {
+
setUploadProgress('')
+
setSkippedFiles([])
+
setUploadedCount(0)
+
setIsUploading(false)
+
}, resetDelay)
+
} else {
+
throw new Error(data.error || 'Upload failed')
+
}
+
} catch (err) {
+
console.error('Upload error:', err)
+
alert(
+
`Upload failed: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
setIsUploading(false)
+
setUploadProgress('')
+
}
+
}
+
+
return (
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<CardTitle>Upload Site</CardTitle>
+
<CardDescription>
+
Deploy a new site from a folder or Git repository
+
</CardDescription>
+
</CardHeader>
+
<CardContent className="space-y-6">
+
<div className="space-y-4">
+
<div className="p-4 bg-muted/50 rounded-lg">
+
<RadioGroup
+
value={siteMode}
+
onValueChange={(value) => setSiteMode(value as 'existing' | 'new')}
+
disabled={isUploading}
+
>
+
<div className="flex items-center space-x-2">
+
<RadioGroupItem value="existing" id="existing" />
+
<Label htmlFor="existing" className="cursor-pointer">
+
Update existing site
+
</Label>
+
</div>
+
<div className="flex items-center space-x-2">
+
<RadioGroupItem value="new" id="new" />
+
<Label htmlFor="new" className="cursor-pointer">
+
Create new site
+
</Label>
+
</div>
+
</RadioGroup>
+
</div>
+
+
{siteMode === 'existing' ? (
+
<div className="space-y-2">
+
<Label htmlFor="site-select">Select Site</Label>
+
{sitesLoading ? (
+
<div className="flex items-center justify-center py-4">
+
<Loader2 className="w-5 h-5 animate-spin text-muted-foreground" />
+
</div>
+
) : sites.length === 0 ? (
+
<div className="p-4 border border-dashed rounded-lg text-center text-sm text-muted-foreground">
+
No sites available. Create a new site instead.
+
</div>
+
) : (
+
<select
+
id="site-select"
+
className="flex h-10 w-full rounded-md border border-input bg-background px-3 py-2 text-sm ring-offset-background file:border-0 file:bg-transparent file:text-sm file:font-medium placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
+
value={selectedSiteRkey}
+
onChange={(e) => setSelectedSiteRkey(e.target.value)}
+
disabled={isUploading}
+
>
+
<option value="">Select a site...</option>
+
{sites.map((site) => (
+
<option key={site.rkey} value={site.rkey}>
+
{site.display_name || site.rkey}
+
</option>
+
))}
+
</select>
+
)}
+
</div>
+
) : (
+
<div className="space-y-2">
+
<Label htmlFor="new-site-name">New Site Name</Label>
+
<Input
+
id="new-site-name"
+
placeholder="my-awesome-site"
+
value={newSiteName}
+
onChange={(e) => setNewSiteName(e.target.value)}
+
disabled={isUploading}
+
/>
+
</div>
+
)}
+
+
<p className="text-xs text-muted-foreground">
+
File limits: 100MB per file, 300MB total
+
</p>
+
</div>
+
+
<div className="grid md:grid-cols-2 gap-4">
+
<Card className="border-2 border-dashed hover:border-accent transition-colors cursor-pointer">
+
<CardContent className="flex flex-col items-center justify-center p-8 text-center">
+
<Upload className="w-12 h-12 text-muted-foreground mb-4" />
+
<h3 className="font-semibold mb-2">
+
Upload Folder
+
</h3>
+
<p className="text-sm text-muted-foreground mb-4">
+
Drag and drop or click to upload your
+
static site files
+
</p>
+
<input
+
type="file"
+
id="file-upload"
+
multiple
+
onChange={handleFileSelect}
+
className="hidden"
+
{...(({ webkitdirectory: '', directory: '' } as any))}
+
disabled={isUploading}
+
/>
+
<label htmlFor="file-upload">
+
<Button
+
variant="outline"
+
type="button"
+
onClick={() =>
+
document
+
.getElementById('file-upload')
+
?.click()
+
}
+
disabled={isUploading}
+
>
+
Choose Folder
+
</Button>
+
</label>
+
{selectedFiles && selectedFiles.length > 0 && (
+
<p className="text-sm text-muted-foreground mt-3">
+
{selectedFiles.length} files selected
+
</p>
+
)}
+
</CardContent>
+
</Card>
+
+
<Card className="border-2 border-dashed opacity-50">
+
<CardContent className="flex flex-col items-center justify-center p-8 text-center">
+
<Globe className="w-12 h-12 text-muted-foreground mb-4" />
+
<h3 className="font-semibold mb-2">
+
Connect Git Repository
+
</h3>
+
<p className="text-sm text-muted-foreground mb-4">
+
Link your GitHub, GitLab, or any Git
+
repository
+
</p>
+
<Badge variant="secondary">Coming soon!</Badge>
+
</CardContent>
+
</Card>
+
</div>
+
+
{uploadProgress && (
+
<div className="space-y-3">
+
<div className="p-4 bg-muted rounded-lg">
+
<div className="flex items-center gap-2">
+
<Loader2 className="w-4 h-4 animate-spin" />
+
<span className="text-sm">{uploadProgress}</span>
+
</div>
+
</div>
+
+
{skippedFiles.length > 0 && (
+
<div className="p-4 bg-yellow-500/10 border border-yellow-500/20 rounded-lg">
+
<div className="flex items-start gap-2 text-yellow-600 dark:text-yellow-400 mb-2">
+
<AlertCircle className="w-4 h-4 mt-0.5 shrink-0" />
+
<div className="flex-1">
+
<span className="font-medium">
+
{skippedFiles.length} file{skippedFiles.length > 1 ? 's' : ''} skipped
+
</span>
+
{uploadedCount > 0 && (
+
<span className="text-sm ml-2">
+
({uploadedCount} uploaded successfully)
+
</span>
+
)}
+
</div>
+
</div>
+
<div className="ml-6 space-y-1 max-h-32 overflow-y-auto">
+
{skippedFiles.slice(0, 5).map((file, idx) => (
+
<div key={idx} className="text-xs">
+
<span className="font-mono">{file.name}</span>
+
<span className="text-muted-foreground"> - {file.reason}</span>
+
</div>
+
))}
+
{skippedFiles.length > 5 && (
+
<div className="text-xs text-muted-foreground">
+
...and {skippedFiles.length - 5} more
+
</div>
+
)}
+
</div>
+
</div>
+
)}
+
</div>
+
)}
+
+
<Button
+
onClick={handleUpload}
+
className="w-full"
+
disabled={
+
(siteMode === 'existing' ? !selectedSiteRkey : !newSiteName) ||
+
isUploading ||
+
(siteMode === 'existing' && (!selectedFiles || selectedFiles.length === 0))
+
}
+
>
+
{isUploading ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Uploading...
+
</>
+
) : (
+
<>
+
{siteMode === 'existing' ? (
+
'Update Site'
+
) : (
+
selectedFiles && selectedFiles.length > 0
+
? 'Upload & Deploy'
+
: 'Create Empty Site'
+
)}
+
</>
+
)}
+
</Button>
+
</CardContent>
+
</Card>
+
</div>
+
)
+
}
public/favicon-16x16.png

This is a binary file and will not be displayed.

public/favicon-32x32.png

This is a binary file and will not be displayed.

public/favicon.ico

This is a binary file and will not be displayed.

-14
public/favicon.svg
···
-
<!--?xml version="1.0" encoding="utf-8"?-->
-
<svg width="64" height="64" viewBox="0 0 64 64" xmlns="http://www.w3.org/2000/svg" role="img" aria-label="Centered large wisp on black background">
-
<!-- black background -->
-
<rect width="64" height="64" fill="#000000"></rect>
-
-
<!-- outer faint glow -->
-
<circle cx="32" cy="32" r="14" fill="none" stroke="#7CF5D8" stroke-opacity="0.35" stroke-width="2.6"></circle>
-
-
<!-- bright halo -->
-
<circle cx="32" cy="32" r="10" fill="none" stroke="#CFF8EE" stroke-opacity="0.95" stroke-width="2.4"></circle>
-
-
<!-- bright core -->
-
<circle cx="32" cy="32" r="4" fill="#FFFFFF"></circle>
-
</svg>
+23 -1
public/index.html
···
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
-
<title>Elysia Static</title>
+
<title>wisp.place</title>
+
<meta name="description" content="Host static websites directly in your AT Protocol account. Keep full ownership and control with fast CDN distribution. Built on Bluesky's decentralized network." />
+
+
<!-- Open Graph / Facebook -->
+
<meta property="og:type" content="website" />
+
<meta property="og:url" content="https://wisp.place/" />
+
<meta property="og:title" content="wisp.place - Decentralized Static Site Hosting" />
+
<meta property="og:description" content="Host static websites directly in your AT Protocol account. Keep full ownership and control with fast CDN distribution." />
+
<meta property="og:site_name" content="wisp.place" />
+
+
<!-- Twitter -->
+
<meta name="twitter:card" content="summary_large_image" />
+
<meta name="twitter:url" content="https://wisp.place/" />
+
<meta name="twitter:title" content="wisp.place - Decentralized Static Site Hosting" />
+
<meta name="twitter:description" content="Host static websites directly in your AT Protocol account. Keep full ownership and control with fast CDN distribution." />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
<link rel="icon" type="image/x-icon" href="./favicon.ico">
+
<link rel="icon" type="image/png" sizes="32x32" href="./favicon-32x32.png">
+
<link rel="icon" type="image/png" sizes="16x16" href="./favicon-16x16.png">
+
<link rel="apple-touch-icon" sizes="180x180" href="./apple-touch-icon.png">
+
<link rel="manifest" href="./site.webmanifest">
</head>
<body>
<div id="elysia"></div>
+428 -16
public/index.tsx
···
-
import { useState, useRef, useEffect } from 'react'
+
import React, { useState, useRef, useEffect } from 'react'
import { createRoot } from 'react-dom/client'
import {
ArrowRight,
···
Code,
Server
} from 'lucide-react'
-
import Layout from '@public/layouts'
import { Button } from '@public/components/ui/button'
import { Card } from '@public/components/ui/card'
+
import { BlueskyPostList, BlueskyProfile, BlueskyPost, AtProtoProvider, useLatestRecord, type AtProtoStyles, type FeedPostRecord } from 'atproto-ui'
+
+
//Credit to https://tangled.org/@jakelazaroff.com/actor-typeahead
+
interface Actor {
+
handle: string
+
avatar?: string
+
displayName?: string
+
}
+
+
interface ActorTypeaheadProps {
+
children: React.ReactElement<React.InputHTMLAttributes<HTMLInputElement>>
+
host?: string
+
rows?: number
+
onSelect?: (handle: string) => void
+
autoSubmit?: boolean
+
}
+
+
const ActorTypeahead: React.FC<ActorTypeaheadProps> = ({
+
children,
+
host = 'https://public.api.bsky.app',
+
rows = 5,
+
onSelect,
+
autoSubmit = false
+
}) => {
+
const [actors, setActors] = useState<Actor[]>([])
+
const [index, setIndex] = useState(-1)
+
const [pressed, setPressed] = useState(false)
+
const [isOpen, setIsOpen] = useState(false)
+
const containerRef = useRef<HTMLDivElement>(null)
+
const inputRef = useRef<HTMLInputElement>(null)
+
const lastQueryRef = useRef<string>('')
+
const previousValueRef = useRef<string>('')
+
const preserveIndexRef = useRef(false)
+
+
const handleInput = async (e: React.FormEvent<HTMLInputElement>) => {
+
const query = e.currentTarget.value
+
+
// Check if the value actually changed (filter out arrow key events)
+
if (query === previousValueRef.current) {
+
return
+
}
+
previousValueRef.current = query
+
+
if (!query) {
+
setActors([])
+
setIndex(-1)
+
setIsOpen(false)
+
lastQueryRef.current = ''
+
return
+
}
+
+
// Store the query for this request
+
const currentQuery = query
+
lastQueryRef.current = currentQuery
+
+
try {
+
const url = new URL('xrpc/app.bsky.actor.searchActorsTypeahead', host)
+
url.searchParams.set('q', query)
+
url.searchParams.set('limit', `${rows}`)
+
+
const res = await fetch(url)
+
const json = await res.json()
+
+
// Only update if this is still the latest query
+
if (lastQueryRef.current === currentQuery) {
+
setActors(json.actors || [])
+
// Only reset index if we're not preserving it
+
if (!preserveIndexRef.current) {
+
setIndex(-1)
+
}
+
preserveIndexRef.current = false
+
setIsOpen(true)
+
}
+
} catch (error) {
+
console.error('Failed to fetch actors:', error)
+
if (lastQueryRef.current === currentQuery) {
+
setActors([])
+
setIsOpen(false)
+
}
+
}
+
}
+
+
const handleKeyDown = (e: React.KeyboardEvent<HTMLInputElement>) => {
+
const navigationKeys = ['ArrowDown', 'ArrowUp', 'PageDown', 'PageUp', 'Enter', 'Escape']
+
+
// Mark that we should preserve the index for navigation keys
+
if (navigationKeys.includes(e.key)) {
+
preserveIndexRef.current = true
+
}
+
+
if (!isOpen || actors.length === 0) return
+
+
switch (e.key) {
+
case 'ArrowDown':
+
e.preventDefault()
+
setIndex((prev) => {
+
const newIndex = prev < 0 ? 0 : Math.min(prev + 1, actors.length - 1)
+
return newIndex
+
})
+
break
+
case 'PageDown':
+
e.preventDefault()
+
setIndex(actors.length - 1)
+
break
+
case 'ArrowUp':
+
e.preventDefault()
+
setIndex((prev) => {
+
const newIndex = prev < 0 ? 0 : Math.max(prev - 1, 0)
+
return newIndex
+
})
+
break
+
case 'PageUp':
+
e.preventDefault()
+
setIndex(0)
+
break
+
case 'Escape':
+
e.preventDefault()
+
setActors([])
+
setIndex(-1)
+
setIsOpen(false)
+
break
+
case 'Enter':
+
if (index >= 0 && index < actors.length) {
+
e.preventDefault()
+
selectActor(actors[index].handle)
+
}
+
break
+
}
+
}
+
+
const selectActor = (handle: string) => {
+
if (inputRef.current) {
+
inputRef.current.value = handle
+
}
+
setActors([])
+
setIndex(-1)
+
setIsOpen(false)
+
onSelect?.(handle)
+
+
// Auto-submit the form if enabled
+
if (autoSubmit && inputRef.current) {
+
const form = inputRef.current.closest('form')
+
if (form) {
+
// Use setTimeout to ensure the value is set before submission
+
setTimeout(() => {
+
form.requestSubmit()
+
}, 0)
+
}
+
}
+
}
+
+
const handleFocusOut = (e: React.FocusEvent) => {
+
if (pressed) return
+
setActors([])
+
setIndex(-1)
+
setIsOpen(false)
+
}
+
+
// Clone the input element and add our event handlers
+
const input = React.cloneElement(children, {
+
ref: (el: HTMLInputElement) => {
+
inputRef.current = el
+
// Preserve the original ref if it exists
+
const originalRef = (children as any).ref
+
if (typeof originalRef === 'function') {
+
originalRef(el)
+
} else if (originalRef) {
+
originalRef.current = el
+
}
+
},
+
onInput: (e: React.FormEvent<HTMLInputElement>) => {
+
handleInput(e)
+
children.props.onInput?.(e)
+
},
+
onKeyDown: (e: React.KeyboardEvent<HTMLInputElement>) => {
+
handleKeyDown(e)
+
children.props.onKeyDown?.(e)
+
},
+
onBlur: (e: React.FocusEvent<HTMLInputElement>) => {
+
handleFocusOut(e)
+
children.props.onBlur?.(e)
+
},
+
autoComplete: 'off'
+
} as any)
+
+
return (
+
<div ref={containerRef} style={{ position: 'relative', display: 'block' }}>
+
{input}
+
{isOpen && actors.length > 0 && (
+
<ul
+
style={{
+
display: 'flex',
+
flexDirection: 'column',
+
position: 'absolute',
+
left: 0,
+
marginTop: '4px',
+
width: '100%',
+
listStyle: 'none',
+
overflow: 'hidden',
+
backgroundColor: 'rgba(255, 255, 255, 0.8)',
+
backgroundClip: 'padding-box',
+
backdropFilter: 'blur(12px)',
+
WebkitBackdropFilter: 'blur(12px)',
+
border: '1px solid rgba(0, 0, 0, 0.1)',
+
borderRadius: '8px',
+
boxShadow: '0 6px 6px -4px rgba(0, 0, 0, 0.2)',
+
padding: '4px',
+
margin: 0,
+
zIndex: 1000
+
}}
+
onMouseDown={() => setPressed(true)}
+
onMouseUp={() => {
+
setPressed(false)
+
inputRef.current?.focus()
+
}}
+
>
+
{actors.map((actor, i) => (
+
<li key={actor.handle}>
+
<button
+
type="button"
+
onClick={() => selectActor(actor.handle)}
+
style={{
+
all: 'unset',
+
boxSizing: 'border-box',
+
display: 'flex',
+
alignItems: 'center',
+
gap: '8px',
+
padding: '6px 8px',
+
width: '100%',
+
height: 'calc(1.5rem + 12px)',
+
borderRadius: '4px',
+
cursor: 'pointer',
+
backgroundColor: i === index ? 'hsl(var(--accent) / 0.5)' : 'transparent',
+
transition: 'background-color 0.1s'
+
}}
+
onMouseEnter={() => setIndex(i)}
+
>
+
<div
+
style={{
+
width: '1.5rem',
+
height: '1.5rem',
+
borderRadius: '50%',
+
backgroundColor: 'hsl(var(--muted))',
+
overflow: 'hidden',
+
flexShrink: 0
+
}}
+
>
+
{actor.avatar && (
+
<img
+
src={actor.avatar}
+
alt=""
+
style={{
+
display: 'block',
+
width: '100%',
+
height: '100%',
+
objectFit: 'cover'
+
}}
+
/>
+
)}
+
</div>
+
<span
+
style={{
+
whiteSpace: 'nowrap',
+
overflow: 'hidden',
+
textOverflow: 'ellipsis',
+
color: '#000000'
+
}}
+
>
+
{actor.handle}
+
</span>
+
</button>
+
</li>
+
))}
+
</ul>
+
)}
+
</div>
+
)
+
}
+
+
const LatestPostWithPrefetch: React.FC<{ did: string }> = ({ did }) => {
+
const { record, rkey, loading } = useLatestRecord<FeedPostRecord>(
+
did,
+
'app.bsky.feed.post'
+
)
+
+
if (loading) return <span>Loadingโ€ฆ</span>
+
if (!record || !rkey) return <span>No posts yet.</span>
+
+
return <BlueskyPost did={did} rkey={rkey} record={record} showParent={true} />
+
}
function App() {
const [showForm, setShowForm] = useState(false)
+
const [checkingAuth, setCheckingAuth] = useState(true)
const inputRef = useRef<HTMLInputElement>(null)
useEffect(() => {
+
// Check authentication status on mount
+
const checkAuth = async () => {
+
try {
+
const response = await fetch('/api/auth/status', {
+
credentials: 'include'
+
})
+
const data = await response.json()
+
if (data.authenticated) {
+
// User is already authenticated, redirect to editor
+
window.location.href = '/editor'
+
return
+
}
+
// If not authenticated, clear any stale cookies
+
document.cookie = 'did=; path=/; expires=Thu, 01 Jan 1970 00:00:00 GMT; SameSite=Lax'
+
} catch (error) {
+
console.error('Auth check failed:', error)
+
// Clear cookies on error as well
+
document.cookie = 'did=; path=/; expires=Thu, 01 Jan 1970 00:00:00 GMT; SameSite=Lax'
+
} finally {
+
setCheckingAuth(false)
+
}
+
}
+
+
checkAuth()
+
}, [])
+
+
useEffect(() => {
if (showForm) {
setTimeout(() => inputRef.current?.focus(), 500)
}
}, [showForm])
+
if (checkingAuth) {
+
return (
+
<div className="min-h-screen bg-background flex items-center justify-center">
+
<div className="w-8 h-8 border-2 border-primary border-t-transparent rounded-full animate-spin"></div>
+
</div>
+
)
+
}
+
return (
<>
<div className="min-h-screen">
···
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
<div className="flex items-center gap-2">
-
<div className="w-8 h-8 bg-primary rounded-lg flex items-center justify-center">
-
<Globe className="w-5 h-5 text-primary-foreground" />
-
</div>
+
<img src="/transparent-full-size-ico.png" alt="wisp.place" className="w-8 h-8" />
<span className="text-xl font-semibold text-foreground">
wisp.place
</span>
···
<Button
size="sm"
className="bg-accent text-accent-foreground hover:bg-accent/90"
+
onClick={() => setShowForm(true)}
>
Get Started
</Button>
···
<div className="max-w-4xl mx-auto text-center">
<div className="inline-flex items-center gap-2 px-4 py-2 rounded-full bg-accent/10 border border-accent/20 mb-8">
<span className="w-2 h-2 bg-accent rounded-full animate-pulse"></span>
-
<span className="text-sm text-accent-foreground">
+
<span className="text-sm text-foreground">
Built on AT Protocol
</span>
</div>
···
'Login failed:',
error
)
+
// Clear any invalid cookies
+
document.cookie = 'did=; path=/; expires=Thu, 01 Jan 1970 00:00:00 GMT; SameSite=Lax'
alert('Authentication failed')
}
}}
className="space-y-3"
>
-
<input
-
ref={inputRef}
-
type="text"
-
name="handle"
-
placeholder="Enter your handle (e.g., alice.bsky.social)"
-
className="w-full py-4 px-4 text-lg bg-input border border-border rounded-lg focus:outline-none focus:ring-2 focus:ring-accent"
-
/>
+
<ActorTypeahead
+
autoSubmit={true}
+
onSelect={(handle) => {
+
if (inputRef.current) {
+
inputRef.current.value = handle
+
}
+
}}
+
>
+
<input
+
ref={inputRef}
+
type="text"
+
name="handle"
+
placeholder="Enter your handle (e.g., alice.bsky.social)"
+
className="w-full py-4 px-4 text-lg bg-input border border-border rounded-lg focus:outline-none focus:ring-2 focus:ring-accent"
+
/>
+
</ActorTypeahead>
<button
type="submit"
className="w-full bg-accent hover:bg-accent/90 text-accent-foreground font-semibold py-4 px-6 text-lg rounded-lg inline-flex items-center justify-center transition-colors"
···
{/* CTA Section */}
<section className="container mx-auto px-4 py-20">
+
<div className="max-w-6xl mx-auto">
+
<div className="text-center mb-12">
+
<h2 className="text-3xl md:text-4xl font-bold">
+
Follow on Bluesky for updates
+
</h2>
+
</div>
+
<div className="grid md:grid-cols-2 gap-8 items-center">
+
<Card
+
className="shadow-lg border-2 border-border overflow-hidden !py-3"
+
style={{
+
'--atproto-color-bg': 'var(--card)',
+
'--atproto-color-bg-elevated': 'hsl(var(--muted) / 0.3)',
+
'--atproto-color-text': 'hsl(var(--foreground))',
+
'--atproto-color-text-secondary': 'hsl(var(--muted-foreground))',
+
'--atproto-color-link': 'hsl(var(--accent))',
+
'--atproto-color-link-hover': 'hsl(var(--accent))',
+
'--atproto-color-border': 'transparent',
+
} as AtProtoStyles}
+
>
+
<BlueskyPostList did="wisp.place" />
+
</Card>
+
<div className="space-y-6 w-full max-w-md mx-auto">
+
<Card
+
className="shadow-lg border-2 overflow-hidden relative !py-3"
+
style={{
+
'--atproto-color-bg': 'var(--card)',
+
'--atproto-color-bg-elevated': 'hsl(var(--muted) / 0.3)',
+
'--atproto-color-text': 'hsl(var(--foreground))',
+
'--atproto-color-text-secondary': 'hsl(var(--muted-foreground))',
+
} as AtProtoStyles}
+
>
+
<BlueskyProfile did="wisp.place" />
+
</Card>
+
<Card
+
className="shadow-lg border-2 overflow-hidden relative !py-3"
+
style={{
+
'--atproto-color-bg': 'var(--card)',
+
'--atproto-color-bg-elevated': 'hsl(var(--muted) / 0.3)',
+
'--atproto-color-text': 'hsl(var(--foreground))',
+
'--atproto-color-text-secondary': 'hsl(var(--muted-foreground))',
+
} as AtProtoStyles}
+
>
+
<LatestPostWithPrefetch did="wisp.place" />
+
</Card>
+
</div>
+
</div>
+
</div>
+
</section>
+
+
{/* Ready to Deploy CTA */}
+
<section className="container mx-auto px-4 py-20">
<div className="max-w-3xl mx-auto text-center bg-accent/5 border border-accent/20 rounded-2xl p-12">
<h2 className="text-3xl md:text-4xl font-bold mb-4">
Ready to deploy?
···
>
@nekomimi.pet
</a>
+
{' โ€ข '}
+
Contact:{' '}
+
<a
+
href="mailto:contact@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
contact@wisp.place
+
</a>
+
{' โ€ข '}
+
Legal/DMCA:{' '}
+
<a
+
href="mailto:legal@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
legal@wisp.place
+
</a>
+
</p>
+
<p className="mt-2">
+
<a
+
href="/acceptable-use"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
Acceptable Use Policy
+
</a>
</p>
</div>
</div>
···
const root = createRoot(document.getElementById('elysia')!)
root.render(
-
<Layout className="gap-6">
-
<App />
-
</Layout>
+
<AtProtoProvider>
+
<Layout className="gap-6">
+
<App />
+
</Layout>
+
</AtProtoProvider>
)
+24
public/layouts/index.tsx
···
import type { PropsWithChildren } from 'react'
+
import { useEffect } from 'react'
import { QueryClientProvider, QueryClient } from '@tanstack/react-query'
import clsx from 'clsx'
···
}
export default function Layout({ children, className }: LayoutProps) {
+
useEffect(() => {
+
// Function to update dark mode based on system preference
+
const updateDarkMode = (e: MediaQueryList | MediaQueryListEvent) => {
+
if (e.matches) {
+
document.documentElement.classList.add('dark')
+
} else {
+
document.documentElement.classList.remove('dark')
+
}
+
}
+
+
// Create media query
+
const darkModeQuery = window.matchMedia('(prefers-color-scheme: dark)')
+
+
// Set initial value
+
updateDarkMode(darkModeQuery)
+
+
// Listen for changes
+
darkModeQuery.addEventListener('change', updateDarkMode)
+
+
// Cleanup
+
return () => darkModeQuery.removeEventListener('change', updateDarkMode)
+
}, [])
+
return (
<QueryClientProvider client={client}>
<div
+18 -1
public/onboarding/index.html
···
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
-
<title>Get Started - wisp.place</title>
+
<title>wisp.place</title>
+
<meta name="description" content="Get started with wisp.place and host your first decentralized static site on AT Protocol." />
+
+
<!-- Open Graph / Facebook -->
+
<meta property="og:type" content="website" />
+
<meta property="og:url" content="https://wisp.place/onboarding" />
+
<meta property="og:title" content="Get Started - wisp.place" />
+
<meta property="og:description" content="Get started with wisp.place and host your first decentralized static site on AT Protocol." />
+
<meta property="og:site_name" content="wisp.place" />
+
+
<!-- Twitter -->
+
<meta name="twitter:card" content="summary" />
+
<meta name="twitter:url" content="https://wisp.place/onboarding" />
+
<meta name="twitter:title" content="Get Started - wisp.place" />
+
<meta name="twitter:description" content="Get started with wisp.place and host your first decentralized static site on AT Protocol." />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
</head>
<body>
<div id="elysia"></div>
+21
public/robots.txt
···
+
# robots.txt for wisp.place
+
+
User-agent: *
+
+
# Allow indexing of landing page
+
Allow: /$
+
+
# Disallow application pages
+
Disallow: /editor
+
Disallow: /admin
+
Disallow: /onboarding
+
+
# Disallow API routes
+
Disallow: /api/
+
Disallow: /wisp/
+
+
# Allow static assets
+
Allow: /favicon.ico
+
Allow: /favicon-*.png
+
Allow: /apple-touch-icon.png
+
Allow: /site.webmanifest
+1
public/site.webmanifest
···
+
{"name":"","short_name":"","icons":[{"src":"/android-chrome-192x192.png","sizes":"192x192","type":"image/png"},{"src":"/android-chrome-512x512.png","sizes":"512x512","type":"image/png"}],"theme_color":"#ffffff","background_color":"#ffffff","display":"standalone"}
+62 -36
public/styles/global.css
···
@import "tailwindcss";
@import "tw-animate-css";
-
@custom-variant dark (&:is(.dark *));
+
@custom-variant dark (@media (prefers-color-scheme: dark));
:root {
+
color-scheme: light;
+
/* Warm beige background inspired by Sunset design #E9DDD8 */
--background: oklch(0.90 0.012 35);
/* Very dark brown text for strong contrast #2A2420 */
···
}
.dark {
-
/* #413C58 - violet background for dark mode */
-
--background: oklch(0.28 0.04 285);
-
/* #F2E7C9 - parchment text */
-
--foreground: oklch(0.93 0.03 85);
+
color-scheme: dark;
-
--card: oklch(0.32 0.04 285);
-
--card-foreground: oklch(0.93 0.03 85);
+
/* Slate violet background - #2C2C2C with violet tint */
+
--background: oklch(0.23 0.015 285);
+
/* Light gray text - #E4E4E4 */
+
--foreground: oklch(0.90 0.005 285);
-
--popover: oklch(0.32 0.04 285);
-
--popover-foreground: oklch(0.93 0.03 85);
+
/* Slightly lighter slate for cards */
+
--card: oklch(0.28 0.015 285);
+
--card-foreground: oklch(0.90 0.005 285);
-
/* #FFAAD2 - pink primary in dark mode */
-
--primary: oklch(0.78 0.15 345);
-
--primary-foreground: oklch(0.32 0.04 285);
+
--popover: oklch(0.28 0.015 285);
+
--popover-foreground: oklch(0.90 0.005 285);
-
--accent: oklch(0.78 0.15 345);
-
--accent-foreground: oklch(0.32 0.04 285);
+
/* Lavender buttons - #B39CD0 */
+
--primary: oklch(0.70 0.10 295);
+
--primary-foreground: oklch(0.23 0.015 285);
-
--secondary: oklch(0.56 0.08 220);
-
--secondary-foreground: oklch(0.93 0.03 85);
+
/* Soft pink accent - #FFC1CC */
+
--accent: oklch(0.85 0.08 5);
+
--accent-foreground: oklch(0.23 0.015 285);
-
--muted: oklch(0.38 0.03 285);
-
--muted-foreground: oklch(0.75 0.02 85);
+
/* Light cyan secondary - #A8DADC */
+
--secondary: oklch(0.82 0.05 200);
+
--secondary-foreground: oklch(0.23 0.015 285);
-
--border: oklch(0.42 0.03 285);
-
--input: oklch(0.42 0.03 285);
-
--ring: oklch(0.78 0.15 345);
+
/* Muted slate areas */
+
--muted: oklch(0.33 0.015 285);
+
--muted-foreground: oklch(0.72 0.01 285);
-
--destructive: oklch(0.577 0.245 27.325);
-
--destructive-foreground: oklch(0.985 0 0);
+
/* Subtle borders */
+
--border: oklch(0.38 0.02 285);
+
--input: oklch(0.30 0.015 285);
+
--ring: oklch(0.70 0.10 295);
-
--chart-1: oklch(0.78 0.15 345);
-
--chart-2: oklch(0.93 0.03 85);
-
--chart-3: oklch(0.56 0.08 220);
-
--chart-4: oklch(0.85 0.02 130);
-
--chart-5: oklch(0.32 0.04 285);
-
--sidebar: oklch(0.205 0 0);
-
--sidebar-foreground: oklch(0.985 0 0);
-
--sidebar-primary: oklch(0.488 0.243 264.376);
-
--sidebar-primary-foreground: oklch(0.985 0 0);
-
--sidebar-accent: oklch(0.269 0 0);
-
--sidebar-accent-foreground: oklch(0.985 0 0);
-
--sidebar-border: oklch(0.269 0 0);
-
--sidebar-ring: oklch(0.439 0 0);
+
/* Warm destructive color */
+
--destructive: oklch(0.60 0.22 27);
+
--destructive-foreground: oklch(0.98 0.01 85);
+
+
/* Chart colors using the accent palette */
+
--chart-1: oklch(0.85 0.08 5);
+
--chart-2: oklch(0.82 0.05 200);
+
--chart-3: oklch(0.70 0.10 295);
+
--chart-4: oklch(0.75 0.08 340);
+
--chart-5: oklch(0.65 0.08 180);
+
+
/* Sidebar slate */
+
--sidebar: oklch(0.20 0.015 285);
+
--sidebar-foreground: oklch(0.90 0.005 285);
+
--sidebar-primary: oklch(0.70 0.10 295);
+
--sidebar-primary-foreground: oklch(0.20 0.015 285);
+
--sidebar-accent: oklch(0.28 0.015 285);
+
--sidebar-accent-foreground: oklch(0.90 0.005 285);
+
--sidebar-border: oklch(0.32 0.02 285);
+
--sidebar-ring: oklch(0.70 0.10 295);
}
@theme inline {
···
.arrow-animate {
animation: arrow-bounce 1.5s ease-in-out infinite;
}
+
+
/* Shiki syntax highlighting styles */
+
.shiki-wrapper {
+
border-radius: 0.5rem;
+
padding: 1rem;
+
overflow-x: auto;
+
border: 1px solid hsl(var(--border));
+
}
+
+
.shiki-wrapper pre {
+
margin: 0 !important;
+
padding: 0 !important;
+
}
public/transparent-full-size-ico.png

This is a binary file and will not be displayed.

+2 -2
scripts/change-admin-password.ts
···
// Change admin password
-
import { adminAuth } from './src/lib/admin-auth'
-
import { db } from './src/lib/db'
+
import { adminAuth } from '../src/lib/admin-auth'
+
import { db } from '../src/lib/db'
import { randomBytes, createHash } from 'crypto'
// Get username and new password from command line
+35 -14
src/index.ts
···
import { Elysia } from 'elysia'
+
import type { Context } from 'elysia'
import { cors } from '@elysiajs/cors'
-
import { openapi, fromTypes } from '@elysiajs/openapi'
import { staticPlugin } from '@elysiajs/static'
import type { Config } from './lib/types'
···
cleanupExpiredSessions,
rotateKeysIfNeeded
} from './lib/oauth-client'
+
import { getCookieSecret } from './lib/db'
import { authRoutes } from './routes/auth'
import { wispRoutes } from './routes/wisp'
import { domainRoutes } from './routes/domain'
···
// Initialize admin setup (prompt if no admin exists)
await promptAdminSetup()
+
// Get or generate cookie signing secret
+
const cookieSecret = await getCookieSecret()
+
const client = await getOAuthClient(config)
// Periodic maintenance: cleanup expired sessions and rotate keys
···
dnsVerifier.start()
logger.info('DNS Verifier Started - checking custom domains every 10 minutes')
-
export const app = new Elysia()
-
.use(openapi({
-
references: fromTypes()
-
}))
+
export const app = new Elysia({
+
serve: {
+
maxRequestBodySize: 1024 * 1024 * 128 * 3,
+
development: Bun.env.NODE_ENV !== 'production' ? true : false,
+
id: Bun.env.NODE_ENV !== 'production' ? undefined : null,
+
},
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
// Observability middleware
.onBeforeHandle(observabilityMiddleware('main-app').beforeHandle)
-
.onAfterHandle((ctx) => {
+
.onAfterHandle((ctx: Context) => {
observabilityMiddleware('main-app').afterHandle(ctx)
// Security headers middleware
const { set } = ctx
···
})
.onError(observabilityMiddleware('main-app').onError)
.use(csrfProtection())
-
.use(authRoutes(client))
-
.use(wispRoutes(client))
-
.use(domainRoutes(client))
-
.use(userRoutes(client))
-
.use(siteRoutes(client))
-
.use(adminRoutes())
+
.use(authRoutes(client, cookieSecret))
+
.use(wispRoutes(client, cookieSecret))
+
.use(domainRoutes(client, cookieSecret))
+
.use(userRoutes(client, cookieSecret))
+
.use(siteRoutes(client, cookieSecret))
+
.use(adminRoutes(cookieSecret))
.use(
await staticPlugin({
prefix: '/'
})
)
-
.get('/client-metadata.json', (c) => {
+
.get('/client-metadata.json', () => {
return createClientMetadata(config)
})
-
.get('/jwks.json', async (c) => {
+
.get('/jwks.json', async ({ set }) => {
+
// Prevent caching to ensure clients always get fresh keys after rotation
+
set.headers['Cache-Control'] = 'no-store, no-cache, must-revalidate, max-age=0'
+
set.headers['Pragma'] = 'no-cache'
+
set.headers['Expires'] = '0'
+
const keys = await getCurrentKeys()
if (!keys.length) return { keys: [] }
···
error: error instanceof Error ? error.message : String(error)
}
}
+
})
+
.get('/.well-known/atproto-did', ({ set }) => {
+
// Return plain text DID for AT Protocol domain verification
+
set.headers['Content-Type'] = 'text/plain'
+
return 'did:plc:7puq73yz2hkvbcpdhnsze2qw'
})
.use(cors({
origin: config.domain,
+182 -15
src/lib/db.ts
···
)
`;
-
// Domains table maps subdomain -> DID
+
// Cookie secrets table for signed cookies
+
await db`
+
CREATE TABLE IF NOT EXISTS cookie_secrets (
+
id TEXT PRIMARY KEY DEFAULT 'default',
+
secret TEXT NOT NULL,
+
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
+
)
+
`;
+
+
// Domains table maps subdomain -> DID (now supports up to 3 domains per user)
await db`
CREATE TABLE IF NOT EXISTS domains (
domain TEXT PRIMARY KEY,
-
did TEXT UNIQUE NOT NULL,
+
did TEXT NOT NULL,
rkey TEXT,
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
)
···
// Column might already exist, ignore
}
+
// Remove the unique constraint on domains.did to allow multiple domains per user
+
try {
+
await db`ALTER TABLE domains DROP CONSTRAINT IF EXISTS domains_did_key`;
+
} catch (err) {
+
// Constraint might already be removed, ignore
+
}
+
// Custom domains table for BYOD (bring your own domain)
await db`
CREATE TABLE IF NOT EXISTS custom_domains (
···
)
`;
+
// Create indexes for common query patterns
+
await Promise.all([
+
// oauth_states cleanup queries
+
db`CREATE INDEX IF NOT EXISTS idx_oauth_states_expires_at ON oauth_states(expires_at)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_oauth_states_expires_at:', err);
+
}
+
}),
+
+
// oauth_sessions cleanup queries
+
db`CREATE INDEX IF NOT EXISTS idx_oauth_sessions_expires_at ON oauth_sessions(expires_at)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_oauth_sessions_expires_at:', err);
+
}
+
}),
+
+
// oauth_keys key rotation queries
+
db`CREATE INDEX IF NOT EXISTS idx_oauth_keys_created_at ON oauth_keys(created_at)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_oauth_keys_created_at:', err);
+
}
+
}),
+
+
// domains queries by (did, rkey)
+
db`CREATE INDEX IF NOT EXISTS idx_domains_did_rkey ON domains(did, rkey)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_domains_did_rkey:', err);
+
}
+
}),
+
+
// custom_domains queries by did
+
db`CREATE INDEX IF NOT EXISTS idx_custom_domains_did ON custom_domains(did)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_custom_domains_did:', err);
+
}
+
}),
+
+
// custom_domains queries by (did, rkey)
+
db`CREATE INDEX IF NOT EXISTS idx_custom_domains_did_rkey ON custom_domains(did, rkey)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_custom_domains_did_rkey:', err);
+
}
+
}),
+
+
// custom_domains DNS verification worker queries
+
db`CREATE INDEX IF NOT EXISTS idx_custom_domains_verified ON custom_domains(verified)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_custom_domains_verified:', err);
+
}
+
}),
+
+
// sites queries by did
+
db`CREATE INDEX IF NOT EXISTS idx_sites_did ON sites(did)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_sites_did:', err);
+
}
+
})
+
]);
+
const RESERVED_HANDLES = new Set([
"www",
"api",
···
export const toDomain = (handle: string): string => `${handle.toLowerCase()}.${BASE_HOST}`;
export const getDomainByDid = async (did: string): Promise<string | null> => {
-
const rows = await db`SELECT domain FROM domains WHERE did = ${did}`;
+
const rows = await db`SELECT domain FROM domains WHERE did = ${did} ORDER BY created_at ASC LIMIT 1`;
return rows[0]?.domain ?? null;
};
export const getWispDomainInfo = async (did: string) => {
-
const rows = await db`SELECT domain, rkey FROM domains WHERE did = ${did}`;
+
const rows = await db`SELECT domain, rkey FROM domains WHERE did = ${did} ORDER BY created_at ASC LIMIT 1`;
return rows[0] ?? null;
+
};
+
+
export const getAllWispDomains = async (did: string) => {
+
const rows = await db`SELECT domain, rkey FROM domains WHERE did = ${did} ORDER BY created_at ASC`;
+
return rows;
+
};
+
+
export const countWispDomains = async (did: string): Promise<number> => {
+
const rows = await db`SELECT COUNT(*) as count FROM domains WHERE did = ${did}`;
+
return Number(rows[0]?.count ?? 0);
};
export const getDidByDomain = async (domain: string): Promise<string | null> => {
···
export const claimDomain = async (did: string, handle: string): Promise<string> => {
const h = handle.trim().toLowerCase();
if (!isValidHandle(h)) throw new Error('invalid_handle');
+
+
// Check if user already has 3 domains
+
const existingCount = await countWispDomains(did);
+
if (existingCount >= 3) {
+
throw new Error('domain_limit_reached');
+
}
+
const domain = toDomain(h);
try {
await db`
···
VALUES (${domain}, ${did})
`;
} catch (err) {
-
// Unique constraint violations -> already taken or DID already claimed
+
// Unique constraint violations -> already taken
throw new Error('conflict');
}
return domain;
···
}
};
-
export const updateWispDomainSite = async (did: string, siteRkey: string | null): Promise<void> => {
+
export const updateWispDomainSite = async (domain: string, siteRkey: string | null): Promise<void> => {
await db`
UPDATE domains
SET rkey = ${siteRkey}
-
WHERE did = ${did}
+
WHERE domain = ${domain}
`;
};
export const getWispDomainSite = async (did: string): Promise<string | null> => {
-
const rows = await db`SELECT rkey FROM domains WHERE did = ${did}`;
+
const rows = await db`SELECT rkey FROM domains WHERE did = ${did} ORDER BY created_at ASC LIMIT 1`;
return rows[0]?.rkey ?? null;
};
+
export const deleteWispDomain = async (domain: string): Promise<void> => {
+
await db`DELETE FROM domains WHERE domain = ${domain}`;
+
};
+
// Session timeout configuration (30 days in seconds)
const SESSION_TIMEOUT = 30 * 24 * 60 * 60; // 2592000 seconds
// OAuth state timeout (1 hour in seconds)
···
const stateStore = {
async set(key: string, data: any) {
-
console.debug('[stateStore] set', key)
const expiresAt = Math.floor(Date.now() / 1000) + STATE_TIMEOUT;
await db`
INSERT INTO oauth_states (key, data, created_at, expires_at)
···
`;
},
async get(key: string) {
-
console.debug('[stateStore] get', key)
const now = Math.floor(Date.now() / 1000);
const result = await db`
SELECT data, expires_at
···
// Check if expired
const expiresAt = Number(result[0].expires_at);
if (expiresAt && now > expiresAt) {
-
console.debug('[stateStore] State expired, deleting', key);
await db`DELETE FROM oauth_states WHERE key = ${key}`;
return undefined;
}
···
return JSON.parse(result[0].data);
},
async del(key: string) {
-
console.debug('[stateStore] del', key)
await db`DELETE FROM oauth_states WHERE key = ${key}`;
}
};
const sessionStore = {
async set(sub: string, data: any) {
-
console.debug('[sessionStore] set', sub)
const expiresAt = Math.floor(Date.now() / 1000) + SESSION_TIMEOUT;
await db`
INSERT INTO oauth_sessions (sub, data, updated_at, expires_at)
···
`;
},
async get(sub: string) {
-
console.debug('[sessionStore] get', sub)
const now = Math.floor(Date.now() / 1000);
const result = await db`
SELECT data, expires_at
···
return JSON.parse(result[0].data);
},
async del(sub: string) {
-
console.debug('[sessionStore] del', sub)
await db`DELETE FROM oauth_sessions WHERE sub = ${sub}`;
}
};
···
return { success: false, error: err };
}
};
+
+
// Get all domains (wisp + custom) mapped to a specific site
+
export const getDomainsBySite = async (did: string, rkey: string) => {
+
const domains: Array<{
+
type: 'wisp' | 'custom';
+
domain: string;
+
verified?: boolean;
+
id?: string;
+
}> = [];
+
+
// Check wisp domain
+
const wispDomain = await db`
+
SELECT domain, rkey FROM domains
+
WHERE did = ${did} AND rkey = ${rkey}
+
`;
+
if (wispDomain.length > 0) {
+
domains.push({
+
type: 'wisp',
+
domain: wispDomain[0].domain,
+
});
+
}
+
+
// Check custom domains
+
const customDomains = await db`
+
SELECT id, domain, verified FROM custom_domains
+
WHERE did = ${did} AND rkey = ${rkey}
+
ORDER BY created_at DESC
+
`;
+
for (const cd of customDomains) {
+
domains.push({
+
type: 'custom',
+
domain: cd.domain,
+
verified: cd.verified,
+
id: cd.id,
+
});
+
}
+
+
return domains;
+
};
+
+
// Get count of domains mapped to a specific site
+
export const getDomainCountBySite = async (did: string, rkey: string) => {
+
const wispCount = await db`
+
SELECT COUNT(*) as count FROM domains
+
WHERE did = ${did} AND rkey = ${rkey}
+
`;
+
+
const customCount = await db`
+
SELECT COUNT(*) as count FROM custom_domains
+
WHERE did = ${did} AND rkey = ${rkey}
+
`;
+
+
return {
+
wisp: Number(wispCount[0]?.count || 0),
+
custom: Number(customCount[0]?.count || 0),
+
total: Number(wispCount[0]?.count || 0) + Number(customCount[0]?.count || 0),
+
};
+
};
+
+
// Cookie secret management - ensure we have a secret for signing cookies
+
export const getCookieSecret = async (): Promise<string> => {
+
// Check if secret already exists
+
const rows = await db`SELECT secret FROM cookie_secrets WHERE id = 'default' LIMIT 1`;
+
+
if (rows.length > 0) {
+
return rows[0].secret as string;
+
}
+
+
// Generate new secret if none exists
+
const secret = crypto.randomUUID() + crypto.randomUUID(); // 72 character random string
+
await db`
+
INSERT INTO cookie_secrets (id, secret, created_at)
+
VALUES ('default', ${secret}, EXTRACT(EPOCH FROM NOW()))
+
`;
+
+
console.log('[CookieSecret] Generated new cookie signing secret');
+
return secret;
+
};
+19 -3
src/lib/dns-verify.ts
···
}
/**
-
* Verify both TXT and CNAME records for a custom domain
+
* Verify custom domain using TXT record as authoritative proof
+
* CNAME check is optional/advisory - TXT record is sufficient for verification
+
*
+
* This approach works with CNAME flattening (e.g., Cloudflare) where the CNAME
+
* is resolved to A/AAAA records and won't be visible in DNS queries.
*/
export const verifyCustomDomain = async (
domain: string,
expectedDid: string,
expectedHash: string
): Promise<VerificationResult> => {
+
// TXT record is authoritative - it proves ownership
const txtResult = await verifyDomainOwnership(domain, expectedDid)
if (!txtResult.verified) {
return txtResult
}
+
// CNAME check is advisory only - we still check it for logging/debugging
+
// but don't fail verification if it's missing (could be flattened)
const cnameResult = await verifyCNAME(domain, expectedHash)
+
+
// Log CNAME status for debugging, but don't fail on it
if (!cnameResult.verified) {
-
return cnameResult
+
console.log(`[DNS Verify] โš ๏ธ CNAME verification failed (may be flattened):`, cnameResult.error)
}
-
return { verified: true }
+
// TXT verification is sufficient
+
return {
+
verified: true,
+
found: {
+
txt: txtResult.found?.txt,
+
cname: cnameResult.found?.cname
+
}
+
}
}
-1
src/lib/oauth-client.ts
···
`;
},
async get(sub: string) {
-
console.debug('[sessionStore] get', sub)
const now = Math.floor(Date.now() / 1000);
const result = await db`
SELECT data, expires_at
+10 -6
src/lib/observability.ts
···
service
)
-
logCollector.error(
-
`Request failed: ${request.method} ${url.pathname}`,
-
service,
-
error,
-
{ statusCode: set.status || 500 }
-
)
+
// Don't log 404 errors
+
const statusCode = set.status || 500
+
if (statusCode !== 404) {
+
logCollector.error(
+
`Request failed: ${request.method} ${url.pathname}`,
+
service,
+
error,
+
{ statusCode }
+
)
+
}
}
}
}
+360
src/lib/wisp-utils.test.ts
···
processUploadedFiles,
createManifest,
updateFileBlobs,
+
computeCID,
+
extractBlobMap,
type UploadedFile,
type FileUploadResult,
} from './wisp-utils'
···
}
})
})
+
+
describe('computeCID', () => {
+
test('should compute CID for gzipped+base64 encoded content', () => {
+
// This simulates the actual flow: gzip -> base64 -> compute CID
+
const originalContent = Buffer.from('Hello, World!')
+
const gzipped = compressFile(originalContent)
+
const base64Content = Buffer.from(gzipped.toString('base64'), 'binary')
+
+
const cid = computeCID(base64Content)
+
+
// CID should be a valid CIDv1 string starting with 'bafkrei'
+
expect(cid).toMatch(/^bafkrei[a-z0-9]+$/)
+
expect(cid.length).toBeGreaterThan(10)
+
})
+
+
test('should compute deterministic CIDs for identical content', () => {
+
const content = Buffer.from('Test content for CID calculation')
+
const gzipped = compressFile(content)
+
const base64Content = Buffer.from(gzipped.toString('base64'), 'binary')
+
+
const cid1 = computeCID(base64Content)
+
const cid2 = computeCID(base64Content)
+
+
expect(cid1).toBe(cid2)
+
})
+
+
test('should compute different CIDs for different content', () => {
+
const content1 = Buffer.from('Content A')
+
const content2 = Buffer.from('Content B')
+
+
const gzipped1 = compressFile(content1)
+
const gzipped2 = compressFile(content2)
+
+
const base64Content1 = Buffer.from(gzipped1.toString('base64'), 'binary')
+
const base64Content2 = Buffer.from(gzipped2.toString('base64'), 'binary')
+
+
const cid1 = computeCID(base64Content1)
+
const cid2 = computeCID(base64Content2)
+
+
expect(cid1).not.toBe(cid2)
+
})
+
+
test('should handle empty content', () => {
+
const emptyContent = Buffer.from('')
+
const gzipped = compressFile(emptyContent)
+
const base64Content = Buffer.from(gzipped.toString('base64'), 'binary')
+
+
const cid = computeCID(base64Content)
+
+
expect(cid).toMatch(/^bafkrei[a-z0-9]+$/)
+
})
+
+
test('should compute same CID as PDS for base64-encoded content', () => {
+
// Test that binary encoding produces correct bytes for CID calculation
+
const testContent = Buffer.from('<!DOCTYPE html><html><body>Hello</body></html>')
+
const gzipped = compressFile(testContent)
+
const base64Content = Buffer.from(gzipped.toString('base64'), 'binary')
+
+
// Compute CID twice to ensure consistency
+
const cid1 = computeCID(base64Content)
+
const cid2 = computeCID(base64Content)
+
+
expect(cid1).toBe(cid2)
+
expect(cid1).toMatch(/^bafkrei/)
+
})
+
+
test('should use binary encoding for base64 strings', () => {
+
// This test verifies we're using the correct encoding method
+
// For base64 strings, 'binary' encoding ensures each character becomes exactly one byte
+
const content = Buffer.from('Test content')
+
const gzipped = compressFile(content)
+
const base64String = gzipped.toString('base64')
+
+
// Using binary encoding (what we use in production)
+
const base64Content = Buffer.from(base64String, 'binary')
+
+
// Verify the length matches the base64 string length
+
expect(base64Content.length).toBe(base64String.length)
+
+
// Verify CID is computed correctly
+
const cid = computeCID(base64Content)
+
expect(cid).toMatch(/^bafkrei/)
+
})
+
})
+
+
describe('extractBlobMap', () => {
+
test('should extract blob map from flat directory structure', () => {
+
const mockCid = CID.parse(TEST_CID_STRING)
+
const mockBlob = new BlobRef(mockCid, 'text/html', 100)
+
+
const directory: Directory = {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'index.html',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: mockBlob,
+
},
+
},
+
],
+
}
+
+
const blobMap = extractBlobMap(directory)
+
+
expect(blobMap.size).toBe(1)
+
expect(blobMap.has('index.html')).toBe(true)
+
+
const entry = blobMap.get('index.html')
+
expect(entry?.cid).toBe(TEST_CID_STRING)
+
expect(entry?.blobRef).toBe(mockBlob)
+
})
+
+
test('should extract blob map from nested directory structure', () => {
+
const mockCid1 = CID.parse(TEST_CID_STRING)
+
const mockCid2 = CID.parse('bafkreiabaduc3573q6snt2xgxzpglwuaojkzflocncrh2vj5j3jykdpqhi')
+
+
const mockBlob1 = new BlobRef(mockCid1, 'text/html', 100)
+
const mockBlob2 = new BlobRef(mockCid2, 'text/css', 50)
+
+
const directory: Directory = {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'index.html',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: mockBlob1,
+
},
+
},
+
{
+
name: 'assets',
+
node: {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'styles.css',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: mockBlob2,
+
},
+
},
+
],
+
},
+
},
+
],
+
}
+
+
const blobMap = extractBlobMap(directory)
+
+
expect(blobMap.size).toBe(2)
+
expect(blobMap.has('index.html')).toBe(true)
+
expect(blobMap.has('assets/styles.css')).toBe(true)
+
+
expect(blobMap.get('index.html')?.cid).toBe(TEST_CID_STRING)
+
expect(blobMap.get('assets/styles.css')?.cid).toBe('bafkreiabaduc3573q6snt2xgxzpglwuaojkzflocncrh2vj5j3jykdpqhi')
+
})
+
+
test('should handle deeply nested directory structures', () => {
+
const mockCid = CID.parse(TEST_CID_STRING)
+
const mockBlob = new BlobRef(mockCid, 'text/javascript', 200)
+
+
const directory: Directory = {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'src',
+
node: {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'lib',
+
node: {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'utils.js',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: mockBlob,
+
},
+
},
+
],
+
},
+
},
+
],
+
},
+
},
+
],
+
}
+
+
const blobMap = extractBlobMap(directory)
+
+
expect(blobMap.size).toBe(1)
+
expect(blobMap.has('src/lib/utils.js')).toBe(true)
+
expect(blobMap.get('src/lib/utils.js')?.cid).toBe(TEST_CID_STRING)
+
})
+
+
test('should handle empty directory', () => {
+
const directory: Directory = {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [],
+
}
+
+
const blobMap = extractBlobMap(directory)
+
+
expect(blobMap.size).toBe(0)
+
})
+
+
test('should correctly extract CID from BlobRef instances (not plain objects)', () => {
+
// This test verifies the fix: AT Protocol SDK returns BlobRef instances,
+
// not plain objects with $type and $link properties
+
const mockCid = CID.parse(TEST_CID_STRING)
+
const mockBlob = new BlobRef(mockCid, 'application/octet-stream', 500)
+
+
const directory: Directory = {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'test.bin',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: mockBlob,
+
},
+
},
+
],
+
}
+
+
const blobMap = extractBlobMap(directory)
+
+
// The fix: we call .toString() on the CID instance instead of accessing $link
+
expect(blobMap.get('test.bin')?.cid).toBe(TEST_CID_STRING)
+
expect(blobMap.get('test.bin')?.blobRef.ref.toString()).toBe(TEST_CID_STRING)
+
})
+
+
test('should handle multiple files in same directory', () => {
+
const mockCid1 = CID.parse(TEST_CID_STRING)
+
const mockCid2 = CID.parse('bafkreiabaduc3573q6snt2xgxzpglwuaojkzflocncrh2vj5j3jykdpqhi')
+
const mockCid3 = CID.parse('bafkreieb3ixgchss44kw7xiavnkns47emdfsqbhcdfluo3p6n3o53fl3vq')
+
+
const mockBlob1 = new BlobRef(mockCid1, 'image/png', 1000)
+
const mockBlob2 = new BlobRef(mockCid2, 'image/png', 2000)
+
const mockBlob3 = new BlobRef(mockCid3, 'image/png', 3000)
+
+
const directory: Directory = {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'images',
+
node: {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'logo.png',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: mockBlob1,
+
},
+
},
+
{
+
name: 'banner.png',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: mockBlob2,
+
},
+
},
+
{
+
name: 'icon.png',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: mockBlob3,
+
},
+
},
+
],
+
},
+
},
+
],
+
}
+
+
const blobMap = extractBlobMap(directory)
+
+
expect(blobMap.size).toBe(3)
+
expect(blobMap.has('images/logo.png')).toBe(true)
+
expect(blobMap.has('images/banner.png')).toBe(true)
+
expect(blobMap.has('images/icon.png')).toBe(true)
+
})
+
+
test('should handle mixed directory and file structure', () => {
+
const mockCid1 = CID.parse(TEST_CID_STRING)
+
const mockCid2 = CID.parse('bafkreiabaduc3573q6snt2xgxzpglwuaojkzflocncrh2vj5j3jykdpqhi')
+
const mockCid3 = CID.parse('bafkreieb3ixgchss44kw7xiavnkns47emdfsqbhcdfluo3p6n3o53fl3vq')
+
+
const directory: Directory = {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'index.html',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: new BlobRef(mockCid1, 'text/html', 100),
+
},
+
},
+
{
+
name: 'assets',
+
node: {
+
$type: 'place.wisp.fs#directory',
+
type: 'directory',
+
entries: [
+
{
+
name: 'styles.css',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: new BlobRef(mockCid2, 'text/css', 50),
+
},
+
},
+
],
+
},
+
},
+
{
+
name: 'README.md',
+
node: {
+
$type: 'place.wisp.fs#file',
+
type: 'file',
+
blob: new BlobRef(mockCid3, 'text/markdown', 200),
+
},
+
},
+
],
+
}
+
+
const blobMap = extractBlobMap(directory)
+
+
expect(blobMap.size).toBe(3)
+
expect(blobMap.has('index.html')).toBe(true)
+
expect(blobMap.has('assets/styles.css')).toBe(true)
+
expect(blobMap.has('README.md')).toBe(true)
+
})
+
})
+63 -2
src/lib/wisp-utils.ts
···
import type { Record, Directory, File, Entry } from "../lexicons/types/place/wisp/fs";
import { validateRecord } from "../lexicons/types/place/wisp/fs";
import { gzipSync } from 'zlib';
+
import { CID } from 'multiformats/cid';
+
import { sha256 } from 'multiformats/hashes/sha2';
+
import * as raw from 'multiformats/codecs/raw';
+
import { createHash } from 'crypto';
+
import * as mf from 'multiformats';
export interface UploadedFile {
name: string;
···
}
/**
-
* Compress a file using gzip
+
* Compress a file using gzip with deterministic output
*/
export function compressFile(content: Buffer): Buffer {
-
return gzipSync(content, { level: 9 });
+
return gzipSync(content, {
+
level: 9
+
});
}
/**
···
const directoryMap = new Map<string, UploadedFile[]>();
for (const file of files) {
+
// Skip undefined/null files (defensive)
+
if (!file || !file.name) {
+
console.error('Skipping undefined or invalid file in processUploadedFiles');
+
continue;
+
}
+
// Remove any base folder name from the path
const normalizedPath = file.name.replace(/^[^\/]*\//, '');
const parts = normalizedPath.split('/');
···
return result;
}
+
+
/**
+
* Compute CID (Content Identifier) for blob content
+
* Uses the same algorithm as AT Protocol: CIDv1 with raw codec and SHA-256
+
* Based on @atproto/common/src/ipld.ts sha256RawToCid implementation
+
*/
+
export function computeCID(content: Buffer): string {
+
// Use node crypto to compute sha256 hash (same as AT Protocol)
+
const hash = createHash('sha256').update(content).digest();
+
// Create digest object from hash bytes
+
const digest = mf.digest.create(sha256.code, hash);
+
// Create CIDv1 with raw codec
+
const cid = CID.createV1(raw.code, digest);
+
return cid.toString();
+
}
+
+
/**
+
* Extract blob information from a directory tree
+
* Returns a map of file paths to their blob refs and CIDs
+
*/
+
export function extractBlobMap(
+
directory: Directory,
+
currentPath: string = ''
+
): Map<string, { blobRef: BlobRef; cid: string }> {
+
const blobMap = new Map<string, { blobRef: BlobRef; cid: string }>();
+
+
for (const entry of directory.entries) {
+
const fullPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
+
+
if ('type' in entry.node && entry.node.type === 'file') {
+
const fileNode = entry.node as File;
+
// AT Protocol SDK returns BlobRef class instances, not plain objects
+
// The ref is a CID instance that can be converted to string
+
if (fileNode.blob && fileNode.blob.ref) {
+
const cidString = fileNode.blob.ref.toString();
+
blobMap.set(fullPath, {
+
blobRef: fileNode.blob,
+
cid: cidString
+
});
+
}
+
} else if ('type' in entry.node && entry.node.type === 'directory') {
+
const subMap = extractBlobMap(entry.node as Directory, fullPath);
+
subMap.forEach((value, key) => blobMap.set(key, value));
+
}
+
}
+
+
return blobMap;
+
}
+106 -9
src/routes/admin.ts
···
import { logCollector, errorTracker, metricsCollector } from '../lib/observability'
import { db } from '../lib/db'
-
export const adminRoutes = () =>
+
export const adminRoutes = (cookieSecret: string) =>
new Elysia({ prefix: '/api/admin' })
// Login
.post(
···
body: t.Object({
username: t.String(),
password: t.String()
+
}),
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
})
}
)
···
}
cookie.admin_session.remove()
return { success: true }
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
})
// Check auth status
···
authenticated: true,
username: session.username
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
})
// Get logs (protected)
···
// Get logs from hosting service
let hostingLogs: any[] = []
try {
-
const hostingPort = process.env.HOSTING_PORT || '3001'
+
const hostingServiceUrl = process.env.HOSTING_SERVICE_URL || `http://localhost:${process.env.HOSTING_PORT || '3001'}`
const params = new URLSearchParams()
if (query.level) params.append('level', query.level as string)
if (query.service) params.append('service', query.service as string)
···
if (query.eventType) params.append('eventType', query.eventType as string)
params.append('limit', String(filter.limit || 100))
-
const response = await fetch(`http://localhost:${hostingPort}/__internal__/observability/logs?${params}`)
+
const response = await fetch(`${hostingServiceUrl}/__internal__/observability/logs?${params}`)
if (response.ok) {
const data = await response.json()
hostingLogs = data.logs
···
)
return { logs: allLogs.slice(0, filter.limit || 100) }
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
})
// Get errors (protected)
···
// Get errors from hosting service
let hostingErrors: any[] = []
try {
-
const hostingPort = process.env.HOSTING_PORT || '3001'
+
const hostingServiceUrl = process.env.HOSTING_SERVICE_URL || `http://localhost:${process.env.HOSTING_PORT || '3001'}`
const params = new URLSearchParams()
if (query.service) params.append('service', query.service as string)
params.append('limit', String(filter.limit || 100))
-
const response = await fetch(`http://localhost:${hostingPort}/__internal__/observability/errors?${params}`)
+
const response = await fetch(`${hostingServiceUrl}/__internal__/observability/errors?${params}`)
if (response.ok) {
const data = await response.json()
hostingErrors = data.errors
···
)
return { errors: allErrors.slice(0, filter.limit || 100) }
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
})
// Get metrics (protected)
···
}
try {
-
const hostingPort = process.env.HOSTING_PORT || '3001'
-
const response = await fetch(`http://localhost:${hostingPort}/__internal__/observability/metrics?timeWindow=${timeWindow}`)
+
const hostingServiceUrl = process.env.HOSTING_SERVICE_URL || `http://localhost:${process.env.HOSTING_PORT || '3001'}`
+
const response = await fetch(`${hostingServiceUrl}/__internal__/observability/metrics?timeWindow=${timeWindow}`)
if (response.ok) {
const data = await response.json()
hostingServiceStats = data.stats
···
hostingService: hostingServiceStats,
timeWindow
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
})
// Get database stats (protected)
···
// Get recent sites (including those without domains)
const recentSites = await db`
-
SELECT
+
SELECT
s.did,
s.rkey,
s.display_name,
···
message: error instanceof Error ? error.message : String(error)
}
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Get cache stats (protected)
+
.get('/cache', async ({ cookie, set }) => {
+
const check = requireAdmin({ cookie, set })
+
if (check) return check
+
+
try {
+
const hostingServiceUrl = process.env.HOSTING_SERVICE_URL || `http://localhost:${process.env.HOSTING_PORT || '3001'}`
+
const response = await fetch(`${hostingServiceUrl}/__internal__/observability/cache`)
+
+
if (response.ok) {
+
const data = await response.json()
+
return data
+
} else {
+
set.status = 503
+
return {
+
error: 'Failed to fetch cache stats from hosting service',
+
message: 'Hosting service unavailable'
+
}
+
}
+
} catch (error) {
+
set.status = 500
+
return {
+
error: 'Failed to fetch cache stats',
+
message: error instanceof Error ? error.message : String(error)
+
}
+
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
})
// Get sites listing (protected)
···
try {
const sites = await db`
-
SELECT
+
SELECT
s.did,
s.rkey,
s.display_name,
···
message: error instanceof Error ? error.message : String(error)
}
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
})
// Get system health (protected)
···
},
timestamp: new Date().toISOString()
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
})
+20 -6
src/routes/auth.ts
···
-
import { Elysia } from 'elysia'
+
import { Elysia, t } from 'elysia'
import { NodeOAuthClient } from '@atproto/oauth-client-node'
-
import { getSitesByDid, getDomainByDid } from '../lib/db'
+
import { getSitesByDid, getDomainByDid, getCookieSecret } from '../lib/db'
import { syncSitesFromPDS } from '../lib/sync-sites'
import { authenticateRequest } from '../lib/wisp-auth'
import { logger } from '../lib/observability'
-
export const authRoutes = (client: NodeOAuthClient) => new Elysia()
+
export const authRoutes = (client: NodeOAuthClient, cookieSecret: string) => new Elysia({
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
.post('/api/auth/signin', async (c) => {
let handle = 'unknown'
try {
···
if (!session) {
logger.error('[Auth] OAuth callback failed: no session returned')
+
c.cookie.did.remove()
return c.redirect('/?error=auth_failed')
}
const cookieSession = c.cookie
-
cookieSession.did.value = session.did
+
cookieSession.did.set({
+
value: session.did,
+
httpOnly: true,
+
secure: process.env.NODE_ENV === 'production',
+
sameSite: 'lax',
+
maxAge: 30 * 24 * 60 * 60 // 30 days
+
})
// Sync sites from PDS to database cache
logger.debug('[Auth] Syncing sites from PDS for', session.did)
···
} catch (err) {
// This catches state validation failures and other OAuth errors
logger.error('[Auth] OAuth callback error', err)
+
c.cookie.did.remove()
return c.redirect('/?error=auth_failed')
}
})
···
const did = cookieSession.did?.value
// Clear the session cookie
-
cookieSession.did.value = ''
-
cookieSession.did.maxAge = 0
+
cookieSession.did.remove()
// If we have a DID, try to revoke the OAuth session
if (did && typeof did === 'string') {
···
const auth = await authenticateRequest(client, c.cookie)
if (!auth) {
+
c.cookie.did.remove()
return { authenticated: false }
}
···
}
} catch (err) {
logger.error('[Auth] Status check error', err)
+
c.cookie.did.remove()
return { authenticated: false }
}
})
+65 -14
src/routes/domain.ts
···
isValidHandle,
toDomain,
updateDomain,
+
countWispDomains,
+
deleteWispDomain,
getCustomDomainInfo,
getCustomDomainById,
claimCustomDomain,
···
import { verifyCustomDomain } from '../lib/dns-verify'
import { logger } from '../lib/logger'
-
export const domainRoutes = (client: NodeOAuthClient) =>
-
new Elysia({ prefix: '/api/domain' })
+
export const domainRoutes = (client: NodeOAuthClient, cookieSecret: string) =>
+
new Elysia({
+
prefix: '/api/domain',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
// Public endpoints (no auth required)
.get('/check', async ({ query }) => {
try {
···
try {
const { handle } = body as { handle?: string };
const normalizedHandle = (handle || "").trim().toLowerCase();
-
+
if (!isValidHandle(normalizedHandle)) {
throw new Error("Invalid handle");
}
-
// ensure user hasn't already claimed
-
const existing = await getDomainByDid(auth.did);
-
if (existing) {
-
throw new Error("Already claimed");
-
}
-
+
// Check if user already has 3 domains (handled in claimDomain)
// claim in DB
let domain: string;
try {
domain = await claimDomain(auth.did, normalizedHandle);
} catch (err) {
-
throw new Error("Handle taken");
+
const message = err instanceof Error ? err.message : 'Unknown error';
+
if (message === 'domain_limit_reached') {
+
throw new Error("Domain limit reached: You can only claim up to 3 wisp.place domains");
+
}
+
throw new Error("Handle taken or error claiming domain");
}
-
// write place.wisp.domain record rkey = self
+
// write place.wisp.domain record with unique rkey
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init));
+
const rkey = normalizedHandle; // Use handle as rkey for uniqueness
await agent.com.atproto.repo.putRecord({
repo: auth.did,
collection: "place.wisp.domain",
-
rkey: "self",
+
rkey,
record: {
$type: "place.wisp.domain",
domain,
···
})
.post('/wisp/map-site', async ({ body, auth }) => {
try {
-
const { siteRkey } = body as { siteRkey: string | null };
+
const { domain, siteRkey } = body as { domain: string; siteRkey: string | null };
+
+
if (!domain) {
+
throw new Error('Domain parameter required');
+
}
// Update wisp.place domain to point to this site
-
await updateWispDomainSite(auth.did, siteRkey);
+
await updateWispDomainSite(domain, siteRkey);
return { success: true };
} catch (err) {
logger.error('[Domain] Wisp domain map error', err);
throw new Error(`Failed to map site: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
})
+
.delete('/wisp/:domain', async ({ params, auth }) => {
+
try {
+
const { domain } = params;
+
+
// Verify domain belongs to user
+
const domainLower = domain.toLowerCase().trim();
+
const info = await isDomainRegistered(domainLower);
+
+
if (!info.registered || info.type !== 'wisp') {
+
throw new Error('Domain not found');
+
}
+
+
if (info.did !== auth.did) {
+
throw new Error('Unauthorized: You do not own this domain');
+
}
+
+
// Delete from database
+
await deleteWispDomain(domainLower);
+
+
// Delete from PDS
+
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init));
+
const handle = domainLower.replace(`.${process.env.BASE_DOMAIN || 'wisp.place'}`, '');
+
try {
+
await agent.com.atproto.repo.deleteRecord({
+
repo: auth.did,
+
collection: "place.wisp.domain",
+
rkey: handle,
+
});
+
} catch (err) {
+
// Record might not exist in PDS, continue anyway
+
logger.warn('[Domain] Could not delete wisp domain from PDS', err);
+
}
+
+
return { success: true };
+
} catch (err) {
+
logger.error('[Domain] Wisp domain delete error', err);
+
throw new Error(`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`);
}
})
.post('/custom/:id/map-site', async ({ params, body, auth }) => {
+8 -2
src/routes/site.ts
···
import { deleteSite } from '../lib/db'
import { logger } from '../lib/logger'
-
export const siteRoutes = (client: NodeOAuthClient) =>
-
new Elysia({ prefix: '/api/site' })
+
export const siteRoutes = (client: NodeOAuthClient, cookieSecret: string) =>
+
new Elysia({
+
prefix: '/api/site',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
.derive(async ({ cookie }) => {
const auth = await requireAuth(client, cookie)
return { auth }
+30 -10
src/routes/user.ts
···
-
import { Elysia } from 'elysia'
+
import { Elysia, t } from 'elysia'
import { requireAuth } from '../lib/wisp-auth'
import { NodeOAuthClient } from '@atproto/oauth-client-node'
import { Agent } from '@atproto/api'
-
import { getSitesByDid, getDomainByDid, getCustomDomainsByDid, getWispDomainInfo } from '../lib/db'
+
import { getSitesByDid, getDomainByDid, getCustomDomainsByDid, getWispDomainInfo, getDomainsBySite, getAllWispDomains } from '../lib/db'
import { syncSitesFromPDS } from '../lib/sync-sites'
import { logger } from '../lib/logger'
-
export const userRoutes = (client: NodeOAuthClient) =>
-
new Elysia({ prefix: '/api/user' })
+
export const userRoutes = (client: NodeOAuthClient, cookieSecret: string) =>
+
new Elysia({
+
prefix: '/api/user',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
.derive(async ({ cookie }) => {
const auth = await requireAuth(client, cookie)
return { auth }
···
})
.get('/domains', async ({ auth }) => {
try {
-
// Get wisp.place subdomain with mapping
-
const wispDomainInfo = await getWispDomainInfo(auth.did)
+
// Get all wisp.place subdomains with mappings (up to 3)
+
const wispDomains = await getAllWispDomains(auth.did)
// Get custom domains
const customDomains = await getCustomDomainsByDid(auth.did)
return {
-
wispDomain: wispDomainInfo ? {
-
domain: wispDomainInfo.domain,
-
rkey: wispDomainInfo.rkey || null
-
} : null,
+
wispDomains: wispDomains.map(d => ({
+
domain: d.domain,
+
rkey: d.rkey || null
+
})),
customDomains
}
} catch (err) {
···
throw new Error('Failed to sync sites')
}
})
+
.get('/site/:rkey/domains', async ({ auth, params }) => {
+
try {
+
const { rkey } = params
+
const domains = await getDomainsBySite(auth.did, rkey)
+
+
return {
+
rkey,
+
domains
+
}
+
} catch (err) {
+
logger.error('[User] Site domains error', err)
+
throw new Error('Failed to get domains for site')
+
}
+
})
+138 -12
src/routes/wisp.ts
···
createManifest,
updateFileBlobs,
shouldCompressFile,
-
compressFile
+
compressFile,
+
computeCID,
+
extractBlobMap
} from '../lib/wisp-utils'
import { upsertSite } from '../lib/db'
import { logger } from '../lib/observability'
···
return true;
}
-
export const wispRoutes = (client: NodeOAuthClient) =>
-
new Elysia({ prefix: '/wisp' })
+
export const wispRoutes = (client: NodeOAuthClient, cookieSecret: string) =>
+
new Elysia({
+
prefix: '/wisp',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
.derive(async ({ cookie }) => {
const auth = await requireAuth(client, cookie)
return { auth }
···
siteName: string;
files: File | File[]
};
+
+
console.log('=== UPLOAD FILES START ===');
+
console.log('Site name:', siteName);
+
console.log('Files received:', Array.isArray(files) ? files.length : 'single file');
try {
if (!siteName) {
···
// Create agent with OAuth session
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
+
console.log('Agent created for DID:', auth.did);
+
+
// Try to fetch existing record to enable incremental updates
+
let existingBlobMap = new Map<string, { blobRef: any; cid: string }>();
+
console.log('Attempting to fetch existing record...');
+
try {
+
const rkey = siteName;
+
const existingRecord = await agent.com.atproto.repo.getRecord({
+
repo: auth.did,
+
collection: 'place.wisp.fs',
+
rkey: rkey
+
});
+
console.log('Existing record found!');
+
+
if (existingRecord.data.value && typeof existingRecord.data.value === 'object' && 'root' in existingRecord.data.value) {
+
const manifest = existingRecord.data.value as any;
+
existingBlobMap = extractBlobMap(manifest.root);
+
console.log(`Found existing manifest with ${existingBlobMap.size} files for incremental update`);
+
logger.info(`Found existing manifest with ${existingBlobMap.size} files for incremental update`);
+
}
+
} catch (error: any) {
+
console.log('No existing record found or error:', error?.message || error);
+
// Record doesn't exist yet, this is a new site
+
if (error?.status !== 400 && error?.error !== 'RecordNotFound') {
+
logger.warn('Failed to fetch existing record, proceeding with full upload', error);
+
}
+
}
// Convert File objects to UploadedFile format
// Elysia gives us File objects directly, handle both single file and array
···
const uploadedFiles: UploadedFile[] = [];
const skippedFiles: Array<{ name: string; reason: string }> = [];
-
+
console.log('Processing files, count:', fileArray.length);
for (let i = 0; i < fileArray.length; i++) {
const file = fileArray[i];
+
console.log(`Processing file ${i + 1}/${fileArray.length}:`, file.name, file.size, 'bytes');
// Skip files that are too large (limit to 100MB per file)
const maxSize = MAX_FILE_SIZE; // 100MB
···
// Compress and base64 encode ALL files
const compressedContent = compressFile(originalContent);
// Base64 encode the gzipped content to prevent PDS content sniffing
-
const base64Content = Buffer.from(compressedContent.toString('base64'), 'utf-8');
+
// Convert base64 string to bytes using binary encoding (each char becomes exactly one byte)
+
// This is what PDS receives and computes CID on
+
const base64Content = Buffer.from(compressedContent.toString('base64'), 'binary');
const compressionRatio = (compressedContent.length / originalContent.length * 100).toFixed(1);
+
console.log(`Compressing ${file.name}: ${originalContent.length} -> ${compressedContent.length} bytes (${compressionRatio}%), base64: ${base64Content.length} bytes`);
logger.info(`Compressing ${file.name}: ${originalContent.length} -> ${compressedContent.length} bytes (${compressionRatio}%), base64: ${base64Content.length} bytes`);
uploadedFiles.push({
name: file.name,
-
content: base64Content,
+
content: base64Content, // This is the gzipped+base64 content that will be uploaded and CID-computed
mimeType: originalMimeType,
size: base64Content.length,
compressed: true,
···
}
// Process files into directory structure
-
const { directory, fileCount } = processUploadedFiles(uploadedFiles);
+
console.log('Processing uploaded files into directory structure...');
+
console.log('uploadedFiles array length:', uploadedFiles.length);
+
console.log('uploadedFiles contents:', uploadedFiles.map((f, i) => `${i}: ${f?.name || 'UNDEFINED'}`));
-
// Upload files as blobs in parallel
+
// Filter out any undefined/null/invalid entries (defensive)
+
const validUploadedFiles = uploadedFiles.filter((f, i) => {
+
if (!f) {
+
console.error(`Filtering out undefined/null file at index ${i}`);
+
return false;
+
}
+
if (!f.name) {
+
console.error(`Filtering out file with no name at index ${i}:`, f);
+
return false;
+
}
+
if (!f.content) {
+
console.error(`Filtering out file with no content at index ${i}:`, f.name);
+
return false;
+
}
+
return true;
+
});
+
if (validUploadedFiles.length !== uploadedFiles.length) {
+
console.warn(`Filtered out ${uploadedFiles.length - validUploadedFiles.length} invalid files`);
+
}
+
console.log('validUploadedFiles length:', validUploadedFiles.length);
+
+
const { directory, fileCount } = processUploadedFiles(validUploadedFiles);
+
console.log('Directory structure created, file count:', fileCount);
+
+
// Upload files as blobs in parallel (or reuse existing blobs with matching CIDs)
+
console.log('Starting blob upload/reuse phase...');
// For compressed files, we upload as octet-stream and store the original MIME type in metadata
// For text/html files, we also use octet-stream as a workaround for PDS image pipeline issues
-
const uploadPromises = uploadedFiles.map(async (file, i) => {
+
const uploadPromises = validUploadedFiles.map(async (file, i) => {
try {
+
// Skip undefined files (shouldn't happen after filter, but defensive)
+
if (!file || !file.name) {
+
console.error(`ERROR: Undefined file at index ${i} in validUploadedFiles!`);
+
throw new Error(`Undefined file at index ${i}`);
+
}
+
+
// Compute CID for this file to check if it already exists
+
// Note: file.content is already gzipped+base64 encoded
+
const fileCID = computeCID(file.content);
+
+
// Normalize the file path for comparison (remove base folder prefix like "cobblemon/")
+
const normalizedPath = file.name.replace(/^[^\/]*\//, '');
+
+
// Check if we have an existing blob with the same CID
+
// Try both the normalized path and the full path
+
const existingBlob = existingBlobMap.get(normalizedPath) || existingBlobMap.get(file.name);
+
+
if (existingBlob && existingBlob.cid === fileCID) {
+
// Reuse existing blob - no need to upload
+
logger.info(`[File Upload] Reusing existing blob for: ${file.name} (CID: ${fileCID})`);
+
+
return {
+
result: {
+
hash: existingBlob.cid,
+
blobRef: existingBlob.blobRef,
+
...(file.compressed && {
+
encoding: 'gzip' as const,
+
mimeType: file.originalMimeType || file.mimeType,
+
base64: true
+
})
+
},
+
filePath: file.name,
+
sentMimeType: file.mimeType,
+
returnedMimeType: existingBlob.blobRef.mimeType,
+
reused: true
+
};
+
}
+
+
// File is new or changed - upload it
// If compressed, always upload as octet-stream
// Otherwise, workaround: PDS incorrectly processes text/html through image pipeline
const uploadMimeType = file.compressed || file.mimeType.startsWith('text/html')
···
: file.mimeType;
const compressionInfo = file.compressed ? ' (gzipped)' : '';
-
logger.info(`[File Upload] Uploading file: ${file.name} (original: ${file.mimeType}, sending as: ${uploadMimeType}, ${file.size} bytes${compressionInfo})`);
+
logger.info(`[File Upload] Uploading new/changed file: ${file.name} (original: ${file.mimeType}, sending as: ${uploadMimeType}, ${file.size} bytes${compressionInfo}, CID: ${fileCID})`);
const uploadResult = await agent.com.atproto.repo.uploadBlob(
file.content,
···
},
filePath: file.name,
sentMimeType: file.mimeType,
-
returnedMimeType: returnedBlobRef.mimeType
+
returnedMimeType: returnedBlobRef.mimeType,
+
reused: false
};
} catch (uploadError) {
logger.error('Upload failed for file', uploadError);
···
// Wait for all uploads to complete
const uploadedBlobs = await Promise.all(uploadPromises);
+
// Count reused vs uploaded blobs
+
const reusedCount = uploadedBlobs.filter(b => (b as any).reused).length;
+
const uploadedCount = uploadedBlobs.filter(b => !(b as any).reused).length;
+
console.log(`Blob statistics: ${reusedCount} reused, ${uploadedCount} uploaded, ${uploadedBlobs.length} total`);
+
logger.info(`Blob statistics: ${reusedCount} reused, ${uploadedCount} uploaded, ${uploadedBlobs.length} total`);
+
// Extract results and file paths in correct order
const uploadResults: FileUploadResult[] = uploadedBlobs.map(blob => blob.result);
const filePaths: string[] = uploadedBlobs.map(blob => blob.filePath);
// Update directory with file blobs
+
console.log('Updating directory with blob references...');
const updatedDirectory = updateFileBlobs(directory, uploadResults, filePaths);
// Create manifest
+
console.log('Creating manifest...');
const manifest = createManifest(siteName, updatedDirectory, fileCount);
+
console.log('Manifest created successfully');
// Use site name as rkey
const rkey = siteName;
let record;
try {
+
console.log('Putting record to PDS with rkey:', rkey);
record = await agent.com.atproto.repo.putRecord({
repo: auth.did,
collection: 'place.wisp.fs',
rkey: rkey,
record: manifest
});
+
console.log('Record successfully created on PDS:', record.data.uri);
} catch (putRecordError: any) {
+
console.error('FAILED to create record on PDS:', putRecordError);
logger.error('Failed to create record on PDS', putRecordError);
throw putRecordError;
···
fileCount,
siteName,
skippedFiles,
-
uploadedCount: uploadedFiles.length
+
uploadedCount: validUploadedFiles.length
};
+
console.log('=== UPLOAD FILES COMPLETE ===');
return result;
} catch (error) {
+
console.error('=== UPLOAD ERROR ===');
+
console.error('Error details:', error);
+
console.error('Stack trace:', error instanceof Error ? error.stack : 'N/A');
logger.error('Upload error', error, {
message: error instanceof Error ? error.message : 'Unknown error',
name: error instanceof Error ? error.name : undefined