Monorepo for wisp.place. A static site hosting service built on top of the AT Protocol. wisp.place

Compare changes

Choose any two refs to compare.

Changed files
+26359 -12395
.tangled
apps
hosting-service
main-app
public
scripts
src
cli
hosting-service
lexicons
packages
public
scripts
src
testDeploy
+6
.dockerignore
···
*.log
.vscode
.idea
+
.prettierrc
+
testDeploy
+
.tangled
+
.crush
+
.claude
+
hosting-service
+6 -1
.gitignore
···
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
.env
# dependencies
-
/node_modules
+
node_modules
+
**/node_modules
/.pnp
.pnp.js
···
# production
/build
+
/result
+
dist
+
**/dist
+
*.tsbuildinfo
# misc
.DS_Store
-3
.gitmodules
···
-
[submodule "cli/jacquard"]
-
path = cli/jacquard
-
url = https://tangled.org/@nonbinary.computer/jacquard
-7
.prettierrc
···
-
{
-
"useTabs": true,
-
"tabWidth": 4,
-
"semi": false,
-
"singleQuote": true,
-
"trailingComma": "none"
-
}
-1
.tangled/workflows/deploy-wisp.yml
···
- name: 'Deploy to Wisp.place'
command: |
-
echo
./cli/target/release/wisp-cli \
"$WISP_HANDLE" \
--path "$SITE_PATH" \
+4
.tangled/workflows/test.yml
···
- name: install dependencies
command: |
export PATH="$HOME/.nix-profile/bin:$PATH"
+
+
# have to regenerate otherwise it wont install necessary dependencies to run
+
rm -rf bun.lock package-lock.json
+
bun install @oven/bun-linux-aarch64
bun install
- name: run all tests
+10 -15
Dockerfile
···
WORKDIR /app
# Copy package files
-
COPY package.json bun.lock* ./
+
COPY package.json ./
+
+
# Copy Bun configuration
+
COPY bunfig.toml ./
+
+
COPY tsconfig.json ./
# Install dependencies
-
RUN bun install --frozen-lockfile
+
RUN bun install
# Copy source code
COPY src ./src
COPY public ./public
-
# Build the application (if needed)
-
# RUN bun run build
-
-
# Set environment variables (can be overridden at runtime)
-
ENV PORT=3000
+
ENV PORT=8000
ENV NODE_ENV=production
-
# Expose the application port
-
EXPOSE 3000
-
-
# Health check
-
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
-
CMD bun -e "fetch('http://localhost:3000/health').then(r => r.ok ? process.exit(0) : process.exit(1)).catch(() => process.exit(1))"
+
EXPOSE 8000
-
# Start the application
-
CMD ["bun", "src/index.ts"]
+
CMD ["bun", "start"]
+99 -5
README.md
···
# Wisp.place
-
A static site hosting service built on the AT Protocol. [https://wisp.place](https://wisp.place)
-
/src is the main backend
+
Decentralized static site hosting on the AT Protocol. [https://wisp.place](https://wisp.place)
+
+
## What is this?
-
/hosting-service is the microservice that serves on-disk caches of sites pulled from the firehose and pdses
+
Host static sites in your AT Protocol repo, served with CDN distribution. Your PDS holds the cryptographically signed manifest and files - the source of truth. Hosting services index and serve them fast.
-
/cli is the wisp-cli, a way to upload sites directly to the pds
+
## Quick Start
-
full readme soon
+
```bash
+
# Using the web interface
+
Visit https://wisp.place and sign in
+
+
# Or use the CLI
+
cd cli
+
cargo build --release
+
./target/release/wisp-cli your-handle.bsky.social --path ./my-site --site my-site
+
```
+
+
Your site appears at `https://sites.wisp.place/{your-did}/{site-name}` or your custom domain.
+
+
## Architecture
+
+
- **`/src`** - Main backend (OAuth, site management, custom domains)
+
- **`/hosting-service`** - Microservice that serves cached sites from disk
+
- **`/cli`** - Rust CLI for direct PDS uploads
+
- **`/public`** - React frontend
+
+
### How it works
+
+
1. Sites stored as `place.wisp.fs` records in your AT Protocol repo
+
2. Files compressed (gzip) and base64-encoded as blobs
+
3. Hosting service watches firehose, caches sites locally
+
4. Sites served via custom domains or `*.wisp.place` subdomains
+
+
## Development
+
+
```bash
+
# Backend
+
bun install
+
bun run src/index.ts
+
+
# Hosting service
+
cd hosting-service
+
npm run start
+
+
# CLI
+
cd cli
+
cargo build
+
```
+
+
## Features
+
+
### URL Redirects and Rewrites
+
+
The hosting service supports Netlify-style `_redirects` files for managing URLs. Place a `_redirects` file in your site root to enable:
+
+
- **301/302 Redirects**: Permanent and temporary URL redirects
+
- **200 Rewrites**: Serve different content without changing the URL
+
- **404 Custom Pages**: Custom error pages for specific paths
+
- **Splats & Placeholders**: Dynamic path matching (`/blog/:year/:month/:day`, `/news/*`)
+
- **Query Parameter Matching**: Redirect based on URL parameters
+
- **Conditional Redirects**: Route by country, language, or cookie presence
+
- **Force Redirects**: Override existing files with redirects
+
+
Example `_redirects`:
+
```
+
# Single-page app routing (React, Vue, etc.)
+
/* /index.html 200
+
+
# Simple redirects
+
/home /
+
/old-blog/* /blog/:splat
+
+
# API proxy
+
/api/* https://api.example.com/:splat 200
+
+
# Country-based routing
+
/ /us/ 302 Country=us
+
/ /uk/ 302 Country=gb
+
```
+
+
## Limits
+
+
- Max file size: 100MB (PDS limit)
+
- Max files: 2000
+
+
## Tech Stack
+
+
- Backend: Bun + Elysia + PostgreSQL
+
- Frontend: React 19 + Tailwind 4 + Radix UI
+
- Hosting: Node microservice using Hono
+
- CLI: Rust + Jacquard (AT Protocol library)
+
- Protocol: AT Protocol OAuth + custom lexicons
+
+
## License
+
+
MIT
+
+
## Links
+
+
- [AT Protocol](https://atproto.com)
+
- [Jacquard Library](https://tangled.org/@nonbinary.computer/jacquard)
-41
api.md
···
-
/**
-
* AUTHENTICATION ROUTES
-
*
-
* Handles OAuth authentication flow for Bluesky/ATProto accounts
-
* All routes are on the editor.wisp.place subdomain
-
*
-
* Routes:
-
* POST /api/auth/signin - Initiate OAuth sign-in flow
-
* GET /api/auth/callback - OAuth callback handler (redirect from PDS)
-
* GET /api/auth/status - Check current authentication status
-
* POST /api/auth/logout - Sign out and clear session
-
*/
-
-
/**
-
* CUSTOM DOMAIN ROUTES
-
*
-
* Handles custom domain (BYOD - Bring Your Own Domain) management
-
* Users can claim custom domains with DNS verification (TXT + CNAME)
-
* and map them to their sites
-
*
-
* Routes:
-
* GET /api/check-domain - Fast verification check for routing (public)
-
* GET /api/custom-domains - List user's custom domains
-
* POST /api/custom-domains/check - Check domain availability and DNS config
-
* POST /api/custom-domains/claim - Claim a custom domain
-
* PUT /api/custom-domains/:id/site - Update site mapping
-
* DELETE /api/custom-domains/:id - Remove a custom domain
-
* POST /api/custom-domains/:id/verify - Manually trigger verification
-
*/
-
-
/**
-
* WISP SITE MANAGEMENT ROUTES
-
*
-
* API endpoints for managing user's Wisp sites stored in ATProto repos
-
* Handles reading site metadata, fetching content, updating sites, and uploads
-
* All routes are on the editor.wisp.place subdomain
-
*
-
* Routes:
-
* GET /wisp/sites - List all sites for authenticated user
-
* POST /wisp/upload-files - Upload and deploy files as a site
-
*/
+34
apps/hosting-service/.dockerignore
···
+
# Dependencies
+
node_modules
+
+
# Environment files
+
.env
+
.env.*
+
!.env.example
+
+
# Git
+
.git
+
.gitignore
+
+
# Cache
+
cache
+
+
# Documentation
+
*.md
+
!README.md
+
+
# Logs
+
*.log
+
npm-debug.log*
+
bun-debug.log*
+
+
# OS files
+
.DS_Store
+
Thumbs.db
+
+
# IDE
+
.vscode
+
.idea
+
*.swp
+
*.swo
+
*~
+6
apps/hosting-service/.env.example
···
+
# Database
+
DATABASE_URL=postgres://postgres:postgres@localhost:5432/wisp
+
+
# Server
+
PORT=3001
+
BASE_HOST=wisp.place
+8
apps/hosting-service/.gitignore
···
+
node_modules/
+
cache/
+
.env
+
.env.local
+
*.log
+
dist/
+
build/
+
.DS_Store
+31
apps/hosting-service/Dockerfile
···
+
# Use official Node.js Alpine image
+
FROM node:alpine AS base
+
+
# Set working directory
+
WORKDIR /app
+
+
# Copy package files
+
COPY package.json ./
+
+
# Install dependencies
+
RUN npm install
+
+
# Copy source code
+
COPY src ./src
+
+
# Create cache directory
+
RUN mkdir -p ./cache/sites
+
+
# Set environment variables (can be overridden at runtime)
+
ENV PORT=3001
+
ENV NODE_ENV=production
+
+
# Expose the application port
+
EXPOSE 3001
+
+
# Health check
+
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
+
CMD node -e "fetch('http://localhost:3001/health').then(r => r.ok ? process.exit(0) : process.exit(1)).catch(() => process.exit(1))"
+
+
# Start the application (can override with 'npm run backfill' in compose)
+
CMD ["npm", "run", "start"]
+130
apps/hosting-service/README.md
···
+
# Wisp Hosting Service
+
+
Minimal microservice for hosting static sites from the AT Protocol. Built with Hono and Bun.
+
+
## Features
+
+
- **Custom Domain Hosting**: Serve verified custom domains
+
- **Wisp.place Subdomains**: Serve registered `*.wisp.place` subdomains
+
- **DNS Hash Routing**: Support DNS verification via `hash.dns.wisp.place`
+
- **Direct File Serving**: Access sites via `sites.wisp.place/:identifier/:site/*` (no DB lookup)
+
- **Firehose Worker**: Listens to AT Protocol firehose for new `place.wisp.fs` records
+
- **Automatic Caching**: Downloads and caches sites locally on first access or firehose event
+
- **SSRF Protection**: Hardened fetch with timeout, size limits, and private IP blocking
+
+
## Routes
+
+
1. **Custom Domains** (`/*`)
+
- Serves verified custom domains (example.com)
+
- DB lookup: `custom_domains` table
+
+
2. **Wisp Subdomains** (`/*.wisp.place/*`)
+
- Serves registered subdomains (alice.wisp.place)
+
- DB lookup: `domains` table
+
+
3. **DNS Hash Routing** (`/hash.dns.wisp.place/*`)
+
- DNS verification routing for custom domains
+
- DB lookup: `custom_domains` by hash
+
+
4. **Direct Serving** (`/sites.wisp.place/:identifier/:site/*`)
+
- Direct access without DB lookup
+
- `:identifier` can be DID or handle
+
- Fetches from PDS if not cached
+
- **Automatic HTML path rewriting**: Absolute paths (`/style.css`) are rewritten to relative paths (`sites.wisp.place/:identifier/:site/style.css`)
+
+
## Setup
+
+
```bash
+
# Install dependencies
+
bun install
+
+
# Copy environment file
+
cp .env.example .env
+
+
# Run in development
+
bun run dev
+
+
# Run in production
+
bun run start
+
```
+
+
## Environment Variables
+
+
- `DATABASE_URL` - PostgreSQL connection string
+
- `PORT` - HTTP server port (default: 3001)
+
- `BASE_HOST` - Base domain (default: wisp.place)
+
+
## Architecture
+
+
- **Hono**: Minimal web framework
+
- **Postgres**: Database for domain/site lookups
+
- **AT Protocol**: Decentralized storage
+
- **Jetstream**: Firehose consumer for real-time updates
+
- **Bun**: Runtime and file serving
+
+
## Cache Structure
+
+
```
+
cache/sites/
+
did:plc:abc123/
+
sitename/
+
index.html
+
style.css
+
assets/
+
logo.png
+
```
+
+
## Health Check
+
+
```bash
+
curl http://localhost:3001/health
+
```
+
+
Returns firehose connection status and last event time.
+
+
## HTML Path Rewriting
+
+
When serving sites via the `/s/:identifier/:site/*` route, HTML files are automatically processed to rewrite absolute paths to work correctly in the subdirectory context.
+
+
**What gets rewritten:**
+
- `src` attributes (images, scripts, iframes)
+
- `href` attributes (links, stylesheets)
+
- `action` attributes (forms)
+
- `poster`, `data` attributes (media)
+
- `srcset` attributes (responsive images)
+
+
**What's preserved:**
+
- External URLs (`https://example.com/style.css`)
+
- Protocol-relative URLs (`//cdn.example.com/script.js`)
+
- Data URIs (`data:image/png;base64,...`)
+
- Anchors (`/#section`)
+
- Already relative paths (`./style.css`, `../images/logo.png`)
+
+
**Example:**
+
```html
+
<!-- Original HTML -->
+
<link rel="stylesheet" href="/style.css">
+
<img src="/images/logo.png">
+
+
<!-- Served at /s/did:plc:abc123/mysite/ becomes -->
+
<link rel="stylesheet" href="sites.wisp.place/did:plc:abc123/mysite/style.css">
+
<img src="sites.wisp.place/did:plc:abc123/mysite/images/logo.png">
+
```
+
+
This ensures sites work correctly when served from subdirectories without requiring manual path adjustments.
+
+
## Security
+
+
### SSRF Protection
+
+
All external HTTP requests are protected against Server-Side Request Forgery (SSRF) attacks:
+
+
- **5-second timeout** on all requests
+
- **Size limits**: 1MB for JSON, 10MB default, 100MB for file blobs
+
- **Blocked private IP ranges**:
+
- Loopback (127.0.0.0/8, ::1)
+
- Private networks (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16)
+
- Link-local (169.254.0.0/16, fe80::/10)
+
- Cloud metadata endpoints (169.254.169.254)
+
- **Protocol validation**: Only HTTP/HTTPS allowed
+
- **Streaming with size enforcement**: Prevents memory exhaustion from large responses
+376
apps/hosting-service/bun.lock
···
+
{
+
"lockfileVersion": 1,
+
"configVersion": 0,
+
"workspaces": {
+
"": {
+
"name": "wisp-hosting-service",
+
"dependencies": {
+
"@atproto/api": "^0.17.4",
+
"@atproto/identity": "^0.4.9",
+
"@atproto/lexicon": "^0.5.1",
+
"@atproto/sync": "^0.1.36",
+
"@atproto/xrpc": "^0.7.5",
+
"@hono/node-server": "^1.19.6",
+
"hono": "^4.10.4",
+
"mime-types": "^2.1.35",
+
"multiformats": "^13.4.1",
+
"postgres": "^3.4.5",
+
},
+
"devDependencies": {
+
"@types/bun": "^1.3.1",
+
"@types/mime-types": "^2.1.4",
+
"@types/node": "^22.10.5",
+
"tsx": "^4.19.2",
+
},
+
},
+
},
+
"packages": {
+
"@atproto/api": ["@atproto/api@0.17.4", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "@atproto/xrpc": "^0.7.5", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, ""],
+
+
"@atproto/common": ["@atproto/common@0.4.12", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@ipld/dag-cbor": "^7.0.3", "cbor-x": "^1.5.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "pino": "^8.21.0" } }, ""],
+
+
"@atproto/common-web": ["@atproto/common-web@0.4.3", "", { "dependencies": { "graphemer": "^1.4.0", "multiformats": "^9.9.0", "uint8arrays": "3.0.0", "zod": "^3.23.8" } }, ""],
+
+
"@atproto/crypto": ["@atproto/crypto@0.4.4", "", { "dependencies": { "@noble/curves": "^1.7.0", "@noble/hashes": "^1.6.1", "uint8arrays": "3.0.0" } }, ""],
+
+
"@atproto/identity": ["@atproto/identity@0.4.9", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/crypto": "^0.4.4" } }, ""],
+
+
"@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, ""],
+
+
"@atproto/repo": ["@atproto/repo@0.8.10", "", { "dependencies": { "@atproto/common": "^0.4.12", "@atproto/common-web": "^0.4.3", "@atproto/crypto": "^0.4.4", "@atproto/lexicon": "^0.5.1", "@ipld/dag-cbor": "^7.0.0", "multiformats": "^9.9.0", "uint8arrays": "3.0.0", "varint": "^6.0.0", "zod": "^3.23.8" } }, ""],
+
+
"@atproto/sync": ["@atproto/sync@0.1.36", "", { "dependencies": { "@atproto/common": "^0.4.12", "@atproto/identity": "^0.4.9", "@atproto/lexicon": "^0.5.1", "@atproto/repo": "^0.8.10", "@atproto/syntax": "^0.4.1", "@atproto/xrpc-server": "^0.9.5", "multiformats": "^9.9.0", "p-queue": "^6.6.2", "ws": "^8.12.0" } }, "sha512-HyF835Bmn8ps9BuXkmGjRrbgfv4K3fJdfEvXimEhTCntqIxQg0ttmOYDg/WBBmIRfkCB5ab+wS1PCGN8trr+FQ=="],
+
+
"@atproto/syntax": ["@atproto/syntax@0.4.1", "", {}, ""],
+
+
"@atproto/xrpc": ["@atproto/xrpc@0.7.5", "", { "dependencies": { "@atproto/lexicon": "^0.5.1", "zod": "^3.23.8" } }, ""],
+
+
"@atproto/xrpc-server": ["@atproto/xrpc-server@0.9.5", "", { "dependencies": { "@atproto/common": "^0.4.12", "@atproto/crypto": "^0.4.4", "@atproto/lexicon": "^0.5.1", "@atproto/xrpc": "^0.7.5", "cbor-x": "^1.5.1", "express": "^4.17.2", "http-errors": "^2.0.0", "mime-types": "^2.1.35", "rate-limiter-flexible": "^2.4.1", "uint8arrays": "3.0.0", "ws": "^8.12.0", "zod": "^3.23.8" } }, ""],
+
+
"@cbor-extract/cbor-extract-darwin-arm64": ["@cbor-extract/cbor-extract-darwin-arm64@2.2.0", "", { "os": "darwin", "cpu": "arm64" }, ""],
+
+
"@esbuild/aix-ppc64": ["@esbuild/aix-ppc64@0.25.11", "", { "os": "aix", "cpu": "ppc64" }, "sha512-Xt1dOL13m8u0WE8iplx9Ibbm+hFAO0GsU2P34UNoDGvZYkY8ifSiy6Zuc1lYxfG7svWE2fzqCUmFp5HCn51gJg=="],
+
+
"@esbuild/android-arm": ["@esbuild/android-arm@0.25.11", "", { "os": "android", "cpu": "arm" }, "sha512-uoa7dU+Dt3HYsethkJ1k6Z9YdcHjTrSb5NUy66ZfZaSV8hEYGD5ZHbEMXnqLFlbBflLsl89Zke7CAdDJ4JI+Gg=="],
+
+
"@esbuild/android-arm64": ["@esbuild/android-arm64@0.25.11", "", { "os": "android", "cpu": "arm64" }, "sha512-9slpyFBc4FPPz48+f6jyiXOx/Y4v34TUeDDXJpZqAWQn/08lKGeD8aDp9TMn9jDz2CiEuHwfhRmGBvpnd/PWIQ=="],
+
+
"@esbuild/android-x64": ["@esbuild/android-x64@0.25.11", "", { "os": "android", "cpu": "x64" }, "sha512-Sgiab4xBjPU1QoPEIqS3Xx+R2lezu0LKIEcYe6pftr56PqPygbB7+szVnzoShbx64MUupqoE0KyRlN7gezbl8g=="],
+
+
"@esbuild/darwin-arm64": ["@esbuild/darwin-arm64@0.25.11", "", { "os": "darwin", "cpu": "arm64" }, "sha512-VekY0PBCukppoQrycFxUqkCojnTQhdec0vevUL/EDOCnXd9LKWqD/bHwMPzigIJXPhC59Vd1WFIL57SKs2mg4w=="],
+
+
"@esbuild/darwin-x64": ["@esbuild/darwin-x64@0.25.11", "", { "os": "darwin", "cpu": "x64" }, "sha512-+hfp3yfBalNEpTGp9loYgbknjR695HkqtY3d3/JjSRUyPg/xd6q+mQqIb5qdywnDxRZykIHs3axEqU6l1+oWEQ=="],
+
+
"@esbuild/freebsd-arm64": ["@esbuild/freebsd-arm64@0.25.11", "", { "os": "freebsd", "cpu": "arm64" }, "sha512-CmKjrnayyTJF2eVuO//uSjl/K3KsMIeYeyN7FyDBjsR3lnSJHaXlVoAK8DZa7lXWChbuOk7NjAc7ygAwrnPBhA=="],
+
+
"@esbuild/freebsd-x64": ["@esbuild/freebsd-x64@0.25.11", "", { "os": "freebsd", "cpu": "x64" }, "sha512-Dyq+5oscTJvMaYPvW3x3FLpi2+gSZTCE/1ffdwuM6G1ARang/mb3jvjxs0mw6n3Lsw84ocfo9CrNMqc5lTfGOw=="],
+
+
"@esbuild/linux-arm": ["@esbuild/linux-arm@0.25.11", "", { "os": "linux", "cpu": "arm" }, "sha512-TBMv6B4kCfrGJ8cUPo7vd6NECZH/8hPpBHHlYI3qzoYFvWu2AdTvZNuU/7hsbKWqu/COU7NIK12dHAAqBLLXgw=="],
+
+
"@esbuild/linux-arm64": ["@esbuild/linux-arm64@0.25.11", "", { "os": "linux", "cpu": "arm64" }, "sha512-Qr8AzcplUhGvdyUF08A1kHU3Vr2O88xxP0Tm8GcdVOUm25XYcMPp2YqSVHbLuXzYQMf9Bh/iKx7YPqECs6ffLA=="],
+
+
"@esbuild/linux-ia32": ["@esbuild/linux-ia32@0.25.11", "", { "os": "linux", "cpu": "ia32" }, "sha512-TmnJg8BMGPehs5JKrCLqyWTVAvielc615jbkOirATQvWWB1NMXY77oLMzsUjRLa0+ngecEmDGqt5jiDC6bfvOw=="],
+
+
"@esbuild/linux-loong64": ["@esbuild/linux-loong64@0.25.11", "", { "os": "linux", "cpu": "none" }, "sha512-DIGXL2+gvDaXlaq8xruNXUJdT5tF+SBbJQKbWy/0J7OhU8gOHOzKmGIlfTTl6nHaCOoipxQbuJi7O++ldrxgMw=="],
+
+
"@esbuild/linux-mips64el": ["@esbuild/linux-mips64el@0.25.11", "", { "os": "linux", "cpu": "none" }, "sha512-Osx1nALUJu4pU43o9OyjSCXokFkFbyzjXb6VhGIJZQ5JZi8ylCQ9/LFagolPsHtgw6himDSyb5ETSfmp4rpiKQ=="],
+
+
"@esbuild/linux-ppc64": ["@esbuild/linux-ppc64@0.25.11", "", { "os": "linux", "cpu": "ppc64" }, "sha512-nbLFgsQQEsBa8XSgSTSlrnBSrpoWh7ioFDUmwo158gIm5NNP+17IYmNWzaIzWmgCxq56vfr34xGkOcZ7jX6CPw=="],
+
+
"@esbuild/linux-riscv64": ["@esbuild/linux-riscv64@0.25.11", "", { "os": "linux", "cpu": "none" }, "sha512-HfyAmqZi9uBAbgKYP1yGuI7tSREXwIb438q0nqvlpxAOs3XnZ8RsisRfmVsgV486NdjD7Mw2UrFSw51lzUk1ww=="],
+
+
"@esbuild/linux-s390x": ["@esbuild/linux-s390x@0.25.11", "", { "os": "linux", "cpu": "s390x" }, "sha512-HjLqVgSSYnVXRisyfmzsH6mXqyvj0SA7pG5g+9W7ESgwA70AXYNpfKBqh1KbTxmQVaYxpzA/SvlB9oclGPbApw=="],
+
+
"@esbuild/linux-x64": ["@esbuild/linux-x64@0.25.11", "", { "os": "linux", "cpu": "x64" }, "sha512-HSFAT4+WYjIhrHxKBwGmOOSpphjYkcswF449j6EjsjbinTZbp8PJtjsVK1XFJStdzXdy/jaddAep2FGY+wyFAQ=="],
+
+
"@esbuild/netbsd-arm64": ["@esbuild/netbsd-arm64@0.25.11", "", { "os": "none", "cpu": "arm64" }, "sha512-hr9Oxj1Fa4r04dNpWr3P8QKVVsjQhqrMSUzZzf+LZcYjZNqhA3IAfPQdEh1FLVUJSiu6sgAwp3OmwBfbFgG2Xg=="],
+
+
"@esbuild/netbsd-x64": ["@esbuild/netbsd-x64@0.25.11", "", { "os": "none", "cpu": "x64" }, "sha512-u7tKA+qbzBydyj0vgpu+5h5AeudxOAGncb8N6C9Kh1N4n7wU1Xw1JDApsRjpShRpXRQlJLb9wY28ELpwdPcZ7A=="],
+
+
"@esbuild/openbsd-arm64": ["@esbuild/openbsd-arm64@0.25.11", "", { "os": "openbsd", "cpu": "arm64" }, "sha512-Qq6YHhayieor3DxFOoYM1q0q1uMFYb7cSpLD2qzDSvK1NAvqFi8Xgivv0cFC6J+hWVw2teCYltyy9/m/14ryHg=="],
+
+
"@esbuild/openbsd-x64": ["@esbuild/openbsd-x64@0.25.11", "", { "os": "openbsd", "cpu": "x64" }, "sha512-CN+7c++kkbrckTOz5hrehxWN7uIhFFlmS/hqziSFVWpAzpWrQoAG4chH+nN3Be+Kzv/uuo7zhX716x3Sn2Jduw=="],
+
+
"@esbuild/openharmony-arm64": ["@esbuild/openharmony-arm64@0.25.11", "", { "os": "none", "cpu": "arm64" }, "sha512-rOREuNIQgaiR+9QuNkbkxubbp8MSO9rONmwP5nKncnWJ9v5jQ4JxFnLu4zDSRPf3x4u+2VN4pM4RdyIzDty/wQ=="],
+
+
"@esbuild/sunos-x64": ["@esbuild/sunos-x64@0.25.11", "", { "os": "sunos", "cpu": "x64" }, "sha512-nq2xdYaWxyg9DcIyXkZhcYulC6pQ2FuCgem3LI92IwMgIZ69KHeY8T4Y88pcwoLIjbed8n36CyKoYRDygNSGhA=="],
+
+
"@esbuild/win32-arm64": ["@esbuild/win32-arm64@0.25.11", "", { "os": "win32", "cpu": "arm64" }, "sha512-3XxECOWJq1qMZ3MN8srCJ/QfoLpL+VaxD/WfNRm1O3B4+AZ/BnLVgFbUV3eiRYDMXetciH16dwPbbHqwe1uU0Q=="],
+
+
"@esbuild/win32-ia32": ["@esbuild/win32-ia32@0.25.11", "", { "os": "win32", "cpu": "ia32" }, "sha512-3ukss6gb9XZ8TlRyJlgLn17ecsK4NSQTmdIXRASVsiS2sQ6zPPZklNJT5GR5tE/MUarymmy8kCEf5xPCNCqVOA=="],
+
+
"@esbuild/win32-x64": ["@esbuild/win32-x64@0.25.11", "", { "os": "win32", "cpu": "x64" }, "sha512-D7Hpz6A2L4hzsRpPaCYkQnGOotdUpDzSGRIv9I+1ITdHROSFUWW95ZPZWQmGka1Fg7W3zFJowyn9WGwMJ0+KPA=="],
+
+
"@hono/node-server": ["@hono/node-server@1.19.6", "", { "peerDependencies": { "hono": "^4" } }, "sha512-Shz/KjlIeAhfiuE93NDKVdZ7HdBVLQAfdbaXEaoAVO3ic9ibRSLGIQGkcBbFyuLr+7/1D5ZCINM8B+6IvXeMtw=="],
+
+
"@ipld/dag-cbor": ["@ipld/dag-cbor@7.0.3", "", { "dependencies": { "cborg": "^1.6.0", "multiformats": "^9.5.4" } }, ""],
+
+
"@noble/curves": ["@noble/curves@1.9.7", "", { "dependencies": { "@noble/hashes": "1.8.0" } }, ""],
+
+
"@noble/hashes": ["@noble/hashes@1.8.0", "", {}, ""],
+
+
"@types/bun": ["@types/bun@1.3.1", "", { "dependencies": { "bun-types": "1.3.1" } }, "sha512-4jNMk2/K9YJtfqwoAa28c8wK+T7nvJFOjxI4h/7sORWcypRNxBpr+TPNaCfVWq70tLCJsqoFwcf0oI0JU/fvMQ=="],
+
+
"@types/mime-types": ["@types/mime-types@2.1.4", "", {}, "sha512-lfU4b34HOri+kAY5UheuFMWPDOI+OPceBSHZKp69gEyTL/mmJ4cnU6Y/rlme3UL3GyOn6Y42hyIEw0/q8sWx5w=="],
+
+
"@types/node": ["@types/node@22.18.12", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-BICHQ67iqxQGFSzfCFTT7MRQ5XcBjG5aeKh5Ok38UBbPe5fxTyE+aHFxwVrGyr8GNlqFMLKD1D3P2K/1ks8tog=="],
+
+
"@types/react": ["@types/react@19.2.2", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA=="],
+
+
"abort-controller": ["abort-controller@3.0.0", "", { "dependencies": { "event-target-shim": "^5.0.0" } }, ""],
+
+
"accepts": ["accepts@1.3.8", "", { "dependencies": { "mime-types": "~2.1.34", "negotiator": "0.6.3" } }, ""],
+
+
"array-flatten": ["array-flatten@1.1.1", "", {}, ""],
+
+
"atomic-sleep": ["atomic-sleep@1.0.0", "", {}, ""],
+
+
"await-lock": ["await-lock@2.2.2", "", {}, ""],
+
+
"base64-js": ["base64-js@1.5.1", "", {}, ""],
+
+
"body-parser": ["body-parser@1.20.3", "", { "dependencies": { "bytes": "3.1.2", "content-type": "~1.0.5", "debug": "2.6.9", "depd": "2.0.0", "destroy": "1.2.0", "http-errors": "2.0.0", "iconv-lite": "0.4.24", "on-finished": "2.4.1", "qs": "6.13.0", "raw-body": "2.5.2", "type-is": "~1.6.18", "unpipe": "1.0.0" } }, ""],
+
+
"buffer": ["buffer@6.0.3", "", { "dependencies": { "base64-js": "^1.3.1", "ieee754": "^1.2.1" } }, ""],
+
+
"bun-types": ["bun-types@1.3.1", "", { "dependencies": { "@types/node": "*" }, "peerDependencies": { "@types/react": "^19" } }, "sha512-NMrcy7smratanWJ2mMXdpatalovtxVggkj11bScuWuiOoXTiKIu2eVS1/7qbyI/4yHedtsn175n4Sm4JcdHLXw=="],
+
+
"bytes": ["bytes@3.1.2", "", {}, ""],
+
+
"call-bind-apply-helpers": ["call-bind-apply-helpers@1.0.2", "", { "dependencies": { "es-errors": "^1.3.0", "function-bind": "^1.1.2" } }, ""],
+
+
"call-bound": ["call-bound@1.0.4", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.2", "get-intrinsic": "^1.3.0" } }, ""],
+
+
"cbor-extract": ["cbor-extract@2.2.0", "", { "dependencies": { "node-gyp-build-optional-packages": "5.1.1" }, "optionalDependencies": { "@cbor-extract/cbor-extract-darwin-arm64": "2.2.0" }, "bin": { "download-cbor-prebuilds": "bin/download-prebuilds.js" } }, ""],
+
+
"cbor-x": ["cbor-x@1.6.0", "", { "optionalDependencies": { "cbor-extract": "^2.2.0" } }, ""],
+
+
"cborg": ["cborg@1.10.2", "", { "bin": "cli.js" }, ""],
+
+
"content-disposition": ["content-disposition@0.5.4", "", { "dependencies": { "safe-buffer": "5.2.1" } }, ""],
+
+
"content-type": ["content-type@1.0.5", "", {}, ""],
+
+
"cookie": ["cookie@0.7.1", "", {}, ""],
+
+
"cookie-signature": ["cookie-signature@1.0.6", "", {}, ""],
+
+
"csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],
+
+
"debug": ["debug@2.6.9", "", { "dependencies": { "ms": "2.0.0" } }, ""],
+
+
"depd": ["depd@2.0.0", "", {}, ""],
+
+
"destroy": ["destroy@1.2.0", "", {}, ""],
+
+
"detect-libc": ["detect-libc@2.1.2", "", {}, ""],
+
+
"dunder-proto": ["dunder-proto@1.0.1", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.1", "es-errors": "^1.3.0", "gopd": "^1.2.0" } }, ""],
+
+
"ee-first": ["ee-first@1.1.1", "", {}, ""],
+
+
"encodeurl": ["encodeurl@2.0.0", "", {}, ""],
+
+
"es-define-property": ["es-define-property@1.0.1", "", {}, ""],
+
+
"es-errors": ["es-errors@1.3.0", "", {}, ""],
+
+
"es-object-atoms": ["es-object-atoms@1.1.1", "", { "dependencies": { "es-errors": "^1.3.0" } }, ""],
+
+
"esbuild": ["esbuild@0.25.11", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.25.11", "@esbuild/android-arm": "0.25.11", "@esbuild/android-arm64": "0.25.11", "@esbuild/android-x64": "0.25.11", "@esbuild/darwin-arm64": "0.25.11", "@esbuild/darwin-x64": "0.25.11", "@esbuild/freebsd-arm64": "0.25.11", "@esbuild/freebsd-x64": "0.25.11", "@esbuild/linux-arm": "0.25.11", "@esbuild/linux-arm64": "0.25.11", "@esbuild/linux-ia32": "0.25.11", "@esbuild/linux-loong64": "0.25.11", "@esbuild/linux-mips64el": "0.25.11", "@esbuild/linux-ppc64": "0.25.11", "@esbuild/linux-riscv64": "0.25.11", "@esbuild/linux-s390x": "0.25.11", "@esbuild/linux-x64": "0.25.11", "@esbuild/netbsd-arm64": "0.25.11", "@esbuild/netbsd-x64": "0.25.11", "@esbuild/openbsd-arm64": "0.25.11", "@esbuild/openbsd-x64": "0.25.11", "@esbuild/openharmony-arm64": "0.25.11", "@esbuild/sunos-x64": "0.25.11", "@esbuild/win32-arm64": "0.25.11", "@esbuild/win32-ia32": "0.25.11", "@esbuild/win32-x64": "0.25.11" }, "bin": "bin/esbuild" }, "sha512-KohQwyzrKTQmhXDW1PjCv3Tyspn9n5GcY2RTDqeORIdIJY8yKIF7sTSopFmn/wpMPW4rdPXI0UE5LJLuq3bx0Q=="],
+
+
"escape-html": ["escape-html@1.0.3", "", {}, ""],
+
+
"etag": ["etag@1.8.1", "", {}, ""],
+
+
"event-target-shim": ["event-target-shim@5.0.1", "", {}, ""],
+
+
"eventemitter3": ["eventemitter3@4.0.7", "", {}, ""],
+
+
"events": ["events@3.3.0", "", {}, ""],
+
+
"express": ["express@4.21.2", "", { "dependencies": { "accepts": "~1.3.8", "array-flatten": "1.1.1", "body-parser": "1.20.3", "content-disposition": "0.5.4", "content-type": "~1.0.4", "cookie": "0.7.1", "cookie-signature": "1.0.6", "debug": "2.6.9", "depd": "2.0.0", "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "etag": "~1.8.1", "finalhandler": "1.3.1", "fresh": "0.5.2", "http-errors": "2.0.0", "merge-descriptors": "1.0.3", "methods": "~1.1.2", "on-finished": "2.4.1", "parseurl": "~1.3.3", "path-to-regexp": "0.1.12", "proxy-addr": "~2.0.7", "qs": "6.13.0", "range-parser": "~1.2.1", "safe-buffer": "5.2.1", "send": "0.19.0", "serve-static": "1.16.2", "setprototypeof": "1.2.0", "statuses": "2.0.1", "type-is": "~1.6.18", "utils-merge": "1.0.1", "vary": "~1.1.2" } }, ""],
+
+
"fast-redact": ["fast-redact@3.5.0", "", {}, ""],
+
+
"finalhandler": ["finalhandler@1.3.1", "", { "dependencies": { "debug": "2.6.9", "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "on-finished": "2.4.1", "parseurl": "~1.3.3", "statuses": "2.0.1", "unpipe": "~1.0.0" } }, ""],
+
+
"forwarded": ["forwarded@0.2.0", "", {}, ""],
+
+
"fresh": ["fresh@0.5.2", "", {}, ""],
+
+
"fsevents": ["fsevents@2.3.3", "", { "os": "darwin" }, "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw=="],
+
+
"function-bind": ["function-bind@1.1.2", "", {}, ""],
+
+
"get-intrinsic": ["get-intrinsic@1.3.0", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.2", "es-define-property": "^1.0.1", "es-errors": "^1.3.0", "es-object-atoms": "^1.1.1", "function-bind": "^1.1.2", "get-proto": "^1.0.1", "gopd": "^1.2.0", "has-symbols": "^1.1.0", "hasown": "^2.0.2", "math-intrinsics": "^1.1.0" } }, ""],
+
+
"get-proto": ["get-proto@1.0.1", "", { "dependencies": { "dunder-proto": "^1.0.1", "es-object-atoms": "^1.0.0" } }, ""],
+
+
"get-tsconfig": ["get-tsconfig@4.13.0", "", { "dependencies": { "resolve-pkg-maps": "^1.0.0" } }, "sha512-1VKTZJCwBrvbd+Wn3AOgQP/2Av+TfTCOlE4AcRJE72W1ksZXbAx8PPBR9RzgTeSPzlPMHrbANMH3LbltH73wxQ=="],
+
+
"gopd": ["gopd@1.2.0", "", {}, ""],
+
+
"graphemer": ["graphemer@1.4.0", "", {}, ""],
+
+
"has-symbols": ["has-symbols@1.1.0", "", {}, ""],
+
+
"hasown": ["hasown@2.0.2", "", { "dependencies": { "function-bind": "^1.1.2" } }, ""],
+
+
"hono": ["hono@4.10.4", "", {}, "sha512-YG/fo7zlU3KwrBL5vDpWKisLYiM+nVstBQqfr7gCPbSYURnNEP9BDxEMz8KfsDR9JX0lJWDRNc6nXX31v7ZEyg=="],
+
+
"http-errors": ["http-errors@2.0.0", "", { "dependencies": { "depd": "2.0.0", "inherits": "2.0.4", "setprototypeof": "1.2.0", "statuses": "2.0.1", "toidentifier": "1.0.1" } }, ""],
+
+
"iconv-lite": ["iconv-lite@0.4.24", "", { "dependencies": { "safer-buffer": ">= 2.1.2 < 3" } }, ""],
+
+
"ieee754": ["ieee754@1.2.1", "", {}, ""],
+
+
"inherits": ["inherits@2.0.4", "", {}, ""],
+
+
"ipaddr.js": ["ipaddr.js@1.9.1", "", {}, ""],
+
+
"iso-datestring-validator": ["iso-datestring-validator@2.2.2", "", {}, ""],
+
+
"math-intrinsics": ["math-intrinsics@1.1.0", "", {}, ""],
+
+
"media-typer": ["media-typer@0.3.0", "", {}, ""],
+
+
"merge-descriptors": ["merge-descriptors@1.0.3", "", {}, ""],
+
+
"methods": ["methods@1.1.2", "", {}, ""],
+
+
"mime": ["mime@1.6.0", "", { "bin": "cli.js" }, ""],
+
+
"mime-db": ["mime-db@1.52.0", "", {}, ""],
+
+
"mime-types": ["mime-types@2.1.35", "", { "dependencies": { "mime-db": "1.52.0" } }, ""],
+
+
"ms": ["ms@2.0.0", "", {}, ""],
+
+
"multiformats": ["multiformats@13.4.1", "", {}, ""],
+
+
"negotiator": ["negotiator@0.6.3", "", {}, ""],
+
+
"node-gyp-build-optional-packages": ["node-gyp-build-optional-packages@5.1.1", "", { "dependencies": { "detect-libc": "^2.0.1" }, "bin": { "node-gyp-build-optional-packages": "bin.js", "node-gyp-build-optional-packages-optional": "optional.js", "node-gyp-build-optional-packages-test": "build-test.js" } }, ""],
+
+
"object-inspect": ["object-inspect@1.13.4", "", {}, ""],
+
+
"on-exit-leak-free": ["on-exit-leak-free@2.1.2", "", {}, ""],
+
+
"on-finished": ["on-finished@2.4.1", "", { "dependencies": { "ee-first": "1.1.1" } }, ""],
+
+
"p-finally": ["p-finally@1.0.0", "", {}, ""],
+
+
"p-queue": ["p-queue@6.6.2", "", { "dependencies": { "eventemitter3": "^4.0.4", "p-timeout": "^3.2.0" } }, ""],
+
+
"p-timeout": ["p-timeout@3.2.0", "", { "dependencies": { "p-finally": "^1.0.0" } }, ""],
+
+
"parseurl": ["parseurl@1.3.3", "", {}, ""],
+
+
"path-to-regexp": ["path-to-regexp@0.1.12", "", {}, ""],
+
+
"pino": ["pino@8.21.0", "", { "dependencies": { "atomic-sleep": "^1.0.0", "fast-redact": "^3.1.1", "on-exit-leak-free": "^2.1.0", "pino-abstract-transport": "^1.2.0", "pino-std-serializers": "^6.0.0", "process-warning": "^3.0.0", "quick-format-unescaped": "^4.0.3", "real-require": "^0.2.0", "safe-stable-stringify": "^2.3.1", "sonic-boom": "^3.7.0", "thread-stream": "^2.6.0" }, "bin": "bin.js" }, ""],
+
+
"pino-abstract-transport": ["pino-abstract-transport@1.2.0", "", { "dependencies": { "readable-stream": "^4.0.0", "split2": "^4.0.0" } }, ""],
+
+
"pino-std-serializers": ["pino-std-serializers@6.2.2", "", {}, ""],
+
+
"postgres": ["postgres@3.4.7", "", {}, ""],
+
+
"process": ["process@0.11.10", "", {}, ""],
+
+
"process-warning": ["process-warning@3.0.0", "", {}, ""],
+
+
"proxy-addr": ["proxy-addr@2.0.7", "", { "dependencies": { "forwarded": "0.2.0", "ipaddr.js": "1.9.1" } }, ""],
+
+
"qs": ["qs@6.13.0", "", { "dependencies": { "side-channel": "^1.0.6" } }, ""],
+
+
"quick-format-unescaped": ["quick-format-unescaped@4.0.4", "", {}, ""],
+
+
"range-parser": ["range-parser@1.2.1", "", {}, ""],
+
+
"rate-limiter-flexible": ["rate-limiter-flexible@2.4.2", "", {}, ""],
+
+
"raw-body": ["raw-body@2.5.2", "", { "dependencies": { "bytes": "3.1.2", "http-errors": "2.0.0", "iconv-lite": "0.4.24", "unpipe": "1.0.0" } }, ""],
+
+
"readable-stream": ["readable-stream@4.7.0", "", { "dependencies": { "abort-controller": "^3.0.0", "buffer": "^6.0.3", "events": "^3.3.0", "process": "^0.11.10", "string_decoder": "^1.3.0" } }, ""],
+
+
"real-require": ["real-require@0.2.0", "", {}, ""],
+
+
"resolve-pkg-maps": ["resolve-pkg-maps@1.0.0", "", {}, "sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw=="],
+
+
"safe-buffer": ["safe-buffer@5.2.1", "", {}, ""],
+
+
"safe-stable-stringify": ["safe-stable-stringify@2.5.0", "", {}, ""],
+
+
"safer-buffer": ["safer-buffer@2.1.2", "", {}, ""],
+
+
"send": ["send@0.19.0", "", { "dependencies": { "debug": "2.6.9", "depd": "2.0.0", "destroy": "1.2.0", "encodeurl": "~1.0.2", "escape-html": "~1.0.3", "etag": "~1.8.1", "fresh": "0.5.2", "http-errors": "2.0.0", "mime": "1.6.0", "ms": "2.1.3", "on-finished": "2.4.1", "range-parser": "~1.2.1", "statuses": "2.0.1" } }, ""],
+
+
"serve-static": ["serve-static@1.16.2", "", { "dependencies": { "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "parseurl": "~1.3.3", "send": "0.19.0" } }, ""],
+
+
"setprototypeof": ["setprototypeof@1.2.0", "", {}, ""],
+
+
"side-channel": ["side-channel@1.1.0", "", { "dependencies": { "es-errors": "^1.3.0", "object-inspect": "^1.13.3", "side-channel-list": "^1.0.0", "side-channel-map": "^1.0.1", "side-channel-weakmap": "^1.0.2" } }, ""],
+
+
"side-channel-list": ["side-channel-list@1.0.0", "", { "dependencies": { "es-errors": "^1.3.0", "object-inspect": "^1.13.3" } }, ""],
+
+
"side-channel-map": ["side-channel-map@1.0.1", "", { "dependencies": { "call-bound": "^1.0.2", "es-errors": "^1.3.0", "get-intrinsic": "^1.2.5", "object-inspect": "^1.13.3" } }, ""],
+
+
"side-channel-weakmap": ["side-channel-weakmap@1.0.2", "", { "dependencies": { "call-bound": "^1.0.2", "es-errors": "^1.3.0", "get-intrinsic": "^1.2.5", "object-inspect": "^1.13.3", "side-channel-map": "^1.0.1" } }, ""],
+
+
"sonic-boom": ["sonic-boom@3.8.1", "", { "dependencies": { "atomic-sleep": "^1.0.0" } }, ""],
+
+
"split2": ["split2@4.2.0", "", {}, ""],
+
+
"statuses": ["statuses@2.0.1", "", {}, ""],
+
+
"string_decoder": ["string_decoder@1.3.0", "", { "dependencies": { "safe-buffer": "~5.2.0" } }, ""],
+
+
"thread-stream": ["thread-stream@2.7.0", "", { "dependencies": { "real-require": "^0.2.0" } }, ""],
+
+
"tlds": ["tlds@1.261.0", "", { "bin": "bin.js" }, ""],
+
+
"toidentifier": ["toidentifier@1.0.1", "", {}, ""],
+
+
"tsx": ["tsx@4.20.6", "", { "dependencies": { "esbuild": "~0.25.0", "get-tsconfig": "^4.7.5" }, "optionalDependencies": { "fsevents": "~2.3.3" }, "bin": "dist/cli.mjs" }, "sha512-ytQKuwgmrrkDTFP4LjR0ToE2nqgy886GpvRSpU0JAnrdBYppuY5rLkRUYPU1yCryb24SsKBTL/hlDQAEFVwtZg=="],
+
+
"type-is": ["type-is@1.6.18", "", { "dependencies": { "media-typer": "0.3.0", "mime-types": "~2.1.24" } }, ""],
+
+
"uint8arrays": ["uint8arrays@3.0.0", "", { "dependencies": { "multiformats": "^9.4.2" } }, ""],
+
+
"undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
+
+
"unpipe": ["unpipe@1.0.0", "", {}, ""],
+
+
"utils-merge": ["utils-merge@1.0.1", "", {}, ""],
+
+
"varint": ["varint@6.0.0", "", {}, ""],
+
+
"vary": ["vary@1.1.2", "", {}, ""],
+
+
"ws": ["ws@8.18.3", "", { "peerDependencies": { "bufferutil": "^4.0.1", "utf-8-validate": ">=5.0.2" }, "optionalPeers": ["bufferutil", "utf-8-validate"] }, ""],
+
+
"zod": ["zod@3.25.76", "", {}, ""],
+
+
"@atproto/api/multiformats": ["multiformats@9.9.0", "", {}, ""],
+
+
"@atproto/common/multiformats": ["multiformats@9.9.0", "", {}, ""],
+
+
"@atproto/common-web/multiformats": ["multiformats@9.9.0", "", {}, ""],
+
+
"@atproto/lexicon/multiformats": ["multiformats@9.9.0", "", {}, ""],
+
+
"@atproto/repo/multiformats": ["multiformats@9.9.0", "", {}, ""],
+
+
"@atproto/sync/multiformats": ["multiformats@9.9.0", "", {}, ""],
+
+
"@ipld/dag-cbor/multiformats": ["multiformats@9.9.0", "", {}, ""],
+
+
"send/encodeurl": ["encodeurl@1.0.2", "", {}, ""],
+
+
"send/ms": ["ms@2.1.3", "", {}, ""],
+
+
"uint8arrays/multiformats": ["multiformats@9.9.0", "", {}, ""],
+
}
+
}
+32
apps/hosting-service/docker-entrypoint.sh
···
+
#!/bin/sh
+
set -e
+
+
# Run different modes based on MODE environment variable
+
# Modes:
+
# - server (default): Start the hosting service
+
# - backfill: Run cache backfill and exit
+
# - backfill-server: Run cache backfill, then start the server
+
+
MODE="${MODE:-server}"
+
+
case "$MODE" in
+
backfill)
+
echo "๐Ÿ”„ Running in backfill-only mode..."
+
exec npm run backfill
+
;;
+
backfill-server)
+
echo "๐Ÿ”„ Running backfill, then starting server..."
+
npm run backfill
+
echo "โœ… Backfill complete, starting server..."
+
exec npm run start
+
;;
+
server)
+
echo "๐Ÿš€ Starting server..."
+
exec npm run start
+
;;
+
*)
+
echo "โŒ Unknown MODE: $MODE"
+
echo "Valid modes: server, backfill, backfill-server"
+
exit 1
+
;;
+
esac
+36
apps/hosting-service/package.json
···
+
{
+
"name": "wisp-hosting-service",
+
"version": "1.0.0",
+
"type": "module",
+
"scripts": {
+
"dev": "tsx --env-file=.env src/index.ts",
+
"build": "tsc",
+
"start": "tsx src/index.ts",
+
"backfill": "tsx src/index.ts --backfill"
+
},
+
"dependencies": {
+
"@wisp/lexicons": "workspace:*",
+
"@wisp/constants": "workspace:*",
+
"@wisp/observability": "workspace:*",
+
"@wisp/atproto-utils": "workspace:*",
+
"@wisp/database": "workspace:*",
+
"@wisp/fs-utils": "workspace:*",
+
"@wisp/safe-fetch": "workspace:*",
+
"@atproto/api": "^0.17.4",
+
"@atproto/identity": "^0.4.9",
+
"@atproto/lexicon": "^0.5.1",
+
"@atproto/sync": "^0.1.36",
+
"@atproto/xrpc": "^0.7.5",
+
"@hono/node-server": "^1.19.6",
+
"hono": "^4.10.4",
+
"mime-types": "^2.1.35",
+
"multiformats": "^13.4.1",
+
"postgres": "^3.4.5"
+
},
+
"devDependencies": {
+
"@types/bun": "^1.3.1",
+
"@types/mime-types": "^2.1.4",
+
"@types/node": "^22.10.5",
+
"tsx": "^4.19.2"
+
}
+
}
+97
apps/hosting-service/src/index.ts
···
+
import app from './server';
+
import { serve } from '@hono/node-server';
+
import { FirehoseWorker } from './lib/firehose';
+
import { createLogger } from '@wisp/observability';
+
import { mkdirSync, existsSync } from 'fs';
+
import { backfillCache } from './lib/backfill';
+
import { startDomainCacheCleanup, stopDomainCacheCleanup, setCacheOnlyMode } from './lib/db';
+
+
const logger = createLogger('hosting-service');
+
+
const PORT = process.env.PORT ? parseInt(process.env.PORT) : 3001;
+
const CACHE_DIR = process.env.CACHE_DIR || './cache/sites';
+
+
// Parse CLI arguments
+
const args = process.argv.slice(2);
+
const hasBackfillFlag = args.includes('--backfill');
+
const backfillOnStartup = hasBackfillFlag || process.env.BACKFILL_ON_STARTUP === 'true';
+
+
// Cache-only mode: service will only cache files locally, no DB writes
+
const hasCacheOnlyFlag = args.includes('--cache-only');
+
export const CACHE_ONLY_MODE = hasCacheOnlyFlag || process.env.CACHE_ONLY_MODE === 'true';
+
+
// Configure cache-only mode in database module
+
if (CACHE_ONLY_MODE) {
+
setCacheOnlyMode(true);
+
}
+
+
// Ensure cache directory exists
+
if (!existsSync(CACHE_DIR)) {
+
mkdirSync(CACHE_DIR, { recursive: true });
+
console.log('Created cache directory:', CACHE_DIR);
+
}
+
+
// Start domain cache cleanup
+
startDomainCacheCleanup();
+
+
// Start firehose worker with observability logger
+
const firehose = new FirehoseWorker((msg, data) => {
+
logger.info(msg, data);
+
});
+
+
firehose.start();
+
+
// Run backfill if requested
+
if (backfillOnStartup) {
+
console.log('๐Ÿ”„ Backfill requested, starting cache backfill...');
+
backfillCache({
+
skipExisting: true,
+
concurrency: 3,
+
}).then((stats) => {
+
console.log('โœ… Cache backfill completed');
+
}).catch((err) => {
+
console.error('โŒ Cache backfill error:', err);
+
});
+
}
+
+
// Add health check endpoint
+
app.get('/health', (c) => {
+
const firehoseHealth = firehose.getHealth();
+
return c.json({
+
status: 'ok',
+
firehose: firehoseHealth,
+
});
+
});
+
+
// Start HTTP server with Node.js adapter
+
const server = serve({
+
fetch: app.fetch,
+
port: PORT,
+
});
+
+
console.log(`
+
Wisp Hosting Service
+
+
Server: http://localhost:${PORT}
+
Health: http://localhost:${PORT}/health
+
Cache: ${CACHE_DIR}
+
Firehose: Connected to Firehose
+
Cache-Only: ${CACHE_ONLY_MODE ? 'ENABLED (no DB writes)' : 'DISABLED'}
+
`);
+
+
// Graceful shutdown
+
process.on('SIGINT', async () => {
+
console.log('\n๐Ÿ›‘ Shutting down...');
+
firehose.stop();
+
stopDomainCacheCleanup();
+
server.close();
+
process.exit(0);
+
});
+
+
process.on('SIGTERM', async () => {
+
console.log('\n๐Ÿ›‘ Shutting down...');
+
firehose.stop();
+
stopDomainCacheCleanup();
+
server.close();
+
process.exit(0);
+
});
+150
apps/hosting-service/src/lib/backfill.ts
···
+
import { getAllSites } from './db';
+
import { fetchSiteRecord, getPdsForDid, downloadAndCacheSite, isCached } from './utils';
+
import { createLogger } from '@wisp/observability';
+
import { markSiteAsBeingCached, unmarkSiteAsBeingCached } from './cache';
+
import { clearRedirectRulesCache } from './site-cache';
+
+
const logger = createLogger('hosting-service');
+
+
export interface BackfillOptions {
+
skipExisting?: boolean; // Skip sites already in cache
+
concurrency?: number; // Number of sites to cache concurrently
+
maxSites?: number; // Maximum number of sites to backfill (for testing)
+
}
+
+
export interface BackfillStats {
+
total: number;
+
cached: number;
+
skipped: number;
+
failed: number;
+
duration: number;
+
}
+
+
/**
+
* Backfill all sites from the database into the local cache
+
*/
+
export async function backfillCache(options: BackfillOptions = {}): Promise<BackfillStats> {
+
const {
+
skipExisting = true,
+
concurrency = 10, // Increased from 3 to 10 for better parallelization
+
maxSites,
+
} = options;
+
+
const startTime = Date.now();
+
const stats: BackfillStats = {
+
total: 0,
+
cached: 0,
+
skipped: 0,
+
failed: 0,
+
duration: 0,
+
};
+
+
logger.info('Starting cache backfill', { skipExisting, concurrency, maxSites });
+
console.log(`
+
โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—
+
โ•‘ CACHE BACKFILL STARTING โ•‘
+
โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
+
`);
+
+
try {
+
// Get all sites from database
+
let sites = await getAllSites();
+
stats.total = sites.length;
+
+
logger.info(`Found ${sites.length} sites in database`);
+
console.log(`๐Ÿ“Š Found ${sites.length} sites in database`);
+
+
// Limit if specified
+
if (maxSites && maxSites > 0) {
+
sites = sites.slice(0, maxSites);
+
console.log(`โš™๏ธ Limited to ${maxSites} sites for backfill`);
+
}
+
+
// Process sites in batches
+
const batches: typeof sites[] = [];
+
for (let i = 0; i < sites.length; i += concurrency) {
+
batches.push(sites.slice(i, i + concurrency));
+
}
+
+
let processed = 0;
+
for (const batch of batches) {
+
await Promise.all(
+
batch.map(async (site) => {
+
try {
+
// Check if already cached
+
if (skipExisting && isCached(site.did, site.rkey)) {
+
stats.skipped++;
+
processed++;
+
logger.debug(`Skipping already cached site`, { did: site.did, rkey: site.rkey });
+
console.log(`โญ๏ธ [${processed}/${sites.length}] Skipped (cached): ${site.display_name || site.rkey}`);
+
return;
+
}
+
+
// Fetch site record
+
const siteData = await fetchSiteRecord(site.did, site.rkey);
+
if (!siteData) {
+
stats.failed++;
+
processed++;
+
logger.error('Site record not found during backfill', null, { did: site.did, rkey: site.rkey });
+
console.log(`โŒ [${processed}/${sites.length}] Failed (not found): ${site.display_name || site.rkey}`);
+
return;
+
}
+
+
// Get PDS endpoint
+
const pdsEndpoint = await getPdsForDid(site.did);
+
if (!pdsEndpoint) {
+
stats.failed++;
+
processed++;
+
logger.error('PDS not found during backfill', null, { did: site.did });
+
console.log(`โŒ [${processed}/${sites.length}] Failed (no PDS): ${site.display_name || site.rkey}`);
+
return;
+
}
+
+
// Mark site as being cached to prevent serving stale content during update
+
markSiteAsBeingCached(site.did, site.rkey);
+
+
try {
+
// Download and cache site
+
await downloadAndCacheSite(site.did, site.rkey, siteData.record, pdsEndpoint, siteData.cid);
+
// Clear redirect rules cache since the site was updated
+
clearRedirectRulesCache(site.did, site.rkey);
+
stats.cached++;
+
processed++;
+
logger.info('Successfully cached site during backfill', { did: site.did, rkey: site.rkey });
+
console.log(`โœ… [${processed}/${sites.length}] Cached: ${site.display_name || site.rkey}`);
+
} finally {
+
// Always unmark, even if caching fails
+
unmarkSiteAsBeingCached(site.did, site.rkey);
+
}
+
} catch (err) {
+
stats.failed++;
+
processed++;
+
logger.error('Failed to cache site during backfill', err, { did: site.did, rkey: site.rkey });
+
console.log(`โŒ [${processed}/${sites.length}] Failed: ${site.display_name || site.rkey}`);
+
}
+
})
+
);
+
}
+
+
stats.duration = Date.now() - startTime;
+
+
console.log(`
+
โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—
+
โ•‘ CACHE BACKFILL COMPLETED โ•‘
+
โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
+
+
๐Ÿ“Š Total Sites: ${stats.total}
+
โœ… Cached: ${stats.cached}
+
โญ๏ธ Skipped: ${stats.skipped}
+
โŒ Failed: ${stats.failed}
+
โฑ๏ธ Duration: ${(stats.duration / 1000).toFixed(2)}s
+
`);
+
+
logger.info('Cache backfill completed', stats);
+
} catch (err) {
+
logger.error('Cache backfill failed', err);
+
console.error('โŒ Cache backfill failed:', err);
+
}
+
+
return stats;
+
}
+196
apps/hosting-service/src/lib/cache.ts
···
+
// In-memory LRU cache for file contents and metadata
+
+
interface CacheEntry<T> {
+
value: T;
+
size: number;
+
timestamp: number;
+
}
+
+
interface CacheStats {
+
hits: number;
+
misses: number;
+
evictions: number;
+
currentSize: number;
+
currentCount: number;
+
}
+
+
export class LRUCache<T> {
+
private cache: Map<string, CacheEntry<T>>;
+
private maxSize: number;
+
private maxCount: number;
+
private currentSize: number;
+
private stats: CacheStats;
+
+
constructor(maxSize: number, maxCount: number) {
+
this.cache = new Map();
+
this.maxSize = maxSize;
+
this.maxCount = maxCount;
+
this.currentSize = 0;
+
this.stats = {
+
hits: 0,
+
misses: 0,
+
evictions: 0,
+
currentSize: 0,
+
currentCount: 0,
+
};
+
}
+
+
get(key: string): T | null {
+
const entry = this.cache.get(key);
+
if (!entry) {
+
this.stats.misses++;
+
return null;
+
}
+
+
// Move to end (most recently used)
+
this.cache.delete(key);
+
this.cache.set(key, entry);
+
+
this.stats.hits++;
+
return entry.value;
+
}
+
+
set(key: string, value: T, size: number): void {
+
// Remove existing entry if present
+
if (this.cache.has(key)) {
+
const existing = this.cache.get(key)!;
+
this.currentSize -= existing.size;
+
this.cache.delete(key);
+
}
+
+
// Evict entries if needed
+
while (
+
(this.cache.size >= this.maxCount || this.currentSize + size > this.maxSize) &&
+
this.cache.size > 0
+
) {
+
const firstKey = this.cache.keys().next().value;
+
if (!firstKey) break; // Should never happen, but satisfy TypeScript
+
const firstEntry = this.cache.get(firstKey);
+
if (!firstEntry) break; // Should never happen, but satisfy TypeScript
+
this.cache.delete(firstKey);
+
this.currentSize -= firstEntry.size;
+
this.stats.evictions++;
+
}
+
+
// Add new entry
+
this.cache.set(key, {
+
value,
+
size,
+
timestamp: Date.now(),
+
});
+
this.currentSize += size;
+
+
// Update stats
+
this.stats.currentSize = this.currentSize;
+
this.stats.currentCount = this.cache.size;
+
}
+
+
delete(key: string): boolean {
+
const entry = this.cache.get(key);
+
if (!entry) return false;
+
+
this.cache.delete(key);
+
this.currentSize -= entry.size;
+
this.stats.currentSize = this.currentSize;
+
this.stats.currentCount = this.cache.size;
+
return true;
+
}
+
+
// Invalidate all entries for a specific site
+
invalidateSite(did: string, rkey: string): number {
+
const prefix = `${did}:${rkey}:`;
+
let count = 0;
+
+
for (const key of Array.from(this.cache.keys())) {
+
if (key.startsWith(prefix)) {
+
this.delete(key);
+
count++;
+
}
+
}
+
+
return count;
+
}
+
+
// Get cache size
+
size(): number {
+
return this.cache.size;
+
}
+
+
clear(): void {
+
this.cache.clear();
+
this.currentSize = 0;
+
this.stats.currentSize = 0;
+
this.stats.currentCount = 0;
+
}
+
+
getStats(): CacheStats {
+
return { ...this.stats };
+
}
+
+
// Get cache hit rate
+
getHitRate(): number {
+
const total = this.stats.hits + this.stats.misses;
+
return total === 0 ? 0 : (this.stats.hits / total) * 100;
+
}
+
}
+
+
// File metadata cache entry
+
export interface FileMetadata {
+
encoding?: 'gzip';
+
mimeType: string;
+
}
+
+
// Global cache instances
+
const FILE_CACHE_SIZE = 100 * 1024 * 1024; // 100MB
+
const FILE_CACHE_COUNT = 500;
+
const METADATA_CACHE_COUNT = 2000;
+
+
export const fileCache = new LRUCache<Buffer>(FILE_CACHE_SIZE, FILE_CACHE_COUNT);
+
export const metadataCache = new LRUCache<FileMetadata>(1024 * 1024, METADATA_CACHE_COUNT); // 1MB for metadata
+
export const rewrittenHtmlCache = new LRUCache<Buffer>(50 * 1024 * 1024, 200); // 50MB for rewritten HTML
+
+
// Helper to generate cache keys
+
export function getCacheKey(did: string, rkey: string, filePath: string, suffix?: string): string {
+
const base = `${did}:${rkey}:${filePath}`;
+
return suffix ? `${base}:${suffix}` : base;
+
}
+
+
// Invalidate all caches for a site
+
export function invalidateSiteCache(did: string, rkey: string): void {
+
const fileCount = fileCache.invalidateSite(did, rkey);
+
const metaCount = metadataCache.invalidateSite(did, rkey);
+
const htmlCount = rewrittenHtmlCache.invalidateSite(did, rkey);
+
+
console.log(`[Cache] Invalidated site ${did}:${rkey} - ${fileCount} files, ${metaCount} metadata, ${htmlCount} HTML`);
+
}
+
+
// Track sites currently being cached (to prevent serving stale cache during updates)
+
const sitesBeingCached = new Set<string>();
+
+
export function markSiteAsBeingCached(did: string, rkey: string): void {
+
const key = `${did}:${rkey}`;
+
sitesBeingCached.add(key);
+
}
+
+
export function unmarkSiteAsBeingCached(did: string, rkey: string): void {
+
const key = `${did}:${rkey}`;
+
sitesBeingCached.delete(key);
+
}
+
+
export function isSiteBeingCached(did: string, rkey: string): boolean {
+
const key = `${did}:${rkey}`;
+
return sitesBeingCached.has(key);
+
}
+
+
// Get overall cache statistics
+
export function getCacheStats() {
+
return {
+
files: fileCache.getStats(),
+
fileHitRate: fileCache.getHitRate(),
+
metadata: metadataCache.getStats(),
+
metadataHitRate: metadataCache.getHitRate(),
+
rewrittenHtml: rewrittenHtmlCache.getStats(),
+
rewrittenHtmlHitRate: rewrittenHtmlCache.getHitRate(),
+
sitesBeingCached: sitesBeingCached.size,
+
};
+
}
+216
apps/hosting-service/src/lib/db.ts
···
+
import postgres from 'postgres';
+
import { createHash } from 'crypto';
+
import type { DomainLookup, CustomDomainLookup } from '@wisp/database';
+
+
// Global cache-only mode flag (set by index.ts)
+
let cacheOnlyMode = false;
+
+
export function setCacheOnlyMode(enabled: boolean) {
+
cacheOnlyMode = enabled;
+
if (enabled) {
+
console.log('[DB] Cache-only mode enabled - database writes will be skipped');
+
}
+
}
+
+
const sql = postgres(
+
process.env.DATABASE_URL || 'postgres://postgres:postgres@localhost:5432/wisp',
+
{
+
max: 10,
+
idle_timeout: 20,
+
}
+
);
+
+
// Domain lookup cache with TTL
+
const DOMAIN_CACHE_TTL = 5 * 60 * 1000; // 5 minutes
+
+
interface CachedDomain<T> {
+
value: T;
+
timestamp: number;
+
}
+
+
const domainCache = new Map<string, CachedDomain<DomainLookup | null>>();
+
const customDomainCache = new Map<string, CachedDomain<CustomDomainLookup | null>>();
+
+
let cleanupInterval: NodeJS.Timeout | null = null;
+
+
export function startDomainCacheCleanup() {
+
if (cleanupInterval) return;
+
+
cleanupInterval = setInterval(() => {
+
const now = Date.now();
+
+
for (const [key, entry] of domainCache.entries()) {
+
if (now - entry.timestamp > DOMAIN_CACHE_TTL) {
+
domainCache.delete(key);
+
}
+
}
+
+
for (const [key, entry] of customDomainCache.entries()) {
+
if (now - entry.timestamp > DOMAIN_CACHE_TTL) {
+
customDomainCache.delete(key);
+
}
+
}
+
}, 30 * 60 * 1000); // Run every 30 minutes
+
}
+
+
export function stopDomainCacheCleanup() {
+
if (cleanupInterval) {
+
clearInterval(cleanupInterval);
+
cleanupInterval = null;
+
}
+
}
+
+
export async function getWispDomain(domain: string): Promise<DomainLookup | null> {
+
const key = domain.toLowerCase();
+
+
// Check cache first
+
const cached = domainCache.get(key);
+
if (cached && Date.now() - cached.timestamp < DOMAIN_CACHE_TTL) {
+
return cached.value;
+
}
+
+
// Query database
+
const result = await sql<DomainLookup[]>`
+
SELECT did, rkey FROM domains WHERE domain = ${key} LIMIT 1
+
`;
+
const data = result[0] || null;
+
+
// Cache the result
+
domainCache.set(key, { value: data, timestamp: Date.now() });
+
+
return data;
+
}
+
+
export async function getCustomDomain(domain: string): Promise<CustomDomainLookup | null> {
+
const key = domain.toLowerCase();
+
+
// Check cache first
+
const cached = customDomainCache.get(key);
+
if (cached && Date.now() - cached.timestamp < DOMAIN_CACHE_TTL) {
+
return cached.value;
+
}
+
+
// Query database
+
const result = await sql<CustomDomainLookup[]>`
+
SELECT id, domain, did, rkey, verified FROM custom_domains
+
WHERE domain = ${key} AND verified = true LIMIT 1
+
`;
+
const data = result[0] || null;
+
+
// Cache the result
+
customDomainCache.set(key, { value: data, timestamp: Date.now() });
+
+
return data;
+
}
+
+
export async function getCustomDomainByHash(hash: string): Promise<CustomDomainLookup | null> {
+
const key = `hash:${hash}`;
+
+
// Check cache first
+
const cached = customDomainCache.get(key);
+
if (cached && Date.now() - cached.timestamp < DOMAIN_CACHE_TTL) {
+
return cached.value;
+
}
+
+
// Query database
+
const result = await sql<CustomDomainLookup[]>`
+
SELECT id, domain, did, rkey, verified FROM custom_domains
+
WHERE id = ${hash} AND verified = true LIMIT 1
+
`;
+
const data = result[0] || null;
+
+
// Cache the result
+
customDomainCache.set(key, { value: data, timestamp: Date.now() });
+
+
return data;
+
}
+
+
export async function upsertSite(did: string, rkey: string, displayName?: string) {
+
// Skip database writes in cache-only mode
+
if (cacheOnlyMode) {
+
console.log('[DB] Skipping upsertSite (cache-only mode)', { did, rkey });
+
return;
+
}
+
+
try {
+
// Only set display_name if provided (not undefined/null/empty)
+
const cleanDisplayName = displayName && displayName.trim() ? displayName.trim() : null;
+
+
await sql`
+
INSERT INTO sites (did, rkey, display_name, created_at, updated_at)
+
VALUES (${did}, ${rkey}, ${cleanDisplayName}, EXTRACT(EPOCH FROM NOW()), EXTRACT(EPOCH FROM NOW()))
+
ON CONFLICT (did, rkey)
+
DO UPDATE SET
+
display_name = CASE
+
WHEN EXCLUDED.display_name IS NOT NULL THEN EXCLUDED.display_name
+
ELSE sites.display_name
+
END,
+
updated_at = EXTRACT(EPOCH FROM NOW())
+
`;
+
} catch (err) {
+
console.error('Failed to upsert site', err);
+
}
+
}
+
+
export interface SiteRecord {
+
did: string;
+
rkey: string;
+
display_name?: string;
+
}
+
+
export async function getAllSites(): Promise<SiteRecord[]> {
+
try {
+
const result = await sql<SiteRecord[]>`
+
SELECT did, rkey, display_name FROM sites
+
ORDER BY created_at DESC
+
`;
+
return result;
+
} catch (err) {
+
console.error('Failed to get all sites', err);
+
return [];
+
}
+
}
+
+
/**
+
* Generate a numeric lock ID from a string key
+
* PostgreSQL advisory locks use bigint (64-bit signed integer)
+
*/
+
function stringToLockId(key: string): bigint {
+
const hash = createHash('sha256').update(key).digest('hex');
+
// Take first 16 hex characters (64 bits) and convert to bigint
+
const hashNum = BigInt('0x' + hash.substring(0, 16));
+
// Keep within signed int64 range
+
return hashNum & 0x7FFFFFFFFFFFFFFFn;
+
}
+
+
/**
+
* Acquire a distributed lock using PostgreSQL advisory locks
+
* Returns true if lock was acquired, false if already held by another instance
+
* Lock is automatically released when the transaction ends or connection closes
+
*/
+
export async function tryAcquireLock(key: string): Promise<boolean> {
+
const lockId = stringToLockId(key);
+
+
try {
+
const result = await sql`SELECT pg_try_advisory_lock(${Number(lockId)}) as acquired`;
+
return result[0]?.acquired === true;
+
} catch (err) {
+
console.error('Failed to acquire lock', { key, error: err });
+
return false;
+
}
+
}
+
+
/**
+
* Release a distributed lock
+
*/
+
export async function releaseLock(key: string): Promise<void> {
+
const lockId = stringToLockId(key);
+
+
try {
+
await sql`SELECT pg_advisory_unlock(${Number(lockId)})`;
+
} catch (err) {
+
console.error('Failed to release lock', { key, error: err });
+
}
+
}
+
+
export { sql };
+839
apps/hosting-service/src/lib/file-serving.ts
···
+
/**
+
* Core file serving logic for the hosting service
+
* Handles file retrieval, caching, redirects, and HTML rewriting
+
*/
+
+
import { readFile } from 'fs/promises';
+
import { lookup } from 'mime-types';
+
import type { Record as WispSettings } from '@wisp/lexicons/types/place/wisp/settings';
+
import { shouldCompressMimeType } from '@wisp/atproto-utils/compression';
+
import { fileCache, metadataCache, rewrittenHtmlCache, getCacheKey, isSiteBeingCached } from './cache';
+
import { getCachedFilePath, getCachedSettings } from './utils';
+
import { loadRedirectRules, matchRedirectRule, parseCookies, parseQueryString } from './redirects';
+
import { rewriteHtmlPaths, isHtmlContent } from './html-rewriter';
+
import { generate404Page, generateDirectoryListing, siteUpdatingResponse } from './page-generators';
+
import { getIndexFiles, applyCustomHeaders, fileExists } from './request-utils';
+
import { getRedirectRulesFromCache, setRedirectRulesInCache } from './site-cache';
+
+
/**
+
* Helper to serve files from cache (for custom domains and subdomains)
+
*/
+
export async function serveFromCache(
+
did: string,
+
rkey: string,
+
filePath: string,
+
fullUrl?: string,
+
headers?: Record<string, string>
+
): Promise<Response> {
+
// Load settings for this site
+
const settings = await getCachedSettings(did, rkey);
+
const indexFiles = getIndexFiles(settings);
+
+
// Check for redirect rules first (_redirects wins over settings)
+
let redirectRules = getRedirectRulesFromCache(did, rkey);
+
+
if (redirectRules === undefined) {
+
// Load rules for the first time
+
redirectRules = await loadRedirectRules(did, rkey);
+
setRedirectRulesInCache(did, rkey, redirectRules);
+
}
+
+
// Apply redirect rules if any exist
+
if (redirectRules.length > 0) {
+
const requestPath = '/' + (filePath || '');
+
const queryParams = fullUrl ? parseQueryString(fullUrl) : {};
+
const cookies = parseCookies(headers?.['cookie']);
+
+
const redirectMatch = matchRedirectRule(requestPath, redirectRules, {
+
queryParams,
+
headers,
+
cookies,
+
});
+
+
if (redirectMatch) {
+
const { rule, targetPath, status } = redirectMatch;
+
+
// If not forced, check if the requested file exists before redirecting
+
if (!rule.force) {
+
// Build the expected file path
+
let checkPath: string = filePath || indexFiles[0] || 'index.html';
+
if (checkPath.endsWith('/')) {
+
checkPath += indexFiles[0] || 'index.html';
+
}
+
+
const cachedFile = getCachedFilePath(did, rkey, checkPath);
+
const fileExistsOnDisk = await fileExists(cachedFile);
+
+
// If file exists and redirect is not forced, serve the file normally
+
if (fileExistsOnDisk) {
+
return serveFileInternal(did, rkey, filePath, settings);
+
}
+
}
+
+
// Handle different status codes
+
if (status === 200) {
+
// Rewrite: serve different content but keep URL the same
+
// Remove leading slash for internal path resolution
+
const rewritePath = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;
+
return serveFileInternal(did, rkey, rewritePath, settings);
+
} else if (status === 301 || status === 302) {
+
// External redirect: change the URL
+
return new Response(null, {
+
status,
+
headers: {
+
'Location': targetPath,
+
'Cache-Control': status === 301 ? 'public, max-age=31536000' : 'public, max-age=0',
+
},
+
});
+
} else if (status === 404) {
+
// Custom 404 page from _redirects (wins over settings.custom404)
+
const custom404Path = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;
+
const response = await serveFileInternal(did, rkey, custom404Path, settings);
+
// Override status to 404
+
return new Response(response.body, {
+
status: 404,
+
headers: response.headers,
+
});
+
}
+
}
+
}
+
+
// No redirect matched, serve normally with settings
+
return serveFileInternal(did, rkey, filePath, settings);
+
}
+
+
/**
+
* Internal function to serve a file (used by both normal serving and rewrites)
+
*/
+
export async function serveFileInternal(
+
did: string,
+
rkey: string,
+
filePath: string,
+
settings: WispSettings | null = null
+
): Promise<Response> {
+
// Check if site is currently being cached - if so, return updating response
+
if (isSiteBeingCached(did, rkey)) {
+
return siteUpdatingResponse();
+
}
+
+
const indexFiles = getIndexFiles(settings);
+
+
// Normalize the request path (keep empty for root, remove trailing slash for others)
+
let requestPath = filePath || '';
+
if (requestPath.endsWith('/') && requestPath.length > 1) {
+
requestPath = requestPath.slice(0, -1);
+
}
+
+
// Check if this path is a directory first
+
const directoryPath = getCachedFilePath(did, rkey, requestPath);
+
if (await fileExists(directoryPath)) {
+
const { stat, readdir } = await import('fs/promises');
+
try {
+
const stats = await stat(directoryPath);
+
if (stats.isDirectory()) {
+
// It's a directory, try each index file in order
+
for (const indexFile of indexFiles) {
+
const indexPath = requestPath ? `${requestPath}/${indexFile}` : indexFile;
+
const indexFilePath = getCachedFilePath(did, rkey, indexPath);
+
if (await fileExists(indexFilePath)) {
+
return serveFileInternal(did, rkey, indexPath, settings);
+
}
+
}
+
// No index file found - check if directory listing is enabled
+
if (settings?.directoryListing) {
+
const { stat } = await import('fs/promises');
+
const entries = await readdir(directoryPath);
+
// Filter out .meta files and other hidden files
+
const visibleEntries = entries.filter(entry => !entry.endsWith('.meta') && entry !== '.metadata.json');
+
+
// Check which entries are directories
+
const entriesWithType = await Promise.all(
+
visibleEntries.map(async (name) => {
+
try {
+
const entryPath = `${directoryPath}/${name}`;
+
const stats = await stat(entryPath);
+
return { name, isDirectory: stats.isDirectory() };
+
} catch {
+
return { name, isDirectory: false };
+
}
+
})
+
);
+
+
const html = generateDirectoryListing(requestPath, entriesWithType);
+
return new Response(html, {
+
headers: {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'public, max-age=300',
+
},
+
});
+
}
+
// Fall through to 404/SPA handling
+
}
+
} catch (err) {
+
// If stat fails, continue with normal flow
+
}
+
}
+
+
// Not a directory, try to serve as a file
+
const fileRequestPath: string = requestPath || indexFiles[0] || 'index.html';
+
const cacheKey = getCacheKey(did, rkey, fileRequestPath);
+
const cachedFile = getCachedFilePath(did, rkey, fileRequestPath);
+
+
// Check in-memory cache first
+
let content = fileCache.get(cacheKey);
+
let meta = metadataCache.get(cacheKey);
+
+
if (!content && await fileExists(cachedFile)) {
+
// Read from disk and cache
+
content = await readFile(cachedFile);
+
fileCache.set(cacheKey, content, content.length);
+
+
const metaFile = `${cachedFile}.meta`;
+
if (await fileExists(metaFile)) {
+
const metaJson = await readFile(metaFile, 'utf-8');
+
meta = JSON.parse(metaJson);
+
metadataCache.set(cacheKey, meta!, JSON.stringify(meta).length);
+
}
+
}
+
+
if (content) {
+
// Build headers with caching
+
const headers: Record<string, string> = {};
+
+
if (meta && meta.encoding === 'gzip' && meta.mimeType) {
+
const shouldServeCompressed = shouldCompressMimeType(meta.mimeType);
+
+
if (!shouldServeCompressed) {
+
// Verify content is actually gzipped before attempting decompression
+
const isGzipped = content.length >= 2 && content[0] === 0x1f && content[1] === 0x8b;
+
if (isGzipped) {
+
const { gunzipSync } = await import('zlib');
+
const decompressed = gunzipSync(content);
+
headers['Content-Type'] = meta.mimeType;
+
headers['Cache-Control'] = 'public, max-age=31536000, immutable';
+
applyCustomHeaders(headers, fileRequestPath, settings);
+
return new Response(decompressed, { headers });
+
} else {
+
// Meta says gzipped but content isn't - serve as-is
+
console.warn(`File ${filePath} has gzip encoding in meta but content lacks gzip magic bytes`);
+
headers['Content-Type'] = meta.mimeType;
+
headers['Cache-Control'] = 'public, max-age=31536000, immutable';
+
applyCustomHeaders(headers, fileRequestPath, settings);
+
return new Response(content, { headers });
+
}
+
}
+
+
headers['Content-Type'] = meta.mimeType;
+
headers['Content-Encoding'] = 'gzip';
+
headers['Cache-Control'] = meta.mimeType.startsWith('text/html')
+
? 'public, max-age=300'
+
: 'public, max-age=31536000, immutable';
+
applyCustomHeaders(headers, fileRequestPath, settings);
+
return new Response(content, { headers });
+
}
+
+
// Non-compressed files
+
const mimeType = lookup(cachedFile) || 'application/octet-stream';
+
headers['Content-Type'] = mimeType;
+
headers['Cache-Control'] = mimeType.startsWith('text/html')
+
? 'public, max-age=300'
+
: 'public, max-age=31536000, immutable';
+
applyCustomHeaders(headers, fileRequestPath, settings);
+
return new Response(content, { headers });
+
}
+
+
// Try index files for directory-like paths
+
if (!fileRequestPath.includes('.')) {
+
for (const indexFileName of indexFiles) {
+
const indexPath = fileRequestPath ? `${fileRequestPath}/${indexFileName}` : indexFileName;
+
const indexCacheKey = getCacheKey(did, rkey, indexPath);
+
const indexFile = getCachedFilePath(did, rkey, indexPath);
+
+
let indexContent = fileCache.get(indexCacheKey);
+
let indexMeta = metadataCache.get(indexCacheKey);
+
+
if (!indexContent && await fileExists(indexFile)) {
+
indexContent = await readFile(indexFile);
+
fileCache.set(indexCacheKey, indexContent, indexContent.length);
+
+
const indexMetaFile = `${indexFile}.meta`;
+
if (await fileExists(indexMetaFile)) {
+
const metaJson = await readFile(indexMetaFile, 'utf-8');
+
indexMeta = JSON.parse(metaJson);
+
metadataCache.set(indexCacheKey, indexMeta!, JSON.stringify(indexMeta).length);
+
}
+
}
+
+
if (indexContent) {
+
const headers: Record<string, string> = {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'public, max-age=300',
+
};
+
+
if (indexMeta && indexMeta.encoding === 'gzip') {
+
headers['Content-Encoding'] = 'gzip';
+
}
+
+
applyCustomHeaders(headers, indexPath, settings);
+
return new Response(indexContent, { headers });
+
}
+
}
+
}
+
+
// Try clean URLs: /about -> /about.html
+
if (settings?.cleanUrls && !fileRequestPath.includes('.')) {
+
const htmlPath = `${fileRequestPath}.html`;
+
const htmlFile = getCachedFilePath(did, rkey, htmlPath);
+
if (await fileExists(htmlFile)) {
+
return serveFileInternal(did, rkey, htmlPath, settings);
+
}
+
+
// Also try /about/index.html
+
for (const indexFileName of indexFiles) {
+
const indexPath = fileRequestPath ? `${fileRequestPath}/${indexFileName}` : indexFileName;
+
const indexFile = getCachedFilePath(did, rkey, indexPath);
+
if (await fileExists(indexFile)) {
+
return serveFileInternal(did, rkey, indexPath, settings);
+
}
+
}
+
}
+
+
// SPA mode: serve SPA file for all non-existing routes (wins over custom404 but loses to _redirects)
+
if (settings?.spaMode) {
+
const spaFile = settings.spaMode;
+
const spaFilePath = getCachedFilePath(did, rkey, spaFile);
+
if (await fileExists(spaFilePath)) {
+
return serveFileInternal(did, rkey, spaFile, settings);
+
}
+
}
+
+
// Custom 404: serve custom 404 file if configured (wins conflict battle)
+
if (settings?.custom404) {
+
const custom404File = settings.custom404;
+
const custom404Path = getCachedFilePath(did, rkey, custom404File);
+
if (await fileExists(custom404Path)) {
+
const response: Response = await serveFileInternal(did, rkey, custom404File, settings);
+
// Override status to 404
+
return new Response(response.body, {
+
status: 404,
+
headers: response.headers,
+
});
+
}
+
}
+
+
// Autodetect 404 pages (GitHub Pages: 404.html, Neocities/Nekoweb: not_found.html)
+
const auto404Pages = ['404.html', 'not_found.html'];
+
for (const auto404Page of auto404Pages) {
+
const auto404Path = getCachedFilePath(did, rkey, auto404Page);
+
if (await fileExists(auto404Path)) {
+
const response: Response = await serveFileInternal(did, rkey, auto404Page, settings);
+
// Override status to 404
+
return new Response(response.body, {
+
status: 404,
+
headers: response.headers,
+
});
+
}
+
}
+
+
// Directory listing fallback: if enabled, show root directory listing on 404
+
if (settings?.directoryListing) {
+
const rootPath = getCachedFilePath(did, rkey, '');
+
if (await fileExists(rootPath)) {
+
const { stat, readdir } = await import('fs/promises');
+
try {
+
const stats = await stat(rootPath);
+
if (stats.isDirectory()) {
+
const entries = await readdir(rootPath);
+
// Filter out .meta files and metadata
+
const visibleEntries = entries.filter(entry =>
+
!entry.endsWith('.meta') && entry !== '.metadata.json'
+
);
+
+
// Check which entries are directories
+
const entriesWithType = await Promise.all(
+
visibleEntries.map(async (name) => {
+
try {
+
const entryPath = `${rootPath}/${name}`;
+
const entryStats = await stat(entryPath);
+
return { name, isDirectory: entryStats.isDirectory() };
+
} catch {
+
return { name, isDirectory: false };
+
}
+
})
+
);
+
+
const html = generateDirectoryListing('', entriesWithType);
+
return new Response(html, {
+
status: 404,
+
headers: {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'public, max-age=300',
+
},
+
});
+
}
+
} catch (err) {
+
// If directory listing fails, fall through to 404
+
}
+
}
+
}
+
+
// Default styled 404 page
+
const html = generate404Page();
+
return new Response(html, {
+
status: 404,
+
headers: {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'public, max-age=300',
+
},
+
});
+
}
+
+
/**
+
* Helper to serve files from cache with HTML path rewriting for sites.wisp.place routes
+
*/
+
export async function serveFromCacheWithRewrite(
+
did: string,
+
rkey: string,
+
filePath: string,
+
basePath: string,
+
fullUrl?: string,
+
headers?: Record<string, string>
+
): Promise<Response> {
+
// Load settings for this site
+
const settings = await getCachedSettings(did, rkey);
+
const indexFiles = getIndexFiles(settings);
+
+
// Check for redirect rules first (_redirects wins over settings)
+
let redirectRules = getRedirectRulesFromCache(did, rkey);
+
+
if (redirectRules === undefined) {
+
// Load rules for the first time
+
redirectRules = await loadRedirectRules(did, rkey);
+
setRedirectRulesInCache(did, rkey, redirectRules);
+
}
+
+
// Apply redirect rules if any exist
+
if (redirectRules.length > 0) {
+
const requestPath = '/' + (filePath || '');
+
const queryParams = fullUrl ? parseQueryString(fullUrl) : {};
+
const cookies = parseCookies(headers?.['cookie']);
+
+
const redirectMatch = matchRedirectRule(requestPath, redirectRules, {
+
queryParams,
+
headers,
+
cookies,
+
});
+
+
if (redirectMatch) {
+
const { rule, targetPath, status } = redirectMatch;
+
+
// If not forced, check if the requested file exists before redirecting
+
if (!rule.force) {
+
// Build the expected file path
+
let checkPath: string = filePath || indexFiles[0] || 'index.html';
+
if (checkPath.endsWith('/')) {
+
checkPath += indexFiles[0] || 'index.html';
+
}
+
+
const cachedFile = getCachedFilePath(did, rkey, checkPath);
+
const fileExistsOnDisk = await fileExists(cachedFile);
+
+
// If file exists and redirect is not forced, serve the file normally
+
if (fileExistsOnDisk) {
+
return serveFileInternalWithRewrite(did, rkey, filePath, basePath, settings);
+
}
+
}
+
+
// Handle different status codes
+
if (status === 200) {
+
// Rewrite: serve different content but keep URL the same
+
const rewritePath = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;
+
return serveFileInternalWithRewrite(did, rkey, rewritePath, basePath, settings);
+
} else if (status === 301 || status === 302) {
+
// External redirect: change the URL
+
// For sites.wisp.place, we need to adjust the target path to include the base path
+
// unless it's an absolute URL
+
let redirectTarget = targetPath;
+
if (!targetPath.startsWith('http://') && !targetPath.startsWith('https://')) {
+
redirectTarget = basePath + (targetPath.startsWith('/') ? targetPath.slice(1) : targetPath);
+
}
+
return new Response(null, {
+
status,
+
headers: {
+
'Location': redirectTarget,
+
'Cache-Control': status === 301 ? 'public, max-age=31536000' : 'public, max-age=0',
+
},
+
});
+
} else if (status === 404) {
+
// Custom 404 page from _redirects (wins over settings.custom404)
+
const custom404Path = targetPath.startsWith('/') ? targetPath.slice(1) : targetPath;
+
const response = await serveFileInternalWithRewrite(did, rkey, custom404Path, basePath, settings);
+
// Override status to 404
+
return new Response(response.body, {
+
status: 404,
+
headers: response.headers,
+
});
+
}
+
}
+
}
+
+
// No redirect matched, serve normally with settings
+
return serveFileInternalWithRewrite(did, rkey, filePath, basePath, settings);
+
}
+
+
/**
+
* Internal function to serve a file with rewriting
+
*/
+
export async function serveFileInternalWithRewrite(
+
did: string,
+
rkey: string,
+
filePath: string,
+
basePath: string,
+
settings: WispSettings | null = null
+
): Promise<Response> {
+
// Check if site is currently being cached - if so, return updating response
+
if (isSiteBeingCached(did, rkey)) {
+
return siteUpdatingResponse();
+
}
+
+
const indexFiles = getIndexFiles(settings);
+
+
// Normalize the request path (keep empty for root, remove trailing slash for others)
+
let requestPath = filePath || '';
+
if (requestPath.endsWith('/') && requestPath.length > 1) {
+
requestPath = requestPath.slice(0, -1);
+
}
+
+
// Check if this path is a directory first
+
const directoryPath = getCachedFilePath(did, rkey, requestPath);
+
if (await fileExists(directoryPath)) {
+
const { stat, readdir } = await import('fs/promises');
+
try {
+
const stats = await stat(directoryPath);
+
if (stats.isDirectory()) {
+
// It's a directory, try each index file in order
+
for (const indexFile of indexFiles) {
+
const indexPath = requestPath ? `${requestPath}/${indexFile}` : indexFile;
+
const indexFilePath = getCachedFilePath(did, rkey, indexPath);
+
if (await fileExists(indexFilePath)) {
+
return serveFileInternalWithRewrite(did, rkey, indexPath, basePath, settings);
+
}
+
}
+
// No index file found - check if directory listing is enabled
+
if (settings?.directoryListing) {
+
const { stat } = await import('fs/promises');
+
const entries = await readdir(directoryPath);
+
// Filter out .meta files and other hidden files
+
const visibleEntries = entries.filter(entry => !entry.endsWith('.meta') && entry !== '.metadata.json');
+
+
// Check which entries are directories
+
const entriesWithType = await Promise.all(
+
visibleEntries.map(async (name) => {
+
try {
+
const entryPath = `${directoryPath}/${name}`;
+
const stats = await stat(entryPath);
+
return { name, isDirectory: stats.isDirectory() };
+
} catch {
+
return { name, isDirectory: false };
+
}
+
})
+
);
+
+
const html = generateDirectoryListing(requestPath, entriesWithType);
+
return new Response(html, {
+
headers: {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'public, max-age=300',
+
},
+
});
+
}
+
// Fall through to 404/SPA handling
+
}
+
} catch (err) {
+
// If stat fails, continue with normal flow
+
}
+
}
+
+
// Not a directory, try to serve as a file
+
const fileRequestPath: string = requestPath || indexFiles[0] || 'index.html';
+
const cacheKey = getCacheKey(did, rkey, fileRequestPath);
+
const cachedFile = getCachedFilePath(did, rkey, fileRequestPath);
+
+
// Check for rewritten HTML in cache first (if it's HTML)
+
const mimeTypeGuess = lookup(fileRequestPath) || 'application/octet-stream';
+
if (isHtmlContent(fileRequestPath, mimeTypeGuess)) {
+
const rewrittenKey = getCacheKey(did, rkey, fileRequestPath, `rewritten:${basePath}`);
+
const rewrittenContent = rewrittenHtmlCache.get(rewrittenKey);
+
if (rewrittenContent) {
+
const headers: Record<string, string> = {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Content-Encoding': 'gzip',
+
'Cache-Control': 'public, max-age=300',
+
};
+
applyCustomHeaders(headers, fileRequestPath, settings);
+
return new Response(rewrittenContent, { headers });
+
}
+
}
+
+
// Check in-memory file cache
+
let content = fileCache.get(cacheKey);
+
let meta = metadataCache.get(cacheKey);
+
+
if (!content && await fileExists(cachedFile)) {
+
// Read from disk and cache
+
content = await readFile(cachedFile);
+
fileCache.set(cacheKey, content, content.length);
+
+
const metaFile = `${cachedFile}.meta`;
+
if (await fileExists(metaFile)) {
+
const metaJson = await readFile(metaFile, 'utf-8');
+
meta = JSON.parse(metaJson);
+
metadataCache.set(cacheKey, meta!, JSON.stringify(meta).length);
+
}
+
}
+
+
if (content) {
+
const mimeType = meta?.mimeType || lookup(cachedFile) || 'application/octet-stream';
+
const isGzipped = meta?.encoding === 'gzip';
+
+
// Check if this is HTML content that needs rewriting
+
if (isHtmlContent(fileRequestPath, mimeType)) {
+
let htmlContent: string;
+
if (isGzipped) {
+
// Verify content is actually gzipped
+
const hasGzipMagic = content.length >= 2 && content[0] === 0x1f && content[1] === 0x8b;
+
if (hasGzipMagic) {
+
const { gunzipSync } = await import('zlib');
+
htmlContent = gunzipSync(content).toString('utf-8');
+
} else {
+
console.warn(`File ${fileRequestPath} marked as gzipped but lacks magic bytes, serving as-is`);
+
htmlContent = content.toString('utf-8');
+
}
+
} else {
+
htmlContent = content.toString('utf-8');
+
}
+
const rewritten = rewriteHtmlPaths(htmlContent, basePath, fileRequestPath);
+
+
// Recompress and cache the rewritten HTML
+
const { gzipSync } = await import('zlib');
+
const recompressed = gzipSync(Buffer.from(rewritten, 'utf-8'));
+
+
const rewrittenKey = getCacheKey(did, rkey, fileRequestPath, `rewritten:${basePath}`);
+
rewrittenHtmlCache.set(rewrittenKey, recompressed, recompressed.length);
+
+
const htmlHeaders: Record<string, string> = {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Content-Encoding': 'gzip',
+
'Cache-Control': 'public, max-age=300',
+
};
+
applyCustomHeaders(htmlHeaders, fileRequestPath, settings);
+
return new Response(recompressed, { headers: htmlHeaders });
+
}
+
+
// Non-HTML files: serve as-is
+
const headers: Record<string, string> = {
+
'Content-Type': mimeType,
+
'Cache-Control': 'public, max-age=31536000, immutable',
+
};
+
+
if (isGzipped) {
+
const shouldServeCompressed = shouldCompressMimeType(mimeType);
+
if (!shouldServeCompressed) {
+
// Verify content is actually gzipped
+
const hasGzipMagic = content.length >= 2 && content[0] === 0x1f && content[1] === 0x8b;
+
if (hasGzipMagic) {
+
const { gunzipSync } = await import('zlib');
+
const decompressed = gunzipSync(content);
+
applyCustomHeaders(headers, fileRequestPath, settings);
+
return new Response(decompressed, { headers });
+
} else {
+
console.warn(`File ${fileRequestPath} marked as gzipped but lacks magic bytes, serving as-is`);
+
applyCustomHeaders(headers, fileRequestPath, settings);
+
return new Response(content, { headers });
+
}
+
}
+
headers['Content-Encoding'] = 'gzip';
+
}
+
+
applyCustomHeaders(headers, fileRequestPath, settings);
+
return new Response(content, { headers });
+
}
+
+
// Try index files for directory-like paths
+
if (!fileRequestPath.includes('.')) {
+
for (const indexFileName of indexFiles) {
+
const indexPath = fileRequestPath ? `${fileRequestPath}/${indexFileName}` : indexFileName;
+
const indexCacheKey = getCacheKey(did, rkey, indexPath);
+
const indexFile = getCachedFilePath(did, rkey, indexPath);
+
+
// Check for rewritten index file in cache
+
const rewrittenKey = getCacheKey(did, rkey, indexPath, `rewritten:${basePath}`);
+
const rewrittenContent = rewrittenHtmlCache.get(rewrittenKey);
+
if (rewrittenContent) {
+
const headers: Record<string, string> = {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Content-Encoding': 'gzip',
+
'Cache-Control': 'public, max-age=300',
+
};
+
applyCustomHeaders(headers, indexPath, settings);
+
return new Response(rewrittenContent, { headers });
+
}
+
+
let indexContent = fileCache.get(indexCacheKey);
+
let indexMeta = metadataCache.get(indexCacheKey);
+
+
if (!indexContent && await fileExists(indexFile)) {
+
indexContent = await readFile(indexFile);
+
fileCache.set(indexCacheKey, indexContent, indexContent.length);
+
+
const indexMetaFile = `${indexFile}.meta`;
+
if (await fileExists(indexMetaFile)) {
+
const metaJson = await readFile(indexMetaFile, 'utf-8');
+
indexMeta = JSON.parse(metaJson);
+
metadataCache.set(indexCacheKey, indexMeta!, JSON.stringify(indexMeta).length);
+
}
+
}
+
+
if (indexContent) {
+
const isGzipped = indexMeta?.encoding === 'gzip';
+
+
let htmlContent: string;
+
if (isGzipped) {
+
// Verify content is actually gzipped
+
const hasGzipMagic = indexContent.length >= 2 && indexContent[0] === 0x1f && indexContent[1] === 0x8b;
+
if (hasGzipMagic) {
+
const { gunzipSync } = await import('zlib');
+
htmlContent = gunzipSync(indexContent).toString('utf-8');
+
} else {
+
console.warn(`Index file marked as gzipped but lacks magic bytes, serving as-is`);
+
htmlContent = indexContent.toString('utf-8');
+
}
+
} else {
+
htmlContent = indexContent.toString('utf-8');
+
}
+
const rewritten = rewriteHtmlPaths(htmlContent, basePath, indexPath);
+
+
const { gzipSync } = await import('zlib');
+
const recompressed = gzipSync(Buffer.from(rewritten, 'utf-8'));
+
+
rewrittenHtmlCache.set(rewrittenKey, recompressed, recompressed.length);
+
+
const headers: Record<string, string> = {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Content-Encoding': 'gzip',
+
'Cache-Control': 'public, max-age=300',
+
};
+
applyCustomHeaders(headers, indexPath, settings);
+
return new Response(recompressed, { headers });
+
}
+
}
+
}
+
+
// Try clean URLs: /about -> /about.html
+
if (settings?.cleanUrls && !fileRequestPath.includes('.')) {
+
const htmlPath = `${fileRequestPath}.html`;
+
const htmlFile = getCachedFilePath(did, rkey, htmlPath);
+
if (await fileExists(htmlFile)) {
+
return serveFileInternalWithRewrite(did, rkey, htmlPath, basePath, settings);
+
}
+
+
// Also try /about/index.html
+
for (const indexFileName of indexFiles) {
+
const indexPath = fileRequestPath ? `${fileRequestPath}/${indexFileName}` : indexFileName;
+
const indexFile = getCachedFilePath(did, rkey, indexPath);
+
if (await fileExists(indexFile)) {
+
return serveFileInternalWithRewrite(did, rkey, indexPath, basePath, settings);
+
}
+
}
+
}
+
+
// SPA mode: serve SPA file for all non-existing routes
+
if (settings?.spaMode) {
+
const spaFile = settings.spaMode;
+
const spaFilePath = getCachedFilePath(did, rkey, spaFile);
+
if (await fileExists(spaFilePath)) {
+
return serveFileInternalWithRewrite(did, rkey, spaFile, basePath, settings);
+
}
+
}
+
+
// Custom 404: serve custom 404 file if configured (wins conflict battle)
+
if (settings?.custom404) {
+
const custom404File = settings.custom404;
+
const custom404Path = getCachedFilePath(did, rkey, custom404File);
+
if (await fileExists(custom404Path)) {
+
const response: Response = await serveFileInternalWithRewrite(did, rkey, custom404File, basePath, settings);
+
// Override status to 404
+
return new Response(response.body, {
+
status: 404,
+
headers: response.headers,
+
});
+
}
+
}
+
+
// Autodetect 404 pages (GitHub Pages: 404.html, Neocities/Nekoweb: not_found.html)
+
const auto404Pages = ['404.html', 'not_found.html'];
+
for (const auto404Page of auto404Pages) {
+
const auto404Path = getCachedFilePath(did, rkey, auto404Page);
+
if (await fileExists(auto404Path)) {
+
const response: Response = await serveFileInternalWithRewrite(did, rkey, auto404Page, basePath, settings);
+
// Override status to 404
+
return new Response(response.body, {
+
status: 404,
+
headers: response.headers,
+
});
+
}
+
}
+
+
// Directory listing fallback: if enabled, show root directory listing on 404
+
if (settings?.directoryListing) {
+
const rootPath = getCachedFilePath(did, rkey, '');
+
if (await fileExists(rootPath)) {
+
const { stat, readdir } = await import('fs/promises');
+
try {
+
const stats = await stat(rootPath);
+
if (stats.isDirectory()) {
+
const entries = await readdir(rootPath);
+
// Filter out .meta files and metadata
+
const visibleEntries = entries.filter(entry =>
+
!entry.endsWith('.meta') && entry !== '.metadata.json'
+
);
+
+
// Check which entries are directories
+
const entriesWithType = await Promise.all(
+
visibleEntries.map(async (name) => {
+
try {
+
const entryPath = `${rootPath}/${name}`;
+
const entryStats = await stat(entryPath);
+
return { name, isDirectory: entryStats.isDirectory() };
+
} catch {
+
return { name, isDirectory: false };
+
}
+
})
+
);
+
+
const html = generateDirectoryListing('', entriesWithType);
+
return new Response(html, {
+
status: 404,
+
headers: {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'public, max-age=300',
+
},
+
});
+
}
+
} catch (err) {
+
// If directory listing fails, fall through to 404
+
}
+
}
+
}
+
+
// Default styled 404 page
+
const html = generate404Page();
+
return new Response(html, {
+
status: 404,
+
headers: {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'public, max-age=300',
+
},
+
});
+
}
+
+430
apps/hosting-service/src/lib/firehose.ts
···
+
import { existsSync, rmSync } from 'fs'
+
import {
+
getPdsForDid,
+
downloadAndCacheSite,
+
fetchSiteRecord
+
} from './utils'
+
import { upsertSite, tryAcquireLock, releaseLock } from './db'
+
import { safeFetch } from '@wisp/safe-fetch'
+
import { isRecord, validateRecord } from '@wisp/lexicons/types/place/wisp/fs'
+
import { Firehose } from '@atproto/sync'
+
import { IdResolver } from '@atproto/identity'
+
import { invalidateSiteCache, markSiteAsBeingCached, unmarkSiteAsBeingCached } from './cache'
+
import { clearRedirectRulesCache } from './site-cache'
+
+
const CACHE_DIR = './cache/sites'
+
+
export class FirehoseWorker {
+
private firehose: Firehose | null = null
+
private idResolver: IdResolver
+
private isShuttingDown = false
+
private lastEventTime = Date.now()
+
+
constructor(
+
private logger?: (msg: string, data?: Record<string, unknown>) => void
+
) {
+
this.idResolver = new IdResolver()
+
}
+
+
private log(msg: string, data?: Record<string, unknown>) {
+
const log = this.logger || console.log
+
log(`[FirehoseWorker] ${msg}`, data || {})
+
}
+
+
start() {
+
this.log('Starting firehose worker')
+
this.connect()
+
}
+
+
stop() {
+
this.log('Stopping firehose worker')
+
this.isShuttingDown = true
+
+
if (this.firehose) {
+
this.firehose.destroy()
+
this.firehose = null
+
}
+
}
+
+
private connect() {
+
if (this.isShuttingDown) return
+
+
this.log('Connecting to AT Protocol firehose')
+
+
this.firehose = new Firehose({
+
idResolver: this.idResolver,
+
service: 'wss://bsky.network',
+
filterCollections: ['place.wisp.fs', 'place.wisp.settings'],
+
handleEvent: async (evt: any) => {
+
this.lastEventTime = Date.now()
+
+
// Watch for write events
+
if (evt.event === 'create' || evt.event === 'update') {
+
const record = evt.record
+
+
// If the write is a valid place.wisp.fs record
+
if (
+
evt.collection === 'place.wisp.fs' &&
+
isRecord(record) &&
+
validateRecord(record).success
+
) {
+
this.log('Received place.wisp.fs event', {
+
did: evt.did,
+
event: evt.event,
+
rkey: evt.rkey
+
})
+
+
try {
+
await this.handleCreateOrUpdate(
+
evt.did,
+
evt.rkey,
+
record,
+
evt.cid?.toString()
+
)
+
} catch (err) {
+
console.error('Full error details:', err);
+
this.log('Error handling event', {
+
did: evt.did,
+
event: evt.event,
+
rkey: evt.rkey,
+
error:
+
err instanceof Error
+
? err.message
+
: String(err)
+
})
+
}
+
}
+
// Handle settings changes
+
else if (evt.collection === 'place.wisp.settings') {
+
this.log('Received place.wisp.settings event', {
+
did: evt.did,
+
event: evt.event,
+
rkey: evt.rkey
+
})
+
+
try {
+
await this.handleSettingsChange(evt.did, evt.rkey)
+
} catch (err) {
+
this.log('Error handling settings change', {
+
did: evt.did,
+
event: evt.event,
+
rkey: evt.rkey,
+
error:
+
err instanceof Error
+
? err.message
+
: String(err)
+
})
+
}
+
}
+
} else if (
+
evt.event === 'delete' &&
+
evt.collection === 'place.wisp.fs'
+
) {
+
this.log('Received delete event', {
+
did: evt.did,
+
rkey: evt.rkey
+
})
+
+
try {
+
await this.handleDelete(evt.did, evt.rkey)
+
} catch (err) {
+
this.log('Error handling delete', {
+
did: evt.did,
+
rkey: evt.rkey,
+
error:
+
err instanceof Error ? err.message : String(err)
+
})
+
}
+
} else if (
+
evt.event === 'delete' &&
+
evt.collection === 'place.wisp.settings'
+
) {
+
this.log('Received settings delete event', {
+
did: evt.did,
+
rkey: evt.rkey
+
})
+
+
try {
+
await this.handleSettingsChange(evt.did, evt.rkey)
+
} catch (err) {
+
this.log('Error handling settings delete', {
+
did: evt.did,
+
rkey: evt.rkey,
+
error:
+
err instanceof Error ? err.message : String(err)
+
})
+
}
+
}
+
},
+
onError: (err: any) => {
+
this.log('Firehose error', {
+
error: err instanceof Error ? err.message : String(err),
+
stack: err instanceof Error ? err.stack : undefined,
+
fullError: err
+
})
+
console.error('Full firehose error:', err)
+
}
+
})
+
+
this.firehose.start()
+
this.log('Firehose started')
+
}
+
+
private async handleCreateOrUpdate(
+
did: string,
+
site: string,
+
record: any,
+
eventCid?: string
+
) {
+
this.log('Processing create/update', { did, site })
+
+
// Record is already validated in handleEvent
+
const fsRecord = record
+
+
const pdsEndpoint = await getPdsForDid(did)
+
if (!pdsEndpoint) {
+
this.log('Could not resolve PDS for DID', { did })
+
return
+
}
+
+
this.log('Resolved PDS', { did, pdsEndpoint })
+
+
// Verify record exists on PDS and fetch its CID
+
this.log('Verifying record on PDS', { did, site })
+
let verifiedCid: string
+
try {
+
const result = await fetchSiteRecord(did, site)
+
+
if (!result) {
+
this.log('Record not found on PDS, skipping cache', {
+
did,
+
site
+
})
+
return
+
}
+
+
verifiedCid = result.cid
+
+
// Verify event CID matches PDS CID (prevent cache poisoning)
+
if (eventCid && eventCid !== verifiedCid) {
+
this.log('CID mismatch detected - potential spoofed event', {
+
did,
+
site,
+
eventCid,
+
verifiedCid
+
})
+
return
+
}
+
+
this.log('Record verified on PDS', { did, site, cid: verifiedCid })
+
} catch (err) {
+
this.log('Failed to verify record on PDS', {
+
did,
+
site,
+
error: err instanceof Error ? err.message : String(err)
+
})
+
return
+
}
+
+
// Invalidate in-memory caches before updating
+
invalidateSiteCache(did, site)
+
+
// Mark site as being cached to prevent serving stale content during update
+
markSiteAsBeingCached(did, site)
+
+
try {
+
// Cache the record with verified CID (uses atomic swap internally)
+
// All instances cache locally for edge serving
+
await downloadAndCacheSite(
+
did,
+
site,
+
fsRecord,
+
pdsEndpoint,
+
verifiedCid
+
)
+
+
// Clear redirect rules cache since the site was updated
+
clearRedirectRulesCache(did, site)
+
+
// Acquire distributed lock only for database write to prevent duplicate writes
+
// Note: upsertSite will check cache-only mode internally and skip if needed
+
const lockKey = `db:upsert:${did}:${site}`
+
const lockAcquired = await tryAcquireLock(lockKey)
+
+
if (!lockAcquired) {
+
this.log('Another instance is writing to DB, skipping upsert', {
+
did,
+
site
+
})
+
this.log('Successfully processed create/update (cached locally)', {
+
did,
+
site
+
})
+
return
+
}
+
+
try {
+
// Upsert site to database (only one instance does this)
+
// In cache-only mode, this will be a no-op
+
await upsertSite(did, site, fsRecord.site)
+
this.log(
+
'Successfully processed create/update (cached + DB updated)',
+
{ did, site }
+
)
+
} finally {
+
// Always release lock, even if DB write fails
+
await releaseLock(lockKey)
+
}
+
} finally {
+
// Always unmark, even if caching fails
+
unmarkSiteAsBeingCached(did, site)
+
}
+
}
+
+
private async handleDelete(did: string, site: string) {
+
this.log('Processing delete', { did, site })
+
+
// All instances should delete their local cache (no lock needed)
+
const pdsEndpoint = await getPdsForDid(did)
+
if (!pdsEndpoint) {
+
this.log('Could not resolve PDS for DID', { did })
+
return
+
}
+
+
// Verify record is actually deleted from PDS
+
try {
+
const recordUrl = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=place.wisp.fs&rkey=${encodeURIComponent(site)}`
+
const recordRes = await safeFetch(recordUrl)
+
+
if (recordRes.ok) {
+
this.log('Record still exists on PDS, not deleting cache', {
+
did,
+
site
+
})
+
return
+
}
+
+
this.log('Verified record is deleted from PDS', {
+
did,
+
site,
+
status: recordRes.status
+
})
+
} catch (err) {
+
this.log('Error verifying deletion on PDS', {
+
did,
+
site,
+
error: err instanceof Error ? err.message : String(err)
+
})
+
}
+
+
// Invalidate in-memory caches
+
invalidateSiteCache(did, site)
+
+
// Delete disk cache
+
this.deleteCache(did, site)
+
+
this.log('Successfully processed delete', { did, site })
+
}
+
+
private async handleSettingsChange(did: string, rkey: string) {
+
this.log('Processing settings change', { did, rkey })
+
+
// Invalidate in-memory caches (includes metadata which stores settings)
+
invalidateSiteCache(did, rkey)
+
+
// Check if site is already cached
+
const cacheDir = `${CACHE_DIR}/${did}/${rkey}`
+
const isCached = existsSync(cacheDir)
+
+
if (!isCached) {
+
this.log('Site not cached yet, checking if fs record exists', { did, rkey })
+
+
// If site exists on PDS, cache it (which will include the new settings)
+
try {
+
const siteRecord = await fetchSiteRecord(did, rkey)
+
+
if (siteRecord) {
+
this.log('Site record found, triggering full cache with settings', { did, rkey })
+
const pdsEndpoint = await getPdsForDid(did)
+
+
if (pdsEndpoint) {
+
// Mark as being cached
+
markSiteAsBeingCached(did, rkey)
+
+
try {
+
await downloadAndCacheSite(did, rkey, siteRecord.record, pdsEndpoint, siteRecord.cid)
+
this.log('Successfully cached site with new settings', { did, rkey })
+
} finally {
+
unmarkSiteAsBeingCached(did, rkey)
+
}
+
} else {
+
this.log('Could not resolve PDS for DID', { did })
+
}
+
} else {
+
this.log('No fs record found for site, skipping cache', { did, rkey })
+
}
+
} catch (err) {
+
this.log('Failed to cache site after settings change', {
+
did,
+
rkey,
+
error: err instanceof Error ? err.message : String(err)
+
})
+
}
+
+
this.log('Successfully processed settings change (new cache)', { did, rkey })
+
return
+
}
+
+
// Site is already cached, just update the settings in metadata
+
try {
+
const { fetchSiteSettings, updateCacheMetadataSettings } = await import('./utils')
+
const settings = await fetchSiteSettings(did, rkey)
+
await updateCacheMetadataSettings(did, rkey, settings)
+
this.log('Updated cached settings', { did, rkey, hasSettings: !!settings })
+
} catch (err) {
+
this.log('Failed to update cached settings', {
+
did,
+
rkey,
+
error: err instanceof Error ? err.message : String(err)
+
})
+
}
+
+
this.log('Successfully processed settings change', { did, rkey })
+
}
+
+
private deleteCache(did: string, site: string) {
+
const cacheDir = `${CACHE_DIR}/${did}/${site}`
+
+
if (!existsSync(cacheDir)) {
+
this.log('Cache directory does not exist, nothing to delete', {
+
did,
+
site
+
})
+
return
+
}
+
+
try {
+
rmSync(cacheDir, { recursive: true, force: true })
+
this.log('Cache deleted', { did, site, path: cacheDir })
+
} catch (err) {
+
this.log('Failed to delete cache', {
+
did,
+
site,
+
path: cacheDir,
+
error: err instanceof Error ? err.message : String(err)
+
})
+
}
+
}
+
+
getHealth() {
+
const isConnected = this.firehose !== null
+
const timeSinceLastEvent = Date.now() - this.lastEventTime
+
+
return {
+
connected: isConnected,
+
lastEventTime: this.lastEventTime,
+
timeSinceLastEvent,
+
healthy: isConnected && timeSinceLastEvent < 300000 // 5 minutes
+
}
+
}
+
}
+457
apps/hosting-service/src/lib/html-rewriter.test.ts
···
+
import { describe, test, expect } from 'bun:test'
+
import { rewriteHtmlPaths, isHtmlContent } from './html-rewriter'
+
+
describe('rewriteHtmlPaths', () => {
+
const basePath = '/identifier/site/'
+
+
describe('absolute paths', () => {
+
test('rewrites absolute paths with leading slash', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('rewrites nested absolute paths', () => {
+
const html = '<link href="/css/style.css">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<link href="/identifier/site/css/style.css">')
+
})
+
})
+
+
describe('relative paths from root document', () => {
+
test('rewrites relative paths with ./ prefix', () => {
+
const html = '<img src="./image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('rewrites relative paths without prefix', () => {
+
const html = '<img src="image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('rewrites relative paths with ../ (should stay at root)', () => {
+
const html = '<img src="../image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
})
+
+
describe('relative paths from nested documents', () => {
+
test('rewrites relative path from nested document', () => {
+
const html = '<img src="./photo.jpg">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/index.html'
+
)
+
expect(result).toBe(
+
'<img src="/identifier/site/folder1/folder2/photo.jpg">'
+
)
+
})
+
+
test('rewrites plain filename from nested document', () => {
+
const html = '<script src="app.js"></script>'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/index.html'
+
)
+
expect(result).toBe(
+
'<script src="/identifier/site/folder1/folder2/app.js"></script>'
+
)
+
})
+
+
test('rewrites ../ to go up one level', () => {
+
const html = '<img src="../image.png">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/folder3/index.html'
+
)
+
expect(result).toBe(
+
'<img src="/identifier/site/folder1/folder2/image.png">'
+
)
+
})
+
+
test('rewrites multiple ../ to go up multiple levels', () => {
+
const html = '<link href="../../css/style.css">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/folder3/index.html'
+
)
+
expect(result).toBe(
+
'<link href="/identifier/site/folder1/css/style.css">'
+
)
+
})
+
+
test('rewrites ../ with additional path segments', () => {
+
const html = '<img src="../assets/logo.png">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'pages/about/index.html'
+
)
+
expect(result).toBe(
+
'<img src="/identifier/site/pages/assets/logo.png">'
+
)
+
})
+
+
test('handles complex nested relative paths', () => {
+
const html = '<script src="../../lib/vendor/jquery.js"></script>'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'pages/blog/post/index.html'
+
)
+
expect(result).toBe(
+
'<script src="/identifier/site/pages/lib/vendor/jquery.js"></script>'
+
)
+
})
+
+
test('handles ../ going past root (stays at root)', () => {
+
const html = '<img src="../../../image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'folder1/index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
})
+
+
describe('external URLs and special schemes', () => {
+
test('does not rewrite http URLs', () => {
+
const html = '<img src="http://example.com/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="http://example.com/image.png">')
+
})
+
+
test('does not rewrite https URLs', () => {
+
const html = '<link href="https://cdn.example.com/style.css">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<link href="https://cdn.example.com/style.css">'
+
)
+
})
+
+
test('does not rewrite protocol-relative URLs', () => {
+
const html = '<script src="//cdn.example.com/script.js"></script>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<script src="//cdn.example.com/script.js"></script>'
+
)
+
})
+
+
test('does not rewrite data URIs', () => {
+
const html =
+
'<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUA">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUA">'
+
)
+
})
+
+
test('does not rewrite mailto links', () => {
+
const html = '<a href="mailto:test@example.com">Email</a>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<a href="mailto:test@example.com">Email</a>')
+
})
+
+
test('does not rewrite tel links', () => {
+
const html = '<a href="tel:+1234567890">Call</a>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<a href="tel:+1234567890">Call</a>')
+
})
+
})
+
+
describe('different HTML attributes', () => {
+
test('rewrites src attribute', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('rewrites href attribute', () => {
+
const html = '<a href="/page.html">Link</a>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<a href="/identifier/site/page.html">Link</a>')
+
})
+
+
test('rewrites action attribute', () => {
+
const html = '<form action="/submit"></form>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<form action="/identifier/site/submit"></form>')
+
})
+
+
test('rewrites data attribute', () => {
+
const html = '<object data="/document.pdf"></object>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<object data="/identifier/site/document.pdf"></object>'
+
)
+
})
+
+
test('rewrites poster attribute', () => {
+
const html = '<video poster="/thumbnail.jpg"></video>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<video poster="/identifier/site/thumbnail.jpg"></video>'
+
)
+
})
+
+
test('rewrites srcset attribute with single URL', () => {
+
const html = '<img srcset="/image.png 1x">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img srcset="/identifier/site/image.png 1x">'
+
)
+
})
+
+
test('rewrites srcset attribute with multiple URLs', () => {
+
const html = '<img srcset="/image-1x.png 1x, /image-2x.png 2x">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img srcset="/identifier/site/image-1x.png 1x, /identifier/site/image-2x.png 2x">'
+
)
+
})
+
+
test('rewrites srcset with width descriptors', () => {
+
const html = '<img srcset="/small.jpg 320w, /large.jpg 1024w">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img srcset="/identifier/site/small.jpg 320w, /identifier/site/large.jpg 1024w">'
+
)
+
})
+
+
test('rewrites srcset with relative paths from nested document', () => {
+
const html = '<img srcset="../img1.png 1x, ../img2.png 2x">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/index.html'
+
)
+
expect(result).toBe(
+
'<img srcset="/identifier/site/folder1/img1.png 1x, /identifier/site/folder1/img2.png 2x">'
+
)
+
})
+
})
+
+
describe('quote handling', () => {
+
test('handles double quotes', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('handles single quotes', () => {
+
const html = "<img src='/image.png'>"
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe("<img src='/identifier/site/image.png'>")
+
})
+
+
test('handles mixed quotes in same document', () => {
+
const html = '<img src="/img1.png"><link href=\'/style.css\'>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img src="/identifier/site/img1.png"><link href=\'/identifier/site/style.css\'>'
+
)
+
})
+
})
+
+
describe('multiple rewrites in same document', () => {
+
test('rewrites multiple attributes in complex HTML', () => {
+
const html = `
+
<!DOCTYPE html>
+
<html>
+
<head>
+
<link href="/css/style.css" rel="stylesheet">
+
<script src="/js/app.js"></script>
+
</head>
+
<body>
+
<img src="/images/logo.png" alt="Logo">
+
<a href="/about.html">About</a>
+
<form action="/submit">
+
<button type="submit">Submit</button>
+
</form>
+
</body>
+
</html>
+
`
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toContain('href="/identifier/site/css/style.css"')
+
expect(result).toContain('src="/identifier/site/js/app.js"')
+
expect(result).toContain('src="/identifier/site/images/logo.png"')
+
expect(result).toContain('href="/identifier/site/about.html"')
+
expect(result).toContain('action="/identifier/site/submit"')
+
})
+
+
test('handles mix of relative and absolute paths', () => {
+
const html = `
+
<img src="/abs/image.png">
+
<img src="./rel/image.png">
+
<img src="../parent/image.png">
+
<img src="https://external.com/image.png">
+
`
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/page.html'
+
)
+
expect(result).toContain('src="/identifier/site/abs/image.png"')
+
expect(result).toContain(
+
'src="/identifier/site/folder1/folder2/rel/image.png"'
+
)
+
expect(result).toContain(
+
'src="/identifier/site/folder1/parent/image.png"'
+
)
+
expect(result).toContain('src="https://external.com/image.png"')
+
})
+
})
+
+
describe('edge cases', () => {
+
test('handles empty src attribute', () => {
+
const html = '<img src="">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="">')
+
})
+
+
test('handles basePath without trailing slash', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(html, '/identifier/site', 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('handles basePath with trailing slash', () => {
+
const html = '<img src="/image.png">'
+
const result = rewriteHtmlPaths(
+
html,
+
'/identifier/site/',
+
'index.html'
+
)
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('handles whitespace around equals sign', () => {
+
const html = '<img src = "/image.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png">')
+
})
+
+
test('preserves query strings in URLs', () => {
+
const html = '<img src="/image.png?v=123">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe('<img src="/identifier/site/image.png?v=123">')
+
})
+
+
test('preserves hash fragments in URLs', () => {
+
const html = '<a href="/page.html#section">Link</a>'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<a href="/identifier/site/page.html#section">Link</a>'
+
)
+
})
+
+
test('handles paths with special characters', () => {
+
const html = '<img src="/folder-name/file_name.png">'
+
const result = rewriteHtmlPaths(html, basePath, 'index.html')
+
expect(result).toBe(
+
'<img src="/identifier/site/folder-name/file_name.png">'
+
)
+
})
+
})
+
+
describe('real-world scenario', () => {
+
test('handles the example from the bug report', () => {
+
// HTML file at: /folder1/folder2/folder3/index.html
+
// Image at: /folder1/folder2/img.png
+
// Reference: src="../img.png"
+
const html = '<img src="../img.png">'
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'folder1/folder2/folder3/index.html'
+
)
+
expect(result).toBe(
+
'<img src="/identifier/site/folder1/folder2/img.png">'
+
)
+
})
+
+
test('handles deeply nested static site structure', () => {
+
// A typical static site with nested pages and shared assets
+
const html = `
+
<!DOCTYPE html>
+
<html>
+
<head>
+
<link href="../../css/style.css" rel="stylesheet">
+
<link href="../../css/theme.css" rel="stylesheet">
+
<script src="../../js/main.js"></script>
+
</head>
+
<body>
+
<img src="../../images/logo.png" alt="Logo">
+
<img src="./post-image.jpg" alt="Post">
+
<a href="../index.html">Back to Blog</a>
+
<a href="../../index.html">Home</a>
+
</body>
+
</html>
+
`
+
const result = rewriteHtmlPaths(
+
html,
+
basePath,
+
'blog/posts/my-post.html'
+
)
+
+
// Assets two levels up
+
expect(result).toContain('href="/identifier/site/css/style.css"')
+
expect(result).toContain('href="/identifier/site/css/theme.css"')
+
expect(result).toContain('src="/identifier/site/js/main.js"')
+
expect(result).toContain('src="/identifier/site/images/logo.png"')
+
+
// Same directory
+
expect(result).toContain(
+
'src="/identifier/site/blog/posts/post-image.jpg"'
+
)
+
+
// One level up
+
expect(result).toContain('href="/identifier/site/blog/index.html"')
+
+
// Two levels up
+
expect(result).toContain('href="/identifier/site/index.html"')
+
})
+
})
+
})
+
+
describe('isHtmlContent', () => {
+
test('identifies HTML by content type', () => {
+
expect(isHtmlContent('file.txt', 'text/html')).toBe(true)
+
expect(isHtmlContent('file.txt', 'text/html; charset=utf-8')).toBe(
+
true
+
)
+
})
+
+
test('identifies HTML by .html extension', () => {
+
expect(isHtmlContent('index.html')).toBe(true)
+
expect(isHtmlContent('page.html', undefined)).toBe(true)
+
expect(isHtmlContent('/path/to/file.html')).toBe(true)
+
})
+
+
test('identifies HTML by .htm extension', () => {
+
expect(isHtmlContent('index.htm')).toBe(true)
+
expect(isHtmlContent('page.htm', undefined)).toBe(true)
+
})
+
+
test('handles case-insensitive extensions', () => {
+
expect(isHtmlContent('INDEX.HTML')).toBe(true)
+
expect(isHtmlContent('page.HTM')).toBe(true)
+
expect(isHtmlContent('File.HtMl')).toBe(true)
+
})
+
+
test('returns false for non-HTML files', () => {
+
expect(isHtmlContent('script.js')).toBe(false)
+
expect(isHtmlContent('style.css')).toBe(false)
+
expect(isHtmlContent('image.png')).toBe(false)
+
expect(isHtmlContent('data.json')).toBe(false)
+
})
+
+
test('returns false for files with no extension', () => {
+
expect(isHtmlContent('README')).toBe(false)
+
expect(isHtmlContent('Makefile')).toBe(false)
+
})
+
})
+226
apps/hosting-service/src/lib/html-rewriter.ts
···
+
/**
+
* Safely rewrites absolute paths in HTML to be relative to a base path
+
* Only processes common HTML attributes and preserves external URLs, data URIs, etc.
+
*/
+
+
const REWRITABLE_ATTRIBUTES = [
+
'src',
+
'href',
+
'action',
+
'data',
+
'poster',
+
'srcset'
+
] as const
+
+
/**
+
* Check if a path should be rewritten
+
*/
+
function shouldRewritePath(path: string): boolean {
+
// Don't rewrite empty paths
+
if (!path) return false
+
+
// Don't rewrite external URLs (http://, https://, //)
+
if (
+
path.startsWith('http://') ||
+
path.startsWith('https://') ||
+
path.startsWith('//')
+
) {
+
return false
+
}
+
+
// Don't rewrite data URIs or other schemes (except file paths)
+
if (
+
path.includes(':') &&
+
!path.startsWith('./') &&
+
!path.startsWith('../')
+
) {
+
return false
+
}
+
+
// Rewrite absolute paths (/) and relative paths (./ or ../ or plain filenames)
+
return true
+
}
+
+
/**
+
* Normalize a path by resolving . and .. segments
+
*/
+
function normalizePath(path: string): string {
+
const parts = path.split('/')
+
const result: string[] = []
+
+
for (const part of parts) {
+
if (part === '.' || part === '') {
+
// Skip current directory and empty parts (but keep leading empty for absolute paths)
+
if (part === '' && result.length === 0) {
+
result.push(part)
+
}
+
continue
+
}
+
if (part === '..') {
+
// Go up one directory (but not past root)
+
if (result.length > 0 && result[result.length - 1] !== '..') {
+
result.pop()
+
}
+
continue
+
}
+
result.push(part)
+
}
+
+
return result.join('/')
+
}
+
+
/**
+
* Get the directory path from a file path
+
* e.g., "folder1/folder2/file.html" -> "folder1/folder2/"
+
*/
+
function getDirectory(filepath: string): string {
+
const lastSlash = filepath.lastIndexOf('/')
+
if (lastSlash === -1) {
+
return ''
+
}
+
return filepath.substring(0, lastSlash + 1)
+
}
+
+
/**
+
* Rewrite a single path
+
*/
+
function rewritePath(
+
path: string,
+
basePath: string,
+
documentPath: string
+
): string {
+
if (!shouldRewritePath(path)) {
+
return path
+
}
+
+
// Handle absolute paths: /file.js -> /base/file.js
+
if (path.startsWith('/')) {
+
return basePath + path.slice(1)
+
}
+
+
// Handle relative paths by resolving against document directory
+
const documentDir = getDirectory(documentPath)
+
let resolvedPath: string
+
+
if (path.startsWith('./')) {
+
// ./file.js relative to current directory
+
resolvedPath = documentDir + path.slice(2)
+
} else if (path.startsWith('../')) {
+
// ../file.js relative to parent directory
+
resolvedPath = documentDir + path
+
} else {
+
// file.js (no prefix) - treat as relative to current directory
+
resolvedPath = documentDir + path
+
}
+
+
// Normalize the path to resolve .. and .
+
resolvedPath = normalizePath(resolvedPath)
+
+
return basePath + resolvedPath
+
}
+
+
/**
+
* Rewrite srcset attribute (can contain multiple URLs)
+
* Format: "url1 1x, url2 2x" or "url1 100w, url2 200w"
+
*/
+
function rewriteSrcset(
+
srcset: string,
+
basePath: string,
+
documentPath: string
+
): string {
+
return srcset
+
.split(',')
+
.map((part) => {
+
const trimmed = part.trim()
+
const spaceIndex = trimmed.indexOf(' ')
+
+
if (spaceIndex === -1) {
+
// No descriptor, just URL
+
return rewritePath(trimmed, basePath, documentPath)
+
}
+
+
const url = trimmed.substring(0, spaceIndex)
+
const descriptor = trimmed.substring(spaceIndex)
+
return rewritePath(url, basePath, documentPath) + descriptor
+
})
+
.join(', ')
+
}
+
+
/**
+
* Rewrite absolute and relative paths in HTML content
+
* Uses simple regex matching for safety (no full HTML parsing)
+
*/
+
export function rewriteHtmlPaths(
+
html: string,
+
basePath: string,
+
documentPath: string
+
): string {
+
// Ensure base path ends with /
+
const normalizedBase = basePath.endsWith('/') ? basePath : basePath + '/'
+
+
let rewritten = html
+
+
// Rewrite each attribute type
+
// Use more specific patterns to prevent ReDoS attacks
+
for (const attr of REWRITABLE_ATTRIBUTES) {
+
if (attr === 'srcset') {
+
// Special handling for srcset - use possessive quantifiers via atomic grouping simulation
+
// Limit whitespace to reasonable amount (max 5 spaces) to prevent ReDoS
+
const srcsetRegex = new RegExp(
+
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}"([^"]*)"`,
+
'gi'
+
)
+
rewritten = rewritten.replace(srcsetRegex, (match, value) => {
+
const rewrittenValue = rewriteSrcset(
+
value,
+
normalizedBase,
+
documentPath
+
)
+
return `${attr}="${rewrittenValue}"`
+
})
+
} else {
+
// Regular attributes with quoted values
+
// Limit whitespace to prevent catastrophic backtracking
+
const doubleQuoteRegex = new RegExp(
+
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}"([^"]*)"`,
+
'gi'
+
)
+
const singleQuoteRegex = new RegExp(
+
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}'([^']*)'`,
+
'gi'
+
)
+
+
rewritten = rewritten.replace(doubleQuoteRegex, (match, value) => {
+
const rewrittenValue = rewritePath(
+
value,
+
normalizedBase,
+
documentPath
+
)
+
return `${attr}="${rewrittenValue}"`
+
})
+
+
rewritten = rewritten.replace(singleQuoteRegex, (match, value) => {
+
const rewrittenValue = rewritePath(
+
value,
+
normalizedBase,
+
documentPath
+
)
+
return `${attr}='${rewrittenValue}'`
+
})
+
}
+
}
+
+
return rewritten
+
}
+
+
/**
+
* Check if content is HTML based on content or filename
+
*/
+
export function isHtmlContent(filepath: string, contentType?: string): boolean {
+
if (contentType && contentType.includes('text/html')) {
+
return true
+
}
+
+
const ext = filepath.toLowerCase().split('.').pop()
+
return ext === 'html' || ext === 'htm'
+
}
+362
apps/hosting-service/src/lib/page-generators.ts
···
+
/**
+
* HTML page generation utilities for hosting service
+
* Generates 404 pages, directory listings, and updating pages
+
*/
+
+
/**
+
* Generate 404 page HTML
+
*/
+
export function generate404Page(): string {
+
const html = `<!DOCTYPE html>
+
<html>
+
<head>
+
<meta charset="utf-8">
+
<meta name="viewport" content="width=device-width, initial-scale=1">
+
<title>404 - Not Found</title>
+
<style>
+
@media (prefers-color-scheme: light) {
+
:root {
+
/* Warm beige background */
+
--background: oklch(0.90 0.012 35);
+
/* Very dark brown text */
+
--foreground: oklch(0.18 0.01 30);
+
--border: oklch(0.75 0.015 30);
+
/* Bright pink accent for links */
+
--accent: oklch(0.78 0.15 345);
+
}
+
}
+
@media (prefers-color-scheme: dark) {
+
:root {
+
/* Slate violet background */
+
--background: oklch(0.23 0.015 285);
+
/* Light gray text */
+
--foreground: oklch(0.90 0.005 285);
+
/* Subtle borders */
+
--border: oklch(0.38 0.02 285);
+
/* Soft pink accent */
+
--accent: oklch(0.85 0.08 5);
+
}
+
}
+
body {
+
font-family: 'SF Mono', Monaco, 'Cascadia Code', 'Roboto Mono', Consolas, 'Courier New', monospace;
+
background: var(--background);
+
color: var(--foreground);
+
padding: 2rem;
+
max-width: 800px;
+
margin: 0 auto;
+
display: flex;
+
flex-direction: column;
+
min-height: 100vh;
+
justify-content: center;
+
align-items: center;
+
text-align: center;
+
}
+
h1 {
+
font-size: 6rem;
+
margin: 0;
+
font-weight: 700;
+
line-height: 1;
+
}
+
h2 {
+
font-size: 1.5rem;
+
margin: 1rem 0 2rem;
+
font-weight: 400;
+
opacity: 0.8;
+
}
+
p {
+
font-size: 1rem;
+
opacity: 0.7;
+
margin-bottom: 2rem;
+
}
+
a {
+
color: var(--accent);
+
text-decoration: none;
+
font-size: 1rem;
+
}
+
a:hover {
+
text-decoration: underline;
+
}
+
footer {
+
margin-top: 2rem;
+
padding-top: 1.5rem;
+
border-top: 1px solid var(--border);
+
text-align: center;
+
font-size: 0.875rem;
+
opacity: 0.7;
+
color: var(--foreground);
+
}
+
footer a {
+
color: var(--accent);
+
text-decoration: none;
+
display: inline;
+
}
+
footer a:hover {
+
text-decoration: underline;
+
}
+
</style>
+
</head>
+
<body>
+
<div>
+
<h1>404</h1>
+
<h2>Page not found</h2>
+
<p>The page you're looking for doesn't exist.</p>
+
<a href="/">โ† Back to home</a>
+
</div>
+
<footer>
+
Hosted on <a href="https://wisp.place" target="_blank" rel="noopener">wisp.place</a> - Made by <a href="https://bsky.app/profile/nekomimi.pet" target="_blank" rel="noopener">@nekomimi.pet</a>
+
</footer>
+
</body>
+
</html>`;
+
return html;
+
}
+
+
/**
+
* Generate directory listing HTML
+
*/
+
export function generateDirectoryListing(path: string, entries: Array<{name: string, isDirectory: boolean}>): string {
+
const title = path || 'Index';
+
+
// Sort: directories first, then files, alphabetically within each group
+
const sortedEntries = [...entries].sort((a, b) => {
+
if (a.isDirectory && !b.isDirectory) return -1;
+
if (!a.isDirectory && b.isDirectory) return 1;
+
return a.name.localeCompare(b.name);
+
});
+
+
const html = `<!DOCTYPE html>
+
<html>
+
<head>
+
<meta charset="utf-8">
+
<meta name="viewport" content="width=device-width, initial-scale=1">
+
<title>Index of /${path}</title>
+
<style>
+
@media (prefers-color-scheme: light) {
+
:root {
+
/* Warm beige background */
+
--background: oklch(0.90 0.012 35);
+
/* Very dark brown text */
+
--foreground: oklch(0.18 0.01 30);
+
--border: oklch(0.75 0.015 30);
+
/* Bright pink accent for links */
+
--accent: oklch(0.78 0.15 345);
+
/* Lavender for folders */
+
--folder: oklch(0.60 0.12 295);
+
--icon: oklch(0.28 0.01 30);
+
}
+
}
+
@media (prefers-color-scheme: dark) {
+
:root {
+
/* Slate violet background */
+
--background: oklch(0.23 0.015 285);
+
/* Light gray text */
+
--foreground: oklch(0.90 0.005 285);
+
/* Subtle borders */
+
--border: oklch(0.38 0.02 285);
+
/* Soft pink accent */
+
--accent: oklch(0.85 0.08 5);
+
/* Lavender for folders */
+
--folder: oklch(0.70 0.10 295);
+
--icon: oklch(0.85 0.005 285);
+
}
+
}
+
body {
+
font-family: 'SF Mono', Monaco, 'Cascadia Code', 'Roboto Mono', Consolas, 'Courier New', monospace;
+
background: var(--background);
+
color: var(--foreground);
+
padding: 2rem;
+
max-width: 800px;
+
margin: 0 auto;
+
}
+
h1 {
+
font-size: 1.5rem;
+
margin-bottom: 2rem;
+
padding-bottom: 0.5rem;
+
border-bottom: 1px solid var(--border);
+
}
+
ul {
+
list-style: none;
+
padding: 0;
+
}
+
li {
+
padding: 0.5rem 0;
+
border-bottom: 1px solid var(--border);
+
}
+
li:last-child {
+
border-bottom: none;
+
}
+
li a {
+
color: var(--accent);
+
text-decoration: none;
+
display: flex;
+
align-items: center;
+
gap: 0.75rem;
+
}
+
li a:hover {
+
text-decoration: underline;
+
}
+
.folder {
+
color: var(--folder);
+
font-weight: 600;
+
}
+
.file {
+
color: var(--accent);
+
}
+
.folder::before,
+
.file::before,
+
.parent::before {
+
content: "";
+
display: inline-block;
+
width: 1.25em;
+
height: 1.25em;
+
background-color: var(--icon);
+
flex-shrink: 0;
+
-webkit-mask-size: contain;
+
mask-size: contain;
+
-webkit-mask-repeat: no-repeat;
+
mask-repeat: no-repeat;
+
-webkit-mask-position: center;
+
mask-position: center;
+
}
+
.folder::before {
+
-webkit-mask-image: url('data:image/svg+xml,<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 64 64"><path d="M64 15v37a5.006 5.006 0 0 1-5 5H5a5.006 5.006 0 0 1-5-5V12a5.006 5.006 0 0 1 5-5h14.116a6.966 6.966 0 0 1 5.466 2.627l5 6.247A2.983 2.983 0 0 0 31.922 17H59a1 1 0 0 1 0 2H31.922a4.979 4.979 0 0 1-3.9-1.876l-5-6.247A4.976 4.976 0 0 0 19.116 9H5a3 3 0 0 0-3 3v40a3 3 0 0 0 3 3h54a3 3 0 0 0 3-3V15a3 3 0 0 0-3-3H30a1 1 0 0 1 0-2h29a5.006 5.006 0 0 1 5 5z"/></svg>');
+
mask-image: url('data:image/svg+xml,<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 64 64"><path d="M64 15v37a5.006 5.006 0 0 1-5 5H5a5.006 5.006 0 0 1-5-5V12a5.006 5.006 0 0 1 5-5h14.116a6.966 6.966 0 0 1 5.466 2.627l5 6.247A2.983 2.983 0 0 0 31.922 17H59a1 1 0 0 1 0 2H31.922a4.979 4.979 0 0 1-3.9-1.876l-5-6.247A4.976 4.976 0 0 0 19.116 9H5a3 3 0 0 0-3 3v40a3 3 0 0 0 3 3h54a3 3 0 0 0 3-3V15a3 3 0 0 0-3-3H30a1 1 0 0 1 0-2h29a5.006 5.006 0 0 1 5 5z"/></svg>');
+
}
+
.file::before {
+
-webkit-mask-image: url('data:image/svg+xml,<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 25 25"><g><path d="M18 8.28a.59.59 0 0 0-.13-.18l-4-3.9h-.05a.41.41 0 0 0-.15-.2.41.41 0 0 0-.19 0h-9a.5.5 0 0 0-.5.5v19a.5.5 0 0 0 .5.5h13a.5.5 0 0 0 .5-.5V8.43a.58.58 0 0 0 .02-.15zM16.3 8H14V5.69zM5 23V5h8v3.5a.49.49 0 0 0 .15.36.5.5 0 0 0 .35.14l3.5-.06V23z"/><path d="M20.5 1h-13a.5.5 0 0 0-.5.5V3a.5.5 0 0 0 1 0V2h12v18h-1a.5.5 0 0 0 0 1h1.5a.5.5 0 0 0 .5-.5v-19a.5.5 0 0 0-.5-.5z"/><path d="M7.5 8h3a.5.5 0 0 0 0-1h-3a.5.5 0 0 0 0 1zM7.5 11h4a.5.5 0 0 0 0-1h-4a.5.5 0 0 0 0 1zM13.5 13h-6a.5.5 0 0 0 0 1h6a.5.5 0 0 0 0-1zM13.5 16h-6a.5.5 0 0 0 0 1h6a.5.5 0 0 0 0-1zM13.5 19h-6a.5.5 0 0 0 0 1h6a.5.5 0 0 0 0-1z"/></g></svg>');
+
mask-image: url('data:image/svg+xml,<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 25 25"><g><path d="M18 8.28a.59.59 0 0 0-.13-.18l-4-3.9h-.05a.41.41 0 0 0-.15-.2.41.41 0 0 0-.19 0h-9a.5.5 0 0 0-.5.5v19a.5.5 0 0 0 .5.5h13a.5.5 0 0 0 .5-.5V8.43a.58.58 0 0 0 .02-.15zM16.3 8H14V5.69zM5 23V5h8v3.5a.49.49 0 0 0 .15.36.5.5 0 0 0 .35.14l3.5-.06V23z"/><path d="M20.5 1h-13a.5.5 0 0 0-.5.5V3a.5.5 0 0 0 1 0V2h12v18h-1a.5.5 0 0 0 0 1h1.5a.5.5 0 0 0 .5-.5v-19a.5.5 0 0 0-.5-.5z"/><path d="M7.5 8h3a.5.5 0 0 0 0-1h-3a.5.5 0 0 0 0 1zM7.5 11h4a.5.5 0 0 0 0-1h-4a.5.5 0 0 0 0 1zM13.5 13h-6a.5.5 0 0 0 0 1h6a.5.5 0 0 0 0-1zM13.5 16h-6a.5.5 0 0 0 0 1h6a.5.5 0 0 0 0-1zM13.5 19h-6a.5.5 0 0 0 0 1h6a.5.5 0 0 0 0-1z"/></g></svg>');
+
}
+
.parent::before {
+
-webkit-mask-image: url('data:image/svg+xml,<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"/></svg>');
+
mask-image: url('data:image/svg+xml,<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><path d="M7.41 15.41L12 10.83l4.59 4.58L18 14l-6-6-6 6z"/></svg>');
+
}
+
footer {
+
margin-top: 2rem;
+
padding-top: 1.5rem;
+
border-top: 1px solid var(--border);
+
text-align: center;
+
font-size: 0.875rem;
+
opacity: 0.7;
+
color: var(--foreground);
+
}
+
footer a {
+
color: var(--accent);
+
text-decoration: none;
+
display: inline;
+
}
+
footer a:hover {
+
text-decoration: underline;
+
}
+
</style>
+
</head>
+
<body>
+
<h1>Index of /${path}</h1>
+
<ul>
+
${path ? '<li><a href="../" class="parent">../</a></li>' : ''}
+
${sortedEntries.map(e =>
+
`<li><a href="${e.name}${e.isDirectory ? '/' : ''}" class="${e.isDirectory ? 'folder' : 'file'}">${e.name}${e.isDirectory ? '/' : ''}</a></li>`
+
).join('\n ')}
+
</ul>
+
<footer>
+
Hosted on <a href="https://wisp.place" target="_blank" rel="noopener">wisp.place</a> - Made by <a href="https://bsky.app/profile/nekomimi.pet" target="_blank" rel="noopener">@nekomimi.pet</a>
+
</footer>
+
</body>
+
</html>`;
+
return html;
+
}
+
+
/**
+
* Return a response indicating the site is being updated
+
*/
+
export function generateSiteUpdatingPage(): string {
+
const html = `<!DOCTYPE html>
+
<html>
+
<head>
+
<meta charset="utf-8">
+
<meta name="viewport" content="width=device-width, initial-scale=1">
+
<title>Site Updating</title>
+
<style>
+
@media (prefers-color-scheme: light) {
+
:root {
+
--background: oklch(0.90 0.012 35);
+
--foreground: oklch(0.18 0.01 30);
+
--primary: oklch(0.35 0.02 35);
+
--accent: oklch(0.78 0.15 345);
+
}
+
}
+
@media (prefers-color-scheme: dark) {
+
:root {
+
--background: oklch(0.23 0.015 285);
+
--foreground: oklch(0.90 0.005 285);
+
--primary: oklch(0.70 0.10 295);
+
--accent: oklch(0.85 0.08 5);
+
}
+
}
+
body {
+
font-family: system-ui, -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
+
display: flex;
+
align-items: center;
+
justify-content: center;
+
min-height: 100vh;
+
margin: 0;
+
background: var(--background);
+
color: var(--foreground);
+
}
+
.container {
+
text-align: center;
+
padding: 2rem;
+
max-width: 500px;
+
}
+
h1 {
+
font-size: 2.5rem;
+
margin-bottom: 1rem;
+
font-weight: 600;
+
color: var(--primary);
+
}
+
p {
+
font-size: 1.25rem;
+
opacity: 0.8;
+
margin-bottom: 2rem;
+
color: var(--foreground);
+
}
+
.spinner {
+
border: 4px solid var(--accent);
+
border-radius: 50%;
+
border-top: 4px solid var(--primary);
+
width: 40px;
+
height: 40px;
+
animation: spin 1s linear infinite;
+
margin: 0 auto;
+
}
+
@keyframes spin {
+
0% { transform: rotate(0deg); }
+
100% { transform: rotate(360deg); }
+
}
+
</style>
+
<meta http-equiv="refresh" content="3">
+
</head>
+
<body>
+
<div class="container">
+
<h1>Site Updating</h1>
+
<p>This site is undergoing an update right now. Check back in a moment...</p>
+
<div class="spinner"></div>
+
</div>
+
</body>
+
</html>`;
+
+
return html;
+
}
+
+
/**
+
* Create a Response for site updating
+
*/
+
export function siteUpdatingResponse(): Response {
+
return new Response(generateSiteUpdatingPage(), {
+
status: 503,
+
headers: {
+
'Content-Type': 'text/html; charset=utf-8',
+
'Cache-Control': 'no-store, no-cache, must-revalidate',
+
'Retry-After': '3',
+
},
+
});
+
}
+
+215
apps/hosting-service/src/lib/redirects.test.ts
···
+
import { describe, it, expect } from 'bun:test'
+
import { parseRedirectsFile, matchRedirectRule } from './redirects';
+
+
describe('parseRedirectsFile', () => {
+
it('should parse simple redirects', () => {
+
const content = `
+
# Comment line
+
/old-path /new-path
+
/home / 301
+
`;
+
const rules = parseRedirectsFile(content);
+
expect(rules).toHaveLength(2);
+
expect(rules[0]).toMatchObject({
+
from: '/old-path',
+
to: '/new-path',
+
status: 301,
+
force: false,
+
});
+
expect(rules[1]).toMatchObject({
+
from: '/home',
+
to: '/',
+
status: 301,
+
force: false,
+
});
+
});
+
+
it('should parse redirects with different status codes', () => {
+
const content = `
+
/temp-redirect /target 302
+
/rewrite /content 200
+
/not-found /404 404
+
`;
+
const rules = parseRedirectsFile(content);
+
expect(rules).toHaveLength(3);
+
expect(rules[0]?.status).toBe(302);
+
expect(rules[1]?.status).toBe(200);
+
expect(rules[2]?.status).toBe(404);
+
});
+
+
it('should parse force redirects', () => {
+
const content = `/force-path /target 301!`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.force).toBe(true);
+
expect(rules[0]?.status).toBe(301);
+
});
+
+
it('should parse splat redirects', () => {
+
const content = `/news/* /blog/:splat`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.from).toBe('/news/*');
+
expect(rules[0]?.to).toBe('/blog/:splat');
+
});
+
+
it('should parse placeholder redirects', () => {
+
const content = `/blog/:year/:month/:day /posts/:year-:month-:day`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.from).toBe('/blog/:year/:month/:day');
+
expect(rules[0]?.to).toBe('/posts/:year-:month-:day');
+
});
+
+
it('should parse country-based redirects', () => {
+
const content = `/ /anz 302 Country=au,nz`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.conditions?.country).toEqual(['au', 'nz']);
+
});
+
+
it('should parse language-based redirects', () => {
+
const content = `/products /en/products 301 Language=en`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.conditions?.language).toEqual(['en']);
+
});
+
+
it('should parse cookie-based redirects', () => {
+
const content = `/* /legacy/:splat 200 Cookie=is_legacy,my_cookie`;
+
const rules = parseRedirectsFile(content);
+
expect(rules[0]?.conditions?.cookie).toEqual(['is_legacy', 'my_cookie']);
+
});
+
});
+
+
describe('matchRedirectRule', () => {
+
it('should match exact paths', () => {
+
const rules = parseRedirectsFile('/old-path /new-path');
+
const match = matchRedirectRule('/old-path', rules);
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/new-path');
+
expect(match?.status).toBe(301);
+
});
+
+
it('should match paths with trailing slash', () => {
+
const rules = parseRedirectsFile('/old-path /new-path');
+
const match = matchRedirectRule('/old-path/', rules);
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/new-path');
+
});
+
+
it('should match splat patterns', () => {
+
const rules = parseRedirectsFile('/news/* /blog/:splat');
+
const match = matchRedirectRule('/news/2024/01/15/my-post', rules);
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/blog/2024/01/15/my-post');
+
});
+
+
it('should match placeholder patterns', () => {
+
const rules = parseRedirectsFile('/blog/:year/:month/:day /posts/:year-:month-:day');
+
const match = matchRedirectRule('/blog/2024/01/15', rules);
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/posts/2024-01-15');
+
});
+
+
it('should preserve query strings for 301/302 redirects', () => {
+
const rules = parseRedirectsFile('/old /new 301');
+
const match = matchRedirectRule('/old', rules, {
+
queryParams: { foo: 'bar', baz: 'qux' },
+
});
+
expect(match?.targetPath).toContain('?');
+
expect(match?.targetPath).toContain('foo=bar');
+
expect(match?.targetPath).toContain('baz=qux');
+
});
+
+
it('should match based on query parameters', () => {
+
const rules = parseRedirectsFile('/store id=:id /blog/:id 301');
+
const match = matchRedirectRule('/store', rules, {
+
queryParams: { id: 'my-post' },
+
});
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toContain('/blog/my-post');
+
});
+
+
it('should not match when query params are missing', () => {
+
const rules = parseRedirectsFile('/store id=:id /blog/:id 301');
+
const match = matchRedirectRule('/store', rules, {
+
queryParams: {},
+
});
+
expect(match).toBeNull();
+
});
+
+
it('should match based on country header', () => {
+
const rules = parseRedirectsFile('/ /aus 302 Country=au');
+
const match = matchRedirectRule('/', rules, {
+
headers: { 'cf-ipcountry': 'AU' },
+
});
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/aus');
+
});
+
+
it('should not match wrong country', () => {
+
const rules = parseRedirectsFile('/ /aus 302 Country=au');
+
const match = matchRedirectRule('/', rules, {
+
headers: { 'cf-ipcountry': 'US' },
+
});
+
expect(match).toBeNull();
+
});
+
+
it('should match based on language header', () => {
+
const rules = parseRedirectsFile('/products /en/products 301 Language=en');
+
const match = matchRedirectRule('/products', rules, {
+
headers: { 'accept-language': 'en-US,en;q=0.9' },
+
});
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/en/products');
+
});
+
+
it('should match based on cookie presence', () => {
+
const rules = parseRedirectsFile('/* /legacy/:splat 200 Cookie=is_legacy');
+
const match = matchRedirectRule('/some-path', rules, {
+
cookies: { is_legacy: 'true' },
+
});
+
expect(match).toBeTruthy();
+
expect(match?.targetPath).toBe('/legacy/some-path');
+
});
+
+
it('should return first matching rule', () => {
+
const content = `
+
/path /first
+
/path /second
+
`;
+
const rules = parseRedirectsFile(content);
+
const match = matchRedirectRule('/path', rules);
+
expect(match?.targetPath).toBe('/first');
+
});
+
+
it('should match more specific rules before general ones', () => {
+
const content = `
+
/jobs/customer-ninja /careers/support
+
/jobs/* /careers/:splat
+
`;
+
const rules = parseRedirectsFile(content);
+
+
const match1 = matchRedirectRule('/jobs/customer-ninja', rules);
+
expect(match1?.targetPath).toBe('/careers/support');
+
+
const match2 = matchRedirectRule('/jobs/developer', rules);
+
expect(match2?.targetPath).toBe('/careers/developer');
+
});
+
+
it('should handle SPA routing pattern', () => {
+
const rules = parseRedirectsFile('/* /index.html 200');
+
+
// Should match any path
+
const match1 = matchRedirectRule('/about', rules);
+
expect(match1).toBeTruthy();
+
expect(match1?.targetPath).toBe('/index.html');
+
expect(match1?.status).toBe(200);
+
+
const match2 = matchRedirectRule('/users/123/profile', rules);
+
expect(match2).toBeTruthy();
+
expect(match2?.targetPath).toBe('/index.html');
+
expect(match2?.status).toBe(200);
+
+
const match3 = matchRedirectRule('/', rules);
+
expect(match3).toBeTruthy();
+
expect(match3?.targetPath).toBe('/index.html');
+
});
+
});
+
+459
apps/hosting-service/src/lib/redirects.ts
···
+
import { readFile } from 'fs/promises';
+
import { existsSync } from 'fs';
+
+
export interface RedirectRule {
+
from: string;
+
to: string;
+
status: number;
+
force: boolean;
+
conditions?: {
+
country?: string[];
+
language?: string[];
+
role?: string[];
+
cookie?: string[];
+
};
+
// For pattern matching
+
fromPattern?: RegExp;
+
fromParams?: string[]; // Named parameters from the pattern
+
queryParams?: Record<string, string>; // Expected query parameters
+
}
+
+
export interface RedirectMatch {
+
rule: RedirectRule;
+
targetPath: string;
+
status: number;
+
}
+
+
// Maximum number of redirect rules to prevent DoS attacks
+
const MAX_REDIRECT_RULES = 1000;
+
+
/**
+
* Parse a _redirects file into an array of redirect rules
+
*/
+
export function parseRedirectsFile(content: string): RedirectRule[] {
+
const lines = content.split('\n');
+
const rules: RedirectRule[] = [];
+
+
for (let lineNum = 0; lineNum < lines.length; lineNum++) {
+
const lineRaw = lines[lineNum];
+
if (!lineRaw) continue;
+
+
const line = lineRaw.trim();
+
+
// Skip empty lines and comments
+
if (!line || line.startsWith('#')) {
+
continue;
+
}
+
+
// Enforce max rules limit
+
if (rules.length >= MAX_REDIRECT_RULES) {
+
console.warn(`Redirect rules limit reached (${MAX_REDIRECT_RULES}), ignoring remaining rules`);
+
break;
+
}
+
+
try {
+
const rule = parseRedirectLine(line);
+
if (rule && rule.fromPattern) {
+
rules.push(rule);
+
}
+
} catch (err) {
+
console.warn(`Failed to parse redirect rule on line ${lineNum + 1}: ${line}`, err);
+
}
+
}
+
+
return rules;
+
}
+
+
/**
+
* Parse a single redirect rule line
+
* Format: /from [query_params] /to [status] [conditions]
+
*/
+
function parseRedirectLine(line: string): RedirectRule | null {
+
// Split by whitespace, but respect quoted strings (though not commonly used)
+
const parts = line.split(/\s+/);
+
+
if (parts.length < 2) {
+
return null;
+
}
+
+
let idx = 0;
+
const from = parts[idx++];
+
+
if (!from) {
+
return null;
+
}
+
+
let status = 301; // Default status
+
let force = false;
+
const conditions: NonNullable<RedirectRule['conditions']> = {};
+
const queryParams: Record<string, string> = {};
+
+
// Parse query parameters that come before the destination path
+
// They look like: key=:value (and don't start with /)
+
while (idx < parts.length) {
+
const part = parts[idx];
+
if (!part) {
+
idx++;
+
continue;
+
}
+
+
// If it starts with / or http, it's the destination path
+
if (part.startsWith('/') || part.startsWith('http://') || part.startsWith('https://')) {
+
break;
+
}
+
+
// If it contains = and comes before the destination, it's a query param
+
if (part.includes('=')) {
+
const splitIndex = part.indexOf('=');
+
const key = part.slice(0, splitIndex);
+
const value = part.slice(splitIndex + 1);
+
+
if (key && value) {
+
queryParams[key] = value;
+
}
+
idx++;
+
} else {
+
// Not a query param, must be destination or something else
+
break;
+
}
+
}
+
+
// Next part should be the destination
+
if (idx >= parts.length) {
+
return null;
+
}
+
+
const to = parts[idx++];
+
if (!to) {
+
return null;
+
}
+
+
// Parse remaining parts for status code and conditions
+
for (let i = idx; i < parts.length; i++) {
+
const part = parts[i];
+
+
if (!part) continue;
+
+
// Check for status code (with optional ! for force)
+
if (/^\d+!?$/.test(part)) {
+
if (part.endsWith('!')) {
+
force = true;
+
status = parseInt(part.slice(0, -1));
+
} else {
+
status = parseInt(part);
+
}
+
continue;
+
}
+
+
// Check for condition parameters (Country=, Language=, Role=, Cookie=)
+
if (part.includes('=')) {
+
const splitIndex = part.indexOf('=');
+
const key = part.slice(0, splitIndex);
+
const value = part.slice(splitIndex + 1);
+
+
if (!key || !value) continue;
+
+
const keyLower = key.toLowerCase();
+
+
if (keyLower === 'country') {
+
conditions.country = value.split(',').map(v => v.trim().toLowerCase());
+
} else if (keyLower === 'language') {
+
conditions.language = value.split(',').map(v => v.trim().toLowerCase());
+
} else if (keyLower === 'role') {
+
conditions.role = value.split(',').map(v => v.trim());
+
} else if (keyLower === 'cookie') {
+
conditions.cookie = value.split(',').map(v => v.trim().toLowerCase());
+
}
+
}
+
}
+
+
// Parse the 'from' pattern
+
const { pattern, params } = convertPathToRegex(from);
+
+
return {
+
from,
+
to,
+
status,
+
force,
+
conditions: Object.keys(conditions).length > 0 ? conditions : undefined,
+
queryParams: Object.keys(queryParams).length > 0 ? queryParams : undefined,
+
fromPattern: pattern,
+
fromParams: params,
+
};
+
}
+
+
/**
+
* Convert a path pattern with placeholders and splats to a regex
+
* Examples:
+
* /blog/:year/:month/:day -> captures year, month, day
+
* /news/* -> captures splat
+
*/
+
function convertPathToRegex(pattern: string): { pattern: RegExp; params: string[] } {
+
const params: string[] = [];
+
let regexStr = '^';
+
+
// Split by query string if present
+
const pathPart = pattern.split('?')[0] || pattern;
+
+
// Escape special regex characters except * and :
+
let escaped = pathPart.replace(/[.+^${}()|[\]\\]/g, '\\$&');
+
+
// Replace :param with named capture groups
+
escaped = escaped.replace(/:([a-zA-Z_][a-zA-Z0-9_]*)/g, (match, paramName) => {
+
params.push(paramName);
+
// Match path segment (everything except / and ?)
+
return '([^/?]+)';
+
});
+
+
// Replace * with splat capture (matches everything including /)
+
if (escaped.includes('*')) {
+
escaped = escaped.replace(/\*/g, '(.*)');
+
params.push('splat');
+
}
+
+
regexStr += escaped;
+
+
// Make trailing slash optional
+
if (!regexStr.endsWith('.*')) {
+
regexStr += '/?';
+
}
+
+
regexStr += '$';
+
+
return {
+
pattern: new RegExp(regexStr),
+
params,
+
};
+
}
+
+
/**
+
* Match a request path against redirect rules with loop detection
+
*/
+
export function matchRedirectRule(
+
requestPath: string,
+
rules: RedirectRule[],
+
context?: {
+
queryParams?: Record<string, string>;
+
headers?: Record<string, string>;
+
cookies?: Record<string, string>;
+
},
+
visitedPaths: Set<string> = new Set()
+
): RedirectMatch | null {
+
// Normalize path: ensure leading slash, remove trailing slash (except for root)
+
let normalizedPath = requestPath.startsWith('/') ? requestPath : `/${requestPath}`;
+
+
// Detect redirect loops
+
if (visitedPaths.has(normalizedPath)) {
+
console.warn(`Redirect loop detected for path: ${normalizedPath}`);
+
return null;
+
}
+
+
// Track this path to detect loops
+
visitedPaths.add(normalizedPath);
+
+
// Limit redirect chain depth to 10
+
if (visitedPaths.size > 10) {
+
console.warn(`Redirect chain too deep (>10) for path: ${normalizedPath}`);
+
return null;
+
}
+
+
for (const rule of rules) {
+
// Check query parameter conditions first (if any)
+
if (rule.queryParams) {
+
// If rule requires query params but none provided, skip this rule
+
if (!context?.queryParams) {
+
continue;
+
}
+
+
// Check that all required query params are present
+
// The value in rule.queryParams is either a literal or a placeholder (:name)
+
const queryMatches = Object.entries(rule.queryParams).every(([key, expectedValue]) => {
+
const actualValue = context.queryParams?.[key];
+
+
// Query param must exist
+
if (actualValue === undefined) {
+
return false;
+
}
+
+
// If expected value is a placeholder (:name), any value is acceptable
+
// If it's a literal, it must match exactly
+
if (expectedValue && !expectedValue.startsWith(':')) {
+
return actualValue === expectedValue;
+
}
+
+
return true;
+
});
+
+
if (!queryMatches) {
+
continue;
+
}
+
}
+
+
// Check conditional redirects (country, language, role, cookie)
+
if (rule.conditions) {
+
if (rule.conditions.country && context?.headers) {
+
const cfCountry = context.headers['cf-ipcountry'];
+
const xCountry = context.headers['x-country'];
+
const country = (cfCountry?.toLowerCase() || xCountry?.toLowerCase());
+
if (!country || !rule.conditions.country.includes(country)) {
+
continue;
+
}
+
}
+
+
if (rule.conditions.language && context?.headers) {
+
const acceptLang = context.headers['accept-language'];
+
if (!acceptLang) {
+
continue;
+
}
+
// Parse accept-language header (simplified)
+
const langs = acceptLang.split(',').map(l => {
+
const langPart = l.split(';')[0];
+
return langPart ? langPart.trim().toLowerCase() : '';
+
}).filter(l => l !== '');
+
const hasMatch = rule.conditions.language.some(lang =>
+
langs.some(l => l === lang || l.startsWith(lang + '-'))
+
);
+
if (!hasMatch) {
+
continue;
+
}
+
}
+
+
if (rule.conditions.cookie && context?.cookies) {
+
const hasCookie = rule.conditions.cookie.some(cookieName =>
+
context.cookies && cookieName in context.cookies
+
);
+
if (!hasCookie) {
+
continue;
+
}
+
}
+
+
// Role-based redirects would need JWT verification - skip for now
+
if (rule.conditions.role) {
+
continue;
+
}
+
}
+
+
// Match the path pattern
+
const match = rule.fromPattern?.exec(normalizedPath);
+
if (!match) {
+
continue;
+
}
+
+
// Build the target path by replacing placeholders
+
let targetPath = rule.to;
+
+
// Replace captured parameters (with URL encoding)
+
if (rule.fromParams && match.length > 1) {
+
for (let i = 0; i < rule.fromParams.length; i++) {
+
const paramName = rule.fromParams[i];
+
const paramValue = match[i + 1];
+
+
if (!paramName || !paramValue) continue;
+
+
// URL encode captured values to prevent invalid URLs
+
const encodedValue = encodeURIComponent(paramValue);
+
+
if (paramName === 'splat') {
+
// For splats, preserve slashes by re-decoding them
+
const splatValue = encodedValue.replace(/%2F/g, '/');
+
targetPath = targetPath.replace(':splat', splatValue);
+
} else {
+
targetPath = targetPath.replace(`:${paramName}`, encodedValue);
+
}
+
}
+
}
+
+
// Handle query parameter replacements (with URL encoding)
+
if (rule.queryParams && context?.queryParams) {
+
for (const [key, placeholder] of Object.entries(rule.queryParams)) {
+
const actualValue = context.queryParams[key];
+
if (actualValue && placeholder && placeholder.startsWith(':')) {
+
const paramName = placeholder.slice(1);
+
if (paramName) {
+
// URL encode query parameter values
+
const encodedValue = encodeURIComponent(actualValue);
+
targetPath = targetPath.replace(`:${paramName}`, encodedValue);
+
}
+
}
+
}
+
}
+
+
// Preserve query string for 200, 301, 302 redirects (unless target already has one)
+
if ([200, 301, 302].includes(rule.status) && context?.queryParams && !targetPath.includes('?')) {
+
const queryString = Object.entries(context.queryParams)
+
.map(([k, v]) => `${encodeURIComponent(k)}=${encodeURIComponent(v)}`)
+
.join('&');
+
if (queryString) {
+
targetPath += `?${queryString}`;
+
}
+
}
+
+
return {
+
rule,
+
targetPath,
+
status: rule.status,
+
};
+
}
+
+
return null;
+
}
+
+
/**
+
* Load redirect rules from a cached site
+
*/
+
export async function loadRedirectRules(did: string, rkey: string): Promise<RedirectRule[]> {
+
const CACHE_DIR = process.env.CACHE_DIR || './cache/sites';
+
const redirectsPath = `${CACHE_DIR}/${did}/${rkey}/_redirects`;
+
+
if (!existsSync(redirectsPath)) {
+
return [];
+
}
+
+
try {
+
const content = await readFile(redirectsPath, 'utf-8');
+
return parseRedirectsFile(content);
+
} catch (err) {
+
console.error('Failed to load _redirects file', err);
+
return [];
+
}
+
}
+
+
/**
+
* Parse cookies from Cookie header
+
*/
+
export function parseCookies(cookieHeader?: string): Record<string, string> {
+
if (!cookieHeader) return {};
+
+
const cookies: Record<string, string> = {};
+
const parts = cookieHeader.split(';');
+
+
for (const part of parts) {
+
const [key, ...valueParts] = part.split('=');
+
if (key && valueParts.length > 0) {
+
cookies[key.trim()] = valueParts.join('=').trim();
+
}
+
}
+
+
return cookies;
+
}
+
+
/**
+
* Parse query string into object
+
*/
+
export function parseQueryString(url: string): Record<string, string> {
+
const queryStart = url.indexOf('?');
+
if (queryStart === -1) return {};
+
+
const queryString = url.slice(queryStart + 1);
+
const params: Record<string, string> = {};
+
+
for (const pair of queryString.split('&')) {
+
const [key, value] = pair.split('=');
+
if (key) {
+
params[decodeURIComponent(key)] = value ? decodeURIComponent(value) : '';
+
}
+
}
+
+
return params;
+
}
+
+96
apps/hosting-service/src/lib/request-utils.ts
···
+
/**
+
* Request utilities for validation and helper functions
+
*/
+
+
import type { Record as WispSettings } from '@wisp/lexicons/types/place/wisp/settings';
+
import { access } from 'fs/promises';
+
+
/**
+
* Default index file names to check for directory requests
+
* Will be checked in order until one is found
+
*/
+
export const DEFAULT_INDEX_FILES = ['index.html', 'index.htm'];
+
+
/**
+
* Get index files list from settings or use defaults
+
*/
+
export function getIndexFiles(settings: WispSettings | null): string[] {
+
if (settings?.indexFiles && settings.indexFiles.length > 0) {
+
return settings.indexFiles;
+
}
+
return DEFAULT_INDEX_FILES;
+
}
+
+
/**
+
* Match a file path against a glob pattern
+
* Supports * wildcard and basic path matching
+
*/
+
export function matchGlob(path: string, pattern: string): boolean {
+
// Normalize paths
+
const normalizedPath = path.startsWith('/') ? path : '/' + path;
+
const normalizedPattern = pattern.startsWith('/') ? pattern : '/' + pattern;
+
+
// Convert glob pattern to regex
+
const regexPattern = normalizedPattern
+
.replace(/\./g, '\\.')
+
.replace(/\*/g, '.*')
+
.replace(/\?/g, '.');
+
+
const regex = new RegExp('^' + regexPattern + '$');
+
return regex.test(normalizedPath);
+
}
+
+
/**
+
* Apply custom headers from settings to response headers
+
*/
+
export function applyCustomHeaders(headers: Record<string, string>, filePath: string, settings: WispSettings | null) {
+
if (!settings?.headers || settings.headers.length === 0) return;
+
+
for (const customHeader of settings.headers) {
+
// If path glob is specified, check if it matches
+
if (customHeader.path) {
+
if (!matchGlob(filePath, customHeader.path)) {
+
continue;
+
}
+
}
+
// Apply the header
+
headers[customHeader.name] = customHeader.value;
+
}
+
}
+
+
/**
+
* Validate site name (rkey) to prevent injection attacks
+
* Must match AT Protocol rkey format
+
*/
+
export function isValidRkey(rkey: string): boolean {
+
if (!rkey || typeof rkey !== 'string') return false;
+
if (rkey.length < 1 || rkey.length > 512) return false;
+
if (rkey === '.' || rkey === '..') return false;
+
if (rkey.includes('/') || rkey.includes('\\') || rkey.includes('\0')) return false;
+
const validRkeyPattern = /^[a-zA-Z0-9._~:-]+$/;
+
return validRkeyPattern.test(rkey);
+
}
+
+
/**
+
* Async file existence check
+
*/
+
export async function fileExists(path: string): Promise<boolean> {
+
try {
+
await access(path);
+
return true;
+
} catch {
+
return false;
+
}
+
}
+
+
/**
+
* Extract and normalize headers from request
+
*/
+
export function extractHeaders(rawHeaders: Headers): Record<string, string> {
+
const headers: Record<string, string> = {};
+
rawHeaders.forEach((value, key) => {
+
headers[key.toLowerCase()] = value;
+
});
+
return headers;
+
}
+
+79
apps/hosting-service/src/lib/site-cache.ts
···
+
/**
+
* Site caching management utilities
+
*/
+
+
import { createLogger } from '@wisp/observability';
+
import { fetchSiteRecord, getPdsForDid, downloadAndCacheSite, isCached } from './utils';
+
import { markSiteAsBeingCached, unmarkSiteAsBeingCached } from './cache';
+
import type { RedirectRule } from './redirects';
+
+
const logger = createLogger('hosting-service');
+
+
// Cache for redirect rules (per site)
+
const redirectRulesCache = new Map<string, RedirectRule[]>();
+
+
/**
+
* Clear redirect rules cache for a specific site
+
* Should be called when a site is updated/recached
+
*/
+
export function clearRedirectRulesCache(did: string, rkey: string) {
+
const cacheKey = `${did}:${rkey}`;
+
redirectRulesCache.delete(cacheKey);
+
}
+
+
/**
+
* Get redirect rules from cache
+
*/
+
export function getRedirectRulesFromCache(did: string, rkey: string): RedirectRule[] | undefined {
+
const cacheKey = `${did}:${rkey}`;
+
return redirectRulesCache.get(cacheKey);
+
}
+
+
/**
+
* Set redirect rules in cache
+
*/
+
export function setRedirectRulesInCache(did: string, rkey: string, rules: RedirectRule[]) {
+
const cacheKey = `${did}:${rkey}`;
+
redirectRulesCache.set(cacheKey, rules);
+
}
+
+
/**
+
* Helper to ensure site is cached
+
* Returns true if site is successfully cached, false otherwise
+
*/
+
export async function ensureSiteCached(did: string, rkey: string): Promise<boolean> {
+
if (isCached(did, rkey)) {
+
return true;
+
}
+
+
// Fetch and cache the site
+
const siteData = await fetchSiteRecord(did, rkey);
+
if (!siteData) {
+
logger.error('Site record not found', null, { did, rkey });
+
return false;
+
}
+
+
const pdsEndpoint = await getPdsForDid(did);
+
if (!pdsEndpoint) {
+
logger.error('PDS not found for DID', null, { did });
+
return false;
+
}
+
+
// Mark site as being cached to prevent serving stale content during update
+
markSiteAsBeingCached(did, rkey);
+
+
try {
+
await downloadAndCacheSite(did, rkey, siteData.record, pdsEndpoint, siteData.cid);
+
// Clear redirect rules cache since the site was updated
+
clearRedirectRulesCache(did, rkey);
+
logger.info('Site cached successfully', { did, rkey });
+
return true;
+
} catch (err) {
+
logger.error('Failed to cache site', err, { did, rkey });
+
return false;
+
} finally {
+
// Always unmark, even if caching fails
+
unmarkSiteAsBeingCached(did, rkey);
+
}
+
}
+
+27
apps/hosting-service/src/lib/types.ts
···
+
import type { BlobRef } from '@atproto/api';
+
+
export interface WispFsRecord {
+
$type: 'place.wisp.fs';
+
site: string;
+
root: Directory;
+
fileCount?: number;
+
createdAt: string;
+
}
+
+
export interface File {
+
$type?: 'place.wisp.fs#file';
+
type: 'file';
+
blob: BlobRef;
+
}
+
+
export interface Directory {
+
$type?: 'place.wisp.fs#directory';
+
type: 'directory';
+
entries: Entry[];
+
}
+
+
export interface Entry {
+
$type?: 'place.wisp.fs#entry';
+
name: string;
+
node: File | Directory | { $type: string };
+
}
+169
apps/hosting-service/src/lib/utils.test.ts
···
+
import { describe, test, expect } from 'bun:test'
+
import { sanitizePath, extractBlobCid } from './utils'
+
import { CID } from 'multiformats'
+
+
describe('sanitizePath', () => {
+
test('allows normal file paths', () => {
+
expect(sanitizePath('index.html')).toBe('index.html')
+
expect(sanitizePath('css/styles.css')).toBe('css/styles.css')
+
expect(sanitizePath('images/logo.png')).toBe('images/logo.png')
+
expect(sanitizePath('js/app.js')).toBe('js/app.js')
+
})
+
+
test('allows deeply nested paths', () => {
+
expect(sanitizePath('assets/images/icons/favicon.ico')).toBe('assets/images/icons/favicon.ico')
+
expect(sanitizePath('a/b/c/d/e/f.txt')).toBe('a/b/c/d/e/f.txt')
+
})
+
+
test('removes leading slashes', () => {
+
expect(sanitizePath('/index.html')).toBe('index.html')
+
expect(sanitizePath('//index.html')).toBe('index.html')
+
expect(sanitizePath('///index.html')).toBe('index.html')
+
expect(sanitizePath('/css/styles.css')).toBe('css/styles.css')
+
})
+
+
test('blocks parent directory traversal', () => {
+
expect(sanitizePath('../etc/passwd')).toBe('etc/passwd')
+
expect(sanitizePath('../../etc/passwd')).toBe('etc/passwd')
+
expect(sanitizePath('../../../etc/passwd')).toBe('etc/passwd')
+
expect(sanitizePath('css/../../../etc/passwd')).toBe('css/etc/passwd')
+
})
+
+
test('blocks directory traversal in middle of path', () => {
+
expect(sanitizePath('images/../../../etc/passwd')).toBe('images/etc/passwd')
+
// Note: sanitizePath only filters out ".." segments, doesn't resolve paths
+
expect(sanitizePath('a/b/../c')).toBe('a/b/c')
+
expect(sanitizePath('a/../b/../c')).toBe('a/b/c')
+
})
+
+
test('removes current directory references', () => {
+
expect(sanitizePath('./index.html')).toBe('index.html')
+
expect(sanitizePath('././index.html')).toBe('index.html')
+
expect(sanitizePath('css/./styles.css')).toBe('css/styles.css')
+
expect(sanitizePath('./css/./styles.css')).toBe('css/styles.css')
+
})
+
+
test('removes empty path segments', () => {
+
expect(sanitizePath('css//styles.css')).toBe('css/styles.css')
+
expect(sanitizePath('css///styles.css')).toBe('css/styles.css')
+
expect(sanitizePath('a//b//c')).toBe('a/b/c')
+
})
+
+
test('blocks null bytes', () => {
+
// Null bytes cause the entire segment to be filtered out
+
expect(sanitizePath('index.html\0.txt')).toBe('')
+
expect(sanitizePath('test\0')).toBe('')
+
// Null byte in middle segment
+
expect(sanitizePath('css/bad\0name/styles.css')).toBe('css/styles.css')
+
})
+
+
test('handles mixed attacks', () => {
+
expect(sanitizePath('/../../../etc/passwd')).toBe('etc/passwd')
+
expect(sanitizePath('/./././../etc/passwd')).toBe('etc/passwd')
+
expect(sanitizePath('//../../.\0./etc/passwd')).toBe('etc/passwd')
+
})
+
+
test('handles edge cases', () => {
+
expect(sanitizePath('')).toBe('')
+
expect(sanitizePath('/')).toBe('')
+
expect(sanitizePath('//')).toBe('')
+
expect(sanitizePath('.')).toBe('')
+
expect(sanitizePath('..')).toBe('')
+
expect(sanitizePath('../..')).toBe('')
+
})
+
+
test('preserves valid special characters in filenames', () => {
+
expect(sanitizePath('file-name.html')).toBe('file-name.html')
+
expect(sanitizePath('file_name.html')).toBe('file_name.html')
+
expect(sanitizePath('file.name.html')).toBe('file.name.html')
+
expect(sanitizePath('file (1).html')).toBe('file (1).html')
+
expect(sanitizePath('file@2x.png')).toBe('file@2x.png')
+
})
+
+
test('handles Unicode characters', () => {
+
expect(sanitizePath('ๆ–‡ไปถ.html')).toBe('ๆ–‡ไปถ.html')
+
expect(sanitizePath('ั„ะฐะนะป.html')).toBe('ั„ะฐะนะป.html')
+
expect(sanitizePath('ใƒ•ใ‚กใ‚คใƒซ.html')).toBe('ใƒ•ใ‚กใ‚คใƒซ.html')
+
})
+
})
+
+
describe('extractBlobCid', () => {
+
const TEST_CID = 'bafkreid7ybejd5s2vv2j7d4aajjlmdgazguemcnuliiyfn6coxpwp2mi6y'
+
+
test('extracts CID from IPLD link', () => {
+
const blobRef = { $link: TEST_CID }
+
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
+
})
+
+
test('extracts CID from typed BlobRef with CID object', () => {
+
const cid = CID.parse(TEST_CID)
+
const blobRef = { ref: cid }
+
const result = extractBlobCid(blobRef)
+
expect(result).toBe(TEST_CID)
+
})
+
+
test('extracts CID from typed BlobRef with IPLD link', () => {
+
const blobRef = {
+
ref: { $link: TEST_CID }
+
}
+
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
+
})
+
+
test('extracts CID from untyped BlobRef', () => {
+
const blobRef = { cid: TEST_CID }
+
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
+
})
+
+
test('returns null for invalid blob ref', () => {
+
expect(extractBlobCid(null)).toBe(null)
+
expect(extractBlobCid(undefined)).toBe(null)
+
expect(extractBlobCid({})).toBe(null)
+
expect(extractBlobCid('not-an-object')).toBe(null)
+
expect(extractBlobCid(123)).toBe(null)
+
})
+
+
test('returns null for malformed objects', () => {
+
expect(extractBlobCid({ wrongKey: 'value' })).toBe(null)
+
expect(extractBlobCid({ ref: 'not-a-cid' })).toBe(null)
+
expect(extractBlobCid({ ref: {} })).toBe(null)
+
})
+
+
test('handles nested structures from AT Proto API', () => {
+
// Real structure from AT Proto
+
const blobRef = {
+
$type: 'blob',
+
ref: CID.parse(TEST_CID),
+
mimeType: 'text/html',
+
size: 1234
+
}
+
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
+
})
+
+
test('handles BlobRef with additional properties', () => {
+
const blobRef = {
+
ref: { $link: TEST_CID },
+
mimeType: 'image/png',
+
size: 5678,
+
someOtherField: 'value'
+
}
+
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
+
})
+
+
test('prioritizes checking IPLD link first', () => {
+
// Direct $link takes precedence
+
const directLink = { $link: TEST_CID }
+
expect(extractBlobCid(directLink)).toBe(TEST_CID)
+
})
+
+
test('handles CID v0 format', () => {
+
const cidV0 = 'QmZ4tDuvesekSs4qM5ZBKpXiZGun7S2CYtEZRB3DYXkjGx'
+
const blobRef = { $link: cidV0 }
+
expect(extractBlobCid(blobRef)).toBe(cidV0)
+
})
+
+
test('handles CID v1 format', () => {
+
const cidV1 = 'bafybeigdyrzt5sfp7udm7hu76uh7y26nf3efuylqabf3oclgtqy55fbzdi'
+
const blobRef = { $link: cidV1 }
+
expect(extractBlobCid(blobRef)).toBe(cidV1)
+
})
+
})
+666
apps/hosting-service/src/lib/utils.ts
···
+
import { AtpAgent } from '@atproto/api';
+
import type { Record as WispFsRecord, Directory, Entry, File } from '@wisp/lexicons/types/place/wisp/fs';
+
import type { Record as SubfsRecord } from '@wisp/lexicons/types/place/wisp/subfs';
+
import type { Record as WispSettings } from '@wisp/lexicons/types/place/wisp/settings';
+
import { existsSync, mkdirSync, readFileSync, rmSync } from 'fs';
+
import { writeFile, readFile, rename } from 'fs/promises';
+
import { safeFetchJson, safeFetchBlob } from '@wisp/safe-fetch';
+
import { CID } from 'multiformats';
+
import { extractBlobCid } from '@wisp/atproto-utils';
+
import { sanitizePath, collectFileCidsFromEntries } from '@wisp/fs-utils';
+
import { shouldCompressMimeType } from '@wisp/atproto-utils/compression';
+
+
// Re-export shared utilities for local usage and tests
+
export { extractBlobCid, sanitizePath };
+
+
const CACHE_DIR = process.env.CACHE_DIR || './cache/sites';
+
const CACHE_TTL = 14 * 24 * 60 * 60 * 1000; // 14 days cache TTL
+
+
interface CacheMetadata {
+
recordCid: string;
+
cachedAt: number;
+
did: string;
+
rkey: string;
+
// Map of file path to blob CID for incremental updates
+
fileCids?: Record<string, string>;
+
// Site settings
+
settings?: WispSettings;
+
}
+
+
+
export async function resolveDid(identifier: string): Promise<string | null> {
+
try {
+
// If it's already a DID, return it
+
if (identifier.startsWith('did:')) {
+
return identifier;
+
}
+
+
// Otherwise, resolve the handle using agent's built-in method
+
const agent = new AtpAgent({ service: 'https://public.api.bsky.app' });
+
const response = await agent.resolveHandle({ handle: identifier });
+
return response.data.did;
+
} catch (err) {
+
console.error('Failed to resolve identifier', identifier, err);
+
return null;
+
}
+
}
+
+
export async function getPdsForDid(did: string): Promise<string | null> {
+
try {
+
let doc;
+
+
if (did.startsWith('did:plc:')) {
+
doc = await safeFetchJson(`https://plc.directory/${encodeURIComponent(did)}`);
+
} else if (did.startsWith('did:web:')) {
+
const didUrl = didWebToHttps(did);
+
doc = await safeFetchJson(didUrl);
+
} else {
+
console.error('Unsupported DID method', did);
+
return null;
+
}
+
+
const services = doc.service || [];
+
const pdsService = services.find((s: any) => s.id === '#atproto_pds');
+
+
return pdsService?.serviceEndpoint || null;
+
} catch (err) {
+
console.error('Failed to get PDS for DID', did, err);
+
return null;
+
}
+
}
+
+
function didWebToHttps(did: string): string {
+
const didParts = did.split(':');
+
if (didParts.length < 3 || didParts[0] !== 'did' || didParts[1] !== 'web') {
+
throw new Error('Invalid did:web format');
+
}
+
+
const domain = didParts[2];
+
const pathParts = didParts.slice(3);
+
+
if (pathParts.length === 0) {
+
return `https://${domain}/.well-known/did.json`;
+
} else {
+
const path = pathParts.join('/');
+
return `https://${domain}/${path}/did.json`;
+
}
+
}
+
+
export async function fetchSiteRecord(did: string, rkey: string): Promise<{ record: WispFsRecord; cid: string } | null> {
+
try {
+
const pdsEndpoint = await getPdsForDid(did);
+
if (!pdsEndpoint) return null;
+
+
const url = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=place.wisp.fs&rkey=${encodeURIComponent(rkey)}`;
+
const data = await safeFetchJson(url);
+
+
return {
+
record: data.value as WispFsRecord,
+
cid: data.cid || ''
+
};
+
} catch (err) {
+
console.error('Failed to fetch site record', did, rkey, err);
+
return null;
+
}
+
}
+
+
export async function fetchSiteSettings(did: string, rkey: string): Promise<WispSettings | null> {
+
try {
+
const pdsEndpoint = await getPdsForDid(did);
+
if (!pdsEndpoint) return null;
+
+
const url = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=place.wisp.settings&rkey=${encodeURIComponent(rkey)}`;
+
const data = await safeFetchJson(url);
+
+
return data.value as WispSettings;
+
} catch (err) {
+
// Settings are optional, so return null if not found
+
return null;
+
}
+
}
+
+
/**
+
* Extract all subfs URIs from a directory tree with their mount paths
+
*/
+
function extractSubfsUris(directory: Directory, currentPath: string = ''): Array<{ uri: string; path: string }> {
+
const uris: Array<{ uri: string; path: string }> = [];
+
+
for (const entry of directory.entries) {
+
const fullPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
+
+
if ('type' in entry.node) {
+
if (entry.node.type === 'subfs') {
+
// Subfs node with subject URI
+
const subfsNode = entry.node as any;
+
if (subfsNode.subject) {
+
uris.push({ uri: subfsNode.subject, path: fullPath });
+
}
+
} else if (entry.node.type === 'directory') {
+
// Recursively search subdirectories
+
const subUris = extractSubfsUris(entry.node as Directory, fullPath);
+
uris.push(...subUris);
+
}
+
}
+
}
+
+
return uris;
+
}
+
+
/**
+
* Fetch a subfs record from the PDS
+
*/
+
async function fetchSubfsRecord(uri: string, pdsEndpoint: string): Promise<SubfsRecord | null> {
+
try {
+
// Parse URI: at://did/collection/rkey
+
const parts = uri.replace('at://', '').split('/');
+
if (parts.length < 3) {
+
console.error('Invalid subfs URI:', uri);
+
return null;
+
}
+
+
const did = parts[0] || '';
+
const collection = parts[1] || '';
+
const rkey = parts[2] || '';
+
+
// Fetch the record from PDS
+
const url = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=${encodeURIComponent(collection)}&rkey=${encodeURIComponent(rkey)}`;
+
const response = await safeFetchJson(url);
+
+
if (!response || !response.value) {
+
console.error('Subfs record not found:', uri);
+
return null;
+
}
+
+
return response.value as SubfsRecord;
+
} catch (err) {
+
console.error('Failed to fetch subfs record:', uri, err);
+
return null;
+
}
+
}
+
+
/**
+
* Replace subfs nodes in a directory tree with their actual content
+
* Subfs entries are "merged" - their root entries are hoisted into the parent directory
+
*/
+
async function expandSubfsNodes(directory: Directory, pdsEndpoint: string): Promise<Directory> {
+
// Extract all subfs URIs
+
const subfsUris = extractSubfsUris(directory);
+
+
if (subfsUris.length === 0) {
+
// No subfs nodes, return as-is
+
return directory;
+
}
+
+
console.log(`Found ${subfsUris.length} subfs records, fetching...`);
+
+
// Fetch all subfs records in parallel
+
const subfsRecords = await Promise.all(
+
subfsUris.map(async ({ uri, path }) => {
+
const record = await fetchSubfsRecord(uri, pdsEndpoint);
+
return { record, path };
+
})
+
);
+
+
// Build a map of path -> root entries to merge
+
// Note: SubFS entries are compatible with FS entries at runtime
+
const subfsMap = new Map<string, Entry[]>();
+
for (const { record, path } of subfsRecords) {
+
if (record && record.root && record.root.entries) {
+
subfsMap.set(path, record.root.entries as unknown as Entry[]);
+
}
+
}
+
+
// Replace subfs nodes by merging their root entries into the parent directory
+
function replaceSubfsInEntries(entries: Entry[], currentPath: string = ''): Entry[] {
+
const result: Entry[] = [];
+
+
for (const entry of entries) {
+
const fullPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
+
const node = entry.node;
+
+
if ('type' in node && node.type === 'subfs') {
+
// Check if this is a flat merge or subdirectory merge (default to flat if not specified)
+
const subfsNode = node as any;
+
const isFlat = subfsNode.flat !== false; // Default to true
+
const subfsEntries = subfsMap.get(fullPath);
+
+
if (subfsEntries) {
+
console.log(`Merging subfs node at ${fullPath} (${subfsEntries.length} entries, flat: ${isFlat})`);
+
+
if (isFlat) {
+
// Flat merge: hoist entries directly into parent directory
+
const processedEntries = replaceSubfsInEntries(subfsEntries, currentPath);
+
result.push(...processedEntries);
+
} else {
+
// Subdirectory merge: create a directory with the subfs node's name
+
const processedEntries = replaceSubfsInEntries(subfsEntries, fullPath);
+
const directoryNode: Directory = {
+
type: 'directory',
+
entries: processedEntries
+
};
+
result.push({
+
name: entry.name,
+
node: directoryNode as any // Type assertion needed due to lexicon type complexity
+
});
+
}
+
} else {
+
// If fetch failed, skip this entry
+
console.warn(`Failed to fetch subfs at ${fullPath}, skipping`);
+
}
+
} else if ('type' in node && node.type === 'directory' && 'entries' in node) {
+
// Recursively process subdirectories
+
result.push({
+
...entry,
+
node: {
+
...node,
+
entries: replaceSubfsInEntries(node.entries, fullPath)
+
}
+
});
+
} else {
+
// Regular file entry
+
result.push(entry);
+
}
+
}
+
+
return result;
+
}
+
+
return {
+
...directory,
+
entries: replaceSubfsInEntries(directory.entries)
+
};
+
}
+
+
+
export async function downloadAndCacheSite(did: string, rkey: string, record: WispFsRecord, pdsEndpoint: string, recordCid: string): Promise<void> {
+
console.log('Caching site', did, rkey);
+
+
if (!record.root) {
+
console.error('Record missing root directory:', JSON.stringify(record, null, 2));
+
throw new Error('Invalid record structure: missing root directory');
+
}
+
+
if (!record.root.entries || !Array.isArray(record.root.entries)) {
+
console.error('Record root missing entries array:', JSON.stringify(record.root, null, 2));
+
throw new Error('Invalid record structure: root missing entries array');
+
}
+
+
// Expand subfs nodes before caching
+
const expandedRoot = await expandSubfsNodes(record.root, pdsEndpoint);
+
+
// Get existing cache metadata to check for incremental updates
+
const existingMetadata = await getCacheMetadata(did, rkey);
+
const existingFileCids = existingMetadata?.fileCids || {};
+
+
// Use a temporary directory with timestamp to avoid collisions
+
const tempSuffix = `.tmp-${Date.now()}-${Math.random().toString(36).slice(2, 9)}`;
+
const tempDir = `${CACHE_DIR}/${did}/${rkey}${tempSuffix}`;
+
const finalDir = `${CACHE_DIR}/${did}/${rkey}`;
+
+
try {
+
// Collect file CIDs from the new record (using expanded root)
+
const newFileCids: Record<string, string> = {};
+
collectFileCidsFromEntries(expandedRoot.entries, '', newFileCids);
+
+
// Fetch site settings (optional)
+
const settings = await fetchSiteSettings(did, rkey);
+
+
// Download/copy files to temporary directory (with incremental logic, using expanded root)
+
await cacheFiles(did, rkey, expandedRoot.entries, pdsEndpoint, '', tempSuffix, existingFileCids, finalDir);
+
await saveCacheMetadata(did, rkey, recordCid, tempSuffix, newFileCids, settings);
+
+
// Atomically replace old cache with new cache
+
// On POSIX systems (Linux/macOS), rename is atomic
+
if (existsSync(finalDir)) {
+
// Rename old directory to backup
+
const backupDir = `${finalDir}.old-${Date.now()}`;
+
await rename(finalDir, backupDir);
+
+
try {
+
// Rename new directory to final location
+
await rename(tempDir, finalDir);
+
+
// Clean up old backup
+
rmSync(backupDir, { recursive: true, force: true });
+
} catch (err) {
+
// If rename failed, restore backup
+
if (existsSync(backupDir) && !existsSync(finalDir)) {
+
await rename(backupDir, finalDir);
+
}
+
throw err;
+
}
+
} else {
+
// No existing cache, just rename temp to final
+
await rename(tempDir, finalDir);
+
}
+
+
console.log('Successfully cached site atomically', did, rkey);
+
} catch (err) {
+
// Clean up temp directory on failure
+
if (existsSync(tempDir)) {
+
rmSync(tempDir, { recursive: true, force: true });
+
}
+
throw err;
+
}
+
}
+
+
+
async function cacheFiles(
+
did: string,
+
site: string,
+
entries: Entry[],
+
pdsEndpoint: string,
+
pathPrefix: string,
+
dirSuffix: string = '',
+
existingFileCids: Record<string, string> = {},
+
existingCacheDir?: string
+
): Promise<void> {
+
// Collect file tasks, separating unchanged files from new/changed files
+
const downloadTasks: Array<() => Promise<void>> = [];
+
const copyTasks: Array<() => Promise<void>> = [];
+
+
function collectFileTasks(
+
entries: Entry[],
+
currentPathPrefix: string
+
) {
+
for (const entry of entries) {
+
const currentPath = currentPathPrefix ? `${currentPathPrefix}/${entry.name}` : entry.name;
+
const node = entry.node;
+
+
if ('type' in node && node.type === 'directory' && 'entries' in node) {
+
collectFileTasks(node.entries, currentPath);
+
} else if ('type' in node && node.type === 'file' && 'blob' in node) {
+
const fileNode = node as File;
+
const cid = extractBlobCid(fileNode.blob);
+
+
// Check if file is unchanged (same CID as existing cache)
+
if (cid && existingFileCids[currentPath] === cid && existingCacheDir) {
+
// File unchanged - copy from existing cache instead of downloading
+
copyTasks.push(() => copyExistingFile(
+
did,
+
site,
+
currentPath,
+
dirSuffix,
+
existingCacheDir
+
));
+
} else {
+
// File new or changed - download it
+
downloadTasks.push(() => cacheFileBlob(
+
did,
+
site,
+
currentPath,
+
fileNode.blob,
+
pdsEndpoint,
+
fileNode.encoding,
+
fileNode.mimeType,
+
fileNode.base64,
+
dirSuffix
+
));
+
}
+
}
+
}
+
}
+
+
collectFileTasks(entries, pathPrefix);
+
+
console.log(`[Incremental Update] Files to copy: ${copyTasks.length}, Files to download: ${downloadTasks.length}`);
+
+
// Copy unchanged files in parallel (fast local operations) - increased limit for better performance
+
const copyLimit = 50;
+
for (let i = 0; i < copyTasks.length; i += copyLimit) {
+
const batch = copyTasks.slice(i, i + copyLimit);
+
await Promise.all(batch.map(task => task()));
+
if (copyTasks.length > copyLimit) {
+
console.log(`[Cache Progress] Copied ${Math.min(i + copyLimit, copyTasks.length)}/${copyTasks.length} unchanged files`);
+
}
+
}
+
+
// Download new/changed files concurrently - increased from 3 to 20 for much better performance
+
const downloadLimit = 20;
+
let successCount = 0;
+
let failureCount = 0;
+
+
for (let i = 0; i < downloadTasks.length; i += downloadLimit) {
+
const batch = downloadTasks.slice(i, i + downloadLimit);
+
const results = await Promise.allSettled(batch.map(task => task()));
+
+
// Count successes and failures
+
results.forEach((result, index) => {
+
if (result.status === 'fulfilled') {
+
successCount++;
+
} else {
+
failureCount++;
+
console.error(`[Cache] Failed to download file (continuing with others):`, result.reason);
+
}
+
});
+
+
if (downloadTasks.length > downloadLimit) {
+
console.log(`[Cache Progress] Downloaded ${Math.min(i + downloadLimit, downloadTasks.length)}/${downloadTasks.length} files (${failureCount} failed)`);
+
}
+
}
+
+
if (failureCount > 0) {
+
console.warn(`[Cache] Completed with ${successCount} successful and ${failureCount} failed file downloads`);
+
}
+
}
+
+
/**
+
* Copy an unchanged file from existing cache to new cache location
+
*/
+
async function copyExistingFile(
+
did: string,
+
site: string,
+
filePath: string,
+
dirSuffix: string,
+
existingCacheDir: string
+
): Promise<void> {
+
const { copyFile } = await import('fs/promises');
+
+
const sourceFile = `${existingCacheDir}/${filePath}`;
+
const destFile = `${CACHE_DIR}/${did}/${site}${dirSuffix}/${filePath}`;
+
const destDir = destFile.substring(0, destFile.lastIndexOf('/'));
+
+
// Create destination directory if needed
+
if (destDir && !existsSync(destDir)) {
+
mkdirSync(destDir, { recursive: true });
+
}
+
+
try {
+
// Copy the file
+
await copyFile(sourceFile, destFile);
+
+
// Copy metadata file if it exists
+
const sourceMetaFile = `${sourceFile}.meta`;
+
const destMetaFile = `${destFile}.meta`;
+
if (existsSync(sourceMetaFile)) {
+
await copyFile(sourceMetaFile, destMetaFile);
+
}
+
} catch (err) {
+
console.error(`Failed to copy cached file ${filePath}, will attempt download:`, err);
+
throw err;
+
}
+
}
+
+
async function cacheFileBlob(
+
did: string,
+
site: string,
+
filePath: string,
+
blobRef: any,
+
pdsEndpoint: string,
+
encoding?: 'gzip',
+
mimeType?: string,
+
base64?: boolean,
+
dirSuffix: string = ''
+
): Promise<void> {
+
const cid = extractBlobCid(blobRef);
+
if (!cid) {
+
console.error('Could not extract CID from blob', blobRef);
+
return;
+
}
+
+
const blobUrl = `${pdsEndpoint}/xrpc/com.atproto.sync.getBlob?did=${encodeURIComponent(did)}&cid=${encodeURIComponent(cid)}`;
+
+
console.log(`[Cache] Fetching blob for file: ${filePath}, CID: ${cid}`);
+
+
// Allow up to 500MB per file blob, with 5 minute timeout
+
let content = await safeFetchBlob(blobUrl, { maxSize: 500 * 1024 * 1024, timeout: 300000 });
+
+
// If content is base64-encoded, decode it back to raw binary (gzipped or not)
+
if (base64) {
+
// Decode base64 directly from raw bytes - no string conversion
+
// The blob contains base64-encoded text as raw bytes, decode it in-place
+
const textDecoder = new TextDecoder();
+
const base64String = textDecoder.decode(content);
+
content = Buffer.from(base64String, 'base64');
+
}
+
+
const cacheFile = `${CACHE_DIR}/${did}/${site}${dirSuffix}/${filePath}`;
+
const fileDir = cacheFile.substring(0, cacheFile.lastIndexOf('/'));
+
+
if (fileDir && !existsSync(fileDir)) {
+
mkdirSync(fileDir, { recursive: true });
+
}
+
+
// Use the shared function to determine if this should remain compressed
+
const shouldStayCompressed = shouldCompressMimeType(mimeType);
+
+
// Decompress files that shouldn't be stored compressed
+
if (encoding === 'gzip' && !shouldStayCompressed && content.length >= 2 &&
+
content[0] === 0x1f && content[1] === 0x8b) {
+
try {
+
const { gunzipSync } = await import('zlib');
+
const decompressed = gunzipSync(content);
+
content = decompressed;
+
// Clear the encoding flag since we're storing decompressed
+
encoding = undefined;
+
} catch (error) {
+
console.error(`Failed to decompress ${filePath}, storing original gzipped content:`, error);
+
}
+
}
+
+
await writeFile(cacheFile, content);
+
+
// Store metadata only if file is still compressed
+
if (encoding === 'gzip' && mimeType) {
+
const metaFile = `${cacheFile}.meta`;
+
await writeFile(metaFile, JSON.stringify({ encoding, mimeType }));
+
console.log('Cached file', filePath, content.length, 'bytes (gzipped,', mimeType + ')');
+
} else {
+
console.log('Cached file', filePath, content.length, 'bytes');
+
}
+
}
+
+
+
export function getCachedFilePath(did: string, site: string, filePath: string): string {
+
const sanitizedPath = sanitizePath(filePath);
+
return `${CACHE_DIR}/${did}/${site}/${sanitizedPath}`;
+
}
+
+
export function isCached(did: string, site: string): boolean {
+
return existsSync(`${CACHE_DIR}/${did}/${site}`);
+
}
+
+
async function saveCacheMetadata(did: string, rkey: string, recordCid: string, dirSuffix: string = '', fileCids?: Record<string, string>, settings?: WispSettings | null): Promise<void> {
+
const metadata: CacheMetadata = {
+
recordCid,
+
cachedAt: Date.now(),
+
did,
+
rkey,
+
fileCids,
+
settings: settings || undefined
+
};
+
+
const metadataPath = `${CACHE_DIR}/${did}/${rkey}${dirSuffix}/.metadata.json`;
+
const metadataDir = metadataPath.substring(0, metadataPath.lastIndexOf('/'));
+
+
if (!existsSync(metadataDir)) {
+
mkdirSync(metadataDir, { recursive: true });
+
}
+
+
await writeFile(metadataPath, JSON.stringify(metadata, null, 2));
+
}
+
+
async function getCacheMetadata(did: string, rkey: string): Promise<CacheMetadata | null> {
+
try {
+
const metadataPath = `${CACHE_DIR}/${did}/${rkey}/.metadata.json`;
+
if (!existsSync(metadataPath)) return null;
+
+
const content = await readFile(metadataPath, 'utf-8');
+
return JSON.parse(content) as CacheMetadata;
+
} catch (err) {
+
console.error('Failed to read cache metadata', err);
+
return null;
+
}
+
}
+
+
export async function getCachedSettings(did: string, rkey: string): Promise<WispSettings | null> {
+
const metadata = await getCacheMetadata(did, rkey);
+
+
// If metadata has settings, return them
+
if (metadata?.settings) {
+
return metadata.settings;
+
}
+
+
// If metadata exists but has no settings, try to fetch from PDS and update cache
+
if (metadata) {
+
console.log('[Cache] Metadata missing settings, fetching from PDS', { did, rkey });
+
try {
+
const settings = await fetchSiteSettings(did, rkey);
+
if (settings) {
+
// Update the cached metadata with the fetched settings
+
await updateCacheMetadataSettings(did, rkey, settings);
+
console.log('[Cache] Updated metadata with fetched settings', { did, rkey });
+
return settings;
+
}
+
} catch (err) {
+
console.error('[Cache] Failed to fetch/update settings', { did, rkey, err });
+
}
+
}
+
+
return null;
+
}
+
+
export async function updateCacheMetadataSettings(did: string, rkey: string, settings: WispSettings | null): Promise<void> {
+
const metadataPath = `${CACHE_DIR}/${did}/${rkey}/.metadata.json`;
+
+
if (!existsSync(metadataPath)) {
+
console.warn('Metadata file does not exist, cannot update settings', { did, rkey });
+
return;
+
}
+
+
try {
+
// Read existing metadata
+
const content = await readFile(metadataPath, 'utf-8');
+
const metadata = JSON.parse(content) as CacheMetadata;
+
+
// Update settings field
+
metadata.settings = settings || undefined;
+
+
// Write back to disk
+
await writeFile(metadataPath, JSON.stringify(metadata, null, 2), 'utf-8');
+
console.log('Updated metadata settings', { did, rkey, hasSettings: !!settings });
+
} catch (err) {
+
console.error('Failed to update metadata settings', err);
+
throw err;
+
}
+
}
+
+
export async function isCacheValid(did: string, rkey: string, currentRecordCid?: string): Promise<boolean> {
+
const metadata = await getCacheMetadata(did, rkey);
+
if (!metadata) return false;
+
+
// Check if cache has expired (14 days TTL)
+
const cacheAge = Date.now() - metadata.cachedAt;
+
if (cacheAge > CACHE_TTL) {
+
console.log('[Cache] Cache expired for', did, rkey);
+
return false;
+
}
+
+
// If current CID is provided, verify it matches
+
if (currentRecordCid && metadata.recordCid !== currentRecordCid) {
+
console.log('[Cache] CID mismatch for', did, rkey, 'cached:', metadata.recordCid, 'current:', currentRecordCid);
+
return false;
+
}
+
+
return true;
+
}
+234
apps/hosting-service/src/server.ts
···
+
/**
+
* Main server entry point for the hosting service
+
* Handles routing and request dispatching
+
*/
+
+
import { Hono } from 'hono';
+
import { cors } from 'hono/cors';
+
import { getWispDomain, getCustomDomain, getCustomDomainByHash } from './lib/db';
+
import { resolveDid } from './lib/utils';
+
import { logCollector, errorTracker, metricsCollector } from '@wisp/observability';
+
import { observabilityMiddleware, observabilityErrorHandler } from '@wisp/observability/middleware/hono';
+
import { sanitizePath } from '@wisp/fs-utils';
+
import { isSiteBeingCached } from './lib/cache';
+
import { isValidRkey, extractHeaders } from './lib/request-utils';
+
import { siteUpdatingResponse } from './lib/page-generators';
+
import { ensureSiteCached } from './lib/site-cache';
+
import { serveFromCache, serveFromCacheWithRewrite } from './lib/file-serving';
+
+
const BASE_HOST = process.env.BASE_HOST || 'wisp.place';
+
+
const app = new Hono();
+
+
// Add CORS middleware - allow all origins for static site hosting
+
app.use('*', cors({
+
origin: '*',
+
allowMethods: ['GET', 'HEAD', 'OPTIONS'],
+
allowHeaders: ['Content-Type', 'Authorization'],
+
exposeHeaders: ['Content-Length', 'Content-Type', 'Content-Encoding', 'Cache-Control'],
+
maxAge: 86400, // 24 hours
+
credentials: false,
+
}));
+
+
// Add observability middleware
+
app.use('*', observabilityMiddleware('hosting-service'));
+
+
// Error handler
+
app.onError(observabilityErrorHandler('hosting-service'));
+
+
// Main site serving route
+
app.get('/*', async (c) => {
+
const url = new URL(c.req.url);
+
const hostname = c.req.header('host') || '';
+
const rawPath = url.pathname.replace(/^\//, '');
+
const path = sanitizePath(rawPath);
+
+
// Check if this is sites.wisp.place subdomain (strip port for comparison)
+
const hostnameWithoutPort = hostname.split(':')[0];
+
if (hostnameWithoutPort === `sites.${BASE_HOST}`) {
+
// Sanitize the path FIRST to prevent path traversal
+
const sanitizedFullPath = sanitizePath(rawPath);
+
+
// Extract identifier and site from sanitized path: did:plc:123abc/sitename/file.html
+
const pathParts = sanitizedFullPath.split('/');
+
if (pathParts.length < 2) {
+
return c.text('Invalid path format. Expected: /identifier/sitename/path', 400);
+
}
+
+
const identifier = pathParts[0];
+
const site = pathParts[1];
+
const filePath = pathParts.slice(2).join('/');
+
+
// Additional validation: identifier must be a valid DID or handle format
+
if (!identifier || identifier.length < 3 || identifier.includes('..') || identifier.includes('\0')) {
+
return c.text('Invalid identifier', 400);
+
}
+
+
// Validate site parameter exists
+
if (!site) {
+
return c.text('Site name required', 400);
+
}
+
+
// Validate site name (rkey)
+
if (!isValidRkey(site)) {
+
return c.text('Invalid site name', 400);
+
}
+
+
// Resolve identifier to DID
+
const did = await resolveDid(identifier);
+
if (!did) {
+
return c.text('Invalid identifier', 400);
+
}
+
+
// Check if site is currently being cached - return updating response early
+
if (isSiteBeingCached(did, site)) {
+
return siteUpdatingResponse();
+
}
+
+
// Ensure site is cached
+
const cached = await ensureSiteCached(did, site);
+
if (!cached) {
+
return c.text('Site not found', 404);
+
}
+
+
// Serve with HTML path rewriting to handle absolute paths
+
const basePath = `/${identifier}/${site}/`;
+
const headers = extractHeaders(c.req.raw.headers);
+
return serveFromCacheWithRewrite(did, site, filePath, basePath, c.req.url, headers);
+
}
+
+
// Check if this is a DNS hash subdomain
+
const dnsMatch = hostname.match(/^([a-f0-9]{16})\.dns\.(.+)$/);
+
if (dnsMatch) {
+
const hash = dnsMatch[1];
+
const baseDomain = dnsMatch[2];
+
+
if (!hash) {
+
return c.text('Invalid DNS hash', 400);
+
}
+
+
if (baseDomain !== BASE_HOST) {
+
return c.text('Invalid base domain', 400);
+
}
+
+
const customDomain = await getCustomDomainByHash(hash);
+
if (!customDomain) {
+
return c.text('Custom domain not found or not verified', 404);
+
}
+
+
if (!customDomain.rkey) {
+
return c.text('Domain not mapped to a site', 404);
+
}
+
+
const rkey = customDomain.rkey;
+
if (!isValidRkey(rkey)) {
+
return c.text('Invalid site configuration', 500);
+
}
+
+
// Check if site is currently being cached - return updating response early
+
if (isSiteBeingCached(customDomain.did, rkey)) {
+
return siteUpdatingResponse();
+
}
+
+
const cached = await ensureSiteCached(customDomain.did, rkey);
+
if (!cached) {
+
return c.text('Site not found', 404);
+
}
+
+
const headers = extractHeaders(c.req.raw.headers);
+
return serveFromCache(customDomain.did, rkey, path, c.req.url, headers);
+
}
+
+
// Route 2: Registered subdomains - /*.wisp.place/*
+
if (hostname.endsWith(`.${BASE_HOST}`)) {
+
const domainInfo = await getWispDomain(hostname);
+
if (!domainInfo) {
+
return c.text('Subdomain not registered', 404);
+
}
+
+
if (!domainInfo.rkey) {
+
return c.text('Domain not mapped to a site', 404);
+
}
+
+
const rkey = domainInfo.rkey;
+
if (!isValidRkey(rkey)) {
+
return c.text('Invalid site configuration', 500);
+
}
+
+
// Check if site is currently being cached - return updating response early
+
if (isSiteBeingCached(domainInfo.did, rkey)) {
+
return siteUpdatingResponse();
+
}
+
+
const cached = await ensureSiteCached(domainInfo.did, rkey);
+
if (!cached) {
+
return c.text('Site not found', 404);
+
}
+
+
const headers = extractHeaders(c.req.raw.headers);
+
return serveFromCache(domainInfo.did, rkey, path, c.req.url, headers);
+
}
+
+
// Route 1: Custom domains - /*
+
const customDomain = await getCustomDomain(hostname);
+
if (!customDomain) {
+
return c.text('Custom domain not found or not verified', 404);
+
}
+
+
if (!customDomain.rkey) {
+
return c.text('Domain not mapped to a site', 404);
+
}
+
+
const rkey = customDomain.rkey;
+
if (!isValidRkey(rkey)) {
+
return c.text('Invalid site configuration', 500);
+
}
+
+
// Check if site is currently being cached - return updating response early
+
if (isSiteBeingCached(customDomain.did, rkey)) {
+
return siteUpdatingResponse();
+
}
+
+
const cached = await ensureSiteCached(customDomain.did, rkey);
+
if (!cached) {
+
return c.text('Site not found', 404);
+
}
+
+
const headers = extractHeaders(c.req.raw.headers);
+
return serveFromCache(customDomain.did, rkey, path, c.req.url, headers);
+
});
+
+
// Internal observability endpoints (for admin panel)
+
app.get('/__internal__/observability/logs', (c) => {
+
const query = c.req.query();
+
const filter: any = {};
+
if (query.level) filter.level = query.level;
+
if (query.service) filter.service = query.service;
+
if (query.search) filter.search = query.search;
+
if (query.eventType) filter.eventType = query.eventType;
+
if (query.limit) filter.limit = parseInt(query.limit as string);
+
return c.json({ logs: logCollector.getLogs(filter) });
+
});
+
+
app.get('/__internal__/observability/errors', (c) => {
+
const query = c.req.query();
+
const filter: any = {};
+
if (query.service) filter.service = query.service;
+
if (query.limit) filter.limit = parseInt(query.limit as string);
+
return c.json({ errors: errorTracker.getErrors(filter) });
+
});
+
+
app.get('/__internal__/observability/metrics', (c) => {
+
const query = c.req.query();
+
const timeWindow = query.timeWindow ? parseInt(query.timeWindow as string) : 3600000;
+
const stats = metricsCollector.getStats('hosting-service', timeWindow);
+
return c.json({ stats, timeWindow });
+
});
+
+
app.get('/__internal__/observability/cache', async (c) => {
+
const { getCacheStats } = await import('./lib/cache');
+
const stats = getCacheStats();
+
return c.json({ cache: stats });
+
});
+
+
export default app;
+36
apps/hosting-service/tsconfig.json
···
+
{
+
"compilerOptions": {
+
/* Base Options */
+
"esModuleInterop": true,
+
"skipLibCheck": true,
+
"target": "es2022",
+
"allowJs": true,
+
"resolveJsonModule": true,
+
"moduleDetection": "force",
+
"isolatedModules": true,
+
"verbatimModuleSyntax": true,
+
+
/* Strictness */
+
"strict": true,
+
"noUncheckedIndexedAccess": true,
+
"noImplicitOverride": true,
+
"forceConsistentCasingInFileNames": true,
+
+
/* Transpiling with TypeScript */
+
"module": "ESNext",
+
"moduleResolution": "bundler",
+
"outDir": "dist",
+
"sourceMap": true,
+
+
/* Code doesn't run in DOM */
+
"lib": ["es2022"],
+
+
/* Workspace Paths */
+
"baseUrl": ".",
+
"paths": {
+
"@wisp/*": ["../../packages/@wisp/*/src"]
+
}
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "cache", "dist"]
+
}
+67
apps/main-app/package.json
···
+
{
+
"name": "@wisp/main-app",
+
"version": "1.0.50",
+
"private": true,
+
"scripts": {
+
"test": "bun test",
+
"dev": "bun run --watch src/index.ts",
+
"start": "bun run src/index.ts",
+
"build": "bun build --compile --target bun --outfile server src/index.ts",
+
"screenshot": "bun run scripts/screenshot-sites.ts"
+
},
+
"dependencies": {
+
"@wisp/lexicons": "workspace:*",
+
"@wisp/constants": "workspace:*",
+
"@wisp/observability": "workspace:*",
+
"@wisp/atproto-utils": "workspace:*",
+
"@wisp/database": "workspace:*",
+
"@wisp/fs-utils": "workspace:*",
+
"@atproto/api": "^0.17.3",
+
"@atproto/lex-cli": "^0.9.5",
+
"@atproto/oauth-client-node": "^0.3.9",
+
"@atproto/xrpc-server": "^0.9.5",
+
"@elysiajs/cors": "^1.4.0",
+
"@elysiajs/eden": "^1.4.3",
+
"@elysiajs/openapi": "^1.4.11",
+
"@elysiajs/opentelemetry": "^1.4.6",
+
"@elysiajs/static": "^1.4.2",
+
"@radix-ui/react-checkbox": "^1.3.3",
+
"@radix-ui/react-dialog": "^1.1.15",
+
"@radix-ui/react-label": "^2.1.7",
+
"@radix-ui/react-radio-group": "^1.3.8",
+
"@radix-ui/react-slot": "^1.2.3",
+
"@radix-ui/react-tabs": "^1.1.13",
+
"@tanstack/react-query": "^5.90.2",
+
"actor-typeahead": "^0.1.1",
+
"atproto-ui": "^0.11.3",
+
"class-variance-authority": "^0.7.1",
+
"clsx": "^2.1.1",
+
"elysia": "latest",
+
"iron-session": "^8.0.4",
+
"lucide-react": "^0.546.0",
+
"multiformats": "^13.4.1",
+
"prismjs": "^1.30.0",
+
"react": "^19.2.0",
+
"react-dom": "^19.2.0",
+
"tailwind-merge": "^3.3.1",
+
"tailwindcss": "4",
+
"tw-animate-css": "^1.4.0",
+
"typescript": "^5.9.3",
+
"zlib": "^1.0.5"
+
},
+
"devDependencies": {
+
"@types/react": "^19.2.2",
+
"@types/react-dom": "^19.2.1",
+
"bun-plugin-tailwind": "^0.1.2",
+
"bun-types": "latest",
+
"esbuild": "0.26.0",
+
"playwright": "^1.49.0"
+
},
+
"module": "src/index.js",
+
"trustedDependencies": [
+
"bun",
+
"cbor-extract",
+
"core-js",
+
"protobufjs"
+
]
+
}
+379
apps/main-app/public/acceptable-use/acceptable-use.tsx
···
+
import { createRoot } from 'react-dom/client'
+
import Layout from '@public/layouts'
+
import { Button } from '@public/components/ui/button'
+
import { Card } from '@public/components/ui/card'
+
import { ArrowLeft, Shield, AlertCircle, CheckCircle, Scale } from 'lucide-react'
+
+
function AcceptableUsePage() {
+
return (
+
<div className="min-h-screen bg-background">
+
{/* Header */}
+
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
+
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
+
<div className="flex items-center gap-2">
+
<img src="/transparent-full-size-ico.png" alt="wisp.place" className="w-8 h-8" />
+
<span className="text-xl font-semibold text-foreground">
+
wisp.place
+
</span>
+
</div>
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={() => window.location.href = '/'}
+
>
+
<ArrowLeft className="w-4 h-4 mr-2" />
+
Back to Home
+
</Button>
+
</div>
+
</header>
+
+
{/* Hero Section */}
+
<div className="bg-gradient-to-b from-accent/10 to-background border-b border-border/40">
+
<div className="container mx-auto px-4 py-16 max-w-4xl text-center">
+
<div className="inline-flex items-center justify-center w-16 h-16 rounded-full bg-accent/20 mb-6">
+
<Shield className="w-8 h-8 text-accent" />
+
</div>
+
<h1 className="text-4xl md:text-5xl font-bold mb-4">Acceptable Use Policy</h1>
+
<div className="flex items-center justify-center gap-6 text-sm text-muted-foreground">
+
<div className="flex items-center gap-2">
+
<span className="font-medium">Effective:</span>
+
<span>November 10, 2025</span>
+
</div>
+
<div className="h-4 w-px bg-border"></div>
+
<div className="flex items-center gap-2">
+
<span className="font-medium">Last Updated:</span>
+
<span>November 10, 2025</span>
+
</div>
+
</div>
+
</div>
+
</div>
+
+
{/* Content */}
+
<div className="container mx-auto px-4 py-12 max-w-4xl">
+
<article className="space-y-12">
+
{/* Our Philosophy */}
+
<section>
+
<h2 className="text-3xl font-bold mb-6 text-foreground">Our Philosophy</h2>
+
<div className="space-y-4 text-lg leading-relaxed text-muted-foreground">
+
<p>
+
wisp.place exists to give you a corner of the internet that's truly yoursโ€”a place to create, experiment, and express yourself freely. We believe in the open web and the fundamental importance of free expression. We're not here to police your thoughts, moderate your aesthetics, or judge your taste.
+
</p>
+
<p>
+
That said, we're also real people running real servers in real jurisdictions (the United States and the Netherlands), and there are legal and practical limits to what we can host. This policy aims to be as permissive as possible while keeping the lights on and staying on the right side of the law.
+
</p>
+
</div>
+
</section>
+
+
{/* What You Can Do */}
+
<Card className="bg-green-500/5 border-green-500/20 p-8">
+
<div className="flex items-start gap-4">
+
<div className="flex-shrink-0">
+
<CheckCircle className="w-8 h-8 text-green-500" />
+
</div>
+
<div className="space-y-4">
+
<h2 className="text-3xl font-bold text-foreground">What You Can Do</h2>
+
<div className="space-y-4 text-lg leading-relaxed text-muted-foreground">
+
<p>
+
<strong className="text-green-600 dark:text-green-400">Almost anything.</strong> Seriously. Build weird art projects. Write controversial essays. Create spaces that would make corporate platforms nervous. Express unpopular opinions. Make things that are strange, provocative, uncomfortable, or just plain yours.
+
</p>
+
<p>
+
We support creative and personal expression in all its forms, including adult content, political speech, counter-cultural work, and experimental projects.
+
</p>
+
</div>
+
</div>
+
</div>
+
</Card>
+
+
{/* What You Can't Do */}
+
<section>
+
<div className="flex items-center gap-3 mb-6">
+
<AlertCircle className="w-8 h-8 text-red-500" />
+
<h2 className="text-3xl font-bold text-foreground">What You Can't Do</h2>
+
</div>
+
+
<div className="space-y-8">
+
<Card className="p-6 border-2">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Illegal Content</h3>
+
<p className="text-muted-foreground mb-4">
+
Don't host content that's illegal in the United States or the Netherlands. This includes but isn't limited to:
+
</p>
+
<ul className="space-y-3 text-muted-foreground">
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span><strong>Child sexual abuse material (CSAM)</strong> involving real minors in any form</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span><strong>Realistic or AI-generated depictions</strong> of minors in sexual contexts, including photorealistic renders, deepfakes, or AI-generated imagery</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span><strong>Non-consensual intimate imagery</strong> (revenge porn, deepfakes, hidden camera footage, etc.)</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Content depicting or facilitating human trafficking, sexual exploitation, or sexual violence</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Instructions for manufacturing explosives, biological weapons, or other instruments designed for mass harm</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Content that facilitates imminent violence or terrorism</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Stolen financial information, credentials, or personal data used for fraud</span>
+
</li>
+
</ul>
+
</Card>
+
+
<Card className="p-6 border-2">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Intellectual Property Violations</h3>
+
<div className="space-y-4 text-muted-foreground">
+
<p>
+
Don't host content that clearly violates someone else's copyright, trademark, or other intellectual property rights. We're required to respond to valid DMCA takedown notices.
+
</p>
+
<p>
+
We understand that copyright law is complicated and sometimes ridiculous. We're not going to proactively scan your site or nitpick over fair use. But if we receive a legitimate legal complaint, we'll have to act on it.
+
</p>
+
</div>
+
</Card>
+
+
<Card className="p-6 border-2 border-red-500/30 bg-red-500/5">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Hate Content</h3>
+
<div className="space-y-4 text-muted-foreground">
+
<p>
+
You can express controversial ideas. You can be offensive. You can make people uncomfortable. But pure hateโ€”content that exists solely to dehumanize, threaten, or incite violence against people based on race, ethnicity, religion, gender, sexual orientation, disability, or similar characteristicsโ€”isn't welcome here.
+
</p>
+
<p>
+
There's a difference between "I have deeply unpopular opinions about X" and "People like X should be eliminated." The former is protected expression. The latter isn't.
+
</p>
+
<div className="bg-background/50 border-l-4 border-red-500 p-4 rounded">
+
<p className="font-medium text-foreground">
+
<strong>A note on enforcement:</strong> While we're generally permissive and believe in giving people the benefit of the doubt, hate content is where we draw a hard line. I will be significantly more aggressive in moderating this type of content than anything else on this list. If your site exists primarily to spread hate or recruit people into hateful ideologies, you will be removed swiftly and without extensive appeals. This is non-negotiable.
+
</p>
+
</div>
+
</div>
+
</Card>
+
+
<Card className="p-6 border-2">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Adult Content Guidelines</h3>
+
<div className="space-y-4 text-muted-foreground">
+
<p>
+
Adult content is allowed. This includes sexually explicit material, erotica, adult artwork, and NSFW creative expression.
+
</p>
+
<p className="font-medium">However:</p>
+
<ul className="space-y-2">
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>No content involving real minors in any sexual context whatsoever</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>No photorealistic, AI-generated, or otherwise realistic depictions of minors in sexual contexts</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-green-500 mt-1">โ€ข</span>
+
<span>Clearly stylized drawings and written fiction are permitted, provided they remain obviously non-photographic in nature</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>No non-consensual content (revenge porn, voyeurism, etc.)</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>No content depicting illegal sexual acts (bestiality, necrophilia, etc.)</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-yellow-500 mt-1">โ€ข</span>
+
<span>Adult content should be clearly marked as such if discoverable through public directories or search</span>
+
</li>
+
</ul>
+
</div>
+
</Card>
+
+
<Card className="p-6 border-2">
+
<h3 className="text-2xl font-semibold mb-4 text-foreground">Malicious Technical Activity</h3>
+
<p className="text-muted-foreground mb-4">Don't use your site to:</p>
+
<ul className="space-y-2 text-muted-foreground">
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Distribute malware, viruses, or exploits</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Conduct phishing or social engineering attacks</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Launch DDoS attacks or network abuse</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Mine cryptocurrency without explicit user consent</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-red-500 mt-1">โ€ข</span>
+
<span>Scrape, spam, or abuse other services</span>
+
</li>
+
</ul>
+
</Card>
+
</div>
+
</section>
+
+
{/* Our Approach to Enforcement */}
+
<section>
+
<div className="flex items-center gap-3 mb-6">
+
<Scale className="w-8 h-8 text-accent" />
+
<h2 className="text-3xl font-bold text-foreground">Our Approach to Enforcement</h2>
+
</div>
+
<div className="space-y-6">
+
<div className="space-y-4 text-lg leading-relaxed text-muted-foreground">
+
<p>
+
<strong>We actively monitor for obvious violations.</strong> Not to censor your creativity or police your opinions, but to catch the clear-cut stuff that threatens the service's existence and makes this a worse place for everyone. We're looking for the blatantly illegal, the obviously harmfulโ€”the stuff that would get servers seized and communities destroyed.
+
</p>
+
<p>
+
We're not reading your blog posts looking for wrongthink. We're making sure this platform doesn't become a haven for the kind of content that ruins good things.
+
</p>
+
</div>
+
+
<Card className="p-6 bg-muted/30">
+
<p className="font-semibold mb-3 text-foreground">We take action when:</p>
+
<ol className="space-y-2 text-muted-foreground">
+
<li className="flex items-start gap-3">
+
<span className="font-bold text-accent">1.</span>
+
<span>We identify content that clearly violates this policy during routine monitoring</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="font-bold text-accent">2.</span>
+
<span>We receive a valid legal complaint (DMCA, court order, etc.)</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="font-bold text-accent">3.</span>
+
<span>Someone reports content that violates this policy and we can verify the violation</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="font-bold text-accent">4.</span>
+
<span>Your site is causing technical problems for the service or other users</span>
+
</li>
+
</ol>
+
</Card>
+
+
<Card className="p-6 bg-muted/30">
+
<p className="font-semibold mb-3 text-foreground">When we do need to take action, we'll try to:</p>
+
<ul className="space-y-2 text-muted-foreground">
+
<li className="flex items-start gap-3">
+
<span className="text-accent">โ€ข</span>
+
<span>Contact you first when legally and practically possible</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-accent">โ€ข</span>
+
<span>Be transparent about what's happening and why</span>
+
</li>
+
<li className="flex items-start gap-3">
+
<span className="text-accent">โ€ข</span>
+
<span>Give you an opportunity to address the issue if appropriate</span>
+
</li>
+
</ul>
+
</Card>
+
+
<p className="text-muted-foreground">
+
For serious or repeated violations, we may suspend or terminate your account.
+
</p>
+
</div>
+
</section>
+
+
{/* Regional Compliance */}
+
<Card className="p-6 bg-blue-500/5 border-blue-500/20">
+
<h2 className="text-2xl font-bold mb-4 text-foreground">Regional Compliance</h2>
+
<p className="text-muted-foreground">
+
Our servers are located in the United States and the Netherlands. Content hosted on wisp.place must comply with the laws of both jurisdictions. While we aim to provide broad creative freedom, these legal requirements are non-negotiable.
+
</p>
+
</Card>
+
+
{/* Changes to This Policy */}
+
<section>
+
<h2 className="text-2xl font-bold mb-4 text-foreground">Changes to This Policy</h2>
+
<p className="text-muted-foreground">
+
We may update this policy as legal requirements or service realities change. If we make significant changes, we'll notify active users.
+
</p>
+
</section>
+
+
{/* Questions or Reports */}
+
<section>
+
<h2 className="text-2xl font-bold mb-4 text-foreground">Questions or Reports</h2>
+
<p className="text-muted-foreground">
+
If you have questions about this policy or need to report a violation, contact us at{' '}
+
<a
+
href="mailto:contact@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
contact@wisp.place
+
</a>
+
.
+
</p>
+
</section>
+
+
{/* Final Message */}
+
<Card className="p-8 bg-accent/10 border-accent/30 border-2">
+
<p className="text-lg leading-relaxed text-foreground">
+
<strong>Remember:</strong> This policy exists to keep the service running and this community healthy, not to limit your creativity. When in doubt, ask yourself: "Is this likely to get real-world authorities knocking on doors or make this place worse for everyone?" If the answer is yes, it probably doesn't belong here. Everything else? Go wild.
+
</p>
+
</Card>
+
</article>
+
</div>
+
+
{/* Footer */}
+
<footer className="border-t border-border/40 bg-muted/20 mt-12">
+
<div className="container mx-auto px-4 py-8">
+
<div className="text-center text-sm text-muted-foreground">
+
<p>
+
Built by{' '}
+
<a
+
href="https://bsky.app/profile/nekomimi.pet"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
@nekomimi.pet
+
</a>
+
{' โ€ข '}
+
Contact:{' '}
+
<a
+
href="mailto:contact@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
contact@wisp.place
+
</a>
+
{' โ€ข '}
+
Legal/DMCA:{' '}
+
<a
+
href="mailto:legal@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
legal@wisp.place
+
</a>
+
</p>
+
<p className="mt-2">
+
<a
+
href="/acceptable-use"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
Acceptable Use Policy
+
</a>
+
</p>
+
</div>
+
</div>
+
</footer>
+
</div>
+
)
+
}
+
+
const root = createRoot(document.getElementById('elysia')!)
+
root.render(
+
<Layout className="gap-6">
+
<AcceptableUsePage />
+
</Layout>
+
)
+35
apps/main-app/public/acceptable-use/index.html
···
+
<!doctype html>
+
<html lang="en">
+
<head>
+
<meta charset="UTF-8" />
+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
+
<title>Acceptable Use Policy - wisp.place</title>
+
<meta name="description" content="Acceptable Use Policy for wisp.place - Guidelines for hosting content on our decentralized static site hosting platform." />
+
+
<!-- Open Graph / Facebook -->
+
<meta property="og:type" content="website" />
+
<meta property="og:url" content="https://wisp.place/acceptable-use" />
+
<meta property="og:title" content="Acceptable Use Policy - wisp.place" />
+
<meta property="og:description" content="Acceptable Use Policy for wisp.place - Guidelines for hosting content on our decentralized static site hosting platform." />
+
<meta property="og:site_name" content="wisp.place" />
+
+
<!-- Twitter -->
+
<meta name="twitter:card" content="summary_large_image" />
+
<meta name="twitter:url" content="https://wisp.place/acceptable-use" />
+
<meta name="twitter:title" content="Acceptable Use Policy - wisp.place" />
+
<meta name="twitter:description" content="Acceptable Use Policy for wisp.place - Guidelines for hosting content on our decentralized static site hosting platform." />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
+
<link rel="icon" type="image/x-icon" href="../favicon.ico">
+
<link rel="icon" type="image/png" sizes="32x32" href="../favicon-32x32.png">
+
<link rel="icon" type="image/png" sizes="16x16" href="../favicon-16x16.png">
+
<link rel="apple-touch-icon" sizes="180x180" href="../apple-touch-icon.png">
+
<link rel="manifest" href="../site.webmanifest">
+
</head>
+
<body>
+
<div id="elysia"></div>
+
<script type="module" src="./acceptable-use.tsx"></script>
+
</body>
+
</html>
+820
apps/main-app/public/admin/admin.tsx
···
+
import { StrictMode, useState, useEffect } from 'react'
+
import { createRoot } from 'react-dom/client'
+
import './styles.css'
+
+
// Types
+
interface LogEntry {
+
id: string
+
timestamp: string
+
level: 'info' | 'warn' | 'error' | 'debug'
+
message: string
+
service: string
+
context?: Record<string, any>
+
eventType?: string
+
}
+
+
interface ErrorEntry {
+
id: string
+
timestamp: string
+
message: string
+
stack?: string
+
service: string
+
count: number
+
lastSeen: string
+
}
+
+
interface MetricsStats {
+
totalRequests: number
+
avgDuration: number
+
p50Duration: number
+
p95Duration: number
+
p99Duration: number
+
errorRate: number
+
requestsPerMinute: number
+
}
+
+
// Helper function to format Unix timestamp from database
+
function formatDbDate(timestamp: number | string): Date {
+
const num = typeof timestamp === 'string' ? parseFloat(timestamp) : timestamp
+
return new Date(num * 1000) // Convert seconds to milliseconds
+
}
+
+
// Login Component
+
function Login({ onLogin }: { onLogin: () => void }) {
+
const [username, setUsername] = useState('')
+
const [password, setPassword] = useState('')
+
const [error, setError] = useState('')
+
const [loading, setLoading] = useState(false)
+
+
const handleSubmit = async (e: React.FormEvent) => {
+
e.preventDefault()
+
setError('')
+
setLoading(true)
+
+
try {
+
const res = await fetch('/api/admin/login', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ username, password }),
+
credentials: 'include'
+
})
+
+
if (res.ok) {
+
onLogin()
+
} else {
+
setError('Invalid credentials')
+
}
+
} catch (err) {
+
setError('Failed to login')
+
} finally {
+
setLoading(false)
+
}
+
}
+
+
return (
+
<div className="min-h-screen bg-gray-950 flex items-center justify-center p-4">
+
<div className="w-full max-w-md">
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-8 shadow-xl">
+
<h1 className="text-2xl font-bold text-white mb-6">Admin Login</h1>
+
<form onSubmit={handleSubmit} className="space-y-4">
+
<div>
+
<label className="block text-sm font-medium text-gray-300 mb-2">
+
Username
+
</label>
+
<input
+
type="text"
+
value={username}
+
onChange={(e) => setUsername(e.target.value)}
+
className="w-full px-3 py-2 bg-gray-800 border border-gray-700 rounded text-white focus:outline-none focus:border-blue-500"
+
required
+
/>
+
</div>
+
<div>
+
<label className="block text-sm font-medium text-gray-300 mb-2">
+
Password
+
</label>
+
<input
+
type="password"
+
value={password}
+
onChange={(e) => setPassword(e.target.value)}
+
className="w-full px-3 py-2 bg-gray-800 border border-gray-700 rounded text-white focus:outline-none focus:border-blue-500"
+
required
+
/>
+
</div>
+
{error && (
+
<div className="text-red-400 text-sm">{error}</div>
+
)}
+
<button
+
type="submit"
+
disabled={loading}
+
className="w-full bg-blue-600 hover:bg-blue-700 disabled:bg-gray-700 text-white font-medium py-2 px-4 rounded transition-colors"
+
>
+
{loading ? 'Logging in...' : 'Login'}
+
</button>
+
</form>
+
</div>
+
</div>
+
</div>
+
)
+
}
+
+
// Dashboard Component
+
function Dashboard() {
+
const [tab, setTab] = useState('overview')
+
const [logs, setLogs] = useState<LogEntry[]>([])
+
const [errors, setErrors] = useState<ErrorEntry[]>([])
+
const [metrics, setMetrics] = useState<any>(null)
+
const [database, setDatabase] = useState<any>(null)
+
const [sites, setSites] = useState<any>(null)
+
const [health, setHealth] = useState<any>(null)
+
const [autoRefresh, setAutoRefresh] = useState(true)
+
+
// Filters
+
const [logLevel, setLogLevel] = useState('')
+
const [logService, setLogService] = useState('')
+
const [logSearch, setLogSearch] = useState('')
+
const [logEventType, setLogEventType] = useState('')
+
+
const fetchLogs = async () => {
+
const params = new URLSearchParams()
+
if (logLevel) params.append('level', logLevel)
+
if (logService) params.append('service', logService)
+
if (logSearch) params.append('search', logSearch)
+
if (logEventType) params.append('eventType', logEventType)
+
params.append('limit', '100')
+
+
const res = await fetch(`/api/admin/logs?${params}`, { credentials: 'include' })
+
if (res.ok) {
+
const data = await res.json()
+
setLogs(data.logs)
+
}
+
}
+
+
const fetchErrors = async () => {
+
const res = await fetch('/api/admin/errors', { credentials: 'include' })
+
if (res.ok) {
+
const data = await res.json()
+
setErrors(data.errors)
+
}
+
}
+
+
const fetchMetrics = async () => {
+
const res = await fetch('/api/admin/metrics', { credentials: 'include' })
+
if (res.ok) {
+
const data = await res.json()
+
setMetrics(data)
+
}
+
}
+
+
const fetchDatabase = async () => {
+
const res = await fetch('/api/admin/database', { credentials: 'include' })
+
if (res.ok) {
+
const data = await res.json()
+
setDatabase(data)
+
}
+
}
+
+
const fetchSites = async () => {
+
const res = await fetch('/api/admin/sites', { credentials: 'include' })
+
if (res.ok) {
+
const data = await res.json()
+
setSites(data)
+
}
+
}
+
+
const fetchHealth = async () => {
+
const res = await fetch('/api/admin/health', { credentials: 'include' })
+
if (res.ok) {
+
const data = await res.json()
+
setHealth(data)
+
}
+
}
+
+
const logout = async () => {
+
await fetch('/api/admin/logout', { method: 'POST', credentials: 'include' })
+
window.location.reload()
+
}
+
+
useEffect(() => {
+
fetchMetrics()
+
fetchDatabase()
+
fetchHealth()
+
fetchLogs()
+
fetchErrors()
+
fetchSites()
+
}, [])
+
+
useEffect(() => {
+
fetchLogs()
+
}, [logLevel, logService, logSearch])
+
+
useEffect(() => {
+
if (!autoRefresh) return
+
+
const interval = setInterval(() => {
+
if (tab === 'overview') {
+
fetchMetrics()
+
fetchHealth()
+
} else if (tab === 'logs') {
+
fetchLogs()
+
} else if (tab === 'errors') {
+
fetchErrors()
+
} else if (tab === 'database') {
+
fetchDatabase()
+
} else if (tab === 'sites') {
+
fetchSites()
+
}
+
}, 5000)
+
+
return () => clearInterval(interval)
+
}, [tab, autoRefresh, logLevel, logService, logSearch])
+
+
const formatDuration = (ms: number) => {
+
if (ms < 1000) return `${ms}ms`
+
return `${(ms / 1000).toFixed(2)}s`
+
}
+
+
const formatUptime = (seconds: number) => {
+
const hours = Math.floor(seconds / 3600)
+
const minutes = Math.floor((seconds % 3600) / 60)
+
return `${hours}h ${minutes}m`
+
}
+
+
return (
+
<div className="min-h-screen bg-gray-950 text-white">
+
{/* Header */}
+
<div className="bg-gray-900 border-b border-gray-800 px-6 py-4">
+
<div className="flex items-center justify-between">
+
<h1 className="text-2xl font-bold">Wisp.place Admin</h1>
+
<div className="flex items-center gap-4">
+
<label className="flex items-center gap-2 text-sm text-gray-400">
+
<input
+
type="checkbox"
+
checked={autoRefresh}
+
onChange={(e) => setAutoRefresh(e.target.checked)}
+
className="rounded"
+
/>
+
Auto-refresh
+
</label>
+
<button
+
onClick={logout}
+
className="px-4 py-2 bg-gray-800 hover:bg-gray-700 rounded text-sm"
+
>
+
Logout
+
</button>
+
</div>
+
</div>
+
</div>
+
+
{/* Tabs */}
+
<div className="bg-gray-900 border-b border-gray-800 px-6">
+
<div className="flex gap-1">
+
{['overview', 'logs', 'errors', 'database', 'sites'].map((t) => (
+
<button
+
key={t}
+
onClick={() => setTab(t)}
+
className={`px-4 py-3 text-sm font-medium capitalize transition-colors ${
+
tab === t
+
? 'text-white border-b-2 border-blue-500'
+
: 'text-gray-400 hover:text-white'
+
}`}
+
>
+
{t}
+
</button>
+
))}
+
</div>
+
</div>
+
+
{/* Content */}
+
<div className="p-6">
+
{tab === 'overview' && (
+
<div className="space-y-6">
+
{/* Health */}
+
{health && (
+
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<div className="text-sm text-gray-400 mb-1">Uptime</div>
+
<div className="text-2xl font-bold">{formatUptime(health.uptime)}</div>
+
</div>
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<div className="text-sm text-gray-400 mb-1">Memory Used</div>
+
<div className="text-2xl font-bold">{health.memory.heapUsed} MB</div>
+
</div>
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<div className="text-sm text-gray-400 mb-1">RSS</div>
+
<div className="text-2xl font-bold">{health.memory.rss} MB</div>
+
</div>
+
</div>
+
)}
+
+
{/* Metrics */}
+
{metrics && (
+
<div>
+
<h2 className="text-xl font-bold mb-4">Performance Metrics</h2>
+
<div className="space-y-4">
+
{/* Overall */}
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<h3 className="text-lg font-semibold mb-3">Overall (Last Hour)</h3>
+
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
+
<div>
+
<div className="text-sm text-gray-400">Total Requests</div>
+
<div className="text-xl font-bold">{metrics.overall.totalRequests}</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">Avg Duration</div>
+
<div className="text-xl font-bold">{metrics.overall.avgDuration}ms</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">P95 Duration</div>
+
<div className="text-xl font-bold">{metrics.overall.p95Duration}ms</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">Error Rate</div>
+
<div className="text-xl font-bold">{metrics.overall.errorRate.toFixed(2)}%</div>
+
</div>
+
</div>
+
</div>
+
+
{/* Main App */}
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<h3 className="text-lg font-semibold mb-3">Main App</h3>
+
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
+
<div>
+
<div className="text-sm text-gray-400">Requests</div>
+
<div className="text-xl font-bold">{metrics.mainApp.totalRequests}</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">Avg</div>
+
<div className="text-xl font-bold">{metrics.mainApp.avgDuration}ms</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">P95</div>
+
<div className="text-xl font-bold">{metrics.mainApp.p95Duration}ms</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">Req/min</div>
+
<div className="text-xl font-bold">{metrics.mainApp.requestsPerMinute}</div>
+
</div>
+
</div>
+
</div>
+
+
{/* Hosting Service */}
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<h3 className="text-lg font-semibold mb-3">Hosting Service</h3>
+
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
+
<div>
+
<div className="text-sm text-gray-400">Requests</div>
+
<div className="text-xl font-bold">{metrics.hostingService.totalRequests}</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">Avg</div>
+
<div className="text-xl font-bold">{metrics.hostingService.avgDuration}ms</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">P95</div>
+
<div className="text-xl font-bold">{metrics.hostingService.p95Duration}ms</div>
+
</div>
+
<div>
+
<div className="text-sm text-gray-400">Req/min</div>
+
<div className="text-xl font-bold">{metrics.hostingService.requestsPerMinute}</div>
+
</div>
+
</div>
+
</div>
+
</div>
+
</div>
+
)}
+
</div>
+
)}
+
+
{tab === 'logs' && (
+
<div className="space-y-4">
+
<div className="flex gap-4">
+
<select
+
value={logLevel}
+
onChange={(e) => setLogLevel(e.target.value)}
+
className="px-3 py-2 bg-gray-900 border border-gray-800 rounded text-white"
+
>
+
<option value="">All Levels</option>
+
<option value="info">Info</option>
+
<option value="warn">Warn</option>
+
<option value="error">Error</option>
+
<option value="debug">Debug</option>
+
</select>
+
<select
+
value={logService}
+
onChange={(e) => setLogService(e.target.value)}
+
className="px-3 py-2 bg-gray-900 border border-gray-800 rounded text-white"
+
>
+
<option value="">All Services</option>
+
<option value="main-app">Main App</option>
+
<option value="hosting-service">Hosting Service</option>
+
</select>
+
<select
+
value={logEventType}
+
onChange={(e) => setLogEventType(e.target.value)}
+
className="px-3 py-2 bg-gray-900 border border-gray-800 rounded text-white"
+
>
+
<option value="">All Event Types</option>
+
<option value="DNS Verifier">DNS Verifier</option>
+
<option value="Auth">Auth</option>
+
<option value="User">User</option>
+
<option value="Domain">Domain</option>
+
<option value="Site">Site</option>
+
<option value="File Upload">File Upload</option>
+
<option value="Sync">Sync</option>
+
<option value="Maintenance">Maintenance</option>
+
<option value="KeyRotation">Key Rotation</option>
+
<option value="Cleanup">Cleanup</option>
+
<option value="Cache">Cache</option>
+
<option value="FirehoseWorker">Firehose Worker</option>
+
</select>
+
<input
+
type="text"
+
value={logSearch}
+
onChange={(e) => setLogSearch(e.target.value)}
+
placeholder="Search logs..."
+
className="flex-1 px-3 py-2 bg-gray-900 border border-gray-800 rounded text-white"
+
/>
+
</div>
+
+
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
+
<div className="max-h-[600px] overflow-y-auto">
+
<table className="w-full text-sm">
+
<thead className="bg-gray-800 sticky top-0">
+
<tr>
+
<th className="px-4 py-2 text-left">Time</th>
+
<th className="px-4 py-2 text-left">Level</th>
+
<th className="px-4 py-2 text-left">Service</th>
+
<th className="px-4 py-2 text-left">Event Type</th>
+
<th className="px-4 py-2 text-left">Message</th>
+
</tr>
+
</thead>
+
<tbody>
+
{logs.map((log) => (
+
<tr key={log.id} className="border-t border-gray-800 hover:bg-gray-800">
+
<td className="px-4 py-2 text-gray-400 whitespace-nowrap">
+
{new Date(log.timestamp).toLocaleTimeString()}
+
</td>
+
<td className="px-4 py-2">
+
<span
+
className={`px-2 py-1 rounded text-xs font-medium ${
+
log.level === 'error'
+
? 'bg-red-900 text-red-200'
+
: log.level === 'warn'
+
? 'bg-yellow-900 text-yellow-200'
+
: log.level === 'info'
+
? 'bg-blue-900 text-blue-200'
+
: 'bg-gray-700 text-gray-300'
+
}`}
+
>
+
{log.level}
+
</span>
+
</td>
+
<td className="px-4 py-2 text-gray-400">{log.service}</td>
+
<td className="px-4 py-2">
+
{log.eventType && (
+
<span className="px-2 py-1 bg-purple-900 text-purple-200 rounded text-xs font-medium">
+
{log.eventType}
+
</span>
+
)}
+
</td>
+
<td className="px-4 py-2">
+
<div>{log.message}</div>
+
{log.context && Object.keys(log.context).length > 0 && (
+
<div className="text-xs text-gray-500 mt-1">
+
{JSON.stringify(log.context)}
+
</div>
+
)}
+
</td>
+
</tr>
+
))}
+
</tbody>
+
</table>
+
</div>
+
</div>
+
</div>
+
)}
+
+
{tab === 'errors' && (
+
<div className="space-y-4">
+
<h2 className="text-xl font-bold">Recent Errors</h2>
+
<div className="space-y-3">
+
{errors.map((error) => (
+
<div key={error.id} className="bg-gray-900 border border-red-900 rounded-lg p-4">
+
<div className="flex items-start justify-between mb-2">
+
<div className="flex-1">
+
<div className="font-semibold text-red-400">{error.message}</div>
+
<div className="text-sm text-gray-400 mt-1">
+
Service: {error.service} โ€ข Count: {error.count} โ€ข Last seen:{' '}
+
{new Date(error.lastSeen).toLocaleString()}
+
</div>
+
</div>
+
</div>
+
{error.stack && (
+
<pre className="text-xs text-gray-500 bg-gray-950 p-2 rounded mt-2 overflow-x-auto">
+
{error.stack}
+
</pre>
+
)}
+
</div>
+
))}
+
{errors.length === 0 && (
+
<div className="text-center text-gray-500 py-8">No errors found</div>
+
)}
+
</div>
+
</div>
+
)}
+
+
{tab === 'database' && database && (
+
<div className="space-y-6">
+
{/* Stats */}
+
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<div className="text-sm text-gray-400 mb-1">Total Sites</div>
+
<div className="text-3xl font-bold">{database.stats.totalSites}</div>
+
</div>
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<div className="text-sm text-gray-400 mb-1">Wisp Subdomains</div>
+
<div className="text-3xl font-bold">{database.stats.totalWispSubdomains}</div>
+
</div>
+
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
+
<div className="text-sm text-gray-400 mb-1">Custom Domains</div>
+
<div className="text-3xl font-bold">{database.stats.totalCustomDomains}</div>
+
</div>
+
</div>
+
+
{/* Recent Sites */}
+
<div>
+
<h3 className="text-lg font-semibold mb-3">Recent Sites</h3>
+
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
+
<table className="w-full text-sm">
+
<thead className="bg-gray-800">
+
<tr>
+
<th className="px-4 py-2 text-left">Site Name</th>
+
<th className="px-4 py-2 text-left">Subdomain</th>
+
<th className="px-4 py-2 text-left">DID</th>
+
<th className="px-4 py-2 text-left">RKey</th>
+
<th className="px-4 py-2 text-left">Created</th>
+
</tr>
+
</thead>
+
<tbody>
+
{database.recentSites.map((site: any, i: number) => (
+
<tr key={i} className="border-t border-gray-800">
+
<td className="px-4 py-2">{site.display_name || 'Untitled'}</td>
+
<td className="px-4 py-2">
+
{site.subdomain ? (
+
<a
+
href={`https://${site.subdomain}`}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-blue-400 hover:underline"
+
>
+
{site.subdomain}
+
</a>
+
) : (
+
<span className="text-gray-500">No domain</span>
+
)}
+
</td>
+
<td className="px-4 py-2 text-gray-400 font-mono text-xs">
+
{site.did.slice(0, 20)}...
+
</td>
+
<td className="px-4 py-2 text-gray-400">{site.rkey || 'self'}</td>
+
<td className="px-4 py-2 text-gray-400">
+
{formatDbDate(site.created_at).toLocaleDateString()}
+
</td>
+
<td className="px-4 py-2">
+
<a
+
href={`https://pdsls.dev/at://${site.did}/place.wisp.fs/${site.rkey || 'self'}`}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-blue-400 hover:text-blue-300 transition-colors"
+
title="View on PDSls.dev"
+
>
+
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
+
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M10 6H6a2 2 0 00-2 2v10a2 2 0 002 2h10a2 2 0 002-2v-4M14 4h6m0 0v6m0-6L10 14" />
+
</svg>
+
</a>
+
</td>
+
</tr>
+
))}
+
</tbody>
+
</table>
+
</div>
+
</div>
+
+
{/* Recent Domains */}
+
<div>
+
<h3 className="text-lg font-semibold mb-3">Recent Custom Domains</h3>
+
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
+
<table className="w-full text-sm">
+
<thead className="bg-gray-800">
+
<tr>
+
<th className="px-4 py-2 text-left">Domain</th>
+
<th className="px-4 py-2 text-left">DID</th>
+
<th className="px-4 py-2 text-left">Verified</th>
+
<th className="px-4 py-2 text-left">Created</th>
+
</tr>
+
</thead>
+
<tbody>
+
{database.recentDomains.map((domain: any, i: number) => (
+
<tr key={i} className="border-t border-gray-800">
+
<td className="px-4 py-2">{domain.domain}</td>
+
<td className="px-4 py-2 text-gray-400 font-mono text-xs">
+
{domain.did.slice(0, 20)}...
+
</td>
+
<td className="px-4 py-2">
+
<span
+
className={`px-2 py-1 rounded text-xs ${
+
domain.verified
+
? 'bg-green-900 text-green-200'
+
: 'bg-yellow-900 text-yellow-200'
+
}`}
+
>
+
{domain.verified ? 'Yes' : 'No'}
+
</span>
+
</td>
+
<td className="px-4 py-2 text-gray-400">
+
{formatDbDate(domain.created_at).toLocaleDateString()}
+
</td>
+
</tr>
+
))}
+
</tbody>
+
</table>
+
</div>
+
</div>
+
</div>
+
)}
+
+
{tab === 'sites' && sites && (
+
<div className="space-y-6">
+
{/* All Sites */}
+
<div>
+
<h3 className="text-lg font-semibold mb-3">All Sites</h3>
+
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
+
<table className="w-full text-sm">
+
<thead className="bg-gray-800">
+
<tr>
+
<th className="px-4 py-2 text-left">Site Name</th>
+
<th className="px-4 py-2 text-left">Subdomain</th>
+
<th className="px-4 py-2 text-left">DID</th>
+
<th className="px-4 py-2 text-left">RKey</th>
+
<th className="px-4 py-2 text-left">Created</th>
+
</tr>
+
</thead>
+
<tbody>
+
{sites.sites.map((site: any, i: number) => (
+
<tr key={i} className="border-t border-gray-800 hover:bg-gray-800">
+
<td className="px-4 py-2">{site.display_name || 'Untitled'}</td>
+
<td className="px-4 py-2">
+
{site.subdomain ? (
+
<a
+
href={`https://${site.subdomain}`}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-blue-400 hover:underline"
+
>
+
{site.subdomain}
+
</a>
+
) : (
+
<span className="text-gray-500">No domain</span>
+
)}
+
</td>
+
<td className="px-4 py-2 text-gray-400 font-mono text-xs">
+
{site.did.slice(0, 30)}...
+
</td>
+
<td className="px-4 py-2 text-gray-400">{site.rkey || 'self'}</td>
+
<td className="px-4 py-2 text-gray-400">
+
{formatDbDate(site.created_at).toLocaleString()}
+
</td>
+
<td className="px-4 py-2">
+
<a
+
href={`https://pdsls.dev/at://${site.did}/place.wisp.fs/${site.rkey || 'self'}`}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-blue-400 hover:text-blue-300 transition-colors"
+
title="View on PDSls.dev"
+
>
+
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
+
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M10 6H6a2 2 0 00-2 2v10a2 2 0 002 2h10a2 2 0 002-2v-4M14 4h6m0 0v6m0-6L10 14" />
+
</svg>
+
</a>
+
</td>
+
</tr>
+
))}
+
</tbody>
+
</table>
+
</div>
+
</div>
+
+
{/* Custom Domains */}
+
<div>
+
<h3 className="text-lg font-semibold mb-3">Custom Domains</h3>
+
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
+
<table className="w-full text-sm">
+
<thead className="bg-gray-800">
+
<tr>
+
<th className="px-4 py-2 text-left">Domain</th>
+
<th className="px-4 py-2 text-left">Verified</th>
+
<th className="px-4 py-2 text-left">DID</th>
+
<th className="px-4 py-2 text-left">RKey</th>
+
<th className="px-4 py-2 text-left">Created</th>
+
<th className="px-4 py-2 text-left">PDSls</th>
+
</tr>
+
</thead>
+
<tbody>
+
{sites.customDomains.map((domain: any, i: number) => (
+
<tr key={i} className="border-t border-gray-800 hover:bg-gray-800">
+
<td className="px-4 py-2">
+
{domain.verified ? (
+
<a
+
href={`https://${domain.domain}`}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-blue-400 hover:underline"
+
>
+
{domain.domain}
+
</a>
+
) : (
+
<span className="text-gray-400">{domain.domain}</span>
+
)}
+
</td>
+
<td className="px-4 py-2">
+
<span
+
className={`px-2 py-1 rounded text-xs ${
+
domain.verified
+
? 'bg-green-900 text-green-200'
+
: 'bg-yellow-900 text-yellow-200'
+
}`}
+
>
+
{domain.verified ? 'Yes' : 'Pending'}
+
</span>
+
</td>
+
<td className="px-4 py-2 text-gray-400 font-mono text-xs">
+
{domain.did.slice(0, 30)}...
+
</td>
+
<td className="px-4 py-2 text-gray-400">{domain.rkey || 'self'}</td>
+
<td className="px-4 py-2 text-gray-400">
+
{formatDbDate(domain.created_at).toLocaleString()}
+
</td>
+
<td className="px-4 py-2">
+
<a
+
href={`https://pdsls.dev/at://${domain.did}/place.wisp.fs/${domain.rkey || 'self'}`}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-blue-400 hover:text-blue-300 transition-colors"
+
title="View on PDSls.dev"
+
>
+
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
+
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M10 6H6a2 2 0 00-2 2v10a2 2 0 002 2h10a2 2 0 002-2v-4M14 4h6m0 0v6m0-6L10 14" />
+
</svg>
+
</a>
+
</td>
+
</tr>
+
))}
+
</tbody>
+
</table>
+
</div>
+
</div>
+
</div>
+
)}
+
</div>
+
</div>
+
)
+
}
+
+
// Main App
+
function App() {
+
const [authenticated, setAuthenticated] = useState(false)
+
const [checking, setChecking] = useState(true)
+
+
useEffect(() => {
+
fetch('/api/admin/status', { credentials: 'include' })
+
.then((res) => res.json())
+
.then((data) => {
+
setAuthenticated(data.authenticated)
+
setChecking(false)
+
})
+
.catch(() => {
+
setChecking(false)
+
})
+
}, [])
+
+
if (checking) {
+
return (
+
<div className="min-h-screen bg-gray-950 flex items-center justify-center">
+
<div className="text-white">Loading...</div>
+
</div>
+
)
+
}
+
+
if (!authenticated) {
+
return <Login onLogin={() => setAuthenticated(true)} />
+
}
+
+
return <Dashboard />
+
}
+
+
createRoot(document.getElementById('root')!).render(
+
<StrictMode>
+
<App />
+
</StrictMode>
+
)
+19
apps/main-app/public/admin/index.html
···
+
<!DOCTYPE html>
+
<html lang="en">
+
<head>
+
<meta charset="UTF-8" />
+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
+
<title>wisp.place</title>
+
<meta name="description" content="Admin dashboard for wisp.place decentralized static site hosting." />
+
<meta name="robots" content="noindex, nofollow" />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
+
<link rel="stylesheet" href="./styles.css" />
+
</head>
+
<body>
+
<div id="root"></div>
+
<script type="module" src="./admin.tsx"></script>
+
</body>
+
</html>
+1
apps/main-app/public/admin/styles.css
···
+
@import "tailwindcss";
apps/main-app/public/android-chrome-192x192.png

This is a binary file and will not be displayed.

apps/main-app/public/android-chrome-512x512.png

This is a binary file and will not be displayed.

apps/main-app/public/apple-touch-icon.png

This is a binary file and will not be displayed.

+46
apps/main-app/public/components/ui/badge.tsx
···
+
import * as React from "react"
+
import { Slot } from "@radix-ui/react-slot"
+
import { cva, type VariantProps } from "class-variance-authority"
+
+
import { cn } from "@public/lib/utils"
+
+
const badgeVariants = cva(
+
"inline-flex items-center justify-center rounded-full border px-2 py-0.5 text-xs font-medium w-fit whitespace-nowrap shrink-0 [&>svg]:size-3 gap-1 [&>svg]:pointer-events-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive transition-[color,box-shadow] overflow-hidden",
+
{
+
variants: {
+
variant: {
+
default:
+
"border-transparent bg-primary text-primary-foreground [a&]:hover:bg-primary/90",
+
secondary:
+
"border-transparent bg-secondary text-secondary-foreground [a&]:hover:bg-secondary/90",
+
destructive:
+
"border-transparent bg-destructive text-white [a&]:hover:bg-destructive/90 focus-visible:ring-destructive/20 dark:focus-visible:ring-destructive/40 dark:bg-destructive/60",
+
outline:
+
"text-foreground [a&]:hover:bg-accent [a&]:hover:text-accent-foreground",
+
},
+
},
+
defaultVariants: {
+
variant: "default",
+
},
+
}
+
)
+
+
function Badge({
+
className,
+
variant,
+
asChild = false,
+
...props
+
}: React.ComponentProps<"span"> &
+
VariantProps<typeof badgeVariants> & { asChild?: boolean }) {
+
const Comp = asChild ? Slot : "span"
+
+
return (
+
<Comp
+
data-slot="badge"
+
className={cn(badgeVariants({ variant }), className)}
+
{...props}
+
/>
+
)
+
}
+
+
export { Badge, badgeVariants }
+60
apps/main-app/public/components/ui/button.tsx
···
+
import * as React from "react"
+
import { Slot } from "@radix-ui/react-slot"
+
import { cva, type VariantProps } from "class-variance-authority"
+
+
import { cn } from "@public/lib/utils"
+
+
const buttonVariants = cva(
+
"inline-flex items-center justify-center gap-2 whitespace-nowrap rounded-md text-sm font-medium transition-all disabled:pointer-events-none disabled:opacity-50 [&_svg]:pointer-events-none [&_svg:not([class*='size-'])]:size-4 shrink-0 [&_svg]:shrink-0 outline-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive",
+
{
+
variants: {
+
variant: {
+
default: "bg-primary text-primary-foreground hover:bg-primary/90",
+
destructive:
+
"bg-destructive text-white hover:bg-destructive/90 focus-visible:ring-destructive/20 dark:focus-visible:ring-destructive/40 dark:bg-destructive/60",
+
outline:
+
"border bg-background shadow-xs hover:bg-accent hover:text-accent-foreground dark:bg-input/30 dark:border-input dark:hover:bg-input/50",
+
secondary:
+
"bg-secondary text-secondary-foreground hover:bg-secondary/80",
+
ghost:
+
"hover:bg-accent hover:text-accent-foreground dark:hover:bg-accent/50",
+
link: "text-primary underline-offset-4 hover:underline",
+
},
+
size: {
+
default: "h-9 px-4 py-2 has-[>svg]:px-3",
+
sm: "h-8 rounded-md gap-1.5 px-3 has-[>svg]:px-2.5",
+
lg: "h-10 rounded-md px-6 has-[>svg]:px-4",
+
icon: "size-9",
+
"icon-sm": "size-8",
+
"icon-lg": "size-10",
+
},
+
},
+
defaultVariants: {
+
variant: "default",
+
size: "default",
+
},
+
}
+
)
+
+
function Button({
+
className,
+
variant,
+
size,
+
asChild = false,
+
...props
+
}: React.ComponentProps<"button"> &
+
VariantProps<typeof buttonVariants> & {
+
asChild?: boolean
+
}) {
+
const Comp = asChild ? Slot : "button"
+
+
return (
+
<Comp
+
data-slot="button"
+
className={cn(buttonVariants({ variant, size, className }))}
+
{...props}
+
/>
+
)
+
}
+
+
export { Button, buttonVariants }
+92
apps/main-app/public/components/ui/card.tsx
···
+
import * as React from "react"
+
+
import { cn } from "@public/lib/utils"
+
+
function Card({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="card"
+
className={cn(
+
"bg-card text-card-foreground flex flex-col gap-6 rounded-xl border py-6 shadow-sm",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
function CardHeader({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="card-header"
+
className={cn(
+
"@container/card-header grid auto-rows-min grid-rows-[auto_auto] items-start gap-2 px-6 has-data-[slot=card-action]:grid-cols-[1fr_auto] [.border-b]:pb-6",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
function CardTitle({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="card-title"
+
className={cn("leading-none font-semibold", className)}
+
{...props}
+
/>
+
)
+
}
+
+
function CardDescription({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="card-description"
+
className={cn("text-muted-foreground text-sm", className)}
+
{...props}
+
/>
+
)
+
}
+
+
function CardAction({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="card-action"
+
className={cn(
+
"col-start-2 row-span-2 row-start-1 self-start justify-self-end",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
function CardContent({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="card-content"
+
className={cn("px-6", className)}
+
{...props}
+
/>
+
)
+
}
+
+
function CardFooter({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="card-footer"
+
className={cn("flex items-center px-6 [.border-t]:pt-6", className)}
+
{...props}
+
/>
+
)
+
}
+
+
export {
+
Card,
+
CardHeader,
+
CardFooter,
+
CardTitle,
+
CardAction,
+
CardDescription,
+
CardContent,
+
}
+30
apps/main-app/public/components/ui/checkbox.tsx
···
+
import * as React from "react"
+
import * as CheckboxPrimitive from "@radix-ui/react-checkbox"
+
import { CheckIcon } from "lucide-react"
+
+
import { cn } from "@public/lib/utils"
+
+
function Checkbox({
+
className,
+
...props
+
}: React.ComponentProps<typeof CheckboxPrimitive.Root>) {
+
return (
+
<CheckboxPrimitive.Root
+
data-slot="checkbox"
+
className={cn(
+
"peer border-input dark:bg-input/30 data-[state=checked]:bg-primary data-[state=checked]:text-primary-foreground dark:data-[state=checked]:bg-primary data-[state=checked]:border-primary focus-visible:border-ring focus-visible:ring-ring/50 aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive size-4 shrink-0 rounded-[4px] border shadow-xs transition-shadow outline-none focus-visible:ring-[3px] disabled:cursor-not-allowed disabled:opacity-50",
+
className
+
)}
+
{...props}
+
>
+
<CheckboxPrimitive.Indicator
+
data-slot="checkbox-indicator"
+
className="grid place-content-center text-current transition-none"
+
>
+
<CheckIcon className="size-3.5" />
+
</CheckboxPrimitive.Indicator>
+
</CheckboxPrimitive.Root>
+
)
+
}
+
+
export { Checkbox }
+104
apps/main-app/public/components/ui/code-block.tsx
···
+
import { useEffect, useRef, useState } from 'react'
+
+
declare global {
+
interface Window {
+
Prism: {
+
languages: Record<string, any>
+
highlightElement: (element: HTMLElement) => void
+
highlightAll: () => void
+
}
+
}
+
}
+
+
interface CodeBlockProps {
+
code: string
+
language?: 'bash' | 'yaml'
+
className?: string
+
}
+
+
export function CodeBlock({ code, language = 'bash', className = '' }: CodeBlockProps) {
+
const [isThemeLoaded, setIsThemeLoaded] = useState(false)
+
const codeRef = useRef<HTMLElement>(null)
+
+
useEffect(() => {
+
// Load Catppuccin theme CSS
+
const loadTheme = async () => {
+
// Detect if user prefers dark mode
+
const prefersDark = window.matchMedia('(prefers-color-scheme: dark)').matches
+
const theme = prefersDark ? 'mocha' : 'latte'
+
+
// Remove any existing theme CSS
+
const existingTheme = document.querySelector('link[data-prism-theme]')
+
if (existingTheme) {
+
existingTheme.remove()
+
}
+
+
// Load the appropriate Catppuccin theme
+
const link = document.createElement('link')
+
link.rel = 'stylesheet'
+
link.href = `https://prismjs.catppuccin.com/${theme}.css`
+
link.setAttribute('data-prism-theme', theme)
+
document.head.appendChild(link)
+
+
// Load PrismJS if not already loaded
+
if (!window.Prism) {
+
const script = document.createElement('script')
+
script.src = 'https://cdnjs.cloudflare.com/ajax/libs/prism/1.30.0/prism.min.js'
+
script.onload = () => {
+
// Load language support if needed
+
if (language === 'yaml' && !window.Prism.languages.yaml) {
+
const yamlScript = document.createElement('script')
+
yamlScript.src = 'https://cdnjs.cloudflare.com/ajax/libs/prism/1.30.0/components/prism-yaml.min.js'
+
yamlScript.onload = () => setIsThemeLoaded(true)
+
document.head.appendChild(yamlScript)
+
} else {
+
setIsThemeLoaded(true)
+
}
+
}
+
document.head.appendChild(script)
+
} else {
+
setIsThemeLoaded(true)
+
}
+
}
+
+
loadTheme()
+
+
// Listen for theme changes
+
const mediaQuery = window.matchMedia('(prefers-color-scheme: dark)')
+
const handleThemeChange = () => loadTheme()
+
mediaQuery.addEventListener('change', handleThemeChange)
+
+
return () => {
+
mediaQuery.removeEventListener('change', handleThemeChange)
+
}
+
}, [language])
+
+
// Highlight code when Prism is loaded and component is mounted
+
useEffect(() => {
+
if (isThemeLoaded && codeRef.current && window.Prism) {
+
window.Prism.highlightElement(codeRef.current)
+
}
+
}, [isThemeLoaded, code])
+
+
if (!isThemeLoaded) {
+
return (
+
<pre className={`p-4 bg-muted rounded-lg overflow-x-auto ${className}`}>
+
<code>{code.trim()}</code>
+
</pre>
+
)
+
}
+
+
// Map language to Prism language class
+
const languageMap = {
+
'bash': 'language-bash',
+
'yaml': 'language-yaml'
+
}
+
+
const prismLanguage = languageMap[language] || 'language-bash'
+
+
return (
+
<pre className={`p-4 rounded-lg overflow-x-auto ${className}`}>
+
<code ref={codeRef} className={prismLanguage}>{code.trim()}</code>
+
</pre>
+
)
+
}
+141
apps/main-app/public/components/ui/dialog.tsx
···
+
import * as React from "react"
+
import * as DialogPrimitive from "@radix-ui/react-dialog"
+
import { XIcon } from "lucide-react"
+
+
import { cn } from "@public/lib/utils"
+
+
function Dialog({
+
...props
+
}: React.ComponentProps<typeof DialogPrimitive.Root>) {
+
return <DialogPrimitive.Root data-slot="dialog" {...props} />
+
}
+
+
function DialogTrigger({
+
...props
+
}: React.ComponentProps<typeof DialogPrimitive.Trigger>) {
+
return <DialogPrimitive.Trigger data-slot="dialog-trigger" {...props} />
+
}
+
+
function DialogPortal({
+
...props
+
}: React.ComponentProps<typeof DialogPrimitive.Portal>) {
+
return <DialogPrimitive.Portal data-slot="dialog-portal" {...props} />
+
}
+
+
function DialogClose({
+
...props
+
}: React.ComponentProps<typeof DialogPrimitive.Close>) {
+
return <DialogPrimitive.Close data-slot="dialog-close" {...props} />
+
}
+
+
function DialogOverlay({
+
className,
+
...props
+
}: React.ComponentProps<typeof DialogPrimitive.Overlay>) {
+
return (
+
<DialogPrimitive.Overlay
+
data-slot="dialog-overlay"
+
className={cn(
+
"data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 fixed inset-0 z-50 bg-black/50",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
function DialogContent({
+
className,
+
children,
+
showCloseButton = true,
+
...props
+
}: React.ComponentProps<typeof DialogPrimitive.Content> & {
+
showCloseButton?: boolean
+
}) {
+
return (
+
<DialogPortal data-slot="dialog-portal">
+
<DialogOverlay />
+
<DialogPrimitive.Content
+
data-slot="dialog-content"
+
className={cn(
+
"bg-background data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 data-[state=closed]:zoom-out-95 data-[state=open]:zoom-in-95 fixed top-[50%] left-[50%] z-50 grid w-full max-w-[calc(100%-2rem)] translate-x-[-50%] translate-y-[-50%] gap-4 rounded-lg border p-6 shadow-lg duration-200 sm:max-w-lg",
+
className
+
)}
+
{...props}
+
>
+
{children}
+
{showCloseButton && (
+
<DialogPrimitive.Close
+
data-slot="dialog-close"
+
className="ring-offset-background focus:ring-ring data-[state=open]:bg-accent data-[state=open]:text-muted-foreground absolute top-4 right-4 rounded-xs opacity-70 transition-opacity hover:opacity-100 focus:ring-2 focus:ring-offset-2 focus:outline-hidden disabled:pointer-events-none [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4"
+
>
+
<XIcon />
+
<span className="sr-only">Close</span>
+
</DialogPrimitive.Close>
+
)}
+
</DialogPrimitive.Content>
+
</DialogPortal>
+
)
+
}
+
+
function DialogHeader({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="dialog-header"
+
className={cn("flex flex-col gap-2 text-center sm:text-left", className)}
+
{...props}
+
/>
+
)
+
}
+
+
function DialogFooter({ className, ...props }: React.ComponentProps<"div">) {
+
return (
+
<div
+
data-slot="dialog-footer"
+
className={cn(
+
"flex flex-col-reverse gap-2 sm:flex-row sm:justify-end",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
function DialogTitle({
+
className,
+
...props
+
}: React.ComponentProps<typeof DialogPrimitive.Title>) {
+
return (
+
<DialogPrimitive.Title
+
data-slot="dialog-title"
+
className={cn("text-lg leading-none font-semibold", className)}
+
{...props}
+
/>
+
)
+
}
+
+
function DialogDescription({
+
className,
+
...props
+
}: React.ComponentProps<typeof DialogPrimitive.Description>) {
+
return (
+
<DialogPrimitive.Description
+
data-slot="dialog-description"
+
className={cn("text-muted-foreground text-sm", className)}
+
{...props}
+
/>
+
)
+
}
+
+
export {
+
Dialog,
+
DialogClose,
+
DialogContent,
+
DialogDescription,
+
DialogFooter,
+
DialogHeader,
+
DialogOverlay,
+
DialogPortal,
+
DialogTitle,
+
DialogTrigger,
+
}
+21
apps/main-app/public/components/ui/input.tsx
···
+
import * as React from "react"
+
+
import { cn } from "@public/lib/utils"
+
+
function Input({ className, type, ...props }: React.ComponentProps<"input">) {
+
return (
+
<input
+
type={type}
+
data-slot="input"
+
className={cn(
+
"file:text-foreground placeholder:text-muted-foreground selection:bg-primary selection:text-primary-foreground dark:bg-input/30 border-input h-9 w-full min-w-0 rounded-md border bg-transparent px-3 py-1 text-base shadow-xs transition-[color,box-shadow] outline-none file:inline-flex file:h-7 file:border-0 file:bg-transparent file:text-sm file:font-medium disabled:pointer-events-none disabled:cursor-not-allowed disabled:opacity-50 md:text-sm",
+
"focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px]",
+
"aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
export { Input }
+22
apps/main-app/public/components/ui/label.tsx
···
+
import * as React from "react"
+
import * as LabelPrimitive from "@radix-ui/react-label"
+
+
import { cn } from "@public/lib/utils"
+
+
function Label({
+
className,
+
...props
+
}: React.ComponentProps<typeof LabelPrimitive.Root>) {
+
return (
+
<LabelPrimitive.Root
+
data-slot="label"
+
className={cn(
+
"flex items-center gap-2 text-sm leading-none font-medium select-none group-data-[disabled=true]:pointer-events-none group-data-[disabled=true]:opacity-50 peer-disabled:cursor-not-allowed peer-disabled:opacity-50",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
export { Label }
+45
apps/main-app/public/components/ui/radio-group.tsx
···
+
"use client"
+
+
import * as React from "react"
+
import * as RadioGroupPrimitive from "@radix-ui/react-radio-group"
+
import { CircleIcon } from "lucide-react"
+
+
import { cn } from "@public/lib/utils"
+
+
function RadioGroup({
+
className,
+
...props
+
}: React.ComponentProps<typeof RadioGroupPrimitive.Root>) {
+
return (
+
<RadioGroupPrimitive.Root
+
data-slot="radio-group"
+
className={cn("grid gap-3", className)}
+
{...props}
+
/>
+
)
+
}
+
+
function RadioGroupItem({
+
className,
+
...props
+
}: React.ComponentProps<typeof RadioGroupPrimitive.Item>) {
+
return (
+
<RadioGroupPrimitive.Item
+
data-slot="radio-group-item"
+
className={cn(
+
"border-input text-primary focus-visible:border-ring focus-visible:ring-ring/50 aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive dark:bg-input/50 aspect-square size-4 shrink-0 rounded-full border border-black/30 dark:border-white/30 shadow-inner transition-[color,box-shadow] outline-none focus-visible:ring-[3px] disabled:cursor-not-allowed disabled:opacity-50",
+
className
+
)}
+
{...props}
+
>
+
<RadioGroupPrimitive.Indicator
+
data-slot="radio-group-indicator"
+
className="relative flex items-center justify-center"
+
>
+
<CircleIcon className="fill-primary absolute top-1/2 left-1/2 size-2 -translate-x-1/2 -translate-y-1/2" />
+
</RadioGroupPrimitive.Indicator>
+
</RadioGroupPrimitive.Item>
+
)
+
}
+
+
export { RadioGroup, RadioGroupItem }
+31
apps/main-app/public/components/ui/skeleton.tsx
···
+
import { cn } from '@public/lib/utils'
+
+
interface SkeletonProps extends React.HTMLAttributes<HTMLDivElement> {}
+
+
function Skeleton({ className, ...props }: SkeletonProps) {
+
return (
+
<div
+
className={cn(
+
'animate-pulse rounded-md bg-muted',
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
interface SkeletonShimmerProps extends React.HTMLAttributes<HTMLDivElement> {}
+
+
function SkeletonShimmer({ className, ...props }: SkeletonShimmerProps) {
+
return (
+
<div
+
className={cn(
+
'relative overflow-hidden rounded-md bg-muted before:absolute before:inset-0 before:-translate-x-full before:animate-[shimmer_2s_infinite] before:bg-gradient-to-r before:from-transparent before:via-white/10 before:to-transparent',
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
export { Skeleton, SkeletonShimmer }
+64
apps/main-app/public/components/ui/tabs.tsx
···
+
import * as React from "react"
+
import * as TabsPrimitive from "@radix-ui/react-tabs"
+
+
import { cn } from "@public/lib/utils"
+
+
function Tabs({
+
className,
+
...props
+
}: React.ComponentProps<typeof TabsPrimitive.Root>) {
+
return (
+
<TabsPrimitive.Root
+
data-slot="tabs"
+
className={cn("flex flex-col gap-2", className)}
+
{...props}
+
/>
+
)
+
}
+
+
function TabsList({
+
className,
+
...props
+
}: React.ComponentProps<typeof TabsPrimitive.List>) {
+
return (
+
<TabsPrimitive.List
+
data-slot="tabs-list"
+
className={cn(
+
"bg-muted dark:bg-muted/80 text-muted-foreground inline-flex h-9 w-fit items-center justify-center rounded-lg p-[3px]",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
function TabsTrigger({
+
className,
+
...props
+
}: React.ComponentProps<typeof TabsPrimitive.Trigger>) {
+
return (
+
<TabsPrimitive.Trigger
+
data-slot="tabs-trigger"
+
className={cn(
+
"data-[state=active]:bg-background dark:data-[state=active]:bg-background dark:data-[state=active]:text-foreground dark:data-[state=active]:border-border focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:outline-ring dark:data-[state=active]:border-border dark:data-[state=active]:shadow-sm text-foreground dark:text-muted-foreground/70 inline-flex h-[calc(100%-1px)] flex-1 items-center justify-center gap-1.5 rounded-md border border-transparent px-2 py-1 text-sm font-medium whitespace-nowrap transition-[color,box-shadow] focus-visible:ring-[3px] focus-visible:outline-1 disabled:pointer-events-none disabled:opacity-50 data-[state=active]:shadow-sm [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
+
className
+
)}
+
{...props}
+
/>
+
)
+
}
+
+
function TabsContent({
+
className,
+
...props
+
}: React.ComponentProps<typeof TabsPrimitive.Content>) {
+
return (
+
<TabsPrimitive.Content
+
data-slot="tabs-content"
+
className={cn("outline-none", className)}
+
{...props}
+
/>
+
)
+
}
+
+
export { Tabs, TabsList, TabsTrigger, TabsContent }
+65
apps/main-app/public/editor/components/TabSkeleton.tsx
···
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
+
// Shimmer animation for skeleton loading
+
const Shimmer = () => (
+
<div className="animate-pulse">
+
<div className="h-4 bg-muted rounded w-3/4 mb-2"></div>
+
<div className="h-4 bg-muted rounded w-1/2"></div>
+
</div>
+
)
+
+
const SkeletonLine = ({ className = '' }: { className?: string }) => (
+
<div className={`animate-pulse bg-muted rounded ${className}`}></div>
+
)
+
+
export function TabSkeleton() {
+
return (
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<div className="space-y-2">
+
<SkeletonLine className="h-6 w-1/3" />
+
<SkeletonLine className="h-4 w-2/3" />
+
</div>
+
</CardHeader>
+
<CardContent className="space-y-4">
+
{/* Skeleton content items */}
+
<div className="p-4 border border-border rounded-lg">
+
<SkeletonLine className="h-5 w-1/2 mb-3" />
+
<SkeletonLine className="h-4 w-3/4 mb-2" />
+
<SkeletonLine className="h-4 w-2/3" />
+
</div>
+
<div className="p-4 border border-border rounded-lg">
+
<SkeletonLine className="h-5 w-1/2 mb-3" />
+
<SkeletonLine className="h-4 w-3/4 mb-2" />
+
<SkeletonLine className="h-4 w-2/3" />
+
</div>
+
<div className="p-4 border border-border rounded-lg">
+
<SkeletonLine className="h-5 w-1/2 mb-3" />
+
<SkeletonLine className="h-4 w-3/4 mb-2" />
+
<SkeletonLine className="h-4 w-2/3" />
+
</div>
+
</CardContent>
+
</Card>
+
+
<Card>
+
<CardHeader>
+
<div className="space-y-2">
+
<SkeletonLine className="h-6 w-1/4" />
+
<SkeletonLine className="h-4 w-1/2" />
+
</div>
+
</CardHeader>
+
<CardContent className="space-y-3">
+
<SkeletonLine className="h-10 w-full" />
+
<SkeletonLine className="h-4 w-3/4" />
+
</CardContent>
+
</Card>
+
</div>
+
)
+
}
+861
apps/main-app/public/editor/editor.tsx
···
+
import { useState, useEffect } from 'react'
+
import { createRoot } from 'react-dom/client'
+
import { Button } from '@public/components/ui/button'
+
import {
+
Tabs,
+
TabsContent,
+
TabsList,
+
TabsTrigger
+
} from '@public/components/ui/tabs'
+
import {
+
Dialog,
+
DialogContent,
+
DialogDescription,
+
DialogHeader,
+
DialogTitle,
+
DialogFooter
+
} from '@public/components/ui/dialog'
+
import { Checkbox } from '@public/components/ui/checkbox'
+
import { Label } from '@public/components/ui/label'
+
import { Badge } from '@public/components/ui/badge'
+
import { SkeletonShimmer } from '@public/components/ui/skeleton'
+
import { Input } from '@public/components/ui/input'
+
import { RadioGroup, RadioGroupItem } from '@public/components/ui/radio-group'
+
import {
+
Loader2,
+
Trash2,
+
LogOut
+
} from 'lucide-react'
+
import Layout from '@public/layouts'
+
import { useUserInfo } from './hooks/useUserInfo'
+
import { useSiteData, type SiteWithDomains } from './hooks/useSiteData'
+
import { useDomainData } from './hooks/useDomainData'
+
import { SitesTab } from './tabs/SitesTab'
+
import { DomainsTab } from './tabs/DomainsTab'
+
import { UploadTab } from './tabs/UploadTab'
+
import { CLITab } from './tabs/CLITab'
+
+
function Dashboard() {
+
// Use custom hooks
+
const { userInfo, loading, fetchUserInfo } = useUserInfo()
+
const { sites, sitesLoading, isSyncing, fetchSites, syncSites, deleteSite } = useSiteData()
+
const {
+
wispDomains,
+
customDomains,
+
domainsLoading,
+
verificationStatus,
+
fetchDomains,
+
addCustomDomain,
+
verifyDomain,
+
deleteCustomDomain,
+
mapWispDomain,
+
deleteWispDomain,
+
mapCustomDomain,
+
claimWispDomain,
+
checkWispAvailability
+
} = useDomainData()
+
+
// Site configuration modal state (shared across components)
+
const [configuringSite, setConfiguringSite] = useState<SiteWithDomains | null>(null)
+
const [selectedDomains, setSelectedDomains] = useState<Set<string>>(new Set())
+
const [isSavingConfig, setIsSavingConfig] = useState(false)
+
const [isDeletingSite, setIsDeletingSite] = useState(false)
+
+
// Site settings state
+
type RoutingMode = 'default' | 'spa' | 'directory' | 'custom404'
+
const [routingMode, setRoutingMode] = useState<RoutingMode>('default')
+
const [spaFile, setSpaFile] = useState('index.html')
+
const [custom404File, setCustom404File] = useState('404.html')
+
const [indexFiles, setIndexFiles] = useState<string[]>(['index.html'])
+
const [newIndexFile, setNewIndexFile] = useState('')
+
const [cleanUrls, setCleanUrls] = useState(false)
+
const [corsEnabled, setCorsEnabled] = useState(false)
+
const [corsOrigin, setCorsOrigin] = useState('*')
+
+
// Fetch initial data on mount
+
useEffect(() => {
+
fetchUserInfo()
+
fetchSites()
+
fetchDomains()
+
}, [])
+
+
// Handle site configuration modal
+
const handleConfigureSite = async (site: SiteWithDomains) => {
+
setConfiguringSite(site)
+
+
// Build set of currently mapped domains
+
const mappedDomains = new Set<string>()
+
+
if (site.domains) {
+
site.domains.forEach(domainInfo => {
+
if (domainInfo.type === 'wisp') {
+
// For wisp domains, use the domain itself as the identifier
+
mappedDomains.add(`wisp:${domainInfo.domain}`)
+
} else if (domainInfo.id) {
+
mappedDomains.add(domainInfo.id)
+
}
+
})
+
}
+
+
setSelectedDomains(mappedDomains)
+
+
// Fetch and populate settings for this site
+
try {
+
const response = await fetch(`/api/site/${site.rkey}/settings`, {
+
credentials: 'include'
+
})
+
if (response.ok) {
+
const settings = await response.json()
+
+
// Determine routing mode based on settings
+
if (settings.spaMode) {
+
setRoutingMode('spa')
+
setSpaFile(settings.spaMode)
+
} else if (settings.directoryListing) {
+
setRoutingMode('directory')
+
} else if (settings.custom404) {
+
setRoutingMode('custom404')
+
setCustom404File(settings.custom404)
+
} else {
+
setRoutingMode('default')
+
}
+
+
// Set other settings
+
setIndexFiles(settings.indexFiles || ['index.html'])
+
setCleanUrls(settings.cleanUrls || false)
+
+
// Check for CORS headers
+
const corsHeader = settings.headers?.find((h: any) => h.name === 'Access-Control-Allow-Origin')
+
if (corsHeader) {
+
setCorsEnabled(true)
+
setCorsOrigin(corsHeader.value)
+
} else {
+
setCorsEnabled(false)
+
setCorsOrigin('*')
+
}
+
} else {
+
// Reset to defaults if no settings found
+
setRoutingMode('default')
+
setSpaFile('index.html')
+
setCustom404File('404.html')
+
setIndexFiles(['index.html'])
+
setCleanUrls(false)
+
setCorsEnabled(false)
+
setCorsOrigin('*')
+
}
+
} catch (err) {
+
console.error('Failed to fetch settings:', err)
+
// Use defaults on error
+
setRoutingMode('default')
+
setSpaFile('index.html')
+
setCustom404File('404.html')
+
setIndexFiles(['index.html'])
+
setCleanUrls(false)
+
setCorsEnabled(false)
+
setCorsOrigin('*')
+
}
+
}
+
+
const handleSaveSiteConfig = async () => {
+
if (!configuringSite) return
+
+
setIsSavingConfig(true)
+
try {
+
// Handle wisp domain mappings
+
const selectedWispDomainIds = Array.from(selectedDomains).filter(id => id.startsWith('wisp:'))
+
const selectedWispDomains = selectedWispDomainIds.map(id => id.replace('wisp:', ''))
+
+
// Get currently mapped wisp domains
+
const currentlyMappedWispDomains = wispDomains.filter(
+
d => d.rkey === configuringSite.rkey
+
)
+
+
// Unmap wisp domains that are no longer selected
+
for (const domain of currentlyMappedWispDomains) {
+
if (!selectedWispDomains.includes(domain.domain)) {
+
await mapWispDomain(domain.domain, null)
+
}
+
}
+
+
// Map newly selected wisp domains
+
for (const domainName of selectedWispDomains) {
+
const isAlreadyMapped = currentlyMappedWispDomains.some(d => d.domain === domainName)
+
if (!isAlreadyMapped) {
+
await mapWispDomain(domainName, configuringSite.rkey)
+
}
+
}
+
+
// Handle custom domain mappings
+
const selectedCustomDomainIds = Array.from(selectedDomains).filter(id => !id.startsWith('wisp:'))
+
const currentlyMappedCustomDomains = customDomains.filter(
+
d => d.rkey === configuringSite.rkey
+
)
+
+
// Unmap domains that are no longer selected
+
for (const domain of currentlyMappedCustomDomains) {
+
if (!selectedCustomDomainIds.includes(domain.id)) {
+
await mapCustomDomain(domain.id, null)
+
}
+
}
+
+
// Map newly selected domains
+
for (const domainId of selectedCustomDomainIds) {
+
const isAlreadyMapped = currentlyMappedCustomDomains.some(d => d.id === domainId)
+
if (!isAlreadyMapped) {
+
await mapCustomDomain(domainId, configuringSite.rkey)
+
}
+
}
+
+
// Save site settings
+
const settings: any = {
+
cleanUrls,
+
indexFiles: indexFiles.filter(f => f.trim() !== '')
+
}
+
+
// Set routing mode based on selection
+
if (routingMode === 'spa') {
+
settings.spaMode = spaFile
+
} else if (routingMode === 'directory') {
+
settings.directoryListing = true
+
} else if (routingMode === 'custom404') {
+
settings.custom404 = custom404File
+
}
+
+
// Add CORS header if enabled
+
if (corsEnabled) {
+
settings.headers = [
+
{
+
name: 'Access-Control-Allow-Origin',
+
value: corsOrigin
+
}
+
]
+
}
+
+
const settingsResponse = await fetch(`/api/site/${configuringSite.rkey}/settings`, {
+
method: 'POST',
+
headers: {
+
'Content-Type': 'application/json'
+
},
+
credentials: 'include',
+
body: JSON.stringify(settings)
+
})
+
+
if (!settingsResponse.ok) {
+
const error = await settingsResponse.json()
+
throw new Error(error.error || 'Failed to save settings')
+
}
+
+
// Refresh both domains and sites to get updated mappings
+
await fetchDomains()
+
await fetchSites()
+
setConfiguringSite(null)
+
} catch (err) {
+
console.error('Save config error:', err)
+
alert(
+
`Failed to save configuration: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
} finally {
+
setIsSavingConfig(false)
+
}
+
}
+
+
const handleDeleteSite = async () => {
+
if (!configuringSite) return
+
+
if (!confirm(`Are you sure you want to delete "${configuringSite.display_name || configuringSite.rkey}"? This action cannot be undone.`)) {
+
return
+
}
+
+
setIsDeletingSite(true)
+
const success = await deleteSite(configuringSite.rkey)
+
if (success) {
+
// Refresh domains in case this site was mapped
+
await fetchDomains()
+
setConfiguringSite(null)
+
}
+
setIsDeletingSite(false)
+
}
+
+
const handleUploadComplete = async () => {
+
await fetchSites()
+
}
+
+
const handleLogout = async () => {
+
try {
+
const response = await fetch('/api/auth/logout', {
+
method: 'POST',
+
credentials: 'include'
+
})
+
const result = await response.json()
+
if (result.success) {
+
// Redirect to home page after successful logout
+
window.location.href = '/'
+
} else {
+
alert('Logout failed: ' + (result.error || 'Unknown error'))
+
}
+
} catch (err) {
+
alert('Logout failed: ' + (err instanceof Error ? err.message : 'Unknown error'))
+
}
+
}
+
+
if (loading) {
+
return (
+
<div className="w-full min-h-screen bg-background">
+
{/* Header Skeleton */}
+
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
+
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
+
<div className="flex items-center gap-2">
+
<img src="/transparent-full-size-ico.png" alt="wisp.place" className="w-8 h-8" />
+
<span className="text-xl font-semibold text-foreground">
+
wisp.place
+
</span>
+
</div>
+
<div className="flex items-center gap-3">
+
<SkeletonShimmer className="h-5 w-32" />
+
<SkeletonShimmer className="h-8 w-8 rounded" />
+
</div>
+
</div>
+
</header>
+
+
<div className="container mx-auto px-4 py-8 max-w-6xl w-full">
+
{/* Title Skeleton */}
+
<div className="mb-8 space-y-2">
+
<SkeletonShimmer className="h-9 w-48" />
+
<SkeletonShimmer className="h-5 w-64" />
+
</div>
+
+
{/* Tabs Skeleton */}
+
<div className="space-y-6 w-full">
+
<div className="inline-flex h-10 items-center justify-center rounded-md bg-muted p-1 text-muted-foreground w-full">
+
<SkeletonShimmer className="h-8 w-1/4 mx-1" />
+
<SkeletonShimmer className="h-8 w-1/4 mx-1" />
+
<SkeletonShimmer className="h-8 w-1/4 mx-1" />
+
<SkeletonShimmer className="h-8 w-1/4 mx-1" />
+
</div>
+
+
{/* Content Skeleton */}
+
<div className="space-y-4">
+
<div className="rounded-lg border border-border bg-card text-card-foreground shadow-sm">
+
<div className="flex flex-col space-y-1.5 p-6">
+
<SkeletonShimmer className="h-7 w-40" />
+
<SkeletonShimmer className="h-4 w-64" />
+
</div>
+
<div className="p-6 pt-0 space-y-4">
+
{[...Array(3)].map((_, i) => (
+
<div
+
key={i}
+
className="flex items-center justify-between p-4 border border-border rounded-lg"
+
>
+
<div className="flex-1 space-y-3">
+
<div className="flex items-center gap-3">
+
<SkeletonShimmer className="h-6 w-48" />
+
<SkeletonShimmer className="h-5 w-16" />
+
</div>
+
<SkeletonShimmer className="h-4 w-64" />
+
</div>
+
<SkeletonShimmer className="h-9 w-28" />
+
</div>
+
))}
+
</div>
+
</div>
+
</div>
+
</div>
+
</div>
+
</div>
+
)
+
}
+
+
return (
+
<div className="w-full min-h-screen bg-background">
+
{/* Header */}
+
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
+
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
+
<div className="flex items-center gap-2">
+
<img src="/transparent-full-size-ico.png" alt="wisp.place" className="w-8 h-8" />
+
<span className="text-xl font-semibold text-foreground">
+
wisp.place
+
</span>
+
</div>
+
<div className="flex items-center gap-3">
+
<span className="text-sm text-muted-foreground">
+
{userInfo?.handle || 'Loading...'}
+
</span>
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={handleLogout}
+
className="h-8 px-2"
+
>
+
<LogOut className="w-4 h-4" />
+
</Button>
+
</div>
+
</div>
+
</header>
+
+
<div className="container mx-auto px-4 py-8 max-w-6xl w-full">
+
<div className="mb-8">
+
<h1 className="text-3xl font-bold mb-2">Dashboard</h1>
+
<p className="text-muted-foreground">
+
Manage your sites and domains
+
</p>
+
</div>
+
+
<Tabs defaultValue="sites" className="space-y-6 w-full">
+
<TabsList className="grid w-full grid-cols-4">
+
<TabsTrigger value="sites">Sites</TabsTrigger>
+
<TabsTrigger value="domains">Domains</TabsTrigger>
+
<TabsTrigger value="upload">Upload</TabsTrigger>
+
<TabsTrigger value="cli">CLI</TabsTrigger>
+
</TabsList>
+
+
{/* Sites Tab */}
+
<TabsContent value="sites">
+
<SitesTab
+
sites={sites}
+
sitesLoading={sitesLoading}
+
isSyncing={isSyncing}
+
userInfo={userInfo}
+
onSyncSites={syncSites}
+
onConfigureSite={handleConfigureSite}
+
/>
+
</TabsContent>
+
+
{/* Domains Tab */}
+
<TabsContent value="domains">
+
<DomainsTab
+
wispDomains={wispDomains}
+
customDomains={customDomains}
+
domainsLoading={domainsLoading}
+
verificationStatus={verificationStatus}
+
userInfo={userInfo}
+
onAddCustomDomain={addCustomDomain}
+
onVerifyDomain={verifyDomain}
+
onDeleteCustomDomain={deleteCustomDomain}
+
onDeleteWispDomain={deleteWispDomain}
+
onClaimWispDomain={claimWispDomain}
+
onCheckWispAvailability={checkWispAvailability}
+
/>
+
</TabsContent>
+
+
{/* Upload Tab */}
+
<TabsContent value="upload">
+
<UploadTab
+
sites={sites}
+
sitesLoading={sitesLoading}
+
onUploadComplete={handleUploadComplete}
+
/>
+
</TabsContent>
+
+
{/* CLI Tab */}
+
<TabsContent value="cli">
+
<CLITab />
+
</TabsContent>
+
</Tabs>
+
</div>
+
+
{/* Footer */}
+
<footer className="border-t border-border/40 bg-muted/20 mt-12">
+
<div className="container mx-auto px-4 py-8">
+
<div className="text-center text-sm text-muted-foreground">
+
<p>
+
Built by{' '}
+
<a
+
href="https://bsky.app/profile/nekomimi.pet"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
@nekomimi.pet
+
</a>
+
{' โ€ข '}
+
Contact:{' '}
+
<a
+
href="mailto:contact@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
contact@wisp.place
+
</a>
+
{' โ€ข '}
+
Legal/DMCA:{' '}
+
<a
+
href="mailto:legal@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
legal@wisp.place
+
</a>
+
</p>
+
<p className="mt-2">
+
<a
+
href="/acceptable-use"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
Acceptable Use Policy
+
</a>
+
</p>
+
</div>
+
</div>
+
</footer>
+
+
{/* Site Configuration Modal */}
+
<Dialog
+
open={configuringSite !== null}
+
onOpenChange={(open) => !open && setConfiguringSite(null)}
+
>
+
<DialogContent className="sm:max-w-2xl max-h-[90vh] overflow-y-auto">
+
<DialogHeader>
+
<DialogTitle>Configure Site</DialogTitle>
+
<DialogDescription>
+
Configure domains and settings for this site.
+
</DialogDescription>
+
</DialogHeader>
+
{configuringSite && (
+
<div className="space-y-4 py-4">
+
<div className="p-3 bg-muted/30 rounded-lg">
+
<p className="text-sm font-medium mb-1">Site:</p>
+
<p className="font-mono text-sm">
+
{configuringSite.display_name ||
+
configuringSite.rkey}
+
</p>
+
</div>
+
+
<Tabs defaultValue="domains" className="w-full">
+
<TabsList className="grid w-full grid-cols-2">
+
<TabsTrigger value="domains">Domains</TabsTrigger>
+
<TabsTrigger value="settings">Settings</TabsTrigger>
+
</TabsList>
+
+
{/* Domains Tab */}
+
<TabsContent value="domains" className="space-y-3 mt-4">
+
<p className="text-sm font-medium">Available Domains:</p>
+
+
{wispDomains.map((wispDomain) => {
+
const domainId = `wisp:${wispDomain.domain}`
+
return (
+
<div key={domainId} className="flex items-center space-x-3 p-3 border rounded-lg hover:bg-muted/30">
+
<Checkbox
+
id={domainId}
+
checked={selectedDomains.has(domainId)}
+
onCheckedChange={(checked) => {
+
const newSelected = new Set(selectedDomains)
+
if (checked) {
+
newSelected.add(domainId)
+
} else {
+
newSelected.delete(domainId)
+
}
+
setSelectedDomains(newSelected)
+
}}
+
/>
+
<Label
+
htmlFor={domainId}
+
className="flex-1 cursor-pointer"
+
>
+
<div className="flex items-center justify-between">
+
<span className="font-mono text-sm">
+
{wispDomain.domain}
+
</span>
+
<Badge variant="secondary" className="text-xs ml-2">
+
Wisp
+
</Badge>
+
</div>
+
</Label>
+
</div>
+
)
+
})}
+
+
{customDomains
+
.filter((d) => d.verified)
+
.map((domain) => (
+
<div
+
key={domain.id}
+
className="flex items-center space-x-3 p-3 border rounded-lg hover:bg-muted/30"
+
>
+
<Checkbox
+
id={domain.id}
+
checked={selectedDomains.has(domain.id)}
+
onCheckedChange={(checked) => {
+
const newSelected = new Set(selectedDomains)
+
if (checked) {
+
newSelected.add(domain.id)
+
} else {
+
newSelected.delete(domain.id)
+
}
+
setSelectedDomains(newSelected)
+
}}
+
/>
+
<Label
+
htmlFor={domain.id}
+
className="flex-1 cursor-pointer"
+
>
+
<div className="flex items-center justify-between">
+
<span className="font-mono text-sm">
+
{domain.domain}
+
</span>
+
<Badge
+
variant="outline"
+
className="text-xs ml-2"
+
>
+
Custom
+
</Badge>
+
</div>
+
</Label>
+
</div>
+
))}
+
+
{customDomains.filter(d => d.verified).length === 0 && wispDomains.length === 0 && (
+
<p className="text-sm text-muted-foreground py-4 text-center">
+
No domains available. Add a custom domain or claim a wisp.place subdomain.
+
</p>
+
)}
+
+
<div className="p-3 bg-muted/20 rounded-lg border-l-4 border-blue-500/50 mt-4">
+
<p className="text-xs text-muted-foreground">
+
<strong>Note:</strong> If no domains are selected, the site will be accessible at:{' '}
+
<span className="font-mono">
+
sites.wisp.place/{userInfo?.handle || '...'}/{configuringSite.rkey}
+
</span>
+
</p>
+
</div>
+
</TabsContent>
+
+
{/* Settings Tab */}
+
<TabsContent value="settings" className="space-y-4 mt-4">
+
{/* Routing Mode */}
+
<div className="space-y-3">
+
<Label className="text-sm font-medium">Routing Mode</Label>
+
<RadioGroup value={routingMode} onValueChange={(value) => setRoutingMode(value as RoutingMode)}>
+
<div className="flex items-center space-x-3 p-3 border rounded-lg">
+
<RadioGroupItem value="default" id="mode-default" />
+
<Label htmlFor="mode-default" className="flex-1 cursor-pointer">
+
<div>
+
<p className="font-medium">Default</p>
+
<p className="text-xs text-muted-foreground">Standard static file serving</p>
+
</div>
+
</Label>
+
</div>
+
<div className="flex items-center space-x-3 p-3 border rounded-lg">
+
<RadioGroupItem value="spa" id="mode-spa" />
+
<Label htmlFor="mode-spa" className="flex-1 cursor-pointer">
+
<div>
+
<p className="font-medium">SPA Mode</p>
+
<p className="text-xs text-muted-foreground">Route all requests to a single file</p>
+
</div>
+
</Label>
+
</div>
+
{routingMode === 'spa' && (
+
<div className="ml-7 space-y-2">
+
<Label htmlFor="spa-file" className="text-sm">SPA File</Label>
+
<Input
+
id="spa-file"
+
value={spaFile}
+
onChange={(e) => setSpaFile(e.target.value)}
+
placeholder="index.html"
+
/>
+
</div>
+
)}
+
<div className="flex items-center space-x-3 p-3 border rounded-lg">
+
<RadioGroupItem value="directory" id="mode-directory" />
+
<Label htmlFor="mode-directory" className="flex-1 cursor-pointer">
+
<div>
+
<p className="font-medium">Directory Listing</p>
+
<p className="text-xs text-muted-foreground">Show directory contents on 404</p>
+
</div>
+
</Label>
+
</div>
+
<div className="flex items-center space-x-3 p-3 border rounded-lg">
+
<RadioGroupItem value="custom404" id="mode-custom404" />
+
<Label htmlFor="mode-custom404" className="flex-1 cursor-pointer">
+
<div>
+
<p className="font-medium">Custom 404 Page</p>
+
<p className="text-xs text-muted-foreground">Serve custom error page</p>
+
</div>
+
</Label>
+
</div>
+
{routingMode === 'custom404' && (
+
<div className="ml-7 space-y-2">
+
<Label htmlFor="404-file" className="text-sm">404 File</Label>
+
<Input
+
id="404-file"
+
value={custom404File}
+
onChange={(e) => setCustom404File(e.target.value)}
+
placeholder="404.html"
+
/>
+
</div>
+
)}
+
</RadioGroup>
+
</div>
+
+
{/* Index Files */}
+
<div className="space-y-3">
+
<Label className={`text-sm font-medium ${routingMode === 'spa' ? 'text-muted-foreground' : ''}`}>
+
Index Files
+
{routingMode === 'spa' && (
+
<span className="ml-2 text-xs">(disabled in SPA mode)</span>
+
)}
+
</Label>
+
<p className="text-xs text-muted-foreground">Files to try when serving a directory (in order)</p>
+
<div className="space-y-2">
+
{indexFiles.map((file, idx) => (
+
<div key={idx} className="flex items-center gap-2">
+
<Input
+
value={file}
+
onChange={(e) => {
+
const newFiles = [...indexFiles]
+
newFiles[idx] = e.target.value
+
setIndexFiles(newFiles)
+
}}
+
placeholder="index.html"
+
disabled={routingMode === 'spa'}
+
/>
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={() => {
+
setIndexFiles(indexFiles.filter((_, i) => i !== idx))
+
}}
+
disabled={routingMode === 'spa'}
+
className="w-20"
+
>
+
Remove
+
</Button>
+
</div>
+
))}
+
<div className="flex items-center gap-2">
+
<Input
+
value={newIndexFile}
+
onChange={(e) => setNewIndexFile(e.target.value)}
+
placeholder="Add index file..."
+
onKeyDown={(e) => {
+
if (e.key === 'Enter' && newIndexFile.trim()) {
+
setIndexFiles([...indexFiles, newIndexFile.trim()])
+
setNewIndexFile('')
+
}
+
}}
+
disabled={routingMode === 'spa'}
+
/>
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={() => {
+
if (newIndexFile.trim()) {
+
setIndexFiles([...indexFiles, newIndexFile.trim()])
+
setNewIndexFile('')
+
}
+
}}
+
disabled={routingMode === 'spa'}
+
className="w-20"
+
>
+
Add
+
</Button>
+
</div>
+
</div>
+
</div>
+
+
{/* Clean URLs */}
+
<div className="flex items-center space-x-3 p-3 border rounded-lg">
+
<Checkbox
+
id="clean-urls"
+
checked={cleanUrls}
+
onCheckedChange={(checked) => setCleanUrls(!!checked)}
+
/>
+
<Label htmlFor="clean-urls" className="flex-1 cursor-pointer">
+
<div>
+
<p className="font-medium">Clean URLs</p>
+
<p className="text-xs text-muted-foreground">
+
Serve /about as /about.html or /about/index.html
+
</p>
+
</div>
+
</Label>
+
</div>
+
+
{/* CORS */}
+
<div className="space-y-3">
+
<div className="flex items-center space-x-3 p-3 border rounded-lg">
+
<Checkbox
+
id="cors-enabled"
+
checked={corsEnabled}
+
onCheckedChange={(checked) => setCorsEnabled(!!checked)}
+
/>
+
<Label htmlFor="cors-enabled" className="flex-1 cursor-pointer">
+
<div>
+
<p className="font-medium">Enable CORS</p>
+
<p className="text-xs text-muted-foreground">
+
Allow cross-origin requests
+
</p>
+
</div>
+
</Label>
+
</div>
+
{corsEnabled && (
+
<div className="ml-7 space-y-2">
+
<Label htmlFor="cors-origin" className="text-sm">Allowed Origin</Label>
+
<Input
+
id="cors-origin"
+
value={corsOrigin}
+
onChange={(e) => setCorsOrigin(e.target.value)}
+
placeholder="*"
+
/>
+
<p className="text-xs text-muted-foreground">
+
Use * for all origins, or specify a domain like https://example.com
+
</p>
+
</div>
+
)}
+
</div>
+
</TabsContent>
+
</Tabs>
+
</div>
+
)}
+
<DialogFooter className="flex flex-col sm:flex-row sm:justify-between gap-2">
+
<Button
+
variant="destructive"
+
onClick={handleDeleteSite}
+
disabled={isSavingConfig || isDeletingSite}
+
className="sm:mr-auto"
+
>
+
{isDeletingSite ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Deleting...
+
</>
+
) : (
+
<>
+
<Trash2 className="w-4 h-4 mr-2" />
+
Delete Site
+
</>
+
)}
+
</Button>
+
<div className="flex flex-col sm:flex-row gap-2 w-full sm:w-auto">
+
<Button
+
variant="outline"
+
onClick={() => setConfiguringSite(null)}
+
disabled={isSavingConfig || isDeletingSite}
+
className="w-full sm:w-auto"
+
>
+
Cancel
+
</Button>
+
<Button
+
onClick={handleSaveSiteConfig}
+
disabled={isSavingConfig || isDeletingSite}
+
className="w-full sm:w-auto"
+
>
+
{isSavingConfig ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Saving...
+
</>
+
) : (
+
'Save'
+
)}
+
</Button>
+
</div>
+
</DialogFooter>
+
</DialogContent>
+
</Dialog>
+
</div>
+
)
+
}
+
+
const root = createRoot(document.getElementById('elysia')!)
+
root.render(
+
<Layout className="gap-6">
+
<Dashboard />
+
</Layout>
+
)
+239
apps/main-app/public/editor/hooks/useDomainData.ts
···
+
import { useState } from 'react'
+
+
export interface CustomDomain {
+
id: string
+
domain: string
+
did: string
+
rkey: string
+
verified: boolean
+
last_verified_at: number | null
+
created_at: number
+
}
+
+
export interface WispDomain {
+
domain: string
+
rkey: string | null
+
}
+
+
type VerificationStatus = 'idle' | 'verifying' | 'success' | 'error'
+
+
export function useDomainData() {
+
const [wispDomains, setWispDomains] = useState<WispDomain[]>([])
+
const [customDomains, setCustomDomains] = useState<CustomDomain[]>([])
+
const [domainsLoading, setDomainsLoading] = useState(true)
+
const [verificationStatus, setVerificationStatus] = useState<{
+
[id: string]: VerificationStatus
+
}>({})
+
+
const fetchDomains = async () => {
+
try {
+
const response = await fetch('/api/user/domains')
+
const data = await response.json()
+
setWispDomains(data.wispDomains || [])
+
setCustomDomains(data.customDomains || [])
+
} catch (err) {
+
console.error('Failed to fetch domains:', err)
+
} finally {
+
setDomainsLoading(false)
+
}
+
}
+
+
const addCustomDomain = async (domain: string) => {
+
try {
+
const response = await fetch('/api/domain/custom/add', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ domain })
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
await fetchDomains()
+
return { success: true, id: data.id }
+
} else {
+
throw new Error(data.error || 'Failed to add domain')
+
}
+
} catch (err) {
+
console.error('Add domain error:', err)
+
alert(
+
`Failed to add domain: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
return { success: false }
+
}
+
}
+
+
const verifyDomain = async (id: string) => {
+
setVerificationStatus({ ...verificationStatus, [id]: 'verifying' })
+
+
try {
+
const response = await fetch('/api/domain/custom/verify', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ id })
+
})
+
+
const data = await response.json()
+
if (data.success && data.verified) {
+
setVerificationStatus({ ...verificationStatus, [id]: 'success' })
+
await fetchDomains()
+
} else {
+
setVerificationStatus({ ...verificationStatus, [id]: 'error' })
+
if (data.error) {
+
alert(`Verification failed: ${data.error}`)
+
}
+
}
+
} catch (err) {
+
console.error('Verify domain error:', err)
+
setVerificationStatus({ ...verificationStatus, [id]: 'error' })
+
alert(
+
`Verification failed: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
}
+
}
+
+
const deleteCustomDomain = async (id: string) => {
+
if (!confirm('Are you sure you want to remove this custom domain?')) {
+
return false
+
}
+
+
try {
+
const response = await fetch(`/api/domain/custom/${id}`, {
+
method: 'DELETE'
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
await fetchDomains()
+
return true
+
} else {
+
throw new Error('Failed to delete domain')
+
}
+
} catch (err) {
+
console.error('Delete domain error:', err)
+
alert(
+
`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
return false
+
}
+
}
+
+
const mapWispDomain = async (domain: string, siteRkey: string | null) => {
+
try {
+
const response = await fetch('/api/domain/wisp/map-site', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ domain, siteRkey })
+
})
+
const data = await response.json()
+
if (!data.success) throw new Error('Failed to map wisp domain')
+
return true
+
} catch (err) {
+
console.error('Map wisp domain error:', err)
+
throw err
+
}
+
}
+
+
const deleteWispDomain = async (domain: string) => {
+
if (!confirm('Are you sure you want to remove this wisp.place domain?')) {
+
return false
+
}
+
+
try {
+
const response = await fetch(`/api/domain/wisp/${encodeURIComponent(domain)}`, {
+
method: 'DELETE'
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
await fetchDomains()
+
return true
+
} else {
+
throw new Error('Failed to delete domain')
+
}
+
} catch (err) {
+
console.error('Delete wisp domain error:', err)
+
alert(
+
`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
return false
+
}
+
}
+
+
const mapCustomDomain = async (domainId: string, siteRkey: string | null) => {
+
try {
+
const response = await fetch(`/api/domain/custom/${domainId}/map-site`, {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ siteRkey })
+
})
+
const data = await response.json()
+
if (!data.success) throw new Error(`Failed to map custom domain ${domainId}`)
+
return true
+
} catch (err) {
+
console.error('Map custom domain error:', err)
+
throw err
+
}
+
}
+
+
const claimWispDomain = async (handle: string) => {
+
try {
+
const response = await fetch('/api/domain/claim', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ handle })
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
await fetchDomains()
+
return { success: true }
+
} else {
+
throw new Error(data.error || 'Failed to claim domain')
+
}
+
} catch (err) {
+
console.error('Claim domain error:', err)
+
const errorMessage = err instanceof Error ? err.message : 'Unknown error'
+
+
// Handle domain limit error more gracefully
+
if (errorMessage.includes('Domain limit reached')) {
+
alert('You have already claimed 3 wisp.place subdomains (maximum limit).')
+
await fetchDomains()
+
} else {
+
alert(`Failed to claim domain: ${errorMessage}`)
+
}
+
return { success: false, error: errorMessage }
+
}
+
}
+
+
const checkWispAvailability = async (handle: string) => {
+
const trimmedHandle = handle.trim().toLowerCase()
+
if (!trimmedHandle) {
+
return { available: null }
+
}
+
+
try {
+
const response = await fetch(`/api/domain/check?handle=${encodeURIComponent(trimmedHandle)}`)
+
const data = await response.json()
+
return { available: data.available }
+
} catch (err) {
+
console.error('Check availability error:', err)
+
return { available: false }
+
}
+
}
+
+
return {
+
wispDomains,
+
customDomains,
+
domainsLoading,
+
verificationStatus,
+
fetchDomains,
+
addCustomDomain,
+
verifyDomain,
+
deleteCustomDomain,
+
mapWispDomain,
+
deleteWispDomain,
+
mapCustomDomain,
+
claimWispDomain,
+
checkWispAvailability
+
}
+
}
+112
apps/main-app/public/editor/hooks/useSiteData.ts
···
+
import { useState } from 'react'
+
+
export interface Site {
+
did: string
+
rkey: string
+
display_name: string | null
+
created_at: number
+
updated_at: number
+
}
+
+
export interface DomainInfo {
+
type: 'wisp' | 'custom'
+
domain: string
+
verified?: boolean
+
id?: string
+
}
+
+
export interface SiteWithDomains extends Site {
+
domains?: DomainInfo[]
+
}
+
+
export function useSiteData() {
+
const [sites, setSites] = useState<SiteWithDomains[]>([])
+
const [sitesLoading, setSitesLoading] = useState(true)
+
const [isSyncing, setIsSyncing] = useState(false)
+
+
const fetchSites = async () => {
+
try {
+
const response = await fetch('/api/user/sites')
+
const data = await response.json()
+
const sitesData: Site[] = data.sites || []
+
+
// Fetch domain info for each site
+
const sitesWithDomains = await Promise.all(
+
sitesData.map(async (site) => {
+
try {
+
const domainsResponse = await fetch(`/api/user/site/${site.rkey}/domains`)
+
const domainsData = await domainsResponse.json()
+
return {
+
...site,
+
domains: domainsData.domains || []
+
}
+
} catch (err) {
+
console.error(`Failed to fetch domains for site ${site.rkey}:`, err)
+
return {
+
...site,
+
domains: []
+
}
+
}
+
})
+
)
+
+
setSites(sitesWithDomains)
+
} catch (err) {
+
console.error('Failed to fetch sites:', err)
+
} finally {
+
setSitesLoading(false)
+
}
+
}
+
+
const syncSites = async () => {
+
setIsSyncing(true)
+
try {
+
const response = await fetch('/api/user/sync', {
+
method: 'POST'
+
})
+
const data = await response.json()
+
if (data.success) {
+
console.log(`Synced ${data.synced} sites from PDS`)
+
// Refresh sites list
+
await fetchSites()
+
}
+
} catch (err) {
+
console.error('Failed to sync sites:', err)
+
alert('Failed to sync sites from PDS')
+
} finally {
+
setIsSyncing(false)
+
}
+
}
+
+
const deleteSite = async (rkey: string) => {
+
try {
+
const response = await fetch(`/api/site/${rkey}`, {
+
method: 'DELETE'
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
// Refresh sites list
+
await fetchSites()
+
return true
+
} else {
+
throw new Error(data.error || 'Failed to delete site')
+
}
+
} catch (err) {
+
console.error('Delete site error:', err)
+
alert(
+
`Failed to delete site: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
return false
+
}
+
}
+
+
return {
+
sites,
+
sitesLoading,
+
isSyncing,
+
fetchSites,
+
syncSites,
+
deleteSite
+
}
+
}
+29
apps/main-app/public/editor/hooks/useUserInfo.ts
···
+
import { useState } from 'react'
+
+
export interface UserInfo {
+
did: string
+
handle: string
+
}
+
+
export function useUserInfo() {
+
const [userInfo, setUserInfo] = useState<UserInfo | null>(null)
+
const [loading, setLoading] = useState(true)
+
+
const fetchUserInfo = async () => {
+
try {
+
const response = await fetch('/api/user/info')
+
const data = await response.json()
+
setUserInfo(data)
+
} catch (err) {
+
console.error('Failed to fetch user info:', err)
+
} finally {
+
setLoading(false)
+
}
+
}
+
+
return {
+
userInfo,
+
loading,
+
fetchUserInfo
+
}
+
}
+53
apps/main-app/public/editor/index.html
···
+
<!doctype html>
+
<html lang="en">
+
<head>
+
<meta charset="UTF-8" />
+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
+
<title>wisp.place</title>
+
<meta name="description" content="Manage your decentralized static sites hosted on AT Protocol." />
+
+
<!-- Open Graph / Facebook -->
+
<meta property="og:type" content="website" />
+
<meta property="og:url" content="https://wisp.place/editor" />
+
<meta property="og:title" content="Editor - wisp.place" />
+
<meta property="og:description" content="Manage your decentralized static sites hosted on AT Protocol." />
+
<meta property="og:site_name" content="wisp.place" />
+
+
<!-- Twitter -->
+
<meta name="twitter:card" content="summary" />
+
<meta name="twitter:url" content="https://wisp.place/editor" />
+
<meta name="twitter:title" content="Editor - wisp.place" />
+
<meta name="twitter:description" content="Manage your decentralized static sites hosted on AT Protocol." />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
+
<link rel="icon" type="image/x-icon" href="../favicon.ico">
+
<link rel="icon" type="image/png" sizes="32x32" href="../favicon-32x32.png">
+
<link rel="icon" type="image/png" sizes="16x16" href="../favicon-16x16.png">
+
<link rel="apple-touch-icon" sizes="180x180" href="../apple-touch-icon.png">
+
<link rel="manifest" href="../site.webmanifest">
+
<style>
+
/* Dark theme fallback styles for before JS loads */
+
@media (prefers-color-scheme: dark) {
+
body {
+
background-color: oklch(0.23 0.015 285);
+
color: oklch(0.90 0.005 285);
+
}
+
+
pre {
+
background-color: oklch(0.33 0.015 285) !important;
+
color: oklch(0.90 0.005 285) !important;
+
}
+
+
.bg-muted {
+
background-color: oklch(0.33 0.015 285) !important;
+
}
+
}
+
</style>
+
</head>
+
<body>
+
<div id="elysia"></div>
+
<script type="module" src="./editor.tsx"></script>
+
</body>
+
</html>
+369
apps/main-app/public/editor/tabs/CLITab.tsx
···
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle,
+
} from "@public/components/ui/card";
+
import { Badge } from "@public/components/ui/badge";
+
import { ExternalLink } from "lucide-react";
+
import { CodeBlock } from "@public/components/ui/code-block";
+
+
export function CLITab() {
+
return (
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<div className="flex items-center gap-2 mb-2">
+
<CardTitle>Wisp CLI Tool</CardTitle>
+
<Badge variant="secondary" className="text-xs">
+
v0.2.0
+
</Badge>
+
<Badge variant="outline" className="text-xs">
+
Alpha
+
</Badge>
+
</div>
+
<CardDescription>
+
Deploy static sites directly from your terminal
+
</CardDescription>
+
</CardHeader>
+
<CardContent className="space-y-6">
+
<div className="prose prose-sm max-w-none dark:prose-invert">
+
<p className="text-sm text-muted-foreground">
+
The Wisp CLI is a command-line tool for deploying static websites
+
directly to your AT Protocol account. Authenticate with app
+
password or OAuth and deploy from CI/CD pipelines.
+
</p>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Features</h3>
+
<ul className="text-sm text-muted-foreground space-y-2 list-disc list-inside">
+
<li>
+
<strong>Deploy:</strong> Push static sites directly from your
+
terminal
+
</li>
+
<li>
+
<strong>Pull:</strong> Download sites from the PDS for
+
development or backup
+
</li>
+
<li>
+
<strong>Serve:</strong> Run a local server with real-time
+
firehose updates
+
</li>
+
</ul>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Download v0.2.0</h3>
+
<div className="grid gap-2">
+
<div className="p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border">
+
<a
+
href="https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-aarch64-darwin"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between mb-2"
+
>
+
<span className="font-mono text-sm">
+
macOS (Apple Silicon)
+
</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<div className="text-xs text-muted-foreground">
+
<span className="font-mono">
+
SHA-1: 9281454860f2eb07b39b80f7a9cc8e9bdcff491b
+
</span>
+
</div>
+
</div>
+
<div className="p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border">
+
<a
+
href="https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-aarch64-linux"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between mb-2"
+
>
+
<span className="font-mono text-sm">Linux (ARM64)</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<div className="text-xs text-muted-foreground">
+
<span className="font-mono">
+
SHA-1: d460863150c4c162b7e7e3801a67746da3aaf9d9
+
</span>
+
</div>
+
</div>
+
<div className="p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border">
+
<a
+
href="https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between mb-2"
+
>
+
<span className="font-mono text-sm">Linux (x86_64)</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<div className="text-xs text-muted-foreground">
+
<span className="font-mono">
+
SHA-1: 94968abed20422df826b78c38cb506dd4b1b5885
+
</span>
+
</div>
+
</div>
+
<div className="p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border">
+
<a
+
href="https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-windows.exe"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between mb-2"
+
>
+
<span className="font-mono text-sm">Windows (x86_64)</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<div className="text-xs text-muted-foreground">
+
<span className="font-mono">
+
SHA-1: 45293e47da38b97ef35258a08cb2682eee64a659
+
</span>
+
</div>
+
</div>
+
</div>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Deploy a Site</h3>
+
<CodeBlock
+
code={`# Download and make executable
+
curl -O https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-aarch64-darwin
+
chmod +x wisp-cli-aarch64-darwin
+
+
# Deploy your site
+
./wisp-cli-aarch64-darwin deploy your-handle.bsky.social \\
+
--path ./dist \\
+
--site my-site \\
+
--password your-app-password
+
+
# Your site will be available at:
+
# https://sites.wisp.place/your-handle/my-site`}
+
language="bash"
+
/>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Pull a Site from PDS</h3>
+
<p className="text-xs text-muted-foreground">
+
Download a site from the PDS to your local machine (uses OAuth
+
authentication):
+
</p>
+
<CodeBlock
+
code={`# Pull a site to a specific directory
+
wisp-cli pull your-handle.bsky.social \\
+
--site my-site \\
+
--output ./my-site
+
+
# Pull to current directory
+
wisp-cli pull your-handle.bsky.social \\
+
--site my-site
+
+
# Opens browser for OAuth authentication on first run`}
+
language="bash"
+
/>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">
+
Serve a Site Locally with Real-Time Updates
+
</h3>
+
<p className="text-xs text-muted-foreground">
+
Run a local server that monitors the firehose for real-time
+
updates (uses OAuth authentication):
+
</p>
+
<CodeBlock
+
code={`# Serve on http://localhost:8080 (default)
+
wisp-cli serve your-handle.bsky.social \\
+
--site my-site
+
+
# Serve on a custom port
+
wisp-cli serve your-handle.bsky.social \\
+
--site my-site \\
+
--port 3000
+
+
# Downloads site, serves it, and watches firehose for live updates!`}
+
language="bash"
+
/>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">
+
CI/CD with Tangled Spindle
+
</h3>
+
<p className="text-xs text-muted-foreground">
+
Deploy automatically on every push using{" "}
+
<a
+
href="https://blog.tangled.org/ci"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-accent hover:underline"
+
>
+
Tangled Spindle
+
</a>
+
</p>
+
+
<div className="space-y-4">
+
<div>
+
<h4 className="text-xs font-semibold mb-2 flex items-center gap-2">
+
<span>Example 1: Simple Asset Publishing</span>
+
<Badge variant="secondary" className="text-xs">
+
Copy Files
+
</Badge>
+
</h4>
+
<CodeBlock
+
code={`when:
+
- event: ['push']
+
branch: ['main']
+
- event: ['manual']
+
+
engine: 'nixery'
+
+
clone:
+
skip: false
+
depth: 1
+
+
dependencies:
+
nixpkgs:
+
- coreutils
+
- curl
+
+
environment:
+
SITE_PATH: '.' # Copy entire repo
+
SITE_NAME: 'myWebbedSite'
+
WISP_HANDLE: 'your-handle.bsky.social'
+
+
steps:
+
- name: deploy assets to wisp
+
command: |
+
# Download Wisp CLI
+
curl https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux -o wisp-cli
+
chmod +x wisp-cli
+
+
# Deploy to Wisp
+
./wisp-cli deploy \\
+
"$WISP_HANDLE" \\
+
--path "$SITE_PATH" \\
+
--site "$SITE_NAME" \\
+
--password "$WISP_APP_PASSWORD"
+
+
# Output
+
#Deployed site 'myWebbedSite': at://did:plc:ttdrpj45ibqunmfhdsb4zdwq/place.wisp.fs/myWebbedSite
+
#Available at: https://sites.wisp.place/did:plc:ttdrpj45ibqunmfhdsb4zdwq/myWebbedSite
+
`}
+
language="yaml"
+
/>
+
</div>
+
+
<div>
+
<h4 className="text-xs font-semibold mb-2 flex items-center gap-2">
+
<span>Example 2: React/Vite Build & Deploy</span>
+
<Badge variant="secondary" className="text-xs">
+
Full Build
+
</Badge>
+
</h4>
+
<CodeBlock
+
code={`when:
+
- event: ['push']
+
branch: ['main']
+
- event: ['manual']
+
+
engine: 'nixery'
+
+
clone:
+
skip: false
+
depth: 1
+
submodules: false
+
+
dependencies:
+
nixpkgs:
+
- nodejs
+
- coreutils
+
- curl
+
github:NixOS/nixpkgs/nixpkgs-unstable:
+
- bun
+
+
environment:
+
SITE_PATH: 'dist'
+
SITE_NAME: 'my-react-site'
+
WISP_HANDLE: 'your-handle.bsky.social'
+
+
steps:
+
- name: build site
+
command: |
+
# necessary to ensure bun is in PATH
+
export PATH="$HOME/.nix-profile/bin:$PATH"
+
+
bun install --frozen-lockfile
+
+
# build with vite, run directly to get around env issues
+
bun node_modules/.bin/vite build
+
+
- name: deploy to wisp
+
command: |
+
# Download Wisp CLI
+
curl https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux -o wisp-cli
+
chmod +x wisp-cli
+
+
# Deploy to Wisp
+
./wisp-cli deploy \\
+
"$WISP_HANDLE" \\
+
--path "$SITE_PATH" \\
+
--site "$SITE_NAME" \\
+
--password "$WISP_APP_PASSWORD"`}
+
language="yaml"
+
/>
+
</div>
+
</div>
+
+
<div className="p-3 bg-muted/30 rounded-lg border-l-4 border-accent">
+
<p className="text-xs text-muted-foreground">
+
<strong className="text-foreground">Note:</strong> Set{" "}
+
<code className="px-1.5 py-0.5 bg-background rounded text-xs">
+
WISP_APP_PASSWORD
+
</code>{" "}
+
as a secret in your Tangled Spindle repository settings.
+
Generate an app password from your AT Protocol account settings.
+
</p>
+
</div>
+
</div>
+
+
<div className="space-y-3">
+
<h3 className="text-sm font-semibold">Learn More</h3>
+
<div className="grid gap-2">
+
<a
+
href="https://docs.wisp.place/cli"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border"
+
>
+
<span className="text-sm">CLI Documentation</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<a
+
href="https://tangled.org/@nekomimi.pet/wisp.place-monorepo/tree/main/cli"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border"
+
>
+
<span className="text-sm">Source Code</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
<a
+
href="https://blog.tangled.org/ci"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="flex items-center justify-between p-3 bg-muted/50 hover:bg-muted rounded-lg transition-colors border border-border"
+
>
+
<span className="text-sm">Tangled Spindle CI/CD</span>
+
<ExternalLink className="w-4 h-4 text-muted-foreground" />
+
</a>
+
</div>
+
</div>
+
</CardContent>
+
</Card>
+
</div>
+
);
+
}
+567
apps/main-app/public/editor/tabs/DomainsTab.tsx
···
+
import { useState } from 'react'
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
import { Button } from '@public/components/ui/button'
+
import { Input } from '@public/components/ui/input'
+
import { Label } from '@public/components/ui/label'
+
import { Badge } from '@public/components/ui/badge'
+
import { SkeletonShimmer } from '@public/components/ui/skeleton'
+
import {
+
Dialog,
+
DialogContent,
+
DialogDescription,
+
DialogHeader,
+
DialogTitle,
+
DialogFooter
+
} from '@public/components/ui/dialog'
+
import {
+
CheckCircle2,
+
XCircle,
+
Loader2,
+
Trash2
+
} from 'lucide-react'
+
import type { WispDomain, CustomDomain } from '../hooks/useDomainData'
+
import type { UserInfo } from '../hooks/useUserInfo'
+
+
interface DomainsTabProps {
+
wispDomains: WispDomain[]
+
customDomains: CustomDomain[]
+
domainsLoading: boolean
+
verificationStatus: { [id: string]: 'idle' | 'verifying' | 'success' | 'error' }
+
userInfo: UserInfo | null
+
onAddCustomDomain: (domain: string) => Promise<{ success: boolean; id?: string }>
+
onVerifyDomain: (id: string) => Promise<void>
+
onDeleteCustomDomain: (id: string) => Promise<boolean>
+
onDeleteWispDomain: (domain: string) => Promise<boolean>
+
onClaimWispDomain: (handle: string) => Promise<{ success: boolean; error?: string }>
+
onCheckWispAvailability: (handle: string) => Promise<{ available: boolean | null }>
+
}
+
+
export function DomainsTab({
+
wispDomains,
+
customDomains,
+
domainsLoading,
+
verificationStatus,
+
userInfo,
+
onAddCustomDomain,
+
onVerifyDomain,
+
onDeleteCustomDomain,
+
onDeleteWispDomain,
+
onClaimWispDomain,
+
onCheckWispAvailability
+
}: DomainsTabProps) {
+
// Wisp domain claim state
+
const [wispHandle, setWispHandle] = useState('')
+
const [isClaimingWisp, setIsClaimingWisp] = useState(false)
+
const [wispAvailability, setWispAvailability] = useState<{
+
available: boolean | null
+
checking: boolean
+
}>({ available: null, checking: false })
+
+
// Custom domain modal state
+
const [addDomainModalOpen, setAddDomainModalOpen] = useState(false)
+
const [customDomain, setCustomDomain] = useState('')
+
const [isAddingDomain, setIsAddingDomain] = useState(false)
+
const [viewDomainDNS, setViewDomainDNS] = useState<string | null>(null)
+
+
const checkWispAvailability = async (handle: string) => {
+
const trimmedHandle = handle.trim().toLowerCase()
+
if (!trimmedHandle) {
+
setWispAvailability({ available: null, checking: false })
+
return
+
}
+
+
setWispAvailability({ available: null, checking: true })
+
const result = await onCheckWispAvailability(trimmedHandle)
+
setWispAvailability({ available: result.available, checking: false })
+
}
+
+
const handleClaimWispDomain = async () => {
+
const trimmedHandle = wispHandle.trim().toLowerCase()
+
if (!trimmedHandle) {
+
alert('Please enter a handle')
+
return
+
}
+
+
setIsClaimingWisp(true)
+
const result = await onClaimWispDomain(trimmedHandle)
+
if (result.success) {
+
setWispHandle('')
+
setWispAvailability({ available: null, checking: false })
+
}
+
setIsClaimingWisp(false)
+
}
+
+
const handleAddCustomDomain = async () => {
+
if (!customDomain) {
+
alert('Please enter a domain')
+
return
+
}
+
+
setIsAddingDomain(true)
+
const result = await onAddCustomDomain(customDomain)
+
setIsAddingDomain(false)
+
+
if (result.success) {
+
setCustomDomain('')
+
setAddDomainModalOpen(false)
+
// Automatically show DNS configuration for the newly added domain
+
if (result.id) {
+
setViewDomainDNS(result.id)
+
}
+
}
+
}
+
+
return (
+
<>
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<CardTitle>wisp.place Subdomains</CardTitle>
+
<CardDescription>
+
Your free subdomains on the wisp.place network (up to 3)
+
</CardDescription>
+
</CardHeader>
+
<CardContent>
+
{domainsLoading ? (
+
<div className="space-y-4">
+
<div className="space-y-2">
+
{[...Array(2)].map((_, i) => (
+
<div
+
key={i}
+
className="flex items-center justify-between p-3 border border-border rounded-lg"
+
>
+
<div className="flex flex-col gap-2 flex-1">
+
<div className="flex items-center gap-2">
+
<SkeletonShimmer className="h-4 w-4 rounded-full" />
+
<SkeletonShimmer className="h-4 w-40" />
+
</div>
+
<SkeletonShimmer className="h-3 w-32 ml-6" />
+
</div>
+
<SkeletonShimmer className="h-8 w-8" />
+
</div>
+
))}
+
</div>
+
<div className="p-4 bg-muted/30 rounded-lg space-y-3">
+
<SkeletonShimmer className="h-4 w-full" />
+
<div className="space-y-2">
+
<SkeletonShimmer className="h-4 w-24" />
+
<SkeletonShimmer className="h-10 w-full" />
+
</div>
+
<SkeletonShimmer className="h-10 w-full" />
+
</div>
+
</div>
+
) : (
+
<div className="space-y-4">
+
{wispDomains.length > 0 && (
+
<div className="space-y-2">
+
{wispDomains.map((domain) => (
+
<div
+
key={domain.domain}
+
className="flex items-center justify-between p-3 border border-border rounded-lg"
+
>
+
<div className="flex flex-col gap-1 flex-1">
+
<div className="flex items-center gap-2">
+
<CheckCircle2 className="w-4 h-4 text-green-500" />
+
<span className="font-mono">
+
{domain.domain}
+
</span>
+
</div>
+
{domain.rkey && (
+
<p className="text-xs text-muted-foreground ml-6">
+
โ†’ Mapped to site: {domain.rkey}
+
</p>
+
)}
+
</div>
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={() => onDeleteWispDomain(domain.domain)}
+
>
+
<Trash2 className="w-4 h-4" />
+
</Button>
+
</div>
+
))}
+
</div>
+
)}
+
+
{wispDomains.length < 3 && (
+
<div className="p-4 bg-muted/30 rounded-lg">
+
<p className="text-sm text-muted-foreground mb-4">
+
{wispDomains.length === 0
+
? 'Claim your free wisp.place subdomain'
+
: `Claim another wisp.place subdomain (${wispDomains.length}/3)`}
+
</p>
+
<div className="space-y-3">
+
<div className="space-y-2">
+
<Label htmlFor="wisp-handle">Choose your handle</Label>
+
<div className="flex gap-2">
+
<div className="flex-1 relative">
+
<Input
+
id="wisp-handle"
+
placeholder="mysite"
+
value={wispHandle}
+
onChange={(e) => {
+
setWispHandle(e.target.value)
+
if (e.target.value.trim()) {
+
checkWispAvailability(e.target.value)
+
} else {
+
setWispAvailability({ available: null, checking: false })
+
}
+
}}
+
disabled={isClaimingWisp}
+
className="pr-24"
+
/>
+
<span className="absolute right-3 top-1/2 -translate-y-1/2 text-sm text-muted-foreground">
+
.wisp.place
+
</span>
+
</div>
+
</div>
+
{wispAvailability.checking && (
+
<p className="text-xs text-muted-foreground flex items-center gap-1">
+
<Loader2 className="w-3 h-3 animate-spin" />
+
Checking availability...
+
</p>
+
)}
+
{!wispAvailability.checking && wispAvailability.available === true && (
+
<p className="text-xs text-green-600 flex items-center gap-1">
+
<CheckCircle2 className="w-3 h-3" />
+
Available
+
</p>
+
)}
+
{!wispAvailability.checking && wispAvailability.available === false && (
+
<p className="text-xs text-red-600 flex items-center gap-1">
+
<XCircle className="w-3 h-3" />
+
Not available
+
</p>
+
)}
+
</div>
+
<Button
+
onClick={handleClaimWispDomain}
+
disabled={!wispHandle.trim() || isClaimingWisp || wispAvailability.available !== true}
+
className="w-full"
+
>
+
{isClaimingWisp ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Claiming...
+
</>
+
) : (
+
'Claim Subdomain'
+
)}
+
</Button>
+
</div>
+
</div>
+
)}
+
+
{wispDomains.length === 3 && (
+
<div className="p-3 bg-muted/30 rounded-lg text-center">
+
<p className="text-sm text-muted-foreground">
+
You have claimed the maximum of 3 wisp.place subdomains
+
</p>
+
</div>
+
)}
+
</div>
+
)}
+
</CardContent>
+
</Card>
+
+
<Card>
+
<CardHeader>
+
<CardTitle>Custom Domains</CardTitle>
+
<CardDescription>
+
Bring your own domain with DNS verification
+
</CardDescription>
+
</CardHeader>
+
<CardContent className="space-y-4">
+
<Button
+
onClick={() => setAddDomainModalOpen(true)}
+
className="w-full"
+
>
+
Add Custom Domain
+
</Button>
+
+
{domainsLoading ? (
+
<div className="space-y-2">
+
{[...Array(2)].map((_, i) => (
+
<div
+
key={i}
+
className="flex items-center justify-between p-3 border border-border rounded-lg"
+
>
+
<div className="flex flex-col gap-2 flex-1">
+
<div className="flex items-center gap-2">
+
<SkeletonShimmer className="h-4 w-4 rounded-full" />
+
<SkeletonShimmer className="h-4 w-48" />
+
</div>
+
<SkeletonShimmer className="h-3 w-36 ml-6" />
+
</div>
+
<div className="flex items-center gap-2">
+
<SkeletonShimmer className="h-8 w-20" />
+
<SkeletonShimmer className="h-8 w-20" />
+
<SkeletonShimmer className="h-8 w-8" />
+
</div>
+
</div>
+
))}
+
</div>
+
) : customDomains.length === 0 ? (
+
<div className="text-center py-4 text-muted-foreground text-sm">
+
No custom domains added yet
+
</div>
+
) : (
+
<div className="space-y-2">
+
{customDomains.map((domain) => (
+
<div
+
key={domain.id}
+
className="flex items-center justify-between p-3 border border-border rounded-lg"
+
>
+
<div className="flex flex-col gap-1 flex-1">
+
<div className="flex items-center gap-2">
+
{domain.verified ? (
+
<CheckCircle2 className="w-4 h-4 text-green-500" />
+
) : (
+
<XCircle className="w-4 h-4 text-red-500" />
+
)}
+
<span className="font-mono">
+
{domain.domain}
+
</span>
+
</div>
+
{domain.rkey && domain.rkey !== 'self' && (
+
<p className="text-xs text-muted-foreground ml-6">
+
โ†’ Mapped to site: {domain.rkey}
+
</p>
+
)}
+
</div>
+
<div className="flex items-center gap-2">
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={() =>
+
setViewDomainDNS(domain.id)
+
}
+
>
+
View DNS
+
</Button>
+
{domain.verified ? (
+
<Badge variant="secondary">
+
Verified
+
</Badge>
+
) : (
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={() =>
+
onVerifyDomain(domain.id)
+
}
+
disabled={
+
verificationStatus[
+
domain.id
+
] === 'verifying'
+
}
+
>
+
{verificationStatus[
+
domain.id
+
] === 'verifying' ? (
+
<>
+
<Loader2 className="w-3 h-3 mr-1 animate-spin" />
+
Verifying...
+
</>
+
) : (
+
'Verify DNS'
+
)}
+
</Button>
+
)}
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={() =>
+
onDeleteCustomDomain(
+
domain.id
+
)
+
}
+
>
+
<Trash2 className="w-4 h-4" />
+
</Button>
+
</div>
+
</div>
+
))}
+
</div>
+
)}
+
</CardContent>
+
</Card>
+
</div>
+
+
{/* Add Custom Domain Modal */}
+
<Dialog open={addDomainModalOpen} onOpenChange={setAddDomainModalOpen}>
+
<DialogContent className="sm:max-w-lg">
+
<DialogHeader>
+
<DialogTitle>Add Custom Domain</DialogTitle>
+
<DialogDescription>
+
Enter your domain name. After adding, you'll see the DNS
+
records to configure.
+
</DialogDescription>
+
</DialogHeader>
+
<div className="space-y-4 py-4">
+
<div className="space-y-2">
+
<Label htmlFor="new-domain">Domain Name</Label>
+
<Input
+
id="new-domain"
+
placeholder="example.com"
+
value={customDomain}
+
onChange={(e) => setCustomDomain(e.target.value)}
+
/>
+
<p className="text-xs text-muted-foreground">
+
After adding, click "View DNS" to see the records you
+
need to configure.
+
</p>
+
</div>
+
</div>
+
<DialogFooter className="flex-col sm:flex-row gap-2">
+
<Button
+
variant="outline"
+
onClick={() => {
+
setAddDomainModalOpen(false)
+
setCustomDomain('')
+
}}
+
className="w-full sm:w-auto"
+
disabled={isAddingDomain}
+
>
+
Cancel
+
</Button>
+
<Button
+
onClick={handleAddCustomDomain}
+
disabled={!customDomain || isAddingDomain}
+
className="w-full sm:w-auto"
+
>
+
{isAddingDomain ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Adding...
+
</>
+
) : (
+
'Add Domain'
+
)}
+
</Button>
+
</DialogFooter>
+
</DialogContent>
+
</Dialog>
+
+
{/* View DNS Records Modal */}
+
<Dialog
+
open={viewDomainDNS !== null}
+
onOpenChange={(open) => !open && setViewDomainDNS(null)}
+
>
+
<DialogContent className="sm:max-w-lg">
+
<DialogHeader>
+
<DialogTitle>DNS Configuration</DialogTitle>
+
<DialogDescription>
+
Add these DNS records to your domain provider
+
</DialogDescription>
+
</DialogHeader>
+
{viewDomainDNS && userInfo && (
+
<>
+
{(() => {
+
const domain = customDomains.find(
+
(d) => d.id === viewDomainDNS
+
)
+
if (!domain) return null
+
+
return (
+
<div className="space-y-4 py-4">
+
<div className="p-3 bg-muted/30 rounded-lg">
+
<p className="text-sm font-medium mb-1">
+
Domain:
+
</p>
+
<p className="font-mono text-sm">
+
{domain.domain}
+
</p>
+
</div>
+
+
<div className="space-y-3">
+
<div className="p-3 bg-background rounded border border-border">
+
<div className="flex justify-between items-start mb-2">
+
<span className="text-xs font-semibold text-muted-foreground">
+
TXT Record (Verification)
+
</span>
+
</div>
+
<div className="font-mono text-xs space-y-2">
+
<div>
+
<span className="text-muted-foreground">
+
Name:
+
</span>{' '}
+
<span className="select-all">
+
_wisp.{domain.domain}
+
</span>
+
</div>
+
<div>
+
<span className="text-muted-foreground">
+
Value:
+
</span>{' '}
+
<span className="select-all break-all">
+
{userInfo.did}
+
</span>
+
</div>
+
</div>
+
</div>
+
+
<div className="p-3 bg-background rounded border border-border">
+
<div className="flex justify-between items-start mb-2">
+
<span className="text-xs font-semibold text-muted-foreground">
+
CNAME Record (Pointing)
+
</span>
+
</div>
+
<div className="font-mono text-xs space-y-2">
+
<div>
+
<span className="text-muted-foreground">
+
Name:
+
</span>{' '}
+
<span className="select-all">
+
{domain.domain}
+
</span>
+
</div>
+
<div>
+
<span className="text-muted-foreground">
+
Value:
+
</span>{' '}
+
<span className="select-all">
+
{domain.id}.dns.wisp.place
+
</span>
+
</div>
+
</div>
+
<p className="text-xs text-muted-foreground mt-2">
+
Note: Some DNS providers (like Cloudflare) flatten CNAMEs to A records - this is fine and won't affect verification.
+
</p>
+
</div>
+
</div>
+
+
<div className="p-3 bg-muted/30 rounded-lg">
+
<p className="text-xs text-muted-foreground">
+
๐Ÿ’ก After configuring DNS, click "Verify DNS"
+
to check if everything is set up correctly.
+
DNS changes can take a few minutes to
+
propagate.
+
</p>
+
</div>
+
</div>
+
)
+
})()}
+
</>
+
)}
+
<DialogFooter>
+
<Button
+
variant="outline"
+
onClick={() => setViewDomainDNS(null)}
+
className="w-full sm:w-auto"
+
>
+
Close
+
</Button>
+
</DialogFooter>
+
</DialogContent>
+
</Dialog>
+
</>
+
)
+
}
+216
apps/main-app/public/editor/tabs/SitesTab.tsx
···
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
import { Button } from '@public/components/ui/button'
+
import { Badge } from '@public/components/ui/badge'
+
import { SkeletonShimmer } from '@public/components/ui/skeleton'
+
import {
+
Globe,
+
ExternalLink,
+
CheckCircle2,
+
AlertCircle,
+
Loader2,
+
RefreshCw,
+
Settings
+
} from 'lucide-react'
+
import type { SiteWithDomains } from '../hooks/useSiteData'
+
import type { UserInfo } from '../hooks/useUserInfo'
+
+
interface SitesTabProps {
+
sites: SiteWithDomains[]
+
sitesLoading: boolean
+
isSyncing: boolean
+
userInfo: UserInfo | null
+
onSyncSites: () => Promise<void>
+
onConfigureSite: (site: SiteWithDomains) => void
+
}
+
+
export function SitesTab({
+
sites,
+
sitesLoading,
+
isSyncing,
+
userInfo,
+
onSyncSites,
+
onConfigureSite
+
}: SitesTabProps) {
+
const getSiteUrl = (site: SiteWithDomains) => {
+
// Use the first mapped domain if available
+
if (site.domains && site.domains.length > 0) {
+
return `https://${site.domains[0].domain}`
+
}
+
+
// Default fallback URL - use handle instead of DID
+
if (!userInfo) return '#'
+
return `https://sites.wisp.place/${userInfo.handle}/${site.rkey}`
+
}
+
+
const getSiteDomainName = (site: SiteWithDomains) => {
+
// Return the first domain if available
+
if (site.domains && site.domains.length > 0) {
+
return site.domains[0].domain
+
}
+
+
// Use handle instead of DID for display
+
if (!userInfo) return `sites.wisp.place/.../${site.rkey}`
+
return `sites.wisp.place/${userInfo.handle}/${site.rkey}`
+
}
+
+
return (
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<div className="flex items-center justify-between">
+
<div>
+
<CardTitle>Your Sites</CardTitle>
+
<CardDescription>
+
View and manage all your deployed sites
+
</CardDescription>
+
</div>
+
{userInfo && (
+
<Button
+
variant="outline"
+
size="sm"
+
asChild
+
>
+
<a
+
href={`https://pdsls.dev/at://${userInfo.did}/place.wisp.fs`}
+
target="_blank"
+
rel="noopener noreferrer"
+
>
+
<ExternalLink className="w-4 h-4 mr-2" />
+
View in PDS
+
</a>
+
</Button>
+
)}
+
</div>
+
</CardHeader>
+
<CardContent className="space-y-4">
+
{sitesLoading ? (
+
<div className="space-y-4">
+
{[...Array(3)].map((_, i) => (
+
<div
+
key={i}
+
className="flex items-center justify-between p-4 border border-border rounded-lg"
+
>
+
<div className="flex-1 space-y-3">
+
<div className="flex items-center gap-3">
+
<SkeletonShimmer className="h-6 w-48" />
+
<SkeletonShimmer className="h-5 w-16" />
+
</div>
+
<SkeletonShimmer className="h-4 w-64" />
+
</div>
+
<SkeletonShimmer className="h-9 w-28" />
+
</div>
+
))}
+
</div>
+
) : sites.length === 0 ? (
+
<div className="text-center py-8 text-muted-foreground">
+
<p>No sites yet. Upload your first site!</p>
+
</div>
+
) : (
+
sites.map((site) => (
+
<div
+
key={`${site.did}-${site.rkey}`}
+
className="flex items-center justify-between p-4 border border-border rounded-lg hover:bg-muted/50 transition-colors"
+
>
+
<div className="flex-1">
+
<div className="flex items-center gap-3 mb-2">
+
<h3 className="font-semibold text-lg">
+
{site.display_name || site.rkey}
+
</h3>
+
<Badge
+
variant="secondary"
+
className="text-xs"
+
>
+
active
+
</Badge>
+
</div>
+
+
{/* Display all mapped domains */}
+
{site.domains && site.domains.length > 0 ? (
+
<div className="space-y-1">
+
{site.domains.map((domainInfo, idx) => (
+
<div key={`${domainInfo.domain}-${idx}`} className="flex items-center gap-2">
+
<a
+
href={`https://${domainInfo.domain}`}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-sm text-accent hover:text-accent/80 flex items-center gap-1"
+
>
+
<Globe className="w-3 h-3" />
+
{domainInfo.domain}
+
<ExternalLink className="w-3 h-3" />
+
</a>
+
<Badge
+
variant={domainInfo.type === 'wisp' ? 'default' : 'outline'}
+
className="text-xs"
+
>
+
{domainInfo.type}
+
</Badge>
+
{domainInfo.type === 'custom' && (
+
<Badge
+
variant={domainInfo.verified ? 'default' : 'secondary'}
+
className="text-xs"
+
>
+
{domainInfo.verified ? (
+
<>
+
<CheckCircle2 className="w-3 h-3 mr-1" />
+
verified
+
</>
+
) : (
+
<>
+
<AlertCircle className="w-3 h-3 mr-1" />
+
pending
+
</>
+
)}
+
</Badge>
+
)}
+
</div>
+
))}
+
</div>
+
) : (
+
<a
+
href={getSiteUrl(site)}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-sm text-muted-foreground hover:text-accent flex items-center gap-1"
+
>
+
{getSiteDomainName(site)}
+
<ExternalLink className="w-3 h-3" />
+
</a>
+
)}
+
</div>
+
<Button
+
variant="outline"
+
size="sm"
+
onClick={() => onConfigureSite(site)}
+
>
+
<Settings className="w-4 h-4 mr-2" />
+
Configure
+
</Button>
+
</div>
+
))
+
)}
+
</CardContent>
+
</Card>
+
+
<div className="p-4 bg-muted/30 rounded-lg border-l-4 border-yellow-500/50">
+
<div className="flex items-start gap-2">
+
<AlertCircle className="w-4 h-4 text-yellow-600 dark:text-yellow-400 mt-0.5 shrink-0" />
+
<div className="flex-1 space-y-1">
+
<p className="text-xs font-semibold text-yellow-600 dark:text-yellow-400">
+
Note about sites.wisp.place URLs
+
</p>
+
<p className="text-xs text-muted-foreground">
+
Complex sites hosted on <code className="px-1 py-0.5 bg-background rounded text-xs">sites.wisp.place</code> may have broken assets if they use absolute paths (e.g., <code className="px-1 py-0.5 bg-background rounded text-xs">/folder/script.js</code>) in CSS or JavaScript files. While HTML paths are automatically rewritten, CSS and JS files are served as-is. For best results, use a wisp.place subdomain or custom domain, or ensure your site uses relative paths.
+
</p>
+
</div>
+
</div>
+
</div>
+
</div>
+
)
+
}
+616
apps/main-app/public/editor/tabs/UploadTab.tsx
···
+
import { useState, useEffect, useRef } from 'react'
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
import { Button } from '@public/components/ui/button'
+
import { Input } from '@public/components/ui/input'
+
import { Label } from '@public/components/ui/label'
+
import { RadioGroup, RadioGroupItem } from '@public/components/ui/radio-group'
+
import { Badge } from '@public/components/ui/badge'
+
import {
+
Globe,
+
Upload,
+
AlertCircle,
+
Loader2,
+
ChevronDown,
+
ChevronUp,
+
CheckCircle2,
+
XCircle,
+
RefreshCw
+
} from 'lucide-react'
+
import type { SiteWithDomains } from '../hooks/useSiteData'
+
+
type FileStatus = 'pending' | 'checking' | 'uploading' | 'uploaded' | 'reused' | 'failed'
+
+
interface FileProgress {
+
name: string
+
status: FileStatus
+
error?: string
+
}
+
+
interface UploadTabProps {
+
sites: SiteWithDomains[]
+
sitesLoading: boolean
+
onUploadComplete: () => Promise<void>
+
}
+
+
export function UploadTab({
+
sites,
+
sitesLoading,
+
onUploadComplete
+
}: UploadTabProps) {
+
// Upload state
+
const [siteMode, setSiteMode] = useState<'existing' | 'new'>('existing')
+
const [selectedSiteRkey, setSelectedSiteRkey] = useState<string>('')
+
const [newSiteName, setNewSiteName] = useState('')
+
const [selectedFiles, setSelectedFiles] = useState<FileList | null>(null)
+
const [isUploading, setIsUploading] = useState(false)
+
const [uploadProgress, setUploadProgress] = useState('')
+
const [skippedFiles, setSkippedFiles] = useState<Array<{ name: string; reason: string }>>([])
+
const [failedFiles, setFailedFiles] = useState<Array<{ name: string; index: number; error: string; size: number }>>([])
+
const [uploadedCount, setUploadedCount] = useState(0)
+
const [fileProgressList, setFileProgressList] = useState<FileProgress[]>([])
+
const [showFileProgress, setShowFileProgress] = useState(false)
+
+
// Keep SSE connection alive across tab switches
+
const eventSourceRef = useRef<EventSource | null>(null)
+
const currentJobIdRef = useRef<string | null>(null)
+
+
// Auto-switch to 'new' mode if no sites exist
+
useEffect(() => {
+
if (!sitesLoading && sites.length === 0 && siteMode === 'existing') {
+
setSiteMode('new')
+
}
+
}, [sites, sitesLoading, siteMode])
+
+
// Cleanup SSE connection on unmount
+
useEffect(() => {
+
return () => {
+
// Don't close the connection on unmount (tab switch)
+
// It will be reused when the component remounts
+
}
+
}, [])
+
+
const handleFileSelect = (e: React.ChangeEvent<HTMLInputElement>) => {
+
if (e.target.files && e.target.files.length > 0) {
+
setSelectedFiles(e.target.files)
+
}
+
}
+
+
const setupSSE = (jobId: string) => {
+
// Close existing connection if any
+
if (eventSourceRef.current) {
+
eventSourceRef.current.close()
+
}
+
+
currentJobIdRef.current = jobId
+
const eventSource = new EventSource(`/wisp/upload-progress/${jobId}`)
+
eventSourceRef.current = eventSource
+
+
eventSource.addEventListener('progress', (event) => {
+
const progressData = JSON.parse(event.data)
+
const { progress, status } = progressData
+
+
// Update file progress list if we have current file info
+
if (progress.currentFile && progress.currentFileStatus) {
+
setFileProgressList(prev => {
+
const existingIndex = prev.findIndex(f => f.name === progress.currentFile)
+
if (existingIndex !== -1) {
+
// Update existing file status - create new array with single update
+
const updated = [...prev]
+
updated[existingIndex] = { ...updated[existingIndex], status: progress.currentFileStatus as FileStatus }
+
return updated
+
} else {
+
// Add new file
+
return [...prev, {
+
name: progress.currentFile,
+
status: progress.currentFileStatus as FileStatus
+
}]
+
}
+
})
+
}
+
+
// Update progress message based on phase
+
let message = 'Processing...'
+
if (progress.phase === 'validating') {
+
message = 'Validating files...'
+
} else if (progress.phase === 'compressing') {
+
const current = progress.filesProcessed || 0
+
const total = progress.totalFiles || 0
+
message = `Compressing files (${current}/${total})...`
+
if (progress.currentFile) {
+
message += ` - ${progress.currentFile}`
+
}
+
} else if (progress.phase === 'uploading') {
+
const uploaded = progress.filesUploaded || 0
+
const reused = progress.filesReused || 0
+
const total = progress.totalFiles || 0
+
message = `Uploading to PDS (${uploaded + reused}/${total})...`
+
} else if (progress.phase === 'creating_manifest') {
+
message = 'Creating manifest...'
+
} else if (progress.phase === 'finalizing') {
+
message = 'Finalizing upload...'
+
}
+
+
setUploadProgress(message)
+
})
+
+
eventSource.addEventListener('done', (event) => {
+
const result = JSON.parse(event.data)
+
eventSource.close()
+
eventSourceRef.current = null
+
currentJobIdRef.current = null
+
+
const hasIssues = (result.skippedFiles && result.skippedFiles.length > 0) ||
+
(result.failedFiles && result.failedFiles.length > 0)
+
+
// Update file progress list with failed files
+
if (result.failedFiles && result.failedFiles.length > 0) {
+
setFileProgressList(prev => {
+
const updated = [...prev]
+
result.failedFiles.forEach((failedFile: any) => {
+
const existingIndex = updated.findIndex(f => f.name === failedFile.name)
+
if (existingIndex !== -1) {
+
updated[existingIndex] = {
+
...updated[existingIndex],
+
status: 'failed',
+
error: failedFile.error
+
}
+
} else {
+
updated.push({
+
name: failedFile.name,
+
status: 'failed',
+
error: failedFile.error
+
})
+
}
+
})
+
return updated
+
})
+
}
+
+
setUploadProgress(hasIssues ? 'Upload completed with issues' : 'Upload complete!')
+
setSkippedFiles(result.skippedFiles || [])
+
setFailedFiles(result.failedFiles || [])
+
setUploadedCount(result.uploadedCount || result.fileCount || 0)
+
+
// Debug: log failed files
+
console.log('Failed files:', result.failedFiles)
+
+
// Check for 419/413 errors and show alert
+
const hasSizeError = result.failedFiles?.some((file: any) => {
+
const error = file.error?.toLowerCase() || ''
+
console.log('Checking error:', error, 'contains PDS?', error.includes('pds'))
+
return error.includes('pds is not allowing') ||
+
error.includes('your pds is not allowing') ||
+
error.includes('request entity too large')
+
})
+
+
console.log('Has size error:', hasSizeError)
+
+
if (hasSizeError) {
+
window.alert('Some files were too large for your PDS. Your PDS is not allowing uploads large enough to store your site. Please contact your PDS host. This could also possibly be a result of it being behind Cloudflare free tier.')
+
}
+
+
setSelectedSiteRkey('')
+
setNewSiteName('')
+
setSelectedFiles(null)
+
+
// Refresh sites list
+
onUploadComplete()
+
+
// Reset form (wait longer if there are issues to show)
+
const resetDelay = hasIssues ? 6000 : 1500
+
setTimeout(() => {
+
setUploadProgress('')
+
setSkippedFiles([])
+
setFailedFiles([])
+
setUploadedCount(0)
+
setFileProgressList([])
+
setIsUploading(false)
+
}, resetDelay)
+
})
+
+
eventSource.addEventListener('error', (event) => {
+
const errorData = JSON.parse((event as any).data || '{}')
+
eventSource.close()
+
eventSourceRef.current = null
+
currentJobIdRef.current = null
+
+
console.error('Upload error:', errorData)
+
alert(
+
`Upload failed: ${errorData.error || 'Unknown error'}`
+
)
+
setIsUploading(false)
+
setUploadProgress('')
+
})
+
+
eventSource.onerror = () => {
+
eventSource.close()
+
eventSourceRef.current = null
+
currentJobIdRef.current = null
+
+
console.error('SSE connection error')
+
alert('Lost connection to upload progress. The upload may still be processing.')
+
setIsUploading(false)
+
setUploadProgress('')
+
}
+
}
+
+
const handleUpload = async () => {
+
const siteName = siteMode === 'existing' ? selectedSiteRkey : newSiteName
+
+
if (!siteName) {
+
alert(siteMode === 'existing' ? 'Please select a site' : 'Please enter a site name')
+
return
+
}
+
+
setIsUploading(true)
+
setUploadProgress('Preparing files...')
+
+
try {
+
const formData = new FormData()
+
formData.append('siteName', siteName)
+
+
if (selectedFiles) {
+
for (let i = 0; i < selectedFiles.length; i++) {
+
formData.append('files', selectedFiles[i])
+
}
+
}
+
+
// If no files, handle synchronously (old behavior)
+
if (!selectedFiles || selectedFiles.length === 0) {
+
setUploadProgress('Creating empty site...')
+
const response = await fetch('/wisp/upload-files', {
+
method: 'POST',
+
body: formData
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
setUploadProgress('Site created!')
+
setSelectedSiteRkey('')
+
setNewSiteName('')
+
setSelectedFiles(null)
+
+
await onUploadComplete()
+
+
setTimeout(() => {
+
setUploadProgress('')
+
setIsUploading(false)
+
}, 1500)
+
} else {
+
throw new Error(data.error || 'Upload failed')
+
}
+
return
+
}
+
+
// For file uploads, use SSE for progress
+
setUploadProgress('Starting upload...')
+
const response = await fetch('/wisp/upload-files', {
+
method: 'POST',
+
body: formData
+
})
+
+
const data = await response.json()
+
if (!data.success || !data.jobId) {
+
throw new Error(data.error || 'Failed to start upload')
+
}
+
+
const jobId = data.jobId
+
setUploadProgress('Connecting to progress stream...')
+
+
// Setup SSE connection (persists across tab switches via ref)
+
setupSSE(jobId)
+
+
} catch (err) {
+
console.error('Upload error:', err)
+
alert(
+
`Upload failed: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
setIsUploading(false)
+
setUploadProgress('')
+
}
+
}
+
+
return (
+
<div className="space-y-4 min-h-[400px]">
+
<Card>
+
<CardHeader>
+
<CardTitle>Upload Site</CardTitle>
+
<CardDescription>
+
Deploy a new site from a folder or Git repository
+
</CardDescription>
+
</CardHeader>
+
<CardContent className="space-y-6">
+
<div className="space-y-4">
+
<div className="p-4 bg-muted/50 rounded-lg">
+
<RadioGroup
+
value={siteMode}
+
onValueChange={(value) => setSiteMode(value as 'existing' | 'new')}
+
disabled={isUploading}
+
>
+
<div className="flex items-center space-x-2">
+
<RadioGroupItem value="existing" id="existing" />
+
<Label htmlFor="existing" className="cursor-pointer">
+
Update existing site
+
</Label>
+
</div>
+
<div className="flex items-center space-x-2">
+
<RadioGroupItem value="new" id="new" />
+
<Label htmlFor="new" className="cursor-pointer">
+
Create new site
+
</Label>
+
</div>
+
</RadioGroup>
+
</div>
+
+
{siteMode === 'existing' ? (
+
<div className="space-y-2">
+
<Label htmlFor="site-select">Select Site</Label>
+
{sitesLoading ? (
+
<div className="flex items-center justify-center py-4">
+
<Loader2 className="w-5 h-5 animate-spin text-muted-foreground" />
+
</div>
+
) : sites.length === 0 ? (
+
<div className="p-4 border border-dashed rounded-lg text-center text-sm text-muted-foreground">
+
No sites available. Create a new site instead.
+
</div>
+
) : (
+
<select
+
id="site-select"
+
className="flex h-10 w-full rounded-md border border-input bg-background px-3 py-2 text-sm ring-offset-background file:border-0 file:bg-transparent file:text-sm file:font-medium placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
+
value={selectedSiteRkey}
+
onChange={(e) => setSelectedSiteRkey(e.target.value)}
+
disabled={isUploading}
+
>
+
<option value="">Select a site...</option>
+
{sites.map((site) => (
+
<option key={site.rkey} value={site.rkey}>
+
{site.display_name || site.rkey}
+
</option>
+
))}
+
</select>
+
)}
+
</div>
+
) : (
+
<div className="space-y-2">
+
<Label htmlFor="new-site-name">New Site Name</Label>
+
<Input
+
id="new-site-name"
+
placeholder="my-awesome-site"
+
value={newSiteName}
+
onChange={(e) => setNewSiteName(e.target.value)}
+
disabled={isUploading}
+
/>
+
</div>
+
)}
+
+
<p className="text-xs text-muted-foreground">
+
File limits: 100MB per file, 300MB total
+
</p>
+
</div>
+
+
<div className="grid md:grid-cols-2 gap-4">
+
<Card className="border-2 border-dashed hover:border-accent transition-colors cursor-pointer">
+
<CardContent className="flex flex-col items-center justify-center p-8 text-center">
+
<Upload className="w-12 h-12 text-muted-foreground mb-4" />
+
<h3 className="font-semibold mb-2">
+
Upload Folder
+
</h3>
+
<p className="text-sm text-muted-foreground mb-4">
+
Drag and drop or click to upload your
+
static site files
+
</p>
+
<input
+
type="file"
+
id="file-upload"
+
multiple
+
onChange={handleFileSelect}
+
className="hidden"
+
{...(({ webkitdirectory: '', directory: '' } as any))}
+
disabled={isUploading}
+
/>
+
<label htmlFor="file-upload">
+
<Button
+
variant="outline"
+
type="button"
+
onClick={() =>
+
document
+
.getElementById('file-upload')
+
?.click()
+
}
+
disabled={isUploading}
+
>
+
Choose Folder
+
</Button>
+
</label>
+
{selectedFiles && selectedFiles.length > 0 && (
+
<p className="text-sm text-muted-foreground mt-3">
+
{selectedFiles.length} files selected
+
</p>
+
)}
+
</CardContent>
+
</Card>
+
+
<Card className="border-2 border-dashed opacity-50">
+
<CardContent className="flex flex-col items-center justify-center p-8 text-center">
+
<Globe className="w-12 h-12 text-muted-foreground mb-4" />
+
<h3 className="font-semibold mb-2">
+
Connect Git Repository
+
</h3>
+
<p className="text-sm text-muted-foreground mb-4">
+
Link your GitHub, GitLab, or any Git
+
repository
+
</p>
+
<Badge variant="secondary">Coming soon!</Badge>
+
</CardContent>
+
</Card>
+
</div>
+
+
{uploadProgress && (
+
<div className="space-y-3">
+
<div className="p-4 bg-muted rounded-lg">
+
<div className="flex items-center gap-2">
+
<Loader2 className="w-4 h-4 animate-spin" />
+
<span className="text-sm">{uploadProgress}</span>
+
</div>
+
</div>
+
+
{fileProgressList.length > 0 && (
+
<div className="border rounded-lg overflow-hidden">
+
<button
+
onClick={() => setShowFileProgress(!showFileProgress)}
+
className="w-full p-3 bg-muted/50 hover:bg-muted transition-colors flex items-center justify-between text-sm font-medium"
+
>
+
<span>
+
Processing files ({fileProgressList.filter(f => f.status === 'uploaded' || f.status === 'reused').length}/{fileProgressList.length})
+
</span>
+
{showFileProgress ? (
+
<ChevronUp className="w-4 h-4" />
+
) : (
+
<ChevronDown className="w-4 h-4" />
+
)}
+
</button>
+
{showFileProgress && (
+
<div className="max-h-64 overflow-y-auto p-3 space-y-1 bg-background">
+
{fileProgressList.map((file, idx) => (
+
<div
+
key={idx}
+
className="flex items-start gap-2 text-xs p-2 rounded hover:bg-muted/50 transition-colors"
+
>
+
{file.status === 'checking' && (
+
<Loader2 className="w-3 h-3 mt-0.5 animate-spin text-blue-500 shrink-0" />
+
)}
+
{file.status === 'uploading' && (
+
<Loader2 className="w-3 h-3 mt-0.5 animate-spin text-purple-500 shrink-0" />
+
)}
+
{file.status === 'uploaded' && (
+
<CheckCircle2 className="w-3 h-3 mt-0.5 text-green-500 shrink-0" />
+
)}
+
{file.status === 'reused' && (
+
<RefreshCw className="w-3 h-3 mt-0.5 text-cyan-500 shrink-0" />
+
)}
+
{file.status === 'failed' && (
+
<XCircle className="w-3 h-3 mt-0.5 text-red-500 shrink-0" />
+
)}
+
<div className="flex-1 min-w-0">
+
<div className="font-mono truncate">{file.name}</div>
+
{file.error && (
+
<div className="text-red-500 mt-0.5">
+
{file.error}
+
</div>
+
)}
+
{file.status === 'checking' && (
+
<div className="text-muted-foreground">Checking for changes...</div>
+
)}
+
{file.status === 'uploading' && (
+
<div className="text-muted-foreground">Uploading to PDS...</div>
+
)}
+
{file.status === 'reused' && (
+
<div className="text-muted-foreground">Reused (unchanged)</div>
+
)}
+
</div>
+
</div>
+
))}
+
</div>
+
)}
+
</div>
+
)}
+
+
{failedFiles.length > 0 && (
+
<div className="p-4 bg-red-500/10 border border-red-500/20 rounded-lg">
+
<div className="flex items-start gap-2 text-red-600 dark:text-red-400 mb-2">
+
<AlertCircle className="w-4 h-4 mt-0.5 shrink-0" />
+
<div className="flex-1">
+
<span className="font-medium">
+
{failedFiles.length} file{failedFiles.length > 1 ? 's' : ''} failed to upload
+
</span>
+
{uploadedCount > 0 && (
+
<span className="text-sm ml-2">
+
({uploadedCount} uploaded successfully)
+
</span>
+
)}
+
</div>
+
</div>
+
<div className="ml-6 space-y-1 max-h-40 overflow-y-auto">
+
{failedFiles.slice(0, 10).map((file, idx) => (
+
<div key={idx} className="text-xs">
+
<div className="font-mono font-semibold">{file.name}</div>
+
<div className="text-muted-foreground ml-2">
+
Error: {file.error}
+
{file.size > 0 && ` (${(file.size / 1024).toFixed(1)} KB)`}
+
</div>
+
</div>
+
))}
+
{failedFiles.length > 10 && (
+
<div className="text-xs text-muted-foreground">
+
...and {failedFiles.length - 10} more
+
</div>
+
)}
+
</div>
+
</div>
+
)}
+
+
{skippedFiles.length > 0 && (
+
<div className="p-4 bg-yellow-500/10 border border-yellow-500/20 rounded-lg">
+
<div className="flex items-start gap-2 text-yellow-600 dark:text-yellow-400 mb-2">
+
<AlertCircle className="w-4 h-4 mt-0.5 shrink-0" />
+
<div className="flex-1">
+
<span className="font-medium">
+
{skippedFiles.length} file{skippedFiles.length > 1 ? 's' : ''} skipped
+
</span>
+
</div>
+
</div>
+
<div className="ml-6 space-y-1 max-h-32 overflow-y-auto">
+
{skippedFiles.slice(0, 5).map((file, idx) => (
+
<div key={idx} className="text-xs">
+
<span className="font-mono">{file.name}</span>
+
<span className="text-muted-foreground"> - {file.reason}</span>
+
</div>
+
))}
+
{skippedFiles.length > 5 && (
+
<div className="text-xs text-muted-foreground">
+
...and {skippedFiles.length - 5} more
+
</div>
+
)}
+
</div>
+
</div>
+
)}
+
</div>
+
)}
+
+
<Button
+
onClick={handleUpload}
+
className="w-full"
+
disabled={
+
(siteMode === 'existing' ? !selectedSiteRkey : !newSiteName) ||
+
isUploading ||
+
(siteMode === 'existing' && (!selectedFiles || selectedFiles.length === 0))
+
}
+
>
+
{isUploading ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Uploading...
+
</>
+
) : (
+
<>
+
{siteMode === 'existing' ? (
+
'Update Site'
+
) : (
+
selectedFiles && selectedFiles.length > 0
+
? 'Upload & Deploy'
+
: 'Create Empty Site'
+
)}
+
</>
+
)}
+
</Button>
+
</CardContent>
+
</Card>
+
</div>
+
)
+
}
apps/main-app/public/favicon-16x16.png

This is a binary file and will not be displayed.

apps/main-app/public/favicon-32x32.png

This is a binary file and will not be displayed.

apps/main-app/public/favicon.ico

This is a binary file and will not be displayed.

+35
apps/main-app/public/index.html
···
+
<!doctype html>
+
<html lang="en">
+
<head>
+
<meta charset="UTF-8" />
+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
+
<title>wisp.place</title>
+
<meta name="description" content="Host static websites directly in your AT Protocol account. Keep full ownership and control with fast CDN distribution. Built on Bluesky's decentralized network." />
+
+
<!-- Open Graph / Facebook -->
+
<meta property="og:type" content="website" />
+
<meta property="og:url" content="https://wisp.place/" />
+
<meta property="og:title" content="wisp.place - Decentralized Static Site Hosting" />
+
<meta property="og:description" content="Host static websites directly in your AT Protocol account. Keep full ownership and control with fast CDN distribution." />
+
<meta property="og:site_name" content="wisp.place" />
+
+
<!-- Twitter -->
+
<meta name="twitter:card" content="summary_large_image" />
+
<meta name="twitter:url" content="https://wisp.place/" />
+
<meta name="twitter:title" content="wisp.place - Decentralized Static Site Hosting" />
+
<meta name="twitter:description" content="Host static websites directly in your AT Protocol account. Keep full ownership and control with fast CDN distribution." />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
+
<link rel="icon" type="image/x-icon" href="./favicon.ico">
+
<link rel="icon" type="image/png" sizes="32x32" href="./favicon-32x32.png">
+
<link rel="icon" type="image/png" sizes="16x16" href="./favicon-16x16.png">
+
<link rel="apple-touch-icon" sizes="180x180" href="./apple-touch-icon.png">
+
<link rel="manifest" href="./site.webmanifest">
+
</head>
+
<body>
+
<div id="elysia"></div>
+
<script type="module" src="./index.tsx"></script>
+
</body>
+
</html>
+751
apps/main-app/public/index.tsx
···
+
import React, { useState, useRef, useEffect } from 'react'
+
import { createRoot } from 'react-dom/client'
+
import { ArrowRight } from 'lucide-react'
+
import Layout from '@public/layouts'
+
import { Button } from '@public/components/ui/button'
+
import { Card } from '@public/components/ui/card'
+
import { BlueskyPostList, BlueskyProfile, BlueskyPost, AtProtoProvider, useLatestRecord, type AtProtoStyles, type FeedPostRecord } from 'atproto-ui'
+
+
//Credit to https://tangled.org/@jakelazaroff.com/actor-typeahead
+
interface Actor {
+
handle: string
+
avatar?: string
+
displayName?: string
+
}
+
+
interface ActorTypeaheadProps {
+
children: React.ReactElement<React.InputHTMLAttributes<HTMLInputElement>>
+
host?: string
+
rows?: number
+
onSelect?: (handle: string) => void
+
autoSubmit?: boolean
+
}
+
+
const ActorTypeahead: React.FC<ActorTypeaheadProps> = ({
+
children,
+
host = 'https://public.api.bsky.app',
+
rows = 5,
+
onSelect,
+
autoSubmit = false
+
}) => {
+
const [actors, setActors] = useState<Actor[]>([])
+
const [index, setIndex] = useState(-1)
+
const [pressed, setPressed] = useState(false)
+
const [isOpen, setIsOpen] = useState(false)
+
const containerRef = useRef<HTMLDivElement>(null)
+
const inputRef = useRef<HTMLInputElement>(null)
+
const lastQueryRef = useRef<string>('')
+
const previousValueRef = useRef<string>('')
+
const preserveIndexRef = useRef(false)
+
+
const handleInput = async (e: React.FormEvent<HTMLInputElement>) => {
+
const query = e.currentTarget.value
+
+
// Check if the value actually changed (filter out arrow key events)
+
if (query === previousValueRef.current) {
+
return
+
}
+
previousValueRef.current = query
+
+
if (!query) {
+
setActors([])
+
setIndex(-1)
+
setIsOpen(false)
+
lastQueryRef.current = ''
+
return
+
}
+
+
// Store the query for this request
+
const currentQuery = query
+
lastQueryRef.current = currentQuery
+
+
try {
+
const url = new URL('xrpc/app.bsky.actor.searchActorsTypeahead', host)
+
url.searchParams.set('q', query)
+
url.searchParams.set('limit', `${rows}`)
+
+
const res = await fetch(url)
+
const json = await res.json()
+
+
// Only update if this is still the latest query
+
if (lastQueryRef.current === currentQuery) {
+
setActors(json.actors || [])
+
// Only reset index if we're not preserving it
+
if (!preserveIndexRef.current) {
+
setIndex(-1)
+
}
+
preserveIndexRef.current = false
+
setIsOpen(true)
+
}
+
} catch (error) {
+
console.error('Failed to fetch actors:', error)
+
if (lastQueryRef.current === currentQuery) {
+
setActors([])
+
setIsOpen(false)
+
}
+
}
+
}
+
+
const handleKeyDown = (e: React.KeyboardEvent<HTMLInputElement>) => {
+
const navigationKeys = ['ArrowDown', 'ArrowUp', 'PageDown', 'PageUp', 'Enter', 'Escape']
+
+
// Mark that we should preserve the index for navigation keys
+
if (navigationKeys.includes(e.key)) {
+
preserveIndexRef.current = true
+
}
+
+
if (!isOpen || actors.length === 0) return
+
+
switch (e.key) {
+
case 'ArrowDown':
+
e.preventDefault()
+
setIndex((prev) => {
+
const newIndex = prev < 0 ? 0 : Math.min(prev + 1, actors.length - 1)
+
return newIndex
+
})
+
break
+
case 'PageDown':
+
e.preventDefault()
+
setIndex(actors.length - 1)
+
break
+
case 'ArrowUp':
+
e.preventDefault()
+
setIndex((prev) => {
+
const newIndex = prev < 0 ? 0 : Math.max(prev - 1, 0)
+
return newIndex
+
})
+
break
+
case 'PageUp':
+
e.preventDefault()
+
setIndex(0)
+
break
+
case 'Escape':
+
e.preventDefault()
+
setActors([])
+
setIndex(-1)
+
setIsOpen(false)
+
break
+
case 'Enter':
+
if (index >= 0 && index < actors.length) {
+
e.preventDefault()
+
selectActor(actors[index].handle)
+
}
+
break
+
}
+
}
+
+
const selectActor = (handle: string) => {
+
if (inputRef.current) {
+
inputRef.current.value = handle
+
}
+
setActors([])
+
setIndex(-1)
+
setIsOpen(false)
+
onSelect?.(handle)
+
+
// Auto-submit the form if enabled
+
if (autoSubmit && inputRef.current) {
+
const form = inputRef.current.closest('form')
+
if (form) {
+
// Use setTimeout to ensure the value is set before submission
+
setTimeout(() => {
+
form.requestSubmit()
+
}, 0)
+
}
+
}
+
}
+
+
const handleFocusOut = (e: React.FocusEvent) => {
+
if (pressed) return
+
setActors([])
+
setIndex(-1)
+
setIsOpen(false)
+
}
+
+
// Clone the input element and add our event handlers
+
const input = React.cloneElement(children, {
+
ref: (el: HTMLInputElement) => {
+
inputRef.current = el
+
// Preserve the original ref if it exists
+
const originalRef = (children as any).ref
+
if (typeof originalRef === 'function') {
+
originalRef(el)
+
} else if (originalRef) {
+
originalRef.current = el
+
}
+
},
+
onInput: (e: React.FormEvent<HTMLInputElement>) => {
+
handleInput(e)
+
children.props.onInput?.(e)
+
},
+
onKeyDown: (e: React.KeyboardEvent<HTMLInputElement>) => {
+
handleKeyDown(e)
+
children.props.onKeyDown?.(e)
+
},
+
onBlur: (e: React.FocusEvent<HTMLInputElement>) => {
+
handleFocusOut(e)
+
children.props.onBlur?.(e)
+
},
+
autoComplete: 'off'
+
} as any)
+
+
return (
+
<div ref={containerRef} style={{ position: 'relative', display: 'block' }}>
+
{input}
+
{isOpen && actors.length > 0 && (
+
<ul
+
style={{
+
display: 'flex',
+
flexDirection: 'column',
+
position: 'absolute',
+
left: 0,
+
marginTop: '4px',
+
width: '100%',
+
listStyle: 'none',
+
overflow: 'hidden',
+
backgroundColor: 'rgba(255, 255, 255, 0.8)',
+
backgroundClip: 'padding-box',
+
backdropFilter: 'blur(12px)',
+
WebkitBackdropFilter: 'blur(12px)',
+
border: '1px solid rgba(0, 0, 0, 0.1)',
+
borderRadius: '8px',
+
boxShadow: '0 6px 6px -4px rgba(0, 0, 0, 0.2)',
+
padding: '4px',
+
margin: 0,
+
zIndex: 1000
+
}}
+
onMouseDown={() => setPressed(true)}
+
onMouseUp={() => {
+
setPressed(false)
+
inputRef.current?.focus()
+
}}
+
>
+
{actors.map((actor, i) => (
+
<li key={actor.handle}>
+
<button
+
type="button"
+
onClick={() => selectActor(actor.handle)}
+
style={{
+
all: 'unset',
+
boxSizing: 'border-box',
+
display: 'flex',
+
alignItems: 'center',
+
gap: '8px',
+
padding: '6px 8px',
+
width: '100%',
+
height: 'calc(1.5rem + 12px)',
+
borderRadius: '4px',
+
cursor: 'pointer',
+
backgroundColor: i === index ? 'hsl(var(--accent) / 0.5)' : 'transparent',
+
transition: 'background-color 0.1s'
+
}}
+
onMouseEnter={() => setIndex(i)}
+
>
+
<div
+
style={{
+
width: '1.5rem',
+
height: '1.5rem',
+
borderRadius: '50%',
+
backgroundColor: 'hsl(var(--muted))',
+
overflow: 'hidden',
+
flexShrink: 0
+
}}
+
>
+
{actor.avatar && (
+
<img
+
src={actor.avatar}
+
alt=""
+
style={{
+
display: 'block',
+
width: '100%',
+
height: '100%',
+
objectFit: 'cover'
+
}}
+
/>
+
)}
+
</div>
+
<span
+
style={{
+
whiteSpace: 'nowrap',
+
overflow: 'hidden',
+
textOverflow: 'ellipsis',
+
color: '#000000'
+
}}
+
>
+
{actor.handle}
+
</span>
+
</button>
+
</li>
+
))}
+
</ul>
+
)}
+
</div>
+
)
+
}
+
+
const LatestPostWithPrefetch: React.FC<{ did: string }> = ({ did }) => {
+
const { record, rkey, loading } = useLatestRecord<FeedPostRecord>(
+
did,
+
'app.bsky.feed.post'
+
)
+
+
if (loading) return <span>Loadingโ€ฆ</span>
+
if (!record || !rkey) return <span>No posts yet.</span>
+
+
return <BlueskyPost did={did} rkey={rkey} record={record} showParent={true} />
+
}
+
+
function App() {
+
const [showForm, setShowForm] = useState(false)
+
const [checkingAuth, setCheckingAuth] = useState(true)
+
const [screenshots, setScreenshots] = useState<string[]>([])
+
const inputRef = useRef<HTMLInputElement>(null)
+
+
useEffect(() => {
+
// Check authentication status on mount
+
const checkAuth = async () => {
+
try {
+
const response = await fetch('/api/auth/status', {
+
credentials: 'include'
+
})
+
const data = await response.json()
+
if (data.authenticated) {
+
// User is already authenticated, redirect to editor
+
window.location.href = '/editor'
+
return
+
}
+
// If not authenticated, clear any stale cookies
+
document.cookie = 'did=; path=/; expires=Thu, 01 Jan 1970 00:00:00 GMT; SameSite=Lax'
+
} catch (error) {
+
console.error('Auth check failed:', error)
+
// Clear cookies on error as well
+
document.cookie = 'did=; path=/; expires=Thu, 01 Jan 1970 00:00:00 GMT; SameSite=Lax'
+
} finally {
+
setCheckingAuth(false)
+
}
+
}
+
+
checkAuth()
+
}, [])
+
+
useEffect(() => {
+
// Fetch screenshots list
+
const fetchScreenshots = async () => {
+
try {
+
const response = await fetch('/api/screenshots')
+
const data = await response.json()
+
setScreenshots(data.screenshots || [])
+
} catch (error) {
+
console.error('Failed to fetch screenshots:', error)
+
}
+
}
+
+
fetchScreenshots()
+
}, [])
+
+
useEffect(() => {
+
if (showForm) {
+
setTimeout(() => inputRef.current?.focus(), 500)
+
}
+
}, [showForm])
+
+
if (checkingAuth) {
+
return (
+
<div className="min-h-screen bg-background flex items-center justify-center">
+
<div className="w-8 h-8 border-2 border-primary border-t-transparent rounded-full animate-spin"></div>
+
</div>
+
)
+
}
+
+
return (
+
<>
+
<div className="min-h-screen">
+
{/* Header */}
+
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
+
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
+
<div className="flex items-center gap-2">
+
<img src="/transparent-full-size-ico.png" alt="wisp.place" className="w-8 h-8" />
+
<span className="text-xl font-semibold text-foreground">
+
wisp.place
+
</span>
+
</div>
+
<div className="flex items-center gap-3">
+
<Button
+
variant="ghost"
+
size="sm"
+
onClick={() => setShowForm(true)}
+
>
+
Sign In
+
</Button>
+
<Button
+
size="sm"
+
className="bg-accent text-accent-foreground hover:bg-accent/90"
+
asChild
+
>
+
<a href="https://docs.wisp.place" target="_blank" rel="noopener noreferrer">
+
Read the Docs
+
</a>
+
</Button>
+
</div>
+
</div>
+
</header>
+
+
{/* Hero Section */}
+
<section className="container mx-auto px-4 py-20 md:py-32">
+
<div className="max-w-4xl mx-auto text-center">
+
<div className="inline-flex items-center gap-2 px-4 py-2 rounded-full bg-accent/10 border border-accent/20 mb-8">
+
<span className="w-2 h-2 bg-accent rounded-full animate-pulse"></span>
+
<span className="text-sm text-foreground">
+
Built on AT Protocol
+
</span>
+
</div>
+
+
<h1 className="text-5xl md:text-7xl font-bold text-balance mb-6 leading-tight">
+
Your Website.Your Control. Lightning Fast.
+
</h1>
+
+
<p className="text-xl md:text-2xl text-muted-foreground text-balance mb-10 leading-relaxed max-w-3xl mx-auto">
+
Host static sites in your AT Protocol account. You
+
keep ownership and control. We just serve them fast
+
through our CDN.
+
</p>
+
+
<div className="max-w-md mx-auto relative">
+
<div
+
className={`transition-all duration-500 ease-in-out ${
+
showForm
+
? 'opacity-0 -translate-y-5 pointer-events-none'
+
: 'opacity-100 translate-y-0'
+
}`}
+
>
+
<Button
+
size="lg"
+
className="bg-primary text-primary-foreground hover:bg-primary/90 text-lg px-8 py-6 w-full"
+
onClick={() => setShowForm(true)}
+
>
+
Log in with AT Proto
+
<ArrowRight className="ml-2 w-5 h-5" />
+
</Button>
+
</div>
+
+
<div
+
className={`transition-all duration-500 ease-in-out absolute inset-0 ${
+
showForm
+
? 'opacity-100 translate-y-0'
+
: 'opacity-0 translate-y-5 pointer-events-none'
+
}`}
+
>
+
<form
+
onSubmit={async (e) => {
+
e.preventDefault()
+
try {
+
const handle =
+
inputRef.current?.value
+
const res = await fetch(
+
'/api/auth/signin',
+
{
+
method: 'POST',
+
headers: {
+
'Content-Type':
+
'application/json'
+
},
+
body: JSON.stringify({
+
handle
+
})
+
}
+
)
+
if (!res.ok)
+
throw new Error(
+
'Request failed'
+
)
+
const data = await res.json()
+
if (data.url) {
+
window.location.href = data.url
+
} else {
+
alert('Unexpected response')
+
}
+
} catch (error) {
+
console.error(
+
'Login failed:',
+
error
+
)
+
// Clear any invalid cookies
+
document.cookie = 'did=; path=/; expires=Thu, 01 Jan 1970 00:00:00 GMT; SameSite=Lax'
+
alert('Authentication failed')
+
}
+
}}
+
className="space-y-3"
+
>
+
<ActorTypeahead
+
autoSubmit={true}
+
onSelect={(handle) => {
+
if (inputRef.current) {
+
inputRef.current.value = handle
+
}
+
}}
+
>
+
<input
+
ref={inputRef}
+
type="text"
+
name="handle"
+
placeholder="Enter your handle (e.g., alice.bsky.social)"
+
className="w-full py-4 px-4 text-lg bg-input border border-border rounded-lg focus:outline-none focus:ring-2 focus:ring-accent"
+
/>
+
</ActorTypeahead>
+
<button
+
type="submit"
+
className="w-full bg-accent hover:bg-accent/90 text-accent-foreground font-semibold py-4 px-6 text-lg rounded-lg inline-flex items-center justify-center transition-colors"
+
>
+
Continue
+
<ArrowRight className="ml-2 w-5 h-5" />
+
</button>
+
</form>
+
</div>
+
</div>
+
</div>
+
</section>
+
+
{/* How It Works */}
+
<section className="container mx-auto px-4 py-16 bg-muted/30">
+
<div className="max-w-3xl mx-auto text-center">
+
<h2 className="text-3xl md:text-4xl font-bold mb-8">
+
How it works
+
</h2>
+
<div className="space-y-6 text-left">
+
<div className="flex gap-4 items-start">
+
<div className="text-4xl font-bold text-accent/40 min-w-[60px]">
+
01
+
</div>
+
<div>
+
<h3 className="text-xl font-semibold mb-2">
+
Upload your static site
+
</h3>
+
<p className="text-muted-foreground">
+
Your HTML, CSS, and JavaScript files are
+
stored in your AT Protocol account as
+
gzipped blobs and a manifest record.
+
</p>
+
</div>
+
</div>
+
<div className="flex gap-4 items-start">
+
<div className="text-4xl font-bold text-accent/40 min-w-[60px]">
+
02
+
</div>
+
<div>
+
<h3 className="text-xl font-semibold mb-2">
+
We serve it globally
+
</h3>
+
<p className="text-muted-foreground">
+
Wisp.place reads your site from your
+
account and delivers it through our CDN
+
for fast loading anywhere.
+
</p>
+
</div>
+
</div>
+
<div className="flex gap-4 items-start">
+
<div className="text-4xl font-bold text-accent/40 min-w-[60px]">
+
03
+
</div>
+
<div>
+
<h3 className="text-xl font-semibold mb-2">
+
You stay in control
+
</h3>
+
<p className="text-muted-foreground">
+
Update or remove your site anytime
+
through your AT Protocol account. No
+
lock-in, no middleman ownership.
+
</p>
+
</div>
+
</div>
+
</div>
+
</div>
+
</section>
+
+
{/* Site Gallery */}
+
<section id="gallery" className="container mx-auto px-4 py-20">
+
<div className="text-center mb-16">
+
<h2 className="text-4xl md:text-5xl font-bold mb-4 text-balance">
+
Join 80+ sites just like yours:
+
</h2>
+
</div>
+
+
<div className="grid grid-cols-1 md:grid-cols-2 gap-6 max-w-5xl mx-auto">
+
{screenshots.map((filename, i) => {
+
// Remove .png extension
+
const baseName = filename.replace('.png', '')
+
+
// Construct site URL from filename
+
let siteUrl: string
+
if (baseName.startsWith('sites_wisp_place_did_plc_')) {
+
// Handle format: sites_wisp_place_did_plc_{identifier}_{sitename}
+
const match = baseName.match(/^sites_wisp_place_did_plc_([a-z0-9]+)_(.+)$/)
+
if (match) {
+
const [, identifier, sitename] = match
+
siteUrl = `https://sites.wisp.place/did:plc:${identifier}/${sitename}`
+
} else {
+
siteUrl = '#'
+
}
+
} else {
+
// Handle format: domain_tld or subdomain_domain_tld
+
// Replace underscores with dots
+
siteUrl = `https://${baseName.replace(/_/g, '.')}`
+
}
+
+
return (
+
<a
+
key={i}
+
href={siteUrl}
+
target="_blank"
+
rel="noopener noreferrer"
+
className="block"
+
>
+
<Card className="overflow-hidden hover:shadow-xl transition-all hover:scale-105 border-2 bg-card p-0 cursor-pointer">
+
<img
+
src={`/screenshots/${filename}`}
+
alt={`${baseName} screenshot`}
+
className="w-full h-auto object-cover aspect-video"
+
loading="lazy"
+
/>
+
</Card>
+
</a>
+
)
+
})}
+
</div>
+
</section>
+
+
{/* CTA Section */}
+
<section className="container mx-auto px-4 py-20">
+
<div className="max-w-6xl mx-auto">
+
<div className="text-center mb-12">
+
<h2 className="text-3xl md:text-4xl font-bold">
+
Follow on Bluesky for updates
+
</h2>
+
</div>
+
<div className="grid md:grid-cols-2 gap-8 items-center">
+
<Card
+
className="shadow-lg border-2 border-border overflow-hidden !py-3"
+
style={{
+
'--atproto-color-bg': 'var(--card)',
+
'--atproto-color-bg-elevated': 'hsl(var(--muted) / 0.3)',
+
'--atproto-color-text': 'hsl(var(--foreground))',
+
'--atproto-color-text-secondary': 'hsl(var(--muted-foreground))',
+
'--atproto-color-link': 'hsl(var(--accent))',
+
'--atproto-color-link-hover': 'hsl(var(--accent))',
+
'--atproto-color-border': 'transparent',
+
} as AtProtoStyles}
+
>
+
<BlueskyPostList did="wisp.place" />
+
</Card>
+
<div className="space-y-6 w-full max-w-md mx-auto">
+
<Card
+
className="shadow-lg border-2 overflow-hidden relative !py-3"
+
style={{
+
'--atproto-color-bg': 'var(--card)',
+
'--atproto-color-bg-elevated': 'hsl(var(--muted) / 0.3)',
+
'--atproto-color-text': 'hsl(var(--foreground))',
+
'--atproto-color-text-secondary': 'hsl(var(--muted-foreground))',
+
} as AtProtoStyles}
+
>
+
<BlueskyProfile did="wisp.place" />
+
</Card>
+
<Card
+
className="shadow-lg border-2 overflow-hidden relative !py-3"
+
style={{
+
'--atproto-color-bg': 'var(--card)',
+
'--atproto-color-bg-elevated': 'hsl(var(--muted) / 0.3)',
+
'--atproto-color-text': 'hsl(var(--foreground))',
+
'--atproto-color-text-secondary': 'hsl(var(--muted-foreground))',
+
} as AtProtoStyles}
+
>
+
<LatestPostWithPrefetch did="wisp.place" />
+
</Card>
+
</div>
+
</div>
+
</div>
+
</section>
+
+
{/* Ready to Deploy CTA */}
+
<section className="container mx-auto px-4 py-20">
+
<div className="max-w-3xl mx-auto text-center bg-accent/5 border border-accent/20 rounded-2xl p-12">
+
<h2 className="text-3xl md:text-4xl font-bold mb-4">
+
Ready to deploy?
+
</h2>
+
<p className="text-xl text-muted-foreground mb-8">
+
Host your static site on your own AT Protocol
+
account today
+
</p>
+
<Button
+
size="lg"
+
className="bg-accent text-accent-foreground hover:bg-accent/90 text-lg px-8 py-6"
+
onClick={() => setShowForm(true)}
+
>
+
Get Started
+
<ArrowRight className="ml-2 w-5 h-5" />
+
</Button>
+
</div>
+
</section>
+
+
{/* Footer */}
+
<footer className="border-t border-border/40 bg-muted/20">
+
<div className="container mx-auto px-4 py-8">
+
<div className="text-center text-sm text-muted-foreground">
+
<p>
+
Built by{' '}
+
<a
+
href="https://bsky.app/profile/nekomimi.pet"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
@nekomimi.pet
+
</a>
+
{' โ€ข '}
+
Contact:{' '}
+
<a
+
href="mailto:contact@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
contact@wisp.place
+
</a>
+
{' โ€ข '}
+
Legal/DMCA:{' '}
+
<a
+
href="mailto:legal@wisp.place"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
legal@wisp.place
+
</a>
+
</p>
+
<p className="mt-2">
+
<a
+
href="/acceptable-use"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
Acceptable Use Policy
+
</a>
+
{' โ€ข '}
+
<a
+
href="https://docs.wisp.place"
+
target="_blank"
+
rel="noopener noreferrer"
+
className="text-accent hover:text-accent/80 transition-colors font-medium"
+
>
+
Documentation
+
</a>
+
</p>
+
</div>
+
</div>
+
</footer>
+
</div>
+
</>
+
)
+
}
+
+
const root = createRoot(document.getElementById('elysia')!)
+
root.render(
+
<AtProtoProvider>
+
<Layout className="gap-6">
+
<App />
+
</Layout>
+
</AtProtoProvider>
+
)
+51
apps/main-app/public/layouts/index.tsx
···
+
import type { PropsWithChildren } from 'react'
+
import { useEffect } from 'react'
+
+
import { QueryClientProvider, QueryClient } from '@tanstack/react-query'
+
import clsx from 'clsx'
+
+
import '@public/styles/global.css'
+
+
const client = new QueryClient()
+
+
interface LayoutProps extends PropsWithChildren {
+
className?: string
+
}
+
+
export default function Layout({ children, className }: LayoutProps) {
+
useEffect(() => {
+
// Function to update dark mode based on system preference
+
const updateDarkMode = (e: MediaQueryList | MediaQueryListEvent) => {
+
if (e.matches) {
+
document.documentElement.classList.add('dark')
+
} else {
+
document.documentElement.classList.remove('dark')
+
}
+
}
+
+
// Create media query
+
const darkModeQuery = window.matchMedia('(prefers-color-scheme: dark)')
+
+
// Set initial value
+
updateDarkMode(darkModeQuery)
+
+
// Listen for changes
+
darkModeQuery.addEventListener('change', updateDarkMode)
+
+
// Cleanup
+
return () => darkModeQuery.removeEventListener('change', updateDarkMode)
+
}, [])
+
+
return (
+
<QueryClientProvider client={client}>
+
<div
+
className={clsx(
+
'flex flex-col items-center w-full min-h-screen',
+
className
+
)}
+
>
+
{children}
+
</div>
+
</QueryClientProvider>
+
)
+
}
+8
apps/main-app/public/lib/api.ts
···
+
import { treaty } from '@elysiajs/eden'
+
+
import type { app } from '@server'
+
+
// Use the current host instead of hardcoded localhost
+
const apiHost = typeof window !== 'undefined' ? window.location.origin : 'http://localhost:8000'
+
+
export const api = treaty<typeof app>(apiHost)
+6
apps/main-app/public/lib/utils.ts
···
+
import { clsx, type ClassValue } from "clsx"
+
import { twMerge } from "tailwind-merge"
+
+
export function cn(...inputs: ClassValue[]) {
+
return twMerge(clsx(inputs))
+
}
+29
apps/main-app/public/onboarding/index.html
···
+
<!doctype html>
+
<html lang="en">
+
<head>
+
<meta charset="UTF-8" />
+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
+
<title>wisp.place</title>
+
<meta name="description" content="Get started with wisp.place and host your first decentralized static site on AT Protocol." />
+
+
<!-- Open Graph / Facebook -->
+
<meta property="og:type" content="website" />
+
<meta property="og:url" content="https://wisp.place/onboarding" />
+
<meta property="og:title" content="Get Started - wisp.place" />
+
<meta property="og:description" content="Get started with wisp.place and host your first decentralized static site on AT Protocol." />
+
<meta property="og:site_name" content="wisp.place" />
+
+
<!-- Twitter -->
+
<meta name="twitter:card" content="summary" />
+
<meta name="twitter:url" content="https://wisp.place/onboarding" />
+
<meta name="twitter:title" content="Get Started - wisp.place" />
+
<meta name="twitter:description" content="Get started with wisp.place and host your first decentralized static site on AT Protocol." />
+
+
<!-- Theme -->
+
<meta name="theme-color" content="#7c3aed" />
+
</head>
+
<body>
+
<div id="elysia"></div>
+
<script type="module" src="./onboarding.tsx"></script>
+
</body>
+
</html>
+467
apps/main-app/public/onboarding/onboarding.tsx
···
+
import { useState, useEffect } from 'react'
+
import { createRoot } from 'react-dom/client'
+
import { Button } from '@public/components/ui/button'
+
import {
+
Card,
+
CardContent,
+
CardDescription,
+
CardHeader,
+
CardTitle
+
} from '@public/components/ui/card'
+
import { Input } from '@public/components/ui/input'
+
import { Label } from '@public/components/ui/label'
+
import { Globe, Upload, CheckCircle2, Loader2, AlertCircle } from 'lucide-react'
+
import Layout from '@public/layouts'
+
+
type OnboardingStep = 'domain' | 'upload' | 'complete'
+
+
function Onboarding() {
+
const [step, setStep] = useState<OnboardingStep>('domain')
+
const [handle, setHandle] = useState('')
+
const [isCheckingAvailability, setIsCheckingAvailability] = useState(false)
+
const [isAvailable, setIsAvailable] = useState<boolean | null>(null)
+
const [domain, setDomain] = useState('')
+
const [isClaimingDomain, setIsClaimingDomain] = useState(false)
+
const [claimedDomain, setClaimedDomain] = useState('')
+
+
const [siteName, setSiteName] = useState('')
+
const [selectedFiles, setSelectedFiles] = useState<FileList | null>(null)
+
const [isUploading, setIsUploading] = useState(false)
+
const [uploadProgress, setUploadProgress] = useState('')
+
const [skippedFiles, setSkippedFiles] = useState<Array<{ name: string; reason: string }>>([])
+
const [uploadedCount, setUploadedCount] = useState(0)
+
+
// Check domain availability as user types
+
useEffect(() => {
+
if (!handle || handle.length < 3) {
+
setIsAvailable(null)
+
setDomain('')
+
return
+
}
+
+
const timeoutId = setTimeout(async () => {
+
setIsCheckingAvailability(true)
+
try {
+
const response = await fetch(
+
`/api/domain/check?handle=${encodeURIComponent(handle)}`
+
)
+
const data = await response.json()
+
setIsAvailable(data.available)
+
setDomain(data.domain || '')
+
} catch (err) {
+
console.error('Error checking availability:', err)
+
setIsAvailable(false)
+
} finally {
+
setIsCheckingAvailability(false)
+
}
+
}, 500)
+
+
return () => clearTimeout(timeoutId)
+
}, [handle])
+
+
const handleClaimDomain = async () => {
+
if (!handle || !isAvailable) return
+
+
setIsClaimingDomain(true)
+
try {
+
const response = await fetch('/api/domain/claim', {
+
method: 'POST',
+
headers: { 'Content-Type': 'application/json' },
+
body: JSON.stringify({ handle })
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
setClaimedDomain(data.domain)
+
setStep('upload')
+
} else {
+
throw new Error(data.error || 'Failed to claim domain')
+
}
+
} catch (err) {
+
console.error('Error claiming domain:', err)
+
const errorMessage = err instanceof Error ? err.message : 'Unknown error'
+
+
// Handle "Already claimed" error - redirect to editor
+
if (errorMessage.includes('Already claimed')) {
+
alert('You have already claimed a wisp.place subdomain. Redirecting to editor...')
+
window.location.href = '/editor'
+
} else {
+
alert(`Failed to claim domain: ${errorMessage}`)
+
}
+
} finally {
+
setIsClaimingDomain(false)
+
}
+
}
+
+
const handleFileSelect = (e: React.ChangeEvent<HTMLInputElement>) => {
+
if (e.target.files && e.target.files.length > 0) {
+
setSelectedFiles(e.target.files)
+
}
+
}
+
+
const handleUpload = async () => {
+
if (!siteName) {
+
alert('Please enter a site name')
+
return
+
}
+
+
setIsUploading(true)
+
setUploadProgress('Preparing files...')
+
+
try {
+
const formData = new FormData()
+
formData.append('siteName', siteName)
+
+
if (selectedFiles) {
+
for (let i = 0; i < selectedFiles.length; i++) {
+
formData.append('files', selectedFiles[i])
+
}
+
}
+
+
setUploadProgress('Uploading to AT Protocol...')
+
const response = await fetch('/wisp/upload-files', {
+
method: 'POST',
+
body: formData
+
})
+
+
const data = await response.json()
+
if (data.success) {
+
setUploadProgress('Upload complete!')
+
setSkippedFiles(data.skippedFiles || [])
+
setUploadedCount(data.uploadedCount || data.fileCount || 0)
+
+
// If there are skipped files, show them briefly before redirecting
+
if (data.skippedFiles && data.skippedFiles.length > 0) {
+
setTimeout(() => {
+
window.location.href = `https://${claimedDomain}`
+
}, 3000) // Give more time to see skipped files
+
} else {
+
setTimeout(() => {
+
window.location.href = `https://${claimedDomain}`
+
}, 1500)
+
}
+
} else {
+
throw new Error(data.error || 'Upload failed')
+
}
+
} catch (err) {
+
console.error('Upload error:', err)
+
alert(
+
`Upload failed: ${err instanceof Error ? err.message : 'Unknown error'}`
+
)
+
setIsUploading(false)
+
setUploadProgress('')
+
}
+
}
+
+
const handleSkipUpload = () => {
+
// Redirect to editor without uploading
+
window.location.href = '/editor'
+
}
+
+
return (
+
<div className="w-full min-h-screen bg-background">
+
{/* Header */}
+
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
+
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
+
<div className="flex items-center gap-2">
+
<div className="w-8 h-8 bg-primary rounded-lg flex items-center justify-center">
+
<Globe className="w-5 h-5 text-primary-foreground" />
+
</div>
+
<span className="text-xl font-semibold text-foreground">
+
wisp.place
+
</span>
+
</div>
+
</div>
+
</header>
+
+
<div className="container mx-auto px-4 py-12 max-w-2xl">
+
{/* Progress indicator */}
+
<div className="mb-8">
+
<div className="flex items-center justify-center gap-2 mb-4">
+
<div
+
className={`w-8 h-8 rounded-full flex items-center justify-center ${
+
step === 'domain'
+
? 'bg-primary text-primary-foreground'
+
: 'bg-green-500 text-white'
+
}`}
+
>
+
{step === 'domain' ? (
+
'1'
+
) : (
+
<CheckCircle2 className="w-5 h-5" />
+
)}
+
</div>
+
<div className="w-16 h-0.5 bg-border"></div>
+
<div
+
className={`w-8 h-8 rounded-full flex items-center justify-center ${
+
step === 'upload'
+
? 'bg-primary text-primary-foreground'
+
: step === 'domain'
+
? 'bg-muted text-muted-foreground'
+
: 'bg-green-500 text-white'
+
}`}
+
>
+
{step === 'complete' ? (
+
<CheckCircle2 className="w-5 h-5" />
+
) : (
+
'2'
+
)}
+
</div>
+
</div>
+
<div className="text-center">
+
<h1 className="text-2xl font-bold mb-2">
+
{step === 'domain' && 'Claim Your Free Domain'}
+
{step === 'upload' && 'Deploy Your First Site'}
+
{step === 'complete' && 'All Set!'}
+
</h1>
+
<p className="text-muted-foreground">
+
{step === 'domain' &&
+
'Choose a subdomain on wisp.place'}
+
{step === 'upload' &&
+
'Upload your site or start with an empty one'}
+
{step === 'complete' && 'Redirecting to your site...'}
+
</p>
+
</div>
+
</div>
+
+
{/* Domain registration step */}
+
{step === 'domain' && (
+
<Card>
+
<CardHeader>
+
<CardTitle>Choose Your Domain</CardTitle>
+
<CardDescription>
+
Pick a unique handle for your free *.wisp.place
+
subdomain
+
</CardDescription>
+
</CardHeader>
+
<CardContent className="space-y-4">
+
<div className="space-y-2">
+
<Label htmlFor="handle">Your Handle</Label>
+
<div className="flex gap-2">
+
<div className="relative flex-1">
+
<Input
+
id="handle"
+
placeholder="my-awesome-site"
+
value={handle}
+
onChange={(e) =>
+
setHandle(
+
e.target.value
+
.toLowerCase()
+
.replace(/[^a-z0-9-]/g, '')
+
)
+
}
+
className="pr-10"
+
/>
+
{isCheckingAvailability && (
+
<Loader2 className="absolute right-3 top-1/2 -translate-y-1/2 w-4 h-4 animate-spin text-muted-foreground" />
+
)}
+
{!isCheckingAvailability &&
+
isAvailable !== null && (
+
<div
+
className={`absolute right-3 top-1/2 -translate-y-1/2 ${
+
isAvailable
+
? 'text-green-500'
+
: 'text-red-500'
+
}`}
+
>
+
{isAvailable ? 'โœ“' : 'โœ—'}
+
</div>
+
)}
+
</div>
+
</div>
+
{domain && (
+
<p className="text-sm text-muted-foreground">
+
Your domain will be:{' '}
+
<span className="font-mono">{domain}</span>
+
</p>
+
)}
+
{isAvailable === false && handle.length >= 3 && (
+
<p className="text-sm text-red-500">
+
This handle is not available or invalid
+
</p>
+
)}
+
</div>
+
+
<Button
+
onClick={handleClaimDomain}
+
disabled={
+
!isAvailable ||
+
isClaimingDomain ||
+
isCheckingAvailability
+
}
+
className="w-full"
+
>
+
{isClaimingDomain ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Claiming Domain...
+
</>
+
) : (
+
<>Claim Domain</>
+
)}
+
</Button>
+
</CardContent>
+
</Card>
+
)}
+
+
{/* Upload step */}
+
{step === 'upload' && (
+
<Card>
+
<CardHeader>
+
<CardTitle>Deploy Your Site</CardTitle>
+
<CardDescription>
+
Upload your static site files or start with an empty
+
site (you can upload later)
+
</CardDescription>
+
</CardHeader>
+
<CardContent className="space-y-6">
+
<div className="p-4 bg-green-500/10 border border-green-500/20 rounded-lg">
+
<div className="flex items-center gap-2 text-green-600 dark:text-green-400">
+
<CheckCircle2 className="w-4 h-4" />
+
<span className="font-medium">
+
Domain claimed: {claimedDomain}
+
</span>
+
</div>
+
</div>
+
+
<div className="space-y-2">
+
<Label htmlFor="site-name">Site Name</Label>
+
<Input
+
id="site-name"
+
placeholder="my-site"
+
value={siteName}
+
onChange={(e) => setSiteName(e.target.value)}
+
/>
+
<p className="text-xs text-muted-foreground">
+
A unique identifier for this site in your account
+
</p>
+
</div>
+
+
<div className="space-y-2">
+
<Label>Upload Files (Optional)</Label>
+
<div className="border-2 border-dashed border-border rounded-lg p-8 text-center hover:border-accent transition-colors">
+
<Upload className="w-12 h-12 text-muted-foreground mx-auto mb-4" />
+
<input
+
type="file"
+
id="file-upload"
+
multiple
+
onChange={handleFileSelect}
+
className="hidden"
+
{...(({ webkitdirectory: '', directory: '' } as any))}
+
/>
+
<label
+
htmlFor="file-upload"
+
className="cursor-pointer"
+
>
+
<Button
+
variant="outline"
+
type="button"
+
onClick={() =>
+
document
+
.getElementById('file-upload')
+
?.click()
+
}
+
>
+
Choose Folder
+
</Button>
+
</label>
+
{selectedFiles && selectedFiles.length > 0 && (
+
<p className="text-sm text-muted-foreground mt-3">
+
{selectedFiles.length} files selected
+
</p>
+
)}
+
</div>
+
<p className="text-xs text-muted-foreground">
+
Supported: HTML, CSS, JS, images, fonts, and more
+
</p>
+
<p className="text-xs text-muted-foreground">
+
Limits: 100MB per file, 300MB total
+
</p>
+
</div>
+
+
{uploadProgress && (
+
<div className="space-y-3">
+
<div className="p-4 bg-muted rounded-lg">
+
<div className="flex items-center gap-2">
+
<Loader2 className="w-4 h-4 animate-spin" />
+
<span className="text-sm">
+
{uploadProgress}
+
</span>
+
</div>
+
</div>
+
+
{skippedFiles.length > 0 && (
+
<div className="p-4 bg-yellow-500/10 border border-yellow-500/20 rounded-lg">
+
<div className="flex items-start gap-2 text-yellow-600 dark:text-yellow-400 mb-2">
+
<AlertCircle className="w-4 h-4 mt-0.5 flex-shrink-0" />
+
<div className="flex-1">
+
<span className="font-medium">
+
{skippedFiles.length} file{skippedFiles.length > 1 ? 's' : ''} skipped
+
</span>
+
{uploadedCount > 0 && (
+
<span className="text-sm ml-2">
+
({uploadedCount} uploaded successfully)
+
</span>
+
)}
+
</div>
+
</div>
+
<div className="ml-6 space-y-1 max-h-32 overflow-y-auto">
+
{skippedFiles.slice(0, 5).map((file, idx) => (
+
<div key={idx} className="text-xs">
+
<span className="font-mono">{file.name}</span>
+
<span className="text-muted-foreground"> - {file.reason}</span>
+
</div>
+
))}
+
{skippedFiles.length > 5 && (
+
<div className="text-xs text-muted-foreground">
+
...and {skippedFiles.length - 5} more
+
</div>
+
)}
+
</div>
+
</div>
+
)}
+
</div>
+
)}
+
+
<div className="flex gap-3">
+
<Button
+
onClick={handleSkipUpload}
+
variant="outline"
+
className="flex-1"
+
disabled={isUploading}
+
>
+
Skip for Now
+
</Button>
+
<Button
+
onClick={handleUpload}
+
className="flex-1"
+
disabled={!siteName || isUploading}
+
>
+
{isUploading ? (
+
<>
+
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
+
Uploading...
+
</>
+
) : (
+
<>
+
{selectedFiles && selectedFiles.length > 0
+
? 'Upload & Deploy'
+
: 'Create Empty Site'}
+
</>
+
)}
+
</Button>
+
</div>
+
</CardContent>
+
</Card>
+
)}
+
</div>
+
</div>
+
)
+
}
+
+
const root = createRoot(document.getElementById('elysia')!)
+
root.render(
+
<Layout>
+
<Onboarding />
+
</Layout>
+
)
+21
apps/main-app/public/robots.txt
···
+
# robots.txt for wisp.place
+
+
User-agent: *
+
+
# Allow indexing of landing page
+
Allow: /$
+
+
# Disallow application pages
+
Disallow: /editor
+
Disallow: /admin
+
Disallow: /onboarding
+
+
# Disallow API routes
+
Disallow: /api/
+
Disallow: /wisp/
+
+
# Allow static assets
+
Allow: /favicon.ico
+
Allow: /favicon-*.png
+
Allow: /apple-touch-icon.png
+
Allow: /site.webmanifest
apps/main-app/public/screenshots/atproto-ui_wisp_place.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/avalanche_moe.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/brotosolar_wisp_place.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/erisa_wisp_place.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/hayden_moe.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/kot_pink.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/moover_wisp_place.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/nekomimi_pet.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/pdsls_wisp_place.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/plc-bench_wisp_place.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/rainygoo_se.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/rd_jbcrn_dev.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/sites_wisp_place_did_plc_3whdb534faiczugsz5fnohh6_rafa.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/sites_wisp_place_did_plc_524tuhdhh3m7li5gycdn6boe_plcbundle-watch.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/system_grdnsys_no.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/tealfm_indexx_dev.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/tigwyk_wisp_place.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/wfr_jbc_lol.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/wisp_jbc_lol.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/wisp_soverth_f5_si.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/www_miriscient_org.png

This is a binary file and will not be displayed.

apps/main-app/public/screenshots/www_wlo_moe.png

This is a binary file and will not be displayed.

+1
apps/main-app/public/site.webmanifest
···
+
{"name":"","short_name":"","icons":[{"src":"/android-chrome-192x192.png","sizes":"192x192","type":"image/png"},{"src":"/android-chrome-512x512.png","sizes":"512x512","type":"image/png"}],"theme_color":"#ffffff","background_color":"#ffffff","display":"standalone"}
+198
apps/main-app/public/styles/global.css
···
+
@import "tailwindcss";
+
@import "tw-animate-css";
+
+
@custom-variant dark (@media (prefers-color-scheme: dark));
+
+
:root {
+
color-scheme: light;
+
+
/* Warm beige background inspired by Sunset design #E9DDD8 */
+
--background: oklch(0.90 0.012 35);
+
/* Very dark brown text for strong contrast #2A2420 */
+
--foreground: oklch(0.18 0.01 30);
+
+
/* Slightly lighter card background */
+
--card: oklch(0.93 0.01 35);
+
--card-foreground: oklch(0.18 0.01 30);
+
+
--popover: oklch(0.93 0.01 35);
+
--popover-foreground: oklch(0.18 0.01 30);
+
+
/* Dark brown primary inspired by #645343 */
+
--primary: oklch(0.35 0.02 35);
+
--primary-foreground: oklch(0.95 0.01 35);
+
+
/* Bright pink accent for links #FFAAD2 */
+
--accent: oklch(0.78 0.15 345);
+
--accent-foreground: oklch(0.18 0.01 30);
+
+
/* Medium taupe secondary inspired by #867D76 */
+
--secondary: oklch(0.52 0.015 30);
+
--secondary-foreground: oklch(0.95 0.01 35);
+
+
/* Light warm muted background */
+
--muted: oklch(0.88 0.01 35);
+
--muted-foreground: oklch(0.42 0.015 30);
+
+
--border: oklch(0.75 0.015 30);
+
--input: oklch(0.92 0.01 35);
+
--ring: oklch(0.72 0.08 15);
+
+
--destructive: oklch(0.577 0.245 27.325);
+
--destructive-foreground: oklch(0.985 0 0);
+
+
--chart-1: oklch(0.78 0.15 345);
+
--chart-2: oklch(0.32 0.04 285);
+
--chart-3: oklch(0.56 0.08 220);
+
--chart-4: oklch(0.85 0.02 130);
+
--chart-5: oklch(0.93 0.03 85);
+
+
--radius: 0.75rem;
+
--sidebar: oklch(0.985 0 0);
+
--sidebar-foreground: oklch(0.145 0 0);
+
--sidebar-primary: oklch(0.205 0 0);
+
--sidebar-primary-foreground: oklch(0.985 0 0);
+
--sidebar-accent: oklch(0.97 0 0);
+
--sidebar-accent-foreground: oklch(0.205 0 0);
+
--sidebar-border: oklch(0.922 0 0);
+
--sidebar-ring: oklch(0.708 0 0);
+
}
+
+
.dark {
+
color-scheme: dark;
+
+
/* Slate violet background - #2C2C2C with violet tint */
+
--background: oklch(0.23 0.015 285);
+
/* Light gray text - #E4E4E4 */
+
--foreground: oklch(0.90 0.005 285);
+
+
/* Slightly lighter slate for cards */
+
--card: oklch(0.28 0.015 285);
+
--card-foreground: oklch(0.90 0.005 285);
+
+
--popover: oklch(0.28 0.015 285);
+
--popover-foreground: oklch(0.90 0.005 285);
+
+
/* Lavender buttons - #B39CD0 */
+
--primary: oklch(0.70 0.10 295);
+
--primary-foreground: oklch(0.23 0.015 285);
+
+
/* Soft pink accent - #FFC1CC */
+
--accent: oklch(0.85 0.08 5);
+
--accent-foreground: oklch(0.23 0.015 285);
+
+
/* Light cyan secondary - #A8DADC */
+
--secondary: oklch(0.82 0.05 200);
+
--secondary-foreground: oklch(0.23 0.015 285);
+
+
/* Muted slate areas */
+
--muted: oklch(0.33 0.015 285);
+
--muted-foreground: oklch(0.72 0.01 285);
+
+
/* Subtle borders */
+
--border: oklch(0.38 0.02 285);
+
--input: oklch(0.30 0.015 285);
+
--ring: oklch(0.70 0.10 295);
+
+
/* Warm destructive color */
+
--destructive: oklch(0.60 0.22 27);
+
--destructive-foreground: oklch(0.98 0.01 85);
+
+
/* Chart colors using the accent palette */
+
--chart-1: oklch(0.85 0.08 5);
+
--chart-2: oklch(0.82 0.05 200);
+
--chart-3: oklch(0.70 0.10 295);
+
--chart-4: oklch(0.75 0.08 340);
+
--chart-5: oklch(0.65 0.08 180);
+
+
/* Sidebar slate */
+
--sidebar: oklch(0.20 0.015 285);
+
--sidebar-foreground: oklch(0.90 0.005 285);
+
--sidebar-primary: oklch(0.70 0.10 295);
+
--sidebar-primary-foreground: oklch(0.20 0.015 285);
+
--sidebar-accent: oklch(0.28 0.015 285);
+
--sidebar-accent-foreground: oklch(0.90 0.005 285);
+
--sidebar-border: oklch(0.32 0.02 285);
+
--sidebar-ring: oklch(0.70 0.10 295);
+
}
+
+
@theme inline {
+
/* optional: --font-sans, --font-serif, --font-mono if they are applied in the layout.tsx */
+
--color-background: var(--background);
+
--color-foreground: var(--foreground);
+
--color-card: var(--card);
+
--color-card-foreground: var(--card-foreground);
+
--color-popover: var(--popover);
+
--color-popover-foreground: var(--popover-foreground);
+
--color-primary: var(--primary);
+
--color-primary-foreground: var(--primary-foreground);
+
--color-secondary: var(--secondary);
+
--color-secondary-foreground: var(--secondary-foreground);
+
--color-muted: var(--muted);
+
--color-muted-foreground: var(--muted-foreground);
+
--color-accent: var(--accent);
+
--color-accent-foreground: var(--accent-foreground);
+
--color-destructive: var(--destructive);
+
--color-destructive-foreground: var(--destructive-foreground);
+
--color-border: var(--border);
+
--color-input: var(--input);
+
--color-ring: var(--ring);
+
--color-chart-1: var(--chart-1);
+
--color-chart-2: var(--chart-2);
+
--color-chart-3: var(--chart-3);
+
--color-chart-4: var(--chart-4);
+
--color-chart-5: var(--chart-5);
+
--radius-sm: calc(var(--radius) - 4px);
+
--radius-md: calc(var(--radius) - 2px);
+
--radius-lg: var(--radius);
+
--radius-xl: calc(var(--radius) + 4px);
+
--color-sidebar: var(--sidebar);
+
--color-sidebar-foreground: var(--sidebar-foreground);
+
--color-sidebar-primary: var(--sidebar-primary);
+
--color-sidebar-primary-foreground: var(--sidebar-primary-foreground);
+
--color-sidebar-accent: var(--sidebar-accent);
+
--color-sidebar-accent-foreground: var(--sidebar-accent-foreground);
+
--color-sidebar-border: var(--sidebar-border);
+
--color-sidebar-ring: var(--sidebar-ring);
+
}
+
+
@layer base {
+
* {
+
@apply border-border outline-ring/50;
+
}
+
body {
+
@apply bg-background text-foreground;
+
}
+
}
+
+
@keyframes arrow-bounce {
+
0%, 100% {
+
transform: translateX(0);
+
}
+
50% {
+
transform: translateX(4px);
+
}
+
}
+
+
.arrow-animate {
+
animation: arrow-bounce 1.5s ease-in-out infinite;
+
}
+
+
@keyframes shimmer {
+
100% {
+
transform: translateX(100%);
+
}
+
}
+
+
/* Shiki syntax highlighting styles */
+
.shiki-wrapper {
+
border-radius: 0.5rem;
+
padding: 1rem;
+
overflow-x: auto;
+
border: 1px solid hsl(var(--border));
+
}
+
+
.shiki-wrapper pre {
+
margin: 0 !important;
+
padding: 0 !important;
+
}
apps/main-app/public/transparent-full-size-ico.png

This is a binary file and will not be displayed.

+46
apps/main-app/scripts/change-admin-password.ts
···
+
// Change admin password
+
import { adminAuth } from '../src/lib/admin-auth'
+
import { db } from '../src/lib/db'
+
import { randomBytes, createHash } from 'crypto'
+
+
// Get username and new password from command line
+
const username = process.argv[2]
+
const newPassword = process.argv[3]
+
+
if (!username || !newPassword) {
+
console.error('Usage: bun run change-admin-password.ts <username> <new-password>')
+
process.exit(1)
+
}
+
+
if (newPassword.length < 8) {
+
console.error('Password must be at least 8 characters')
+
process.exit(1)
+
}
+
+
// Hash password
+
function hashPassword(password: string, salt: string): string {
+
return createHash('sha256').update(password + salt).digest('hex')
+
}
+
+
function generateSalt(): string {
+
return randomBytes(32).toString('hex')
+
}
+
+
// Initialize
+
await adminAuth.init()
+
+
// Check if user exists
+
const result = await db`SELECT username FROM admin_users WHERE username = ${username}`
+
if (result.length === 0) {
+
console.error(`Admin user '${username}' not found`)
+
process.exit(1)
+
}
+
+
// Update password
+
const salt = generateSalt()
+
const passwordHash = hashPassword(newPassword, salt)
+
+
await db`UPDATE admin_users SET password_hash = ${passwordHash}, salt = ${salt} WHERE username = ${username}`
+
+
console.log(`โœ“ Password updated for admin user '${username}'`)
+
process.exit(0)
+31
apps/main-app/scripts/create-admin.ts
···
+
// Quick script to create admin user with randomly generated password
+
import { adminAuth } from '../src/lib/admin-auth'
+
import { randomBytes } from 'crypto'
+
+
// Generate a secure random password
+
function generatePassword(length: number = 20): string {
+
const chars = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!@#$%^&*'
+
const bytes = randomBytes(length)
+
let password = ''
+
for (let i = 0; i < length; i++) {
+
password += chars[bytes[i] % chars.length]
+
}
+
return password
+
}
+
+
const username = 'admin'
+
const password = generatePassword(20)
+
+
await adminAuth.init()
+
await adminAuth.createAdmin(username, password)
+
+
console.log('\nโ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—')
+
console.log('โ•‘ ADMIN USER CREATED SUCCESSFULLY โ•‘')
+
console.log('โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n')
+
console.log(`Username: ${username}`)
+
console.log(`Password: ${password}`)
+
console.log('\nโš ๏ธ IMPORTANT: Save this password securely!')
+
console.log('This password will not be shown again.\n')
+
console.log('Change it with: bun run change-admin-password.ts admin NEW_PASSWORD\n')
+
+
process.exit(0)
+229
apps/main-app/scripts/screenshot-sites.ts
···
+
#!/usr/bin/env bun
+
/**
+
* Screenshot Sites Script
+
*
+
* Takes screenshots of all sites in the database.
+
* Usage: bun run scripts/screenshot-sites.ts
+
*/
+
+
import { chromium } from 'playwright'
+
import { db } from '../src/lib/db'
+
import { mkdir } from 'fs/promises'
+
import { join } from 'path'
+
+
const SCREENSHOTS_DIR = join(process.cwd(), 'screenshots')
+
const VIEWPORT_WIDTH = 1920
+
const VIEWPORT_HEIGHT = 1080
+
const TIMEOUT = 10000 // 10 seconds
+
const MAX_RETRIES = 1
+
const CONCURRENCY = 10 // Number of parallel screenshots
+
+
interface Site {
+
did: string
+
rkey: string
+
}
+
+
/**
+
* Get all sites from the database
+
*/
+
async function getAllSites(): Promise<Site[]> {
+
const rows = await db`
+
SELECT did, rkey
+
FROM sites
+
ORDER BY created_at DESC
+
`
+
+
return rows as Site[]
+
}
+
+
/**
+
* Determine the URL to screenshot for a site
+
* Priority: custom domain (verified) โ†’ wisp domain โ†’ fallback to sites.wisp.place
+
*/
+
async function getSiteUrl(site: Site): Promise<string> {
+
// Check for custom domain mapped to this site
+
const customDomains = await db`
+
SELECT domain FROM custom_domains
+
WHERE did = ${site.did} AND rkey = ${site.rkey} AND verified = true
+
LIMIT 1
+
`
+
if (customDomains.length > 0) {
+
return `https://${customDomains[0].domain}`
+
}
+
+
// Check for wisp domain mapped to this site
+
const wispDomains = await db`
+
SELECT domain FROM domains
+
WHERE did = ${site.did} AND rkey = ${site.rkey}
+
LIMIT 1
+
`
+
if (wispDomains.length > 0) {
+
return `https://${wispDomains[0].domain}`
+
}
+
+
// Fallback to direct serving URL
+
return `https://sites.wisp.place/${site.did}/${site.rkey}`
+
}
+
+
/**
+
* Sanitize filename to remove invalid characters
+
*/
+
function sanitizeFilename(str: string): string {
+
return str.replace(/[^a-z0-9_-]/gi, '_').toLowerCase()
+
}
+
+
/**
+
* Take a screenshot of a site with retry logic
+
*/
+
async function screenshotSite(
+
page: any,
+
site: Site,
+
retries: number = MAX_RETRIES
+
): Promise<{ success: boolean; error?: string }> {
+
const url = await getSiteUrl(site)
+
// Use the URL as filename (remove https:// and sanitize)
+
const urlForFilename = url.replace(/^https?:\/\//, '')
+
const filename = `${sanitizeFilename(urlForFilename)}.png`
+
const filepath = join(SCREENSHOTS_DIR, filename)
+
+
for (let attempt = 0; attempt <= retries; attempt++) {
+
try {
+
// Navigate to the site
+
await page.goto(url, {
+
waitUntil: 'networkidle',
+
timeout: TIMEOUT
+
})
+
+
// Wait a bit for any dynamic content
+
await page.waitForTimeout(1000)
+
+
// Take screenshot
+
await page.screenshot({
+
path: filepath,
+
fullPage: false, // Just viewport, not full scrollable page
+
type: 'png'
+
})
+
+
return { success: true }
+
+
} catch (error) {
+
const errorMsg = error instanceof Error ? error.message : String(error)
+
+
if (attempt < retries) {
+
continue
+
}
+
+
return { success: false, error: errorMsg }
+
}
+
}
+
+
return { success: false, error: 'Unknown error' }
+
}
+
+
/**
+
* Main function
+
*/
+
async function main() {
+
console.log('๐Ÿš€ Starting site screenshot process...\n')
+
+
// Create screenshots directory if it doesn't exist
+
await mkdir(SCREENSHOTS_DIR, { recursive: true })
+
console.log(`๐Ÿ“ Screenshots will be saved to: ${SCREENSHOTS_DIR}\n`)
+
+
// Get all sites
+
console.log('๐Ÿ“Š Fetching sites from database...')
+
const sites = await getAllSites()
+
console.log(` Found ${sites.length} sites\n`)
+
+
if (sites.length === 0) {
+
console.log('No sites to screenshot. Exiting.')
+
return
+
}
+
+
// Launch browser
+
console.log('๐ŸŒ Launching browser...\n')
+
const browser = await chromium.launch({
+
headless: true
+
})
+
+
const context = await browser.newContext({
+
viewport: {
+
width: VIEWPORT_WIDTH,
+
height: VIEWPORT_HEIGHT
+
},
+
userAgent: 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 WispScreenshotBot/1.0'
+
})
+
+
// Track results
+
const results = {
+
success: 0,
+
failed: 0,
+
errors: [] as { site: string; error: string }[]
+
}
+
+
// Process sites in parallel batches
+
console.log(`๐Ÿ“ธ Screenshotting ${sites.length} sites with concurrency ${CONCURRENCY}...\n`)
+
+
for (let i = 0; i < sites.length; i += CONCURRENCY) {
+
const batch = sites.slice(i, i + CONCURRENCY)
+
const batchNum = Math.floor(i / CONCURRENCY) + 1
+
const totalBatches = Math.ceil(sites.length / CONCURRENCY)
+
+
console.log(`[Batch ${batchNum}/${totalBatches}] Processing ${batch.length} sites...`)
+
+
// Create a page for each site in the batch
+
const batchResults = await Promise.all(
+
batch.map(async (site, idx) => {
+
const page = await context.newPage()
+
const globalIdx = i + idx + 1
+
console.log(` [${globalIdx}/${sites.length}] ${site.did}/${site.rkey}`)
+
+
const result = await screenshotSite(page, site)
+
await page.close()
+
+
return { site, result }
+
})
+
)
+
+
// Aggregate results
+
for (const { site, result } of batchResults) {
+
if (result.success) {
+
results.success++
+
} else {
+
results.failed++
+
results.errors.push({
+
site: `${site.did}/${site.rkey}`,
+
error: result.error || 'Unknown error'
+
})
+
}
+
}
+
+
console.log(` Batch complete: ${batchResults.filter(r => r.result.success).length}/${batch.length} successful\n`)
+
}
+
+
// Cleanup
+
await browser.close()
+
+
// Print summary
+
console.log('โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—')
+
console.log('โ•‘ SCREENSHOT SUMMARY โ•‘')
+
console.log('โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n')
+
console.log(`Total sites: ${sites.length}`)
+
console.log(`โœ… Successful: ${results.success}`)
+
console.log(`โŒ Failed: ${results.failed}`)
+
+
if (results.errors.length > 0) {
+
console.log('\nFailed sites:')
+
for (const err of results.errors) {
+
console.log(` - ${err.site}: ${err.error}`)
+
}
+
}
+
+
console.log(`\n๐Ÿ“ Screenshots saved to: ${SCREENSHOTS_DIR}\n`)
+
}
+
+
// Run the script
+
main().catch((error) => {
+
console.error('Fatal error:', error)
+
process.exit(1)
+
})
+195
apps/main-app/src/index.ts
···
+
import { Elysia } from 'elysia'
+
import type { Context } from 'elysia'
+
import { cors } from '@elysiajs/cors'
+
import { staticPlugin } from '@elysiajs/static'
+
+
import type { Config } from './lib/types'
+
import { BASE_HOST } from '@wisp/constants'
+
import {
+
createClientMetadata,
+
getOAuthClient,
+
getCurrentKeys,
+
cleanupExpiredSessions,
+
rotateKeysIfNeeded
+
} from './lib/oauth-client'
+
import { getCookieSecret } from './lib/db'
+
import { authRoutes } from './routes/auth'
+
import { wispRoutes } from './routes/wisp'
+
import { domainRoutes } from './routes/domain'
+
import { userRoutes } from './routes/user'
+
import { siteRoutes } from './routes/site'
+
import { csrfProtection } from './lib/csrf'
+
import { DNSVerificationWorker } from './lib/dns-verification-worker'
+
import { createLogger, logCollector } from '@wisp/observability'
+
import { observabilityMiddleware } from '@wisp/observability/middleware/elysia'
+
import { promptAdminSetup } from './lib/admin-auth'
+
import { adminRoutes } from './routes/admin'
+
+
const logger = createLogger('main-app')
+
+
const config: Config = {
+
domain: (Bun.env.DOMAIN ?? `https://${BASE_HOST}`) as Config['domain'],
+
clientName: Bun.env.CLIENT_NAME ?? 'PDS-View'
+
}
+
+
// Initialize admin setup (prompt if no admin exists)
+
await promptAdminSetup()
+
+
// Get or generate cookie signing secret
+
const cookieSecret = await getCookieSecret()
+
+
const client = await getOAuthClient(config)
+
+
// Periodic maintenance: cleanup expired sessions and rotate keys
+
// Run every hour
+
const runMaintenance = async () => {
+
console.log('[Maintenance] Running periodic maintenance...')
+
await cleanupExpiredSessions()
+
await rotateKeysIfNeeded()
+
}
+
+
// Run maintenance on startup
+
runMaintenance()
+
+
// Schedule maintenance to run every hour
+
setInterval(runMaintenance, 60 * 60 * 1000)
+
+
// Start DNS verification worker (runs every 10 minutes)
+
const dnsVerifier = new DNSVerificationWorker(
+
10 * 60 * 1000, // 10 minutes
+
(msg, data) => {
+
logCollector.info(`[DNS Verifier] ${msg}`, 'main-app', data ? { data } : undefined)
+
}
+
)
+
+
dnsVerifier.start()
+
logger.info('DNS Verifier Started - checking custom domains every 10 minutes')
+
+
export const app = new Elysia({
+
serve: {
+
maxRequestBodySize: 1024 * 1024 * 128 * 3,
+
development: Bun.env.NODE_ENV !== 'production' ? true : false,
+
id: Bun.env.NODE_ENV !== 'production' ? undefined : null,
+
},
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
+
// Observability middleware
+
.onBeforeHandle(observabilityMiddleware('main-app').beforeHandle)
+
.onAfterHandle((ctx: Context) => {
+
observabilityMiddleware('main-app').afterHandle(ctx)
+
// Security headers middleware
+
const { set } = ctx
+
// Prevent clickjacking attacks
+
set.headers['X-Frame-Options'] = 'DENY'
+
// Prevent MIME type sniffing
+
set.headers['X-Content-Type-Options'] = 'nosniff'
+
// Strict Transport Security (HSTS) - enforce HTTPS
+
set.headers['Strict-Transport-Security'] = 'max-age=31536000; includeSubDomains'
+
// Referrer policy - limit referrer information
+
set.headers['Referrer-Policy'] = 'strict-origin-when-cross-origin'
+
// Content Security Policy
+
set.headers['Content-Security-Policy'] =
+
"default-src 'self'; " +
+
"script-src 'self' 'unsafe-inline' 'unsafe-eval'; " +
+
"style-src 'self' 'unsafe-inline'; " +
+
"img-src 'self' data: https:; " +
+
"font-src 'self' data:; " +
+
"connect-src 'self' https:; " +
+
"frame-ancestors 'none'; " +
+
"base-uri 'self'; " +
+
"form-action 'self'"
+
// Additional security headers
+
set.headers['X-XSS-Protection'] = '1; mode=block'
+
set.headers['Permissions-Policy'] = 'geolocation=(), microphone=(), camera=()'
+
})
+
.onError(observabilityMiddleware('main-app').onError)
+
.use(csrfProtection())
+
.use(authRoutes(client, cookieSecret))
+
.use(wispRoutes(client, cookieSecret))
+
.use(domainRoutes(client, cookieSecret))
+
.use(userRoutes(client, cookieSecret))
+
.use(siteRoutes(client, cookieSecret))
+
.use(adminRoutes(cookieSecret))
+
.use(
+
await staticPlugin({
+
assets: 'apps/main-app/public',
+
prefix: '/'
+
})
+
)
+
.get('/client-metadata.json', () => {
+
return createClientMetadata(config)
+
})
+
.get('/jwks.json', async ({ set }) => {
+
// Prevent caching to ensure clients always get fresh keys after rotation
+
set.headers['Cache-Control'] = 'no-store, no-cache, must-revalidate, max-age=0'
+
set.headers['Pragma'] = 'no-cache'
+
set.headers['Expires'] = '0'
+
+
const keys = await getCurrentKeys()
+
if (!keys.length) return { keys: [] }
+
+
return {
+
keys: keys.map((k) => {
+
const jwk = k.publicJwk ?? k
+
const { ...pub } = jwk
+
return pub
+
})
+
}
+
})
+
.get('/api/health', () => {
+
const dnsVerifierHealth = dnsVerifier.getHealth()
+
return {
+
status: 'ok',
+
timestamp: new Date().toISOString(),
+
dnsVerifier: dnsVerifierHealth
+
}
+
})
+
.get('/api/screenshots', async () => {
+
const { Glob } = await import('bun')
+
const glob = new Glob('*.png')
+
const screenshots: string[] = []
+
+
for await (const file of glob.scan('./apps/main-app/public/screenshots')) {
+
screenshots.push(file)
+
}
+
+
return { screenshots }
+
})
+
.get('/api/admin/test', () => {
+
return { message: 'Admin routes test works!' }
+
})
+
.post('/api/admin/verify-dns', async () => {
+
try {
+
await dnsVerifier.trigger()
+
return {
+
success: true,
+
message: 'DNS verification triggered'
+
}
+
} catch (error) {
+
return {
+
success: false,
+
error: error instanceof Error ? error.message : String(error)
+
}
+
}
+
})
+
.get('/.well-known/atproto-did', ({ set }) => {
+
// Return plain text DID for AT Protocol domain verification
+
set.headers['Content-Type'] = 'text/plain'
+
return 'did:plc:7puq73yz2hkvbcpdhnsze2qw'
+
})
+
.use(cors({
+
origin: config.domain,
+
credentials: true,
+
methods: ['GET', 'POST', 'DELETE', 'PUT', 'PATCH', 'OPTIONS'],
+
allowedHeaders: ['Content-Type', 'Authorization', 'Origin', 'X-Forwarded-Host'],
+
exposeHeaders: ['Content-Type'],
+
maxAge: 86400 // 24 hours
+
}))
+
.listen(8000)
+
+
console.log(
+
`๐ŸฆŠ Elysia is running at ${app.server?.hostname}:${app.server?.port}`
+
)
+208
apps/main-app/src/lib/admin-auth.ts
···
+
// Admin authentication system
+
import { db } from './db'
+
import { randomBytes, createHash } from 'crypto'
+
+
interface AdminUser {
+
id: number
+
username: string
+
password_hash: string
+
created_at: Date
+
}
+
+
interface AdminSession {
+
sessionId: string
+
username: string
+
expiresAt: Date
+
}
+
+
// In-memory session storage
+
const sessions = new Map<string, AdminSession>()
+
const SESSION_DURATION = 24 * 60 * 60 * 1000 // 24 hours
+
+
// Hash password using SHA-256 with salt
+
function hashPassword(password: string, salt: string): string {
+
return createHash('sha256').update(password + salt).digest('hex')
+
}
+
+
// Generate random salt
+
function generateSalt(): string {
+
return randomBytes(32).toString('hex')
+
}
+
+
// Generate session ID
+
function generateSessionId(): string {
+
return randomBytes(32).toString('hex')
+
}
+
+
// Generate a secure random password
+
function generatePassword(length: number = 20): string {
+
const chars = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!@#$%^&*'
+
const bytes = randomBytes(length)
+
let password = ''
+
for (let i = 0; i < length; i++) {
+
password += chars[bytes[i] % chars.length]
+
}
+
return password
+
}
+
+
export const adminAuth = {
+
// Initialize admin table
+
async init() {
+
await db`
+
CREATE TABLE IF NOT EXISTS admin_users (
+
id SERIAL PRIMARY KEY,
+
username TEXT UNIQUE NOT NULL,
+
password_hash TEXT NOT NULL,
+
salt TEXT NOT NULL,
+
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
+
)
+
`
+
},
+
+
// Check if any admin exists
+
async hasAdmin(): Promise<boolean> {
+
const result = await db`SELECT COUNT(*) as count FROM admin_users`
+
return result[0].count > 0
+
},
+
+
// Create admin user
+
async createAdmin(username: string, password: string): Promise<boolean> {
+
try {
+
const salt = generateSalt()
+
const passwordHash = hashPassword(password, salt)
+
+
await db`INSERT INTO admin_users (username, password_hash, salt) VALUES (${username}, ${passwordHash}, ${salt})`
+
+
console.log(`โœ“ Admin user '${username}' created successfully`)
+
return true
+
} catch (error) {
+
console.error('Failed to create admin user:', error)
+
return false
+
}
+
},
+
+
// Verify admin credentials
+
async verify(username: string, password: string): Promise<boolean> {
+
try {
+
const result = await db`SELECT password_hash, salt FROM admin_users WHERE username = ${username}`
+
+
if (result.length === 0) {
+
return false
+
}
+
+
const { password_hash, salt } = result[0]
+
const hash = hashPassword(password, salt as string)
+
return hash === password_hash
+
} catch (error) {
+
console.error('Failed to verify admin:', error)
+
return false
+
}
+
},
+
+
// Create session
+
createSession(username: string): string {
+
const sessionId = generateSessionId()
+
const expiresAt = new Date(Date.now() + SESSION_DURATION)
+
+
sessions.set(sessionId, {
+
sessionId,
+
username,
+
expiresAt
+
})
+
+
// Clean up expired sessions
+
this.cleanupSessions()
+
+
return sessionId
+
},
+
+
// Verify session
+
verifySession(sessionId: string): AdminSession | null {
+
const session = sessions.get(sessionId)
+
+
if (!session) {
+
return null
+
}
+
+
if (session.expiresAt.getTime() < Date.now()) {
+
sessions.delete(sessionId)
+
return null
+
}
+
+
return session
+
},
+
+
// Delete session
+
deleteSession(sessionId: string) {
+
sessions.delete(sessionId)
+
},
+
+
// Cleanup expired sessions
+
cleanupSessions() {
+
const now = Date.now()
+
for (const [sessionId, session] of sessions.entries()) {
+
if (session.expiresAt.getTime() < now) {
+
sessions.delete(sessionId)
+
}
+
}
+
}
+
}
+
+
// Prompt for admin creation on startup
+
export async function promptAdminSetup() {
+
await adminAuth.init()
+
+
const hasAdmin = await adminAuth.hasAdmin()
+
if (hasAdmin) {
+
return
+
}
+
+
// Skip prompt if SKIP_ADMIN_SETUP is set
+
if (process.env.SKIP_ADMIN_SETUP === 'true') {
+
console.log('\nโ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—')
+
console.log('โ•‘ ADMIN SETUP REQUIRED โ•‘')
+
console.log('โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n')
+
console.log('No admin user found.')
+
console.log('Create one with: bun run create-admin.ts\n')
+
return
+
}
+
+
console.log('\n===========================================')
+
console.log(' ADMIN SETUP REQUIRED')
+
console.log('===========================================\n')
+
console.log('No admin user found. Creating one automatically...\n')
+
+
// Auto-generate admin credentials with random password
+
const username = 'admin'
+
const password = generatePassword(20)
+
+
await adminAuth.createAdmin(username, password)
+
+
console.log('โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—')
+
console.log('โ•‘ ADMIN USER CREATED SUCCESSFULLY โ•‘')
+
console.log('โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n')
+
console.log(`Username: ${username}`)
+
console.log(`Password: ${password}`)
+
console.log('\nโš ๏ธ IMPORTANT: Save this password securely!')
+
console.log('This password will not be shown again.\n')
+
console.log('Change it with: bun run change-admin-password.ts admin NEW_PASSWORD\n')
+
}
+
+
// Elysia middleware to protect admin routes
+
export function requireAdmin({ cookie, set }: any) {
+
const sessionId = cookie.admin_session?.value
+
+
if (!sessionId) {
+
set.status = 401
+
return { error: 'Unauthorized' }
+
}
+
+
const session = adminAuth.verifySession(sessionId)
+
if (!session) {
+
set.status = 401
+
return { error: 'Unauthorized' }
+
}
+
+
// Session is valid, continue
+
return
+
}
+81
apps/main-app/src/lib/csrf.test.ts
···
+
import { describe, test, expect } from 'bun:test'
+
import { verifyRequestOrigin } from './csrf'
+
+
describe('verifyRequestOrigin', () => {
+
test('should accept matching origin and host', () => {
+
expect(verifyRequestOrigin('https://example.com', ['example.com'])).toBe(true)
+
expect(verifyRequestOrigin('http://localhost:8000', ['localhost:8000'])).toBe(true)
+
expect(verifyRequestOrigin('https://app.example.com', ['app.example.com'])).toBe(true)
+
})
+
+
test('should accept origin matching one of multiple allowed hosts', () => {
+
const allowedHosts = ['example.com', 'app.example.com', 'localhost:8000']
+
expect(verifyRequestOrigin('https://example.com', allowedHosts)).toBe(true)
+
expect(verifyRequestOrigin('https://app.example.com', allowedHosts)).toBe(true)
+
expect(verifyRequestOrigin('http://localhost:8000', allowedHosts)).toBe(true)
+
})
+
+
test('should reject non-matching origin', () => {
+
expect(verifyRequestOrigin('https://evil.com', ['example.com'])).toBe(false)
+
expect(verifyRequestOrigin('https://fake-example.com', ['example.com'])).toBe(false)
+
expect(verifyRequestOrigin('https://example.com.evil.com', ['example.com'])).toBe(false)
+
})
+
+
test('should reject empty origin', () => {
+
expect(verifyRequestOrigin('', ['example.com'])).toBe(false)
+
})
+
+
test('should reject invalid URL format', () => {
+
expect(verifyRequestOrigin('not-a-url', ['example.com'])).toBe(false)
+
expect(verifyRequestOrigin('javascript:alert(1)', ['example.com'])).toBe(false)
+
expect(verifyRequestOrigin('file:///etc/passwd', ['example.com'])).toBe(false)
+
})
+
+
test('should handle different protocols correctly', () => {
+
// Same host, different protocols should match (we only check host)
+
expect(verifyRequestOrigin('http://example.com', ['example.com'])).toBe(true)
+
expect(verifyRequestOrigin('https://example.com', ['example.com'])).toBe(true)
+
})
+
+
test('should handle port numbers correctly', () => {
+
expect(verifyRequestOrigin('http://localhost:3000', ['localhost:3000'])).toBe(true)
+
expect(verifyRequestOrigin('http://localhost:3000', ['localhost:8000'])).toBe(false)
+
expect(verifyRequestOrigin('http://localhost', ['localhost'])).toBe(true)
+
})
+
+
test('should handle subdomains correctly', () => {
+
expect(verifyRequestOrigin('https://sub.example.com', ['sub.example.com'])).toBe(true)
+
expect(verifyRequestOrigin('https://sub.example.com', ['example.com'])).toBe(false)
+
})
+
+
test('should handle case sensitivity (exact match required)', () => {
+
// URL host is automatically lowercased by URL parser
+
expect(verifyRequestOrigin('https://EXAMPLE.COM', ['example.com'])).toBe(true)
+
expect(verifyRequestOrigin('https://example.com', ['example.com'])).toBe(true)
+
// But allowed hosts are case-sensitive
+
expect(verifyRequestOrigin('https://example.com', ['EXAMPLE.COM'])).toBe(false)
+
})
+
+
test('should handle trailing slashes in origin', () => {
+
expect(verifyRequestOrigin('https://example.com/', ['example.com'])).toBe(true)
+
})
+
+
test('should handle paths in origin (host extraction)', () => {
+
expect(verifyRequestOrigin('https://example.com/path/to/page', ['example.com'])).toBe(true)
+
expect(verifyRequestOrigin('https://evil.com/example.com', ['example.com'])).toBe(false)
+
})
+
+
test('should reject when allowed hosts is empty', () => {
+
expect(verifyRequestOrigin('https://example.com', [])).toBe(false)
+
})
+
+
test('should handle IPv4 addresses', () => {
+
expect(verifyRequestOrigin('http://127.0.0.1:8000', ['127.0.0.1:8000'])).toBe(true)
+
expect(verifyRequestOrigin('http://192.168.1.1', ['192.168.1.1'])).toBe(true)
+
})
+
+
test('should handle IPv6 addresses', () => {
+
expect(verifyRequestOrigin('http://[::1]:8000', ['[::1]:8000'])).toBe(true)
+
expect(verifyRequestOrigin('http://[2001:db8::1]', ['[2001:db8::1]'])).toBe(true)
+
})
+
})
+80
apps/main-app/src/lib/csrf.ts
···
+
import { Elysia } from 'elysia'
+
import { logger } from './logger'
+
+
/**
+
* CSRF Protection using Origin/Host header verification
+
* Based on Lucia's recommended approach for cookie-based authentication
+
*
+
* This validates that the Origin header matches the Host header for
+
* state-changing requests (POST, PUT, DELETE, PATCH).
+
*/
+
+
/**
+
* Verify that the request origin matches the expected host
+
* @param origin - The Origin header value
+
* @param allowedHosts - Array of allowed host values
+
* @returns true if origin is valid, false otherwise
+
*/
+
export function verifyRequestOrigin(origin: string, allowedHosts: string[]): boolean {
+
if (!origin) {
+
return false
+
}
+
+
try {
+
const originUrl = new URL(origin)
+
const originHost = originUrl.host
+
+
return allowedHosts.some(host => originHost === host)
+
} catch {
+
// Invalid URL
+
return false
+
}
+
}
+
+
/**
+
* CSRF Protection Middleware for Elysia
+
*
+
* Validates Origin header against Host header for non-GET requests
+
* to prevent CSRF attacks when using cookie-based authentication.
+
*
+
* Usage:
+
* ```ts
+
* import { csrfProtection } from './lib/csrf'
+
*
+
* new Elysia()
+
* .use(csrfProtection())
+
* .post('/api/protected', handler)
+
* ```
+
*/
+
export const csrfProtection = () => {
+
return new Elysia({ name: 'csrf-protection' })
+
.onBeforeHandle(({ request, set }) => {
+
const method = request.method.toUpperCase()
+
+
// Only protect state-changing methods
+
if (['GET', 'HEAD', 'OPTIONS'].includes(method)) {
+
return
+
}
+
+
// Get headers
+
const originHeader = request.headers.get('Origin')
+
// Use X-Forwarded-Host if behind a proxy, otherwise use Host
+
const hostHeader = request.headers.get('X-Forwarded-Host') || request.headers.get('Host')
+
+
// Validate origin matches host
+
if (!originHeader || !hostHeader || !verifyRequestOrigin(originHeader, [hostHeader])) {
+
logger.warn('[CSRF] Request blocked', {
+
method,
+
origin: originHeader,
+
host: hostHeader,
+
path: new URL(request.url).pathname
+
})
+
+
set.status = 403
+
return {
+
error: 'CSRF validation failed',
+
message: 'Request origin does not match host'
+
}
+
}
+
})
+
}
+116
apps/main-app/src/lib/db.test.ts
···
+
import { describe, test, expect, beforeAll, afterAll } from 'bun:test'
+
import {
+
claimCustomDomain,
+
getCustomDomainInfo,
+
deleteCustomDomain,
+
updateCustomDomainVerification,
+
db
+
} from './db'
+
+
describe('custom domain claiming', () => {
+
const testDid1 = 'did:plc:testuser1'
+
const testDid2 = 'did:plc:testuser2'
+
const testDomain = 'example-test-domain.com'
+
const hash1 = 'testhash12345678'
+
const hash2 = 'testhash87654321'
+
const hash3 = 'testhash11111111'
+
+
beforeAll(async () => {
+
// Clean up any existing test data
+
try {
+
await db`DELETE FROM custom_domains WHERE domain = ${testDomain}`
+
} catch (err) {
+
// Ignore errors if table doesn't exist or other issues
+
}
+
})
+
+
afterAll(async () => {
+
// Clean up test data
+
try {
+
await db`DELETE FROM custom_domains WHERE domain = ${testDomain}`
+
} catch (err) {
+
// Ignore cleanup errors
+
}
+
})
+
+
test('should allow first user to claim a domain', async () => {
+
const result = await claimCustomDomain(testDid1, testDomain, hash1)
+
expect(result.success).toBe(true)
+
expect(result.hash).toBe(hash1)
+
+
const domainInfo = await getCustomDomainInfo(testDomain)
+
expect(domainInfo).toBeTruthy()
+
expect(domainInfo!.domain).toBe(testDomain)
+
expect(domainInfo!.did).toBe(testDid1)
+
expect(domainInfo!.verified).toBe(false)
+
expect(domainInfo!.id).toBe(hash1)
+
})
+
+
test('should allow second user to claim an unverified domain', async () => {
+
const result = await claimCustomDomain(testDid2, testDomain, hash2)
+
expect(result.success).toBe(true)
+
expect(result.hash).toBe(hash2)
+
+
const domainInfo = await getCustomDomainInfo(testDomain)
+
expect(domainInfo).toBeTruthy()
+
expect(domainInfo!.domain).toBe(testDomain)
+
expect(domainInfo!.did).toBe(testDid2) // Should have changed
+
expect(domainInfo!.verified).toBe(false)
+
expect(domainInfo!.id).toBe(hash2) // Should have changed
+
})
+
+
test('should prevent claiming a verified domain', async () => {
+
// First verify the domain for testDid2
+
await updateCustomDomainVerification(hash2, true)
+
+
// Now try to claim it with testDid1 - should fail
+
try {
+
await claimCustomDomain(testDid1, testDomain, hash3)
+
expect('Should have thrown an error when trying to claim a verified domain').fail()
+
} catch (err) {
+
expect((err as Error).message).toBe('conflict')
+
}
+
+
// Verify the domain is still owned by testDid2 and verified
+
const domainInfo = await getCustomDomainInfo(testDomain)
+
expect(domainInfo).toBeTruthy()
+
expect(domainInfo!.did).toBe(testDid2)
+
expect(domainInfo!.verified).toBe(true)
+
expect(domainInfo!.id).toBe(hash2)
+
})
+
+
test('should allow claiming after unverification', async () => {
+
// Unverify the domain
+
await updateCustomDomainVerification(hash2, false)
+
+
// Now should be claimable again
+
const result = await claimCustomDomain(testDid1, testDomain, hash3)
+
expect(result.success).toBe(true)
+
expect(result.hash).toBe(hash3)
+
+
const domainInfo = await getCustomDomainInfo(testDomain)
+
expect(domainInfo).toBeTruthy()
+
expect(domainInfo!.did).toBe(testDid1) // Should have changed back
+
expect(domainInfo!.verified).toBe(false)
+
expect(domainInfo!.id).toBe(hash3)
+
})
+
+
test('should handle concurrent claims gracefully', async () => {
+
// Both users try to claim at the same time - one should win
+
const promise1 = claimCustomDomain(testDid1, testDomain, hash1)
+
const promise2 = claimCustomDomain(testDid2, testDomain, hash2)
+
+
const [result1, result2] = await Promise.allSettled([promise1, promise2])
+
+
// At least one should succeed
+
const successCount = [result1, result2].filter(r => r.status === 'fulfilled').length
+
expect(successCount).toBeGreaterThan(0)
+
expect(successCount).toBeLessThanOrEqual(2)
+
+
// Final state should be consistent
+
const domainInfo = await getCustomDomainInfo(testDomain)
+
expect(domainInfo).toBeTruthy()
+
expect(domainInfo!.verified).toBe(false)
+
expect([hash1, hash2]).toContain(domainInfo!.id)
+
})
+
})
+528
apps/main-app/src/lib/db.ts
···
+
import { SQL } from "bun";
+
import { BASE_HOST } from "@wisp/constants";
+
+
export const db = new SQL(
+
process.env.NODE_ENV === 'production'
+
? process.env.DATABASE_URL || (() => {
+
throw new Error('DATABASE_URL environment variable is required in production');
+
})()
+
: process.env.DATABASE_URL || "postgres://postgres:postgres@localhost:5432/wisp"
+
);
+
+
await db`
+
CREATE TABLE IF NOT EXISTS oauth_states (
+
key TEXT PRIMARY KEY,
+
data TEXT NOT NULL,
+
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
+
)
+
`;
+
+
await db`
+
CREATE TABLE IF NOT EXISTS oauth_sessions (
+
sub TEXT PRIMARY KEY,
+
data TEXT NOT NULL,
+
updated_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW()),
+
expires_at BIGINT NOT NULL DEFAULT EXTRACT(EPOCH FROM NOW()) + 2592000
+
)
+
`;
+
+
await db`
+
CREATE TABLE IF NOT EXISTS oauth_keys (
+
kid TEXT PRIMARY KEY,
+
jwk TEXT NOT NULL,
+
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
+
)
+
`;
+
+
// Cookie secrets table for signed cookies
+
await db`
+
CREATE TABLE IF NOT EXISTS cookie_secrets (
+
id TEXT PRIMARY KEY DEFAULT 'default',
+
secret TEXT NOT NULL,
+
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
+
)
+
`;
+
+
// Domains table maps subdomain -> DID (now supports up to 3 domains per user)
+
await db`
+
CREATE TABLE IF NOT EXISTS domains (
+
domain TEXT PRIMARY KEY,
+
did TEXT NOT NULL,
+
rkey TEXT,
+
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
+
)
+
`;
+
+
// Add columns if they don't exist (for existing databases)
+
try {
+
await db`ALTER TABLE domains ADD COLUMN IF NOT EXISTS rkey TEXT`;
+
} catch (err) {
+
// Column might already exist, ignore
+
}
+
+
try {
+
await db`ALTER TABLE oauth_sessions ADD COLUMN IF NOT EXISTS expires_at BIGINT NOT NULL DEFAULT EXTRACT(EPOCH FROM NOW()) + 2592000`;
+
} catch (err) {
+
// Column might already exist, ignore
+
}
+
+
try {
+
await db`ALTER TABLE oauth_keys ADD COLUMN IF NOT EXISTS created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())`;
+
} catch (err) {
+
// Column might already exist, ignore
+
}
+
+
try {
+
await db`ALTER TABLE oauth_states ADD COLUMN IF NOT EXISTS expires_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW()) + 3600`;
+
} catch (err) {
+
// Column might already exist, ignore
+
}
+
+
// Remove the unique constraint on domains.did to allow multiple domains per user
+
try {
+
await db`ALTER TABLE domains DROP CONSTRAINT IF EXISTS domains_did_key`;
+
} catch (err) {
+
// Constraint might already be removed, ignore
+
}
+
+
// Custom domains table for BYOD (bring your own domain)
+
await db`
+
CREATE TABLE IF NOT EXISTS custom_domains (
+
id TEXT PRIMARY KEY,
+
domain TEXT UNIQUE NOT NULL,
+
did TEXT NOT NULL,
+
rkey TEXT,
+
verified BOOLEAN DEFAULT false,
+
last_verified_at BIGINT,
+
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
+
)
+
`;
+
+
// Migrate existing tables to make rkey nullable and remove default
+
try {
+
await db`ALTER TABLE custom_domains ALTER COLUMN rkey DROP NOT NULL`;
+
} catch (err) {
+
// Column might already be nullable, ignore
+
}
+
try {
+
await db`ALTER TABLE custom_domains ALTER COLUMN rkey DROP DEFAULT`;
+
} catch (err) {
+
// Default might already be removed, ignore
+
}
+
+
// Sites table - cache of place.wisp.fs records from PDS
+
await db`
+
CREATE TABLE IF NOT EXISTS sites (
+
did TEXT NOT NULL,
+
rkey TEXT NOT NULL,
+
display_name TEXT,
+
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW()),
+
updated_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW()),
+
PRIMARY KEY (did, rkey)
+
)
+
`;
+
+
// Create indexes for common query patterns
+
await Promise.all([
+
// oauth_states cleanup queries
+
db`CREATE INDEX IF NOT EXISTS idx_oauth_states_expires_at ON oauth_states(expires_at)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_oauth_states_expires_at:', err);
+
}
+
}),
+
+
// oauth_sessions cleanup queries
+
db`CREATE INDEX IF NOT EXISTS idx_oauth_sessions_expires_at ON oauth_sessions(expires_at)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_oauth_sessions_expires_at:', err);
+
}
+
}),
+
+
// oauth_keys key rotation queries
+
db`CREATE INDEX IF NOT EXISTS idx_oauth_keys_created_at ON oauth_keys(created_at)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_oauth_keys_created_at:', err);
+
}
+
}),
+
+
// domains queries by (did, rkey)
+
db`CREATE INDEX IF NOT EXISTS idx_domains_did_rkey ON domains(did, rkey)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_domains_did_rkey:', err);
+
}
+
}),
+
+
// custom_domains queries by did
+
db`CREATE INDEX IF NOT EXISTS idx_custom_domains_did ON custom_domains(did)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_custom_domains_did:', err);
+
}
+
}),
+
+
// custom_domains queries by (did, rkey)
+
db`CREATE INDEX IF NOT EXISTS idx_custom_domains_did_rkey ON custom_domains(did, rkey)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_custom_domains_did_rkey:', err);
+
}
+
}),
+
+
// custom_domains DNS verification worker queries
+
db`CREATE INDEX IF NOT EXISTS idx_custom_domains_verified ON custom_domains(verified)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_custom_domains_verified:', err);
+
}
+
}),
+
+
// sites queries by did
+
db`CREATE INDEX IF NOT EXISTS idx_sites_did ON sites(did)`.catch(err => {
+
if (!err.message?.includes('already exists')) {
+
console.error('Failed to create idx_sites_did:', err);
+
}
+
})
+
]);
+
+
const RESERVED_HANDLES = new Set([
+
"www",
+
"api",
+
"admin",
+
"static",
+
"public",
+
"preview",
+
"slingshot",
+
"plc",
+
"constellation",
+
"cdn",
+
"pds",
+
"staging",
+
"auth"
+
]);
+
+
export const isValidHandle = (handle: string): boolean => {
+
const h = handle.trim().toLowerCase();
+
if (h.length < 3 || h.length > 63) return false;
+
if (!/^[a-z0-9-]+$/.test(h)) return false;
+
if (h.startsWith('-') || h.endsWith('-')) return false;
+
if (h.includes('--')) return false;
+
if (RESERVED_HANDLES.has(h)) return false;
+
return true;
+
};
+
+
export const toDomain = (handle: string): string => `${handle.toLowerCase()}.${BASE_HOST}`;
+
+
export const getDomainByDid = async (did: string): Promise<string | null> => {
+
const rows = await db`SELECT domain FROM domains WHERE did = ${did} ORDER BY created_at ASC LIMIT 1`;
+
return rows[0]?.domain ?? null;
+
};
+
+
export const getWispDomainInfo = async (did: string) => {
+
const rows = await db`SELECT domain, rkey FROM domains WHERE did = ${did} ORDER BY created_at ASC LIMIT 1`;
+
return rows[0] ?? null;
+
};
+
+
export const getAllWispDomains = async (did: string): Promise<Array<{ domain: string; rkey: string | null }>> => {
+
const rows = await db`SELECT domain, rkey FROM domains WHERE did = ${did} ORDER BY created_at ASC`;
+
return rows;
+
};
+
+
export const countWispDomains = async (did: string): Promise<number> => {
+
const rows = await db`SELECT COUNT(*) as count FROM domains WHERE did = ${did}`;
+
return Number(rows[0]?.count ?? 0);
+
};
+
+
export const getDidByDomain = async (domain: string): Promise<string | null> => {
+
const rows = await db`SELECT did FROM domains WHERE domain = ${domain.toLowerCase()}`;
+
return rows[0]?.did ?? null;
+
};
+
+
export const isDomainAvailable = async (handle: string): Promise<boolean> => {
+
const h = handle.trim().toLowerCase();
+
if (!isValidHandle(h)) return false;
+
const domain = toDomain(h);
+
const rows = await db`SELECT 1 FROM domains WHERE domain = ${domain} LIMIT 1`;
+
return rows.length === 0;
+
};
+
+
export const isDomainRegistered = async (domain: string) => {
+
const domainLower = domain.toLowerCase().trim();
+
+
// Check wisp.place subdomains
+
const wispDomain = await db`
+
SELECT did, domain, rkey FROM domains WHERE domain = ${domainLower}
+
`;
+
+
if (wispDomain.length > 0) {
+
return {
+
registered: true,
+
type: 'wisp' as const,
+
domain: wispDomain[0].domain,
+
did: wispDomain[0].did,
+
rkey: wispDomain[0].rkey
+
};
+
}
+
+
// Check custom domains
+
const customDomain = await db`
+
SELECT id, domain, did, rkey, verified FROM custom_domains WHERE domain = ${domainLower}
+
`;
+
+
if (customDomain.length > 0) {
+
return {
+
registered: true,
+
type: 'custom' as const,
+
domain: customDomain[0].domain,
+
did: customDomain[0].did,
+
rkey: customDomain[0].rkey,
+
verified: customDomain[0].verified
+
};
+
}
+
+
return { registered: false };
+
};
+
+
export const claimDomain = async (did: string, handle: string): Promise<string> => {
+
const h = handle.trim().toLowerCase();
+
if (!isValidHandle(h)) throw new Error('invalid_handle');
+
+
// Check if user already has 3 domains
+
const existingCount = await countWispDomains(did);
+
if (existingCount >= 3) {
+
throw new Error('domain_limit_reached');
+
}
+
+
const domain = toDomain(h);
+
try {
+
await db`
+
INSERT INTO domains (domain, did)
+
VALUES (${domain}, ${did})
+
`;
+
} catch (err) {
+
// Unique constraint violations -> already taken
+
throw new Error('conflict');
+
}
+
return domain;
+
};
+
+
export const updateDomain = async (did: string, handle: string): Promise<string> => {
+
const h = handle.trim().toLowerCase();
+
if (!isValidHandle(h)) throw new Error('invalid_handle');
+
const domain = toDomain(h);
+
try {
+
const rows = await db`
+
UPDATE domains SET domain = ${domain}
+
WHERE did = ${did}
+
RETURNING domain
+
`;
+
if (rows.length > 0) return rows[0].domain as string;
+
// No existing row, behave like claim
+
return await claimDomain(did, handle);
+
} catch (err) {
+
// Unique constraint violations -> already taken by someone else
+
throw new Error('conflict');
+
}
+
};
+
+
export const updateWispDomainSite = async (domain: string, siteRkey: string | null): Promise<void> => {
+
await db`
+
UPDATE domains
+
SET rkey = ${siteRkey}
+
WHERE domain = ${domain}
+
`;
+
};
+
+
export const getWispDomainSite = async (did: string): Promise<string | null> => {
+
const rows = await db`SELECT rkey FROM domains WHERE did = ${did} ORDER BY created_at ASC LIMIT 1`;
+
return rows[0]?.rkey ?? null;
+
};
+
+
export const deleteWispDomain = async (domain: string): Promise<void> => {
+
await db`DELETE FROM domains WHERE domain = ${domain}`;
+
};
+
+
export const getCustomDomainsByDid = async (did: string) => {
+
const rows = await db`SELECT * FROM custom_domains WHERE did = ${did} ORDER BY created_at DESC`;
+
return rows;
+
};
+
+
export const getCustomDomainInfo = async (domain: string) => {
+
const rows = await db`SELECT * FROM custom_domains WHERE domain = ${domain.toLowerCase()}`;
+
return rows[0] ?? null;
+
};
+
+
export const getCustomDomainByHash = async (hash: string) => {
+
const rows = await db`SELECT * FROM custom_domains WHERE id = ${hash}`;
+
return rows[0] ?? null;
+
};
+
+
export const getCustomDomainById = async (id: string) => {
+
const rows = await db`SELECT * FROM custom_domains WHERE id = ${id}`;
+
return rows[0] ?? null;
+
};
+
+
export const claimCustomDomain = async (did: string, domain: string, hash: string, rkey: string | null = null) => {
+
const domainLower = domain.toLowerCase();
+
try {
+
// Use UPSERT with ON CONFLICT to handle existing pending domains
+
const result = await db`
+
INSERT INTO custom_domains (id, domain, did, rkey, verified, created_at)
+
VALUES (${hash}, ${domainLower}, ${did}, ${rkey}, false, EXTRACT(EPOCH FROM NOW()))
+
ON CONFLICT (domain) DO UPDATE SET
+
id = EXCLUDED.id,
+
did = EXCLUDED.did,
+
rkey = EXCLUDED.rkey,
+
verified = EXCLUDED.verified,
+
created_at = EXCLUDED.created_at
+
WHERE custom_domains.verified = false
+
RETURNING *
+
`;
+
+
if (result.length === 0) {
+
// No rows were updated, meaning the domain exists and is verified
+
throw new Error('conflict');
+
}
+
+
return { success: true, hash };
+
} catch (err) {
+
console.error('Failed to claim custom domain', err);
+
throw new Error('conflict');
+
}
+
};
+
+
export const updateCustomDomainRkey = async (id: string, rkey: string | null) => {
+
const rows = await db`
+
UPDATE custom_domains
+
SET rkey = ${rkey}
+
WHERE id = ${id}
+
RETURNING *
+
`;
+
return rows[0] ?? null;
+
};
+
+
export const updateCustomDomainVerification = async (id: string, verified: boolean) => {
+
const rows = await db`
+
UPDATE custom_domains
+
SET verified = ${verified}, last_verified_at = EXTRACT(EPOCH FROM NOW())
+
WHERE id = ${id}
+
RETURNING *
+
`;
+
return rows[0] ?? null;
+
};
+
+
export const deleteCustomDomain = async (id: string) => {
+
await db`DELETE FROM custom_domains WHERE id = ${id}`;
+
};
+
+
export const getSitesByDid = async (did: string) => {
+
const rows = await db`SELECT * FROM sites WHERE did = ${did} ORDER BY created_at DESC`;
+
return rows;
+
};
+
+
export const upsertSite = async (did: string, rkey: string, displayName?: string) => {
+
try {
+
// Only set display_name if provided (not undefined/null/empty)
+
const cleanDisplayName = displayName && displayName.trim() ? displayName.trim() : null;
+
+
await db`
+
INSERT INTO sites (did, rkey, display_name, created_at, updated_at)
+
VALUES (${did}, ${rkey}, ${cleanDisplayName}, EXTRACT(EPOCH FROM NOW()), EXTRACT(EPOCH FROM NOW()))
+
ON CONFLICT (did, rkey)
+
DO UPDATE SET
+
display_name = CASE
+
WHEN EXCLUDED.display_name IS NOT NULL THEN EXCLUDED.display_name
+
ELSE sites.display_name
+
END,
+
updated_at = EXTRACT(EPOCH FROM NOW())
+
`;
+
return { success: true };
+
} catch (err) {
+
console.error('Failed to upsert site', err);
+
return { success: false, error: err };
+
}
+
};
+
+
export const deleteSite = async (did: string, rkey: string) => {
+
try {
+
await db`DELETE FROM sites WHERE did = ${did} AND rkey = ${rkey}`;
+
return { success: true };
+
} catch (err) {
+
console.error('Failed to delete site', err);
+
return { success: false, error: err };
+
}
+
};
+
+
// Get all domains (wisp + custom) mapped to a specific site
+
export const getDomainsBySite = async (did: string, rkey: string) => {
+
const domains: Array<{
+
type: 'wisp' | 'custom';
+
domain: string;
+
verified?: boolean;
+
id?: string;
+
}> = [];
+
+
// Check wisp domain
+
const wispDomain = await db`
+
SELECT domain, rkey FROM domains
+
WHERE did = ${did} AND rkey = ${rkey}
+
`;
+
if (wispDomain.length > 0) {
+
domains.push({
+
type: 'wisp',
+
domain: wispDomain[0].domain,
+
});
+
}
+
+
// Check custom domains
+
const customDomains = await db`
+
SELECT id, domain, verified FROM custom_domains
+
WHERE did = ${did} AND rkey = ${rkey}
+
ORDER BY created_at DESC
+
`;
+
for (const cd of customDomains) {
+
domains.push({
+
type: 'custom',
+
domain: cd.domain,
+
verified: cd.verified,
+
id: cd.id,
+
});
+
}
+
+
return domains;
+
};
+
+
// Get count of domains mapped to a specific site
+
export const getDomainCountBySite = async (did: string, rkey: string) => {
+
const wispCount = await db`
+
SELECT COUNT(*) as count FROM domains
+
WHERE did = ${did} AND rkey = ${rkey}
+
`;
+
+
const customCount = await db`
+
SELECT COUNT(*) as count FROM custom_domains
+
WHERE did = ${did} AND rkey = ${rkey}
+
`;
+
+
return {
+
wisp: Number(wispCount[0]?.count || 0),
+
custom: Number(customCount[0]?.count || 0),
+
total: Number(wispCount[0]?.count || 0) + Number(customCount[0]?.count || 0),
+
};
+
};
+
+
// Cookie secret management - ensure we have a secret for signing cookies
+
export const getCookieSecret = async (): Promise<string> => {
+
// Check if secret already exists
+
const rows = await db`SELECT secret FROM cookie_secrets WHERE id = 'default' LIMIT 1`;
+
+
if (rows.length > 0) {
+
return rows[0].secret as string;
+
}
+
+
// Generate new secret if none exists
+
const secret = crypto.randomUUID() + crypto.randomUUID(); // 72 character random string
+
await db`
+
INSERT INTO cookie_secrets (id, secret, created_at)
+
VALUES ('default', ${secret}, EXTRACT(EPOCH FROM NOW()))
+
`;
+
+
console.log('[CookieSecret] Generated new cookie signing secret');
+
return secret;
+
};
+209
apps/main-app/src/lib/dns-verification-worker.ts
···
+
import { verifyCustomDomain } from './dns-verify';
+
import { db } from './db';
+
+
interface VerificationStats {
+
totalChecked: number;
+
verified: number;
+
failed: number;
+
errors: number;
+
}
+
+
export class DNSVerificationWorker {
+
private interval: Timer | null = null;
+
private isRunning = false;
+
private lastRunTime: number | null = null;
+
private stats: VerificationStats = {
+
totalChecked: 0,
+
verified: 0,
+
failed: 0,
+
errors: 0,
+
};
+
+
constructor(
+
private checkIntervalMs: number = 60 * 60 * 1000, // 1 hour default
+
private onLog?: (message: string, data?: any) => void
+
) {}
+
+
private log(message: string, data?: any) {
+
if (this.onLog) {
+
this.onLog(message, data);
+
}
+
}
+
+
async start() {
+
if (this.isRunning) {
+
this.log('DNS verification worker already running');
+
return;
+
}
+
+
this.isRunning = true;
+
this.log('Starting DNS verification worker', {
+
intervalMinutes: this.checkIntervalMs / 60000,
+
});
+
+
// Run immediately on start
+
await this.verifyAllDomains();
+
+
// Then run on interval
+
this.interval = setInterval(() => {
+
this.verifyAllDomains();
+
}, this.checkIntervalMs);
+
}
+
+
stop() {
+
if (this.interval) {
+
clearInterval(this.interval);
+
this.interval = null;
+
}
+
this.isRunning = false;
+
this.log('DNS verification worker stopped');
+
}
+
+
private async verifyAllDomains() {
+
this.log('Starting DNS verification check');
+
const startTime = Date.now();
+
+
const runStats: VerificationStats = {
+
totalChecked: 0,
+
verified: 0,
+
failed: 0,
+
errors: 0,
+
};
+
+
try {
+
// Get all custom domains (both verified and pending)
+
const domains = await db<Array<{
+
id: string;
+
domain: string;
+
did: string;
+
verified: boolean;
+
}>>`
+
SELECT id, domain, did, verified FROM custom_domains
+
`;
+
+
if (!domains || domains.length === 0) {
+
this.log('No custom domains to check');
+
this.lastRunTime = Date.now();
+
return;
+
}
+
+
const verifiedCount = domains.filter(d => d.verified).length;
+
const pendingCount = domains.filter(d => !d.verified).length;
+
this.log(`Checking ${domains.length} custom domains (${verifiedCount} verified, ${pendingCount} pending)`);
+
+
// Verify each domain
+
for (const row of domains) {
+
runStats.totalChecked++;
+
const { id, domain, did, verified: wasVerified } = row;
+
+
try {
+
// Extract hash from id (SHA256 of did:domain)
+
const expectedHash = id.substring(0, 16);
+
+
// Verify DNS records - this will only verify if TXT record matches this specific DID
+
const result = await verifyCustomDomain(domain, did, expectedHash);
+
+
if (result.verified) {
+
// Double-check: ensure this record is still the current owner in database
+
// This prevents race conditions where domain ownership changed during verification
+
const currentOwner = await db<Array<{ id: string; did: string; verified: boolean }>>`
+
SELECT id, did, verified FROM custom_domains WHERE domain = ${domain}
+
`;
+
+
const isStillOwner = currentOwner.length > 0 && currentOwner[0].id === id;
+
+
if (!isStillOwner) {
+
this.log(`โš ๏ธ Domain ownership changed during verification: ${domain}`, {
+
expectedId: id,
+
expectedDid: did,
+
actualId: currentOwner[0]?.id,
+
actualDid: currentOwner[0]?.did
+
});
+
runStats.failed++;
+
continue;
+
}
+
+
// Update verified status and last_verified_at timestamp
+
await db`
+
UPDATE custom_domains
+
SET verified = true,
+
last_verified_at = EXTRACT(EPOCH FROM NOW())
+
WHERE id = ${id}
+
`;
+
runStats.verified++;
+
if (!wasVerified) {
+
this.log(`Domain newly verified: ${domain}`, { did });
+
} else {
+
this.log(`Domain re-verified: ${domain}`, { did });
+
}
+
} else {
+
// Mark domain as unverified or keep it pending
+
await db`
+
UPDATE custom_domains
+
SET verified = false,
+
last_verified_at = EXTRACT(EPOCH FROM NOW())
+
WHERE id = ${id}
+
`;
+
runStats.failed++;
+
if (wasVerified) {
+
this.log(`Domain verification failed (was verified): ${domain}`, {
+
did,
+
error: result.error,
+
found: result.found,
+
});
+
} else {
+
this.log(`Domain still pending: ${domain}`, {
+
did,
+
error: result.error,
+
found: result.found,
+
});
+
}
+
}
+
} catch (error) {
+
runStats.errors++;
+
this.log(`Error verifying domain: ${domain}`, {
+
did,
+
error: error instanceof Error ? error.message : String(error),
+
});
+
}
+
}
+
+
// Update cumulative stats
+
this.stats.totalChecked += runStats.totalChecked;
+
this.stats.verified += runStats.verified;
+
this.stats.failed += runStats.failed;
+
this.stats.errors += runStats.errors;
+
+
const duration = Date.now() - startTime;
+
this.lastRunTime = Date.now();
+
+
this.log('DNS verification check completed', {
+
duration: `${duration}ms`,
+
...runStats,
+
});
+
} catch (error) {
+
this.log('Fatal error in DNS verification worker', {
+
error: error instanceof Error ? error.message : String(error),
+
});
+
}
+
}
+
+
getHealth() {
+
return {
+
isRunning: this.isRunning,
+
lastRunTime: this.lastRunTime,
+
intervalMs: this.checkIntervalMs,
+
stats: this.stats,
+
healthy: this.isRunning && (
+
this.lastRunTime === null ||
+
Date.now() - this.lastRunTime < this.checkIntervalMs * 2
+
),
+
};
+
}
+
+
// Manual trigger for testing
+
async trigger() {
+
this.log('Manual DNS verification triggered');
+
await this.verifyAllDomains();
+
}
+
}
+172
apps/main-app/src/lib/dns-verify.ts
···
+
import { promises as dns } from 'dns'
+
+
/**
+
* Result of a domain verification process
+
*/
+
export interface VerificationResult {
+
/** Whether the verification was successful */
+
verified: boolean
+
/** Error message if verification failed */
+
error?: string
+
/** DNS records found during verification */
+
found?: {
+
/** TXT records found (used for domain verification) */
+
txt?: string[]
+
/** CNAME record found (used for domain pointing) */
+
cname?: string
+
}
+
}
+
+
/**
+
* Verify domain ownership via TXT record at _wisp.{domain}
+
* Expected format: did:plc:xxx or did:web:xxx
+
*/
+
export const verifyDomainOwnership = async (
+
domain: string,
+
expectedDid: string
+
): Promise<VerificationResult> => {
+
try {
+
const txtDomain = `_wisp.${domain}`
+
+
console.log(`[DNS Verify] Checking TXT record for ${txtDomain}`)
+
console.log(`[DNS Verify] Expected DID: ${expectedDid}`)
+
+
// Query TXT records
+
const records = await dns.resolveTxt(txtDomain)
+
+
// Log what we found
+
const foundTxtValues = records.map((record) => record.join(''))
+
console.log(`[DNS Verify] Found TXT records:`, foundTxtValues)
+
+
// TXT records come as arrays of strings (for multi-part records)
+
// We need to join them and check if any match the expected DID
+
for (const record of records) {
+
const txtValue = record.join('')
+
if (txtValue === expectedDid) {
+
console.log(`[DNS Verify] โœ“ TXT record matches!`)
+
return { verified: true, found: { txt: foundTxtValues } }
+
}
+
}
+
+
console.log(`[DNS Verify] โœ— TXT record does not match`)
+
return {
+
verified: false,
+
error: `TXT record at ${txtDomain} does not match expected DID. Expected: ${expectedDid}`,
+
found: { txt: foundTxtValues }
+
}
+
} catch (err: any) {
+
console.log(`[DNS Verify] โœ— TXT lookup error:`, err.message)
+
if (err.code === 'ENOTFOUND' || err.code === 'ENODATA') {
+
return {
+
verified: false,
+
error: `No TXT record found at _wisp.${domain}`,
+
found: { txt: [] }
+
}
+
}
+
return {
+
verified: false,
+
error: `DNS lookup failed: ${err.message}`,
+
found: { txt: [] }
+
}
+
}
+
}
+
+
/**
+
* Verify CNAME record points to the expected hash target
+
* For custom domains, we expect: domain CNAME -> {hash}.dns.wisp.place
+
*/
+
export const verifyCNAME = async (
+
domain: string,
+
expectedHash: string
+
): Promise<VerificationResult> => {
+
try {
+
console.log(`[DNS Verify] Checking CNAME record for ${domain}`)
+
const expectedTarget = `${expectedHash}.dns.wisp.place`
+
console.log(`[DNS Verify] Expected CNAME: ${expectedTarget}`)
+
+
// Resolve CNAME for the domain
+
const cname = await dns.resolveCname(domain)
+
+
// Log what we found
+
const foundCname =
+
cname.length > 0
+
? cname[0]?.toLowerCase().replace(/\.$/, '')
+
: null
+
console.log(`[DNS Verify] Found CNAME:`, foundCname || 'none')
+
+
if (cname.length === 0 || !foundCname) {
+
console.log(`[DNS Verify] โœ— No CNAME record found`)
+
return {
+
verified: false,
+
error: `No CNAME record found for ${domain}`,
+
found: { cname: '' }
+
}
+
}
+
+
// Check if CNAME points to the expected target
+
const actualTarget = foundCname
+
+
if (actualTarget === expectedTarget.toLowerCase()) {
+
console.log(`[DNS Verify] โœ“ CNAME record matches!`)
+
return { verified: true, found: { cname: actualTarget } }
+
}
+
+
console.log(`[DNS Verify] โœ— CNAME record does not match`)
+
return {
+
verified: false,
+
error: `CNAME for ${domain} points to ${actualTarget}, expected ${expectedTarget}`,
+
found: { cname: actualTarget }
+
}
+
} catch (err: any) {
+
console.log(`[DNS Verify] โœ— CNAME lookup error:`, err.message)
+
if (err.code === 'ENOTFOUND' || err.code === 'ENODATA') {
+
return {
+
verified: false,
+
error: `No CNAME record found for ${domain}`,
+
found: { cname: '' }
+
}
+
}
+
return {
+
verified: false,
+
error: `DNS lookup failed: ${err.message}`,
+
found: { cname: '' }
+
}
+
}
+
}
+
+
/**
+
* Verify custom domain using TXT record as authoritative proof
+
* CNAME check is optional/advisory - TXT record is sufficient for verification
+
*
+
* This approach works with CNAME flattening (e.g., Cloudflare) where the CNAME
+
* is resolved to A/AAAA records and won't be visible in DNS queries.
+
*/
+
export const verifyCustomDomain = async (
+
domain: string,
+
expectedDid: string,
+
expectedHash: string
+
): Promise<VerificationResult> => {
+
// TXT record is authoritative - it proves ownership
+
const txtResult = await verifyDomainOwnership(domain, expectedDid)
+
if (!txtResult.verified) {
+
return txtResult
+
}
+
+
// CNAME check is advisory only - we still check it for logging/debugging
+
// but don't fail verification if it's missing (could be flattened)
+
const cnameResult = await verifyCNAME(domain, expectedHash)
+
+
// Log CNAME status for debugging, but don't fail on it
+
if (!cnameResult.verified) {
+
console.log(`[DNS Verify] โš ๏ธ CNAME verification failed (may be flattened):`, cnameResult.error)
+
}
+
+
// TXT verification is sufficient
+
return {
+
verified: true,
+
found: {
+
txt: txtResult.found?.txt,
+
cname: cnameResult.found?.cname
+
}
+
}
+
}
+9
apps/main-app/src/lib/logger.ts
···
+
/**
+
* Main app logger using @wisp/observability
+
*
+
* Note: This file is kept for backward compatibility.
+
* New code should import createLogger from @wisp/observability directly.
+
*/
+
import { createLogger } from '@wisp/observability'
+
+
export const logger = createLogger('main-app')
+252
apps/main-app/src/lib/oauth-client.ts
···
+
import { NodeOAuthClient, type ClientMetadata } from "@atproto/oauth-client-node";
+
import { JoseKey } from "@atproto/jwk-jose";
+
import { db } from "./db";
+
import { logger } from "./logger";
+
import { SlingshotHandleResolver } from "./slingshot-handle-resolver";
+
+
// Session timeout configuration (30 days in seconds)
+
const SESSION_TIMEOUT = 30 * 24 * 60 * 60; // 2592000 seconds
+
// OAuth state timeout (1 hour in seconds)
+
const STATE_TIMEOUT = 60 * 60; // 3600 seconds
+
+
const stateStore = {
+
async set(key: string, data: any) {
+
console.debug('[stateStore] set', key)
+
const expiresAt = Math.floor(Date.now() / 1000) + STATE_TIMEOUT;
+
await db`
+
INSERT INTO oauth_states (key, data, created_at, expires_at)
+
VALUES (${key}, ${JSON.stringify(data)}, EXTRACT(EPOCH FROM NOW()), ${expiresAt})
+
ON CONFLICT (key) DO UPDATE SET data = EXCLUDED.data, expires_at = ${expiresAt}
+
`;
+
},
+
async get(key: string) {
+
console.debug('[stateStore] get', key)
+
const now = Math.floor(Date.now() / 1000);
+
const result = await db`
+
SELECT data, expires_at
+
FROM oauth_states
+
WHERE key = ${key}
+
`;
+
if (!result[0]) return undefined;
+
+
// Check if expired
+
const expiresAt = Number(result[0].expires_at);
+
if (expiresAt && now > expiresAt) {
+
console.debug('[stateStore] State expired, deleting', key);
+
await db`DELETE FROM oauth_states WHERE key = ${key}`;
+
return undefined;
+
}
+
+
return JSON.parse(result[0].data);
+
},
+
async del(key: string) {
+
console.debug('[stateStore] del', key)
+
await db`DELETE FROM oauth_states WHERE key = ${key}`;
+
}
+
};
+
+
const sessionStore = {
+
async set(sub: string, data: any) {
+
console.debug('[sessionStore] set', sub)
+
const expiresAt = Math.floor(Date.now() / 1000) + SESSION_TIMEOUT;
+
await db`
+
INSERT INTO oauth_sessions (sub, data, updated_at, expires_at)
+
VALUES (${sub}, ${JSON.stringify(data)}, EXTRACT(EPOCH FROM NOW()), ${expiresAt})
+
ON CONFLICT (sub) DO UPDATE SET
+
data = EXCLUDED.data,
+
updated_at = EXTRACT(EPOCH FROM NOW()),
+
expires_at = ${expiresAt}
+
`;
+
},
+
async get(sub: string) {
+
const now = Math.floor(Date.now() / 1000);
+
const result = await db`
+
SELECT data, expires_at
+
FROM oauth_sessions
+
WHERE sub = ${sub}
+
`;
+
if (!result[0]) return undefined;
+
+
// Check if expired
+
const expiresAt = Number(result[0].expires_at);
+
if (expiresAt && now > expiresAt) {
+
logger.debug('[sessionStore] Session expired, deleting', { sub });
+
await db`DELETE FROM oauth_sessions WHERE sub = ${sub}`;
+
return undefined;
+
}
+
+
return JSON.parse(result[0].data);
+
},
+
async del(sub: string) {
+
console.debug('[sessionStore] del', sub)
+
await db`DELETE FROM oauth_sessions WHERE sub = ${sub}`;
+
}
+
};
+
+
export { sessionStore };
+
+
// Cleanup expired sessions and states
+
export const cleanupExpiredSessions = async () => {
+
const now = Math.floor(Date.now() / 1000);
+
try {
+
const sessionsDeleted = await db`
+
DELETE FROM oauth_sessions WHERE expires_at < ${now}
+
`;
+
const statesDeleted = await db`
+
DELETE FROM oauth_states WHERE expires_at IS NOT NULL AND expires_at < ${now}
+
`;
+
logger.info(`[Cleanup] Deleted ${sessionsDeleted.length} expired sessions and ${statesDeleted.length} expired states`);
+
return { sessions: sessionsDeleted.length, states: statesDeleted.length };
+
} catch (err) {
+
logger.error('[Cleanup] Failed to cleanup expired data', err);
+
return { sessions: 0, states: 0 };
+
}
+
};
+
+
export const createClientMetadata = (config: { domain: `http://${string}` | `https://${string}`, clientName: string }): ClientMetadata => {
+
const isLocalDev = Bun.env.LOCAL_DEV === 'true';
+
+
if (isLocalDev) {
+
// Loopback client for local development
+
// For loopback, scopes and redirect_uri must be in client_id query string
+
const redirectUri = 'http://127.0.0.1:8000/api/auth/callback';
+
const scope = 'atproto repo:place.wisp.fs repo:place.wisp.domain repo:place.wisp.subfs repo:place.wisp.settings blob:*/* rpc:app.bsky.actor.getProfile?aud=did:web:api.bsky.app#bsky_appview';
+
const params = new URLSearchParams();
+
params.append('redirect_uri', redirectUri);
+
params.append('scope', scope);
+
+
return {
+
client_id: `http://localhost?${params.toString()}`,
+
client_name: config.clientName,
+
client_uri: `https://wisp.place`,
+
redirect_uris: [redirectUri],
+
grant_types: ['authorization_code', 'refresh_token'],
+
response_types: ['code'],
+
application_type: 'web',
+
token_endpoint_auth_method: 'none',
+
scope: scope,
+
dpop_bound_access_tokens: false,
+
subject_type: 'public',
+
authorization_signed_response_alg: 'ES256'
+
} as ClientMetadata;
+
}
+
+
// Production client with private_key_jwt
+
return {
+
client_id: `${config.domain}/client-metadata.json`,
+
client_name: config.clientName,
+
client_uri: `https://wisp.place`,
+
logo_uri: `${config.domain}/logo.png`,
+
tos_uri: `${config.domain}/tos`,
+
policy_uri: `${config.domain}/policy`,
+
redirect_uris: [`${config.domain}/api/auth/callback`],
+
grant_types: ['authorization_code', 'refresh_token'],
+
response_types: ['code'],
+
application_type: 'web',
+
token_endpoint_auth_method: 'private_key_jwt',
+
token_endpoint_auth_signing_alg: "ES256",
+
scope: "atproto repo:place.wisp.fs repo:place.wisp.domain repo:place.wisp.subfs repo:place.wisp.settings blob:*/* rpc:app.bsky.actor.getProfile?aud=did:web:api.bsky.app#bsky_appview",
+
dpop_bound_access_tokens: true,
+
jwks_uri: `${config.domain}/jwks.json`,
+
subject_type: 'public',
+
authorization_signed_response_alg: 'ES256'
+
} as ClientMetadata;
+
};
+
+
const persistKey = async (key: JoseKey) => {
+
const priv = key.privateJwk;
+
if (!priv) return;
+
const kid = key.kid ?? crypto.randomUUID();
+
await db`
+
INSERT INTO oauth_keys (kid, jwk, created_at)
+
VALUES (${kid}, ${JSON.stringify(priv)}, EXTRACT(EPOCH FROM NOW()))
+
ON CONFLICT (kid) DO UPDATE SET jwk = EXCLUDED.jwk
+
`;
+
};
+
+
const loadPersistedKeys = async (): Promise<JoseKey[]> => {
+
const rows = await db`SELECT kid, jwk, created_at FROM oauth_keys ORDER BY kid`;
+
const keys: JoseKey[] = [];
+
for (const row of rows) {
+
try {
+
const obj = JSON.parse(row.jwk);
+
const key = await JoseKey.fromImportable(obj as any, (obj as any).kid);
+
keys.push(key);
+
} catch (err) {
+
logger.error('[OAuth] Could not parse stored JWK', err);
+
}
+
}
+
return keys;
+
};
+
+
const ensureKeys = async (): Promise<JoseKey[]> => {
+
let keys = await loadPersistedKeys();
+
const needed: string[] = [];
+
for (let i = 1; i <= 3; i++) {
+
const kid = `key${i}`;
+
if (!keys.some(k => k.kid === kid)) needed.push(kid);
+
}
+
for (const kid of needed) {
+
const newKey = await JoseKey.generate(['ES256'], kid);
+
await persistKey(newKey);
+
keys.push(newKey);
+
}
+
keys.sort((a, b) => (a.kid ?? '').localeCompare(b.kid ?? ''));
+
return keys;
+
};
+
+
// Load keys from database every time (stateless - safe for horizontal scaling)
+
export const getCurrentKeys = async (): Promise<JoseKey[]> => {
+
return await loadPersistedKeys();
+
};
+
+
// Key rotation - rotate keys older than 30 days (monthly rotation)
+
const KEY_MAX_AGE = 30 * 24 * 60 * 60; // 30 days in seconds
+
+
export const rotateKeysIfNeeded = async (): Promise<boolean> => {
+
const now = Math.floor(Date.now() / 1000);
+
const cutoffTime = now - KEY_MAX_AGE;
+
+
try {
+
// Find keys older than 30 days
+
const oldKeys = await db`
+
SELECT kid, created_at FROM oauth_keys
+
WHERE created_at IS NOT NULL AND created_at < ${cutoffTime}
+
ORDER BY created_at ASC
+
`;
+
+
if (oldKeys.length === 0) {
+
logger.debug('[KeyRotation] No keys need rotation');
+
return false;
+
}
+
+
logger.info(`[KeyRotation] Found ${oldKeys.length} key(s) older than 30 days, rotating oldest key`);
+
+
// Rotate the oldest key
+
const oldestKey = oldKeys[0];
+
const oldKid = oldestKey.kid;
+
+
// Generate new key with same kid
+
const newKey = await JoseKey.generate(['ES256'], oldKid);
+
await persistKey(newKey);
+
+
logger.info(`[KeyRotation] Rotated key ${oldKid}`);
+
+
return true;
+
} catch (err) {
+
logger.error('[KeyRotation] Failed to rotate keys', err);
+
return false;
+
}
+
};
+
+
export const getOAuthClient = async (config: { domain: `http://${string}` | `https://${string}`, clientName: string }) => {
+
const keys = await ensureKeys();
+
+
return new NodeOAuthClient({
+
clientMetadata: createClientMetadata(config),
+
keyset: keys,
+
stateStore,
+
sessionStore,
+
handleResolver: new SlingshotHandleResolver()
+
});
+
};
+81
apps/main-app/src/lib/slingshot-handle-resolver.ts
···
+
import type { HandleResolver, ResolveHandleOptions, ResolvedHandle } from '@atproto-labs/handle-resolver';
+
import type { AtprotoDid } from '@atproto/did';
+
import { logger } from './logger';
+
+
/**
+
* Custom HandleResolver that uses Slingshot's identity resolver service
+
* to work around bugs in atproto-oauth-node when handles have redirects
+
* in their well-known configuration.
+
*
+
* Uses: https://slingshot.wisp.place/xrpc/com.atproto.identity.resolveHandle
+
*/
+
export class SlingshotHandleResolver implements HandleResolver {
+
private readonly endpoint = 'https://slingshot.wisp.place/xrpc/com.atproto.identity.resolveHandle';
+
+
async resolve(handle: string, options?: ResolveHandleOptions): Promise<ResolvedHandle> {
+
try {
+
logger.debug('[SlingshotHandleResolver] Resolving handle', { handle });
+
+
const url = new URL(this.endpoint);
+
url.searchParams.set('handle', handle);
+
+
const controller = new AbortController();
+
const timeoutId = setTimeout(() => controller.abort(), 5000); // 5s timeout
+
+
try {
+
const response = await fetch(url.toString(), {
+
signal: options?.signal || controller.signal,
+
headers: {
+
'Accept': 'application/json',
+
},
+
});
+
+
clearTimeout(timeoutId);
+
+
if (!response.ok) {
+
logger.error('[SlingshotHandleResolver] Failed to resolve handle', {
+
handle,
+
status: response.status,
+
statusText: response.statusText,
+
});
+
return null;
+
}
+
+
const data = await response.json() as { did: string };
+
+
if (!data.did) {
+
logger.warn('[SlingshotHandleResolver] No DID in response', { handle });
+
return null;
+
}
+
+
// Validate that it's a proper DID format
+
if (!data.did.startsWith('did:')) {
+
logger.error('[SlingshotHandleResolver] Invalid DID format', { handle, did: data.did });
+
return null;
+
}
+
+
logger.debug('[SlingshotHandleResolver] Successfully resolved handle', { handle, did: data.did });
+
return data.did as AtprotoDid;
+
} catch (fetchError) {
+
clearTimeout(timeoutId);
+
+
if (fetchError instanceof Error && fetchError.name === 'AbortError') {
+
logger.error('[SlingshotHandleResolver] Request aborted', { handle });
+
throw fetchError; // Re-throw abort errors
+
}
+
+
throw fetchError;
+
}
+
} catch (error) {
+
logger.error('[SlingshotHandleResolver] Error resolving handle', error, { handle });
+
+
// If it's an abort error, propagate it
+
if (error instanceof Error && error.name === 'AbortError') {
+
throw error;
+
}
+
+
// For other unexpected errors, return null (handle not found)
+
return null;
+
}
+
}
+
}
+90
apps/main-app/src/lib/sync-sites.ts
···
+
import { Agent } from '@atproto/api'
+
import type { OAuthSession } from '@atproto/oauth-client-node'
+
import { upsertSite } from './db'
+
+
/**
+
* Sync sites from user's PDS into the database cache
+
* - Fetches all place.wisp.fs records from AT Protocol repo
+
* - Validates record structure
+
* - Backfills into sites table
+
*/
+
export async function syncSitesFromPDS(
+
did: string,
+
session: OAuthSession
+
): Promise<{ synced: number; errors: string[] }> {
+
console.log(`[Sync] Starting site sync for ${did}`)
+
+
const agent = new Agent((url, init) => session.fetchHandler(url, init))
+
const errors: string[] = []
+
let synced = 0
+
+
try {
+
// List all records in the place.wisp.fs collection
+
console.log('[Sync] Fetching place.wisp.fs records from PDS')
+
const records = await agent.com.atproto.repo.listRecords({
+
repo: did,
+
collection: 'place.wisp.fs',
+
limit: 100 // Adjust if users might have more sites
+
})
+
+
console.log(`[Sync] Found ${records.data.records.length} records`)
+
+
// Process each record
+
for (const record of records.data.records) {
+
try {
+
const { uri, value } = record
+
+
// Extract rkey from URI (at://did/collection/rkey)
+
const rkey = uri.split('/').pop()
+
if (!rkey) {
+
errors.push(`Invalid URI format: ${uri}`)
+
continue
+
}
+
+
// Validate record structure
+
if (!value || typeof value !== 'object') {
+
errors.push(`Invalid record value for ${rkey}`)
+
continue
+
}
+
+
const siteValue = value as any
+
+
// Check for required fields
+
if (siteValue.$type !== 'place.wisp.fs') {
+
errors.push(
+
`Invalid $type for ${rkey}: ${siteValue.$type}`
+
)
+
continue
+
}
+
+
if (!siteValue.site || typeof siteValue.site !== 'string') {
+
errors.push(`Missing or invalid site name for ${rkey}`)
+
continue
+
}
+
+
// Upsert into database
+
const displayName = siteValue.site
+
await upsertSite(did, rkey, displayName)
+
+
console.log(
+
`[Sync] โœ“ Synced site: ${displayName} (${rkey})`
+
)
+
synced++
+
} catch (err) {
+
const errorMsg = `Error processing record: ${err instanceof Error ? err.message : 'Unknown error'}`
+
console.error(`[Sync] ${errorMsg}`)
+
errors.push(errorMsg)
+
}
+
}
+
+
console.log(
+
`[Sync] Complete: ${synced} synced, ${errors.length} errors`
+
)
+
return { synced, errors }
+
} catch (err) {
+
const errorMsg = `Failed to fetch records from PDS: ${err instanceof Error ? err.message : 'Unknown error'}`
+
console.error(`[Sync] ${errorMsg}`)
+
errors.push(errorMsg)
+
return { synced, errors }
+
}
+
}
+10
apps/main-app/src/lib/types.ts
···
+
/**
+
* Configuration for the Wisp client
+
* @typeParam Config
+
*/
+
export type Config = {
+
/** The base domain URL with HTTP or HTTPS protocol */
+
domain: `http://${string}` | `https://${string}`,
+
/** Name of the client application */
+
clientName: string
+
};
+202
apps/main-app/src/lib/upload-jobs.ts
···
+
import { createLogger } from '@wisp/observability';
+
+
const logger = createLogger('main-app');
+
+
export type UploadJobStatus = 'pending' | 'processing' | 'uploading' | 'completed' | 'failed';
+
+
export interface UploadProgress {
+
filesProcessed: number;
+
totalFiles: number;
+
filesUploaded: number;
+
filesReused: number;
+
currentFile?: string;
+
currentFileStatus?: 'checking' | 'uploading' | 'uploaded' | 'reused' | 'failed';
+
phase: 'validating' | 'compressing' | 'uploading' | 'creating_manifest' | 'finalizing' | 'done';
+
}
+
+
export interface UploadJob {
+
id: string;
+
did: string;
+
siteName: string;
+
status: UploadJobStatus;
+
progress: UploadProgress;
+
result?: {
+
success: boolean;
+
uri?: string;
+
cid?: string;
+
fileCount?: number;
+
siteName?: string;
+
skippedFiles?: Array<{ name: string; reason: string }>;
+
failedFiles?: Array<{ name: string; index: number; error: string; size: number }>;
+
uploadedCount?: number;
+
hasFailures?: boolean;
+
};
+
error?: string;
+
createdAt: number;
+
updatedAt: number;
+
}
+
+
// In-memory job storage
+
const jobs = new Map<string, UploadJob>();
+
+
// SSE connections for each job
+
const jobListeners = new Map<string, Set<(event: string, data: any) => void>>();
+
+
// Cleanup old jobs after 1 hour
+
const JOB_TTL = 60 * 60 * 1000;
+
+
export function createUploadJob(did: string, siteName: string, totalFiles: number): string {
+
const id = crypto.randomUUID();
+
const now = Date.now();
+
+
const job: UploadJob = {
+
id,
+
did,
+
siteName,
+
status: 'pending',
+
progress: {
+
filesProcessed: 0,
+
totalFiles,
+
filesUploaded: 0,
+
filesReused: 0,
+
phase: 'validating'
+
},
+
createdAt: now,
+
updatedAt: now
+
};
+
+
jobs.set(id, job);
+
logger.info(`Upload job created: ${id} for ${did}/${siteName} (${totalFiles} files)`);
+
+
// Schedule cleanup
+
setTimeout(() => {
+
jobs.delete(id);
+
jobListeners.delete(id);
+
logger.info(`Upload job cleaned up: ${id}`);
+
}, JOB_TTL);
+
+
return id;
+
}
+
+
export function getUploadJob(id: string): UploadJob | undefined {
+
return jobs.get(id);
+
}
+
+
export function updateUploadJob(
+
id: string,
+
updates: Partial<Omit<UploadJob, 'id' | 'did' | 'siteName' | 'createdAt'>>
+
): void {
+
const job = jobs.get(id);
+
if (!job) {
+
logger.warn(`Attempted to update non-existent job: ${id}`);
+
return;
+
}
+
+
Object.assign(job, updates, { updatedAt: Date.now() });
+
jobs.set(id, job);
+
+
// Notify all listeners
+
const listeners = jobListeners.get(id);
+
if (listeners && listeners.size > 0) {
+
const eventData = {
+
status: job.status,
+
progress: job.progress,
+
result: job.result,
+
error: job.error
+
};
+
+
const failedListeners: Array<(event: string, data: any) => void> = [];
+
listeners.forEach(listener => {
+
try {
+
listener('progress', eventData);
+
} catch (err) {
+
// Client disconnected, remove this listener
+
failedListeners.push(listener);
+
}
+
});
+
+
// Remove failed listeners
+
failedListeners.forEach(listener => listeners.delete(listener));
+
}
+
}
+
+
export function completeUploadJob(id: string, result: UploadJob['result']): void {
+
updateUploadJob(id, {
+
status: 'completed',
+
progress: {
+
...getUploadJob(id)!.progress,
+
phase: 'done'
+
},
+
result
+
});
+
+
// Send final event and close connections
+
setTimeout(() => {
+
const listeners = jobListeners.get(id);
+
if (listeners) {
+
listeners.forEach(listener => {
+
try {
+
listener('done', result);
+
} catch (err) {
+
// Client already disconnected, ignore
+
}
+
});
+
jobListeners.delete(id);
+
}
+
}, 100);
+
}
+
+
export function failUploadJob(id: string, error: string): void {
+
updateUploadJob(id, {
+
status: 'failed',
+
error
+
});
+
+
// Send error event and close connections
+
setTimeout(() => {
+
const listeners = jobListeners.get(id);
+
if (listeners) {
+
listeners.forEach(listener => {
+
try {
+
listener('error', { error });
+
} catch (err) {
+
// Client already disconnected, ignore
+
}
+
});
+
jobListeners.delete(id);
+
}
+
}, 100);
+
}
+
+
export function addJobListener(jobId: string, listener: (event: string, data: any) => void): () => void {
+
if (!jobListeners.has(jobId)) {
+
jobListeners.set(jobId, new Set());
+
}
+
jobListeners.get(jobId)!.add(listener);
+
+
// Return cleanup function
+
return () => {
+
const listeners = jobListeners.get(jobId);
+
if (listeners) {
+
listeners.delete(listener);
+
if (listeners.size === 0) {
+
jobListeners.delete(jobId);
+
}
+
}
+
};
+
}
+
+
export function updateJobProgress(
+
jobId: string,
+
progressUpdate: Partial<UploadProgress>
+
): void {
+
const job = getUploadJob(jobId);
+
if (!job) return;
+
+
updateUploadJob(jobId, {
+
progress: {
+
...job.progress,
+
...progressUpdate
+
}
+
});
+
}
+38
apps/main-app/src/lib/wisp-auth.ts
···
+
import { Did } from "@atproto/api";
+
import { NodeOAuthClient } from "@atproto/oauth-client-node";
+
import type { OAuthSession } from "@atproto/oauth-client-node";
+
import { Cookie } from "elysia";
+
import { logger } from "./logger";
+
+
+
export interface AuthenticatedContext {
+
did: Did;
+
session: OAuthSession;
+
}
+
+
export const authenticateRequest = async (
+
client: NodeOAuthClient,
+
cookies: Record<string, Cookie<unknown>>
+
): Promise<AuthenticatedContext | null> => {
+
try {
+
const did = cookies.did?.value as Did;
+
if (!did) return null;
+
+
const session = await client.restore(did, "auto");
+
return session ? { did, session } : null;
+
} catch (err) {
+
logger.error('[Auth] Authentication error', err);
+
return null;
+
}
+
};
+
+
export const requireAuth = async (
+
client: NodeOAuthClient,
+
cookies: Record<string, Cookie<unknown>>
+
): Promise<AuthenticatedContext> => {
+
const auth = await authenticateRequest(client, cookies);
+
if (!auth) {
+
throw new Error('Authentication required');
+
}
+
return auth;
+
};
+408
apps/main-app/src/routes/admin.ts
···
+
// Admin API routes
+
import { Elysia, t } from 'elysia'
+
import { adminAuth, requireAdmin } from '../lib/admin-auth'
+
import { logCollector, errorTracker, metricsCollector } from '@wisp/observability'
+
import { db } from '../lib/db'
+
+
export const adminRoutes = (cookieSecret: string) =>
+
new Elysia({
+
prefix: '/api/admin',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
}
+
})
+
// Login
+
.post(
+
'/login',
+
async ({ body, cookie, set }) => {
+
const { username, password } = body
+
+
const valid = await adminAuth.verify(username, password)
+
if (!valid) {
+
set.status = 401
+
return { error: 'Invalid credentials' }
+
}
+
+
const sessionId = adminAuth.createSession(username)
+
+
// Set cookie
+
cookie.admin_session.set({
+
value: sessionId,
+
httpOnly: true,
+
secure: process.env.NODE_ENV === 'production',
+
sameSite: 'lax',
+
maxAge: 24 * 60 * 60 // 24 hours
+
})
+
+
return { success: true }
+
},
+
{
+
body: t.Object({
+
username: t.String(),
+
password: t.String()
+
}),
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
}
+
)
+
+
// Logout
+
.post('/logout', ({ cookie }) => {
+
const sessionId = cookie.admin_session?.value
+
if (sessionId && typeof sessionId === 'string') {
+
adminAuth.deleteSession(sessionId)
+
}
+
cookie.admin_session.remove()
+
return { success: true }
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Check auth status
+
.get('/status', ({ cookie }) => {
+
const sessionId = cookie.admin_session?.value
+
if (!sessionId || typeof sessionId !== 'string') {
+
return { authenticated: false }
+
}
+
+
const session = adminAuth.verifySession(sessionId)
+
if (!session) {
+
return { authenticated: false }
+
}
+
+
return {
+
authenticated: true,
+
username: session.username
+
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Get logs (protected)
+
.get('/logs', async ({ query, cookie, set }) => {
+
const check = requireAdmin({ cookie, set })
+
if (check) return check
+
+
const filter: any = {}
+
+
if (query.level) filter.level = query.level
+
if (query.service) filter.service = query.service
+
if (query.search) filter.search = query.search
+
if (query.eventType) filter.eventType = query.eventType
+
if (query.limit) filter.limit = parseInt(query.limit as string)
+
+
// Get logs from main app
+
const mainLogs = logCollector.getLogs(filter)
+
+
// Get logs from hosting service
+
let hostingLogs: any[] = []
+
try {
+
const hostingServiceUrl = process.env.HOSTING_SERVICE_URL || `http://localhost:${process.env.HOSTING_PORT || '3001'}`
+
const params = new URLSearchParams()
+
if (query.level) params.append('level', query.level as string)
+
if (query.service) params.append('service', query.service as string)
+
if (query.search) params.append('search', query.search as string)
+
if (query.eventType) params.append('eventType', query.eventType as string)
+
params.append('limit', String(filter.limit || 100))
+
+
const response = await fetch(`${hostingServiceUrl}/__internal__/observability/logs?${params}`)
+
if (response.ok) {
+
const data = await response.json()
+
hostingLogs = data.logs
+
}
+
} catch (err) {
+
// Hosting service might not be running
+
}
+
+
// Merge and sort by timestamp
+
const allLogs = [...mainLogs, ...hostingLogs].sort((a, b) =>
+
new Date(b.timestamp).getTime() - new Date(a.timestamp).getTime()
+
)
+
+
return { logs: allLogs.slice(0, filter.limit || 100) }
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Get errors (protected)
+
.get('/errors', async ({ query, cookie, set }) => {
+
const check = requireAdmin({ cookie, set })
+
if (check) return check
+
+
const filter: any = {}
+
+
if (query.service) filter.service = query.service
+
if (query.limit) filter.limit = parseInt(query.limit as string)
+
+
// Get errors from main app
+
const mainErrors = errorTracker.getErrors(filter)
+
+
// Get errors from hosting service
+
let hostingErrors: any[] = []
+
try {
+
const hostingServiceUrl = process.env.HOSTING_SERVICE_URL || `http://localhost:${process.env.HOSTING_PORT || '3001'}`
+
const params = new URLSearchParams()
+
if (query.service) params.append('service', query.service as string)
+
params.append('limit', String(filter.limit || 100))
+
+
const response = await fetch(`${hostingServiceUrl}/__internal__/observability/errors?${params}`)
+
if (response.ok) {
+
const data = await response.json()
+
hostingErrors = data.errors
+
}
+
} catch (err) {
+
// Hosting service might not be running
+
}
+
+
// Merge and sort by last seen
+
const allErrors = [...mainErrors, ...hostingErrors].sort((a, b) =>
+
new Date(b.lastSeen).getTime() - new Date(a.lastSeen).getTime()
+
)
+
+
return { errors: allErrors.slice(0, filter.limit || 100) }
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Get metrics (protected)
+
.get('/metrics', async ({ query, cookie, set }) => {
+
const check = requireAdmin({ cookie, set })
+
if (check) return check
+
+
const timeWindow = query.timeWindow
+
? parseInt(query.timeWindow as string)
+
: 3600000 // 1 hour default
+
+
const mainAppStats = metricsCollector.getStats('main-app', timeWindow)
+
const overallStats = metricsCollector.getStats(undefined, timeWindow)
+
+
// Get hosting service stats from its own endpoint
+
let hostingServiceStats = {
+
totalRequests: 0,
+
avgDuration: 0,
+
p50Duration: 0,
+
p95Duration: 0,
+
p99Duration: 0,
+
errorRate: 0,
+
requestsPerMinute: 0
+
}
+
+
try {
+
const hostingServiceUrl = process.env.HOSTING_SERVICE_URL || `http://localhost:${process.env.HOSTING_PORT || '3001'}`
+
const response = await fetch(`${hostingServiceUrl}/__internal__/observability/metrics?timeWindow=${timeWindow}`)
+
if (response.ok) {
+
const data = await response.json()
+
hostingServiceStats = data.stats
+
}
+
} catch (err) {
+
// Hosting service might not be running
+
}
+
+
return {
+
overall: overallStats,
+
mainApp: mainAppStats,
+
hostingService: hostingServiceStats,
+
timeWindow
+
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Get database stats (protected)
+
.get('/database', async ({ cookie, set }) => {
+
const check = requireAdmin({ cookie, set })
+
if (check) return check
+
+
try {
+
// Get total counts
+
const allSitesResult = await db`SELECT COUNT(*) as count FROM sites`
+
const wispSubdomainsResult = await db`SELECT COUNT(*) as count FROM domains WHERE domain LIKE '%.wisp.place'`
+
const customDomainsResult = await db`SELECT COUNT(*) as count FROM custom_domains WHERE verified = true`
+
+
// Get recent sites (including those without domains)
+
const recentSites = await db`
+
SELECT
+
s.did,
+
s.rkey,
+
s.display_name,
+
s.created_at,
+
d.domain as subdomain
+
FROM sites s
+
LEFT JOIN domains d ON s.did = d.did AND s.rkey = d.rkey AND d.domain LIKE '%.wisp.place'
+
ORDER BY s.created_at DESC
+
LIMIT 10
+
`
+
+
// Get recent domains
+
const recentDomains = await db`SELECT domain, did, rkey, verified, created_at FROM custom_domains ORDER BY created_at DESC LIMIT 10`
+
+
return {
+
stats: {
+
totalSites: allSitesResult[0].count,
+
totalWispSubdomains: wispSubdomainsResult[0].count,
+
totalCustomDomains: customDomainsResult[0].count
+
},
+
recentSites: recentSites,
+
recentDomains: recentDomains
+
}
+
} catch (error) {
+
set.status = 500
+
return {
+
error: 'Failed to fetch database stats',
+
message: error instanceof Error ? error.message : String(error)
+
}
+
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Get cache stats (protected)
+
.get('/cache', async ({ cookie, set }) => {
+
const check = requireAdmin({ cookie, set })
+
if (check) return check
+
+
try {
+
const hostingServiceUrl = process.env.HOSTING_SERVICE_URL || `http://localhost:${process.env.HOSTING_PORT || '3001'}`
+
const response = await fetch(`${hostingServiceUrl}/__internal__/observability/cache`)
+
+
if (response.ok) {
+
const data = await response.json()
+
return data
+
} else {
+
set.status = 503
+
return {
+
error: 'Failed to fetch cache stats from hosting service',
+
message: 'Hosting service unavailable'
+
}
+
}
+
} catch (error) {
+
set.status = 500
+
return {
+
error: 'Failed to fetch cache stats',
+
message: error instanceof Error ? error.message : String(error)
+
}
+
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Get sites listing (protected)
+
.get('/sites', async ({ query, cookie, set }) => {
+
const check = requireAdmin({ cookie, set })
+
if (check) return check
+
+
const limit = query.limit ? parseInt(query.limit as string) : 50
+
const offset = query.offset ? parseInt(query.offset as string) : 0
+
+
try {
+
const sites = await db`
+
SELECT
+
s.did,
+
s.rkey,
+
s.display_name,
+
s.created_at,
+
d.domain as subdomain
+
FROM sites s
+
LEFT JOIN domains d ON s.did = d.did AND s.rkey = d.rkey AND d.domain LIKE '%.wisp.place'
+
ORDER BY s.created_at DESC
+
LIMIT ${limit} OFFSET ${offset}
+
`
+
+
const customDomains = await db`
+
SELECT
+
domain,
+
did,
+
rkey,
+
verified,
+
created_at
+
FROM custom_domains
+
ORDER BY created_at DESC
+
LIMIT ${limit} OFFSET ${offset}
+
`
+
+
return {
+
sites: sites,
+
customDomains: customDomains
+
}
+
} catch (error) {
+
set.status = 500
+
return {
+
error: 'Failed to fetch sites',
+
message: error instanceof Error ? error.message : String(error)
+
}
+
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+
// Get system health (protected)
+
.get('/health', ({ cookie, set }) => {
+
const check = requireAdmin({ cookie, set })
+
if (check) return check
+
+
const uptime = process.uptime()
+
const memory = process.memoryUsage()
+
+
return {
+
uptime: Math.floor(uptime),
+
memory: {
+
heapUsed: Math.round(memory.heapUsed / 1024 / 1024), // MB
+
heapTotal: Math.round(memory.heapTotal / 1024 / 1024), // MB
+
rss: Math.round(memory.rss / 1024 / 1024) // MB
+
},
+
timestamp: new Date().toISOString()
+
}
+
}, {
+
cookie: t.Cookie({
+
admin_session: t.Optional(t.String())
+
}, {
+
secrets: cookieSecret,
+
sign: ['admin_session']
+
})
+
})
+
+128
apps/main-app/src/routes/auth.ts
···
+
import { Elysia, t } from 'elysia'
+
import { NodeOAuthClient } from '@atproto/oauth-client-node'
+
import { getSitesByDid, getDomainByDid, getCookieSecret } from '../lib/db'
+
import { syncSitesFromPDS } from '../lib/sync-sites'
+
import { authenticateRequest } from '../lib/wisp-auth'
+
import { createLogger } from '@wisp/observability'
+
+
const logger = createLogger('main-app')
+
+
export const authRoutes = (client: NodeOAuthClient, cookieSecret: string) => new Elysia({
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
+
.post('/api/auth/signin', async (c) => {
+
let handle = 'unknown'
+
try {
+
const body = c.body as { handle: string }
+
handle = body.handle
+
logger.info('Sign-in attempt', { handle })
+
const state = crypto.randomUUID()
+
const url = await client.authorize(handle, { state })
+
logger.info('Authorization URL generated', { handle })
+
return { url: url.toString() }
+
} catch (err) {
+
logger.error('Signin error', err, { handle })
+
console.error('[Auth] Full error:', err)
+
return { error: 'Authentication failed', details: err instanceof Error ? err.message : String(err) }
+
}
+
})
+
.get('/api/auth/callback', async (c) => {
+
try {
+
const params = new URLSearchParams(c.query)
+
+
// client.callback() validates the state parameter internally
+
// It will throw an error if state validation fails (CSRF protection)
+
const { session } = await client.callback(params)
+
+
if (!session) {
+
logger.error('[Auth] OAuth callback failed: no session returned')
+
c.cookie.did.remove()
+
return c.redirect('/?error=auth_failed')
+
}
+
+
const cookieSession = c.cookie
+
cookieSession.did.set({
+
value: session.did,
+
httpOnly: true,
+
secure: process.env.NODE_ENV === 'production',
+
sameSite: 'lax',
+
maxAge: 30 * 24 * 60 * 60 // 30 days
+
})
+
+
// Sync sites from PDS to database cache
+
logger.debug('[Auth] Syncing sites from PDS for', session.did as any)
+
try {
+
const syncResult = await syncSitesFromPDS(session.did, session)
+
logger.debug(`[Auth] Sync complete: ${syncResult.synced} sites synced`)
+
if (syncResult.errors.length > 0) {
+
logger.debug('[Auth] Sync errors:', syncResult.errors)
+
}
+
} catch (err) {
+
logger.error('[Auth] Failed to sync sites', err)
+
// Don't fail auth if sync fails, just log it
+
}
+
+
// Check if user has any sites or domain
+
const sites = await getSitesByDid(session.did)
+
const domain = await getDomainByDid(session.did)
+
+
// If no sites and no domain, redirect to onboarding
+
if (sites.length === 0 && !domain) {
+
return c.redirect('/onboarding')
+
}
+
+
return c.redirect('/editor')
+
} catch (err) {
+
// This catches state validation failures and other OAuth errors
+
logger.error('[Auth] OAuth callback error', err)
+
c.cookie.did.remove()
+
return c.redirect('/?error=auth_failed')
+
}
+
})
+
.post('/api/auth/logout', async (c) => {
+
try {
+
const cookieSession = c.cookie
+
const did = cookieSession.did?.value
+
+
// Clear the session cookie
+
cookieSession.did.remove()
+
+
// If we have a DID, try to revoke the OAuth session
+
if (did && typeof did === 'string') {
+
try {
+
await client.revoke(did)
+
logger.debug('[Auth] Revoked OAuth session for', did as any)
+
} catch (err) {
+
logger.error('[Auth] Failed to revoke session', err)
+
// Continue with logout even if revoke fails
+
}
+
}
+
+
return { success: true }
+
} catch (err) {
+
logger.error('[Auth] Logout error', err)
+
return { error: 'Logout failed' }
+
}
+
})
+
.get('/api/auth/status', async (c) => {
+
try {
+
const auth = await authenticateRequest(client, c.cookie)
+
+
if (!auth) {
+
c.cookie.did.remove()
+
return { authenticated: false }
+
}
+
+
return {
+
authenticated: true,
+
did: auth.did
+
}
+
} catch (err) {
+
logger.error('[Auth] Status check error', err)
+
c.cookie.did.remove()
+
return { authenticated: false }
+
}
+
})
+399
apps/main-app/src/routes/domain.ts
···
+
import { Elysia } from 'elysia'
+
import { requireAuth, type AuthenticatedContext } from '../lib/wisp-auth'
+
import { NodeOAuthClient } from '@atproto/oauth-client-node'
+
import { Agent } from '@atproto/api'
+
import {
+
claimDomain,
+
getDomainByDid,
+
isDomainAvailable,
+
isDomainRegistered,
+
isValidHandle,
+
toDomain,
+
updateDomain,
+
countWispDomains,
+
deleteWispDomain,
+
getCustomDomainInfo,
+
getCustomDomainById,
+
claimCustomDomain,
+
deleteCustomDomain,
+
updateCustomDomainVerification,
+
updateWispDomainSite,
+
updateCustomDomainRkey
+
} from '../lib/db'
+
import { createHash } from 'crypto'
+
import { verifyCustomDomain } from '../lib/dns-verify'
+
import { createLogger } from '@wisp/observability'
+
+
const logger = createLogger('main-app')
+
+
export const domainRoutes = (client: NodeOAuthClient, cookieSecret: string) =>
+
new Elysia({
+
prefix: '/api/domain',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
+
// Public endpoints (no auth required)
+
.get('/check', async ({ query }) => {
+
try {
+
const handle = (query.handle || "")
+
.trim()
+
.toLowerCase();
+
+
if (!isValidHandle(handle)) {
+
return {
+
available: false,
+
reason: "invalid"
+
};
+
}
+
+
const available = await isDomainAvailable(handle);
+
return {
+
available,
+
domain: toDomain(handle)
+
};
+
} catch (err) {
+
logger.error('[Domain] Check error', err);
+
return {
+
available: false
+
};
+
}
+
})
+
.get('/registered', async ({ query, set }) => {
+
try {
+
const domain = (query.domain || "").trim().toLowerCase();
+
+
if (!domain) {
+
set.status = 400;
+
return { error: 'Domain parameter required' };
+
}
+
+
const result = await isDomainRegistered(domain);
+
+
// For Caddy on-demand TLS: 200 = allow, 404 = deny
+
if (result.registered) {
+
set.status = 200;
+
return result;
+
} else {
+
set.status = 404;
+
return { registered: false };
+
}
+
} catch (err) {
+
logger.error('[Domain] Registered check error', err);
+
set.status = 500;
+
return { error: 'Failed to check domain' };
+
}
+
})
+
// Authenticated endpoints (require auth)
+
.derive(async ({ cookie }) => {
+
const auth = await requireAuth(client, cookie)
+
return { auth }
+
})
+
.post('/claim', async ({ body, auth }) => {
+
try {
+
const { handle } = body as { handle?: string };
+
const normalizedHandle = (handle || "").trim().toLowerCase();
+
+
if (!isValidHandle(normalizedHandle)) {
+
throw new Error("Invalid handle");
+
}
+
+
// Check if user already has 3 domains (handled in claimDomain)
+
// claim in DB
+
let domain: string;
+
try {
+
domain = await claimDomain(auth.did, normalizedHandle);
+
} catch (err) {
+
const message = err instanceof Error ? err.message : 'Unknown error';
+
if (message === 'domain_limit_reached') {
+
throw new Error("Domain limit reached: You can only claim up to 3 wisp.place domains");
+
}
+
throw new Error("Handle taken or error claiming domain");
+
}
+
+
// write place.wisp.domain record with unique rkey
+
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init));
+
const rkey = normalizedHandle; // Use handle as rkey for uniqueness
+
await agent.com.atproto.repo.putRecord({
+
repo: auth.did,
+
collection: "place.wisp.domain",
+
rkey,
+
record: {
+
$type: "place.wisp.domain",
+
domain,
+
createdAt: new Date().toISOString(),
+
} as any,
+
validate: false,
+
});
+
+
return { success: true, domain };
+
} catch (err) {
+
logger.error('[Domain] Claim error', err);
+
throw new Error(`Failed to claim: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
})
+
.post('/update', async ({ body, auth }) => {
+
try {
+
const { handle } = body as { handle?: string };
+
const normalizedHandle = (handle || "").trim().toLowerCase();
+
+
if (!isValidHandle(normalizedHandle)) {
+
throw new Error("Invalid handle");
+
}
+
+
const desiredDomain = toDomain(normalizedHandle);
+
const current = await getDomainByDid(auth.did);
+
+
if (current === desiredDomain) {
+
return { success: true, domain: current };
+
}
+
+
let domain: string;
+
try {
+
domain = await updateDomain(auth.did, normalizedHandle);
+
} catch (err) {
+
throw new Error("Handle taken");
+
}
+
+
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init));
+
await agent.com.atproto.repo.putRecord({
+
repo: auth.did,
+
collection: "place.wisp.domain",
+
rkey: "self",
+
record: {
+
$type: "place.wisp.domain",
+
domain,
+
createdAt: new Date().toISOString(),
+
} as any,
+
validate: false,
+
});
+
+
return { success: true, domain };
+
} catch (err) {
+
logger.error('[Domain] Update error', err);
+
throw new Error(`Failed to update: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
})
+
.post('/custom/add', async ({ body, auth }) => {
+
try {
+
const { domain } = body as { domain: string };
+
const domainLower = domain.toLowerCase().trim();
+
+
// Enhanced domain validation
+
// 1. Length check (RFC 1035: labels 1-63 chars, total max 253)
+
if (!domainLower || domainLower.length < 3 || domainLower.length > 253) {
+
throw new Error('Invalid domain: must be 3-253 characters');
+
}
+
+
// 2. Basic format validation
+
// - Must contain at least one dot (require TLD)
+
// - Valid characters: a-z, 0-9, hyphen, dot
+
// - No consecutive dots, no leading/trailing dots or hyphens
+
const domainPattern = /^(?:[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?\.)+[a-z]{2,}$/;
+
if (!domainPattern.test(domainLower)) {
+
throw new Error('Invalid domain format');
+
}
+
+
// 3. Validate each label (part between dots)
+
const labels = domainLower.split('.');
+
for (const label of labels) {
+
if (label.length === 0 || label.length > 63) {
+
throw new Error('Invalid domain: label length must be 1-63 characters');
+
}
+
if (label.startsWith('-') || label.endsWith('-')) {
+
throw new Error('Invalid domain: labels cannot start or end with hyphen');
+
}
+
}
+
+
// 4. TLD validation (require valid TLD, block single-char TLDs and numeric TLDs)
+
const tld = labels[labels.length - 1];
+
if (tld.length < 2 || /^\d+$/.test(tld)) {
+
throw new Error('Invalid domain: TLD must be at least 2 characters and not all numeric');
+
}
+
+
// 5. Homograph attack protection - block domains with mixed scripts or confusables
+
// Block non-ASCII characters (Punycode domains should be pre-converted)
+
if (!/^[a-z0-9.-]+$/.test(domainLower)) {
+
throw new Error('Invalid domain: only ASCII alphanumeric, dots, and hyphens allowed');
+
}
+
+
// 6. Block localhost, internal IPs, and reserved domains
+
const blockedDomains = [
+
'localhost',
+
'example.com',
+
'example.org',
+
'example.net',
+
'test',
+
'invalid',
+
'local'
+
];
+
const blockedPatterns = [
+
/^(?:10|127|172\.(?:1[6-9]|2[0-9]|3[01])|192\.168)\./, // Private IPs
+
/^(?:\d{1,3}\.){3}\d{1,3}$/, // Any IP address
+
];
+
+
if (blockedDomains.includes(domainLower)) {
+
throw new Error('Invalid domain: reserved or blocked domain');
+
}
+
+
for (const pattern of blockedPatterns) {
+
if (pattern.test(domainLower)) {
+
throw new Error('Invalid domain: IP addresses not allowed');
+
}
+
}
+
+
// Check if already exists and is verified
+
const existing = await getCustomDomainInfo(domainLower);
+
if (existing && existing.verified) {
+
throw new Error('Domain already verified and claimed');
+
}
+
+
// Create hash for ID
+
const hash = createHash('sha256').update(`${auth.did}:${domainLower}`).digest('hex').substring(0, 16);
+
+
// Store in database only
+
await claimCustomDomain(auth.did, domainLower, hash);
+
+
return {
+
success: true,
+
id: hash,
+
domain: domainLower,
+
verified: false
+
};
+
} catch (err) {
+
logger.error('[Domain] Custom domain add error', err);
+
throw new Error(`Failed to add domain: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
})
+
.post('/custom/verify', async ({ body, auth }) => {
+
try {
+
const { id } = body as { id: string };
+
+
// Get domain from database
+
const domainInfo = await getCustomDomainById(id);
+
if (!domainInfo) {
+
throw new Error('Domain not found');
+
}
+
+
// Verify DNS records (TXT + CNAME)
+
logger.debug(`[Domain] Verifying custom domain: ${domainInfo.domain}`);
+
const result = await verifyCustomDomain(domainInfo.domain, auth.did, id);
+
+
// Update verification status in database
+
await updateCustomDomainVerification(id, result.verified);
+
+
return {
+
success: true,
+
verified: result.verified,
+
error: result.error,
+
found: result.found
+
};
+
} catch (err) {
+
logger.error('[Domain] Custom domain verify error', err);
+
throw new Error(`Failed to verify domain: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
})
+
.delete('/custom/:id', async ({ params, auth }) => {
+
try {
+
const { id } = params;
+
+
// Verify ownership before deleting
+
const domainInfo = await getCustomDomainById(id);
+
if (!domainInfo) {
+
throw new Error('Domain not found');
+
}
+
+
if (domainInfo.did !== auth.did) {
+
throw new Error('Unauthorized: You do not own this domain');
+
}
+
+
// Delete from database
+
await deleteCustomDomain(id);
+
+
return { success: true };
+
} catch (err) {
+
logger.error('[Domain] Custom domain delete error', err);
+
throw new Error(`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
})
+
.post('/wisp/map-site', async ({ body, auth }) => {
+
try {
+
const { domain, siteRkey } = body as { domain: string; siteRkey: string | null };
+
+
if (!domain) {
+
throw new Error('Domain parameter required');
+
}
+
+
// Update wisp.place domain to point to this site
+
await updateWispDomainSite(domain, siteRkey);
+
+
return { success: true };
+
} catch (err) {
+
logger.error('[Domain] Wisp domain map error', err);
+
throw new Error(`Failed to map site: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
})
+
.delete('/wisp/:domain', async ({ params, auth }) => {
+
try {
+
const { domain } = params;
+
+
// Verify domain belongs to user
+
const domainLower = domain.toLowerCase().trim();
+
const info = await isDomainRegistered(domainLower);
+
+
if (!info.registered || info.type !== 'wisp') {
+
throw new Error('Domain not found');
+
}
+
+
if (info.did !== auth.did) {
+
throw new Error('Unauthorized: You do not own this domain');
+
}
+
+
// Delete from database
+
await deleteWispDomain(domainLower);
+
+
// Delete from PDS
+
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init));
+
const handle = domainLower.replace(`.${process.env.BASE_DOMAIN || 'wisp.place'}`, '');
+
try {
+
await agent.com.atproto.repo.deleteRecord({
+
repo: auth.did,
+
collection: "place.wisp.domain",
+
rkey: handle,
+
});
+
} catch (err) {
+
// Record might not exist in PDS, continue anyway
+
logger.warn('[Domain] Could not delete wisp domain from PDS', err as any);
+
}
+
+
return { success: true };
+
} catch (err) {
+
logger.error('[Domain] Wisp domain delete error', err);
+
throw new Error(`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
})
+
.post('/custom/:id/map-site', async ({ params, body, auth }) => {
+
try {
+
const { id } = params;
+
const { siteRkey } = body as { siteRkey: string | null };
+
+
// Verify ownership before updating
+
const domainInfo = await getCustomDomainById(id);
+
if (!domainInfo) {
+
throw new Error('Domain not found');
+
}
+
+
if (domainInfo.did !== auth.did) {
+
throw new Error('Unauthorized: You do not own this domain');
+
}
+
+
// Update custom domain to point to this site
+
await updateCustomDomainRkey(id, siteRkey);
+
+
return { success: true };
+
} catch (err) {
+
logger.error('[Domain] Custom domain map error', err);
+
throw new Error(`Failed to map site: ${err instanceof Error ? err.message : 'Unknown error'}`);
+
}
+
});
+230
apps/main-app/src/routes/site.ts
···
+
import { Elysia } from 'elysia'
+
import { requireAuth } from '../lib/wisp-auth'
+
import { NodeOAuthClient } from '@atproto/oauth-client-node'
+
import { Agent } from '@atproto/api'
+
import { deleteSite } from '../lib/db'
+
import { createLogger } from '@wisp/observability'
+
import { extractSubfsUris } from '@wisp/atproto-utils'
+
+
const logger = createLogger('main-app')
+
+
export const siteRoutes = (client: NodeOAuthClient, cookieSecret: string) =>
+
new Elysia({
+
prefix: '/api/site',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
+
.derive(async ({ cookie }) => {
+
const auth = await requireAuth(client, cookie)
+
return { auth }
+
})
+
.delete('/:rkey', async ({ params, auth }) => {
+
const { rkey } = params
+
+
if (!rkey) {
+
return {
+
success: false,
+
error: 'Site rkey is required'
+
}
+
}
+
+
try {
+
// Create agent with OAuth session
+
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
+
+
// First, fetch the site record to find any subfs references
+
let subfsUris: Array<{ uri: string; path: string }> = [];
+
try {
+
const existingRecord = await agent.com.atproto.repo.getRecord({
+
repo: auth.did,
+
collection: 'place.wisp.fs',
+
rkey: rkey
+
});
+
+
if (existingRecord.data.value && typeof existingRecord.data.value === 'object' && 'root' in existingRecord.data.value) {
+
const manifest = existingRecord.data.value as any;
+
subfsUris = extractSubfsUris(manifest.root);
+
+
if (subfsUris.length > 0) {
+
console.log(`Found ${subfsUris.length} subfs records to delete`);
+
logger.info(`[Site] Found ${subfsUris.length} subfs records associated with ${rkey}`);
+
}
+
}
+
} catch (err) {
+
// Record might not exist, continue with deletion
+
console.log('Could not fetch site record for subfs cleanup, continuing...');
+
}
+
+
// Delete the main record from AT Protocol
+
try {
+
await agent.com.atproto.repo.deleteRecord({
+
repo: auth.did,
+
collection: 'place.wisp.fs',
+
rkey: rkey
+
})
+
logger.info(`[Site] Deleted site ${rkey} from PDS for ${auth.did}`)
+
} catch (err) {
+
logger.error(`[Site] Failed to delete site ${rkey} from PDS`, err)
+
throw new Error('Failed to delete site from AT Protocol')
+
}
+
+
// Delete associated subfs records
+
if (subfsUris.length > 0) {
+
console.log(`Deleting ${subfsUris.length} associated subfs records...`);
+
+
await Promise.all(
+
subfsUris.map(async ({ uri }) => {
+
try {
+
// Parse URI: at://did/collection/rkey
+
const parts = uri.replace('at://', '').split('/');
+
const subRkey = parts[2];
+
+
await agent.com.atproto.repo.deleteRecord({
+
repo: auth.did,
+
collection: 'place.wisp.subfs',
+
rkey: subRkey
+
});
+
+
console.log(` ๐Ÿ—‘๏ธ Deleted subfs: ${uri}`);
+
logger.info(`[Site] Deleted subfs record: ${uri}`);
+
} catch (err: any) {
+
// Log but don't fail if subfs deletion fails
+
console.warn(`Failed to delete subfs ${uri}:`, err?.message);
+
logger.warn(`[Site] Failed to delete subfs ${uri}`, err);
+
}
+
})
+
);
+
+
logger.info(`[Site] Deleted ${subfsUris.length} subfs records for ${rkey}`);
+
}
+
+
// Delete from database
+
const result = await deleteSite(auth.did, rkey)
+
if (!result.success) {
+
throw new Error('Failed to delete site from database')
+
}
+
+
logger.info(`[Site] Successfully deleted site ${rkey} for ${auth.did}`)
+
+
return {
+
success: true,
+
message: 'Site deleted successfully'
+
}
+
} catch (err) {
+
logger.error('[Site] Delete error', err)
+
return {
+
success: false,
+
error: err instanceof Error ? err.message : 'Failed to delete site'
+
}
+
}
+
})
+
.get('/:rkey/settings', async ({ params, auth }) => {
+
const { rkey } = params
+
+
if (!rkey) {
+
return {
+
success: false,
+
error: 'Site rkey is required'
+
}
+
}
+
+
try {
+
// Create agent with OAuth session
+
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
+
+
// Fetch settings record
+
try {
+
const record = await agent.com.atproto.repo.getRecord({
+
repo: auth.did,
+
collection: 'place.wisp.settings',
+
rkey: rkey
+
})
+
+
if (record.data.value) {
+
return record.data.value
+
}
+
} catch (err: any) {
+
// Record doesn't exist, return defaults
+
if (err?.error === 'RecordNotFound') {
+
return {
+
indexFiles: ['index.html'],
+
cleanUrls: false,
+
directoryListing: false
+
}
+
}
+
throw err
+
}
+
+
// Default settings
+
return {
+
indexFiles: ['index.html'],
+
cleanUrls: false,
+
directoryListing: false
+
}
+
} catch (err) {
+
logger.error('[Site] Get settings error', err)
+
return {
+
success: false,
+
error: err instanceof Error ? err.message : 'Failed to fetch settings'
+
}
+
}
+
})
+
.post('/:rkey/settings', async ({ params, body, auth }) => {
+
const { rkey } = params
+
+
if (!rkey) {
+
return {
+
success: false,
+
error: 'Site rkey is required'
+
}
+
}
+
+
// Validate settings
+
const settings = body as any
+
+
// Ensure mutual exclusivity of routing modes
+
const modes = [
+
settings.spaMode,
+
settings.directoryListing,
+
settings.custom404
+
].filter(Boolean)
+
+
if (modes.length > 1) {
+
return {
+
success: false,
+
error: 'Only one of spaMode, directoryListing, or custom404 can be enabled'
+
}
+
}
+
+
try {
+
// Create agent with OAuth session
+
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
+
+
// Create or update settings record
+
const record = await agent.com.atproto.repo.putRecord({
+
repo: auth.did,
+
collection: 'place.wisp.settings',
+
rkey: rkey,
+
record: {
+
$type: 'place.wisp.settings',
+
...settings
+
}
+
})
+
+
logger.info(`[Site] Saved settings for ${rkey} (${auth.did})`)
+
+
return {
+
success: true,
+
uri: record.data.uri,
+
cid: record.data.cid
+
}
+
} catch (err) {
+
logger.error('[Site] Save settings error', err)
+
return {
+
success: false,
+
error: err instanceof Error ? err.message : 'Failed to save settings'
+
}
+
}
+
})
+127
apps/main-app/src/routes/user.ts
···
+
import { Elysia, t } from 'elysia'
+
import { requireAuth } from '../lib/wisp-auth'
+
import { NodeOAuthClient } from '@atproto/oauth-client-node'
+
import { Agent } from '@atproto/api'
+
import { getSitesByDid, getDomainByDid, getCustomDomainsByDid, getWispDomainInfo, getDomainsBySite, getAllWispDomains } from '../lib/db'
+
import { syncSitesFromPDS } from '../lib/sync-sites'
+
import { createLogger } from '@wisp/observability'
+
+
const logger = createLogger('main-app')
+
+
export const userRoutes = (client: NodeOAuthClient, cookieSecret: string) =>
+
new Elysia({
+
prefix: '/api/user',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
+
.derive(async ({ cookie }) => {
+
const auth = await requireAuth(client, cookie)
+
return { auth }
+
})
+
.get('/status', async ({ auth }) => {
+
try {
+
// Check if user has any sites
+
const sites = await getSitesByDid(auth.did)
+
+
// Check if user has claimed a domain
+
const domain = await getDomainByDid(auth.did)
+
+
return {
+
did: auth.did,
+
hasSites: sites.length > 0,
+
hasDomain: !!domain,
+
domain: domain || null,
+
sitesCount: sites.length
+
}
+
} catch (err) {
+
logger.error('[User] Status error', err)
+
throw new Error('Failed to get user status')
+
}
+
})
+
.get('/info', async ({ auth }) => {
+
try {
+
// Get user's handle from AT Protocol
+
const agent = new Agent(auth.session)
+
+
let handle = 'unknown'
+
try {
+
console.log('[User] Attempting to fetch profile for DID:', auth.did)
+
const profile = await agent.getProfile({ actor: auth.did })
+
console.log('[User] Profile fetched successfully:', profile.data.handle)
+
handle = profile.data.handle
+
} catch (err) {
+
console.error('[User] Failed to fetch profile - Full error:', err)
+
console.error('[User] Error message:', err instanceof Error ? err.message : String(err))
+
console.error('[User] Error stack:', err instanceof Error ? err.stack : 'No stack')
+
logger.error('[User] Failed to fetch profile', err)
+
}
+
+
return {
+
did: auth.did,
+
handle
+
}
+
} catch (err) {
+
logger.error('[User] Info error', err)
+
throw new Error('Failed to get user info')
+
}
+
})
+
.get('/sites', async ({ auth }) => {
+
try {
+
const sites = await getSitesByDid(auth.did)
+
return { sites }
+
} catch (err) {
+
logger.error('[User] Sites error', err)
+
throw new Error('Failed to get sites')
+
}
+
})
+
.get('/domains', async ({ auth }) => {
+
try {
+
// Get all wisp.place subdomains with mappings (up to 3)
+
const wispDomains = await getAllWispDomains(auth.did)
+
+
// Get custom domains
+
const customDomains = await getCustomDomainsByDid(auth.did)
+
+
return {
+
wispDomains: wispDomains.map(d => ({
+
domain: d.domain,
+
rkey: d.rkey || null
+
})),
+
customDomains
+
}
+
} catch (err) {
+
logger.error('[User] Domains error', err)
+
throw new Error('Failed to get domains')
+
}
+
})
+
.post('/sync', async ({ auth }) => {
+
try {
+
logger.debug('[User] Manual sync requested for', { did: auth.did })
+
const result = await syncSitesFromPDS(auth.did, auth.session)
+
+
return {
+
success: true,
+
synced: result.synced,
+
errors: result.errors
+
}
+
} catch (err) {
+
logger.error('[User] Sync error', err)
+
throw new Error('Failed to sync sites')
+
}
+
})
+
.get('/site/:rkey/domains', async ({ auth, params }) => {
+
try {
+
const { rkey } = params
+
const domains = await getDomainsBySite(auth.did, rkey)
+
+
return {
+
rkey,
+
domains
+
}
+
} catch (err) {
+
logger.error('[User] Site domains error', err)
+
throw new Error('Failed to get domains for site')
+
}
+
})
+1201
apps/main-app/src/routes/wisp.ts
···
+
import { Elysia } from 'elysia'
+
import { requireAuth, type AuthenticatedContext } from '../lib/wisp-auth'
+
import { NodeOAuthClient } from '@atproto/oauth-client-node'
+
import { Agent } from '@atproto/api'
+
import { TID } from '@atproto/common-web'
+
import {
+
type UploadedFile,
+
type FileUploadResult,
+
processUploadedFiles,
+
updateFileBlobs,
+
findLargeDirectories,
+
replaceDirectoryWithSubfs,
+
estimateDirectorySize
+
} from '@wisp/fs-utils'
+
import {
+
shouldCompressFile,
+
compressFile,
+
computeCID,
+
extractBlobMap,
+
extractSubfsUris
+
} from '@wisp/atproto-utils'
+
import { createManifest } from '@wisp/fs-utils'
+
import { upsertSite } from '../lib/db'
+
import { createLogger } from '@wisp/observability'
+
import { validateRecord, type Directory } from '@wisp/lexicons/types/place/wisp/fs'
+
import { validateRecord as validateSubfsRecord } from '@wisp/lexicons/types/place/wisp/subfs'
+
import { MAX_SITE_SIZE, MAX_FILE_SIZE, MAX_FILE_COUNT } from '@wisp/constants'
+
import {
+
createUploadJob,
+
getUploadJob,
+
updateJobProgress,
+
completeUploadJob,
+
failUploadJob,
+
addJobListener
+
} from '../lib/upload-jobs'
+
+
const logger = createLogger('main-app')
+
+
function isValidSiteName(siteName: string): boolean {
+
if (!siteName || typeof siteName !== 'string') return false;
+
+
// Length check (AT Protocol rkey limit)
+
if (siteName.length < 1 || siteName.length > 512) return false;
+
+
// Check for path traversal
+
if (siteName === '.' || siteName === '..') return false;
+
if (siteName.includes('/') || siteName.includes('\\')) return false;
+
if (siteName.includes('\0')) return false;
+
+
// AT Protocol rkey format: alphanumeric, dots, dashes, underscores, tildes, colons
+
// Based on NSID format rules
+
const validRkeyPattern = /^[a-zA-Z0-9._~:-]+$/;
+
if (!validRkeyPattern.test(siteName)) return false;
+
+
return true;
+
}
+
+
async function processUploadInBackground(
+
jobId: string,
+
agent: Agent,
+
did: string,
+
siteName: string,
+
fileArray: File[]
+
): Promise<void> {
+
try {
+
// Try to fetch existing record to enable incremental updates
+
let existingBlobMap = new Map<string, { blobRef: any; cid: string }>();
+
let oldSubfsUris: Array<{ uri: string; path: string }> = [];
+
console.log('Attempting to fetch existing record...');
+
updateJobProgress(jobId, { phase: 'validating' });
+
+
try {
+
const rkey = siteName;
+
const existingRecord = await agent.com.atproto.repo.getRecord({
+
repo: did,
+
collection: 'place.wisp.fs',
+
rkey: rkey
+
});
+
console.log('Existing record found!');
+
+
if (existingRecord.data.value && typeof existingRecord.data.value === 'object' && 'root' in existingRecord.data.value) {
+
const manifest = existingRecord.data.value as any;
+
+
// Extract blob map from main record
+
existingBlobMap = extractBlobMap(manifest.root);
+
console.log(`Found existing manifest with ${existingBlobMap.size} files in main record`);
+
+
// Extract subfs URIs with their mount paths from main record
+
const subfsUris = extractSubfsUris(manifest.root);
+
oldSubfsUris = subfsUris; // Save for cleanup later
+
+
if (subfsUris.length > 0) {
+
console.log(`Found ${subfsUris.length} subfs records, fetching in parallel...`);
+
logger.info(`Fetching ${subfsUris.length} subfs records for blob reuse`);
+
+
// Fetch all subfs records in parallel
+
const subfsRecords = await Promise.all(
+
subfsUris.map(async ({ uri, path }) => {
+
try {
+
// Parse URI: at://did/collection/rkey
+
const parts = uri.replace('at://', '').split('/');
+
const subDid = parts[0];
+
const collection = parts[1];
+
const subRkey = parts[2];
+
+
const record = await agent.com.atproto.repo.getRecord({
+
repo: subDid,
+
collection: collection,
+
rkey: subRkey
+
});
+
+
return { record: record.data.value as any, mountPath: path };
+
} catch (err: any) {
+
logger.warn(`Failed to fetch subfs record ${uri}: ${err?.message}`, err);
+
return null;
+
}
+
})
+
);
+
+
// Merge blob maps from all subfs records
+
let totalSubfsBlobs = 0;
+
for (const subfsData of subfsRecords) {
+
if (subfsData && subfsData.record && 'root' in subfsData.record) {
+
// Extract blobs with the correct mount path prefix
+
const subfsMap = extractBlobMap(subfsData.record.root, subfsData.mountPath);
+
subfsMap.forEach((value, key) => {
+
existingBlobMap.set(key, value);
+
totalSubfsBlobs++;
+
});
+
}
+
}
+
+
console.log(`Merged ${totalSubfsBlobs} files from ${subfsUris.length} subfs records`);
+
logger.info(`Total blob map: ${existingBlobMap.size} files (main + subfs)`);
+
}
+
+
console.log(`Total existing blobs for reuse: ${existingBlobMap.size} files`);
+
logger.info(`Found existing manifest with ${existingBlobMap.size} files for incremental update`);
+
}
+
} catch (error: any) {
+
console.log('No existing record found or error:', error?.message || error);
+
if (error?.status !== 400 && error?.error !== 'RecordNotFound') {
+
logger.warn('Failed to fetch existing record, proceeding with full upload', error);
+
}
+
}
+
+
// Convert File objects to UploadedFile format
+
const uploadedFiles: UploadedFile[] = [];
+
const skippedFiles: Array<{ name: string; reason: string }> = [];
+
+
console.log('Processing files, count:', fileArray.length);
+
updateJobProgress(jobId, { phase: 'compressing' });
+
+
for (let i = 0; i < fileArray.length; i++) {
+
const file = fileArray[i];
+
+
// Skip undefined/null files
+
if (!file || !file.name) {
+
console.log(`Skipping undefined file at index ${i}`);
+
skippedFiles.push({
+
name: `[undefined file at index ${i}]`,
+
reason: 'Invalid file object'
+
});
+
continue;
+
}
+
+
console.log(`Processing file ${i + 1}/${fileArray.length}:`, file.name, file.size, 'bytes');
+
updateJobProgress(jobId, {
+
filesProcessed: i + 1,
+
currentFile: file.name
+
});
+
+
// Skip unwanted files and directories
+
const normalizedPath = file.name.replace(/^[^\/]*\//, '');
+
const fileName = normalizedPath.split('/').pop() || '';
+
const pathParts = normalizedPath.split('/');
+
+
// .git directory (version control - thousands of files)
+
if (normalizedPath.startsWith('.git/') || normalizedPath === '.git') {
+
console.log(`Skipping .git file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: '.git directory excluded'
+
});
+
continue;
+
}
+
+
// .DS_Store (macOS metadata - can leak info)
+
if (fileName === '.DS_Store') {
+
console.log(`Skipping .DS_Store file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: '.DS_Store file excluded'
+
});
+
continue;
+
}
+
+
// .env files (environment variables with secrets)
+
if (fileName.startsWith('.env')) {
+
console.log(`Skipping .env file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: 'environment files excluded for security'
+
});
+
continue;
+
}
+
+
// node_modules (dependency folder - can be 100,000+ files)
+
if (pathParts.includes('node_modules')) {
+
console.log(`Skipping node_modules file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: 'node_modules excluded'
+
});
+
continue;
+
}
+
+
// OS metadata files
+
if (fileName === 'Thumbs.db' || fileName === 'desktop.ini' || fileName.startsWith('._')) {
+
console.log(`Skipping OS metadata file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: 'OS metadata file excluded'
+
});
+
continue;
+
}
+
+
// macOS system directories
+
if (pathParts.includes('.Spotlight-V100') || pathParts.includes('.Trashes') || pathParts.includes('.fseventsd')) {
+
console.log(`Skipping macOS system file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: 'macOS system directory excluded'
+
});
+
continue;
+
}
+
+
// Cache and temp directories
+
if (pathParts.some(part => part === '.cache' || part === '.temp' || part === '.tmp')) {
+
console.log(`Skipping cache/temp file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: 'cache/temp directory excluded'
+
});
+
continue;
+
}
+
+
// Python cache
+
if (pathParts.includes('__pycache__') || fileName.endsWith('.pyc')) {
+
console.log(`Skipping Python cache file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: 'Python cache excluded'
+
});
+
continue;
+
}
+
+
// Python virtual environments
+
if (pathParts.some(part => part === '.venv' || part === 'venv' || part === 'env')) {
+
console.log(`Skipping Python venv file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: 'Python virtual environment excluded'
+
});
+
continue;
+
}
+
+
// Editor swap files
+
if (fileName.endsWith('.swp') || fileName.endsWith('.swo') || fileName.endsWith('~')) {
+
console.log(`Skipping editor swap file: ${file.name}`);
+
skippedFiles.push({
+
name: file.name,
+
reason: 'editor swap file excluded'
+
});
+
continue;
+
}
+
+
// Skip files that are too large
+
const maxSize = MAX_FILE_SIZE;
+
if (file.size > maxSize) {
+
skippedFiles.push({
+
name: file.name,
+
reason: `file too large (${(file.size / 1024 / 1024).toFixed(2)}MB, max 100MB)`
+
});
+
continue;
+
}
+
+
const arrayBuffer = await file.arrayBuffer();
+
const originalContent = Buffer.from(arrayBuffer);
+
const originalMimeType = file.type || 'application/octet-stream';
+
+
// Determine if file should be compressed (pass filename to exclude _redirects)
+
const shouldCompress = shouldCompressFile(originalMimeType, normalizedPath);
+
+
// Text files (HTML/CSS/JS) need base64 encoding to prevent PDS content sniffing
+
// Audio files just need compression without base64
+
const needsBase64 =
+
originalMimeType.startsWith('text/') ||
+
originalMimeType.startsWith('application/json') ||
+
originalMimeType.startsWith('application/xml') ||
+
originalMimeType === 'image/svg+xml';
+
+
let finalContent: Buffer;
+
let compressed = false;
+
let base64Encoded = false;
+
+
if (shouldCompress) {
+
const compressedContent = compressFile(originalContent);
+
compressed = true;
+
+
if (needsBase64) {
+
// Text files: compress AND base64 encode
+
finalContent = Buffer.from(compressedContent.toString('base64'), 'binary');
+
base64Encoded = true;
+
const compressionRatio = (compressedContent.length / originalContent.length * 100).toFixed(1);
+
console.log(`Compressing+base64 ${file.name}: ${originalContent.length} -> ${compressedContent.length} bytes (${compressionRatio}%), base64: ${finalContent.length} bytes`);
+
logger.info(`Compressing+base64 ${file.name}: ${originalContent.length} -> ${compressedContent.length} bytes (${compressionRatio}%), base64: ${finalContent.length} bytes`);
+
} else {
+
// Audio files: just compress, no base64
+
finalContent = compressedContent;
+
const compressionRatio = (compressedContent.length / originalContent.length * 100).toFixed(1);
+
console.log(`Compressing ${file.name}: ${originalContent.length} -> ${compressedContent.length} bytes (${compressionRatio}%)`);
+
logger.info(`Compressing ${file.name}: ${originalContent.length} -> ${compressedContent.length} bytes (${compressionRatio}%)`);
+
}
+
} else {
+
// Binary files: upload directly
+
finalContent = originalContent;
+
console.log(`Uploading ${file.name} directly: ${originalContent.length} bytes (no compression)`);
+
logger.info(`Uploading ${file.name} directly: ${originalContent.length} bytes (binary)`);
+
}
+
+
uploadedFiles.push({
+
name: file.name,
+
content: finalContent,
+
mimeType: originalMimeType,
+
size: finalContent.length,
+
compressed,
+
base64Encoded,
+
originalMimeType
+
});
+
}
+
+
// Update total file count after filtering (important for progress tracking)
+
updateJobProgress(jobId, {
+
totalFiles: uploadedFiles.length
+
});
+
+
// Check total size limit
+
const totalSize = uploadedFiles.reduce((sum, file) => sum + file.size, 0);
+
const maxTotalSize = MAX_SITE_SIZE;
+
+
if (totalSize > maxTotalSize) {
+
throw new Error(`Total upload size ${(totalSize / 1024 / 1024).toFixed(2)}MB exceeds 300MB limit`);
+
}
+
+
// Check file count limit
+
if (uploadedFiles.length > MAX_FILE_COUNT) {
+
throw new Error(`File count ${uploadedFiles.length} exceeds ${MAX_FILE_COUNT} files limit`);
+
}
+
+
console.log(`After filtering: ${uploadedFiles.length} files to process (${skippedFiles.length} skipped)`);
+
+
if (uploadedFiles.length === 0) {
+
// Create empty manifest
+
const emptyManifest = {
+
$type: 'place.wisp.fs',
+
site: siteName,
+
root: {
+
type: 'directory',
+
entries: []
+
},
+
fileCount: 0,
+
createdAt: new Date().toISOString()
+
};
+
+
const validationResult = validateRecord(emptyManifest);
+
if (!validationResult.success) {
+
throw new Error(`Invalid manifest: ${validationResult.error?.message || 'Validation failed'}`);
+
}
+
+
const rkey = siteName;
+
updateJobProgress(jobId, { phase: 'finalizing' });
+
+
const record = await agent.com.atproto.repo.putRecord({
+
repo: did,
+
collection: 'place.wisp.fs',
+
rkey: rkey,
+
record: emptyManifest
+
});
+
+
await upsertSite(did, rkey, siteName);
+
+
completeUploadJob(jobId, {
+
success: true,
+
uri: record.data.uri,
+
cid: record.data.cid,
+
fileCount: 0,
+
siteName,
+
skippedFiles
+
});
+
return;
+
}
+
+
// Process files into directory structure
+
console.log('Processing uploaded files into directory structure...');
+
const validUploadedFiles = uploadedFiles.filter((f, i) => {
+
if (!f || !f.name || !f.content) {
+
console.error(`Filtering out invalid file at index ${i}`);
+
return false;
+
}
+
return true;
+
});
+
+
const { directory, fileCount } = processUploadedFiles(validUploadedFiles);
+
console.log('Directory structure created, file count:', fileCount);
+
+
// Upload files as blobs with retry logic for DPoP nonce conflicts
+
console.log('Starting blob upload/reuse phase...');
+
updateJobProgress(jobId, { phase: 'uploading' });
+
+
// Helper function to upload blob with exponential backoff retry and timeout
+
const uploadBlobWithRetry = async (
+
agent: Agent,
+
content: Buffer,
+
mimeType: string,
+
fileName: string,
+
maxRetries = 5
+
) => {
+
for (let attempt = 0; attempt < maxRetries; attempt++) {
+
const controller = new AbortController();
+
const timeoutMs = 300000; // 5 minute timeout per upload
+
const timeoutId = setTimeout(() => controller.abort(), timeoutMs);
+
+
try {
+
console.log(`[File Upload] Starting upload attempt ${attempt + 1}/${maxRetries} for ${fileName} (${content.length} bytes, ${mimeType})`);
+
+
const result = await agent.com.atproto.repo.uploadBlob(content, { encoding: mimeType });
+
clearTimeout(timeoutId);
+
console.log(`[File Upload] โœ… Successfully uploaded ${fileName} on attempt ${attempt + 1}`);
+
return result;
+
} catch (error: any) {
+
clearTimeout(timeoutId);
+
+
const isDPoPNonceError =
+
error?.message?.toLowerCase().includes('nonce') ||
+
error?.message?.toLowerCase().includes('dpop') ||
+
error?.status === 409;
+
+
const isTimeout = error?.name === 'AbortError' || error?.message === 'Upload timeout';
+
const isRateLimited = error?.status === 429 || error?.message?.toLowerCase().includes('rate');
+
const isRequestEntityTooLarge = error?.status === 419 || error?.status === 413;
+
+
// Special handling for 419/413 Request Entity Too Large errors
+
if (isRequestEntityTooLarge) {
+
const customError = new Error('Your PDS is not allowing uploads large enough to store your site. Please contact your PDS host. This could also possibly be a result of it being behind Cloudflare free tier.');
+
(customError as any).status = 419;
+
throw customError;
+
}
+
+
// Retry on DPoP nonce conflicts, timeouts, or rate limits
+
if ((isDPoPNonceError || isTimeout || isRateLimited) && attempt < maxRetries - 1) {
+
let backoffMs: number;
+
if (isRateLimited) {
+
backoffMs = 2000 * Math.pow(2, attempt); // 2s, 4s, 8s, 16s for rate limits
+
} else if (isTimeout) {
+
backoffMs = 1000 * Math.pow(2, attempt); // 1s, 2s, 4s, 8s for timeouts
+
} else {
+
backoffMs = 100 * Math.pow(2, attempt); // 100ms, 200ms, 400ms for DPoP
+
}
+
+
const reason = isDPoPNonceError ? 'DPoP nonce conflict' : isTimeout ? 'timeout' : 'rate limit';
+
logger.info(`[File Upload] ๐Ÿ”„ ${reason} for ${fileName}, retrying in ${backoffMs}ms (attempt ${attempt + 1}/${maxRetries})`);
+
console.log(`[File Upload] ๐Ÿ”„ ${reason} for ${fileName}, retrying in ${backoffMs}ms`);
+
await new Promise(resolve => setTimeout(resolve, backoffMs));
+
continue;
+
}
+
+
// Log detailed error information before throwing
+
logger.error(`[File Upload] โŒ Upload failed for ${fileName} (size: ${content.length} bytes, mimeType: ${mimeType}, attempt: ${attempt + 1}/${maxRetries})`, {
+
error: error?.error || error?.message || 'Unknown error',
+
status: error?.status,
+
headers: error?.headers,
+
success: error?.success
+
});
+
console.error(`[File Upload] โŒ Upload failed for ${fileName}:`, {
+
error: error?.error || error?.message || 'Unknown error',
+
status: error?.status,
+
size: content.length,
+
mimeType,
+
attempt: attempt + 1
+
});
+
throw error;
+
}
+
}
+
throw new Error(`Failed to upload ${fileName} after ${maxRetries} attempts`);
+
};
+
+
// Use sliding window concurrency for maximum throughput
+
const CONCURRENCY_LIMIT = 20; // Maximum concurrent uploads
+
const uploadedBlobs: Array<{
+
result: FileUploadResult;
+
filePath: string;
+
sentMimeType: string;
+
returnedMimeType: string;
+
reused: boolean;
+
}> = [];
+
const failedFiles: Array<{
+
name: string;
+
index: number;
+
error: string;
+
size: number;
+
}> = [];
+
+
// Track completed files count for accurate progress
+
let completedFilesCount = 0;
+
+
// Process file with sliding window concurrency
+
const processFile = async (file: UploadedFile, index: number) => {
+
try {
+
if (!file || !file.name) {
+
throw new Error(`Undefined file at index ${index}`);
+
}
+
+
const fileCID = computeCID(file.content);
+
const normalizedPath = file.name.replace(/^[^\/]*\//, '');
+
const existingBlob = existingBlobMap.get(normalizedPath) || existingBlobMap.get(file.name);
+
+
if (existingBlob && existingBlob.cid === fileCID) {
+
logger.info(`[File Upload] โ™ป๏ธ Reused: ${file.name} (unchanged, CID: ${fileCID})`);
+
const reusedCount = (getUploadJob(jobId)?.progress.filesReused || 0) + 1;
+
completedFilesCount++;
+
updateJobProgress(jobId, {
+
filesReused: reusedCount,
+
currentFile: `${completedFilesCount}/${validUploadedFiles.length}: ${file.name} (reused)`
+
});
+
+
return {
+
result: {
+
hash: existingBlob.cid,
+
blobRef: existingBlob.blobRef,
+
...(file.compressed && {
+
encoding: 'gzip' as const,
+
mimeType: file.originalMimeType || file.mimeType,
+
base64: file.base64Encoded || false
+
})
+
},
+
filePath: file.name,
+
sentMimeType: file.mimeType,
+
returnedMimeType: existingBlob.blobRef.mimeType,
+
reused: true
+
};
+
}
+
+
const uploadMimeType = file.compressed || file.mimeType.startsWith('text/html')
+
? 'application/octet-stream'
+
: file.mimeType;
+
+
const compressionInfo = file.compressed ? ' (gzipped)' : '';
+
const fileSizeMB = (file.size / 1024 / 1024).toFixed(2);
+
logger.info(`[File Upload] โฌ†๏ธ Uploading: ${file.name} (${fileSizeMB}MB${compressionInfo})`);
+
+
const uploadResult = await uploadBlobWithRetry(
+
agent,
+
file.content,
+
uploadMimeType,
+
file.name
+
);
+
+
const returnedBlobRef = uploadResult.data.blob;
+
const uploadedCount = (getUploadJob(jobId)?.progress.filesUploaded || 0) + 1;
+
completedFilesCount++;
+
updateJobProgress(jobId, {
+
filesUploaded: uploadedCount,
+
currentFile: `${completedFilesCount}/${validUploadedFiles.length}: ${file.name} (uploaded)`
+
});
+
logger.info(`[File Upload] โœ… Uploaded: ${file.name} (CID: ${fileCID})`);
+
+
return {
+
result: {
+
hash: returnedBlobRef.ref.toString(),
+
blobRef: returnedBlobRef,
+
...(file.compressed && {
+
encoding: 'gzip' as const,
+
mimeType: file.originalMimeType || file.mimeType,
+
base64: file.base64Encoded || false
+
})
+
},
+
filePath: file.name,
+
sentMimeType: file.mimeType,
+
returnedMimeType: returnedBlobRef.mimeType,
+
reused: false
+
};
+
} catch (uploadError) {
+
const fileName = file?.name || 'unknown';
+
const fileSize = file?.size || 0;
+
const errorMessage = uploadError instanceof Error ? uploadError.message : 'Unknown error';
+
const errorDetails = {
+
fileName,
+
fileSize,
+
index,
+
error: errorMessage,
+
stack: uploadError instanceof Error ? uploadError.stack : undefined
+
};
+
logger.error(`Upload failed for file: ${fileName} (${fileSize} bytes) at index ${index}`, errorDetails);
+
console.error(`Upload failed for file: ${fileName} (${fileSize} bytes) at index ${index}`, errorDetails);
+
+
completedFilesCount++;
+
updateJobProgress(jobId, {
+
currentFile: `${completedFilesCount}/${validUploadedFiles.length}: ${fileName} (failed)`
+
});
+
+
// Track failed file but don't throw - continue with other files
+
failedFiles.push({
+
name: fileName,
+
index,
+
error: errorMessage,
+
size: fileSize
+
});
+
+
return null; // Return null to indicate failure
+
}
+
};
+
+
// Sliding window concurrency control
+
const processWithConcurrency = async () => {
+
const results: any[] = [];
+
let fileIndex = 0;
+
const executing = new Map<Promise<void>, { index: number; name: string }>();
+
+
for (const file of validUploadedFiles) {
+
const currentIndex = fileIndex++;
+
+
const promise = processFile(file, currentIndex)
+
.then(result => {
+
results[currentIndex] = result;
+
})
+
.catch(error => {
+
// This shouldn't happen since processFile catches errors, but just in case
+
logger.error(`Unexpected error processing file at index ${currentIndex}`, error);
+
results[currentIndex] = null;
+
})
+
.finally(() => {
+
executing.delete(promise);
+
});
+
+
executing.set(promise, { index: currentIndex, name: file.name });
+
+
if (executing.size >= CONCURRENCY_LIMIT) {
+
await Promise.race(executing.keys());
+
}
+
}
+
+
// Wait for remaining uploads
+
await Promise.all(executing.keys());
+
console.log(`\nโœ… Upload complete: ${completedFilesCount}/${validUploadedFiles.length} files processed\n`);
+
return results.filter(r => r !== undefined && r !== null); // Filter out null (failed) and undefined entries
+
};
+
+
const allResults = await processWithConcurrency();
+
uploadedBlobs.push(...allResults);
+
+
const currentReused = uploadedBlobs.filter(b => b.reused).length;
+
const currentUploaded = uploadedBlobs.filter(b => !b.reused).length;
+
const successfulCount = uploadedBlobs.length;
+
const failedCount = failedFiles.length;
+
+
logger.info(`[File Upload] ๐ŸŽ‰ Upload complete โ†’ ${successfulCount}/${validUploadedFiles.length} files succeeded (${currentUploaded} uploaded, ${currentReused} reused), ${failedCount} failed`);
+
+
if (failedCount > 0) {
+
logger.warn(`[File Upload] โš ๏ธ Failed files:`, failedFiles);
+
console.warn(`[File Upload] โš ๏ธ ${failedCount} files failed to upload:`, failedFiles.map(f => f.name).join(', '));
+
}
+
+
const reusedCount = uploadedBlobs.filter(b => b.reused).length;
+
const uploadedCount = uploadedBlobs.filter(b => !b.reused).length;
+
logger.info(`[File Upload] ๐ŸŽ‰ Upload phase complete! Total: ${successfulCount} files (${uploadedCount} uploaded, ${reusedCount} reused)`);
+
+
const uploadResults: FileUploadResult[] = uploadedBlobs.map(blob => blob.result);
+
const filePaths: string[] = uploadedBlobs.map(blob => blob.filePath);
+
+
// Update directory with file blobs (only for successfully uploaded files)
+
console.log('Updating directory with blob references...');
+
updateJobProgress(jobId, { phase: 'creating_manifest' });
+
+
// Create a set of successfully uploaded paths for quick lookup
+
const successfulPaths = new Set(filePaths.map(path => path.replace(/^[^\/]*\//, '')));
+
+
const updatedDirectory = updateFileBlobs(directory, uploadResults, filePaths, '', successfulPaths);
+
+
// Calculate actual file count (only successfully uploaded files)
+
const actualFileCount = uploadedBlobs.length;
+
+
// Check if we need to split into subfs records
+
// Split proactively if we have lots of files to avoid hitting manifest size limits
+
const MAX_MANIFEST_SIZE = 140 * 1024; // 140KB to be safe (PDS limit is 150KB)
+
const FILE_COUNT_THRESHOLD = 250; // Start splitting at this many files
+
const TARGET_FILE_COUNT = 200; // Try to keep main manifest under this many files
+
const subfsRecords: Array<{ uri: string; path: string }> = [];
+
let workingDirectory = updatedDirectory;
+
let currentFileCount = actualFileCount;
+
+
// Create initial manifest to check size
+
let manifest = createManifest(siteName, workingDirectory, actualFileCount);
+
let manifestSize = JSON.stringify(manifest).length;
+
+
// Split if we have lots of files OR if manifest is already too large
+
if (actualFileCount >= FILE_COUNT_THRESHOLD || manifestSize > MAX_MANIFEST_SIZE) {
+
console.log(`โš ๏ธ Large site detected (${actualFileCount} files, ${(manifestSize / 1024).toFixed(1)}KB), splitting into subfs records...`);
+
logger.info(`Large site with ${actualFileCount} files, splitting into subfs records`);
+
+
// Keep splitting until manifest fits under limits (both size and file count)
+
let attempts = 0;
+
const MAX_ATTEMPTS = 100; // Allow many splits for very large sites
+
+
while ((manifestSize > MAX_MANIFEST_SIZE || currentFileCount > TARGET_FILE_COUNT) && attempts < MAX_ATTEMPTS) {
+
attempts++;
+
+
// Find all directories sorted by size (largest first)
+
const directories = findLargeDirectories(workingDirectory);
+
directories.sort((a, b) => b.size - a.size);
+
+
// Check if we can split subdirectories or need to split flat files
+
if (directories.length > 0) {
+
// Split the largest subdirectory
+
const largestDir = directories[0];
+
console.log(` Split #${attempts}: ${largestDir.path} (${largestDir.fileCount} files, ${(largestDir.size / 1024).toFixed(1)}KB)`);
+
+
// Create a subfs record for this directory
+
const subfsRkey = TID.nextStr();
+
const subfsManifest = {
+
$type: 'place.wisp.subfs' as const,
+
root: largestDir.directory,
+
fileCount: largestDir.fileCount,
+
createdAt: new Date().toISOString()
+
};
+
+
// Validate subfs record
+
const subfsValidation = validateSubfsRecord(subfsManifest);
+
if (!subfsValidation.success) {
+
throw new Error(`Invalid subfs manifest: ${subfsValidation.error?.message || 'Validation failed'}`);
+
}
+
+
// Upload subfs record to PDS
+
const subfsRecord = await agent.com.atproto.repo.putRecord({
+
repo: did,
+
collection: 'place.wisp.subfs',
+
rkey: subfsRkey,
+
record: subfsManifest
+
});
+
+
const subfsUri = subfsRecord.data.uri;
+
subfsRecords.push({ uri: subfsUri, path: largestDir.path });
+
console.log(` โœ… Created subfs: ${subfsUri}`);
+
logger.info(`Created subfs record for ${largestDir.path}: ${subfsUri}`);
+
+
// Replace directory with subfs node in the main tree
+
workingDirectory = replaceDirectoryWithSubfs(workingDirectory, largestDir.path, subfsUri);
+
currentFileCount -= largestDir.fileCount;
+
} else {
+
// No subdirectories - split flat files at root level
+
const rootFiles = workingDirectory.entries.filter(e => 'type' in e.node && e.node.type === 'file');
+
+
if (rootFiles.length === 0) {
+
throw new Error(
+
`Cannot split manifest further - no files or directories available. ` +
+
`Current: ${currentFileCount} files, ${(manifestSize / 1024).toFixed(1)}KB.`
+
);
+
}
+
+
// Take a chunk of files (aim for ~100 files per chunk)
+
const CHUNK_SIZE = 100;
+
const chunkFiles = rootFiles.slice(0, Math.min(CHUNK_SIZE, rootFiles.length));
+
console.log(` Split #${attempts}: flat root (${chunkFiles.length} files)`);
+
+
// Create a directory with just these files
+
const chunkDirectory: Directory = {
+
$type: 'place.wisp.fs#directory' as const,
+
type: 'directory' as const,
+
entries: chunkFiles
+
};
+
+
// Create subfs record for this chunk
+
const subfsRkey = TID.nextStr();
+
const subfsManifest = {
+
$type: 'place.wisp.subfs' as const,
+
root: chunkDirectory,
+
fileCount: chunkFiles.length,
+
createdAt: new Date().toISOString()
+
};
+
+
// Validate subfs record
+
const subfsValidation = validateSubfsRecord(subfsManifest);
+
if (!subfsValidation.success) {
+
throw new Error(`Invalid subfs manifest: ${subfsValidation.error?.message || 'Validation failed'}`);
+
}
+
+
// Upload subfs record to PDS
+
const subfsRecord = await agent.com.atproto.repo.putRecord({
+
repo: did,
+
collection: 'place.wisp.subfs',
+
rkey: subfsRkey,
+
record: subfsManifest
+
});
+
+
const subfsUri = subfsRecord.data.uri;
+
console.log(` โœ… Created flat subfs: ${subfsUri}`);
+
logger.info(`Created flat subfs record with ${chunkFiles.length} files: ${subfsUri}`);
+
+
// Remove these files from the working directory and add a subfs entry
+
const remainingEntries = workingDirectory.entries.filter(
+
e => !chunkFiles.some(cf => cf.name === e.name)
+
);
+
+
// Add subfs entry (will be merged flat when expanded)
+
remainingEntries.push({
+
name: `__subfs_${attempts}`, // Placeholder name, will be merged away
+
node: {
+
$type: 'place.wisp.fs#subfs' as const,
+
type: 'subfs' as const,
+
subject: subfsUri,
+
flat: true // Merge entries directly into parent (default, but explicit for clarity)
+
}
+
});
+
+
workingDirectory = {
+
$type: 'place.wisp.fs#directory' as const,
+
type: 'directory' as const,
+
entries: remainingEntries
+
};
+
+
subfsRecords.push({ uri: subfsUri, path: `__subfs_${attempts}` });
+
currentFileCount -= chunkFiles.length;
+
}
+
+
// Recreate manifest and check new size
+
manifest = createManifest(siteName, workingDirectory, currentFileCount);
+
manifestSize = JSON.stringify(manifest).length;
+
const newSizeKB = (manifestSize / 1024).toFixed(1);
+
console.log(` โ†’ Manifest now ${newSizeKB}KB with ${currentFileCount} files (${subfsRecords.length} subfs total)`);
+
+
// Check if we're under both limits now
+
if (manifestSize <= MAX_MANIFEST_SIZE && currentFileCount <= TARGET_FILE_COUNT) {
+
console.log(` โœ… Manifest fits! (${currentFileCount} files, ${newSizeKB}KB)`);
+
break;
+
}
+
}
+
+
if (manifestSize > MAX_MANIFEST_SIZE || currentFileCount > TARGET_FILE_COUNT) {
+
throw new Error(
+
`Failed to fit manifest after splitting ${attempts} directories. ` +
+
`Current: ${currentFileCount} files, ${(manifestSize / 1024).toFixed(1)}KB. ` +
+
`This should never happen - please report this issue.`
+
);
+
}
+
+
console.log(`โœ… Split complete: ${subfsRecords.length} subfs records, ${currentFileCount} files in main, ${(manifestSize / 1024).toFixed(1)}KB manifest`);
+
logger.info(`Split into ${subfsRecords.length} subfs records, ${currentFileCount} files remaining in main tree`);
+
} else {
+
const manifestSizeKB = (manifestSize / 1024).toFixed(1);
+
console.log(`Manifest created (${fileCount} files, ${manifestSizeKB}KB JSON) - no splitting needed`);
+
}
+
+
const rkey = siteName;
+
updateJobProgress(jobId, { phase: 'finalizing' });
+
+
console.log('Putting record to PDS with rkey:', rkey);
+
const record = await agent.com.atproto.repo.putRecord({
+
repo: did,
+
collection: 'place.wisp.fs',
+
rkey: rkey,
+
record: manifest
+
});
+
console.log('Record successfully created on PDS:', record.data.uri);
+
+
// Store site in database cache
+
await upsertSite(did, rkey, siteName);
+
+
// Clean up old subfs records if we had any
+
if (oldSubfsUris.length > 0) {
+
console.log(`Cleaning up ${oldSubfsUris.length} old subfs records...`);
+
logger.info(`Cleaning up ${oldSubfsUris.length} old subfs records`);
+
+
// Delete old subfs records in parallel (don't wait for completion)
+
Promise.all(
+
oldSubfsUris.map(async ({ uri }) => {
+
try {
+
// Parse URI: at://did/collection/rkey
+
const parts = uri.replace('at://', '').split('/');
+
const subRkey = parts[2];
+
+
await agent.com.atproto.repo.deleteRecord({
+
repo: did,
+
collection: 'place.wisp.subfs',
+
rkey: subRkey
+
});
+
+
console.log(` ๐Ÿ—‘๏ธ Deleted old subfs: ${uri}`);
+
logger.info(`Deleted old subfs record: ${uri}`);
+
} catch (err: any) {
+
// Don't fail the whole upload if cleanup fails
+
console.warn(`Failed to delete old subfs ${uri}:`, err?.message);
+
logger.warn(`Failed to delete old subfs ${uri}`, err);
+
}
+
})
+
).catch(err => {
+
// Log but don't fail if cleanup fails
+
logger.warn('Some subfs cleanup operations failed', err);
+
});
+
}
+
+
completeUploadJob(jobId, {
+
success: true,
+
uri: record.data.uri,
+
cid: record.data.cid,
+
fileCount,
+
siteName,
+
skippedFiles,
+
failedFiles,
+
uploadedCount: validUploadedFiles.length - failedFiles.length,
+
hasFailures: failedFiles.length > 0
+
});
+
+
console.log('=== UPLOAD FILES COMPLETE ===');
+
} catch (error) {
+
console.error('=== UPLOAD ERROR ===');
+
console.error('Error details:', error);
+
logger.error('Upload error', error);
+
failUploadJob(jobId, error instanceof Error ? error.message : 'Unknown error');
+
}
+
}
+
+
export const wispRoutes = (client: NodeOAuthClient, cookieSecret: string) =>
+
new Elysia({
+
prefix: '/wisp',
+
cookie: {
+
secrets: cookieSecret,
+
sign: ['did']
+
}
+
})
+
.derive(async ({ cookie }) => {
+
const auth = await requireAuth(client, cookie)
+
return { auth }
+
})
+
.get(
+
'/upload-progress/:jobId',
+
async ({ params: { jobId }, auth, set }) => {
+
const job = getUploadJob(jobId);
+
+
if (!job) {
+
set.status = 404;
+
return { error: 'Job not found' };
+
}
+
+
// Verify job belongs to authenticated user
+
if (job.did !== auth.did) {
+
set.status = 403;
+
return { error: 'Unauthorized' };
+
}
+
+
// Set up SSE headers
+
set.headers = {
+
'Content-Type': 'text/event-stream',
+
'Cache-Control': 'no-cache',
+
'Connection': 'keep-alive'
+
};
+
+
const stream = new ReadableStream({
+
start(controller) {
+
const encoder = new TextEncoder();
+
+
// Send initial state
+
const sendEvent = (event: string, data: any) => {
+
try {
+
const message = `event: ${event}\ndata: ${JSON.stringify(data)}\n\n`;
+
controller.enqueue(encoder.encode(message));
+
} catch (err) {
+
// Controller closed, ignore
+
}
+
};
+
+
// Send keepalive comment every 15 seconds to prevent timeout
+
const keepaliveInterval = setInterval(() => {
+
try {
+
controller.enqueue(encoder.encode(': keepalive\n\n'));
+
} catch (err) {
+
// Controller closed, stop sending keepalives
+
clearInterval(keepaliveInterval);
+
}
+
}, 15000);
+
+
// Send current job state immediately
+
sendEvent('progress', {
+
status: job.status,
+
progress: job.progress,
+
result: job.result,
+
error: job.error
+
});
+
+
// If job is already completed or failed, close the stream
+
if (job.status === 'completed' || job.status === 'failed') {
+
clearInterval(keepaliveInterval);
+
controller.close();
+
return;
+
}
+
+
// Listen for updates
+
const cleanup = addJobListener(jobId, (event, data) => {
+
sendEvent(event, data);
+
+
// Close stream after done or error event
+
if (event === 'done' || event === 'error') {
+
clearInterval(keepaliveInterval);
+
setTimeout(() => {
+
try {
+
controller.close();
+
} catch (err) {
+
// Already closed
+
}
+
}, 100);
+
}
+
});
+
+
// Cleanup on disconnect
+
return () => {
+
clearInterval(keepaliveInterval);
+
cleanup();
+
};
+
}
+
});
+
+
return new Response(stream);
+
}
+
)
+
.post(
+
'/upload-files',
+
async ({ body, auth }) => {
+
const { siteName, files } = body as {
+
siteName: string;
+
files: File | File[]
+
};
+
+
console.log('=== UPLOAD FILES START ===');
+
console.log('Site name:', siteName);
+
console.log('Files received:', Array.isArray(files) ? files.length : 'single file');
+
+
try {
+
if (!siteName) {
+
throw new Error('Site name is required')
+
}
+
+
if (!isValidSiteName(siteName)) {
+
throw new Error('Invalid site name: must be 1-512 characters and contain only alphanumeric, dots, dashes, underscores, tildes, and colons')
+
}
+
+
// Check if files were provided
+
const hasFiles = files && (Array.isArray(files) ? files.length > 0 : !!files);
+
+
if (!hasFiles) {
+
// Handle empty upload synchronously (fast operation)
+
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
+
+
const emptyManifest = {
+
$type: 'place.wisp.fs',
+
site: siteName,
+
root: {
+
type: 'directory',
+
entries: []
+
},
+
fileCount: 0,
+
createdAt: new Date().toISOString()
+
};
+
+
const validationResult = validateRecord(emptyManifest);
+
if (!validationResult.success) {
+
throw new Error(`Invalid manifest: ${validationResult.error?.message || 'Validation failed'}`);
+
}
+
+
const rkey = siteName;
+
+
const record = await agent.com.atproto.repo.putRecord({
+
repo: auth.did,
+
collection: 'place.wisp.fs',
+
rkey: rkey,
+
record: emptyManifest
+
});
+
+
await upsertSite(auth.did, rkey, siteName);
+
+
return {
+
success: true,
+
uri: record.data.uri,
+
cid: record.data.cid,
+
fileCount: 0,
+
siteName
+
};
+
}
+
+
// For file uploads, create a job and process in background
+
const fileArray = Array.isArray(files) ? files : [files];
+
const jobId = createUploadJob(auth.did, siteName, fileArray.length);
+
+
// Track upload speeds to estimate progress
+
const uploadStats = {
+
speeds: [] as number[], // MB/s from completed uploads
+
getAverageSpeed(): number {
+
if (this.speeds.length === 0) return 3; // Default 3 MB/s
+
const sum = this.speeds.reduce((a, b) => a + b, 0);
+
return sum / this.speeds.length;
+
}
+
};
+
+
// Create agent with OAuth session and upload progress monitoring
+
const wrappedFetchHandler = async (url: string, init?: RequestInit) => {
+
// Check if this is an uploadBlob request with a body
+
if (url.includes('uploadBlob') && init?.body) {
+
const originalBody = init.body;
+
const bodySize = originalBody instanceof Uint8Array ? originalBody.length :
+
originalBody instanceof ArrayBuffer ? originalBody.byteLength :
+
typeof originalBody === 'string' ? new TextEncoder().encode(originalBody).length : 0;
+
+
const startTime = Date.now();
+
+
if (bodySize > 10 * 1024 * 1024) { // Files over 10MB
+
const sizeMB = (bodySize / 1024 / 1024).toFixed(1);
+
const avgSpeed = uploadStats.getAverageSpeed();
+
const estimatedDuration = (bodySize / 1024 / 1024) / avgSpeed;
+
+
console.log(`[Upload Progress] Starting upload of ${sizeMB}MB file`);
+
console.log(`[Upload Stats] Measured speeds from last ${uploadStats.speeds.length} files:`, uploadStats.speeds.map(s => s.toFixed(2) + ' MB/s').join(', '));
+
console.log(`[Upload Stats] Average speed: ${avgSpeed.toFixed(2)} MB/s, estimated duration: ${estimatedDuration.toFixed(0)}s`);
+
+
// Log estimated progress every 5 seconds
+
const progressInterval = setInterval(() => {
+
const elapsed = (Date.now() - startTime) / 1000;
+
const estimatedPercent = Math.min(95, Math.round((elapsed / estimatedDuration) * 100));
+
const estimatedMB = Math.min(bodySize / 1024 / 1024, elapsed * avgSpeed).toFixed(1);
+
console.log(`[Upload Progress] ~${estimatedPercent}% (~${estimatedMB}/${sizeMB}MB) - ${elapsed.toFixed(0)}s elapsed`);
+
}, 5000);
+
+
try {
+
const result = await auth.session.fetchHandler(url, init);
+
clearInterval(progressInterval);
+
const totalTime = (Date.now() - startTime) / 1000;
+
const actualSpeed = (bodySize / 1024 / 1024) / totalTime;
+
uploadStats.speeds.push(actualSpeed);
+
// Keep only last 10 uploads for rolling average
+
if (uploadStats.speeds.length > 10) uploadStats.speeds.shift();
+
console.log(`[Upload Progress] โœ… Completed ${sizeMB}MB in ${totalTime.toFixed(1)}s (${actualSpeed.toFixed(1)} MB/s)`);
+
return result;
+
} catch (err) {
+
clearInterval(progressInterval);
+
const elapsed = (Date.now() - startTime) / 1000;
+
console.error(`[Upload Progress] โŒ Upload failed after ${elapsed.toFixed(1)}s`);
+
throw err;
+
}
+
} else {
+
// Track small files too for speed calculation
+
try {
+
const result = await auth.session.fetchHandler(url, init);
+
const totalTime = (Date.now() - startTime) / 1000;
+
if (totalTime > 0.5) { // Only track if > 0.5s
+
const actualSpeed = (bodySize / 1024 / 1024) / totalTime;
+
uploadStats.speeds.push(actualSpeed);
+
if (uploadStats.speeds.length > 10) uploadStats.speeds.shift();
+
console.log(`[Upload Stats] Small file: ${(bodySize / 1024).toFixed(1)}KB in ${totalTime.toFixed(2)}s = ${actualSpeed.toFixed(2)} MB/s`);
+
}
+
return result;
+
} catch (err) {
+
throw err;
+
}
+
}
+
}
+
+
// Normal request
+
return auth.session.fetchHandler(url, init);
+
};
+
+
const agent = new Agent(wrappedFetchHandler)
+
console.log('Agent created for DID:', auth.did);
+
console.log('Created upload job:', jobId);
+
+
// Start background processing (don't await)
+
processUploadInBackground(jobId, agent, auth.did, siteName, fileArray).catch(err => {
+
console.error('Background upload process failed:', err);
+
logger.error('Background upload process failed', err);
+
});
+
+
// Return immediately with job ID
+
return {
+
success: true,
+
jobId,
+
message: 'Upload started. Connect to /wisp/upload-progress/' + jobId + ' for progress updates.'
+
};
+
} catch (error) {
+
console.error('=== UPLOAD ERROR ===');
+
console.error('Error details:', error);
+
logger.error('Upload error', error);
+
throw new Error(`Failed to upload files: ${error instanceof Error ? error.message : 'Unknown error'}`);
+
}
+
}
+
)
+14
apps/main-app/tsconfig.json
···
+
{
+
"extends": "../../tsconfig.json",
+
"compilerOptions": {
+
"baseUrl": ".",
+
"paths": {
+
"@server": ["./src/index.ts"],
+
"@server/*": ["./src/*"],
+
"@public/*": ["./public/*"],
+
"@wisp/*": ["../../packages/@wisp/*/src"]
+
}
+
},
+
"include": ["src/**/*", "public/**/*", "scripts/**/*"],
+
"exclude": ["node_modules"]
+
}
+447 -39
bun.lock
···
{
"lockfileVersion": 1,
+
"configVersion": 0,
"workspaces": {
"": {
"name": "elysia-static",
+
},
+
"apps/hosting-service": {
+
"name": "wisp-hosting-service",
+
"version": "1.0.0",
+
"dependencies": {
+
"@atproto/api": "^0.17.4",
+
"@atproto/identity": "^0.4.9",
+
"@atproto/lexicon": "^0.5.1",
+
"@atproto/sync": "^0.1.36",
+
"@atproto/xrpc": "^0.7.5",
+
"@hono/node-server": "^1.19.6",
+
"@wisp/atproto-utils": "workspace:*",
+
"@wisp/constants": "workspace:*",
+
"@wisp/database": "workspace:*",
+
"@wisp/fs-utils": "workspace:*",
+
"@wisp/lexicons": "workspace:*",
+
"@wisp/observability": "workspace:*",
+
"@wisp/safe-fetch": "workspace:*",
+
"hono": "^4.10.4",
+
"mime-types": "^2.1.35",
+
"multiformats": "^13.4.1",
+
"postgres": "^3.4.5",
+
},
+
"devDependencies": {
+
"@types/bun": "^1.3.1",
+
"@types/mime-types": "^2.1.4",
+
"@types/node": "^22.10.5",
+
"tsx": "^4.19.2",
+
},
+
},
+
"apps/main-app": {
+
"name": "@wisp/main-app",
+
"version": "1.0.50",
"dependencies": {
"@atproto/api": "^0.17.3",
"@atproto/lex-cli": "^0.9.5",
···
"@elysiajs/openapi": "^1.4.11",
"@elysiajs/opentelemetry": "^1.4.6",
"@elysiajs/static": "^1.4.2",
+
"@radix-ui/react-checkbox": "^1.3.3",
"@radix-ui/react-dialog": "^1.1.15",
"@radix-ui/react-label": "^2.1.7",
"@radix-ui/react-radio-group": "^1.3.8",
"@radix-ui/react-slot": "^1.2.3",
"@radix-ui/react-tabs": "^1.1.13",
"@tanstack/react-query": "^5.90.2",
+
"@wisp/atproto-utils": "workspace:*",
+
"@wisp/constants": "workspace:*",
+
"@wisp/database": "workspace:*",
+
"@wisp/fs-utils": "workspace:*",
+
"@wisp/lexicons": "workspace:*",
+
"@wisp/observability": "workspace:*",
+
"actor-typeahead": "^0.1.1",
+
"atproto-ui": "^0.11.3",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"elysia": "latest",
"iron-session": "^8.0.4",
"lucide-react": "^0.546.0",
+
"multiformats": "^13.4.1",
+
"prismjs": "^1.30.0",
"react": "^19.2.0",
"react-dom": "^19.2.0",
"tailwind-merge": "^3.3.1",
···
"@types/react-dom": "^19.2.1",
"bun-plugin-tailwind": "^0.1.2",
"bun-types": "latest",
+
"esbuild": "0.26.0",
+
"playwright": "^1.49.0",
},
},
+
"packages/@wisp/atproto-utils": {
+
"name": "@wisp/atproto-utils",
+
"version": "1.0.0",
+
"dependencies": {
+
"@atproto/api": "^0.14.1",
+
"@wisp/lexicons": "workspace:*",
+
"multiformats": "^13.3.1",
+
},
+
},
+
"packages/@wisp/constants": {
+
"name": "@wisp/constants",
+
"version": "1.0.0",
+
},
+
"packages/@wisp/database": {
+
"name": "@wisp/database",
+
"version": "1.0.0",
+
"dependencies": {
+
"postgres": "^3.4.5",
+
},
+
"peerDependencies": {
+
"bun": "^1.0.0",
+
},
+
"optionalPeers": [
+
"bun",
+
],
+
},
+
"packages/@wisp/fs-utils": {
+
"name": "@wisp/fs-utils",
+
"version": "1.0.0",
+
"dependencies": {
+
"@atproto/api": "^0.14.1",
+
"@wisp/lexicons": "workspace:*",
+
},
+
},
+
"packages/@wisp/lexicons": {
+
"name": "@wisp/lexicons",
+
"version": "1.0.0",
+
"dependencies": {
+
"@atproto/lexicon": "^0.5.1",
+
"@atproto/xrpc-server": "^0.9.5",
+
},
+
"devDependencies": {
+
"@atproto/lex-cli": "^0.9.5",
+
},
+
},
+
"packages/@wisp/observability": {
+
"name": "@wisp/observability",
+
"version": "1.0.0",
+
"peerDependencies": {
+
"hono": "^4.0.0",
+
},
+
"optionalPeers": [
+
"hono",
+
],
+
},
+
"packages/@wisp/safe-fetch": {
+
"name": "@wisp/safe-fetch",
+
"version": "1.0.0",
+
},
},
"trustedDependencies": [
"core-js",
+
"cbor-extract",
+
"bun",
"protobufjs",
],
"packages": {
+
"@atcute/atproto": ["@atcute/atproto@3.1.9", "", { "dependencies": { "@atcute/lexicons": "^1.2.2" } }, "sha512-DyWwHCTdR4hY2BPNbLXgVmm7lI+fceOwWbE4LXbGvbvVtSn+ejSVFaAv01Ra3kWDha0whsOmbJL8JP0QPpf1+w=="],
+
+
"@atcute/bluesky": ["@atcute/bluesky@3.2.10", "", { "dependencies": { "@atcute/atproto": "^3.1.9", "@atcute/lexicons": "^1.2.2" } }, "sha512-qwQWTzRf3umnh2u41gdU+xWYkbzGlKDupc3zeOB+YjmuP1N9wEaUhwS8H7vgrqr0xC9SGNDjeUVcjC4m5BPLBg=="],
+
+
"@atcute/client": ["@atcute/client@4.0.5", "", { "dependencies": { "@atcute/identity": "^1.1.1", "@atcute/lexicons": "^1.2.2" } }, "sha512-R8Qen8goGmEkynYGg2m6XFlVmz0GTDvQ+9w+4QqOob+XMk8/WDpF4aImev7WKEde/rV2gjcqW7zM8E6W9NShDA=="],
+
+
"@atcute/identity": ["@atcute/identity@1.1.1", "", { "dependencies": { "@atcute/lexicons": "^1.2.2", "@badrap/valita": "^0.4.6" } }, "sha512-zax42n693VEhnC+5tndvO2KLDTMkHOz8UExwmklvJv7R9VujfEwiSWhcv6Jgwb3ellaG8wjiQ1lMOIjLLvwh0Q=="],
+
+
"@atcute/identity-resolver": ["@atcute/identity-resolver@1.1.4", "", { "dependencies": { "@atcute/lexicons": "^1.2.2", "@atcute/util-fetch": "^1.0.3", "@badrap/valita": "^0.4.6" }, "peerDependencies": { "@atcute/identity": "^1.0.0" } }, "sha512-/SVh8vf2cXFJenmBnGeYF2aY3WGQm3cJeew5NWTlkqoy3LvJ5wkvKq9PWu4Tv653VF40rPOp6LOdVr9Fa+q5rA=="],
+
+
"@atcute/lexicons": ["@atcute/lexicons@1.2.2", "", { "dependencies": { "@standard-schema/spec": "^1.0.0", "esm-env": "^1.2.2" } }, "sha512-bgEhJq5Z70/0TbK5sx+tAkrR8FsCODNiL2gUEvS5PuJfPxmFmRYNWaMGehxSPaXWpU2+Oa9ckceHiYbrItDTkA=="],
+
+
"@atcute/tangled": ["@atcute/tangled@1.0.10", "", { "dependencies": { "@atcute/atproto": "^3.1.8", "@atcute/lexicons": "^1.2.2" } }, "sha512-DGconZIN5TpLBah+aHGbWI1tMsL7XzyVEbr/fW4CbcLWYKICU6SAUZ0YnZ+5GvltjlORWHUy7hfftvoh4zodIA=="],
+
+
"@atcute/util-fetch": ["@atcute/util-fetch@1.0.3", "", { "dependencies": { "@badrap/valita": "^0.4.6" } }, "sha512-f8zzTb/xlKIwv2OQ31DhShPUNCmIIleX6p7qIXwWwEUjX6x8skUtpdISSjnImq01LXpltGV5y8yhV4/Mlb7CRQ=="],
+
"@atproto-labs/did-resolver": ["@atproto-labs/did-resolver@0.2.2", "", { "dependencies": { "@atproto-labs/fetch": "0.2.3", "@atproto-labs/pipe": "0.1.1", "@atproto-labs/simple-store": "0.3.0", "@atproto-labs/simple-store-memory": "0.1.4", "@atproto/did": "0.2.1", "zod": "^3.23.8" } }, "sha512-ca2B7xR43tVoQ8XxBvha58DXwIH8cIyKQl6lpOKGkPUrJuFoO4iCLlDiSDi2Ueh+yE1rMDPP/qveHdajgDX3WQ=="],
"@atproto-labs/fetch": ["@atproto-labs/fetch@0.2.3", "", { "dependencies": { "@atproto-labs/pipe": "0.1.1" } }, "sha512-NZtbJOCbxKUFRFKMpamT38PUQMY0hX0p7TG5AEYOPhZKZEP7dHZ1K2s1aB8MdVH0qxmqX7nQleNrrvLf09Zfdw=="],
-
"@atproto-labs/fetch-node": ["@atproto-labs/fetch-node@0.1.10", "", { "dependencies": { "@atproto-labs/fetch": "0.2.3", "@atproto-labs/pipe": "0.1.1", "ipaddr.js": "^2.1.0", "undici": "^6.14.1" } }, "sha512-o7hGaonA71A6p7O107VhM6UBUN/g9tTyYohMp1q0Kf6xQ4npnuZYRSHSf2g6reSfGQJ1GoFNjBObETTT1ge/jQ=="],
+
"@atproto-labs/fetch-node": ["@atproto-labs/fetch-node@0.2.0", "", { "dependencies": { "@atproto-labs/fetch": "0.2.3", "@atproto-labs/pipe": "0.1.1", "ipaddr.js": "^2.1.0", "undici": "^6.14.1" } }, "sha512-Krq09nH/aeoiU2s9xdHA0FjTEFWG9B5FFenipv1iRixCcPc7V3DhTNDawxG9gI8Ny0k4dBVS9WTRN/IDzBx86Q=="],
"@atproto-labs/handle-resolver": ["@atproto-labs/handle-resolver@0.3.2", "", { "dependencies": { "@atproto-labs/simple-store": "0.3.0", "@atproto-labs/simple-store-memory": "0.1.4", "@atproto/did": "0.2.1", "zod": "^3.23.8" } }, "sha512-KIerCzh3qb+zZoqWbIvTlvBY0XPq0r56kwViaJY/LTe/3oPO2JaqlYKS/F4dByWBhHK6YoUOJ0sWrh6PMJl40A=="],
-
"@atproto-labs/handle-resolver-node": ["@atproto-labs/handle-resolver-node@0.1.20", "", { "dependencies": { "@atproto-labs/fetch-node": "0.1.10", "@atproto-labs/handle-resolver": "0.3.2", "@atproto/did": "0.2.1" } }, "sha512-094EL61XN9M7vm22cloSOxk/gcTRaCK52vN7BYgXgdoEI8uJJMTFXenQqu+LRGwiCcjvyclcBqbaz0DzJep50Q=="],
+
"@atproto-labs/handle-resolver-node": ["@atproto-labs/handle-resolver-node@0.1.21", "", { "dependencies": { "@atproto-labs/fetch-node": "0.2.0", "@atproto-labs/handle-resolver": "0.3.2", "@atproto/did": "0.2.1" } }, "sha512-fuJy5Px5pGF3lJX/ATdurbT8tbmaFWtf+PPxAQDFy7ot2no3t+iaAgymhyxYymrssOuWs6BwOP8tyF3VrfdwtQ=="],
"@atproto-labs/identity-resolver": ["@atproto-labs/identity-resolver@0.3.2", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/handle-resolver": "0.3.2" } }, "sha512-MYxO9pe0WsFyi5HFdKAwqIqHfiF2kBPoVhAIuH/4PYHzGr799ED47xLhNMxR3ZUYrJm5+TQzWXypGZ0Btw1Ffw=="],
···
"@atproto-labs/simple-store-memory": ["@atproto-labs/simple-store-memory@0.1.4", "", { "dependencies": { "@atproto-labs/simple-store": "0.3.0", "lru-cache": "^10.2.0" } }, "sha512-3mKY4dP8I7yKPFj9VKpYyCRzGJOi5CEpOLPlRhoJyLmgs3J4RzDrjn323Oakjz2Aj2JzRU/AIvWRAZVhpYNJHw=="],
-
"@atproto/api": ["@atproto/api@0.17.3", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "@atproto/xrpc": "^0.7.5", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, "sha512-pdQXhUAapNPdmN00W0vX5ta/aMkHqfgBHATt20X02XwxQpY2AnrPm2Iog4FyjsZqoHooAtCNV/NWJ4xfddJzsg=="],
+
"@atproto/api": ["@atproto/api@0.14.22", "", { "dependencies": { "@atproto/common-web": "^0.4.1", "@atproto/lexicon": "^0.4.10", "@atproto/syntax": "^0.4.0", "@atproto/xrpc": "^0.6.12", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, "sha512-ziXPau+sUdFovObSnsoN7JbOmUw1C5e5L28/yXf3P8vbEnSS3HVVGD1jYcscBYY34xQqi4bVDpwMYx/4yRsTuQ=="],
"@atproto/common": ["@atproto/common@0.4.12", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@ipld/dag-cbor": "^7.0.3", "cbor-x": "^1.5.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "pino": "^8.21.0" } }, "sha512-NC+TULLQiqs6MvNymhQS5WDms3SlbIKGLf4n33tpftRJcalh507rI+snbcUb7TLIkKw7VO17qMqxEXtIdd5auQ=="],
···
"@atproto/did": ["@atproto/did@0.2.1", "", { "dependencies": { "zod": "^3.23.8" } }, "sha512-1i5BTU2GnBaaeYWhxUOnuEKFVq9euT5+dQPFabHpa927BlJ54PmLGyBBaOI7/NbLmN5HWwBa18SBkMpg3jGZRA=="],
+
"@atproto/identity": ["@atproto/identity@0.4.10", "", { "dependencies": { "@atproto/common-web": "^0.4.4", "@atproto/crypto": "^0.4.4" } }, "sha512-nQbzDLXOhM8p/wo0cTh5DfMSOSHzj6jizpodX37LJ4S1TZzumSxAjHEZa5Rev3JaoD5uSWMVE0MmKEGWkPPvfQ=="],
+
"@atproto/jwk": ["@atproto/jwk@0.6.0", "", { "dependencies": { "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-bDoJPvt7TrQVi/rBfBrSSpGykhtIriKxeYCYQTiPRKFfyRhbgpElF0wPXADjIswnbzZdOwbY63az4E/CFVT3Tw=="],
"@atproto/jwk-jose": ["@atproto/jwk-jose@0.1.11", "", { "dependencies": { "@atproto/jwk": "0.6.0", "jose": "^5.2.0" } }, "sha512-i4Fnr2sTBYmMmHXl7NJh8GrCH+tDQEVWrcDMDnV5DjJfkgT17wIqvojIw9SNbSL4Uf0OtfEv6AgG0A+mgh8b5Q=="],
"@atproto/jwk-webcrypto": ["@atproto/jwk-webcrypto@0.2.0", "", { "dependencies": { "@atproto/jwk": "0.6.0", "@atproto/jwk-jose": "0.1.11", "zod": "^3.23.8" } }, "sha512-UmgRrrEAkWvxwhlwe30UmDOdTEFidlIzBC7C3cCbeJMcBN1x8B3KH+crXrsTqfWQBG58mXgt8wgSK3Kxs2LhFg=="],
-
"@atproto/lex-cli": ["@atproto/lex-cli@0.9.5", "", { "dependencies": { "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "chalk": "^4.1.2", "commander": "^9.4.0", "prettier": "^3.2.5", "ts-morph": "^24.0.0", "yesno": "^0.4.0", "zod": "^3.23.8" }, "bin": { "lex": "dist/index.js" } }, "sha512-zun4jhD1dbjD7IHtLIjh/1UsMx+6E8+OyOT2GXYAKIj9N6wmLKM/v2OeQBKxcyqUmtRi57lxWnGikWjjU7pplQ=="],
+
"@atproto/lex-cbor": ["@atproto/lex-cbor@0.0.1", "", { "dependencies": { "@atproto/lex-data": "0.0.1", "multiformats": "^9.9.0", "tslib": "^2.8.1" } }, "sha512-GCgowcC041tYmsoIxalIECJq4ZRHgREk6lFa4BzNRUZarMqwz57YF/7eUlo2Q6hoaMUL7Bjr6FvXwcZFaKrhvA=="],
+
+
"@atproto/lex-cli": ["@atproto/lex-cli@0.9.6", "", { "dependencies": { "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "chalk": "^4.1.2", "commander": "^9.4.0", "prettier": "^3.2.5", "ts-morph": "^24.0.0", "yesno": "^0.4.0", "zod": "^3.23.8" }, "bin": { "lex": "dist/index.js" } }, "sha512-EedEKmURoSP735YwSDHsFrLOhZ4P2it8goCHv5ApWi/R9DFpOKOpmYfIXJ9MAprK8cw+yBnjDJbzpLJy7UXlTg=="],
+
+
"@atproto/lex-data": ["@atproto/lex-data@0.0.1", "", { "dependencies": { "@atproto/syntax": "0.4.1", "multiformats": "^9.9.0", "tslib": "^2.8.1", "uint8arrays": "3.0.0", "unicode-segmenter": "^0.14.0" } }, "sha512-DrS/8cQcQs3s5t9ELAFNtyDZ8/PdiCx47ALtFEP2GnX2uCBHZRkqWG7xmu6ehjc787nsFzZBvlnz3T/gov5fGA=="],
+
+
"@atproto/lex-json": ["@atproto/lex-json@0.0.1", "", { "dependencies": { "@atproto/lex-data": "0.0.1", "tslib": "^2.8.1" } }, "sha512-ivcF7+pDRuD/P97IEKQ/9TruunXj0w58Khvwk3M6psaI5eZT6LRsRZ4cWcKaXiFX4SHnjy+x43g0f7pPtIsERg=="],
-
"@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-y8AEtYmfgVl4fqFxqXAeGvhesiGkxiy3CWoJIfsFDDdTlZUC8DFnZrYhcqkIop3OlCkkljvpSJi1hbeC1tbi8A=="],
+
"@atproto/lexicon": ["@atproto/lexicon@0.5.2", "", { "dependencies": { "@atproto/common-web": "^0.4.4", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-lRmJgMA8f5j7VB5Iu5cp188ald5FuI4FlmZ7nn6EBrk1dgOstWVrI5Ft6K3z2vjyLZRG6nzknlsw+tDP63p7bQ=="],
+
+
"@atproto/oauth-client": ["@atproto/oauth-client@0.5.8", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/fetch": "0.2.3", "@atproto-labs/handle-resolver": "0.3.2", "@atproto-labs/identity-resolver": "0.3.2", "@atproto-labs/simple-store": "0.3.0", "@atproto-labs/simple-store-memory": "0.1.4", "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "@atproto/oauth-types": "0.5.0", "@atproto/xrpc": "0.7.5", "core-js": "^3", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-7YEym6d97+Dd73qGdkQTXi5La8xvCQxwRUDzzlR/NVAARa9a4YP7MCmqBJVeP2anT0By+DSAPyPDLTsxcjIcCg=="],
+
+
"@atproto/oauth-client-node": ["@atproto/oauth-client-node@0.3.10", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/handle-resolver-node": "0.1.21", "@atproto-labs/simple-store": "0.3.0", "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "@atproto/jwk-jose": "0.1.11", "@atproto/jwk-webcrypto": "0.2.0", "@atproto/oauth-client": "0.5.8", "@atproto/oauth-types": "0.5.0" } }, "sha512-6khKlJqu1Ed5rt3rzcTD5hymB6JUjKdOHWYXwiphw4inkAIo6GxLCighI4eGOqZorYk2j8ueeTNB6KsgH0kcRw=="],
-
"@atproto/oauth-client": ["@atproto/oauth-client@0.5.7", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/fetch": "0.2.3", "@atproto-labs/handle-resolver": "0.3.2", "@atproto-labs/identity-resolver": "0.3.2", "@atproto-labs/simple-store": "0.3.0", "@atproto-labs/simple-store-memory": "0.1.4", "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "@atproto/oauth-types": "0.4.2", "@atproto/xrpc": "0.7.5", "core-js": "^3", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-pDvbvy9DCxrAJv7bAbBUzWrHZKhFy091HvEMZhr+EyZA6gSCGYmmQJG/coDj0oICSVQeafAZd+IxR0YUCWwmEg=="],
+
"@atproto/oauth-types": ["@atproto/oauth-types@0.5.0", "", { "dependencies": { "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "zod": "^3.23.8" } }, "sha512-33xz7HcXhbl+XRqbIMVu3GE02iK1nKe2oMWENASsfZEYbCz2b9ZOarOFuwi7g4LKqpGowGp0iRKsQHFcq4SDaQ=="],
-
"@atproto/oauth-client-node": ["@atproto/oauth-client-node@0.3.9", "", { "dependencies": { "@atproto-labs/did-resolver": "0.2.2", "@atproto-labs/handle-resolver-node": "0.1.20", "@atproto-labs/simple-store": "0.3.0", "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "@atproto/jwk-jose": "0.1.11", "@atproto/jwk-webcrypto": "0.2.0", "@atproto/oauth-client": "0.5.7", "@atproto/oauth-types": "0.4.2" } }, "sha512-JdzwDQ8Gczl0lgfJNm7lG7omkJ4yu99IuGkkRWixpEvKY/jY/mDZaho+3pfd29SrUvwQOOx4Bm4l7DGeYwxxyA=="],
+
"@atproto/repo": ["@atproto/repo@0.8.11", "", { "dependencies": { "@atproto/common": "^0.5.0", "@atproto/common-web": "^0.4.4", "@atproto/crypto": "^0.4.4", "@atproto/lexicon": "^0.5.2", "@ipld/dag-cbor": "^7.0.0", "multiformats": "^9.9.0", "uint8arrays": "3.0.0", "varint": "^6.0.0", "zod": "^3.23.8" } }, "sha512-b/WCu5ITws4ILHoXiZz0XXB5U9C08fUVzkBQDwpnme62GXv8gUaAPL/ttG61OusW09ARwMMQm4vxoP0hTFg+zA=="],
-
"@atproto/oauth-types": ["@atproto/oauth-types@0.4.2", "", { "dependencies": { "@atproto/did": "0.2.1", "@atproto/jwk": "0.6.0", "zod": "^3.23.8" } }, "sha512-gcfNTyFsPJcYDf79M0iKHykWqzxloscioKoerdIN3MTS3htiNOSgZjm2p8ho7pdrElLzea3qktuhTQI39j1XFQ=="],
+
"@atproto/sync": ["@atproto/sync@0.1.38", "", { "dependencies": { "@atproto/common": "^0.5.0", "@atproto/identity": "^0.4.10", "@atproto/lexicon": "^0.5.2", "@atproto/repo": "^0.8.11", "@atproto/syntax": "^0.4.1", "@atproto/xrpc-server": "^0.10.0", "multiformats": "^9.9.0", "p-queue": "^6.6.2", "ws": "^8.12.0" } }, "sha512-2rE0SM21Nk4hWw/XcIYFnzlWO6/gBg8mrzuWbOvDhD49sA/wW4zyjaHZ5t1gvk28/SLok2VZiIR8nYBdbf7F5Q=="],
"@atproto/syntax": ["@atproto/syntax@0.4.1", "", {}, "sha512-CJdImtLAiFO+0z3BWTtxwk6aY5w4t8orHTMVJgkf++QRJWTxPbIFko/0hrkADB7n2EruDxDSeAgfUGehpH6ngw=="],
+
"@atproto/ws-client": ["@atproto/ws-client@0.0.3", "", { "dependencies": { "@atproto/common": "^0.5.0", "ws": "^8.12.0" } }, "sha512-eKqkTWBk6zuMY+6gs02eT7mS8Btewm8/qaL/Dp00NDCqpNC+U59MWvQsOWT3xkNGfd9Eip+V6VI4oyPvAfsfTA=="],
+
"@atproto/xrpc": ["@atproto/xrpc@0.7.5", "", { "dependencies": { "@atproto/lexicon": "^0.5.1", "zod": "^3.23.8" } }, "sha512-MUYNn5d2hv8yVegRL0ccHvTHAVj5JSnW07bkbiaz96UH45lvYNRVwt44z+yYVnb0/mvBzyD3/ZQ55TRGt7fHkA=="],
"@atproto/xrpc-server": ["@atproto/xrpc-server@0.9.5", "", { "dependencies": { "@atproto/common": "^0.4.12", "@atproto/crypto": "^0.4.4", "@atproto/lexicon": "^0.5.1", "@atproto/xrpc": "^0.7.5", "cbor-x": "^1.5.1", "express": "^4.17.2", "http-errors": "^2.0.0", "mime-types": "^2.1.35", "rate-limiter-flexible": "^2.4.1", "uint8arrays": "3.0.0", "ws": "^8.12.0", "zod": "^3.23.8" } }, "sha512-V0srjUgy6mQ5yf9+MSNBLs457m4qclEaWZsnqIE7RfYywvntexTAbMoo7J7ONfTNwdmA9Gw4oLak2z2cDAET4w=="],
+
+
"@badrap/valita": ["@badrap/valita@0.4.6", "", {}, "sha512-4kdqcjyxo/8RQ8ayjms47HCWZIF5981oE5nIenbfThKDxWXtEHKipAOWlflpPJzZx9y/JWYQkp18Awr7VuepFg=="],
"@borewit/text-codec": ["@borewit/text-codec@0.1.1", "", {}, "sha512-5L/uBxmjaCIX5h8Z+uu+kA9BQLkc/Wl06UGR5ajNRxu+/XjonB5i8JpgFMrPj3LXTCPA0pv8yxUvbUi+QthGGA=="],
···
"@elysiajs/cors": ["@elysiajs/cors@1.4.0", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-pb0SCzBfFbFSYA/U40HHO7R+YrcXBJXOWgL20eSViK33ol1e20ru2/KUaZYo5IMUn63yaTJI/bQERuQ+77ND8g=="],
-
"@elysiajs/eden": ["@elysiajs/eden@1.4.3", "", { "peerDependencies": { "elysia": ">= 1.4.0-exp.0" } }, "sha512-mX0v5cTvTJiDsDWNEEyuoqudOvW5J+tXsvp/ZOJXJF3iCIEJI0Brvm78ymPrvwiOG4nUr3lS8BxUfbNf32DSXA=="],
+
"@elysiajs/eden": ["@elysiajs/eden@1.4.4", "", { "peerDependencies": { "elysia": ">= 1.4.0-exp.0" } }, "sha512-/LVqflmgUcCiXb8rz1iRq9Rx3SWfIV/EkoNqDFGMx+TvOyo8QHAygFXAVQz7RHs+jk6n6mEgpI6KlKBANoErsQ=="],
"@elysiajs/openapi": ["@elysiajs/openapi@1.4.11", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-d75bMxYJpN6qSDi/z9L1S7SLk1S/8Px+cTb3W2lrYzU8uQ5E0kXdy1oOMJEfTyVsz3OA19NP9KNxE7ztSbLBLg=="],
"@elysiajs/opentelemetry": ["@elysiajs/opentelemetry@1.4.6", "", { "dependencies": { "@opentelemetry/api": "^1.9.0", "@opentelemetry/instrumentation": "^0.200.0", "@opentelemetry/sdk-node": "^0.200.0" }, "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-jR7t4M6ZvMnBqzzHsNTL6y3sNq9jbGi2vKxbkizi/OO5tlvlKl/rnBGyFjZUjQ1Hte7rCz+2kfmgOQMhkjk+Og=="],
-
"@elysiajs/static": ["@elysiajs/static@1.4.2", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-lAEvdxeBhU/jX/hTzfoP+1AtqhsKNCwW4Q+tfNwAShWU6s4ZPQxR1hLoHBveeApofJt4HWEq/tBGvfFz3ODuKg=="],
+
"@elysiajs/static": ["@elysiajs/static@1.4.6", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-cd61aY/DHOVhlnBjzTBX8E1XANIrsCH8MwEGHeLMaZzNrz0gD4Q8Qsde2dFMzu81I7ZDaaZ2Rim9blSLtUrYBg=="],
+
+
"@esbuild/aix-ppc64": ["@esbuild/aix-ppc64@0.26.0", "", { "os": "aix", "cpu": "ppc64" }, "sha512-hj0sKNCQOOo2fgyII3clmJXP28VhgDfU5iy3GNHlWO76KG6N7x4D9ezH5lJtQTG+1J6MFDAJXC1qsI+W+LvZoA=="],
+
+
"@esbuild/android-arm": ["@esbuild/android-arm@0.26.0", "", { "os": "android", "cpu": "arm" }, "sha512-C0hkDsYNHZkBtPxxDx177JN90/1MiCpvBNjz1f5yWJo1+5+c5zr8apjastpEG+wtPjo9FFtGG7owSsAxyKiHxA=="],
+
+
"@esbuild/android-arm64": ["@esbuild/android-arm64@0.26.0", "", { "os": "android", "cpu": "arm64" }, "sha512-DDnoJ5eoa13L8zPh87PUlRd/IyFaIKOlRbxiwcSbeumcJ7UZKdtuMCHa1Q27LWQggug6W4m28i4/O2qiQQ5NZQ=="],
+
+
"@esbuild/android-x64": ["@esbuild/android-x64@0.26.0", "", { "os": "android", "cpu": "x64" }, "sha512-bKDkGXGZnj0T70cRpgmv549x38Vr2O3UWLbjT2qmIkdIWcmlg8yebcFWoT9Dku7b5OV3UqPEuNKRzlNhjwUJ9A=="],
+
+
"@esbuild/darwin-arm64": ["@esbuild/darwin-arm64@0.26.0", "", { "os": "darwin", "cpu": "arm64" }, "sha512-6Z3naJgOuAIB0RLlJkYc81An3rTlQ/IeRdrU3dOea8h/PvZSgitZV+thNuIccw0MuK1GmIAnAmd5TrMZad8FTQ=="],
+
+
"@esbuild/darwin-x64": ["@esbuild/darwin-x64@0.26.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-OPnYj0zpYW0tHusMefyaMvNYQX5pNQuSsHFTHUBNp3vVXupwqpxofcjVsUx11CQhGVkGeXjC3WLjh91hgBG2xw=="],
+
+
"@esbuild/freebsd-arm64": ["@esbuild/freebsd-arm64@0.26.0", "", { "os": "freebsd", "cpu": "arm64" }, "sha512-jix2fa6GQeZhO1sCKNaNMjfj5hbOvoL2F5t+w6gEPxALumkpOV/wq7oUBMHBn2hY2dOm+mEV/K+xfZy3mrsxNQ=="],
+
+
"@esbuild/freebsd-x64": ["@esbuild/freebsd-x64@0.26.0", "", { "os": "freebsd", "cpu": "x64" }, "sha512-tccJaH5xHJD/239LjbVvJwf6T4kSzbk6wPFerF0uwWlkw/u7HL+wnAzAH5GB2irGhYemDgiNTp8wJzhAHQ64oA=="],
+
+
"@esbuild/linux-arm": ["@esbuild/linux-arm@0.26.0", "", { "os": "linux", "cpu": "arm" }, "sha512-JY8NyU31SyRmRpuc5W8PQarAx4TvuYbyxbPIpHAZdr/0g4iBr8KwQBS4kiiamGl2f42BBecHusYCsyxi7Kn8UQ=="],
+
+
"@esbuild/linux-arm64": ["@esbuild/linux-arm64@0.26.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-IMJYN7FSkLttYyTbsbme0Ra14cBO5z47kpamo16IwggzzATFY2lcZAwkbcNkWiAduKrTgFJP7fW5cBI7FzcuNQ=="],
+
+
"@esbuild/linux-ia32": ["@esbuild/linux-ia32@0.26.0", "", { "os": "linux", "cpu": "ia32" }, "sha512-XITaGqGVLgk8WOHw8We9Z1L0lbLFip8LyQzKYFKO4zFo1PFaaSKsbNjvkb7O8kEXytmSGRkYpE8LLVpPJpsSlw=="],
+
+
"@esbuild/linux-loong64": ["@esbuild/linux-loong64@0.26.0", "", { "os": "linux", "cpu": "none" }, "sha512-MkggfbDIczStUJwq9wU7gQ7kO33d8j9lWuOCDifN9t47+PeI+9m2QVh51EI/zZQ1spZtFMC1nzBJ+qNGCjJnsg=="],
+
+
"@esbuild/linux-mips64el": ["@esbuild/linux-mips64el@0.26.0", "", { "os": "linux", "cpu": "none" }, "sha512-fUYup12HZWAeccNLhQ5HwNBPr4zXCPgUWzEq2Rfw7UwqwfQrFZ0SR/JljaURR8xIh9t+o1lNUFTECUTmaP7yKA=="],
+
+
"@esbuild/linux-ppc64": ["@esbuild/linux-ppc64@0.26.0", "", { "os": "linux", "cpu": "ppc64" }, "sha512-MzRKhM0Ip+//VYwC8tialCiwUQ4G65WfALtJEFyU0GKJzfTYoPBw5XNWf0SLbCUYQbxTKamlVwPmcw4DgZzFxg=="],
+
+
"@esbuild/linux-riscv64": ["@esbuild/linux-riscv64@0.26.0", "", { "os": "linux", "cpu": "none" }, "sha512-QhCc32CwI1I4Jrg1enCv292sm3YJprW8WHHlyxJhae/dVs+KRWkbvz2Nynl5HmZDW/m9ZxrXayHzjzVNvQMGQA=="],
+
+
"@esbuild/linux-s390x": ["@esbuild/linux-s390x@0.26.0", "", { "os": "linux", "cpu": "s390x" }, "sha512-1D6vi6lfI18aNT1aTf2HV+RIlm6fxtlAp8eOJ4mmnbYmZ4boz8zYDar86sIYNh0wmiLJEbW/EocaKAX6Yso2fw=="],
+
+
"@esbuild/linux-x64": ["@esbuild/linux-x64@0.26.0", "", { "os": "linux", "cpu": "x64" }, "sha512-rnDcepj7LjrKFvZkx+WrBv6wECeYACcFjdNPvVPojCPJD8nHpb3pv3AuR9CXgdnjH1O23btICj0rsp0L9wAnHA=="],
+
+
"@esbuild/netbsd-arm64": ["@esbuild/netbsd-arm64@0.26.0", "", { "os": "none", "cpu": "arm64" }, "sha512-FSWmgGp0mDNjEXXFcsf12BmVrb+sZBBBlyh3LwB/B9ac3Kkc8x5D2WimYW9N7SUkolui8JzVnVlWh7ZmjCpnxw=="],
-
"@grpc/grpc-js": ["@grpc/grpc-js@1.14.0", "", { "dependencies": { "@grpc/proto-loader": "^0.8.0", "@js-sdsl/ordered-map": "^4.4.2" } }, "sha512-N8Jx6PaYzcTRNzirReJCtADVoq4z7+1KQ4E70jTg/koQiMoUSN1kbNjPOqpPbhMFhfU1/l7ixspPl8dNY+FoUg=="],
+
"@esbuild/netbsd-x64": ["@esbuild/netbsd-x64@0.26.0", "", { "os": "none", "cpu": "x64" }, "sha512-0QfciUDFryD39QoSPUDshj4uNEjQhp73+3pbSAaxjV2qGOEDsM67P7KbJq7LzHoVl46oqhIhJ1S+skKGR7lMXA=="],
+
+
"@esbuild/openbsd-arm64": ["@esbuild/openbsd-arm64@0.26.0", "", { "os": "openbsd", "cpu": "arm64" }, "sha512-vmAK+nHhIZWImwJ3RNw9hX3fU4UGN/OqbSE0imqljNbUQC3GvVJ1jpwYoTfD6mmXmQaxdJY6Hn4jQbLGJKg5Yw=="],
+
+
"@esbuild/openbsd-x64": ["@esbuild/openbsd-x64@0.26.0", "", { "os": "openbsd", "cpu": "x64" }, "sha512-GPXF7RMkJ7o9bTyUsnyNtrFMqgM3X+uM/LWw4CeHIjqc32fm0Ir6jKDnWHpj8xHFstgWDUYseSABK9KCkHGnpg=="],
+
+
"@esbuild/openharmony-arm64": ["@esbuild/openharmony-arm64@0.26.0", "", { "os": "none", "cpu": "arm64" }, "sha512-nUHZ5jEYqbBthbiBksbmHTlbb5eElyVfs/s1iHQ8rLBq1eWsd5maOnDpCocw1OM8kFK747d1Xms8dXJHtduxSw=="],
+
+
"@esbuild/sunos-x64": ["@esbuild/sunos-x64@0.26.0", "", { "os": "sunos", "cpu": "x64" }, "sha512-TMg3KCTCYYaVO+R6P5mSORhcNDDlemUVnUbb8QkboUtOhb5JWKAzd5uMIMECJQOxHZ/R+N8HHtDF5ylzLfMiLw=="],
+
+
"@esbuild/win32-arm64": ["@esbuild/win32-arm64@0.26.0", "", { "os": "win32", "cpu": "arm64" }, "sha512-apqYgoAUd6ZCb9Phcs8zN32q6l0ZQzQBdVXOofa6WvHDlSOhwCWgSfVQabGViThS40Y1NA4SCvQickgZMFZRlA=="],
+
+
"@esbuild/win32-ia32": ["@esbuild/win32-ia32@0.26.0", "", { "os": "win32", "cpu": "ia32" }, "sha512-FGJAcImbJNZzLWu7U6WB0iKHl4RuY4TsXEwxJPl9UZLS47agIZuILZEX3Pagfw7I4J3ddflomt9f0apfaJSbaw=="],
+
+
"@esbuild/win32-x64": ["@esbuild/win32-x64@0.26.0", "", { "os": "win32", "cpu": "x64" }, "sha512-WAckBKaVnmFqbEhbymrPK7M086DQMpL1XoRbpmN0iW8k5JSXjDRQBhcZNa0VweItknLq9eAeCL34jK7/CDcw7A=="],
+
+
"@grpc/grpc-js": ["@grpc/grpc-js@1.14.1", "", { "dependencies": { "@grpc/proto-loader": "^0.8.0", "@js-sdsl/ordered-map": "^4.4.2" } }, "sha512-sPxgEWtPUR3EnRJCEtbGZG2iX8LQDUls2wUS3o27jg07KqJFMq6YDeWvMo1wfpmy3rqRdS0rivpLwhqQtEyCuQ=="],
"@grpc/proto-loader": ["@grpc/proto-loader@0.8.0", "", { "dependencies": { "lodash.camelcase": "^4.3.0", "long": "^5.0.0", "protobufjs": "^7.5.3", "yargs": "^17.7.2" }, "bin": { "proto-loader-gen-types": "build/bin/proto-loader-gen-types.js" } }, "sha512-rc1hOQtjIWGxcxpb9aHAfLpIctjEnsDehj0DAiVfBlmT84uvR0uUtN2hEi/ecvWVjXUGf5qPF4qEgiLOx1YIMQ=="],
+
+
"@hono/node-server": ["@hono/node-server@1.19.6", "", { "peerDependencies": { "hono": "^4" } }, "sha512-Shz/KjlIeAhfiuE93NDKVdZ7HdBVLQAfdbaXEaoAVO3ic9ibRSLGIQGkcBbFyuLr+7/1D5ZCINM8B+6IvXeMtw=="],
"@ipld/dag-cbor": ["@ipld/dag-cbor@7.0.3", "", { "dependencies": { "cborg": "^1.6.0", "multiformats": "^9.5.4" } }, "sha512-1VVh2huHsuohdXC1bGJNE8WR72slZ9XE2T3wbBBq31dm7ZBatmKLLxrB+XAqafxfRFjv08RZmj/W/ZqaM13AuA=="],
···
"@opentelemetry/sdk-trace-node": ["@opentelemetry/sdk-trace-node@2.0.0", "", { "dependencies": { "@opentelemetry/context-async-hooks": "2.0.0", "@opentelemetry/core": "2.0.0", "@opentelemetry/sdk-trace-base": "2.0.0" }, "peerDependencies": { "@opentelemetry/api": ">=1.0.0 <1.10.0" } }, "sha512-omdilCZozUjQwY3uZRBwbaRMJ3p09l4t187Lsdf0dGMye9WKD4NGcpgZRvqhI1dwcH6og+YXQEtoO9Wx3ykilg=="],
-
"@opentelemetry/semantic-conventions": ["@opentelemetry/semantic-conventions@1.37.0", "", {}, "sha512-JD6DerIKdJGmRp4jQyX5FlrQjA4tjOw1cvfsPAZXfOOEErMUHjPcPSICS+6WnM0nB0efSFARh0KAZss+bvExOA=="],
+
"@opentelemetry/semantic-conventions": ["@opentelemetry/semantic-conventions@1.38.0", "", {}, "sha512-kocjix+/sSggfJhwXqClZ3i9Y/MI0fp7b+g7kCRm6psy2dsf8uApTRclwG18h8Avm7C9+fnt+O36PspJ/OzoWg=="],
-
"@oven/bun-darwin-aarch64": ["@oven/bun-darwin-aarch64@1.3.0", "", { "os": "darwin", "cpu": "arm64" }, "sha512-WeXSaL29ylJEZMYHHW28QZ6rgAbxQ1KuNSZD9gvd3fPlo0s6s2PglvPArjjP07nmvIK9m4OffN0k4M98O7WmAg=="],
+
"@oven/bun-darwin-aarch64": ["@oven/bun-darwin-aarch64@1.3.2", "", { "os": "darwin", "cpu": "arm64" }, "sha512-licBDIbbLP5L5/S0+bwtJynso94XD3KyqSP48K59Sq7Mude6C7dR5ZujZm4Ut4BwZqUFfNOfYNMWBU5nlL7t1A=="],
-
"@oven/bun-darwin-x64": ["@oven/bun-darwin-x64@1.3.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-CFKjoUWQH0Oz3UHYfKbdKLq0wGryrFsTJEYq839qAwHQSECvVZYAnxVVDYUDa0yQFonhO2qSHY41f6HK+b7xtw=="],
+
"@oven/bun-darwin-x64": ["@oven/bun-darwin-x64@1.3.2", "", { "os": "darwin", "cpu": "x64" }, "sha512-hn8lLzsYyyh6ULo2E8v2SqtrWOkdQKJwapeVy1rDw7juTTeHY3KDudGWf4mVYteC9riZU6HD88Fn3nGwyX0eIg=="],
-
"@oven/bun-darwin-x64-baseline": ["@oven/bun-darwin-x64-baseline@1.3.0", "", { "os": "darwin", "cpu": "x64" }, "sha512-+FSr/ub5vA/EkD3fMhHJUzYioSf/sXd50OGxNDAntVxcDu4tXL/81Ka3R/gkZmjznpLFIzovU/1Ts+b7dlkrfw=="],
+
"@oven/bun-darwin-x64-baseline": ["@oven/bun-darwin-x64-baseline@1.3.2", "", { "os": "darwin", "cpu": "x64" }, "sha512-UHxdtbyxdtNJUNcXtIrjx3Lmq8ji3KywlXtIHV/0vn9A8W5mulqOcryqUWMFVH9JTIIzmNn6Q/qVmXHTME63Ww=="],
-
"@oven/bun-linux-aarch64": ["@oven/bun-linux-aarch64@1.3.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-WHthS/eLkCNcp9pk4W8aubRl9fIUgt2XhHyLrP0GClB1FVvmodu/zIOtG0NXNpzlzB8+gglOkGo4dPjfVf4Z+g=="],
+
"@oven/bun-linux-aarch64": ["@oven/bun-linux-aarch64@1.3.2", "", { "os": "linux", "cpu": "arm64" }, "sha512-5uZzxzvHU/z+3cZwN/A0H8G+enQ+9FkeJVZkE2fwK2XhiJZFUGAuWajCpy7GepvOWlqV7VjPaKi2+Qmr4IX7nQ=="],
-
"@oven/bun-linux-aarch64-musl": ["@oven/bun-linux-aarch64-musl@1.3.0", "", { "os": "linux", "cpu": "arm64" }, "sha512-HT5sr7N8NDYbQRjAnT7ISpx64y+ewZZRQozOJb0+KQObKvg4UUNXGm4Pn1xA4/WPMZDDazjO8E2vtOQw1nJlAQ=="],
+
"@oven/bun-linux-aarch64-musl": ["@oven/bun-linux-aarch64-musl@1.3.2", "", { "os": "linux", "cpu": "arm64" }, "sha512-OD9DYkjes7WXieBn4zQZGXWhRVZhIEWMDGCetZ3H4vxIuweZ++iul/CNX5jdpNXaJ17myb1ROMvmRbrqW44j3w=="],
-
"@oven/bun-linux-x64": ["@oven/bun-linux-x64@1.3.0", "", { "os": "linux", "cpu": "x64" }, "sha512-sGEWoJQXO4GDr0x4t/yJQ/Bq1yNkOdX9tHbZZ+DBGJt3z3r7jeb4Digv8xQUk6gdTFC9vnGHuin+KW3/yD1Aww=="],
+
"@oven/bun-linux-x64": ["@oven/bun-linux-x64@1.3.2", "", { "os": "linux", "cpu": "x64" }, "sha512-EoEuRP9bxAxVKuvi6tZ0ZENjueP4lvjz0mKsMzdG0kwg/2apGKiirH1l0RIcdmvfDGGuDmNiv/XBpkoXq1x8ug=="],
-
"@oven/bun-linux-x64-baseline": ["@oven/bun-linux-x64-baseline@1.3.0", "", { "os": "linux", "cpu": "x64" }, "sha512-OmlEH3nlxQyv7HOvTH21vyNAZGv9DIPnrTznzvKiOQxkOphhCyKvPTlF13ydw4s/i18iwaUrhHy+YG9HSSxa4Q=="],
+
"@oven/bun-linux-x64-baseline": ["@oven/bun-linux-x64-baseline@1.3.2", "", { "os": "linux", "cpu": "x64" }, "sha512-m9Ov9YH8KjRLui87eNtQQFKVnjGsNk3xgbrR9c8d2FS3NfZSxmVjSeBvEsDjzNf1TXLDriHb/NYOlpiMf/QzDg=="],
-
"@oven/bun-linux-x64-musl": ["@oven/bun-linux-x64-musl@1.3.0", "", { "os": "linux", "cpu": "x64" }, "sha512-rtzUEzCynl3Rhgn/iR9DQezSFiZMcAXAbU+xfROqsweMGKwvwIA2ckyyckO08psEP8XcUZTs3LT9CH7PnaMiEA=="],
+
"@oven/bun-linux-x64-musl": ["@oven/bun-linux-x64-musl@1.3.2", "", { "os": "linux", "cpu": "x64" }, "sha512-3TuOsRVoG8K+soQWRo+Cp5ACpRs6rTFSu5tAqc/6WrqwbNWmqjov/eWJPTgz3gPXnC7uNKVG7RxxAmV8r2EYTQ=="],
-
"@oven/bun-linux-x64-musl-baseline": ["@oven/bun-linux-x64-musl-baseline@1.3.0", "", { "os": "linux", "cpu": "x64" }, "sha512-hrr7mDvUjMX1tuJaXz448tMsgKIqGJBY8+rJqztKOw1U5+a/v2w5HuIIW1ce7ut0ZwEn+KIDvAujlPvpH33vpQ=="],
+
"@oven/bun-linux-x64-musl-baseline": ["@oven/bun-linux-x64-musl-baseline@1.3.2", "", { "os": "linux", "cpu": "x64" }, "sha512-q8Hto8hcpofPJjvuvjuwyYvhOaAzPw1F5vRUUeOJDmDwZ4lZhANFM0rUwchMzfWUJCD6jg8/EVQ8MiixnZWU0A=="],
-
"@oven/bun-windows-x64": ["@oven/bun-windows-x64@1.3.0", "", { "os": "win32", "cpu": "x64" }, "sha512-xXwtpZVVP7T+vkxcF/TUVVOGRjEfkByO4mKveKYb4xnHWV4u4NnV0oNmzyMKkvmj10to5j2h0oZxA4ZVVv4gfA=="],
+
"@oven/bun-windows-x64": ["@oven/bun-windows-x64@1.3.2", "", { "os": "win32", "cpu": "x64" }, "sha512-nZJUa5NprPYQ4Ii4cMwtP9PzlJJTp1XhxJ+A9eSn1Jfr6YygVWyN2KLjenyI93IcuBouBAaepDAVZZjH2lFBhg=="],
-
"@oven/bun-windows-x64-baseline": ["@oven/bun-windows-x64-baseline@1.3.0", "", { "os": "win32", "cpu": "x64" }, "sha512-/jVZ8eYjpYHLDFNoT86cP+AjuWvpkzFY+0R0a1bdeu0sQ6ILuy1FV6hz1hUAP390E09VCo5oP76fnx29giHTtA=="],
+
"@oven/bun-windows-x64-baseline": ["@oven/bun-windows-x64-baseline@1.3.2", "", { "os": "win32", "cpu": "x64" }, "sha512-s00T99MjB+xLOWq+t+wVaVBrry+oBOZNiTJijt+bmkp/MJptYS3FGvs7a+nkjLNzoNDoWQcXgKew6AaHES37Bg=="],
"@protobufjs/aspromise": ["@protobufjs/aspromise@1.1.2", "", {}, "sha512-j+gKExEuLmKwvz3OgROXtrJ2UG2x8Ch2YZUxahh+s1F2HZ+wAceUNLkvy6zKCPVRkU++ZWQrdxsUeQXmcg4uoQ=="],
···
"@radix-ui/primitive": ["@radix-ui/primitive@1.1.3", "", {}, "sha512-JTF99U/6XIjCBo0wqkU5sK10glYe27MRRsfwoiq5zzOEZLHU3A3KCMa5X/azekYRCJ0HlwI0crAXS/5dEHTzDg=="],
+
"@radix-ui/react-checkbox": ["@radix-ui/react-checkbox@1.3.3", "", { "dependencies": { "@radix-ui/primitive": "1.1.3", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-presence": "1.1.5", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-use-controllable-state": "1.2.2", "@radix-ui/react-use-previous": "1.1.1", "@radix-ui/react-use-size": "1.1.1" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-wBbpv+NQftHDdG86Qc0pIyXk5IR3tM8Vd0nWLKDcX8nNn4nXFOFwsKuqw2okA/1D/mpaAkmuyndrPJTYDNZtFw=="],
+
"@radix-ui/react-collection": ["@radix-ui/react-collection@1.1.7", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-slot": "1.2.3" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-Fh9rGN0MoI4ZFUNyfFVNU4y9LUz93u9/0K+yLgA2bwRojxM8JU1DyvvMBabnZPBgMWREAJvU2jjVzq+LrFUglw=="],
"@radix-ui/react-compose-refs": ["@radix-ui/react-compose-refs@1.1.2", "", { "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-z4eqJvfiNnFMHIIvXP3CY57y2WJs5g2v3X0zm9mEJkrkNv4rDxu+sg9Jh8EkXyeqBkB7SOcboo9dMVqhyrACIg=="],
···
"@radix-ui/react-id": ["@radix-ui/react-id@1.1.1", "", { "dependencies": { "@radix-ui/react-use-layout-effect": "1.1.1" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-kGkGegYIdQsOb4XjsfM97rXsiHaBwco+hFI66oO4s9LU+PLAC5oJ7khdOVFxkhsmlbpUqDAvXw11CluXP+jkHg=="],
-
"@radix-ui/react-label": ["@radix-ui/react-label@2.1.7", "", { "dependencies": { "@radix-ui/react-primitive": "2.1.3" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-YT1GqPSL8kJn20djelMX7/cTRp/Y9w5IZHvfxQTVHrOqa2yMl7i/UfMqKRU5V7mEyKTrUVgJXhNQPVCG8PBLoQ=="],
+
"@radix-ui/react-label": ["@radix-ui/react-label@2.1.8", "", { "dependencies": { "@radix-ui/react-primitive": "2.1.4" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-FmXs37I6hSBVDlO4y764TNz1rLgKwjJMQ0EGte6F3Cb3f4bIuHB/iLa/8I9VKkmOy+gNHq8rql3j686ACVV21A=="],
"@radix-ui/react-portal": ["@radix-ui/react-portal@1.1.9", "", { "dependencies": { "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-use-layout-effect": "1.1.1" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-bpIxvq03if6UNwXZ+HTK71JLh4APvnXntDc6XOX8UVq4XQOVl7lwok0AvIl+b8zgCw3fSaVTZMpAPPagXbKmHQ=="],
···
"@radix-ui/react-roving-focus": ["@radix-ui/react-roving-focus@1.1.11", "", { "dependencies": { "@radix-ui/primitive": "1.1.3", "@radix-ui/react-collection": "1.1.7", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-direction": "1.1.1", "@radix-ui/react-id": "1.1.1", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-use-callback-ref": "1.1.1", "@radix-ui/react-use-controllable-state": "1.2.2" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-7A6S9jSgm/S+7MdtNDSb+IU859vQqJ/QAtcYQcfFC6W8RS4IxIZDldLR0xqCFZ6DCyrQLjLPsxtTNch5jVA4lA=="],
-
"@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.3", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A=="],
+
"@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.4", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-Jl+bCv8HxKnlTLVrcDE8zTMJ09R9/ukw4qBs/oZClOfoQk/cOTbDn+NceXfV7j09YPVQUryJPHurafcSg6EVKA=="],
"@radix-ui/react-tabs": ["@radix-ui/react-tabs@1.1.13", "", { "dependencies": { "@radix-ui/primitive": "1.1.3", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-direction": "1.1.1", "@radix-ui/react-id": "1.1.1", "@radix-ui/react-presence": "1.1.5", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-roving-focus": "1.1.11", "@radix-ui/react-use-controllable-state": "1.2.2" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-7xdcatg7/U+7+Udyoj2zodtI9H/IIopqo+YOIcZOq1nJwXWBZ9p8xiu5llXlekDbZkca79a/fozEYQXIA4sW6A=="],
···
"@sinclair/typebox": ["@sinclair/typebox@0.34.41", "", {}, "sha512-6gS8pZzSXdyRHTIqoqSVknxolr1kzfy4/CeDnrzsVz8TTIWUbOBr6gnzOmTYJ3eXQNh4IYHIGi5aIL7sOZ2G/g=="],
-
"@tanstack/query-core": ["@tanstack/query-core@5.90.2", "", {}, "sha512-k/TcR3YalnzibscALLwxeiLUub6jN5EDLwKDiO7q5f4ICEoptJ+n9+7vcEFy5/x/i6Q+Lb/tXrsKCggf5uQJXQ=="],
+
"@standard-schema/spec": ["@standard-schema/spec@1.0.0", "", {}, "sha512-m2bOd0f2RT9k8QJx1JN85cZYyH1RqFBdlwtkSlf4tBDYLCiiZnv1fIIwacK6cqwXavOydf0NPToMQgpKq+dVlA=="],
-
"@tanstack/react-query": ["@tanstack/react-query@5.90.2", "", { "dependencies": { "@tanstack/query-core": "5.90.2" }, "peerDependencies": { "react": "^18 || ^19" } }, "sha512-CLABiR+h5PYfOWr/z+vWFt5VsOA2ekQeRQBFSKlcoW6Ndx/f8rfyVmq4LbgOM4GG2qtxAxjLYLOpCNTYm4uKzw=="],
+
"@tanstack/query-core": ["@tanstack/query-core@5.90.7", "", {}, "sha512-6PN65csiuTNfBMXqQUxQhCNdtm1rV+9kC9YwWAIKcaxAauq3Wu7p18j3gQY3YIBJU70jT/wzCCZ2uqto/vQgiQ=="],
+
+
"@tanstack/react-query": ["@tanstack/react-query@5.90.7", "", { "dependencies": { "@tanstack/query-core": "5.90.7" }, "peerDependencies": { "react": "^18 || ^19" } }, "sha512-wAHc/cgKzW7LZNFloThyHnV/AX9gTg3w5yAv0gvQHPZoCnepwqCMtzbuPbb2UvfvO32XZ46e8bPOYbfZhzVnnQ=="],
"@tokenizer/inflate": ["@tokenizer/inflate@0.2.7", "", { "dependencies": { "debug": "^4.4.0", "fflate": "^0.8.2", "token-types": "^6.0.0" } }, "sha512-MADQgmZT1eKjp06jpI2yozxaU9uVs4GzzgSL+uEq7bVcJ9V1ZXQkeGNql1fsSI0gMy1vhvNTNbUqrx+pZfJVmg=="],
···
"@ts-morph/common": ["@ts-morph/common@0.25.0", "", { "dependencies": { "minimatch": "^9.0.4", "path-browserify": "^1.0.1", "tinyglobby": "^0.2.9" } }, "sha512-kMnZz+vGGHi4GoHnLmMhGNjm44kGtKUXGnOvrKmMwAuvNjM/PgKVGfUnL7IDvK7Jb2QQ82jq3Zmp04Gy+r3Dkg=="],
-
"@types/node": ["@types/node@24.7.2", "", { "dependencies": { "undici-types": "~7.14.0" } }, "sha512-/NbVmcGTP+lj5oa4yiYxxeBjRivKQ5Ns1eSZeB99ExsEQ6rX5XYU1Zy/gGxY/ilqtD4Etx9mKyrPxZRetiahhA=="],
+
"@types/bun": ["@types/bun@1.3.3", "", { "dependencies": { "bun-types": "1.3.3" } }, "sha512-ogrKbJ2X5N0kWLLFKeytG0eHDleBYtngtlbu9cyBKFtNL3cnpDZkNdQj8flVf6WTZUX5ulI9AY1oa7ljhSrp+g=="],
+
+
"@types/mime-types": ["@types/mime-types@2.1.4", "", {}, "sha512-lfU4b34HOri+kAY5UheuFMWPDOI+OPceBSHZKp69gEyTL/mmJ4cnU6Y/rlme3UL3GyOn6Y42hyIEw0/q8sWx5w=="],
+
+
"@types/node": ["@types/node@22.19.1", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-LCCV0HdSZZZb34qifBsyWlUmok6W7ouER+oQIGBScS8EsZsQbrtFTUrDX4hOl+CS6p7cnNC4td+qrSVGSCTUfQ=="],
"@types/react": ["@types/react@19.2.2", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA=="],
-
"@types/react-dom": ["@types/react-dom@19.2.1", "", { "peerDependencies": { "@types/react": "^19.2.0" } }, "sha512-/EEvYBdT3BflCWvTMO7YkYBHVE9Ci6XdqZciZANQgKpaiDRGOLIlRo91jbTNRQjgPFWVaRxcYc0luVNFitz57A=="],
+
"@types/react-dom": ["@types/react-dom@19.2.2", "", { "peerDependencies": { "@types/react": "^19.2.0" } }, "sha512-9KQPoO6mZCi7jcIStSnlOWn2nEF3mNmyr3rIAsGnAbQKYbRLyqmeSc39EVgtxXVia+LMT8j3knZLAZAh+xLmrw=="],
"@types/shimmer": ["@types/shimmer@1.2.0", "", {}, "sha512-UE7oxhQLLd9gub6JKIAhDq06T0F6FnztwMNRvYgjeQSBeMc1ZG/tA47EwfduvkuQS8apbkM/lpLpWsaCeYsXVg=="],
+
"@wisp/atproto-utils": ["@wisp/atproto-utils@workspace:packages/@wisp/atproto-utils"],
+
+
"@wisp/constants": ["@wisp/constants@workspace:packages/@wisp/constants"],
+
+
"@wisp/database": ["@wisp/database@workspace:packages/@wisp/database"],
+
+
"@wisp/fs-utils": ["@wisp/fs-utils@workspace:packages/@wisp/fs-utils"],
+
+
"@wisp/lexicons": ["@wisp/lexicons@workspace:packages/@wisp/lexicons"],
+
+
"@wisp/main-app": ["@wisp/main-app@workspace:apps/main-app"],
+
+
"@wisp/observability": ["@wisp/observability@workspace:packages/@wisp/observability"],
+
+
"@wisp/safe-fetch": ["@wisp/safe-fetch@workspace:packages/@wisp/safe-fetch"],
+
"abort-controller": ["abort-controller@3.0.0", "", { "dependencies": { "event-target-shim": "^5.0.0" } }, "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg=="],
"accepts": ["accepts@1.3.8", "", { "dependencies": { "mime-types": "~2.1.34", "negotiator": "0.6.3" } }, "sha512-PYAthTa2m2VKxuvSD3DPC/Gy+U+sOA1LAuT8mkmRuvw+NACSaeXEQ+NHcVF7rONl6qcaxV3Uuemwawk+7+SJLw=="],
···
"acorn": ["acorn@8.15.0", "", { "bin": { "acorn": "bin/acorn" } }, "sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg=="],
"acorn-import-attributes": ["acorn-import-attributes@1.9.5", "", { "peerDependencies": { "acorn": "^8" } }, "sha512-n02Vykv5uA3eHGM/Z2dQrcD56kL8TyDb2p1+0P83PClMnC/nc+anbQRhIOWnSq4Ke/KvDPrY3C9hDtC/A3eHnQ=="],
+
+
"actor-typeahead": ["actor-typeahead@0.1.1", "", {}, "sha512-ilsBwzplKwMSBiO6Tg6RdaZ5xxqgXds5jCQuHV+ib9Aq3ja9g0T7u2Y1PmihotmS7l5RxhpGI/tPm3ljoRDRwg=="],
"ansi-regex": ["ansi-regex@5.0.1", "", {}, "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ=="],
···
"atomic-sleep": ["atomic-sleep@1.0.0", "", {}, "sha512-kNOjDqAh7px0XWNI+4QbzoiR/nTkHAWNud2uvnJquD1/x5a7EQZMJT0AczqK0Qn67oY/TTQ1LbUKajZpp3I9tQ=="],
+
"atproto-ui": ["atproto-ui@0.11.3", "", { "dependencies": { "@atcute/atproto": "^3.1.7", "@atcute/bluesky": "^3.2.3", "@atcute/client": "^4.0.3", "@atcute/identity-resolver": "^1.1.3", "@atcute/tangled": "^1.0.10" }, "peerDependencies": { "react": "^18.2.0 || ^19.0.0", "react-dom": "^18.2.0 || ^19.0.0" }, "optionalPeers": ["react-dom"] }, "sha512-NIBsORuo9lpCpr1SNKcKhNvqOVpsEy9IoHqFe1CM9gNTArpQL1hUcoP1Cou9a1O5qzCul9kaiu5xBHnB81I/WQ=="],
+
"await-lock": ["await-lock@2.2.2", "", {}, "sha512-aDczADvlvTGajTDjcjpJMqRkOF6Qdz3YbPZm/PyW6tKPkx2hlYBzxMhEywM/tU72HrVZjgl5VCdRuMlA7pZ8Gw=="],
"balanced-match": ["balanced-match@1.0.2", "", {}, "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw=="],
···
"buffer": ["buffer@6.0.3", "", { "dependencies": { "base64-js": "^1.3.1", "ieee754": "^1.2.1" } }, "sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA=="],
-
"bun": ["bun@1.3.0", "", { "optionalDependencies": { "@oven/bun-darwin-aarch64": "1.3.0", "@oven/bun-darwin-x64": "1.3.0", "@oven/bun-darwin-x64-baseline": "1.3.0", "@oven/bun-linux-aarch64": "1.3.0", "@oven/bun-linux-aarch64-musl": "1.3.0", "@oven/bun-linux-x64": "1.3.0", "@oven/bun-linux-x64-baseline": "1.3.0", "@oven/bun-linux-x64-musl": "1.3.0", "@oven/bun-linux-x64-musl-baseline": "1.3.0", "@oven/bun-windows-x64": "1.3.0", "@oven/bun-windows-x64-baseline": "1.3.0" }, "os": [ "linux", "win32", "darwin", ], "cpu": [ "x64", "arm64", ], "bin": { "bun": "bin/bun.exe", "bunx": "bin/bunx.exe" } }, "sha512-YI7mFs7iWc/VsGsh2aw6eAPD2cjzn1j+LKdYVk09x1CrdTWKYIHyd+dG5iQoN9//3hCDoZj8U6vKpZzEf5UARA=="],
+
"bun": ["bun@1.3.2", "", { "optionalDependencies": { "@oven/bun-darwin-aarch64": "1.3.2", "@oven/bun-darwin-x64": "1.3.2", "@oven/bun-darwin-x64-baseline": "1.3.2", "@oven/bun-linux-aarch64": "1.3.2", "@oven/bun-linux-aarch64-musl": "1.3.2", "@oven/bun-linux-x64": "1.3.2", "@oven/bun-linux-x64-baseline": "1.3.2", "@oven/bun-linux-x64-musl": "1.3.2", "@oven/bun-linux-x64-musl-baseline": "1.3.2", "@oven/bun-windows-x64": "1.3.2", "@oven/bun-windows-x64-baseline": "1.3.2" }, "os": [ "linux", "win32", "darwin", ], "cpu": [ "x64", "arm64", ], "bin": { "bun": "bin/bun.exe", "bunx": "bin/bunx.exe" } }, "sha512-x75mPJiEfhO1j4Tfc65+PtW6ZyrAB6yTZInydnjDZXF9u9PRAnr6OK3v0Q9dpDl0dxRHkXlYvJ8tteJxc8t4Sw=="],
"bun-plugin-tailwind": ["bun-plugin-tailwind@0.1.2", "", { "peerDependencies": { "bun": ">=1.0.0" } }, "sha512-41jNC1tZRSK3s1o7pTNrLuQG8kL/0vR/JgiTmZAJ1eHwe0w5j6HFPKeqEk0WAD13jfrUC7+ULuewFBBCoADPpg=="],
-
"bun-types": ["bun-types@1.3.0", "", { "dependencies": { "@types/node": "*" }, "peerDependencies": { "@types/react": "^19" } }, "sha512-u8X0thhx+yJ0KmkxuEo9HAtdfgCBaM/aI9K90VQcQioAmkVp3SG3FkwWGibUFz3WdXAdcsqOcbU40lK7tbHdkQ=="],
+
"bun-types": ["bun-types@1.3.3", "", { "dependencies": { "@types/node": "*" } }, "sha512-z3Xwlg7j2l9JY27x5Qn3Wlyos8YAp0kKRlrePAOjgjMGS5IG6E7Jnlx736vH9UVI4wUICwwhC9anYL++XeOgTQ=="],
"bytes": ["bytes@3.1.2", "", {}, "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg=="],
···
"ee-first": ["ee-first@1.1.1", "", {}, "sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow=="],
-
"elysia": ["elysia@1.4.11", "", { "dependencies": { "cookie": "^1.0.2", "exact-mirror": "0.2.2", "fast-decode-uri-component": "^1.0.1" }, "peerDependencies": { "@sinclair/typebox": ">= 0.34.0 < 1", "file-type": ">= 20.0.0", "openapi-types": ">= 12.0.0", "typescript": ">= 5.0.0" }, "optionalPeers": ["typescript"] }, "sha512-cphuzQj0fRw1ICRvwHy2H3xQio9bycaZUVHnDHJQnKqBfMNlZ+Hzj6TMmt9lc0Az0mvbCnPXWVF7y1MCRhUuOA=="],
+
"elysia": ["elysia@1.4.16", "", { "dependencies": { "cookie": "^1.0.2", "exact-mirror": "0.2.3", "fast-decode-uri-component": "^1.0.1", "memoirist": "^0.4.0" }, "peerDependencies": { "@sinclair/typebox": ">= 0.34.0 < 1", "@types/bun": ">= 1.2.0", "file-type": ">= 20.0.0", "openapi-types": ">= 12.0.0", "typescript": ">= 5.0.0" }, "optionalPeers": ["@types/bun", "typescript"] }, "sha512-KZtKN160/bdWVKg2hEgyoNXY8jRRquc+m6PboyisaLZL891I+Ufb7Ja6lDAD7vMQur8sLEWIcidZOzj5lWw9UA=="],
"emoji-regex": ["emoji-regex@8.0.0", "", {}, "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="],
···
"es-object-atoms": ["es-object-atoms@1.1.1", "", { "dependencies": { "es-errors": "^1.3.0" } }, "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA=="],
+
"esbuild": ["esbuild@0.26.0", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.26.0", "@esbuild/android-arm": "0.26.0", "@esbuild/android-arm64": "0.26.0", "@esbuild/android-x64": "0.26.0", "@esbuild/darwin-arm64": "0.26.0", "@esbuild/darwin-x64": "0.26.0", "@esbuild/freebsd-arm64": "0.26.0", "@esbuild/freebsd-x64": "0.26.0", "@esbuild/linux-arm": "0.26.0", "@esbuild/linux-arm64": "0.26.0", "@esbuild/linux-ia32": "0.26.0", "@esbuild/linux-loong64": "0.26.0", "@esbuild/linux-mips64el": "0.26.0", "@esbuild/linux-ppc64": "0.26.0", "@esbuild/linux-riscv64": "0.26.0", "@esbuild/linux-s390x": "0.26.0", "@esbuild/linux-x64": "0.26.0", "@esbuild/netbsd-arm64": "0.26.0", "@esbuild/netbsd-x64": "0.26.0", "@esbuild/openbsd-arm64": "0.26.0", "@esbuild/openbsd-x64": "0.26.0", "@esbuild/openharmony-arm64": "0.26.0", "@esbuild/sunos-x64": "0.26.0", "@esbuild/win32-arm64": "0.26.0", "@esbuild/win32-ia32": "0.26.0", "@esbuild/win32-x64": "0.26.0" }, "bin": { "esbuild": "bin/esbuild" } }, "sha512-3Hq7jri+tRrVWha+ZeIVhl4qJRha/XjRNSopvTsOaCvfPHrflTYTcUFcEjMKdxofsXXsdc4zjg5NOTnL4Gl57Q=="],
+
"escalade": ["escalade@3.2.0", "", {}, "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA=="],
"escape-html": ["escape-html@1.0.3", "", {}, "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow=="],
+
"esm-env": ["esm-env@1.2.2", "", {}, "sha512-Epxrv+Nr/CaL4ZcFGPJIYLWFom+YeV1DqMLHJoEd9SYRxNbaFruBwfEX/kkHUJf55j2+TUbmDcmuilbP1TmXHA=="],
+
"etag": ["etag@1.8.1", "", {}, "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg=="],
"event-target-shim": ["event-target-shim@5.0.1", "", {}, "sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ=="],
+
"eventemitter3": ["eventemitter3@4.0.7", "", {}, "sha512-8guHBZCwKnFhYdHr2ysuRWErTwhoN2X8XELRlrRwpmfeY2jjuUN4taQMsULKUVo1K4DvZl+0pgfyoysHxvmvEw=="],
+
"events": ["events@3.3.0", "", {}, "sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q=="],
-
"exact-mirror": ["exact-mirror@0.2.2", "", { "peerDependencies": { "@sinclair/typebox": "^0.34.15" }, "optionalPeers": ["@sinclair/typebox"] }, "sha512-CrGe+4QzHZlnrXZVlo/WbUZ4qQZq8C0uATQVGVgXIrNXgHDBBNFD1VRfssRA2C9t3RYvh3MadZSdg2Wy7HBoQA=="],
+
"exact-mirror": ["exact-mirror@0.2.3", "", { "peerDependencies": { "@sinclair/typebox": "^0.34.15" }, "optionalPeers": ["@sinclair/typebox"] }, "sha512-aLdARfO0W0ntufjDyytUJQMbNXoB9g+BbA8KcgIq4XOOTYRw48yUGON/Pr64iDrYNZKcKvKbqE0MPW56FF2BXA=="],
"express": ["express@4.21.2", "", { "dependencies": { "accepts": "~1.3.8", "array-flatten": "1.1.1", "body-parser": "1.20.3", "content-disposition": "0.5.4", "content-type": "~1.0.4", "cookie": "0.7.1", "cookie-signature": "1.0.6", "debug": "2.6.9", "depd": "2.0.0", "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "etag": "~1.8.1", "finalhandler": "1.3.1", "fresh": "0.5.2", "http-errors": "2.0.0", "merge-descriptors": "1.0.3", "methods": "~1.1.2", "on-finished": "2.4.1", "parseurl": "~1.3.3", "path-to-regexp": "0.1.12", "proxy-addr": "~2.0.7", "qs": "6.13.0", "range-parser": "~1.2.1", "safe-buffer": "5.2.1", "send": "0.19.0", "serve-static": "1.16.2", "setprototypeof": "1.2.0", "statuses": "2.0.1", "type-is": "~1.6.18", "utils-merge": "1.0.1", "vary": "~1.1.2" } }, "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA=="],
···
"fresh": ["fresh@0.5.2", "", {}, "sha512-zJ2mQYM18rEFOudeV4GShTGIQ7RbzA7ozbU9I/XBpm7kqgMywgmylMwXHxZJmkVoYkna9d2pVXVXPdYTP9ej8Q=="],
+
"fsevents": ["fsevents@2.3.2", "", { "os": "darwin" }, "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA=="],
+
"function-bind": ["function-bind@1.1.2", "", {}, "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA=="],
"get-caller-file": ["get-caller-file@2.0.5", "", {}, "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg=="],
···
"get-proto": ["get-proto@1.0.1", "", { "dependencies": { "dunder-proto": "^1.0.1", "es-object-atoms": "^1.0.0" } }, "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g=="],
+
"get-tsconfig": ["get-tsconfig@4.13.0", "", { "dependencies": { "resolve-pkg-maps": "^1.0.0" } }, "sha512-1VKTZJCwBrvbd+Wn3AOgQP/2Av+TfTCOlE4AcRJE72W1ksZXbAx8PPBR9RzgTeSPzlPMHrbANMH3LbltH73wxQ=="],
+
"gopd": ["gopd@1.2.0", "", {}, "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg=="],
"graphemer": ["graphemer@1.4.0", "", {}, "sha512-EtKwoO6kxCL9WO5xipiHTZlSzBm7WLT627TqC/uVRd0HKmq8NXyebnNYxDoBi7wt8eTWrUrKXCOVaFq9x1kgag=="],
···
"hasown": ["hasown@2.0.2", "", { "dependencies": { "function-bind": "^1.1.2" } }, "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ=="],
+
"hono": ["hono@4.10.6", "", {}, "sha512-BIdolzGpDO9MQ4nu3AUuDwHZZ+KViNm+EZ75Ae55eMXMqLVhDFqEMXxtUe9Qh8hjL+pIna/frs2j6Y2yD5Ua/g=="],
+
"http-errors": ["http-errors@2.0.0", "", { "dependencies": { "depd": "2.0.0", "inherits": "2.0.4", "setprototypeof": "1.2.0", "statuses": "2.0.1", "toidentifier": "1.0.1" } }, "sha512-FtwrG/euBzaEjYeRqOgly7G0qviiXoJWnvEH2Z1plBdXgbyjv34pHTSb9zoeHMyDy33+DWy5Wt9Wo+TURtOYSQ=="],
"iconv-lite": ["iconv-lite@0.4.24", "", { "dependencies": { "safer-buffer": ">= 2.1.2 < 3" } }, "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA=="],
···
"inherits": ["inherits@2.0.4", "", {}, "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ=="],
-
"ipaddr.js": ["ipaddr.js@2.2.0", "", {}, "sha512-Ag3wB2o37wslZS19hZqorUnrnzSkpOVy+IiiDEiTqNubEYpYuHWIf6K4psgN2ZWKExS4xhVCrRVfb/wfW8fWJA=="],
+
"ipaddr.js": ["ipaddr.js@1.9.1", "", {}, "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g=="],
"iron-session": ["iron-session@8.0.4", "", { "dependencies": { "cookie": "^0.7.2", "iron-webcrypto": "^1.2.1", "uncrypto": "^0.1.3" } }, "sha512-9ivNnaKOd08osD0lJ3i6If23GFS2LsxyMU8Gf/uBUEgm8/8CC1hrrCHFDpMo3IFbpBgwoo/eairRsaD3c5itxA=="],
···
"math-intrinsics": ["math-intrinsics@1.1.0", "", {}, "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g=="],
"media-typer": ["media-typer@0.3.0", "", {}, "sha512-dq+qelQ9akHpcOl/gUVRTxVIOkAJ1wR3QAvb4RsVjS8oVoFjDGTc679wJYmUmknUF5HwMLOgb5O+a3KxfWapPQ=="],
+
+
"memoirist": ["memoirist@0.4.0", "", {}, "sha512-zxTgA0mSYELa66DimuNQDvyLq36AwDlTuVRbnQtB+VuTcKWm5Qc4z3WkSpgsFWHNhexqkIooqpv4hdcqrX5Nmg=="],
"merge-descriptors": ["merge-descriptors@1.0.3", "", {}, "sha512-gaNvAS7TZ897/rVaZ0nMtAyxNyi/pdbjbAwUpFQpN70GqnVfOiXpeUUMKRBmzXaSQ8DdTX4/0ms62r2K+hE6mQ=="],
···
"ms": ["ms@2.0.0", "", {}, "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="],
-
"multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
"multiformats": ["multiformats@13.4.1", "", {}, "sha512-VqO6OSvLrFVAYYjgsr8tyv62/rCQhPgsZUXLTqoFLSgdkgiUYKYeArbt1uWLlEpkjxQe+P0+sHlbPEte1Bi06Q=="],
"negotiator": ["negotiator@0.6.3", "", {}, "sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg=="],
···
"on-finished": ["on-finished@2.4.1", "", { "dependencies": { "ee-first": "1.1.1" } }, "sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg=="],
"openapi-types": ["openapi-types@12.1.3", "", {}, "sha512-N4YtSYJqghVu4iek2ZUvcN/0aqH1kRDuNqzcycDxhOUpg7GdvLa2F3DgS6yBNhInhv2r/6I0Flkn7CqL8+nIcw=="],
+
+
"p-finally": ["p-finally@1.0.0", "", {}, "sha512-LICb2p9CB7FS+0eR1oqWnHhp0FljGLZCWBE9aix0Uye9W8LTQPwMTYVGWQWIw9RdQiDg4+epXQODwIYJtSJaow=="],
+
+
"p-queue": ["p-queue@6.6.2", "", { "dependencies": { "eventemitter3": "^4.0.4", "p-timeout": "^3.2.0" } }, "sha512-RwFpb72c/BhQLEXIZ5K2e+AhgNVmIejGlTgiB9MzZ0e93GRvqZ7uSi0dvRF7/XIXDeNkra2fNHBxTyPDGySpjQ=="],
+
+
"p-timeout": ["p-timeout@3.2.0", "", { "dependencies": { "p-finally": "^1.0.0" } }, "sha512-rhIwUycgwwKcP9yTOOFK/AKsAopjjCakVqLHePO3CC6Mir1Z99xT+R63jZxAT5lFZLa2inS5h+ZS2GvR99/FBg=="],
"parseurl": ["parseurl@1.3.3", "", {}, "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ=="],
···
"pino-std-serializers": ["pino-std-serializers@6.2.2", "", {}, "sha512-cHjPPsE+vhj/tnhCy/wiMh3M3z3h/j15zHQX+S9GkTBgqJuTuJzYJ4gUyACLhDaJ7kk9ba9iRDmbH2tJU03OiA=="],
+
"playwright": ["playwright@1.56.1", "", { "dependencies": { "playwright-core": "1.56.1" }, "optionalDependencies": { "fsevents": "2.3.2" }, "bin": { "playwright": "cli.js" } }, "sha512-aFi5B0WovBHTEvpM3DzXTUaeN6eN0qWnTkKx4NQaH4Wvcmc153PdaY2UBdSYKaGYw+UyWXSVyxDUg5DoPEttjw=="],
+
+
"playwright-core": ["playwright-core@1.56.1", "", { "bin": { "playwright-core": "cli.js" } }, "sha512-hutraynyn31F+Bifme+Ps9Vq59hKuUCz7H1kDOcBs+2oGguKkWTU50bBWrtz34OUWmIwpBTWDxaRPXrIXkgvmQ=="],
+
+
"postgres": ["postgres@3.4.7", "", {}, "sha512-Jtc2612XINuBjIl/QTWsV5UvE8UHuNblcO3vVADSrKsrc6RqGX6lOW1cEo3CM2v0XG4Nat8nI+YM7/f26VxXLw=="],
+
"prettier": ["prettier@3.6.2", "", { "bin": { "prettier": "bin/prettier.cjs" } }, "sha512-I7AIg5boAr5R0FFtJ6rCfD+LFsWHp81dolrFD8S79U9tb8Az2nGrJncnMSnys+bpQJfRUzqs9hnA81OAA3hCuQ=="],
+
+
"prismjs": ["prismjs@1.30.0", "", {}, "sha512-DEvV2ZF2r2/63V+tK8hQvrR2ZGn10srHbXviTlcv7Kpzw8jWiNTqbVgjO3IY8RxrrOUF8VPMQQFysYYYv0YZxw=="],
"process": ["process@0.11.10", "", {}, "sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A=="],
···
"resolve": ["resolve@1.22.11", "", { "dependencies": { "is-core-module": "^2.16.1", "path-parse": "^1.0.7", "supports-preserve-symlinks-flag": "^1.0.0" }, "bin": { "resolve": "bin/resolve" } }, "sha512-RfqAvLnMl313r7c9oclB1HhUEAezcpLjz95wFH4LVuhk9JF/r22qmVP9AMmOU4vMX7Q8pN8jwNg/CSpdFnMjTQ=="],
+
"resolve-pkg-maps": ["resolve-pkg-maps@1.0.0", "", {}, "sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw=="],
+
"safe-buffer": ["safe-buffer@5.2.1", "", {}, "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ=="],
"safe-stable-stringify": ["safe-stable-stringify@2.5.0", "", {}, "sha512-b3rppTKm9T+PsVCBEOUR46GWI7fdOs00VKZ1+9c1EWDaDMvjQc6tUwuFyIprgGgTcWoVHSKrU8H31ZHA2e0RHA=="],
···
"tailwind-merge": ["tailwind-merge@3.3.1", "", {}, "sha512-gBXpgUm/3rp1lMZZrM/w7D8GKqshif0zAymAhbCyIt8KMe+0v9DQ7cdYLR4FHH/cKpdTXb+A/tKKU3eolfsI+g=="],
-
"tailwindcss": ["tailwindcss@4.1.14", "", {}, "sha512-b7pCxjGO98LnxVkKjaZSDeNuljC4ueKUddjENJOADtubtdo8llTaJy7HwBMeLNSSo2N5QIAgklslK1+Ir8r6CA=="],
+
"tailwindcss": ["tailwindcss@4.1.17", "", {}, "sha512-j9Ee2YjuQqYT9bbRTfTZht9W/ytp5H+jJpZKiYdP/bpnXARAuELt9ofP0lPnmHjbga7SNQIxdTAXCmtKVYjN+Q=="],
"thread-stream": ["thread-stream@2.7.0", "", { "dependencies": { "real-require": "^0.2.0" } }, "sha512-qQiRWsU/wvNolI6tbbCKd9iKaTnCXsTwVxhhKM6nctPdujTyztjlbUkUTUymidWcMnZ5pWR0ej4a0tjsW021vw=="],
"tinyglobby": ["tinyglobby@0.2.15", "", { "dependencies": { "fdir": "^6.5.0", "picomatch": "^4.0.3" } }, "sha512-j2Zq4NyQYG5XMST4cbs02Ak8iJUdxRM0XI5QyxXuZOzKOINmWurp3smXu3y5wDcJrptwpSjgXHzIQxR0omXljQ=="],
-
"tlds": ["tlds@1.260.0", "", { "bin": { "tlds": "bin.js" } }, "sha512-78+28EWBhCEE7qlyaHA9OR3IPvbCLiDh3Ckla593TksfFc9vfTsgvH7eS+dr3o9qr31gwGbogcI16yN91PoRjQ=="],
+
"tlds": ["tlds@1.261.0", "", { "bin": { "tlds": "bin.js" } }, "sha512-QXqwfEl9ddlGBaRFXIvNKK6OhipSiLXuRuLJX5DErz0o0Q0rYxulWLdFryTkV5PkdZct5iMInwYEGe/eR++1AA=="],
"toidentifier": ["toidentifier@1.0.1", "", {}, "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA=="],
···
"ts-morph": ["ts-morph@24.0.0", "", { "dependencies": { "@ts-morph/common": "~0.25.0", "code-block-writer": "^13.0.3" } }, "sha512-2OAOg/Ob5yx9Et7ZX4CvTCc0UFoZHwLEJ+dpDPSUi5TgwwlTlX47w+iFRrEwzUZwYACjq83cgjS/Da50Ga37uw=="],
"tslib": ["tslib@2.8.1", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
+
+
"tsx": ["tsx@4.20.6", "", { "dependencies": { "esbuild": "~0.25.0", "get-tsconfig": "^4.7.5" }, "optionalDependencies": { "fsevents": "~2.3.3" }, "bin": { "tsx": "dist/cli.mjs" } }, "sha512-ytQKuwgmrrkDTFP4LjR0ToE2nqgy886GpvRSpU0JAnrdBYppuY5rLkRUYPU1yCryb24SsKBTL/hlDQAEFVwtZg=="],
"tw-animate-css": ["tw-animate-css@1.4.0", "", {}, "sha512-7bziOlRqH0hJx80h/3mbicLW7o8qLsH5+RaLR2t+OHM3D0JlWGODQKQ4cxbK7WlvmUxpcj6Kgu6EKqjrGFe3QQ=="],
···
"undici": ["undici@6.22.0", "", {}, "sha512-hU/10obOIu62MGYjdskASR3CUAiYaFTtC9Pa6vHyf//mAipSvSQg6od2CnJswq7fvzNS3zJhxoRkgNVaHurWKw=="],
-
"undici-types": ["undici-types@7.14.0", "", {}, "sha512-QQiYxHuyZ9gQUIrmPo3IA+hUl4KYk8uSA7cHrcKd/l3p1OTpZcM0Tbp9x7FAtXdAYhlasd60ncPpgu6ihG6TOA=="],
+
"undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
+
+
"unicode-segmenter": ["unicode-segmenter@0.14.0", "", {}, "sha512-AH4lhPCJANUnSLEKnM4byboctePJzltF4xj8b+NbNiYeAkAXGh7px2K/4NANFp7dnr6+zB3e6HLu8Jj8SKyvYg=="],
"unpipe": ["unpipe@1.0.0", "", {}, "sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ=="],
···
"utils-merge": ["utils-merge@1.0.1", "", {}, "sha512-pMZTvIkT1d+TFGvDOqodOclx0QWkkgi6Tdoa8gC8ffGAAqz9pzPTZWAybbsHHoED/ztMtkv/VoYTYyShUn81hA=="],
+
"varint": ["varint@6.0.0", "", {}, "sha512-cXEIW6cfr15lFv563k4GuVuW/fiwjknytD37jIOLSdSWuOI6WnO/oKwmP2FQTU2l01LP8/M5TSAJpzUaGe3uWg=="],
+
"vary": ["vary@1.1.2", "", {}, "sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg=="],
+
"wisp-hosting-service": ["wisp-hosting-service@workspace:apps/hosting-service"],
+
"wrap-ansi": ["wrap-ansi@7.0.0", "", { "dependencies": { "ansi-styles": "^4.0.0", "string-width": "^4.1.0", "strip-ansi": "^6.0.0" } }, "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q=="],
"ws": ["ws@8.18.3", "", { "peerDependencies": { "bufferutil": "^4.0.1", "utf-8-validate": ">=5.0.2" }, "optionalPeers": ["bufferutil", "utf-8-validate"] }, "sha512-PEIGCY5tSlUt50cqyMXfCzX+oOPqN0vuGqWzbcJ2xvnkzkq46oOpz7dQaTDBdfICb4N14+GARUDw2XV2N4tvzg=="],
···
"zod": ["zod@3.25.76", "", {}, "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ=="],
+
"@atproto-labs/fetch-node/ipaddr.js": ["ipaddr.js@2.2.0", "", {}, "sha512-Ag3wB2o37wslZS19hZqorUnrnzSkpOVy+IiiDEiTqNubEYpYuHWIf6K4psgN2ZWKExS4xhVCrRVfb/wfW8fWJA=="],
+
+
"@atproto/api/@atproto/lexicon": ["@atproto/lexicon@0.4.14", "", { "dependencies": { "@atproto/common-web": "^0.4.2", "@atproto/syntax": "^0.4.0", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-jiKpmH1QER3Gvc7JVY5brwrfo+etFoe57tKPQX/SmPwjvUsFnJAow5xLIryuBaJgFAhnTZViXKs41t//pahGHQ=="],
+
+
"@atproto/api/@atproto/xrpc": ["@atproto/xrpc@0.6.12", "", { "dependencies": { "@atproto/lexicon": "^0.4.10", "zod": "^3.23.8" } }, "sha512-Ut3iISNLujlmY9Gu8sNU+SPDJDvqlVzWddU8qUr0Yae5oD4SguaUFjjhireMGhQ3M5E0KljQgDbTmnBo1kIZ3w=="],
+
+
"@atproto/api/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/common/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/common-web/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/identity/@atproto/common-web": ["@atproto/common-web@0.4.5", "", { "dependencies": { "@atproto/lex-data": "0.0.1", "@atproto/lex-json": "0.0.1", "zod": "^3.23.8" } }, "sha512-Tx0xUafLm3vRvOQpbBl5eb9V8xlC7TaRXs6dAulHRkDG3Kb+P9qn3pkDteq+aeMshbVXbVa1rm3Ok4vFyuoyYA=="],
+
+
"@atproto/jwk/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/lex-cbor/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/lex-cli/@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-y8AEtYmfgVl4fqFxqXAeGvhesiGkxiy3CWoJIfsFDDdTlZUC8DFnZrYhcqkIop3OlCkkljvpSJi1hbeC1tbi8A=="],
+
+
"@atproto/lex-data/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/lexicon/@atproto/common-web": ["@atproto/common-web@0.4.5", "", { "dependencies": { "@atproto/lex-data": "0.0.1", "@atproto/lex-json": "0.0.1", "zod": "^3.23.8" } }, "sha512-Tx0xUafLm3vRvOQpbBl5eb9V8xlC7TaRXs6dAulHRkDG3Kb+P9qn3pkDteq+aeMshbVXbVa1rm3Ok4vFyuoyYA=="],
+
+
"@atproto/lexicon/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/oauth-client/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/repo/@atproto/common": ["@atproto/common@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.5", "@atproto/lex-cbor": "0.0.1", "@atproto/lex-data": "0.0.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "pino": "^8.21.0" } }, "sha512-0S57sjzw4r9OLc5srJFi6uAz/aTKYl6btz3x36tSnGriL716m6h0x2IVtgd+FhUfIQfisevrqcqw8SfaGk8VTw=="],
+
+
"@atproto/repo/@atproto/common-web": ["@atproto/common-web@0.4.5", "", { "dependencies": { "@atproto/lex-data": "0.0.1", "@atproto/lex-json": "0.0.1", "zod": "^3.23.8" } }, "sha512-Tx0xUafLm3vRvOQpbBl5eb9V8xlC7TaRXs6dAulHRkDG3Kb+P9qn3pkDteq+aeMshbVXbVa1rm3Ok4vFyuoyYA=="],
+
+
"@atproto/repo/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/sync/@atproto/common": ["@atproto/common@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.5", "@atproto/lex-cbor": "0.0.1", "@atproto/lex-data": "0.0.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "pino": "^8.21.0" } }, "sha512-0S57sjzw4r9OLc5srJFi6uAz/aTKYl6btz3x36tSnGriL716m6h0x2IVtgd+FhUfIQfisevrqcqw8SfaGk8VTw=="],
+
+
"@atproto/sync/@atproto/xrpc-server": ["@atproto/xrpc-server@0.10.1", "", { "dependencies": { "@atproto/common": "^0.5.1", "@atproto/crypto": "^0.4.4", "@atproto/lex-cbor": "0.0.1", "@atproto/lex-data": "0.0.1", "@atproto/lexicon": "^0.5.2", "@atproto/ws-client": "^0.0.3", "@atproto/xrpc": "^0.7.6", "express": "^4.17.2", "http-errors": "^2.0.0", "mime-types": "^2.1.35", "rate-limiter-flexible": "^2.4.1", "ws": "^8.12.0", "zod": "^3.23.8" } }, "sha512-kHXykL4inBV/49vefn5zR5zv/VM1//+BIRqk9OvB3+mbERw0jkFiHhc6PWyY/81VD4ciu7FZwUCpRy/mtQtIaA=="],
+
+
"@atproto/sync/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/ws-client/@atproto/common": ["@atproto/common@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.5", "@atproto/lex-cbor": "0.0.1", "@atproto/lex-data": "0.0.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "pino": "^8.21.0" } }, "sha512-0S57sjzw4r9OLc5srJFi6uAz/aTKYl6btz3x36tSnGriL716m6h0x2IVtgd+FhUfIQfisevrqcqw8SfaGk8VTw=="],
+
+
"@atproto/xrpc/@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-y8AEtYmfgVl4fqFxqXAeGvhesiGkxiy3CWoJIfsFDDdTlZUC8DFnZrYhcqkIop3OlCkkljvpSJi1hbeC1tbi8A=="],
+
+
"@atproto/xrpc-server/@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-y8AEtYmfgVl4fqFxqXAeGvhesiGkxiy3CWoJIfsFDDdTlZUC8DFnZrYhcqkIop3OlCkkljvpSJi1hbeC1tbi8A=="],
+
+
"@ipld/dag-cbor/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@radix-ui/react-collection/@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.3", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A=="],
+
+
"@radix-ui/react-dialog/@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.3", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A=="],
+
+
"@radix-ui/react-label/@radix-ui/react-primitive": ["@radix-ui/react-primitive@2.1.4", "", { "dependencies": { "@radix-ui/react-slot": "1.2.4" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-9hQc4+GNVtJAIEPEqlYqW5RiYdrr8ea5XQ0ZOnD6fgru+83kqT15mq2OCcbe8KnjRZl5vF3ks69AKz3kh1jrhg=="],
+
+
"@radix-ui/react-primitive/@radix-ui/react-slot": ["@radix-ui/react-slot@1.2.3", "", { "dependencies": { "@radix-ui/react-compose-refs": "1.1.2" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A=="],
+
"@tokenizer/inflate/debug": ["debug@4.4.3", "", { "dependencies": { "ms": "^2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="],
+
+
"@wisp/main-app/@atproto/api": ["@atproto/api@0.17.7", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "@atproto/xrpc": "^0.7.5", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, "sha512-V+OJBZq9chcrD21xk1bUa6oc5DSKfQj5DmUPf5rmZncqL1w9ZEbS38H5cMyqqdhfgo2LWeDRdZHD0rvNyJsIaw=="],
+
+
"bun-types/@types/node": ["@types/node@24.10.0", "", { "dependencies": { "undici-types": "~7.16.0" } }, "sha512-qzQZRBqkFsYyaSWXuEHc2WR9c0a0CXwiE5FWUvn7ZM+vdy1uZLfCunD38UzhuB7YN/J11ndbDBcTmOdxJo9Q7A=="],
"express/cookie": ["cookie@0.7.1", "", {}, "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w=="],
"iron-session/cookie": ["cookie@0.7.2", "", {}, "sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w=="],
-
"proxy-addr/ipaddr.js": ["ipaddr.js@1.9.1", "", {}, "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g=="],
+
"protobufjs/@types/node": ["@types/node@24.10.0", "", { "dependencies": { "undici-types": "~7.16.0" } }, "sha512-qzQZRBqkFsYyaSWXuEHc2WR9c0a0CXwiE5FWUvn7ZM+vdy1uZLfCunD38UzhuB7YN/J11ndbDBcTmOdxJo9Q7A=="],
"require-in-the-middle/debug": ["debug@4.4.3", "", { "dependencies": { "ms": "^2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="],
···
"send/ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="],
+
"tsx/esbuild": ["esbuild@0.25.12", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.25.12", "@esbuild/android-arm": "0.25.12", "@esbuild/android-arm64": "0.25.12", "@esbuild/android-x64": "0.25.12", "@esbuild/darwin-arm64": "0.25.12", "@esbuild/darwin-x64": "0.25.12", "@esbuild/freebsd-arm64": "0.25.12", "@esbuild/freebsd-x64": "0.25.12", "@esbuild/linux-arm": "0.25.12", "@esbuild/linux-arm64": "0.25.12", "@esbuild/linux-ia32": "0.25.12", "@esbuild/linux-loong64": "0.25.12", "@esbuild/linux-mips64el": "0.25.12", "@esbuild/linux-ppc64": "0.25.12", "@esbuild/linux-riscv64": "0.25.12", "@esbuild/linux-s390x": "0.25.12", "@esbuild/linux-x64": "0.25.12", "@esbuild/netbsd-arm64": "0.25.12", "@esbuild/netbsd-x64": "0.25.12", "@esbuild/openbsd-arm64": "0.25.12", "@esbuild/openbsd-x64": "0.25.12", "@esbuild/openharmony-arm64": "0.25.12", "@esbuild/sunos-x64": "0.25.12", "@esbuild/win32-arm64": "0.25.12", "@esbuild/win32-ia32": "0.25.12", "@esbuild/win32-x64": "0.25.12" }, "bin": { "esbuild": "bin/esbuild" } }, "sha512-bbPBYYrtZbkt6Os6FiTLCTFxvq4tt3JKall1vRwshA3fdVztsLAatFaZobhkBC8/BrPetoa0oksYoKXoG4ryJg=="],
+
+
"tsx/fsevents": ["fsevents@2.3.3", "", { "os": "darwin" }, "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw=="],
+
+
"uint8arrays/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"wisp-hosting-service/@atproto/api": ["@atproto/api@0.17.7", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "@atproto/xrpc": "^0.7.5", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, "sha512-V+OJBZq9chcrD21xk1bUa6oc5DSKfQj5DmUPf5rmZncqL1w9ZEbS38H5cMyqqdhfgo2LWeDRdZHD0rvNyJsIaw=="],
+
+
"wisp-hosting-service/@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-y8AEtYmfgVl4fqFxqXAeGvhesiGkxiy3CWoJIfsFDDdTlZUC8DFnZrYhcqkIop3OlCkkljvpSJi1hbeC1tbi8A=="],
+
+
"@atproto/lex-cli/@atproto/lexicon/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/sync/@atproto/common/@atproto/common-web": ["@atproto/common-web@0.4.5", "", { "dependencies": { "@atproto/lex-data": "0.0.1", "@atproto/lex-json": "0.0.1", "zod": "^3.23.8" } }, "sha512-Tx0xUafLm3vRvOQpbBl5eb9V8xlC7TaRXs6dAulHRkDG3Kb+P9qn3pkDteq+aeMshbVXbVa1rm3Ok4vFyuoyYA=="],
+
+
"@atproto/sync/@atproto/xrpc-server/@atproto/xrpc": ["@atproto/xrpc@0.7.6", "", { "dependencies": { "@atproto/lexicon": "^0.5.2", "zod": "^3.23.8" } }, "sha512-RvCf4j0JnKYWuz3QzsYCntJi3VuiAAybQsMIUw2wLWcHhchO9F7UaBZINLL2z0qc/cYWPv5NSwcVydMseoCZLA=="],
+
+
"@atproto/ws-client/@atproto/common/@atproto/common-web": ["@atproto/common-web@0.4.5", "", { "dependencies": { "@atproto/lex-data": "0.0.1", "@atproto/lex-json": "0.0.1", "zod": "^3.23.8" } }, "sha512-Tx0xUafLm3vRvOQpbBl5eb9V8xlC7TaRXs6dAulHRkDG3Kb+P9qn3pkDteq+aeMshbVXbVa1rm3Ok4vFyuoyYA=="],
+
+
"@atproto/ws-client/@atproto/common/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/xrpc-server/@atproto/lexicon/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"@atproto/xrpc/@atproto/lexicon/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
"@tokenizer/inflate/debug/ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="],
+
"@wisp/main-app/@atproto/api/@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, "sha512-y8AEtYmfgVl4fqFxqXAeGvhesiGkxiy3CWoJIfsFDDdTlZUC8DFnZrYhcqkIop3OlCkkljvpSJi1hbeC1tbi8A=="],
+
+
"@wisp/main-app/@atproto/api/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"bun-types/@types/node/undici-types": ["undici-types@7.16.0", "", {}, "sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw=="],
+
+
"protobufjs/@types/node/undici-types": ["undici-types@7.16.0", "", {}, "sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw=="],
+
"require-in-the-middle/debug/ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="],
+
+
"tsx/esbuild/@esbuild/aix-ppc64": ["@esbuild/aix-ppc64@0.25.12", "", { "os": "aix", "cpu": "ppc64" }, "sha512-Hhmwd6CInZ3dwpuGTF8fJG6yoWmsToE+vYgD4nytZVxcu1ulHpUQRAB1UJ8+N1Am3Mz4+xOByoQoSZf4D+CpkA=="],
+
+
"tsx/esbuild/@esbuild/android-arm": ["@esbuild/android-arm@0.25.12", "", { "os": "android", "cpu": "arm" }, "sha512-VJ+sKvNA/GE7Ccacc9Cha7bpS8nyzVv0jdVgwNDaR4gDMC/2TTRc33Ip8qrNYUcpkOHUT5OZ0bUcNNVZQ9RLlg=="],
+
+
"tsx/esbuild/@esbuild/android-arm64": ["@esbuild/android-arm64@0.25.12", "", { "os": "android", "cpu": "arm64" }, "sha512-6AAmLG7zwD1Z159jCKPvAxZd4y/VTO0VkprYy+3N2FtJ8+BQWFXU+OxARIwA46c5tdD9SsKGZ/1ocqBS/gAKHg=="],
+
+
"tsx/esbuild/@esbuild/android-x64": ["@esbuild/android-x64@0.25.12", "", { "os": "android", "cpu": "x64" }, "sha512-5jbb+2hhDHx5phYR2By8GTWEzn6I9UqR11Kwf22iKbNpYrsmRB18aX/9ivc5cabcUiAT/wM+YIZ6SG9QO6a8kg=="],
+
+
"tsx/esbuild/@esbuild/darwin-arm64": ["@esbuild/darwin-arm64@0.25.12", "", { "os": "darwin", "cpu": "arm64" }, "sha512-N3zl+lxHCifgIlcMUP5016ESkeQjLj/959RxxNYIthIg+CQHInujFuXeWbWMgnTo4cp5XVHqFPmpyu9J65C1Yg=="],
+
+
"tsx/esbuild/@esbuild/darwin-x64": ["@esbuild/darwin-x64@0.25.12", "", { "os": "darwin", "cpu": "x64" }, "sha512-HQ9ka4Kx21qHXwtlTUVbKJOAnmG1ipXhdWTmNXiPzPfWKpXqASVcWdnf2bnL73wgjNrFXAa3yYvBSd9pzfEIpA=="],
+
+
"tsx/esbuild/@esbuild/freebsd-arm64": ["@esbuild/freebsd-arm64@0.25.12", "", { "os": "freebsd", "cpu": "arm64" }, "sha512-gA0Bx759+7Jve03K1S0vkOu5Lg/85dou3EseOGUes8flVOGxbhDDh/iZaoek11Y8mtyKPGF3vP8XhnkDEAmzeg=="],
+
+
"tsx/esbuild/@esbuild/freebsd-x64": ["@esbuild/freebsd-x64@0.25.12", "", { "os": "freebsd", "cpu": "x64" }, "sha512-TGbO26Yw2xsHzxtbVFGEXBFH0FRAP7gtcPE7P5yP7wGy7cXK2oO7RyOhL5NLiqTlBh47XhmIUXuGciXEqYFfBQ=="],
+
+
"tsx/esbuild/@esbuild/linux-arm": ["@esbuild/linux-arm@0.25.12", "", { "os": "linux", "cpu": "arm" }, "sha512-lPDGyC1JPDou8kGcywY0YILzWlhhnRjdof3UlcoqYmS9El818LLfJJc3PXXgZHrHCAKs/Z2SeZtDJr5MrkxtOw=="],
+
+
"tsx/esbuild/@esbuild/linux-arm64": ["@esbuild/linux-arm64@0.25.12", "", { "os": "linux", "cpu": "arm64" }, "sha512-8bwX7a8FghIgrupcxb4aUmYDLp8pX06rGh5HqDT7bB+8Rdells6mHvrFHHW2JAOPZUbnjUpKTLg6ECyzvas2AQ=="],
+
+
"tsx/esbuild/@esbuild/linux-ia32": ["@esbuild/linux-ia32@0.25.12", "", { "os": "linux", "cpu": "ia32" }, "sha512-0y9KrdVnbMM2/vG8KfU0byhUN+EFCny9+8g202gYqSSVMonbsCfLjUO+rCci7pM0WBEtz+oK/PIwHkzxkyharA=="],
+
+
"tsx/esbuild/@esbuild/linux-loong64": ["@esbuild/linux-loong64@0.25.12", "", { "os": "linux", "cpu": "none" }, "sha512-h///Lr5a9rib/v1GGqXVGzjL4TMvVTv+s1DPoxQdz7l/AYv6LDSxdIwzxkrPW438oUXiDtwM10o9PmwS/6Z0Ng=="],
+
+
"tsx/esbuild/@esbuild/linux-mips64el": ["@esbuild/linux-mips64el@0.25.12", "", { "os": "linux", "cpu": "none" }, "sha512-iyRrM1Pzy9GFMDLsXn1iHUm18nhKnNMWscjmp4+hpafcZjrr2WbT//d20xaGljXDBYHqRcl8HnxbX6uaA/eGVw=="],
+
+
"tsx/esbuild/@esbuild/linux-ppc64": ["@esbuild/linux-ppc64@0.25.12", "", { "os": "linux", "cpu": "ppc64" }, "sha512-9meM/lRXxMi5PSUqEXRCtVjEZBGwB7P/D4yT8UG/mwIdze2aV4Vo6U5gD3+RsoHXKkHCfSxZKzmDssVlRj1QQA=="],
+
+
"tsx/esbuild/@esbuild/linux-riscv64": ["@esbuild/linux-riscv64@0.25.12", "", { "os": "linux", "cpu": "none" }, "sha512-Zr7KR4hgKUpWAwb1f3o5ygT04MzqVrGEGXGLnj15YQDJErYu/BGg+wmFlIDOdJp0PmB0lLvxFIOXZgFRrdjR0w=="],
+
+
"tsx/esbuild/@esbuild/linux-s390x": ["@esbuild/linux-s390x@0.25.12", "", { "os": "linux", "cpu": "s390x" }, "sha512-MsKncOcgTNvdtiISc/jZs/Zf8d0cl/t3gYWX8J9ubBnVOwlk65UIEEvgBORTiljloIWnBzLs4qhzPkJcitIzIg=="],
+
+
"tsx/esbuild/@esbuild/linux-x64": ["@esbuild/linux-x64@0.25.12", "", { "os": "linux", "cpu": "x64" }, "sha512-uqZMTLr/zR/ed4jIGnwSLkaHmPjOjJvnm6TVVitAa08SLS9Z0VM8wIRx7gWbJB5/J54YuIMInDquWyYvQLZkgw=="],
+
+
"tsx/esbuild/@esbuild/netbsd-arm64": ["@esbuild/netbsd-arm64@0.25.12", "", { "os": "none", "cpu": "arm64" }, "sha512-xXwcTq4GhRM7J9A8Gv5boanHhRa/Q9KLVmcyXHCTaM4wKfIpWkdXiMog/KsnxzJ0A1+nD+zoecuzqPmCRyBGjg=="],
+
+
"tsx/esbuild/@esbuild/netbsd-x64": ["@esbuild/netbsd-x64@0.25.12", "", { "os": "none", "cpu": "x64" }, "sha512-Ld5pTlzPy3YwGec4OuHh1aCVCRvOXdH8DgRjfDy/oumVovmuSzWfnSJg+VtakB9Cm0gxNO9BzWkj6mtO1FMXkQ=="],
+
+
"tsx/esbuild/@esbuild/openbsd-arm64": ["@esbuild/openbsd-arm64@0.25.12", "", { "os": "openbsd", "cpu": "arm64" }, "sha512-fF96T6KsBo/pkQI950FARU9apGNTSlZGsv1jZBAlcLL1MLjLNIWPBkj5NlSz8aAzYKg+eNqknrUJ24QBybeR5A=="],
+
+
"tsx/esbuild/@esbuild/openbsd-x64": ["@esbuild/openbsd-x64@0.25.12", "", { "os": "openbsd", "cpu": "x64" }, "sha512-MZyXUkZHjQxUvzK7rN8DJ3SRmrVrke8ZyRusHlP+kuwqTcfWLyqMOE3sScPPyeIXN/mDJIfGXvcMqCgYKekoQw=="],
+
+
"tsx/esbuild/@esbuild/openharmony-arm64": ["@esbuild/openharmony-arm64@0.25.12", "", { "os": "none", "cpu": "arm64" }, "sha512-rm0YWsqUSRrjncSXGA7Zv78Nbnw4XL6/dzr20cyrQf7ZmRcsovpcRBdhD43Nuk3y7XIoW2OxMVvwuRvk9XdASg=="],
+
+
"tsx/esbuild/@esbuild/sunos-x64": ["@esbuild/sunos-x64@0.25.12", "", { "os": "sunos", "cpu": "x64" }, "sha512-3wGSCDyuTHQUzt0nV7bocDy72r2lI33QL3gkDNGkod22EsYl04sMf0qLb8luNKTOmgF/eDEDP5BFNwoBKH441w=="],
+
+
"tsx/esbuild/@esbuild/win32-arm64": ["@esbuild/win32-arm64@0.25.12", "", { "os": "win32", "cpu": "arm64" }, "sha512-rMmLrur64A7+DKlnSuwqUdRKyd3UE7oPJZmnljqEptesKM8wx9J8gx5u0+9Pq0fQQW8vqeKebwNXdfOyP+8Bsg=="],
+
+
"tsx/esbuild/@esbuild/win32-ia32": ["@esbuild/win32-ia32@0.25.12", "", { "os": "win32", "cpu": "ia32" }, "sha512-HkqnmmBoCbCwxUKKNPBixiWDGCpQGVsrQfJoVGYLPT41XWF8lHuE5N6WhVia2n4o5QK5M4tYr21827fNhi4byQ=="],
+
+
"tsx/esbuild/@esbuild/win32-x64": ["@esbuild/win32-x64@0.25.12", "", { "os": "win32", "cpu": "x64" }, "sha512-alJC0uCZpTFrSL0CCDjcgleBXPnCrEAhTBILpeAp7M/OFgoqtAetfBzX0xM00MUsVVPpVjlPuMbREqnZCXaTnA=="],
+
+
"wisp-hosting-service/@atproto/api/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
+
+
"wisp-hosting-service/@atproto/lexicon/multiformats": ["multiformats@9.9.0", "", {}, "sha512-HoMUjhH9T8DDBNT+6xzkrd9ga/XiBI4xLr58LJACwK6G3HTOPeMz4nB4KJs33L2BelrIJa7P0VuNaVF3hMYfjg=="],
}
}
+25
cli/.gitignore
···
+
test/
+
.DS_STORE
+
jacquard/
+
binaries/
+
# Generated by Cargo
+
# will have compiled files and executables
+
debug
+
target
+
+
# These are backup files generated by rustfmt
+
**/*.rs.bk
+
+
# MSVC Windows builds of rustc generate these, which store debugging information
+
*.pdb
+
+
# Generated by cargo mutants
+
# Contains mutation testing data
+
**/mutants.out*/
+
+
# RustRover
+
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
+
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
+
# and can be added to the global gitignore or merged into this file. For a more nuclear
+
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
+
#.idea/
+635 -204
cli/Cargo.lock
···
[[package]]
name = "anstyle-query"
-
version = "1.1.4"
+
version = "1.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "9e231f6134f61b71076a3eab506c379d4f36122f2af15a9ff04415ea4c3339e2"
+
checksum = "40c48f72fd53cd289104fc64099abca73db4166ad86ea0b4341abe65af83dadc"
dependencies = [
-
"windows-sys 0.60.2",
+
"windows-sys 0.61.2",
]
[[package]]
name = "anstyle-wincon"
-
version = "3.0.10"
+
version = "3.0.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "3e0633414522a32ffaac8ac6cc8f748e090c5717661fddeea04219e2344f5f2a"
+
checksum = "291e6a250ff86cd4a820112fb8898808a366d8f9f58ce16d1f538353ad55747d"
dependencies = [
"anstyle",
"once_cell_polyfill",
-
"windows-sys 0.60.2",
+
"windows-sys 0.61.2",
]
[[package]]
···
[[package]]
name = "async-compression"
-
version = "0.4.32"
+
version = "0.4.33"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "5a89bce6054c720275ac2432fbba080a66a2106a44a1b804553930ca6909f4e0"
+
checksum = "93c1f86859c1af3d514fa19e8323147ff10ea98684e6c7b307912509f50e67b2"
dependencies = [
"compression-codecs",
"compression-core",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
checksum = "c08606f8c3cbf4ce6ec8e28fb0014a2c086708fe954eaa885384a6165172e7e8"
[[package]]
+
name = "axum"
+
version = "0.8.7"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "5b098575ebe77cb6d14fc7f32749631a6e44edbef6b796f89b020e99ba20d425"
+
dependencies = [
+
"axum-core",
+
"bytes",
+
"form_urlencoded",
+
"futures-util",
+
"http",
+
"http-body",
+
"http-body-util",
+
"hyper",
+
"hyper-util",
+
"itoa",
+
"matchit",
+
"memchr",
+
"mime",
+
"percent-encoding",
+
"pin-project-lite",
+
"serde_core",
+
"serde_json",
+
"serde_path_to_error",
+
"serde_urlencoded",
+
"sync_wrapper",
+
"tokio",
+
"tower",
+
"tower-layer",
+
"tower-service",
+
"tracing",
+
]
+
+
[[package]]
+
name = "axum-core"
+
version = "0.5.5"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "59446ce19cd142f8833f856eb31f3eb097812d1479ab224f54d72428ca21ea22"
+
dependencies = [
+
"bytes",
+
"futures-core",
+
"http",
+
"http-body",
+
"http-body-util",
+
"mime",
+
"pin-project-lite",
+
"sync_wrapper",
+
"tower-layer",
+
"tower-service",
+
"tracing",
+
]
+
+
[[package]]
name = "backtrace"
version = "0.3.76"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"proc-macro2",
"quote",
"rustversion",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
checksum = "46c5e41b57b8bba42a04676d81cb89e9ee8e859a1a66f80a5a72e1cb76b34d43"
[[package]]
+
name = "byteorder"
+
version = "1.5.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "1fd0f2584146f6f2ef48085050886acf353beff7305ebd1ae69500e27c67f64b"
+
+
[[package]]
name = "bytes"
-
version = "1.10.1"
+
version = "1.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "d71b6127be86fdcfddb610f7182ac57211d4b18a3e9c82eb2d17662f2227ad6a"
+
checksum = "b35204fbdc0b3f4446b89fc1ac2cf84a8a68971995d0bf2e925ec7cd960f9cb3"
dependencies = [
"serde",
]
···
[[package]]
name = "cc"
-
version = "1.2.44"
+
version = "1.2.46"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "37521ac7aabe3d13122dc382493e20c9416f299d2ccd5b3a5340a2570cdeb0f3"
+
checksum = "b97463e1064cb1b1c1384ad0a0b9c8abd0988e2a91f52606c80ef14aadb63e36"
dependencies = [
"find-msvc-tools",
"shlex",
···
[[package]]
name = "clap"
-
version = "4.5.51"
+
version = "4.5.52"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "4c26d721170e0295f191a69bd9a1f93efcdb0aff38684b61ab5750468972e5f5"
+
checksum = "aa8120877db0e5c011242f96806ce3c94e0737ab8108532a76a3300a01db2ab8"
dependencies = [
"clap_builder",
"clap_derive",
···
[[package]]
name = "clap_builder"
-
version = "4.5.51"
+
version = "4.5.52"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "75835f0c7bf681bfd05abe44e965760fea999a5286c6eb2d59883634fd02011a"
+
checksum = "02576b399397b659c26064fbc92a75fede9d18ffd5f80ca1cd74ddab167016e1"
dependencies = [
"anstream",
"anstyle",
···
"heck 0.5.0",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
[[package]]
name = "compression-codecs"
-
version = "0.4.31"
+
version = "0.4.32"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "ef8a506ec4b81c460798f572caead636d57d3d7e940f998160f52bd254bf2d23"
+
checksum = "680dc087785c5230f8e8843e2e57ac7c1c90488b6a91b88caa265410568f441b"
dependencies = [
"compression-core",
"flate2",
···
[[package]]
name = "compression-core"
-
version = "0.4.29"
+
version = "0.4.30"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "e47641d3deaf41fb1538ac1f54735925e275eaf3bf4d55c81b137fba797e5cbb"
+
checksum = "3a9b614a5787ef0c8802a55766480563cb3a93b435898c422ed2a359cf811582"
[[package]]
name = "const-oid"
···
checksum = "2f421161cb492475f1661ddc9815a745a1c894592070661180fdec3d4872e9c3"
[[package]]
+
name = "cordyceps"
+
version = "0.3.4"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "688d7fbb8092b8de775ef2536f36c8c31f2bc4006ece2e8d8ad2d17d00ce0a2a"
+
dependencies = [
+
"loom",
+
"tracing",
+
]
+
+
[[package]]
name = "core-foundation"
version = "0.9.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"proc-macro2",
"quote",
"strsim",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
dependencies = [
"darling_core",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
checksum = "8d162beedaa69905488a8da94f5ac3edb4dd4788b732fadb7bd120b2625c1976"
dependencies = [
"data-encoding",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
]
[[package]]
+
name = "derive_more"
+
version = "1.0.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "4a9b99b9cbbe49445b21764dc0625032a89b145a2642e67603e1c936f5458d05"
+
dependencies = [
+
"derive_more-impl",
+
]
+
+
[[package]]
+
name = "derive_more-impl"
+
version = "1.0.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "cb7330aeadfbe296029522e6c40f315320aba36fc43a5b3632f3795348f3bd22"
+
dependencies = [
+
"proc-macro2",
+
"quote",
+
"syn 2.0.110",
+
"unicode-xid",
+
]
+
+
[[package]]
+
name = "diatomic-waker"
+
version = "0.2.3"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "ab03c107fafeb3ee9f5925686dbb7a73bc76e3932abb0d2b365cb64b169cf04c"
+
+
[[package]]
name = "digest"
version = "0.10.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
"heck 0.5.0",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
]
[[package]]
···
[[package]]
name = "find-msvc-tools"
-
version = "0.1.4"
+
version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "52051878f80a721bb68ebfbc930e07b65ba72f2da88968ea5c06fd6ca3d3a127"
+
checksum = "3a3076410a55c90011c298b04d0cfa770b00fa04e1e3c97d3f6c9de105a03844"
[[package]]
name = "flate2"
···
checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1"
[[package]]
-
name = "foreign-types"
-
version = "0.3.2"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
-
dependencies = [
-
"foreign-types-shared",
-
]
-
-
[[package]]
-
name = "foreign-types-shared"
-
version = "0.1.1"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
-
-
[[package]]
name = "form_urlencoded"
version = "1.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
]
[[package]]
+
name = "futures-buffered"
+
version = "0.2.12"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "a8e0e1f38ec07ba4abbde21eed377082f17ccb988be9d988a5adbf4bafc118fd"
+
dependencies = [
+
"cordyceps",
+
"diatomic-waker",
+
"futures-core",
+
"pin-project-lite",
+
"spin 0.10.0",
+
]
+
+
[[package]]
name = "futures-channel"
version = "0.3.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "9e5c1b78ca4aae1ac06c48a526a655760685149f0d465d21f37abfe57ce075c6"
[[package]]
+
name = "futures-lite"
+
version = "2.6.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "f78e10609fe0e0b3f4157ffab1876319b5b0db102a2c60dc4626306dc46b44ad"
+
dependencies = [
+
"fastrand",
+
"futures-core",
+
"futures-io",
+
"parking",
+
"pin-project-lite",
+
]
+
+
[[package]]
name = "futures-macro"
version = "0.3.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
+
name = "generator"
+
version = "0.8.7"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "605183a538e3e2a9c1038635cc5c2d194e2ee8fd0d1b66b8349fad7dbacce5a2"
+
dependencies = [
+
"cc",
+
"cfg-if",
+
"libc",
+
"log",
+
"rustversion",
+
"windows",
+
]
+
+
[[package]]
name = "generic-array"
version = "0.14.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "e629b9b98ef3dd8afe6ca2bd0f89306cec16d43d907889945bc5d6687f2f13c7"
[[package]]
+
name = "gloo-storage"
+
version = "0.3.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "fbc8031e8c92758af912f9bc08fbbadd3c6f3cfcbf6b64cdf3d6a81f0139277a"
+
dependencies = [
+
"gloo-utils",
+
"js-sys",
+
"serde",
+
"serde_json",
+
"thiserror 1.0.69",
+
"wasm-bindgen",
+
"web-sys",
+
]
+
+
[[package]]
+
name = "gloo-utils"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "0b5555354113b18c547c1d3a98fbf7fb32a9ff4f6fa112ce823a21641a0ba3aa"
+
dependencies = [
+
"js-sys",
+
"serde",
+
"serde_json",
+
"wasm-bindgen",
+
"web-sys",
+
]
+
+
[[package]]
name = "group"
version = "0.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"markup5ever",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
+
name = "http-range-header"
+
version = "0.4.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "9171a2ea8a68358193d15dd5d70c1c10a2afc3e7e4c5bc92bc9f025cebd7359c"
+
+
[[package]]
name = "httparse"
version = "1.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
name = "hyper"
-
version = "1.7.0"
+
version = "1.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "eb3aa54a13a0dfe7fbe3a59e0c76093041720fdc77b110cc0fc260fafb4dc51e"
+
checksum = "2ab2d4f250c3d7b1c9fcdff1cece94ea4e2dfbec68614f7b87cb205f24ca9d11"
dependencies = [
"atomic-waker",
"bytes",
···
"http",
"http-body",
"httparse",
+
"httpdate",
"itoa",
"pin-project-lite",
"pin-utils",
···
[[package]]
-
name = "hyper-tls"
-
version = "0.6.0"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "70206fc6890eaca9fde8a0bf71caa2ddfc9fe045ac9e5c70df101a7dbde866e0"
-
dependencies = [
-
"bytes",
-
"http-body-util",
-
"hyper",
-
"hyper-util",
-
"native-tls",
-
"tokio",
-
"tokio-native-tls",
-
"tower-service",
-
]
-
-
[[package]]
name = "hyper-util"
-
version = "0.1.17"
+
version = "0.1.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "3c6995591a8f1380fcb4ba966a252a4b29188d51d2b89e3a252f5305be65aea8"
+
checksum = "52e9a2a24dc5c6821e71a7030e1e14b7b632acac55c40e9d2e082c621261bb56"
dependencies = [
"base64 0.22.1",
"bytes",
···
"js-sys",
"log",
"wasm-bindgen",
-
"windows-core",
+
"windows-core 0.62.2",
[[package]]
···
[[package]]
name = "iri-string"
-
version = "0.7.8"
+
version = "0.7.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "dbc5ebe9c3a1a7a5127f920a418f7585e9e758e911d0466ed004f393b0e380b2"
+
checksum = "4f867b9d1d896b67beb18518eda36fdb77a32ea590de864f1325b294a6d14397"
dependencies = [
"memchr",
"serde",
···
[[package]]
name = "jacquard"
-
version = "0.9.0"
+
version = "0.9.3"
dependencies = [
"bytes",
"getrandom 0.2.16",
+
"gloo-storage",
"http",
"jacquard-api",
"jacquard-common",
···
"jose-jwk",
"miette",
"regex",
+
"regex-lite",
"reqwest",
"serde",
"serde_html_form",
···
[[package]]
name = "jacquard-api"
-
version = "0.9.0"
+
version = "0.9.2"
dependencies = [
"bon",
"bytes",
···
[[package]]
name = "jacquard-common"
-
version = "0.9.0"
+
version = "0.9.2"
dependencies = [
"base64 0.22.1",
"bon",
"bytes",
"chrono",
+
"ciborium",
"cid",
+
"futures",
"getrandom 0.2.16",
"getrandom 0.3.4",
"http",
···
"miette",
"multibase",
"multihash",
+
"n0-future 0.1.3",
"ouroboros",
"p256",
"rand 0.9.2",
"regex",
+
"regex-lite",
"reqwest",
"serde",
"serde_html_form",
···
"smol_str",
"thiserror 2.0.17",
"tokio",
+
"tokio-tungstenite-wasm",
"tokio-util",
"trait-variant",
"url",
···
[[package]]
name = "jacquard-derive"
-
version = "0.9.0"
+
version = "0.9.3"
dependencies = [
"heck 0.5.0",
"jacquard-lexicon",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
name = "jacquard-identity"
-
version = "0.9.0"
+
version = "0.9.2"
dependencies = [
"bon",
"bytes",
···
[[package]]
name = "jacquard-lexicon"
-
version = "0.9.0"
+
version = "0.9.2"
dependencies = [
"cid",
"dashmap",
···
"serde_repr",
"serde_with",
"sha2",
-
"syn 2.0.108",
+
"syn 2.0.110",
"thiserror 2.0.17",
"unicode-segmentation",
[[package]]
name = "jacquard-oauth"
-
version = "0.9.0"
+
version = "0.9.2"
dependencies = [
"base64 0.22.1",
"bytes",
···
"serde_html_form",
"serde_json",
"sha2",
-
"signature",
"smol_str",
"thiserror 2.0.17",
"tokio",
···
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbd2bcb4c963f2ddae06a2efc7e9f3591312473c50c6685e1f298068316e66fe"
dependencies = [
-
"spin",
+
"spin 0.9.8",
[[package]]
···
checksum = "34080505efa8e45a4b816c349525ebe327ceaa8559756f0356cba97ef3bf7432"
[[package]]
+
name = "loom"
+
version = "0.7.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "419e0dc8046cb947daa77eb95ae174acfbddb7673b4151f56d1eed8e93fbfaca"
+
dependencies = [
+
"cfg-if",
+
"generator",
+
"scoped-tls",
+
"tracing",
+
"tracing-subscriber",
+
]
+
+
[[package]]
name = "lru-cache"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
+
name = "matchers"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "d1525a2a28c7f4fa0fc98bb91ae755d1e2d1505079e05539e35bc876b5d65ae9"
+
dependencies = [
+
"regex-automata",
+
]
+
+
[[package]]
+
name = "matchit"
+
version = "0.8.4"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "47e1ffaa40ddd1f3ed91f717a33c8c0ee23fff369e3aa8772b9605cc1d22f4c3"
+
+
[[package]]
name = "memchr"
version = "2.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
name = "mini-moka"
-
version = "0.11.0"
-
source = "git+https://github.com/moka-rs/mini-moka?rev=da864e849f5d034f32e02197fee9bb5d5af36d3d#da864e849f5d034f32e02197fee9bb5d5af36d3d"
+
version = "0.10.99"
dependencies = [
"crossbeam-channel",
"crossbeam-utils",
···
[[package]]
-
name = "native-tls"
-
version = "0.2.14"
+
name = "n0-future"
+
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "87de3442987e9dbec73158d5c715e7ad9072fda936bb03d19d7fa10e00520f0e"
+
checksum = "7bb0e5d99e681ab3c938842b96fcb41bf8a7bb4bfdb11ccbd653a7e83e06c794"
dependencies = [
-
"libc",
-
"log",
-
"openssl",
-
"openssl-probe",
-
"openssl-sys",
-
"schannel",
-
"security-framework",
-
"security-framework-sys",
-
"tempfile",
+
"cfg_aliases",
+
"derive_more",
+
"futures-buffered",
+
"futures-lite",
+
"futures-util",
+
"js-sys",
+
"pin-project",
+
"send_wrapper",
+
"tokio",
+
"tokio-util",
+
"wasm-bindgen",
+
"wasm-bindgen-futures",
+
"web-time",
+
]
+
+
[[package]]
+
name = "n0-future"
+
version = "0.3.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "8c0709ac8235ce13b82bc4d180ee3c42364b90c1a8a628c3422d991d75a728b5"
+
dependencies = [
+
"cfg_aliases",
+
"derive_more",
+
"futures-buffered",
+
"futures-lite",
+
"futures-util",
+
"js-sys",
+
"pin-project",
+
"send_wrapper",
+
"tokio",
+
"tokio-util",
+
"wasm-bindgen",
+
"wasm-bindgen-futures",
+
"web-time",
[[package]]
···
[[package]]
+
name = "nu-ansi-term"
+
version = "0.50.3"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "7957b9740744892f114936ab4a57b3f487491bbeafaf8083688b16841a4240e5"
+
dependencies = [
+
"windows-sys 0.61.2",
+
]
+
+
[[package]]
name = "num-bigint-dig"
-
version = "0.8.5"
+
version = "0.8.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "82c79c15c05d4bf82b6f5ef163104cc81a760d8e874d38ac50ab67c8877b647b"
+
checksum = "e661dda6640fad38e827a6d4a310ff4763082116fe217f279885c97f511bb0b7"
dependencies = [
"lazy_static",
"libm",
···
checksum = "384b8ab6d37215f3c5301a95a4accb5d64aa607f1fcb26a11b5303878451b4fe"
[[package]]
-
name = "openssl"
-
version = "0.10.74"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "24ad14dd45412269e1a30f52ad8f0664f0f4f4a89ee8fe28c3b3527021ebb654"
-
dependencies = [
-
"bitflags",
-
"cfg-if",
-
"foreign-types",
-
"libc",
-
"once_cell",
-
"openssl-macros",
-
"openssl-sys",
-
]
-
-
[[package]]
-
name = "openssl-macros"
-
version = "0.1.1"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
-
dependencies = [
-
"proc-macro2",
-
"quote",
-
"syn 2.0.108",
-
]
-
-
[[package]]
name = "openssl-probe"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d05e27ee213611ffe7d6348b942e8f942b37114c00cc03cec254295a4a17852e"
[[package]]
-
name = "openssl-sys"
-
version = "0.9.110"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "0a9f0075ba3c21b09f8e8b2026584b1d18d49388648f2fbbf3c97ea8deced8e2"
-
dependencies = [
-
"cc",
-
"libc",
-
"pkg-config",
-
"vcpkg",
-
]
-
-
[[package]]
name = "option-ext"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"proc-macro2",
"proc-macro2-diagnostics",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
+
name = "parking"
+
version = "2.2.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "f38d5652c16fde515bb1ecef450ab0f6a219d619a7274976324d5e377f7dceba"
+
+
[[package]]
name = "parking_lot"
version = "0.12.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
+
name = "pin-project"
+
version = "1.1.10"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "677f1add503faace112b9f1373e43e9e054bfdd22ff1a63c1bc485eaec6a6a8a"
+
dependencies = [
+
"pin-project-internal",
+
]
+
+
[[package]]
+
name = "pin-project-internal"
+
version = "1.1.10"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "6e918e4ff8c4549eb882f14b3a4bc8c8bc93de829416eacf579f1207a8fbf861"
+
dependencies = [
+
"proc-macro2",
+
"quote",
+
"syn 2.0.110",
+
]
+
+
[[package]]
name = "pin-project-lite"
version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
-
name = "pkg-config"
-
version = "0.3.32"
-
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "7edddbd0b52d732b21ad9a5fab5c704c14cd949e5e9a1ec5929a24fded1b904c"
-
-
[[package]]
name = "potential_utf"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "479ca8adacdd7ce8f1fb39ce9ecccbfe93a3f1344b3d0d97f20bc0196208f62b"
dependencies = [
"proc-macro2",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
"version_check",
"yansi",
···
[[package]]
name = "quote"
-
version = "1.0.41"
+
version = "1.0.42"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "ce25767e7b499d1b604768e7cde645d14cc8584231ea6b295e9c9eb22c02e1d1"
+
checksum = "a338cc41d27e6cc6dce6cefc13a0729dfbb81c262b1f519331575dd80ef3067f"
dependencies = [
"proc-macro2",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
+
name = "regex-lite"
+
version = "0.1.8"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "8d942b98df5e658f56f20d592c7f868833fe38115e65c33003d8cd224b0155da"
+
+
[[package]]
name = "regex-syntax"
version = "0.8.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"http-body-util",
"hyper",
"hyper-rustls",
-
"hyper-tls",
"hyper-util",
"js-sys",
"log",
"mime",
-
"native-tls",
"percent-encoding",
"pin-project-lite",
"quinn",
···
"serde_urlencoded",
"sync_wrapper",
"tokio",
-
"tokio-native-tls",
"tokio-rustls",
"tokio-util",
"tower",
···
[[package]]
name = "resolv-conf"
-
version = "0.7.5"
+
version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "6b3789b30bd25ba102de4beabd95d21ac45b69b1be7d14522bab988c526d6799"
+
checksum = "1e061d1b48cb8d38042de4ae0a7a6401009d6143dc80d2e2d6f31f0bdd6470c7"
[[package]]
name = "rfc6979"
···
[[package]]
name = "rsa"
-
version = "0.9.8"
+
version = "0.9.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "78928ac1ed176a5ca1d17e578a1825f3d81ca54cf41053a592584b020cfd691b"
+
checksum = "40a0376c50d0358279d9d643e4bf7b7be212f1f4ff1da9070a7b54d22ef75c88"
dependencies = [
"const-oid",
"digest",
···
[[package]]
name = "rustls"
-
version = "0.23.34"
+
version = "0.23.35"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "6a9586e9ee2b4f8fab52a0048ca7334d7024eef48e2cb9407e3497bb7cab7fa7"
+
checksum = "533f54bc6a7d4f647e46ad909549eda97bf5afc1585190ef692b4286b198bd8f"
dependencies = [
"once_cell",
"ring",
···
[[package]]
+
name = "rustls-native-certs"
+
version = "0.8.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "9980d917ebb0c0536119ba501e90834767bffc3d60641457fd84a1f3fd337923"
+
dependencies = [
+
"openssl-probe",
+
"rustls-pki-types",
+
"schannel",
+
"security-framework",
+
]
+
+
[[package]]
name = "rustls-pki-types"
version = "1.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
name = "schemars"
-
version = "1.0.4"
+
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "82d20c4491bc164fa2f6c5d44565947a52ad80b9505d8e36f8d54c27c739fcd0"
+
checksum = "9558e172d4e8533736ba97870c4b2cd63f84b382a3d6eb063da41b91cce17289"
dependencies = [
"dyn-clone",
"ref-cast",
···
[[package]]
+
name = "scoped-tls"
+
version = "1.0.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "e1cf6437eb19a8f4a6cc0f7dca544973b0b78843adbfeb3683d1a94a0024a294"
+
+
[[package]]
name = "scopeguard"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
name = "security-framework"
-
version = "2.11.1"
+
version = "3.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "897b2245f0b511c87893af39b033e5ca9cce68824c4d7e7630b5a1d339658d02"
+
checksum = "b3297343eaf830f66ede390ea39da1d462b6b0c1b000f420d0a83f898bbbe6ef"
dependencies = [
"bitflags",
-
"core-foundation 0.9.4",
+
"core-foundation 0.10.1",
"core-foundation-sys",
"libc",
"security-framework-sys",
···
"core-foundation-sys",
"libc",
+
+
[[package]]
+
name = "send_wrapper"
+
version = "0.6.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "cd0b0ec5f1c1ca621c432a25813d8d60c88abe6d3e08a3eb9cf37d97a0fe3d73"
[[package]]
name = "serde"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
"itoa",
"memchr",
"ryu",
+
"serde",
+
"serde_core",
+
]
+
+
[[package]]
+
name = "serde_path_to_error"
+
version = "0.1.20"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "10a9ff822e371bb5403e391ecd83e182e0e77ba7f6fe0160b795797109d1b457"
+
dependencies = [
+
"itoa",
"serde",
"serde_core",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
[[package]]
name = "serde_with"
-
version = "3.15.1"
+
version = "3.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "aa66c845eee442168b2c8134fec70ac50dc20e760769c8ba0ad1319ca1959b04"
+
checksum = "10574371d41b0d9b2cff89418eda27da52bcaff2cc8741db26382a77c29131f1"
dependencies = [
"base64 0.22.1",
"chrono",
···
"indexmap 1.9.3",
"indexmap 2.12.0",
"schemars 0.9.0",
-
"schemars 1.0.4",
+
"schemars 1.1.0",
"serde_core",
"serde_json",
"serde_with_macros",
···
[[package]]
name = "serde_with_macros"
-
version = "3.15.1"
+
version = "3.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "b91a903660542fced4e99881aa481bdbaec1634568ee02e0b8bd57c64cb38955"
+
checksum = "08a72d8216842fdd57820dc78d840bef99248e35fb2554ff923319e60f2d686b"
dependencies = [
"darling",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
+
]
+
+
[[package]]
+
name = "sha1"
+
version = "0.10.6"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "e3bf829a2d51ab4a5ddf1352d8470c140cadc8301b2ae1789db023f01cedd6ba"
+
dependencies = [
+
"cfg-if",
+
"cpufeatures",
+
"digest",
[[package]]
···
[[package]]
+
name = "sharded-slab"
+
version = "0.1.7"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "f40ca3c46823713e0d4209592e8d6e826aa57e928f09752619fc696c499637f6"
+
dependencies = [
+
"lazy_static",
+
]
+
+
[[package]]
name = "shellexpand"
version = "3.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "6980e8d7511241f8acf4aebddbb1ff938df5eebe98691418c4468d0b72a96a67"
[[package]]
+
name = "spin"
+
version = "0.10.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "d5fe4ccb98d9c292d56fec89a5e07da7fc4cf0dc11e156b41793132775d3e591"
+
+
[[package]]
name = "spki"
version = "0.7.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
"quote",
"serde",
"sha2",
-
"syn 2.0.108",
+
"syn 2.0.110",
"thiserror 1.0.69",
···
[[package]]
name = "syn"
-
version = "2.0.108"
+
version = "2.0.110"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "da58917d35242480a05c2897064da0a80589a2a0476c9a3f2fdc83b53502e917"
+
checksum = "a99801b5bd34ede4cf3fc688c5919368fea4e4814a4664359503e6015b280aea"
dependencies = [
"proc-macro2",
"quote",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
+
]
+
+
[[package]]
+
name = "thread_local"
+
version = "1.1.9"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "f60246a4944f24f6e018aa17cdeffb7818b76356965d03b07d6a9886e8962185"
+
dependencies = [
+
"cfg-if",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
+
]
+
+
[[package]]
+
name = "tokio-rustls"
+
version = "0.26.4"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "1729aa945f29d91ba541258c8df89027d5792d85a8841fb65e8bf0f4ede4ef61"
+
dependencies = [
+
"rustls",
+
"tokio",
[[package]]
-
name = "tokio-native-tls"
-
version = "0.3.1"
+
name = "tokio-tungstenite"
+
version = "0.24.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "bbae76ab933c85776efabc971569dd6119c580d8f5d448769dec1764bf796ef2"
+
checksum = "edc5f74e248dc973e0dbb7b74c7e0d6fcc301c694ff50049504004ef4d0cdcd9"
dependencies = [
-
"native-tls",
+
"futures-util",
+
"log",
+
"rustls",
+
"rustls-native-certs",
+
"rustls-pki-types",
"tokio",
+
"tokio-rustls",
+
"tungstenite",
[[package]]
-
name = "tokio-rustls"
-
version = "0.26.4"
+
name = "tokio-tungstenite-wasm"
+
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "1729aa945f29d91ba541258c8df89027d5792d85a8841fb65e8bf0f4ede4ef61"
+
checksum = "e21a5c399399c3db9f08d8297ac12b500e86bca82e930253fdc62eaf9c0de6ae"
dependencies = [
+
"futures-channel",
+
"futures-util",
+
"http",
+
"httparse",
+
"js-sys",
"rustls",
+
"thiserror 1.0.69",
"tokio",
+
"tokio-tungstenite",
+
"wasm-bindgen",
+
"web-sys",
[[package]]
name = "tokio-util"
-
version = "0.7.16"
+
version = "0.7.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "14307c986784f72ef81c89db7d9e28d6ac26d16213b109ea501696195e6e3ce5"
+
checksum = "2efa149fe76073d6e8fd97ef4f4eca7b67f599660115591483572e406e165594"
dependencies = [
"bytes",
"futures-core",
"futures-sink",
+
"futures-util",
"pin-project-lite",
"tokio",
···
"tokio",
"tower-layer",
"tower-service",
+
"tracing",
[[package]]
···
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "adc82fd73de2a9722ac5da747f12383d2bfdb93591ee6c58486e0097890f05f2"
dependencies = [
+
"async-compression",
"bitflags",
"bytes",
+
"futures-core",
"futures-util",
"http",
"http-body",
+
"http-body-util",
+
"http-range-header",
+
"httpdate",
"iri-string",
+
"mime",
+
"mime_guess",
+
"percent-encoding",
"pin-project-lite",
+
"tokio",
+
"tokio-util",
"tower",
"tower-layer",
"tower-service",
+
"tracing",
[[package]]
···
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "784e0ac535deb450455cbfa28a6f0df145ea1bb7ae51b821cf5e7927fdcfbdd0"
dependencies = [
+
"log",
"pin-project-lite",
"tracing-attributes",
"tracing-core",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
checksum = "b9d12581f227e93f094d3af2ae690a574abb8a2b9b7a96e7cfe9647b2b617678"
dependencies = [
"once_cell",
+
"valuable",
+
]
+
+
[[package]]
+
name = "tracing-log"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "ee855f1f400bd0e5c02d150ae5de3840039a3f54b025156404e34c23c03f47c3"
+
dependencies = [
+
"log",
+
"once_cell",
+
"tracing-core",
+
]
+
+
[[package]]
+
name = "tracing-subscriber"
+
version = "0.3.20"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "2054a14f5307d601f88daf0553e1cbf472acc4f2c51afab632431cdcd72124d5"
+
dependencies = [
+
"matchers",
+
"nu-ansi-term",
+
"once_cell",
+
"regex-automata",
+
"sharded-slab",
+
"smallvec",
+
"thread_local",
+
"tracing",
+
"tracing-core",
+
"tracing-log",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
checksum = "e421abadd41a4225275504ea4d6566923418b7f05506fbc9c0fe86ba7396114b"
[[package]]
+
name = "tungstenite"
+
version = "0.24.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "18e5b8366ee7a95b16d32197d0b2604b43a0be89dc5fac9f8e96ccafbaedda8a"
+
dependencies = [
+
"byteorder",
+
"bytes",
+
"data-encoding",
+
"http",
+
"httparse",
+
"log",
+
"rand 0.8.5",
+
"rustls",
+
"rustls-pki-types",
+
"sha1",
+
"thiserror 1.0.69",
+
"utf-8",
+
]
+
+
[[package]]
name = "twoway"
version = "0.1.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "b4ac048d71ede7ee76d585517add45da530660ef4390e49b098733c6e897f254"
[[package]]
+
name = "unicode-xid"
+
version = "0.2.6"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
+
+
[[package]]
name = "unsigned-varint"
version = "0.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
checksum = "06abde3611657adf66d383f00b093d7faecc7fa57071cce2578660c9f1010821"
[[package]]
-
name = "vcpkg"
-
version = "0.2.15"
+
name = "valuable"
+
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "accd4ea62f7bb7a82fe23066fb0957d48ef677f6eeb8215f372f52e48bb32426"
+
checksum = "ba73ea9cf16a25df0c8caa16c51acb937d5712a8429db78a3ee29d5dcacd3a65"
[[package]]
name = "version_check"
···
"bumpalo",
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
"wasm-bindgen-shared",
···
[[package]]
+
name = "windows"
+
version = "0.61.3"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "9babd3a767a4c1aef6900409f85f5d53ce2544ccdfaa86dad48c91782c6d6893"
+
dependencies = [
+
"windows-collections",
+
"windows-core 0.61.2",
+
"windows-future",
+
"windows-link 0.1.3",
+
"windows-numerics",
+
]
+
+
[[package]]
+
name = "windows-collections"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "3beeceb5e5cfd9eb1d76b381630e82c4241ccd0d27f1a39ed41b2760b255c5e8"
+
dependencies = [
+
"windows-core 0.61.2",
+
]
+
+
[[package]]
+
name = "windows-core"
+
version = "0.61.2"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "c0fdd3ddb90610c7638aa2b3a3ab2904fb9e5cdbecc643ddb3647212781c4ae3"
+
dependencies = [
+
"windows-implement",
+
"windows-interface",
+
"windows-link 0.1.3",
+
"windows-result 0.3.4",
+
"windows-strings 0.4.2",
+
]
+
+
[[package]]
name = "windows-core"
version = "0.62.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
+
name = "windows-future"
+
version = "0.2.1"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "fc6a41e98427b19fe4b73c550f060b59fa592d7d686537eebf9385621bfbad8e"
+
dependencies = [
+
"windows-core 0.61.2",
+
"windows-link 0.1.3",
+
"windows-threading",
+
]
+
+
[[package]]
name = "windows-implement"
version = "0.60.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
checksum = "f0805222e57f7521d6a62e36fa9163bc891acd422f971defe97d64e70d0a4fe5"
[[package]]
+
name = "windows-numerics"
+
version = "0.2.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "9150af68066c4c5c07ddc0ce30421554771e528bde427614c61038bc2c92c2b1"
+
dependencies = [
+
"windows-core 0.61.2",
+
"windows-link 0.1.3",
+
]
+
+
[[package]]
name = "windows-registry"
-
version = "0.5.3"
+
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
-
checksum = "5b8a9ed28765efc97bbc954883f4e6796c33a06546ebafacbabee9696967499e"
+
checksum = "02752bf7fbdcce7f2a27a742f798510f3e5ad88dbe84871e5168e2120c3d5720"
dependencies = [
-
"windows-link 0.1.3",
-
"windows-result 0.3.4",
-
"windows-strings 0.4.2",
+
"windows-link 0.2.1",
+
"windows-result 0.4.1",
+
"windows-strings 0.5.1",
[[package]]
···
[[package]]
+
name = "windows-threading"
+
version = "0.1.0"
+
source = "registry+https://github.com/rust-lang/crates.io-index"
+
checksum = "b66463ad2e0ea3bbf808b7f1d371311c80e115c0b71d60efc142cafbcfb057a6"
+
dependencies = [
+
"windows-link 0.1.3",
+
]
+
+
[[package]]
name = "windows_aarch64_gnullvm"
version = "0.42.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
···
[[package]]
name = "wisp-cli"
-
version = "0.1.0"
+
version = "0.3.0"
dependencies = [
+
"axum",
"base64 0.22.1",
"bytes",
+
"chrono",
"clap",
"flate2",
"futures",
···
"jacquard-oauth",
"miette",
"mime_guess",
+
"multibase",
+
"multihash",
+
"n0-future 0.3.1",
+
"regex",
"reqwest",
"rustversion",
"serde",
"serde_json",
+
"sha2",
"shellexpand",
"tokio",
+
"tower",
+
"tower-http",
+
"url",
"walkdir",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
"synstructure",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
[[package]]
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
"synstructure",
···
dependencies = [
"proc-macro2",
"quote",
-
"syn 2.0.108",
+
"syn 2.0.110",
+20 -9
cli/Cargo.toml
···
[package]
name = "wisp-cli"
-
version = "0.1.0"
+
version = "0.3.0"
edition = "2024"
[features]
···
place_wisp = []
[dependencies]
-
jacquard = { path = "jacquard/crates/jacquard", features = ["loopback"] }
-
jacquard-oauth = { path = "jacquard/crates/jacquard-oauth" }
-
jacquard-api = { path = "jacquard/crates/jacquard-api" }
-
jacquard-common = { path = "jacquard/crates/jacquard-common" }
-
jacquard-identity = { path = "jacquard/crates/jacquard-identity", features = ["dns"] }
-
jacquard-derive = { path = "jacquard/crates/jacquard-derive" }
-
jacquard-lexicon = { path = "jacquard/crates/jacquard-lexicon" }
+
jacquard = { path = "/Users/regent/Developer/jacquard/crates/jacquard", features = ["loopback"] }
+
jacquard-oauth = { path = "/Users/regent/Developer/jacquard/crates/jacquard-oauth" }
+
jacquard-api = { path = "/Users/regent/Developer/jacquard/crates/jacquard-api", features = ["streaming"] }
+
jacquard-common = { path = "/Users/regent/Developer/jacquard/crates/jacquard-common", features = ["websocket"] }
+
jacquard-identity = { path = "/Users/regent/Developer/jacquard/crates/jacquard-identity", features = ["dns"] }
+
jacquard-derive = { path = "/Users/regent/Developer/jacquard/crates/jacquard-derive" }
+
jacquard-lexicon = { path = "/Users/regent/Developer/jacquard/crates/jacquard-lexicon" }
clap = { version = "4.5.51", features = ["derive"] }
tokio = { version = "1.48", features = ["full"] }
miette = { version = "7.6.0", features = ["fancy"] }
serde_json = "1.0.145"
serde = { version = "1.0", features = ["derive"] }
shellexpand = "3.1.1"
-
reqwest = "0.12"
+
#reqwest = "0.12"
+
reqwest = { version = "0.12", default-features = false, features = ["rustls-tls"] }
rustversion = "1.0"
flate2 = "1.0"
base64 = "0.22"
···
mime_guess = "2.0"
bytes = "1.10"
futures = "0.3.31"
+
multihash = "0.19.3"
+
multibase = "0.9"
+
sha2 = "0.10"
+
axum = "0.8.7"
+
tower-http = { version = "0.6.6", features = ["fs", "compression-gzip"] }
+
tower = "0.5.2"
+
n0-future = "0.3.1"
+
chrono = "0.4"
+
url = "2.5"
+
regex = "1.11"
+271
cli/README.md
···
+
# Wisp CLI
+
+
A command-line tool for deploying static sites to your AT Protocol repo to be served on [wisp.place](https://wisp.place), an AT indexer to serve such sites.
+
+
## Why?
+
+
The PDS serves as a way to verfiably, cryptographically prove that you own your site. That it was you (or at least someone who controls your account) who uploaded it. It is also a manifest of each file in the site to ensure file integrity. Keeping hosting seperate ensures that you could move your site across other servers or even serverless solutions to ensure speedy delievery while keeping it backed by an absolute source of truth being the manifest record and the blobs of each file in your repo.
+
+
## Features
+
+
- Deploy static sites directly to your AT Protocol repo
+
- Supports both OAuth and app password authentication
+
- Preserves directory structure and file integrity
+
+
## Soon
+
+
-- Host sites
+
-- Manage and delete sites
+
-- Metrics and logs for self hosting.
+
+
## Installation
+
+
### From Source
+
+
```bash
+
cargo build --release
+
```
+
+
Check out the build scripts for cross complation using nix-shell.
+
+
The binary will be available at `target/release/wisp-cli`.
+
+
## Usage
+
+
### Basic Deployment
+
+
Deploy the current directory:
+
+
```bash
+
wisp-cli nekomimi.ppet --path . --site my-site
+
```
+
+
Deploy a specific directory:
+
+
```bash
+
wisp-cli alice.bsky.social --path ./dist/ --site my-site
+
```
+
+
### Authentication Methods
+
+
#### OAuth (Recommended)
+
+
By default, the CLI uses OAuth authentication with a local loopback server:
+
+
```bash
+
wisp-cli alice.bsky.social --path ./my-site --site my-site
+
```
+
+
This will:
+
1. Open your browser for authentication
+
2. Save the session to a file (default: `/tmp/wisp-oauth-session.json`)
+
3. Reuse the session for future deployments
+
+
Specify a custom session file location:
+
+
```bash
+
wisp-cli alice.bsky.social --path ./my-site --site my-site --store ~/.wisp-session.json
+
```
+
+
#### App Password
+
+
For headless environments or CI/CD, use an app password:
+
+
```bash
+
wisp-cli alice.bsky.social --path ./my-site --site my-site --password YOUR_APP_PASSWORD
+
```
+
+
**Note:** When using `--password`, the `--store` option is ignored.
+
+
## Command-Line Options
+
+
```
+
wisp-cli [OPTIONS] <INPUT>
+
+
Arguments:
+
<INPUT> Handle (e.g., alice.bsky.social), DID, or PDS URL
+
+
Options:
+
-p, --path <PATH> Path to the directory containing your static site [default: .]
+
-s, --site <SITE> Site name (defaults to directory name)
+
--store <STORE> Path to auth store file (only used with OAuth) [default: /tmp/wisp-oauth-session.json]
+
--password <PASSWORD> App Password for authentication (alternative to OAuth)
+
-h, --help Print help
+
-V, --version Print version
+
```
+
+
## How It Works
+
+
1. **Authentication**: Authenticates using OAuth or app password
+
2. **File Processing**:
+
- Recursively walks the directory tree
+
- Skips hidden files (starting with `.`)
+
- Detects MIME types automatically
+
- Compresses files with gzip
+
- Base64 encodes compressed content
+
3. **Upload**:
+
- Uploads files as blobs to your PDS
+
- Processes up to 5 files concurrently
+
- Creates a `place.wisp.fs` record with the site manifest
+
4. **Deployment**: Site is immediately available at `https://sites.wisp.place/{did}/{site-name}`
+
+
## File Processing
+
+
All files are automatically:
+
+
- **Compressed** with gzip (level 9)
+
- **Base64 encoded** to bypass PDS content sniffing
+
- **Uploaded** as `application/octet-stream` blobs
+
- **Stored** with original MIME type metadata
+
+
The hosting service automatically decompresses non HTML/CSS/JS files when serving them.
+
+
## Limitations
+
+
- **Max file size**: 100MB per file (after compression) (this is a PDS limit, but not enforced by the CLI in case yours is higher)
+
- **Max file count**: 2000 files
+
- **Site name** must follow AT Protocol rkey format rules (alphanumeric, hyphens, underscores)
+
+
## Deploy with CI/CD
+
+
### GitHub Actions
+
+
```yaml
+
name: Deploy to Wisp
+
on:
+
push:
+
branches: [main]
+
+
jobs:
+
deploy:
+
runs-on: ubuntu-latest
+
steps:
+
- uses: actions/checkout@v3
+
+
- name: Setup Node
+
uses: actions/setup-node@v3
+
with:
+
node-version: '25'
+
+
- name: Install dependencies
+
run: npm install
+
+
- name: Build site
+
run: npm run build
+
+
- name: Download Wisp CLI
+
run: |
+
curl -L https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux -o wisp-cli
+
chmod +x wisp-cli
+
+
- name: Deploy to Wisp
+
env:
+
WISP_APP_PASSWORD: ${{ secrets.WISP_APP_PASSWORD }}
+
run: |
+
./wisp-cli alice.bsky.social \
+
--path ./dist \
+
--site my-site \
+
--password "$WISP_APP_PASSWORD"
+
```
+
+
### Tangled.org
+
+
```yaml
+
when:
+
- event: ['push']
+
branch: ['main']
+
- event: ['manual']
+
+
engine: 'nixery'
+
+
clone:
+
skip: false
+
depth: 1
+
submodules: false
+
+
dependencies:
+
nixpkgs:
+
- nodejs
+
- coreutils
+
- curl
+
github:NixOS/nixpkgs/nixpkgs-unstable:
+
- bun
+
+
environment:
+
SITE_PATH: 'dist'
+
SITE_NAME: 'my-site'
+
WISP_HANDLE: 'your-handle.bsky.social'
+
+
steps:
+
- name: build site
+
command: |
+
export PATH="$HOME/.nix-profile/bin:$PATH"
+
+
# regenerate lockfile
+
rm package-lock.json bun.lock
+
bun install @rolldown/binding-linux-arm64-gnu --save-optional
+
bun install
+
+
# build with vite
+
bun node_modules/.bin/vite build
+
+
- name: deploy to wisp
+
command: |
+
# Download Wisp CLI
+
curl https://sites.wisp.place/nekomimi.pet/wisp-cli-binaries/wisp-cli-x86_64-linux -o wisp-cli
+
chmod +x wisp-cli
+
+
# Deploy to Wisp
+
./wisp-cli \
+
"$WISP_HANDLE" \
+
--path "$SITE_PATH" \
+
--site "$SITE_NAME" \
+
--password "$WISP_APP_PASSWORD"
+
```
+
+
### Generic Shell Script
+
+
```bash
+
# Use app password from environment variable
+
wisp-cli alice.bsky.social --path ./dist --site my-site --password "$WISP_APP_PASSWORD"
+
```
+
+
## Output
+
+
Upon successful deployment, you'll see:
+
+
```
+
Deployed site 'my-site': at://did:plc:abc123xyz/place.wisp.fs/my-site
+
Available at: https://sites.wisp.place/did:plc:abc123xyz/my-site
+
```
+
+
### Dependencies
+
+
- **jacquard**: AT Protocol client library
+
- **clap**: Command-line argument parsing
+
- **tokio**: Async runtime
+
- **flate2**: Gzip compression
+
- **base64**: Base64 encoding
+
- **walkdir**: Directory traversal
+
- **mime_guess**: MIME type detection
+
+
## License
+
+
MIT License
+
+
## Contributing
+
+
Just don't give me entirely claude slop especailly not in the PR description itself. You should be responsible for code you submit and aware of what it even is you're submitting.
+
+
## Links
+
+
- **Website**: https://wisp.place
+
- **Main Repository**: https://tangled.org/@nekomimi.pet/wisp.place-monorepo
+
- **AT Protocol**: https://atproto.com
+
- **Jacquard Library**: https://tangled.org/@nonbinary.computer/jacquard
+
+
## Support
+
+
For issues and questions:
+
- Check the main wisp.place documentation
+
- Open an issue in the main repository
+23
cli/build-linux.sh
···
+
#!/usr/bin/env bash
+
# Build Linux binaries (statically linked)
+
set -e
+
mkdir -p binaries
+
+
# Build Linux binaries
+
echo "Building Linux binaries..."
+
+
echo "Building Linux ARM64 (static)..."
+
nix-shell -p rustup --run '
+
rustup target add aarch64-unknown-linux-musl
+
RUSTFLAGS="-C target-feature=+crt-static" cargo zigbuild --release --target aarch64-unknown-linux-musl
+
'
+
cp target/aarch64-unknown-linux-musl/release/wisp-cli binaries/wisp-cli-aarch64-linux
+
+
echo "Building Linux x86_64 (static)..."
+
nix-shell -p rustup --run '
+
rustup target add x86_64-unknown-linux-musl
+
RUSTFLAGS="-C target-feature=+crt-static" cargo build --release --target x86_64-unknown-linux-musl
+
'
+
cp target/x86_64-unknown-linux-musl/release/wisp-cli binaries/wisp-cli-x86_64-linux
+
+
echo "Done! Binaries in ./binaries/"
+15
cli/build-macos.sh
···
+
#!/bin/bash
+
# Build Linux and macOS binaries
+
+
set -e
+
+
mkdir -p binaries
+
rm -rf target
+
+
# Build macOS binaries natively
+
echo "Building macOS binaries..."
+
rustup target add aarch64-apple-darwin
+
+
echo "Building macOS arm64 binary."
+
RUSTFLAGS="-C target-feature=+crt-static" cargo build --release --target aarch64-apple-darwin
+
cp target/aarch64-apple-darwin/release/wisp-cli binaries/wisp-cli-macos-arm64
-51
cli/lexicons/place/wisp/fs.json
···
-
{
-
"lexicon": 1,
-
"id": "place.wisp.fs",
-
"defs": {
-
"main": {
-
"type": "record",
-
"description": "Virtual filesystem manifest for a Wisp site",
-
"record": {
-
"type": "object",
-
"required": ["site", "root", "createdAt"],
-
"properties": {
-
"site": { "type": "string" },
-
"root": { "type": "ref", "ref": "#directory" },
-
"fileCount": { "type": "integer", "minimum": 0, "maximum": 1000 },
-
"createdAt": { "type": "string", "format": "datetime" }
-
}
-
}
-
},
-
"file": {
-
"type": "object",
-
"required": ["type", "blob"],
-
"properties": {
-
"type": { "type": "string", "const": "file" },
-
"blob": { "type": "blob", "accept": ["*/*"], "maxSize": 1000000, "description": "Content blob ref" },
-
"encoding": { "type": "string", "enum": ["gzip"], "description": "Content encoding (e.g., gzip for compressed files)" },
-
"mimeType": { "type": "string", "description": "Original MIME type before compression" },
-
"base64": { "type": "boolean", "description": "True if blob content is base64-encoded (used to bypass PDS content sniffing)" }
-
}
-
},
-
"directory": {
-
"type": "object",
-
"required": ["type", "entries"],
-
"properties": {
-
"type": { "type": "string", "const": "directory" },
-
"entries": {
-
"type": "array",
-
"maxLength": 500,
-
"items": { "type": "ref", "ref": "#entry" }
-
}
-
}
-
},
-
"entry": {
-
"type": "object",
-
"required": ["name", "node"],
-
"properties": {
-
"name": { "type": "string", "maxLength": 255 },
-
"node": { "type": "union", "refs": ["#file", "#directory"] }
-
}
-
}
-
}
-
}
+89
cli/src/blob_map.rs
···
+
use jacquard_common::types::blob::BlobRef;
+
use jacquard_common::IntoStatic;
+
use std::collections::HashMap;
+
+
use crate::place_wisp::fs::{Directory, EntryNode};
+
+
/// Extract blob information from a directory tree
+
/// Returns a map of file paths to their blob refs and CIDs
+
///
+
/// This mirrors the TypeScript implementation in src/lib/wisp-utils.ts lines 275-302
+
pub fn extract_blob_map(
+
directory: &Directory,
+
) -> HashMap<String, (BlobRef<'static>, String)> {
+
extract_blob_map_recursive(directory, String::new())
+
}
+
+
fn extract_blob_map_recursive(
+
directory: &Directory,
+
current_path: String,
+
) -> HashMap<String, (BlobRef<'static>, String)> {
+
let mut blob_map = HashMap::new();
+
+
for entry in &directory.entries {
+
let full_path = if current_path.is_empty() {
+
entry.name.to_string()
+
} else {
+
format!("{}/{}", current_path, entry.name)
+
};
+
+
match &entry.node {
+
EntryNode::File(file_node) => {
+
// Extract CID from blob ref
+
// BlobRef is an enum with Blob variant, which has a ref field (CidLink)
+
let blob_ref = &file_node.blob;
+
let cid_string = blob_ref.blob().r#ref.to_string();
+
+
// Store with full path (mirrors TypeScript implementation)
+
blob_map.insert(
+
full_path,
+
(blob_ref.clone().into_static(), cid_string)
+
);
+
}
+
EntryNode::Directory(subdir) => {
+
let sub_map = extract_blob_map_recursive(subdir, full_path);
+
blob_map.extend(sub_map);
+
}
+
EntryNode::Subfs(_) => {
+
// Subfs nodes don't contain blobs directly - they reference other records
+
// Skip them in blob map extraction
+
}
+
EntryNode::Unknown(_) => {
+
// Skip unknown node types
+
}
+
}
+
}
+
+
blob_map
+
}
+
+
/// Normalize file path by removing base folder prefix
+
/// Example: "cobblemon/index.html" -> "index.html"
+
///
+
/// Note: This function is kept for reference but is no longer used in production code.
+
/// The TypeScript server has a similar normalization (src/routes/wisp.ts line 291) to handle
+
/// uploads that include a base folder prefix, but our CLI doesn't need this since we
+
/// track full paths consistently.
+
#[allow(dead_code)]
+
pub fn normalize_path(path: &str) -> String {
+
// Remove base folder prefix (everything before first /)
+
if let Some(idx) = path.find('/') {
+
path[idx + 1..].to_string()
+
} else {
+
path.to_string()
+
}
+
}
+
+
#[cfg(test)]
+
mod tests {
+
use super::*;
+
+
#[test]
+
fn test_normalize_path() {
+
assert_eq!(normalize_path("index.html"), "index.html");
+
assert_eq!(normalize_path("cobblemon/index.html"), "index.html");
+
assert_eq!(normalize_path("folder/subfolder/file.txt"), "subfolder/file.txt");
+
assert_eq!(normalize_path("a/b/c/d.txt"), "b/c/d.txt");
+
}
+
}
+
+66
cli/src/cid.rs
···
+
use jacquard_common::types::cid::IpldCid;
+
use sha2::{Digest, Sha256};
+
+
/// Compute CID (Content Identifier) for blob content
+
/// Uses the same algorithm as AT Protocol: CIDv1 with raw codec (0x55) and SHA-256
+
///
+
/// CRITICAL: This must be called on BASE64-ENCODED GZIPPED content, not just gzipped content
+
///
+
/// Based on @atproto/common/src/ipld.ts sha256RawToCid implementation
+
pub fn compute_cid(content: &[u8]) -> String {
+
// Use node crypto to compute sha256 hash (same as AT Protocol)
+
let hash = Sha256::digest(content);
+
+
// Create multihash (code 0x12 = sha2-256)
+
let multihash = multihash::Multihash::wrap(0x12, &hash)
+
.expect("SHA-256 hash should always fit in multihash");
+
+
// Create CIDv1 with raw codec (0x55)
+
let cid = IpldCid::new_v1(0x55, multihash);
+
+
// Convert to base32 string representation
+
cid.to_string_of_base(multibase::Base::Base32Lower)
+
.unwrap_or_else(|_| cid.to_string())
+
}
+
+
#[cfg(test)]
+
mod tests {
+
use super::*;
+
use base64::Engine;
+
+
#[test]
+
fn test_compute_cid() {
+
// Test with a simple string: "hello"
+
let content = b"hello";
+
let cid = compute_cid(content);
+
+
// CID should start with 'baf' for raw codec base32
+
assert!(cid.starts_with("baf"));
+
}
+
+
#[test]
+
fn test_compute_cid_base64_encoded() {
+
// Simulate the actual use case: gzipped then base64 encoded
+
use flate2::write::GzEncoder;
+
use flate2::Compression;
+
use std::io::Write;
+
+
let original = b"hello world";
+
+
// Gzip compress
+
let mut encoder = GzEncoder::new(Vec::new(), Compression::default());
+
encoder.write_all(original).unwrap();
+
let gzipped = encoder.finish().unwrap();
+
+
// Base64 encode the gzipped data
+
let base64_bytes = base64::prelude::BASE64_STANDARD.encode(&gzipped).into_bytes();
+
+
// Compute CID on the base64 bytes
+
let cid = compute_cid(&base64_bytes);
+
+
// Should be a valid CID
+
assert!(cid.starts_with("baf"));
+
assert!(cid.len() > 10);
+
}
+
}
+
+71
cli/src/download.rs
···
+
use base64::Engine;
+
use bytes::Bytes;
+
use flate2::read::GzDecoder;
+
use jacquard_common::types::blob::BlobRef;
+
use miette::IntoDiagnostic;
+
use std::io::Read;
+
use url::Url;
+
+
/// Download a blob from the PDS
+
pub async fn download_blob(pds_url: &Url, blob_ref: &BlobRef<'_>, did: &str) -> miette::Result<Bytes> {
+
// Extract CID from blob ref
+
let cid = blob_ref.blob().r#ref.to_string();
+
+
// Construct blob download URL
+
// The correct endpoint is: /xrpc/com.atproto.sync.getBlob?did={did}&cid={cid}
+
let blob_url = pds_url
+
.join(&format!("/xrpc/com.atproto.sync.getBlob?did={}&cid={}", did, cid))
+
.into_diagnostic()?;
+
+
let client = reqwest::Client::new();
+
let response = client
+
.get(blob_url)
+
.send()
+
.await
+
.into_diagnostic()?;
+
+
if !response.status().is_success() {
+
return Err(miette::miette!(
+
"Failed to download blob: {}",
+
response.status()
+
));
+
}
+
+
let bytes = response.bytes().await.into_diagnostic()?;
+
Ok(bytes)
+
}
+
+
/// Decompress and decode a blob (base64 + gzip)
+
pub fn decompress_blob(data: &[u8], is_base64: bool, is_gzipped: bool) -> miette::Result<Vec<u8>> {
+
let mut current_data = data.to_vec();
+
+
// First, decode base64 if needed
+
if is_base64 {
+
current_data = base64::prelude::BASE64_STANDARD
+
.decode(&current_data)
+
.into_diagnostic()?;
+
}
+
+
// Then, decompress gzip if needed
+
if is_gzipped {
+
let mut decoder = GzDecoder::new(&current_data[..]);
+
let mut decompressed = Vec::new();
+
decoder.read_to_end(&mut decompressed).into_diagnostic()?;
+
current_data = decompressed;
+
}
+
+
Ok(current_data)
+
}
+
+
/// Download and decompress a blob
+
pub async fn download_and_decompress_blob(
+
pds_url: &Url,
+
blob_ref: &BlobRef<'_>,
+
did: &str,
+
is_base64: bool,
+
is_gzipped: bool,
+
) -> miette::Result<Vec<u8>> {
+
let data = download_blob(pds_url, blob_ref, did).await?;
+
decompress_blob(&data, is_base64, is_gzipped)
+
}
+
+9
cli/src/lib.rs
···
+
// @generated by jacquard-lexicon. DO NOT EDIT.
+
//
+
// This file was automatically generated from Lexicon schemas.
+
// Any manual changes will be overwritten on the next regeneration.
+
+
pub mod builder_types;
+
+
#[cfg(feature = "place_wisp")]
+
pub mod place_wisp;
+543 -68
cli/src/main.rs
···
mod builder_types;
mod place_wisp;
+
mod cid;
+
mod blob_map;
+
mod metadata;
+
mod download;
+
mod pull;
+
mod serve;
+
mod subfs_utils;
+
mod redirects;
-
use clap::Parser;
+
use clap::{Parser, Subcommand};
use jacquard::CowStr;
-
use jacquard::client::{Agent, FileAuthStore, AgentSessionExt, MemoryCredentialSession};
+
use jacquard::client::{Agent, FileAuthStore, AgentSessionExt, MemoryCredentialSession, AgentSession};
use jacquard::oauth::client::OAuthClient;
use jacquard::oauth::loopback::LoopbackConfig;
use jacquard::prelude::IdentityResolver;
···
use jacquard_common::types::blob::MimeType;
use miette::IntoDiagnostic;
use std::path::{Path, PathBuf};
+
use std::collections::HashMap;
use flate2::Compression;
use flate2::write::GzEncoder;
use std::io::Write;
···
use place_wisp::fs::*;
#[derive(Parser, Debug)]
-
#[command(author, version, about = "Deploy a static site to wisp.place")]
+
#[command(author, version, about = "wisp.place CLI tool")]
struct Args {
+
#[command(subcommand)]
+
command: Option<Commands>,
+
+
// Deploy arguments (when no subcommand is specified)
/// Handle (e.g., alice.bsky.social), DID, or PDS URL
-
input: CowStr<'static>,
+
#[arg(global = true, conflicts_with = "command")]
+
input: Option<CowStr<'static>>,
/// Path to the directory containing your static site
-
#[arg(short, long, default_value = ".")]
-
path: PathBuf,
+
#[arg(short, long, global = true, conflicts_with = "command")]
+
path: Option<PathBuf>,
/// Site name (defaults to directory name)
-
#[arg(short, long)]
+
#[arg(short, long, global = true, conflicts_with = "command")]
site: Option<String>,
-
/// Path to auth store file (will be created if missing, only used with OAuth)
-
#[arg(long, default_value = "/tmp/wisp-oauth-session.json")]
-
store: String,
+
/// Path to auth store file
+
#[arg(long, global = true, conflicts_with = "command")]
+
store: Option<String>,
-
/// App Password for authentication (alternative to OAuth)
-
#[arg(long)]
+
/// App Password for authentication
+
#[arg(long, global = true, conflicts_with = "command")]
password: Option<CowStr<'static>>,
}
+
#[derive(Subcommand, Debug)]
+
enum Commands {
+
/// Deploy a static site to wisp.place (default command)
+
Deploy {
+
/// Handle (e.g., alice.bsky.social), DID, or PDS URL
+
input: CowStr<'static>,
+
+
/// Path to the directory containing your static site
+
#[arg(short, long, default_value = ".")]
+
path: PathBuf,
+
+
/// Site name (defaults to directory name)
+
#[arg(short, long)]
+
site: Option<String>,
+
+
/// Path to auth store file (will be created if missing, only used with OAuth)
+
#[arg(long, default_value = "/tmp/wisp-oauth-session.json")]
+
store: String,
+
+
/// App Password for authentication (alternative to OAuth)
+
#[arg(long)]
+
password: Option<CowStr<'static>>,
+
},
+
/// Pull a site from the PDS to a local directory
+
Pull {
+
/// Handle (e.g., alice.bsky.social) or DID
+
input: CowStr<'static>,
+
+
/// Site name (record key)
+
#[arg(short, long)]
+
site: String,
+
+
/// Output directory for the downloaded site
+
#[arg(short, long, default_value = ".")]
+
output: PathBuf,
+
},
+
/// Serve a site locally with real-time firehose updates
+
Serve {
+
/// Handle (e.g., alice.bsky.social) or DID
+
input: CowStr<'static>,
+
+
/// Site name (record key)
+
#[arg(short, long)]
+
site: String,
+
+
/// Output directory for the site files
+
#[arg(short, long, default_value = ".")]
+
output: PathBuf,
+
+
/// Port to serve on
+
#[arg(short, long, default_value = "8080")]
+
port: u16,
+
},
+
}
+
#[tokio::main]
async fn main() -> miette::Result<()> {
let args = Args::parse();
-
// Dispatch to appropriate authentication method
-
if let Some(password) = args.password {
-
run_with_app_password(args.input, password, args.path, args.site).await
-
} else {
-
run_with_oauth(args.input, args.store, args.path, args.site).await
+
let result = match args.command {
+
Some(Commands::Deploy { input, path, site, store, password }) => {
+
// Dispatch to appropriate authentication method
+
if let Some(password) = password {
+
run_with_app_password(input, password, path, site).await
+
} else {
+
run_with_oauth(input, store, path, site).await
+
}
+
}
+
Some(Commands::Pull { input, site, output }) => {
+
pull::pull_site(input, CowStr::from(site), output).await
+
}
+
Some(Commands::Serve { input, site, output, port }) => {
+
serve::serve_site(input, CowStr::from(site), output, port).await
+
}
+
None => {
+
// Legacy mode: if input is provided, assume deploy command
+
if let Some(input) = args.input {
+
let path = args.path.unwrap_or_else(|| PathBuf::from("."));
+
let store = args.store.unwrap_or_else(|| "/tmp/wisp-oauth-session.json".to_string());
+
+
// Dispatch to appropriate authentication method
+
if let Some(password) = args.password {
+
run_with_app_password(input, password, path, args.site).await
+
} else {
+
run_with_oauth(input, store, path, args.site).await
+
}
+
} else {
+
// No command and no input, show help
+
use clap::CommandFactory;
+
Args::command().print_help().into_diagnostic()?;
+
Ok(())
+
}
+
}
+
};
+
+
// Force exit to avoid hanging on background tasks/connections
+
match result {
+
Ok(_) => std::process::exit(0),
+
Err(e) => {
+
eprintln!("{:?}", e);
+
std::process::exit(1)
+
}
}
}
···
site: Option<String>,
) -> miette::Result<()> {
let (session, auth) =
-
MemoryCredentialSession::authenticated(input, password, None).await?;
+
MemoryCredentialSession::authenticated(input, password, None, None).await?;
println!("Signed in as {}", auth.handle);
let agent: Agent<_> = Agent::from(session);
···
path: PathBuf,
site: Option<String>,
) -> miette::Result<()> {
-
let oauth = OAuthClient::with_default_config(FileAuthStore::new(&store));
+
use jacquard::oauth::scopes::Scope;
+
use jacquard::oauth::atproto::AtprotoClientMetadata;
+
use jacquard::oauth::session::ClientData;
+
use url::Url;
+
+
// Request the necessary scopes for wisp.place
+
let scopes = Scope::parse_multiple("atproto repo:place.wisp.fs repo:place.wisp.subfs blob:*/*")
+
.map_err(|e| miette::miette!("Failed to parse scopes: {:?}", e))?;
+
+
// Create redirect URIs that match the loopback server (port 4000, path /oauth/callback)
+
let redirect_uris = vec![
+
Url::parse("http://127.0.0.1:4000/oauth/callback").into_diagnostic()?,
+
Url::parse("http://[::1]:4000/oauth/callback").into_diagnostic()?,
+
];
+
+
// Create client metadata with matching redirect URIs and scopes
+
let client_data = ClientData {
+
keyset: None,
+
config: AtprotoClientMetadata::new_localhost(
+
Some(redirect_uris),
+
Some(scopes),
+
),
+
};
+
+
let oauth = OAuthClient::new(FileAuthStore::new(&store), client_data);
+
let session = oauth
.login_with_local_server(input, Default::default(), LoopbackConfig::default())
.await?;
···
println!("Deploying site '{}'...", site_name);
+
// Try to fetch existing manifest for incremental updates
+
let (existing_blob_map, old_subfs_uris): (HashMap<String, (jacquard_common::types::blob::BlobRef<'static>, String)>, Vec<(String, String)>) = {
+
use jacquard_common::types::string::AtUri;
+
+
// Get the DID for this session
+
let session_info = agent.session_info().await;
+
if let Some((did, _)) = session_info {
+
// Construct the AT URI for the record
+
let uri_string = format!("at://{}/place.wisp.fs/{}", did, site_name);
+
if let Ok(uri) = AtUri::new(&uri_string) {
+
match agent.get_record::<Fs>(&uri).await {
+
Ok(response) => {
+
match response.into_output() {
+
Ok(record_output) => {
+
let existing_manifest = record_output.value;
+
let mut blob_map = blob_map::extract_blob_map(&existing_manifest.root);
+
println!("Found existing manifest with {} files in main record", blob_map.len());
+
+
// Extract subfs URIs from main record
+
let subfs_uris = subfs_utils::extract_subfs_uris(&existing_manifest.root, String::new());
+
+
if !subfs_uris.is_empty() {
+
println!("Found {} subfs records, fetching for blob reuse...", subfs_uris.len());
+
+
// Merge blob maps from all subfs records
+
match subfs_utils::merge_subfs_blob_maps(agent, subfs_uris.clone(), &mut blob_map).await {
+
Ok(merged_count) => {
+
println!("Total blob map: {} files (main + {} from subfs)", blob_map.len(), merged_count);
+
}
+
Err(e) => {
+
eprintln!("โš ๏ธ Failed to merge some subfs blob maps: {}", e);
+
}
+
}
+
+
(blob_map, subfs_uris)
+
} else {
+
(blob_map, Vec::new())
+
}
+
}
+
Err(_) => {
+
println!("No existing manifest found, uploading all files...");
+
(HashMap::new(), Vec::new())
+
}
+
}
+
}
+
Err(_) => {
+
// Record doesn't exist yet - this is a new site
+
println!("No existing manifest found, uploading all files...");
+
(HashMap::new(), Vec::new())
+
}
+
}
+
} else {
+
println!("No existing manifest found (invalid URI), uploading all files...");
+
(HashMap::new(), Vec::new())
+
}
+
} else {
+
println!("No existing manifest found (could not get DID), uploading all files...");
+
(HashMap::new(), Vec::new())
+
}
+
};
+
// Build directory tree
-
let root_dir = build_directory(agent, &path).await?;
+
let (root_dir, total_files, reused_count) = build_directory(agent, &path, &existing_blob_map, String::new()).await?;
+
let uploaded_count = total_files - reused_count;
+
+
// Check if we need to split into subfs records
+
const MAX_MANIFEST_SIZE: usize = 140 * 1024; // 140KB (PDS limit is 150KB)
+
const FILE_COUNT_THRESHOLD: usize = 250; // Start splitting at this many files
+
const TARGET_FILE_COUNT: usize = 200; // Keep main manifest under this
+
+
let mut working_directory = root_dir;
+
let mut current_file_count = total_files;
+
let mut new_subfs_uris: Vec<(String, String)> = Vec::new();
+
+
// Estimate initial manifest size
+
let mut manifest_size = subfs_utils::estimate_directory_size(&working_directory);
+
+
if total_files >= FILE_COUNT_THRESHOLD || manifest_size > MAX_MANIFEST_SIZE {
+
println!("\nโš ๏ธ Large site detected ({} files, {:.1}KB manifest), splitting into subfs records...",
+
total_files, manifest_size as f64 / 1024.0);
+
+
let mut attempts = 0;
+
const MAX_SPLIT_ATTEMPTS: usize = 50;
+
+
while (manifest_size > MAX_MANIFEST_SIZE || current_file_count > TARGET_FILE_COUNT) && attempts < MAX_SPLIT_ATTEMPTS {
+
attempts += 1;
+
+
// Find large directories to split
+
let directories = subfs_utils::find_large_directories(&working_directory, String::new());
+
+
if let Some(largest_dir) = directories.first() {
+
println!(" Split #{}: {} ({} files, {:.1}KB)",
+
attempts, largest_dir.path, largest_dir.file_count, largest_dir.size as f64 / 1024.0);
+
+
// Create a subfs record for this directory
+
use jacquard_common::types::string::Tid;
+
let subfs_tid = Tid::now_0();
+
let subfs_rkey = subfs_tid.to_string();
+
+
let subfs_manifest = crate::place_wisp::subfs::SubfsRecord::new()
+
.root(convert_fs_dir_to_subfs_dir(largest_dir.directory.clone()))
+
.file_count(Some(largest_dir.file_count as i64))
+
.created_at(Datetime::now())
+
.build();
+
+
// Upload subfs record
+
let subfs_output = agent.put_record(
+
RecordKey::from(Rkey::new(&subfs_rkey).into_diagnostic()?),
+
subfs_manifest
+
).await.into_diagnostic()?;
+
+
let subfs_uri = subfs_output.uri.to_string();
+
println!(" โœ… Created subfs: {}", subfs_uri);
+
+
// Replace directory with subfs node (flat: false to preserve structure)
+
working_directory = subfs_utils::replace_directory_with_subfs(
+
working_directory,
+
&largest_dir.path,
+
&subfs_uri,
+
false // Preserve directory structure
+
)?;
+
+
new_subfs_uris.push((subfs_uri, largest_dir.path.clone()));
+
current_file_count -= largest_dir.file_count;
+
+
// Recalculate manifest size
+
manifest_size = subfs_utils::estimate_directory_size(&working_directory);
+
println!(" โ†’ Manifest now {:.1}KB with {} files ({} subfs total)",
+
manifest_size as f64 / 1024.0, current_file_count, new_subfs_uris.len());
+
+
if manifest_size <= MAX_MANIFEST_SIZE && current_file_count <= TARGET_FILE_COUNT {
+
println!("โœ… Manifest now fits within limits");
+
break;
+
}
+
} else {
+
println!(" No more subdirectories to split - stopping");
+
break;
+
}
+
}
-
// Count total files
-
let file_count = count_files(&root_dir);
+
if attempts >= MAX_SPLIT_ATTEMPTS {
+
return Err(miette::miette!(
+
"Exceeded maximum split attempts ({}). Manifest still too large: {:.1}KB with {} files",
+
MAX_SPLIT_ATTEMPTS,
+
manifest_size as f64 / 1024.0,
+
current_file_count
+
));
+
}
+
+
println!("โœ… Split complete: {} subfs records, {} files in main manifest, {:.1}KB",
+
new_subfs_uris.len(), current_file_count, manifest_size as f64 / 1024.0);
+
} else {
+
println!("Manifest created ({} files, {:.1}KB) - no splitting needed",
+
total_files, manifest_size as f64 / 1024.0);
+
}
-
// Create the Fs record
+
// Create the final Fs record
let fs_record = Fs::new()
.site(CowStr::from(site_name.clone()))
-
.root(root_dir)
-
.file_count(file_count as i64)
+
.root(working_directory)
+
.file_count(current_file_count as i64)
.created_at(Datetime::now())
.build();
···
.and_then(|s| s.split('/').next())
.ok_or_else(|| miette::miette!("Failed to parse DID from URI"))?;
-
println!("Deployed site '{}': {}", site_name, output.uri);
-
println!("Available at: https://sites.wisp.place/{}/{}", did, site_name);
+
println!("\nโœ“ Deployed site '{}': {}", site_name, output.uri);
+
println!(" Total files: {} ({} reused, {} uploaded)", total_files, reused_count, uploaded_count);
+
println!(" Available at: https://sites.wisp.place/{}/{}", did, site_name);
+
+
// Clean up old subfs records
+
if !old_subfs_uris.is_empty() {
+
println!("\nCleaning up {} old subfs records...", old_subfs_uris.len());
+
+
let mut deleted_count = 0;
+
let mut failed_count = 0;
+
+
for (uri, _path) in old_subfs_uris {
+
match subfs_utils::delete_subfs_record(agent, &uri).await {
+
Ok(_) => {
+
deleted_count += 1;
+
println!(" ๐Ÿ—‘๏ธ Deleted old subfs: {}", uri);
+
}
+
Err(e) => {
+
failed_count += 1;
+
eprintln!(" โš ๏ธ Failed to delete {}: {}", uri, e);
+
}
+
}
+
}
+
+
if failed_count > 0 {
+
eprintln!("โš ๏ธ Cleanup completed with {} deleted, {} failed", deleted_count, failed_count);
+
} else {
+
println!("โœ… Cleanup complete: {} old subfs records deleted", deleted_count);
+
}
+
}
Ok(())
}
/// Recursively build a Directory from a filesystem path
+
/// current_path is the path from the root of the site (e.g., "" for root, "config" for config dir)
fn build_directory<'a>(
agent: &'a Agent<impl jacquard::client::AgentSession + IdentityResolver + 'a>,
dir_path: &'a Path,
-
) -> std::pin::Pin<Box<dyn std::future::Future<Output = miette::Result<Directory<'static>>> + 'a>>
+
existing_blobs: &'a HashMap<String, (jacquard_common::types::blob::BlobRef<'static>, String)>,
+
current_path: String,
+
) -> std::pin::Pin<Box<dyn std::future::Future<Output = miette::Result<(Directory<'static>, usize, usize)>> + 'a>>
{
Box::pin(async move {
// Collect all directory entries first
···
.ok_or_else(|| miette::miette!("Invalid filename: {:?}", name))?
.to_string();
-
// Skip hidden files
-
if name_str.starts_with('.') {
+
// Skip unwanted files and directories
+
+
// .git directory (version control - thousands of files)
+
if name_str == ".git" {
+
continue;
+
}
+
+
// .DS_Store (macOS metadata - can leak info)
+
if name_str == ".DS_Store" {
+
continue;
+
}
+
+
// .env files (environment variables with secrets)
+
if name_str.starts_with(".env") {
+
continue;
+
}
+
+
// node_modules (dependency folder - can be 100,000+ files)
+
if name_str == "node_modules" {
+
continue;
+
}
+
+
// OS metadata files
+
if name_str == "Thumbs.db" || name_str == "desktop.ini" || name_str.starts_with("._") {
+
continue;
+
}
+
+
// macOS system directories
+
if name_str == ".Spotlight-V100" || name_str == ".Trashes" || name_str == ".fseventsd" {
+
continue;
+
}
+
+
// Cache and temp directories
+
if name_str == ".cache" || name_str == ".temp" || name_str == ".tmp" {
+
continue;
+
}
+
+
// Python cache
+
if name_str == "__pycache__" || name_str.ends_with(".pyc") {
+
continue;
+
}
+
+
// Python virtual environments
+
if name_str == ".venv" || name_str == "venv" || name_str == "env" {
+
continue;
+
}
+
+
// Editor swap files
+
if name_str.ends_with(".swp") || name_str.ends_with(".swo") || name_str.ends_with("~") {
continue;
}
let metadata = entry.metadata().into_diagnostic()?;
if metadata.is_file() {
-
file_tasks.push((name_str, path));
+
// Construct full path for this file (for blob map lookup)
+
let full_path = if current_path.is_empty() {
+
name_str.clone()
+
} else {
+
format!("{}/{}", current_path, name_str)
+
};
+
file_tasks.push((name_str, path, full_path));
} else if metadata.is_dir() {
dir_tasks.push((name_str, path));
}
}
// Process files concurrently with a limit of 5
-
let file_entries: Vec<Entry> = stream::iter(file_tasks)
-
.map(|(name, path)| async move {
-
let file_node = process_file(agent, &path).await?;
-
Ok::<_, miette::Report>(Entry::new()
+
let file_results: Vec<(Entry<'static>, bool)> = stream::iter(file_tasks)
+
.map(|(name, path, full_path)| async move {
+
let (file_node, reused) = process_file(agent, &path, &full_path, existing_blobs).await?;
+
let entry = Entry::new()
.name(CowStr::from(name))
.node(EntryNode::File(Box::new(file_node)))
-
.build())
+
.build();
+
Ok::<_, miette::Report>((entry, reused))
})
.buffer_unordered(5)
.collect::<Vec<_>>()
.await
.into_iter()
.collect::<miette::Result<Vec<_>>>()?;
+
+
let mut file_entries = Vec::new();
+
let mut reused_count = 0;
+
let mut total_files = 0;
+
+
for (entry, reused) in file_results {
+
file_entries.push(entry);
+
total_files += 1;
+
if reused {
+
reused_count += 1;
+
}
+
}
// Process directories recursively (sequentially to avoid too much nesting)
let mut dir_entries = Vec::new();
for (name, path) in dir_tasks {
-
let subdir = build_directory(agent, &path).await?;
+
// Construct full path for subdirectory
+
let subdir_path = if current_path.is_empty() {
+
name.clone()
+
} else {
+
format!("{}/{}", current_path, name)
+
};
+
let (subdir, sub_total, sub_reused) = build_directory(agent, &path, existing_blobs, subdir_path).await?;
dir_entries.push(Entry::new()
.name(CowStr::from(name))
.node(EntryNode::Directory(Box::new(subdir)))
.build());
+
total_files += sub_total;
+
reused_count += sub_reused;
}
// Combine file and directory entries
let mut entries = file_entries;
entries.extend(dir_entries);
-
Ok(Directory::new()
+
let directory = Directory::new()
.r#type(CowStr::from("directory"))
.entries(entries)
-
.build())
+
.build();
+
+
Ok((directory, total_files, reused_count))
})
}
-
/// Process a single file: gzip -> base64 -> upload blob
+
/// Process a single file: gzip -> base64 -> upload blob (or reuse existing)
+
/// Returns (File, reused: bool)
+
/// file_path_key is the full path from the site root (e.g., "config/file.json") for blob map lookup
+
///
+
/// Special handling: _redirects files are NOT compressed (uploaded as-is)
async fn process_file(
agent: &Agent<impl jacquard::client::AgentSession + IdentityResolver>,
file_path: &Path,
-
) -> miette::Result<File<'static>>
+
file_path_key: &str,
+
existing_blobs: &HashMap<String, (jacquard_common::types::blob::BlobRef<'static>, String)>,
+
) -> miette::Result<(File<'static>, bool)>
{
// Read file
let file_data = std::fs::read(file_path).into_diagnostic()?;
···
.first_or_octet_stream()
.to_string();
-
// Gzip compress
-
let mut encoder = GzEncoder::new(Vec::new(), Compression::default());
-
encoder.write_all(&file_data).into_diagnostic()?;
-
let gzipped = encoder.finish().into_diagnostic()?;
+
// Check if this is a _redirects file (don't compress it)
+
let is_redirects_file = file_path.file_name()
+
.and_then(|n| n.to_str())
+
.map(|n| n == "_redirects")
+
.unwrap_or(false);
-
// Base64 encode the gzipped data
-
let base64_bytes = base64::prelude::BASE64_STANDARD.encode(&gzipped).into_bytes();
+
let (upload_bytes, encoding, is_base64) = if is_redirects_file {
+
// Don't compress _redirects - upload as-is
+
(file_data.clone(), None, false)
+
} else {
+
// Gzip compress
+
let mut encoder = GzEncoder::new(Vec::new(), Compression::default());
+
encoder.write_all(&file_data).into_diagnostic()?;
+
let gzipped = encoder.finish().into_diagnostic()?;
-
// Upload blob as octet-stream
-
let blob = agent.upload_blob(
-
base64_bytes,
-
MimeType::new_static("application/octet-stream"),
-
).await?;
+
// Base64 encode the gzipped data
+
let base64_bytes = base64::prelude::BASE64_STANDARD.encode(&gzipped).into_bytes();
+
(base64_bytes, Some("gzip"), true)
+
};
+
+
// Compute CID for this file
+
let file_cid = cid::compute_cid(&upload_bytes);
+
+
// Check if we have an existing blob with the same CID
+
let existing_blob = existing_blobs.get(file_path_key);
-
Ok(File::new()
+
if let Some((existing_blob_ref, existing_cid)) = existing_blob {
+
if existing_cid == &file_cid {
+
// CIDs match - reuse existing blob
+
println!(" โœ“ Reusing blob for {} (CID: {})", file_path_key, file_cid);
+
let mut file_builder = File::new()
+
.r#type(CowStr::from("file"))
+
.blob(existing_blob_ref.clone())
+
.mime_type(CowStr::from(original_mime));
+
+
if let Some(enc) = encoding {
+
file_builder = file_builder.encoding(CowStr::from(enc));
+
}
+
if is_base64 {
+
file_builder = file_builder.base64(true);
+
}
+
+
return Ok((file_builder.build(), true));
+
}
+
}
+
+
// File is new or changed - upload it
+
let mime_type = if is_redirects_file {
+
MimeType::new_static("text/plain")
+
} else {
+
MimeType::new_static("application/octet-stream")
+
};
+
+
println!(" โ†‘ Uploading {} ({} bytes, CID: {})", file_path_key, upload_bytes.len(), file_cid);
+
let blob = agent.upload_blob(upload_bytes, mime_type).await?;
+
+
let mut file_builder = File::new()
.r#type(CowStr::from("file"))
.blob(blob)
-
.encoding(CowStr::from("gzip"))
-
.mime_type(CowStr::from(original_mime))
-
.base64(true)
-
.build())
-
}
+
.mime_type(CowStr::from(original_mime));
-
/// Count total files in a directory tree
-
fn count_files(dir: &Directory) -> usize {
-
let mut count = 0;
-
for entry in &dir.entries {
-
match &entry.node {
-
EntryNode::File(_) => count += 1,
-
EntryNode::Directory(subdir) => count += count_files(subdir),
-
_ => {} // Unknown variants
-
}
+
if let Some(enc) = encoding {
+
file_builder = file_builder.encoding(CowStr::from(enc));
}
-
count
+
if is_base64 {
+
file_builder = file_builder.base64(true);
+
}
+
+
Ok((file_builder.build(), false))
}
+
+
/// Convert fs::Directory to subfs::Directory
+
/// They have the same structure, but different types
+
fn convert_fs_dir_to_subfs_dir(fs_dir: place_wisp::fs::Directory<'static>) -> place_wisp::subfs::Directory<'static> {
+
use place_wisp::subfs::{Directory as SubfsDirectory, Entry as SubfsEntry, EntryNode as SubfsEntryNode, File as SubfsFile};
+
+
let subfs_entries: Vec<SubfsEntry> = fs_dir.entries.into_iter().map(|entry| {
+
let node = match entry.node {
+
place_wisp::fs::EntryNode::File(file) => {
+
SubfsEntryNode::File(Box::new(SubfsFile::new()
+
.r#type(file.r#type)
+
.blob(file.blob)
+
.encoding(file.encoding)
+
.mime_type(file.mime_type)
+
.base64(file.base64)
+
.build()))
+
}
+
place_wisp::fs::EntryNode::Directory(dir) => {
+
SubfsEntryNode::Directory(Box::new(convert_fs_dir_to_subfs_dir(*dir)))
+
}
+
place_wisp::fs::EntryNode::Subfs(subfs) => {
+
// Nested subfs in the directory we're converting
+
// Note: subfs::Subfs doesn't have the 'flat' field - that's only in fs::Subfs
+
SubfsEntryNode::Subfs(Box::new(place_wisp::subfs::Subfs::new()
+
.r#type(subfs.r#type)
+
.subject(subfs.subject)
+
.build()))
+
}
+
place_wisp::fs::EntryNode::Unknown(unknown) => {
+
SubfsEntryNode::Unknown(unknown)
+
}
+
};
+
+
SubfsEntry::new()
+
.name(entry.name)
+
.node(node)
+
.build()
+
}).collect();
+
+
SubfsDirectory::new()
+
.r#type(fs_dir.r#type)
+
.entries(subfs_entries)
+
.build()
+
}
+
+46
cli/src/metadata.rs
···
+
use serde::{Deserialize, Serialize};
+
use std::collections::HashMap;
+
use std::path::Path;
+
use miette::IntoDiagnostic;
+
+
/// Metadata tracking file CIDs for incremental updates
+
#[derive(Debug, Clone, Serialize, Deserialize)]
+
pub struct SiteMetadata {
+
/// Record CID from the PDS
+
pub record_cid: String,
+
/// Map of file paths to their blob CIDs
+
pub file_cids: HashMap<String, String>,
+
/// Timestamp when the site was last synced
+
pub last_sync: i64,
+
}
+
+
impl SiteMetadata {
+
pub fn new(record_cid: String, file_cids: HashMap<String, String>) -> Self {
+
Self {
+
record_cid,
+
file_cids,
+
last_sync: chrono::Utc::now().timestamp(),
+
}
+
}
+
+
/// Load metadata from a directory
+
pub fn load(dir: &Path) -> miette::Result<Option<Self>> {
+
let metadata_path = dir.join(".wisp-metadata.json");
+
if !metadata_path.exists() {
+
return Ok(None);
+
}
+
+
let contents = std::fs::read_to_string(&metadata_path).into_diagnostic()?;
+
let metadata: SiteMetadata = serde_json::from_str(&contents).into_diagnostic()?;
+
Ok(Some(metadata))
+
}
+
+
/// Save metadata to a directory
+
pub fn save(&self, dir: &Path) -> miette::Result<()> {
+
let metadata_path = dir.join(".wisp-metadata.json");
+
let contents = serde_json::to_string_pretty(self).into_diagnostic()?;
+
std::fs::write(&metadata_path, contents).into_diagnostic()?;
+
Ok(())
+
}
+
}
+
+261 -1
cli/src/place_wisp/fs.rs
···
description: None,
refs: vec![
::jacquard_common::CowStr::new_static("#file"),
-
::jacquard_common::CowStr::new_static("#directory")
+
::jacquard_common::CowStr::new_static("#directory"),
+
::jacquard_common::CowStr::new_static("#subfs")
],
closed: None,
}),
···
}),
}),
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("subfs"),
+
::jacquard_lexicon::lexicon::LexUserType::Object(::jacquard_lexicon::lexicon::LexObject {
+
description: None,
+
required: Some(
+
vec![
+
::jacquard_common::smol_str::SmolStr::new_static("type"),
+
::jacquard_common::smol_str::SmolStr::new_static("subject")
+
],
+
),
+
nullable: None,
+
properties: {
+
#[allow(unused_mut)]
+
let mut map = ::std::collections::BTreeMap::new();
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("flat"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::Boolean(::jacquard_lexicon::lexicon::LexBoolean {
+
description: None,
+
default: None,
+
r#const: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("subject"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: Some(
+
::jacquard_common::CowStr::new_static(
+
"AT-URI pointing to a place.wisp.subfs record containing this subtree.",
+
),
+
),
+
format: Some(
+
::jacquard_lexicon::lexicon::LexStringFormat::AtUri,
+
),
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("type"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: None,
+
format: None,
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map
+
},
+
}),
+
);
map
},
}
···
File(Box<crate::place_wisp::fs::File<'a>>),
#[serde(rename = "place.wisp.fs#directory")]
Directory(Box<crate::place_wisp::fs::Directory<'a>>),
+
#[serde(rename = "place.wisp.fs#subfs")]
+
Subfs(Box<crate::place_wisp::fs::Subfs<'a>>),
}
impl<'a> ::jacquard_lexicon::schema::LexiconSchema for Entry<'a> {
···
});
+
Ok(())
+
}
+
}
+
+
#[jacquard_derive::lexicon]
+
#[derive(
+
serde::Serialize,
+
serde::Deserialize,
+
Debug,
+
Clone,
+
PartialEq,
+
Eq,
+
jacquard_derive::IntoStatic
+
)]
+
#[serde(rename_all = "camelCase")]
+
pub struct Subfs<'a> {
+
/// If true, the subfs record's root entries are merged (flattened) into the parent directory, replacing the subfs entry. If false (default), the subfs entries are placed in a subdirectory with the subfs entry's name. Flat merging is useful for splitting large directories across multiple records while maintaining a flat structure.
+
#[serde(skip_serializing_if = "std::option::Option::is_none")]
+
pub flat: Option<bool>,
+
/// AT-URI pointing to a place.wisp.subfs record containing this subtree.
+
#[serde(borrow)]
+
pub subject: jacquard_common::types::string::AtUri<'a>,
+
#[serde(borrow)]
+
pub r#type: jacquard_common::CowStr<'a>,
+
}
+
+
pub mod subfs_state {
+
+
pub use crate::builder_types::{Set, Unset, IsSet, IsUnset};
+
#[allow(unused)]
+
use ::core::marker::PhantomData;
+
mod sealed {
+
pub trait Sealed {}
+
}
+
/// State trait tracking which required fields have been set
+
pub trait State: sealed::Sealed {
+
type Type;
+
type Subject;
+
}
+
/// Empty state - all required fields are unset
+
pub struct Empty(());
+
impl sealed::Sealed for Empty {}
+
impl State for Empty {
+
type Type = Unset;
+
type Subject = Unset;
+
}
+
///State transition - sets the `type` field to Set
+
pub struct SetType<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetType<S> {}
+
impl<S: State> State for SetType<S> {
+
type Type = Set<members::r#type>;
+
type Subject = S::Subject;
+
}
+
///State transition - sets the `subject` field to Set
+
pub struct SetSubject<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetSubject<S> {}
+
impl<S: State> State for SetSubject<S> {
+
type Type = S::Type;
+
type Subject = Set<members::subject>;
+
}
+
/// Marker types for field names
+
#[allow(non_camel_case_types)]
+
pub mod members {
+
///Marker type for the `type` field
+
pub struct r#type(());
+
///Marker type for the `subject` field
+
pub struct subject(());
+
}
+
}
+
+
/// Builder for constructing an instance of this type
+
pub struct SubfsBuilder<'a, S: subfs_state::State> {
+
_phantom_state: ::core::marker::PhantomData<fn() -> S>,
+
__unsafe_private_named: (
+
::core::option::Option<bool>,
+
::core::option::Option<jacquard_common::types::string::AtUri<'a>>,
+
::core::option::Option<jacquard_common::CowStr<'a>>,
+
),
+
_phantom: ::core::marker::PhantomData<&'a ()>,
+
}
+
+
impl<'a> Subfs<'a> {
+
/// Create a new builder for this type
+
pub fn new() -> SubfsBuilder<'a, subfs_state::Empty> {
+
SubfsBuilder::new()
+
}
+
}
+
+
impl<'a> SubfsBuilder<'a, subfs_state::Empty> {
+
/// Create a new builder with all fields unset
+
pub fn new() -> Self {
+
SubfsBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: (None, None, None),
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S: subfs_state::State> SubfsBuilder<'a, S> {
+
/// Set the `flat` field (optional)
+
pub fn flat(mut self, value: impl Into<Option<bool>>) -> Self {
+
self.__unsafe_private_named.0 = value.into();
+
self
+
}
+
/// Set the `flat` field to an Option value (optional)
+
pub fn maybe_flat(mut self, value: Option<bool>) -> Self {
+
self.__unsafe_private_named.0 = value;
+
self
+
}
+
}
+
+
impl<'a, S> SubfsBuilder<'a, S>
+
where
+
S: subfs_state::State,
+
S::Subject: subfs_state::IsUnset,
+
{
+
/// Set the `subject` field (required)
+
pub fn subject(
+
mut self,
+
value: impl Into<jacquard_common::types::string::AtUri<'a>>,
+
) -> SubfsBuilder<'a, subfs_state::SetSubject<S>> {
+
self.__unsafe_private_named.1 = ::core::option::Option::Some(value.into());
+
SubfsBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> SubfsBuilder<'a, S>
+
where
+
S: subfs_state::State,
+
S::Type: subfs_state::IsUnset,
+
{
+
/// Set the `type` field (required)
+
pub fn r#type(
+
mut self,
+
value: impl Into<jacquard_common::CowStr<'a>>,
+
) -> SubfsBuilder<'a, subfs_state::SetType<S>> {
+
self.__unsafe_private_named.2 = ::core::option::Option::Some(value.into());
+
SubfsBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> SubfsBuilder<'a, S>
+
where
+
S: subfs_state::State,
+
S::Type: subfs_state::IsSet,
+
S::Subject: subfs_state::IsSet,
+
{
+
/// Build the final struct
+
pub fn build(self) -> Subfs<'a> {
+
Subfs {
+
flat: self.__unsafe_private_named.0,
+
subject: self.__unsafe_private_named.1.unwrap(),
+
r#type: self.__unsafe_private_named.2.unwrap(),
+
extra_data: Default::default(),
+
}
+
}
+
/// Build the final struct with custom extra_data
+
pub fn build_with_data(
+
self,
+
extra_data: std::collections::BTreeMap<
+
jacquard_common::smol_str::SmolStr,
+
jacquard_common::types::value::Data<'a>,
+
>,
+
) -> Subfs<'a> {
+
Subfs {
+
flat: self.__unsafe_private_named.0,
+
subject: self.__unsafe_private_named.1.unwrap(),
+
r#type: self.__unsafe_private_named.2.unwrap(),
+
extra_data: Some(extra_data),
+
}
+
}
+
}
+
+
impl<'a> ::jacquard_lexicon::schema::LexiconSchema for Subfs<'a> {
+
fn nsid() -> &'static str {
+
"place.wisp.fs"
+
}
+
fn def_name() -> &'static str {
+
"subfs"
+
}
+
fn lexicon_doc() -> ::jacquard_lexicon::lexicon::LexiconDoc<'static> {
+
lexicon_doc_place_wisp_fs()
+
}
+
fn validate(
+
&self,
+
) -> ::std::result::Result<(), ::jacquard_lexicon::validation::ConstraintError> {
Ok(())
+1408
cli/src/place_wisp/subfs.rs
···
+
// @generated by jacquard-lexicon. DO NOT EDIT.
+
//
+
// Lexicon: place.wisp.subfs
+
//
+
// This file was automatically generated from Lexicon schemas.
+
// Any manual changes will be overwritten on the next regeneration.
+
+
#[jacquard_derive::lexicon]
+
#[derive(
+
serde::Serialize,
+
serde::Deserialize,
+
Debug,
+
Clone,
+
PartialEq,
+
Eq,
+
jacquard_derive::IntoStatic
+
)]
+
#[serde(rename_all = "camelCase")]
+
pub struct Directory<'a> {
+
#[serde(borrow)]
+
pub entries: Vec<crate::place_wisp::subfs::Entry<'a>>,
+
#[serde(borrow)]
+
pub r#type: jacquard_common::CowStr<'a>,
+
}
+
+
pub mod directory_state {
+
+
pub use crate::builder_types::{Set, Unset, IsSet, IsUnset};
+
#[allow(unused)]
+
use ::core::marker::PhantomData;
+
mod sealed {
+
pub trait Sealed {}
+
}
+
/// State trait tracking which required fields have been set
+
pub trait State: sealed::Sealed {
+
type Type;
+
type Entries;
+
}
+
/// Empty state - all required fields are unset
+
pub struct Empty(());
+
impl sealed::Sealed for Empty {}
+
impl State for Empty {
+
type Type = Unset;
+
type Entries = Unset;
+
}
+
///State transition - sets the `type` field to Set
+
pub struct SetType<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetType<S> {}
+
impl<S: State> State for SetType<S> {
+
type Type = Set<members::r#type>;
+
type Entries = S::Entries;
+
}
+
///State transition - sets the `entries` field to Set
+
pub struct SetEntries<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetEntries<S> {}
+
impl<S: State> State for SetEntries<S> {
+
type Type = S::Type;
+
type Entries = Set<members::entries>;
+
}
+
/// Marker types for field names
+
#[allow(non_camel_case_types)]
+
pub mod members {
+
///Marker type for the `type` field
+
pub struct r#type(());
+
///Marker type for the `entries` field
+
pub struct entries(());
+
}
+
}
+
+
/// Builder for constructing an instance of this type
+
pub struct DirectoryBuilder<'a, S: directory_state::State> {
+
_phantom_state: ::core::marker::PhantomData<fn() -> S>,
+
__unsafe_private_named: (
+
::core::option::Option<Vec<crate::place_wisp::subfs::Entry<'a>>>,
+
::core::option::Option<jacquard_common::CowStr<'a>>,
+
),
+
_phantom: ::core::marker::PhantomData<&'a ()>,
+
}
+
+
impl<'a> Directory<'a> {
+
/// Create a new builder for this type
+
pub fn new() -> DirectoryBuilder<'a, directory_state::Empty> {
+
DirectoryBuilder::new()
+
}
+
}
+
+
impl<'a> DirectoryBuilder<'a, directory_state::Empty> {
+
/// Create a new builder with all fields unset
+
pub fn new() -> Self {
+
DirectoryBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: (None, None),
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> DirectoryBuilder<'a, S>
+
where
+
S: directory_state::State,
+
S::Entries: directory_state::IsUnset,
+
{
+
/// Set the `entries` field (required)
+
pub fn entries(
+
mut self,
+
value: impl Into<Vec<crate::place_wisp::subfs::Entry<'a>>>,
+
) -> DirectoryBuilder<'a, directory_state::SetEntries<S>> {
+
self.__unsafe_private_named.0 = ::core::option::Option::Some(value.into());
+
DirectoryBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> DirectoryBuilder<'a, S>
+
where
+
S: directory_state::State,
+
S::Type: directory_state::IsUnset,
+
{
+
/// Set the `type` field (required)
+
pub fn r#type(
+
mut self,
+
value: impl Into<jacquard_common::CowStr<'a>>,
+
) -> DirectoryBuilder<'a, directory_state::SetType<S>> {
+
self.__unsafe_private_named.1 = ::core::option::Option::Some(value.into());
+
DirectoryBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> DirectoryBuilder<'a, S>
+
where
+
S: directory_state::State,
+
S::Type: directory_state::IsSet,
+
S::Entries: directory_state::IsSet,
+
{
+
/// Build the final struct
+
pub fn build(self) -> Directory<'a> {
+
Directory {
+
entries: self.__unsafe_private_named.0.unwrap(),
+
r#type: self.__unsafe_private_named.1.unwrap(),
+
extra_data: Default::default(),
+
}
+
}
+
/// Build the final struct with custom extra_data
+
pub fn build_with_data(
+
self,
+
extra_data: std::collections::BTreeMap<
+
jacquard_common::smol_str::SmolStr,
+
jacquard_common::types::value::Data<'a>,
+
>,
+
) -> Directory<'a> {
+
Directory {
+
entries: self.__unsafe_private_named.0.unwrap(),
+
r#type: self.__unsafe_private_named.1.unwrap(),
+
extra_data: Some(extra_data),
+
}
+
}
+
}
+
+
fn lexicon_doc_place_wisp_subfs() -> ::jacquard_lexicon::lexicon::LexiconDoc<'static> {
+
::jacquard_lexicon::lexicon::LexiconDoc {
+
lexicon: ::jacquard_lexicon::lexicon::Lexicon::Lexicon1,
+
id: ::jacquard_common::CowStr::new_static("place.wisp.subfs"),
+
revision: None,
+
description: None,
+
defs: {
+
let mut map = ::std::collections::BTreeMap::new();
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("directory"),
+
::jacquard_lexicon::lexicon::LexUserType::Object(::jacquard_lexicon::lexicon::LexObject {
+
description: None,
+
required: Some(
+
vec![
+
::jacquard_common::smol_str::SmolStr::new_static("type"),
+
::jacquard_common::smol_str::SmolStr::new_static("entries")
+
],
+
),
+
nullable: None,
+
properties: {
+
#[allow(unused_mut)]
+
let mut map = ::std::collections::BTreeMap::new();
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("entries"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::Array(::jacquard_lexicon::lexicon::LexArray {
+
description: None,
+
items: ::jacquard_lexicon::lexicon::LexArrayItem::Ref(::jacquard_lexicon::lexicon::LexRef {
+
description: None,
+
r#ref: ::jacquard_common::CowStr::new_static("#entry"),
+
}),
+
min_length: None,
+
max_length: Some(500usize),
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("type"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: None,
+
format: None,
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map
+
},
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("entry"),
+
::jacquard_lexicon::lexicon::LexUserType::Object(::jacquard_lexicon::lexicon::LexObject {
+
description: None,
+
required: Some(
+
vec![
+
::jacquard_common::smol_str::SmolStr::new_static("name"),
+
::jacquard_common::smol_str::SmolStr::new_static("node")
+
],
+
),
+
nullable: None,
+
properties: {
+
#[allow(unused_mut)]
+
let mut map = ::std::collections::BTreeMap::new();
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("name"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: None,
+
format: None,
+
default: None,
+
min_length: None,
+
max_length: Some(255usize),
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("node"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::Union(::jacquard_lexicon::lexicon::LexRefUnion {
+
description: None,
+
refs: vec![
+
::jacquard_common::CowStr::new_static("#file"),
+
::jacquard_common::CowStr::new_static("#directory"),
+
::jacquard_common::CowStr::new_static("#subfs")
+
],
+
closed: None,
+
}),
+
);
+
map
+
},
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("file"),
+
::jacquard_lexicon::lexicon::LexUserType::Object(::jacquard_lexicon::lexicon::LexObject {
+
description: None,
+
required: Some(
+
vec![
+
::jacquard_common::smol_str::SmolStr::new_static("type"),
+
::jacquard_common::smol_str::SmolStr::new_static("blob")
+
],
+
),
+
nullable: None,
+
properties: {
+
#[allow(unused_mut)]
+
let mut map = ::std::collections::BTreeMap::new();
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("base64"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::Boolean(::jacquard_lexicon::lexicon::LexBoolean {
+
description: None,
+
default: None,
+
r#const: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("blob"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::Blob(::jacquard_lexicon::lexicon::LexBlob {
+
description: None,
+
accept: None,
+
max_size: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("encoding"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: Some(
+
::jacquard_common::CowStr::new_static(
+
"Content encoding (e.g., gzip for compressed files)",
+
),
+
),
+
format: None,
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("mimeType"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: Some(
+
::jacquard_common::CowStr::new_static(
+
"Original MIME type before compression",
+
),
+
),
+
format: None,
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("type"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: None,
+
format: None,
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map
+
},
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("main"),
+
::jacquard_lexicon::lexicon::LexUserType::Record(::jacquard_lexicon::lexicon::LexRecord {
+
description: Some(
+
::jacquard_common::CowStr::new_static(
+
"Virtual filesystem subtree referenced by place.wisp.fs records. When a subfs entry is expanded, its root entries are merged (flattened) into the parent directory, allowing large directories to be split across multiple records while maintaining a flat structure.",
+
),
+
),
+
key: None,
+
record: ::jacquard_lexicon::lexicon::LexRecordRecord::Object(::jacquard_lexicon::lexicon::LexObject {
+
description: None,
+
required: Some(
+
vec![
+
::jacquard_common::smol_str::SmolStr::new_static("root"),
+
::jacquard_common::smol_str::SmolStr::new_static("createdAt")
+
],
+
),
+
nullable: None,
+
properties: {
+
#[allow(unused_mut)]
+
let mut map = ::std::collections::BTreeMap::new();
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static(
+
"createdAt",
+
),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: None,
+
format: Some(
+
::jacquard_lexicon::lexicon::LexStringFormat::Datetime,
+
),
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static(
+
"fileCount",
+
),
+
::jacquard_lexicon::lexicon::LexObjectProperty::Integer(::jacquard_lexicon::lexicon::LexInteger {
+
description: None,
+
default: None,
+
minimum: Some(0i64),
+
maximum: Some(1000i64),
+
r#enum: None,
+
r#const: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("root"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::Ref(::jacquard_lexicon::lexicon::LexRef {
+
description: None,
+
r#ref: ::jacquard_common::CowStr::new_static("#directory"),
+
}),
+
);
+
map
+
},
+
}),
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("subfs"),
+
::jacquard_lexicon::lexicon::LexUserType::Object(::jacquard_lexicon::lexicon::LexObject {
+
description: None,
+
required: Some(
+
vec![
+
::jacquard_common::smol_str::SmolStr::new_static("type"),
+
::jacquard_common::smol_str::SmolStr::new_static("subject")
+
],
+
),
+
nullable: None,
+
properties: {
+
#[allow(unused_mut)]
+
let mut map = ::std::collections::BTreeMap::new();
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("subject"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: Some(
+
::jacquard_common::CowStr::new_static(
+
"AT-URI pointing to another place.wisp.subfs record for nested subtrees. When expanded, the referenced record's root entries are merged (flattened) into the parent directory, allowing recursive splitting of large directory structures.",
+
),
+
),
+
format: Some(
+
::jacquard_lexicon::lexicon::LexStringFormat::AtUri,
+
),
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map.insert(
+
::jacquard_common::smol_str::SmolStr::new_static("type"),
+
::jacquard_lexicon::lexicon::LexObjectProperty::String(::jacquard_lexicon::lexicon::LexString {
+
description: None,
+
format: None,
+
default: None,
+
min_length: None,
+
max_length: None,
+
min_graphemes: None,
+
max_graphemes: None,
+
r#enum: None,
+
r#const: None,
+
known_values: None,
+
}),
+
);
+
map
+
},
+
}),
+
);
+
map
+
},
+
}
+
}
+
+
impl<'a> ::jacquard_lexicon::schema::LexiconSchema for Directory<'a> {
+
fn nsid() -> &'static str {
+
"place.wisp.subfs"
+
}
+
fn def_name() -> &'static str {
+
"directory"
+
}
+
fn lexicon_doc() -> ::jacquard_lexicon::lexicon::LexiconDoc<'static> {
+
lexicon_doc_place_wisp_subfs()
+
}
+
fn validate(
+
&self,
+
) -> ::std::result::Result<(), ::jacquard_lexicon::validation::ConstraintError> {
+
{
+
let value = &self.entries;
+
#[allow(unused_comparisons)]
+
if value.len() > 500usize {
+
return Err(::jacquard_lexicon::validation::ConstraintError::MaxLength {
+
path: ::jacquard_lexicon::validation::ValidationPath::from_field(
+
"entries",
+
),
+
max: 500usize,
+
actual: value.len(),
+
});
+
}
+
}
+
Ok(())
+
}
+
}
+
+
#[jacquard_derive::lexicon]
+
#[derive(
+
serde::Serialize,
+
serde::Deserialize,
+
Debug,
+
Clone,
+
PartialEq,
+
Eq,
+
jacquard_derive::IntoStatic
+
)]
+
#[serde(rename_all = "camelCase")]
+
pub struct Entry<'a> {
+
#[serde(borrow)]
+
pub name: jacquard_common::CowStr<'a>,
+
#[serde(borrow)]
+
pub node: EntryNode<'a>,
+
}
+
+
pub mod entry_state {
+
+
pub use crate::builder_types::{Set, Unset, IsSet, IsUnset};
+
#[allow(unused)]
+
use ::core::marker::PhantomData;
+
mod sealed {
+
pub trait Sealed {}
+
}
+
/// State trait tracking which required fields have been set
+
pub trait State: sealed::Sealed {
+
type Name;
+
type Node;
+
}
+
/// Empty state - all required fields are unset
+
pub struct Empty(());
+
impl sealed::Sealed for Empty {}
+
impl State for Empty {
+
type Name = Unset;
+
type Node = Unset;
+
}
+
///State transition - sets the `name` field to Set
+
pub struct SetName<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetName<S> {}
+
impl<S: State> State for SetName<S> {
+
type Name = Set<members::name>;
+
type Node = S::Node;
+
}
+
///State transition - sets the `node` field to Set
+
pub struct SetNode<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetNode<S> {}
+
impl<S: State> State for SetNode<S> {
+
type Name = S::Name;
+
type Node = Set<members::node>;
+
}
+
/// Marker types for field names
+
#[allow(non_camel_case_types)]
+
pub mod members {
+
///Marker type for the `name` field
+
pub struct name(());
+
///Marker type for the `node` field
+
pub struct node(());
+
}
+
}
+
+
/// Builder for constructing an instance of this type
+
pub struct EntryBuilder<'a, S: entry_state::State> {
+
_phantom_state: ::core::marker::PhantomData<fn() -> S>,
+
__unsafe_private_named: (
+
::core::option::Option<jacquard_common::CowStr<'a>>,
+
::core::option::Option<EntryNode<'a>>,
+
),
+
_phantom: ::core::marker::PhantomData<&'a ()>,
+
}
+
+
impl<'a> Entry<'a> {
+
/// Create a new builder for this type
+
pub fn new() -> EntryBuilder<'a, entry_state::Empty> {
+
EntryBuilder::new()
+
}
+
}
+
+
impl<'a> EntryBuilder<'a, entry_state::Empty> {
+
/// Create a new builder with all fields unset
+
pub fn new() -> Self {
+
EntryBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: (None, None),
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> EntryBuilder<'a, S>
+
where
+
S: entry_state::State,
+
S::Name: entry_state::IsUnset,
+
{
+
/// Set the `name` field (required)
+
pub fn name(
+
mut self,
+
value: impl Into<jacquard_common::CowStr<'a>>,
+
) -> EntryBuilder<'a, entry_state::SetName<S>> {
+
self.__unsafe_private_named.0 = ::core::option::Option::Some(value.into());
+
EntryBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> EntryBuilder<'a, S>
+
where
+
S: entry_state::State,
+
S::Node: entry_state::IsUnset,
+
{
+
/// Set the `node` field (required)
+
pub fn node(
+
mut self,
+
value: impl Into<EntryNode<'a>>,
+
) -> EntryBuilder<'a, entry_state::SetNode<S>> {
+
self.__unsafe_private_named.1 = ::core::option::Option::Some(value.into());
+
EntryBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> EntryBuilder<'a, S>
+
where
+
S: entry_state::State,
+
S::Name: entry_state::IsSet,
+
S::Node: entry_state::IsSet,
+
{
+
/// Build the final struct
+
pub fn build(self) -> Entry<'a> {
+
Entry {
+
name: self.__unsafe_private_named.0.unwrap(),
+
node: self.__unsafe_private_named.1.unwrap(),
+
extra_data: Default::default(),
+
}
+
}
+
/// Build the final struct with custom extra_data
+
pub fn build_with_data(
+
self,
+
extra_data: std::collections::BTreeMap<
+
jacquard_common::smol_str::SmolStr,
+
jacquard_common::types::value::Data<'a>,
+
>,
+
) -> Entry<'a> {
+
Entry {
+
name: self.__unsafe_private_named.0.unwrap(),
+
node: self.__unsafe_private_named.1.unwrap(),
+
extra_data: Some(extra_data),
+
}
+
}
+
}
+
+
#[jacquard_derive::open_union]
+
#[derive(
+
serde::Serialize,
+
serde::Deserialize,
+
Debug,
+
Clone,
+
PartialEq,
+
Eq,
+
jacquard_derive::IntoStatic
+
)]
+
#[serde(tag = "$type")]
+
#[serde(bound(deserialize = "'de: 'a"))]
+
pub enum EntryNode<'a> {
+
#[serde(rename = "place.wisp.subfs#file")]
+
File(Box<crate::place_wisp::subfs::File<'a>>),
+
#[serde(rename = "place.wisp.subfs#directory")]
+
Directory(Box<crate::place_wisp::subfs::Directory<'a>>),
+
#[serde(rename = "place.wisp.subfs#subfs")]
+
Subfs(Box<crate::place_wisp::subfs::Subfs<'a>>),
+
}
+
+
impl<'a> ::jacquard_lexicon::schema::LexiconSchema for Entry<'a> {
+
fn nsid() -> &'static str {
+
"place.wisp.subfs"
+
}
+
fn def_name() -> &'static str {
+
"entry"
+
}
+
fn lexicon_doc() -> ::jacquard_lexicon::lexicon::LexiconDoc<'static> {
+
lexicon_doc_place_wisp_subfs()
+
}
+
fn validate(
+
&self,
+
) -> ::std::result::Result<(), ::jacquard_lexicon::validation::ConstraintError> {
+
{
+
let value = &self.name;
+
#[allow(unused_comparisons)]
+
if <str>::len(value.as_ref()) > 255usize {
+
return Err(::jacquard_lexicon::validation::ConstraintError::MaxLength {
+
path: ::jacquard_lexicon::validation::ValidationPath::from_field(
+
"name",
+
),
+
max: 255usize,
+
actual: <str>::len(value.as_ref()),
+
});
+
}
+
}
+
Ok(())
+
}
+
}
+
+
#[jacquard_derive::lexicon]
+
#[derive(
+
serde::Serialize,
+
serde::Deserialize,
+
Debug,
+
Clone,
+
PartialEq,
+
Eq,
+
jacquard_derive::IntoStatic
+
)]
+
#[serde(rename_all = "camelCase")]
+
pub struct File<'a> {
+
/// True if blob content is base64-encoded (used to bypass PDS content sniffing)
+
#[serde(skip_serializing_if = "std::option::Option::is_none")]
+
pub base64: Option<bool>,
+
/// Content blob ref
+
#[serde(borrow)]
+
pub blob: jacquard_common::types::blob::BlobRef<'a>,
+
/// Content encoding (e.g., gzip for compressed files)
+
#[serde(skip_serializing_if = "std::option::Option::is_none")]
+
#[serde(borrow)]
+
pub encoding: Option<jacquard_common::CowStr<'a>>,
+
/// Original MIME type before compression
+
#[serde(skip_serializing_if = "std::option::Option::is_none")]
+
#[serde(borrow)]
+
pub mime_type: Option<jacquard_common::CowStr<'a>>,
+
#[serde(borrow)]
+
pub r#type: jacquard_common::CowStr<'a>,
+
}
+
+
pub mod file_state {
+
+
pub use crate::builder_types::{Set, Unset, IsSet, IsUnset};
+
#[allow(unused)]
+
use ::core::marker::PhantomData;
+
mod sealed {
+
pub trait Sealed {}
+
}
+
/// State trait tracking which required fields have been set
+
pub trait State: sealed::Sealed {
+
type Type;
+
type Blob;
+
}
+
/// Empty state - all required fields are unset
+
pub struct Empty(());
+
impl sealed::Sealed for Empty {}
+
impl State for Empty {
+
type Type = Unset;
+
type Blob = Unset;
+
}
+
///State transition - sets the `type` field to Set
+
pub struct SetType<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetType<S> {}
+
impl<S: State> State for SetType<S> {
+
type Type = Set<members::r#type>;
+
type Blob = S::Blob;
+
}
+
///State transition - sets the `blob` field to Set
+
pub struct SetBlob<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetBlob<S> {}
+
impl<S: State> State for SetBlob<S> {
+
type Type = S::Type;
+
type Blob = Set<members::blob>;
+
}
+
/// Marker types for field names
+
#[allow(non_camel_case_types)]
+
pub mod members {
+
///Marker type for the `type` field
+
pub struct r#type(());
+
///Marker type for the `blob` field
+
pub struct blob(());
+
}
+
}
+
+
/// Builder for constructing an instance of this type
+
pub struct FileBuilder<'a, S: file_state::State> {
+
_phantom_state: ::core::marker::PhantomData<fn() -> S>,
+
__unsafe_private_named: (
+
::core::option::Option<bool>,
+
::core::option::Option<jacquard_common::types::blob::BlobRef<'a>>,
+
::core::option::Option<jacquard_common::CowStr<'a>>,
+
::core::option::Option<jacquard_common::CowStr<'a>>,
+
::core::option::Option<jacquard_common::CowStr<'a>>,
+
),
+
_phantom: ::core::marker::PhantomData<&'a ()>,
+
}
+
+
impl<'a> File<'a> {
+
/// Create a new builder for this type
+
pub fn new() -> FileBuilder<'a, file_state::Empty> {
+
FileBuilder::new()
+
}
+
}
+
+
impl<'a> FileBuilder<'a, file_state::Empty> {
+
/// Create a new builder with all fields unset
+
pub fn new() -> Self {
+
FileBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: (None, None, None, None, None),
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S: file_state::State> FileBuilder<'a, S> {
+
/// Set the `base64` field (optional)
+
pub fn base64(mut self, value: impl Into<Option<bool>>) -> Self {
+
self.__unsafe_private_named.0 = value.into();
+
self
+
}
+
/// Set the `base64` field to an Option value (optional)
+
pub fn maybe_base64(mut self, value: Option<bool>) -> Self {
+
self.__unsafe_private_named.0 = value;
+
self
+
}
+
}
+
+
impl<'a, S> FileBuilder<'a, S>
+
where
+
S: file_state::State,
+
S::Blob: file_state::IsUnset,
+
{
+
/// Set the `blob` field (required)
+
pub fn blob(
+
mut self,
+
value: impl Into<jacquard_common::types::blob::BlobRef<'a>>,
+
) -> FileBuilder<'a, file_state::SetBlob<S>> {
+
self.__unsafe_private_named.1 = ::core::option::Option::Some(value.into());
+
FileBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S: file_state::State> FileBuilder<'a, S> {
+
/// Set the `encoding` field (optional)
+
pub fn encoding(
+
mut self,
+
value: impl Into<Option<jacquard_common::CowStr<'a>>>,
+
) -> Self {
+
self.__unsafe_private_named.2 = value.into();
+
self
+
}
+
/// Set the `encoding` field to an Option value (optional)
+
pub fn maybe_encoding(mut self, value: Option<jacquard_common::CowStr<'a>>) -> Self {
+
self.__unsafe_private_named.2 = value;
+
self
+
}
+
}
+
+
impl<'a, S: file_state::State> FileBuilder<'a, S> {
+
/// Set the `mimeType` field (optional)
+
pub fn mime_type(
+
mut self,
+
value: impl Into<Option<jacquard_common::CowStr<'a>>>,
+
) -> Self {
+
self.__unsafe_private_named.3 = value.into();
+
self
+
}
+
/// Set the `mimeType` field to an Option value (optional)
+
pub fn maybe_mime_type(
+
mut self,
+
value: Option<jacquard_common::CowStr<'a>>,
+
) -> Self {
+
self.__unsafe_private_named.3 = value;
+
self
+
}
+
}
+
+
impl<'a, S> FileBuilder<'a, S>
+
where
+
S: file_state::State,
+
S::Type: file_state::IsUnset,
+
{
+
/// Set the `type` field (required)
+
pub fn r#type(
+
mut self,
+
value: impl Into<jacquard_common::CowStr<'a>>,
+
) -> FileBuilder<'a, file_state::SetType<S>> {
+
self.__unsafe_private_named.4 = ::core::option::Option::Some(value.into());
+
FileBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> FileBuilder<'a, S>
+
where
+
S: file_state::State,
+
S::Type: file_state::IsSet,
+
S::Blob: file_state::IsSet,
+
{
+
/// Build the final struct
+
pub fn build(self) -> File<'a> {
+
File {
+
base64: self.__unsafe_private_named.0,
+
blob: self.__unsafe_private_named.1.unwrap(),
+
encoding: self.__unsafe_private_named.2,
+
mime_type: self.__unsafe_private_named.3,
+
r#type: self.__unsafe_private_named.4.unwrap(),
+
extra_data: Default::default(),
+
}
+
}
+
/// Build the final struct with custom extra_data
+
pub fn build_with_data(
+
self,
+
extra_data: std::collections::BTreeMap<
+
jacquard_common::smol_str::SmolStr,
+
jacquard_common::types::value::Data<'a>,
+
>,
+
) -> File<'a> {
+
File {
+
base64: self.__unsafe_private_named.0,
+
blob: self.__unsafe_private_named.1.unwrap(),
+
encoding: self.__unsafe_private_named.2,
+
mime_type: self.__unsafe_private_named.3,
+
r#type: self.__unsafe_private_named.4.unwrap(),
+
extra_data: Some(extra_data),
+
}
+
}
+
}
+
+
impl<'a> ::jacquard_lexicon::schema::LexiconSchema for File<'a> {
+
fn nsid() -> &'static str {
+
"place.wisp.subfs"
+
}
+
fn def_name() -> &'static str {
+
"file"
+
}
+
fn lexicon_doc() -> ::jacquard_lexicon::lexicon::LexiconDoc<'static> {
+
lexicon_doc_place_wisp_subfs()
+
}
+
fn validate(
+
&self,
+
) -> ::std::result::Result<(), ::jacquard_lexicon::validation::ConstraintError> {
+
Ok(())
+
}
+
}
+
+
/// Virtual filesystem subtree referenced by place.wisp.fs records. When a subfs entry is expanded, its root entries are merged (flattened) into the parent directory, allowing large directories to be split across multiple records while maintaining a flat structure.
+
#[jacquard_derive::lexicon]
+
#[derive(
+
serde::Serialize,
+
serde::Deserialize,
+
Debug,
+
Clone,
+
PartialEq,
+
Eq,
+
jacquard_derive::IntoStatic
+
)]
+
#[serde(rename_all = "camelCase")]
+
pub struct SubfsRecord<'a> {
+
pub created_at: jacquard_common::types::string::Datetime,
+
#[serde(skip_serializing_if = "std::option::Option::is_none")]
+
pub file_count: Option<i64>,
+
#[serde(borrow)]
+
pub root: crate::place_wisp::subfs::Directory<'a>,
+
}
+
+
pub mod subfs_record_state {
+
+
pub use crate::builder_types::{Set, Unset, IsSet, IsUnset};
+
#[allow(unused)]
+
use ::core::marker::PhantomData;
+
mod sealed {
+
pub trait Sealed {}
+
}
+
/// State trait tracking which required fields have been set
+
pub trait State: sealed::Sealed {
+
type Root;
+
type CreatedAt;
+
}
+
/// Empty state - all required fields are unset
+
pub struct Empty(());
+
impl sealed::Sealed for Empty {}
+
impl State for Empty {
+
type Root = Unset;
+
type CreatedAt = Unset;
+
}
+
///State transition - sets the `root` field to Set
+
pub struct SetRoot<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetRoot<S> {}
+
impl<S: State> State for SetRoot<S> {
+
type Root = Set<members::root>;
+
type CreatedAt = S::CreatedAt;
+
}
+
///State transition - sets the `created_at` field to Set
+
pub struct SetCreatedAt<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetCreatedAt<S> {}
+
impl<S: State> State for SetCreatedAt<S> {
+
type Root = S::Root;
+
type CreatedAt = Set<members::created_at>;
+
}
+
/// Marker types for field names
+
#[allow(non_camel_case_types)]
+
pub mod members {
+
///Marker type for the `root` field
+
pub struct root(());
+
///Marker type for the `created_at` field
+
pub struct created_at(());
+
}
+
}
+
+
/// Builder for constructing an instance of this type
+
pub struct SubfsRecordBuilder<'a, S: subfs_record_state::State> {
+
_phantom_state: ::core::marker::PhantomData<fn() -> S>,
+
__unsafe_private_named: (
+
::core::option::Option<jacquard_common::types::string::Datetime>,
+
::core::option::Option<i64>,
+
::core::option::Option<crate::place_wisp::subfs::Directory<'a>>,
+
),
+
_phantom: ::core::marker::PhantomData<&'a ()>,
+
}
+
+
impl<'a> SubfsRecord<'a> {
+
/// Create a new builder for this type
+
pub fn new() -> SubfsRecordBuilder<'a, subfs_record_state::Empty> {
+
SubfsRecordBuilder::new()
+
}
+
}
+
+
impl<'a> SubfsRecordBuilder<'a, subfs_record_state::Empty> {
+
/// Create a new builder with all fields unset
+
pub fn new() -> Self {
+
SubfsRecordBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: (None, None, None),
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> SubfsRecordBuilder<'a, S>
+
where
+
S: subfs_record_state::State,
+
S::CreatedAt: subfs_record_state::IsUnset,
+
{
+
/// Set the `createdAt` field (required)
+
pub fn created_at(
+
mut self,
+
value: impl Into<jacquard_common::types::string::Datetime>,
+
) -> SubfsRecordBuilder<'a, subfs_record_state::SetCreatedAt<S>> {
+
self.__unsafe_private_named.0 = ::core::option::Option::Some(value.into());
+
SubfsRecordBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S: subfs_record_state::State> SubfsRecordBuilder<'a, S> {
+
/// Set the `fileCount` field (optional)
+
pub fn file_count(mut self, value: impl Into<Option<i64>>) -> Self {
+
self.__unsafe_private_named.1 = value.into();
+
self
+
}
+
/// Set the `fileCount` field to an Option value (optional)
+
pub fn maybe_file_count(mut self, value: Option<i64>) -> Self {
+
self.__unsafe_private_named.1 = value;
+
self
+
}
+
}
+
+
impl<'a, S> SubfsRecordBuilder<'a, S>
+
where
+
S: subfs_record_state::State,
+
S::Root: subfs_record_state::IsUnset,
+
{
+
/// Set the `root` field (required)
+
pub fn root(
+
mut self,
+
value: impl Into<crate::place_wisp::subfs::Directory<'a>>,
+
) -> SubfsRecordBuilder<'a, subfs_record_state::SetRoot<S>> {
+
self.__unsafe_private_named.2 = ::core::option::Option::Some(value.into());
+
SubfsRecordBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> SubfsRecordBuilder<'a, S>
+
where
+
S: subfs_record_state::State,
+
S::Root: subfs_record_state::IsSet,
+
S::CreatedAt: subfs_record_state::IsSet,
+
{
+
/// Build the final struct
+
pub fn build(self) -> SubfsRecord<'a> {
+
SubfsRecord {
+
created_at: self.__unsafe_private_named.0.unwrap(),
+
file_count: self.__unsafe_private_named.1,
+
root: self.__unsafe_private_named.2.unwrap(),
+
extra_data: Default::default(),
+
}
+
}
+
/// Build the final struct with custom extra_data
+
pub fn build_with_data(
+
self,
+
extra_data: std::collections::BTreeMap<
+
jacquard_common::smol_str::SmolStr,
+
jacquard_common::types::value::Data<'a>,
+
>,
+
) -> SubfsRecord<'a> {
+
SubfsRecord {
+
created_at: self.__unsafe_private_named.0.unwrap(),
+
file_count: self.__unsafe_private_named.1,
+
root: self.__unsafe_private_named.2.unwrap(),
+
extra_data: Some(extra_data),
+
}
+
}
+
}
+
+
impl<'a> SubfsRecord<'a> {
+
pub fn uri(
+
uri: impl Into<jacquard_common::CowStr<'a>>,
+
) -> Result<
+
jacquard_common::types::uri::RecordUri<'a, SubfsRecordRecord>,
+
jacquard_common::types::uri::UriError,
+
> {
+
jacquard_common::types::uri::RecordUri::try_from_uri(
+
jacquard_common::types::string::AtUri::new_cow(uri.into())?,
+
)
+
}
+
}
+
+
/// Typed wrapper for GetRecord response with this collection's record type.
+
#[derive(
+
serde::Serialize,
+
serde::Deserialize,
+
Debug,
+
Clone,
+
PartialEq,
+
Eq,
+
jacquard_derive::IntoStatic
+
)]
+
#[serde(rename_all = "camelCase")]
+
pub struct SubfsRecordGetRecordOutput<'a> {
+
#[serde(skip_serializing_if = "std::option::Option::is_none")]
+
#[serde(borrow)]
+
pub cid: std::option::Option<jacquard_common::types::string::Cid<'a>>,
+
#[serde(borrow)]
+
pub uri: jacquard_common::types::string::AtUri<'a>,
+
#[serde(borrow)]
+
pub value: SubfsRecord<'a>,
+
}
+
+
impl From<SubfsRecordGetRecordOutput<'_>> for SubfsRecord<'_> {
+
fn from(output: SubfsRecordGetRecordOutput<'_>) -> Self {
+
use jacquard_common::IntoStatic;
+
output.value.into_static()
+
}
+
}
+
+
impl jacquard_common::types::collection::Collection for SubfsRecord<'_> {
+
const NSID: &'static str = "place.wisp.subfs";
+
type Record = SubfsRecordRecord;
+
}
+
+
/// Marker type for deserializing records from this collection.
+
#[derive(Debug, serde::Serialize, serde::Deserialize)]
+
pub struct SubfsRecordRecord;
+
impl jacquard_common::xrpc::XrpcResp for SubfsRecordRecord {
+
const NSID: &'static str = "place.wisp.subfs";
+
const ENCODING: &'static str = "application/json";
+
type Output<'de> = SubfsRecordGetRecordOutput<'de>;
+
type Err<'de> = jacquard_common::types::collection::RecordError<'de>;
+
}
+
+
impl jacquard_common::types::collection::Collection for SubfsRecordRecord {
+
const NSID: &'static str = "place.wisp.subfs";
+
type Record = SubfsRecordRecord;
+
}
+
+
impl<'a> ::jacquard_lexicon::schema::LexiconSchema for SubfsRecord<'a> {
+
fn nsid() -> &'static str {
+
"place.wisp.subfs"
+
}
+
fn def_name() -> &'static str {
+
"main"
+
}
+
fn lexicon_doc() -> ::jacquard_lexicon::lexicon::LexiconDoc<'static> {
+
lexicon_doc_place_wisp_subfs()
+
}
+
fn validate(
+
&self,
+
) -> ::std::result::Result<(), ::jacquard_lexicon::validation::ConstraintError> {
+
if let Some(ref value) = self.file_count {
+
if *value > 1000i64 {
+
return Err(::jacquard_lexicon::validation::ConstraintError::Maximum {
+
path: ::jacquard_lexicon::validation::ValidationPath::from_field(
+
"file_count",
+
),
+
max: 1000i64,
+
actual: *value,
+
});
+
}
+
}
+
if let Some(ref value) = self.file_count {
+
if *value < 0i64 {
+
return Err(::jacquard_lexicon::validation::ConstraintError::Minimum {
+
path: ::jacquard_lexicon::validation::ValidationPath::from_field(
+
"file_count",
+
),
+
min: 0i64,
+
actual: *value,
+
});
+
}
+
}
+
Ok(())
+
}
+
}
+
+
#[jacquard_derive::lexicon]
+
#[derive(
+
serde::Serialize,
+
serde::Deserialize,
+
Debug,
+
Clone,
+
PartialEq,
+
Eq,
+
jacquard_derive::IntoStatic
+
)]
+
#[serde(rename_all = "camelCase")]
+
pub struct Subfs<'a> {
+
/// AT-URI pointing to another place.wisp.subfs record for nested subtrees. When expanded, the referenced record's root entries are merged (flattened) into the parent directory, allowing recursive splitting of large directory structures.
+
#[serde(borrow)]
+
pub subject: jacquard_common::types::string::AtUri<'a>,
+
#[serde(borrow)]
+
pub r#type: jacquard_common::CowStr<'a>,
+
}
+
+
pub mod subfs_state {
+
+
pub use crate::builder_types::{Set, Unset, IsSet, IsUnset};
+
#[allow(unused)]
+
use ::core::marker::PhantomData;
+
mod sealed {
+
pub trait Sealed {}
+
}
+
/// State trait tracking which required fields have been set
+
pub trait State: sealed::Sealed {
+
type Type;
+
type Subject;
+
}
+
/// Empty state - all required fields are unset
+
pub struct Empty(());
+
impl sealed::Sealed for Empty {}
+
impl State for Empty {
+
type Type = Unset;
+
type Subject = Unset;
+
}
+
///State transition - sets the `type` field to Set
+
pub struct SetType<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetType<S> {}
+
impl<S: State> State for SetType<S> {
+
type Type = Set<members::r#type>;
+
type Subject = S::Subject;
+
}
+
///State transition - sets the `subject` field to Set
+
pub struct SetSubject<S: State = Empty>(PhantomData<fn() -> S>);
+
impl<S: State> sealed::Sealed for SetSubject<S> {}
+
impl<S: State> State for SetSubject<S> {
+
type Type = S::Type;
+
type Subject = Set<members::subject>;
+
}
+
/// Marker types for field names
+
#[allow(non_camel_case_types)]
+
pub mod members {
+
///Marker type for the `type` field
+
pub struct r#type(());
+
///Marker type for the `subject` field
+
pub struct subject(());
+
}
+
}
+
+
/// Builder for constructing an instance of this type
+
pub struct SubfsBuilder<'a, S: subfs_state::State> {
+
_phantom_state: ::core::marker::PhantomData<fn() -> S>,
+
__unsafe_private_named: (
+
::core::option::Option<jacquard_common::types::string::AtUri<'a>>,
+
::core::option::Option<jacquard_common::CowStr<'a>>,
+
),
+
_phantom: ::core::marker::PhantomData<&'a ()>,
+
}
+
+
impl<'a> Subfs<'a> {
+
/// Create a new builder for this type
+
pub fn new() -> SubfsBuilder<'a, subfs_state::Empty> {
+
SubfsBuilder::new()
+
}
+
}
+
+
impl<'a> SubfsBuilder<'a, subfs_state::Empty> {
+
/// Create a new builder with all fields unset
+
pub fn new() -> Self {
+
SubfsBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: (None, None),
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> SubfsBuilder<'a, S>
+
where
+
S: subfs_state::State,
+
S::Subject: subfs_state::IsUnset,
+
{
+
/// Set the `subject` field (required)
+
pub fn subject(
+
mut self,
+
value: impl Into<jacquard_common::types::string::AtUri<'a>>,
+
) -> SubfsBuilder<'a, subfs_state::SetSubject<S>> {
+
self.__unsafe_private_named.0 = ::core::option::Option::Some(value.into());
+
SubfsBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> SubfsBuilder<'a, S>
+
where
+
S: subfs_state::State,
+
S::Type: subfs_state::IsUnset,
+
{
+
/// Set the `type` field (required)
+
pub fn r#type(
+
mut self,
+
value: impl Into<jacquard_common::CowStr<'a>>,
+
) -> SubfsBuilder<'a, subfs_state::SetType<S>> {
+
self.__unsafe_private_named.1 = ::core::option::Option::Some(value.into());
+
SubfsBuilder {
+
_phantom_state: ::core::marker::PhantomData,
+
__unsafe_private_named: self.__unsafe_private_named,
+
_phantom: ::core::marker::PhantomData,
+
}
+
}
+
}
+
+
impl<'a, S> SubfsBuilder<'a, S>
+
where
+
S: subfs_state::State,
+
S::Type: subfs_state::IsSet,
+
S::Subject: subfs_state::IsSet,
+
{
+
/// Build the final struct
+
pub fn build(self) -> Subfs<'a> {
+
Subfs {
+
subject: self.__unsafe_private_named.0.unwrap(),
+
r#type: self.__unsafe_private_named.1.unwrap(),
+
extra_data: Default::default(),
+
}
+
}
+
/// Build the final struct with custom extra_data
+
pub fn build_with_data(
+
self,
+
extra_data: std::collections::BTreeMap<
+
jacquard_common::smol_str::SmolStr,
+
jacquard_common::types::value::Data<'a>,
+
>,
+
) -> Subfs<'a> {
+
Subfs {
+
subject: self.__unsafe_private_named.0.unwrap(),
+
r#type: self.__unsafe_private_named.1.unwrap(),
+
extra_data: Some(extra_data),
+
}
+
}
+
}
+
+
impl<'a> ::jacquard_lexicon::schema::LexiconSchema for Subfs<'a> {
+
fn nsid() -> &'static str {
+
"place.wisp.subfs"
+
}
+
fn def_name() -> &'static str {
+
"subfs"
+
}
+
fn lexicon_doc() -> ::jacquard_lexicon::lexicon::LexiconDoc<'static> {
+
lexicon_doc_place_wisp_subfs()
+
}
+
fn validate(
+
&self,
+
) -> ::std::result::Result<(), ::jacquard_lexicon::validation::ConstraintError> {
+
Ok(())
+
}
+
}
+2 -1
cli/src/place_wisp.rs
···
// This file was automatically generated from Lexicon schemas.
// Any manual changes will be overwritten on the next regeneration.
-
pub mod fs;
+
pub mod fs;
+
pub mod subfs;
+683
cli/src/pull.rs
···
+
use crate::blob_map;
+
use crate::download;
+
use crate::metadata::SiteMetadata;
+
use crate::place_wisp::fs::*;
+
use crate::subfs_utils;
+
use jacquard::CowStr;
+
use jacquard::prelude::IdentityResolver;
+
use jacquard_common::types::string::Did;
+
use jacquard_common::xrpc::XrpcExt;
+
use jacquard_identity::PublicResolver;
+
use miette::IntoDiagnostic;
+
use std::collections::HashMap;
+
use std::path::{Path, PathBuf};
+
use url::Url;
+
+
/// Pull a site from the PDS to a local directory
+
pub async fn pull_site(
+
input: CowStr<'static>,
+
rkey: CowStr<'static>,
+
output_dir: PathBuf,
+
) -> miette::Result<()> {
+
println!("Pulling site {} from {}...", rkey, input);
+
+
// Resolve handle to DID if needed
+
let resolver = PublicResolver::default();
+
let did = if input.starts_with("did:") {
+
Did::new(&input).into_diagnostic()?
+
} else {
+
// It's a handle, resolve it
+
let handle = jacquard_common::types::string::Handle::new(&input).into_diagnostic()?;
+
resolver.resolve_handle(&handle).await.into_diagnostic()?
+
};
+
+
// Resolve PDS endpoint for the DID
+
let pds_url = resolver.pds_for_did(&did).await.into_diagnostic()?;
+
println!("Resolved PDS: {}", pds_url);
+
+
// Fetch the place.wisp.fs record
+
+
println!("Fetching record from PDS...");
+
let client = reqwest::Client::new();
+
+
// Use com.atproto.repo.getRecord
+
use jacquard::api::com_atproto::repo::get_record::GetRecord;
+
use jacquard_common::types::string::Rkey as RkeyType;
+
let rkey_parsed = RkeyType::new(&rkey).into_diagnostic()?;
+
+
use jacquard_common::types::ident::AtIdentifier;
+
use jacquard_common::types::string::RecordKey;
+
let request = GetRecord::new()
+
.repo(AtIdentifier::Did(did.clone()))
+
.collection(CowStr::from("place.wisp.fs"))
+
.rkey(RecordKey::from(rkey_parsed))
+
.build();
+
+
let response = client
+
.xrpc(pds_url.clone())
+
.send(&request)
+
.await
+
.into_diagnostic()?;
+
+
let record_output = response.into_output().into_diagnostic()?;
+
let record_cid = record_output.cid.as_ref().map(|c| c.to_string()).unwrap_or_default();
+
+
// Parse the record value as Fs
+
use jacquard_common::types::value::from_data;
+
let fs_record: Fs = from_data(&record_output.value).into_diagnostic()?;
+
+
let file_count = fs_record.file_count.map(|c| c.to_string()).unwrap_or_else(|| "?".to_string());
+
println!("Found site '{}' with {} files (in main record)", fs_record.site, file_count);
+
+
// Check for and expand subfs nodes
+
let expanded_root = expand_subfs_in_pull(&fs_record.root, &pds_url, did.as_str()).await?;
+
let total_file_count = subfs_utils::count_files_in_directory(&expanded_root);
+
+
if total_file_count as i64 != fs_record.file_count.unwrap_or(0) {
+
println!("Total files after expanding subfs: {}", total_file_count);
+
}
+
+
// Load existing metadata for incremental updates
+
let existing_metadata = SiteMetadata::load(&output_dir)?;
+
let existing_file_cids = existing_metadata
+
.as_ref()
+
.map(|m| m.file_cids.clone())
+
.unwrap_or_default();
+
+
// Extract blob map from the expanded manifest
+
let new_blob_map = blob_map::extract_blob_map(&expanded_root);
+
let new_file_cids: HashMap<String, String> = new_blob_map
+
.iter()
+
.map(|(path, (_blob_ref, cid))| (path.clone(), cid.clone()))
+
.collect();
+
+
// Clean up any leftover temp directories from previous failed attempts
+
let parent = output_dir.parent().unwrap_or_else(|| std::path::Path::new("."));
+
let output_name = output_dir.file_name().unwrap_or_else(|| std::ffi::OsStr::new("site")).to_string_lossy();
+
let temp_prefix = format!(".tmp-{}-", output_name);
+
+
if let Ok(entries) = parent.read_dir() {
+
for entry in entries.flatten() {
+
let name = entry.file_name();
+
if name.to_string_lossy().starts_with(&temp_prefix) {
+
let _ = std::fs::remove_dir_all(entry.path());
+
}
+
}
+
}
+
+
// Check if we need to update (verify files actually exist, not just metadata)
+
if let Some(metadata) = &existing_metadata {
+
if metadata.record_cid == record_cid {
+
// Verify that the output directory actually exists and has the expected files
+
let has_all_files = output_dir.exists() && {
+
// Count actual files on disk (excluding metadata)
+
let mut actual_file_count = 0;
+
if let Ok(entries) = std::fs::read_dir(&output_dir) {
+
for entry in entries.flatten() {
+
let name = entry.file_name();
+
if !name.to_string_lossy().starts_with(".wisp-metadata") {
+
if entry.path().is_file() {
+
actual_file_count += 1;
+
}
+
}
+
}
+
}
+
+
// Compare with expected file count from metadata
+
let expected_count = metadata.file_cids.len();
+
actual_file_count > 0 && actual_file_count >= expected_count
+
};
+
+
if has_all_files {
+
println!("Site is already up to date!");
+
return Ok(());
+
} else {
+
println!("Site metadata exists but files are missing, re-downloading...");
+
}
+
}
+
}
+
+
// Create temporary directory for atomic update
+
// Place temp dir in parent directory to avoid issues with non-existent output_dir
+
let parent = output_dir.parent().unwrap_or_else(|| std::path::Path::new("."));
+
let temp_dir_name = format!(
+
".tmp-{}-{}",
+
output_dir.file_name().unwrap_or_else(|| std::ffi::OsStr::new("site")).to_string_lossy(),
+
chrono::Utc::now().timestamp()
+
);
+
let temp_dir = parent.join(temp_dir_name);
+
std::fs::create_dir_all(&temp_dir).into_diagnostic()?;
+
+
println!("Downloading files...");
+
let mut downloaded = 0;
+
let mut reused = 0;
+
+
// Download files recursively (using expanded root)
+
let download_result = download_directory(
+
&expanded_root,
+
&temp_dir,
+
&pds_url,
+
did.as_str(),
+
&new_blob_map,
+
&existing_file_cids,
+
&output_dir,
+
String::new(),
+
&mut downloaded,
+
&mut reused,
+
)
+
.await;
+
+
// If download failed, clean up temp directory
+
if let Err(e) = download_result {
+
let _ = std::fs::remove_dir_all(&temp_dir);
+
return Err(e);
+
}
+
+
println!(
+
"Downloaded {} files, reused {} files",
+
downloaded, reused
+
);
+
+
// Save metadata
+
let metadata = SiteMetadata::new(record_cid, new_file_cids);
+
metadata.save(&temp_dir)?;
+
+
// Move files from temp to output directory
+
let output_abs = std::fs::canonicalize(&output_dir).unwrap_or_else(|_| output_dir.clone());
+
let current_dir = std::env::current_dir().into_diagnostic()?;
+
+
// Special handling for pulling to current directory
+
if output_abs == current_dir {
+
// Move files from temp to current directory
+
for entry in std::fs::read_dir(&temp_dir).into_diagnostic()? {
+
let entry = entry.into_diagnostic()?;
+
let dest = current_dir.join(entry.file_name());
+
+
// Remove existing file/dir if it exists
+
if dest.exists() {
+
if dest.is_dir() {
+
std::fs::remove_dir_all(&dest).into_diagnostic()?;
+
} else {
+
std::fs::remove_file(&dest).into_diagnostic()?;
+
}
+
}
+
+
// Move from temp to current dir
+
std::fs::rename(entry.path(), dest).into_diagnostic()?;
+
}
+
+
// Clean up temp directory
+
std::fs::remove_dir_all(&temp_dir).into_diagnostic()?;
+
} else {
+
// If output directory exists and has content, remove it first
+
if output_dir.exists() {
+
std::fs::remove_dir_all(&output_dir).into_diagnostic()?;
+
}
+
+
// Ensure parent directory exists
+
if let Some(parent) = output_dir.parent() {
+
if !parent.as_os_str().is_empty() && !parent.exists() {
+
std::fs::create_dir_all(parent).into_diagnostic()?;
+
}
+
}
+
+
// Rename temp to final location
+
match std::fs::rename(&temp_dir, &output_dir) {
+
Ok(_) => {},
+
Err(e) => {
+
// Clean up temp directory on failure
+
let _ = std::fs::remove_dir_all(&temp_dir);
+
return Err(miette::miette!("Failed to move temp directory: {}", e));
+
}
+
}
+
}
+
+
println!("โœ“ Site pulled successfully to {}", output_dir.display());
+
+
Ok(())
+
}
+
+
/// Recursively download a directory with concurrent downloads
+
fn download_directory<'a>(
+
dir: &'a Directory<'_>,
+
output_dir: &'a Path,
+
pds_url: &'a Url,
+
did: &'a str,
+
new_blob_map: &'a HashMap<String, (jacquard_common::types::blob::BlobRef<'static>, String)>,
+
existing_file_cids: &'a HashMap<String, String>,
+
existing_output_dir: &'a Path,
+
path_prefix: String,
+
downloaded: &'a mut usize,
+
reused: &'a mut usize,
+
) -> std::pin::Pin<Box<dyn std::future::Future<Output = miette::Result<()>> + Send + 'a>> {
+
Box::pin(async move {
+
use futures::stream::{self, StreamExt};
+
+
// Collect download tasks and directory tasks separately
+
struct DownloadTask {
+
path: String,
+
output_path: PathBuf,
+
blob: jacquard_common::types::blob::BlobRef<'static>,
+
base64: bool,
+
gzip: bool,
+
}
+
+
struct CopyTask {
+
path: String,
+
from: PathBuf,
+
to: PathBuf,
+
}
+
+
let mut download_tasks = Vec::new();
+
let mut copy_tasks = Vec::new();
+
let mut dir_tasks = Vec::new();
+
+
for entry in &dir.entries {
+
let entry_name = entry.name.as_str();
+
let current_path = if path_prefix.is_empty() {
+
entry_name.to_string()
+
} else {
+
format!("{}/{}", path_prefix, entry_name)
+
};
+
+
match &entry.node {
+
EntryNode::File(file) => {
+
let output_path = output_dir.join(entry_name);
+
+
// Check if file CID matches existing
+
let should_copy = if let Some((_blob_ref, new_cid)) = new_blob_map.get(&current_path) {
+
if let Some(existing_cid) = existing_file_cids.get(&current_path) {
+
if existing_cid == new_cid {
+
let existing_path = existing_output_dir.join(&current_path);
+
if existing_path.exists() {
+
copy_tasks.push(CopyTask {
+
path: current_path.clone(),
+
from: existing_path,
+
to: output_path.clone(),
+
});
+
true
+
} else {
+
false
+
}
+
} else {
+
false
+
}
+
} else {
+
false
+
}
+
} else {
+
false
+
};
+
+
if !should_copy {
+
use jacquard_common::IntoStatic;
+
// File needs to be downloaded
+
download_tasks.push(DownloadTask {
+
path: current_path,
+
output_path,
+
blob: file.blob.clone().into_static(),
+
base64: file.base64.unwrap_or(false),
+
gzip: file.encoding.as_ref().map(|e| e.as_str() == "gzip").unwrap_or(false),
+
});
+
}
+
}
+
EntryNode::Directory(subdir) => {
+
let subdir_path = output_dir.join(entry_name);
+
dir_tasks.push((subdir.as_ref().clone(), subdir_path, current_path));
+
}
+
EntryNode::Subfs(_) => {
+
println!(" โš  Skipping subfs node at {} (should have been expanded)", current_path);
+
}
+
EntryNode::Unknown(_) => {
+
println!(" โš  Skipping unknown node type for {}", current_path);
+
}
+
}
+
}
+
+
// Execute copy tasks (fast, do them all)
+
for task in copy_tasks {
+
std::fs::copy(&task.from, &task.to).into_diagnostic()?;
+
*reused += 1;
+
println!(" โœ“ Reused {}", task.path);
+
}
+
+
// Execute download tasks with concurrency limit (20 concurrent downloads)
+
const DOWNLOAD_CONCURRENCY: usize = 20;
+
+
let pds_url_clone = pds_url.clone();
+
let did_str = did.to_string();
+
+
let download_results: Vec<miette::Result<(String, PathBuf, Vec<u8>)>> = stream::iter(download_tasks)
+
.map(|task| {
+
let pds = pds_url_clone.clone();
+
let did_copy = did_str.clone();
+
+
async move {
+
println!(" โ†“ Downloading {}", task.path);
+
let data = download::download_and_decompress_blob(
+
&pds,
+
&task.blob,
+
&did_copy,
+
task.base64,
+
task.gzip,
+
)
+
.await?;
+
+
Ok::<_, miette::Report>((task.path, task.output_path, data))
+
}
+
})
+
.buffer_unordered(DOWNLOAD_CONCURRENCY)
+
.collect()
+
.await;
+
+
// Write downloaded files to disk
+
for result in download_results {
+
let (path, output_path, data) = result?;
+
std::fs::write(&output_path, data).into_diagnostic()?;
+
*downloaded += 1;
+
println!(" โœ“ Downloaded {}", path);
+
}
+
+
// Recursively process directories
+
for (subdir, subdir_path, current_path) in dir_tasks {
+
std::fs::create_dir_all(&subdir_path).into_diagnostic()?;
+
+
download_directory(
+
&subdir,
+
&subdir_path,
+
pds_url,
+
did,
+
new_blob_map,
+
existing_file_cids,
+
existing_output_dir,
+
current_path,
+
downloaded,
+
reused,
+
)
+
.await?;
+
}
+
+
Ok(())
+
})
+
}
+
+
/// Expand subfs nodes in a directory tree by fetching and merging subfs records (RECURSIVELY)
+
async fn expand_subfs_in_pull<'a>(
+
directory: &Directory<'a>,
+
pds_url: &Url,
+
_did: &str,
+
) -> miette::Result<Directory<'static>> {
+
use crate::place_wisp::subfs::SubfsRecord;
+
use jacquard_common::types::value::from_data;
+
use jacquard_common::IntoStatic;
+
+
// Recursively fetch ALL subfs records (including nested ones)
+
let mut all_subfs_map: HashMap<String, crate::place_wisp::subfs::Directory> = HashMap::new();
+
let mut to_fetch = subfs_utils::extract_subfs_uris(directory, String::new());
+
+
if to_fetch.is_empty() {
+
return Ok((*directory).clone().into_static());
+
}
+
+
println!("Found {} subfs records, fetching recursively...", to_fetch.len());
+
let client = reqwest::Client::new();
+
+
// Keep fetching until we've resolved all subfs (including nested ones)
+
let mut iteration = 0;
+
const MAX_ITERATIONS: usize = 10; // Prevent infinite loops
+
+
while !to_fetch.is_empty() && iteration < MAX_ITERATIONS {
+
iteration += 1;
+
println!(" Iteration {}: fetching {} subfs records...", iteration, to_fetch.len());
+
+
let mut fetch_tasks = Vec::new();
+
+
for (uri, path) in to_fetch.clone() {
+
let client = client.clone();
+
let pds_url = pds_url.clone();
+
+
fetch_tasks.push(async move {
+
let parts: Vec<&str> = uri.trim_start_matches("at://").split('/').collect();
+
if parts.len() < 3 {
+
return Err(miette::miette!("Invalid subfs URI: {}", uri));
+
}
+
+
let _did = parts[0];
+
let collection = parts[1];
+
let rkey = parts[2];
+
+
if collection != "place.wisp.subfs" {
+
return Err(miette::miette!("Expected place.wisp.subfs collection, got: {}", collection));
+
}
+
+
use jacquard::api::com_atproto::repo::get_record::GetRecord;
+
use jacquard_common::types::string::Rkey as RkeyType;
+
use jacquard_common::types::ident::AtIdentifier;
+
use jacquard_common::types::string::{RecordKey, Did as DidType};
+
+
let rkey_parsed = RkeyType::new(rkey).into_diagnostic()?;
+
let did_parsed = DidType::new(_did).into_diagnostic()?;
+
+
let request = GetRecord::new()
+
.repo(AtIdentifier::Did(did_parsed))
+
.collection(CowStr::from("place.wisp.subfs"))
+
.rkey(RecordKey::from(rkey_parsed))
+
.build();
+
+
let response = client
+
.xrpc(pds_url)
+
.send(&request)
+
.await
+
.into_diagnostic()?;
+
+
let record_output = response.into_output().into_diagnostic()?;
+
let subfs_record: SubfsRecord = from_data(&record_output.value).into_diagnostic()?;
+
let subfs_record_static = subfs_record.into_static();
+
+
Ok::<_, miette::Report>((path, subfs_record_static))
+
});
+
}
+
+
let results: Vec<_> = futures::future::join_all(fetch_tasks).await;
+
+
// Process results and find nested subfs
+
let mut newly_fetched = Vec::new();
+
for result in results {
+
match result {
+
Ok((path, record)) => {
+
println!(" โœ“ Fetched subfs at {}", path);
+
+
// Check for nested subfs in this record
+
let nested_subfs = extract_subfs_from_subfs_dir(&record.root, path.clone());
+
newly_fetched.extend(nested_subfs);
+
+
all_subfs_map.insert(path, record.root);
+
}
+
Err(e) => {
+
eprintln!(" โš ๏ธ Failed to fetch subfs: {}", e);
+
}
+
}
+
}
+
+
// Update to_fetch with only the NEW subfs we haven't fetched yet
+
to_fetch = newly_fetched
+
.into_iter()
+
.filter(|(uri, _)| !all_subfs_map.iter().any(|(k, _)| k == uri))
+
.collect();
+
}
+
+
if iteration >= MAX_ITERATIONS {
+
return Err(miette::miette!("Max iterations reached while fetching nested subfs"));
+
}
+
+
println!(" Total subfs records fetched: {}", all_subfs_map.len());
+
+
// Now replace all subfs nodes with their content
+
Ok(replace_subfs_with_content(directory.clone(), &all_subfs_map, String::new()))
+
}
+
+
/// Extract subfs URIs from a subfs::Directory
+
fn extract_subfs_from_subfs_dir(
+
directory: &crate::place_wisp::subfs::Directory,
+
current_path: String,
+
) -> Vec<(String, String)> {
+
let mut uris = Vec::new();
+
+
for entry in &directory.entries {
+
let full_path = if current_path.is_empty() {
+
entry.name.to_string()
+
} else {
+
format!("{}/{}", current_path, entry.name)
+
};
+
+
match &entry.node {
+
crate::place_wisp::subfs::EntryNode::Subfs(subfs_node) => {
+
uris.push((subfs_node.subject.to_string(), full_path.clone()));
+
}
+
crate::place_wisp::subfs::EntryNode::Directory(subdir) => {
+
let nested = extract_subfs_from_subfs_dir(subdir, full_path);
+
uris.extend(nested);
+
}
+
_ => {}
+
}
+
}
+
+
uris
+
}
+
+
/// Recursively replace subfs nodes with their actual content
+
fn replace_subfs_with_content(
+
directory: Directory,
+
subfs_map: &HashMap<String, crate::place_wisp::subfs::Directory>,
+
current_path: String,
+
) -> Directory<'static> {
+
use jacquard_common::IntoStatic;
+
+
let new_entries: Vec<Entry<'static>> = directory
+
.entries
+
.into_iter()
+
.flat_map(|entry| {
+
let full_path = if current_path.is_empty() {
+
entry.name.to_string()
+
} else {
+
format!("{}/{}", current_path, entry.name)
+
};
+
+
match entry.node {
+
EntryNode::Subfs(subfs_node) => {
+
// Check if we have this subfs record
+
if let Some(subfs_dir) = subfs_map.get(&full_path) {
+
let flat = subfs_node.flat.unwrap_or(true); // Default to flat merge
+
+
if flat {
+
// Flat merge: hoist subfs entries into parent
+
println!(" Merging subfs {} (flat)", full_path);
+
let converted_entries: Vec<Entry<'static>> = subfs_dir
+
.entries
+
.iter()
+
.map(|subfs_entry| convert_subfs_entry_to_fs(subfs_entry.clone().into_static()))
+
.collect();
+
+
converted_entries
+
} else {
+
// Nested: create a directory with the subfs name
+
println!(" Merging subfs {} (nested)", full_path);
+
let converted_entries: Vec<Entry<'static>> = subfs_dir
+
.entries
+
.iter()
+
.map(|subfs_entry| convert_subfs_entry_to_fs(subfs_entry.clone().into_static()))
+
.collect();
+
+
vec![Entry::new()
+
.name(entry.name.into_static())
+
.node(EntryNode::Directory(Box::new(
+
Directory::new()
+
.r#type(CowStr::from("directory"))
+
.entries(converted_entries)
+
.build()
+
)))
+
.build()]
+
}
+
} else {
+
// Subfs not found, skip with warning
+
eprintln!(" โš ๏ธ Subfs not found: {}", full_path);
+
vec![]
+
}
+
}
+
EntryNode::Directory(dir) => {
+
// Recursively process subdirectories
+
vec![Entry::new()
+
.name(entry.name.into_static())
+
.node(EntryNode::Directory(Box::new(
+
replace_subfs_with_content(*dir, subfs_map, full_path)
+
)))
+
.build()]
+
}
+
EntryNode::File(_) => {
+
vec![entry.into_static()]
+
}
+
EntryNode::Unknown(_) => {
+
vec![entry.into_static()]
+
}
+
}
+
})
+
.collect();
+
+
Directory::new()
+
.r#type(CowStr::from("directory"))
+
.entries(new_entries)
+
.build()
+
}
+
+
/// Convert a subfs entry to a fs entry (they have the same structure but different types)
+
fn convert_subfs_entry_to_fs(subfs_entry: crate::place_wisp::subfs::Entry<'static>) -> Entry<'static> {
+
use jacquard_common::IntoStatic;
+
+
let node = match subfs_entry.node {
+
crate::place_wisp::subfs::EntryNode::File(file) => {
+
EntryNode::File(Box::new(
+
File::new()
+
.r#type(file.r#type.into_static())
+
.blob(file.blob.into_static())
+
.encoding(file.encoding.map(|e| e.into_static()))
+
.mime_type(file.mime_type.map(|m| m.into_static()))
+
.base64(file.base64)
+
.build()
+
))
+
}
+
crate::place_wisp::subfs::EntryNode::Directory(dir) => {
+
let converted_entries: Vec<Entry<'static>> = dir
+
.entries
+
.into_iter()
+
.map(|e| convert_subfs_entry_to_fs(e.into_static()))
+
.collect();
+
+
EntryNode::Directory(Box::new(
+
Directory::new()
+
.r#type(dir.r#type.into_static())
+
.entries(converted_entries)
+
.build()
+
))
+
}
+
crate::place_wisp::subfs::EntryNode::Subfs(_nested_subfs) => {
+
// Nested subfs should have been expanded already - if we get here, it means expansion failed
+
// Treat it like a directory reference that should have been expanded
+
eprintln!(" โš ๏ธ Warning: unexpanded nested subfs at path, treating as empty directory");
+
EntryNode::Directory(Box::new(
+
Directory::new()
+
.r#type(CowStr::from("directory"))
+
.entries(vec![])
+
.build()
+
))
+
}
+
crate::place_wisp::subfs::EntryNode::Unknown(unknown) => {
+
EntryNode::Unknown(unknown)
+
}
+
};
+
+
Entry::new()
+
.name(subfs_entry.name.into_static())
+
.node(node)
+
.build()
+
}
+
+375
cli/src/redirects.rs
···
+
use regex::Regex;
+
use std::collections::HashMap;
+
use std::fs;
+
use std::path::Path;
+
+
/// Maximum number of redirect rules to prevent DoS attacks
+
const MAX_REDIRECT_RULES: usize = 1000;
+
+
#[derive(Debug, Clone)]
+
pub struct RedirectRule {
+
#[allow(dead_code)]
+
pub from: String,
+
pub to: String,
+
pub status: u16,
+
#[allow(dead_code)]
+
pub force: bool,
+
pub from_pattern: Regex,
+
pub from_params: Vec<String>,
+
pub query_params: Option<HashMap<String, String>>,
+
}
+
+
#[derive(Debug)]
+
pub struct RedirectMatch {
+
pub target_path: String,
+
pub status: u16,
+
pub force: bool,
+
}
+
+
/// Parse a _redirects file into an array of redirect rules
+
pub fn parse_redirects_file(content: &str) -> Vec<RedirectRule> {
+
let lines = content.lines();
+
let mut rules = Vec::new();
+
+
for (line_num, line_raw) in lines.enumerate() {
+
if line_raw.trim().is_empty() || line_raw.trim().starts_with('#') {
+
continue;
+
}
+
+
// Enforce max rules limit
+
if rules.len() >= MAX_REDIRECT_RULES {
+
eprintln!(
+
"Redirect rules limit reached ({}), ignoring remaining rules",
+
MAX_REDIRECT_RULES
+
);
+
break;
+
}
+
+
match parse_redirect_line(line_raw.trim()) {
+
Ok(Some(rule)) => rules.push(rule),
+
Ok(None) => continue,
+
Err(e) => {
+
eprintln!(
+
"Failed to parse redirect rule on line {}: {} ({})",
+
line_num + 1,
+
line_raw,
+
e
+
);
+
}
+
}
+
}
+
+
rules
+
}
+
+
/// Parse a single redirect rule line
+
/// Format: /from [query_params] /to [status] [conditions]
+
fn parse_redirect_line(line: &str) -> Result<Option<RedirectRule>, String> {
+
let parts: Vec<&str> = line.split_whitespace().collect();
+
+
if parts.len() < 2 {
+
return Ok(None);
+
}
+
+
let mut idx = 0;
+
let from = parts[idx];
+
idx += 1;
+
+
let mut status = 301; // Default status
+
let mut force = false;
+
let mut query_params: HashMap<String, String> = HashMap::new();
+
+
// Parse query parameters that come before the destination path
+
while idx < parts.len() {
+
let part = parts[idx];
+
+
// If it starts with / or http, it's the destination path
+
if part.starts_with('/') || part.starts_with("http://") || part.starts_with("https://") {
+
break;
+
}
+
+
// If it contains = and comes before the destination, it's a query param
+
if part.contains('=') {
+
let split_index = part.find('=').unwrap();
+
let key = &part[..split_index];
+
let value = &part[split_index + 1..];
+
+
if !key.is_empty() && !value.is_empty() {
+
query_params.insert(key.to_string(), value.to_string());
+
}
+
idx += 1;
+
} else {
+
break;
+
}
+
}
+
+
// Next part should be the destination
+
if idx >= parts.len() {
+
return Ok(None);
+
}
+
+
let to = parts[idx];
+
idx += 1;
+
+
// Parse remaining parts for status code
+
for part in parts.iter().skip(idx) {
+
// Check for status code (with optional ! for force)
+
if let Some(stripped) = part.strip_suffix('!') {
+
if let Ok(s) = stripped.parse::<u16>() {
+
force = true;
+
status = s;
+
}
+
} else if let Ok(s) = part.parse::<u16>() {
+
status = s;
+
}
+
// Note: We're ignoring conditional redirects (Country, Language, Cookie, Role) for now
+
// They can be added later if needed
+
}
+
+
// Parse the 'from' pattern
+
let (pattern, params) = convert_path_to_regex(from)?;
+
+
Ok(Some(RedirectRule {
+
from: from.to_string(),
+
to: to.to_string(),
+
status,
+
force,
+
from_pattern: pattern,
+
from_params: params,
+
query_params: if query_params.is_empty() {
+
None
+
} else {
+
Some(query_params)
+
},
+
}))
+
}
+
+
/// Convert a path pattern with placeholders and splats to a regex
+
/// Examples:
+
/// /blog/:year/:month/:day -> captures year, month, day
+
/// /news/* -> captures splat
+
fn convert_path_to_regex(pattern: &str) -> Result<(Regex, Vec<String>), String> {
+
let mut params = Vec::new();
+
let mut regex_str = String::from("^");
+
+
// Split by query string if present
+
let path_part = pattern.split('?').next().unwrap_or(pattern);
+
+
// Escape special regex characters except * and :
+
let mut escaped = String::new();
+
for ch in path_part.chars() {
+
match ch {
+
'.' | '+' | '^' | '$' | '{' | '}' | '(' | ')' | '|' | '[' | ']' | '\\' => {
+
escaped.push('\\');
+
escaped.push(ch);
+
}
+
_ => escaped.push(ch),
+
}
+
}
+
+
// Replace :param with named capture groups
+
let param_regex = Regex::new(r":([a-zA-Z_][a-zA-Z0-9_]*)").map_err(|e| e.to_string())?;
+
let mut last_end = 0;
+
let mut result = String::new();
+
+
for cap in param_regex.captures_iter(&escaped) {
+
let m = cap.get(0).unwrap();
+
result.push_str(&escaped[last_end..m.start()]);
+
result.push_str("([^/?]+)");
+
params.push(cap[1].to_string());
+
last_end = m.end();
+
}
+
result.push_str(&escaped[last_end..]);
+
escaped = result;
+
+
// Replace * with splat capture
+
if escaped.contains('*') {
+
escaped = escaped.replace('*', "(.*)");
+
params.push("splat".to_string());
+
}
+
+
regex_str.push_str(&escaped);
+
+
// Make trailing slash optional
+
if !regex_str.ends_with(".*") {
+
regex_str.push_str("/?");
+
}
+
+
regex_str.push('$');
+
+
let pattern = Regex::new(&regex_str).map_err(|e| e.to_string())?;
+
+
Ok((pattern, params))
+
}
+
+
/// Match a request path against redirect rules
+
pub fn match_redirect_rule(
+
request_path: &str,
+
rules: &[RedirectRule],
+
query_params: Option<&HashMap<String, String>>,
+
) -> Option<RedirectMatch> {
+
// Normalize path: ensure leading slash
+
let normalized_path = if request_path.starts_with('/') {
+
request_path.to_string()
+
} else {
+
format!("/{}", request_path)
+
};
+
+
for rule in rules {
+
// Check query parameter conditions first (if any)
+
if let Some(required_params) = &rule.query_params {
+
if let Some(actual_params) = query_params {
+
let query_matches = required_params.iter().all(|(key, expected_value)| {
+
if let Some(actual_value) = actual_params.get(key) {
+
// If expected value is a placeholder (:name), any value is acceptable
+
if expected_value.starts_with(':') {
+
return true;
+
}
+
// Otherwise it must match exactly
+
actual_value == expected_value
+
} else {
+
false
+
}
+
});
+
+
if !query_matches {
+
continue;
+
}
+
} else {
+
// Rule requires query params but none provided
+
continue;
+
}
+
}
+
+
// Match the path pattern
+
if let Some(captures) = rule.from_pattern.captures(&normalized_path) {
+
let mut target_path = rule.to.clone();
+
+
// Replace captured parameters
+
for (i, param_name) in rule.from_params.iter().enumerate() {
+
if let Some(param_value) = captures.get(i + 1) {
+
let value = param_value.as_str();
+
+
if param_name == "splat" {
+
target_path = target_path.replace(":splat", value);
+
} else {
+
target_path = target_path.replace(&format!(":{}", param_name), value);
+
}
+
}
+
}
+
+
// Handle query parameter replacements
+
if let Some(required_params) = &rule.query_params {
+
if let Some(actual_params) = query_params {
+
for (key, placeholder) in required_params {
+
if placeholder.starts_with(':') {
+
if let Some(actual_value) = actual_params.get(key) {
+
let param_name = &placeholder[1..];
+
target_path = target_path.replace(
+
&format!(":{}", param_name),
+
actual_value,
+
);
+
}
+
}
+
}
+
}
+
}
+
+
// Preserve query string for 200, 301, 302 redirects (unless target already has one)
+
if [200, 301, 302].contains(&rule.status)
+
&& query_params.is_some()
+
&& !target_path.contains('?')
+
{
+
if let Some(params) = query_params {
+
if !params.is_empty() {
+
let query_string: String = params
+
.iter()
+
.map(|(k, v)| format!("{}={}", k, v))
+
.collect::<Vec<_>>()
+
.join("&");
+
target_path = format!("{}?{}", target_path, query_string);
+
}
+
}
+
}
+
+
return Some(RedirectMatch {
+
target_path,
+
status: rule.status,
+
force: rule.force,
+
});
+
}
+
}
+
+
None
+
}
+
+
/// Load redirect rules from a _redirects file
+
pub fn load_redirect_rules(directory: &Path) -> Vec<RedirectRule> {
+
let redirects_path = directory.join("_redirects");
+
+
if !redirects_path.exists() {
+
return Vec::new();
+
}
+
+
match fs::read_to_string(&redirects_path) {
+
Ok(content) => parse_redirects_file(&content),
+
Err(e) => {
+
eprintln!("Failed to load _redirects file: {}", e);
+
Vec::new()
+
}
+
}
+
}
+
+
#[cfg(test)]
+
mod tests {
+
use super::*;
+
+
#[test]
+
fn test_parse_simple_redirect() {
+
let content = "/old-path /new-path";
+
let rules = parse_redirects_file(content);
+
assert_eq!(rules.len(), 1);
+
assert_eq!(rules[0].from, "/old-path");
+
assert_eq!(rules[0].to, "/new-path");
+
assert_eq!(rules[0].status, 301);
+
assert!(!rules[0].force);
+
}
+
+
#[test]
+
fn test_parse_with_status() {
+
let content = "/temp /target 302";
+
let rules = parse_redirects_file(content);
+
assert_eq!(rules[0].status, 302);
+
}
+
+
#[test]
+
fn test_parse_force_redirect() {
+
let content = "/force /target 301!";
+
let rules = parse_redirects_file(content);
+
assert!(rules[0].force);
+
}
+
+
#[test]
+
fn test_match_exact_path() {
+
let rules = parse_redirects_file("/old-path /new-path");
+
let m = match_redirect_rule("/old-path", &rules, None);
+
assert!(m.is_some());
+
assert_eq!(m.unwrap().target_path, "/new-path");
+
}
+
+
#[test]
+
fn test_match_splat() {
+
let rules = parse_redirects_file("/news/* /blog/:splat");
+
let m = match_redirect_rule("/news/2024/01/15/post", &rules, None);
+
assert!(m.is_some());
+
assert_eq!(m.unwrap().target_path, "/blog/2024/01/15/post");
+
}
+
+
#[test]
+
fn test_match_placeholders() {
+
let rules = parse_redirects_file("/blog/:year/:month/:day /posts/:year-:month-:day");
+
let m = match_redirect_rule("/blog/2024/01/15", &rules, None);
+
assert!(m.is_some());
+
assert_eq!(m.unwrap().target_path, "/posts/2024-01-15");
+
}
+
}
+375
cli/src/serve.rs
···
+
use crate::pull::pull_site;
+
use crate::redirects::{load_redirect_rules, match_redirect_rule, RedirectRule};
+
use axum::{
+
Router,
+
extract::Request,
+
response::{Response, IntoResponse, Redirect},
+
http::{StatusCode, Uri},
+
};
+
use jacquard::CowStr;
+
use jacquard::api::com_atproto::sync::subscribe_repos::{SubscribeRepos, SubscribeReposMessage};
+
use jacquard_common::types::string::Did;
+
use jacquard_common::xrpc::{SubscriptionClient, TungsteniteSubscriptionClient};
+
use miette::IntoDiagnostic;
+
use n0_future::StreamExt;
+
use std::collections::HashMap;
+
use std::path::PathBuf;
+
use std::sync::Arc;
+
use tokio::sync::RwLock;
+
use tower::Service;
+
use tower_http::compression::CompressionLayer;
+
use tower_http::services::ServeDir;
+
+
/// Shared state for the server
+
#[derive(Clone)]
+
struct ServerState {
+
did: CowStr<'static>,
+
rkey: CowStr<'static>,
+
output_dir: PathBuf,
+
last_cid: Arc<RwLock<Option<String>>>,
+
redirect_rules: Arc<RwLock<Vec<RedirectRule>>>,
+
}
+
+
/// Serve a site locally with real-time firehose updates
+
pub async fn serve_site(
+
input: CowStr<'static>,
+
rkey: CowStr<'static>,
+
output_dir: PathBuf,
+
port: u16,
+
) -> miette::Result<()> {
+
println!("Serving site {} from {} on port {}...", rkey, input, port);
+
+
// Resolve handle to DID if needed
+
use jacquard_identity::PublicResolver;
+
use jacquard::prelude::IdentityResolver;
+
+
let resolver = PublicResolver::default();
+
let did = if input.starts_with("did:") {
+
Did::new(&input).into_diagnostic()?
+
} else {
+
// It's a handle, resolve it
+
let handle = jacquard_common::types::string::Handle::new(&input).into_diagnostic()?;
+
resolver.resolve_handle(&handle).await.into_diagnostic()?
+
};
+
+
println!("Resolved to DID: {}", did.as_str());
+
+
// Create output directory if it doesn't exist
+
std::fs::create_dir_all(&output_dir).into_diagnostic()?;
+
+
// Initial pull of the site
+
println!("Performing initial pull...");
+
let did_str = CowStr::from(did.as_str().to_string());
+
pull_site(did_str.clone(), rkey.clone(), output_dir.clone()).await?;
+
+
// Load redirect rules
+
let redirect_rules = load_redirect_rules(&output_dir);
+
if !redirect_rules.is_empty() {
+
println!("Loaded {} redirect rules from _redirects", redirect_rules.len());
+
}
+
+
// Create shared state
+
let state = ServerState {
+
did: did_str.clone(),
+
rkey: rkey.clone(),
+
output_dir: output_dir.clone(),
+
last_cid: Arc::new(RwLock::new(None)),
+
redirect_rules: Arc::new(RwLock::new(redirect_rules)),
+
};
+
+
// Start firehose listener in background
+
let firehose_state = state.clone();
+
tokio::spawn(async move {
+
if let Err(e) = watch_firehose(firehose_state).await {
+
eprintln!("Firehose error: {}", e);
+
}
+
});
+
+
// Create HTTP server with gzip compression and redirect handling
+
let serve_dir = ServeDir::new(&output_dir).precompressed_gzip();
+
+
let app = Router::new()
+
.fallback(move |req: Request| {
+
let state = state.clone();
+
let mut serve_dir = serve_dir.clone();
+
async move {
+
handle_request_with_redirects(req, state, &mut serve_dir).await
+
}
+
})
+
.layer(CompressionLayer::new());
+
+
let addr = format!("0.0.0.0:{}", port);
+
let listener = tokio::net::TcpListener::bind(&addr)
+
.await
+
.into_diagnostic()?;
+
+
println!("\nโœ“ Server running at http://localhost:{}", port);
+
println!(" Watching for updates on the firehose...\n");
+
+
axum::serve(listener, app).await.into_diagnostic()?;
+
+
Ok(())
+
}
+
+
/// Handle a request with redirect support
+
async fn handle_request_with_redirects(
+
req: Request,
+
state: ServerState,
+
serve_dir: &mut ServeDir,
+
) -> Response {
+
let uri = req.uri().clone();
+
let path = uri.path();
+
let method = req.method().clone();
+
+
// Parse query parameters
+
let query_params = uri.query().map(|q| {
+
let mut params = HashMap::new();
+
for pair in q.split('&') {
+
if let Some((key, value)) = pair.split_once('=') {
+
params.insert(key.to_string(), value.to_string());
+
}
+
}
+
params
+
});
+
+
// Check for redirect rules
+
let redirect_rules = state.redirect_rules.read().await;
+
if let Some(redirect_match) = match_redirect_rule(path, &redirect_rules, query_params.as_ref()) {
+
let is_force = redirect_match.force;
+
drop(redirect_rules); // Release the lock
+
+
// If not forced, check if the file exists first
+
if !is_force {
+
// Try to serve the file normally first
+
let test_req = Request::builder()
+
.uri(uri.clone())
+
.method(&method)
+
.body(axum::body::Body::empty())
+
.unwrap();
+
+
match serve_dir.call(test_req).await {
+
Ok(response) if response.status().is_success() => {
+
// File exists and was served successfully, return it
+
return response.into_response();
+
}
+
_ => {
+
// File doesn't exist or error, apply redirect
+
}
+
}
+
}
+
+
// Handle different status codes
+
match redirect_match.status {
+
200 => {
+
// Rewrite: serve the target file but keep the URL the same
+
if let Ok(target_uri) = redirect_match.target_path.parse::<Uri>() {
+
let new_req = Request::builder()
+
.uri(target_uri)
+
.method(&method)
+
.body(axum::body::Body::empty())
+
.unwrap();
+
+
match serve_dir.call(new_req).await {
+
Ok(response) => response.into_response(),
+
Err(_) => StatusCode::INTERNAL_SERVER_ERROR.into_response(),
+
}
+
} else {
+
StatusCode::INTERNAL_SERVER_ERROR.into_response()
+
}
+
}
+
301 => {
+
// Permanent redirect
+
Redirect::permanent(&redirect_match.target_path).into_response()
+
}
+
302 => {
+
// Temporary redirect
+
Redirect::temporary(&redirect_match.target_path).into_response()
+
}
+
404 => {
+
// Custom 404 page
+
if let Ok(target_uri) = redirect_match.target_path.parse::<Uri>() {
+
let new_req = Request::builder()
+
.uri(target_uri)
+
.method(&method)
+
.body(axum::body::Body::empty())
+
.unwrap();
+
+
match serve_dir.call(new_req).await {
+
Ok(mut response) => {
+
*response.status_mut() = StatusCode::NOT_FOUND;
+
response.into_response()
+
}
+
Err(_) => StatusCode::NOT_FOUND.into_response(),
+
}
+
} else {
+
StatusCode::NOT_FOUND.into_response()
+
}
+
}
+
_ => {
+
// Unsupported status code, fall through to normal serving
+
match serve_dir.call(req).await {
+
Ok(response) => response.into_response(),
+
Err(_) => StatusCode::INTERNAL_SERVER_ERROR.into_response(),
+
}
+
}
+
}
+
} else {
+
drop(redirect_rules);
+
// No redirect match, serve normally
+
match serve_dir.call(req).await {
+
Ok(response) => response.into_response(),
+
Err(_) => StatusCode::NOT_FOUND.into_response(),
+
}
+
}
+
}
+
+
/// Watch the firehose for updates to the specific site
+
fn watch_firehose(state: ServerState) -> std::pin::Pin<Box<dyn std::future::Future<Output = miette::Result<()>> + Send>> {
+
Box::pin(async move {
+
use jacquard_identity::PublicResolver;
+
use jacquard::prelude::IdentityResolver;
+
+
// Resolve DID to PDS URL
+
let resolver = PublicResolver::default();
+
let did = Did::new(&state.did).into_diagnostic()?;
+
let pds_url = resolver.pds_for_did(&did).await.into_diagnostic()?;
+
+
println!("[PDS] Resolved DID to PDS: {}", pds_url);
+
+
// Convert HTTP(S) URL to WebSocket URL
+
let mut ws_url = pds_url.clone();
+
let scheme = if pds_url.scheme() == "https" { "wss" } else { "ws" };
+
ws_url.set_scheme(scheme)
+
.map_err(|_| miette::miette!("Failed to set WebSocket scheme"))?;
+
+
println!("[PDS] Connecting to {}...", ws_url);
+
+
// Create subscription client
+
let client = TungsteniteSubscriptionClient::from_base_uri(ws_url);
+
+
// Subscribe to the PDS firehose
+
let params = SubscribeRepos::new().build();
+
+
let stream = client.subscribe(&params).await.into_diagnostic()?;
+
println!("[PDS] Connected! Watching for updates...");
+
+
// Convert to typed message stream
+
let (_sink, mut messages) = stream.into_stream();
+
+
loop {
+
match messages.next().await {
+
Some(Ok(msg)) => {
+
if let Err(e) = handle_firehose_message(&state, msg).await {
+
eprintln!("[PDS] Error handling message: {}", e);
+
}
+
}
+
Some(Err(e)) => {
+
eprintln!("[PDS] Stream error: {}", e);
+
// Try to reconnect after a delay
+
tokio::time::sleep(tokio::time::Duration::from_secs(5)).await;
+
return Box::pin(watch_firehose(state)).await;
+
}
+
None => {
+
println!("[PDS] Stream ended, reconnecting...");
+
tokio::time::sleep(tokio::time::Duration::from_secs(5)).await;
+
return Box::pin(watch_firehose(state)).await;
+
}
+
}
+
}
+
})
+
}
+
+
/// Handle a firehose message
+
async fn handle_firehose_message<'a>(
+
state: &ServerState,
+
msg: SubscribeReposMessage<'a>,
+
) -> miette::Result<()> {
+
match msg {
+
SubscribeReposMessage::Commit(commit_msg) => {
+
// Check if this commit is from our DID
+
if commit_msg.repo.as_str() != state.did.as_str() {
+
return Ok(());
+
}
+
+
// Check if any operation affects our site
+
let target_path = format!("place.wisp.fs/{}", state.rkey);
+
let has_site_update = commit_msg.ops.iter().any(|op| op.path.as_ref() == target_path);
+
+
if has_site_update {
+
// Debug: log all operations for this commit
+
println!("[Debug] Commit has {} ops for {}", commit_msg.ops.len(), state.rkey);
+
for op in &commit_msg.ops {
+
if op.path.as_ref() == target_path {
+
println!("[Debug] - {} {}", op.action.as_ref(), op.path.as_ref());
+
}
+
}
+
}
+
+
if has_site_update {
+
// Use the commit CID as the version tracker
+
let commit_cid = commit_msg.commit.to_string();
+
+
// Check if this is a new commit
+
let should_update = {
+
let last_cid = state.last_cid.read().await;
+
Some(commit_cid.clone()) != *last_cid
+
};
+
+
if should_update {
+
// Check operation types
+
let has_create_or_update = commit_msg.ops.iter().any(|op| {
+
op.path.as_ref() == target_path &&
+
(op.action.as_ref() == "create" || op.action.as_ref() == "update")
+
});
+
let has_delete = commit_msg.ops.iter().any(|op| {
+
op.path.as_ref() == target_path && op.action.as_ref() == "delete"
+
});
+
+
// If there's a create/update, pull the site (even if there's also a delete in the same commit)
+
if has_create_or_update {
+
println!("\n[Update] Detected change to site {} (commit: {})", state.rkey, commit_cid);
+
println!("[Update] Pulling latest version...");
+
+
// Pull the updated site
+
match pull_site(
+
state.did.clone(),
+
state.rkey.clone(),
+
state.output_dir.clone(),
+
)
+
.await
+
{
+
Ok(_) => {
+
// Update last CID
+
let mut last_cid = state.last_cid.write().await;
+
*last_cid = Some(commit_cid);
+
+
// Reload redirect rules
+
let new_redirect_rules = load_redirect_rules(&state.output_dir);
+
let mut redirect_rules = state.redirect_rules.write().await;
+
*redirect_rules = new_redirect_rules;
+
+
println!("[Update] โœ“ Site updated successfully!\n");
+
}
+
Err(e) => {
+
eprintln!("[Update] Failed to pull site: {}", e);
+
}
+
}
+
} else if has_delete {
+
// Only a delete, no create/update
+
println!("\n[Update] Site {} was deleted", state.rkey);
+
+
// Update last CID so we don't process this commit again
+
let mut last_cid = state.last_cid.write().await;
+
*last_cid = Some(commit_cid);
+
}
+
}
+
}
+
}
+
_ => {
+
// Ignore identity and account messages
+
}
+
}
+
+
Ok(())
+
}
+
+336
cli/src/subfs_utils.rs
···
+
use jacquard_common::types::string::AtUri;
+
use jacquard_common::types::blob::BlobRef;
+
use jacquard_common::IntoStatic;
+
use jacquard::client::{Agent, AgentSession, AgentSessionExt};
+
use jacquard::prelude::IdentityResolver;
+
use miette::IntoDiagnostic;
+
use std::collections::HashMap;
+
+
use crate::place_wisp::fs::{Directory as FsDirectory, EntryNode as FsEntryNode};
+
use crate::place_wisp::subfs::SubfsRecord;
+
+
/// Extract all subfs URIs from a directory tree with their mount paths
+
pub fn extract_subfs_uris(directory: &FsDirectory, current_path: String) -> Vec<(String, String)> {
+
let mut uris = Vec::new();
+
+
for entry in &directory.entries {
+
let full_path = if current_path.is_empty() {
+
entry.name.to_string()
+
} else {
+
format!("{}/{}", current_path, entry.name)
+
};
+
+
match &entry.node {
+
FsEntryNode::Subfs(subfs_node) => {
+
// Found a subfs node - store its URI and mount path
+
uris.push((subfs_node.subject.to_string(), full_path.clone()));
+
}
+
FsEntryNode::Directory(subdir) => {
+
// Recursively search subdirectories
+
let sub_uris = extract_subfs_uris(subdir, full_path);
+
uris.extend(sub_uris);
+
}
+
FsEntryNode::File(_) => {
+
// Files don't contain subfs
+
}
+
FsEntryNode::Unknown(_) => {
+
// Skip unknown nodes
+
}
+
}
+
}
+
+
uris
+
}
+
+
/// Fetch a subfs record from the PDS
+
pub async fn fetch_subfs_record(
+
agent: &Agent<impl AgentSession + IdentityResolver>,
+
uri: &str,
+
) -> miette::Result<SubfsRecord<'static>> {
+
// Parse URI: at://did/collection/rkey
+
let parts: Vec<&str> = uri.trim_start_matches("at://").split('/').collect();
+
+
if parts.len() < 3 {
+
return Err(miette::miette!("Invalid subfs URI: {}", uri));
+
}
+
+
let _did = parts[0];
+
let collection = parts[1];
+
let _rkey = parts[2];
+
+
if collection != "place.wisp.subfs" {
+
return Err(miette::miette!("Expected place.wisp.subfs collection, got: {}", collection));
+
}
+
+
// Construct AT-URI for fetching
+
let at_uri = AtUri::new(uri).into_diagnostic()?;
+
+
// Fetch the record
+
let response = agent.get_record::<SubfsRecord>(&at_uri).await.into_diagnostic()?;
+
let record_output = response.into_output().into_diagnostic()?;
+
+
Ok(record_output.value.into_static())
+
}
+
+
/// Merge blob maps from subfs records into the main blob map
+
/// Returns the total number of blobs merged from all subfs records
+
pub async fn merge_subfs_blob_maps(
+
agent: &Agent<impl AgentSession + IdentityResolver>,
+
subfs_uris: Vec<(String, String)>,
+
main_blob_map: &mut HashMap<String, (BlobRef<'static>, String)>,
+
) -> miette::Result<usize> {
+
let mut total_merged = 0;
+
+
println!("Fetching {} subfs records for blob reuse...", subfs_uris.len());
+
+
// Fetch all subfs records in parallel (but with some concurrency limit)
+
use futures::stream::{self, StreamExt};
+
+
let subfs_results: Vec<_> = stream::iter(subfs_uris)
+
.map(|(uri, mount_path)| async move {
+
match fetch_subfs_record(agent, &uri).await {
+
Ok(record) => Some((record, mount_path)),
+
Err(e) => {
+
eprintln!(" โš ๏ธ Failed to fetch subfs {}: {}", uri, e);
+
None
+
}
+
}
+
})
+
.buffer_unordered(5)
+
.collect()
+
.await;
+
+
// Convert subfs Directory to fs Directory for blob extraction
+
// Note: We need to extract blobs from the subfs record's root
+
for result in subfs_results {
+
if let Some((subfs_record, mount_path)) = result {
+
// Extract blobs from this subfs record's root
+
// The blob_map module works with fs::Directory, but subfs::Directory has the same structure
+
// We need to convert or work directly with the entries
+
+
let subfs_blob_map = extract_subfs_blobs(&subfs_record.root, mount_path.clone());
+
let count = subfs_blob_map.len();
+
+
for (path, blob_info) in subfs_blob_map {
+
main_blob_map.insert(path, blob_info);
+
}
+
+
total_merged += count;
+
println!(" โœ“ Merged {} blobs from subfs at {}", count, mount_path);
+
}
+
}
+
+
Ok(total_merged)
+
}
+
+
/// Extract blobs from a subfs directory (works with subfs::Directory)
+
/// Returns a map of file paths to their blob refs and CIDs
+
fn extract_subfs_blobs(
+
directory: &crate::place_wisp::subfs::Directory,
+
current_path: String,
+
) -> HashMap<String, (BlobRef<'static>, String)> {
+
let mut blob_map = HashMap::new();
+
+
for entry in &directory.entries {
+
let full_path = if current_path.is_empty() {
+
entry.name.to_string()
+
} else {
+
format!("{}/{}", current_path, entry.name)
+
};
+
+
match &entry.node {
+
crate::place_wisp::subfs::EntryNode::File(file_node) => {
+
let blob_ref = &file_node.blob;
+
let cid_string = blob_ref.blob().r#ref.to_string();
+
blob_map.insert(
+
full_path,
+
(blob_ref.clone().into_static(), cid_string)
+
);
+
}
+
crate::place_wisp::subfs::EntryNode::Directory(subdir) => {
+
let sub_map = extract_subfs_blobs(subdir, full_path);
+
blob_map.extend(sub_map);
+
}
+
crate::place_wisp::subfs::EntryNode::Subfs(_nested_subfs) => {
+
// Nested subfs - these should be resolved recursively in the main flow
+
// For now, we skip them (they'll be fetched separately)
+
eprintln!(" โš ๏ธ Found nested subfs at {}, skipping (should be fetched separately)", full_path);
+
}
+
crate::place_wisp::subfs::EntryNode::Unknown(_) => {
+
// Skip unknown nodes
+
}
+
}
+
}
+
+
blob_map
+
}
+
+
/// Count total files in a directory tree
+
pub fn count_files_in_directory(directory: &FsDirectory) -> usize {
+
let mut count = 0;
+
+
for entry in &directory.entries {
+
match &entry.node {
+
FsEntryNode::File(_) => count += 1,
+
FsEntryNode::Directory(subdir) => {
+
count += count_files_in_directory(subdir);
+
}
+
FsEntryNode::Subfs(_) => {
+
// Subfs nodes don't count towards the main manifest file count
+
}
+
FsEntryNode::Unknown(_) => {}
+
}
+
}
+
+
count
+
}
+
+
/// Estimate JSON size of a directory tree
+
pub fn estimate_directory_size(directory: &FsDirectory) -> usize {
+
// Serialize to JSON and measure
+
match serde_json::to_string(directory) {
+
Ok(json) => json.len(),
+
Err(_) => 0,
+
}
+
}
+
+
/// Information about a directory that could be split into a subfs record
+
#[derive(Debug)]
+
pub struct SplittableDirectory {
+
pub path: String,
+
pub directory: FsDirectory<'static>,
+
pub size: usize,
+
pub file_count: usize,
+
}
+
+
/// Find large directories that could be split into subfs records
+
/// Returns directories sorted by size (largest first)
+
pub fn find_large_directories(directory: &FsDirectory, current_path: String) -> Vec<SplittableDirectory> {
+
let mut result = Vec::new();
+
+
for entry in &directory.entries {
+
if let FsEntryNode::Directory(subdir) = &entry.node {
+
let dir_path = if current_path.is_empty() {
+
entry.name.to_string()
+
} else {
+
format!("{}/{}", current_path, entry.name)
+
};
+
+
let size = estimate_directory_size(subdir);
+
let file_count = count_files_in_directory(subdir);
+
+
result.push(SplittableDirectory {
+
path: dir_path.clone(),
+
directory: (*subdir.clone()).into_static(),
+
size,
+
file_count,
+
});
+
+
// Recursively find subdirectories
+
let subdirs = find_large_directories(subdir, dir_path);
+
result.extend(subdirs);
+
}
+
}
+
+
// Sort by size (largest first)
+
result.sort_by(|a, b| b.size.cmp(&a.size));
+
+
result
+
}
+
+
/// Replace a directory with a subfs node in the tree
+
pub fn replace_directory_with_subfs(
+
directory: FsDirectory<'static>,
+
target_path: &str,
+
subfs_uri: &str,
+
flat: bool,
+
) -> miette::Result<FsDirectory<'static>> {
+
use jacquard_common::CowStr;
+
use crate::place_wisp::fs::{Entry, Subfs};
+
+
let path_parts: Vec<&str> = target_path.split('/').collect();
+
+
if path_parts.is_empty() {
+
return Err(miette::miette!("Cannot replace root directory"));
+
}
+
+
// Parse the subfs URI and make it owned/'static
+
let at_uri = AtUri::new_cow(jacquard_common::CowStr::from(subfs_uri.to_string())).into_diagnostic()?;
+
+
// If this is a root-level directory
+
if path_parts.len() == 1 {
+
let target_name = path_parts[0];
+
let new_entries: Vec<Entry> = directory.entries.into_iter().map(|entry| {
+
if entry.name == target_name {
+
// Replace this directory with a subfs node
+
Entry::new()
+
.name(entry.name)
+
.node(FsEntryNode::Subfs(Box::new(
+
Subfs::new()
+
.r#type(CowStr::from("subfs"))
+
.subject(at_uri.clone())
+
.flat(Some(flat))
+
.build()
+
)))
+
.build()
+
} else {
+
entry
+
}
+
}).collect();
+
+
return Ok(FsDirectory::new()
+
.r#type(CowStr::from("directory"))
+
.entries(new_entries)
+
.build());
+
}
+
+
// Recursively navigate to parent directory
+
let first_part = path_parts[0];
+
let remaining_path = path_parts[1..].join("/");
+
+
let new_entries: Vec<Entry> = directory.entries.into_iter().filter_map(|entry| {
+
if entry.name == first_part {
+
if let FsEntryNode::Directory(subdir) = entry.node {
+
// Recursively process this subdirectory
+
match replace_directory_with_subfs((*subdir).into_static(), &remaining_path, subfs_uri, flat) {
+
Ok(updated_subdir) => {
+
Some(Entry::new()
+
.name(entry.name)
+
.node(FsEntryNode::Directory(Box::new(updated_subdir)))
+
.build())
+
}
+
Err(_) => None, // Skip entries that fail to update
+
}
+
} else {
+
Some(entry)
+
}
+
} else {
+
Some(entry)
+
}
+
}).collect();
+
+
Ok(FsDirectory::new()
+
.r#type(CowStr::from("directory"))
+
.entries(new_entries)
+
.build())
+
}
+
+
/// Delete a subfs record from the PDS
+
pub async fn delete_subfs_record(
+
agent: &Agent<impl AgentSession + IdentityResolver>,
+
uri: &str,
+
) -> miette::Result<()> {
+
use jacquard_common::types::uri::RecordUri;
+
+
// Construct AT-URI and convert to RecordUri
+
let at_uri = AtUri::new(uri).into_diagnostic()?;
+
let record_uri: RecordUri<'_, crate::place_wisp::subfs::SubfsRecordRecord> = RecordUri::try_from_uri(at_uri).into_diagnostic()?;
+
+
let rkey = record_uri.rkey()
+
.ok_or_else(|| miette::miette!("Invalid subfs URI: missing rkey"))?
+
.clone();
+
+
agent.delete_record::<SubfsRecord>(rkey).await.into_diagnostic()?;
+
+
Ok(())
+
}
-14
cli/test_headers.rs
···
-
use http::Request;
-
-
fn main() {
-
let builder = Request::builder()
-
.header(http::header::CONTENT_TYPE, "*/*")
-
.header(http::header::CONTENT_TYPE, "application/octet-stream");
-
-
let req = builder.body(()).unwrap();
-
-
println!("Content-Type headers:");
-
for value in req.headers().get_all(http::header::CONTENT_TYPE) {
-
println!(" {:?}", value);
-
}
-
}
+90
crates.nix
···
+
{...}: {
+
perSystem = {
+
pkgs,
+
config,
+
lib,
+
inputs',
+
...
+
}: {
+
# declare projects
+
nci.projects."wisp-place-cli" = {
+
path = ./cli;
+
export = false;
+
};
+
nci.toolchains.mkBuild = _:
+
with inputs'.fenix.packages;
+
combine [
+
minimal.rustc
+
minimal.cargo
+
targets.x86_64-pc-windows-gnu.latest.rust-std
+
targets.x86_64-unknown-linux-gnu.latest.rust-std
+
targets.aarch64-apple-darwin.latest.rust-std
+
targets.aarch64-unknown-linux-gnu.latest.rust-std
+
];
+
# configure crates
+
nci.crates."wisp-cli" = {
+
profiles = {
+
dev.runTests = false;
+
release.runTests = false;
+
};
+
targets."x86_64-unknown-linux-gnu" = let
+
targetPkgs = pkgs.pkgsCross.gnu64;
+
targetCC = targetPkgs.stdenv.cc;
+
targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;
+
in rec {
+
default = true;
+
depsDrvConfig.mkDerivation = {
+
nativeBuildInputs = [targetCC];
+
};
+
depsDrvConfig.env = rec {
+
TARGET_CC = "${targetCC.targetPrefix}cc";
+
"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER" = TARGET_CC;
+
};
+
drvConfig = depsDrvConfig;
+
};
+
targets."x86_64-pc-windows-gnu" = let
+
targetPkgs = pkgs.pkgsCross.mingwW64;
+
targetCC = targetPkgs.stdenv.cc;
+
targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;
+
in rec {
+
depsDrvConfig.mkDerivation = {
+
nativeBuildInputs = [targetCC];
+
buildInputs = with targetPkgs; [windows.pthreads];
+
};
+
depsDrvConfig.env = rec {
+
TARGET_CC = "${targetCC.targetPrefix}cc";
+
"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER" = TARGET_CC;
+
};
+
drvConfig = depsDrvConfig;
+
};
+
targets."aarch64-apple-darwin" = let
+
targetPkgs = pkgs.pkgsCross.aarch64-darwin;
+
targetCC = targetPkgs.stdenv.cc;
+
targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;
+
in rec {
+
depsDrvConfig.mkDerivation = {
+
nativeBuildInputs = [targetCC];
+
};
+
depsDrvConfig.env = rec {
+
TARGET_CC = "${targetCC.targetPrefix}cc";
+
"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER" = TARGET_CC;
+
};
+
drvConfig = depsDrvConfig;
+
};
+
targets."aarch64-unknown-linux-gnu" = let
+
targetPkgs = pkgs.pkgsCross.aarch64-multiplatform;
+
targetCC = targetPkgs.stdenv.cc;
+
targetCargoEnvVarTarget = targetPkgs.stdenv.hostPlatform.rust.cargoEnvVarTarget;
+
in rec {
+
depsDrvConfig.mkDerivation = {
+
nativeBuildInputs = [targetCC];
+
};
+
depsDrvConfig.env = rec {
+
TARGET_CC = "${targetCC.targetPrefix}cc";
+
"CARGO_TARGET_${targetCargoEnvVarTarget}_LINKER" = TARGET_CC;
+
};
+
drvConfig = depsDrvConfig;
+
};
+
};
+
};
+
}
+318
flake.lock
···
+
{
+
"nodes": {
+
"crane": {
+
"flake": false,
+
"locked": {
+
"lastModified": 1758758545,
+
"narHash": "sha256-NU5WaEdfwF6i8faJ2Yh+jcK9vVFrofLcwlD/mP65JrI=",
+
"owner": "ipetkov",
+
"repo": "crane",
+
"rev": "95d528a5f54eaba0d12102249ce42f4d01f4e364",
+
"type": "github"
+
},
+
"original": {
+
"owner": "ipetkov",
+
"ref": "v0.21.1",
+
"repo": "crane",
+
"type": "github"
+
}
+
},
+
"dream2nix": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"nixpkgs"
+
],
+
"purescript-overlay": "purescript-overlay",
+
"pyproject-nix": "pyproject-nix"
+
},
+
"locked": {
+
"lastModified": 1754978539,
+
"narHash": "sha256-nrDovydywSKRbWim9Ynmgj8SBm8LK3DI2WuhIqzOHYI=",
+
"owner": "nix-community",
+
"repo": "dream2nix",
+
"rev": "fbec3263cb4895ac86ee9506cdc4e6919a1a2214",
+
"type": "github"
+
},
+
"original": {
+
"owner": "nix-community",
+
"repo": "dream2nix",
+
"type": "github"
+
}
+
},
+
"fenix": {
+
"inputs": {
+
"nixpkgs": [
+
"nixpkgs"
+
],
+
"rust-analyzer-src": "rust-analyzer-src"
+
},
+
"locked": {
+
"lastModified": 1762584108,
+
"narHash": "sha256-wZUW7dlXMXaRdvNbaADqhF8gg9bAfFiMV+iyFQiDv+Y=",
+
"owner": "nix-community",
+
"repo": "fenix",
+
"rev": "32f3ad3b6c690061173e1ac16708874975ec6056",
+
"type": "github"
+
},
+
"original": {
+
"owner": "nix-community",
+
"repo": "fenix",
+
"type": "github"
+
}
+
},
+
"flake-compat": {
+
"flake": false,
+
"locked": {
+
"lastModified": 1696426674,
+
"narHash": "sha256-kvjfFW7WAETZlt09AgDn1MrtKzP7t90Vf7vypd3OL1U=",
+
"owner": "edolstra",
+
"repo": "flake-compat",
+
"rev": "0f9255e01c2351cc7d116c072cb317785dd33b33",
+
"type": "github"
+
},
+
"original": {
+
"owner": "edolstra",
+
"repo": "flake-compat",
+
"type": "github"
+
}
+
},
+
"mk-naked-shell": {
+
"flake": false,
+
"locked": {
+
"lastModified": 1681286841,
+
"narHash": "sha256-3XlJrwlR0nBiREnuogoa5i1b4+w/XPe0z8bbrJASw0g=",
+
"owner": "90-008",
+
"repo": "mk-naked-shell",
+
"rev": "7612f828dd6f22b7fb332cc69440e839d7ffe6bd",
+
"type": "github"
+
},
+
"original": {
+
"owner": "90-008",
+
"repo": "mk-naked-shell",
+
"type": "github"
+
}
+
},
+
"nci": {
+
"inputs": {
+
"crane": "crane",
+
"dream2nix": "dream2nix",
+
"mk-naked-shell": "mk-naked-shell",
+
"nixpkgs": [
+
"nixpkgs"
+
],
+
"parts": "parts",
+
"rust-overlay": "rust-overlay",
+
"treefmt": "treefmt"
+
},
+
"locked": {
+
"lastModified": 1762582646,
+
"narHash": "sha256-MMzE4xccG+8qbLhdaZoeFDUKWUOn3B4lhp5dZmgukmM=",
+
"owner": "90-008",
+
"repo": "nix-cargo-integration",
+
"rev": "0993c449377049fa8868a664e8290ac6658e0b9a",
+
"type": "github"
+
},
+
"original": {
+
"owner": "90-008",
+
"repo": "nix-cargo-integration",
+
"type": "github"
+
}
+
},
+
"nixpkgs": {
+
"locked": {
+
"lastModified": 1762361079,
+
"narHash": "sha256-lz718rr1BDpZBYk7+G8cE6wee3PiBUpn8aomG/vLLiY=",
+
"owner": "nixos",
+
"repo": "nixpkgs",
+
"rev": "ffcdcf99d65c61956d882df249a9be53e5902ea5",
+
"type": "github"
+
},
+
"original": {
+
"owner": "nixos",
+
"ref": "nixpkgs-unstable",
+
"repo": "nixpkgs",
+
"type": "github"
+
}
+
},
+
"parts": {
+
"inputs": {
+
"nixpkgs-lib": [
+
"nci",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1762440070,
+
"narHash": "sha256-xxdepIcb39UJ94+YydGP221rjnpkDZUlykKuF54PsqI=",
+
"owner": "hercules-ci",
+
"repo": "flake-parts",
+
"rev": "26d05891e14c88eb4a5d5bee659c0db5afb609d8",
+
"type": "github"
+
},
+
"original": {
+
"owner": "hercules-ci",
+
"repo": "flake-parts",
+
"type": "github"
+
}
+
},
+
"parts_2": {
+
"inputs": {
+
"nixpkgs-lib": [
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1762440070,
+
"narHash": "sha256-xxdepIcb39UJ94+YydGP221rjnpkDZUlykKuF54PsqI=",
+
"owner": "hercules-ci",
+
"repo": "flake-parts",
+
"rev": "26d05891e14c88eb4a5d5bee659c0db5afb609d8",
+
"type": "github"
+
},
+
"original": {
+
"owner": "hercules-ci",
+
"repo": "flake-parts",
+
"type": "github"
+
}
+
},
+
"purescript-overlay": {
+
"inputs": {
+
"flake-compat": "flake-compat",
+
"nixpkgs": [
+
"nci",
+
"dream2nix",
+
"nixpkgs"
+
],
+
"slimlock": "slimlock"
+
},
+
"locked": {
+
"lastModified": 1728546539,
+
"narHash": "sha256-Sws7w0tlnjD+Bjck1nv29NjC5DbL6nH5auL9Ex9Iz2A=",
+
"owner": "thomashoneyman",
+
"repo": "purescript-overlay",
+
"rev": "4ad4c15d07bd899d7346b331f377606631eb0ee4",
+
"type": "github"
+
},
+
"original": {
+
"owner": "thomashoneyman",
+
"repo": "purescript-overlay",
+
"type": "github"
+
}
+
},
+
"pyproject-nix": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"dream2nix",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1752481895,
+
"narHash": "sha256-luVj97hIMpCbwhx3hWiRwjP2YvljWy8FM+4W9njDhLA=",
+
"owner": "pyproject-nix",
+
"repo": "pyproject.nix",
+
"rev": "16ee295c25107a94e59a7fc7f2e5322851781162",
+
"type": "github"
+
},
+
"original": {
+
"owner": "pyproject-nix",
+
"repo": "pyproject.nix",
+
"type": "github"
+
}
+
},
+
"root": {
+
"inputs": {
+
"fenix": "fenix",
+
"nci": "nci",
+
"nixpkgs": "nixpkgs",
+
"parts": "parts_2"
+
}
+
},
+
"rust-analyzer-src": {
+
"flake": false,
+
"locked": {
+
"lastModified": 1762438844,
+
"narHash": "sha256-ApIKJf6CcMsV2nYBXhGF95BmZMO/QXPhgfSnkA/rVUo=",
+
"owner": "rust-lang",
+
"repo": "rust-analyzer",
+
"rev": "4bf516ee5a960c1e2eee9fedd9b1c9e976a19c86",
+
"type": "github"
+
},
+
"original": {
+
"owner": "rust-lang",
+
"ref": "nightly",
+
"repo": "rust-analyzer",
+
"type": "github"
+
}
+
},
+
"rust-overlay": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1762569282,
+
"narHash": "sha256-vINZAJpXQTZd5cfh06Rcw7hesH7sGSvi+Tn+HUieJn8=",
+
"owner": "oxalica",
+
"repo": "rust-overlay",
+
"rev": "a35a6144b976f70827c2fe2f5c89d16d8f9179d8",
+
"type": "github"
+
},
+
"original": {
+
"owner": "oxalica",
+
"repo": "rust-overlay",
+
"type": "github"
+
}
+
},
+
"slimlock": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"dream2nix",
+
"purescript-overlay",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1688756706,
+
"narHash": "sha256-xzkkMv3neJJJ89zo3o2ojp7nFeaZc2G0fYwNXNJRFlo=",
+
"owner": "thomashoneyman",
+
"repo": "slimlock",
+
"rev": "cf72723f59e2340d24881fd7bf61cb113b4c407c",
+
"type": "github"
+
},
+
"original": {
+
"owner": "thomashoneyman",
+
"repo": "slimlock",
+
"type": "github"
+
}
+
},
+
"treefmt": {
+
"inputs": {
+
"nixpkgs": [
+
"nci",
+
"nixpkgs"
+
]
+
},
+
"locked": {
+
"lastModified": 1762410071,
+
"narHash": "sha256-aF5fvoZeoXNPxT0bejFUBXeUjXfHLSL7g+mjR/p5TEg=",
+
"owner": "numtide",
+
"repo": "treefmt-nix",
+
"rev": "97a30861b13c3731a84e09405414398fbf3e109f",
+
"type": "github"
+
},
+
"original": {
+
"owner": "numtide",
+
"repo": "treefmt-nix",
+
"type": "github"
+
}
+
}
+
},
+
"root": "root",
+
"version": 7
+
}
+59
flake.nix
···
+
{
+
inputs.nixpkgs.url = "github:nixos/nixpkgs/nixpkgs-unstable";
+
inputs.nci.url = "github:90-008/nix-cargo-integration";
+
inputs.nci.inputs.nixpkgs.follows = "nixpkgs";
+
inputs.parts.url = "github:hercules-ci/flake-parts";
+
inputs.parts.inputs.nixpkgs-lib.follows = "nixpkgs";
+
inputs.fenix = {
+
url = "github:nix-community/fenix";
+
inputs.nixpkgs.follows = "nixpkgs";
+
};
+
+
outputs = inputs @ {
+
parts,
+
nci,
+
...
+
}:
+
parts.lib.mkFlake {inherit inputs;} {
+
systems = ["x86_64-linux" "aarch64-darwin"];
+
imports = [
+
nci.flakeModule
+
./crates.nix
+
];
+
perSystem = {
+
pkgs,
+
config,
+
...
+
}: let
+
crateOutputs = config.nci.outputs."wisp-cli";
+
mkRenamedPackage = name: pkg: isWindows: pkgs.runCommand name {} ''
+
mkdir -p $out/bin
+
if [ -f ${pkg}/bin/wisp-cli.exe ]; then
+
cp ${pkg}/bin/wisp-cli.exe $out/bin/${name}
+
elif [ -f ${pkg}/bin/wisp-cli ]; then
+
cp ${pkg}/bin/wisp-cli $out/bin/${name}
+
else
+
echo "Error: Could not find wisp-cli binary in ${pkg}/bin/"
+
ls -la ${pkg}/bin/ || true
+
exit 1
+
fi
+
'';
+
in {
+
devShells.default = crateOutputs.devShell;
+
packages.default = crateOutputs.packages.release;
+
packages.wisp-cli-x86_64-linux = mkRenamedPackage "wisp-cli-x86_64-linux" crateOutputs.packages.release false;
+
packages.wisp-cli-aarch64-linux = mkRenamedPackage "wisp-cli-aarch64-linux" crateOutputs.allTargets."aarch64-unknown-linux-gnu".packages.release false;
+
packages.wisp-cli-x86_64-windows = mkRenamedPackage "wisp-cli-x86_64-windows.exe" crateOutputs.allTargets."x86_64-pc-windows-gnu".packages.release true;
+
packages.wisp-cli-aarch64-darwin = mkRenamedPackage "wisp-cli-aarch64-darwin" crateOutputs.allTargets."aarch64-apple-darwin".packages.release false;
+
packages.all = pkgs.symlinkJoin {
+
name = "wisp-cli-all";
+
paths = [
+
config.packages.wisp-cli-x86_64-linux
+
config.packages.wisp-cli-aarch64-linux
+
config.packages.wisp-cli-x86_64-windows
+
config.packages.wisp-cli-aarch64-darwin
+
];
+
};
+
};
+
};
+
}
-34
hosting-service/.dockerignore
···
-
# Dependencies
-
node_modules
-
-
# Environment files
-
.env
-
.env.*
-
!.env.example
-
-
# Git
-
.git
-
.gitignore
-
-
# Cache
-
cache
-
-
# Documentation
-
*.md
-
!README.md
-
-
# Logs
-
*.log
-
npm-debug.log*
-
bun-debug.log*
-
-
# OS files
-
.DS_Store
-
Thumbs.db
-
-
# IDE
-
.vscode
-
.idea
-
*.swp
-
*.swo
-
*~
-6
hosting-service/.env.example
···
-
# Database
-
DATABASE_URL=postgres://postgres:postgres@localhost:5432/wisp
-
-
# Server
-
PORT=3001
-
BASE_HOST=wisp.place
-8
hosting-service/.gitignore
···
-
node_modules/
-
cache/
-
.env
-
.env.local
-
*.log
-
dist/
-
build/
-
.DS_Store
-31
hosting-service/Dockerfile
···
-
# Use official Bun image
-
FROM oven/bun:1.3 AS base
-
-
# Set working directory
-
WORKDIR /app
-
-
# Copy package files
-
COPY package.json bun.lock ./
-
-
# Install dependencies
-
RUN bun install --frozen-lockfile --production
-
-
# Copy source code
-
COPY src ./src
-
-
# Create cache directory
-
RUN mkdir -p ./cache/sites
-
-
# Set environment variables (can be overridden at runtime)
-
ENV PORT=3001
-
ENV NODE_ENV=production
-
-
# Expose the application port
-
EXPOSE 3001
-
-
# Health check
-
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
-
CMD bun -e "fetch('http://localhost:3001/health').then(r => r.ok ? process.exit(0) : process.exit(1)).catch(() => process.exit(1))"
-
-
# Start the application
-
CMD ["bun", "src/index.ts"]
-123
hosting-service/EXAMPLE.md
···
-
# HTML Path Rewriting Example
-
-
This document demonstrates how HTML path rewriting works when serving sites via the `/s/:identifier/:site/*` route.
-
-
## Problem
-
-
When you create a static site with absolute paths like `/style.css` or `/images/logo.png`, these paths work fine when served from the root domain. However, when served from a subdirectory like `/s/alice.bsky.social/mysite/`, these absolute paths break because they resolve to the server root instead of the site root.
-
-
## Solution
-
-
The hosting service automatically rewrites absolute paths in HTML files to work correctly in the subdirectory context.
-
-
## Example
-
-
**Original HTML file (index.html):**
-
```html
-
<!DOCTYPE html>
-
<html>
-
<head>
-
<meta charset="UTF-8">
-
<title>My Site</title>
-
<link rel="stylesheet" href="/style.css">
-
<link rel="icon" href="/favicon.ico">
-
<script src="/app.js"></script>
-
</head>
-
<body>
-
<header>
-
<img src="/images/logo.png" alt="Logo">
-
<nav>
-
<a href="/">Home</a>
-
<a href="/about">About</a>
-
<a href="/contact">Contact</a>
-
</nav>
-
</header>
-
-
<main>
-
<h1>Welcome</h1>
-
<img src="/images/hero.jpg"
-
srcset="/images/hero.jpg 1x, /images/hero@2x.jpg 2x"
-
alt="Hero">
-
-
<form action="/submit" method="post">
-
<input type="text" name="email">
-
<button>Submit</button>
-
</form>
-
</main>
-
-
<footer>
-
<a href="https://example.com">External Link</a>
-
<a href="#top">Back to Top</a>
-
</footer>
-
</body>
-
</html>
-
```
-
-
**When accessed via `/s/alice.bsky.social/mysite/`, the HTML is rewritten to:**
-
```html
-
<!DOCTYPE html>
-
<html>
-
<head>
-
<meta charset="UTF-8">
-
<title>My Site</title>
-
<link rel="stylesheet" href="/s/alice.bsky.social/mysite/style.css">
-
<link rel="icon" href="/s/alice.bsky.social/mysite/favicon.ico">
-
<script src="/s/alice.bsky.social/mysite/app.js"></script>
-
</head>
-
<body>
-
<header>
-
<img src="/s/alice.bsky.social/mysite/images/logo.png" alt="Logo">
-
<nav>
-
<a href="/s/alice.bsky.social/mysite/">Home</a>
-
<a href="/s/alice.bsky.social/mysite/about">About</a>
-
<a href="/s/alice.bsky.social/mysite/contact">Contact</a>
-
</nav>
-
</header>
-
-
<main>
-
<h1>Welcome</h1>
-
<img src="/s/alice.bsky.social/mysite/images/hero.jpg"
-
srcset="/s/alice.bsky.social/mysite/images/hero.jpg 1x, /s/alice.bsky.social/mysite/images/hero@2x.jpg 2x"
-
alt="Hero">
-
-
<form action="/s/alice.bsky.social/mysite/submit" method="post">
-
<input type="text" name="email">
-
<button>Submit</button>
-
</form>
-
</main>
-
-
<footer>
-
<a href="https://example.com">External Link</a>
-
<a href="#top">Back to Top</a>
-
</footer>
-
</body>
-
</html>
-
```
-
-
## What's Preserved
-
-
Notice that:
-
- โœ… Absolute paths are rewritten: `/style.css` โ†’ `/s/alice.bsky.social/mysite/style.css`
-
- โœ… External URLs are preserved: `https://example.com` stays the same
-
- โœ… Anchors are preserved: `#top` stays the same
-
- โœ… The rewriting is safe and won't break your site
-
-
## Supported Attributes
-
-
The rewriter handles these HTML attributes:
-
- `src` - images, scripts, iframes, videos, audio
-
- `href` - links, stylesheets
-
- `action` - forms
-
- `data` - objects
-
- `poster` - video posters
-
- `srcset` - responsive images
-
-
## Testing Your Site
-
-
To test if your site works with path rewriting:
-
-
1. Upload your site to your PDS as a `place.wisp.fs` record
-
2. Access it via: `https://hosting.wisp.place/s/YOUR_HANDLE/SITE_NAME/`
-
3. Check that all resources load correctly
-
-
If you're using relative paths already (like `./style.css` or `../images/logo.png`), they'll work without any rewriting.
-130
hosting-service/README.md
···
-
# Wisp Hosting Service
-
-
Minimal microservice for hosting static sites from the AT Protocol. Built with Hono and Bun.
-
-
## Features
-
-
- **Custom Domain Hosting**: Serve verified custom domains
-
- **Wisp.place Subdomains**: Serve registered `*.wisp.place` subdomains
-
- **DNS Hash Routing**: Support DNS verification via `hash.dns.wisp.place`
-
- **Direct File Serving**: Access sites via `/s/:identifier/:site/*` (no DB lookup)
-
- **Firehose Worker**: Listens to AT Protocol firehose for new `place.wisp.fs` records
-
- **Automatic Caching**: Downloads and caches sites locally on first access or firehose event
-
- **SSRF Protection**: Hardened fetch with timeout, size limits, and private IP blocking
-
-
## Routes
-
-
1. **Custom Domains** (`/*`)
-
- Serves verified custom domains (example.com)
-
- DB lookup: `custom_domains` table
-
-
2. **Wisp Subdomains** (`/*.wisp.place/*`)
-
- Serves registered subdomains (alice.wisp.place)
-
- DB lookup: `domains` table
-
-
3. **DNS Hash Routing** (`/hash.dns.wisp.place/*`)
-
- DNS verification routing for custom domains
-
- DB lookup: `custom_domains` by hash
-
-
4. **Direct Serving** (`/s.wisp.place/:identifier/:site/*`)
-
- Direct access without DB lookup
-
- `:identifier` can be DID or handle
-
- Fetches from PDS if not cached
-
- **Automatic HTML path rewriting**: Absolute paths (`/style.css`) are rewritten to relative paths (`/s/:identifier/:site/style.css`)
-
-
## Setup
-
-
```bash
-
# Install dependencies
-
bun install
-
-
# Copy environment file
-
cp .env.example .env
-
-
# Run in development
-
bun run dev
-
-
# Run in production
-
bun run start
-
```
-
-
## Environment Variables
-
-
- `DATABASE_URL` - PostgreSQL connection string
-
- `PORT` - HTTP server port (default: 3001)
-
- `BASE_HOST` - Base domain (default: wisp.place)
-
-
## Architecture
-
-
- **Hono**: Minimal web framework
-
- **Postgres**: Database for domain/site lookups
-
- **AT Protocol**: Decentralized storage
-
- **Jetstream**: Firehose consumer for real-time updates
-
- **Bun**: Runtime and file serving
-
-
## Cache Structure
-
-
```
-
cache/sites/
-
did:plc:abc123/
-
sitename/
-
index.html
-
style.css
-
assets/
-
logo.png
-
```
-
-
## Health Check
-
-
```bash
-
curl http://localhost:3001/health
-
```
-
-
Returns firehose connection status and last event time.
-
-
## HTML Path Rewriting
-
-
When serving sites via the `/s/:identifier/:site/*` route, HTML files are automatically processed to rewrite absolute paths to work correctly in the subdirectory context.
-
-
**What gets rewritten:**
-
- `src` attributes (images, scripts, iframes)
-
- `href` attributes (links, stylesheets)
-
- `action` attributes (forms)
-
- `poster`, `data` attributes (media)
-
- `srcset` attributes (responsive images)
-
-
**What's preserved:**
-
- External URLs (`https://example.com/style.css`)
-
- Protocol-relative URLs (`//cdn.example.com/script.js`)
-
- Data URIs (`data:image/png;base64,...`)
-
- Anchors (`/#section`)
-
- Already relative paths (`./style.css`, `../images/logo.png`)
-
-
**Example:**
-
```html
-
<!-- Original HTML -->
-
<link rel="stylesheet" href="/style.css">
-
<img src="/images/logo.png">
-
-
<!-- Served at /s/did:plc:abc123/mysite/ becomes -->
-
<link rel="stylesheet" href="/s/did:plc:abc123/mysite/style.css">
-
<img src="/s/did:plc:abc123/mysite/images/logo.png">
-
```
-
-
This ensures sites work correctly when served from subdirectories without requiring manual path adjustments.
-
-
## Security
-
-
### SSRF Protection
-
-
All external HTTP requests are protected against Server-Side Request Forgery (SSRF) attacks:
-
-
- **5-second timeout** on all requests
-
- **Size limits**: 1MB for JSON, 10MB default, 100MB for file blobs
-
- **Blocked private IP ranges**:
-
- Loopback (127.0.0.0/8, ::1)
-
- Private networks (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16)
-
- Link-local (169.254.0.0/16, fe80::/10)
-
- Cloud metadata endpoints (169.254.169.254)
-
- **Protocol validation**: Only HTTP/HTTPS allowed
-
- **Streaming with size enforcement**: Prevents memory exhaustion from large responses
-375
hosting-service/bun.lock
···
-
{
-
"lockfileVersion": 1,
-
"workspaces": {
-
"": {
-
"name": "wisp-hosting-service",
-
"dependencies": {
-
"@atproto/api": "^0.17.4",
-
"@atproto/identity": "^0.4.9",
-
"@atproto/lexicon": "^0.5.1",
-
"@atproto/sync": "^0.1.36",
-
"@atproto/xrpc": "^0.7.5",
-
"@hono/node-server": "^1.19.6",
-
"hono": "^4.10.4",
-
"mime-types": "^2.1.35",
-
"multiformats": "^13.4.1",
-
"postgres": "^3.4.5",
-
},
-
"devDependencies": {
-
"@types/bun": "^1.3.1",
-
"@types/mime-types": "^2.1.4",
-
"@types/node": "^22.10.5",
-
"tsx": "^4.19.2",
-
},
-
},
-
},
-
"packages": {
-
"@atproto/api": ["@atproto/api@0.17.4", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/lexicon": "^0.5.1", "@atproto/syntax": "^0.4.1", "@atproto/xrpc": "^0.7.5", "await-lock": "^2.2.2", "multiformats": "^9.9.0", "tlds": "^1.234.0", "zod": "^3.23.8" } }, ""],
-
-
"@atproto/common": ["@atproto/common@0.4.12", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@ipld/dag-cbor": "^7.0.3", "cbor-x": "^1.5.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "pino": "^8.21.0" } }, ""],
-
-
"@atproto/common-web": ["@atproto/common-web@0.4.3", "", { "dependencies": { "graphemer": "^1.4.0", "multiformats": "^9.9.0", "uint8arrays": "3.0.0", "zod": "^3.23.8" } }, ""],
-
-
"@atproto/crypto": ["@atproto/crypto@0.4.4", "", { "dependencies": { "@noble/curves": "^1.7.0", "@noble/hashes": "^1.6.1", "uint8arrays": "3.0.0" } }, ""],
-
-
"@atproto/identity": ["@atproto/identity@0.4.9", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/crypto": "^0.4.4" } }, ""],
-
-
"@atproto/lexicon": ["@atproto/lexicon@0.5.1", "", { "dependencies": { "@atproto/common-web": "^0.4.3", "@atproto/syntax": "^0.4.1", "iso-datestring-validator": "^2.2.2", "multiformats": "^9.9.0", "zod": "^3.23.8" } }, ""],
-
-
"@atproto/repo": ["@atproto/repo@0.8.10", "", { "dependencies": { "@atproto/common": "^0.4.12", "@atproto/common-web": "^0.4.3", "@atproto/crypto": "^0.4.4", "@atproto/lexicon": "^0.5.1", "@ipld/dag-cbor": "^7.0.0", "multiformats": "^9.9.0", "uint8arrays": "3.0.0", "varint": "^6.0.0", "zod": "^3.23.8" } }, ""],
-
-
"@atproto/sync": ["@atproto/sync@0.1.36", "", { "dependencies": { "@atproto/common": "^0.4.12", "@atproto/identity": "^0.4.9", "@atproto/lexicon": "^0.5.1", "@atproto/repo": "^0.8.10", "@atproto/syntax": "^0.4.1", "@atproto/xrpc-server": "^0.9.5", "multiformats": "^9.9.0", "p-queue": "^6.6.2", "ws": "^8.12.0" } }, "sha512-HyF835Bmn8ps9BuXkmGjRrbgfv4K3fJdfEvXimEhTCntqIxQg0ttmOYDg/WBBmIRfkCB5ab+wS1PCGN8trr+FQ=="],
-
-
"@atproto/syntax": ["@atproto/syntax@0.4.1", "", {}, ""],
-
-
"@atproto/xrpc": ["@atproto/xrpc@0.7.5", "", { "dependencies": { "@atproto/lexicon": "^0.5.1", "zod": "^3.23.8" } }, ""],
-
-
"@atproto/xrpc-server": ["@atproto/xrpc-server@0.9.5", "", { "dependencies": { "@atproto/common": "^0.4.12", "@atproto/crypto": "^0.4.4", "@atproto/lexicon": "^0.5.1", "@atproto/xrpc": "^0.7.5", "cbor-x": "^1.5.1", "express": "^4.17.2", "http-errors": "^2.0.0", "mime-types": "^2.1.35", "rate-limiter-flexible": "^2.4.1", "uint8arrays": "3.0.0", "ws": "^8.12.0", "zod": "^3.23.8" } }, ""],
-
-
"@cbor-extract/cbor-extract-darwin-arm64": ["@cbor-extract/cbor-extract-darwin-arm64@2.2.0", "", { "os": "darwin", "cpu": "arm64" }, ""],
-
-
"@esbuild/aix-ppc64": ["@esbuild/aix-ppc64@0.25.11", "", { "os": "aix", "cpu": "ppc64" }, "sha512-Xt1dOL13m8u0WE8iplx9Ibbm+hFAO0GsU2P34UNoDGvZYkY8ifSiy6Zuc1lYxfG7svWE2fzqCUmFp5HCn51gJg=="],
-
-
"@esbuild/android-arm": ["@esbuild/android-arm@0.25.11", "", { "os": "android", "cpu": "arm" }, "sha512-uoa7dU+Dt3HYsethkJ1k6Z9YdcHjTrSb5NUy66ZfZaSV8hEYGD5ZHbEMXnqLFlbBflLsl89Zke7CAdDJ4JI+Gg=="],
-
-
"@esbuild/android-arm64": ["@esbuild/android-arm64@0.25.11", "", { "os": "android", "cpu": "arm64" }, "sha512-9slpyFBc4FPPz48+f6jyiXOx/Y4v34TUeDDXJpZqAWQn/08lKGeD8aDp9TMn9jDz2CiEuHwfhRmGBvpnd/PWIQ=="],
-
-
"@esbuild/android-x64": ["@esbuild/android-x64@0.25.11", "", { "os": "android", "cpu": "x64" }, "sha512-Sgiab4xBjPU1QoPEIqS3Xx+R2lezu0LKIEcYe6pftr56PqPygbB7+szVnzoShbx64MUupqoE0KyRlN7gezbl8g=="],
-
-
"@esbuild/darwin-arm64": ["@esbuild/darwin-arm64@0.25.11", "", { "os": "darwin", "cpu": "arm64" }, "sha512-VekY0PBCukppoQrycFxUqkCojnTQhdec0vevUL/EDOCnXd9LKWqD/bHwMPzigIJXPhC59Vd1WFIL57SKs2mg4w=="],
-
-
"@esbuild/darwin-x64": ["@esbuild/darwin-x64@0.25.11", "", { "os": "darwin", "cpu": "x64" }, "sha512-+hfp3yfBalNEpTGp9loYgbknjR695HkqtY3d3/JjSRUyPg/xd6q+mQqIb5qdywnDxRZykIHs3axEqU6l1+oWEQ=="],
-
-
"@esbuild/freebsd-arm64": ["@esbuild/freebsd-arm64@0.25.11", "", { "os": "freebsd", "cpu": "arm64" }, "sha512-CmKjrnayyTJF2eVuO//uSjl/K3KsMIeYeyN7FyDBjsR3lnSJHaXlVoAK8DZa7lXWChbuOk7NjAc7ygAwrnPBhA=="],
-
-
"@esbuild/freebsd-x64": ["@esbuild/freebsd-x64@0.25.11", "", { "os": "freebsd", "cpu": "x64" }, "sha512-Dyq+5oscTJvMaYPvW3x3FLpi2+gSZTCE/1ffdwuM6G1ARang/mb3jvjxs0mw6n3Lsw84ocfo9CrNMqc5lTfGOw=="],
-
-
"@esbuild/linux-arm": ["@esbuild/linux-arm@0.25.11", "", { "os": "linux", "cpu": "arm" }, "sha512-TBMv6B4kCfrGJ8cUPo7vd6NECZH/8hPpBHHlYI3qzoYFvWu2AdTvZNuU/7hsbKWqu/COU7NIK12dHAAqBLLXgw=="],
-
-
"@esbuild/linux-arm64": ["@esbuild/linux-arm64@0.25.11", "", { "os": "linux", "cpu": "arm64" }, "sha512-Qr8AzcplUhGvdyUF08A1kHU3Vr2O88xxP0Tm8GcdVOUm25XYcMPp2YqSVHbLuXzYQMf9Bh/iKx7YPqECs6ffLA=="],
-
-
"@esbuild/linux-ia32": ["@esbuild/linux-ia32@0.25.11", "", { "os": "linux", "cpu": "ia32" }, "sha512-TmnJg8BMGPehs5JKrCLqyWTVAvielc615jbkOirATQvWWB1NMXY77oLMzsUjRLa0+ngecEmDGqt5jiDC6bfvOw=="],
-
-
"@esbuild/linux-loong64": ["@esbuild/linux-loong64@0.25.11", "", { "os": "linux", "cpu": "none" }, "sha512-DIGXL2+gvDaXlaq8xruNXUJdT5tF+SBbJQKbWy/0J7OhU8gOHOzKmGIlfTTl6nHaCOoipxQbuJi7O++ldrxgMw=="],
-
-
"@esbuild/linux-mips64el": ["@esbuild/linux-mips64el@0.25.11", "", { "os": "linux", "cpu": "none" }, "sha512-Osx1nALUJu4pU43o9OyjSCXokFkFbyzjXb6VhGIJZQ5JZi8ylCQ9/LFagolPsHtgw6himDSyb5ETSfmp4rpiKQ=="],
-
-
"@esbuild/linux-ppc64": ["@esbuild/linux-ppc64@0.25.11", "", { "os": "linux", "cpu": "ppc64" }, "sha512-nbLFgsQQEsBa8XSgSTSlrnBSrpoWh7ioFDUmwo158gIm5NNP+17IYmNWzaIzWmgCxq56vfr34xGkOcZ7jX6CPw=="],
-
-
"@esbuild/linux-riscv64": ["@esbuild/linux-riscv64@0.25.11", "", { "os": "linux", "cpu": "none" }, "sha512-HfyAmqZi9uBAbgKYP1yGuI7tSREXwIb438q0nqvlpxAOs3XnZ8RsisRfmVsgV486NdjD7Mw2UrFSw51lzUk1ww=="],
-
-
"@esbuild/linux-s390x": ["@esbuild/linux-s390x@0.25.11", "", { "os": "linux", "cpu": "s390x" }, "sha512-HjLqVgSSYnVXRisyfmzsH6mXqyvj0SA7pG5g+9W7ESgwA70AXYNpfKBqh1KbTxmQVaYxpzA/SvlB9oclGPbApw=="],
-
-
"@esbuild/linux-x64": ["@esbuild/linux-x64@0.25.11", "", { "os": "linux", "cpu": "x64" }, "sha512-HSFAT4+WYjIhrHxKBwGmOOSpphjYkcswF449j6EjsjbinTZbp8PJtjsVK1XFJStdzXdy/jaddAep2FGY+wyFAQ=="],
-
-
"@esbuild/netbsd-arm64": ["@esbuild/netbsd-arm64@0.25.11", "", { "os": "none", "cpu": "arm64" }, "sha512-hr9Oxj1Fa4r04dNpWr3P8QKVVsjQhqrMSUzZzf+LZcYjZNqhA3IAfPQdEh1FLVUJSiu6sgAwp3OmwBfbFgG2Xg=="],
-
-
"@esbuild/netbsd-x64": ["@esbuild/netbsd-x64@0.25.11", "", { "os": "none", "cpu": "x64" }, "sha512-u7tKA+qbzBydyj0vgpu+5h5AeudxOAGncb8N6C9Kh1N4n7wU1Xw1JDApsRjpShRpXRQlJLb9wY28ELpwdPcZ7A=="],
-
-
"@esbuild/openbsd-arm64": ["@esbuild/openbsd-arm64@0.25.11", "", { "os": "openbsd", "cpu": "arm64" }, "sha512-Qq6YHhayieor3DxFOoYM1q0q1uMFYb7cSpLD2qzDSvK1NAvqFi8Xgivv0cFC6J+hWVw2teCYltyy9/m/14ryHg=="],
-
-
"@esbuild/openbsd-x64": ["@esbuild/openbsd-x64@0.25.11", "", { "os": "openbsd", "cpu": "x64" }, "sha512-CN+7c++kkbrckTOz5hrehxWN7uIhFFlmS/hqziSFVWpAzpWrQoAG4chH+nN3Be+Kzv/uuo7zhX716x3Sn2Jduw=="],
-
-
"@esbuild/openharmony-arm64": ["@esbuild/openharmony-arm64@0.25.11", "", { "os": "none", "cpu": "arm64" }, "sha512-rOREuNIQgaiR+9QuNkbkxubbp8MSO9rONmwP5nKncnWJ9v5jQ4JxFnLu4zDSRPf3x4u+2VN4pM4RdyIzDty/wQ=="],
-
-
"@esbuild/sunos-x64": ["@esbuild/sunos-x64@0.25.11", "", { "os": "sunos", "cpu": "x64" }, "sha512-nq2xdYaWxyg9DcIyXkZhcYulC6pQ2FuCgem3LI92IwMgIZ69KHeY8T4Y88pcwoLIjbed8n36CyKoYRDygNSGhA=="],
-
-
"@esbuild/win32-arm64": ["@esbuild/win32-arm64@0.25.11", "", { "os": "win32", "cpu": "arm64" }, "sha512-3XxECOWJq1qMZ3MN8srCJ/QfoLpL+VaxD/WfNRm1O3B4+AZ/BnLVgFbUV3eiRYDMXetciH16dwPbbHqwe1uU0Q=="],
-
-
"@esbuild/win32-ia32": ["@esbuild/win32-ia32@0.25.11", "", { "os": "win32", "cpu": "ia32" }, "sha512-3ukss6gb9XZ8TlRyJlgLn17ecsK4NSQTmdIXRASVsiS2sQ6zPPZklNJT5GR5tE/MUarymmy8kCEf5xPCNCqVOA=="],
-
-
"@esbuild/win32-x64": ["@esbuild/win32-x64@0.25.11", "", { "os": "win32", "cpu": "x64" }, "sha512-D7Hpz6A2L4hzsRpPaCYkQnGOotdUpDzSGRIv9I+1ITdHROSFUWW95ZPZWQmGka1Fg7W3zFJowyn9WGwMJ0+KPA=="],
-
-
"@hono/node-server": ["@hono/node-server@1.19.6", "", { "peerDependencies": { "hono": "^4" } }, "sha512-Shz/KjlIeAhfiuE93NDKVdZ7HdBVLQAfdbaXEaoAVO3ic9ibRSLGIQGkcBbFyuLr+7/1D5ZCINM8B+6IvXeMtw=="],
-
-
"@ipld/dag-cbor": ["@ipld/dag-cbor@7.0.3", "", { "dependencies": { "cborg": "^1.6.0", "multiformats": "^9.5.4" } }, ""],
-
-
"@noble/curves": ["@noble/curves@1.9.7", "", { "dependencies": { "@noble/hashes": "1.8.0" } }, ""],
-
-
"@noble/hashes": ["@noble/hashes@1.8.0", "", {}, ""],
-
-
"@types/bun": ["@types/bun@1.3.1", "", { "dependencies": { "bun-types": "1.3.1" } }, "sha512-4jNMk2/K9YJtfqwoAa28c8wK+T7nvJFOjxI4h/7sORWcypRNxBpr+TPNaCfVWq70tLCJsqoFwcf0oI0JU/fvMQ=="],
-
-
"@types/mime-types": ["@types/mime-types@2.1.4", "", {}, "sha512-lfU4b34HOri+kAY5UheuFMWPDOI+OPceBSHZKp69gEyTL/mmJ4cnU6Y/rlme3UL3GyOn6Y42hyIEw0/q8sWx5w=="],
-
-
"@types/node": ["@types/node@22.18.12", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-BICHQ67iqxQGFSzfCFTT7MRQ5XcBjG5aeKh5Ok38UBbPe5fxTyE+aHFxwVrGyr8GNlqFMLKD1D3P2K/1ks8tog=="],
-
-
"@types/react": ["@types/react@19.2.2", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA=="],
-
-
"abort-controller": ["abort-controller@3.0.0", "", { "dependencies": { "event-target-shim": "^5.0.0" } }, ""],
-
-
"accepts": ["accepts@1.3.8", "", { "dependencies": { "mime-types": "~2.1.34", "negotiator": "0.6.3" } }, ""],
-
-
"array-flatten": ["array-flatten@1.1.1", "", {}, ""],
-
-
"atomic-sleep": ["atomic-sleep@1.0.0", "", {}, ""],
-
-
"await-lock": ["await-lock@2.2.2", "", {}, ""],
-
-
"base64-js": ["base64-js@1.5.1", "", {}, ""],
-
-
"body-parser": ["body-parser@1.20.3", "", { "dependencies": { "bytes": "3.1.2", "content-type": "~1.0.5", "debug": "2.6.9", "depd": "2.0.0", "destroy": "1.2.0", "http-errors": "2.0.0", "iconv-lite": "0.4.24", "on-finished": "2.4.1", "qs": "6.13.0", "raw-body": "2.5.2", "type-is": "~1.6.18", "unpipe": "1.0.0" } }, ""],
-
-
"buffer": ["buffer@6.0.3", "", { "dependencies": { "base64-js": "^1.3.1", "ieee754": "^1.2.1" } }, ""],
-
-
"bun-types": ["bun-types@1.3.1", "", { "dependencies": { "@types/node": "*" }, "peerDependencies": { "@types/react": "^19" } }, "sha512-NMrcy7smratanWJ2mMXdpatalovtxVggkj11bScuWuiOoXTiKIu2eVS1/7qbyI/4yHedtsn175n4Sm4JcdHLXw=="],
-
-
"bytes": ["bytes@3.1.2", "", {}, ""],
-
-
"call-bind-apply-helpers": ["call-bind-apply-helpers@1.0.2", "", { "dependencies": { "es-errors": "^1.3.0", "function-bind": "^1.1.2" } }, ""],
-
-
"call-bound": ["call-bound@1.0.4", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.2", "get-intrinsic": "^1.3.0" } }, ""],
-
-
"cbor-extract": ["cbor-extract@2.2.0", "", { "dependencies": { "node-gyp-build-optional-packages": "5.1.1" }, "optionalDependencies": { "@cbor-extract/cbor-extract-darwin-arm64": "2.2.0" }, "bin": { "download-cbor-prebuilds": "bin/download-prebuilds.js" } }, ""],
-
-
"cbor-x": ["cbor-x@1.6.0", "", { "optionalDependencies": { "cbor-extract": "^2.2.0" } }, ""],
-
-
"cborg": ["cborg@1.10.2", "", { "bin": "cli.js" }, ""],
-
-
"content-disposition": ["content-disposition@0.5.4", "", { "dependencies": { "safe-buffer": "5.2.1" } }, ""],
-
-
"content-type": ["content-type@1.0.5", "", {}, ""],
-
-
"cookie": ["cookie@0.7.1", "", {}, ""],
-
-
"cookie-signature": ["cookie-signature@1.0.6", "", {}, ""],
-
-
"csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],
-
-
"debug": ["debug@2.6.9", "", { "dependencies": { "ms": "2.0.0" } }, ""],
-
-
"depd": ["depd@2.0.0", "", {}, ""],
-
-
"destroy": ["destroy@1.2.0", "", {}, ""],
-
-
"detect-libc": ["detect-libc@2.1.2", "", {}, ""],
-
-
"dunder-proto": ["dunder-proto@1.0.1", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.1", "es-errors": "^1.3.0", "gopd": "^1.2.0" } }, ""],
-
-
"ee-first": ["ee-first@1.1.1", "", {}, ""],
-
-
"encodeurl": ["encodeurl@2.0.0", "", {}, ""],
-
-
"es-define-property": ["es-define-property@1.0.1", "", {}, ""],
-
-
"es-errors": ["es-errors@1.3.0", "", {}, ""],
-
-
"es-object-atoms": ["es-object-atoms@1.1.1", "", { "dependencies": { "es-errors": "^1.3.0" } }, ""],
-
-
"esbuild": ["esbuild@0.25.11", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.25.11", "@esbuild/android-arm": "0.25.11", "@esbuild/android-arm64": "0.25.11", "@esbuild/android-x64": "0.25.11", "@esbuild/darwin-arm64": "0.25.11", "@esbuild/darwin-x64": "0.25.11", "@esbuild/freebsd-arm64": "0.25.11", "@esbuild/freebsd-x64": "0.25.11", "@esbuild/linux-arm": "0.25.11", "@esbuild/linux-arm64": "0.25.11", "@esbuild/linux-ia32": "0.25.11", "@esbuild/linux-loong64": "0.25.11", "@esbuild/linux-mips64el": "0.25.11", "@esbuild/linux-ppc64": "0.25.11", "@esbuild/linux-riscv64": "0.25.11", "@esbuild/linux-s390x": "0.25.11", "@esbuild/linux-x64": "0.25.11", "@esbuild/netbsd-arm64": "0.25.11", "@esbuild/netbsd-x64": "0.25.11", "@esbuild/openbsd-arm64": "0.25.11", "@esbuild/openbsd-x64": "0.25.11", "@esbuild/openharmony-arm64": "0.25.11", "@esbuild/sunos-x64": "0.25.11", "@esbuild/win32-arm64": "0.25.11", "@esbuild/win32-ia32": "0.25.11", "@esbuild/win32-x64": "0.25.11" }, "bin": "bin/esbuild" }, "sha512-KohQwyzrKTQmhXDW1PjCv3Tyspn9n5GcY2RTDqeORIdIJY8yKIF7sTSopFmn/wpMPW4rdPXI0UE5LJLuq3bx0Q=="],
-
-
"escape-html": ["escape-html@1.0.3", "", {}, ""],
-
-
"etag": ["etag@1.8.1", "", {}, ""],
-
-
"event-target-shim": ["event-target-shim@5.0.1", "", {}, ""],
-
-
"eventemitter3": ["eventemitter3@4.0.7", "", {}, ""],
-
-
"events": ["events@3.3.0", "", {}, ""],
-
-
"express": ["express@4.21.2", "", { "dependencies": { "accepts": "~1.3.8", "array-flatten": "1.1.1", "body-parser": "1.20.3", "content-disposition": "0.5.4", "content-type": "~1.0.4", "cookie": "0.7.1", "cookie-signature": "1.0.6", "debug": "2.6.9", "depd": "2.0.0", "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "etag": "~1.8.1", "finalhandler": "1.3.1", "fresh": "0.5.2", "http-errors": "2.0.0", "merge-descriptors": "1.0.3", "methods": "~1.1.2", "on-finished": "2.4.1", "parseurl": "~1.3.3", "path-to-regexp": "0.1.12", "proxy-addr": "~2.0.7", "qs": "6.13.0", "range-parser": "~1.2.1", "safe-buffer": "5.2.1", "send": "0.19.0", "serve-static": "1.16.2", "setprototypeof": "1.2.0", "statuses": "2.0.1", "type-is": "~1.6.18", "utils-merge": "1.0.1", "vary": "~1.1.2" } }, ""],
-
-
"fast-redact": ["fast-redact@3.5.0", "", {}, ""],
-
-
"finalhandler": ["finalhandler@1.3.1", "", { "dependencies": { "debug": "2.6.9", "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "on-finished": "2.4.1", "parseurl": "~1.3.3", "statuses": "2.0.1", "unpipe": "~1.0.0" } }, ""],
-
-
"forwarded": ["forwarded@0.2.0", "", {}, ""],
-
-
"fresh": ["fresh@0.5.2", "", {}, ""],
-
-
"fsevents": ["fsevents@2.3.3", "", { "os": "darwin" }, "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw=="],
-
-
"function-bind": ["function-bind@1.1.2", "", {}, ""],
-
-
"get-intrinsic": ["get-intrinsic@1.3.0", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.2", "es-define-property": "^1.0.1", "es-errors": "^1.3.0", "es-object-atoms": "^1.1.1", "function-bind": "^1.1.2", "get-proto": "^1.0.1", "gopd": "^1.2.0", "has-symbols": "^1.1.0", "hasown": "^2.0.2", "math-intrinsics": "^1.1.0" } }, ""],
-
-
"get-proto": ["get-proto@1.0.1", "", { "dependencies": { "dunder-proto": "^1.0.1", "es-object-atoms": "^1.0.0" } }, ""],
-
-
"get-tsconfig": ["get-tsconfig@4.13.0", "", { "dependencies": { "resolve-pkg-maps": "^1.0.0" } }, "sha512-1VKTZJCwBrvbd+Wn3AOgQP/2Av+TfTCOlE4AcRJE72W1ksZXbAx8PPBR9RzgTeSPzlPMHrbANMH3LbltH73wxQ=="],
-
-
"gopd": ["gopd@1.2.0", "", {}, ""],
-
-
"graphemer": ["graphemer@1.4.0", "", {}, ""],
-
-
"has-symbols": ["has-symbols@1.1.0", "", {}, ""],
-
-
"hasown": ["hasown@2.0.2", "", { "dependencies": { "function-bind": "^1.1.2" } }, ""],
-
-
"hono": ["hono@4.10.4", "", {}, "sha512-YG/fo7zlU3KwrBL5vDpWKisLYiM+nVstBQqfr7gCPbSYURnNEP9BDxEMz8KfsDR9JX0lJWDRNc6nXX31v7ZEyg=="],
-
-
"http-errors": ["http-errors@2.0.0", "", { "dependencies": { "depd": "2.0.0", "inherits": "2.0.4", "setprototypeof": "1.2.0", "statuses": "2.0.1", "toidentifier": "1.0.1" } }, ""],
-
-
"iconv-lite": ["iconv-lite@0.4.24", "", { "dependencies": { "safer-buffer": ">= 2.1.2 < 3" } }, ""],
-
-
"ieee754": ["ieee754@1.2.1", "", {}, ""],
-
-
"inherits": ["inherits@2.0.4", "", {}, ""],
-
-
"ipaddr.js": ["ipaddr.js@1.9.1", "", {}, ""],
-
-
"iso-datestring-validator": ["iso-datestring-validator@2.2.2", "", {}, ""],
-
-
"math-intrinsics": ["math-intrinsics@1.1.0", "", {}, ""],
-
-
"media-typer": ["media-typer@0.3.0", "", {}, ""],
-
-
"merge-descriptors": ["merge-descriptors@1.0.3", "", {}, ""],
-
-
"methods": ["methods@1.1.2", "", {}, ""],
-
-
"mime": ["mime@1.6.0", "", { "bin": "cli.js" }, ""],
-
-
"mime-db": ["mime-db@1.52.0", "", {}, ""],
-
-
"mime-types": ["mime-types@2.1.35", "", { "dependencies": { "mime-db": "1.52.0" } }, ""],
-
-
"ms": ["ms@2.0.0", "", {}, ""],
-
-
"multiformats": ["multiformats@13.4.1", "", {}, ""],
-
-
"negotiator": ["negotiator@0.6.3", "", {}, ""],
-
-
"node-gyp-build-optional-packages": ["node-gyp-build-optional-packages@5.1.1", "", { "dependencies": { "detect-libc": "^2.0.1" }, "bin": { "node-gyp-build-optional-packages": "bin.js", "node-gyp-build-optional-packages-optional": "optional.js", "node-gyp-build-optional-packages-test": "build-test.js" } }, ""],
-
-
"object-inspect": ["object-inspect@1.13.4", "", {}, ""],
-
-
"on-exit-leak-free": ["on-exit-leak-free@2.1.2", "", {}, ""],
-
-
"on-finished": ["on-finished@2.4.1", "", { "dependencies": { "ee-first": "1.1.1" } }, ""],
-
-
"p-finally": ["p-finally@1.0.0", "", {}, ""],
-
-
"p-queue": ["p-queue@6.6.2", "", { "dependencies": { "eventemitter3": "^4.0.4", "p-timeout": "^3.2.0" } }, ""],
-
-
"p-timeout": ["p-timeout@3.2.0", "", { "dependencies": { "p-finally": "^1.0.0" } }, ""],
-
-
"parseurl": ["parseurl@1.3.3", "", {}, ""],
-
-
"path-to-regexp": ["path-to-regexp@0.1.12", "", {}, ""],
-
-
"pino": ["pino@8.21.0", "", { "dependencies": { "atomic-sleep": "^1.0.0", "fast-redact": "^3.1.1", "on-exit-leak-free": "^2.1.0", "pino-abstract-transport": "^1.2.0", "pino-std-serializers": "^6.0.0", "process-warning": "^3.0.0", "quick-format-unescaped": "^4.0.3", "real-require": "^0.2.0", "safe-stable-stringify": "^2.3.1", "sonic-boom": "^3.7.0", "thread-stream": "^2.6.0" }, "bin": "bin.js" }, ""],
-
-
"pino-abstract-transport": ["pino-abstract-transport@1.2.0", "", { "dependencies": { "readable-stream": "^4.0.0", "split2": "^4.0.0" } }, ""],
-
-
"pino-std-serializers": ["pino-std-serializers@6.2.2", "", {}, ""],
-
-
"postgres": ["postgres@3.4.7", "", {}, ""],
-
-
"process": ["process@0.11.10", "", {}, ""],
-
-
"process-warning": ["process-warning@3.0.0", "", {}, ""],
-
-
"proxy-addr": ["proxy-addr@2.0.7", "", { "dependencies": { "forwarded": "0.2.0", "ipaddr.js": "1.9.1" } }, ""],
-
-
"qs": ["qs@6.13.0", "", { "dependencies": { "side-channel": "^1.0.6" } }, ""],
-
-
"quick-format-unescaped": ["quick-format-unescaped@4.0.4", "", {}, ""],
-
-
"range-parser": ["range-parser@1.2.1", "", {}, ""],
-
-
"rate-limiter-flexible": ["rate-limiter-flexible@2.4.2", "", {}, ""],
-
-
"raw-body": ["raw-body@2.5.2", "", { "dependencies": { "bytes": "3.1.2", "http-errors": "2.0.0", "iconv-lite": "0.4.24", "unpipe": "1.0.0" } }, ""],
-
-
"readable-stream": ["readable-stream@4.7.0", "", { "dependencies": { "abort-controller": "^3.0.0", "buffer": "^6.0.3", "events": "^3.3.0", "process": "^0.11.10", "string_decoder": "^1.3.0" } }, ""],
-
-
"real-require": ["real-require@0.2.0", "", {}, ""],
-
-
"resolve-pkg-maps": ["resolve-pkg-maps@1.0.0", "", {}, "sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw=="],
-
-
"safe-buffer": ["safe-buffer@5.2.1", "", {}, ""],
-
-
"safe-stable-stringify": ["safe-stable-stringify@2.5.0", "", {}, ""],
-
-
"safer-buffer": ["safer-buffer@2.1.2", "", {}, ""],
-
-
"send": ["send@0.19.0", "", { "dependencies": { "debug": "2.6.9", "depd": "2.0.0", "destroy": "1.2.0", "encodeurl": "~1.0.2", "escape-html": "~1.0.3", "etag": "~1.8.1", "fresh": "0.5.2", "http-errors": "2.0.0", "mime": "1.6.0", "ms": "2.1.3", "on-finished": "2.4.1", "range-parser": "~1.2.1", "statuses": "2.0.1" } }, ""],
-
-
"serve-static": ["serve-static@1.16.2", "", { "dependencies": { "encodeurl": "~2.0.0", "escape-html": "~1.0.3", "parseurl": "~1.3.3", "send": "0.19.0" } }, ""],
-
-
"setprototypeof": ["setprototypeof@1.2.0", "", {}, ""],
-
-
"side-channel": ["side-channel@1.1.0", "", { "dependencies": { "es-errors": "^1.3.0", "object-inspect": "^1.13.3", "side-channel-list": "^1.0.0", "side-channel-map": "^1.0.1", "side-channel-weakmap": "^1.0.2" } }, ""],
-
-
"side-channel-list": ["side-channel-list@1.0.0", "", { "dependencies": { "es-errors": "^1.3.0", "object-inspect": "^1.13.3" } }, ""],
-
-
"side-channel-map": ["side-channel-map@1.0.1", "", { "dependencies": { "call-bound": "^1.0.2", "es-errors": "^1.3.0", "get-intrinsic": "^1.2.5", "object-inspect": "^1.13.3" } }, ""],
-
-
"side-channel-weakmap": ["side-channel-weakmap@1.0.2", "", { "dependencies": { "call-bound": "^1.0.2", "es-errors": "^1.3.0", "get-intrinsic": "^1.2.5", "object-inspect": "^1.13.3", "side-channel-map": "^1.0.1" } }, ""],
-
-
"sonic-boom": ["sonic-boom@3.8.1", "", { "dependencies": { "atomic-sleep": "^1.0.0" } }, ""],
-
-
"split2": ["split2@4.2.0", "", {}, ""],
-
-
"statuses": ["statuses@2.0.1", "", {}, ""],
-
-
"string_decoder": ["string_decoder@1.3.0", "", { "dependencies": { "safe-buffer": "~5.2.0" } }, ""],
-
-
"thread-stream": ["thread-stream@2.7.0", "", { "dependencies": { "real-require": "^0.2.0" } }, ""],
-
-
"tlds": ["tlds@1.261.0", "", { "bin": "bin.js" }, ""],
-
-
"toidentifier": ["toidentifier@1.0.1", "", {}, ""],
-
-
"tsx": ["tsx@4.20.6", "", { "dependencies": { "esbuild": "~0.25.0", "get-tsconfig": "^4.7.5" }, "optionalDependencies": { "fsevents": "~2.3.3" }, "bin": "dist/cli.mjs" }, "sha512-ytQKuwgmrrkDTFP4LjR0ToE2nqgy886GpvRSpU0JAnrdBYppuY5rLkRUYPU1yCryb24SsKBTL/hlDQAEFVwtZg=="],
-
-
"type-is": ["type-is@1.6.18", "", { "dependencies": { "media-typer": "0.3.0", "mime-types": "~2.1.24" } }, ""],
-
-
"uint8arrays": ["uint8arrays@3.0.0", "", { "dependencies": { "multiformats": "^9.4.2" } }, ""],
-
-
"undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
-
-
"unpipe": ["unpipe@1.0.0", "", {}, ""],
-
-
"utils-merge": ["utils-merge@1.0.1", "", {}, ""],
-
-
"varint": ["varint@6.0.0", "", {}, ""],
-
-
"vary": ["vary@1.1.2", "", {}, ""],
-
-
"ws": ["ws@8.18.3", "", { "peerDependencies": { "bufferutil": "^4.0.1", "utf-8-validate": ">=5.0.2" }, "optionalPeers": ["bufferutil", "utf-8-validate"] }, ""],
-
-
"zod": ["zod@3.25.76", "", {}, ""],
-
-
"@atproto/api/multiformats": ["multiformats@9.9.0", "", {}, ""],
-
-
"@atproto/common/multiformats": ["multiformats@9.9.0", "", {}, ""],
-
-
"@atproto/common-web/multiformats": ["multiformats@9.9.0", "", {}, ""],
-
-
"@atproto/lexicon/multiformats": ["multiformats@9.9.0", "", {}, ""],
-
-
"@atproto/repo/multiformats": ["multiformats@9.9.0", "", {}, ""],
-
-
"@atproto/sync/multiformats": ["multiformats@9.9.0", "", {}, ""],
-
-
"@ipld/dag-cbor/multiformats": ["multiformats@9.9.0", "", {}, ""],
-
-
"send/encodeurl": ["encodeurl@1.0.2", "", {}, ""],
-
-
"send/ms": ["ms@2.1.3", "", {}, ""],
-
-
"uint8arrays/multiformats": ["multiformats@9.9.0", "", {}, ""],
-
}
-
}
-28
hosting-service/package.json
···
-
{
-
"name": "wisp-hosting-service",
-
"version": "1.0.0",
-
"type": "module",
-
"scripts": {
-
"dev": "tsx watch src/index.ts",
-
"build": "tsc",
-
"start": "tsx src/index.ts"
-
},
-
"dependencies": {
-
"@atproto/api": "^0.17.4",
-
"@atproto/identity": "^0.4.9",
-
"@atproto/lexicon": "^0.5.1",
-
"@atproto/sync": "^0.1.36",
-
"@atproto/xrpc": "^0.7.5",
-
"@hono/node-server": "^1.19.6",
-
"hono": "^4.10.4",
-
"mime-types": "^2.1.35",
-
"multiformats": "^13.4.1",
-
"postgres": "^3.4.5"
-
},
-
"devDependencies": {
-
"@types/bun": "^1.3.1",
-
"@types/mime-types": "^2.1.4",
-
"@types/node": "^22.10.5",
-
"tsx": "^4.19.2"
-
}
-
}
-60
hosting-service/src/index.ts
···
-
import app from './server';
-
import { serve } from '@hono/node-server';
-
import { FirehoseWorker } from './lib/firehose';
-
import { logger } from './lib/observability';
-
import { mkdirSync, existsSync } from 'fs';
-
-
const PORT = process.env.PORT ? parseInt(process.env.PORT) : 3001;
-
const CACHE_DIR = './cache/sites';
-
-
// Ensure cache directory exists
-
if (!existsSync(CACHE_DIR)) {
-
mkdirSync(CACHE_DIR, { recursive: true });
-
console.log('Created cache directory:', CACHE_DIR);
-
}
-
-
// Start firehose worker with observability logger
-
const firehose = new FirehoseWorker((msg, data) => {
-
logger.info(msg, data);
-
});
-
-
firehose.start();
-
-
// Add health check endpoint
-
app.get('/health', (c) => {
-
const firehoseHealth = firehose.getHealth();
-
return c.json({
-
status: 'ok',
-
firehose: firehoseHealth,
-
});
-
});
-
-
// Start HTTP server with Node.js adapter
-
const server = serve({
-
fetch: app.fetch,
-
port: PORT,
-
});
-
-
console.log(`
-
Wisp Hosting Service
-
-
Server: http://localhost:${PORT}
-
Health: http://localhost:${PORT}/health
-
Cache: ${CACHE_DIR}
-
Firehose: Connected to Firehose
-
`);
-
-
// Graceful shutdown
-
process.on('SIGINT', async () => {
-
console.log('\n๐Ÿ›‘ Shutting down...');
-
firehose.stop();
-
server.close();
-
process.exit(0);
-
});
-
-
process.on('SIGTERM', async () => {
-
console.log('\n๐Ÿ›‘ Shutting down...');
-
firehose.stop();
-
server.close();
-
process.exit(0);
-
});
-44
hosting-service/src/lexicon/index.ts
···
-
/**
-
* GENERATED CODE - DO NOT MODIFY
-
*/
-
import {
-
type Auth,
-
type Options as XrpcOptions,
-
Server as XrpcServer,
-
type StreamConfigOrHandler,
-
type MethodConfigOrHandler,
-
createServer as createXrpcServer,
-
} from '@atproto/xrpc-server'
-
import { schemas } from './lexicons.js'
-
-
export function createServer(options?: XrpcOptions): Server {
-
return new Server(options)
-
}
-
-
export class Server {
-
xrpc: XrpcServer
-
place: PlaceNS
-
-
constructor(options?: XrpcOptions) {
-
this.xrpc = createXrpcServer(schemas, options)
-
this.place = new PlaceNS(this)
-
}
-
}
-
-
export class PlaceNS {
-
_server: Server
-
wisp: PlaceWispNS
-
-
constructor(server: Server) {
-
this._server = server
-
this.wisp = new PlaceWispNS(server)
-
}
-
}
-
-
export class PlaceWispNS {
-
_server: Server
-
-
constructor(server: Server) {
-
this._server = server
-
}
-
}
-141
hosting-service/src/lexicon/lexicons.ts
···
-
/**
-
* GENERATED CODE - DO NOT MODIFY
-
*/
-
import {
-
type LexiconDoc,
-
Lexicons,
-
ValidationError,
-
type ValidationResult,
-
} from '@atproto/lexicon'
-
import { type $Typed, is$typed, maybe$typed } from './util.js'
-
-
export const schemaDict = {
-
PlaceWispFs: {
-
lexicon: 1,
-
id: 'place.wisp.fs',
-
defs: {
-
main: {
-
type: 'record',
-
description: 'Virtual filesystem manifest for a Wisp site',
-
record: {
-
type: 'object',
-
required: ['site', 'root', 'createdAt'],
-
properties: {
-
site: {
-
type: 'string',
-
},
-
root: {
-
type: 'ref',
-
ref: 'lex:place.wisp.fs#directory',
-
},
-
fileCount: {
-
type: 'integer',
-
minimum: 0,
-
maximum: 1000,
-
},
-
createdAt: {
-
type: 'string',
-
format: 'datetime',
-
},
-
},
-
},
-
},
-
file: {
-
type: 'object',
-
required: ['type', 'blob'],
-
properties: {
-
type: {
-
type: 'string',
-
const: 'file',
-
},
-
blob: {
-
type: 'blob',
-
accept: ['*/*'],
-
maxSize: 1000000,
-
description: 'Content blob ref',
-
},
-
encoding: {
-
type: 'string',
-
enum: ['gzip'],
-
description: 'Content encoding (e.g., gzip for compressed files)',
-
},
-
mimeType: {
-
type: 'string',
-
description: 'Original MIME type before compression',
-
},
-
base64: {
-
type: 'boolean',
-
description:
-
'True if blob content is base64-encoded (used to bypass PDS content sniffing)',
-
},
-
},
-
},
-
directory: {
-
type: 'object',
-
required: ['type', 'entries'],
-
properties: {
-
type: {
-
type: 'string',
-
const: 'directory',
-
},
-
entries: {
-
type: 'array',
-
maxLength: 500,
-
items: {
-
type: 'ref',
-
ref: 'lex:place.wisp.fs#entry',
-
},
-
},
-
},
-
},
-
entry: {
-
type: 'object',
-
required: ['name', 'node'],
-
properties: {
-
name: {
-
type: 'string',
-
maxLength: 255,
-
},
-
node: {
-
type: 'union',
-
refs: ['lex:place.wisp.fs#file', 'lex:place.wisp.fs#directory'],
-
},
-
},
-
},
-
},
-
},
-
} as const satisfies Record<string, LexiconDoc>
-
export const schemas = Object.values(schemaDict) satisfies LexiconDoc[]
-
export const lexicons: Lexicons = new Lexicons(schemas)
-
-
export function validate<T extends { $type: string }>(
-
v: unknown,
-
id: string,
-
hash: string,
-
requiredType: true,
-
): ValidationResult<T>
-
export function validate<T extends { $type?: string }>(
-
v: unknown,
-
id: string,
-
hash: string,
-
requiredType?: false,
-
): ValidationResult<T>
-
export function validate(
-
v: unknown,
-
id: string,
-
hash: string,
-
requiredType?: boolean,
-
): ValidationResult {
-
return (requiredType ? is$typed : maybe$typed)(v, id, hash)
-
? lexicons.validate(`${id}#${hash}`, v)
-
: {
-
success: false,
-
error: new ValidationError(
-
`Must be an object with "${hash === 'main' ? id : `${id}#${hash}`}" $type property`,
-
),
-
}
-
}
-
-
export const ids = {
-
PlaceWispFs: 'place.wisp.fs',
-
} as const
-85
hosting-service/src/lexicon/types/place/wisp/fs.ts
···
-
/**
-
* GENERATED CODE - DO NOT MODIFY
-
*/
-
import { type ValidationResult, BlobRef } from '@atproto/lexicon'
-
import { CID } from 'multiformats'
-
import { validate as _validate } from '../../../lexicons'
-
import { type $Typed, is$typed as _is$typed, type OmitKey } from '../../../util'
-
-
const is$typed = _is$typed,
-
validate = _validate
-
const id = 'place.wisp.fs'
-
-
export interface Record {
-
$type: 'place.wisp.fs'
-
site: string
-
root: Directory
-
fileCount?: number
-
createdAt: string
-
[k: string]: unknown
-
}
-
-
const hashRecord = 'main'
-
-
export function isRecord<V>(v: V) {
-
return is$typed(v, id, hashRecord)
-
}
-
-
export function validateRecord<V>(v: V) {
-
return validate<Record & V>(v, id, hashRecord, true)
-
}
-
-
export interface File {
-
$type?: 'place.wisp.fs#file'
-
type: 'file'
-
/** Content blob ref */
-
blob: BlobRef
-
/** Content encoding (e.g., gzip for compressed files) */
-
encoding?: 'gzip'
-
/** Original MIME type before compression */
-
mimeType?: string
-
/** True if blob content is base64-encoded (used to bypass PDS content sniffing) */
-
base64?: boolean
-
}
-
-
const hashFile = 'file'
-
-
export function isFile<V>(v: V) {
-
return is$typed(v, id, hashFile)
-
}
-
-
export function validateFile<V>(v: V) {
-
return validate<File & V>(v, id, hashFile)
-
}
-
-
export interface Directory {
-
$type?: 'place.wisp.fs#directory'
-
type: 'directory'
-
entries: Entry[]
-
}
-
-
const hashDirectory = 'directory'
-
-
export function isDirectory<V>(v: V) {
-
return is$typed(v, id, hashDirectory)
-
}
-
-
export function validateDirectory<V>(v: V) {
-
return validate<Directory & V>(v, id, hashDirectory)
-
}
-
-
export interface Entry {
-
$type?: 'place.wisp.fs#entry'
-
name: string
-
node: $Typed<File> | $Typed<Directory> | { $type: string }
-
}
-
-
const hashEntry = 'entry'
-
-
export function isEntry<V>(v: V) {
-
return is$typed(v, id, hashEntry)
-
}
-
-
export function validateEntry<V>(v: V) {
-
return validate<Entry & V>(v, id, hashEntry)
-
}
-82
hosting-service/src/lexicon/util.ts
···
-
/**
-
* GENERATED CODE - DO NOT MODIFY
-
*/
-
-
import { type ValidationResult } from '@atproto/lexicon'
-
-
export type OmitKey<T, K extends keyof T> = {
-
[K2 in keyof T as K2 extends K ? never : K2]: T[K2]
-
}
-
-
export type $Typed<V, T extends string = string> = V & { $type: T }
-
export type Un$Typed<V extends { $type?: string }> = OmitKey<V, '$type'>
-
-
export type $Type<Id extends string, Hash extends string> = Hash extends 'main'
-
? Id
-
: `${Id}#${Hash}`
-
-
function isObject<V>(v: V): v is V & object {
-
return v != null && typeof v === 'object'
-
}
-
-
function is$type<Id extends string, Hash extends string>(
-
$type: unknown,
-
id: Id,
-
hash: Hash,
-
): $type is $Type<Id, Hash> {
-
return hash === 'main'
-
? $type === id
-
: // $type === `${id}#${hash}`
-
typeof $type === 'string' &&
-
$type.length === id.length + 1 + hash.length &&
-
$type.charCodeAt(id.length) === 35 /* '#' */ &&
-
$type.startsWith(id) &&
-
$type.endsWith(hash)
-
}
-
-
export type $TypedObject<
-
V,
-
Id extends string,
-
Hash extends string,
-
> = V extends {
-
$type: $Type<Id, Hash>
-
}
-
? V
-
: V extends { $type?: string }
-
? V extends { $type?: infer T extends $Type<Id, Hash> }
-
? V & { $type: T }
-
: never
-
: V & { $type: $Type<Id, Hash> }
-
-
export function is$typed<V, Id extends string, Hash extends string>(
-
v: V,
-
id: Id,
-
hash: Hash,
-
): v is $TypedObject<V, Id, Hash> {
-
return isObject(v) && '$type' in v && is$type(v.$type, id, hash)
-
}
-
-
export function maybe$typed<V, Id extends string, Hash extends string>(
-
v: V,
-
id: Id,
-
hash: Hash,
-
): v is V & object & { $type?: $Type<Id, Hash> } {
-
return (
-
isObject(v) &&
-
('$type' in v ? v.$type === undefined || is$type(v.$type, id, hash) : true)
-
)
-
}
-
-
export type Validator<R = unknown> = (v: unknown) => ValidationResult<R>
-
export type ValidatorParam<V extends Validator> =
-
V extends Validator<infer R> ? R : never
-
-
/**
-
* Utility function that allows to convert a "validate*" utility function into a
-
* type predicate.
-
*/
-
export function asPredicate<V extends Validator>(validate: V) {
-
return function <T>(v: T): v is T & ValidatorParam<V> {
-
return validate(v).success
-
}
-
}
-126
hosting-service/src/lib/db.ts
···
-
import postgres from 'postgres';
-
import { createHash } from 'crypto';
-
-
const sql = postgres(
-
process.env.DATABASE_URL || 'postgres://postgres:postgres@localhost:5432/wisp',
-
{
-
max: 10,
-
idle_timeout: 20,
-
}
-
);
-
-
export interface DomainLookup {
-
did: string;
-
rkey: string | null;
-
}
-
-
export interface CustomDomainLookup {
-
id: string;
-
domain: string;
-
did: string;
-
rkey: string | null;
-
verified: boolean;
-
}
-
-
-
-
export async function getWispDomain(domain: string): Promise<DomainLookup | null> {
-
const key = domain.toLowerCase();
-
-
// Query database
-
const result = await sql<DomainLookup[]>`
-
SELECT did, rkey FROM domains WHERE domain = ${key} LIMIT 1
-
`;
-
const data = result[0] || null;
-
-
return data;
-
}
-
-
export async function getCustomDomain(domain: string): Promise<CustomDomainLookup | null> {
-
const key = domain.toLowerCase();
-
-
// Query database
-
const result = await sql<CustomDomainLookup[]>`
-
SELECT id, domain, did, rkey, verified FROM custom_domains
-
WHERE domain = ${key} AND verified = true LIMIT 1
-
`;
-
const data = result[0] || null;
-
-
return data;
-
}
-
-
export async function getCustomDomainByHash(hash: string): Promise<CustomDomainLookup | null> {
-
// Query database
-
const result = await sql<CustomDomainLookup[]>`
-
SELECT id, domain, did, rkey, verified FROM custom_domains
-
WHERE id = ${hash} AND verified = true LIMIT 1
-
`;
-
const data = result[0] || null;
-
-
return data;
-
}
-
-
export async function upsertSite(did: string, rkey: string, displayName?: string) {
-
try {
-
// Only set display_name if provided (not undefined/null/empty)
-
const cleanDisplayName = displayName && displayName.trim() ? displayName.trim() : null;
-
-
await sql`
-
INSERT INTO sites (did, rkey, display_name, created_at, updated_at)
-
VALUES (${did}, ${rkey}, ${cleanDisplayName}, EXTRACT(EPOCH FROM NOW()), EXTRACT(EPOCH FROM NOW()))
-
ON CONFLICT (did, rkey)
-
DO UPDATE SET
-
display_name = CASE
-
WHEN EXCLUDED.display_name IS NOT NULL THEN EXCLUDED.display_name
-
ELSE sites.display_name
-
END,
-
updated_at = EXTRACT(EPOCH FROM NOW())
-
`;
-
} catch (err) {
-
console.error('Failed to upsert site', err);
-
}
-
}
-
-
/**
-
* Generate a numeric lock ID from a string key
-
* PostgreSQL advisory locks use bigint (64-bit signed integer)
-
*/
-
function stringToLockId(key: string): bigint {
-
const hash = createHash('sha256').update(key).digest('hex');
-
// Take first 16 hex characters (64 bits) and convert to bigint
-
const hashNum = BigInt('0x' + hash.substring(0, 16));
-
// Keep within signed int64 range
-
return hashNum & 0x7FFFFFFFFFFFFFFFn;
-
}
-
-
/**
-
* Acquire a distributed lock using PostgreSQL advisory locks
-
* Returns true if lock was acquired, false if already held by another instance
-
* Lock is automatically released when the transaction ends or connection closes
-
*/
-
export async function tryAcquireLock(key: string): Promise<boolean> {
-
const lockId = stringToLockId(key);
-
-
try {
-
const result = await sql`SELECT pg_try_advisory_lock(${Number(lockId)}) as acquired`;
-
return result[0]?.acquired === true;
-
} catch (err) {
-
console.error('Failed to acquire lock', { key, error: err });
-
return false;
-
}
-
}
-
-
/**
-
* Release a distributed lock
-
*/
-
export async function releaseLock(key: string): Promise<void> {
-
const lockId = stringToLockId(key);
-
-
try {
-
await sql`SELECT pg_advisory_unlock(${Number(lockId)})`;
-
} catch (err) {
-
console.error('Failed to release lock', { key, error: err });
-
}
-
}
-
-
export { sql };
-261
hosting-service/src/lib/firehose.ts
···
-
import { existsSync, rmSync } from 'fs';
-
import { getPdsForDid, downloadAndCacheSite, extractBlobCid, fetchSiteRecord } from './utils';
-
import { upsertSite, tryAcquireLock, releaseLock } from './db';
-
import { safeFetch } from './safe-fetch';
-
import { isRecord, validateRecord } from '../lexicon/types/place/wisp/fs';
-
import { Firehose } from '@atproto/sync';
-
import { IdResolver } from '@atproto/identity';
-
-
const CACHE_DIR = './cache/sites';
-
-
export class FirehoseWorker {
-
private firehose: Firehose | null = null;
-
private idResolver: IdResolver;
-
private isShuttingDown = false;
-
private lastEventTime = Date.now();
-
-
constructor(
-
private logger?: (msg: string, data?: Record<string, unknown>) => void,
-
) {
-
this.idResolver = new IdResolver();
-
}
-
-
private log(msg: string, data?: Record<string, unknown>) {
-
const log = this.logger || console.log;
-
log(`[FirehoseWorker] ${msg}`, data || {});
-
}
-
-
start() {
-
this.log('Starting firehose worker');
-
this.connect();
-
}
-
-
stop() {
-
this.log('Stopping firehose worker');
-
this.isShuttingDown = true;
-
-
if (this.firehose) {
-
this.firehose.destroy();
-
this.firehose = null;
-
}
-
}
-
-
private connect() {
-
if (this.isShuttingDown) return;
-
-
this.log('Connecting to AT Protocol firehose');
-
-
this.firehose = new Firehose({
-
idResolver: this.idResolver,
-
service: 'wss://bsky.network',
-
filterCollections: ['place.wisp.fs'],
-
handleEvent: async (evt: any) => {
-
this.lastEventTime = Date.now();
-
-
// Watch for write events
-
if (evt.event === 'create' || evt.event === 'update') {
-
const record = evt.record;
-
-
// If the write is a valid place.wisp.fs record
-
if (
-
evt.collection === 'place.wisp.fs' &&
-
isRecord(record) &&
-
validateRecord(record).success
-
) {
-
this.log('Received place.wisp.fs event', {
-
did: evt.did,
-
event: evt.event,
-
rkey: evt.rkey,
-
});
-
-
try {
-
await this.handleCreateOrUpdate(evt.did, evt.rkey, record, evt.cid?.toString());
-
} catch (err) {
-
this.log('Error handling event', {
-
did: evt.did,
-
event: evt.event,
-
rkey: evt.rkey,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
}
-
}
-
} else if (evt.event === 'delete' && evt.collection === 'place.wisp.fs') {
-
this.log('Received delete event', {
-
did: evt.did,
-
rkey: evt.rkey,
-
});
-
-
try {
-
await this.handleDelete(evt.did, evt.rkey);
-
} catch (err) {
-
this.log('Error handling delete', {
-
did: evt.did,
-
rkey: evt.rkey,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
}
-
}
-
},
-
onError: (err: any) => {
-
this.log('Firehose error', {
-
error: err instanceof Error ? err.message : String(err),
-
stack: err instanceof Error ? err.stack : undefined,
-
fullError: err,
-
});
-
console.error('Full firehose error:', err);
-
},
-
});
-
-
this.firehose.start();
-
this.log('Firehose started');
-
}
-
-
private async handleCreateOrUpdate(did: string, site: string, record: any, eventCid?: string) {
-
this.log('Processing create/update', { did, site });
-
-
// Record is already validated in handleEvent
-
const fsRecord = record;
-
-
const pdsEndpoint = await getPdsForDid(did);
-
if (!pdsEndpoint) {
-
this.log('Could not resolve PDS for DID', { did });
-
return;
-
}
-
-
this.log('Resolved PDS', { did, pdsEndpoint });
-
-
// Verify record exists on PDS and fetch its CID
-
let verifiedCid: string;
-
try {
-
const result = await fetchSiteRecord(did, site);
-
-
if (!result) {
-
this.log('Record not found on PDS, skipping cache', { did, site });
-
return;
-
}
-
-
verifiedCid = result.cid;
-
-
// Verify event CID matches PDS CID (prevent cache poisoning)
-
if (eventCid && eventCid !== verifiedCid) {
-
this.log('CID mismatch detected - potential spoofed event', {
-
did,
-
site,
-
eventCid,
-
verifiedCid
-
});
-
return;
-
}
-
-
this.log('Record verified on PDS', { did, site, cid: verifiedCid });
-
} catch (err) {
-
this.log('Failed to verify record on PDS', {
-
did,
-
site,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
return;
-
}
-
-
// Cache the record with verified CID (uses atomic swap internally)
-
// All instances cache locally for edge serving
-
await downloadAndCacheSite(did, site, fsRecord, pdsEndpoint, verifiedCid);
-
-
// Acquire distributed lock only for database write to prevent duplicate writes
-
const lockKey = `db:upsert:${did}:${site}`;
-
const lockAcquired = await tryAcquireLock(lockKey);
-
-
if (!lockAcquired) {
-
this.log('Another instance is writing to DB, skipping upsert', { did, site });
-
this.log('Successfully processed create/update (cached locally)', { did, site });
-
return;
-
}
-
-
try {
-
// Upsert site to database (only one instance does this)
-
await upsertSite(did, site, fsRecord.site);
-
this.log('Successfully processed create/update (cached + DB updated)', { did, site });
-
} finally {
-
// Always release lock, even if DB write fails
-
await releaseLock(lockKey);
-
}
-
}
-
-
private async handleDelete(did: string, site: string) {
-
this.log('Processing delete', { did, site });
-
-
// All instances should delete their local cache (no lock needed)
-
const pdsEndpoint = await getPdsForDid(did);
-
if (!pdsEndpoint) {
-
this.log('Could not resolve PDS for DID', { did });
-
return;
-
}
-
-
// Verify record is actually deleted from PDS
-
try {
-
const recordUrl = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=place.wisp.fs&rkey=${encodeURIComponent(site)}`;
-
const recordRes = await safeFetch(recordUrl);
-
-
if (recordRes.ok) {
-
this.log('Record still exists on PDS, not deleting cache', {
-
did,
-
site,
-
});
-
return;
-
}
-
-
this.log('Verified record is deleted from PDS', {
-
did,
-
site,
-
status: recordRes.status,
-
});
-
} catch (err) {
-
this.log('Error verifying deletion on PDS', {
-
did,
-
site,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
}
-
-
// Delete cache
-
this.deleteCache(did, site);
-
-
this.log('Successfully processed delete', { did, site });
-
}
-
-
private deleteCache(did: string, site: string) {
-
const cacheDir = `${CACHE_DIR}/${did}/${site}`;
-
-
if (!existsSync(cacheDir)) {
-
this.log('Cache directory does not exist, nothing to delete', {
-
did,
-
site,
-
});
-
return;
-
}
-
-
try {
-
rmSync(cacheDir, { recursive: true, force: true });
-
this.log('Cache deleted', { did, site, path: cacheDir });
-
} catch (err) {
-
this.log('Failed to delete cache', {
-
did,
-
site,
-
path: cacheDir,
-
error: err instanceof Error ? err.message : String(err),
-
});
-
}
-
}
-
-
getHealth() {
-
const isConnected = this.firehose !== null;
-
const timeSinceLastEvent = Date.now() - this.lastEventTime;
-
-
return {
-
connected: isConnected,
-
lastEventTime: this.lastEventTime,
-
timeSinceLastEvent,
-
healthy: isConnected && timeSinceLastEvent < 300000, // 5 minutes
-
};
-
}
-
}
-147
hosting-service/src/lib/html-rewriter.ts
···
-
/**
-
* Safely rewrites absolute paths in HTML to be relative to a base path
-
* Only processes common HTML attributes and preserves external URLs, data URIs, etc.
-
*/
-
-
const REWRITABLE_ATTRIBUTES = [
-
'src',
-
'href',
-
'action',
-
'data',
-
'poster',
-
'srcset',
-
] as const;
-
-
/**
-
* Check if a path should be rewritten
-
*/
-
function shouldRewritePath(path: string): boolean {
-
// Don't rewrite empty paths
-
if (!path) return false;
-
-
// Don't rewrite external URLs (http://, https://, //)
-
if (path.startsWith('http://') || path.startsWith('https://') || path.startsWith('//')) {
-
return false;
-
}
-
-
// Don't rewrite data URIs or other schemes (except file paths)
-
if (path.includes(':') && !path.startsWith('./') && !path.startsWith('../')) {
-
return false;
-
}
-
-
// Don't rewrite pure anchors or paths that start with /#
-
if (path.startsWith('#') || path.startsWith('/#')) return false;
-
-
// Don't rewrite relative paths (./ or ../)
-
if (path.startsWith('./') || path.startsWith('../')) return false;
-
-
// Rewrite absolute paths (/)
-
return true;
-
}
-
-
/**
-
* Rewrite a single path
-
*/
-
function rewritePath(path: string, basePath: string): string {
-
if (!shouldRewritePath(path)) {
-
return path;
-
}
-
-
// Handle absolute paths: /file.js -> /base/file.js
-
if (path.startsWith('/')) {
-
return basePath + path.slice(1);
-
}
-
-
// At this point, only plain filenames without ./ or ../ prefix should reach here
-
// But since we're filtering those in shouldRewritePath, this shouldn't happen
-
return path;
-
}
-
-
/**
-
* Rewrite srcset attribute (can contain multiple URLs)
-
* Format: "url1 1x, url2 2x" or "url1 100w, url2 200w"
-
*/
-
function rewriteSrcset(srcset: string, basePath: string): string {
-
return srcset
-
.split(',')
-
.map(part => {
-
const trimmed = part.trim();
-
const spaceIndex = trimmed.indexOf(' ');
-
-
if (spaceIndex === -1) {
-
// No descriptor, just URL
-
return rewritePath(trimmed, basePath);
-
}
-
-
const url = trimmed.substring(0, spaceIndex);
-
const descriptor = trimmed.substring(spaceIndex);
-
return rewritePath(url, basePath) + descriptor;
-
})
-
.join(', ');
-
}
-
-
/**
-
* Rewrite absolute paths in HTML content
-
* Uses simple regex matching for safety (no full HTML parsing)
-
*/
-
export function rewriteHtmlPaths(html: string, basePath: string): string {
-
// Ensure base path ends with /
-
const normalizedBase = basePath.endsWith('/') ? basePath : basePath + '/';
-
-
let rewritten = html;
-
-
// Rewrite each attribute type
-
// Use more specific patterns to prevent ReDoS attacks
-
for (const attr of REWRITABLE_ATTRIBUTES) {
-
if (attr === 'srcset') {
-
// Special handling for srcset - use possessive quantifiers via atomic grouping simulation
-
// Limit whitespace to reasonable amount (max 5 spaces) to prevent ReDoS
-
const srcsetRegex = new RegExp(
-
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}"([^"]*)"`,
-
'gi'
-
);
-
rewritten = rewritten.replace(srcsetRegex, (match, value) => {
-
const rewrittenValue = rewriteSrcset(value, normalizedBase);
-
return `${attr}="${rewrittenValue}"`;
-
});
-
} else {
-
// Regular attributes with quoted values
-
// Limit whitespace to prevent catastrophic backtracking
-
const doubleQuoteRegex = new RegExp(
-
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}"([^"]*)"`,
-
'gi'
-
);
-
const singleQuoteRegex = new RegExp(
-
`\\b${attr}[ \\t]{0,5}=[ \\t]{0,5}'([^']*)'`,
-
'gi'
-
);
-
-
rewritten = rewritten.replace(doubleQuoteRegex, (match, value) => {
-
const rewrittenValue = rewritePath(value, normalizedBase);
-
return `${attr}="${rewrittenValue}"`;
-
});
-
-
rewritten = rewritten.replace(singleQuoteRegex, (match, value) => {
-
const rewrittenValue = rewritePath(value, normalizedBase);
-
return `${attr}='${rewrittenValue}'`;
-
});
-
}
-
}
-
-
return rewritten;
-
}
-
-
/**
-
* Check if content is HTML based on content or filename
-
*/
-
export function isHtmlContent(
-
filepath: string,
-
contentType?: string
-
): boolean {
-
if (contentType && contentType.includes('text/html')) {
-
return true;
-
}
-
-
const ext = filepath.toLowerCase().split('.').pop();
-
return ext === 'html' || ext === 'htm';
-
}
-326
hosting-service/src/lib/observability.ts
···
-
// DIY Observability for Hosting Service
-
import type { Context } from 'hono'
-
-
// Types
-
export interface LogEntry {
-
id: string
-
timestamp: Date
-
level: 'info' | 'warn' | 'error' | 'debug'
-
message: string
-
service: string
-
context?: Record<string, any>
-
traceId?: string
-
eventType?: string
-
}
-
-
export interface ErrorEntry {
-
id: string
-
timestamp: Date
-
message: string
-
stack?: string
-
service: string
-
context?: Record<string, any>
-
count: number
-
lastSeen: Date
-
}
-
-
export interface MetricEntry {
-
timestamp: Date
-
path: string
-
method: string
-
statusCode: number
-
duration: number
-
service: string
-
}
-
-
// In-memory storage with rotation
-
const MAX_LOGS = 5000
-
const MAX_ERRORS = 500
-
const MAX_METRICS = 10000
-
-
const logs: LogEntry[] = []
-
const errors: Map<string, ErrorEntry> = new Map()
-
const metrics: MetricEntry[] = []
-
-
// Helper to generate unique IDs
-
let logCounter = 0
-
let errorCounter = 0
-
-
function generateId(prefix: string, counter: number): string {
-
return `${prefix}-${Date.now()}-${counter}`
-
}
-
-
// Helper to extract event type from message
-
function extractEventType(message: string): string | undefined {
-
const match = message.match(/^\[([^\]]+)\]/)
-
return match ? match[1] : undefined
-
}
-
-
// Log collector
-
export const logCollector = {
-
log(level: LogEntry['level'], message: string, service: string, context?: Record<string, any>, traceId?: string) {
-
const entry: LogEntry = {
-
id: generateId('log', logCounter++),
-
timestamp: new Date(),
-
level,
-
message,
-
service,
-
context,
-
traceId,
-
eventType: extractEventType(message)
-
}
-
-
logs.unshift(entry)
-
-
// Rotate if needed
-
if (logs.length > MAX_LOGS) {
-
logs.splice(MAX_LOGS)
-
}
-
-
// Also log to console for compatibility
-
const contextStr = context ? ` ${JSON.stringify(context)}` : ''
-
const traceStr = traceId ? ` [trace:${traceId}]` : ''
-
console[level === 'debug' ? 'log' : level](`[${service}] ${message}${contextStr}${traceStr}`)
-
},
-
-
info(message: string, service: string, context?: Record<string, any>, traceId?: string) {
-
this.log('info', message, service, context, traceId)
-
},
-
-
warn(message: string, service: string, context?: Record<string, any>, traceId?: string) {
-
this.log('warn', message, service, context, traceId)
-
},
-
-
error(message: string, service: string, error?: any, context?: Record<string, any>, traceId?: string) {
-
const ctx = { ...context }
-
if (error instanceof Error) {
-
ctx.error = error.message
-
ctx.stack = error.stack
-
} else if (error) {
-
ctx.error = String(error)
-
}
-
this.log('error', message, service, ctx, traceId)
-
-
// Also track in errors
-
errorTracker.track(message, service, error, context)
-
},
-
-
debug(message: string, service: string, context?: Record<string, any>, traceId?: string) {
-
if (process.env.NODE_ENV !== 'production') {
-
this.log('debug', message, service, context, traceId)
-
}
-
},
-
-
getLogs(filter?: { level?: string; service?: string; limit?: number; search?: string; eventType?: string }) {
-
let filtered = [...logs]
-
-
if (filter?.level) {
-
filtered = filtered.filter(log => log.level === filter.level)
-
}
-
-
if (filter?.service) {
-
filtered = filtered.filter(log => log.service === filter.service)
-
}
-
-
if (filter?.eventType) {
-
filtered = filtered.filter(log => log.eventType === filter.eventType)
-
}
-
-
if (filter?.search) {
-
const search = filter.search.toLowerCase()
-
filtered = filtered.filter(log =>
-
log.message.toLowerCase().includes(search) ||
-
JSON.stringify(log.context).toLowerCase().includes(search)
-
)
-
}
-
-
const limit = filter?.limit || 100
-
return filtered.slice(0, limit)
-
},
-
-
clear() {
-
logs.length = 0
-
}
-
}
-
-
// Error tracker with deduplication
-
export const errorTracker = {
-
track(message: string, service: string, error?: any, context?: Record<string, any>) {
-
const key = `${service}:${message}`
-
-
const existing = errors.get(key)
-
if (existing) {
-
existing.count++
-
existing.lastSeen = new Date()
-
if (context) {
-
existing.context = { ...existing.context, ...context }
-
}
-
} else {
-
const entry: ErrorEntry = {
-
id: generateId('error', errorCounter++),
-
timestamp: new Date(),
-
message,
-
service,
-
context,
-
count: 1,
-
lastSeen: new Date()
-
}
-
-
if (error instanceof Error) {
-
entry.stack = error.stack
-
}
-
-
errors.set(key, entry)
-
-
// Rotate if needed
-
if (errors.size > MAX_ERRORS) {
-
const oldest = Array.from(errors.keys())[0]
-
if (oldest !== undefined) {
-
errors.delete(oldest)
-
}
-
}
-
}
-
},
-
-
getErrors(filter?: { service?: string; limit?: number }) {
-
let filtered = Array.from(errors.values())
-
-
if (filter?.service) {
-
filtered = filtered.filter(err => err.service === filter.service)
-
}
-
-
// Sort by last seen (most recent first)
-
filtered.sort((a, b) => b.lastSeen.getTime() - a.lastSeen.getTime())
-
-
const limit = filter?.limit || 100
-
return filtered.slice(0, limit)
-
},
-
-
clear() {
-
errors.clear()
-
}
-
}
-
-
// Metrics collector
-
export const metricsCollector = {
-
recordRequest(path: string, method: string, statusCode: number, duration: number, service: string) {
-
const entry: MetricEntry = {
-
timestamp: new Date(),
-
path,
-
method,
-
statusCode,
-
duration,
-
service
-
}
-
-
metrics.unshift(entry)
-
-
// Rotate if needed
-
if (metrics.length > MAX_METRICS) {
-
metrics.splice(MAX_METRICS)
-
}
-
},
-
-
getMetrics(filter?: { service?: string; timeWindow?: number }) {
-
let filtered = [...metrics]
-
-
if (filter?.service) {
-
filtered = filtered.filter(m => m.service === filter.service)
-
}
-
-
if (filter?.timeWindow) {
-
const cutoff = Date.now() - filter.timeWindow
-
filtered = filtered.filter(m => m.timestamp.getTime() > cutoff)
-
}
-
-
return filtered
-
},
-
-
getStats(service?: string, timeWindow: number = 3600000) {
-
const filtered = this.getMetrics({ service, timeWindow })
-
-
if (filtered.length === 0) {
-
return {
-
totalRequests: 0,
-
avgDuration: 0,
-
p50Duration: 0,
-
p95Duration: 0,
-
p99Duration: 0,
-
errorRate: 0,
-
requestsPerMinute: 0
-
}
-
}
-
-
const durations = filtered.map(m => m.duration).sort((a, b) => a - b)
-
const totalDuration = durations.reduce((sum, d) => sum + d, 0)
-
const errors = filtered.filter(m => m.statusCode >= 400).length
-
-
const p50 = durations[Math.floor(durations.length * 0.5)]
-
const p95 = durations[Math.floor(durations.length * 0.95)]
-
const p99 = durations[Math.floor(durations.length * 0.99)]
-
-
const timeWindowMinutes = timeWindow / 60000
-
-
return {
-
totalRequests: filtered.length,
-
avgDuration: Math.round(totalDuration / filtered.length),
-
p50Duration: Math.round(p50 ?? 0),
-
p95Duration: Math.round(p95 ?? 0),
-
p99Duration: Math.round(p99 ?? 0),
-
errorRate: (errors / filtered.length) * 100,
-
requestsPerMinute: Math.round(filtered.length / timeWindowMinutes)
-
}
-
},
-
-
clear() {
-
metrics.length = 0
-
}
-
}
-
-
// Hono middleware for request timing
-
export function observabilityMiddleware(service: string) {
-
return async (c: Context, next: () => Promise<void>) => {
-
const startTime = Date.now()
-
-
await next()
-
-
const duration = Date.now() - startTime
-
const { pathname } = new URL(c.req.url)
-
-
metricsCollector.recordRequest(
-
pathname,
-
c.req.method,
-
c.res.status,
-
duration,
-
service
-
)
-
}
-
}
-
-
// Hono error handler
-
export function observabilityErrorHandler(service: string) {
-
return (err: Error, c: Context) => {
-
const { pathname } = new URL(c.req.url)
-
-
logCollector.error(
-
`Request failed: ${c.req.method} ${pathname}`,
-
service,
-
err,
-
{ statusCode: c.res.status || 500 }
-
)
-
-
return c.text('Internal Server Error', 500)
-
}
-
}
-
-
// Export singleton logger for easy access
-
export const logger = {
-
info: (message: string, context?: Record<string, any>) =>
-
logCollector.info(message, 'hosting-service', context),
-
warn: (message: string, context?: Record<string, any>) =>
-
logCollector.warn(message, 'hosting-service', context),
-
error: (message: string, error?: any, context?: Record<string, any>) =>
-
logCollector.error(message, 'hosting-service', error, context),
-
debug: (message: string, context?: Record<string, any>) =>
-
logCollector.debug(message, 'hosting-service', context)
-
}
-187
hosting-service/src/lib/safe-fetch.ts
···
-
/**
-
* SSRF-hardened fetch utility
-
* Prevents requests to private networks, localhost, and enforces timeouts/size limits
-
*/
-
-
const BLOCKED_IP_RANGES = [
-
/^127\./, // 127.0.0.0/8 - Loopback
-
/^10\./, // 10.0.0.0/8 - Private
-
/^172\.(1[6-9]|2\d|3[01])\./, // 172.16.0.0/12 - Private
-
/^192\.168\./, // 192.168.0.0/16 - Private
-
/^169\.254\./, // 169.254.0.0/16 - Link-local
-
/^::1$/, // IPv6 loopback
-
/^fe80:/, // IPv6 link-local
-
/^fc00:/, // IPv6 unique local
-
/^fd00:/, // IPv6 unique local
-
];
-
-
const BLOCKED_HOSTS = [
-
'localhost',
-
'metadata.google.internal',
-
'169.254.169.254',
-
];
-
-
const FETCH_TIMEOUT = 120000; // 120 seconds
-
const FETCH_TIMEOUT_BLOB = 120000; // 2 minutes for blob downloads
-
const MAX_RESPONSE_SIZE = 10 * 1024 * 1024; // 10MB
-
const MAX_JSON_SIZE = 1024 * 1024; // 1MB
-
const MAX_BLOB_SIZE = 100 * 1024 * 1024; // 100MB
-
const MAX_REDIRECTS = 10;
-
-
function isBlockedHost(hostname: string): boolean {
-
const lowerHost = hostname.toLowerCase();
-
-
if (BLOCKED_HOSTS.includes(lowerHost)) {
-
return true;
-
}
-
-
for (const pattern of BLOCKED_IP_RANGES) {
-
if (pattern.test(lowerHost)) {
-
return true;
-
}
-
}
-
-
return false;
-
}
-
-
export async function safeFetch(
-
url: string,
-
options?: RequestInit & { maxSize?: number; timeout?: number }
-
): Promise<Response> {
-
const timeoutMs = options?.timeout ?? FETCH_TIMEOUT;
-
const maxSize = options?.maxSize ?? MAX_RESPONSE_SIZE;
-
-
// Parse and validate URL
-
let parsedUrl: URL;
-
try {
-
parsedUrl = new URL(url);
-
} catch (err) {
-
throw new Error(`Invalid URL: ${url}`);
-
}
-
-
if (!['http:', 'https:'].includes(parsedUrl.protocol)) {
-
throw new Error(`Blocked protocol: ${parsedUrl.protocol}`);
-
}
-
-
const hostname = parsedUrl.hostname;
-
if (isBlockedHost(hostname)) {
-
throw new Error(`Blocked host: ${hostname}`);
-
}
-
-
const controller = new AbortController();
-
const timeoutId = setTimeout(() => controller.abort(), timeoutMs);
-
-
try {
-
const response = await fetch(url, {
-
...options,
-
signal: controller.signal,
-
redirect: 'follow',
-
});
-
-
const contentLength = response.headers.get('content-length');
-
if (contentLength && parseInt(contentLength, 10) > maxSize) {
-
throw new Error(`Response too large: ${contentLength} bytes`);
-
}
-
-
return response;
-
} catch (err) {
-
if (err instanceof Error && err.name === 'AbortError') {
-
throw new Error(`Request timeout after ${timeoutMs}ms`);
-
}
-
throw err;
-
} finally {
-
clearTimeout(timeoutId);
-
}
-
}
-
-
export async function safeFetchJson<T = any>(
-
url: string,
-
options?: RequestInit & { maxSize?: number; timeout?: number }
-
): Promise<T> {
-
const maxJsonSize = options?.maxSize ?? MAX_JSON_SIZE;
-
const response = await safeFetch(url, { ...options, maxSize: maxJsonSize });
-
-
if (!response.ok) {
-
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
-
}
-
-
const reader = response.body?.getReader();
-
if (!reader) {
-
throw new Error('No response body');
-
}
-
-
const chunks: Uint8Array[] = [];
-
let totalSize = 0;
-
-
try {
-
while (true) {
-
const { done, value } = await reader.read();
-
if (done) break;
-
-
totalSize += value.length;
-
if (totalSize > maxJsonSize) {
-
throw new Error(`Response exceeds max size: ${maxJsonSize} bytes`);
-
}
-
-
chunks.push(value);
-
}
-
} finally {
-
reader.releaseLock();
-
}
-
-
const combined = new Uint8Array(totalSize);
-
let offset = 0;
-
for (const chunk of chunks) {
-
combined.set(chunk, offset);
-
offset += chunk.length;
-
}
-
-
const text = new TextDecoder().decode(combined);
-
return JSON.parse(text);
-
}
-
-
export async function safeFetchBlob(
-
url: string,
-
options?: RequestInit & { maxSize?: number; timeout?: number }
-
): Promise<Uint8Array> {
-
const maxBlobSize = options?.maxSize ?? MAX_BLOB_SIZE;
-
const timeoutMs = options?.timeout ?? FETCH_TIMEOUT_BLOB;
-
const response = await safeFetch(url, { ...options, maxSize: maxBlobSize, timeout: timeoutMs });
-
-
if (!response.ok) {
-
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
-
}
-
-
const reader = response.body?.getReader();
-
if (!reader) {
-
throw new Error('No response body');
-
}
-
-
const chunks: Uint8Array[] = [];
-
let totalSize = 0;
-
-
try {
-
while (true) {
-
const { done, value } = await reader.read();
-
if (done) break;
-
-
totalSize += value.length;
-
if (totalSize > maxBlobSize) {
-
throw new Error(`Blob exceeds max size: ${maxBlobSize} bytes`);
-
}
-
-
chunks.push(value);
-
}
-
} finally {
-
reader.releaseLock();
-
}
-
-
const combined = new Uint8Array(totalSize);
-
let offset = 0;
-
for (const chunk of chunks) {
-
combined.set(chunk, offset);
-
offset += chunk.length;
-
}
-
-
return combined;
-
}
-27
hosting-service/src/lib/types.ts
···
-
import type { BlobRef } from '@atproto/api';
-
-
export interface WispFsRecord {
-
$type: 'place.wisp.fs';
-
site: string;
-
root: Directory;
-
fileCount?: number;
-
createdAt: string;
-
}
-
-
export interface File {
-
$type?: 'place.wisp.fs#file';
-
type: 'file';
-
blob: BlobRef;
-
}
-
-
export interface Directory {
-
$type?: 'place.wisp.fs#directory';
-
type: 'directory';
-
entries: Entry[];
-
}
-
-
export interface Entry {
-
$type?: 'place.wisp.fs#entry';
-
name: string;
-
node: File | Directory | { $type: string };
-
}
-169
hosting-service/src/lib/utils.test.ts
···
-
import { describe, test, expect } from 'bun:test'
-
import { sanitizePath, extractBlobCid } from './utils'
-
import { CID } from 'multiformats'
-
-
describe('sanitizePath', () => {
-
test('allows normal file paths', () => {
-
expect(sanitizePath('index.html')).toBe('index.html')
-
expect(sanitizePath('css/styles.css')).toBe('css/styles.css')
-
expect(sanitizePath('images/logo.png')).toBe('images/logo.png')
-
expect(sanitizePath('js/app.js')).toBe('js/app.js')
-
})
-
-
test('allows deeply nested paths', () => {
-
expect(sanitizePath('assets/images/icons/favicon.ico')).toBe('assets/images/icons/favicon.ico')
-
expect(sanitizePath('a/b/c/d/e/f.txt')).toBe('a/b/c/d/e/f.txt')
-
})
-
-
test('removes leading slashes', () => {
-
expect(sanitizePath('/index.html')).toBe('index.html')
-
expect(sanitizePath('//index.html')).toBe('index.html')
-
expect(sanitizePath('///index.html')).toBe('index.html')
-
expect(sanitizePath('/css/styles.css')).toBe('css/styles.css')
-
})
-
-
test('blocks parent directory traversal', () => {
-
expect(sanitizePath('../etc/passwd')).toBe('etc/passwd')
-
expect(sanitizePath('../../etc/passwd')).toBe('etc/passwd')
-
expect(sanitizePath('../../../etc/passwd')).toBe('etc/passwd')
-
expect(sanitizePath('css/../../../etc/passwd')).toBe('css/etc/passwd')
-
})
-
-
test('blocks directory traversal in middle of path', () => {
-
expect(sanitizePath('images/../../../etc/passwd')).toBe('images/etc/passwd')
-
// Note: sanitizePath only filters out ".." segments, doesn't resolve paths
-
expect(sanitizePath('a/b/../c')).toBe('a/b/c')
-
expect(sanitizePath('a/../b/../c')).toBe('a/b/c')
-
})
-
-
test('removes current directory references', () => {
-
expect(sanitizePath('./index.html')).toBe('index.html')
-
expect(sanitizePath('././index.html')).toBe('index.html')
-
expect(sanitizePath('css/./styles.css')).toBe('css/styles.css')
-
expect(sanitizePath('./css/./styles.css')).toBe('css/styles.css')
-
})
-
-
test('removes empty path segments', () => {
-
expect(sanitizePath('css//styles.css')).toBe('css/styles.css')
-
expect(sanitizePath('css///styles.css')).toBe('css/styles.css')
-
expect(sanitizePath('a//b//c')).toBe('a/b/c')
-
})
-
-
test('blocks null bytes', () => {
-
// Null bytes cause the entire segment to be filtered out
-
expect(sanitizePath('index.html\0.txt')).toBe('')
-
expect(sanitizePath('test\0')).toBe('')
-
// Null byte in middle segment
-
expect(sanitizePath('css/bad\0name/styles.css')).toBe('css/styles.css')
-
})
-
-
test('handles mixed attacks', () => {
-
expect(sanitizePath('/../../../etc/passwd')).toBe('etc/passwd')
-
expect(sanitizePath('/./././../etc/passwd')).toBe('etc/passwd')
-
expect(sanitizePath('//../../.\0./etc/passwd')).toBe('etc/passwd')
-
})
-
-
test('handles edge cases', () => {
-
expect(sanitizePath('')).toBe('')
-
expect(sanitizePath('/')).toBe('')
-
expect(sanitizePath('//')).toBe('')
-
expect(sanitizePath('.')).toBe('')
-
expect(sanitizePath('..')).toBe('')
-
expect(sanitizePath('../..')).toBe('')
-
})
-
-
test('preserves valid special characters in filenames', () => {
-
expect(sanitizePath('file-name.html')).toBe('file-name.html')
-
expect(sanitizePath('file_name.html')).toBe('file_name.html')
-
expect(sanitizePath('file.name.html')).toBe('file.name.html')
-
expect(sanitizePath('file (1).html')).toBe('file (1).html')
-
expect(sanitizePath('file@2x.png')).toBe('file@2x.png')
-
})
-
-
test('handles Unicode characters', () => {
-
expect(sanitizePath('ๆ–‡ไปถ.html')).toBe('ๆ–‡ไปถ.html')
-
expect(sanitizePath('ั„ะฐะนะป.html')).toBe('ั„ะฐะนะป.html')
-
expect(sanitizePath('ใƒ•ใ‚กใ‚คใƒซ.html')).toBe('ใƒ•ใ‚กใ‚คใƒซ.html')
-
})
-
})
-
-
describe('extractBlobCid', () => {
-
const TEST_CID = 'bafkreid7ybejd5s2vv2j7d4aajjlmdgazguemcnuliiyfn6coxpwp2mi6y'
-
-
test('extracts CID from IPLD link', () => {
-
const blobRef = { $link: TEST_CID }
-
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
-
})
-
-
test('extracts CID from typed BlobRef with CID object', () => {
-
const cid = CID.parse(TEST_CID)
-
const blobRef = { ref: cid }
-
const result = extractBlobCid(blobRef)
-
expect(result).toBe(TEST_CID)
-
})
-
-
test('extracts CID from typed BlobRef with IPLD link', () => {
-
const blobRef = {
-
ref: { $link: TEST_CID }
-
}
-
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
-
})
-
-
test('extracts CID from untyped BlobRef', () => {
-
const blobRef = { cid: TEST_CID }
-
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
-
})
-
-
test('returns null for invalid blob ref', () => {
-
expect(extractBlobCid(null)).toBe(null)
-
expect(extractBlobCid(undefined)).toBe(null)
-
expect(extractBlobCid({})).toBe(null)
-
expect(extractBlobCid('not-an-object')).toBe(null)
-
expect(extractBlobCid(123)).toBe(null)
-
})
-
-
test('returns null for malformed objects', () => {
-
expect(extractBlobCid({ wrongKey: 'value' })).toBe(null)
-
expect(extractBlobCid({ ref: 'not-a-cid' })).toBe(null)
-
expect(extractBlobCid({ ref: {} })).toBe(null)
-
})
-
-
test('handles nested structures from AT Proto API', () => {
-
// Real structure from AT Proto
-
const blobRef = {
-
$type: 'blob',
-
ref: CID.parse(TEST_CID),
-
mimeType: 'text/html',
-
size: 1234
-
}
-
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
-
})
-
-
test('handles BlobRef with additional properties', () => {
-
const blobRef = {
-
ref: { $link: TEST_CID },
-
mimeType: 'image/png',
-
size: 5678,
-
someOtherField: 'value'
-
}
-
expect(extractBlobCid(blobRef)).toBe(TEST_CID)
-
})
-
-
test('prioritizes checking IPLD link first', () => {
-
// Direct $link takes precedence
-
const directLink = { $link: TEST_CID }
-
expect(extractBlobCid(directLink)).toBe(TEST_CID)
-
})
-
-
test('handles CID v0 format', () => {
-
const cidV0 = 'QmZ4tDuvesekSs4qM5ZBKpXiZGun7S2CYtEZRB3DYXkjGx'
-
const blobRef = { $link: cidV0 }
-
expect(extractBlobCid(blobRef)).toBe(cidV0)
-
})
-
-
test('handles CID v1 format', () => {
-
const cidV1 = 'bafybeigdyrzt5sfp7udm7hu76uh7y26nf3efuylqabf3oclgtqy55fbzdi'
-
const blobRef = { $link: cidV1 }
-
expect(extractBlobCid(blobRef)).toBe(cidV1)
-
})
-
})
-392
hosting-service/src/lib/utils.ts
···
-
import { AtpAgent } from '@atproto/api';
-
import type { Record as WispFsRecord, Directory, Entry, File } from '../lexicon/types/place/wisp/fs';
-
import { existsSync, mkdirSync, readFileSync, rmSync } from 'fs';
-
import { writeFile, readFile, rename } from 'fs/promises';
-
import { safeFetchJson, safeFetchBlob } from './safe-fetch';
-
import { CID } from 'multiformats';
-
-
const CACHE_DIR = './cache/sites';
-
const CACHE_TTL = 14 * 24 * 60 * 60 * 1000; // 14 days cache TTL
-
-
interface CacheMetadata {
-
recordCid: string;
-
cachedAt: number;
-
did: string;
-
rkey: string;
-
}
-
-
interface IpldLink {
-
$link: string;
-
}
-
-
interface TypedBlobRef {
-
ref: CID | IpldLink;
-
}
-
-
interface UntypedBlobRef {
-
cid: string;
-
}
-
-
function isIpldLink(obj: unknown): obj is IpldLink {
-
return typeof obj === 'object' && obj !== null && '$link' in obj && typeof (obj as IpldLink).$link === 'string';
-
}
-
-
function isTypedBlobRef(obj: unknown): obj is TypedBlobRef {
-
return typeof obj === 'object' && obj !== null && 'ref' in obj;
-
}
-
-
function isUntypedBlobRef(obj: unknown): obj is UntypedBlobRef {
-
return typeof obj === 'object' && obj !== null && 'cid' in obj && typeof (obj as UntypedBlobRef).cid === 'string';
-
}
-
-
export async function resolveDid(identifier: string): Promise<string | null> {
-
try {
-
// If it's already a DID, return it
-
if (identifier.startsWith('did:')) {
-
return identifier;
-
}
-
-
// Otherwise, resolve the handle using agent's built-in method
-
const agent = new AtpAgent({ service: 'https://public.api.bsky.app' });
-
const response = await agent.resolveHandle({ handle: identifier });
-
return response.data.did;
-
} catch (err) {
-
console.error('Failed to resolve identifier', identifier, err);
-
return null;
-
}
-
}
-
-
export async function getPdsForDid(did: string): Promise<string | null> {
-
try {
-
let doc;
-
-
if (did.startsWith('did:plc:')) {
-
doc = await safeFetchJson(`https://plc.directory/${encodeURIComponent(did)}`);
-
} else if (did.startsWith('did:web:')) {
-
const didUrl = didWebToHttps(did);
-
doc = await safeFetchJson(didUrl);
-
} else {
-
console.error('Unsupported DID method', did);
-
return null;
-
}
-
-
const services = doc.service || [];
-
const pdsService = services.find((s: any) => s.id === '#atproto_pds');
-
-
return pdsService?.serviceEndpoint || null;
-
} catch (err) {
-
console.error('Failed to get PDS for DID', did, err);
-
return null;
-
}
-
}
-
-
function didWebToHttps(did: string): string {
-
const didParts = did.split(':');
-
if (didParts.length < 3 || didParts[0] !== 'did' || didParts[1] !== 'web') {
-
throw new Error('Invalid did:web format');
-
}
-
-
const domain = didParts[2];
-
const pathParts = didParts.slice(3);
-
-
if (pathParts.length === 0) {
-
return `https://${domain}/.well-known/did.json`;
-
} else {
-
const path = pathParts.join('/');
-
return `https://${domain}/${path}/did.json`;
-
}
-
}
-
-
export async function fetchSiteRecord(did: string, rkey: string): Promise<{ record: WispFsRecord; cid: string } | null> {
-
try {
-
const pdsEndpoint = await getPdsForDid(did);
-
if (!pdsEndpoint) return null;
-
-
const url = `${pdsEndpoint}/xrpc/com.atproto.repo.getRecord?repo=${encodeURIComponent(did)}&collection=place.wisp.fs&rkey=${encodeURIComponent(rkey)}`;
-
const data = await safeFetchJson(url);
-
-
return {
-
record: data.value as WispFsRecord,
-
cid: data.cid || ''
-
};
-
} catch (err) {
-
console.error('Failed to fetch site record', did, rkey, err);
-
return null;
-
}
-
}
-
-
export function extractBlobCid(blobRef: unknown): string | null {
-
if (isIpldLink(blobRef)) {
-
return blobRef.$link;
-
}
-
-
if (isTypedBlobRef(blobRef)) {
-
const ref = blobRef.ref;
-
-
const cid = CID.asCID(ref);
-
if (cid) {
-
return cid.toString();
-
}
-
-
if (isIpldLink(ref)) {
-
return ref.$link;
-
}
-
}
-
-
if (isUntypedBlobRef(blobRef)) {
-
return blobRef.cid;
-
}
-
-
return null;
-
}
-
-
export async function downloadAndCacheSite(did: string, rkey: string, record: WispFsRecord, pdsEndpoint: string, recordCid: string): Promise<void> {
-
console.log('Caching site', did, rkey);
-
-
if (!record.root) {
-
console.error('Record missing root directory:', JSON.stringify(record, null, 2));
-
throw new Error('Invalid record structure: missing root directory');
-
}
-
-
if (!record.root.entries || !Array.isArray(record.root.entries)) {
-
console.error('Record root missing entries array:', JSON.stringify(record.root, null, 2));
-
throw new Error('Invalid record structure: root missing entries array');
-
}
-
-
// Use a temporary directory with timestamp to avoid collisions
-
const tempSuffix = `.tmp-${Date.now()}-${Math.random().toString(36).slice(2, 9)}`;
-
const tempDir = `${CACHE_DIR}/${did}/${rkey}${tempSuffix}`;
-
const finalDir = `${CACHE_DIR}/${did}/${rkey}`;
-
-
try {
-
// Download to temporary directory
-
await cacheFiles(did, rkey, record.root.entries, pdsEndpoint, '', tempSuffix);
-
await saveCacheMetadata(did, rkey, recordCid, tempSuffix);
-
-
// Atomically replace old cache with new cache
-
// On POSIX systems (Linux/macOS), rename is atomic
-
if (existsSync(finalDir)) {
-
// Rename old directory to backup
-
const backupDir = `${finalDir}.old-${Date.now()}`;
-
await rename(finalDir, backupDir);
-
-
try {
-
// Rename new directory to final location
-
await rename(tempDir, finalDir);
-
-
// Clean up old backup
-
rmSync(backupDir, { recursive: true, force: true });
-
} catch (err) {
-
// If rename failed, restore backup
-
if (existsSync(backupDir) && !existsSync(finalDir)) {
-
await rename(backupDir, finalDir);
-
}
-
throw err;
-
}
-
} else {
-
// No existing cache, just rename temp to final
-
await rename(tempDir, finalDir);
-
}
-
-
console.log('Successfully cached site atomically', did, rkey);
-
} catch (err) {
-
// Clean up temp directory on failure
-
if (existsSync(tempDir)) {
-
rmSync(tempDir, { recursive: true, force: true });
-
}
-
throw err;
-
}
-
}
-
-
async function cacheFiles(
-
did: string,
-
site: string,
-
entries: Entry[],
-
pdsEndpoint: string,
-
pathPrefix: string,
-
dirSuffix: string = ''
-
): Promise<void> {
-
// Collect all file blob download tasks first
-
const downloadTasks: Array<() => Promise<void>> = [];
-
-
function collectFileTasks(
-
entries: Entry[],
-
currentPathPrefix: string
-
) {
-
for (const entry of entries) {
-
const currentPath = currentPathPrefix ? `${currentPathPrefix}/${entry.name}` : entry.name;
-
const node = entry.node;
-
-
if ('type' in node && node.type === 'directory' && 'entries' in node) {
-
collectFileTasks(node.entries, currentPath);
-
} else if ('type' in node && node.type === 'file' && 'blob' in node) {
-
const fileNode = node as File;
-
downloadTasks.push(() => cacheFileBlob(
-
did,
-
site,
-
currentPath,
-
fileNode.blob,
-
pdsEndpoint,
-
fileNode.encoding,
-
fileNode.mimeType,
-
fileNode.base64,
-
dirSuffix
-
));
-
}
-
}
-
}
-
-
collectFileTasks(entries, pathPrefix);
-
-
// Execute downloads concurrently with a limit of 3 at a time
-
const concurrencyLimit = 3;
-
for (let i = 0; i < downloadTasks.length; i += concurrencyLimit) {
-
const batch = downloadTasks.slice(i, i + concurrencyLimit);
-
await Promise.all(batch.map(task => task()));
-
}
-
}
-
-
async function cacheFileBlob(
-
did: string,
-
site: string,
-
filePath: string,
-
blobRef: any,
-
pdsEndpoint: string,
-
encoding?: 'gzip',
-
mimeType?: string,
-
base64?: boolean,
-
dirSuffix: string = ''
-
): Promise<void> {
-
const cid = extractBlobCid(blobRef);
-
if (!cid) {
-
console.error('Could not extract CID from blob', blobRef);
-
return;
-
}
-
-
const blobUrl = `${pdsEndpoint}/xrpc/com.atproto.sync.getBlob?did=${encodeURIComponent(did)}&cid=${encodeURIComponent(cid)}`;
-
-
// Allow up to 100MB per file blob, with 2 minute timeout
-
let content = await safeFetchBlob(blobUrl, { maxSize: 100 * 1024 * 1024, timeout: 120000 });
-
-
console.log(`[DEBUG] ${filePath}: fetched ${content.length} bytes, base64=${base64}, encoding=${encoding}, mimeType=${mimeType}`);
-
-
// If content is base64-encoded, decode it back to binary (gzipped or not)
-
if (base64) {
-
const originalSize = content.length;
-
// The content from the blob is base64 text, decode it directly to binary
-
const buffer = Buffer.from(content);
-
const base64String = buffer.toString('ascii'); // Use ascii for base64 text, not utf-8
-
console.log(`[DEBUG] ${filePath}: base64 string first 100 chars: ${base64String.substring(0, 100)}`);
-
content = Buffer.from(base64String, 'base64');
-
console.log(`[DEBUG] ${filePath}: decoded from ${originalSize} bytes to ${content.length} bytes`);
-
-
// Check if it's actually gzipped by looking at magic bytes
-
if (content.length >= 2) {
-
const magic = content[0] === 0x1f && content[1] === 0x8b;
-
const byte0 = content[0];
-
const byte1 = content[1];
-
console.log(`[DEBUG] ${filePath}: has gzip magic bytes: ${magic} (0x${byte0?.toString(16)}, 0x${byte1?.toString(16)})`);
-
}
-
}
-
-
const cacheFile = `${CACHE_DIR}/${did}/${site}${dirSuffix}/${filePath}`;
-
const fileDir = cacheFile.substring(0, cacheFile.lastIndexOf('/'));
-
-
if (fileDir && !existsSync(fileDir)) {
-
mkdirSync(fileDir, { recursive: true });
-
}
-
-
await writeFile(cacheFile, content);
-
-
// Store metadata if file is compressed
-
if (encoding === 'gzip' && mimeType) {
-
const metaFile = `${cacheFile}.meta`;
-
await writeFile(metaFile, JSON.stringify({ encoding, mimeType }));
-
console.log('Cached file', filePath, content.length, 'bytes (gzipped,', mimeType + ')');
-
} else {
-
console.log('Cached file', filePath, content.length, 'bytes');
-
}
-
}
-
-
/**
-
* Sanitize a file path to prevent directory traversal attacks
-
* Removes any path segments that attempt to go up directories
-
*/
-
export function sanitizePath(filePath: string): string {
-
// Remove leading slashes
-
let cleaned = filePath.replace(/^\/+/, '');
-
-
// Split into segments and filter out dangerous ones
-
const segments = cleaned.split('/').filter(segment => {
-
// Remove empty segments
-
if (!segment || segment === '.') return false;
-
// Remove parent directory references
-
if (segment === '..') return false;
-
// Remove segments with null bytes
-
if (segment.includes('\0')) return false;
-
return true;
-
});
-
-
// Rejoin the safe segments
-
return segments.join('/');
-
}
-
-
export function getCachedFilePath(did: string, site: string, filePath: string): string {
-
const sanitizedPath = sanitizePath(filePath);
-
return `${CACHE_DIR}/${did}/${site}/${sanitizedPath}`;
-
}
-
-
export function isCached(did: string, site: string): boolean {
-
return existsSync(`${CACHE_DIR}/${did}/${site}`);
-
}
-
-
async function saveCacheMetadata(did: string, rkey: string, recordCid: string, dirSuffix: string = ''): Promise<void> {
-
const metadata: CacheMetadata = {
-
recordCid,
-
cachedAt: Date.now(),
-
did,
-
rkey
-
};
-
-
const metadataPath = `${CACHE_DIR}/${did}/${rkey}${dirSuffix}/.metadata.json`;
-
const metadataDir = metadataPath.substring(0, metadataPath.lastIndexOf('/'));
-
-
if (!existsSync(metadataDir)) {
-
mkdirSync(metadataDir, { recursive: true });
-
}
-
-
await writeFile(metadataPath, JSON.stringify(metadata, null, 2));
-
}
-
-
async function getCacheMetadata(did: string, rkey: string): Promise<CacheMetadata | null> {
-
try {
-
const metadataPath = `${CACHE_DIR}/${did}/${rkey}/.metadata.json`;
-
if (!existsSync(metadataPath)) return null;
-
-
const content = await readFile(metadataPath, 'utf-8');
-
return JSON.parse(content) as CacheMetadata;
-
} catch (err) {
-
console.error('Failed to read cache metadata', err);
-
return null;
-
}
-
}
-
-
export async function isCacheValid(did: string, rkey: string, currentRecordCid?: string): Promise<boolean> {
-
const metadata = await getCacheMetadata(did, rkey);
-
if (!metadata) return false;
-
-
// Check if cache has expired (14 days TTL)
-
const cacheAge = Date.now() - metadata.cachedAt;
-
if (cacheAge > CACHE_TTL) {
-
console.log('[Cache] Cache expired for', did, rkey);
-
return false;
-
}
-
-
// If current CID is provided, verify it matches
-
if (currentRecordCid && metadata.recordCid !== currentRecordCid) {
-
console.log('[Cache] CID mismatch for', did, rkey, 'cached:', metadata.recordCid, 'current:', currentRecordCid);
-
return false;
-
}
-
-
return true;
-
}
-453
hosting-service/src/server.ts
···
-
import { Hono } from 'hono';
-
import { getWispDomain, getCustomDomain, getCustomDomainByHash } from './lib/db';
-
import { resolveDid, getPdsForDid, fetchSiteRecord, downloadAndCacheSite, getCachedFilePath, isCached, sanitizePath } from './lib/utils';
-
import { rewriteHtmlPaths, isHtmlContent } from './lib/html-rewriter';
-
import { existsSync, readFileSync } from 'fs';
-
import { lookup } from 'mime-types';
-
import { logger, observabilityMiddleware, observabilityErrorHandler, logCollector, errorTracker, metricsCollector } from './lib/observability';
-
-
const BASE_HOST = process.env.BASE_HOST || 'wisp.place';
-
-
/**
-
* Validate site name (rkey) to prevent injection attacks
-
* Must match AT Protocol rkey format
-
*/
-
function isValidRkey(rkey: string): boolean {
-
if (!rkey || typeof rkey !== 'string') return false;
-
if (rkey.length < 1 || rkey.length > 512) return false;
-
if (rkey === '.' || rkey === '..') return false;
-
if (rkey.includes('/') || rkey.includes('\\') || rkey.includes('\0')) return false;
-
const validRkeyPattern = /^[a-zA-Z0-9._~:-]+$/;
-
return validRkeyPattern.test(rkey);
-
}
-
-
// Helper to serve files from cache
-
async function serveFromCache(did: string, rkey: string, filePath: string) {
-
// Default to index.html if path is empty or ends with /
-
let requestPath = filePath || 'index.html';
-
if (requestPath.endsWith('/')) {
-
requestPath += 'index.html';
-
}
-
-
const cachedFile = getCachedFilePath(did, rkey, requestPath);
-
-
if (existsSync(cachedFile)) {
-
const content = readFileSync(cachedFile);
-
const metaFile = `${cachedFile}.meta`;
-
-
console.log(`[DEBUG SERVE] ${requestPath}: file size=${content.length} bytes, metaFile exists=${existsSync(metaFile)}`);
-
-
// Check if file has compression metadata
-
if (existsSync(metaFile)) {
-
const meta = JSON.parse(readFileSync(metaFile, 'utf-8'));
-
console.log(`[DEBUG SERVE] ${requestPath}: meta=${JSON.stringify(meta)}`);
-
-
// Check actual content for gzip magic bytes
-
if (content.length >= 2) {
-
const hasGzipMagic = content[0] === 0x1f && content[1] === 0x8b;
-
const byte0 = content[0];
-
const byte1 = content[1];
-
console.log(`[DEBUG SERVE] ${requestPath}: has gzip magic bytes=${hasGzipMagic} (0x${byte0?.toString(16)}, 0x${byte1?.toString(16)})`);
-
}
-
-
if (meta.encoding === 'gzip' && meta.mimeType) {
-
// Don't serve already-compressed media formats with Content-Encoding: gzip
-
// These formats (video, audio, images) are already compressed and the browser
-
// can't decode them if we add another layer of compression
-
const alreadyCompressedTypes = [
-
'video/', 'audio/', 'image/jpeg', 'image/jpg', 'image/png',
-
'image/gif', 'image/webp', 'application/pdf'
-
];
-
-
const isAlreadyCompressed = alreadyCompressedTypes.some(type =>
-
meta.mimeType.toLowerCase().startsWith(type)
-
);
-
-
if (isAlreadyCompressed) {
-
// Decompress the file before serving
-
console.log(`[DEBUG SERVE] ${requestPath}: decompressing already-compressed media type`);
-
const { gunzipSync } = await import('zlib');
-
const decompressed = gunzipSync(content);
-
console.log(`[DEBUG SERVE] ${requestPath}: decompressed from ${content.length} to ${decompressed.length} bytes`);
-
return new Response(decompressed, {
-
headers: {
-
'Content-Type': meta.mimeType,
-
},
-
});
-
}
-
-
// Serve gzipped content with proper headers (for HTML, CSS, JS, etc.)
-
console.log(`[DEBUG SERVE] ${requestPath}: serving as gzipped with Content-Encoding header`);
-
return new Response(content, {
-
headers: {
-
'Content-Type': meta.mimeType,
-
'Content-Encoding': 'gzip',
-
},
-
});
-
}
-
}
-
-
// Serve non-compressed files normally
-
const mimeType = lookup(cachedFile) || 'application/octet-stream';
-
return new Response(content, {
-
headers: {
-
'Content-Type': mimeType,
-
},
-
});
-
}
-
-
// Try index.html for directory-like paths
-
if (!requestPath.includes('.')) {
-
const indexFile = getCachedFilePath(did, rkey, `${requestPath}/index.html`);
-
if (existsSync(indexFile)) {
-
const content = readFileSync(indexFile);
-
const metaFile = `${indexFile}.meta`;
-
-
// Check if file has compression metadata
-
if (existsSync(metaFile)) {
-
const meta = JSON.parse(readFileSync(metaFile, 'utf-8'));
-
if (meta.encoding === 'gzip' && meta.mimeType) {
-
return new Response(content, {
-
headers: {
-
'Content-Type': meta.mimeType,
-
'Content-Encoding': 'gzip',
-
},
-
});
-
}
-
}
-
-
return new Response(content, {
-
headers: {
-
'Content-Type': 'text/html; charset=utf-8',
-
},
-
});
-
}
-
}
-
-
return new Response('Not Found', { status: 404 });
-
}
-
-
// Helper to serve files from cache with HTML path rewriting for sites.wisp.place routes
-
async function serveFromCacheWithRewrite(
-
did: string,
-
rkey: string,
-
filePath: string,
-
basePath: string
-
) {
-
// Default to index.html if path is empty or ends with /
-
let requestPath = filePath || 'index.html';
-
if (requestPath.endsWith('/')) {
-
requestPath += 'index.html';
-
}
-
-
const cachedFile = getCachedFilePath(did, rkey, requestPath);
-
-
if (existsSync(cachedFile)) {
-
const metaFile = `${cachedFile}.meta`;
-
let mimeType = lookup(cachedFile) || 'application/octet-stream';
-
let isGzipped = false;
-
-
// Check if file has compression metadata
-
if (existsSync(metaFile)) {
-
const meta = JSON.parse(readFileSync(metaFile, 'utf-8'));
-
if (meta.encoding === 'gzip' && meta.mimeType) {
-
mimeType = meta.mimeType;
-
isGzipped = true;
-
}
-
}
-
-
// Check if this is HTML content that needs rewriting
-
// Note: For gzipped HTML with path rewriting, we need to decompress, rewrite, and serve uncompressed
-
// This is a trade-off for the sites.wisp.place domain which needs path rewriting
-
if (isHtmlContent(requestPath, mimeType)) {
-
let content: string;
-
if (isGzipped) {
-
const { gunzipSync } = await import('zlib');
-
const compressed = readFileSync(cachedFile);
-
content = gunzipSync(compressed).toString('utf-8');
-
} else {
-
content = readFileSync(cachedFile, 'utf-8');
-
}
-
const rewritten = rewriteHtmlPaths(content, basePath);
-
return new Response(rewritten, {
-
headers: {
-
'Content-Type': 'text/html; charset=utf-8',
-
},
-
});
-
}
-
-
// Non-HTML files: serve gzipped content as-is with proper headers
-
const content = readFileSync(cachedFile);
-
if (isGzipped) {
-
// Don't serve already-compressed media formats with Content-Encoding: gzip
-
const alreadyCompressedTypes = [
-
'video/', 'audio/', 'image/jpeg', 'image/jpg', 'image/png',
-
'image/gif', 'image/webp', 'application/pdf'
-
];
-
-
const isAlreadyCompressed = alreadyCompressedTypes.some(type =>
-
mimeType.toLowerCase().startsWith(type)
-
);
-
-
if (isAlreadyCompressed) {
-
// Decompress the file before serving
-
const { gunzipSync } = await import('zlib');
-
const decompressed = gunzipSync(content);
-
return new Response(decompressed, {
-
headers: {
-
'Content-Type': mimeType,
-
},
-
});
-
}
-
-
return new Response(content, {
-
headers: {
-
'Content-Type': mimeType,
-
'Content-Encoding': 'gzip',
-
},
-
});
-
}
-
return new Response(content, {
-
headers: {
-
'Content-Type': mimeType,
-
},
-
});
-
}
-
-
// Try index.html for directory-like paths
-
if (!requestPath.includes('.')) {
-
const indexFile = getCachedFilePath(did, rkey, `${requestPath}/index.html`);
-
if (existsSync(indexFile)) {
-
const metaFile = `${indexFile}.meta`;
-
let isGzipped = false;
-
-
if (existsSync(metaFile)) {
-
const meta = JSON.parse(readFileSync(metaFile, 'utf-8'));
-
if (meta.encoding === 'gzip') {
-
isGzipped = true;
-
}
-
}
-
-
// HTML needs path rewriting, so decompress if needed
-
let content: string;
-
if (isGzipped) {
-
const { gunzipSync } = await import('zlib');
-
const compressed = readFileSync(indexFile);
-
content = gunzipSync(compressed).toString('utf-8');
-
} else {
-
content = readFileSync(indexFile, 'utf-8');
-
}
-
const rewritten = rewriteHtmlPaths(content, basePath);
-
return new Response(rewritten, {
-
headers: {
-
'Content-Type': 'text/html; charset=utf-8',
-
},
-
});
-
}
-
}
-
-
return new Response('Not Found', { status: 404 });
-
}
-
-
// Helper to ensure site is cached
-
async function ensureSiteCached(did: string, rkey: string): Promise<boolean> {
-
if (isCached(did, rkey)) {
-
return true;
-
}
-
-
// Fetch and cache the site
-
const siteData = await fetchSiteRecord(did, rkey);
-
if (!siteData) {
-
logger.error('Site record not found', null, { did, rkey });
-
return false;
-
}
-
-
const pdsEndpoint = await getPdsForDid(did);
-
if (!pdsEndpoint) {
-
logger.error('PDS not found for DID', null, { did });
-
return false;
-
}
-
-
try {
-
await downloadAndCacheSite(did, rkey, siteData.record, pdsEndpoint, siteData.cid);
-
logger.info('Site cached successfully', { did, rkey });
-
return true;
-
} catch (err) {
-
logger.error('Failed to cache site', err, { did, rkey });
-
return false;
-
}
-
}
-
-
const app = new Hono();
-
-
// Add observability middleware
-
app.use('*', observabilityMiddleware('hosting-service'));
-
-
// Error handler
-
app.onError(observabilityErrorHandler('hosting-service'));
-
-
// Main site serving route
-
app.get('/*', async (c) => {
-
const url = new URL(c.req.url);
-
const hostname = c.req.header('host') || '';
-
const rawPath = url.pathname.replace(/^\//, '');
-
const path = sanitizePath(rawPath);
-
-
// Check if this is sites.wisp.place subdomain
-
if (hostname === `sites.${BASE_HOST}` || hostname === `sites.${BASE_HOST}:${process.env.PORT || 3000}`) {
-
// Sanitize the path FIRST to prevent path traversal
-
const sanitizedFullPath = sanitizePath(rawPath);
-
-
// Extract identifier and site from sanitized path: did:plc:123abc/sitename/file.html
-
const pathParts = sanitizedFullPath.split('/');
-
if (pathParts.length < 2) {
-
return c.text('Invalid path format. Expected: /identifier/sitename/path', 400);
-
}
-
-
const identifier = pathParts[0];
-
const site = pathParts[1];
-
const filePath = pathParts.slice(2).join('/');
-
-
// Additional validation: identifier must be a valid DID or handle format
-
if (!identifier || identifier.length < 3 || identifier.includes('..') || identifier.includes('\0')) {
-
return c.text('Invalid identifier', 400);
-
}
-
-
// Validate site parameter exists
-
if (!site) {
-
return c.text('Site name required', 400);
-
}
-
-
// Validate site name (rkey)
-
if (!isValidRkey(site)) {
-
return c.text('Invalid site name', 400);
-
}
-
-
// Resolve identifier to DID
-
const did = await resolveDid(identifier);
-
if (!did) {
-
return c.text('Invalid identifier', 400);
-
}
-
-
// Ensure site is cached
-
const cached = await ensureSiteCached(did, site);
-
if (!cached) {
-
return c.text('Site not found', 404);
-
}
-
-
// Serve with HTML path rewriting to handle absolute paths
-
const basePath = `/${identifier}/${site}/`;
-
return serveFromCacheWithRewrite(did, site, filePath, basePath);
-
}
-
-
// Check if this is a DNS hash subdomain
-
const dnsMatch = hostname.match(/^([a-f0-9]{16})\.dns\.(.+)$/);
-
if (dnsMatch) {
-
const hash = dnsMatch[1];
-
const baseDomain = dnsMatch[2];
-
-
if (!hash) {
-
return c.text('Invalid DNS hash', 400);
-
}
-
-
if (baseDomain !== BASE_HOST) {
-
return c.text('Invalid base domain', 400);
-
}
-
-
const customDomain = await getCustomDomainByHash(hash);
-
if (!customDomain) {
-
return c.text('Custom domain not found or not verified', 404);
-
}
-
-
if (!customDomain.rkey) {
-
return c.text('Domain not mapped to a site', 404);
-
}
-
-
const rkey = customDomain.rkey;
-
if (!isValidRkey(rkey)) {
-
return c.text('Invalid site configuration', 500);
-
}
-
-
const cached = await ensureSiteCached(customDomain.did, rkey);
-
if (!cached) {
-
return c.text('Site not found', 404);
-
}
-
-
return serveFromCache(customDomain.did, rkey, path);
-
}
-
-
// Route 2: Registered subdomains - /*.wisp.place/*
-
if (hostname.endsWith(`.${BASE_HOST}`)) {
-
const domainInfo = await getWispDomain(hostname);
-
if (!domainInfo) {
-
return c.text('Subdomain not registered', 404);
-
}
-
-
if (!domainInfo.rkey) {
-
return c.text('Domain not mapped to a site', 404);
-
}
-
-
const rkey = domainInfo.rkey;
-
if (!isValidRkey(rkey)) {
-
return c.text('Invalid site configuration', 500);
-
}
-
-
const cached = await ensureSiteCached(domainInfo.did, rkey);
-
if (!cached) {
-
return c.text('Site not found', 404);
-
}
-
-
return serveFromCache(domainInfo.did, rkey, path);
-
}
-
-
// Route 1: Custom domains - /*
-
const customDomain = await getCustomDomain(hostname);
-
if (!customDomain) {
-
return c.text('Custom domain not found or not verified', 404);
-
}
-
-
if (!customDomain.rkey) {
-
return c.text('Domain not mapped to a site', 404);
-
}
-
-
const rkey = customDomain.rkey;
-
if (!isValidRkey(rkey)) {
-
return c.text('Invalid site configuration', 500);
-
}
-
-
const cached = await ensureSiteCached(customDomain.did, rkey);
-
if (!cached) {
-
return c.text('Site not found', 404);
-
}
-
-
return serveFromCache(customDomain.did, rkey, path);
-
});
-
-
// Internal observability endpoints (for admin panel)
-
app.get('/__internal__/observability/logs', (c) => {
-
const query = c.req.query();
-
const filter: any = {};
-
if (query.level) filter.level = query.level;
-
if (query.service) filter.service = query.service;
-
if (query.search) filter.search = query.search;
-
if (query.eventType) filter.eventType = query.eventType;
-
if (query.limit) filter.limit = parseInt(query.limit as string);
-
return c.json({ logs: logCollector.getLogs(filter) });
-
});
-
-
app.get('/__internal__/observability/errors', (c) => {
-
const query = c.req.query();
-
const filter: any = {};
-
if (query.service) filter.service = query.service;
-
if (query.limit) filter.limit = parseInt(query.limit as string);
-
return c.json({ errors: errorTracker.getErrors(filter) });
-
});
-
-
app.get('/__internal__/observability/metrics', (c) => {
-
const query = c.req.query();
-
const timeWindow = query.timeWindow ? parseInt(query.timeWindow as string) : 3600000;
-
const stats = metricsCollector.getStats('hosting-service', timeWindow);
-
return c.json({ stats, timeWindow });
-
});
-
-
export default app;
-28
hosting-service/tsconfig.json
···
-
{
-
"compilerOptions": {
-
/* Base Options */
-
"esModuleInterop": true,
-
"skipLibCheck": true,
-
"target": "es2022",
-
"allowJs": true,
-
"resolveJsonModule": true,
-
"moduleDetection": "force",
-
"isolatedModules": true,
-
"verbatimModuleSyntax": true,
-
-
/* Strictness */
-
"strict": true,
-
"noUncheckedIndexedAccess": true,
-
"noImplicitOverride": true,
-
"forceConsistentCasingInFileNames": true,
-
-
/* Transpiling with TypeScript */
-
"module": "ESNext",
-
"moduleResolution": "bundler",
-
"outDir": "dist",
-
"sourceMap": true,
-
-
/* Code doesn't run in DOM */
-
"lib": ["es2022"],
-
}
-
}
-51
lexicons/fs.json
···
-
{
-
"lexicon": 1,
-
"id": "place.wisp.fs",
-
"defs": {
-
"main": {
-
"type": "record",
-
"description": "Virtual filesystem manifest for a Wisp site",
-
"record": {
-
"type": "object",
-
"required": ["site", "root", "createdAt"],
-
"properties": {
-
"site": { "type": "string" },
-
"root": { "type": "ref", "ref": "#directory" },
-
"fileCount": { "type": "integer", "minimum": 0, "maximum": 1000 },
-
"createdAt": { "type": "string", "format": "datetime" }
-
}
-
}
-
},
-
"file": {
-
"type": "object",
-
"required": ["type", "blob"],
-
"properties": {
-
"type": { "type": "string", "const": "file" },
-
"blob": { "type": "blob", "accept": ["*/*"], "maxSize": 1000000, "description": "Content blob ref" },
-
"encoding": { "type": "string", "enum": ["gzip"], "description": "Content encoding (e.g., gzip for compressed files)" },
-
"mimeType": { "type": "string", "description": "Original MIME type before compression" },
-
"base64": { "type": "boolean", "description": "True if blob content is base64-encoded (used to bypass PDS content sniffing)" }
-
}
-
},
-
"directory": {
-
"type": "object",
-
"required": ["type", "entries"],
-
"properties": {
-
"type": { "type": "string", "const": "directory" },
-
"entries": {
-
"type": "array",
-
"maxLength": 500,
-
"items": { "type": "ref", "ref": "#entry" }
-
}
-
}
-
},
-
"entry": {
-
"type": "object",
-
"required": ["name", "node"],
-
"properties": {
-
"name": { "type": "string", "maxLength": 255 },
-
"node": { "type": "union", "refs": ["#file", "#directory"] }
-
}
-
}
-
}
-
}
+15 -45
package.json
···
{
-
"name": "elysia-static",
+
"name": "@wisp/monorepo",
"version": "1.0.50",
+
"private": true,
+
"workspaces": [
+
"packages/@wisp/*",
+
"apps/main-app",
+
"apps/hosting-service"
+
],
"scripts": {
"test": "bun test",
-
"dev": "bun run --watch src/index.ts",
-
"start": "bun run src/index.ts",
-
"build": "bun build --compile --target bun --outfile server src/index.ts"
-
},
-
"dependencies": {
-
"@atproto/api": "^0.17.3",
-
"@atproto/lex-cli": "^0.9.5",
-
"@atproto/oauth-client-node": "^0.3.9",
-
"@atproto/xrpc-server": "^0.9.5",
-
"@elysiajs/cors": "^1.4.0",
-
"@elysiajs/eden": "^1.4.3",
-
"@elysiajs/openapi": "^1.4.11",
-
"@elysiajs/opentelemetry": "^1.4.6",
-
"@elysiajs/static": "^1.4.2",
-
"@radix-ui/react-dialog": "^1.1.15",
-
"@radix-ui/react-label": "^2.1.7",
-
"@radix-ui/react-radio-group": "^1.3.8",
-
"@radix-ui/react-slot": "^1.2.3",
-
"@radix-ui/react-tabs": "^1.1.13",
-
"@tanstack/react-query": "^5.90.2",
-
"class-variance-authority": "^0.7.1",
-
"clsx": "^2.1.1",
-
"elysia": "latest",
-
"iron-session": "^8.0.4",
-
"lucide-react": "^0.546.0",
-
"react": "^19.2.0",
-
"react-dom": "^19.2.0",
-
"tailwind-merge": "^3.3.1",
-
"tailwindcss": "4",
-
"tw-animate-css": "^1.4.0",
-
"typescript": "^5.9.3",
-
"zlib": "^1.0.5"
-
},
-
"devDependencies": {
-
"@types/react": "^19.2.2",
-
"@types/react-dom": "^19.2.1",
-
"bun-plugin-tailwind": "^0.1.2",
-
"bun-types": "latest"
-
},
-
"module": "src/index.js",
-
"trustedDependencies": [
-
"core-js",
-
"protobufjs"
-
]
+
"dev": "bun run --watch apps/main-app/src/index.ts",
+
"start": "bun run apps/main-app/src/index.ts",
+
"build": "bun build --compile --target bun --outfile server apps/main-app/src/index.ts",
+
"screenshot": "bun run apps/main-app/scripts/screenshot-sites.ts",
+
"hosting:dev": "cd apps/hosting-service && npm run dev",
+
"hosting:build": "cd apps/hosting-service && npm run build",
+
"hosting:start": "cd apps/hosting-service && npm run start"
+
}
}
+31
packages/@wisp/atproto-utils/package.json
···
+
{
+
"name": "@wisp/atproto-utils",
+
"version": "1.0.0",
+
"private": true,
+
"type": "module",
+
"main": "./src/index.ts",
+
"types": "./src/index.ts",
+
"exports": {
+
".": {
+
"types": "./src/index.ts",
+
"default": "./src/index.ts"
+
},
+
"./blob": {
+
"types": "./src/blob.ts",
+
"default": "./src/blob.ts"
+
},
+
"./compression": {
+
"types": "./src/compression.ts",
+
"default": "./src/compression.ts"
+
},
+
"./subfs": {
+
"types": "./src/subfs.ts",
+
"default": "./src/subfs.ts"
+
}
+
},
+
"dependencies": {
+
"@atproto/api": "^0.14.1",
+
"@wisp/lexicons": "workspace:*",
+
"multiformats": "^13.3.1"
+
}
+
}
+108
packages/@wisp/atproto-utils/src/blob.ts
···
+
import type { BlobRef } from "@atproto/lexicon";
+
import type { Directory, File } from "@wisp/lexicons/types/place/wisp/fs";
+
import { CID } from 'multiformats/cid';
+
import { sha256 } from 'multiformats/hashes/sha2';
+
import * as raw from 'multiformats/codecs/raw';
+
import { createHash } from 'crypto';
+
import * as mf from 'multiformats';
+
+
/**
+
* Compute CID (Content Identifier) for blob content
+
* Uses the same algorithm as AT Protocol: CIDv1 with raw codec and SHA-256
+
* Based on @atproto/common/src/ipld.ts sha256RawToCid implementation
+
*/
+
export function computeCID(content: Buffer): string {
+
// Use node crypto to compute sha256 hash (same as AT Protocol)
+
const hash = createHash('sha256').update(content).digest();
+
// Create digest object from hash bytes
+
const digest = mf.digest.create(sha256.code, hash);
+
// Create CIDv1 with raw codec
+
const cid = CID.createV1(raw.code, digest);
+
return cid.toString();
+
}
+
+
/**
+
* Extract blob information from a directory tree
+
* Returns a map of file paths to their blob refs and CIDs
+
*/
+
export function extractBlobMap(
+
directory: Directory,
+
currentPath: string = ''
+
): Map<string, { blobRef: BlobRef; cid: string }> {
+
const blobMap = new Map<string, { blobRef: BlobRef; cid: string }>();
+
+
for (const entry of directory.entries) {
+
const fullPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
+
+
if ('type' in entry.node && entry.node.type === 'file') {
+
const fileNode = entry.node as File;
+
// AT Protocol SDK returns BlobRef class instances, not plain objects
+
// The ref is a CID instance that can be converted to string
+
if (fileNode.blob && fileNode.blob.ref) {
+
const cidString = fileNode.blob.ref.toString();
+
blobMap.set(fullPath, {
+
blobRef: fileNode.blob,
+
cid: cidString
+
});
+
}
+
} else if ('type' in entry.node && entry.node.type === 'directory') {
+
const subMap = extractBlobMap(entry.node as Directory, fullPath);
+
subMap.forEach((value, key) => blobMap.set(key, value));
+
}
+
// Skip subfs nodes - they don't contain blobs in the main tree
+
}
+
+
return blobMap;
+
}
+
+
interface IpldLink {
+
$link: string;
+
}
+
+
interface TypedBlobRef {
+
ref: CID | IpldLink;
+
}
+
+
interface UntypedBlobRef {
+
cid: string;
+
}
+
+
function isIpldLink(obj: unknown): obj is IpldLink {
+
return typeof obj === 'object' && obj !== null && '$link' in obj && typeof (obj as IpldLink).$link === 'string';
+
}
+
+
function isTypedBlobRef(obj: unknown): obj is TypedBlobRef {
+
return typeof obj === 'object' && obj !== null && 'ref' in obj;
+
}
+
+
function isUntypedBlobRef(obj: unknown): obj is UntypedBlobRef {
+
return typeof obj === 'object' && obj !== null && 'cid' in obj && typeof (obj as UntypedBlobRef).cid === 'string';
+
}
+
+
/**
+
* Extract CID from a blob reference (handles multiple blob ref formats)
+
*/
+
export function extractBlobCid(blobRef: unknown): string | null {
+
if (isIpldLink(blobRef)) {
+
return blobRef.$link;
+
}
+
+
if (isTypedBlobRef(blobRef)) {
+
const ref = blobRef.ref;
+
+
const cid = CID.asCID(ref);
+
if (cid) {
+
return cid.toString();
+
}
+
+
if (isIpldLink(ref)) {
+
return ref.$link;
+
}
+
}
+
+
if (isUntypedBlobRef(blobRef)) {
+
return blobRef.cid;
+
}
+
+
return null;
+
}
+95
packages/@wisp/atproto-utils/src/compression.ts
···
+
import { gzipSync } from 'zlib';
+
+
/**
+
* Determine if a file should be gzip compressed based on its MIME type and filename
+
*/
+
export function shouldCompressFile(mimeType: string, fileName?: string): boolean {
+
// Never compress _redirects file - it needs to be plain text for the hosting service
+
if (fileName && (fileName.endsWith('/_redirects') || fileName === '_redirects')) {
+
return false;
+
}
+
+
// Compress text-based files and uncompressed audio formats
+
const compressibleTypes = [
+
'text/html',
+
'text/css',
+
'text/javascript',
+
'application/javascript',
+
'application/json',
+
'image/svg+xml',
+
'text/xml',
+
'application/xml',
+
'text/plain',
+
'application/x-javascript',
+
// Uncompressed audio formats (WAV, AIFF, etc.)
+
'audio/wav',
+
'audio/wave',
+
'audio/x-wav',
+
'audio/aiff',
+
'audio/x-aiff'
+
];
+
+
// Check if mime type starts with any compressible type
+
return compressibleTypes.some(type => mimeType.startsWith(type));
+
}
+
+
/**
+
* Determines if a MIME type should benefit from gzip compression.
+
* Returns true for text-based web assets (HTML, CSS, JS, JSON, XML, SVG).
+
* Returns false for already-compressed formats (images, video, audio, PDFs).
+
*/
+
export function shouldCompressMimeType(mimeType: string | undefined): boolean {
+
if (!mimeType) return false;
+
+
const mime = mimeType.toLowerCase();
+
+
// Text-based web assets and uncompressed audio that benefit from compression
+
const compressibleTypes = [
+
'text/html',
+
'text/css',
+
'text/javascript',
+
'application/javascript',
+
'application/x-javascript',
+
'text/xml',
+
'application/xml',
+
'application/json',
+
'text/plain',
+
'image/svg+xml',
+
// Uncompressed audio formats
+
'audio/wav',
+
'audio/wave',
+
'audio/x-wav',
+
'audio/aiff',
+
'audio/x-aiff',
+
];
+
+
if (compressibleTypes.some(type => mime === type || mime.startsWith(type))) {
+
return true;
+
}
+
+
// Already-compressed formats that should NOT be double-compressed
+
const alreadyCompressedPrefixes = [
+
'video/',
+
'audio/',
+
'image/',
+
'application/pdf',
+
'application/zip',
+
'application/gzip',
+
];
+
+
if (alreadyCompressedPrefixes.some(prefix => mime.startsWith(prefix))) {
+
return false;
+
}
+
+
// Default to not compressing for unknown types
+
return false;
+
}
+
+
/**
+
* Compress a file using gzip with deterministic output
+
*/
+
export function compressFile(content: Buffer): Buffer {
+
return gzipSync(content, {
+
level: 9
+
});
+
}
+8
packages/@wisp/atproto-utils/src/index.ts
···
+
// Blob utilities
+
export { computeCID, extractBlobMap, extractBlobCid } from './blob';
+
+
// Compression utilities
+
export { shouldCompressFile, shouldCompressMimeType, compressFile } from './compression';
+
+
// Subfs utilities
+
export { extractSubfsUris } from './subfs';
+31
packages/@wisp/atproto-utils/src/subfs.ts
···
+
import type { Directory } from "@wisp/lexicons/types/place/wisp/fs";
+
+
/**
+
* Extract all subfs URIs from a directory tree with their mount paths
+
*/
+
export function extractSubfsUris(
+
directory: Directory,
+
currentPath: string = ''
+
): Array<{ uri: string; path: string }> {
+
const uris: Array<{ uri: string; path: string }> = [];
+
+
for (const entry of directory.entries) {
+
const fullPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
+
+
if ('type' in entry.node) {
+
if (entry.node.type === 'subfs') {
+
// Subfs node with subject URI
+
const subfsNode = entry.node as any;
+
if (subfsNode.subject) {
+
uris.push({ uri: subfsNode.subject, path: fullPath });
+
}
+
} else if (entry.node.type === 'directory') {
+
// Recursively search subdirectories
+
const subUris = extractSubfsUris(entry.node as Directory, fullPath);
+
uris.push(...subUris);
+
}
+
}
+
}
+
+
return uris;
+
}
+9
packages/@wisp/atproto-utils/tsconfig.json
···
+
{
+
"extends": "../../../tsconfig.json",
+
"compilerOptions": {
+
"outDir": "./dist",
+
"rootDir": "./src"
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "dist"]
+
}
+14
packages/@wisp/constants/package.json
···
+
{
+
"name": "@wisp/constants",
+
"version": "1.0.0",
+
"private": true,
+
"type": "module",
+
"main": "./src/index.ts",
+
"types": "./src/index.ts",
+
"exports": {
+
".": {
+
"types": "./src/index.ts",
+
"default": "./src/index.ts"
+
}
+
}
+
}
+32
packages/@wisp/constants/src/index.ts
···
+
/**
+
* Shared constants for wisp.place
+
*/
+
+
// Domain configuration
+
export const getBaseHost = () => {
+
if (typeof Bun !== 'undefined') {
+
return Bun.env.BASE_DOMAIN || "wisp.place";
+
}
+
return process.env.BASE_DOMAIN || "wisp.place";
+
};
+
+
export const BASE_HOST = getBaseHost();
+
+
// File size limits
+
export const MAX_SITE_SIZE = 300 * 1024 * 1024; // 300MB
+
export const MAX_FILE_SIZE = 100 * 1024 * 1024; // 100MB
+
export const MAX_FILE_COUNT = 1000;
+
+
// Cache configuration
+
export const CACHE_TTL_MS = 5 * 60 * 1000; // 5 minutes
+
+
// Fetch timeouts and limits
+
export const FETCH_TIMEOUT_MS = 30000; // 30 seconds
+
export const MAX_JSON_SIZE = 10 * 1024 * 1024; // 10MB
+
export const MAX_BLOB_SIZE = MAX_FILE_SIZE; // Use file size limit
+
+
// Directory limits (AT Protocol lexicon constraints)
+
export const MAX_ENTRIES_PER_DIRECTORY = 500;
+
+
// Compression settings
+
export const GZIP_COMPRESSION_LEVEL = 9;
+9
packages/@wisp/constants/tsconfig.json
···
+
{
+
"extends": "../../../tsconfig.json",
+
"compilerOptions": {
+
"outDir": "./dist",
+
"rootDir": "./src"
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "dist"]
+
}
+29
packages/@wisp/database/package.json
···
+
{
+
"name": "@wisp/database",
+
"version": "1.0.0",
+
"private": true,
+
"type": "module",
+
"main": "./src/index.ts",
+
"types": "./src/index.ts",
+
"exports": {
+
".": {
+
"types": "./src/index.ts",
+
"default": "./src/index.ts"
+
},
+
"./types": {
+
"types": "./src/types.ts",
+
"default": "./src/types.ts"
+
}
+
},
+
"dependencies": {
+
"postgres": "^3.4.5"
+
},
+
"peerDependencies": {
+
"bun": "^1.0.0"
+
},
+
"peerDependenciesMeta": {
+
"bun": {
+
"optional": true
+
}
+
}
+
}
+22
packages/@wisp/database/src/index.ts
···
+
/**
+
* Shared database utilities for wisp.place
+
*
+
* This package provides database query functions that work across both
+
* main-app (Bun SQL) and hosting-service (postgres) environments.
+
*
+
* The actual database client is passed in by the consuming application.
+
*/
+
+
export * from './types';
+
+
// Re-export types
+
export type {
+
DomainLookup,
+
CustomDomainLookup,
+
SiteRecord,
+
OAuthState,
+
OAuthSession,
+
OAuthKey,
+
CookieSecret,
+
AdminUser
+
} from './types';
+56
packages/@wisp/database/src/types.ts
···
+
/**
+
* Shared database types used across main-app and hosting-service
+
*/
+
+
export interface DomainLookup {
+
did: string;
+
rkey: string | null;
+
}
+
+
export interface CustomDomainLookup {
+
id: string;
+
domain: string;
+
did: string;
+
rkey: string | null;
+
verified: boolean;
+
}
+
+
export interface SiteRecord {
+
did: string;
+
rkey: string;
+
display_name?: string;
+
created_at?: number;
+
updated_at?: number;
+
}
+
+
export interface OAuthState {
+
key: string;
+
data: string;
+
created_at?: number;
+
expires_at?: number;
+
}
+
+
export interface OAuthSession {
+
sub: string;
+
data: string;
+
updated_at?: number;
+
expires_at?: number;
+
}
+
+
export interface OAuthKey {
+
kid: string;
+
jwk: string;
+
created_at?: number;
+
}
+
+
export interface CookieSecret {
+
id: string;
+
secret: string;
+
created_at?: number;
+
}
+
+
export interface AdminUser {
+
username: string;
+
password_hash: string;
+
created_at?: number;
+
}
+9
packages/@wisp/database/tsconfig.json
···
+
{
+
"extends": "../../../tsconfig.json",
+
"compilerOptions": {
+
"outDir": "./dist",
+
"rootDir": "./src"
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "dist"]
+
}
+34
packages/@wisp/fs-utils/package.json
···
+
{
+
"name": "@wisp/fs-utils",
+
"version": "1.0.0",
+
"private": true,
+
"type": "module",
+
"main": "./src/index.ts",
+
"types": "./src/index.ts",
+
"exports": {
+
".": {
+
"types": "./src/index.ts",
+
"default": "./src/index.ts"
+
},
+
"./path": {
+
"types": "./src/path.ts",
+
"default": "./src/path.ts"
+
},
+
"./tree": {
+
"types": "./src/tree.ts",
+
"default": "./src/tree.ts"
+
},
+
"./manifest": {
+
"types": "./src/manifest.ts",
+
"default": "./src/manifest.ts"
+
},
+
"./subfs-split": {
+
"types": "./src/subfs-split.ts",
+
"default": "./src/subfs-split.ts"
+
}
+
},
+
"dependencies": {
+
"@atproto/api": "^0.14.1",
+
"@wisp/lexicons": "workspace:*"
+
}
+
}
+12
packages/@wisp/fs-utils/src/index.ts
···
+
// Path utilities
+
export { sanitizePath, normalizePath } from './path';
+
+
// Tree processing
+
export type { UploadedFile, FileUploadResult, ProcessedDirectory } from './tree';
+
export { processUploadedFiles, updateFileBlobs, countFilesInDirectory, collectFileCidsFromEntries } from './tree';
+
+
// Manifest creation
+
export { createManifest } from './manifest';
+
+
// Subfs splitting utilities
+
export { estimateDirectorySize, findLargeDirectories, replaceDirectoryWithSubfs } from './subfs-split';
+27
packages/@wisp/fs-utils/src/manifest.ts
···
+
import type { Record, Directory } from "@wisp/lexicons/types/place/wisp/fs";
+
import { validateRecord } from "@wisp/lexicons/types/place/wisp/fs";
+
+
/**
+
* Create the manifest record for a site
+
*/
+
export function createManifest(
+
siteName: string,
+
root: Directory,
+
fileCount: number
+
): Record {
+
const manifest = {
+
$type: 'place.wisp.fs' as const,
+
site: siteName,
+
root,
+
fileCount,
+
createdAt: new Date().toISOString()
+
};
+
+
// Validate the manifest before returning
+
const validationResult = validateRecord(manifest);
+
if (!validationResult.success) {
+
throw new Error(`Invalid manifest: ${validationResult.error?.message || 'Validation failed'}`);
+
}
+
+
return manifest;
+
}
+29
packages/@wisp/fs-utils/src/path.ts
···
+
/**
+
* Sanitize a file path to prevent directory traversal attacks
+
* Removes any path segments that attempt to go up directories
+
*/
+
export function sanitizePath(filePath: string): string {
+
// Remove leading slashes
+
let cleaned = filePath.replace(/^\/+/, '');
+
+
// Split into segments and filter out dangerous ones
+
const segments = cleaned.split('/').filter(segment => {
+
// Remove empty segments
+
if (!segment || segment === '.') return false;
+
// Remove parent directory references
+
if (segment === '..') return false;
+
// Remove segments with null bytes
+
if (segment.includes('\0')) return false;
+
return true;
+
});
+
+
// Rejoin the safe segments
+
return segments.join('/');
+
}
+
+
/**
+
* Normalize a path by removing leading base folder names
+
*/
+
export function normalizePath(path: string): string {
+
return path.replace(/^[^\/]*\//, '');
+
}
+113
packages/@wisp/fs-utils/src/subfs-split.ts
···
+
import type { Directory } from "@wisp/lexicons/types/place/wisp/fs";
+
+
/**
+
* Estimate the JSON size of a directory tree
+
*/
+
export function estimateDirectorySize(directory: Directory): number {
+
return JSON.stringify(directory).length;
+
}
+
+
/**
+
* Count files in a directory tree
+
*/
+
export function countFilesInDirectory(directory: Directory): number {
+
let count = 0;
+
for (const entry of directory.entries) {
+
if ('type' in entry.node && entry.node.type === 'file') {
+
count++;
+
} else if ('type' in entry.node && entry.node.type === 'directory') {
+
count += countFilesInDirectory(entry.node as Directory);
+
}
+
}
+
return count;
+
}
+
+
/**
+
* Find all directories in a tree with their paths and sizes
+
*/
+
export function findLargeDirectories(directory: Directory, currentPath: string = ''): Array<{
+
path: string;
+
directory: Directory;
+
size: number;
+
fileCount: number;
+
}> {
+
const result: Array<{ path: string; directory: Directory; size: number; fileCount: number }> = [];
+
+
for (const entry of directory.entries) {
+
if ('type' in entry.node && entry.node.type === 'directory') {
+
const dirPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
+
const dir = entry.node as Directory;
+
const size = estimateDirectorySize(dir);
+
const fileCount = countFilesInDirectory(dir);
+
+
result.push({ path: dirPath, directory: dir, size, fileCount });
+
+
// Recursively find subdirectories
+
const subdirs = findLargeDirectories(dir, dirPath);
+
result.push(...subdirs);
+
}
+
}
+
+
return result;
+
}
+
+
/**
+
* Replace a directory with a subfs node in the tree
+
*/
+
export function replaceDirectoryWithSubfs(
+
directory: Directory,
+
targetPath: string,
+
subfsUri: string
+
): Directory {
+
const pathParts = targetPath.split('/');
+
const targetName = pathParts[pathParts.length - 1];
+
const parentPath = pathParts.slice(0, -1).join('/');
+
+
// If this is a root-level directory
+
if (pathParts.length === 1) {
+
const newEntries = directory.entries.map(entry => {
+
if (entry.name === targetName && 'type' in entry.node && entry.node.type === 'directory') {
+
return {
+
name: entry.name,
+
node: {
+
$type: 'place.wisp.fs#subfs' as const,
+
type: 'subfs' as const,
+
subject: subfsUri,
+
flat: false // Preserve directory structure
+
}
+
};
+
}
+
return entry;
+
});
+
+
return {
+
$type: 'place.wisp.fs#directory' as const,
+
type: 'directory' as const,
+
entries: newEntries
+
};
+
}
+
+
// Recursively navigate to parent directory
+
const newEntries = directory.entries.map(entry => {
+
if ('type' in entry.node && entry.node.type === 'directory') {
+
const entryPath = entry.name;
+
if (parentPath.startsWith(entryPath) || parentPath === entry.name) {
+
const remainingPath = pathParts.slice(1).join('/');
+
return {
+
name: entry.name,
+
node: {
+
...replaceDirectoryWithSubfs(entry.node as Directory, remainingPath, subfsUri),
+
$type: 'place.wisp.fs#directory' as const
+
}
+
};
+
}
+
}
+
return entry;
+
});
+
+
return {
+
$type: 'place.wisp.fs#directory' as const,
+
type: 'directory' as const,
+
entries: newEntries
+
};
+
}
+241
packages/@wisp/fs-utils/src/tree.ts
···
+
import type { BlobRef } from "@atproto/api";
+
import type { Directory, Entry, File } from "@wisp/lexicons/types/place/wisp/fs";
+
+
export interface UploadedFile {
+
name: string;
+
content: Buffer;
+
mimeType: string;
+
size: number;
+
compressed?: boolean;
+
base64Encoded?: boolean;
+
originalMimeType?: string;
+
}
+
+
export interface FileUploadResult {
+
hash: string;
+
blobRef: BlobRef;
+
encoding?: 'gzip';
+
mimeType?: string;
+
base64?: boolean;
+
}
+
+
export interface ProcessedDirectory {
+
directory: Directory;
+
fileCount: number;
+
}
+
+
/**
+
* Process uploaded files into a directory structure
+
*/
+
export function processUploadedFiles(files: UploadedFile[]): ProcessedDirectory {
+
const entries: Entry[] = [];
+
let fileCount = 0;
+
+
// Group files by directory
+
const directoryMap = new Map<string, UploadedFile[]>();
+
+
for (const file of files) {
+
// Skip undefined/null files (defensive)
+
if (!file || !file.name) {
+
console.error('Skipping undefined or invalid file in processUploadedFiles');
+
continue;
+
}
+
+
// Remove any base folder name from the path
+
const normalizedPath = file.name.replace(/^[^\/]*\//, '');
+
+
// Skip files in .git directories
+
if (normalizedPath.startsWith('.git/') || normalizedPath === '.git') {
+
continue;
+
}
+
+
const parts = normalizedPath.split('/');
+
+
if (parts.length === 1) {
+
// Root level file
+
entries.push({
+
name: parts[0],
+
node: {
+
$type: 'place.wisp.fs#file' as const,
+
type: 'file' as const,
+
blob: undefined as any // Will be filled after upload
+
}
+
});
+
fileCount++;
+
} else {
+
// File in subdirectory
+
const dirPath = parts.slice(0, -1).join('/');
+
if (!directoryMap.has(dirPath)) {
+
directoryMap.set(dirPath, []);
+
}
+
directoryMap.get(dirPath)!.push({
+
...file,
+
name: normalizedPath
+
});
+
}
+
}
+
+
// Process subdirectories
+
for (const [dirPath, dirFiles] of directoryMap) {
+
const dirEntries: Entry[] = [];
+
+
for (const file of dirFiles) {
+
const fileName = file.name.split('/').pop()!;
+
dirEntries.push({
+
name: fileName,
+
node: {
+
$type: 'place.wisp.fs#file' as const,
+
type: 'file' as const,
+
blob: undefined as any // Will be filled after upload
+
}
+
});
+
fileCount++;
+
}
+
+
// Build nested directory structure
+
const pathParts = dirPath.split('/');
+
let currentEntries = entries;
+
+
for (let i = 0; i < pathParts.length; i++) {
+
const part = pathParts[i];
+
const isLast = i === pathParts.length - 1;
+
+
let existingEntry = currentEntries.find(e => e.name === part);
+
+
if (!existingEntry) {
+
const newDir = {
+
$type: 'place.wisp.fs#directory' as const,
+
type: 'directory' as const,
+
entries: isLast ? dirEntries : []
+
};
+
+
existingEntry = {
+
name: part,
+
node: newDir
+
};
+
currentEntries.push(existingEntry);
+
} else if ('entries' in existingEntry.node && isLast) {
+
(existingEntry.node as any).entries.push(...dirEntries);
+
}
+
+
if (existingEntry && 'entries' in existingEntry.node) {
+
currentEntries = (existingEntry.node as any).entries;
+
}
+
}
+
}
+
+
const result = {
+
directory: {
+
$type: 'place.wisp.fs#directory' as const,
+
type: 'directory' as const,
+
entries
+
},
+
fileCount
+
};
+
+
return result;
+
}
+
+
/**
+
* Update file blobs in directory structure after upload
+
* Uses path-based matching to correctly match files in nested directories
+
* Filters out files that were not successfully uploaded
+
*/
+
export function updateFileBlobs(
+
directory: Directory,
+
uploadResults: FileUploadResult[],
+
filePaths: string[],
+
currentPath: string = '',
+
successfulPaths?: Set<string>
+
): Directory {
+
const updatedEntries = directory.entries.map(entry => {
+
if ('type' in entry.node && entry.node.type === 'file') {
+
// Build the full path for this file
+
const fullPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
+
+
// If successfulPaths is provided, skip files that weren't successfully uploaded
+
if (successfulPaths && !successfulPaths.has(fullPath)) {
+
return null; // Filter out failed files
+
}
+
+
// Find exact match in filePaths (need to handle normalized paths)
+
const fileIndex = filePaths.findIndex((path) => {
+
// Normalize both paths by removing leading base folder
+
const normalizedUploadPath = path.replace(/^[^\/]*\//, '');
+
const normalizedEntryPath = fullPath;
+
return normalizedUploadPath === normalizedEntryPath || path === fullPath;
+
});
+
+
if (fileIndex !== -1 && uploadResults[fileIndex]) {
+
const result = uploadResults[fileIndex];
+
const blobRef = result.blobRef;
+
+
return {
+
...entry,
+
node: {
+
$type: 'place.wisp.fs#file' as const,
+
type: 'file' as const,
+
blob: blobRef,
+
...(result.encoding && { encoding: result.encoding }),
+
...(result.mimeType && { mimeType: result.mimeType }),
+
...(result.base64 && { base64: result.base64 })
+
}
+
};
+
} else {
+
console.error(`Could not find blob for file: ${fullPath}`);
+
return null; // Filter out files without blobs
+
}
+
} else if ('type' in entry.node && entry.node.type === 'directory') {
+
const dirPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
+
return {
+
...entry,
+
node: updateFileBlobs(entry.node as Directory, uploadResults, filePaths, dirPath, successfulPaths)
+
};
+
}
+
return entry;
+
}).filter(entry => entry !== null) as Entry[]; // Remove null entries (failed files)
+
+
const result = {
+
$type: 'place.wisp.fs#directory' as const,
+
type: 'directory' as const,
+
entries: updatedEntries
+
};
+
+
return result;
+
}
+
+
/**
+
* Count files in a directory tree
+
*/
+
export function countFilesInDirectory(directory: Directory): number {
+
let count = 0;
+
for (const entry of directory.entries) {
+
if ('type' in entry.node && entry.node.type === 'file') {
+
count++;
+
} else if ('type' in entry.node && entry.node.type === 'directory') {
+
count += countFilesInDirectory(entry.node as Directory);
+
}
+
}
+
return count;
+
}
+
+
/**
+
* Recursively collect file CIDs from entries for incremental update tracking
+
*/
+
export function collectFileCidsFromEntries(entries: Entry[], pathPrefix: string, fileCids: Record<string, string>): void {
+
for (const entry of entries) {
+
const currentPath = pathPrefix ? `${pathPrefix}/${entry.name}` : entry.name;
+
const node = entry.node;
+
+
if ('type' in node && node.type === 'directory' && 'entries' in node) {
+
collectFileCidsFromEntries(node.entries, currentPath, fileCids);
+
} else if ('type' in node && node.type === 'file' && 'blob' in node) {
+
const fileNode = node as File;
+
// Extract CID from blob ref
+
if (fileNode.blob && fileNode.blob.ref) {
+
const cid = fileNode.blob.ref.toString();
+
fileCids[currentPath] = cid;
+
}
+
}
+
}
+
}
+9
packages/@wisp/fs-utils/tsconfig.json
···
+
{
+
"extends": "../../../tsconfig.json",
+
"compilerOptions": {
+
"outDir": "./dist",
+
"rootDir": "./src"
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "dist"]
+
}
+25
packages/@wisp/lexicons/README.md
···
+
# @wisp/lexicons
+
+
Shared AT Protocol lexicon definitions and generated TypeScript types for the wisp.place project.
+
+
## Contents
+
+
- `/lexicons` - Source lexicon JSON definitions
+
- `/src` - Generated TypeScript types and validation functions
+
+
## Usage
+
+
```typescript
+
import { ids, lexicons } from '@wisp/lexicons';
+
import type { PlaceWispFs } from '@wisp/lexicons/types/place/wisp/fs';
+
```
+
+
## Code Generation
+
+
To regenerate types from lexicon definitions:
+
+
```bash
+
npm run codegen
+
```
+
+
This uses `@atproto/lex-cli` to generate TypeScript types from the JSON schemas in `/lexicons`.
+59
packages/@wisp/lexicons/lexicons/fs.json
···
+
{
+
"lexicon": 1,
+
"id": "place.wisp.fs",
+
"defs": {
+
"main": {
+
"type": "record",
+
"description": "Virtual filesystem manifest for a Wisp site",
+
"record": {
+
"type": "object",
+
"required": ["site", "root", "createdAt"],
+
"properties": {
+
"site": { "type": "string" },
+
"root": { "type": "ref", "ref": "#directory" },
+
"fileCount": { "type": "integer", "minimum": 0, "maximum": 1000 },
+
"createdAt": { "type": "string", "format": "datetime" }
+
}
+
}
+
},
+
"file": {
+
"type": "object",
+
"required": ["type", "blob"],
+
"properties": {
+
"type": { "type": "string", "const": "file" },
+
"blob": { "type": "blob", "accept": ["*/*"], "maxSize": 1000000000, "description": "Content blob ref" },
+
"encoding": { "type": "string", "enum": ["gzip"], "description": "Content encoding (e.g., gzip for compressed files)" },
+
"mimeType": { "type": "string", "description": "Original MIME type before compression" },
+
"base64": { "type": "boolean", "description": "True if blob content is base64-encoded (used to bypass PDS content sniffing)" } }
+
},
+
"directory": {
+
"type": "object",
+
"required": ["type", "entries"],
+
"properties": {
+
"type": { "type": "string", "const": "directory" },
+
"entries": {
+
"type": "array",
+
"maxLength": 500,
+
"items": { "type": "ref", "ref": "#entry" }
+
}
+
}
+
},
+
"entry": {
+
"type": "object",
+
"required": ["name", "node"],
+
"properties": {
+
"name": { "type": "string", "maxLength": 255 },
+
"node": { "type": "union", "refs": ["#file", "#directory", "#subfs"] }
+
}
+
},
+
"subfs": {
+
"type": "object",
+
"required": ["type", "subject"],
+
"properties": {
+
"type": { "type": "string", "const": "subfs" },
+
"subject": { "type": "string", "format": "at-uri", "description": "AT-URI pointing to a place.wisp.subfs record containing this subtree." },
+
"flat": { "type": "boolean", "description": "If true (default), the subfs record's root entries are merged (flattened) into the parent directory, replacing the subfs entry. If false, the subfs entries are placed in a subdirectory with the subfs entry's name. Flat merging is useful for splitting large directories across multiple records while maintaining a flat structure." }
+
}
+
}
+
}
+
}
+76
packages/@wisp/lexicons/lexicons/settings.json
···
+
{
+
"lexicon": 1,
+
"id": "place.wisp.settings",
+
"defs": {
+
"main": {
+
"type": "record",
+
"description": "Configuration settings for a static site hosted on wisp.place",
+
"key": "any",
+
"record": {
+
"type": "object",
+
"properties": {
+
"directoryListing": {
+
"type": "boolean",
+
"description": "Enable directory listing mode for paths that resolve to directories without an index file. Incompatible with spaMode.",
+
"default": false
+
},
+
"spaMode": {
+
"type": "string",
+
"description": "File to serve for all routes (e.g., 'index.html'). When set, enables SPA mode where all non-file requests are routed to this file. Incompatible with directoryListing and custom404.",
+
"maxLength": 500
+
},
+
"custom404": {
+
"type": "string",
+
"description": "Custom 404 error page file path. Incompatible with directoryListing and spaMode.",
+
"maxLength": 500
+
},
+
"indexFiles": {
+
"type": "array",
+
"description": "Ordered list of files to try when serving a directory. Defaults to ['index.html'] if not specified.",
+
"items": {
+
"type": "string",
+
"maxLength": 255
+
},
+
"maxLength": 10
+
},
+
"cleanUrls": {
+
"type": "boolean",
+
"description": "Enable clean URL routing. When enabled, '/about' will attempt to serve '/about.html' or '/about/index.html' automatically.",
+
"default": false
+
},
+
"headers": {
+
"type": "array",
+
"description": "Custom HTTP headers to set on responses",
+
"items": {
+
"type": "ref",
+
"ref": "#customHeader"
+
},
+
"maxLength": 50
+
}
+
}
+
}
+
},
+
"customHeader": {
+
"type": "object",
+
"description": "Custom HTTP header configuration",
+
"required": ["name", "value"],
+
"properties": {
+
"name": {
+
"type": "string",
+
"description": "HTTP header name (e.g., 'Cache-Control', 'X-Frame-Options')",
+
"maxLength": 100
+
},
+
"value": {
+
"type": "string",
+
"description": "HTTP header value",
+
"maxLength": 1000
+
},
+
"path": {
+
"type": "string",
+
"description": "Optional glob pattern to apply this header to specific paths (e.g., '*.html', '/assets/*'). If not specified, applies to all paths.",
+
"maxLength": 500
+
}
+
}
+
}
+
}
+
}
+59
packages/@wisp/lexicons/lexicons/subfs.json
···
+
{
+
"lexicon": 1,
+
"id": "place.wisp.subfs",
+
"defs": {
+
"main": {
+
"type": "record",
+
"description": "Virtual filesystem subtree referenced by place.wisp.fs records. When a subfs entry is expanded, its root entries are merged (flattened) into the parent directory, allowing large directories to be split across multiple records while maintaining a flat structure.",
+
"record": {
+
"type": "object",
+
"required": ["root", "createdAt"],
+
"properties": {
+
"root": { "type": "ref", "ref": "#directory" },
+
"fileCount": { "type": "integer", "minimum": 0, "maximum": 1000 },
+
"createdAt": { "type": "string", "format": "datetime" }
+
}
+
}
+
},
+
"file": {
+
"type": "object",
+
"required": ["type", "blob"],
+
"properties": {
+
"type": { "type": "string", "const": "file" },
+
"blob": { "type": "blob", "accept": ["*/*"], "maxSize": 1000000000, "description": "Content blob ref" },
+
"encoding": { "type": "string", "enum": ["gzip"], "description": "Content encoding (e.g., gzip for compressed files)" },
+
"mimeType": { "type": "string", "description": "Original MIME type before compression" },
+
"base64": { "type": "boolean", "description": "True if blob content is base64-encoded (used to bypass PDS content sniffing)" }
+
}
+
},
+
"directory": {
+
"type": "object",
+
"required": ["type", "entries"],
+
"properties": {
+
"type": { "type": "string", "const": "directory" },
+
"entries": {
+
"type": "array",
+
"maxLength": 500,
+
"items": { "type": "ref", "ref": "#entry" }
+
}
+
}
+
},
+
"entry": {
+
"type": "object",
+
"required": ["name", "node"],
+
"properties": {
+
"name": { "type": "string", "maxLength": 255 },
+
"node": { "type": "union", "refs": ["#file", "#directory", "#subfs"] }
+
}
+
},
+
"subfs": {
+
"type": "object",
+
"required": ["type", "subject"],
+
"properties": {
+
"type": { "type": "string", "const": "subfs" },
+
"subject": { "type": "string", "format": "at-uri", "description": "AT-URI pointing to another place.wisp.subfs record for nested subtrees. When expanded, the referenced record's root entries are merged (flattened) into the parent directory, allowing recursive splitting of large directory structures." }
+
}
+
}
+
}
+
}
+
+44
packages/@wisp/lexicons/package.json
···
+
{
+
"name": "@wisp/lexicons",
+
"version": "1.0.0",
+
"private": true,
+
"type": "module",
+
"main": "./src/index.ts",
+
"types": "./src/index.ts",
+
"exports": {
+
".": {
+
"types": "./src/index.ts",
+
"default": "./src/index.ts"
+
},
+
"./types/place/wisp/fs": {
+
"types": "./src/types/place/wisp/fs.ts",
+
"default": "./src/types/place/wisp/fs.ts"
+
},
+
"./types/place/wisp/settings": {
+
"types": "./src/types/place/wisp/settings.ts",
+
"default": "./src/types/place/wisp/settings.ts"
+
},
+
"./types/place/wisp/subfs": {
+
"types": "./src/types/place/wisp/subfs.ts",
+
"default": "./src/types/place/wisp/subfs.ts"
+
},
+
"./lexicons": {
+
"types": "./src/lexicons.ts",
+
"default": "./src/lexicons.ts"
+
},
+
"./util": {
+
"types": "./src/util.ts",
+
"default": "./src/util.ts"
+
}
+
},
+
"scripts": {
+
"codegen": "lex gen-server ./src ./lexicons"
+
},
+
"dependencies": {
+
"@atproto/lexicon": "^0.5.1",
+
"@atproto/xrpc-server": "^0.9.5"
+
},
+
"devDependencies": {
+
"@atproto/lex-cli": "^0.9.5"
+
}
+
}
+44
packages/@wisp/lexicons/src/index.ts
···
+
/**
+
* GENERATED CODE - DO NOT MODIFY
+
*/
+
import {
+
type Auth,
+
type Options as XrpcOptions,
+
Server as XrpcServer,
+
type StreamConfigOrHandler,
+
type MethodConfigOrHandler,
+
createServer as createXrpcServer,
+
} from '@atproto/xrpc-server'
+
import { schemas } from './lexicons'
+
+
export function createServer(options?: XrpcOptions): Server {
+
return new Server(options)
+
}
+
+
export class Server {
+
xrpc: XrpcServer
+
place: PlaceNS
+
+
constructor(options?: XrpcOptions) {
+
this.xrpc = createXrpcServer(schemas, options)
+
this.place = new PlaceNS(this)
+
}
+
}
+
+
export class PlaceNS {
+
_server: Server
+
wisp: PlaceWispNS
+
+
constructor(server: Server) {
+
this._server = server
+
this.wisp = new PlaceWispNS(server)
+
}
+
}
+
+
export class PlaceWispNS {
+
_server: Server
+
+
constructor(server: Server) {
+
this._server = server
+
}
+
}
+364
packages/@wisp/lexicons/src/lexicons.ts
···
+
/**
+
* GENERATED CODE - DO NOT MODIFY
+
*/
+
import {
+
type LexiconDoc,
+
Lexicons,
+
ValidationError,
+
type ValidationResult,
+
} from '@atproto/lexicon'
+
import { type $Typed, is$typed, maybe$typed } from './util'
+
+
export const schemaDict = {
+
PlaceWispFs: {
+
lexicon: 1,
+
id: 'place.wisp.fs',
+
defs: {
+
main: {
+
type: 'record',
+
description: 'Virtual filesystem manifest for a Wisp site',
+
record: {
+
type: 'object',
+
required: ['site', 'root', 'createdAt'],
+
properties: {
+
site: {
+
type: 'string',
+
},
+
root: {
+
type: 'ref',
+
ref: 'lex:place.wisp.fs#directory',
+
},
+
fileCount: {
+
type: 'integer',
+
minimum: 0,
+
maximum: 1000,
+
},
+
createdAt: {
+
type: 'string',
+
format: 'datetime',
+
},
+
},
+
},
+
},
+
file: {
+
type: 'object',
+
required: ['type', 'blob'],
+
properties: {
+
type: {
+
type: 'string',
+
const: 'file',
+
},
+
blob: {
+
type: 'blob',
+
accept: ['*/*'],
+
maxSize: 1000000000,
+
description: 'Content blob ref',
+
},
+
encoding: {
+
type: 'string',
+
enum: ['gzip'],
+
description: 'Content encoding (e.g., gzip for compressed files)',
+
},
+
mimeType: {
+
type: 'string',
+
description: 'Original MIME type before compression',
+
},
+
base64: {
+
type: 'boolean',
+
description:
+
'True if blob content is base64-encoded (used to bypass PDS content sniffing)',
+
},
+
},
+
},
+
directory: {
+
type: 'object',
+
required: ['type', 'entries'],
+
properties: {
+
type: {
+
type: 'string',
+
const: 'directory',
+
},
+
entries: {
+
type: 'array',
+
maxLength: 500,
+
items: {
+
type: 'ref',
+
ref: 'lex:place.wisp.fs#entry',
+
},
+
},
+
},
+
},
+
entry: {
+
type: 'object',
+
required: ['name', 'node'],
+
properties: {
+
name: {
+
type: 'string',
+
maxLength: 255,
+
},
+
node: {
+
type: 'union',
+
refs: [
+
'lex:place.wisp.fs#file',
+
'lex:place.wisp.fs#directory',
+
'lex:place.wisp.fs#subfs',
+
],
+
},
+
},
+
},
+
subfs: {
+
type: 'object',
+
required: ['type', 'subject'],
+
properties: {
+
type: {
+
type: 'string',
+
const: 'subfs',
+
},
+
subject: {
+
type: 'string',
+
format: 'at-uri',
+
description:
+
'AT-URI pointing to a place.wisp.subfs record containing this subtree.',
+
},
+
flat: {
+
type: 'boolean',
+
description:
+
"If true (default), the subfs record's root entries are merged (flattened) into the parent directory, replacing the subfs entry. If false, the subfs entries are placed in a subdirectory with the subfs entry's name. Flat merging is useful for splitting large directories across multiple records while maintaining a flat structure.",
+
},
+
},
+
},
+
},
+
},
+
PlaceWispSettings: {
+
lexicon: 1,
+
id: 'place.wisp.settings',
+
defs: {
+
main: {
+
type: 'record',
+
description:
+
'Configuration settings for a static site hosted on wisp.place',
+
key: 'any',
+
record: {
+
type: 'object',
+
properties: {
+
directoryListing: {
+
type: 'boolean',
+
description:
+
'Enable directory listing mode for paths that resolve to directories without an index file. Incompatible with spaMode.',
+
default: false,
+
},
+
spaMode: {
+
type: 'string',
+
description:
+
"File to serve for all routes (e.g., 'index.html'). When set, enables SPA mode where all non-file requests are routed to this file. Incompatible with directoryListing and custom404.",
+
maxLength: 500,
+
},
+
custom404: {
+
type: 'string',
+
description:
+
'Custom 404 error page file path. Incompatible with directoryListing and spaMode.',
+
maxLength: 500,
+
},
+
indexFiles: {
+
type: 'array',
+
description:
+
"Ordered list of files to try when serving a directory. Defaults to ['index.html'] if not specified.",
+
items: {
+
type: 'string',
+
maxLength: 255,
+
},
+
maxLength: 10,
+
},
+
cleanUrls: {
+
type: 'boolean',
+
description:
+
"Enable clean URL routing. When enabled, '/about' will attempt to serve '/about.html' or '/about/index.html' automatically.",
+
default: false,
+
},
+
headers: {
+
type: 'array',
+
description: 'Custom HTTP headers to set on responses',
+
items: {
+
type: 'ref',
+
ref: 'lex:place.wisp.settings#customHeader',
+
},
+
maxLength: 50,
+
},
+
},
+
},
+
},
+
customHeader: {
+
type: 'object',
+
description: 'Custom HTTP header configuration',
+
required: ['name', 'value'],
+
properties: {
+
name: {
+
type: 'string',
+
description:
+
"HTTP header name (e.g., 'Cache-Control', 'X-Frame-Options')",
+
maxLength: 100,
+
},
+
value: {
+
type: 'string',
+
description: 'HTTP header value',
+
maxLength: 1000,
+
},
+
path: {
+
type: 'string',
+
description:
+
"Optional glob pattern to apply this header to specific paths (e.g., '*.html', '/assets/*'). If not specified, applies to all paths.",
+
maxLength: 500,
+
},
+
},
+
},
+
},
+
},
+
PlaceWispSubfs: {
+
lexicon: 1,
+
id: 'place.wisp.subfs',
+
defs: {
+
main: {
+
type: 'record',
+
description:
+
'Virtual filesystem subtree referenced by place.wisp.fs records. When a subfs entry is expanded, its root entries are merged (flattened) into the parent directory, allowing large directories to be split across multiple records while maintaining a flat structure.',
+
record: {
+
type: 'object',
+
required: ['root', 'createdAt'],
+
properties: {
+
root: {
+
type: 'ref',
+
ref: 'lex:place.wisp.subfs#directory',
+
},
+
fileCount: {
+
type: 'integer',
+
minimum: 0,
+
maximum: 1000,
+
},
+
createdAt: {
+
type: 'string',
+
format: 'datetime',
+
},
+
},
+
},
+
},
+
file: {
+
type: 'object',
+
required: ['type', 'blob'],
+
properties: {
+
type: {
+
type: 'string',
+
const: 'file',
+
},
+
blob: {
+
type: 'blob',
+
accept: ['*/*'],
+
maxSize: 1000000000,
+
description: 'Content blob ref',
+
},
+
encoding: {
+
type: 'string',
+
enum: ['gzip'],
+
description: 'Content encoding (e.g., gzip for compressed files)',
+
},
+
mimeType: {
+
type: 'string',
+
description: 'Original MIME type before compression',
+
},
+
base64: {
+
type: 'boolean',
+
description:
+
'True if blob content is base64-encoded (used to bypass PDS content sniffing)',
+
},
+
},
+
},
+
directory: {
+
type: 'object',
+
required: ['type', 'entries'],
+
properties: {
+
type: {
+
type: 'string',
+
const: 'directory',
+
},
+
entries: {
+
type: 'array',
+
maxLength: 500,
+
items: {
+
type: 'ref',
+
ref: 'lex:place.wisp.subfs#entry',
+
},
+
},
+
},
+
},
+
entry: {
+
type: 'object',
+
required: ['name', 'node'],
+
properties: {
+
name: {
+
type: 'string',
+
maxLength: 255,
+
},
+
node: {
+
type: 'union',
+
refs: [
+
'lex:place.wisp.subfs#file',
+
'lex:place.wisp.subfs#directory',
+
'lex:place.wisp.subfs#subfs',
+
],
+
},
+
},
+
},
+
subfs: {
+
type: 'object',
+
required: ['type', 'subject'],
+
properties: {
+
type: {
+
type: 'string',
+
const: 'subfs',
+
},
+
subject: {
+
type: 'string',
+
format: 'at-uri',
+
description:
+
"AT-URI pointing to another place.wisp.subfs record for nested subtrees. When expanded, the referenced record's root entries are merged (flattened) into the parent directory, allowing recursive splitting of large directory structures.",
+
},
+
},
+
},
+
},
+
},
+
} as const satisfies Record<string, LexiconDoc>
+
export const schemas = Object.values(schemaDict) satisfies LexiconDoc[]
+
export const lexicons: Lexicons = new Lexicons(schemas)
+
+
export function validate<T extends { $type: string }>(
+
v: unknown,
+
id: string,
+
hash: string,
+
requiredType: true,
+
): ValidationResult<T>
+
export function validate<T extends { $type?: string }>(
+
v: unknown,
+
id: string,
+
hash: string,
+
requiredType?: false,
+
): ValidationResult<T>
+
export function validate(
+
v: unknown,
+
id: string,
+
hash: string,
+
requiredType?: boolean,
+
): ValidationResult {
+
return (requiredType ? is$typed : maybe$typed)(v, id, hash)
+
? lexicons.validate(`${id}#${hash}`, v)
+
: {
+
success: false,
+
error: new ValidationError(
+
`Must be an object with "${hash === 'main' ? id : `${id}#${hash}`}" $type property`,
+
),
+
}
+
}
+
+
export const ids = {
+
PlaceWispFs: 'place.wisp.fs',
+
PlaceWispSettings: 'place.wisp.settings',
+
PlaceWispSubfs: 'place.wisp.subfs',
+
} as const
+110
packages/@wisp/lexicons/src/types/place/wisp/fs.ts
···
+
/**
+
* GENERATED CODE - DO NOT MODIFY
+
*/
+
import { type ValidationResult, BlobRef } from '@atproto/lexicon'
+
import { CID } from 'multiformats/cid'
+
import { validate as _validate } from '../../../lexicons'
+
import { type $Typed, is$typed as _is$typed, type OmitKey } from '../../../util'
+
+
const is$typed = _is$typed,
+
validate = _validate
+
const id = 'place.wisp.fs'
+
+
export interface Main {
+
$type: 'place.wisp.fs'
+
site: string
+
root: Directory
+
fileCount?: number
+
createdAt: string
+
[k: string]: unknown
+
}
+
+
const hashMain = 'main'
+
+
export function isMain<V>(v: V) {
+
return is$typed(v, id, hashMain)
+
}
+
+
export function validateMain<V>(v: V) {
+
return validate<Main & V>(v, id, hashMain, true)
+
}
+
+
export {
+
type Main as Record,
+
isMain as isRecord,
+
validateMain as validateRecord,
+
}
+
+
export interface File {
+
$type?: 'place.wisp.fs#file'
+
type: 'file'
+
/** Content blob ref */
+
blob: BlobRef
+
/** Content encoding (e.g., gzip for compressed files) */
+
encoding?: 'gzip'
+
/** Original MIME type before compression */
+
mimeType?: string
+
/** True if blob content is base64-encoded (used to bypass PDS content sniffing) */
+
base64?: boolean
+
}
+
+
const hashFile = 'file'
+
+
export function isFile<V>(v: V) {
+
return is$typed(v, id, hashFile)
+
}
+
+
export function validateFile<V>(v: V) {
+
return validate<File & V>(v, id, hashFile)
+
}
+
+
export interface Directory {
+
$type?: 'place.wisp.fs#directory'
+
type: 'directory'
+
entries: Entry[]
+
}
+
+
const hashDirectory = 'directory'
+
+
export function isDirectory<V>(v: V) {
+
return is$typed(v, id, hashDirectory)
+
}
+
+
export function validateDirectory<V>(v: V) {
+
return validate<Directory & V>(v, id, hashDirectory)
+
}
+
+
export interface Entry {
+
$type?: 'place.wisp.fs#entry'
+
name: string
+
node: $Typed<File> | $Typed<Directory> | $Typed<Subfs> | { $type: string }
+
}
+
+
const hashEntry = 'entry'
+
+
export function isEntry<V>(v: V) {
+
return is$typed(v, id, hashEntry)
+
}
+
+
export function validateEntry<V>(v: V) {
+
return validate<Entry & V>(v, id, hashEntry)
+
}
+
+
export interface Subfs {
+
$type?: 'place.wisp.fs#subfs'
+
type: 'subfs'
+
/** AT-URI pointing to a place.wisp.subfs record containing this subtree. */
+
subject: string
+
/** If true (default), the subfs record's root entries are merged (flattened) into the parent directory, replacing the subfs entry. If false, the subfs entries are placed in a subdirectory with the subfs entry's name. Flat merging is useful for splitting large directories across multiple records while maintaining a flat structure. */
+
flat?: boolean
+
}
+
+
const hashSubfs = 'subfs'
+
+
export function isSubfs<V>(v: V) {
+
return is$typed(v, id, hashSubfs)
+
}
+
+
export function validateSubfs<V>(v: V) {
+
return validate<Subfs & V>(v, id, hashSubfs)
+
}
+65
packages/@wisp/lexicons/src/types/place/wisp/settings.ts
···
+
/**
+
* GENERATED CODE - DO NOT MODIFY
+
*/
+
import { type ValidationResult, BlobRef } from '@atproto/lexicon'
+
import { CID } from 'multiformats/cid'
+
import { validate as _validate } from '../../../lexicons'
+
import { type $Typed, is$typed as _is$typed, type OmitKey } from '../../../util'
+
+
const is$typed = _is$typed,
+
validate = _validate
+
const id = 'place.wisp.settings'
+
+
export interface Main {
+
$type: 'place.wisp.settings'
+
/** Enable directory listing mode for paths that resolve to directories without an index file. Incompatible with spaMode. */
+
directoryListing: boolean
+
/** File to serve for all routes (e.g., 'index.html'). When set, enables SPA mode where all non-file requests are routed to this file. Incompatible with directoryListing and custom404. */
+
spaMode?: string
+
/** Custom 404 error page file path. Incompatible with directoryListing and spaMode. */
+
custom404?: string
+
/** Ordered list of files to try when serving a directory. Defaults to ['index.html'] if not specified. */
+
indexFiles?: string[]
+
/** Enable clean URL routing. When enabled, '/about' will attempt to serve '/about.html' or '/about/index.html' automatically. */
+
cleanUrls: boolean
+
/** Custom HTTP headers to set on responses */
+
headers?: CustomHeader[]
+
[k: string]: unknown
+
}
+
+
const hashMain = 'main'
+
+
export function isMain<V>(v: V) {
+
return is$typed(v, id, hashMain)
+
}
+
+
export function validateMain<V>(v: V) {
+
return validate<Main & V>(v, id, hashMain, true)
+
}
+
+
export {
+
type Main as Record,
+
isMain as isRecord,
+
validateMain as validateRecord,
+
}
+
+
/** Custom HTTP header configuration */
+
export interface CustomHeader {
+
$type?: 'place.wisp.settings#customHeader'
+
/** HTTP header name (e.g., 'Cache-Control', 'X-Frame-Options') */
+
name: string
+
/** HTTP header value */
+
value: string
+
/** Optional glob pattern to apply this header to specific paths (e.g., '*.html', '/assets/*'). If not specified, applies to all paths. */
+
path?: string
+
}
+
+
const hashCustomHeader = 'customHeader'
+
+
export function isCustomHeader<V>(v: V) {
+
return is$typed(v, id, hashCustomHeader)
+
}
+
+
export function validateCustomHeader<V>(v: V) {
+
return validate<CustomHeader & V>(v, id, hashCustomHeader)
+
}
+107
packages/@wisp/lexicons/src/types/place/wisp/subfs.ts
···
+
/**
+
* GENERATED CODE - DO NOT MODIFY
+
*/
+
import { type ValidationResult, BlobRef } from '@atproto/lexicon'
+
import { CID } from 'multiformats/cid'
+
import { validate as _validate } from '../../../lexicons'
+
import { type $Typed, is$typed as _is$typed, type OmitKey } from '../../../util'
+
+
const is$typed = _is$typed,
+
validate = _validate
+
const id = 'place.wisp.subfs'
+
+
export interface Main {
+
$type: 'place.wisp.subfs'
+
root: Directory
+
fileCount?: number
+
createdAt: string
+
[k: string]: unknown
+
}
+
+
const hashMain = 'main'
+
+
export function isMain<V>(v: V) {
+
return is$typed(v, id, hashMain)
+
}
+
+
export function validateMain<V>(v: V) {
+
return validate<Main & V>(v, id, hashMain, true)
+
}
+
+
export {
+
type Main as Record,
+
isMain as isRecord,
+
validateMain as validateRecord,
+
}
+
+
export interface File {
+
$type?: 'place.wisp.subfs#file'
+
type: 'file'
+
/** Content blob ref */
+
blob: BlobRef
+
/** Content encoding (e.g., gzip for compressed files) */
+
encoding?: 'gzip'
+
/** Original MIME type before compression */
+
mimeType?: string
+
/** True if blob content is base64-encoded (used to bypass PDS content sniffing) */
+
base64?: boolean
+
}
+
+
const hashFile = 'file'
+
+
export function isFile<V>(v: V) {
+
return is$typed(v, id, hashFile)
+
}
+
+
export function validateFile<V>(v: V) {
+
return validate<File & V>(v, id, hashFile)
+
}
+
+
export interface Directory {
+
$type?: 'place.wisp.subfs#directory'
+
type: 'directory'
+
entries: Entry[]
+
}
+
+
const hashDirectory = 'directory'
+
+
export function isDirectory<V>(v: V) {
+
return is$typed(v, id, hashDirectory)
+
}
+
+
export function validateDirectory<V>(v: V) {
+
return validate<Directory & V>(v, id, hashDirectory)
+
}
+
+
export interface Entry {
+
$type?: 'place.wisp.subfs#entry'
+
name: string
+
node: $Typed<File> | $Typed<Directory> | $Typed<Subfs> | { $type: string }
+
}
+
+
const hashEntry = 'entry'
+
+
export function isEntry<V>(v: V) {
+
return is$typed(v, id, hashEntry)
+
}
+
+
export function validateEntry<V>(v: V) {
+
return validate<Entry & V>(v, id, hashEntry)
+
}
+
+
export interface Subfs {
+
$type?: 'place.wisp.subfs#subfs'
+
type: 'subfs'
+
/** AT-URI pointing to another place.wisp.subfs record for nested subtrees. When expanded, the referenced record's root entries are merged (flattened) into the parent directory, allowing recursive splitting of large directory structures. */
+
subject: string
+
}
+
+
const hashSubfs = 'subfs'
+
+
export function isSubfs<V>(v: V) {
+
return is$typed(v, id, hashSubfs)
+
}
+
+
export function validateSubfs<V>(v: V) {
+
return validate<Subfs & V>(v, id, hashSubfs)
+
}
+82
packages/@wisp/lexicons/src/util.ts
···
+
/**
+
* GENERATED CODE - DO NOT MODIFY
+
*/
+
+
import { type ValidationResult } from '@atproto/lexicon'
+
+
export type OmitKey<T, K extends keyof T> = {
+
[K2 in keyof T as K2 extends K ? never : K2]: T[K2]
+
}
+
+
export type $Typed<V, T extends string = string> = V & { $type: T }
+
export type Un$Typed<V extends { $type?: string }> = OmitKey<V, '$type'>
+
+
export type $Type<Id extends string, Hash extends string> = Hash extends 'main'
+
? Id
+
: `${Id}#${Hash}`
+
+
function isObject<V>(v: V): v is V & object {
+
return v != null && typeof v === 'object'
+
}
+
+
function is$type<Id extends string, Hash extends string>(
+
$type: unknown,
+
id: Id,
+
hash: Hash,
+
): $type is $Type<Id, Hash> {
+
return hash === 'main'
+
? $type === id
+
: // $type === `${id}#${hash}`
+
typeof $type === 'string' &&
+
$type.length === id.length + 1 + hash.length &&
+
$type.charCodeAt(id.length) === 35 /* '#' */ &&
+
$type.startsWith(id) &&
+
$type.endsWith(hash)
+
}
+
+
export type $TypedObject<
+
V,
+
Id extends string,
+
Hash extends string,
+
> = V extends {
+
$type: $Type<Id, Hash>
+
}
+
? V
+
: V extends { $type?: string }
+
? V extends { $type?: infer T extends $Type<Id, Hash> }
+
? V & { $type: T }
+
: never
+
: V & { $type: $Type<Id, Hash> }
+
+
export function is$typed<V, Id extends string, Hash extends string>(
+
v: V,
+
id: Id,
+
hash: Hash,
+
): v is $TypedObject<V, Id, Hash> {
+
return isObject(v) && '$type' in v && is$type(v.$type, id, hash)
+
}
+
+
export function maybe$typed<V, Id extends string, Hash extends string>(
+
v: V,
+
id: Id,
+
hash: Hash,
+
): v is V & object & { $type?: $Type<Id, Hash> } {
+
return (
+
isObject(v) &&
+
('$type' in v ? v.$type === undefined || is$type(v.$type, id, hash) : true)
+
)
+
}
+
+
export type Validator<R = unknown> = (v: unknown) => ValidationResult<R>
+
export type ValidatorParam<V extends Validator> =
+
V extends Validator<infer R> ? R : never
+
+
/**
+
* Utility function that allows to convert a "validate*" utility function into a
+
* type predicate.
+
*/
+
export function asPredicate<V extends Validator>(validate: V) {
+
return function <T>(v: T): v is T & ValidatorParam<V> {
+
return validate(v).success
+
}
+
}
+11
packages/@wisp/lexicons/tsconfig.json
···
+
{
+
"extends": "../../../tsconfig.json",
+
"compilerOptions": {
+
"outDir": "./dist",
+
"rootDir": "./src",
+
"declaration": true,
+
"declarationMap": true
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "dist"]
+
}
+34
packages/@wisp/observability/package.json
···
+
{
+
"name": "@wisp/observability",
+
"version": "1.0.0",
+
"private": true,
+
"type": "module",
+
"main": "./src/index.ts",
+
"types": "./src/index.ts",
+
"exports": {
+
".": {
+
"types": "./src/index.ts",
+
"default": "./src/index.ts"
+
},
+
"./core": {
+
"types": "./src/core.ts",
+
"default": "./src/core.ts"
+
},
+
"./middleware/elysia": {
+
"types": "./src/middleware/elysia.ts",
+
"default": "./src/middleware/elysia.ts"
+
},
+
"./middleware/hono": {
+
"types": "./src/middleware/hono.ts",
+
"default": "./src/middleware/hono.ts"
+
}
+
},
+
"peerDependencies": {
+
"hono": "^4.0.0"
+
},
+
"peerDependenciesMeta": {
+
"hono": {
+
"optional": true
+
}
+
}
+
}
+368
packages/@wisp/observability/src/core.ts
···
+
/**
+
* Core observability types and collectors
+
* Framework-agnostic logging, error tracking, and metrics collection
+
*/
+
+
// ============================================================================
+
// Types
+
// ============================================================================
+
+
export interface LogEntry {
+
id: string
+
timestamp: Date
+
level: 'info' | 'warn' | 'error' | 'debug'
+
message: string
+
service: string
+
context?: Record<string, any>
+
traceId?: string
+
eventType?: string
+
}
+
+
export interface ErrorEntry {
+
id: string
+
timestamp: Date
+
message: string
+
stack?: string
+
service: string
+
context?: Record<string, any>
+
count: number
+
lastSeen: Date
+
}
+
+
export interface MetricEntry {
+
timestamp: Date
+
path: string
+
method: string
+
statusCode: number
+
duration: number
+
service: string
+
}
+
+
export interface LogFilter {
+
level?: string
+
service?: string
+
limit?: number
+
search?: string
+
eventType?: string
+
}
+
+
export interface ErrorFilter {
+
service?: string
+
limit?: number
+
}
+
+
export interface MetricFilter {
+
service?: string
+
timeWindow?: number
+
}
+
+
export interface MetricStats {
+
totalRequests: number
+
avgDuration: number
+
p50Duration: number
+
p95Duration: number
+
p99Duration: number
+
errorRate: number
+
requestsPerMinute: number
+
}
+
+
// ============================================================================
+
// Configuration
+
// ============================================================================
+
+
const MAX_LOGS = 5000
+
const MAX_ERRORS = 500
+
const MAX_METRICS = 10000
+
+
// ============================================================================
+
// Storage
+
// ============================================================================
+
+
const logs: LogEntry[] = []
+
const errors: Map<string, ErrorEntry> = new Map()
+
const metrics: MetricEntry[] = []
+
+
// ============================================================================
+
// Helpers
+
// ============================================================================
+
+
let logCounter = 0
+
let errorCounter = 0
+
+
function generateId(prefix: string, counter: number): string {
+
return `${prefix}-${Date.now()}-${counter}`
+
}
+
+
function extractEventType(message: string): string | undefined {
+
const match = message.match(/^\[([^\]]+)\]/)
+
return match ? match[1] : undefined
+
}
+
+
// ============================================================================
+
// Log Collector
+
// ============================================================================
+
+
export const logCollector = {
+
log(
+
level: LogEntry['level'],
+
message: string,
+
service: string,
+
context?: Record<string, any>,
+
traceId?: string
+
) {
+
const entry: LogEntry = {
+
id: generateId('log', logCounter++),
+
timestamp: new Date(),
+
level,
+
message,
+
service,
+
context,
+
traceId,
+
eventType: extractEventType(message)
+
}
+
+
logs.unshift(entry)
+
+
// Rotate if needed
+
if (logs.length > MAX_LOGS) {
+
logs.splice(MAX_LOGS)
+
}
+
+
// Also log to console for compatibility
+
const contextStr = context ? ` ${JSON.stringify(context)}` : ''
+
const traceStr = traceId ? ` [trace:${traceId}]` : ''
+
console[level === 'debug' ? 'log' : level](`[${service}] ${message}${contextStr}${traceStr}`)
+
},
+
+
info(message: string, service: string, context?: Record<string, any>, traceId?: string) {
+
this.log('info', message, service, context, traceId)
+
},
+
+
warn(message: string, service: string, context?: Record<string, any>, traceId?: string) {
+
this.log('warn', message, service, context, traceId)
+
},
+
+
error(
+
message: string,
+
service: string,
+
error?: any,
+
context?: Record<string, any>,
+
traceId?: string
+
) {
+
const ctx = { ...context }
+
if (error instanceof Error) {
+
ctx.error = error.message
+
ctx.stack = error.stack
+
} else if (error) {
+
ctx.error = String(error)
+
}
+
this.log('error', message, service, ctx, traceId)
+
+
// Also track in errors
+
errorTracker.track(message, service, error, context)
+
},
+
+
debug(message: string, service: string, context?: Record<string, any>, traceId?: string) {
+
const env = typeof Bun !== 'undefined' ? Bun.env.NODE_ENV : process.env.NODE_ENV;
+
if (env !== 'production') {
+
this.log('debug', message, service, context, traceId)
+
}
+
},
+
+
getLogs(filter?: LogFilter) {
+
let filtered = [...logs]
+
+
if (filter?.level) {
+
filtered = filtered.filter(log => log.level === filter.level)
+
}
+
+
if (filter?.service) {
+
filtered = filtered.filter(log => log.service === filter.service)
+
}
+
+
if (filter?.eventType) {
+
filtered = filtered.filter(log => log.eventType === filter.eventType)
+
}
+
+
if (filter?.search) {
+
const search = filter.search.toLowerCase()
+
filtered = filtered.filter(log =>
+
log.message.toLowerCase().includes(search) ||
+
(log.context ? JSON.stringify(log.context).toLowerCase().includes(search) : false)
+
)
+
}
+
+
const limit = filter?.limit || 100
+
return filtered.slice(0, limit)
+
},
+
+
clear() {
+
logs.length = 0
+
}
+
}
+
+
// ============================================================================
+
// Error Tracker
+
// ============================================================================
+
+
export const errorTracker = {
+
track(message: string, service: string, error?: any, context?: Record<string, any>) {
+
const key = `${service}:${message}`
+
+
const existing = errors.get(key)
+
if (existing) {
+
existing.count++
+
existing.lastSeen = new Date()
+
if (context) {
+
existing.context = { ...existing.context, ...context }
+
}
+
} else {
+
const entry: ErrorEntry = {
+
id: generateId('error', errorCounter++),
+
timestamp: new Date(),
+
message,
+
service,
+
context,
+
count: 1,
+
lastSeen: new Date()
+
}
+
+
if (error instanceof Error) {
+
entry.stack = error.stack
+
}
+
+
errors.set(key, entry)
+
+
// Rotate if needed
+
if (errors.size > MAX_ERRORS) {
+
const oldest = Array.from(errors.keys())[0]
+
if (oldest !== undefined) {
+
errors.delete(oldest)
+
}
+
}
+
}
+
},
+
+
getErrors(filter?: ErrorFilter) {
+
let filtered = Array.from(errors.values())
+
+
if (filter?.service) {
+
filtered = filtered.filter(err => err.service === filter.service)
+
}
+
+
// Sort by last seen (most recent first)
+
filtered.sort((a, b) => b.lastSeen.getTime() - a.lastSeen.getTime())
+
+
const limit = filter?.limit || 100
+
return filtered.slice(0, limit)
+
},
+
+
clear() {
+
errors.clear()
+
}
+
}
+
+
// ============================================================================
+
// Metrics Collector
+
// ============================================================================
+
+
export const metricsCollector = {
+
recordRequest(
+
path: string,
+
method: string,
+
statusCode: number,
+
duration: number,
+
service: string
+
) {
+
const entry: MetricEntry = {
+
timestamp: new Date(),
+
path,
+
method,
+
statusCode,
+
duration,
+
service
+
}
+
+
metrics.unshift(entry)
+
+
// Rotate if needed
+
if (metrics.length > MAX_METRICS) {
+
metrics.splice(MAX_METRICS)
+
}
+
},
+
+
getMetrics(filter?: MetricFilter) {
+
let filtered = [...metrics]
+
+
if (filter?.service) {
+
filtered = filtered.filter(m => m.service === filter.service)
+
}
+
+
if (filter?.timeWindow) {
+
const cutoff = Date.now() - filter.timeWindow
+
filtered = filtered.filter(m => m.timestamp.getTime() > cutoff)
+
}
+
+
return filtered
+
},
+
+
getStats(service?: string, timeWindow: number = 3600000): MetricStats {
+
const filtered = this.getMetrics({ service, timeWindow })
+
+
if (filtered.length === 0) {
+
return {
+
totalRequests: 0,
+
avgDuration: 0,
+
p50Duration: 0,
+
p95Duration: 0,
+
p99Duration: 0,
+
errorRate: 0,
+
requestsPerMinute: 0
+
}
+
}
+
+
const durations = filtered.map(m => m.duration).sort((a, b) => a - b)
+
const totalDuration = durations.reduce((sum, d) => sum + d, 0)
+
const errors = filtered.filter(m => m.statusCode >= 400).length
+
+
const p50 = durations[Math.floor(durations.length * 0.5)]
+
const p95 = durations[Math.floor(durations.length * 0.95)]
+
const p99 = durations[Math.floor(durations.length * 0.99)]
+
+
const timeWindowMinutes = timeWindow / 60000
+
+
return {
+
totalRequests: filtered.length,
+
avgDuration: Math.round(totalDuration / filtered.length),
+
p50Duration: Math.round(p50 ?? 0),
+
p95Duration: Math.round(p95 ?? 0),
+
p99Duration: Math.round(p99 ?? 0),
+
errorRate: (errors / filtered.length) * 100,
+
requestsPerMinute: Math.round(filtered.length / timeWindowMinutes)
+
}
+
},
+
+
clear() {
+
metrics.length = 0
+
}
+
}
+
+
// ============================================================================
+
// Logger Factory
+
// ============================================================================
+
+
/**
+
* Create a service-specific logger instance
+
*/
+
export function createLogger(service: string) {
+
return {
+
info: (message: string, context?: Record<string, any>) =>
+
logCollector.info(message, service, context),
+
warn: (message: string, context?: Record<string, any>) =>
+
logCollector.warn(message, service, context),
+
error: (message: string, error?: any, context?: Record<string, any>) =>
+
logCollector.error(message, service, error, context),
+
debug: (message: string, context?: Record<string, any>) =>
+
logCollector.debug(message, service, context)
+
}
+
}
+11
packages/@wisp/observability/src/index.ts
···
+
/**
+
* @wisp/observability
+
* Framework-agnostic observability package with Elysia and Hono middleware
+
*/
+
+
// Export everything from core
+
export * from './core'
+
+
// Note: Middleware should be imported from specific subpaths:
+
// - import { observabilityMiddleware } from '@wisp/observability/middleware/elysia'
+
// - import { observabilityMiddleware, observabilityErrorHandler } from '@wisp/observability/middleware/hono'
+49
packages/@wisp/observability/src/middleware/elysia.ts
···
+
import { metricsCollector, logCollector } from '../core'
+
+
/**
+
* Elysia middleware for observability
+
* Tracks request metrics and logs errors
+
*/
+
export function observabilityMiddleware(service: string) {
+
return {
+
beforeHandle: ({ request }: any) => {
+
// Store start time on request object
+
(request as any).__startTime = Date.now()
+
},
+
afterHandle: ({ request, set }: any) => {
+
const duration = Date.now() - ((request as any).__startTime || Date.now())
+
const url = new URL(request.url)
+
+
metricsCollector.recordRequest(
+
url.pathname,
+
request.method,
+
set.status || 200,
+
duration,
+
service
+
)
+
},
+
onError: ({ request, error, set }: any) => {
+
const duration = Date.now() - ((request as any).__startTime || Date.now())
+
const url = new URL(request.url)
+
+
metricsCollector.recordRequest(
+
url.pathname,
+
request.method,
+
set.status || 500,
+
duration,
+
service
+
)
+
+
// Don't log 404 errors
+
const statusCode = set.status || 500
+
if (statusCode !== 404) {
+
logCollector.error(
+
`Request failed: ${request.method} ${url.pathname}`,
+
service,
+
error,
+
{ statusCode }
+
)
+
}
+
}
+
}
+
}
+44
packages/@wisp/observability/src/middleware/hono.ts
···
+
import type { Context } from 'hono'
+
import { metricsCollector, logCollector } from '../core'
+
+
/**
+
* Hono middleware for observability
+
* Tracks request metrics
+
*/
+
export function observabilityMiddleware(service: string) {
+
return async (c: Context, next: () => Promise<void>) => {
+
const startTime = Date.now()
+
+
await next()
+
+
const duration = Date.now() - startTime
+
const { pathname } = new URL(c.req.url)
+
+
metricsCollector.recordRequest(
+
pathname,
+
c.req.method,
+
c.res.status,
+
duration,
+
service
+
)
+
}
+
}
+
+
/**
+
* Hono error handler for observability
+
* Logs errors with context
+
*/
+
export function observabilityErrorHandler(service: string) {
+
return (err: Error, c: Context) => {
+
const { pathname } = new URL(c.req.url)
+
+
logCollector.error(
+
`Request failed: ${c.req.method} ${pathname}`,
+
service,
+
err,
+
{ statusCode: c.res.status || 500 }
+
)
+
+
return c.text('Internal Server Error', 500)
+
}
+
}
+9
packages/@wisp/observability/tsconfig.json
···
+
{
+
"extends": "../../../tsconfig.json",
+
"compilerOptions": {
+
"outDir": "./dist",
+
"rootDir": "./src"
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "dist"]
+
}
+14
packages/@wisp/safe-fetch/package.json
···
+
{
+
"name": "@wisp/safe-fetch",
+
"version": "1.0.0",
+
"private": true,
+
"type": "module",
+
"main": "./src/index.ts",
+
"types": "./src/index.ts",
+
"exports": {
+
".": {
+
"types": "./src/index.ts",
+
"default": "./src/index.ts"
+
}
+
}
+
}
+187
packages/@wisp/safe-fetch/src/index.ts
···
+
/**
+
* SSRF-hardened fetch utility
+
* Prevents requests to private networks, localhost, and enforces timeouts/size limits
+
*/
+
+
const BLOCKED_IP_RANGES = [
+
/^127\./, // 127.0.0.0/8 - Loopback
+
/^10\./, // 10.0.0.0/8 - Private
+
/^172\.(1[6-9]|2\d|3[01])\./, // 172.16.0.0/12 - Private
+
/^192\.168\./, // 192.168.0.0/16 - Private
+
/^169\.254\./, // 169.254.0.0/16 - Link-local
+
/^::1$/, // IPv6 loopback
+
/^fe80:/, // IPv6 link-local
+
/^fc00:/, // IPv6 unique local
+
/^fd00:/, // IPv6 unique local
+
];
+
+
const BLOCKED_HOSTS = [
+
'localhost',
+
'metadata.google.internal',
+
'169.254.169.254',
+
];
+
+
const FETCH_TIMEOUT = 120000; // 120 seconds
+
const FETCH_TIMEOUT_BLOB = 120000; // 2 minutes for blob downloads
+
const MAX_RESPONSE_SIZE = 10 * 1024 * 1024; // 10MB
+
const MAX_JSON_SIZE = 1024 * 1024; // 1MB
+
const MAX_BLOB_SIZE = 500 * 1024 * 1024; // 500MB
+
const MAX_REDIRECTS = 10;
+
+
function isBlockedHost(hostname: string): boolean {
+
const lowerHost = hostname.toLowerCase();
+
+
if (BLOCKED_HOSTS.includes(lowerHost)) {
+
return true;
+
}
+
+
for (const pattern of BLOCKED_IP_RANGES) {
+
if (pattern.test(lowerHost)) {
+
return true;
+
}
+
}
+
+
return false;
+
}
+
+
export async function safeFetch(
+
url: string,
+
options?: RequestInit & { maxSize?: number; timeout?: number }
+
): Promise<Response> {
+
const timeoutMs = options?.timeout ?? FETCH_TIMEOUT;
+
const maxSize = options?.maxSize ?? MAX_RESPONSE_SIZE;
+
+
// Parse and validate URL
+
let parsedUrl: URL;
+
try {
+
parsedUrl = new URL(url);
+
} catch (err) {
+
throw new Error(`Invalid URL: ${url}`);
+
}
+
+
if (!['http:', 'https:'].includes(parsedUrl.protocol)) {
+
throw new Error(`Blocked protocol: ${parsedUrl.protocol}`);
+
}
+
+
const hostname = parsedUrl.hostname;
+
if (isBlockedHost(hostname)) {
+
throw new Error(`Blocked host: ${hostname}`);
+
}
+
+
const controller = new AbortController();
+
const timeoutId = setTimeout(() => controller.abort(), timeoutMs);
+
+
try {
+
const response = await fetch(url, {
+
...options,
+
signal: controller.signal,
+
redirect: 'follow',
+
});
+
+
const contentLength = response.headers.get('content-length');
+
if (contentLength && parseInt(contentLength, 10) > maxSize) {
+
throw new Error(`Response too large: ${contentLength} bytes`);
+
}
+
+
return response;
+
} catch (err) {
+
if (err instanceof Error && err.name === 'AbortError') {
+
throw new Error(`Request timeout after ${timeoutMs}ms`);
+
}
+
throw err;
+
} finally {
+
clearTimeout(timeoutId);
+
}
+
}
+
+
export async function safeFetchJson<T = any>(
+
url: string,
+
options?: RequestInit & { maxSize?: number; timeout?: number }
+
): Promise<T> {
+
const maxJsonSize = options?.maxSize ?? MAX_JSON_SIZE;
+
const response = await safeFetch(url, { ...options, maxSize: maxJsonSize });
+
+
if (!response.ok) {
+
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
+
}
+
+
const reader = response.body?.getReader();
+
if (!reader) {
+
throw new Error('No response body');
+
}
+
+
const chunks: Uint8Array[] = [];
+
let totalSize = 0;
+
+
try {
+
while (true) {
+
const { done, value } = await reader.read();
+
if (done) break;
+
+
totalSize += value.length;
+
if (totalSize > maxJsonSize) {
+
throw new Error(`Response exceeds max size: ${maxJsonSize} bytes`);
+
}
+
+
chunks.push(value);
+
}
+
} finally {
+
reader.releaseLock();
+
}
+
+
const combined = new Uint8Array(totalSize);
+
let offset = 0;
+
for (const chunk of chunks) {
+
combined.set(chunk, offset);
+
offset += chunk.length;
+
}
+
+
const text = new TextDecoder().decode(combined);
+
return JSON.parse(text);
+
}
+
+
export async function safeFetchBlob(
+
url: string,
+
options?: RequestInit & { maxSize?: number; timeout?: number }
+
): Promise<Uint8Array> {
+
const maxBlobSize = options?.maxSize ?? MAX_BLOB_SIZE;
+
const timeoutMs = options?.timeout ?? FETCH_TIMEOUT_BLOB;
+
const response = await safeFetch(url, { ...options, maxSize: maxBlobSize, timeout: timeoutMs });
+
+
if (!response.ok) {
+
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
+
}
+
+
const reader = response.body?.getReader();
+
if (!reader) {
+
throw new Error('No response body');
+
}
+
+
const chunks: Uint8Array[] = [];
+
let totalSize = 0;
+
+
try {
+
while (true) {
+
const { done, value } = await reader.read();
+
if (done) break;
+
+
totalSize += value.length;
+
if (totalSize > maxBlobSize) {
+
throw new Error(`Blob exceeds max size: ${maxBlobSize} bytes`);
+
}
+
+
chunks.push(value);
+
}
+
} finally {
+
reader.releaseLock();
+
}
+
+
const combined = new Uint8Array(totalSize);
+
let offset = 0;
+
for (const chunk of chunks) {
+
combined.set(chunk, offset);
+
offset += chunk.length;
+
}
+
+
return combined;
+
}
+9
packages/@wisp/safe-fetch/tsconfig.json
···
+
{
+
"extends": "../../../tsconfig.json",
+
"compilerOptions": {
+
"outDir": "./dist",
+
"rootDir": "./src"
+
},
+
"include": ["src/**/*"],
+
"exclude": ["node_modules", "dist"]
+
}
-820
public/admin/admin.tsx
···
-
import { StrictMode, useState, useEffect } from 'react'
-
import { createRoot } from 'react-dom/client'
-
import './styles.css'
-
-
// Types
-
interface LogEntry {
-
id: string
-
timestamp: string
-
level: 'info' | 'warn' | 'error' | 'debug'
-
message: string
-
service: string
-
context?: Record<string, any>
-
eventType?: string
-
}
-
-
interface ErrorEntry {
-
id: string
-
timestamp: string
-
message: string
-
stack?: string
-
service: string
-
count: number
-
lastSeen: string
-
}
-
-
interface MetricsStats {
-
totalRequests: number
-
avgDuration: number
-
p50Duration: number
-
p95Duration: number
-
p99Duration: number
-
errorRate: number
-
requestsPerMinute: number
-
}
-
-
// Helper function to format Unix timestamp from database
-
function formatDbDate(timestamp: number | string): Date {
-
const num = typeof timestamp === 'string' ? parseFloat(timestamp) : timestamp
-
return new Date(num * 1000) // Convert seconds to milliseconds
-
}
-
-
// Login Component
-
function Login({ onLogin }: { onLogin: () => void }) {
-
const [username, setUsername] = useState('')
-
const [password, setPassword] = useState('')
-
const [error, setError] = useState('')
-
const [loading, setLoading] = useState(false)
-
-
const handleSubmit = async (e: React.FormEvent) => {
-
e.preventDefault()
-
setError('')
-
setLoading(true)
-
-
try {
-
const res = await fetch('/api/admin/login', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ username, password }),
-
credentials: 'include'
-
})
-
-
if (res.ok) {
-
onLogin()
-
} else {
-
setError('Invalid credentials')
-
}
-
} catch (err) {
-
setError('Failed to login')
-
} finally {
-
setLoading(false)
-
}
-
}
-
-
return (
-
<div className="min-h-screen bg-gray-950 flex items-center justify-center p-4">
-
<div className="w-full max-w-md">
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-8 shadow-xl">
-
<h1 className="text-2xl font-bold text-white mb-6">Admin Login</h1>
-
<form onSubmit={handleSubmit} className="space-y-4">
-
<div>
-
<label className="block text-sm font-medium text-gray-300 mb-2">
-
Username
-
</label>
-
<input
-
type="text"
-
value={username}
-
onChange={(e) => setUsername(e.target.value)}
-
className="w-full px-3 py-2 bg-gray-800 border border-gray-700 rounded text-white focus:outline-none focus:border-blue-500"
-
required
-
/>
-
</div>
-
<div>
-
<label className="block text-sm font-medium text-gray-300 mb-2">
-
Password
-
</label>
-
<input
-
type="password"
-
value={password}
-
onChange={(e) => setPassword(e.target.value)}
-
className="w-full px-3 py-2 bg-gray-800 border border-gray-700 rounded text-white focus:outline-none focus:border-blue-500"
-
required
-
/>
-
</div>
-
{error && (
-
<div className="text-red-400 text-sm">{error}</div>
-
)}
-
<button
-
type="submit"
-
disabled={loading}
-
className="w-full bg-blue-600 hover:bg-blue-700 disabled:bg-gray-700 text-white font-medium py-2 px-4 rounded transition-colors"
-
>
-
{loading ? 'Logging in...' : 'Login'}
-
</button>
-
</form>
-
</div>
-
</div>
-
</div>
-
)
-
}
-
-
// Dashboard Component
-
function Dashboard() {
-
const [tab, setTab] = useState('overview')
-
const [logs, setLogs] = useState<LogEntry[]>([])
-
const [errors, setErrors] = useState<ErrorEntry[]>([])
-
const [metrics, setMetrics] = useState<any>(null)
-
const [database, setDatabase] = useState<any>(null)
-
const [sites, setSites] = useState<any>(null)
-
const [health, setHealth] = useState<any>(null)
-
const [autoRefresh, setAutoRefresh] = useState(true)
-
-
// Filters
-
const [logLevel, setLogLevel] = useState('')
-
const [logService, setLogService] = useState('')
-
const [logSearch, setLogSearch] = useState('')
-
const [logEventType, setLogEventType] = useState('')
-
-
const fetchLogs = async () => {
-
const params = new URLSearchParams()
-
if (logLevel) params.append('level', logLevel)
-
if (logService) params.append('service', logService)
-
if (logSearch) params.append('search', logSearch)
-
if (logEventType) params.append('eventType', logEventType)
-
params.append('limit', '100')
-
-
const res = await fetch(`/api/admin/logs?${params}`, { credentials: 'include' })
-
if (res.ok) {
-
const data = await res.json()
-
setLogs(data.logs)
-
}
-
}
-
-
const fetchErrors = async () => {
-
const res = await fetch('/api/admin/errors', { credentials: 'include' })
-
if (res.ok) {
-
const data = await res.json()
-
setErrors(data.errors)
-
}
-
}
-
-
const fetchMetrics = async () => {
-
const res = await fetch('/api/admin/metrics', { credentials: 'include' })
-
if (res.ok) {
-
const data = await res.json()
-
setMetrics(data)
-
}
-
}
-
-
const fetchDatabase = async () => {
-
const res = await fetch('/api/admin/database', { credentials: 'include' })
-
if (res.ok) {
-
const data = await res.json()
-
setDatabase(data)
-
}
-
}
-
-
const fetchSites = async () => {
-
const res = await fetch('/api/admin/sites', { credentials: 'include' })
-
if (res.ok) {
-
const data = await res.json()
-
setSites(data)
-
}
-
}
-
-
const fetchHealth = async () => {
-
const res = await fetch('/api/admin/health', { credentials: 'include' })
-
if (res.ok) {
-
const data = await res.json()
-
setHealth(data)
-
}
-
}
-
-
const logout = async () => {
-
await fetch('/api/admin/logout', { method: 'POST', credentials: 'include' })
-
window.location.reload()
-
}
-
-
useEffect(() => {
-
fetchMetrics()
-
fetchDatabase()
-
fetchHealth()
-
fetchLogs()
-
fetchErrors()
-
fetchSites()
-
}, [])
-
-
useEffect(() => {
-
fetchLogs()
-
}, [logLevel, logService, logSearch])
-
-
useEffect(() => {
-
if (!autoRefresh) return
-
-
const interval = setInterval(() => {
-
if (tab === 'overview') {
-
fetchMetrics()
-
fetchHealth()
-
} else if (tab === 'logs') {
-
fetchLogs()
-
} else if (tab === 'errors') {
-
fetchErrors()
-
} else if (tab === 'database') {
-
fetchDatabase()
-
} else if (tab === 'sites') {
-
fetchSites()
-
}
-
}, 5000)
-
-
return () => clearInterval(interval)
-
}, [tab, autoRefresh, logLevel, logService, logSearch])
-
-
const formatDuration = (ms: number) => {
-
if (ms < 1000) return `${ms}ms`
-
return `${(ms / 1000).toFixed(2)}s`
-
}
-
-
const formatUptime = (seconds: number) => {
-
const hours = Math.floor(seconds / 3600)
-
const minutes = Math.floor((seconds % 3600) / 60)
-
return `${hours}h ${minutes}m`
-
}
-
-
return (
-
<div className="min-h-screen bg-gray-950 text-white">
-
{/* Header */}
-
<div className="bg-gray-900 border-b border-gray-800 px-6 py-4">
-
<div className="flex items-center justify-between">
-
<h1 className="text-2xl font-bold">Wisp.place Admin</h1>
-
<div className="flex items-center gap-4">
-
<label className="flex items-center gap-2 text-sm text-gray-400">
-
<input
-
type="checkbox"
-
checked={autoRefresh}
-
onChange={(e) => setAutoRefresh(e.target.checked)}
-
className="rounded"
-
/>
-
Auto-refresh
-
</label>
-
<button
-
onClick={logout}
-
className="px-4 py-2 bg-gray-800 hover:bg-gray-700 rounded text-sm"
-
>
-
Logout
-
</button>
-
</div>
-
</div>
-
</div>
-
-
{/* Tabs */}
-
<div className="bg-gray-900 border-b border-gray-800 px-6">
-
<div className="flex gap-1">
-
{['overview', 'logs', 'errors', 'database', 'sites'].map((t) => (
-
<button
-
key={t}
-
onClick={() => setTab(t)}
-
className={`px-4 py-3 text-sm font-medium capitalize transition-colors ${
-
tab === t
-
? 'text-white border-b-2 border-blue-500'
-
: 'text-gray-400 hover:text-white'
-
}`}
-
>
-
{t}
-
</button>
-
))}
-
</div>
-
</div>
-
-
{/* Content */}
-
<div className="p-6">
-
{tab === 'overview' && (
-
<div className="space-y-6">
-
{/* Health */}
-
{health && (
-
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<div className="text-sm text-gray-400 mb-1">Uptime</div>
-
<div className="text-2xl font-bold">{formatUptime(health.uptime)}</div>
-
</div>
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<div className="text-sm text-gray-400 mb-1">Memory Used</div>
-
<div className="text-2xl font-bold">{health.memory.heapUsed} MB</div>
-
</div>
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<div className="text-sm text-gray-400 mb-1">RSS</div>
-
<div className="text-2xl font-bold">{health.memory.rss} MB</div>
-
</div>
-
</div>
-
)}
-
-
{/* Metrics */}
-
{metrics && (
-
<div>
-
<h2 className="text-xl font-bold mb-4">Performance Metrics</h2>
-
<div className="space-y-4">
-
{/* Overall */}
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<h3 className="text-lg font-semibold mb-3">Overall (Last Hour)</h3>
-
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
-
<div>
-
<div className="text-sm text-gray-400">Total Requests</div>
-
<div className="text-xl font-bold">{metrics.overall.totalRequests}</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">Avg Duration</div>
-
<div className="text-xl font-bold">{metrics.overall.avgDuration}ms</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">P95 Duration</div>
-
<div className="text-xl font-bold">{metrics.overall.p95Duration}ms</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">Error Rate</div>
-
<div className="text-xl font-bold">{metrics.overall.errorRate.toFixed(2)}%</div>
-
</div>
-
</div>
-
</div>
-
-
{/* Main App */}
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<h3 className="text-lg font-semibold mb-3">Main App</h3>
-
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
-
<div>
-
<div className="text-sm text-gray-400">Requests</div>
-
<div className="text-xl font-bold">{metrics.mainApp.totalRequests}</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">Avg</div>
-
<div className="text-xl font-bold">{metrics.mainApp.avgDuration}ms</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">P95</div>
-
<div className="text-xl font-bold">{metrics.mainApp.p95Duration}ms</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">Req/min</div>
-
<div className="text-xl font-bold">{metrics.mainApp.requestsPerMinute}</div>
-
</div>
-
</div>
-
</div>
-
-
{/* Hosting Service */}
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<h3 className="text-lg font-semibold mb-3">Hosting Service</h3>
-
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
-
<div>
-
<div className="text-sm text-gray-400">Requests</div>
-
<div className="text-xl font-bold">{metrics.hostingService.totalRequests}</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">Avg</div>
-
<div className="text-xl font-bold">{metrics.hostingService.avgDuration}ms</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">P95</div>
-
<div className="text-xl font-bold">{metrics.hostingService.p95Duration}ms</div>
-
</div>
-
<div>
-
<div className="text-sm text-gray-400">Req/min</div>
-
<div className="text-xl font-bold">{metrics.hostingService.requestsPerMinute}</div>
-
</div>
-
</div>
-
</div>
-
</div>
-
</div>
-
)}
-
</div>
-
)}
-
-
{tab === 'logs' && (
-
<div className="space-y-4">
-
<div className="flex gap-4">
-
<select
-
value={logLevel}
-
onChange={(e) => setLogLevel(e.target.value)}
-
className="px-3 py-2 bg-gray-900 border border-gray-800 rounded text-white"
-
>
-
<option value="">All Levels</option>
-
<option value="info">Info</option>
-
<option value="warn">Warn</option>
-
<option value="error">Error</option>
-
<option value="debug">Debug</option>
-
</select>
-
<select
-
value={logService}
-
onChange={(e) => setLogService(e.target.value)}
-
className="px-3 py-2 bg-gray-900 border border-gray-800 rounded text-white"
-
>
-
<option value="">All Services</option>
-
<option value="main-app">Main App</option>
-
<option value="hosting-service">Hosting Service</option>
-
</select>
-
<select
-
value={logEventType}
-
onChange={(e) => setLogEventType(e.target.value)}
-
className="px-3 py-2 bg-gray-900 border border-gray-800 rounded text-white"
-
>
-
<option value="">All Event Types</option>
-
<option value="DNS Verifier">DNS Verifier</option>
-
<option value="Auth">Auth</option>
-
<option value="User">User</option>
-
<option value="Domain">Domain</option>
-
<option value="Site">Site</option>
-
<option value="File Upload">File Upload</option>
-
<option value="Sync">Sync</option>
-
<option value="Maintenance">Maintenance</option>
-
<option value="KeyRotation">Key Rotation</option>
-
<option value="Cleanup">Cleanup</option>
-
<option value="Cache">Cache</option>
-
<option value="FirehoseWorker">Firehose Worker</option>
-
</select>
-
<input
-
type="text"
-
value={logSearch}
-
onChange={(e) => setLogSearch(e.target.value)}
-
placeholder="Search logs..."
-
className="flex-1 px-3 py-2 bg-gray-900 border border-gray-800 rounded text-white"
-
/>
-
</div>
-
-
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
-
<div className="max-h-[600px] overflow-y-auto">
-
<table className="w-full text-sm">
-
<thead className="bg-gray-800 sticky top-0">
-
<tr>
-
<th className="px-4 py-2 text-left">Time</th>
-
<th className="px-4 py-2 text-left">Level</th>
-
<th className="px-4 py-2 text-left">Service</th>
-
<th className="px-4 py-2 text-left">Event Type</th>
-
<th className="px-4 py-2 text-left">Message</th>
-
</tr>
-
</thead>
-
<tbody>
-
{logs.map((log) => (
-
<tr key={log.id} className="border-t border-gray-800 hover:bg-gray-800">
-
<td className="px-4 py-2 text-gray-400 whitespace-nowrap">
-
{new Date(log.timestamp).toLocaleTimeString()}
-
</td>
-
<td className="px-4 py-2">
-
<span
-
className={`px-2 py-1 rounded text-xs font-medium ${
-
log.level === 'error'
-
? 'bg-red-900 text-red-200'
-
: log.level === 'warn'
-
? 'bg-yellow-900 text-yellow-200'
-
: log.level === 'info'
-
? 'bg-blue-900 text-blue-200'
-
: 'bg-gray-700 text-gray-300'
-
}`}
-
>
-
{log.level}
-
</span>
-
</td>
-
<td className="px-4 py-2 text-gray-400">{log.service}</td>
-
<td className="px-4 py-2">
-
{log.eventType && (
-
<span className="px-2 py-1 bg-purple-900 text-purple-200 rounded text-xs font-medium">
-
{log.eventType}
-
</span>
-
)}
-
</td>
-
<td className="px-4 py-2">
-
<div>{log.message}</div>
-
{log.context && Object.keys(log.context).length > 0 && (
-
<div className="text-xs text-gray-500 mt-1">
-
{JSON.stringify(log.context)}
-
</div>
-
)}
-
</td>
-
</tr>
-
))}
-
</tbody>
-
</table>
-
</div>
-
</div>
-
</div>
-
)}
-
-
{tab === 'errors' && (
-
<div className="space-y-4">
-
<h2 className="text-xl font-bold">Recent Errors</h2>
-
<div className="space-y-3">
-
{errors.map((error) => (
-
<div key={error.id} className="bg-gray-900 border border-red-900 rounded-lg p-4">
-
<div className="flex items-start justify-between mb-2">
-
<div className="flex-1">
-
<div className="font-semibold text-red-400">{error.message}</div>
-
<div className="text-sm text-gray-400 mt-1">
-
Service: {error.service} โ€ข Count: {error.count} โ€ข Last seen:{' '}
-
{new Date(error.lastSeen).toLocaleString()}
-
</div>
-
</div>
-
</div>
-
{error.stack && (
-
<pre className="text-xs text-gray-500 bg-gray-950 p-2 rounded mt-2 overflow-x-auto">
-
{error.stack}
-
</pre>
-
)}
-
</div>
-
))}
-
{errors.length === 0 && (
-
<div className="text-center text-gray-500 py-8">No errors found</div>
-
)}
-
</div>
-
</div>
-
)}
-
-
{tab === 'database' && database && (
-
<div className="space-y-6">
-
{/* Stats */}
-
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<div className="text-sm text-gray-400 mb-1">Total Sites</div>
-
<div className="text-3xl font-bold">{database.stats.totalSites}</div>
-
</div>
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<div className="text-sm text-gray-400 mb-1">Wisp Subdomains</div>
-
<div className="text-3xl font-bold">{database.stats.totalWispSubdomains}</div>
-
</div>
-
<div className="bg-gray-900 border border-gray-800 rounded-lg p-4">
-
<div className="text-sm text-gray-400 mb-1">Custom Domains</div>
-
<div className="text-3xl font-bold">{database.stats.totalCustomDomains}</div>
-
</div>
-
</div>
-
-
{/* Recent Sites */}
-
<div>
-
<h3 className="text-lg font-semibold mb-3">Recent Sites</h3>
-
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
-
<table className="w-full text-sm">
-
<thead className="bg-gray-800">
-
<tr>
-
<th className="px-4 py-2 text-left">Site Name</th>
-
<th className="px-4 py-2 text-left">Subdomain</th>
-
<th className="px-4 py-2 text-left">DID</th>
-
<th className="px-4 py-2 text-left">RKey</th>
-
<th className="px-4 py-2 text-left">Created</th>
-
</tr>
-
</thead>
-
<tbody>
-
{database.recentSites.map((site: any, i: number) => (
-
<tr key={i} className="border-t border-gray-800">
-
<td className="px-4 py-2">{site.display_name || 'Untitled'}</td>
-
<td className="px-4 py-2">
-
{site.subdomain ? (
-
<a
-
href={`https://${site.subdomain}`}
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-blue-400 hover:underline"
-
>
-
{site.subdomain}
-
</a>
-
) : (
-
<span className="text-gray-500">No domain</span>
-
)}
-
</td>
-
<td className="px-4 py-2 text-gray-400 font-mono text-xs">
-
{site.did.slice(0, 20)}...
-
</td>
-
<td className="px-4 py-2 text-gray-400">{site.rkey || 'self'}</td>
-
<td className="px-4 py-2 text-gray-400">
-
{formatDbDate(site.created_at).toLocaleDateString()}
-
</td>
-
<td className="px-4 py-2">
-
<a
-
href={`https://pdsls.dev/at://${site.did}/place.wisp.fs/${site.rkey || 'self'}`}
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-blue-400 hover:text-blue-300 transition-colors"
-
title="View on PDSls.dev"
-
>
-
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
-
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M10 6H6a2 2 0 00-2 2v10a2 2 0 002 2h10a2 2 0 002-2v-4M14 4h6m0 0v6m0-6L10 14" />
-
</svg>
-
</a>
-
</td>
-
</tr>
-
))}
-
</tbody>
-
</table>
-
</div>
-
</div>
-
-
{/* Recent Domains */}
-
<div>
-
<h3 className="text-lg font-semibold mb-3">Recent Custom Domains</h3>
-
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
-
<table className="w-full text-sm">
-
<thead className="bg-gray-800">
-
<tr>
-
<th className="px-4 py-2 text-left">Domain</th>
-
<th className="px-4 py-2 text-left">DID</th>
-
<th className="px-4 py-2 text-left">Verified</th>
-
<th className="px-4 py-2 text-left">Created</th>
-
</tr>
-
</thead>
-
<tbody>
-
{database.recentDomains.map((domain: any, i: number) => (
-
<tr key={i} className="border-t border-gray-800">
-
<td className="px-4 py-2">{domain.domain}</td>
-
<td className="px-4 py-2 text-gray-400 font-mono text-xs">
-
{domain.did.slice(0, 20)}...
-
</td>
-
<td className="px-4 py-2">
-
<span
-
className={`px-2 py-1 rounded text-xs ${
-
domain.verified
-
? 'bg-green-900 text-green-200'
-
: 'bg-yellow-900 text-yellow-200'
-
}`}
-
>
-
{domain.verified ? 'Yes' : 'No'}
-
</span>
-
</td>
-
<td className="px-4 py-2 text-gray-400">
-
{formatDbDate(domain.created_at).toLocaleDateString()}
-
</td>
-
</tr>
-
))}
-
</tbody>
-
</table>
-
</div>
-
</div>
-
</div>
-
)}
-
-
{tab === 'sites' && sites && (
-
<div className="space-y-6">
-
{/* All Sites */}
-
<div>
-
<h3 className="text-lg font-semibold mb-3">All Sites</h3>
-
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
-
<table className="w-full text-sm">
-
<thead className="bg-gray-800">
-
<tr>
-
<th className="px-4 py-2 text-left">Site Name</th>
-
<th className="px-4 py-2 text-left">Subdomain</th>
-
<th className="px-4 py-2 text-left">DID</th>
-
<th className="px-4 py-2 text-left">RKey</th>
-
<th className="px-4 py-2 text-left">Created</th>
-
</tr>
-
</thead>
-
<tbody>
-
{sites.sites.map((site: any, i: number) => (
-
<tr key={i} className="border-t border-gray-800 hover:bg-gray-800">
-
<td className="px-4 py-2">{site.display_name || 'Untitled'}</td>
-
<td className="px-4 py-2">
-
{site.subdomain ? (
-
<a
-
href={`https://${site.subdomain}`}
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-blue-400 hover:underline"
-
>
-
{site.subdomain}
-
</a>
-
) : (
-
<span className="text-gray-500">No domain</span>
-
)}
-
</td>
-
<td className="px-4 py-2 text-gray-400 font-mono text-xs">
-
{site.did.slice(0, 30)}...
-
</td>
-
<td className="px-4 py-2 text-gray-400">{site.rkey || 'self'}</td>
-
<td className="px-4 py-2 text-gray-400">
-
{formatDbDate(site.created_at).toLocaleString()}
-
</td>
-
<td className="px-4 py-2">
-
<a
-
href={`https://pdsls.dev/at://${site.did}/place.wisp.fs/${site.rkey || 'self'}`}
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-blue-400 hover:text-blue-300 transition-colors"
-
title="View on PDSls.dev"
-
>
-
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
-
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M10 6H6a2 2 0 00-2 2v10a2 2 0 002 2h10a2 2 0 002-2v-4M14 4h6m0 0v6m0-6L10 14" />
-
</svg>
-
</a>
-
</td>
-
</tr>
-
))}
-
</tbody>
-
</table>
-
</div>
-
</div>
-
-
{/* Custom Domains */}
-
<div>
-
<h3 className="text-lg font-semibold mb-3">Custom Domains</h3>
-
<div className="bg-gray-900 border border-gray-800 rounded-lg overflow-hidden">
-
<table className="w-full text-sm">
-
<thead className="bg-gray-800">
-
<tr>
-
<th className="px-4 py-2 text-left">Domain</th>
-
<th className="px-4 py-2 text-left">Verified</th>
-
<th className="px-4 py-2 text-left">DID</th>
-
<th className="px-4 py-2 text-left">RKey</th>
-
<th className="px-4 py-2 text-left">Created</th>
-
<th className="px-4 py-2 text-left">PDSls</th>
-
</tr>
-
</thead>
-
<tbody>
-
{sites.customDomains.map((domain: any, i: number) => (
-
<tr key={i} className="border-t border-gray-800 hover:bg-gray-800">
-
<td className="px-4 py-2">
-
{domain.verified ? (
-
<a
-
href={`https://${domain.domain}`}
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-blue-400 hover:underline"
-
>
-
{domain.domain}
-
</a>
-
) : (
-
<span className="text-gray-400">{domain.domain}</span>
-
)}
-
</td>
-
<td className="px-4 py-2">
-
<span
-
className={`px-2 py-1 rounded text-xs ${
-
domain.verified
-
? 'bg-green-900 text-green-200'
-
: 'bg-yellow-900 text-yellow-200'
-
}`}
-
>
-
{domain.verified ? 'Yes' : 'Pending'}
-
</span>
-
</td>
-
<td className="px-4 py-2 text-gray-400 font-mono text-xs">
-
{domain.did.slice(0, 30)}...
-
</td>
-
<td className="px-4 py-2 text-gray-400">{domain.rkey || 'self'}</td>
-
<td className="px-4 py-2 text-gray-400">
-
{formatDbDate(domain.created_at).toLocaleString()}
-
</td>
-
<td className="px-4 py-2">
-
<a
-
href={`https://pdsls.dev/at://${domain.did}/place.wisp.fs/${domain.rkey || 'self'}`}
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-blue-400 hover:text-blue-300 transition-colors"
-
title="View on PDSls.dev"
-
>
-
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
-
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M10 6H6a2 2 0 00-2 2v10a2 2 0 002 2h10a2 2 0 002-2v-4M14 4h6m0 0v6m0-6L10 14" />
-
</svg>
-
</a>
-
</td>
-
</tr>
-
))}
-
</tbody>
-
</table>
-
</div>
-
</div>
-
</div>
-
)}
-
</div>
-
</div>
-
)
-
}
-
-
// Main App
-
function App() {
-
const [authenticated, setAuthenticated] = useState(false)
-
const [checking, setChecking] = useState(true)
-
-
useEffect(() => {
-
fetch('/api/admin/status', { credentials: 'include' })
-
.then((res) => res.json())
-
.then((data) => {
-
setAuthenticated(data.authenticated)
-
setChecking(false)
-
})
-
.catch(() => {
-
setChecking(false)
-
})
-
}, [])
-
-
if (checking) {
-
return (
-
<div className="min-h-screen bg-gray-950 flex items-center justify-center">
-
<div className="text-white">Loading...</div>
-
</div>
-
)
-
}
-
-
if (!authenticated) {
-
return <Login onLogin={() => setAuthenticated(true)} />
-
}
-
-
return <Dashboard />
-
}
-
-
createRoot(document.getElementById('root')!).render(
-
<StrictMode>
-
<App />
-
</StrictMode>
-
)
-13
public/admin/index.html
···
-
<!DOCTYPE html>
-
<html lang="en">
-
<head>
-
<meta charset="UTF-8" />
-
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
-
<title>Admin Dashboard - Wisp.place</title>
-
<link rel="stylesheet" href="./styles.css" />
-
</head>
-
<body>
-
<div id="root"></div>
-
<script type="module" src="./admin.tsx"></script>
-
</body>
-
</html>
-1
public/admin/styles.css
···
-
@import "tailwindcss";
-46
public/components/ui/badge.tsx
···
-
import * as React from "react"
-
import { Slot } from "@radix-ui/react-slot"
-
import { cva, type VariantProps } from "class-variance-authority"
-
-
import { cn } from "@public/lib/utils"
-
-
const badgeVariants = cva(
-
"inline-flex items-center justify-center rounded-full border px-2 py-0.5 text-xs font-medium w-fit whitespace-nowrap shrink-0 [&>svg]:size-3 gap-1 [&>svg]:pointer-events-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive transition-[color,box-shadow] overflow-hidden",
-
{
-
variants: {
-
variant: {
-
default:
-
"border-transparent bg-primary text-primary-foreground [a&]:hover:bg-primary/90",
-
secondary:
-
"border-transparent bg-secondary text-secondary-foreground [a&]:hover:bg-secondary/90",
-
destructive:
-
"border-transparent bg-destructive text-white [a&]:hover:bg-destructive/90 focus-visible:ring-destructive/20 dark:focus-visible:ring-destructive/40 dark:bg-destructive/60",
-
outline:
-
"text-foreground [a&]:hover:bg-accent [a&]:hover:text-accent-foreground",
-
},
-
},
-
defaultVariants: {
-
variant: "default",
-
},
-
}
-
)
-
-
function Badge({
-
className,
-
variant,
-
asChild = false,
-
...props
-
}: React.ComponentProps<"span"> &
-
VariantProps<typeof badgeVariants> & { asChild?: boolean }) {
-
const Comp = asChild ? Slot : "span"
-
-
return (
-
<Comp
-
data-slot="badge"
-
className={cn(badgeVariants({ variant }), className)}
-
{...props}
-
/>
-
)
-
}
-
-
export { Badge, badgeVariants }
-60
public/components/ui/button.tsx
···
-
import * as React from "react"
-
import { Slot } from "@radix-ui/react-slot"
-
import { cva, type VariantProps } from "class-variance-authority"
-
-
import { cn } from "@public/lib/utils"
-
-
const buttonVariants = cva(
-
"inline-flex items-center justify-center gap-2 whitespace-nowrap rounded-md text-sm font-medium transition-all disabled:pointer-events-none disabled:opacity-50 [&_svg]:pointer-events-none [&_svg:not([class*='size-'])]:size-4 shrink-0 [&_svg]:shrink-0 outline-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive",
-
{
-
variants: {
-
variant: {
-
default: "bg-primary text-primary-foreground hover:bg-primary/90",
-
destructive:
-
"bg-destructive text-white hover:bg-destructive/90 focus-visible:ring-destructive/20 dark:focus-visible:ring-destructive/40 dark:bg-destructive/60",
-
outline:
-
"border bg-background shadow-xs hover:bg-accent hover:text-accent-foreground dark:bg-input/30 dark:border-input dark:hover:bg-input/50",
-
secondary:
-
"bg-secondary text-secondary-foreground hover:bg-secondary/80",
-
ghost:
-
"hover:bg-accent hover:text-accent-foreground dark:hover:bg-accent/50",
-
link: "text-primary underline-offset-4 hover:underline",
-
},
-
size: {
-
default: "h-9 px-4 py-2 has-[>svg]:px-3",
-
sm: "h-8 rounded-md gap-1.5 px-3 has-[>svg]:px-2.5",
-
lg: "h-10 rounded-md px-6 has-[>svg]:px-4",
-
icon: "size-9",
-
"icon-sm": "size-8",
-
"icon-lg": "size-10",
-
},
-
},
-
defaultVariants: {
-
variant: "default",
-
size: "default",
-
},
-
}
-
)
-
-
function Button({
-
className,
-
variant,
-
size,
-
asChild = false,
-
...props
-
}: React.ComponentProps<"button"> &
-
VariantProps<typeof buttonVariants> & {
-
asChild?: boolean
-
}) {
-
const Comp = asChild ? Slot : "button"
-
-
return (
-
<Comp
-
data-slot="button"
-
className={cn(buttonVariants({ variant, size, className }))}
-
{...props}
-
/>
-
)
-
}
-
-
export { Button, buttonVariants }
-92
public/components/ui/card.tsx
···
-
import * as React from "react"
-
-
import { cn } from "@public/lib/utils"
-
-
function Card({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="card"
-
className={cn(
-
"bg-card text-card-foreground flex flex-col gap-6 rounded-xl border py-6 shadow-sm",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
function CardHeader({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="card-header"
-
className={cn(
-
"@container/card-header grid auto-rows-min grid-rows-[auto_auto] items-start gap-2 px-6 has-data-[slot=card-action]:grid-cols-[1fr_auto] [.border-b]:pb-6",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
function CardTitle({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="card-title"
-
className={cn("leading-none font-semibold", className)}
-
{...props}
-
/>
-
)
-
}
-
-
function CardDescription({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="card-description"
-
className={cn("text-muted-foreground text-sm", className)}
-
{...props}
-
/>
-
)
-
}
-
-
function CardAction({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="card-action"
-
className={cn(
-
"col-start-2 row-span-2 row-start-1 self-start justify-self-end",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
function CardContent({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="card-content"
-
className={cn("px-6", className)}
-
{...props}
-
/>
-
)
-
}
-
-
function CardFooter({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="card-footer"
-
className={cn("flex items-center px-6 [.border-t]:pt-6", className)}
-
{...props}
-
/>
-
)
-
}
-
-
export {
-
Card,
-
CardHeader,
-
CardFooter,
-
CardTitle,
-
CardAction,
-
CardDescription,
-
CardContent,
-
}
-141
public/components/ui/dialog.tsx
···
-
import * as React from "react"
-
import * as DialogPrimitive from "@radix-ui/react-dialog"
-
import { XIcon } from "lucide-react"
-
-
import { cn } from "@public/lib/utils"
-
-
function Dialog({
-
...props
-
}: React.ComponentProps<typeof DialogPrimitive.Root>) {
-
return <DialogPrimitive.Root data-slot="dialog" {...props} />
-
}
-
-
function DialogTrigger({
-
...props
-
}: React.ComponentProps<typeof DialogPrimitive.Trigger>) {
-
return <DialogPrimitive.Trigger data-slot="dialog-trigger" {...props} />
-
}
-
-
function DialogPortal({
-
...props
-
}: React.ComponentProps<typeof DialogPrimitive.Portal>) {
-
return <DialogPrimitive.Portal data-slot="dialog-portal" {...props} />
-
}
-
-
function DialogClose({
-
...props
-
}: React.ComponentProps<typeof DialogPrimitive.Close>) {
-
return <DialogPrimitive.Close data-slot="dialog-close" {...props} />
-
}
-
-
function DialogOverlay({
-
className,
-
...props
-
}: React.ComponentProps<typeof DialogPrimitive.Overlay>) {
-
return (
-
<DialogPrimitive.Overlay
-
data-slot="dialog-overlay"
-
className={cn(
-
"data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 fixed inset-0 z-50 bg-black/50",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
function DialogContent({
-
className,
-
children,
-
showCloseButton = true,
-
...props
-
}: React.ComponentProps<typeof DialogPrimitive.Content> & {
-
showCloseButton?: boolean
-
}) {
-
return (
-
<DialogPortal data-slot="dialog-portal">
-
<DialogOverlay />
-
<DialogPrimitive.Content
-
data-slot="dialog-content"
-
className={cn(
-
"bg-background data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 data-[state=closed]:zoom-out-95 data-[state=open]:zoom-in-95 fixed top-[50%] left-[50%] z-50 grid w-full max-w-[calc(100%-2rem)] translate-x-[-50%] translate-y-[-50%] gap-4 rounded-lg border p-6 shadow-lg duration-200 sm:max-w-lg",
-
className
-
)}
-
{...props}
-
>
-
{children}
-
{showCloseButton && (
-
<DialogPrimitive.Close
-
data-slot="dialog-close"
-
className="ring-offset-background focus:ring-ring data-[state=open]:bg-accent data-[state=open]:text-muted-foreground absolute top-4 right-4 rounded-xs opacity-70 transition-opacity hover:opacity-100 focus:ring-2 focus:ring-offset-2 focus:outline-hidden disabled:pointer-events-none [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4"
-
>
-
<XIcon />
-
<span className="sr-only">Close</span>
-
</DialogPrimitive.Close>
-
)}
-
</DialogPrimitive.Content>
-
</DialogPortal>
-
)
-
}
-
-
function DialogHeader({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="dialog-header"
-
className={cn("flex flex-col gap-2 text-center sm:text-left", className)}
-
{...props}
-
/>
-
)
-
}
-
-
function DialogFooter({ className, ...props }: React.ComponentProps<"div">) {
-
return (
-
<div
-
data-slot="dialog-footer"
-
className={cn(
-
"flex flex-col-reverse gap-2 sm:flex-row sm:justify-end",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
function DialogTitle({
-
className,
-
...props
-
}: React.ComponentProps<typeof DialogPrimitive.Title>) {
-
return (
-
<DialogPrimitive.Title
-
data-slot="dialog-title"
-
className={cn("text-lg leading-none font-semibold", className)}
-
{...props}
-
/>
-
)
-
}
-
-
function DialogDescription({
-
className,
-
...props
-
}: React.ComponentProps<typeof DialogPrimitive.Description>) {
-
return (
-
<DialogPrimitive.Description
-
data-slot="dialog-description"
-
className={cn("text-muted-foreground text-sm", className)}
-
{...props}
-
/>
-
)
-
}
-
-
export {
-
Dialog,
-
DialogClose,
-
DialogContent,
-
DialogDescription,
-
DialogFooter,
-
DialogHeader,
-
DialogOverlay,
-
DialogPortal,
-
DialogTitle,
-
DialogTrigger,
-
}
-21
public/components/ui/input.tsx
···
-
import * as React from "react"
-
-
import { cn } from "@public/lib/utils"
-
-
function Input({ className, type, ...props }: React.ComponentProps<"input">) {
-
return (
-
<input
-
type={type}
-
data-slot="input"
-
className={cn(
-
"file:text-foreground placeholder:text-muted-foreground selection:bg-primary selection:text-primary-foreground dark:bg-input/30 border-input h-9 w-full min-w-0 rounded-md border bg-transparent px-3 py-1 text-base shadow-xs transition-[color,box-shadow] outline-none file:inline-flex file:h-7 file:border-0 file:bg-transparent file:text-sm file:font-medium disabled:pointer-events-none disabled:cursor-not-allowed disabled:opacity-50 md:text-sm",
-
"focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px]",
-
"aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
export { Input }
-22
public/components/ui/label.tsx
···
-
import * as React from "react"
-
import * as LabelPrimitive from "@radix-ui/react-label"
-
-
import { cn } from "@public/lib/utils"
-
-
function Label({
-
className,
-
...props
-
}: React.ComponentProps<typeof LabelPrimitive.Root>) {
-
return (
-
<LabelPrimitive.Root
-
data-slot="label"
-
className={cn(
-
"flex items-center gap-2 text-sm leading-none font-medium select-none group-data-[disabled=true]:pointer-events-none group-data-[disabled=true]:opacity-50 peer-disabled:cursor-not-allowed peer-disabled:opacity-50",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
export { Label }
-45
public/components/ui/radio-group.tsx
···
-
"use client"
-
-
import * as React from "react"
-
import * as RadioGroupPrimitive from "@radix-ui/react-radio-group"
-
import { CircleIcon } from "lucide-react"
-
-
import { cn } from "@public/lib/utils"
-
-
function RadioGroup({
-
className,
-
...props
-
}: React.ComponentProps<typeof RadioGroupPrimitive.Root>) {
-
return (
-
<RadioGroupPrimitive.Root
-
data-slot="radio-group"
-
className={cn("grid gap-3", className)}
-
{...props}
-
/>
-
)
-
}
-
-
function RadioGroupItem({
-
className,
-
...props
-
}: React.ComponentProps<typeof RadioGroupPrimitive.Item>) {
-
return (
-
<RadioGroupPrimitive.Item
-
data-slot="radio-group-item"
-
className={cn(
-
"border-input text-primary focus-visible:border-ring focus-visible:ring-ring/50 aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive dark:bg-input/30 aspect-square size-4 shrink-0 rounded-full border shadow-xs transition-[color,box-shadow] outline-none focus-visible:ring-[3px] disabled:cursor-not-allowed disabled:opacity-50",
-
className
-
)}
-
{...props}
-
>
-
<RadioGroupPrimitive.Indicator
-
data-slot="radio-group-indicator"
-
className="relative flex items-center justify-center"
-
>
-
<CircleIcon className="fill-primary absolute top-1/2 left-1/2 size-2 -translate-x-1/2 -translate-y-1/2" />
-
</RadioGroupPrimitive.Indicator>
-
</RadioGroupPrimitive.Item>
-
)
-
}
-
-
export { RadioGroup, RadioGroupItem }
-64
public/components/ui/tabs.tsx
···
-
import * as React from "react"
-
import * as TabsPrimitive from "@radix-ui/react-tabs"
-
-
import { cn } from "@public/lib/utils"
-
-
function Tabs({
-
className,
-
...props
-
}: React.ComponentProps<typeof TabsPrimitive.Root>) {
-
return (
-
<TabsPrimitive.Root
-
data-slot="tabs"
-
className={cn("flex flex-col gap-2", className)}
-
{...props}
-
/>
-
)
-
}
-
-
function TabsList({
-
className,
-
...props
-
}: React.ComponentProps<typeof TabsPrimitive.List>) {
-
return (
-
<TabsPrimitive.List
-
data-slot="tabs-list"
-
className={cn(
-
"bg-muted text-muted-foreground inline-flex h-9 w-fit items-center justify-center rounded-lg p-[3px]",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
function TabsTrigger({
-
className,
-
...props
-
}: React.ComponentProps<typeof TabsPrimitive.Trigger>) {
-
return (
-
<TabsPrimitive.Trigger
-
data-slot="tabs-trigger"
-
className={cn(
-
"data-[state=active]:bg-background dark:data-[state=active]:text-foreground focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:outline-ring dark:data-[state=active]:border-input dark:data-[state=active]:bg-input/30 text-foreground dark:text-muted-foreground inline-flex h-[calc(100%-1px)] flex-1 items-center justify-center gap-1.5 rounded-md border border-transparent px-2 py-1 text-sm font-medium whitespace-nowrap transition-[color,box-shadow] focus-visible:ring-[3px] focus-visible:outline-1 disabled:pointer-events-none disabled:opacity-50 data-[state=active]:shadow-sm [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
-
className
-
)}
-
{...props}
-
/>
-
)
-
}
-
-
function TabsContent({
-
className,
-
...props
-
}: React.ComponentProps<typeof TabsPrimitive.Content>) {
-
return (
-
<TabsPrimitive.Content
-
data-slot="tabs-content"
-
className={cn("outline-none", className)}
-
{...props}
-
/>
-
)
-
}
-
-
export { Tabs, TabsList, TabsTrigger, TabsContent }
-1396
public/editor/editor.tsx
···
-
import { useState, useEffect } from 'react'
-
import { createRoot } from 'react-dom/client'
-
import { Button } from '@public/components/ui/button'
-
import {
-
Card,
-
CardContent,
-
CardDescription,
-
CardHeader,
-
CardTitle
-
} from '@public/components/ui/card'
-
import { Input } from '@public/components/ui/input'
-
import { Label } from '@public/components/ui/label'
-
import {
-
Tabs,
-
TabsContent,
-
TabsList,
-
TabsTrigger
-
} from '@public/components/ui/tabs'
-
import { Badge } from '@public/components/ui/badge'
-
import {
-
Dialog,
-
DialogContent,
-
DialogDescription,
-
DialogHeader,
-
DialogTitle,
-
DialogFooter
-
} from '@public/components/ui/dialog'
-
import {
-
Globe,
-
Upload,
-
ExternalLink,
-
CheckCircle2,
-
XCircle,
-
AlertCircle,
-
Loader2,
-
Trash2,
-
RefreshCw,
-
Settings
-
} from 'lucide-react'
-
import { RadioGroup, RadioGroupItem } from '@public/components/ui/radio-group'
-
-
import Layout from '@public/layouts'
-
-
interface UserInfo {
-
did: string
-
handle: string
-
}
-
-
interface Site {
-
did: string
-
rkey: string
-
display_name: string | null
-
created_at: number
-
updated_at: number
-
}
-
-
interface CustomDomain {
-
id: string
-
domain: string
-
did: string
-
rkey: string
-
verified: boolean
-
last_verified_at: number | null
-
created_at: number
-
}
-
-
interface WispDomain {
-
domain: string
-
rkey: string | null
-
}
-
-
function Dashboard() {
-
// User state
-
const [userInfo, setUserInfo] = useState<UserInfo | null>(null)
-
const [loading, setLoading] = useState(true)
-
-
// Sites state
-
const [sites, setSites] = useState<Site[]>([])
-
const [sitesLoading, setSitesLoading] = useState(true)
-
const [isSyncing, setIsSyncing] = useState(false)
-
-
// Domains state
-
const [wispDomain, setWispDomain] = useState<WispDomain | null>(null)
-
const [customDomains, setCustomDomains] = useState<CustomDomain[]>([])
-
const [domainsLoading, setDomainsLoading] = useState(true)
-
-
// Site configuration state
-
const [configuringSite, setConfiguringSite] = useState<Site | null>(null)
-
const [selectedDomain, setSelectedDomain] = useState<string>('')
-
const [isSavingConfig, setIsSavingConfig] = useState(false)
-
const [isDeletingSite, setIsDeletingSite] = useState(false)
-
-
// Upload state
-
const [siteMode, setSiteMode] = useState<'existing' | 'new'>('existing')
-
const [selectedSiteRkey, setSelectedSiteRkey] = useState<string>('')
-
const [newSiteName, setNewSiteName] = useState('')
-
const [selectedFiles, setSelectedFiles] = useState<FileList | null>(null)
-
const [isUploading, setIsUploading] = useState(false)
-
const [uploadProgress, setUploadProgress] = useState('')
-
const [skippedFiles, setSkippedFiles] = useState<Array<{ name: string; reason: string }>>([])
-
const [uploadedCount, setUploadedCount] = useState(0)
-
-
// Custom domain modal state
-
const [addDomainModalOpen, setAddDomainModalOpen] = useState(false)
-
const [customDomain, setCustomDomain] = useState('')
-
const [isAddingDomain, setIsAddingDomain] = useState(false)
-
const [verificationStatus, setVerificationStatus] = useState<{
-
[id: string]: 'idle' | 'verifying' | 'success' | 'error'
-
}>({})
-
const [viewDomainDNS, setViewDomainDNS] = useState<string | null>(null)
-
-
// Wisp domain claim state
-
const [wispHandle, setWispHandle] = useState('')
-
const [isClaimingWisp, setIsClaimingWisp] = useState(false)
-
const [wispAvailability, setWispAvailability] = useState<{
-
available: boolean | null
-
checking: boolean
-
}>({ available: null, checking: false })
-
-
// Fetch user info on mount
-
useEffect(() => {
-
fetchUserInfo()
-
fetchSites()
-
fetchDomains()
-
}, [])
-
-
// Auto-switch to 'new' mode if no sites exist
-
useEffect(() => {
-
if (!sitesLoading && sites.length === 0 && siteMode === 'existing') {
-
setSiteMode('new')
-
}
-
}, [sites, sitesLoading, siteMode])
-
-
const fetchUserInfo = async () => {
-
try {
-
const response = await fetch('/api/user/info')
-
const data = await response.json()
-
setUserInfo(data)
-
} catch (err) {
-
console.error('Failed to fetch user info:', err)
-
} finally {
-
setLoading(false)
-
}
-
}
-
-
const fetchSites = async () => {
-
try {
-
const response = await fetch('/api/user/sites')
-
const data = await response.json()
-
setSites(data.sites || [])
-
} catch (err) {
-
console.error('Failed to fetch sites:', err)
-
} finally {
-
setSitesLoading(false)
-
}
-
}
-
-
const syncSites = async () => {
-
setIsSyncing(true)
-
try {
-
const response = await fetch('/api/user/sync', {
-
method: 'POST'
-
})
-
const data = await response.json()
-
if (data.success) {
-
console.log(`Synced ${data.synced} sites from PDS`)
-
// Refresh sites list
-
await fetchSites()
-
}
-
} catch (err) {
-
console.error('Failed to sync sites:', err)
-
alert('Failed to sync sites from PDS')
-
} finally {
-
setIsSyncing(false)
-
}
-
}
-
-
const fetchDomains = async () => {
-
try {
-
const response = await fetch('/api/user/domains')
-
const data = await response.json()
-
setWispDomain(data.wispDomain)
-
setCustomDomains(data.customDomains || [])
-
} catch (err) {
-
console.error('Failed to fetch domains:', err)
-
} finally {
-
setDomainsLoading(false)
-
}
-
}
-
-
const getSiteUrl = (site: Site) => {
-
// Check if this site is mapped to the wisp.place domain
-
if (wispDomain && wispDomain.rkey === site.rkey) {
-
return `https://${wispDomain.domain}`
-
}
-
-
// Check if this site is mapped to any custom domain
-
const customDomain = customDomains.find((d) => d.rkey === site.rkey)
-
if (customDomain) {
-
return `https://${customDomain.domain}`
-
}
-
-
// Default fallback URL
-
if (!userInfo) return '#'
-
return `https://sites.wisp.place/${site.did}/${site.rkey}`
-
}
-
-
const getSiteDomainName = (site: Site) => {
-
if (wispDomain && wispDomain.rkey === site.rkey) {
-
return wispDomain.domain
-
}
-
-
const customDomain = customDomains.find((d) => d.rkey === site.rkey)
-
if (customDomain) {
-
return customDomain.domain
-
}
-
-
return `sites.wisp.place/${site.did}/${site.rkey}`
-
}
-
-
const handleFileSelect = (e: React.ChangeEvent<HTMLInputElement>) => {
-
if (e.target.files && e.target.files.length > 0) {
-
setSelectedFiles(e.target.files)
-
}
-
}
-
-
const handleUpload = async () => {
-
const siteName = siteMode === 'existing' ? selectedSiteRkey : newSiteName
-
-
if (!siteName) {
-
alert(siteMode === 'existing' ? 'Please select a site' : 'Please enter a site name')
-
return
-
}
-
-
setIsUploading(true)
-
setUploadProgress('Preparing files...')
-
-
try {
-
const formData = new FormData()
-
formData.append('siteName', siteName)
-
-
if (selectedFiles) {
-
for (let i = 0; i < selectedFiles.length; i++) {
-
formData.append('files', selectedFiles[i])
-
}
-
}
-
-
setUploadProgress('Uploading to AT Protocol...')
-
const response = await fetch('/wisp/upload-files', {
-
method: 'POST',
-
body: formData
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
setUploadProgress('Upload complete!')
-
setSkippedFiles(data.skippedFiles || [])
-
setUploadedCount(data.uploadedCount || data.fileCount || 0)
-
setSelectedSiteRkey('')
-
setNewSiteName('')
-
setSelectedFiles(null)
-
-
// Refresh sites list
-
await fetchSites()
-
-
// Reset form - give more time if there are skipped files
-
const resetDelay = data.skippedFiles && data.skippedFiles.length > 0 ? 4000 : 1500
-
setTimeout(() => {
-
setUploadProgress('')
-
setSkippedFiles([])
-
setUploadedCount(0)
-
setIsUploading(false)
-
}, resetDelay)
-
} else {
-
throw new Error(data.error || 'Upload failed')
-
}
-
} catch (err) {
-
console.error('Upload error:', err)
-
alert(
-
`Upload failed: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
setIsUploading(false)
-
setUploadProgress('')
-
}
-
}
-
-
const handleAddCustomDomain = async () => {
-
if (!customDomain) {
-
alert('Please enter a domain')
-
return
-
}
-
-
setIsAddingDomain(true)
-
try {
-
const response = await fetch('/api/domain/custom/add', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ domain: customDomain })
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
setCustomDomain('')
-
setAddDomainModalOpen(false)
-
await fetchDomains()
-
-
// Automatically show DNS configuration for the newly added domain
-
setViewDomainDNS(data.id)
-
} else {
-
throw new Error(data.error || 'Failed to add domain')
-
}
-
} catch (err) {
-
console.error('Add domain error:', err)
-
alert(
-
`Failed to add domain: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
} finally {
-
setIsAddingDomain(false)
-
}
-
}
-
-
const handleVerifyDomain = async (id: string) => {
-
setVerificationStatus({ ...verificationStatus, [id]: 'verifying' })
-
-
try {
-
const response = await fetch('/api/domain/custom/verify', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ id })
-
})
-
-
const data = await response.json()
-
if (data.success && data.verified) {
-
setVerificationStatus({ ...verificationStatus, [id]: 'success' })
-
await fetchDomains()
-
} else {
-
setVerificationStatus({ ...verificationStatus, [id]: 'error' })
-
if (data.error) {
-
alert(`Verification failed: ${data.error}`)
-
}
-
}
-
} catch (err) {
-
console.error('Verify domain error:', err)
-
setVerificationStatus({ ...verificationStatus, [id]: 'error' })
-
alert(
-
`Verification failed: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
}
-
}
-
-
const handleDeleteCustomDomain = async (id: string) => {
-
if (!confirm('Are you sure you want to remove this custom domain?')) {
-
return
-
}
-
-
try {
-
const response = await fetch(`/api/domain/custom/${id}`, {
-
method: 'DELETE'
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
await fetchDomains()
-
} else {
-
throw new Error('Failed to delete domain')
-
}
-
} catch (err) {
-
console.error('Delete domain error:', err)
-
alert(
-
`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
}
-
}
-
-
const handleConfigureSite = (site: Site) => {
-
setConfiguringSite(site)
-
-
// Determine current domain mapping
-
if (wispDomain && wispDomain.rkey === site.rkey) {
-
setSelectedDomain('wisp')
-
} else {
-
const customDomain = customDomains.find((d) => d.rkey === site.rkey)
-
if (customDomain) {
-
setSelectedDomain(customDomain.id)
-
} else {
-
setSelectedDomain('none')
-
}
-
}
-
}
-
-
const handleSaveSiteConfig = async () => {
-
if (!configuringSite) return
-
-
setIsSavingConfig(true)
-
try {
-
if (selectedDomain === 'wisp') {
-
// Map to wisp.place domain
-
const response = await fetch('/api/domain/wisp/map-site', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ siteRkey: configuringSite.rkey })
-
})
-
const data = await response.json()
-
if (!data.success) throw new Error('Failed to map site')
-
} else if (selectedDomain === 'none') {
-
// Unmap from all domains
-
// Unmap wisp domain if this site was mapped to it
-
if (wispDomain && wispDomain.rkey === configuringSite.rkey) {
-
await fetch('/api/domain/wisp/map-site', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ siteRkey: null })
-
})
-
}
-
-
// Unmap from custom domains
-
const mappedCustom = customDomains.find(
-
(d) => d.rkey === configuringSite.rkey
-
)
-
if (mappedCustom) {
-
await fetch(`/api/domain/custom/${mappedCustom.id}/map-site`, {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ siteRkey: null })
-
})
-
}
-
} else {
-
// Map to a custom domain
-
const response = await fetch(
-
`/api/domain/custom/${selectedDomain}/map-site`,
-
{
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ siteRkey: configuringSite.rkey })
-
}
-
)
-
const data = await response.json()
-
if (!data.success) throw new Error('Failed to map site')
-
}
-
-
// Refresh domains to get updated mappings
-
await fetchDomains()
-
setConfiguringSite(null)
-
} catch (err) {
-
console.error('Save config error:', err)
-
alert(
-
`Failed to save configuration: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
} finally {
-
setIsSavingConfig(false)
-
}
-
}
-
-
const handleDeleteSite = async () => {
-
if (!configuringSite) return
-
-
if (!confirm(`Are you sure you want to delete "${configuringSite.display_name || configuringSite.rkey}"? This action cannot be undone.`)) {
-
return
-
}
-
-
setIsDeletingSite(true)
-
try {
-
const response = await fetch(`/api/site/${configuringSite.rkey}`, {
-
method: 'DELETE'
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
// Refresh sites list
-
await fetchSites()
-
// Refresh domains in case this site was mapped
-
await fetchDomains()
-
setConfiguringSite(null)
-
} else {
-
throw new Error(data.error || 'Failed to delete site')
-
}
-
} catch (err) {
-
console.error('Delete site error:', err)
-
alert(
-
`Failed to delete site: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
} finally {
-
setIsDeletingSite(false)
-
}
-
}
-
-
const checkWispAvailability = async (handle: string) => {
-
const trimmedHandle = handle.trim().toLowerCase()
-
if (!trimmedHandle) {
-
setWispAvailability({ available: null, checking: false })
-
return
-
}
-
-
setWispAvailability({ available: null, checking: true })
-
try {
-
const response = await fetch(`/api/domain/check?handle=${encodeURIComponent(trimmedHandle)}`)
-
const data = await response.json()
-
setWispAvailability({ available: data.available, checking: false })
-
} catch (err) {
-
console.error('Check availability error:', err)
-
setWispAvailability({ available: false, checking: false })
-
}
-
}
-
-
const handleClaimWispDomain = async () => {
-
const trimmedHandle = wispHandle.trim().toLowerCase()
-
if (!trimmedHandle) {
-
alert('Please enter a handle')
-
return
-
}
-
-
setIsClaimingWisp(true)
-
try {
-
const response = await fetch('/api/domain/claim', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ handle: trimmedHandle })
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
setWispHandle('')
-
setWispAvailability({ available: null, checking: false })
-
await fetchDomains()
-
} else {
-
throw new Error(data.error || 'Failed to claim domain')
-
}
-
} catch (err) {
-
console.error('Claim domain error:', err)
-
const errorMessage = err instanceof Error ? err.message : 'Unknown error'
-
-
// Handle "Already claimed" error more gracefully
-
if (errorMessage.includes('Already claimed')) {
-
alert('You have already claimed a wisp.place subdomain. Please refresh the page.')
-
await fetchDomains()
-
} else {
-
alert(`Failed to claim domain: ${errorMessage}`)
-
}
-
} finally {
-
setIsClaimingWisp(false)
-
}
-
}
-
-
if (loading) {
-
return (
-
<div className="w-full min-h-screen bg-background flex items-center justify-center">
-
<Loader2 className="w-8 h-8 animate-spin text-primary" />
-
</div>
-
)
-
}
-
-
return (
-
<div className="w-full min-h-screen bg-background">
-
{/* Header */}
-
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
-
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
-
<div className="flex items-center gap-2">
-
<div className="w-8 h-8 bg-primary rounded-lg flex items-center justify-center">
-
<Globe className="w-5 h-5 text-primary-foreground" />
-
</div>
-
<span className="text-xl font-semibold text-foreground">
-
wisp.place
-
</span>
-
</div>
-
<div className="flex items-center gap-3">
-
<span className="text-sm text-muted-foreground">
-
{userInfo?.handle || 'Loading...'}
-
</span>
-
</div>
-
</div>
-
</header>
-
-
<div className="container mx-auto px-4 py-8 max-w-6xl w-full">
-
<div className="mb-8">
-
<h1 className="text-3xl font-bold mb-2">Dashboard</h1>
-
<p className="text-muted-foreground">
-
Manage your sites and domains
-
</p>
-
</div>
-
-
<Tabs defaultValue="sites" className="space-y-6 w-full">
-
<TabsList className="grid w-full grid-cols-3 max-w-md">
-
<TabsTrigger value="sites">Sites</TabsTrigger>
-
<TabsTrigger value="domains">Domains</TabsTrigger>
-
<TabsTrigger value="upload">Upload</TabsTrigger>
-
</TabsList>
-
-
{/* Sites Tab */}
-
<TabsContent value="sites" className="space-y-4 min-h-[400px]">
-
<Card>
-
<CardHeader>
-
<div className="flex items-center justify-between">
-
<div>
-
<CardTitle>Your Sites</CardTitle>
-
<CardDescription>
-
View and manage all your deployed sites
-
</CardDescription>
-
</div>
-
<Button
-
variant="outline"
-
size="sm"
-
onClick={syncSites}
-
disabled={isSyncing || sitesLoading}
-
>
-
<RefreshCw
-
className={`w-4 h-4 mr-2 ${isSyncing ? 'animate-spin' : ''}`}
-
/>
-
Sync from PDS
-
</Button>
-
</div>
-
</CardHeader>
-
<CardContent className="space-y-4">
-
{sitesLoading ? (
-
<div className="flex items-center justify-center py-8">
-
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
-
</div>
-
) : sites.length === 0 ? (
-
<div className="text-center py-8 text-muted-foreground">
-
<p>No sites yet. Upload your first site!</p>
-
</div>
-
) : (
-
sites.map((site) => (
-
<div
-
key={`${site.did}-${site.rkey}`}
-
className="flex items-center justify-between p-4 border border-border rounded-lg hover:bg-muted/50 transition-colors"
-
>
-
<div className="flex-1">
-
<div className="flex items-center gap-3 mb-2">
-
<h3 className="font-semibold text-lg">
-
{site.display_name || site.rkey}
-
</h3>
-
<Badge
-
variant="secondary"
-
className="text-xs"
-
>
-
active
-
</Badge>
-
</div>
-
<a
-
href={getSiteUrl(site)}
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-sm text-accent hover:text-accent/80 flex items-center gap-1"
-
>
-
{getSiteDomainName(site)}
-
<ExternalLink className="w-3 h-3" />
-
</a>
-
</div>
-
<Button
-
variant="outline"
-
size="sm"
-
onClick={() => handleConfigureSite(site)}
-
>
-
<Settings className="w-4 h-4 mr-2" />
-
Configure
-
</Button>
-
</div>
-
))
-
)}
-
</CardContent>
-
</Card>
-
</TabsContent>
-
-
{/* Domains Tab */}
-
<TabsContent value="domains" className="space-y-4 min-h-[400px]">
-
<Card>
-
<CardHeader>
-
<CardTitle>wisp.place Subdomain</CardTitle>
-
<CardDescription>
-
Your free subdomain on the wisp.place network
-
</CardDescription>
-
</CardHeader>
-
<CardContent>
-
{domainsLoading ? (
-
<div className="flex items-center justify-center py-4">
-
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
-
</div>
-
) : wispDomain ? (
-
<>
-
<div className="flex flex-col gap-2 p-4 bg-muted/50 rounded-lg">
-
<div className="flex items-center gap-2">
-
<CheckCircle2 className="w-5 h-5 text-green-500" />
-
<span className="font-mono text-lg">
-
{wispDomain.domain}
-
</span>
-
</div>
-
{wispDomain.rkey && (
-
<p className="text-xs text-muted-foreground ml-7">
-
โ†’ Mapped to site: {wispDomain.rkey}
-
</p>
-
)}
-
</div>
-
<p className="text-sm text-muted-foreground mt-3">
-
{wispDomain.rkey
-
? 'This domain is mapped to a specific site'
-
: 'This domain is not mapped to any site yet. Configure it from the Sites tab.'}
-
</p>
-
</>
-
) : (
-
<div className="space-y-4">
-
<div className="p-4 bg-muted/30 rounded-lg">
-
<p className="text-sm text-muted-foreground mb-4">
-
Claim your free wisp.place subdomain
-
</p>
-
<div className="space-y-3">
-
<div className="space-y-2">
-
<Label htmlFor="wisp-handle">Choose your handle</Label>
-
<div className="flex gap-2">
-
<div className="flex-1 relative">
-
<Input
-
id="wisp-handle"
-
placeholder="mysite"
-
value={wispHandle}
-
onChange={(e) => {
-
setWispHandle(e.target.value)
-
if (e.target.value.trim()) {
-
checkWispAvailability(e.target.value)
-
} else {
-
setWispAvailability({ available: null, checking: false })
-
}
-
}}
-
disabled={isClaimingWisp}
-
className="pr-24"
-
/>
-
<span className="absolute right-3 top-1/2 -translate-y-1/2 text-sm text-muted-foreground">
-
.wisp.place
-
</span>
-
</div>
-
</div>
-
{wispAvailability.checking && (
-
<p className="text-xs text-muted-foreground flex items-center gap-1">
-
<Loader2 className="w-3 h-3 animate-spin" />
-
Checking availability...
-
</p>
-
)}
-
{!wispAvailability.checking && wispAvailability.available === true && (
-
<p className="text-xs text-green-600 flex items-center gap-1">
-
<CheckCircle2 className="w-3 h-3" />
-
Available
-
</p>
-
)}
-
{!wispAvailability.checking && wispAvailability.available === false && (
-
<p className="text-xs text-red-600 flex items-center gap-1">
-
<XCircle className="w-3 h-3" />
-
Not available
-
</p>
-
)}
-
</div>
-
<Button
-
onClick={handleClaimWispDomain}
-
disabled={!wispHandle.trim() || isClaimingWisp || wispAvailability.available !== true}
-
className="w-full"
-
>
-
{isClaimingWisp ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Claiming...
-
</>
-
) : (
-
'Claim Subdomain'
-
)}
-
</Button>
-
</div>
-
</div>
-
</div>
-
)}
-
</CardContent>
-
</Card>
-
-
<Card>
-
<CardHeader>
-
<CardTitle>Custom Domains</CardTitle>
-
<CardDescription>
-
Bring your own domain with DNS verification
-
</CardDescription>
-
</CardHeader>
-
<CardContent className="space-y-4">
-
<Button
-
onClick={() => setAddDomainModalOpen(true)}
-
className="w-full"
-
>
-
Add Custom Domain
-
</Button>
-
-
{domainsLoading ? (
-
<div className="flex items-center justify-center py-4">
-
<Loader2 className="w-6 h-6 animate-spin text-muted-foreground" />
-
</div>
-
) : customDomains.length === 0 ? (
-
<div className="text-center py-4 text-muted-foreground text-sm">
-
No custom domains added yet
-
</div>
-
) : (
-
<div className="space-y-2">
-
{customDomains.map((domain) => (
-
<div
-
key={domain.id}
-
className="flex items-center justify-between p-3 border border-border rounded-lg"
-
>
-
<div className="flex flex-col gap-1 flex-1">
-
<div className="flex items-center gap-2">
-
{domain.verified ? (
-
<CheckCircle2 className="w-4 h-4 text-green-500" />
-
) : (
-
<XCircle className="w-4 h-4 text-red-500" />
-
)}
-
<span className="font-mono">
-
{domain.domain}
-
</span>
-
</div>
-
{domain.rkey && domain.rkey !== 'self' && (
-
<p className="text-xs text-muted-foreground ml-6">
-
โ†’ Mapped to site: {domain.rkey}
-
</p>
-
)}
-
</div>
-
<div className="flex items-center gap-2">
-
<Button
-
variant="outline"
-
size="sm"
-
onClick={() =>
-
setViewDomainDNS(domain.id)
-
}
-
>
-
View DNS
-
</Button>
-
{domain.verified ? (
-
<Badge variant="secondary">
-
Verified
-
</Badge>
-
) : (
-
<Button
-
variant="outline"
-
size="sm"
-
onClick={() =>
-
handleVerifyDomain(domain.id)
-
}
-
disabled={
-
verificationStatus[
-
domain.id
-
] === 'verifying'
-
}
-
>
-
{verificationStatus[
-
domain.id
-
] === 'verifying' ? (
-
<>
-
<Loader2 className="w-3 h-3 mr-1 animate-spin" />
-
Verifying...
-
</>
-
) : (
-
'Verify DNS'
-
)}
-
</Button>
-
)}
-
<Button
-
variant="ghost"
-
size="sm"
-
onClick={() =>
-
handleDeleteCustomDomain(
-
domain.id
-
)
-
}
-
>
-
<Trash2 className="w-4 h-4" />
-
</Button>
-
</div>
-
</div>
-
))}
-
</div>
-
)}
-
</CardContent>
-
</Card>
-
</TabsContent>
-
-
{/* Upload Tab */}
-
<TabsContent value="upload" className="space-y-4 min-h-[400px]">
-
<Card>
-
<CardHeader>
-
<CardTitle>Upload Site</CardTitle>
-
<CardDescription>
-
Deploy a new site from a folder or Git repository
-
</CardDescription>
-
</CardHeader>
-
<CardContent className="space-y-6">
-
<div className="space-y-4">
-
<RadioGroup
-
value={siteMode}
-
onValueChange={(value) => setSiteMode(value as 'existing' | 'new')}
-
disabled={isUploading}
-
>
-
<div className="flex items-center space-x-2">
-
<RadioGroupItem value="existing" id="existing" />
-
<Label htmlFor="existing" className="cursor-pointer">
-
Update existing site
-
</Label>
-
</div>
-
<div className="flex items-center space-x-2">
-
<RadioGroupItem value="new" id="new" />
-
<Label htmlFor="new" className="cursor-pointer">
-
Create new site
-
</Label>
-
</div>
-
</RadioGroup>
-
-
{siteMode === 'existing' ? (
-
<div className="space-y-2">
-
<Label htmlFor="site-select">Select Site</Label>
-
{sitesLoading ? (
-
<div className="flex items-center justify-center py-4">
-
<Loader2 className="w-5 h-5 animate-spin text-muted-foreground" />
-
</div>
-
) : sites.length === 0 ? (
-
<div className="p-4 border border-dashed rounded-lg text-center text-sm text-muted-foreground">
-
No sites available. Create a new site instead.
-
</div>
-
) : (
-
<select
-
id="site-select"
-
className="flex h-10 w-full rounded-md border border-input bg-background px-3 py-2 text-sm ring-offset-background file:border-0 file:bg-transparent file:text-sm file:font-medium placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
-
value={selectedSiteRkey}
-
onChange={(e) => setSelectedSiteRkey(e.target.value)}
-
disabled={isUploading}
-
>
-
<option value="">Select a site...</option>
-
{sites.map((site) => (
-
<option key={site.rkey} value={site.rkey}>
-
{site.display_name || site.rkey}
-
</option>
-
))}
-
</select>
-
)}
-
</div>
-
) : (
-
<div className="space-y-2">
-
<Label htmlFor="new-site-name">New Site Name</Label>
-
<Input
-
id="new-site-name"
-
placeholder="my-awesome-site"
-
value={newSiteName}
-
onChange={(e) => setNewSiteName(e.target.value)}
-
disabled={isUploading}
-
/>
-
</div>
-
)}
-
-
<p className="text-xs text-muted-foreground">
-
File limits: 100MB per file, 300MB total
-
</p>
-
</div>
-
-
<div className="grid md:grid-cols-2 gap-4">
-
<Card className="border-2 border-dashed hover:border-accent transition-colors cursor-pointer">
-
<CardContent className="flex flex-col items-center justify-center p-8 text-center">
-
<Upload className="w-12 h-12 text-muted-foreground mb-4" />
-
<h3 className="font-semibold mb-2">
-
Upload Folder
-
</h3>
-
<p className="text-sm text-muted-foreground mb-4">
-
Drag and drop or click to upload your
-
static site files
-
</p>
-
<input
-
type="file"
-
id="file-upload"
-
multiple
-
onChange={handleFileSelect}
-
className="hidden"
-
{...(({ webkitdirectory: '', directory: '' } as any))}
-
disabled={isUploading}
-
/>
-
<label htmlFor="file-upload">
-
<Button
-
variant="outline"
-
type="button"
-
onClick={() =>
-
document
-
.getElementById('file-upload')
-
?.click()
-
}
-
disabled={isUploading}
-
>
-
Choose Folder
-
</Button>
-
</label>
-
{selectedFiles && selectedFiles.length > 0 && (
-
<p className="text-sm text-muted-foreground mt-3">
-
{selectedFiles.length} files selected
-
</p>
-
)}
-
</CardContent>
-
</Card>
-
-
<Card className="border-2 border-dashed opacity-50">
-
<CardContent className="flex flex-col items-center justify-center p-8 text-center">
-
<Globe className="w-12 h-12 text-muted-foreground mb-4" />
-
<h3 className="font-semibold mb-2">
-
Connect Git Repository
-
</h3>
-
<p className="text-sm text-muted-foreground mb-4">
-
Link your GitHub, GitLab, or any Git
-
repository
-
</p>
-
<Badge variant="secondary">Coming soon!</Badge>
-
</CardContent>
-
</Card>
-
</div>
-
-
{uploadProgress && (
-
<div className="space-y-3">
-
<div className="p-4 bg-muted rounded-lg">
-
<div className="flex items-center gap-2">
-
<Loader2 className="w-4 h-4 animate-spin" />
-
<span className="text-sm">{uploadProgress}</span>
-
</div>
-
</div>
-
-
{skippedFiles.length > 0 && (
-
<div className="p-4 bg-yellow-500/10 border border-yellow-500/20 rounded-lg">
-
<div className="flex items-start gap-2 text-yellow-600 dark:text-yellow-400 mb-2">
-
<AlertCircle className="w-4 h-4 mt-0.5 flex-shrink-0" />
-
<div className="flex-1">
-
<span className="font-medium">
-
{skippedFiles.length} file{skippedFiles.length > 1 ? 's' : ''} skipped
-
</span>
-
{uploadedCount > 0 && (
-
<span className="text-sm ml-2">
-
({uploadedCount} uploaded successfully)
-
</span>
-
)}
-
</div>
-
</div>
-
<div className="ml-6 space-y-1 max-h-32 overflow-y-auto">
-
{skippedFiles.slice(0, 5).map((file, idx) => (
-
<div key={idx} className="text-xs">
-
<span className="font-mono">{file.name}</span>
-
<span className="text-muted-foreground"> - {file.reason}</span>
-
</div>
-
))}
-
{skippedFiles.length > 5 && (
-
<div className="text-xs text-muted-foreground">
-
...and {skippedFiles.length - 5} more
-
</div>
-
)}
-
</div>
-
</div>
-
)}
-
</div>
-
)}
-
-
<Button
-
onClick={handleUpload}
-
className="w-full"
-
disabled={
-
(siteMode === 'existing' ? !selectedSiteRkey : !newSiteName) ||
-
isUploading ||
-
(siteMode === 'existing' && (!selectedFiles || selectedFiles.length === 0))
-
}
-
>
-
{isUploading ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Uploading...
-
</>
-
) : (
-
<>
-
{siteMode === 'existing' ? (
-
'Update Site'
-
) : (
-
selectedFiles && selectedFiles.length > 0
-
? 'Upload & Deploy'
-
: 'Create Empty Site'
-
)}
-
</>
-
)}
-
</Button>
-
</CardContent>
-
</Card>
-
</TabsContent>
-
</Tabs>
-
</div>
-
-
{/* Add Custom Domain Modal */}
-
<Dialog open={addDomainModalOpen} onOpenChange={setAddDomainModalOpen}>
-
<DialogContent className="sm:max-w-lg">
-
<DialogHeader>
-
<DialogTitle>Add Custom Domain</DialogTitle>
-
<DialogDescription>
-
Enter your domain name. After adding, you'll see the DNS
-
records to configure.
-
</DialogDescription>
-
</DialogHeader>
-
<div className="space-y-4 py-4">
-
<div className="space-y-2">
-
<Label htmlFor="new-domain">Domain Name</Label>
-
<Input
-
id="new-domain"
-
placeholder="example.com"
-
value={customDomain}
-
onChange={(e) => setCustomDomain(e.target.value)}
-
/>
-
<p className="text-xs text-muted-foreground">
-
After adding, click "View DNS" to see the records you
-
need to configure.
-
</p>
-
</div>
-
</div>
-
<DialogFooter className="flex-col sm:flex-row gap-2">
-
<Button
-
variant="outline"
-
onClick={() => {
-
setAddDomainModalOpen(false)
-
setCustomDomain('')
-
}}
-
className="w-full sm:w-auto"
-
disabled={isAddingDomain}
-
>
-
Cancel
-
</Button>
-
<Button
-
onClick={handleAddCustomDomain}
-
disabled={!customDomain || isAddingDomain}
-
className="w-full sm:w-auto"
-
>
-
{isAddingDomain ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Adding...
-
</>
-
) : (
-
'Add Domain'
-
)}
-
</Button>
-
</DialogFooter>
-
</DialogContent>
-
</Dialog>
-
-
{/* Site Configuration Modal */}
-
<Dialog
-
open={configuringSite !== null}
-
onOpenChange={(open) => !open && setConfiguringSite(null)}
-
>
-
<DialogContent className="sm:max-w-lg">
-
<DialogHeader>
-
<DialogTitle>Configure Site Domain</DialogTitle>
-
<DialogDescription>
-
Choose which domain this site should use
-
</DialogDescription>
-
</DialogHeader>
-
{configuringSite && (
-
<div className="space-y-4 py-4">
-
<div className="p-3 bg-muted/30 rounded-lg">
-
<p className="text-sm font-medium mb-1">Site:</p>
-
<p className="font-mono text-sm">
-
{configuringSite.display_name ||
-
configuringSite.rkey}
-
</p>
-
</div>
-
-
<RadioGroup
-
value={selectedDomain}
-
onValueChange={setSelectedDomain}
-
>
-
{wispDomain && (
-
<div className="flex items-center space-x-2">
-
<RadioGroupItem value="wisp" id="wisp" />
-
<Label
-
htmlFor="wisp"
-
className="flex-1 cursor-pointer"
-
>
-
<div className="flex items-center justify-between">
-
<span className="font-mono text-sm">
-
{wispDomain.domain}
-
</span>
-
<Badge variant="secondary" className="text-xs ml-2">
-
Free
-
</Badge>
-
</div>
-
</Label>
-
</div>
-
)}
-
-
{customDomains
-
.filter((d) => d.verified)
-
.map((domain) => (
-
<div
-
key={domain.id}
-
className="flex items-center space-x-2"
-
>
-
<RadioGroupItem
-
value={domain.id}
-
id={domain.id}
-
/>
-
<Label
-
htmlFor={domain.id}
-
className="flex-1 cursor-pointer"
-
>
-
<div className="flex items-center justify-between">
-
<span className="font-mono text-sm">
-
{domain.domain}
-
</span>
-
<Badge
-
variant="outline"
-
className="text-xs ml-2"
-
>
-
Custom
-
</Badge>
-
</div>
-
</Label>
-
</div>
-
))}
-
-
<div className="flex items-center space-x-2">
-
<RadioGroupItem value="none" id="none" />
-
<Label htmlFor="none" className="flex-1 cursor-pointer">
-
<div className="flex flex-col">
-
<span className="text-sm">Default URL</span>
-
<span className="text-xs text-muted-foreground font-mono break-all">
-
sites.wisp.place/{configuringSite.did}/
-
{configuringSite.rkey}
-
</span>
-
</div>
-
</Label>
-
</div>
-
</RadioGroup>
-
</div>
-
)}
-
<DialogFooter className="flex flex-col sm:flex-row sm:justify-between gap-2">
-
<Button
-
variant="destructive"
-
onClick={handleDeleteSite}
-
disabled={isSavingConfig || isDeletingSite}
-
className="sm:mr-auto"
-
>
-
{isDeletingSite ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Deleting...
-
</>
-
) : (
-
<>
-
<Trash2 className="w-4 h-4 mr-2" />
-
Delete Site
-
</>
-
)}
-
</Button>
-
<div className="flex flex-col sm:flex-row gap-2 w-full sm:w-auto">
-
<Button
-
variant="outline"
-
onClick={() => setConfiguringSite(null)}
-
disabled={isSavingConfig || isDeletingSite}
-
className="w-full sm:w-auto"
-
>
-
Cancel
-
</Button>
-
<Button
-
onClick={handleSaveSiteConfig}
-
disabled={isSavingConfig || isDeletingSite}
-
className="w-full sm:w-auto"
-
>
-
{isSavingConfig ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Saving...
-
</>
-
) : (
-
'Save'
-
)}
-
</Button>
-
</div>
-
</DialogFooter>
-
</DialogContent>
-
</Dialog>
-
-
{/* View DNS Records Modal */}
-
<Dialog
-
open={viewDomainDNS !== null}
-
onOpenChange={(open) => !open && setViewDomainDNS(null)}
-
>
-
<DialogContent className="sm:max-w-lg">
-
<DialogHeader>
-
<DialogTitle>DNS Configuration</DialogTitle>
-
<DialogDescription>
-
Add these DNS records to your domain provider
-
</DialogDescription>
-
</DialogHeader>
-
{viewDomainDNS && userInfo && (
-
<>
-
{(() => {
-
const domain = customDomains.find(
-
(d) => d.id === viewDomainDNS
-
)
-
if (!domain) return null
-
-
return (
-
<div className="space-y-4 py-4">
-
<div className="p-3 bg-muted/30 rounded-lg">
-
<p className="text-sm font-medium mb-1">
-
Domain:
-
</p>
-
<p className="font-mono text-sm">
-
{domain.domain}
-
</p>
-
</div>
-
-
<div className="space-y-3">
-
<div className="p-3 bg-background rounded border border-border">
-
<div className="flex justify-between items-start mb-2">
-
<span className="text-xs font-semibold text-muted-foreground">
-
TXT Record (Verification)
-
</span>
-
</div>
-
<div className="font-mono text-xs space-y-2">
-
<div>
-
<span className="text-muted-foreground">
-
Name:
-
</span>{' '}
-
<span className="select-all">
-
_wisp.{domain.domain}
-
</span>
-
</div>
-
<div>
-
<span className="text-muted-foreground">
-
Value:
-
</span>{' '}
-
<span className="select-all break-all">
-
{userInfo.did}
-
</span>
-
</div>
-
</div>
-
</div>
-
-
<div className="p-3 bg-background rounded border border-border">
-
<div className="flex justify-between items-start mb-2">
-
<span className="text-xs font-semibold text-muted-foreground">
-
CNAME Record (Pointing)
-
</span>
-
</div>
-
<div className="font-mono text-xs space-y-2">
-
<div>
-
<span className="text-muted-foreground">
-
Name:
-
</span>{' '}
-
<span className="select-all">
-
{domain.domain}
-
</span>
-
</div>
-
<div>
-
<span className="text-muted-foreground">
-
Value:
-
</span>{' '}
-
<span className="select-all">
-
{domain.id}.dns.wisp.place
-
</span>
-
</div>
-
</div>
-
<p className="text-xs text-muted-foreground mt-2">
-
Some DNS providers may require you to use @ or leave it blank for the root domain
-
</p>
-
</div>
-
</div>
-
-
<div className="p-3 bg-muted/30 rounded-lg">
-
<p className="text-xs text-muted-foreground">
-
๐Ÿ’ก After configuring DNS, click "Verify DNS"
-
to check if everything is set up correctly.
-
DNS changes can take a few minutes to
-
propagate.
-
</p>
-
</div>
-
</div>
-
)
-
})()}
-
</>
-
)}
-
<DialogFooter>
-
<Button
-
variant="outline"
-
onClick={() => setViewDomainDNS(null)}
-
className="w-full sm:w-auto"
-
>
-
Close
-
</Button>
-
</DialogFooter>
-
</DialogContent>
-
</Dialog>
-
</div>
-
)
-
}
-
-
const root = createRoot(document.getElementById('elysia')!)
-
root.render(
-
<Layout className="gap-6">
-
<Dashboard />
-
</Layout>
-
)
-13
public/editor/index.html
···
-
<!doctype html>
-
<html lang="en">
-
<head>
-
<meta charset="UTF-8" />
-
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
-
<title>Elysia Static</title>
-
<link rel="icon" type="image/x-icon" href="../favicon.ico">
-
</head>
-
<body>
-
<div id="elysia"></div>
-
<script type="module" src="./editor.tsx"></script>
-
</body>
-
</html>
public/favicon.ico

This is a binary file and will not be displayed.

-14
public/favicon.svg
···
-
<!--?xml version="1.0" encoding="utf-8"?-->
-
<svg width="64" height="64" viewBox="0 0 64 64" xmlns="http://www.w3.org/2000/svg" role="img" aria-label="Centered large wisp on black background">
-
<!-- black background -->
-
<rect width="64" height="64" fill="#000000"></rect>
-
-
<!-- outer faint glow -->
-
<circle cx="32" cy="32" r="14" fill="none" stroke="#7CF5D8" stroke-opacity="0.35" stroke-width="2.6"></circle>
-
-
<!-- bright halo -->
-
<circle cx="32" cy="32" r="10" fill="none" stroke="#CFF8EE" stroke-opacity="0.95" stroke-width="2.4"></circle>
-
-
<!-- bright core -->
-
<circle cx="32" cy="32" r="4" fill="#FFFFFF"></circle>
-
</svg>
-13
public/index.html
···
-
<!doctype html>
-
<html lang="en">
-
<head>
-
<meta charset="UTF-8" />
-
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
-
<title>Elysia Static</title>
-
<link rel="icon" type="image/x-icon" href="./favicon.ico">
-
</head>
-
<body>
-
<div id="elysia"></div>
-
<script type="module" src="./index.tsx"></script>
-
</body>
-
</html>
-336
public/index.tsx
···
-
import { useState, useRef, useEffect } from 'react'
-
import { createRoot } from 'react-dom/client'
-
import {
-
ArrowRight,
-
Shield,
-
Zap,
-
Globe,
-
Lock,
-
Code,
-
Server
-
} from 'lucide-react'
-
-
import Layout from '@public/layouts'
-
import { Button } from '@public/components/ui/button'
-
import { Card } from '@public/components/ui/card'
-
-
function App() {
-
const [showForm, setShowForm] = useState(false)
-
const inputRef = useRef<HTMLInputElement>(null)
-
-
useEffect(() => {
-
if (showForm) {
-
setTimeout(() => inputRef.current?.focus(), 500)
-
}
-
}, [showForm])
-
-
return (
-
<>
-
<div className="min-h-screen">
-
{/* Header */}
-
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
-
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
-
<div className="flex items-center gap-2">
-
<div className="w-8 h-8 bg-primary rounded-lg flex items-center justify-center">
-
<Globe className="w-5 h-5 text-primary-foreground" />
-
</div>
-
<span className="text-xl font-semibold text-foreground">
-
wisp.place
-
</span>
-
</div>
-
<div className="flex items-center gap-3">
-
<Button
-
variant="ghost"
-
size="sm"
-
onClick={() => setShowForm(true)}
-
>
-
Sign In
-
</Button>
-
<Button
-
size="sm"
-
className="bg-accent text-accent-foreground hover:bg-accent/90"
-
>
-
Get Started
-
</Button>
-
</div>
-
</div>
-
</header>
-
-
{/* Hero Section */}
-
<section className="container mx-auto px-4 py-20 md:py-32">
-
<div className="max-w-4xl mx-auto text-center">
-
<div className="inline-flex items-center gap-2 px-4 py-2 rounded-full bg-accent/10 border border-accent/20 mb-8">
-
<span className="w-2 h-2 bg-accent rounded-full animate-pulse"></span>
-
<span className="text-sm text-accent-foreground">
-
Built on AT Protocol
-
</span>
-
</div>
-
-
<h1 className="text-5xl md:text-7xl font-bold text-balance mb-6 leading-tight">
-
Your Website.Your Control. Lightning Fast.
-
</h1>
-
-
<p className="text-xl md:text-2xl text-muted-foreground text-balance mb-10 leading-relaxed max-w-3xl mx-auto">
-
Host static sites in your AT Protocol account. You
-
keep ownership and control. We just serve them fast
-
through our CDN.
-
</p>
-
-
<div className="max-w-md mx-auto relative">
-
<div
-
className={`transition-all duration-500 ease-in-out ${
-
showForm
-
? 'opacity-0 -translate-y-5 pointer-events-none'
-
: 'opacity-100 translate-y-0'
-
}`}
-
>
-
<Button
-
size="lg"
-
className="bg-primary text-primary-foreground hover:bg-primary/90 text-lg px-8 py-6 w-full"
-
onClick={() => setShowForm(true)}
-
>
-
Log in with AT Proto
-
<ArrowRight className="ml-2 w-5 h-5" />
-
</Button>
-
</div>
-
-
<div
-
className={`transition-all duration-500 ease-in-out absolute inset-0 ${
-
showForm
-
? 'opacity-100 translate-y-0'
-
: 'opacity-0 translate-y-5 pointer-events-none'
-
}`}
-
>
-
<form
-
onSubmit={async (e) => {
-
e.preventDefault()
-
try {
-
const handle =
-
inputRef.current?.value
-
const res = await fetch(
-
'/api/auth/signin',
-
{
-
method: 'POST',
-
headers: {
-
'Content-Type':
-
'application/json'
-
},
-
body: JSON.stringify({
-
handle
-
})
-
}
-
)
-
if (!res.ok)
-
throw new Error(
-
'Request failed'
-
)
-
const data = await res.json()
-
if (data.url) {
-
window.location.href = data.url
-
} else {
-
alert('Unexpected response')
-
}
-
} catch (error) {
-
console.error(
-
'Login failed:',
-
error
-
)
-
alert('Authentication failed')
-
}
-
}}
-
className="space-y-3"
-
>
-
<input
-
ref={inputRef}
-
type="text"
-
name="handle"
-
placeholder="Enter your handle (e.g., alice.bsky.social)"
-
className="w-full py-4 px-4 text-lg bg-input border border-border rounded-lg focus:outline-none focus:ring-2 focus:ring-accent"
-
/>
-
<button
-
type="submit"
-
className="w-full bg-accent hover:bg-accent/90 text-accent-foreground font-semibold py-4 px-6 text-lg rounded-lg inline-flex items-center justify-center transition-colors"
-
>
-
Continue
-
<ArrowRight className="ml-2 w-5 h-5" />
-
</button>
-
</form>
-
</div>
-
</div>
-
</div>
-
</section>
-
-
{/* How It Works */}
-
<section className="container mx-auto px-4 py-16 bg-muted/30">
-
<div className="max-w-3xl mx-auto text-center">
-
<h2 className="text-3xl md:text-4xl font-bold mb-8">
-
How it works
-
</h2>
-
<div className="space-y-6 text-left">
-
<div className="flex gap-4 items-start">
-
<div className="text-4xl font-bold text-accent/40 min-w-[60px]">
-
01
-
</div>
-
<div>
-
<h3 className="text-xl font-semibold mb-2">
-
Upload your static site
-
</h3>
-
<p className="text-muted-foreground">
-
Your HTML, CSS, and JavaScript files are
-
stored in your AT Protocol account as
-
gzipped blobs and a manifest record.
-
</p>
-
</div>
-
</div>
-
<div className="flex gap-4 items-start">
-
<div className="text-4xl font-bold text-accent/40 min-w-[60px]">
-
02
-
</div>
-
<div>
-
<h3 className="text-xl font-semibold mb-2">
-
We serve it globally
-
</h3>
-
<p className="text-muted-foreground">
-
Wisp.place reads your site from your
-
account and delivers it through our CDN
-
for fast loading anywhere.
-
</p>
-
</div>
-
</div>
-
<div className="flex gap-4 items-start">
-
<div className="text-4xl font-bold text-accent/40 min-w-[60px]">
-
03
-
</div>
-
<div>
-
<h3 className="text-xl font-semibold mb-2">
-
You stay in control
-
</h3>
-
<p className="text-muted-foreground">
-
Update or remove your site anytime
-
through your AT Protocol account. No
-
lock-in, no middleman ownership.
-
</p>
-
</div>
-
</div>
-
</div>
-
</div>
-
</section>
-
-
{/* Features Grid */}
-
<section id="features" className="container mx-auto px-4 py-20">
-
<div className="text-center mb-16">
-
<h2 className="text-4xl md:text-5xl font-bold mb-4 text-balance">
-
Why Wisp.place?
-
</h2>
-
<p className="text-xl text-muted-foreground text-balance max-w-2xl mx-auto">
-
Static site hosting that respects your ownership
-
</p>
-
</div>
-
-
<div className="grid md:grid-cols-2 lg:grid-cols-3 gap-6 max-w-6xl mx-auto">
-
{[
-
{
-
icon: Shield,
-
title: 'You Own Your Content',
-
description:
-
'Your site lives in your AT Protocol account. Move it to another service anytime, or take it offline yourself.'
-
},
-
{
-
icon: Zap,
-
title: 'CDN Performance',
-
description:
-
'We cache and serve your site from edge locations worldwide for fast load times.'
-
},
-
{
-
icon: Lock,
-
title: 'No Vendor Lock-in',
-
description:
-
'Your data stays in your account. Switch providers or self-host whenever you want.'
-
},
-
{
-
icon: Code,
-
title: 'Simple Deployment',
-
description:
-
'Upload your static files and we handle the rest. No complex configuration needed.'
-
},
-
{
-
icon: Server,
-
title: 'AT Protocol Native',
-
description:
-
'Built for the decentralized web. Your site has a verifiable identity on the network.'
-
},
-
{
-
icon: Globe,
-
title: 'Custom Domains',
-
description:
-
'Use your own domain name or a wisp.place subdomain. Your choice, either way.'
-
}
-
].map((feature, i) => (
-
<Card
-
key={i}
-
className="p-6 hover:shadow-lg transition-shadow border-2 bg-card"
-
>
-
<div className="w-12 h-12 rounded-lg bg-accent/10 flex items-center justify-center mb-4">
-
<feature.icon className="w-6 h-6 text-accent" />
-
</div>
-
<h3 className="text-xl font-semibold mb-2 text-card-foreground">
-
{feature.title}
-
</h3>
-
<p className="text-muted-foreground leading-relaxed">
-
{feature.description}
-
</p>
-
</Card>
-
))}
-
</div>
-
</section>
-
-
{/* CTA Section */}
-
<section className="container mx-auto px-4 py-20">
-
<div className="max-w-3xl mx-auto text-center bg-accent/5 border border-accent/20 rounded-2xl p-12">
-
<h2 className="text-3xl md:text-4xl font-bold mb-4">
-
Ready to deploy?
-
</h2>
-
<p className="text-xl text-muted-foreground mb-8">
-
Host your static site on your own AT Protocol
-
account today
-
</p>
-
<Button
-
size="lg"
-
className="bg-accent text-accent-foreground hover:bg-accent/90 text-lg px-8 py-6"
-
onClick={() => setShowForm(true)}
-
>
-
Get Started
-
<ArrowRight className="ml-2 w-5 h-5" />
-
</Button>
-
</div>
-
</section>
-
-
{/* Footer */}
-
<footer className="border-t border-border/40 bg-muted/20">
-
<div className="container mx-auto px-4 py-8">
-
<div className="text-center text-sm text-muted-foreground">
-
<p>
-
Built by{' '}
-
<a
-
href="https://bsky.app/profile/nekomimi.pet"
-
target="_blank"
-
rel="noopener noreferrer"
-
className="text-accent hover:text-accent/80 transition-colors font-medium"
-
>
-
@nekomimi.pet
-
</a>
-
</p>
-
</div>
-
</div>
-
</footer>
-
</div>
-
</>
-
)
-
}
-
-
const root = createRoot(document.getElementById('elysia')!)
-
root.render(
-
<Layout className="gap-6">
-
<App />
-
</Layout>
-
)
-27
public/layouts/index.tsx
···
-
import type { PropsWithChildren } from 'react'
-
-
import { QueryClientProvider, QueryClient } from '@tanstack/react-query'
-
import clsx from 'clsx'
-
-
import '@public/styles/global.css'
-
-
const client = new QueryClient()
-
-
interface LayoutProps extends PropsWithChildren {
-
className?: string
-
}
-
-
export default function Layout({ children, className }: LayoutProps) {
-
return (
-
<QueryClientProvider client={client}>
-
<div
-
className={clsx(
-
'flex flex-col items-center w-full min-h-screen',
-
className
-
)}
-
>
-
{children}
-
</div>
-
</QueryClientProvider>
-
)
-
}
-8
public/lib/api.ts
···
-
import { treaty } from '@elysiajs/eden'
-
-
import type { app } from '@server'
-
-
// Use the current host instead of hardcoded localhost
-
const apiHost = typeof window !== 'undefined' ? window.location.origin : 'http://localhost:8000'
-
-
export const api = treaty<typeof app>(apiHost)
-6
public/lib/utils.ts
···
-
import { clsx, type ClassValue } from "clsx"
-
import { twMerge } from "tailwind-merge"
-
-
export function cn(...inputs: ClassValue[]) {
-
return twMerge(clsx(inputs))
-
}
-12
public/onboarding/index.html
···
-
<!doctype html>
-
<html lang="en">
-
<head>
-
<meta charset="UTF-8" />
-
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
-
<title>Get Started - wisp.place</title>
-
</head>
-
<body>
-
<div id="elysia"></div>
-
<script type="module" src="./onboarding.tsx"></script>
-
</body>
-
</html>
-467
public/onboarding/onboarding.tsx
···
-
import { useState, useEffect } from 'react'
-
import { createRoot } from 'react-dom/client'
-
import { Button } from '@public/components/ui/button'
-
import {
-
Card,
-
CardContent,
-
CardDescription,
-
CardHeader,
-
CardTitle
-
} from '@public/components/ui/card'
-
import { Input } from '@public/components/ui/input'
-
import { Label } from '@public/components/ui/label'
-
import { Globe, Upload, CheckCircle2, Loader2, AlertCircle } from 'lucide-react'
-
import Layout from '@public/layouts'
-
-
type OnboardingStep = 'domain' | 'upload' | 'complete'
-
-
function Onboarding() {
-
const [step, setStep] = useState<OnboardingStep>('domain')
-
const [handle, setHandle] = useState('')
-
const [isCheckingAvailability, setIsCheckingAvailability] = useState(false)
-
const [isAvailable, setIsAvailable] = useState<boolean | null>(null)
-
const [domain, setDomain] = useState('')
-
const [isClaimingDomain, setIsClaimingDomain] = useState(false)
-
const [claimedDomain, setClaimedDomain] = useState('')
-
-
const [siteName, setSiteName] = useState('')
-
const [selectedFiles, setSelectedFiles] = useState<FileList | null>(null)
-
const [isUploading, setIsUploading] = useState(false)
-
const [uploadProgress, setUploadProgress] = useState('')
-
const [skippedFiles, setSkippedFiles] = useState<Array<{ name: string; reason: string }>>([])
-
const [uploadedCount, setUploadedCount] = useState(0)
-
-
// Check domain availability as user types
-
useEffect(() => {
-
if (!handle || handle.length < 3) {
-
setIsAvailable(null)
-
setDomain('')
-
return
-
}
-
-
const timeoutId = setTimeout(async () => {
-
setIsCheckingAvailability(true)
-
try {
-
const response = await fetch(
-
`/api/domain/check?handle=${encodeURIComponent(handle)}`
-
)
-
const data = await response.json()
-
setIsAvailable(data.available)
-
setDomain(data.domain || '')
-
} catch (err) {
-
console.error('Error checking availability:', err)
-
setIsAvailable(false)
-
} finally {
-
setIsCheckingAvailability(false)
-
}
-
}, 500)
-
-
return () => clearTimeout(timeoutId)
-
}, [handle])
-
-
const handleClaimDomain = async () => {
-
if (!handle || !isAvailable) return
-
-
setIsClaimingDomain(true)
-
try {
-
const response = await fetch('/api/domain/claim', {
-
method: 'POST',
-
headers: { 'Content-Type': 'application/json' },
-
body: JSON.stringify({ handle })
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
setClaimedDomain(data.domain)
-
setStep('upload')
-
} else {
-
throw new Error(data.error || 'Failed to claim domain')
-
}
-
} catch (err) {
-
console.error('Error claiming domain:', err)
-
const errorMessage = err instanceof Error ? err.message : 'Unknown error'
-
-
// Handle "Already claimed" error - redirect to editor
-
if (errorMessage.includes('Already claimed')) {
-
alert('You have already claimed a wisp.place subdomain. Redirecting to editor...')
-
window.location.href = '/editor'
-
} else {
-
alert(`Failed to claim domain: ${errorMessage}`)
-
}
-
} finally {
-
setIsClaimingDomain(false)
-
}
-
}
-
-
const handleFileSelect = (e: React.ChangeEvent<HTMLInputElement>) => {
-
if (e.target.files && e.target.files.length > 0) {
-
setSelectedFiles(e.target.files)
-
}
-
}
-
-
const handleUpload = async () => {
-
if (!siteName) {
-
alert('Please enter a site name')
-
return
-
}
-
-
setIsUploading(true)
-
setUploadProgress('Preparing files...')
-
-
try {
-
const formData = new FormData()
-
formData.append('siteName', siteName)
-
-
if (selectedFiles) {
-
for (let i = 0; i < selectedFiles.length; i++) {
-
formData.append('files', selectedFiles[i])
-
}
-
}
-
-
setUploadProgress('Uploading to AT Protocol...')
-
const response = await fetch('/wisp/upload-files', {
-
method: 'POST',
-
body: formData
-
})
-
-
const data = await response.json()
-
if (data.success) {
-
setUploadProgress('Upload complete!')
-
setSkippedFiles(data.skippedFiles || [])
-
setUploadedCount(data.uploadedCount || data.fileCount || 0)
-
-
// If there are skipped files, show them briefly before redirecting
-
if (data.skippedFiles && data.skippedFiles.length > 0) {
-
setTimeout(() => {
-
window.location.href = `https://${claimedDomain}`
-
}, 3000) // Give more time to see skipped files
-
} else {
-
setTimeout(() => {
-
window.location.href = `https://${claimedDomain}`
-
}, 1500)
-
}
-
} else {
-
throw new Error(data.error || 'Upload failed')
-
}
-
} catch (err) {
-
console.error('Upload error:', err)
-
alert(
-
`Upload failed: ${err instanceof Error ? err.message : 'Unknown error'}`
-
)
-
setIsUploading(false)
-
setUploadProgress('')
-
}
-
}
-
-
const handleSkipUpload = () => {
-
// Redirect to editor without uploading
-
window.location.href = '/editor'
-
}
-
-
return (
-
<div className="w-full min-h-screen bg-background">
-
{/* Header */}
-
<header className="border-b border-border/40 bg-background/80 backdrop-blur-sm sticky top-0 z-50">
-
<div className="container mx-auto px-4 py-4 flex items-center justify-between">
-
<div className="flex items-center gap-2">
-
<div className="w-8 h-8 bg-primary rounded-lg flex items-center justify-center">
-
<Globe className="w-5 h-5 text-primary-foreground" />
-
</div>
-
<span className="text-xl font-semibold text-foreground">
-
wisp.place
-
</span>
-
</div>
-
</div>
-
</header>
-
-
<div className="container mx-auto px-4 py-12 max-w-2xl">
-
{/* Progress indicator */}
-
<div className="mb-8">
-
<div className="flex items-center justify-center gap-2 mb-4">
-
<div
-
className={`w-8 h-8 rounded-full flex items-center justify-center ${
-
step === 'domain'
-
? 'bg-primary text-primary-foreground'
-
: 'bg-green-500 text-white'
-
}`}
-
>
-
{step === 'domain' ? (
-
'1'
-
) : (
-
<CheckCircle2 className="w-5 h-5" />
-
)}
-
</div>
-
<div className="w-16 h-0.5 bg-border"></div>
-
<div
-
className={`w-8 h-8 rounded-full flex items-center justify-center ${
-
step === 'upload'
-
? 'bg-primary text-primary-foreground'
-
: step === 'domain'
-
? 'bg-muted text-muted-foreground'
-
: 'bg-green-500 text-white'
-
}`}
-
>
-
{step === 'complete' ? (
-
<CheckCircle2 className="w-5 h-5" />
-
) : (
-
'2'
-
)}
-
</div>
-
</div>
-
<div className="text-center">
-
<h1 className="text-2xl font-bold mb-2">
-
{step === 'domain' && 'Claim Your Free Domain'}
-
{step === 'upload' && 'Deploy Your First Site'}
-
{step === 'complete' && 'All Set!'}
-
</h1>
-
<p className="text-muted-foreground">
-
{step === 'domain' &&
-
'Choose a subdomain on wisp.place'}
-
{step === 'upload' &&
-
'Upload your site or start with an empty one'}
-
{step === 'complete' && 'Redirecting to your site...'}
-
</p>
-
</div>
-
</div>
-
-
{/* Domain registration step */}
-
{step === 'domain' && (
-
<Card>
-
<CardHeader>
-
<CardTitle>Choose Your Domain</CardTitle>
-
<CardDescription>
-
Pick a unique handle for your free *.wisp.place
-
subdomain
-
</CardDescription>
-
</CardHeader>
-
<CardContent className="space-y-4">
-
<div className="space-y-2">
-
<Label htmlFor="handle">Your Handle</Label>
-
<div className="flex gap-2">
-
<div className="relative flex-1">
-
<Input
-
id="handle"
-
placeholder="my-awesome-site"
-
value={handle}
-
onChange={(e) =>
-
setHandle(
-
e.target.value
-
.toLowerCase()
-
.replace(/[^a-z0-9-]/g, '')
-
)
-
}
-
className="pr-10"
-
/>
-
{isCheckingAvailability && (
-
<Loader2 className="absolute right-3 top-1/2 -translate-y-1/2 w-4 h-4 animate-spin text-muted-foreground" />
-
)}
-
{!isCheckingAvailability &&
-
isAvailable !== null && (
-
<div
-
className={`absolute right-3 top-1/2 -translate-y-1/2 ${
-
isAvailable
-
? 'text-green-500'
-
: 'text-red-500'
-
}`}
-
>
-
{isAvailable ? 'โœ“' : 'โœ—'}
-
</div>
-
)}
-
</div>
-
</div>
-
{domain && (
-
<p className="text-sm text-muted-foreground">
-
Your domain will be:{' '}
-
<span className="font-mono">{domain}</span>
-
</p>
-
)}
-
{isAvailable === false && handle.length >= 3 && (
-
<p className="text-sm text-red-500">
-
This handle is not available or invalid
-
</p>
-
)}
-
</div>
-
-
<Button
-
onClick={handleClaimDomain}
-
disabled={
-
!isAvailable ||
-
isClaimingDomain ||
-
isCheckingAvailability
-
}
-
className="w-full"
-
>
-
{isClaimingDomain ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Claiming Domain...
-
</>
-
) : (
-
<>Claim Domain</>
-
)}
-
</Button>
-
</CardContent>
-
</Card>
-
)}
-
-
{/* Upload step */}
-
{step === 'upload' && (
-
<Card>
-
<CardHeader>
-
<CardTitle>Deploy Your Site</CardTitle>
-
<CardDescription>
-
Upload your static site files or start with an empty
-
site (you can upload later)
-
</CardDescription>
-
</CardHeader>
-
<CardContent className="space-y-6">
-
<div className="p-4 bg-green-500/10 border border-green-500/20 rounded-lg">
-
<div className="flex items-center gap-2 text-green-600 dark:text-green-400">
-
<CheckCircle2 className="w-4 h-4" />
-
<span className="font-medium">
-
Domain claimed: {claimedDomain}
-
</span>
-
</div>
-
</div>
-
-
<div className="space-y-2">
-
<Label htmlFor="site-name">Site Name</Label>
-
<Input
-
id="site-name"
-
placeholder="my-site"
-
value={siteName}
-
onChange={(e) => setSiteName(e.target.value)}
-
/>
-
<p className="text-xs text-muted-foreground">
-
A unique identifier for this site in your account
-
</p>
-
</div>
-
-
<div className="space-y-2">
-
<Label>Upload Files (Optional)</Label>
-
<div className="border-2 border-dashed border-border rounded-lg p-8 text-center hover:border-accent transition-colors">
-
<Upload className="w-12 h-12 text-muted-foreground mx-auto mb-4" />
-
<input
-
type="file"
-
id="file-upload"
-
multiple
-
onChange={handleFileSelect}
-
className="hidden"
-
{...(({ webkitdirectory: '', directory: '' } as any))}
-
/>
-
<label
-
htmlFor="file-upload"
-
className="cursor-pointer"
-
>
-
<Button
-
variant="outline"
-
type="button"
-
onClick={() =>
-
document
-
.getElementById('file-upload')
-
?.click()
-
}
-
>
-
Choose Folder
-
</Button>
-
</label>
-
{selectedFiles && selectedFiles.length > 0 && (
-
<p className="text-sm text-muted-foreground mt-3">
-
{selectedFiles.length} files selected
-
</p>
-
)}
-
</div>
-
<p className="text-xs text-muted-foreground">
-
Supported: HTML, CSS, JS, images, fonts, and more
-
</p>
-
<p className="text-xs text-muted-foreground">
-
Limits: 100MB per file, 300MB total
-
</p>
-
</div>
-
-
{uploadProgress && (
-
<div className="space-y-3">
-
<div className="p-4 bg-muted rounded-lg">
-
<div className="flex items-center gap-2">
-
<Loader2 className="w-4 h-4 animate-spin" />
-
<span className="text-sm">
-
{uploadProgress}
-
</span>
-
</div>
-
</div>
-
-
{skippedFiles.length > 0 && (
-
<div className="p-4 bg-yellow-500/10 border border-yellow-500/20 rounded-lg">
-
<div className="flex items-start gap-2 text-yellow-600 dark:text-yellow-400 mb-2">
-
<AlertCircle className="w-4 h-4 mt-0.5 flex-shrink-0" />
-
<div className="flex-1">
-
<span className="font-medium">
-
{skippedFiles.length} file{skippedFiles.length > 1 ? 's' : ''} skipped
-
</span>
-
{uploadedCount > 0 && (
-
<span className="text-sm ml-2">
-
({uploadedCount} uploaded successfully)
-
</span>
-
)}
-
</div>
-
</div>
-
<div className="ml-6 space-y-1 max-h-32 overflow-y-auto">
-
{skippedFiles.slice(0, 5).map((file, idx) => (
-
<div key={idx} className="text-xs">
-
<span className="font-mono">{file.name}</span>
-
<span className="text-muted-foreground"> - {file.reason}</span>
-
</div>
-
))}
-
{skippedFiles.length > 5 && (
-
<div className="text-xs text-muted-foreground">
-
...and {skippedFiles.length - 5} more
-
</div>
-
)}
-
</div>
-
</div>
-
)}
-
</div>
-
)}
-
-
<div className="flex gap-3">
-
<Button
-
onClick={handleSkipUpload}
-
variant="outline"
-
className="flex-1"
-
disabled={isUploading}
-
>
-
Skip for Now
-
</Button>
-
<Button
-
onClick={handleUpload}
-
className="flex-1"
-
disabled={!siteName || isUploading}
-
>
-
{isUploading ? (
-
<>
-
<Loader2 className="w-4 h-4 mr-2 animate-spin" />
-
Uploading...
-
</>
-
) : (
-
<>
-
{selectedFiles && selectedFiles.length > 0
-
? 'Upload & Deploy'
-
: 'Create Empty Site'}
-
</>
-
)}
-
</Button>
-
</div>
-
</CardContent>
-
</Card>
-
)}
-
</div>
-
</div>
-
)
-
}
-
-
const root = createRoot(document.getElementById('elysia')!)
-
root.render(
-
<Layout>
-
<Onboarding />
-
</Layout>
-
)
-166
public/styles/global.css
···
-
@import "tailwindcss";
-
@import "tw-animate-css";
-
-
@custom-variant dark (&:is(.dark *));
-
-
:root {
-
/* Warm beige background inspired by Sunset design #E9DDD8 */
-
--background: oklch(0.90 0.012 35);
-
/* Very dark brown text for strong contrast #2A2420 */
-
--foreground: oklch(0.18 0.01 30);
-
-
/* Slightly lighter card background */
-
--card: oklch(0.93 0.01 35);
-
--card-foreground: oklch(0.18 0.01 30);
-
-
--popover: oklch(0.93 0.01 35);
-
--popover-foreground: oklch(0.18 0.01 30);
-
-
/* Dark brown primary inspired by #645343 */
-
--primary: oklch(0.35 0.02 35);
-
--primary-foreground: oklch(0.95 0.01 35);
-
-
/* Bright pink accent for links #FFAAD2 */
-
--accent: oklch(0.78 0.15 345);
-
--accent-foreground: oklch(0.18 0.01 30);
-
-
/* Medium taupe secondary inspired by #867D76 */
-
--secondary: oklch(0.52 0.015 30);
-
--secondary-foreground: oklch(0.95 0.01 35);
-
-
/* Light warm muted background */
-
--muted: oklch(0.88 0.01 35);
-
--muted-foreground: oklch(0.42 0.015 30);
-
-
--border: oklch(0.75 0.015 30);
-
--input: oklch(0.92 0.01 35);
-
--ring: oklch(0.72 0.08 15);
-
-
--destructive: oklch(0.577 0.245 27.325);
-
--destructive-foreground: oklch(0.985 0 0);
-
-
--chart-1: oklch(0.78 0.15 345);
-
--chart-2: oklch(0.32 0.04 285);
-
--chart-3: oklch(0.56 0.08 220);
-
--chart-4: oklch(0.85 0.02 130);
-
--chart-5: oklch(0.93 0.03 85);
-
-
--radius: 0.75rem;
-
--sidebar: oklch(0.985 0 0);
-
--sidebar-foreground: oklch(0.145 0 0);
-
--sidebar-primary: oklch(0.205 0 0);
-
--sidebar-primary-foreground: oklch(0.985 0 0);
-
--sidebar-accent: oklch(0.97 0 0);
-
--sidebar-accent-foreground: oklch(0.205 0 0);
-
--sidebar-border: oklch(0.922 0 0);
-
--sidebar-ring: oklch(0.708 0 0);
-
}
-
-
.dark {
-
/* #413C58 - violet background for dark mode */
-
--background: oklch(0.28 0.04 285);
-
/* #F2E7C9 - parchment text */
-
--foreground: oklch(0.93 0.03 85);
-
-
--card: oklch(0.32 0.04 285);
-
--card-foreground: oklch(0.93 0.03 85);
-
-
--popover: oklch(0.32 0.04 285);
-
--popover-foreground: oklch(0.93 0.03 85);
-
-
/* #FFAAD2 - pink primary in dark mode */
-
--primary: oklch(0.78 0.15 345);
-
--primary-foreground: oklch(0.32 0.04 285);
-
-
--accent: oklch(0.78 0.15 345);
-
--accent-foreground: oklch(0.32 0.04 285);
-
-
--secondary: oklch(0.56 0.08 220);
-
--secondary-foreground: oklch(0.93 0.03 85);
-
-
--muted: oklch(0.38 0.03 285);
-
--muted-foreground: oklch(0.75 0.02 85);
-
-
--border: oklch(0.42 0.03 285);
-
--input: oklch(0.42 0.03 285);
-
--ring: oklch(0.78 0.15 345);
-
-
--destructive: oklch(0.577 0.245 27.325);
-
--destructive-foreground: oklch(0.985 0 0);
-
-
--chart-1: oklch(0.78 0.15 345);
-
--chart-2: oklch(0.93 0.03 85);
-
--chart-3: oklch(0.56 0.08 220);
-
--chart-4: oklch(0.85 0.02 130);
-
--chart-5: oklch(0.32 0.04 285);
-
--sidebar: oklch(0.205 0 0);
-
--sidebar-foreground: oklch(0.985 0 0);
-
--sidebar-primary: oklch(0.488 0.243 264.376);
-
--sidebar-primary-foreground: oklch(0.985 0 0);
-
--sidebar-accent: oklch(0.269 0 0);
-
--sidebar-accent-foreground: oklch(0.985 0 0);
-
--sidebar-border: oklch(0.269 0 0);
-
--sidebar-ring: oklch(0.439 0 0);
-
}
-
-
@theme inline {
-
/* optional: --font-sans, --font-serif, --font-mono if they are applied in the layout.tsx */
-
--color-background: var(--background);
-
--color-foreground: var(--foreground);
-
--color-card: var(--card);
-
--color-card-foreground: var(--card-foreground);
-
--color-popover: var(--popover);
-
--color-popover-foreground: var(--popover-foreground);
-
--color-primary: var(--primary);
-
--color-primary-foreground: var(--primary-foreground);
-
--color-secondary: var(--secondary);
-
--color-secondary-foreground: var(--secondary-foreground);
-
--color-muted: var(--muted);
-
--color-muted-foreground: var(--muted-foreground);
-
--color-accent: var(--accent);
-
--color-accent-foreground: var(--accent-foreground);
-
--color-destructive: var(--destructive);
-
--color-destructive-foreground: var(--destructive-foreground);
-
--color-border: var(--border);
-
--color-input: var(--input);
-
--color-ring: var(--ring);
-
--color-chart-1: var(--chart-1);
-
--color-chart-2: var(--chart-2);
-
--color-chart-3: var(--chart-3);
-
--color-chart-4: var(--chart-4);
-
--color-chart-5: var(--chart-5);
-
--radius-sm: calc(var(--radius) - 4px);
-
--radius-md: calc(var(--radius) - 2px);
-
--radius-lg: var(--radius);
-
--radius-xl: calc(var(--radius) + 4px);
-
--color-sidebar: var(--sidebar);
-
--color-sidebar-foreground: var(--sidebar-foreground);
-
--color-sidebar-primary: var(--sidebar-primary);
-
--color-sidebar-primary-foreground: var(--sidebar-primary-foreground);
-
--color-sidebar-accent: var(--sidebar-accent);
-
--color-sidebar-accent-foreground: var(--sidebar-accent-foreground);
-
--color-sidebar-border: var(--sidebar-border);
-
--color-sidebar-ring: var(--sidebar-ring);
-
}
-
-
@layer base {
-
* {
-
@apply border-border outline-ring/50;
-
}
-
body {
-
@apply bg-background text-foreground;
-
}
-
}
-
-
@keyframes arrow-bounce {
-
0%, 100% {
-
transform: translateX(0);
-
}
-
50% {
-
transform: translateX(4px);
-
}
-
}
-
-
.arrow-animate {
-
animation: arrow-bounce 1.5s ease-in-out infinite;
-
}
-46
scripts/change-admin-password.ts
···
-
// Change admin password
-
import { adminAuth } from './src/lib/admin-auth'
-
import { db } from './src/lib/db'
-
import { randomBytes, createHash } from 'crypto'
-
-
// Get username and new password from command line
-
const username = process.argv[2]
-
const newPassword = process.argv[3]
-
-
if (!username || !newPassword) {
-
console.error('Usage: bun run change-admin-password.ts <username> <new-password>')
-
process.exit(1)
-
}
-
-
if (newPassword.length < 8) {
-
console.error('Password must be at least 8 characters')
-
process.exit(1)
-
}
-
-
// Hash password
-
function hashPassword(password: string, salt: string): string {
-
return createHash('sha256').update(password + salt).digest('hex')
-
}
-
-
function generateSalt(): string {
-
return randomBytes(32).toString('hex')
-
}
-
-
// Initialize
-
await adminAuth.init()
-
-
// Check if user exists
-
const result = await db`SELECT username FROM admin_users WHERE username = ${username}`
-
if (result.length === 0) {
-
console.error(`Admin user '${username}' not found`)
-
process.exit(1)
-
}
-
-
// Update password
-
const salt = generateSalt()
-
const passwordHash = hashPassword(newPassword, salt)
-
-
await db`UPDATE admin_users SET password_hash = ${passwordHash}, salt = ${salt} WHERE username = ${username}`
-
-
console.log(`โœ“ Password updated for admin user '${username}'`)
-
process.exit(0)
-31
scripts/create-admin.ts
···
-
// Quick script to create admin user with randomly generated password
-
import { adminAuth } from './src/lib/admin-auth'
-
import { randomBytes } from 'crypto'
-
-
// Generate a secure random password
-
function generatePassword(length: number = 20): string {
-
const chars = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!@#$%^&*'
-
const bytes = randomBytes(length)
-
let password = ''
-
for (let i = 0; i < length; i++) {
-
password += chars[bytes[i] % chars.length]
-
}
-
return password
-
}
-
-
const username = 'admin'
-
const password = generatePassword(20)
-
-
await adminAuth.init()
-
await adminAuth.createAdmin(username, password)
-
-
console.log('\nโ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—')
-
console.log('โ•‘ ADMIN USER CREATED SUCCESSFULLY โ•‘')
-
console.log('โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n')
-
console.log(`Username: ${username}`)
-
console.log(`Password: ${password}`)
-
console.log('\nโš ๏ธ IMPORTANT: Save this password securely!')
-
console.log('This password will not be shown again.\n')
-
console.log('Change it with: bun run change-admin-password.ts admin NEW_PASSWORD\n')
-
-
process.exit(0)
-159
src/index.ts
···
-
import { Elysia } from 'elysia'
-
import { cors } from '@elysiajs/cors'
-
import { openapi, fromTypes } from '@elysiajs/openapi'
-
import { staticPlugin } from '@elysiajs/static'
-
-
import type { Config } from './lib/types'
-
import { BASE_HOST } from './lib/constants'
-
import {
-
createClientMetadata,
-
getOAuthClient,
-
getCurrentKeys,
-
cleanupExpiredSessions,
-
rotateKeysIfNeeded
-
} from './lib/oauth-client'
-
import { authRoutes } from './routes/auth'
-
import { wispRoutes } from './routes/wisp'
-
import { domainRoutes } from './routes/domain'
-
import { userRoutes } from './routes/user'
-
import { siteRoutes } from './routes/site'
-
import { csrfProtection } from './lib/csrf'
-
import { DNSVerificationWorker } from './lib/dns-verification-worker'
-
import { logger, logCollector, observabilityMiddleware } from './lib/observability'
-
import { promptAdminSetup } from './lib/admin-auth'
-
import { adminRoutes } from './routes/admin'
-
-
const config: Config = {
-
domain: (Bun.env.DOMAIN ?? `https://${BASE_HOST}`) as Config['domain'],
-
clientName: Bun.env.CLIENT_NAME ?? 'PDS-View'
-
}
-
-
// Initialize admin setup (prompt if no admin exists)
-
await promptAdminSetup()
-
-
const client = await getOAuthClient(config)
-
-
// Periodic maintenance: cleanup expired sessions and rotate keys
-
// Run every hour
-
const runMaintenance = async () => {
-
console.log('[Maintenance] Running periodic maintenance...')
-
await cleanupExpiredSessions()
-
await rotateKeysIfNeeded()
-
}
-
-
// Run maintenance on startup
-
runMaintenance()
-
-
// Schedule maintenance to run every hour
-
setInterval(runMaintenance, 60 * 60 * 1000)
-
-
// Start DNS verification worker (runs every 10 minutes)
-
const dnsVerifier = new DNSVerificationWorker(
-
10 * 60 * 1000, // 10 minutes
-
(msg, data) => {
-
logCollector.info(`[DNS Verifier] ${msg}`, 'main-app', data ? { data } : undefined)
-
}
-
)
-
-
dnsVerifier.start()
-
logger.info('DNS Verifier Started - checking custom domains every 10 minutes')
-
-
export const app = new Elysia()
-
.use(openapi({
-
references: fromTypes()
-
}))
-
// Observability middleware
-
.onBeforeHandle(observabilityMiddleware('main-app').beforeHandle)
-
.onAfterHandle((ctx) => {
-
observabilityMiddleware('main-app').afterHandle(ctx)
-
// Security headers middleware
-
const { set } = ctx
-
// Prevent clickjacking attacks
-
set.headers['X-Frame-Options'] = 'DENY'
-
// Prevent MIME type sniffing
-
set.headers['X-Content-Type-Options'] = 'nosniff'
-
// Strict Transport Security (HSTS) - enforce HTTPS
-
set.headers['Strict-Transport-Security'] = 'max-age=31536000; includeSubDomains'
-
// Referrer policy - limit referrer information
-
set.headers['Referrer-Policy'] = 'strict-origin-when-cross-origin'
-
// Content Security Policy
-
set.headers['Content-Security-Policy'] =
-
"default-src 'self'; " +
-
"script-src 'self' 'unsafe-inline' 'unsafe-eval'; " +
-
"style-src 'self' 'unsafe-inline'; " +
-
"img-src 'self' data: https:; " +
-
"font-src 'self' data:; " +
-
"connect-src 'self' https:; " +
-
"frame-ancestors 'none'; " +
-
"base-uri 'self'; " +
-
"form-action 'self'"
-
// Additional security headers
-
set.headers['X-XSS-Protection'] = '1; mode=block'
-
set.headers['Permissions-Policy'] = 'geolocation=(), microphone=(), camera=()'
-
})
-
.onError(observabilityMiddleware('main-app').onError)
-
.use(csrfProtection())
-
.use(authRoutes(client))
-
.use(wispRoutes(client))
-
.use(domainRoutes(client))
-
.use(userRoutes(client))
-
.use(siteRoutes(client))
-
.use(adminRoutes())
-
.use(
-
await staticPlugin({
-
prefix: '/'
-
})
-
)
-
.get('/client-metadata.json', (c) => {
-
return createClientMetadata(config)
-
})
-
.get('/jwks.json', async (c) => {
-
const keys = await getCurrentKeys()
-
if (!keys.length) return { keys: [] }
-
-
return {
-
keys: keys.map((k) => {
-
const jwk = k.publicJwk ?? k
-
const { ...pub } = jwk
-
return pub
-
})
-
}
-
})
-
.get('/api/health', () => {
-
const dnsVerifierHealth = dnsVerifier.getHealth()
-
return {
-
status: 'ok',
-
timestamp: new Date().toISOString(),
-
dnsVerifier: dnsVerifierHealth
-
}
-
})
-
.get('/api/admin/test', () => {
-
return { message: 'Admin routes test works!' }
-
})
-
.post('/api/admin/verify-dns', async () => {
-
try {
-
await dnsVerifier.trigger()
-
return {
-
success: true,
-
message: 'DNS verification triggered'
-
}
-
} catch (error) {
-
return {
-
success: false,
-
error: error instanceof Error ? error.message : String(error)
-
}
-
}
-
})
-
.use(cors({
-
origin: config.domain,
-
credentials: true,
-
methods: ['GET', 'POST', 'DELETE', 'PUT', 'PATCH', 'OPTIONS'],
-
allowedHeaders: ['Content-Type', 'Authorization', 'Origin', 'X-Forwarded-Host'],
-
exposeHeaders: ['Content-Type'],
-
maxAge: 86400 // 24 hours
-
}))
-
.listen(8000)
-
-
console.log(
-
`๐ŸฆŠ Elysia is running at ${app.server?.hostname}:${app.server?.port}`
-
)
-44
src/lexicons/index.ts
···
-
/**
-
* GENERATED CODE - DO NOT MODIFY
-
*/
-
import {
-
type Auth,
-
type Options as XrpcOptions,
-
Server as XrpcServer,
-
type StreamConfigOrHandler,
-
type MethodConfigOrHandler,
-
createServer as createXrpcServer,
-
} from '@atproto/xrpc-server'
-
import { schemas } from './lexicons.js'
-
-
export function createServer(options?: XrpcOptions): Server {
-
return new Server(options)
-
}
-
-
export class Server {
-
xrpc: XrpcServer
-
place: PlaceNS
-
-
constructor(options?: XrpcOptions) {
-
this.xrpc = createXrpcServer(schemas, options)
-
this.place = new PlaceNS(this)
-
}
-
}
-
-
export class PlaceNS {
-
_server: Server
-
wisp: PlaceWispNS
-
-
constructor(server: Server) {
-
this._server = server
-
this.wisp = new PlaceWispNS(server)
-
}
-
}
-
-
export class PlaceWispNS {
-
_server: Server
-
-
constructor(server: Server) {
-
this._server = server
-
}
-
}
-127
src/lexicons/lexicons.ts
···
-
/**
-
* GENERATED CODE - DO NOT MODIFY
-
*/
-
import {
-
type LexiconDoc,
-
Lexicons,
-
ValidationError,
-
type ValidationResult,
-
} from '@atproto/lexicon'
-
import { type $Typed, is$typed, maybe$typed } from './util.js'
-
-
export const schemaDict = {
-
PlaceWispFs: {
-
lexicon: 1,
-
id: 'place.wisp.fs',
-
defs: {
-
main: {
-
type: 'record',
-
description: 'Virtual filesystem manifest for a Wisp site',
-
record: {
-
type: 'object',
-
required: ['site', 'root', 'createdAt'],
-
properties: {
-
site: {
-
type: 'string',
-
},
-
root: {
-
type: 'ref',
-
ref: 'lex:place.wisp.fs#directory',
-
},
-
fileCount: {
-
type: 'integer',
-
minimum: 0,
-
maximum: 1000,
-
},
-
createdAt: {
-
type: 'string',
-
format: 'datetime',
-
},
-
},
-
},
-
},
-
file: {
-
type: 'object',
-
required: ['type', 'blob'],
-
properties: {
-
type: {
-
type: 'string',
-
const: 'file',
-
},
-
blob: {
-
type: 'blob',
-
accept: ['*/*'],
-
maxSize: 1000000,
-
description: 'Content blob ref',
-
},
-
},
-
},
-
directory: {
-
type: 'object',
-
required: ['type', 'entries'],
-
properties: {
-
type: {
-
type: 'string',
-
const: 'directory',
-
},
-
entries: {
-
type: 'array',
-
maxLength: 500,
-
items: {
-
type: 'ref',
-
ref: 'lex:place.wisp.fs#entry',
-
},
-
},
-
},
-
},
-
entry: {
-
type: 'object',
-
required: ['name', 'node'],
-
properties: {
-
name: {
-
type: 'string',
-
maxLength: 255,
-
},
-
node: {
-
type: 'union',
-
refs: ['lex:place.wisp.fs#file', 'lex:place.wisp.fs#directory'],
-
},
-
},
-
},
-
},
-
},
-
} as const satisfies Record<string, LexiconDoc>
-
export const schemas = Object.values(schemaDict) satisfies LexiconDoc[]
-
export const lexicons: Lexicons = new Lexicons(schemas)
-
-
export function validate<T extends { $type: string }>(
-
v: unknown,
-
id: string,
-
hash: string,
-
requiredType: true,
-
): ValidationResult<T>
-
export function validate<T extends { $type?: string }>(
-
v: unknown,
-
id: string,
-
hash: string,
-
requiredType?: false,
-
): ValidationResult<T>
-
export function validate(
-
v: unknown,
-
id: string,
-
hash: string,
-
requiredType?: boolean,
-
): ValidationResult {
-
return (requiredType ? is$typed : maybe$typed)(v, id, hash)
-
? lexicons.validate(`${id}#${hash}`, v)
-
: {
-
success: false,
-
error: new ValidationError(
-
`Must be an object with "${hash === 'main' ? id : `${id}#${hash}`}" $type property`,
-
),
-
}
-
}
-
-
export const ids = {
-
PlaceWispFs: 'place.wisp.fs',
-
} as const
-85
src/lexicons/types/place/wisp/fs.ts
···
-
/**
-
* GENERATED CODE - DO NOT MODIFY
-
*/
-
import { type ValidationResult, BlobRef } from '@atproto/lexicon'
-
import { CID } from 'multiformats/cid'
-
import { validate as _validate } from '../../../lexicons'
-
import { type $Typed, is$typed as _is$typed, type OmitKey } from '../../../util'
-
-
const is$typed = _is$typed,
-
validate = _validate
-
const id = 'place.wisp.fs'
-
-
export interface Main {
-
$type: 'place.wisp.fs'
-
site: string
-
root: Directory
-
fileCount?: number
-
createdAt: string
-
[k: string]: unknown
-
}
-
-
const hashMain = 'main'
-
-
export function isMain<V>(v: V) {
-
return is$typed(v, id, hashMain)
-
}
-
-
export function validateMain<V>(v: V) {
-
return validate<Main & V>(v, id, hashMain, true)
-
}
-
-
export {
-
type Main as Record,
-
isMain as isRecord,
-
validateMain as validateRecord,
-
}
-
-
export interface File {
-
$type?: 'place.wisp.fs#file'
-
type: 'file'
-
/** Content blob ref */
-
blob: BlobRef
-
}
-
-
const hashFile = 'file'
-
-
export function isFile<V>(v: V) {
-
return is$typed(v, id, hashFile)
-
}
-
-
export function validateFile<V>(v: V) {
-
return validate<File & V>(v, id, hashFile)
-
}
-
-
export interface Directory {
-
$type?: 'place.wisp.fs#directory'
-
type: 'directory'
-
entries: Entry[]
-
}
-
-
const hashDirectory = 'directory'
-
-
export function isDirectory<V>(v: V) {
-
return is$typed(v, id, hashDirectory)
-
}
-
-
export function validateDirectory<V>(v: V) {
-
return validate<Directory & V>(v, id, hashDirectory)
-
}
-
-
export interface Entry {
-
$type?: 'place.wisp.fs#entry'
-
name: string
-
node: $Typed<File> | $Typed<Directory> | { $type: string }
-
}
-
-
const hashEntry = 'entry'
-
-
export function isEntry<V>(v: V) {
-
return is$typed(v, id, hashEntry)
-
}
-
-
export function validateEntry<V>(v: V) {
-
return validate<Entry & V>(v, id, hashEntry)
-
}
-82
src/lexicons/util.ts
···
-
/**
-
* GENERATED CODE - DO NOT MODIFY
-
*/
-
-
import { type ValidationResult } from '@atproto/lexicon'
-
-
export type OmitKey<T, K extends keyof T> = {
-
[K2 in keyof T as K2 extends K ? never : K2]: T[K2]
-
}
-
-
export type $Typed<V, T extends string = string> = V & { $type: T }
-
export type Un$Typed<V extends { $type?: string }> = OmitKey<V, '$type'>
-
-
export type $Type<Id extends string, Hash extends string> = Hash extends 'main'
-
? Id
-
: `${Id}#${Hash}`
-
-
function isObject<V>(v: V): v is V & object {
-
return v != null && typeof v === 'object'
-
}
-
-
function is$type<Id extends string, Hash extends string>(
-
$type: unknown,
-
id: Id,
-
hash: Hash,
-
): $type is $Type<Id, Hash> {
-
return hash === 'main'
-
? $type === id
-
: // $type === `${id}#${hash}`
-
typeof $type === 'string' &&
-
$type.length === id.length + 1 + hash.length &&
-
$type.charCodeAt(id.length) === 35 /* '#' */ &&
-
$type.startsWith(id) &&
-
$type.endsWith(hash)
-
}
-
-
export type $TypedObject<
-
V,
-
Id extends string,
-
Hash extends string,
-
> = V extends {
-
$type: $Type<Id, Hash>
-
}
-
? V
-
: V extends { $type?: string }
-
? V extends { $type?: infer T extends $Type<Id, Hash> }
-
? V & { $type: T }
-
: never
-
: V & { $type: $Type<Id, Hash> }
-
-
export function is$typed<V, Id extends string, Hash extends string>(
-
v: V,
-
id: Id,
-
hash: Hash,
-
): v is $TypedObject<V, Id, Hash> {
-
return isObject(v) && '$type' in v && is$type(v.$type, id, hash)
-
}
-
-
export function maybe$typed<V, Id extends string, Hash extends string>(
-
v: V,
-
id: Id,
-
hash: Hash,
-
): v is V & object & { $type?: $Type<Id, Hash> } {
-
return (
-
isObject(v) &&
-
('$type' in v ? v.$type === undefined || is$type(v.$type, id, hash) : true)
-
)
-
}
-
-
export type Validator<R = unknown> = (v: unknown) => ValidationResult<R>
-
export type ValidatorParam<V extends Validator> =
-
V extends Validator<infer R> ? R : never
-
-
/**
-
* Utility function that allows to convert a "validate*" utility function into a
-
* type predicate.
-
*/
-
export function asPredicate<V extends Validator>(validate: V) {
-
return function <T>(v: T): v is T & ValidatorParam<V> {
-
return validate(v).success
-
}
-
}
-208
src/lib/admin-auth.ts
···
-
// Admin authentication system
-
import { db } from './db'
-
import { randomBytes, createHash } from 'crypto'
-
-
interface AdminUser {
-
id: number
-
username: string
-
password_hash: string
-
created_at: Date
-
}
-
-
interface AdminSession {
-
sessionId: string
-
username: string
-
expiresAt: Date
-
}
-
-
// In-memory session storage
-
const sessions = new Map<string, AdminSession>()
-
const SESSION_DURATION = 24 * 60 * 60 * 1000 // 24 hours
-
-
// Hash password using SHA-256 with salt
-
function hashPassword(password: string, salt: string): string {
-
return createHash('sha256').update(password + salt).digest('hex')
-
}
-
-
// Generate random salt
-
function generateSalt(): string {
-
return randomBytes(32).toString('hex')
-
}
-
-
// Generate session ID
-
function generateSessionId(): string {
-
return randomBytes(32).toString('hex')
-
}
-
-
// Generate a secure random password
-
function generatePassword(length: number = 20): string {
-
const chars = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789!@#$%^&*'
-
const bytes = randomBytes(length)
-
let password = ''
-
for (let i = 0; i < length; i++) {
-
password += chars[bytes[i] % chars.length]
-
}
-
return password
-
}
-
-
export const adminAuth = {
-
// Initialize admin table
-
async init() {
-
await db`
-
CREATE TABLE IF NOT EXISTS admin_users (
-
id SERIAL PRIMARY KEY,
-
username TEXT UNIQUE NOT NULL,
-
password_hash TEXT NOT NULL,
-
salt TEXT NOT NULL,
-
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
-
)
-
`
-
},
-
-
// Check if any admin exists
-
async hasAdmin(): Promise<boolean> {
-
const result = await db`SELECT COUNT(*) as count FROM admin_users`
-
return result[0].count > 0
-
},
-
-
// Create admin user
-
async createAdmin(username: string, password: string): Promise<boolean> {
-
try {
-
const salt = generateSalt()
-
const passwordHash = hashPassword(password, salt)
-
-
await db`INSERT INTO admin_users (username, password_hash, salt) VALUES (${username}, ${passwordHash}, ${salt})`
-
-
console.log(`โœ“ Admin user '${username}' created successfully`)
-
return true
-
} catch (error) {
-
console.error('Failed to create admin user:', error)
-
return false
-
}
-
},
-
-
// Verify admin credentials
-
async verify(username: string, password: string): Promise<boolean> {
-
try {
-
const result = await db`SELECT password_hash, salt FROM admin_users WHERE username = ${username}`
-
-
if (result.length === 0) {
-
return false
-
}
-
-
const { password_hash, salt } = result[0]
-
const hash = hashPassword(password, salt as string)
-
return hash === password_hash
-
} catch (error) {
-
console.error('Failed to verify admin:', error)
-
return false
-
}
-
},
-
-
// Create session
-
createSession(username: string): string {
-
const sessionId = generateSessionId()
-
const expiresAt = new Date(Date.now() + SESSION_DURATION)
-
-
sessions.set(sessionId, {
-
sessionId,
-
username,
-
expiresAt
-
})
-
-
// Clean up expired sessions
-
this.cleanupSessions()
-
-
return sessionId
-
},
-
-
// Verify session
-
verifySession(sessionId: string): AdminSession | null {
-
const session = sessions.get(sessionId)
-
-
if (!session) {
-
return null
-
}
-
-
if (session.expiresAt.getTime() < Date.now()) {
-
sessions.delete(sessionId)
-
return null
-
}
-
-
return session
-
},
-
-
// Delete session
-
deleteSession(sessionId: string) {
-
sessions.delete(sessionId)
-
},
-
-
// Cleanup expired sessions
-
cleanupSessions() {
-
const now = Date.now()
-
for (const [sessionId, session] of sessions.entries()) {
-
if (session.expiresAt.getTime() < now) {
-
sessions.delete(sessionId)
-
}
-
}
-
}
-
}
-
-
// Prompt for admin creation on startup
-
export async function promptAdminSetup() {
-
await adminAuth.init()
-
-
const hasAdmin = await adminAuth.hasAdmin()
-
if (hasAdmin) {
-
return
-
}
-
-
// Skip prompt if SKIP_ADMIN_SETUP is set
-
if (process.env.SKIP_ADMIN_SETUP === 'true') {
-
console.log('\nโ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—')
-
console.log('โ•‘ ADMIN SETUP REQUIRED โ•‘')
-
console.log('โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n')
-
console.log('No admin user found.')
-
console.log('Create one with: bun run create-admin.ts\n')
-
return
-
}
-
-
console.log('\n===========================================')
-
console.log(' ADMIN SETUP REQUIRED')
-
console.log('===========================================\n')
-
console.log('No admin user found. Creating one automatically...\n')
-
-
// Auto-generate admin credentials with random password
-
const username = 'admin'
-
const password = generatePassword(20)
-
-
await adminAuth.createAdmin(username, password)
-
-
console.log('โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—')
-
console.log('โ•‘ ADMIN USER CREATED SUCCESSFULLY โ•‘')
-
console.log('โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n')
-
console.log(`Username: ${username}`)
-
console.log(`Password: ${password}`)
-
console.log('\nโš ๏ธ IMPORTANT: Save this password securely!')
-
console.log('This password will not be shown again.\n')
-
console.log('Change it with: bun run change-admin-password.ts admin NEW_PASSWORD\n')
-
}
-
-
// Elysia middleware to protect admin routes
-
export function requireAdmin({ cookie, set }: any) {
-
const sessionId = cookie.admin_session?.value
-
-
if (!sessionId) {
-
set.status = 401
-
return { error: 'Unauthorized' }
-
}
-
-
const session = adminAuth.verifySession(sessionId)
-
if (!session) {
-
set.status = 401
-
return { error: 'Unauthorized' }
-
}
-
-
// Session is valid, continue
-
return
-
}
-4
src/lib/constants.ts
···
-
export const BASE_HOST = Bun.env.BASE_DOMAIN || "wisp.place";
-
export const MAX_SITE_SIZE = 300 * 1024 * 1024; //300MB
-
export const MAX_FILE_SIZE = 100 * 1024 * 1024; //100MB
-
export const MAX_FILE_COUNT = 2000;
-81
src/lib/csrf.test.ts
···
-
import { describe, test, expect } from 'bun:test'
-
import { verifyRequestOrigin } from './csrf'
-
-
describe('verifyRequestOrigin', () => {
-
test('should accept matching origin and host', () => {
-
expect(verifyRequestOrigin('https://example.com', ['example.com'])).toBe(true)
-
expect(verifyRequestOrigin('http://localhost:8000', ['localhost:8000'])).toBe(true)
-
expect(verifyRequestOrigin('https://app.example.com', ['app.example.com'])).toBe(true)
-
})
-
-
test('should accept origin matching one of multiple allowed hosts', () => {
-
const allowedHosts = ['example.com', 'app.example.com', 'localhost:8000']
-
expect(verifyRequestOrigin('https://example.com', allowedHosts)).toBe(true)
-
expect(verifyRequestOrigin('https://app.example.com', allowedHosts)).toBe(true)
-
expect(verifyRequestOrigin('http://localhost:8000', allowedHosts)).toBe(true)
-
})
-
-
test('should reject non-matching origin', () => {
-
expect(verifyRequestOrigin('https://evil.com', ['example.com'])).toBe(false)
-
expect(verifyRequestOrigin('https://fake-example.com', ['example.com'])).toBe(false)
-
expect(verifyRequestOrigin('https://example.com.evil.com', ['example.com'])).toBe(false)
-
})
-
-
test('should reject empty origin', () => {
-
expect(verifyRequestOrigin('', ['example.com'])).toBe(false)
-
})
-
-
test('should reject invalid URL format', () => {
-
expect(verifyRequestOrigin('not-a-url', ['example.com'])).toBe(false)
-
expect(verifyRequestOrigin('javascript:alert(1)', ['example.com'])).toBe(false)
-
expect(verifyRequestOrigin('file:///etc/passwd', ['example.com'])).toBe(false)
-
})
-
-
test('should handle different protocols correctly', () => {
-
// Same host, different protocols should match (we only check host)
-
expect(verifyRequestOrigin('http://example.com', ['example.com'])).toBe(true)
-
expect(verifyRequestOrigin('https://example.com', ['example.com'])).toBe(true)
-
})
-
-
test('should handle port numbers correctly', () => {
-
expect(verifyRequestOrigin('http://localhost:3000', ['localhost:3000'])).toBe(true)
-
expect(verifyRequestOrigin('http://localhost:3000', ['localhost:8000'])).toBe(false)
-
expect(verifyRequestOrigin('http://localhost', ['localhost'])).toBe(true)
-
})
-
-
test('should handle subdomains correctly', () => {
-
expect(verifyRequestOrigin('https://sub.example.com', ['sub.example.com'])).toBe(true)
-
expect(verifyRequestOrigin('https://sub.example.com', ['example.com'])).toBe(false)
-
})
-
-
test('should handle case sensitivity (exact match required)', () => {
-
// URL host is automatically lowercased by URL parser
-
expect(verifyRequestOrigin('https://EXAMPLE.COM', ['example.com'])).toBe(true)
-
expect(verifyRequestOrigin('https://example.com', ['example.com'])).toBe(true)
-
// But allowed hosts are case-sensitive
-
expect(verifyRequestOrigin('https://example.com', ['EXAMPLE.COM'])).toBe(false)
-
})
-
-
test('should handle trailing slashes in origin', () => {
-
expect(verifyRequestOrigin('https://example.com/', ['example.com'])).toBe(true)
-
})
-
-
test('should handle paths in origin (host extraction)', () => {
-
expect(verifyRequestOrigin('https://example.com/path/to/page', ['example.com'])).toBe(true)
-
expect(verifyRequestOrigin('https://evil.com/example.com', ['example.com'])).toBe(false)
-
})
-
-
test('should reject when allowed hosts is empty', () => {
-
expect(verifyRequestOrigin('https://example.com', [])).toBe(false)
-
})
-
-
test('should handle IPv4 addresses', () => {
-
expect(verifyRequestOrigin('http://127.0.0.1:8000', ['127.0.0.1:8000'])).toBe(true)
-
expect(verifyRequestOrigin('http://192.168.1.1', ['192.168.1.1'])).toBe(true)
-
})
-
-
test('should handle IPv6 addresses', () => {
-
expect(verifyRequestOrigin('http://[::1]:8000', ['[::1]:8000'])).toBe(true)
-
expect(verifyRequestOrigin('http://[2001:db8::1]', ['[2001:db8::1]'])).toBe(true)
-
})
-
})
-80
src/lib/csrf.ts
···
-
import { Elysia } from 'elysia'
-
import { logger } from './logger'
-
-
/**
-
* CSRF Protection using Origin/Host header verification
-
* Based on Lucia's recommended approach for cookie-based authentication
-
*
-
* This validates that the Origin header matches the Host header for
-
* state-changing requests (POST, PUT, DELETE, PATCH).
-
*/
-
-
/**
-
* Verify that the request origin matches the expected host
-
* @param origin - The Origin header value
-
* @param allowedHosts - Array of allowed host values
-
* @returns true if origin is valid, false otherwise
-
*/
-
export function verifyRequestOrigin(origin: string, allowedHosts: string[]): boolean {
-
if (!origin) {
-
return false
-
}
-
-
try {
-
const originUrl = new URL(origin)
-
const originHost = originUrl.host
-
-
return allowedHosts.some(host => originHost === host)
-
} catch {
-
// Invalid URL
-
return false
-
}
-
}
-
-
/**
-
* CSRF Protection Middleware for Elysia
-
*
-
* Validates Origin header against Host header for non-GET requests
-
* to prevent CSRF attacks when using cookie-based authentication.
-
*
-
* Usage:
-
* ```ts
-
* import { csrfProtection } from './lib/csrf'
-
*
-
* new Elysia()
-
* .use(csrfProtection())
-
* .post('/api/protected', handler)
-
* ```
-
*/
-
export const csrfProtection = () => {
-
return new Elysia({ name: 'csrf-protection' })
-
.onBeforeHandle(({ request, set }) => {
-
const method = request.method.toUpperCase()
-
-
// Only protect state-changing methods
-
if (['GET', 'HEAD', 'OPTIONS'].includes(method)) {
-
return
-
}
-
-
// Get headers
-
const originHeader = request.headers.get('Origin')
-
// Use X-Forwarded-Host if behind a proxy, otherwise use Host
-
const hostHeader = request.headers.get('X-Forwarded-Host') || request.headers.get('Host')
-
-
// Validate origin matches host
-
if (!originHeader || !hostHeader || !verifyRequestOrigin(originHeader, [hostHeader])) {
-
logger.warn('[CSRF] Request blocked', {
-
method,
-
origin: originHeader,
-
host: hostHeader,
-
path: new URL(request.url).pathname
-
})
-
-
set.status = 403
-
return {
-
error: 'CSRF validation failed',
-
message: 'Request origin does not match host'
-
}
-
}
-
})
-
}
-580
src/lib/db.ts
···
-
import { NodeOAuthClient, type ClientMetadata } from "@atproto/oauth-client-node";
-
import { SQL } from "bun";
-
import { JoseKey } from "@atproto/jwk-jose";
-
import { BASE_HOST } from "./constants";
-
-
export const db = new SQL(
-
process.env.NODE_ENV === 'production'
-
? process.env.DATABASE_URL || (() => {
-
throw new Error('DATABASE_URL environment variable is required in production');
-
})()
-
: process.env.DATABASE_URL || "postgres://postgres:postgres@localhost:5432/wisp"
-
);
-
-
await db`
-
CREATE TABLE IF NOT EXISTS oauth_states (
-
key TEXT PRIMARY KEY,
-
data TEXT NOT NULL,
-
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
-
)
-
`;
-
-
await db`
-
CREATE TABLE IF NOT EXISTS oauth_sessions (
-
sub TEXT PRIMARY KEY,
-
data TEXT NOT NULL,
-
updated_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW()),
-
expires_at BIGINT NOT NULL DEFAULT EXTRACT(EPOCH FROM NOW()) + 2592000
-
)
-
`;
-
-
await db`
-
CREATE TABLE IF NOT EXISTS oauth_keys (
-
kid TEXT PRIMARY KEY,
-
jwk TEXT NOT NULL,
-
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
-
)
-
`;
-
-
// Domains table maps subdomain -> DID
-
await db`
-
CREATE TABLE IF NOT EXISTS domains (
-
domain TEXT PRIMARY KEY,
-
did TEXT UNIQUE NOT NULL,
-
rkey TEXT,
-
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
-
)
-
`;
-
-
// Add columns if they don't exist (for existing databases)
-
try {
-
await db`ALTER TABLE domains ADD COLUMN IF NOT EXISTS rkey TEXT`;
-
} catch (err) {
-
// Column might already exist, ignore
-
}
-
-
try {
-
await db`ALTER TABLE oauth_sessions ADD COLUMN IF NOT EXISTS expires_at BIGINT NOT NULL DEFAULT EXTRACT(EPOCH FROM NOW()) + 2592000`;
-
} catch (err) {
-
// Column might already exist, ignore
-
}
-
-
try {
-
await db`ALTER TABLE oauth_keys ADD COLUMN IF NOT EXISTS created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())`;
-
} catch (err) {
-
// Column might already exist, ignore
-
}
-
-
try {
-
await db`ALTER TABLE oauth_states ADD COLUMN IF NOT EXISTS expires_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW()) + 3600`;
-
} catch (err) {
-
// Column might already exist, ignore
-
}
-
-
// Custom domains table for BYOD (bring your own domain)
-
await db`
-
CREATE TABLE IF NOT EXISTS custom_domains (
-
id TEXT PRIMARY KEY,
-
domain TEXT UNIQUE NOT NULL,
-
did TEXT NOT NULL,
-
rkey TEXT,
-
verified BOOLEAN DEFAULT false,
-
last_verified_at BIGINT,
-
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW())
-
)
-
`;
-
-
// Migrate existing tables to make rkey nullable and remove default
-
try {
-
await db`ALTER TABLE custom_domains ALTER COLUMN rkey DROP NOT NULL`;
-
} catch (err) {
-
// Column might already be nullable, ignore
-
}
-
try {
-
await db`ALTER TABLE custom_domains ALTER COLUMN rkey DROP DEFAULT`;
-
} catch (err) {
-
// Default might already be removed, ignore
-
}
-
-
// Sites table - cache of place.wisp.fs records from PDS
-
await db`
-
CREATE TABLE IF NOT EXISTS sites (
-
did TEXT NOT NULL,
-
rkey TEXT NOT NULL,
-
display_name TEXT,
-
created_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW()),
-
updated_at BIGINT DEFAULT EXTRACT(EPOCH FROM NOW()),
-
PRIMARY KEY (did, rkey)
-
)
-
`;
-
-
const RESERVED_HANDLES = new Set([
-
"www",
-
"api",
-
"admin",
-
"static",
-
"public",
-
"preview"
-
]);
-
-
export const isValidHandle = (handle: string): boolean => {
-
const h = handle.trim().toLowerCase();
-
if (h.length < 3 || h.length > 63) return false;
-
if (!/^[a-z0-9-]+$/.test(h)) return false;
-
if (h.startsWith('-') || h.endsWith('-')) return false;
-
if (h.includes('--')) return false;
-
if (RESERVED_HANDLES.has(h)) return false;
-
return true;
-
};
-
-
export const toDomain = (handle: string): string => `${handle.toLowerCase()}.${BASE_HOST}`;
-
-
export const getDomainByDid = async (did: string): Promise<string | null> => {
-
const rows = await db`SELECT domain FROM domains WHERE did = ${did}`;
-
return rows[0]?.domain ?? null;
-
};
-
-
export const getWispDomainInfo = async (did: string) => {
-
const rows = await db`SELECT domain, rkey FROM domains WHERE did = ${did}`;
-
return rows[0] ?? null;
-
};
-
-
export const getDidByDomain = async (domain: string): Promise<string | null> => {
-
const rows = await db`SELECT did FROM domains WHERE domain = ${domain.toLowerCase()}`;
-
return rows[0]?.did ?? null;
-
};
-
-
export const isDomainAvailable = async (handle: string): Promise<boolean> => {
-
const h = handle.trim().toLowerCase();
-
if (!isValidHandle(h)) return false;
-
const domain = toDomain(h);
-
const rows = await db`SELECT 1 FROM domains WHERE domain = ${domain} LIMIT 1`;
-
return rows.length === 0;
-
};
-
-
export const isDomainRegistered = async (domain: string) => {
-
const domainLower = domain.toLowerCase().trim();
-
-
// Check wisp.place subdomains
-
const wispDomain = await db`
-
SELECT did, domain, rkey FROM domains WHERE domain = ${domainLower}
-
`;
-
-
if (wispDomain.length > 0) {
-
return {
-
registered: true,
-
type: 'wisp' as const,
-
domain: wispDomain[0].domain,
-
did: wispDomain[0].did,
-
rkey: wispDomain[0].rkey
-
};
-
}
-
-
// Check custom domains
-
const customDomain = await db`
-
SELECT id, domain, did, rkey, verified FROM custom_domains WHERE domain = ${domainLower}
-
`;
-
-
if (customDomain.length > 0) {
-
return {
-
registered: true,
-
type: 'custom' as const,
-
domain: customDomain[0].domain,
-
did: customDomain[0].did,
-
rkey: customDomain[0].rkey,
-
verified: customDomain[0].verified
-
};
-
}
-
-
return { registered: false };
-
};
-
-
export const claimDomain = async (did: string, handle: string): Promise<string> => {
-
const h = handle.trim().toLowerCase();
-
if (!isValidHandle(h)) throw new Error('invalid_handle');
-
const domain = toDomain(h);
-
try {
-
await db`
-
INSERT INTO domains (domain, did)
-
VALUES (${domain}, ${did})
-
`;
-
} catch (err) {
-
// Unique constraint violations -> already taken or DID already claimed
-
throw new Error('conflict');
-
}
-
return domain;
-
};
-
-
export const updateDomain = async (did: string, handle: string): Promise<string> => {
-
const h = handle.trim().toLowerCase();
-
if (!isValidHandle(h)) throw new Error('invalid_handle');
-
const domain = toDomain(h);
-
try {
-
const rows = await db`
-
UPDATE domains SET domain = ${domain}
-
WHERE did = ${did}
-
RETURNING domain
-
`;
-
if (rows.length > 0) return rows[0].domain as string;
-
// No existing row, behave like claim
-
return await claimDomain(did, handle);
-
} catch (err) {
-
// Unique constraint violations -> already taken by someone else
-
throw new Error('conflict');
-
}
-
};
-
-
export const updateWispDomainSite = async (did: string, siteRkey: string | null): Promise<void> => {
-
await db`
-
UPDATE domains
-
SET rkey = ${siteRkey}
-
WHERE did = ${did}
-
`;
-
};
-
-
export const getWispDomainSite = async (did: string): Promise<string | null> => {
-
const rows = await db`SELECT rkey FROM domains WHERE did = ${did}`;
-
return rows[0]?.rkey ?? null;
-
};
-
-
// Session timeout configuration (30 days in seconds)
-
const SESSION_TIMEOUT = 30 * 24 * 60 * 60; // 2592000 seconds
-
// OAuth state timeout (1 hour in seconds)
-
const STATE_TIMEOUT = 60 * 60; // 3600 seconds
-
-
const stateStore = {
-
async set(key: string, data: any) {
-
console.debug('[stateStore] set', key)
-
const expiresAt = Math.floor(Date.now() / 1000) + STATE_TIMEOUT;
-
await db`
-
INSERT INTO oauth_states (key, data, created_at, expires_at)
-
VALUES (${key}, ${JSON.stringify(data)}, EXTRACT(EPOCH FROM NOW()), ${expiresAt})
-
ON CONFLICT (key) DO UPDATE SET data = EXCLUDED.data, expires_at = ${expiresAt}
-
`;
-
},
-
async get(key: string) {
-
console.debug('[stateStore] get', key)
-
const now = Math.floor(Date.now() / 1000);
-
const result = await db`
-
SELECT data, expires_at
-
FROM oauth_states
-
WHERE key = ${key}
-
`;
-
if (!result[0]) return undefined;
-
-
// Check if expired
-
const expiresAt = Number(result[0].expires_at);
-
if (expiresAt && now > expiresAt) {
-
console.debug('[stateStore] State expired, deleting', key);
-
await db`DELETE FROM oauth_states WHERE key = ${key}`;
-
return undefined;
-
}
-
-
return JSON.parse(result[0].data);
-
},
-
async del(key: string) {
-
console.debug('[stateStore] del', key)
-
await db`DELETE FROM oauth_states WHERE key = ${key}`;
-
}
-
};
-
-
const sessionStore = {
-
async set(sub: string, data: any) {
-
console.debug('[sessionStore] set', sub)
-
const expiresAt = Math.floor(Date.now() / 1000) + SESSION_TIMEOUT;
-
await db`
-
INSERT INTO oauth_sessions (sub, data, updated_at, expires_at)
-
VALUES (${sub}, ${JSON.stringify(data)}, EXTRACT(EPOCH FROM NOW()), ${expiresAt})
-
ON CONFLICT (sub) DO UPDATE SET
-
data = EXCLUDED.data,
-
updated_at = EXTRACT(EPOCH FROM NOW()),
-
expires_at = ${expiresAt}
-
`;
-
},
-
async get(sub: string) {
-
console.debug('[sessionStore] get', sub)
-
const now = Math.floor(Date.now() / 1000);
-
const result = await db`
-
SELECT data, expires_at
-
FROM oauth_sessions
-
WHERE sub = ${sub}
-
`;
-
if (!result[0]) return undefined;
-
-
// Check if expired
-
const expiresAt = Number(result[0].expires_at);
-
if (expiresAt && now > expiresAt) {
-
console.log('[sessionStore] Session expired, deleting', sub);
-
await db`DELETE FROM oauth_sessions WHERE sub = ${sub}`;
-
return undefined;
-
}
-
-
return JSON.parse(result[0].data);
-
},
-
async del(sub: string) {
-
console.debug('[sessionStore] del', sub)
-
await db`DELETE FROM oauth_sessions WHERE sub = ${sub}`;
-
}
-
};
-
-
export { sessionStore };
-
-
// Cleanup expired sessions and states
-
export const cleanupExpiredSessions = async () => {
-
const now = Math.floor(Date.now() / 1000);
-
try {
-
const sessionsDeleted = await db`
-
DELETE FROM oauth_sessions WHERE expires_at < ${now}
-
`;
-
const statesDeleted = await db`
-
DELETE FROM oauth_states WHERE expires_at IS NOT NULL AND expires_at < ${now}
-
`;
-
console.log(`[Cleanup] Deleted ${sessionsDeleted.length} expired sessions and ${statesDeleted.length} expired states`);
-
return { sessions: sessionsDeleted.length, states: statesDeleted.length };
-
} catch (err) {
-
console.error('[Cleanup] Failed to cleanup expired data:', err);
-
return { sessions: 0, states: 0 };
-
}
-
};
-
-
export const createClientMetadata = (config: { domain: `http://${string}` | `https://${string}`, clientName: string }): ClientMetadata => {
-
const isLocalDev = process.env.LOCAL_DEV === 'true';
-
-
if (isLocalDev) {
-
// Loopback client for local development
-
// For loopback, scopes and redirect_uri must be in client_id query string
-
const redirectUri = 'http://127.0.0.1:8000/api/auth/callback';
-
const scope = 'atproto transition:generic';
-
const params = new URLSearchParams();
-
params.append('redirect_uri', redirectUri);
-
params.append('scope', scope);
-
-
return {
-
client_id: `http://localhost?${params.toString()}`,
-
client_name: config.clientName,
-
client_uri: config.domain,
-
redirect_uris: [redirectUri],
-
grant_types: ['authorization_code', 'refresh_token'],
-
response_types: ['code'],
-
application_type: 'web',
-
token_endpoint_auth_method: 'none',
-
scope: scope,
-
dpop_bound_access_tokens: false,
-
subject_type: 'public'
-
};
-
}
-
-
// Production client with private_key_jwt
-
return {
-
client_id: `${config.domain}/client-metadata.json`,
-
client_name: config.clientName,
-
client_uri: config.domain,
-
logo_uri: `${config.domain}/logo.png`,
-
tos_uri: `${config.domain}/tos`,
-
policy_uri: `${config.domain}/policy`,
-
redirect_uris: [`${config.domain}/api/auth/callback`],
-
grant_types: ['authorization_code', 'refresh_token'],
-
response_types: ['code'],
-
application_type: 'web',
-
token_endpoint_auth_method: 'private_key_jwt',
-
token_endpoint_auth_signing_alg: "ES256",
-
scope: "atproto transition:generic",
-
dpop_bound_access_tokens: true,
-
jwks_uri: `${config.domain}/jwks.json`,
-
subject_type: 'public',
-
authorization_signed_response_alg: 'ES256'
-
};
-
};
-
-
const persistKey = async (key: JoseKey) => {
-
const priv = key.privateJwk;
-
if (!priv) return;
-
const kid = key.kid ?? crypto.randomUUID();
-
await db`
-
INSERT INTO oauth_keys (kid, jwk, created_at)
-
VALUES (${kid}, ${JSON.stringify(priv)}, EXTRACT(EPOCH FROM NOW()))
-
ON CONFLICT (kid) DO UPDATE SET jwk = EXCLUDED.jwk
-
`;
-
};
-
-
const loadPersistedKeys = async (): Promise<JoseKey[]> => {
-
const rows = await db`SELECT kid, jwk, created_at FROM oauth_keys ORDER BY kid`;
-
const keys: JoseKey[] = [];
-
for (const row of rows) {
-
try {
-
const obj = JSON.parse(row.jwk);
-
const key = await JoseKey.fromImportable(obj as any, (obj as any).kid);
-
keys.push(key);
-
} catch (err) {
-
console.error('Could not parse stored JWK', err);
-
}
-
}
-
return keys;
-
};
-
-
const ensureKeys = async (): Promise<JoseKey[]> => {
-
let keys = await loadPersistedKeys();
-
const needed: string[] = [];
-
for (let i = 1; i <= 3; i++) {
-
const kid = `key${i}`;
-
if (!keys.some(k => k.kid === kid)) needed.push(kid);
-
}
-
for (const kid of needed) {
-
const newKey = await JoseKey.generate(['ES256'], kid);
-
await persistKey(newKey);
-
keys.push(newKey);
-
}
-
keys.sort((a, b) => (a.kid ?? '').localeCompare(b.kid ?? ''));
-
return keys;
-
};
-
-
// Load keys from database every time (stateless - safe for horizontal scaling)
-
export const getCurrentKeys = async (): Promise<JoseKey[]> => {
-
return await loadPersistedKeys();
-
};
-
-
// Key rotation - rotate keys older than 30 days (monthly rotation)
-
const KEY_MAX_AGE = 30 * 24 * 60 * 60; // 30 days in seconds
-
-
export const rotateKeysIfNeeded = async (): Promise<boolean> => {
-
const now = Math.floor(Date.now() / 1000);
-
const cutoffTime = now - KEY_MAX_AGE;
-
-
try {
-
// Find keys older than 30 days
-
const oldKeys = await db`
-
SELECT kid, created_at FROM oauth_keys
-
WHERE created_at IS NOT NULL AND created_at < ${cutoffTime}
-
ORDER BY created_at ASC
-
`;
-
-
if (oldKeys.length === 0) {
-
console.log('[KeyRotation] No keys need rotation');
-
return false;
-
}
-
-
console.log(`[KeyRotation] Found ${oldKeys.length} key(s) older than 30 days, rotating oldest key`);
-
-
// Rotate the oldest key
-
const oldestKey = oldKeys[0];
-
const oldKid = oldestKey.kid;
-
-
// Generate new key with same kid
-
const newKey = await JoseKey.generate(['ES256'], oldKid);
-
await persistKey(newKey);
-
-
console.log(`[KeyRotation] Rotated key ${oldKid}`);
-
-
return true;
-
} catch (err) {
-
console.error('[KeyRotation] Failed to rotate keys:', err);
-
return false;
-
}
-
};
-
-
export const getOAuthClient = async (config: { domain: `http://${string}` | `https://${string}`, clientName: string }) => {
-
const keys = await ensureKeys();
-
-
return new NodeOAuthClient({
-
clientMetadata: createClientMetadata(config),
-
keyset: keys,
-
stateStore,
-
sessionStore
-
});
-
};
-
-
export const getCustomDomainsByDid = async (did: string) => {
-
const rows = await db`SELECT * FROM custom_domains WHERE did = ${did} ORDER BY created_at DESC`;
-
return rows;
-
};
-
-
export const getCustomDomainInfo = async (domain: string) => {
-
const rows = await db`SELECT * FROM custom_domains WHERE domain = ${domain.toLowerCase()}`;
-
return rows[0] ?? null;
-
};
-
-
export const getCustomDomainByHash = async (hash: string) => {
-
const rows = await db`SELECT * FROM custom_domains WHERE id = ${hash}`;
-
return rows[0] ?? null;
-
};
-
-
export const getCustomDomainById = async (id: string) => {
-
const rows = await db`SELECT * FROM custom_domains WHERE id = ${id}`;
-
return rows[0] ?? null;
-
};
-
-
export const claimCustomDomain = async (did: string, domain: string, hash: string, rkey: string | null = null) => {
-
const domainLower = domain.toLowerCase();
-
try {
-
await db`
-
INSERT INTO custom_domains (id, domain, did, rkey, verified, created_at)
-
VALUES (${hash}, ${domainLower}, ${did}, ${rkey}, false, EXTRACT(EPOCH FROM NOW()))
-
`;
-
return { success: true, hash };
-
} catch (err) {
-
console.error('Failed to claim custom domain', err);
-
throw new Error('conflict');
-
}
-
};
-
-
export const updateCustomDomainRkey = async (id: string, rkey: string | null) => {
-
const rows = await db`
-
UPDATE custom_domains
-
SET rkey = ${rkey}
-
WHERE id = ${id}
-
RETURNING *
-
`;
-
return rows[0] ?? null;
-
};
-
-
export const updateCustomDomainVerification = async (id: string, verified: boolean) => {
-
const rows = await db`
-
UPDATE custom_domains
-
SET verified = ${verified}, last_verified_at = EXTRACT(EPOCH FROM NOW())
-
WHERE id = ${id}
-
RETURNING *
-
`;
-
return rows[0] ?? null;
-
};
-
-
export const deleteCustomDomain = async (id: string) => {
-
await db`DELETE FROM custom_domains WHERE id = ${id}`;
-
};
-
-
export const getSitesByDid = async (did: string) => {
-
const rows = await db`SELECT * FROM sites WHERE did = ${did} ORDER BY created_at DESC`;
-
return rows;
-
};
-
-
export const upsertSite = async (did: string, rkey: string, displayName?: string) => {
-
try {
-
// Only set display_name if provided (not undefined/null/empty)
-
const cleanDisplayName = displayName && displayName.trim() ? displayName.trim() : null;
-
-
await db`
-
INSERT INTO sites (did, rkey, display_name, created_at, updated_at)
-
VALUES (${did}, ${rkey}, ${cleanDisplayName}, EXTRACT(EPOCH FROM NOW()), EXTRACT(EPOCH FROM NOW()))
-
ON CONFLICT (did, rkey)
-
DO UPDATE SET
-
display_name = CASE
-
WHEN EXCLUDED.display_name IS NOT NULL THEN EXCLUDED.display_name
-
ELSE sites.display_name
-
END,
-
updated_at = EXTRACT(EPOCH FROM NOW())
-
`;
-
return { success: true };
-
} catch (err) {
-
console.error('Failed to upsert site', err);
-
return { success: false, error: err };
-
}
-
};
-
-
export const deleteSite = async (did: string, rkey: string) => {
-
try {
-
await db`DELETE FROM sites WHERE did = ${did} AND rkey = ${rkey}`;
-
return { success: true };
-
} catch (err) {
-
console.error('Failed to delete site', err);
-
return { success: false, error: err };
-
}
-
};
-190
src/lib/dns-verification-worker.ts
···
-
import { verifyCustomDomain } from './dns-verify';
-
import { db } from './db';
-
-
interface VerificationStats {
-
totalChecked: number;
-
verified: number;
-
failed: number;
-
errors: number;
-
}
-
-
export class DNSVerificationWorker {
-
private interval: Timer | null = null;
-
private isRunning = false;
-
private lastRunTime: number | null = null;
-
private stats: VerificationStats = {
-
totalChecked: 0,
-
verified: 0,
-
failed: 0,
-
errors: 0,
-
};
-
-
constructor(
-
private checkIntervalMs: number = 60 * 60 * 1000, // 1 hour default
-
private onLog?: (message: string, data?: any) => void
-
) {}
-
-
private log(message: string, data?: any) {
-
if (this.onLog) {
-
this.onLog(message, data);
-
}
-
}
-
-
async start() {
-
if (this.isRunning) {
-
this.log('DNS verification worker already running');
-
return;
-
}
-
-
this.isRunning = true;
-
this.log('Starting DNS verification worker', {
-
intervalMinutes: this.checkIntervalMs / 60000,
-
});
-
-
// Run immediately on start
-
await this.verifyAllDomains();
-
-
// Then run on interval
-
this.interval = setInterval(() => {
-
this.verifyAllDomains();
-
}, this.checkIntervalMs);
-
}
-
-
stop() {
-
if (this.interval) {
-
clearInterval(this.interval);
-
this.interval = null;
-
}
-
this.isRunning = false;
-
this.log('DNS verification worker stopped');
-
}
-
-
private async verifyAllDomains() {
-
this.log('Starting DNS verification check');
-
const startTime = Date.now();
-
-
const runStats: VerificationStats = {
-
totalChecked: 0,
-
verified: 0,
-
failed: 0,
-
errors: 0,
-
};
-
-
try {
-
// Get all custom domains (both verified and pending)
-
const domains = await db<Array<{
-
id: string;
-
domain: string;
-
did: string;
-
verified: boolean;
-
}>>`
-
SELECT id, domain, did, verified FROM custom_domains
-
`;
-
-
if (!domains || domains.length === 0) {
-
this.log('No custom domains to check');
-
this.lastRunTime = Date.now();
-
return;
-
}
-
-
const verifiedCount = domains.filter(d => d.verified).length;
-
const pendingCount = domains.filter(d => !d.verified).length;
-
this.log(`Checking ${domains.length} custom domains (${verifiedCount} verified, ${pendingCount} pending)`);
-
-
// Verify each domain
-
for (const row of domains) {
-
runStats.totalChecked++;
-
const { id, domain, did, verified: wasVerified } = row;
-
-
try {
-
// Extract hash from id (SHA256 of did:domain)
-
const expectedHash = id.substring(0, 16);
-
-
// Verify DNS records
-
const result = await verifyCustomDomain(domain, did, expectedHash);
-
-
if (result.verified) {
-
// Update verified status and last_verified_at timestamp
-
await db`
-
UPDATE custom_domains
-
SET verified = true,
-
last_verified_at = EXTRACT(EPOCH FROM NOW())
-
WHERE id = ${id}
-
`;
-
runStats.verified++;
-
if (!wasVerified) {
-
this.log(`Domain newly verified: ${domain}`, { did });
-
} else {
-
this.log(`Domain re-verified: ${domain}`, { did });
-
}
-
} else {
-
// Mark domain as unverified or keep it pending
-
await db`
-
UPDATE custom_domains
-
SET verified = false,
-
last_verified_at = EXTRACT(EPOCH FROM NOW())
-
WHERE id = ${id}
-
`;
-
runStats.failed++;
-
if (wasVerified) {
-
this.log(`Domain verification failed (was verified): ${domain}`, {
-
did,
-
error: result.error,
-
found: result.found,
-
});
-
} else {
-
this.log(`Domain still pending: ${domain}`, {
-
did,
-
error: result.error,
-
found: result.found,
-
});
-
}
-
}
-
} catch (error) {
-
runStats.errors++;
-
this.log(`Error verifying domain: ${domain}`, {
-
did,
-
error: error instanceof Error ? error.message : String(error),
-
});
-
}
-
}
-
-
// Update cumulative stats
-
this.stats.totalChecked += runStats.totalChecked;
-
this.stats.verified += runStats.verified;
-
this.stats.failed += runStats.failed;
-
this.stats.errors += runStats.errors;
-
-
const duration = Date.now() - startTime;
-
this.lastRunTime = Date.now();
-
-
this.log('DNS verification check completed', {
-
duration: `${duration}ms`,
-
...runStats,
-
});
-
} catch (error) {
-
this.log('Fatal error in DNS verification worker', {
-
error: error instanceof Error ? error.message : String(error),
-
});
-
}
-
}
-
-
getHealth() {
-
return {
-
isRunning: this.isRunning,
-
lastRunTime: this.lastRunTime,
-
intervalMs: this.checkIntervalMs,
-
stats: this.stats,
-
healthy: this.isRunning && (
-
this.lastRunTime === null ||
-
Date.now() - this.lastRunTime < this.checkIntervalMs * 2
-
),
-
};
-
}
-
-
// Manual trigger for testing
-
async trigger() {
-
this.log('Manual DNS verification triggered');
-
await this.verifyAllDomains();
-
}
-
}
-156
src/lib/dns-verify.ts
···
-
import { promises as dns } from 'dns'
-
-
/**
-
* Result of a domain verification process
-
*/
-
export interface VerificationResult {
-
/** Whether the verification was successful */
-
verified: boolean
-
/** Error message if verification failed */
-
error?: string
-
/** DNS records found during verification */
-
found?: {
-
/** TXT records found (used for domain verification) */
-
txt?: string[]
-
/** CNAME record found (used for domain pointing) */
-
cname?: string
-
}
-
}
-
-
/**
-
* Verify domain ownership via TXT record at _wisp.{domain}
-
* Expected format: did:plc:xxx or did:web:xxx
-
*/
-
export const verifyDomainOwnership = async (
-
domain: string,
-
expectedDid: string
-
): Promise<VerificationResult> => {
-
try {
-
const txtDomain = `_wisp.${domain}`
-
-
console.log(`[DNS Verify] Checking TXT record for ${txtDomain}`)
-
console.log(`[DNS Verify] Expected DID: ${expectedDid}`)
-
-
// Query TXT records
-
const records = await dns.resolveTxt(txtDomain)
-
-
// Log what we found
-
const foundTxtValues = records.map((record) => record.join(''))
-
console.log(`[DNS Verify] Found TXT records:`, foundTxtValues)
-
-
// TXT records come as arrays of strings (for multi-part records)
-
// We need to join them and check if any match the expected DID
-
for (const record of records) {
-
const txtValue = record.join('')
-
if (txtValue === expectedDid) {
-
console.log(`[DNS Verify] โœ“ TXT record matches!`)
-
return { verified: true, found: { txt: foundTxtValues } }
-
}
-
}
-
-
console.log(`[DNS Verify] โœ— TXT record does not match`)
-
return {
-
verified: false,
-
error: `TXT record at ${txtDomain} does not match expected DID. Expected: ${expectedDid}`,
-
found: { txt: foundTxtValues }
-
}
-
} catch (err: any) {
-
console.log(`[DNS Verify] โœ— TXT lookup error:`, err.message)
-
if (err.code === 'ENOTFOUND' || err.code === 'ENODATA') {
-
return {
-
verified: false,
-
error: `No TXT record found at _wisp.${domain}`,
-
found: { txt: [] }
-
}
-
}
-
return {
-
verified: false,
-
error: `DNS lookup failed: ${err.message}`,
-
found: { txt: [] }
-
}
-
}
-
}
-
-
/**
-
* Verify CNAME record points to the expected hash target
-
* For custom domains, we expect: domain CNAME -> {hash}.dns.wisp.place
-
*/
-
export const verifyCNAME = async (
-
domain: string,
-
expectedHash: string
-
): Promise<VerificationResult> => {
-
try {
-
console.log(`[DNS Verify] Checking CNAME record for ${domain}`)
-
const expectedTarget = `${expectedHash}.dns.wisp.place`
-
console.log(`[DNS Verify] Expected CNAME: ${expectedTarget}`)
-
-
// Resolve CNAME for the domain
-
const cname = await dns.resolveCname(domain)
-
-
// Log what we found
-
const foundCname =
-
cname.length > 0
-
? cname[0]?.toLowerCase().replace(/\.$/, '')
-
: null
-
console.log(`[DNS Verify] Found CNAME:`, foundCname || 'none')
-
-
if (cname.length === 0 || !foundCname) {
-
console.log(`[DNS Verify] โœ— No CNAME record found`)
-
return {
-
verified: false,
-
error: `No CNAME record found for ${domain}`,
-
found: { cname: '' }
-
}
-
}
-
-
// Check if CNAME points to the expected target
-
const actualTarget = foundCname
-
-
if (actualTarget === expectedTarget.toLowerCase()) {
-
console.log(`[DNS Verify] โœ“ CNAME record matches!`)
-
return { verified: true, found: { cname: actualTarget } }
-
}
-
-
console.log(`[DNS Verify] โœ— CNAME record does not match`)
-
return {
-
verified: false,
-
error: `CNAME for ${domain} points to ${actualTarget}, expected ${expectedTarget}`,
-
found: { cname: actualTarget }
-
}
-
} catch (err: any) {
-
console.log(`[DNS Verify] โœ— CNAME lookup error:`, err.message)
-
if (err.code === 'ENOTFOUND' || err.code === 'ENODATA') {
-
return {
-
verified: false,
-
error: `No CNAME record found for ${domain}`,
-
found: { cname: '' }
-
}
-
}
-
return {
-
verified: false,
-
error: `DNS lookup failed: ${err.message}`,
-
found: { cname: '' }
-
}
-
}
-
}
-
-
/**
-
* Verify both TXT and CNAME records for a custom domain
-
*/
-
export const verifyCustomDomain = async (
-
domain: string,
-
expectedDid: string,
-
expectedHash: string
-
): Promise<VerificationResult> => {
-
const txtResult = await verifyDomainOwnership(domain, expectedDid)
-
if (!txtResult.verified) {
-
return txtResult
-
}
-
-
const cnameResult = await verifyCNAME(domain, expectedHash)
-
if (!cnameResult.verified) {
-
return cnameResult
-
}
-
-
return { verified: true }
-
}
-46
src/lib/logger.ts
···
-
// Secure logging utility - only verbose in development mode
-
const isDev = process.env.NODE_ENV !== 'production';
-
-
export const logger = {
-
// Always log these (safe for production)
-
info: (...args: any[]) => {
-
console.log(...args);
-
},
-
-
// Only log in development (may contain sensitive info)
-
debug: (...args: any[]) => {
-
if (isDev) {
-
console.debug(...args);
-
}
-
},
-
-
// Warning logging (always logged but may be sanitized in production)
-
warn: (message: string, context?: Record<string, any>) => {
-
if (isDev) {
-
console.warn(message, context);
-
} else {
-
console.warn(message);
-
}
-
},
-
-
// Safe error logging - sanitizes in production
-
error: (message: string, error?: any) => {
-
if (isDev) {
-
// Development: log full error details
-
console.error(message, error);
-
} else {
-
// Production: log only the message, not error details
-
console.error(message);
-
}
-
},
-
-
// Log error with context but sanitize sensitive data in production
-
errorWithContext: (message: string, context?: Record<string, any>, error?: any) => {
-
if (isDev) {
-
console.error(message, context, error);
-
} else {
-
// In production, only log the message
-
console.error(message);
-
}
-
}
-
};
-250
src/lib/oauth-client.ts
···
-
import { NodeOAuthClient, type ClientMetadata } from "@atproto/oauth-client-node";
-
import { JoseKey } from "@atproto/jwk-jose";
-
import { db } from "./db";
-
import { logger } from "./logger";
-
-
// Session timeout configuration (30 days in seconds)
-
const SESSION_TIMEOUT = 30 * 24 * 60 * 60; // 2592000 seconds
-
// OAuth state timeout (1 hour in seconds)
-
const STATE_TIMEOUT = 60 * 60; // 3600 seconds
-
-
const stateStore = {
-
async set(key: string, data: any) {
-
console.debug('[stateStore] set', key)
-
const expiresAt = Math.floor(Date.now() / 1000) + STATE_TIMEOUT;
-
await db`
-
INSERT INTO oauth_states (key, data, created_at, expires_at)
-
VALUES (${key}, ${JSON.stringify(data)}, EXTRACT(EPOCH FROM NOW()), ${expiresAt})
-
ON CONFLICT (key) DO UPDATE SET data = EXCLUDED.data, expires_at = ${expiresAt}
-
`;
-
},
-
async get(key: string) {
-
console.debug('[stateStore] get', key)
-
const now = Math.floor(Date.now() / 1000);
-
const result = await db`
-
SELECT data, expires_at
-
FROM oauth_states
-
WHERE key = ${key}
-
`;
-
if (!result[0]) return undefined;
-
-
// Check if expired
-
const expiresAt = Number(result[0].expires_at);
-
if (expiresAt && now > expiresAt) {
-
console.debug('[stateStore] State expired, deleting', key);
-
await db`DELETE FROM oauth_states WHERE key = ${key}`;
-
return undefined;
-
}
-
-
return JSON.parse(result[0].data);
-
},
-
async del(key: string) {
-
console.debug('[stateStore] del', key)
-
await db`DELETE FROM oauth_states WHERE key = ${key}`;
-
}
-
};
-
-
const sessionStore = {
-
async set(sub: string, data: any) {
-
console.debug('[sessionStore] set', sub)
-
const expiresAt = Math.floor(Date.now() / 1000) + SESSION_TIMEOUT;
-
await db`
-
INSERT INTO oauth_sessions (sub, data, updated_at, expires_at)
-
VALUES (${sub}, ${JSON.stringify(data)}, EXTRACT(EPOCH FROM NOW()), ${expiresAt})
-
ON CONFLICT (sub) DO UPDATE SET
-
data = EXCLUDED.data,
-
updated_at = EXTRACT(EPOCH FROM NOW()),
-
expires_at = ${expiresAt}
-
`;
-
},
-
async get(sub: string) {
-
console.debug('[sessionStore] get', sub)
-
const now = Math.floor(Date.now() / 1000);
-
const result = await db`
-
SELECT data, expires_at
-
FROM oauth_sessions
-
WHERE sub = ${sub}
-
`;
-
if (!result[0]) return undefined;
-
-
// Check if expired
-
const expiresAt = Number(result[0].expires_at);
-
if (expiresAt && now > expiresAt) {
-
logger.debug('[sessionStore] Session expired, deleting', sub);
-
await db`DELETE FROM oauth_sessions WHERE sub = ${sub}`;
-
return undefined;
-
}
-
-
return JSON.parse(result[0].data);
-
},
-
async del(sub: string) {
-
console.debug('[sessionStore] del', sub)
-
await db`DELETE FROM oauth_sessions WHERE sub = ${sub}`;
-
}
-
};
-
-
export { sessionStore };
-
-
// Cleanup expired sessions and states
-
export const cleanupExpiredSessions = async () => {
-
const now = Math.floor(Date.now() / 1000);
-
try {
-
const sessionsDeleted = await db`
-
DELETE FROM oauth_sessions WHERE expires_at < ${now}
-
`;
-
const statesDeleted = await db`
-
DELETE FROM oauth_states WHERE expires_at IS NOT NULL AND expires_at < ${now}
-
`;
-
logger.info(`[Cleanup] Deleted ${sessionsDeleted.length} expired sessions and ${statesDeleted.length} expired states`);
-
return { sessions: sessionsDeleted.length, states: statesDeleted.length };
-
} catch (err) {
-
logger.error('[Cleanup] Failed to cleanup expired data', err);
-
return { sessions: 0, states: 0 };
-
}
-
};
-
-
export const createClientMetadata = (config: { domain: `http://${string}` | `https://${string}`, clientName: string }): ClientMetadata => {
-
const isLocalDev = Bun.env.LOCAL_DEV === 'true';
-
-
if (isLocalDev) {
-
// Loopback client for local development
-
// For loopback, scopes and redirect_uri must be in client_id query string
-
const redirectUri = 'http://127.0.0.1:8000/api/auth/callback';
-
const scope = 'atproto transition:generic';
-
const params = new URLSearchParams();
-
params.append('redirect_uri', redirectUri);
-
params.append('scope', scope);
-
-
return {
-
client_id: `http://localhost?${params.toString()}`,
-
client_name: config.clientName,
-
client_uri: `https://wisp.place`,
-
redirect_uris: [redirectUri],
-
grant_types: ['authorization_code', 'refresh_token'],
-
response_types: ['code'],
-
application_type: 'web',
-
token_endpoint_auth_method: 'none',
-
scope: scope,
-
dpop_bound_access_tokens: false,
-
subject_type: 'public'
-
};
-
}
-
-
// Production client with private_key_jwt
-
return {
-
client_id: `${config.domain}/client-metadata.json`,
-
client_name: config.clientName,
-
client_uri: `https://wisp.place`,
-
logo_uri: `${config.domain}/logo.png`,
-
tos_uri: `${config.domain}/tos`,
-
policy_uri: `${config.domain}/policy`,
-
redirect_uris: [`${config.domain}/api/auth/callback`],
-
grant_types: ['authorization_code', 'refresh_token'],
-
response_types: ['code'],
-
application_type: 'web',
-
token_endpoint_auth_method: 'private_key_jwt',
-
token_endpoint_auth_signing_alg: "ES256",
-
scope: "atproto transition:generic",
-
dpop_bound_access_tokens: true,
-
jwks_uri: `${config.domain}/jwks.json`,
-
subject_type: 'public',
-
authorization_signed_response_alg: 'ES256'
-
};
-
};
-
-
const persistKey = async (key: JoseKey) => {
-
const priv = key.privateJwk;
-
if (!priv) return;
-
const kid = key.kid ?? crypto.randomUUID();
-
await db`
-
INSERT INTO oauth_keys (kid, jwk, created_at)
-
VALUES (${kid}, ${JSON.stringify(priv)}, EXTRACT(EPOCH FROM NOW()))
-
ON CONFLICT (kid) DO UPDATE SET jwk = EXCLUDED.jwk
-
`;
-
};
-
-
const loadPersistedKeys = async (): Promise<JoseKey[]> => {
-
const rows = await db`SELECT kid, jwk, created_at FROM oauth_keys ORDER BY kid`;
-
const keys: JoseKey[] = [];
-
for (const row of rows) {
-
try {
-
const obj = JSON.parse(row.jwk);
-
const key = await JoseKey.fromImportable(obj as any, (obj as any).kid);
-
keys.push(key);
-
} catch (err) {
-
logger.error('[OAuth] Could not parse stored JWK', err);
-
}
-
}
-
return keys;
-
};
-
-
const ensureKeys = async (): Promise<JoseKey[]> => {
-
let keys = await loadPersistedKeys();
-
const needed: string[] = [];
-
for (let i = 1; i <= 3; i++) {
-
const kid = `key${i}`;
-
if (!keys.some(k => k.kid === kid)) needed.push(kid);
-
}
-
for (const kid of needed) {
-
const newKey = await JoseKey.generate(['ES256'], kid);
-
await persistKey(newKey);
-
keys.push(newKey);
-
}
-
keys.sort((a, b) => (a.kid ?? '').localeCompare(b.kid ?? ''));
-
return keys;
-
};
-
-
// Load keys from database every time (stateless - safe for horizontal scaling)
-
export const getCurrentKeys = async (): Promise<JoseKey[]> => {
-
return await loadPersistedKeys();
-
};
-
-
// Key rotation - rotate keys older than 30 days (monthly rotation)
-
const KEY_MAX_AGE = 30 * 24 * 60 * 60; // 30 days in seconds
-
-
export const rotateKeysIfNeeded = async (): Promise<boolean> => {
-
const now = Math.floor(Date.now() / 1000);
-
const cutoffTime = now - KEY_MAX_AGE;
-
-
try {
-
// Find keys older than 30 days
-
const oldKeys = await db`
-
SELECT kid, created_at FROM oauth_keys
-
WHERE created_at IS NOT NULL AND created_at < ${cutoffTime}
-
ORDER BY created_at ASC
-
`;
-
-
if (oldKeys.length === 0) {
-
logger.debug('[KeyRotation] No keys need rotation');
-
return false;
-
}
-
-
logger.info(`[KeyRotation] Found ${oldKeys.length} key(s) older than 30 days, rotating oldest key`);
-
-
// Rotate the oldest key
-
const oldestKey = oldKeys[0];
-
const oldKid = oldestKey.kid;
-
-
// Generate new key with same kid
-
const newKey = await JoseKey.generate(['ES256'], oldKid);
-
await persistKey(newKey);
-
-
logger.info(`[KeyRotation] Rotated key ${oldKid}`);
-
-
return true;
-
} catch (err) {
-
logger.error('[KeyRotation] Failed to rotate keys', err);
-
return false;
-
}
-
};
-
-
export const getOAuthClient = async (config: { domain: `http://${string}` | `https://${string}`, clientName: string }) => {
-
const keys = await ensureKeys();
-
-
return new NodeOAuthClient({
-
clientMetadata: createClientMetadata(config),
-
keyset: keys,
-
stateStore,
-
sessionStore
-
});
-
};
-335
src/lib/observability.ts
···
-
// DIY Observability - Logs, Metrics, and Error Tracking
-
// Types
-
export interface LogEntry {
-
id: string
-
timestamp: Date
-
level: 'info' | 'warn' | 'error' | 'debug'
-
message: string
-
service: string
-
context?: Record<string, any>
-
traceId?: string
-
eventType?: string
-
}
-
-
export interface ErrorEntry {
-
id: string
-
timestamp: Date
-
message: string
-
stack?: string
-
service: string
-
context?: Record<string, any>
-
count: number // How many times this error occurred
-
lastSeen: Date
-
}
-
-
export interface MetricEntry {
-
timestamp: Date
-
path: string
-
method: string
-
statusCode: number
-
duration: number // in milliseconds
-
service: string
-
}
-
-
export interface DatabaseStats {
-
totalSites: number
-
totalDomains: number
-
totalCustomDomains: number
-
recentSites: any[]
-
recentDomains: any[]
-
}
-
-
// In-memory storage with rotation
-
const MAX_LOGS = 5000
-
const MAX_ERRORS = 500
-
const MAX_METRICS = 10000
-
-
const logs: LogEntry[] = []
-
const errors: Map<string, ErrorEntry> = new Map()
-
const metrics: MetricEntry[] = []
-
-
// Helper to generate unique IDs
-
let logCounter = 0
-
let errorCounter = 0
-
-
function generateId(prefix: string, counter: number): string {
-
return `${prefix}-${Date.now()}-${counter}`
-
}
-
-
// Helper to extract event type from message
-
function extractEventType(message: string): string | undefined {
-
const match = message.match(/^\[([^\]]+)\]/)
-
return match ? match[1] : undefined
-
}
-
-
// Log collector
-
export const logCollector = {
-
log(level: LogEntry['level'], message: string, service: string, context?: Record<string, any>, traceId?: string) {
-
const entry: LogEntry = {
-
id: generateId('log', logCounter++),
-
timestamp: new Date(),
-
level,
-
message,
-
service,
-
context,
-
traceId,
-
eventType: extractEventType(message)
-
}
-
-
logs.unshift(entry)
-
-
// Rotate if needed
-
if (logs.length > MAX_LOGS) {
-
logs.splice(MAX_LOGS)
-
}
-
-
// Also log to console for compatibility
-
const contextStr = context ? ` ${JSON.stringify(context)}` : ''
-
const traceStr = traceId ? ` [trace:${traceId}]` : ''
-
console[level === 'debug' ? 'log' : level](`[${service}] ${message}${contextStr}${traceStr}`)
-
},
-
-
info(message: string, service: string, context?: Record<string, any>, traceId?: string) {
-
this.log('info', message, service, context, traceId)
-
},
-
-
warn(message: string, service: string, context?: Record<string, any>, traceId?: string) {
-
this.log('warn', message, service, context, traceId)
-
},
-
-
error(message: string, service: string, error?: any, context?: Record<string, any>, traceId?: string) {
-
const ctx = { ...context }
-
if (error instanceof Error) {
-
ctx.error = error.message
-
ctx.stack = error.stack
-
} else if (error) {
-
ctx.error = String(error)
-
}
-
this.log('error', message, service, ctx, traceId)
-
-
// Also track in errors
-
errorTracker.track(message, service, error, context)
-
},
-
-
debug(message: string, service: string, context?: Record<string, any>, traceId?: string) {
-
if (process.env.NODE_ENV !== 'production') {
-
this.log('debug', message, service, context, traceId)
-
}
-
},
-
-
getLogs(filter?: { level?: string; service?: string; limit?: number; search?: string; eventType?: string }) {
-
let filtered = [...logs]
-
-
if (filter?.level) {
-
filtered = filtered.filter(log => log.level === filter.level)
-
}
-
-
if (filter?.service) {
-
filtered = filtered.filter(log => log.service === filter.service)
-
}
-
-
if (filter?.eventType) {
-
filtered = filtered.filter(log => log.eventType === filter.eventType)
-
}
-
-
if (filter?.search) {
-
const search = filter.search.toLowerCase()
-
filtered = filtered.filter(log =>
-
log.message.toLowerCase().includes(search) ||
-
(log.context ? JSON.stringify(log.context).toLowerCase().includes(search) : false)
-
)
-
}
-
-
const limit = filter?.limit || 100
-
return filtered.slice(0, limit)
-
},
-
-
clear() {
-
logs.length = 0
-
}
-
}
-
-
// Error tracker with deduplication
-
export const errorTracker = {
-
track(message: string, service: string, error?: any, context?: Record<string, any>) {
-
const key = `${service}:${message}`
-
-
const existing = errors.get(key)
-
if (existing) {
-
existing.count++
-
existing.lastSeen = new Date()
-
if (context) {
-
existing.context = { ...existing.context, ...context }
-
}
-
} else {
-
const entry: ErrorEntry = {
-
id: generateId('error', errorCounter++),
-
timestamp: new Date(),
-
message,
-
service,
-
context,
-
count: 1,
-
lastSeen: new Date()
-
}
-
-
if (error instanceof Error) {
-
entry.stack = error.stack
-
}
-
-
errors.set(key, entry)
-
-
// Rotate if needed
-
if (errors.size > MAX_ERRORS) {
-
const oldest = Array.from(errors.keys())[0]
-
errors.delete(oldest)
-
}
-
}
-
},
-
-
getErrors(filter?: { service?: string; limit?: number }) {
-
let filtered = Array.from(errors.values())
-
-
if (filter?.service) {
-
filtered = filtered.filter(err => err.service === filter.service)
-
}
-
-
// Sort by last seen (most recent first)
-
filtered.sort((a, b) => b.lastSeen.getTime() - a.lastSeen.getTime())
-
-
const limit = filter?.limit || 100
-
return filtered.slice(0, limit)
-
},
-
-
clear() {
-
errors.clear()
-
}
-
}
-
-
// Metrics collector
-
export const metricsCollector = {
-
recordRequest(path: string, method: string, statusCode: number, duration: number, service: string) {
-
const entry: MetricEntry = {
-
timestamp: new Date(),
-
path,
-
method,
-
statusCode,
-
duration,
-
service
-
}
-
-
metrics.unshift(entry)
-
-
// Rotate if needed
-
if (metrics.length > MAX_METRICS) {
-
metrics.splice(MAX_METRICS)
-
}
-
},
-
-
getMetrics(filter?: { service?: string; timeWindow?: number }) {
-
let filtered = [...metrics]
-
-
if (filter?.service) {
-
filtered = filtered.filter(m => m.service === filter.service)
-
}
-
-
if (filter?.timeWindow) {
-
const cutoff = Date.now() - filter.timeWindow
-
filtered = filtered.filter(m => m.timestamp.getTime() > cutoff)
-
}
-
-
return filtered
-
},
-
-
getStats(service?: string, timeWindow: number = 3600000) {
-
const filtered = this.getMetrics({ service, timeWindow })
-
-
if (filtered.length === 0) {
-
return {
-
totalRequests: 0,
-
avgDuration: 0,
-
p50Duration: 0,
-
p95Duration: 0,
-
p99Duration: 0,
-
errorRate: 0,
-
requestsPerMinute: 0
-
}
-
}
-
-
const durations = filtered.map(m => m.duration).sort((a, b) => a - b)
-
const totalDuration = durations.reduce((sum, d) => sum + d, 0)
-
const errors = filtered.filter(m => m.statusCode >= 400).length
-
-
const p50 = durations[Math.floor(durations.length * 0.5)]
-
const p95 = durations[Math.floor(durations.length * 0.95)]
-
const p99 = durations[Math.floor(durations.length * 0.99)]
-
-
const timeWindowMinutes = timeWindow / 60000
-
-
return {
-
totalRequests: filtered.length,
-
avgDuration: Math.round(totalDuration / filtered.length),
-
p50Duration: Math.round(p50),
-
p95Duration: Math.round(p95),
-
p99Duration: Math.round(p99),
-
errorRate: (errors / filtered.length) * 100,
-
requestsPerMinute: Math.round(filtered.length / timeWindowMinutes)
-
}
-
},
-
-
clear() {
-
metrics.length = 0
-
}
-
}
-
-
// Elysia middleware for request timing
-
export function observabilityMiddleware(service: string) {
-
return {
-
beforeHandle: ({ request }: any) => {
-
// Store start time on request object
-
(request as any).__startTime = Date.now()
-
},
-
afterHandle: ({ request, set }: any) => {
-
const duration = Date.now() - ((request as any).__startTime || Date.now())
-
const url = new URL(request.url)
-
-
metricsCollector.recordRequest(
-
url.pathname,
-
request.method,
-
set.status || 200,
-
duration,
-
service
-
)
-
},
-
onError: ({ request, error, set }: any) => {
-
const duration = Date.now() - ((request as any).__startTime || Date.now())
-
const url = new URL(request.url)
-
-
metricsCollector.recordRequest(
-
url.pathname,
-
request.method,
-
set.status || 500,
-
duration,
-
service
-
)
-
-
logCollector.error(
-
`Request failed: ${request.method} ${url.pathname}`,
-
service,
-
error,
-
{ statusCode: set.status || 500 }
-
)
-
}
-
}
-
}
-
-
// Export singleton logger for easy access
-
export const logger = {
-
info: (message: string, context?: Record<string, any>) =>
-
logCollector.info(message, 'main-app', context),
-
warn: (message: string, context?: Record<string, any>) =>
-
logCollector.warn(message, 'main-app', context),
-
error: (message: string, error?: any, context?: Record<string, any>) =>
-
logCollector.error(message, 'main-app', error, context),
-
debug: (message: string, context?: Record<string, any>) =>
-
logCollector.debug(message, 'main-app', context)
-
}
-90
src/lib/sync-sites.ts
···
-
import { Agent } from '@atproto/api'
-
import type { OAuthSession } from '@atproto/oauth-client-node'
-
import { upsertSite } from './db'
-
-
/**
-
* Sync sites from user's PDS into the database cache
-
* - Fetches all place.wisp.fs records from AT Protocol repo
-
* - Validates record structure
-
* - Backfills into sites table
-
*/
-
export async function syncSitesFromPDS(
-
did: string,
-
session: OAuthSession
-
): Promise<{ synced: number; errors: string[] }> {
-
console.log(`[Sync] Starting site sync for ${did}`)
-
-
const agent = new Agent((url, init) => session.fetchHandler(url, init))
-
const errors: string[] = []
-
let synced = 0
-
-
try {
-
// List all records in the place.wisp.fs collection
-
console.log('[Sync] Fetching place.wisp.fs records from PDS')
-
const records = await agent.com.atproto.repo.listRecords({
-
repo: did,
-
collection: 'place.wisp.fs',
-
limit: 100 // Adjust if users might have more sites
-
})
-
-
console.log(`[Sync] Found ${records.data.records.length} records`)
-
-
// Process each record
-
for (const record of records.data.records) {
-
try {
-
const { uri, value } = record
-
-
// Extract rkey from URI (at://did/collection/rkey)
-
const rkey = uri.split('/').pop()
-
if (!rkey) {
-
errors.push(`Invalid URI format: ${uri}`)
-
continue
-
}
-
-
// Validate record structure
-
if (!value || typeof value !== 'object') {
-
errors.push(`Invalid record value for ${rkey}`)
-
continue
-
}
-
-
const siteValue = value as any
-
-
// Check for required fields
-
if (siteValue.$type !== 'place.wisp.fs') {
-
errors.push(
-
`Invalid $type for ${rkey}: ${siteValue.$type}`
-
)
-
continue
-
}
-
-
if (!siteValue.site || typeof siteValue.site !== 'string') {
-
errors.push(`Missing or invalid site name for ${rkey}`)
-
continue
-
}
-
-
// Upsert into database
-
const displayName = siteValue.site
-
await upsertSite(did, rkey, displayName)
-
-
console.log(
-
`[Sync] โœ“ Synced site: ${displayName} (${rkey})`
-
)
-
synced++
-
} catch (err) {
-
const errorMsg = `Error processing record: ${err instanceof Error ? err.message : 'Unknown error'}`
-
console.error(`[Sync] ${errorMsg}`)
-
errors.push(errorMsg)
-
}
-
}
-
-
console.log(
-
`[Sync] Complete: ${synced} synced, ${errors.length} errors`
-
)
-
return { synced, errors }
-
} catch (err) {
-
const errorMsg = `Failed to fetch records from PDS: ${err instanceof Error ? err.message : 'Unknown error'}`
-
console.error(`[Sync] ${errorMsg}`)
-
errors.push(errorMsg)
-
return { synced, errors }
-
}
-
}
-10
src/lib/types.ts
···
-
/**
-
* Configuration for the Wisp client
-
* @typeParam Config
-
*/
-
export type Config = {
-
/** The base domain URL with HTTP or HTTPS protocol */
-
domain: `http://${string}` | `https://${string}`,
-
/** Name of the client application */
-
clientName: string
-
};
-38
src/lib/wisp-auth.ts
···
-
import { Did } from "@atproto/api";
-
import { NodeOAuthClient } from "@atproto/oauth-client-node";
-
import type { OAuthSession } from "@atproto/oauth-client-node";
-
import { Cookie } from "elysia";
-
import { logger } from "./logger";
-
-
-
export interface AuthenticatedContext {
-
did: Did;
-
session: OAuthSession;
-
}
-
-
export const authenticateRequest = async (
-
client: NodeOAuthClient,
-
cookies: Record<string, Cookie<unknown>>
-
): Promise<AuthenticatedContext | null> => {
-
try {
-
const did = cookies.did?.value as Did;
-
if (!did) return null;
-
-
const session = await client.restore(did, "auto");
-
return session ? { did, session } : null;
-
} catch (err) {
-
logger.error('[Auth] Authentication error', err);
-
return null;
-
}
-
};
-
-
export const requireAuth = async (
-
client: NodeOAuthClient,
-
cookies: Record<string, Cookie<unknown>>
-
): Promise<AuthenticatedContext> => {
-
const auth = await authenticateRequest(client, cookies);
-
if (!auth) {
-
throw new Error('Authentication required');
-
}
-
return auth;
-
};
-639
src/lib/wisp-utils.test.ts
···
-
import { describe, test, expect } from 'bun:test'
-
import {
-
shouldCompressFile,
-
compressFile,
-
processUploadedFiles,
-
createManifest,
-
updateFileBlobs,
-
type UploadedFile,
-
type FileUploadResult,
-
} from './wisp-utils'
-
import type { Directory } from '../lexicons/types/place/wisp/fs'
-
import { gunzipSync } from 'zlib'
-
import { BlobRef } from '@atproto/api'
-
import { CID } from 'multiformats/cid'
-
-
// Helper function to create a valid CID for testing
-
// Using a real valid CID from actual AT Protocol usage
-
const TEST_CID_STRING = 'bafkreid7ybejd5s2vv2j7d4aajjlmdgazguemcnuliiyfn6coxpwp2mi6y'
-
-
function createMockBlobRef(mimeType: string, size: number): BlobRef {
-
// Create a properly formatted CID
-
const cid = CID.parse(TEST_CID_STRING)
-
return new BlobRef(cid, mimeType, size)
-
}
-
-
describe('shouldCompressFile', () => {
-
test('should compress HTML files', () => {
-
expect(shouldCompressFile('text/html')).toBe(true)
-
expect(shouldCompressFile('text/html; charset=utf-8')).toBe(true)
-
})
-
-
test('should compress CSS files', () => {
-
expect(shouldCompressFile('text/css')).toBe(true)
-
})
-
-
test('should compress JavaScript files', () => {
-
expect(shouldCompressFile('text/javascript')).toBe(true)
-
expect(shouldCompressFile('application/javascript')).toBe(true)
-
expect(shouldCompressFile('application/x-javascript')).toBe(true)
-
})
-
-
test('should compress JSON files', () => {
-
expect(shouldCompressFile('application/json')).toBe(true)
-
})
-
-
test('should compress SVG files', () => {
-
expect(shouldCompressFile('image/svg+xml')).toBe(true)
-
})
-
-
test('should compress XML files', () => {
-
expect(shouldCompressFile('text/xml')).toBe(true)
-
expect(shouldCompressFile('application/xml')).toBe(true)
-
})
-
-
test('should compress plain text files', () => {
-
expect(shouldCompressFile('text/plain')).toBe(true)
-
})
-
-
test('should NOT compress images', () => {
-
expect(shouldCompressFile('image/png')).toBe(false)
-
expect(shouldCompressFile('image/jpeg')).toBe(false)
-
expect(shouldCompressFile('image/jpg')).toBe(false)
-
expect(shouldCompressFile('image/gif')).toBe(false)
-
expect(shouldCompressFile('image/webp')).toBe(false)
-
})
-
-
test('should NOT compress videos', () => {
-
expect(shouldCompressFile('video/mp4')).toBe(false)
-
expect(shouldCompressFile('video/webm')).toBe(false)
-
})
-
-
test('should NOT compress already compressed formats', () => {
-
expect(shouldCompressFile('application/zip')).toBe(false)
-
expect(shouldCompressFile('application/gzip')).toBe(false)
-
expect(shouldCompressFile('application/pdf')).toBe(false)
-
})
-
-
test('should NOT compress fonts', () => {
-
expect(shouldCompressFile('font/woff')).toBe(false)
-
expect(shouldCompressFile('font/woff2')).toBe(false)
-
expect(shouldCompressFile('font/ttf')).toBe(false)
-
})
-
})
-
-
describe('compressFile', () => {
-
test('should compress text content', () => {
-
const content = Buffer.from('Hello, World! '.repeat(100))
-
const compressed = compressFile(content)
-
-
expect(compressed.length).toBeLessThan(content.length)
-
-
// Verify we can decompress it back
-
const decompressed = gunzipSync(compressed)
-
expect(decompressed.toString()).toBe(content.toString())
-
})
-
-
test('should compress HTML content significantly', () => {
-
const html = `
-
<!DOCTYPE html>
-
<html>
-
<head><title>Test</title></head>
-
<body>
-
${'<p>Hello World!</p>\n'.repeat(50)}
-
</body>
-
</html>
-
`
-
const content = Buffer.from(html)
-
const compressed = compressFile(content)
-
-
expect(compressed.length).toBeLessThan(content.length)
-
-
// Verify decompression
-
const decompressed = gunzipSync(compressed)
-
expect(decompressed.toString()).toBe(html)
-
})
-
-
test('should handle empty content', () => {
-
const content = Buffer.from('')
-
const compressed = compressFile(content)
-
const decompressed = gunzipSync(compressed)
-
expect(decompressed.toString()).toBe('')
-
})
-
-
test('should produce deterministic compression', () => {
-
const content = Buffer.from('Test content')
-
const compressed1 = compressFile(content)
-
const compressed2 = compressFile(content)
-
-
expect(compressed1.toString('base64')).toBe(compressed2.toString('base64'))
-
})
-
})
-
-
describe('processUploadedFiles', () => {
-
test('should process single root-level file', () => {
-
const files: UploadedFile[] = [
-
{
-
name: 'index.html',
-
content: Buffer.from('<html></html>'),
-
mimeType: 'text/html',
-
size: 13,
-
},
-
]
-
-
const result = processUploadedFiles(files)
-
-
expect(result.fileCount).toBe(1)
-
expect(result.directory.type).toBe('directory')
-
expect(result.directory.entries).toHaveLength(1)
-
expect(result.directory.entries[0].name).toBe('index.html')
-
-
const node = result.directory.entries[0].node
-
expect('blob' in node).toBe(true) // It's a file node
-
})
-
-
test('should process multiple root-level files', () => {
-
const files: UploadedFile[] = [
-
{
-
name: 'index.html',
-
content: Buffer.from('<html></html>'),
-
mimeType: 'text/html',
-
size: 13,
-
},
-
{
-
name: 'styles.css',
-
content: Buffer.from('body {}'),
-
mimeType: 'text/css',
-
size: 7,
-
},
-
{
-
name: 'script.js',
-
content: Buffer.from('console.log("hi")'),
-
mimeType: 'application/javascript',
-
size: 17,
-
},
-
]
-
-
const result = processUploadedFiles(files)
-
-
expect(result.fileCount).toBe(3)
-
expect(result.directory.entries).toHaveLength(3)
-
-
const names = result.directory.entries.map(e => e.name)
-
expect(names).toContain('index.html')
-
expect(names).toContain('styles.css')
-
expect(names).toContain('script.js')
-
})
-
-
test('should process files with subdirectories', () => {
-
const files: UploadedFile[] = [
-
{
-
name: 'dist/index.html',
-
content: Buffer.from('<html></html>'),
-
mimeType: 'text/html',
-
size: 13,
-
},
-
{
-
name: 'dist/css/styles.css',
-
content: Buffer.from('body {}'),
-
mimeType: 'text/css',
-
size: 7,
-
},
-
{
-
name: 'dist/js/app.js',
-
content: Buffer.from('console.log()'),
-
mimeType: 'application/javascript',
-
size: 13,
-
},
-
]
-
-
const result = processUploadedFiles(files)
-
-
expect(result.fileCount).toBe(3)
-
expect(result.directory.entries).toHaveLength(3) // index.html, css/, js/
-
-
// Check root has index.html (after base folder removal)
-
const indexEntry = result.directory.entries.find(e => e.name === 'index.html')
-
expect(indexEntry).toBeDefined()
-
-
// Check css directory exists
-
const cssDir = result.directory.entries.find(e => e.name === 'css')
-
expect(cssDir).toBeDefined()
-
expect('entries' in cssDir!.node).toBe(true)
-
-
if ('entries' in cssDir!.node) {
-
expect(cssDir!.node.entries).toHaveLength(1)
-
expect(cssDir!.node.entries[0].name).toBe('styles.css')
-
}
-
-
// Check js directory exists
-
const jsDir = result.directory.entries.find(e => e.name === 'js')
-
expect(jsDir).toBeDefined()
-
expect('entries' in jsDir!.node).toBe(true)
-
})
-
-
test('should handle deeply nested subdirectories', () => {
-
const files: UploadedFile[] = [
-
{
-
name: 'dist/deep/nested/folder/file.txt',
-
content: Buffer.from('content'),
-
mimeType: 'text/plain',
-
size: 7,
-
},
-
]
-
-
const result = processUploadedFiles(files)
-
-
expect(result.fileCount).toBe(1)
-
-
// Navigate through the directory structure (base folder removed)
-
const deepDir = result.directory.entries.find(e => e.name === 'deep')
-
expect(deepDir).toBeDefined()
-
expect('entries' in deepDir!.node).toBe(true)
-
-
if ('entries' in deepDir!.node) {
-
const nestedDir = deepDir!.node.entries.find(e => e.name === 'nested')
-
expect(nestedDir).toBeDefined()
-
-
if (nestedDir && 'entries' in nestedDir.node) {
-
const folderDir = nestedDir.node.entries.find(e => e.name === 'folder')
-
expect(folderDir).toBeDefined()
-
-
if (folderDir && 'entries' in folderDir.node) {
-
expect(folderDir.node.entries).toHaveLength(1)
-
expect(folderDir.node.entries[0].name).toBe('file.txt')
-
}
-
}
-
}
-
})
-
-
test('should remove base folder name from paths', () => {
-
const files: UploadedFile[] = [
-
{
-
name: 'dist/index.html',
-
content: Buffer.from('<html></html>'),
-
mimeType: 'text/html',
-
size: 13,
-
},
-
{
-
name: 'dist/css/styles.css',
-
content: Buffer.from('body {}'),
-
mimeType: 'text/css',
-
size: 7,
-
},
-
]
-
-
const result = processUploadedFiles(files)
-
-
// After removing 'dist/', we should have index.html and css/ at root
-
expect(result.directory.entries.find(e => e.name === 'index.html')).toBeDefined()
-
expect(result.directory.entries.find(e => e.name === 'css')).toBeDefined()
-
expect(result.directory.entries.find(e => e.name === 'dist')).toBeUndefined()
-
})
-
-
test('should handle empty file list', () => {
-
const files: UploadedFile[] = []
-
const result = processUploadedFiles(files)
-
-
expect(result.fileCount).toBe(0)
-
expect(result.directory.entries).toHaveLength(0)
-
})
-
-
test('should handle multiple files in same subdirectory', () => {
-
const files: UploadedFile[] = [
-
{
-
name: 'dist/assets/image1.png',
-
content: Buffer.from('png1'),
-
mimeType: 'image/png',
-
size: 4,
-
},
-
{
-
name: 'dist/assets/image2.png',
-
content: Buffer.from('png2'),
-
mimeType: 'image/png',
-
size: 4,
-
},
-
]
-
-
const result = processUploadedFiles(files)
-
-
expect(result.fileCount).toBe(2)
-
-
const assetsDir = result.directory.entries.find(e => e.name === 'assets')
-
expect(assetsDir).toBeDefined()
-
-
if ('entries' in assetsDir!.node) {
-
expect(assetsDir!.node.entries).toHaveLength(2)
-
const names = assetsDir!.node.entries.map(e => e.name)
-
expect(names).toContain('image1.png')
-
expect(names).toContain('image2.png')
-
}
-
})
-
})
-
-
describe('createManifest', () => {
-
test('should create valid manifest', () => {
-
const root: Directory = {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [],
-
}
-
-
const manifest = createManifest('example.com', root, 0)
-
-
expect(manifest.$type).toBe('place.wisp.fs')
-
expect(manifest.site).toBe('example.com')
-
expect(manifest.root).toBe(root)
-
expect(manifest.fileCount).toBe(0)
-
expect(manifest.createdAt).toBeDefined()
-
-
// Verify it's a valid ISO date string
-
const date = new Date(manifest.createdAt)
-
expect(date.toISOString()).toBe(manifest.createdAt)
-
})
-
-
test('should create manifest with file count', () => {
-
const root: Directory = {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [],
-
}
-
-
const manifest = createManifest('test-site', root, 42)
-
-
expect(manifest.fileCount).toBe(42)
-
expect(manifest.site).toBe('test-site')
-
})
-
-
test('should create manifest with populated directory', () => {
-
const mockBlob = createMockBlobRef('text/html', 100)
-
-
const root: Directory = {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [
-
{
-
name: 'index.html',
-
node: {
-
$type: 'place.wisp.fs#file',
-
type: 'file',
-
blob: mockBlob,
-
},
-
},
-
],
-
}
-
-
const manifest = createManifest('populated-site', root, 1)
-
-
expect(manifest).toBeDefined()
-
expect(manifest.site).toBe('populated-site')
-
expect(manifest.root.entries).toHaveLength(1)
-
})
-
})
-
-
describe('updateFileBlobs', () => {
-
test('should update single file blob at root', () => {
-
const directory: Directory = {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [
-
{
-
name: 'index.html',
-
node: {
-
$type: 'place.wisp.fs#file',
-
type: 'file',
-
blob: undefined as any,
-
},
-
},
-
],
-
}
-
-
const mockBlob = createMockBlobRef('text/html', 100)
-
const uploadResults: FileUploadResult[] = [
-
{
-
hash: TEST_CID_STRING,
-
blobRef: mockBlob,
-
mimeType: 'text/html',
-
},
-
]
-
-
const filePaths = ['index.html']
-
-
const updated = updateFileBlobs(directory, uploadResults, filePaths)
-
-
expect(updated.entries).toHaveLength(1)
-
const fileNode = updated.entries[0].node
-
-
if ('blob' in fileNode) {
-
expect(fileNode.blob).toBeDefined()
-
expect(fileNode.blob.mimeType).toBe('text/html')
-
expect(fileNode.blob.size).toBe(100)
-
} else {
-
throw new Error('Expected file node')
-
}
-
})
-
-
test('should update files in nested directories', () => {
-
const directory: Directory = {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [
-
{
-
name: 'css',
-
node: {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [
-
{
-
name: 'styles.css',
-
node: {
-
$type: 'place.wisp.fs#file',
-
type: 'file',
-
blob: undefined as any,
-
},
-
},
-
],
-
},
-
},
-
],
-
}
-
-
const mockBlob = createMockBlobRef('text/css', 50)
-
const uploadResults: FileUploadResult[] = [
-
{
-
hash: TEST_CID_STRING,
-
blobRef: mockBlob,
-
mimeType: 'text/css',
-
encoding: 'gzip',
-
},
-
]
-
-
const filePaths = ['css/styles.css']
-
-
const updated = updateFileBlobs(directory, uploadResults, filePaths)
-
-
const cssDir = updated.entries[0]
-
expect(cssDir.name).toBe('css')
-
-
if ('entries' in cssDir.node) {
-
const cssFile = cssDir.node.entries[0]
-
expect(cssFile.name).toBe('styles.css')
-
-
if ('blob' in cssFile.node) {
-
expect(cssFile.node.blob.mimeType).toBe('text/css')
-
if ('encoding' in cssFile.node) {
-
expect(cssFile.node.encoding).toBe('gzip')
-
}
-
} else {
-
throw new Error('Expected file node')
-
}
-
} else {
-
throw new Error('Expected directory node')
-
}
-
})
-
-
test('should handle normalized paths with base folder removed', () => {
-
const directory: Directory = {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [
-
{
-
name: 'index.html',
-
node: {
-
$type: 'place.wisp.fs#file',
-
type: 'file',
-
blob: undefined as any,
-
},
-
},
-
],
-
}
-
-
const mockBlob = createMockBlobRef('text/html', 100)
-
const uploadResults: FileUploadResult[] = [
-
{
-
hash: TEST_CID_STRING,
-
blobRef: mockBlob,
-
},
-
]
-
-
// Path includes base folder that should be normalized
-
const filePaths = ['dist/index.html']
-
-
const updated = updateFileBlobs(directory, uploadResults, filePaths)
-
-
const fileNode = updated.entries[0].node
-
if ('blob' in fileNode) {
-
expect(fileNode.blob).toBeDefined()
-
} else {
-
throw new Error('Expected file node')
-
}
-
})
-
-
test('should preserve file metadata (encoding, mimeType, base64)', () => {
-
const directory: Directory = {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [
-
{
-
name: 'data.json',
-
node: {
-
$type: 'place.wisp.fs#file',
-
type: 'file',
-
blob: undefined as any,
-
},
-
},
-
],
-
}
-
-
const mockBlob = createMockBlobRef('application/json', 200)
-
const uploadResults: FileUploadResult[] = [
-
{
-
hash: TEST_CID_STRING,
-
blobRef: mockBlob,
-
mimeType: 'application/json',
-
encoding: 'gzip',
-
base64: true,
-
},
-
]
-
-
const filePaths = ['data.json']
-
-
const updated = updateFileBlobs(directory, uploadResults, filePaths)
-
-
const fileNode = updated.entries[0].node
-
if ('blob' in fileNode && 'mimeType' in fileNode && 'encoding' in fileNode && 'base64' in fileNode) {
-
expect(fileNode.mimeType).toBe('application/json')
-
expect(fileNode.encoding).toBe('gzip')
-
expect(fileNode.base64).toBe(true)
-
} else {
-
throw new Error('Expected file node with metadata')
-
}
-
})
-
-
test('should handle multiple files at different directory levels', () => {
-
const directory: Directory = {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [
-
{
-
name: 'index.html',
-
node: {
-
$type: 'place.wisp.fs#file',
-
type: 'file',
-
blob: undefined as any,
-
},
-
},
-
{
-
name: 'assets',
-
node: {
-
$type: 'place.wisp.fs#directory',
-
type: 'directory',
-
entries: [
-
{
-
name: 'logo.svg',
-
node: {
-
$type: 'place.wisp.fs#file',
-
type: 'file',
-
blob: undefined as any,
-
},
-
},
-
],
-
},
-
},
-
],
-
}
-
-
const htmlBlob = createMockBlobRef('text/html', 100)
-
const svgBlob = createMockBlobRef('image/svg+xml', 500)
-
-
const uploadResults: FileUploadResult[] = [
-
{
-
hash: TEST_CID_STRING,
-
blobRef: htmlBlob,
-
},
-
{
-
hash: TEST_CID_STRING,
-
blobRef: svgBlob,
-
},
-
]
-
-
const filePaths = ['index.html', 'assets/logo.svg']
-
-
const updated = updateFileBlobs(directory, uploadResults, filePaths)
-
-
// Check root file
-
const indexNode = updated.entries[0].node
-
if ('blob' in indexNode) {
-
expect(indexNode.blob.mimeType).toBe('text/html')
-
}
-
-
// Check nested file
-
const assetsDir = updated.entries[1]
-
if ('entries' in assetsDir.node) {
-
const logoNode = assetsDir.node.entries[0].node
-
if ('blob' in logoNode) {
-
expect(logoNode.blob.mimeType).toBe('image/svg+xml')
-
}
-
}
-
})
-
})
-241
src/lib/wisp-utils.ts
···
-
import type { BlobRef } from "@atproto/api";
-
import type { Record, Directory, File, Entry } from "../lexicons/types/place/wisp/fs";
-
import { validateRecord } from "../lexicons/types/place/wisp/fs";
-
import { gzipSync } from 'zlib';
-
-
export interface UploadedFile {
-
name: string;
-
content: Buffer;
-
mimeType: string;
-
size: number;
-
compressed?: boolean;
-
originalMimeType?: string;
-
}
-
-
export interface FileUploadResult {
-
hash: string;
-
blobRef: BlobRef;
-
encoding?: 'gzip';
-
mimeType?: string;
-
base64?: boolean;
-
}
-
-
export interface ProcessedDirectory {
-
directory: Directory;
-
fileCount: number;
-
}
-
-
/**
-
* Determine if a file should be gzip compressed based on its MIME type
-
*/
-
export function shouldCompressFile(mimeType: string): boolean {
-
// Compress text-based files
-
const compressibleTypes = [
-
'text/html',
-
'text/css',
-
'text/javascript',
-
'application/javascript',
-
'application/json',
-
'image/svg+xml',
-
'text/xml',
-
'application/xml',
-
'text/plain',
-
'application/x-javascript'
-
];
-
-
// Check if mime type starts with any compressible type
-
return compressibleTypes.some(type => mimeType.startsWith(type));
-
}
-
-
/**
-
* Compress a file using gzip
-
*/
-
export function compressFile(content: Buffer): Buffer {
-
return gzipSync(content, { level: 9 });
-
}
-
-
/**
-
* Process uploaded files into a directory structure
-
*/
-
export function processUploadedFiles(files: UploadedFile[]): ProcessedDirectory {
-
const entries: Entry[] = [];
-
let fileCount = 0;
-
-
// Group files by directory
-
const directoryMap = new Map<string, UploadedFile[]>();
-
-
for (const file of files) {
-
// Remove any base folder name from the path
-
const normalizedPath = file.name.replace(/^[^\/]*\//, '');
-
const parts = normalizedPath.split('/');
-
-
if (parts.length === 1) {
-
// Root level file
-
entries.push({
-
name: parts[0],
-
node: {
-
$type: 'place.wisp.fs#file' as const,
-
type: 'file' as const,
-
blob: undefined as any // Will be filled after upload
-
}
-
});
-
fileCount++;
-
} else {
-
// File in subdirectory
-
const dirPath = parts.slice(0, -1).join('/');
-
if (!directoryMap.has(dirPath)) {
-
directoryMap.set(dirPath, []);
-
}
-
directoryMap.get(dirPath)!.push({
-
...file,
-
name: normalizedPath
-
});
-
}
-
}
-
-
// Process subdirectories
-
for (const [dirPath, dirFiles] of directoryMap) {
-
const dirEntries: Entry[] = [];
-
-
for (const file of dirFiles) {
-
const fileName = file.name.split('/').pop()!;
-
dirEntries.push({
-
name: fileName,
-
node: {
-
$type: 'place.wisp.fs#file' as const,
-
type: 'file' as const,
-
blob: undefined as any // Will be filled after upload
-
}
-
});
-
fileCount++;
-
}
-
-
// Build nested directory structure
-
const pathParts = dirPath.split('/');
-
let currentEntries = entries;
-
-
for (let i = 0; i < pathParts.length; i++) {
-
const part = pathParts[i];
-
const isLast = i === pathParts.length - 1;
-
-
let existingEntry = currentEntries.find(e => e.name === part);
-
-
if (!existingEntry) {
-
const newDir = {
-
$type: 'place.wisp.fs#directory' as const,
-
type: 'directory' as const,
-
entries: isLast ? dirEntries : []
-
};
-
-
existingEntry = {
-
name: part,
-
node: newDir
-
};
-
currentEntries.push(existingEntry);
-
} else if ('entries' in existingEntry.node && isLast) {
-
(existingEntry.node as any).entries.push(...dirEntries);
-
}
-
-
if (existingEntry && 'entries' in existingEntry.node) {
-
currentEntries = (existingEntry.node as any).entries;
-
}
-
}
-
}
-
-
const result = {
-
directory: {
-
$type: 'place.wisp.fs#directory' as const,
-
type: 'directory' as const,
-
entries
-
},
-
fileCount
-
};
-
-
return result;
-
}
-
-
/**
-
* Create the manifest record for a site
-
*/
-
export function createManifest(
-
siteName: string,
-
root: Directory,
-
fileCount: number
-
): Record {
-
const manifest = {
-
$type: 'place.wisp.fs' as const,
-
site: siteName,
-
root,
-
fileCount,
-
createdAt: new Date().toISOString()
-
};
-
-
// Validate the manifest before returning
-
const validationResult = validateRecord(manifest);
-
if (!validationResult.success) {
-
throw new Error(`Invalid manifest: ${validationResult.error?.message || 'Validation failed'}`);
-
}
-
-
return manifest;
-
}
-
-
/**
-
* Update file blobs in directory structure after upload
-
* Uses path-based matching to correctly match files in nested directories
-
*/
-
export function updateFileBlobs(
-
directory: Directory,
-
uploadResults: FileUploadResult[],
-
filePaths: string[],
-
currentPath: string = ''
-
): Directory {
-
const updatedEntries = directory.entries.map(entry => {
-
if ('type' in entry.node && entry.node.type === 'file') {
-
// Build the full path for this file
-
const fullPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
-
-
// Find exact match in filePaths (need to handle normalized paths)
-
const fileIndex = filePaths.findIndex((path) => {
-
// Normalize both paths by removing leading base folder
-
const normalizedUploadPath = path.replace(/^[^\/]*\//, '');
-
const normalizedEntryPath = fullPath;
-
return normalizedUploadPath === normalizedEntryPath || path === fullPath;
-
});
-
-
if (fileIndex !== -1 && uploadResults[fileIndex]) {
-
const result = uploadResults[fileIndex];
-
const blobRef = result.blobRef;
-
-
return {
-
...entry,
-
node: {
-
$type: 'place.wisp.fs#file' as const,
-
type: 'file' as const,
-
blob: blobRef,
-
...(result.encoding && { encoding: result.encoding }),
-
...(result.mimeType && { mimeType: result.mimeType }),
-
...(result.base64 && { base64: result.base64 })
-
}
-
};
-
} else {
-
console.error(`โŒ BLOB MATCHING ERROR: Could not find blob for file: ${fullPath}`);
-
console.error(` Available paths:`, filePaths.slice(0, 10), filePaths.length > 10 ? `... and ${filePaths.length - 10} more` : '');
-
}
-
} else if ('type' in entry.node && entry.node.type === 'directory') {
-
const dirPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
-
return {
-
...entry,
-
node: updateFileBlobs(entry.node as Directory, uploadResults, filePaths, dirPath)
-
};
-
}
-
return entry;
-
}) as Entry[];
-
-
const result = {
-
$type: 'place.wisp.fs#directory' as const,
-
type: 'directory' as const,
-
entries: updatedEntries
-
};
-
-
return result;
-
}
-305
src/routes/admin.ts
···
-
// Admin API routes
-
import { Elysia, t } from 'elysia'
-
import { adminAuth, requireAdmin } from '../lib/admin-auth'
-
import { logCollector, errorTracker, metricsCollector } from '../lib/observability'
-
import { db } from '../lib/db'
-
-
export const adminRoutes = () =>
-
new Elysia({ prefix: '/api/admin' })
-
// Login
-
.post(
-
'/login',
-
async ({ body, cookie, set }) => {
-
const { username, password } = body
-
-
const valid = await adminAuth.verify(username, password)
-
if (!valid) {
-
set.status = 401
-
return { error: 'Invalid credentials' }
-
}
-
-
const sessionId = adminAuth.createSession(username)
-
-
// Set cookie
-
cookie.admin_session.set({
-
value: sessionId,
-
httpOnly: true,
-
secure: process.env.NODE_ENV === 'production',
-
sameSite: 'lax',
-
maxAge: 24 * 60 * 60 // 24 hours
-
})
-
-
return { success: true }
-
},
-
{
-
body: t.Object({
-
username: t.String(),
-
password: t.String()
-
})
-
}
-
)
-
-
// Logout
-
.post('/logout', ({ cookie }) => {
-
const sessionId = cookie.admin_session?.value
-
if (sessionId && typeof sessionId === 'string') {
-
adminAuth.deleteSession(sessionId)
-
}
-
cookie.admin_session.remove()
-
return { success: true }
-
})
-
-
// Check auth status
-
.get('/status', ({ cookie }) => {
-
const sessionId = cookie.admin_session?.value
-
if (!sessionId || typeof sessionId !== 'string') {
-
return { authenticated: false }
-
}
-
-
const session = adminAuth.verifySession(sessionId)
-
if (!session) {
-
return { authenticated: false }
-
}
-
-
return {
-
authenticated: true,
-
username: session.username
-
}
-
})
-
-
// Get logs (protected)
-
.get('/logs', async ({ query, cookie, set }) => {
-
const check = requireAdmin({ cookie, set })
-
if (check) return check
-
-
const filter: any = {}
-
-
if (query.level) filter.level = query.level
-
if (query.service) filter.service = query.service
-
if (query.search) filter.search = query.search
-
if (query.eventType) filter.eventType = query.eventType
-
if (query.limit) filter.limit = parseInt(query.limit as string)
-
-
// Get logs from main app
-
const mainLogs = logCollector.getLogs(filter)
-
-
// Get logs from hosting service
-
let hostingLogs: any[] = []
-
try {
-
const hostingPort = process.env.HOSTING_PORT || '3001'
-
const params = new URLSearchParams()
-
if (query.level) params.append('level', query.level as string)
-
if (query.service) params.append('service', query.service as string)
-
if (query.search) params.append('search', query.search as string)
-
if (query.eventType) params.append('eventType', query.eventType as string)
-
params.append('limit', String(filter.limit || 100))
-
-
const response = await fetch(`http://localhost:${hostingPort}/__internal__/observability/logs?${params}`)
-
if (response.ok) {
-
const data = await response.json()
-
hostingLogs = data.logs
-
}
-
} catch (err) {
-
// Hosting service might not be running
-
}
-
-
// Merge and sort by timestamp
-
const allLogs = [...mainLogs, ...hostingLogs].sort((a, b) =>
-
new Date(b.timestamp).getTime() - new Date(a.timestamp).getTime()
-
)
-
-
return { logs: allLogs.slice(0, filter.limit || 100) }
-
})
-
-
// Get errors (protected)
-
.get('/errors', async ({ query, cookie, set }) => {
-
const check = requireAdmin({ cookie, set })
-
if (check) return check
-
-
const filter: any = {}
-
-
if (query.service) filter.service = query.service
-
if (query.limit) filter.limit = parseInt(query.limit as string)
-
-
// Get errors from main app
-
const mainErrors = errorTracker.getErrors(filter)
-
-
// Get errors from hosting service
-
let hostingErrors: any[] = []
-
try {
-
const hostingPort = process.env.HOSTING_PORT || '3001'
-
const params = new URLSearchParams()
-
if (query.service) params.append('service', query.service as string)
-
params.append('limit', String(filter.limit || 100))
-
-
const response = await fetch(`http://localhost:${hostingPort}/__internal__/observability/errors?${params}`)
-
if (response.ok) {
-
const data = await response.json()
-
hostingErrors = data.errors
-
}
-
} catch (err) {
-
// Hosting service might not be running
-
}
-
-
// Merge and sort by last seen
-
const allErrors = [...mainErrors, ...hostingErrors].sort((a, b) =>
-
new Date(b.lastSeen).getTime() - new Date(a.lastSeen).getTime()
-
)
-
-
return { errors: allErrors.slice(0, filter.limit || 100) }
-
})
-
-
// Get metrics (protected)
-
.get('/metrics', async ({ query, cookie, set }) => {
-
const check = requireAdmin({ cookie, set })
-
if (check) return check
-
-
const timeWindow = query.timeWindow
-
? parseInt(query.timeWindow as string)
-
: 3600000 // 1 hour default
-
-
const mainAppStats = metricsCollector.getStats('main-app', timeWindow)
-
const overallStats = metricsCollector.getStats(undefined, timeWindow)
-
-
// Get hosting service stats from its own endpoint
-
let hostingServiceStats = {
-
totalRequests: 0,
-
avgDuration: 0,
-
p50Duration: 0,
-
p95Duration: 0,
-
p99Duration: 0,
-
errorRate: 0,
-
requestsPerMinute: 0
-
}
-
-
try {
-
const hostingPort = process.env.HOSTING_PORT || '3001'
-
const response = await fetch(`http://localhost:${hostingPort}/__internal__/observability/metrics?timeWindow=${timeWindow}`)
-
if (response.ok) {
-
const data = await response.json()
-
hostingServiceStats = data.stats
-
}
-
} catch (err) {
-
// Hosting service might not be running
-
}
-
-
return {
-
overall: overallStats,
-
mainApp: mainAppStats,
-
hostingService: hostingServiceStats,
-
timeWindow
-
}
-
})
-
-
// Get database stats (protected)
-
.get('/database', async ({ cookie, set }) => {
-
const check = requireAdmin({ cookie, set })
-
if (check) return check
-
-
try {
-
// Get total counts
-
const allSitesResult = await db`SELECT COUNT(*) as count FROM sites`
-
const wispSubdomainsResult = await db`SELECT COUNT(*) as count FROM domains WHERE domain LIKE '%.wisp.place'`
-
const customDomainsResult = await db`SELECT COUNT(*) as count FROM custom_domains WHERE verified = true`
-
-
// Get recent sites (including those without domains)
-
const recentSites = await db`
-
SELECT
-
s.did,
-
s.rkey,
-
s.display_name,
-
s.created_at,
-
d.domain as subdomain
-
FROM sites s
-
LEFT JOIN domains d ON s.did = d.did AND s.rkey = d.rkey AND d.domain LIKE '%.wisp.place'
-
ORDER BY s.created_at DESC
-
LIMIT 10
-
`
-
-
// Get recent domains
-
const recentDomains = await db`SELECT domain, did, rkey, verified, created_at FROM custom_domains ORDER BY created_at DESC LIMIT 10`
-
-
return {
-
stats: {
-
totalSites: allSitesResult[0].count,
-
totalWispSubdomains: wispSubdomainsResult[0].count,
-
totalCustomDomains: customDomainsResult[0].count
-
},
-
recentSites: recentSites,
-
recentDomains: recentDomains
-
}
-
} catch (error) {
-
set.status = 500
-
return {
-
error: 'Failed to fetch database stats',
-
message: error instanceof Error ? error.message : String(error)
-
}
-
}
-
})
-
-
// Get sites listing (protected)
-
.get('/sites', async ({ query, cookie, set }) => {
-
const check = requireAdmin({ cookie, set })
-
if (check) return check
-
-
const limit = query.limit ? parseInt(query.limit as string) : 50
-
const offset = query.offset ? parseInt(query.offset as string) : 0
-
-
try {
-
const sites = await db`
-
SELECT
-
s.did,
-
s.rkey,
-
s.display_name,
-
s.created_at,
-
d.domain as subdomain
-
FROM sites s
-
LEFT JOIN domains d ON s.did = d.did AND s.rkey = d.rkey AND d.domain LIKE '%.wisp.place'
-
ORDER BY s.created_at DESC
-
LIMIT ${limit} OFFSET ${offset}
-
`
-
-
const customDomains = await db`
-
SELECT
-
domain,
-
did,
-
rkey,
-
verified,
-
created_at
-
FROM custom_domains
-
ORDER BY created_at DESC
-
LIMIT ${limit} OFFSET ${offset}
-
`
-
-
return {
-
sites: sites,
-
customDomains: customDomains
-
}
-
} catch (error) {
-
set.status = 500
-
return {
-
error: 'Failed to fetch sites',
-
message: error instanceof Error ? error.message : String(error)
-
}
-
}
-
})
-
-
// Get system health (protected)
-
.get('/health', ({ cookie, set }) => {
-
const check = requireAdmin({ cookie, set })
-
if (check) return check
-
-
const uptime = process.uptime()
-
const memory = process.memoryUsage()
-
-
return {
-
uptime: Math.floor(uptime),
-
memory: {
-
heapUsed: Math.round(memory.heapUsed / 1024 / 1024), // MB
-
heapTotal: Math.round(memory.heapTotal / 1024 / 1024), // MB
-
rss: Math.round(memory.rss / 1024 / 1024) // MB
-
},
-
timestamp: new Date().toISOString()
-
}
-
})
-
-112
src/routes/auth.ts
···
-
import { Elysia } from 'elysia'
-
import { NodeOAuthClient } from '@atproto/oauth-client-node'
-
import { getSitesByDid, getDomainByDid } from '../lib/db'
-
import { syncSitesFromPDS } from '../lib/sync-sites'
-
import { authenticateRequest } from '../lib/wisp-auth'
-
import { logger } from '../lib/observability'
-
-
export const authRoutes = (client: NodeOAuthClient) => new Elysia()
-
.post('/api/auth/signin', async (c) => {
-
let handle = 'unknown'
-
try {
-
const body = c.body as { handle: string }
-
handle = body.handle
-
logger.info('Sign-in attempt', { handle })
-
const state = crypto.randomUUID()
-
const url = await client.authorize(handle, { state })
-
logger.info('Authorization URL generated', { handle })
-
return { url: url.toString() }
-
} catch (err) {
-
logger.error('Signin error', err, { handle })
-
console.error('[Auth] Full error:', err)
-
return { error: 'Authentication failed', details: err instanceof Error ? err.message : String(err) }
-
}
-
})
-
.get('/api/auth/callback', async (c) => {
-
try {
-
const params = new URLSearchParams(c.query)
-
-
// client.callback() validates the state parameter internally
-
// It will throw an error if state validation fails (CSRF protection)
-
const { session } = await client.callback(params)
-
-
if (!session) {
-
logger.error('[Auth] OAuth callback failed: no session returned')
-
return c.redirect('/?error=auth_failed')
-
}
-
-
const cookieSession = c.cookie
-
cookieSession.did.value = session.did
-
-
// Sync sites from PDS to database cache
-
logger.debug('[Auth] Syncing sites from PDS for', session.did)
-
try {
-
const syncResult = await syncSitesFromPDS(session.did, session)
-
logger.debug(`[Auth] Sync complete: ${syncResult.synced} sites synced`)
-
if (syncResult.errors.length > 0) {
-
logger.debug('[Auth] Sync errors:', syncResult.errors)
-
}
-
} catch (err) {
-
logger.error('[Auth] Failed to sync sites', err)
-
// Don't fail auth if sync fails, just log it
-
}
-
-
// Check if user has any sites or domain
-
const sites = await getSitesByDid(session.did)
-
const domain = await getDomainByDid(session.did)
-
-
// If no sites and no domain, redirect to onboarding
-
if (sites.length === 0 && !domain) {
-
return c.redirect('/onboarding')
-
}
-
-
return c.redirect('/editor')
-
} catch (err) {
-
// This catches state validation failures and other OAuth errors
-
logger.error('[Auth] OAuth callback error', err)
-
return c.redirect('/?error=auth_failed')
-
}
-
})
-
.post('/api/auth/logout', async (c) => {
-
try {
-
const cookieSession = c.cookie
-
const did = cookieSession.did?.value
-
-
// Clear the session cookie
-
cookieSession.did.value = ''
-
cookieSession.did.maxAge = 0
-
-
// If we have a DID, try to revoke the OAuth session
-
if (did && typeof did === 'string') {
-
try {
-
await client.revoke(did)
-
logger.debug('[Auth] Revoked OAuth session for', did)
-
} catch (err) {
-
logger.error('[Auth] Failed to revoke session', err)
-
// Continue with logout even if revoke fails
-
}
-
}
-
-
return { success: true }
-
} catch (err) {
-
logger.error('[Auth] Logout error', err)
-
return { error: 'Logout failed' }
-
}
-
})
-
.get('/api/auth/status', async (c) => {
-
try {
-
const auth = await authenticateRequest(client, c.cookie)
-
-
if (!auth) {
-
return { authenticated: false }
-
}
-
-
return {
-
authenticated: true,
-
did: auth.did
-
}
-
} catch (err) {
-
logger.error('[Auth] Status check error', err)
-
return { authenticated: false }
-
}
-
})
-346
src/routes/domain.ts
···
-
import { Elysia } from 'elysia'
-
import { requireAuth, type AuthenticatedContext } from '../lib/wisp-auth'
-
import { NodeOAuthClient } from '@atproto/oauth-client-node'
-
import { Agent } from '@atproto/api'
-
import {
-
claimDomain,
-
getDomainByDid,
-
isDomainAvailable,
-
isDomainRegistered,
-
isValidHandle,
-
toDomain,
-
updateDomain,
-
getCustomDomainInfo,
-
getCustomDomainById,
-
claimCustomDomain,
-
deleteCustomDomain,
-
updateCustomDomainVerification,
-
updateWispDomainSite,
-
updateCustomDomainRkey
-
} from '../lib/db'
-
import { createHash } from 'crypto'
-
import { verifyCustomDomain } from '../lib/dns-verify'
-
import { logger } from '../lib/logger'
-
-
export const domainRoutes = (client: NodeOAuthClient) =>
-
new Elysia({ prefix: '/api/domain' })
-
// Public endpoints (no auth required)
-
.get('/check', async ({ query }) => {
-
try {
-
const handle = (query.handle || "")
-
.trim()
-
.toLowerCase();
-
-
if (!isValidHandle(handle)) {
-
return {
-
available: false,
-
reason: "invalid"
-
};
-
}
-
-
const available = await isDomainAvailable(handle);
-
return {
-
available,
-
domain: toDomain(handle)
-
};
-
} catch (err) {
-
logger.error('[Domain] Check error', err);
-
return {
-
available: false
-
};
-
}
-
})
-
.get('/registered', async ({ query, set }) => {
-
try {
-
const domain = (query.domain || "").trim().toLowerCase();
-
-
if (!domain) {
-
set.status = 400;
-
return { error: 'Domain parameter required' };
-
}
-
-
const result = await isDomainRegistered(domain);
-
-
// For Caddy on-demand TLS: 200 = allow, 404 = deny
-
if (result.registered) {
-
set.status = 200;
-
return result;
-
} else {
-
set.status = 404;
-
return { registered: false };
-
}
-
} catch (err) {
-
logger.error('[Domain] Registered check error', err);
-
set.status = 500;
-
return { error: 'Failed to check domain' };
-
}
-
})
-
// Authenticated endpoints (require auth)
-
.derive(async ({ cookie }) => {
-
const auth = await requireAuth(client, cookie)
-
return { auth }
-
})
-
.post('/claim', async ({ body, auth }) => {
-
try {
-
const { handle } = body as { handle?: string };
-
const normalizedHandle = (handle || "").trim().toLowerCase();
-
-
if (!isValidHandle(normalizedHandle)) {
-
throw new Error("Invalid handle");
-
}
-
-
// ensure user hasn't already claimed
-
const existing = await getDomainByDid(auth.did);
-
if (existing) {
-
throw new Error("Already claimed");
-
}
-
-
// claim in DB
-
let domain: string;
-
try {
-
domain = await claimDomain(auth.did, normalizedHandle);
-
} catch (err) {
-
throw new Error("Handle taken");
-
}
-
-
// write place.wisp.domain record rkey = self
-
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init));
-
await agent.com.atproto.repo.putRecord({
-
repo: auth.did,
-
collection: "place.wisp.domain",
-
rkey: "self",
-
record: {
-
$type: "place.wisp.domain",
-
domain,
-
createdAt: new Date().toISOString(),
-
} as any,
-
validate: false,
-
});
-
-
return { success: true, domain };
-
} catch (err) {
-
logger.error('[Domain] Claim error', err);
-
throw new Error(`Failed to claim: ${err instanceof Error ? err.message : 'Unknown error'}`);
-
}
-
})
-
.post('/update', async ({ body, auth }) => {
-
try {
-
const { handle } = body as { handle?: string };
-
const normalizedHandle = (handle || "").trim().toLowerCase();
-
-
if (!isValidHandle(normalizedHandle)) {
-
throw new Error("Invalid handle");
-
}
-
-
const desiredDomain = toDomain(normalizedHandle);
-
const current = await getDomainByDid(auth.did);
-
-
if (current === desiredDomain) {
-
return { success: true, domain: current };
-
}
-
-
let domain: string;
-
try {
-
domain = await updateDomain(auth.did, normalizedHandle);
-
} catch (err) {
-
throw new Error("Handle taken");
-
}
-
-
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init));
-
await agent.com.atproto.repo.putRecord({
-
repo: auth.did,
-
collection: "place.wisp.domain",
-
rkey: "self",
-
record: {
-
$type: "place.wisp.domain",
-
domain,
-
createdAt: new Date().toISOString(),
-
} as any,
-
validate: false,
-
});
-
-
return { success: true, domain };
-
} catch (err) {
-
logger.error('[Domain] Update error', err);
-
throw new Error(`Failed to update: ${err instanceof Error ? err.message : 'Unknown error'}`);
-
}
-
})
-
.post('/custom/add', async ({ body, auth }) => {
-
try {
-
const { domain } = body as { domain: string };
-
const domainLower = domain.toLowerCase().trim();
-
-
// Enhanced domain validation
-
// 1. Length check (RFC 1035: labels 1-63 chars, total max 253)
-
if (!domainLower || domainLower.length < 3 || domainLower.length > 253) {
-
throw new Error('Invalid domain: must be 3-253 characters');
-
}
-
-
// 2. Basic format validation
-
// - Must contain at least one dot (require TLD)
-
// - Valid characters: a-z, 0-9, hyphen, dot
-
// - No consecutive dots, no leading/trailing dots or hyphens
-
const domainPattern = /^(?:[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?\.)+[a-z]{2,}$/;
-
if (!domainPattern.test(domainLower)) {
-
throw new Error('Invalid domain format');
-
}
-
-
// 3. Validate each label (part between dots)
-
const labels = domainLower.split('.');
-
for (const label of labels) {
-
if (label.length === 0 || label.length > 63) {
-
throw new Error('Invalid domain: label length must be 1-63 characters');
-
}
-
if (label.startsWith('-') || label.endsWith('-')) {
-
throw new Error('Invalid domain: labels cannot start or end with hyphen');
-
}
-
}
-
-
// 4. TLD validation (require valid TLD, block single-char TLDs and numeric TLDs)
-
const tld = labels[labels.length - 1];
-
if (tld.length < 2 || /^\d+$/.test(tld)) {
-
throw new Error('Invalid domain: TLD must be at least 2 characters and not all numeric');
-
}
-
-
// 5. Homograph attack protection - block domains with mixed scripts or confusables
-
// Block non-ASCII characters (Punycode domains should be pre-converted)
-
if (!/^[a-z0-9.-]+$/.test(domainLower)) {
-
throw new Error('Invalid domain: only ASCII alphanumeric, dots, and hyphens allowed');
-
}
-
-
// 6. Block localhost, internal IPs, and reserved domains
-
const blockedDomains = [
-
'localhost',
-
'example.com',
-
'example.org',
-
'example.net',
-
'test',
-
'invalid',
-
'local'
-
];
-
const blockedPatterns = [
-
/^(?:10|127|172\.(?:1[6-9]|2[0-9]|3[01])|192\.168)\./, // Private IPs
-
/^(?:\d{1,3}\.){3}\d{1,3}$/, // Any IP address
-
];
-
-
if (blockedDomains.includes(domainLower)) {
-
throw new Error('Invalid domain: reserved or blocked domain');
-
}
-
-
for (const pattern of blockedPatterns) {
-
if (pattern.test(domainLower)) {
-
throw new Error('Invalid domain: IP addresses not allowed');
-
}
-
}
-
-
// Check if already exists
-
const existing = await getCustomDomainInfo(domainLower);
-
if (existing) {
-
throw new Error('Domain already claimed');
-
}
-
-
// Create hash for ID
-
const hash = createHash('sha256').update(`${auth.did}:${domainLower}`).digest('hex').substring(0, 16);
-
-
// Store in database only
-
await claimCustomDomain(auth.did, domainLower, hash);
-
-
return {
-
success: true,
-
id: hash,
-
domain: domainLower,
-
verified: false
-
};
-
} catch (err) {
-
logger.error('[Domain] Custom domain add error', err);
-
throw new Error(`Failed to add domain: ${err instanceof Error ? err.message : 'Unknown error'}`);
-
}
-
})
-
.post('/custom/verify', async ({ body, auth }) => {
-
try {
-
const { id } = body as { id: string };
-
-
// Get domain from database
-
const domainInfo = await getCustomDomainById(id);
-
if (!domainInfo) {
-
throw new Error('Domain not found');
-
}
-
-
// Verify DNS records (TXT + CNAME)
-
logger.debug(`[Domain] Verifying custom domain: ${domainInfo.domain}`);
-
const result = await verifyCustomDomain(domainInfo.domain, auth.did, id);
-
-
// Update verification status in database
-
await updateCustomDomainVerification(id, result.verified);
-
-
return {
-
success: true,
-
verified: result.verified,
-
error: result.error,
-
found: result.found
-
};
-
} catch (err) {
-
logger.error('[Domain] Custom domain verify error', err);
-
throw new Error(`Failed to verify domain: ${err instanceof Error ? err.message : 'Unknown error'}`);
-
}
-
})
-
.delete('/custom/:id', async ({ params, auth }) => {
-
try {
-
const { id } = params;
-
-
// Verify ownership before deleting
-
const domainInfo = await getCustomDomainById(id);
-
if (!domainInfo) {
-
throw new Error('Domain not found');
-
}
-
-
if (domainInfo.did !== auth.did) {
-
throw new Error('Unauthorized: You do not own this domain');
-
}
-
-
// Delete from database
-
await deleteCustomDomain(id);
-
-
return { success: true };
-
} catch (err) {
-
logger.error('[Domain] Custom domain delete error', err);
-
throw new Error(`Failed to delete domain: ${err instanceof Error ? err.message : 'Unknown error'}`);
-
}
-
})
-
.post('/wisp/map-site', async ({ body, auth }) => {
-
try {
-
const { siteRkey } = body as { siteRkey: string | null };
-
-
// Update wisp.place domain to point to this site
-
await updateWispDomainSite(auth.did, siteRkey);
-
-
return { success: true };
-
} catch (err) {
-
logger.error('[Domain] Wisp domain map error', err);
-
throw new Error(`Failed to map site: ${err instanceof Error ? err.message : 'Unknown error'}`);
-
}
-
})
-
.post('/custom/:id/map-site', async ({ params, body, auth }) => {
-
try {
-
const { id } = params;
-
const { siteRkey } = body as { siteRkey: string | null };
-
-
// Verify ownership before updating
-
const domainInfo = await getCustomDomainById(id);
-
if (!domainInfo) {
-
throw new Error('Domain not found');
-
}
-
-
if (domainInfo.did !== auth.did) {
-
throw new Error('Unauthorized: You do not own this domain');
-
}
-
-
// Update custom domain to point to this site
-
await updateCustomDomainRkey(id, siteRkey);
-
-
return { success: true };
-
} catch (err) {
-
logger.error('[Domain] Custom domain map error', err);
-
throw new Error(`Failed to map site: ${err instanceof Error ? err.message : 'Unknown error'}`);
-
}
-
});
-60
src/routes/site.ts
···
-
import { Elysia } from 'elysia'
-
import { requireAuth } from '../lib/wisp-auth'
-
import { NodeOAuthClient } from '@atproto/oauth-client-node'
-
import { Agent } from '@atproto/api'
-
import { deleteSite } from '../lib/db'
-
import { logger } from '../lib/logger'
-
-
export const siteRoutes = (client: NodeOAuthClient) =>
-
new Elysia({ prefix: '/api/site' })
-
.derive(async ({ cookie }) => {
-
const auth = await requireAuth(client, cookie)
-
return { auth }
-
})
-
.delete('/:rkey', async ({ params, auth }) => {
-
const { rkey } = params
-
-
if (!rkey) {
-
return {
-
success: false,
-
error: 'Site rkey is required'
-
}
-
}
-
-
try {
-
// Create agent with OAuth session
-
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
-
-
// Delete the record from AT Protocol
-
try {
-
await agent.com.atproto.repo.deleteRecord({
-
repo: auth.did,
-
collection: 'place.wisp.fs',
-
rkey: rkey
-
})
-
logger.info(`[Site] Deleted site ${rkey} from PDS for ${auth.did}`)
-
} catch (err) {
-
logger.error(`[Site] Failed to delete site ${rkey} from PDS`, err)
-
throw new Error('Failed to delete site from AT Protocol')
-
}
-
-
// Delete from database
-
const result = await deleteSite(auth.did, rkey)
-
if (!result.success) {
-
throw new Error('Failed to delete site from database')
-
}
-
-
logger.info(`[Site] Successfully deleted site ${rkey} for ${auth.did}`)
-
-
return {
-
success: true,
-
message: 'Site deleted successfully'
-
}
-
} catch (err) {
-
logger.error('[Site] Delete error', err)
-
return {
-
success: false,
-
error: err instanceof Error ? err.message : 'Failed to delete site'
-
}
-
}
-
})
-100
src/routes/user.ts
···
-
import { Elysia } from 'elysia'
-
import { requireAuth } from '../lib/wisp-auth'
-
import { NodeOAuthClient } from '@atproto/oauth-client-node'
-
import { Agent } from '@atproto/api'
-
import { getSitesByDid, getDomainByDid, getCustomDomainsByDid, getWispDomainInfo } from '../lib/db'
-
import { syncSitesFromPDS } from '../lib/sync-sites'
-
import { logger } from '../lib/logger'
-
-
export const userRoutes = (client: NodeOAuthClient) =>
-
new Elysia({ prefix: '/api/user' })
-
.derive(async ({ cookie }) => {
-
const auth = await requireAuth(client, cookie)
-
return { auth }
-
})
-
.get('/status', async ({ auth }) => {
-
try {
-
// Check if user has any sites
-
const sites = await getSitesByDid(auth.did)
-
-
// Check if user has claimed a domain
-
const domain = await getDomainByDid(auth.did)
-
-
return {
-
did: auth.did,
-
hasSites: sites.length > 0,
-
hasDomain: !!domain,
-
domain: domain || null,
-
sitesCount: sites.length
-
}
-
} catch (err) {
-
logger.error('[User] Status error', err)
-
throw new Error('Failed to get user status')
-
}
-
})
-
.get('/info', async ({ auth }) => {
-
try {
-
// Get user's handle from AT Protocol
-
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
-
-
let handle = 'unknown'
-
try {
-
const profile = await agent.getProfile({ actor: auth.did })
-
handle = profile.data.handle
-
} catch (err) {
-
logger.error('[User] Failed to fetch profile', err)
-
}
-
-
return {
-
did: auth.did,
-
handle
-
}
-
} catch (err) {
-
logger.error('[User] Info error', err)
-
throw new Error('Failed to get user info')
-
}
-
})
-
.get('/sites', async ({ auth }) => {
-
try {
-
const sites = await getSitesByDid(auth.did)
-
return { sites }
-
} catch (err) {
-
logger.error('[User] Sites error', err)
-
throw new Error('Failed to get sites')
-
}
-
})
-
.get('/domains', async ({ auth }) => {
-
try {
-
// Get wisp.place subdomain with mapping
-
const wispDomainInfo = await getWispDomainInfo(auth.did)
-
-
// Get custom domains
-
const customDomains = await getCustomDomainsByDid(auth.did)
-
-
return {
-
wispDomain: wispDomainInfo ? {
-
domain: wispDomainInfo.domain,
-
rkey: wispDomainInfo.rkey || null
-
} : null,
-
customDomains
-
}
-
} catch (err) {
-
logger.error('[User] Domains error', err)
-
throw new Error('Failed to get domains')
-
}
-
})
-
.post('/sync', async ({ auth }) => {
-
try {
-
logger.debug('[User] Manual sync requested for', auth.did)
-
const result = await syncSitesFromPDS(auth.did, auth.session)
-
-
return {
-
success: true,
-
synced: result.synced,
-
errors: result.errors
-
}
-
} catch (err) {
-
logger.error('[User] Sync error', err)
-
throw new Error('Failed to sync sites')
-
}
-
})
-307
src/routes/wisp.ts
···
-
import { Elysia } from 'elysia'
-
import { requireAuth, type AuthenticatedContext } from '../lib/wisp-auth'
-
import { NodeOAuthClient } from '@atproto/oauth-client-node'
-
import { Agent } from '@atproto/api'
-
import {
-
type UploadedFile,
-
type FileUploadResult,
-
processUploadedFiles,
-
createManifest,
-
updateFileBlobs,
-
shouldCompressFile,
-
compressFile
-
} from '../lib/wisp-utils'
-
import { upsertSite } from '../lib/db'
-
import { logger } from '../lib/observability'
-
import { validateRecord } from '../lexicons/types/place/wisp/fs'
-
import { MAX_SITE_SIZE, MAX_FILE_SIZE, MAX_FILE_COUNT } from '../lib/constants'
-
-
function isValidSiteName(siteName: string): boolean {
-
if (!siteName || typeof siteName !== 'string') return false;
-
-
// Length check (AT Protocol rkey limit)
-
if (siteName.length < 1 || siteName.length > 512) return false;
-
-
// Check for path traversal
-
if (siteName === '.' || siteName === '..') return false;
-
if (siteName.includes('/') || siteName.includes('\\')) return false;
-
if (siteName.includes('\0')) return false;
-
-
// AT Protocol rkey format: alphanumeric, dots, dashes, underscores, tildes, colons
-
// Based on NSID format rules
-
const validRkeyPattern = /^[a-zA-Z0-9._~:-]+$/;
-
if (!validRkeyPattern.test(siteName)) return false;
-
-
return true;
-
}
-
-
export const wispRoutes = (client: NodeOAuthClient) =>
-
new Elysia({ prefix: '/wisp' })
-
.derive(async ({ cookie }) => {
-
const auth = await requireAuth(client, cookie)
-
return { auth }
-
})
-
.post(
-
'/upload-files',
-
async ({ body, auth }) => {
-
const { siteName, files } = body as {
-
siteName: string;
-
files: File | File[]
-
};
-
-
try {
-
if (!siteName) {
-
throw new Error('Site name is required')
-
}
-
-
if (!isValidSiteName(siteName)) {
-
throw new Error('Invalid site name: must be 1-512 characters and contain only alphanumeric, dots, dashes, underscores, tildes, and colons')
-
}
-
-
// Check if files were provided
-
const hasFiles = files && (Array.isArray(files) ? files.length > 0 : !!files);
-
-
if (!hasFiles) {
-
// Create agent with OAuth session
-
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
-
-
// Create empty manifest
-
const emptyManifest = {
-
$type: 'place.wisp.fs',
-
site: siteName,
-
root: {
-
type: 'directory',
-
entries: []
-
},
-
fileCount: 0,
-
createdAt: new Date().toISOString()
-
};
-
-
// Validate the manifest
-
const validationResult = validateRecord(emptyManifest);
-
if (!validationResult.success) {
-
throw new Error(`Invalid manifest: ${validationResult.error?.message || 'Validation failed'}`);
-
}
-
-
// Use site name as rkey
-
const rkey = siteName;
-
-
const record = await agent.com.atproto.repo.putRecord({
-
repo: auth.did,
-
collection: 'place.wisp.fs',
-
rkey: rkey,
-
record: emptyManifest
-
});
-
-
await upsertSite(auth.did, rkey, siteName);
-
-
return {
-
success: true,
-
uri: record.data.uri,
-
cid: record.data.cid,
-
fileCount: 0,
-
siteName
-
};
-
}
-
-
// Create agent with OAuth session
-
const agent = new Agent((url, init) => auth.session.fetchHandler(url, init))
-
-
// Convert File objects to UploadedFile format
-
// Elysia gives us File objects directly, handle both single file and array
-
const fileArray = Array.isArray(files) ? files : [files];
-
const uploadedFiles: UploadedFile[] = [];
-
const skippedFiles: Array<{ name: string; reason: string }> = [];
-
-
-
-
for (let i = 0; i < fileArray.length; i++) {
-
const file = fileArray[i];
-
-
// Skip files that are too large (limit to 100MB per file)
-
const maxSize = MAX_FILE_SIZE; // 100MB
-
if (file.size > maxSize) {
-
skippedFiles.push({
-
name: file.name,
-
reason: `file too large (${(file.size / 1024 / 1024).toFixed(2)}MB, max 100MB)`
-
});
-
continue;
-
}
-
-
const arrayBuffer = await file.arrayBuffer();
-
const originalContent = Buffer.from(arrayBuffer);
-
const originalMimeType = file.type || 'application/octet-stream';
-
-
// Compress and base64 encode ALL files
-
const compressedContent = compressFile(originalContent);
-
// Base64 encode the gzipped content to prevent PDS content sniffing
-
const base64Content = Buffer.from(compressedContent.toString('base64'), 'utf-8');
-
const compressionRatio = (compressedContent.length / originalContent.length * 100).toFixed(1);
-
logger.info(`Compressing ${file.name}: ${originalContent.length} -> ${compressedContent.length} bytes (${compressionRatio}%), base64: ${base64Content.length} bytes`);
-
-
uploadedFiles.push({
-
name: file.name,
-
content: base64Content,
-
mimeType: originalMimeType,
-
size: base64Content.length,
-
compressed: true,
-
originalMimeType
-
});
-
}
-
-
// Check total size limit (300MB)
-
const totalSize = uploadedFiles.reduce((sum, file) => sum + file.size, 0);
-
const maxTotalSize = MAX_SITE_SIZE; // 300MB
-
-
if (totalSize > maxTotalSize) {
-
throw new Error(`Total upload size ${(totalSize / 1024 / 1024).toFixed(2)}MB exceeds 300MB limit`);
-
}
-
-
// Check file count limit (2000 files)
-
if (uploadedFiles.length > MAX_FILE_COUNT) {
-
throw new Error(`File count ${uploadedFiles.length} exceeds ${MAX_FILE_COUNT} files limit`);
-
}
-
-
if (uploadedFiles.length === 0) {
-
-
// Create empty manifest
-
const emptyManifest = {
-
$type: 'place.wisp.fs',
-
site: siteName,
-
root: {
-
type: 'directory',
-
entries: []
-
},
-
fileCount: 0,
-
createdAt: new Date().toISOString()
-
};
-
-
// Validate the manifest
-
const validationResult = validateRecord(emptyManifest);
-
if (!validationResult.success) {
-
throw new Error(`Invalid manifest: ${validationResult.error?.message || 'Validation failed'}`);
-
}
-
-
// Use site name as rkey
-
const rkey = siteName;
-
-
const record = await agent.com.atproto.repo.putRecord({
-
repo: auth.did,
-
collection: 'place.wisp.fs',
-
rkey: rkey,
-
record: emptyManifest
-
});
-
-
await upsertSite(auth.did, rkey, siteName);
-
-
return {
-
success: true,
-
uri: record.data.uri,
-
cid: record.data.cid,
-
fileCount: 0,
-
siteName,
-
skippedFiles,
-
message: 'Site created but no valid web files were found to upload'
-
};
-
}
-
-
// Process files into directory structure
-
const { directory, fileCount } = processUploadedFiles(uploadedFiles);
-
-
// Upload files as blobs in parallel
-
// For compressed files, we upload as octet-stream and store the original MIME type in metadata
-
// For text/html files, we also use octet-stream as a workaround for PDS image pipeline issues
-
const uploadPromises = uploadedFiles.map(async (file, i) => {
-
try {
-
// If compressed, always upload as octet-stream
-
// Otherwise, workaround: PDS incorrectly processes text/html through image pipeline
-
const uploadMimeType = file.compressed || file.mimeType.startsWith('text/html')
-
? 'application/octet-stream'
-
: file.mimeType;
-
-
const compressionInfo = file.compressed ? ' (gzipped)' : '';
-
logger.info(`[File Upload] Uploading file: ${file.name} (original: ${file.mimeType}, sending as: ${uploadMimeType}, ${file.size} bytes${compressionInfo})`);
-
-
const uploadResult = await agent.com.atproto.repo.uploadBlob(
-
file.content,
-
{
-
encoding: uploadMimeType
-
}
-
);
-
-
const returnedBlobRef = uploadResult.data.blob;
-
-
// Use the blob ref exactly as returned from PDS
-
return {
-
result: {
-
hash: returnedBlobRef.ref.toString(),
-
blobRef: returnedBlobRef,
-
...(file.compressed && {
-
encoding: 'gzip' as const,
-
mimeType: file.originalMimeType || file.mimeType,
-
base64: true
-
})
-
},
-
filePath: file.name,
-
sentMimeType: file.mimeType,
-
returnedMimeType: returnedBlobRef.mimeType
-
};
-
} catch (uploadError) {
-
logger.error('Upload failed for file', uploadError);
-
throw uploadError;
-
}
-
});
-
-
// Wait for all uploads to complete
-
const uploadedBlobs = await Promise.all(uploadPromises);
-
-
// Extract results and file paths in correct order
-
const uploadResults: FileUploadResult[] = uploadedBlobs.map(blob => blob.result);
-
const filePaths: string[] = uploadedBlobs.map(blob => blob.filePath);
-
-
// Update directory with file blobs
-
const updatedDirectory = updateFileBlobs(directory, uploadResults, filePaths);
-
-
// Create manifest
-
const manifest = createManifest(siteName, updatedDirectory, fileCount);
-
-
// Use site name as rkey
-
const rkey = siteName;
-
-
let record;
-
try {
-
record = await agent.com.atproto.repo.putRecord({
-
repo: auth.did,
-
collection: 'place.wisp.fs',
-
rkey: rkey,
-
record: manifest
-
});
-
} catch (putRecordError: any) {
-
logger.error('Failed to create record on PDS', putRecordError);
-
-
throw putRecordError;
-
}
-
-
// Store site in database cache
-
await upsertSite(auth.did, rkey, siteName);
-
-
const result = {
-
success: true,
-
uri: record.data.uri,
-
cid: record.data.cid,
-
fileCount,
-
siteName,
-
skippedFiles,
-
uploadedCount: uploadedFiles.length
-
};
-
-
return result;
-
} catch (error) {
-
logger.error('Upload error', error, {
-
message: error instanceof Error ? error.message : 'Unknown error',
-
name: error instanceof Error ? error.name : undefined
-
});
-
throw new Error(`Failed to upload files: ${error instanceof Error ? error.message : 'Unknown error'}`);
-
}
-
}
-
)
-40
testDeploy/index.html
···
-
<!DOCTYPE html>
-
<html lang="en">
-
<head>
-
<meta charset="UTF-8">
-
<meta name="viewport" content="width=device-width, initial-scale=1.0">
-
<title>Wisp.place Test Site</title>
-
<style>
-
body {
-
font-family: system-ui, -apple-system, sans-serif;
-
max-width: 800px;
-
margin: 4rem auto;
-
padding: 0 2rem;
-
line-height: 1.6;
-
}
-
h1 {
-
color: #333;
-
}
-
.info {
-
background: #f0f0f0;
-
padding: 1rem;
-
border-radius: 8px;
-
margin: 2rem 0;
-
}
-
</style>
-
</head>
-
<body>
-
<h1>Hello from Wisp.place!</h1>
-
<p>This is a test deployment using the wisp-cli and Tangled Spindles CI/CD.</p>
-
-
<div class="info">
-
<h2>About this deployment</h2>
-
<p>This site was deployed to the AT Protocol using:</p>
-
<ul>
-
<li>Wisp.place CLI (Rust)</li>
-
<li>Tangled Spindles CI/CD</li>
-
<li>AT Protocol for decentralized hosting</li>
-
</ul>
-
</div>
-
</body>
-
</html>
+1 -1
tsconfig.json
···
/* Modules */
"module": "ES2022" /* Specify what module code is generated. */,
// "rootDir": "./", /* Specify the root folder within your source files. */
-
"moduleResolution": "node" /* Specify how TypeScript looks up a file from a given module specifier. */,
+
"moduleResolution": "bundler" /* Specify how TypeScript looks up a file from a given module specifier. */,
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */