# Fiscal Clone 3.0 Turbopack-first rebuild of a fiscal.ai-style terminal with Vercel AI SDK integration. ## Stack - Next.js 16 App Router - Bun runtime/tooling - Elysia route layer mounted in Next Route Handlers - Turbopack for `dev` and `build` - Better Auth (email/password + magic link) - Drizzle ORM (SQLite) + Better Auth Drizzle adapter - Internal API routes via Elysia app module (`lib/server/api/app.ts`) - Eden Treaty for type-safe frontend API calls - Workflow DevKit Postgres World for background task execution durability - SQLite-backed app domain storage (watchlist, holdings, filings, task projection, insights) - Vercel AI SDK (`ai`) with Zhipu (`zhipu-ai-provider`) via Coding API (`https://api.z.ai/api/coding/paas/v4`) ## Run locally ```bash bun install cp .env.example .env bun run dev ``` Open [http://localhost:3000](http://localhost:3000). `bun run dev` is the local-safe entrypoint. It bootstraps the local SQLite schema from `drizzle/` when needed, forces Better Auth to a localhost origin, uses same-origin API calls, and falls back to local SQLite + Workflow local runtime even if `.env` still contains deployment-oriented values. If port `3000` is already in use and you did not set `PORT`, it automatically picks the next open local port and keeps Better Auth in sync with that port. On macOS, `bun run dev` also auto-detects Homebrew SQLite and enables native `sqlite-vec` when `/opt/homebrew/opt/sqlite/lib/libsqlite3.dylib` or `/usr/local/opt/sqlite/lib/libsqlite3.dylib` exists. If no compatible SQLite library is found, the app falls back to table-backed vector storage and search still works. See [doc/sqlite-vec-local-setup.md](doc/sqlite-vec-local-setup.md). If you need raw `next dev` behavior without those overrides, use: ```bash bun run dev:next ``` The default local database path is `data/fiscal.sqlite` via `DATABASE_URL=file:data/fiscal.sqlite`. ## Production build ```bash bun run db:migrate bun run build bun run start ``` ## Verification Run the required repo checks before considering work complete: ```bash bun fmt bun lint bun typecheck cargo test -p fiscal-xbrl-core --manifest-path rust/Cargo.toml bun run validate:taxonomy-packs ``` Use `bun run test` for Vitest-based tests when needed. Do not use `bun test` directly. ## Browser E2E tests Install Playwright's Chromium browser once: ```bash bun run test:e2e:install ``` Run the suite: ```bash bun run test:e2e ``` Useful variants: ```bash bun run test:e2e:headed bun run test:e2e:ui ``` The Playwright web server boot path uses an isolated SQLite database at `data/e2e.sqlite`, forces local Better Auth origins for the test port, and stores artifacts under `output/playwright/`. On macOS, `bun run e2e:webserver` uses the same Homebrew SQLite auto-detection so `sqlite-vec` can load natively when available. ## Docker deployment ```bash cp .env.example .env docker compose up --build -d ``` For local Docker, host port mapping comes from `docker-compose.override.yml` (default `http://localhost:3000` via `APP_PORT`). The app calls Zhipu directly via AI SDK for extraction and report generation. Zhipu always targets the Coding API endpoint (`https://api.z.ai/api/coding/paas/v4`). On container startup, the app applies Drizzle migrations automatically before launching Next.js. The app stores SQLite data in Docker volume `fiscal_sqlite_data` (mounted to `/app/data`) and workflow world data in Postgres volume `workflow_postgres_data`. Container startup runs: 1. `workflow-postgres-setup` (idempotent Workflow world bootstrap) 2. Programmatic Drizzle migrations for SQLite app tables 3. Next.js server boot Docker images use Bun (`oven/bun:1.3.5-alpine`) for build and runtime. Docker builds use BuildKit cache mounts for Bun downloads and `.next/cache`, so repeated server-side builds can reuse dependency and Next/Turbopack caches on the same builder. Optional runtime toggles: - `RUN_WORKFLOW_SETUP_ON_START=true` keeps `workflow-postgres-setup` enabled at container boot. - `RUN_DB_MIGRATIONS_ON_START=true` keeps SQLite migrations enabled at container boot. ## Coolify deployment This compose setup is compatible with Coolify as-is (it uses named Docker volumes, not host bind mounts). Required environment variables in Coolify: - `DATABASE_URL=file:/app/data/fiscal.sqlite` - `BETTER_AUTH_SECRET=` - `BETTER_AUTH_BASE_URL=https://fiscal.b11studio.xyz` - `BETTER_AUTH_TRUSTED_ORIGINS=https://fiscal.b11studio.xyz` - `WORKFLOW_TARGET_WORLD=@workflow/world-postgres` - `WORKFLOW_POSTGRES_URL=postgres://workflow:workflow@workflow-postgres:5432/workflow` - Optional: `WORKFLOW_POSTGRES_WORKER_CONCURRENCY=10` - Optional: `WORKFLOW_POSTGRES_JOB_PREFIX=fiscal_` Operational constraints for Coolify: - Keep this service to a single instance/replica. SQLite is file-based and not appropriate for multi-replica shared-write deployments. - Ensure both named volumes are persisted (`fiscal_sqlite_data`, `workflow_postgres_data`). - Keep `WORKFLOW_POSTGRES_URL` explicit so Workflow does not fall back to `DATABASE_URL` (SQLite). - The app `/api/health` probes Workflow backend connectivity and returns non-200 when Workflow world is unavailable. - Keep `Include Source Commit in Build` disabled so Docker layer cache stays reusable between commits. - Keep Docker cleanup threshold-based rather than aggressive, otherwise Coolify will discard build cache. - Keep repeated builds pinned to the same builder/server when possible so Docker layer cache and BuildKit cache mounts remain warm. Emergency rollback path: 1. Set `WORKFLOW_TARGET_WORLD=local` 2. Remove/disable `WORKFLOW_POSTGRES_URL` 3. Redeploy ## Environment Use root `.env` or root `.env.local`: ```env # leave blank for same-origin API NEXT_PUBLIC_API_URL= DATABASE_URL=file:data/fiscal.sqlite # Optional native sqlite-vec setup # macOS local dev/e2e auto-detects Homebrew SQLite if present. # SQLITE_CUSTOM_LIB_PATH=/opt/homebrew/opt/sqlite/lib/libsqlite3.dylib # SQLITE_VEC_EXTENSION_PATH=/absolute/path/to/node_modules/sqlite-vec-darwin-arm64/vec0.dylib BETTER_AUTH_SECRET=replace-with-a-long-random-secret BETTER_AUTH_BASE_URL=http://localhost:3000 BETTER_AUTH_TRUSTED_ORIGINS=http://localhost:3000 ZHIPU_API_KEY= ZHIPU_MODEL=glm-5 # optional generation tuning AI_TEMPERATURE=0.2 SEC_USER_AGENT=Fiscal Clone # Rust XBRL sidecar FISCAL_XBRL_BIN=rust/target/release/fiscal-xbrl FISCAL_XBRL_CACHE_DIR=.cache/xbrl XBRL_ENGINE_TIMEOUT_MS=45000 FISCAL_TAXONOMY_DIR=rust/taxonomy # local dev default WORKFLOW_TARGET_WORLD=local # docker / production runtime WORKFLOW_POSTGRES_URL=postgres://workflow:workflow@workflow-postgres:5432/workflow WORKFLOW_POSTGRES_WORKER_CONCURRENCY=10 WORKFLOW_POSTGRES_JOB_PREFIX=fiscal_ RUN_WORKFLOW_SETUP_ON_START=true RUN_DB_MIGRATIONS_ON_START=true # Optional local-world fallback WORKFLOW_LOCAL_DATA_DIR=.workflow-data WORKFLOW_LOCAL_QUEUE_CONCURRENCY=100 ``` `ZHIPU_API_KEY` is required for AI workloads (extraction and report generation). Missing or invalid credentials fail AI tasks. `ZHIPU_BASE_URL` is deprecated and ignored; runtime always uses `https://api.z.ai/api/coding/paas/v4`. `bun run dev` will still normalize Better Auth origin, same-origin API routing, SQLite path, and Workflow runtime for localhost boot if this file contains deployment values. For the Rust XBRL sidecar, `FISCAL_XBRL_BIN` should point to the built `fiscal-xbrl` executable, `FISCAL_XBRL_CACHE_DIR` controls local filing cache storage, `XBRL_ENGINE_TIMEOUT_MS` bounds sidecar execution time, and `FISCAL_TAXONOMY_DIR` overrides the taxonomy pack directory when needed. Build the sidecar with `bun run build:sidecar` or `cargo build --manifest-path rust/Cargo.toml --release --bin fiscal-xbrl`. ## API surface All endpoints below are defined in Elysia at `lib/server/api/app.ts` and exposed via `app/api/[[...slugs]]/route.ts`. - `ALL /api/auth/*` (Better Auth handler) - `GET /api/health` - `GET /api/me` - `GET|POST /api/watchlist` - `DELETE /api/watchlist/:id` - `GET|POST /api/portfolio/holdings` - `PATCH|DELETE /api/portfolio/holdings/:id` - `GET /api/portfolio/summary` - `POST /api/portfolio/refresh-prices` - `POST /api/portfolio/insights/generate` - `GET /api/portfolio/insights/latest` - `GET /api/financials/company` - `GET /api/filings` - `POST /api/filings/sync` - `POST /api/filings/:accessionNumber/analyze` - `GET /api/tasks` - `GET /api/tasks/:taskId`