Files
Neon-Desk/README.md

163 lines
5.1 KiB
Markdown

# Fiscal Clone 3.0
Turbopack-first rebuild of a fiscal.ai-style terminal with Vercel AI SDK integration.
## Stack
- Next.js 16 App Router
- Bun runtime/tooling
- Elysia route layer mounted in Next Route Handlers
- Turbopack for `dev` and `build`
- Better Auth (email/password + magic link)
- Drizzle ORM (SQLite) + Better Auth Drizzle adapter
- Internal API routes via Elysia app module (`lib/server/api/app.ts`)
- Eden Treaty for type-safe frontend API calls
- Workflow DevKit Postgres World for background task execution durability
- SQLite-backed app domain storage (watchlist, holdings, filings, task projection, insights)
- Vercel AI SDK (`ai`) with Zhipu (`zhipu-ai-provider`) via Coding API (`https://api.z.ai/api/coding/paas/v4`)
## Run locally
```bash
bun install
bun run db:generate
bun run db:migrate
bun run dev
```
Open [http://localhost:3000](http://localhost:3000).
The default database path is `data/fiscal.sqlite` via `DATABASE_URL=file:data/fiscal.sqlite`.
## Production build
```bash
bun run db:migrate
bun run build
bun run start
```
## Browser E2E tests
Install Playwright's Chromium browser once:
```bash
bun run test:e2e:install
```
Run the suite:
```bash
bun run test:e2e
```
Useful variants:
```bash
bun run test:e2e:headed
bun run test:e2e:ui
```
The Playwright web server boot path uses an isolated SQLite database at `data/e2e.sqlite`, forces local Better Auth origins for the test port, and stores artifacts under `output/playwright/`.
## Docker deployment
```bash
cp .env.example .env
docker compose up --build -d
```
For local Docker, host port mapping comes from `docker-compose.override.yml` (default `http://localhost:3000` via `APP_PORT`).
The app calls Zhipu directly via AI SDK for extraction and report generation.
Zhipu always targets the Coding API endpoint (`https://api.z.ai/api/coding/paas/v4`).
On container startup, the app applies Drizzle migrations automatically before launching Next.js.
The app stores SQLite data in Docker volume `fiscal_sqlite_data` (mounted to `/app/data`) and workflow world data in Postgres volume `workflow_postgres_data`.
Container startup runs:
1. `workflow-postgres-setup` (idempotent Workflow world bootstrap)
2. Drizzle migrations for SQLite app tables
3. Next.js server boot
Docker images use Bun (`oven/bun:1.3.5-alpine`) for build and runtime.
## Coolify deployment
This compose setup is compatible with Coolify as-is (it uses named Docker volumes, not host bind mounts).
Required environment variables in Coolify:
- `DATABASE_URL=file:/app/data/fiscal.sqlite`
- `BETTER_AUTH_SECRET=<long-random-secret>`
- `BETTER_AUTH_BASE_URL=https://fiscal.b11studio.xyz`
- `BETTER_AUTH_TRUSTED_ORIGINS=https://fiscal.b11studio.xyz`
- `WORKFLOW_TARGET_WORLD=@workflow/world-postgres`
- `WORKFLOW_POSTGRES_URL=postgres://workflow:workflow@workflow-postgres:5432/workflow`
- Optional: `WORKFLOW_POSTGRES_WORKER_CONCURRENCY=10`
- Optional: `WORKFLOW_POSTGRES_JOB_PREFIX=fiscal_`
Operational constraints for Coolify:
- Keep this service to a single instance/replica. SQLite is file-based and not appropriate for multi-replica shared-write deployments.
- Ensure both named volumes are persisted (`fiscal_sqlite_data`, `workflow_postgres_data`).
- Keep `WORKFLOW_POSTGRES_URL` explicit so Workflow does not fall back to `DATABASE_URL` (SQLite).
- The app `/api/health` probes Workflow backend connectivity and returns non-200 when Workflow world is unavailable.
Emergency rollback path:
1. Set `WORKFLOW_TARGET_WORLD=local`
2. Remove/disable `WORKFLOW_POSTGRES_URL`
3. Redeploy
## Environment
Use root `.env` or root `.env.local`:
```env
# leave blank for same-origin API
NEXT_PUBLIC_API_URL=
DATABASE_URL=file:data/fiscal.sqlite
BETTER_AUTH_SECRET=replace-with-a-long-random-secret
BETTER_AUTH_BASE_URL=https://fiscal.b11studio.xyz
BETTER_AUTH_TRUSTED_ORIGINS=https://fiscal.b11studio.xyz
ZHIPU_API_KEY=
ZHIPU_MODEL=glm-5
# optional generation tuning
AI_TEMPERATURE=0.2
SEC_USER_AGENT=Fiscal Clone <support@fiscal.local>
WORKFLOW_TARGET_WORLD=@workflow/world-postgres
WORKFLOW_POSTGRES_URL=postgres://workflow:workflow@workflow-postgres:5432/workflow
WORKFLOW_POSTGRES_WORKER_CONCURRENCY=10
WORKFLOW_POSTGRES_JOB_PREFIX=fiscal_
# Optional local-world fallback
WORKFLOW_LOCAL_DATA_DIR=.workflow-data
WORKFLOW_LOCAL_QUEUE_CONCURRENCY=100
```
`ZHIPU_API_KEY` is required for AI workloads (extraction and report generation). Missing or invalid credentials fail AI tasks.
`ZHIPU_BASE_URL` is deprecated and ignored; runtime always uses `https://api.z.ai/api/coding/paas/v4`.
## API surface
All endpoints below are defined in Elysia at `lib/server/api/app.ts` and exposed via `app/api/[[...slugs]]/route.ts`.
- `ALL /api/auth/*` (Better Auth handler)
- `GET /api/health`
- `GET /api/me`
- `GET|POST /api/watchlist`
- `DELETE /api/watchlist/:id`
- `GET|POST /api/portfolio/holdings`
- `PATCH|DELETE /api/portfolio/holdings/:id`
- `GET /api/portfolio/summary`
- `POST /api/portfolio/refresh-prices`
- `POST /api/portfolio/insights/generate`
- `GET /api/portfolio/insights/latest`
- `GET /api/financials/company`
- `GET /api/filings`
- `POST /api/filings/sync`
- `POST /api/filings/:accessionNumber/analyze`
- `GET /api/tasks`
- `GET /api/tasks/:taskId`