remove legacy backend stack and stale deployment docs after rebuild
This commit is contained in:
26
.env.example
26
.env.example
@@ -1,28 +1,10 @@
|
||||
# PostgreSQL
|
||||
DATABASE_URL=postgres://postgres:postgres@localhost:5432/fiscal
|
||||
POSTGRES_USER=postgres
|
||||
POSTGRES_PASSWORD=postgres
|
||||
POSTGRES_DB=fiscal
|
||||
POSTGRES_HOST=localhost
|
||||
|
||||
# API service
|
||||
PORT=3001
|
||||
NODE_ENV=development
|
||||
FRONTEND_URL=http://localhost:3000
|
||||
BETTER_AUTH_SECRET=replace-with-strong-random-secret
|
||||
BETTER_AUTH_BASE_URL=http://localhost:3001
|
||||
SEC_USER_AGENT=Fiscal Clone <support@example.com>
|
||||
|
||||
# Frontend
|
||||
NEXT_PUBLIC_API_URL=http://localhost:3001
|
||||
# In Coolify this must be the public backend URL (e.g. https://api.fiscal.example.com)
|
||||
# Optional API override. Leave empty to use same-origin internal API routes.
|
||||
NEXT_PUBLIC_API_URL=
|
||||
|
||||
# OpenClaw / ZeroClaw (OpenAI-compatible)
|
||||
OPENCLAW_BASE_URL=http://localhost:4000
|
||||
OPENCLAW_API_KEY=replace-with-your-agent-key
|
||||
OPENCLAW_MODEL=zeroclaw
|
||||
|
||||
# Queue tuning
|
||||
TASK_HEARTBEAT_SECONDS=15
|
||||
TASK_STALE_SECONDS=120
|
||||
TASK_MAX_ATTEMPTS=3
|
||||
# SEC API etiquette
|
||||
SEC_USER_AGENT=Fiscal Clone <support@fiscal.local>
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
# Better Auth Migration (Archived)
|
||||
|
||||
This document described the pre-2.0 incremental migration path.
|
||||
|
||||
The codebase has been rebuilt for Fiscal Clone 2.0. Use these sources instead:
|
||||
- `README.md` for runtime and setup
|
||||
- `backend/src/auth.ts` for Better Auth configuration
|
||||
- `backend/src/db/migrate.ts` for current schema
|
||||
- `docs/REBUILD_DECISIONS.md` for architecture rationale
|
||||
78
COOLIFY.md
78
COOLIFY.md
@@ -1,78 +0,0 @@
|
||||
# Coolify Deployment (Fiscal Clone 2.0)
|
||||
|
||||
This repository is deployable on Coolify using the root `docker-compose.yml`.
|
||||
|
||||
## What gets deployed
|
||||
|
||||
- `frontend` (Next.js)
|
||||
- `backend` (Elysia API + Better Auth)
|
||||
- `worker` (durable async job processor)
|
||||
- `postgres` (database)
|
||||
|
||||
`backend` and `worker` auto-run migrations on startup:
|
||||
- `bun run src/db/migrate.ts`
|
||||
- then start API/worker process
|
||||
|
||||
## Coolify setup
|
||||
|
||||
1. Create a **Docker Compose** app in Coolify.
|
||||
2. Connect this repository.
|
||||
3. Use compose file: `/docker-compose.yml`.
|
||||
4. Add public domains:
|
||||
- `frontend` service on port `3000` (example: `https://fiscal.example.com`)
|
||||
- `backend` service on port `3001` (example: `https://api.fiscal.example.com`)
|
||||
|
||||
## Required environment variables
|
||||
|
||||
Set these in Coolify before deploy:
|
||||
|
||||
```env
|
||||
POSTGRES_USER=postgres
|
||||
POSTGRES_PASSWORD=<strong-password>
|
||||
POSTGRES_DB=fiscal
|
||||
|
||||
DATABASE_URL=postgres://postgres:<strong-password>@postgres:5432/fiscal
|
||||
|
||||
# Public URLs
|
||||
FRONTEND_URL=https://fiscal.example.com
|
||||
BETTER_AUTH_BASE_URL=https://api.fiscal.example.com
|
||||
NEXT_PUBLIC_API_URL=https://api.fiscal.example.com
|
||||
|
||||
# Security
|
||||
BETTER_AUTH_SECRET=<openssl rand -base64 32>
|
||||
SEC_USER_AGENT=Fiscal Clone <ops@your-domain.com>
|
||||
|
||||
# Optional OpenClaw/ZeroClaw integration
|
||||
OPENCLAW_BASE_URL=https://your-openclaw-endpoint
|
||||
OPENCLAW_API_KEY=<token>
|
||||
OPENCLAW_MODEL=zeroclaw
|
||||
|
||||
# Optional queue tuning
|
||||
TASK_HEARTBEAT_SECONDS=15
|
||||
TASK_STALE_SECONDS=120
|
||||
TASK_MAX_ATTEMPTS=3
|
||||
```
|
||||
|
||||
## Important build note
|
||||
|
||||
`NEXT_PUBLIC_API_URL` is compiled into the frontend bundle at build time. If you change it, trigger a new deploy/rebuild.
|
||||
|
||||
The frontend includes a safety fallback: if `NEXT_PUBLIC_API_URL` is accidentally set to an internal host like `http://backend:3001`, browser calls will fall back to `https://api.<frontend-host>`.
|
||||
This is a fallback only; keep `NEXT_PUBLIC_API_URL` correct in Coolify.
|
||||
|
||||
## Post-deploy checks
|
||||
|
||||
1. API health:
|
||||
```bash
|
||||
curl -f https://api.fiscal.example.com/api/health
|
||||
```
|
||||
2. Frontend loads and auth screens render.
|
||||
3. Create user, add watchlist symbol, queue filing sync.
|
||||
4. Confirm background tasks move `queued -> running -> completed` in dashboard.
|
||||
|
||||
## Common pitfalls
|
||||
|
||||
- `NEXT_PUBLIC_API_URL` left as internal hostname (`http://backend:3001`) causes auth/API failures until fallback or proper config is applied.
|
||||
- `FRONTEND_URL` missing/incorrect causes CORS/session issues.
|
||||
- `BETTER_AUTH_BASE_URL` must be the public backend URL, not the internal container hostname.
|
||||
- Deploying frontend and backend on unrelated domains can cause cookie/session headaches. Prefer same root domain (e.g. `fiscal.example.com` + `api.fiscal.example.com`).
|
||||
@@ -1,172 +0,0 @@
|
||||
# Dependency Updates Summary
|
||||
Generated: 2026-02-19
|
||||
Last Updated: 2026-02-20
|
||||
|
||||
---
|
||||
|
||||
## Build System Updates (2026-02-20)
|
||||
|
||||
### npm Version Update:
|
||||
- **Backend Dockerfile**: Updated to install npm@latest globally during build
|
||||
- Previous: npm 10.8.2 (bundled with Node 20)
|
||||
- Updated: npm 11.10.1
|
||||
- Change: Added `npm install -g npm@latest` after bun installation
|
||||
- Benefit: Latest npm features, bug fixes, and security updates
|
||||
|
||||
### Tailwind CSS v4 Compatibility Fixes:
|
||||
- **Frontend PostCSS Configuration**: Updated to use new Tailwind v4 plugin
|
||||
- Previous: `tailwindcss: {}` in postcss.config.js
|
||||
- Updated: `'@tailwindcss/postcss': {}` in postcss.config.js
|
||||
- Added: `@tailwindcss/postcss: ^4.0.0` to devDependencies
|
||||
- Fixes: Build error "The PostCSS plugin has moved to a separate package"
|
||||
- Compatibility: Tailwind CSS v4.2.0
|
||||
|
||||
---
|
||||
|
||||
## Backend Dependencies Updates
|
||||
|
||||
### MAJOR VERSION UPDATES:
|
||||
|
||||
1. **zod**: 3.24.1 → 4.3.6
|
||||
- Major version bump
|
||||
- Potential breaking changes in validation API
|
||||
|
||||
2. **bcryptjs**: 2.4.3 → 3.0.3
|
||||
- Major version bump
|
||||
- Potential breaking changes in hashing API
|
||||
|
||||
3. **@types/bcryptjs**: 2.4.6 → 3.0.0
|
||||
- TypeScript types update to match bcryptjs v3
|
||||
|
||||
### MINOR/PATCH UPDATES:
|
||||
|
||||
4. **jsonwebtoken**: 9.0.2 → 9.0.3
|
||||
- Patch update, should be compatible
|
||||
|
||||
### ALREADY UP-TO-DATE:
|
||||
- @elysiajs/cors: 1.4.1 ✓
|
||||
- @elysiajs/swagger: 1.3.1 ✓
|
||||
- elysia: 1.4.25 ✓
|
||||
- pg: 8.18.0 ✓
|
||||
- postgres: 3.4.8 ✓
|
||||
- dotenv: 17.3.1 ✓
|
||||
- @types/pg: 8.16.0 ✓
|
||||
- @types/jsonwebtoken: 9.0.10 ✓
|
||||
|
||||
---
|
||||
|
||||
## Frontend Dependencies Updates
|
||||
|
||||
### MAJOR VERSION UPDATES:
|
||||
|
||||
1. **next-auth**: 5.0.0-beta.25 → 5.0.0-beta.30
|
||||
- Updated to latest v5 beta (not downgraded to v4)
|
||||
- Maintains feature set, gets bug fixes
|
||||
- Decision: Keep on v5 beta as it's the future version
|
||||
|
||||
2. **recharts**: 2.15.4 → 3.7.0
|
||||
- Major version bump
|
||||
- Potential breaking changes in chart components
|
||||
|
||||
3. **tailwind-merge**: 2.6.1 → 3.5.0
|
||||
- Major version bump
|
||||
- Potential breaking changes in class merging logic
|
||||
|
||||
4. **@types/node**: 22.12.0 → 25.3.0
|
||||
- Major version bump
|
||||
- Updated TypeScript definitions for Node.js
|
||||
|
||||
5. **tailwindcss**: 3.4.19 → 4.2.0
|
||||
- Major version bump
|
||||
- Tailwind CSS v4 has significant changes to configuration and API
|
||||
- May require updating tailwind.config.js configuration
|
||||
|
||||
### MINOR/PATCH UPDATES:
|
||||
|
||||
6. **react**: 19.1.0 → 19.2.4
|
||||
- Patch update, should be compatible
|
||||
|
||||
7. **react-dom**: 19.1.0 → 19.2.4
|
||||
- Patch update, should be compatible
|
||||
|
||||
### ALREADY UP-TO-DATE:
|
||||
- next: 16.1.6 ✓
|
||||
- lucide-react: 0.574.0 ✓
|
||||
- date-fns: 4.1.0 ✓
|
||||
- @radix-ui/react-dialog: 1.1.15 ✓
|
||||
- @radix-ui/react-dropdown-menu: 2.1.16 ✓
|
||||
- @radix-ui/react-slot: 1.2.4 ✓
|
||||
- @radix-ui/react-tabs: 1.1.13 ✓
|
||||
- @radix-ui/react-toast: 1.2.15 ✓
|
||||
- class-variance-authority: 0.7.1 ✓
|
||||
- clsx: 2.1.1 ✓
|
||||
- typescript: 5.9.3 ✓
|
||||
- postcss: 8.5.6 ✓
|
||||
- autoprefixer: 10.4.24 ✓
|
||||
- @types/react: 19.2.14 ✓
|
||||
- @types/react-dom: 19.2.3 ✓
|
||||
|
||||
---
|
||||
|
||||
## Known Compatibility Issues & Recommendations
|
||||
|
||||
### High Priority:
|
||||
1. **tailwindcss v4**: This is a major update with significant changes. The project's configuration files (tailwind.config.js, postcss.config.js) may need updates. Note: Tailwind v4 is designed to be backward compatible with v3 configs in most cases.
|
||||
- **FIXED (2026-02-20)**: Updated postcss.config.js to use `@tailwindcss/postcss` plugin instead of `tailwindcss` directly. Added `@tailwindcss/postcss: ^4.0.0` to devDependencies.
|
||||
|
||||
### Medium Priority:
|
||||
2. **zod v4**: Check if validation schemas need updates.
|
||||
3. **bcryptjs v3**: Check if password hashing/verification code needs updates.
|
||||
4. **recharts v3**: Chart components may need updates to match new API.
|
||||
|
||||
### Low Priority:
|
||||
5. **@types/node v25**: Usually compatible, but check for any Node.js API changes.
|
||||
|
||||
---
|
||||
|
||||
## Testing Checklist
|
||||
|
||||
After updating dependencies:
|
||||
- [ ] Run `npm install` in both backend and frontend
|
||||
- [ ] Run `npm run build` in frontend
|
||||
- [ ] Test authentication flow (next-auth)
|
||||
- [ ] Test API endpoints
|
||||
- [ ] Test database connections
|
||||
- [ ] Test all UI components and charts
|
||||
- [ ] Run any existing tests
|
||||
- [ ] Check console for deprecation warnings
|
||||
- [ ] Test Docker build (`docker-compose build`)
|
||||
|
||||
---
|
||||
|
||||
## Rollback Plan
|
||||
|
||||
If issues arise after updates:
|
||||
1. Revert package.json changes
|
||||
2. Run `npm install` to restore previous versions
|
||||
3. Commit and push revert
|
||||
|
||||
To revert: `git checkout HEAD~1` (if committed)
|
||||
Or restore from backup: `cp package.json.backup package.json`
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
**Total dependencies updated:**
|
||||
- Backend: 3 dependencies (1 major, 1 patch, 1 type update)
|
||||
- Frontend: 7 dependencies (5 major, 2 patch)
|
||||
|
||||
**Changes committed:** Yes (commit 1573e07)
|
||||
**Changes pushed to remote:** Yes
|
||||
|
||||
**Next steps:**
|
||||
1. Pull latest changes: `git pull`
|
||||
2. Install dependencies in both directories:
|
||||
- Backend: `cd backend && bun install` (or `npm install`)
|
||||
- Frontend: `cd frontend && npm install`
|
||||
3. Test build: `npm run build` (frontend)
|
||||
4. Run the testing checklist above
|
||||
5. Monitor for deprecation warnings or errors
|
||||
|
||||
**If issues occur:** Refer to rollback plan above or reach out for assistance.
|
||||
@@ -1,592 +0,0 @@
|
||||
# Deploy Fiscal Clone via Gitea to Coolify
|
||||
|
||||
Deploy your fiscal-clone project using your self-hosted Git service.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Push to Gitea
|
||||
|
||||
```bash
|
||||
cd /data/workspace/fiscal-clone
|
||||
git init
|
||||
git add .
|
||||
git commit -m "Initial commit: Fiscal Clone production ready"
|
||||
git remote add gitea https://git.b11studio.xyz/francy51/fiscal-clone.git
|
||||
git push -u gitea main
|
||||
```
|
||||
|
||||
### 2. Deploy to Coolify
|
||||
|
||||
In Coolify dashboard:
|
||||
|
||||
1. **Create New Application**
|
||||
- Type: Docker Compose
|
||||
- Name: `fiscal-clone`
|
||||
- Source: Git Repository
|
||||
- Repository URL: `git@git.b11studio.xyz:francy51/fiscal-clone.git`
|
||||
- Branch: `main`
|
||||
- Build Context: `/`
|
||||
- Docker Compose File: `docker-compose.yml`
|
||||
|
||||
2. **Configure Environment Variables**
|
||||
|
||||
```
|
||||
DATABASE_URL=postgres://postgres:your-password@postgres:5432/fiscal
|
||||
POSTGRES_USER=postgres
|
||||
POSTGRES_PASSWORD=your-password
|
||||
POSTGRES_DB=fiscal
|
||||
|
||||
JWT_SECRET=your-jwt-secret-here
|
||||
|
||||
GITHUB_ID=your-github-oauth-id
|
||||
GITHUB_SECRET=your-github-oauth-secret
|
||||
|
||||
GOOGLE_ID=your-google-oauth-id
|
||||
GOOGLE_SECRET=your-google-oauth-secret
|
||||
|
||||
NEXT_PUBLIC_API_URL=https://api.fiscal.yourdomain.com
|
||||
```
|
||||
|
||||
3. **Configure Domains**
|
||||
|
||||
```
|
||||
Frontend: https://fiscal.b11studio.xyz
|
||||
Backend API: https://api.fiscal.b11studio.xyz
|
||||
```
|
||||
|
||||
4. **Deploy**
|
||||
|
||||
Click "Deploy" button in Coolify
|
||||
|
||||
## Configuration Details
|
||||
|
||||
### Backend Application
|
||||
|
||||
**Service 1: PostgreSQL**
|
||||
- Image: postgres:16-alpine
|
||||
- Database Name: fiscal
|
||||
- User: postgres
|
||||
- Password: (set in environment)
|
||||
- Volumes: postgres_data
|
||||
|
||||
**Service 2: Backend**
|
||||
- Build Context: ./backend
|
||||
- Dockerfile: Dockerfile
|
||||
- Ports: 3001
|
||||
- Environment: All backend env vars
|
||||
- Health Check: /api/health
|
||||
- Depends On: postgres
|
||||
|
||||
**Service 3: Frontend**
|
||||
- Build Context: ./frontend
|
||||
- Dockerfile: Dockerfile
|
||||
- Ports: 3000
|
||||
- Environment: NEXT_PUBLIC_API_URL
|
||||
- Depends On: backend
|
||||
|
||||
### Traefik Configuration
|
||||
|
||||
Coolify automatically configures routing. Labels included in docker-compose.yml.
|
||||
|
||||
## Environment Variables Reference
|
||||
|
||||
### Required Variables
|
||||
|
||||
| Variable | Description | Example |
|
||||
|-----------|-------------|----------|
|
||||
| DATABASE_URL | PostgreSQL connection string | postgres://postgres:password@postgres:5432/fiscal |
|
||||
| JWT_SECRET | JWT signing secret | generate-secure-random-string |
|
||||
| NEXT_PUBLIC_API_URL | Backend API URL | https://api.fiscal.yourdomain.com |
|
||||
|
||||
### OAuth Configuration
|
||||
|
||||
**GitHub OAuth:**
|
||||
- Create OAuth app: https://github.com/settings/developers
|
||||
- Callback URL: https://fiscal.b11studio.xyz/api/auth/callback/github
|
||||
|
||||
**Google OAuth:**
|
||||
- Create OAuth app: https://console.cloud.google.com/apis/credentials
|
||||
- Callback URL: https://fiscal.b11studio.xyz/api/auth/callback/google
|
||||
|
||||
## Deployment Steps Detailed
|
||||
|
||||
### Step 1: Prepare Repository
|
||||
|
||||
```bash
|
||||
# Navigate to project
|
||||
cd /data/workspace/fiscal-clone
|
||||
|
||||
# Initialize Git
|
||||
git init
|
||||
|
||||
# Add all files
|
||||
git add .
|
||||
|
||||
# Create initial commit
|
||||
git commit -m "feat: Initial release
|
||||
|
||||
- SEC filings extraction with EDGAR API
|
||||
- Portfolio analytics with Yahoo Finance
|
||||
- NextAuth.js authentication (GitHub, Google, Email)
|
||||
- Real-time stock price updates
|
||||
- PostgreSQL with automatic P&L calculations
|
||||
- Recharts visualizations
|
||||
- OpenClaw AI integration endpoints"
|
||||
|
||||
# Add remote
|
||||
git remote add gitea https://git.b11studio.xyz/francy51/fiscal-clone.git
|
||||
|
||||
# Push to Gitea
|
||||
git push -u gitea main
|
||||
```
|
||||
|
||||
### Step 2: Create Gitea Repository
|
||||
|
||||
Option A: Use Gitea web interface
|
||||
1. Access: https://git.b11studio.xyz
|
||||
2. Login (admin account)
|
||||
3. Create repository: `fiscal-clone`
|
||||
4. Make it private (recommended)
|
||||
5. Clone HTTPS URL: `https://git.b11studio.xyz/francy51/fiscal-clone.git`
|
||||
|
||||
Option B: Create via API
|
||||
```bash
|
||||
# Create repo using Gitea API
|
||||
curl -X POST \
|
||||
-H "Authorization: token YOUR_GITEA_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"name": "fiscal-clone",
|
||||
"description": "Fiscal.ai clone - Financial filings and portfolio analytics",
|
||||
"private": true,
|
||||
"auto_init": false
|
||||
}' \
|
||||
https://git.b11studio.xyz/api/v1/user/repos
|
||||
```
|
||||
|
||||
### Step 3: Configure Coolify Application
|
||||
|
||||
1. **Navigate to Coolify Dashboard**
|
||||
- https://coolify.b11studio.xyz
|
||||
|
||||
2. **Create Application**
|
||||
- Click "New Application"
|
||||
- Name: `fiscal-clone-full-stack`
|
||||
- Type: Docker Compose
|
||||
- Source: Git Repository
|
||||
|
||||
3. **Git Configuration**
|
||||
- Repository URL: `git@git.b11studio.xyz:francy51/fiscal-clone.git`
|
||||
- Branch: `main`
|
||||
- Docker Compose File: `docker-compose.yml`
|
||||
- Build Context: `/`
|
||||
- Environment ID: Select or create new
|
||||
|
||||
4. **Add Environment Variables**
|
||||
- Click "Add Variable"
|
||||
- Add each required variable from reference table
|
||||
|
||||
5. **Configure Domains**
|
||||
- Click "Domains"
|
||||
- Add domain: `fiscal.b11studio.xyz`
|
||||
- Select frontend service
|
||||
- Generate SSL certificate
|
||||
|
||||
6. **Deploy**
|
||||
- Click "Deploy" button
|
||||
- Monitor deployment logs
|
||||
|
||||
## Post-Deployment Verification
|
||||
|
||||
### 1. Check Application Status
|
||||
|
||||
In Coolify dashboard:
|
||||
- Verify all services are running
|
||||
- Check deployment logs
|
||||
- Verify health checks passing
|
||||
|
||||
### 2. Test Endpoints
|
||||
|
||||
```bash
|
||||
# Test backend health
|
||||
curl https://api.fiscal.b11studio.xyz/api/health
|
||||
|
||||
# Test frontend
|
||||
curl https://fiscal.b11studio.xyz
|
||||
|
||||
# Test API documentation
|
||||
curl https://api.fiscal.b11studio.xyz/swagger
|
||||
```
|
||||
|
||||
### 3. Run Database Migrations
|
||||
|
||||
```bash
|
||||
# In Coolify terminal
|
||||
docker-compose exec backend bun run db:migrate
|
||||
```
|
||||
|
||||
### 4. Verify Database Connection
|
||||
|
||||
```bash
|
||||
# Check database connectivity
|
||||
docker-compose exec backend bun -e "console.log(await db\`SELECT 1\`)"
|
||||
```
|
||||
|
||||
### 5. Create First User
|
||||
|
||||
1. Access frontend: https://fiscal.b11studio.xyz
|
||||
2. Click "Sign Up"
|
||||
3. Create account
|
||||
4. Log in
|
||||
|
||||
### 6. Add to Watchlist
|
||||
|
||||
1. Go to "Watchlist"
|
||||
2. Add a stock (e.g., AAPL)
|
||||
3. Wait for SEC filings to be fetched
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Deployment Fails
|
||||
|
||||
**Issue:** Docker build fails
|
||||
```
|
||||
Check:
|
||||
- Build context is correct (/)
|
||||
- docker-compose.yml syntax is valid
|
||||
- Environment variables are set
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
- Redeploy application
|
||||
- Check Coolify logs
|
||||
- Verify Gitea repository is accessible
|
||||
|
||||
### Database Connection Failed
|
||||
|
||||
**Issue:** Backend cannot connect to database
|
||||
```
|
||||
Check:
|
||||
- DATABASE_URL format is correct
|
||||
- PostgreSQL service is running
|
||||
- Network connectivity between containers
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
- Restart PostgreSQL container
|
||||
- Verify database credentials
|
||||
- Check environment variables
|
||||
|
||||
### Frontend Cannot Connect to Backend
|
||||
|
||||
**Issue:** API errors in browser console
|
||||
```
|
||||
Check:
|
||||
- NEXT_PUBLIC_API_URL is set correctly
|
||||
- Backend is running
|
||||
- CORS is configured correctly
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
- Check backend logs
|
||||
- Verify frontend environment variables
|
||||
- Clear browser cache
|
||||
|
||||
### Authentication Fails
|
||||
|
||||
**Issue:** OAuth providers not working
|
||||
```
|
||||
Check:
|
||||
- OAuth client IDs and secrets are correct
|
||||
- Callback URLs match Coolify domain
|
||||
- Redirect URIs are correct
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
- Update OAuth configuration in Coolify
|
||||
- Verify GitHub/Google console settings
|
||||
- Check NextAuth configuration
|
||||
|
||||
## CI/CD with Gitea Actions
|
||||
|
||||
### Enable Gitea Actions in Coolify
|
||||
|
||||
In Coolify environment variables for backend:
|
||||
```
|
||||
GITEA__actions__ENABLED=true
|
||||
GITEA__actions__DEFAULT_ACTIONS_URL=https://git.b11studio.xyz
|
||||
```
|
||||
|
||||
### Create Workflow File
|
||||
|
||||
Create `.gitea/workflows/deploy.yml` in fiscal-clone:
|
||||
|
||||
```yaml
|
||||
name: Deploy to Coolify
|
||||
on:
|
||||
push:
|
||||
branches: [main]
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Trigger Coolify Deployment
|
||||
run: |
|
||||
curl -X POST \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer ${{ secrets.COOLIFY_API_TOKEN }}}" \
|
||||
-d '{
|
||||
"resource_id": "your-resource-id",
|
||||
"source": "main"
|
||||
}' \
|
||||
https://coolify.b11studio.xyz/api/v1/resources/${{ secrets.COOLIFY_RESOURCE_ID }}/deploy
|
||||
```
|
||||
|
||||
### Store Secrets
|
||||
|
||||
Add secrets in Gitea:
|
||||
1. `COOLIFY_API_TOKEN` - Your Coolify API token
|
||||
2. `COOLIFY_RESOURCE_ID` - The Coolify resource ID
|
||||
|
||||
## Monitoring & Alerts
|
||||
|
||||
### Set Up Discord Alerts
|
||||
|
||||
1. Create Discord webhook URL for deployment channel
|
||||
2. Add to fiscal-clone monitoring
|
||||
|
||||
### Health Check Endpoint
|
||||
|
||||
```
|
||||
GET /api/health
|
||||
|
||||
Response:
|
||||
{
|
||||
"status": "ok",
|
||||
"timestamp": "2026-02-15T23:51:00.000Z",
|
||||
"version": "1.0.0",
|
||||
"database": "connected"
|
||||
}
|
||||
```
|
||||
|
||||
## Maintenance
|
||||
|
||||
### Updates
|
||||
|
||||
```bash
|
||||
# Pull latest changes from Gitea
|
||||
git pull origin main
|
||||
|
||||
# Re-deploy in Coolify (automatic webhook triggers)
|
||||
# Or manual redeploy
|
||||
```
|
||||
|
||||
### Backups
|
||||
|
||||
Coolify provides automated PostgreSQL backups. To manually backup:
|
||||
|
||||
```bash
|
||||
# Export database
|
||||
docker-compose exec postgres pg_dump -U postgres -d fiscal > backup.sql
|
||||
|
||||
# Restore database
|
||||
docker-compose exec -T postgres psql -U postgres -d fiscal < backup.sql
|
||||
```
|
||||
|
||||
### Monitoring
|
||||
|
||||
Set up monitoring in `/data/workspace/memory/coolify-integration.md`:
|
||||
|
||||
```markdown
|
||||
## Applications
|
||||
- fiscal-backend - https://api.fiscal.b11studio.xyz
|
||||
- fiscal-frontend - https://fiscal.b11studio.xyz
|
||||
|
||||
## Monitoring
|
||||
- API health: /api/health
|
||||
- Database status: Coolify metrics
|
||||
- Deployment logs: Coolify dashboard
|
||||
|
||||
## Alerts
|
||||
- Discord: #alerts channel
|
||||
- Email: Optional SMTP configuration
|
||||
```
|
||||
|
||||
## Security Considerations
|
||||
|
||||
### Database
|
||||
- PostgreSQL 16-alpine (latest stable)
|
||||
- Strong random password required
|
||||
- No exposed ports (only internal)
|
||||
- Coolify provides SSL/TLS
|
||||
|
||||
### API
|
||||
- JWT tokens with 30-day expiration
|
||||
- CORS configured for allowed origins
|
||||
- Rate limiting recommended (add Nginx/Cloudflare)
|
||||
|
||||
### Authentication
|
||||
- Secure OAuth callback URLs
|
||||
- HTTPS only
|
||||
- 2FA recommended for sensitive operations
|
||||
|
||||
### Secrets Management
|
||||
|
||||
Never commit secrets to repository. Use Coolify environment variables:
|
||||
|
||||
❌ DO NOT COMMIT:
|
||||
- API keys
|
||||
- Database passwords
|
||||
- JWT secrets
|
||||
- OAuth client secrets
|
||||
- SMTP credentials
|
||||
|
||||
✅ DO USE:
|
||||
- Coolify environment variables
|
||||
- Encrypted secrets in Coolify
|
||||
- Separate config files for local development
|
||||
|
||||
## Scaling
|
||||
|
||||
### Horizontal Scaling
|
||||
|
||||
Add more application replicas:
|
||||
|
||||
1. In Coolify application settings
|
||||
2. Click "Resources"
|
||||
3. Adjust replica count
|
||||
|
||||
### Database Scaling
|
||||
|
||||
For high load, consider:
|
||||
|
||||
1. Separate PostgreSQL instance
|
||||
2. Connection pooling optimization
|
||||
3. Read replicas for queries
|
||||
|
||||
### Caching
|
||||
|
||||
Redis is included in docker-compose.yml. Configure application caching:
|
||||
|
||||
```javascript
|
||||
// In backend services
|
||||
const cache = {
|
||||
get: async (key) => {
|
||||
// Try Redis first
|
||||
try {
|
||||
const value = await redis.get(key);
|
||||
return value;
|
||||
} catch (err) {
|
||||
return null;
|
||||
}
|
||||
},
|
||||
set: async (key, value, ttl = 3600) => {
|
||||
await redis.setex(key, ttl, value);
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
## Rollback Procedure
|
||||
|
||||
### Rollback to Previous Version
|
||||
|
||||
1. In Gitea, tag previous version:
|
||||
```bash
|
||||
git tag v1.0.0
|
||||
git push origin v1.0.0
|
||||
```
|
||||
|
||||
2. In Coolify, select previous commit:
|
||||
- Application Settings → Deployments
|
||||
- Select previous deployment
|
||||
- Click "Rollback"
|
||||
|
||||
### Emergency Rollback
|
||||
|
||||
If critical issues occur:
|
||||
|
||||
1. Stop application in Coolify
|
||||
2. Deploy known good version
|
||||
3. Restore database from backup
|
||||
4. Verify functionality
|
||||
5. Document incident
|
||||
|
||||
## API Usage
|
||||
|
||||
### SEC Filings API
|
||||
|
||||
```
|
||||
GET /api/filings/{ticker}
|
||||
GET /api/filings
|
||||
POST /api/filings/refresh/{ticker}
|
||||
```
|
||||
|
||||
### Portfolio API
|
||||
|
||||
```
|
||||
GET /api/portfolio/{userId}
|
||||
GET /api/portfolio/{userId}/summary
|
||||
POST /api/portfolio
|
||||
PUT /api/portfolio/{id}
|
||||
DELETE /api/portfolio/{id}
|
||||
```
|
||||
|
||||
### Watchlist API
|
||||
|
||||
```
|
||||
GET /api/watchlist/{userId}
|
||||
POST /api/watchlist
|
||||
DELETE /api/watchlist/{id}
|
||||
```
|
||||
|
||||
## Support
|
||||
|
||||
### Documentation Links
|
||||
|
||||
- **Fiscal Clone README:** `/data/workspace/fiscal-clone/README.md`
|
||||
- **Gitea Docs:** https://docs.gitea.io/en-us/
|
||||
- **Coolify Docs:** https://docs.coolify.io/
|
||||
- **NextAuth Docs:** https://next-auth.js.org/
|
||||
- **Elysia Docs:** https://elysiajs.com/
|
||||
- **PostgreSQL:** https://www.postgresql.org/docs/
|
||||
|
||||
### Troubleshooting Commands
|
||||
|
||||
```bash
|
||||
# Check container logs
|
||||
docker-compose logs -f backend
|
||||
|
||||
# Check all containers
|
||||
docker-compose ps
|
||||
|
||||
# Restart specific service
|
||||
docker-compose restart backend
|
||||
|
||||
# Check database connection
|
||||
docker-compose exec backend bun run db:migrate
|
||||
|
||||
# View resource usage
|
||||
docker stats
|
||||
|
||||
# Clean up resources
|
||||
docker system prune -f
|
||||
```
|
||||
|
||||
## Success Criteria
|
||||
|
||||
Deployment is successful when:
|
||||
|
||||
- [ ] All containers are running
|
||||
- [ ] Health check passes: `/api/health`
|
||||
- [ ] Frontend loads at domain
|
||||
- [ ] Can create account and login
|
||||
- [ ] Can add stocks to watchlist
|
||||
- [ ] SEC filings are being fetched
|
||||
- [ ] Database migrations completed
|
||||
- [ ] No errors in logs
|
||||
|
||||
---
|
||||
|
||||
**Deployment Guide Version:** 1.0
|
||||
**Last Updated:** 2026-02-15
|
||||
**Status:** Ready for deployment
|
||||
@@ -1,5 +0,0 @@
|
||||
# Direct Coolify Deployment
|
||||
|
||||
Use `/Users/francescobrassesco/Coding/fiscal clone/fiscal-clone/COOLIFY.md` as the canonical deployment guide for Fiscal Clone 2.0.
|
||||
|
||||
This file is retained only as a compatibility entrypoint.
|
||||
45
README.md
45
README.md
@@ -1,30 +1,16 @@
|
||||
# Fiscal Clone 3.0 (Turbopack Rebuild)
|
||||
# Fiscal Clone 3.0
|
||||
|
||||
Ground-up rebuild into a single Next.js 16 application that runs with Turbopack and internal API routes.
|
||||
Turbopack-first rebuild of a fiscal.ai-style terminal with OpenClaw integration.
|
||||
|
||||
## What changed
|
||||
## Stack
|
||||
|
||||
- Removed hard runtime dependency on the external backend for core app workflows.
|
||||
- Added internal `app/api/*` services for watchlist, portfolio, filings, tasks, and health.
|
||||
- Added durable local data store at runtime (`frontend/data/store.json`).
|
||||
- Added async task engine with retry support for:
|
||||
- `sync_filings`
|
||||
- `refresh_prices`
|
||||
- `analyze_filing`
|
||||
- `portfolio_insights`
|
||||
- Added OpenClaw integration through OpenAI-compatible `/v1/chat/completions`.
|
||||
- Enforced Turbopack for both development and production builds.
|
||||
- Next.js 16 App Router
|
||||
- Turbopack for `dev` and `build`
|
||||
- Internal API routes (`app/api/*`)
|
||||
- Durable local task engine and JSON data store
|
||||
- OpenClaw/ZeroClaw analysis via OpenAI-compatible chat endpoint
|
||||
|
||||
## Architecture
|
||||
|
||||
- `frontend/`: full app (UI + API + task engine)
|
||||
- `frontend/app/api/*`: route handlers
|
||||
- `frontend/lib/server/*`: storage, task processors, SEC/pricing adapters, OpenClaw client
|
||||
- `frontend/data/store.json`: generated local runtime state (git-ignored)
|
||||
|
||||
The legacy `backend/` folder is retained in-repo but no longer required for the rebuilt local workflow.
|
||||
|
||||
## Run
|
||||
## Run locally
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
@@ -32,9 +18,9 @@ npm install
|
||||
npm run dev
|
||||
```
|
||||
|
||||
Open: [http://localhost:3000](http://localhost:3000)
|
||||
Open [http://localhost:3000](http://localhost:3000).
|
||||
|
||||
## Build (Turbopack)
|
||||
## Production build
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
@@ -42,18 +28,21 @@ npm run build
|
||||
npm run start
|
||||
```
|
||||
|
||||
## OpenClaw setup
|
||||
## Environment
|
||||
|
||||
Set in environment (for example `frontend/.env.local`):
|
||||
Use root `.env` or `frontend/.env.local`:
|
||||
|
||||
```env
|
||||
# leave blank for same-origin API
|
||||
NEXT_PUBLIC_API_URL=
|
||||
|
||||
OPENCLAW_BASE_URL=http://localhost:4000
|
||||
OPENCLAW_API_KEY=your_key
|
||||
OPENCLAW_MODEL=zeroclaw
|
||||
SEC_USER_AGENT=Fiscal Clone <support@fiscal.local>
|
||||
```
|
||||
|
||||
If OpenClaw is not configured, the app falls back to local analysis responses so task flows remain testable.
|
||||
If OpenClaw is unset, the app uses local fallback analysis so task workflows still run.
|
||||
|
||||
## API surface
|
||||
|
||||
|
||||
@@ -1,17 +0,0 @@
|
||||
FROM node:20-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
RUN npm install -g bun && npm install -g npm@latest
|
||||
|
||||
COPY package.json bun.lock* ./
|
||||
RUN bun install --frozen-lockfile || bun install
|
||||
|
||||
COPY . .
|
||||
|
||||
ENV NODE_ENV=production
|
||||
ENV PORT=3001
|
||||
|
||||
EXPOSE 3001
|
||||
|
||||
CMD ["bun", "run", "src/index.ts"]
|
||||
158
backend/bun.lock
158
backend/bun.lock
@@ -1,158 +0,0 @@
|
||||
{
|
||||
"lockfileVersion": 1,
|
||||
"configVersion": 1,
|
||||
"workspaces": {
|
||||
"": {
|
||||
"name": "fiscal-backend",
|
||||
"dependencies": {
|
||||
"@elysiajs/cors": "^1.4.1",
|
||||
"@elysiajs/swagger": "^1.3.1",
|
||||
"better-auth": "^1.4.18",
|
||||
"dotenv": "^17.3.1",
|
||||
"elysia": "^1.4.25",
|
||||
"pg": "^8.18.0",
|
||||
"postgres": "^3.4.8",
|
||||
"zod": "^4.3.6",
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/pg": "^8.16.0",
|
||||
"bun-types": "latest",
|
||||
},
|
||||
},
|
||||
},
|
||||
"packages": {
|
||||
"@better-auth/core": ["@better-auth/core@1.4.18", "", { "dependencies": { "@standard-schema/spec": "^1.0.0", "zod": "^4.3.5" }, "peerDependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.21", "better-call": "1.1.8", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1" } }, "sha512-q+awYgC7nkLEBdx2sW0iJjkzgSHlIxGnOpsN1r/O1+a4m7osJNHtfK2mKJSL1I+GfNyIlxJF8WvD/NLuYMpmcg=="],
|
||||
|
||||
"@better-auth/telemetry": ["@better-auth/telemetry@1.4.18", "", { "dependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.21" }, "peerDependencies": { "@better-auth/core": "1.4.18" } }, "sha512-e5rDF8S4j3Um/0LIVATL2in9dL4lfO2fr2v1Wio4qTMRbfxqnUDTa+6SZtwdeJrbc4O+a3c+IyIpjG9Q/6GpfQ=="],
|
||||
|
||||
"@better-auth/utils": ["@better-auth/utils@0.3.0", "", {}, "sha512-W+Adw6ZA6mgvnSnhOki270rwJ42t4XzSK6YWGF//BbVXL6SwCLWfyzBc1lN2m/4RM28KubdBKQ4X5VMoLRNPQw=="],
|
||||
|
||||
"@better-fetch/fetch": ["@better-fetch/fetch@1.1.21", "", {}, "sha512-/ImESw0sskqlVR94jB+5+Pxjf+xBwDZF/N5+y2/q4EqD7IARUTSpPfIo8uf39SYpCxyOCtbyYpUrZ3F/k0zT4A=="],
|
||||
|
||||
"@borewit/text-codec": ["@borewit/text-codec@0.2.1", "", {}, "sha512-k7vvKPbf7J2fZ5klGRD9AeKfUvojuZIQ3BT5u7Jfv+puwXkUBUT5PVyMDfJZpy30CBDXGMgw7fguK/lpOMBvgw=="],
|
||||
|
||||
"@elysiajs/cors": ["@elysiajs/cors@1.4.1", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-lQfad+F3r4mNwsxRKbXyJB8Jg43oAOXjRwn7sKUL6bcOW3KjUqUimTS+woNpO97efpzjtDE0tEjGk9DTw8lqTQ=="],
|
||||
|
||||
"@elysiajs/swagger": ["@elysiajs/swagger@1.3.1", "", { "dependencies": { "@scalar/themes": "^0.9.52", "@scalar/types": "^0.0.12", "openapi-types": "^12.1.3", "pathe": "^1.1.2" }, "peerDependencies": { "elysia": ">= 1.3.0" } }, "sha512-LcbLHa0zE6FJKWPWKsIC/f+62wbDv3aXydqcNPVPyqNcaUgwvCajIi+5kHEU6GO3oXUCpzKaMsb3gsjt8sLzFQ=="],
|
||||
|
||||
"@noble/ciphers": ["@noble/ciphers@2.1.1", "", {}, "sha512-bysYuiVfhxNJuldNXlFEitTVdNnYUc+XNJZd7Qm2a5j1vZHgY+fazadNFWFaMK/2vye0JVlxV3gHmC0WDfAOQw=="],
|
||||
|
||||
"@noble/hashes": ["@noble/hashes@2.0.1", "", {}, "sha512-XlOlEbQcE9fmuXxrVTXCTlG2nlRXa9Rj3rr5Ue/+tX+nmkgbX720YHh0VR3hBF9xDvwnb8D2shVGOwNx+ulArw=="],
|
||||
|
||||
"@scalar/openapi-types": ["@scalar/openapi-types@0.1.1", "", {}, "sha512-NMy3QNk6ytcCoPUGJH0t4NNr36OWXgZhA3ormr3TvhX1NDgoF95wFyodGVH8xiHeUyn2/FxtETm8UBLbB5xEmg=="],
|
||||
|
||||
"@scalar/themes": ["@scalar/themes@0.9.86", "", { "dependencies": { "@scalar/types": "0.1.7" } }, "sha512-QUHo9g5oSWi+0Lm1vJY9TaMZRau8LHg+vte7q5BVTBnu6NuQfigCaN+ouQ73FqIVd96TwMO6Db+dilK1B+9row=="],
|
||||
|
||||
"@scalar/types": ["@scalar/types@0.0.12", "", { "dependencies": { "@scalar/openapi-types": "0.1.1", "@unhead/schema": "^1.9.5" } }, "sha512-XYZ36lSEx87i4gDqopQlGCOkdIITHHEvgkuJFrXFATQs9zHARop0PN0g4RZYWj+ZpCUclOcaOjbCt8JGe22mnQ=="],
|
||||
|
||||
"@sinclair/typebox": ["@sinclair/typebox@0.34.48", "", {}, "sha512-kKJTNuK3AQOrgjjotVxMrCn1sUJwM76wMszfq1kdU4uYVJjvEWuFQ6HgvLt4Xz3fSmZlTOxJ/Ie13KnIcWQXFA=="],
|
||||
|
||||
"@standard-schema/spec": ["@standard-schema/spec@1.1.0", "", {}, "sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w=="],
|
||||
|
||||
"@tokenizer/inflate": ["@tokenizer/inflate@0.4.1", "", { "dependencies": { "debug": "^4.4.3", "token-types": "^6.1.1" } }, "sha512-2mAv+8pkG6GIZiF1kNg1jAjh27IDxEPKwdGul3snfztFerfPGI1LjDezZp3i7BElXompqEtPmoPx6c2wgtWsOA=="],
|
||||
|
||||
"@tokenizer/token": ["@tokenizer/token@0.3.0", "", {}, "sha512-OvjF+z51L3ov0OyAU0duzsYuvO01PH7x4t6DJx+guahgTnBHkhJdG7soQeTSFLWN3efnHyibZ4Z8l2EuWwJN3A=="],
|
||||
|
||||
"@types/node": ["@types/node@25.3.0", "", { "dependencies": { "undici-types": "~7.18.0" } }, "sha512-4K3bqJpXpqfg2XKGK9bpDTc6xO/xoUP/RBWS7AtRMug6zZFaRekiLzjVtAoZMquxoAbzBvy5nxQ7veS5eYzf8A=="],
|
||||
|
||||
"@types/pg": ["@types/pg@8.16.0", "", { "dependencies": { "@types/node": "*", "pg-protocol": "*", "pg-types": "^2.2.0" } }, "sha512-RmhMd/wD+CF8Dfo+cVIy3RR5cl8CyfXQ0tGgW6XBL8L4LM/UTEbNXYRbLwU6w+CgrKBNbrQWt4FUtTfaU5jSYQ=="],
|
||||
|
||||
"@unhead/schema": ["@unhead/schema@1.11.20", "", { "dependencies": { "hookable": "^5.5.3", "zhead": "^2.2.4" } }, "sha512-0zWykKAaJdm+/Y7yi/Yds20PrUK7XabLe9c3IRcjnwYmSWY6z0Cr19VIs3ozCj8P+GhR+/TI2mwtGlueCEYouA=="],
|
||||
|
||||
"better-auth": ["better-auth@1.4.18", "", { "dependencies": { "@better-auth/core": "1.4.18", "@better-auth/telemetry": "1.4.18", "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.21", "@noble/ciphers": "^2.0.0", "@noble/hashes": "^2.0.0", "better-call": "1.1.8", "defu": "^6.1.4", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1", "zod": "^4.3.5" }, "peerDependencies": { "@lynx-js/react": "*", "@prisma/client": "^5.0.0 || ^6.0.0 || ^7.0.0", "@sveltejs/kit": "^2.0.0", "@tanstack/react-start": "^1.0.0", "@tanstack/solid-start": "^1.0.0", "better-sqlite3": "^12.0.0", "drizzle-kit": ">=0.31.4", "drizzle-orm": ">=0.41.0", "mongodb": "^6.0.0 || ^7.0.0", "mysql2": "^3.0.0", "next": "^14.0.0 || ^15.0.0 || ^16.0.0", "pg": "^8.0.0", "prisma": "^5.0.0 || ^6.0.0 || ^7.0.0", "react": "^18.0.0 || ^19.0.0", "react-dom": "^18.0.0 || ^19.0.0", "solid-js": "^1.0.0", "svelte": "^4.0.0 || ^5.0.0", "vitest": "^2.0.0 || ^3.0.0 || ^4.0.0", "vue": "^3.0.0" }, "optionalPeers": ["@lynx-js/react", "@prisma/client", "@sveltejs/kit", "@tanstack/react-start", "@tanstack/solid-start", "better-sqlite3", "drizzle-kit", "drizzle-orm", "mongodb", "mysql2", "next", "pg", "prisma", "react", "react-dom", "solid-js", "svelte", "vitest", "vue"] }, "sha512-bnyifLWBPcYVltH3RhS7CM62MoelEqC6Q+GnZwfiDWNfepXoQZBjEvn4urcERC7NTKgKq5zNBM8rvPvRBa6xcg=="],
|
||||
|
||||
"better-call": ["better-call@1.1.8", "", { "dependencies": { "@better-auth/utils": "^0.3.0", "@better-fetch/fetch": "^1.1.4", "rou3": "^0.7.10", "set-cookie-parser": "^2.7.1" }, "peerDependencies": { "zod": "^4.0.0" }, "optionalPeers": ["zod"] }, "sha512-XMQ2rs6FNXasGNfMjzbyroSwKwYbZ/T3IxruSS6U2MJRsSYh3wYtG3o6H00ZlKZ/C/UPOAD97tqgQJNsxyeTXw=="],
|
||||
|
||||
"bun-types": ["bun-types@1.3.9", "", { "dependencies": { "@types/node": "*" } }, "sha512-+UBWWOakIP4Tswh0Bt0QD0alpTY8cb5hvgiYeWCMet9YukHbzuruIEeXC2D7nMJPB12kbh8C7XJykSexEqGKJg=="],
|
||||
|
||||
"cookie": ["cookie@1.1.1", "", {}, "sha512-ei8Aos7ja0weRpFzJnEA9UHJ/7XQmqglbRwnf2ATjcB9Wq874VKH9kfjjirM6UhU2/E5fFYadylyhFldcqSidQ=="],
|
||||
|
||||
"debug": ["debug@4.4.3", "", { "dependencies": { "ms": "^2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="],
|
||||
|
||||
"defu": ["defu@6.1.4", "", {}, "sha512-mEQCMmwJu317oSz8CwdIOdwf3xMif1ttiM8LTufzc3g6kR+9Pe236twL8j3IYT1F7GfRgGcW6MWxzZjLIkuHIg=="],
|
||||
|
||||
"dotenv": ["dotenv@17.3.1", "", {}, "sha512-IO8C/dzEb6O3F9/twg6ZLXz164a2fhTnEWb95H23Dm4OuN+92NmEAlTrupP9VW6Jm3sO26tQlqyvyi4CsnY9GA=="],
|
||||
|
||||
"elysia": ["elysia@1.4.25", "", { "dependencies": { "cookie": "^1.1.1", "exact-mirror": "^0.2.7", "fast-decode-uri-component": "^1.0.1", "memoirist": "^0.4.0" }, "peerDependencies": { "@sinclair/typebox": ">= 0.34.0 < 1", "@types/bun": ">= 1.2.0", "file-type": ">= 20.0.0", "openapi-types": ">= 12.0.0", "typescript": ">= 5.0.0" }, "optionalPeers": ["@types/bun", "typescript"] }, "sha512-liKjavH99Gpzrv9cDil6uYWmPuqESfPFV1FIaFSd3iNqo3y7e29sN43VxFIK8tWWnyi6eDAmi2SZk8hNAMQMyg=="],
|
||||
|
||||
"exact-mirror": ["exact-mirror@0.2.7", "", { "peerDependencies": { "@sinclair/typebox": "^0.34.15" }, "optionalPeers": ["@sinclair/typebox"] }, "sha512-+MeEmDcLA4o/vjK2zujgk+1VTxPR4hdp23qLqkWfStbECtAq9gmsvQa3LW6z/0GXZyHJobrCnmy1cdeE7BjsYg=="],
|
||||
|
||||
"fast-decode-uri-component": ["fast-decode-uri-component@1.0.1", "", {}, "sha512-WKgKWg5eUxvRZGwW8FvfbaH7AXSh2cL+3j5fMGzUMCxWBJ3dV3a7Wz8y2f/uQ0e3B6WmodD3oS54jTQ9HVTIIg=="],
|
||||
|
||||
"file-type": ["file-type@21.3.0", "", { "dependencies": { "@tokenizer/inflate": "^0.4.1", "strtok3": "^10.3.4", "token-types": "^6.1.1", "uint8array-extras": "^1.4.0" } }, "sha512-8kPJMIGz1Yt/aPEwOsrR97ZyZaD1Iqm8PClb1nYFclUCkBi0Ma5IsYNQzvSFS9ib51lWyIw5mIT9rWzI/xjpzA=="],
|
||||
|
||||
"hookable": ["hookable@5.5.3", "", {}, "sha512-Yc+BQe8SvoXH1643Qez1zqLRmbA5rCL+sSmk6TVos0LWVfNIB7PGncdlId77WzLGSIB5KaWgTaNTs2lNVEI6VQ=="],
|
||||
|
||||
"ieee754": ["ieee754@1.2.1", "", {}, "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA=="],
|
||||
|
||||
"jose": ["jose@6.1.3", "", {}, "sha512-0TpaTfihd4QMNwrz/ob2Bp7X04yuxJkjRGi4aKmOqwhov54i6u79oCv7T+C7lo70MKH6BesI3vscD1yb/yzKXQ=="],
|
||||
|
||||
"kysely": ["kysely@0.28.11", "", {}, "sha512-zpGIFg0HuoC893rIjYX1BETkVWdDnzTzF5e0kWXJFg5lE0k1/LfNWBejrcnOFu8Q2Rfq/hTDTU7XLUM8QOrpzg=="],
|
||||
|
||||
"memoirist": ["memoirist@0.4.0", "", {}, "sha512-zxTgA0mSYELa66DimuNQDvyLq36AwDlTuVRbnQtB+VuTcKWm5Qc4z3WkSpgsFWHNhexqkIooqpv4hdcqrX5Nmg=="],
|
||||
|
||||
"ms": ["ms@2.1.3", "", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="],
|
||||
|
||||
"nanoid": ["nanoid@5.1.6", "", { "bin": { "nanoid": "bin/nanoid.js" } }, "sha512-c7+7RQ+dMB5dPwwCp4ee1/iV/q2P6aK1mTZcfr1BTuVlyW9hJYiMPybJCcnBlQtuSmTIWNeazm/zqNoZSSElBg=="],
|
||||
|
||||
"nanostores": ["nanostores@1.1.0", "", {}, "sha512-yJBmDJr18xy47dbNVlHcgdPrulSn1nhSE6Ns9vTG+Nx9VPT6iV1MD6aQFp/t52zpf82FhLLTXAXr30NuCnxvwA=="],
|
||||
|
||||
"openapi-types": ["openapi-types@12.1.3", "", {}, "sha512-N4YtSYJqghVu4iek2ZUvcN/0aqH1kRDuNqzcycDxhOUpg7GdvLa2F3DgS6yBNhInhv2r/6I0Flkn7CqL8+nIcw=="],
|
||||
|
||||
"pathe": ["pathe@1.1.2", "", {}, "sha512-whLdWMYL2TwI08hn8/ZqAbrVemu0LNaNNJZX73O6qaIdCTfXutsLhMkjdENX0qhsQ9uIimo4/aQOmXkoon2nDQ=="],
|
||||
|
||||
"pg": ["pg@8.18.0", "", { "dependencies": { "pg-connection-string": "^2.11.0", "pg-pool": "^3.11.0", "pg-protocol": "^1.11.0", "pg-types": "2.2.0", "pgpass": "1.0.5" }, "optionalDependencies": { "pg-cloudflare": "^1.3.0" }, "peerDependencies": { "pg-native": ">=3.0.1" }, "optionalPeers": ["pg-native"] }, "sha512-xqrUDL1b9MbkydY/s+VZ6v+xiMUmOUk7SS9d/1kpyQxoJ6U9AO1oIJyUWVZojbfe5Cc/oluutcgFG4L9RDP1iQ=="],
|
||||
|
||||
"pg-cloudflare": ["pg-cloudflare@1.3.0", "", {}, "sha512-6lswVVSztmHiRtD6I8hw4qP/nDm1EJbKMRhf3HCYaqud7frGysPv7FYJ5noZQdhQtN2xJnimfMtvQq21pdbzyQ=="],
|
||||
|
||||
"pg-connection-string": ["pg-connection-string@2.11.0", "", {}, "sha512-kecgoJwhOpxYU21rZjULrmrBJ698U2RxXofKVzOn5UDj61BPj/qMb7diYUR1nLScCDbrztQFl1TaQZT0t1EtzQ=="],
|
||||
|
||||
"pg-int8": ["pg-int8@1.0.1", "", {}, "sha512-WCtabS6t3c8SkpDBUlb1kjOs7l66xsGdKpIPZsg4wR+B3+u9UAum2odSsF9tnvxg80h4ZxLWMy4pRjOsFIqQpw=="],
|
||||
|
||||
"pg-pool": ["pg-pool@3.11.0", "", { "peerDependencies": { "pg": ">=8.0" } }, "sha512-MJYfvHwtGp870aeusDh+hg9apvOe2zmpZJpyt+BMtzUWlVqbhFmMK6bOBXLBUPd7iRtIF9fZplDc7KrPN3PN7w=="],
|
||||
|
||||
"pg-protocol": ["pg-protocol@1.11.0", "", {}, "sha512-pfsxk2M9M3BuGgDOfuy37VNRRX3jmKgMjcvAcWqNDpZSf4cUmv8HSOl5ViRQFsfARFn0KuUQTgLxVMbNq5NW3g=="],
|
||||
|
||||
"pg-types": ["pg-types@2.2.0", "", { "dependencies": { "pg-int8": "1.0.1", "postgres-array": "~2.0.0", "postgres-bytea": "~1.0.0", "postgres-date": "~1.0.4", "postgres-interval": "^1.1.0" } }, "sha512-qTAAlrEsl8s4OiEQY69wDvcMIdQN6wdz5ojQiOy6YRMuynxenON0O5oCpJI6lshc6scgAY8qvJ2On/p+CXY0GA=="],
|
||||
|
||||
"pgpass": ["pgpass@1.0.5", "", { "dependencies": { "split2": "^4.1.0" } }, "sha512-FdW9r/jQZhSeohs1Z3sI1yxFQNFvMcnmfuj4WBMUTxOrAyLMaTcE1aAMBiTlbMNaXvBCQuVi0R7hd8udDSP7ug=="],
|
||||
|
||||
"postgres": ["postgres@3.4.8", "", {}, "sha512-d+JFcLM17njZaOLkv6SCev7uoLaBtfK86vMUXhW1Z4glPWh4jozno9APvW/XKFJ3CCxVoC7OL38BqRydtu5nGg=="],
|
||||
|
||||
"postgres-array": ["postgres-array@2.0.0", "", {}, "sha512-VpZrUqU5A69eQyW2c5CA1jtLecCsN2U/bD6VilrFDWq5+5UIEVO7nazS3TEcHf1zuPYO/sqGvUvW62g86RXZuA=="],
|
||||
|
||||
"postgres-bytea": ["postgres-bytea@1.0.1", "", {}, "sha512-5+5HqXnsZPE65IJZSMkZtURARZelel2oXUEO8rH83VS/hxH5vv1uHquPg5wZs8yMAfdv971IU+kcPUczi7NVBQ=="],
|
||||
|
||||
"postgres-date": ["postgres-date@1.0.7", "", {}, "sha512-suDmjLVQg78nMK2UZ454hAG+OAW+HQPZ6n++TNDUX+L0+uUlLywnoxJKDou51Zm+zTCjrCl0Nq6J9C5hP9vK/Q=="],
|
||||
|
||||
"postgres-interval": ["postgres-interval@1.2.0", "", { "dependencies": { "xtend": "^4.0.0" } }, "sha512-9ZhXKM/rw350N1ovuWHbGxnGh/SNJ4cnxHiM0rxE4VN41wsg8P8zWn9hv/buK00RP4WvlOyr/RBDiptyxVbkZQ=="],
|
||||
|
||||
"rou3": ["rou3@0.7.12", "", {}, "sha512-iFE4hLDuloSWcD7mjdCDhx2bKcIsYbtOTpfH5MHHLSKMOUyjqQXTeZVa289uuwEGEKFoE/BAPbhaU4B774nceg=="],
|
||||
|
||||
"set-cookie-parser": ["set-cookie-parser@2.7.2", "", {}, "sha512-oeM1lpU/UvhTxw+g3cIfxXHyJRc/uidd3yK1P242gzHds0udQBYzs3y8j4gCCW+ZJ7ad0yctld8RYO+bdurlvw=="],
|
||||
|
||||
"split2": ["split2@4.2.0", "", {}, "sha512-UcjcJOWknrNkF6PLX83qcHM6KHgVKNkV62Y8a5uYDVv9ydGQVwAHMKqHdJje1VTWpljG0WYpCDhrCdAOYH4TWg=="],
|
||||
|
||||
"strtok3": ["strtok3@10.3.4", "", { "dependencies": { "@tokenizer/token": "^0.3.0" } }, "sha512-KIy5nylvC5le1OdaaoCJ07L+8iQzJHGH6pWDuzS+d07Cu7n1MZ2x26P8ZKIWfbK02+XIL8Mp4RkWeqdUCrDMfg=="],
|
||||
|
||||
"token-types": ["token-types@6.1.2", "", { "dependencies": { "@borewit/text-codec": "^0.2.1", "@tokenizer/token": "^0.3.0", "ieee754": "^1.2.1" } }, "sha512-dRXchy+C0IgK8WPC6xvCHFRIWYUbqqdEIKPaKo/AcTUNzwLTK6AH7RjdLWsEZcAN/TBdtfUw3PYEgPr5VPr6ww=="],
|
||||
|
||||
"type-fest": ["type-fest@4.41.0", "", {}, "sha512-TeTSQ6H5YHvpqVwBRcnLDCBnDOHWYu7IvGbHT6N8AOymcr9PJGjc1GTtiWZTYg0NCgYwvnYWEkVChQAr9bjfwA=="],
|
||||
|
||||
"uint8array-extras": ["uint8array-extras@1.5.0", "", {}, "sha512-rvKSBiC5zqCCiDZ9kAOszZcDvdAHwwIKJG33Ykj43OKcWsnmcBRL09YTU4nOeHZ8Y2a7l1MgTd08SBe9A8Qj6A=="],
|
||||
|
||||
"undici-types": ["undici-types@7.18.2", "", {}, "sha512-AsuCzffGHJybSaRrmr5eHr81mwJU3kjw6M+uprWvCXiNeN9SOGwQ3Jn8jb8m3Z6izVgknn1R0FTCEAP2QrLY/w=="],
|
||||
|
||||
"xtend": ["xtend@4.0.2", "", {}, "sha512-LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ=="],
|
||||
|
||||
"zhead": ["zhead@2.2.4", "", {}, "sha512-8F0OI5dpWIA5IGG5NHUg9staDwz/ZPxZtvGVf01j7vHqSyZ0raHY+78atOVxRqb73AotX22uV1pXt3gYSstGag=="],
|
||||
|
||||
"zod": ["zod@4.3.6", "", {}, "sha512-rftlrkhHZOcjDwkGlnUtZZkvaPHCsDATp4pGpuOOMDaTdDDXF91wuVDJoWoPsKX/3YPQ5fHuF3STjcYyKr+Qhg=="],
|
||||
|
||||
"@scalar/themes/@scalar/types": ["@scalar/types@0.1.7", "", { "dependencies": { "@scalar/openapi-types": "0.2.0", "@unhead/schema": "^1.11.11", "nanoid": "^5.1.5", "type-fest": "^4.20.0", "zod": "^3.23.8" } }, "sha512-irIDYzTQG2KLvFbuTI8k2Pz/R4JR+zUUSykVTbEMatkzMmVFnn1VzNSMlODbadycwZunbnL2tA27AXed9URVjw=="],
|
||||
|
||||
"@scalar/themes/@scalar/types/@scalar/openapi-types": ["@scalar/openapi-types@0.2.0", "", { "dependencies": { "zod": "^3.23.8" } }, "sha512-waiKk12cRCqyUCWTOX0K1WEVX46+hVUK+zRPzAahDJ7G0TApvbNkuy5wx7aoUyEk++HHde0XuQnshXnt8jsddA=="],
|
||||
|
||||
"@scalar/themes/@scalar/types/zod": ["zod@3.25.76", "", {}, "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ=="],
|
||||
}
|
||||
}
|
||||
@@ -1,64 +0,0 @@
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16-alpine
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
POSTGRES_USER: ${POSTGRES_USER:-postgres}
|
||||
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-postgres}
|
||||
POSTGRES_DB: ${POSTGRES_DB:-fiscal}
|
||||
volumes:
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
healthcheck:
|
||||
test: ['CMD-SHELL', 'pg_isready -U ${POSTGRES_USER:-postgres} -d ${POSTGRES_DB:-fiscal}']
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 10
|
||||
|
||||
backend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
restart: unless-stopped
|
||||
command: ['sh', '-c', 'bun run src/db/migrate.ts && bun run src/index.ts']
|
||||
environment:
|
||||
DATABASE_URL: ${DATABASE_URL:-postgres://postgres:postgres@postgres:5432/fiscal}
|
||||
PORT: ${PORT:-3001}
|
||||
FRONTEND_URL: ${FRONTEND_URL:-http://localhost:3000}
|
||||
BETTER_AUTH_SECRET: ${BETTER_AUTH_SECRET:-local-dev-better-auth-secret-change-me}
|
||||
BETTER_AUTH_BASE_URL: ${BETTER_AUTH_BASE_URL:-http://localhost:3001}
|
||||
SEC_USER_AGENT: ${SEC_USER_AGENT:-Fiscal Clone <support@example.com>}
|
||||
OPENCLAW_BASE_URL: ${OPENCLAW_BASE_URL:-}
|
||||
OPENCLAW_API_KEY: ${OPENCLAW_API_KEY:-}
|
||||
OPENCLAW_MODEL: ${OPENCLAW_MODEL:-zeroclaw}
|
||||
TASK_HEARTBEAT_SECONDS: ${TASK_HEARTBEAT_SECONDS:-15}
|
||||
TASK_STALE_SECONDS: ${TASK_STALE_SECONDS:-120}
|
||||
TASK_MAX_ATTEMPTS: ${TASK_MAX_ATTEMPTS:-3}
|
||||
depends_on:
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
worker:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
restart: unless-stopped
|
||||
command: ['sh', '-c', 'bun run src/db/migrate.ts && bun run src/worker.ts']
|
||||
environment:
|
||||
DATABASE_URL: ${DATABASE_URL:-postgres://postgres:postgres@postgres:5432/fiscal}
|
||||
PORT: ${PORT:-3001}
|
||||
FRONTEND_URL: ${FRONTEND_URL:-http://localhost:3000}
|
||||
BETTER_AUTH_SECRET: ${BETTER_AUTH_SECRET:-local-dev-better-auth-secret-change-me}
|
||||
BETTER_AUTH_BASE_URL: ${BETTER_AUTH_BASE_URL:-http://localhost:3001}
|
||||
SEC_USER_AGENT: ${SEC_USER_AGENT:-Fiscal Clone <support@example.com>}
|
||||
OPENCLAW_BASE_URL: ${OPENCLAW_BASE_URL:-}
|
||||
OPENCLAW_API_KEY: ${OPENCLAW_API_KEY:-}
|
||||
OPENCLAW_MODEL: ${OPENCLAW_MODEL:-zeroclaw}
|
||||
TASK_HEARTBEAT_SECONDS: ${TASK_HEARTBEAT_SECONDS:-15}
|
||||
TASK_STALE_SECONDS: ${TASK_STALE_SECONDS:-120}
|
||||
TASK_MAX_ATTEMPTS: ${TASK_MAX_ATTEMPTS:-3}
|
||||
depends_on:
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
volumes:
|
||||
postgres_data:
|
||||
@@ -1,26 +0,0 @@
|
||||
{
|
||||
"name": "fiscal-backend",
|
||||
"version": "2.0.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"dev": "bun run --watch src/index.ts",
|
||||
"dev:worker": "bun run --watch src/worker.ts",
|
||||
"start": "bun run src/index.ts",
|
||||
"start:worker": "bun run src/worker.ts",
|
||||
"db:migrate": "bun run src/db/migrate.ts"
|
||||
},
|
||||
"dependencies": {
|
||||
"@elysiajs/cors": "^1.4.1",
|
||||
"@elysiajs/swagger": "^1.3.1",
|
||||
"better-auth": "^1.4.18",
|
||||
"dotenv": "^17.3.1",
|
||||
"elysia": "^1.4.25",
|
||||
"pg": "^8.18.0",
|
||||
"postgres": "^3.4.8",
|
||||
"zod": "^4.3.6"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/pg": "^8.16.0",
|
||||
"bun-types": "latest"
|
||||
}
|
||||
}
|
||||
@@ -1,38 +0,0 @@
|
||||
import { betterAuth } from 'better-auth';
|
||||
import { Pool } from 'pg';
|
||||
import { env } from './config';
|
||||
|
||||
const pool = new Pool({
|
||||
connectionString: env.DATABASE_URL,
|
||||
max: 20,
|
||||
idleTimeoutMillis: 30_000
|
||||
});
|
||||
|
||||
export const auth = betterAuth({
|
||||
secret: env.BETTER_AUTH_SECRET,
|
||||
baseURL: env.BETTER_AUTH_BASE_URL,
|
||||
database: pool,
|
||||
trustedOrigins: env.FRONTEND_ORIGINS,
|
||||
emailAndPassword: {
|
||||
enabled: true,
|
||||
autoSignIn: true
|
||||
},
|
||||
user: {
|
||||
modelName: 'users',
|
||||
additionalFields: {
|
||||
name: {
|
||||
type: 'string',
|
||||
required: false
|
||||
},
|
||||
image: {
|
||||
type: 'string',
|
||||
required: false
|
||||
}
|
||||
}
|
||||
},
|
||||
advanced: {
|
||||
database: {
|
||||
generateId: false
|
||||
}
|
||||
}
|
||||
});
|
||||
@@ -1,47 +0,0 @@
|
||||
import * as dotenv from 'dotenv';
|
||||
import { z } from 'zod';
|
||||
|
||||
dotenv.config();
|
||||
|
||||
const schema = z.object({
|
||||
NODE_ENV: z.enum(['development', 'test', 'production']).default('development'),
|
||||
PORT: z.coerce.number().int().positive().default(3001),
|
||||
DATABASE_URL: z.string().optional(),
|
||||
POSTGRES_USER: z.string().default('postgres'),
|
||||
POSTGRES_PASSWORD: z.string().default('postgres'),
|
||||
POSTGRES_HOST: z.string().default('localhost'),
|
||||
POSTGRES_DB: z.string().default('fiscal'),
|
||||
FRONTEND_URL: z.string().default('http://localhost:3000'),
|
||||
BETTER_AUTH_SECRET: z.string().min(16).default('local-dev-better-auth-secret-change-me-1234'),
|
||||
BETTER_AUTH_BASE_URL: z.string().url().default('http://localhost:3001'),
|
||||
SEC_USER_AGENT: z.string().default('Fiscal Clone <support@fiscal.local>'),
|
||||
OPENCLAW_BASE_URL: z.preprocess(
|
||||
(value) => (typeof value === 'string' && value.trim() === '' ? undefined : value),
|
||||
z.string().url().optional()
|
||||
),
|
||||
OPENCLAW_API_KEY: z.preprocess(
|
||||
(value) => (typeof value === 'string' && value.trim() === '' ? undefined : value),
|
||||
z.string().optional()
|
||||
),
|
||||
OPENCLAW_MODEL: z.string().default('zeroclaw'),
|
||||
TASK_HEARTBEAT_SECONDS: z.coerce.number().int().positive().default(15),
|
||||
TASK_STALE_SECONDS: z.coerce.number().int().positive().default(120),
|
||||
TASK_MAX_ATTEMPTS: z.coerce.number().int().positive().default(3)
|
||||
});
|
||||
|
||||
const parsed = schema.safeParse(process.env);
|
||||
|
||||
if (!parsed.success) {
|
||||
console.error('Invalid environment configuration', parsed.error.flatten().fieldErrors);
|
||||
throw new Error('Invalid environment variables');
|
||||
}
|
||||
|
||||
const rawEnv = parsed.data;
|
||||
const databaseUrl = rawEnv.DATABASE_URL
|
||||
?? `postgres://${rawEnv.POSTGRES_USER}:${rawEnv.POSTGRES_PASSWORD}@${rawEnv.POSTGRES_HOST}:5432/${rawEnv.POSTGRES_DB}`;
|
||||
|
||||
export const env = {
|
||||
...rawEnv,
|
||||
DATABASE_URL: databaseUrl,
|
||||
FRONTEND_ORIGINS: rawEnv.FRONTEND_URL.split(',').map((origin) => origin.trim()).filter(Boolean)
|
||||
};
|
||||
@@ -1,13 +0,0 @@
|
||||
import postgres from 'postgres';
|
||||
import { env } from '../config';
|
||||
|
||||
export const db = postgres(env.DATABASE_URL, {
|
||||
max: 20,
|
||||
idle_timeout: 20,
|
||||
connect_timeout: 10,
|
||||
prepare: true
|
||||
});
|
||||
|
||||
export async function closeDb() {
|
||||
await db.end({ timeout: 5 });
|
||||
}
|
||||
@@ -1,256 +0,0 @@
|
||||
import { db } from './index';
|
||||
|
||||
async function migrate() {
|
||||
console.log('Running database migrations...');
|
||||
|
||||
await db`CREATE EXTENSION IF NOT EXISTS pgcrypto`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS users (
|
||||
id SERIAL PRIMARY KEY,
|
||||
email TEXT UNIQUE NOT NULL,
|
||||
email_verified BOOLEAN NOT NULL DEFAULT FALSE,
|
||||
name TEXT,
|
||||
image TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
)
|
||||
`;
|
||||
|
||||
await db`ALTER TABLE users ADD COLUMN IF NOT EXISTS email_verified BOOLEAN NOT NULL DEFAULT FALSE`;
|
||||
await db`ALTER TABLE users ADD COLUMN IF NOT EXISTS image TEXT`;
|
||||
|
||||
await db`
|
||||
DO $$
|
||||
BEGIN
|
||||
IF EXISTS (
|
||||
SELECT 1
|
||||
FROM information_schema.columns
|
||||
WHERE table_name = 'users'
|
||||
AND column_name = 'password'
|
||||
) THEN
|
||||
EXECUTE 'ALTER TABLE users ALTER COLUMN password DROP NOT NULL';
|
||||
END IF;
|
||||
END
|
||||
$$
|
||||
`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS session (
|
||||
id TEXT PRIMARY KEY,
|
||||
user_id INTEGER NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
token TEXT NOT NULL UNIQUE,
|
||||
expires_at TIMESTAMPTZ NOT NULL,
|
||||
ip_address TEXT,
|
||||
user_agent TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
)
|
||||
`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS account (
|
||||
id TEXT PRIMARY KEY,
|
||||
user_id INTEGER NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
account_id TEXT NOT NULL,
|
||||
provider_id TEXT NOT NULL,
|
||||
access_token TEXT,
|
||||
refresh_token TEXT,
|
||||
access_token_expires_at TIMESTAMPTZ,
|
||||
refresh_token_expires_at TIMESTAMPTZ,
|
||||
scope TEXT,
|
||||
id_token TEXT,
|
||||
password TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
UNIQUE(user_id, provider_id, account_id)
|
||||
)
|
||||
`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS verification (
|
||||
id TEXT PRIMARY KEY,
|
||||
identifier TEXT NOT NULL,
|
||||
value TEXT NOT NULL,
|
||||
expires_at TIMESTAMPTZ NOT NULL,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
)
|
||||
`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS filings (
|
||||
id BIGSERIAL PRIMARY KEY,
|
||||
ticker VARCHAR(12) NOT NULL,
|
||||
filing_type VARCHAR(20) NOT NULL,
|
||||
filing_date DATE NOT NULL,
|
||||
accession_number VARCHAR(40) NOT NULL UNIQUE,
|
||||
cik VARCHAR(20) NOT NULL,
|
||||
company_name TEXT NOT NULL,
|
||||
filing_url TEXT,
|
||||
metrics JSONB,
|
||||
analysis JSONB,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
)
|
||||
`;
|
||||
|
||||
await db`ALTER TABLE filings ADD COLUMN IF NOT EXISTS filing_url TEXT`;
|
||||
await db`ALTER TABLE filings ADD COLUMN IF NOT EXISTS metrics JSONB`;
|
||||
await db`ALTER TABLE filings ADD COLUMN IF NOT EXISTS analysis JSONB`;
|
||||
await db`ALTER TABLE filings ADD COLUMN IF NOT EXISTS updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()`;
|
||||
|
||||
await db`
|
||||
DO $$
|
||||
BEGIN
|
||||
IF EXISTS (
|
||||
SELECT 1
|
||||
FROM information_schema.columns
|
||||
WHERE table_name = 'filings'
|
||||
AND column_name = 'key_metrics'
|
||||
) THEN
|
||||
EXECUTE 'UPDATE filings SET metrics = COALESCE(metrics, key_metrics) WHERE metrics IS NULL';
|
||||
END IF;
|
||||
|
||||
IF EXISTS (
|
||||
SELECT 1
|
||||
FROM information_schema.columns
|
||||
WHERE table_name = 'filings'
|
||||
AND column_name = 'insights'
|
||||
) THEN
|
||||
EXECUTE $migrate$
|
||||
UPDATE filings
|
||||
SET analysis = COALESCE(analysis, jsonb_build_object('legacyInsights', insights))
|
||||
WHERE analysis IS NULL
|
||||
AND insights IS NOT NULL
|
||||
$migrate$;
|
||||
END IF;
|
||||
END
|
||||
$$
|
||||
`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS watchlist (
|
||||
id BIGSERIAL PRIMARY KEY,
|
||||
user_id INTEGER NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
ticker VARCHAR(12) NOT NULL,
|
||||
company_name TEXT NOT NULL,
|
||||
sector VARCHAR(120),
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
UNIQUE(user_id, ticker)
|
||||
)
|
||||
`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS holdings (
|
||||
id BIGSERIAL PRIMARY KEY,
|
||||
user_id INTEGER NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
ticker VARCHAR(12) NOT NULL,
|
||||
shares NUMERIC(20, 4) NOT NULL,
|
||||
avg_cost NUMERIC(12, 4) NOT NULL,
|
||||
current_price NUMERIC(12, 4),
|
||||
market_value NUMERIC(20, 4) GENERATED ALWAYS AS ((COALESCE(current_price, avg_cost) * shares)) STORED,
|
||||
gain_loss NUMERIC(20, 4) GENERATED ALWAYS AS (((COALESCE(current_price, avg_cost) - avg_cost) * shares)) STORED,
|
||||
gain_loss_pct NUMERIC(12, 4) GENERATED ALWAYS AS (
|
||||
CASE
|
||||
WHEN avg_cost > 0 THEN (((COALESCE(current_price, avg_cost) - avg_cost) / avg_cost) * 100)
|
||||
ELSE 0
|
||||
END
|
||||
) STORED,
|
||||
last_price_at TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
UNIQUE(user_id, ticker)
|
||||
)
|
||||
`;
|
||||
|
||||
await db`
|
||||
DO $$
|
||||
BEGIN
|
||||
IF EXISTS (
|
||||
SELECT 1
|
||||
FROM information_schema.tables
|
||||
WHERE table_name = 'portfolio'
|
||||
) THEN
|
||||
EXECUTE $migrate$
|
||||
INSERT INTO holdings (
|
||||
user_id,
|
||||
ticker,
|
||||
shares,
|
||||
avg_cost,
|
||||
current_price,
|
||||
last_price_at,
|
||||
created_at,
|
||||
updated_at
|
||||
)
|
||||
SELECT
|
||||
user_id,
|
||||
ticker,
|
||||
shares,
|
||||
avg_cost,
|
||||
current_price,
|
||||
last_updated,
|
||||
created_at,
|
||||
NOW()
|
||||
FROM portfolio
|
||||
ON CONFLICT (user_id, ticker) DO NOTHING
|
||||
$migrate$;
|
||||
END IF;
|
||||
END
|
||||
$$
|
||||
`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS portfolio_insights (
|
||||
id BIGSERIAL PRIMARY KEY,
|
||||
user_id INTEGER NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
provider TEXT NOT NULL,
|
||||
model TEXT NOT NULL,
|
||||
content TEXT NOT NULL,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
)
|
||||
`;
|
||||
|
||||
await db`
|
||||
CREATE TABLE IF NOT EXISTS long_tasks (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
task_type TEXT NOT NULL,
|
||||
status TEXT NOT NULL,
|
||||
priority INTEGER NOT NULL DEFAULT 50,
|
||||
payload JSONB NOT NULL,
|
||||
result JSONB,
|
||||
error TEXT,
|
||||
attempts INTEGER NOT NULL DEFAULT 0,
|
||||
max_attempts INTEGER NOT NULL DEFAULT 3,
|
||||
scheduled_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
started_at TIMESTAMPTZ,
|
||||
heartbeat_at TIMESTAMPTZ,
|
||||
finished_at TIMESTAMPTZ,
|
||||
created_by INTEGER REFERENCES users(id) ON DELETE SET NULL,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
CONSTRAINT long_tasks_status_check CHECK (status IN ('queued', 'running', 'completed', 'failed'))
|
||||
)
|
||||
`;
|
||||
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_users_email ON users(email)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_session_token ON session(token)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_session_user ON session(user_id)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_account_user ON account(user_id)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_watchlist_user ON watchlist(user_id)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_holdings_user ON holdings(user_id)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_filings_ticker_date ON filings(ticker, filing_date DESC)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_filings_accession ON filings(accession_number)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_portfolio_insights_user ON portfolio_insights(user_id, created_at DESC)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_long_tasks_status_sched ON long_tasks(status, scheduled_at, priority DESC, created_at)`;
|
||||
await db`CREATE INDEX IF NOT EXISTS idx_long_tasks_user ON long_tasks(created_by, created_at DESC)`;
|
||||
|
||||
console.log('Migrations completed successfully.');
|
||||
}
|
||||
|
||||
migrate()
|
||||
.then(() => process.exit(0))
|
||||
.catch((error) => {
|
||||
console.error('Migration failed', error);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -1,58 +0,0 @@
|
||||
import { Elysia } from 'elysia';
|
||||
import { cors } from '@elysiajs/cors';
|
||||
import { swagger } from '@elysiajs/swagger';
|
||||
import { env } from './config';
|
||||
import { db } from './db';
|
||||
import { betterAuthRoutes } from './routes/better-auth';
|
||||
import { filingsRoutes } from './routes/filings';
|
||||
import { meRoutes } from './routes/me';
|
||||
import { openclawRoutes } from './routes/openclaw';
|
||||
import { portfolioRoutes } from './routes/portfolio';
|
||||
import { taskRoutes } from './routes/tasks';
|
||||
import { watchlistRoutes } from './routes/watchlist';
|
||||
|
||||
const app = new Elysia({ prefix: '/api' })
|
||||
.use(cors({
|
||||
origin: env.FRONTEND_ORIGINS,
|
||||
credentials: true,
|
||||
allowedHeaders: ['Content-Type', 'Authorization'],
|
||||
methods: ['GET', 'POST', 'PUT', 'PATCH', 'DELETE', 'OPTIONS']
|
||||
}))
|
||||
.use(swagger({
|
||||
documentation: {
|
||||
info: {
|
||||
title: 'Fiscal Clone API',
|
||||
version: '2.0.0',
|
||||
description: 'Futuristic fiscal intelligence API with durable jobs and OpenClaw integration.'
|
||||
}
|
||||
}
|
||||
}))
|
||||
.use(betterAuthRoutes)
|
||||
.use(meRoutes)
|
||||
.use(watchlistRoutes)
|
||||
.use(portfolioRoutes)
|
||||
.use(filingsRoutes)
|
||||
.use(openclawRoutes)
|
||||
.use(taskRoutes)
|
||||
.get('/health', async () => {
|
||||
const queueRows = await db`
|
||||
SELECT status, COUNT(*)::int AS count
|
||||
FROM long_tasks
|
||||
GROUP BY status
|
||||
`;
|
||||
|
||||
return {
|
||||
status: 'ok',
|
||||
version: '2.0.0',
|
||||
timestamp: new Date().toISOString(),
|
||||
queue: queueRows.reduce<Record<string, number>>((acc, row) => {
|
||||
acc[row.status] = row.count;
|
||||
return acc;
|
||||
}, {})
|
||||
};
|
||||
});
|
||||
|
||||
app.listen(env.PORT);
|
||||
|
||||
console.log(`Fiscal backend listening on http://localhost:${app.server?.port}`);
|
||||
console.log(`Swagger docs: http://localhost:${app.server?.port}/swagger`);
|
||||
@@ -1,7 +0,0 @@
|
||||
import { Elysia } from 'elysia';
|
||||
import { auth } from '../auth';
|
||||
|
||||
export const betterAuthRoutes = new Elysia({ prefix: '/auth' })
|
||||
.all('/*', async ({ request }) => {
|
||||
return await auth.handler(request);
|
||||
});
|
||||
@@ -1,16 +0,0 @@
|
||||
import { UnauthorizedError } from '../session';
|
||||
|
||||
export function toHttpError(set: { status: number }, error: unknown) {
|
||||
if (error instanceof UnauthorizedError) {
|
||||
set.status = 401;
|
||||
return { error: error.message };
|
||||
}
|
||||
|
||||
if (error instanceof Error) {
|
||||
set.status = 500;
|
||||
return { error: error.message };
|
||||
}
|
||||
|
||||
set.status = 500;
|
||||
return { error: 'Unexpected error' };
|
||||
}
|
||||
@@ -1,107 +0,0 @@
|
||||
import { Elysia, t } from 'elysia';
|
||||
import { db } from '../db';
|
||||
import { requireSessionUser } from '../session';
|
||||
import { enqueueTask } from '../tasks/repository';
|
||||
import { toHttpError } from './error';
|
||||
|
||||
export const filingsRoutes = new Elysia({ prefix: '/filings' })
|
||||
.get('/', async ({ request, set, query }) => {
|
||||
try {
|
||||
await requireSessionUser(request);
|
||||
const tickerFilter = query.ticker?.trim().toUpperCase();
|
||||
const limit = Number(query.limit ?? 50);
|
||||
const safeLimit = Number.isFinite(limit) ? Math.min(Math.max(limit, 1), 200) : 50;
|
||||
|
||||
const rows = tickerFilter
|
||||
? await db`
|
||||
SELECT *
|
||||
FROM filings
|
||||
WHERE ticker = ${tickerFilter}
|
||||
ORDER BY filing_date DESC, created_at DESC
|
||||
LIMIT ${safeLimit}
|
||||
`
|
||||
: await db`
|
||||
SELECT *
|
||||
FROM filings
|
||||
ORDER BY filing_date DESC, created_at DESC
|
||||
LIMIT ${safeLimit}
|
||||
`;
|
||||
|
||||
return { filings: rows };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
query: t.Object({
|
||||
ticker: t.Optional(t.String()),
|
||||
limit: t.Optional(t.Numeric())
|
||||
})
|
||||
})
|
||||
.get('/:accessionNumber', async ({ request, set, params }) => {
|
||||
try {
|
||||
await requireSessionUser(request);
|
||||
const rows = await db`
|
||||
SELECT *
|
||||
FROM filings
|
||||
WHERE accession_number = ${params.accessionNumber}
|
||||
LIMIT 1
|
||||
`;
|
||||
|
||||
if (!rows[0]) {
|
||||
set.status = 404;
|
||||
return { error: 'Filing not found' };
|
||||
}
|
||||
|
||||
return { filing: rows[0] };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
params: t.Object({
|
||||
accessionNumber: t.String({ minLength: 8 })
|
||||
})
|
||||
})
|
||||
.post('/sync', async ({ request, set, body }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const task = await enqueueTask({
|
||||
taskType: 'sync_filings',
|
||||
payload: {
|
||||
ticker: body.ticker.trim().toUpperCase(),
|
||||
limit: body.limit ?? 20
|
||||
},
|
||||
createdBy: user.id,
|
||||
priority: 90
|
||||
});
|
||||
|
||||
return { task };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
body: t.Object({
|
||||
ticker: t.String({ minLength: 1, maxLength: 12 }),
|
||||
limit: t.Optional(t.Number({ minimum: 1, maximum: 50 }))
|
||||
})
|
||||
})
|
||||
.post('/:accessionNumber/analyze', async ({ request, set, params }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const task = await enqueueTask({
|
||||
taskType: 'analyze_filing',
|
||||
payload: {
|
||||
accessionNumber: params.accessionNumber
|
||||
},
|
||||
createdBy: user.id,
|
||||
priority: 65
|
||||
});
|
||||
|
||||
return { task };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
params: t.Object({
|
||||
accessionNumber: t.String({ minLength: 8 })
|
||||
})
|
||||
});
|
||||
@@ -1,13 +0,0 @@
|
||||
import { Elysia } from 'elysia';
|
||||
import { requireSessionUser } from '../session';
|
||||
import { toHttpError } from './error';
|
||||
|
||||
export const meRoutes = new Elysia({ prefix: '/me' })
|
||||
.get('/', async ({ request, set }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
return { user };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
});
|
||||
@@ -1,53 +0,0 @@
|
||||
import { Elysia, t } from 'elysia';
|
||||
import { env } from '../config';
|
||||
import { requireSessionUser } from '../session';
|
||||
import { enqueueTask } from '../tasks/repository';
|
||||
import { toHttpError } from './error';
|
||||
|
||||
export const openclawRoutes = new Elysia({ prefix: '/ai' })
|
||||
.get('/status', async ({ request, set }) => {
|
||||
try {
|
||||
await requireSessionUser(request);
|
||||
return {
|
||||
configured: Boolean(env.OPENCLAW_BASE_URL && env.OPENCLAW_API_KEY),
|
||||
baseUrl: env.OPENCLAW_BASE_URL ?? null,
|
||||
model: env.OPENCLAW_MODEL
|
||||
};
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
})
|
||||
.post('/portfolio-insights', async ({ request, set }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const task = await enqueueTask({
|
||||
taskType: 'portfolio_insights',
|
||||
payload: { userId: user.id },
|
||||
createdBy: user.id,
|
||||
priority: 70
|
||||
});
|
||||
|
||||
return { task };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
})
|
||||
.post('/filing-insights', async ({ request, set, body }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const task = await enqueueTask({
|
||||
taskType: 'analyze_filing',
|
||||
payload: { accessionNumber: body.accessionNumber },
|
||||
createdBy: user.id,
|
||||
priority: 65
|
||||
});
|
||||
|
||||
return { task };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
body: t.Object({
|
||||
accessionNumber: t.String({ minLength: 8 })
|
||||
})
|
||||
});
|
||||
@@ -1,197 +0,0 @@
|
||||
import { Elysia, t } from 'elysia';
|
||||
import { db } from '../db';
|
||||
import { requireSessionUser } from '../session';
|
||||
import { enqueueTask } from '../tasks/repository';
|
||||
import { toHttpError } from './error';
|
||||
|
||||
export const portfolioRoutes = new Elysia({ prefix: '/portfolio' })
|
||||
.get('/holdings', async ({ request, set }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const holdings = await db`
|
||||
SELECT
|
||||
id,
|
||||
user_id,
|
||||
ticker,
|
||||
shares,
|
||||
avg_cost,
|
||||
current_price,
|
||||
market_value,
|
||||
gain_loss,
|
||||
gain_loss_pct,
|
||||
last_price_at,
|
||||
created_at,
|
||||
updated_at
|
||||
FROM holdings
|
||||
WHERE user_id = ${user.id}
|
||||
ORDER BY market_value DESC, ticker ASC
|
||||
`;
|
||||
|
||||
return { holdings };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
})
|
||||
.post('/holdings', async ({ request, set, body }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const ticker = body.ticker.trim().toUpperCase();
|
||||
|
||||
const rows = await db`
|
||||
INSERT INTO holdings (
|
||||
user_id,
|
||||
ticker,
|
||||
shares,
|
||||
avg_cost,
|
||||
current_price
|
||||
) VALUES (
|
||||
${user.id},
|
||||
${ticker},
|
||||
${body.shares},
|
||||
${body.avgCost},
|
||||
${body.currentPrice ?? null}
|
||||
)
|
||||
ON CONFLICT (user_id, ticker)
|
||||
DO UPDATE SET
|
||||
shares = EXCLUDED.shares,
|
||||
avg_cost = EXCLUDED.avg_cost,
|
||||
current_price = COALESCE(EXCLUDED.current_price, holdings.current_price),
|
||||
updated_at = NOW()
|
||||
RETURNING *
|
||||
`;
|
||||
|
||||
return { holding: rows[0] };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
body: t.Object({
|
||||
ticker: t.String({ minLength: 1, maxLength: 12 }),
|
||||
shares: t.Number({ minimum: 0.0001 }),
|
||||
avgCost: t.Number({ minimum: 0.0001 }),
|
||||
currentPrice: t.Optional(t.Number({ minimum: 0 }))
|
||||
})
|
||||
})
|
||||
.patch('/holdings/:id', async ({ request, set, params, body }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const rows = await db`
|
||||
UPDATE holdings
|
||||
SET
|
||||
shares = COALESCE(${body.shares ?? null}, shares),
|
||||
avg_cost = COALESCE(${body.avgCost ?? null}, avg_cost),
|
||||
current_price = COALESCE(${body.currentPrice ?? null}, current_price),
|
||||
updated_at = NOW()
|
||||
WHERE id = ${params.id}
|
||||
AND user_id = ${user.id}
|
||||
RETURNING *
|
||||
`;
|
||||
|
||||
if (!rows[0]) {
|
||||
set.status = 404;
|
||||
return { error: 'Holding not found' };
|
||||
}
|
||||
|
||||
return { holding: rows[0] };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
params: t.Object({
|
||||
id: t.Numeric()
|
||||
}),
|
||||
body: t.Object({
|
||||
shares: t.Optional(t.Number({ minimum: 0.0001 })),
|
||||
avgCost: t.Optional(t.Number({ minimum: 0.0001 })),
|
||||
currentPrice: t.Optional(t.Number({ minimum: 0 }))
|
||||
})
|
||||
})
|
||||
.delete('/holdings/:id', async ({ request, set, params }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const rows = await db`
|
||||
DELETE FROM holdings
|
||||
WHERE id = ${params.id}
|
||||
AND user_id = ${user.id}
|
||||
RETURNING id
|
||||
`;
|
||||
|
||||
if (!rows[0]) {
|
||||
set.status = 404;
|
||||
return { error: 'Holding not found' };
|
||||
}
|
||||
|
||||
return { success: true };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
params: t.Object({
|
||||
id: t.Numeric()
|
||||
})
|
||||
})
|
||||
.get('/summary', async ({ request, set }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const rows = await db`
|
||||
SELECT
|
||||
COUNT(*)::int AS positions,
|
||||
COALESCE(SUM(market_value), 0)::numeric AS total_value,
|
||||
COALESCE(SUM(gain_loss), 0)::numeric AS total_gain_loss,
|
||||
COALESCE(SUM(shares * avg_cost), 0)::numeric AS total_cost_basis,
|
||||
COALESCE(AVG(gain_loss_pct), 0)::numeric AS avg_return_pct
|
||||
FROM holdings
|
||||
WHERE user_id = ${user.id}
|
||||
`;
|
||||
|
||||
return { summary: rows[0] };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
})
|
||||
.post('/refresh-prices', async ({ request, set }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const task = await enqueueTask({
|
||||
taskType: 'refresh_prices',
|
||||
payload: { userId: user.id },
|
||||
createdBy: user.id,
|
||||
priority: 80
|
||||
});
|
||||
|
||||
return { task };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
})
|
||||
.post('/insights/generate', async ({ request, set }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const task = await enqueueTask({
|
||||
taskType: 'portfolio_insights',
|
||||
payload: { userId: user.id },
|
||||
createdBy: user.id,
|
||||
priority: 70
|
||||
});
|
||||
|
||||
return { task };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
})
|
||||
.get('/insights/latest', async ({ request, set }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const rows = await db`
|
||||
SELECT id, user_id, provider, model, content, created_at
|
||||
FROM portfolio_insights
|
||||
WHERE user_id = ${user.id}
|
||||
ORDER BY created_at DESC
|
||||
LIMIT 1
|
||||
`;
|
||||
|
||||
return { insight: rows[0] ?? null };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
});
|
||||
@@ -1,40 +0,0 @@
|
||||
import { Elysia, t } from 'elysia';
|
||||
import { requireSessionUser } from '../session';
|
||||
import { getTaskById, listRecentTasks } from '../tasks/repository';
|
||||
import { toHttpError } from './error';
|
||||
|
||||
export const taskRoutes = new Elysia({ prefix: '/tasks' })
|
||||
.get('/', async ({ request, set, query }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const limit = Number(query.limit ?? 20);
|
||||
const safeLimit = Number.isFinite(limit) ? Math.min(Math.max(limit, 1), 50) : 20;
|
||||
const tasks = await listRecentTasks(user.id, safeLimit);
|
||||
return { tasks };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
query: t.Object({
|
||||
limit: t.Optional(t.Numeric())
|
||||
})
|
||||
})
|
||||
.get('/:taskId', async ({ request, set, params }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const task = await getTaskById(params.taskId, user.id);
|
||||
|
||||
if (!task) {
|
||||
set.status = 404;
|
||||
return { error: 'Task not found' };
|
||||
}
|
||||
|
||||
return { task };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
params: t.Object({
|
||||
taskId: t.String()
|
||||
})
|
||||
});
|
||||
@@ -1,80 +0,0 @@
|
||||
import { Elysia, t } from 'elysia';
|
||||
import { db } from '../db';
|
||||
import { requireSessionUser } from '../session';
|
||||
import { toHttpError } from './error';
|
||||
|
||||
export const watchlistRoutes = new Elysia({ prefix: '/watchlist' })
|
||||
.get('/', async ({ request, set }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const watchlist = await db`
|
||||
SELECT id, user_id, ticker, company_name, sector, created_at
|
||||
FROM watchlist
|
||||
WHERE user_id = ${user.id}
|
||||
ORDER BY created_at DESC
|
||||
`;
|
||||
|
||||
return { items: watchlist };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
})
|
||||
.post('/', async ({ request, set, body }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const ticker = body.ticker.trim().toUpperCase();
|
||||
|
||||
const rows = await db`
|
||||
INSERT INTO watchlist (
|
||||
user_id,
|
||||
ticker,
|
||||
company_name,
|
||||
sector
|
||||
) VALUES (
|
||||
${user.id},
|
||||
${ticker},
|
||||
${body.companyName.trim()},
|
||||
${body.sector?.trim() || null}
|
||||
)
|
||||
ON CONFLICT (user_id, ticker)
|
||||
DO UPDATE SET
|
||||
company_name = EXCLUDED.company_name,
|
||||
sector = EXCLUDED.sector
|
||||
RETURNING *
|
||||
`;
|
||||
|
||||
return { item: rows[0] };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
body: t.Object({
|
||||
ticker: t.String({ minLength: 1, maxLength: 12 }),
|
||||
companyName: t.String({ minLength: 1, maxLength: 200 }),
|
||||
sector: t.Optional(t.String({ maxLength: 120 }))
|
||||
})
|
||||
})
|
||||
.delete('/:id', async ({ request, set, params }) => {
|
||||
try {
|
||||
const user = await requireSessionUser(request);
|
||||
const rows = await db`
|
||||
DELETE FROM watchlist
|
||||
WHERE id = ${params.id}
|
||||
AND user_id = ${user.id}
|
||||
RETURNING id
|
||||
`;
|
||||
|
||||
if (!rows[0]) {
|
||||
set.status = 404;
|
||||
return { error: 'Watchlist item not found' };
|
||||
}
|
||||
|
||||
return { success: true };
|
||||
} catch (error) {
|
||||
return toHttpError(set, error);
|
||||
}
|
||||
}, {
|
||||
params: t.Object({
|
||||
id: t.Numeric()
|
||||
})
|
||||
});
|
||||
@@ -1,61 +0,0 @@
|
||||
import { env } from '../config';
|
||||
|
||||
type ChatCompletionResponse = {
|
||||
choices?: Array<{
|
||||
message?: {
|
||||
content?: string;
|
||||
};
|
||||
}>;
|
||||
};
|
||||
|
||||
export class OpenClawService {
|
||||
isConfigured() {
|
||||
return Boolean(env.OPENCLAW_BASE_URL && env.OPENCLAW_API_KEY);
|
||||
}
|
||||
|
||||
async runAnalysis(prompt: string, systemPrompt?: string) {
|
||||
if (!this.isConfigured()) {
|
||||
return {
|
||||
provider: 'local-fallback',
|
||||
model: env.OPENCLAW_MODEL,
|
||||
text: 'OpenClaw/ZeroClaw is not configured. Set OPENCLAW_BASE_URL and OPENCLAW_API_KEY to enable live AI analysis.'
|
||||
};
|
||||
}
|
||||
|
||||
const response = await fetch(`${env.OPENCLAW_BASE_URL}/v1/chat/completions`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
Authorization: `Bearer ${env.OPENCLAW_API_KEY}`
|
||||
},
|
||||
body: JSON.stringify({
|
||||
model: env.OPENCLAW_MODEL,
|
||||
temperature: 0.2,
|
||||
messages: [
|
||||
systemPrompt
|
||||
? { role: 'system', content: systemPrompt }
|
||||
: null,
|
||||
{ role: 'user', content: prompt }
|
||||
].filter(Boolean)
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const body = await response.text();
|
||||
throw new Error(`OpenClaw request failed (${response.status}): ${body.slice(0, 200)}`);
|
||||
}
|
||||
|
||||
const payload = await response.json() as ChatCompletionResponse;
|
||||
const text = payload.choices?.[0]?.message?.content?.trim();
|
||||
|
||||
if (!text) {
|
||||
throw new Error('OpenClaw returned an empty response');
|
||||
}
|
||||
|
||||
return {
|
||||
provider: 'openclaw',
|
||||
model: env.OPENCLAW_MODEL,
|
||||
text
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -1,72 +0,0 @@
|
||||
import { db } from '../db';
|
||||
|
||||
const YAHOO_BASE = 'https://query1.finance.yahoo.com/v8/finance/chart';
|
||||
|
||||
export class PriceService {
|
||||
async getQuote(ticker: string): Promise<number | null> {
|
||||
const normalizedTicker = ticker.trim().toUpperCase();
|
||||
|
||||
try {
|
||||
const response = await fetch(`${YAHOO_BASE}/${normalizedTicker}?interval=1d&range=1d`, {
|
||||
headers: {
|
||||
'User-Agent': 'Mozilla/5.0 (compatible; FiscalClone/2.0)'
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const payload = await response.json() as {
|
||||
chart?: {
|
||||
result?: Array<{ meta?: { regularMarketPrice?: number } }>;
|
||||
};
|
||||
};
|
||||
|
||||
const price = payload.chart?.result?.[0]?.meta?.regularMarketPrice;
|
||||
|
||||
return typeof price === 'number' ? price : null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async refreshHoldingsPrices(userId?: number) {
|
||||
const holdings = userId
|
||||
? await db`SELECT DISTINCT ticker FROM holdings WHERE user_id = ${userId}`
|
||||
: await db`SELECT DISTINCT ticker FROM holdings`;
|
||||
|
||||
let updatedCount = 0;
|
||||
|
||||
for (const holding of holdings) {
|
||||
const price = await this.getQuote(holding.ticker);
|
||||
|
||||
if (price === null) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (userId) {
|
||||
await db`
|
||||
UPDATE holdings
|
||||
SET current_price = ${price}, last_price_at = NOW(), updated_at = NOW()
|
||||
WHERE user_id = ${userId} AND ticker = ${holding.ticker}
|
||||
`;
|
||||
} else {
|
||||
await db`
|
||||
UPDATE holdings
|
||||
SET current_price = ${price}, last_price_at = NOW(), updated_at = NOW()
|
||||
WHERE ticker = ${holding.ticker}
|
||||
`;
|
||||
}
|
||||
|
||||
updatedCount += 1;
|
||||
|
||||
await Bun.sleep(120);
|
||||
}
|
||||
|
||||
return {
|
||||
updatedCount,
|
||||
totalTickers: holdings.length
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -1,208 +0,0 @@
|
||||
import { env } from '../config';
|
||||
import type { FilingMetrics, FilingType } from '../types';
|
||||
|
||||
type TickerDirectoryRecord = {
|
||||
cik_str: number;
|
||||
ticker: string;
|
||||
title: string;
|
||||
};
|
||||
|
||||
type RecentFilingsPayload = {
|
||||
filings?: {
|
||||
recent?: {
|
||||
accessionNumber?: string[];
|
||||
filingDate?: string[];
|
||||
form?: string[];
|
||||
primaryDocument?: string[];
|
||||
};
|
||||
};
|
||||
cik?: string;
|
||||
name?: string;
|
||||
};
|
||||
|
||||
type CompanyFactsPayload = {
|
||||
facts?: {
|
||||
'us-gaap'?: Record<string, { units?: Record<string, Array<{ val?: number; end?: string; filed?: string }>> }>;
|
||||
};
|
||||
};
|
||||
|
||||
export type SecFiling = {
|
||||
ticker: string;
|
||||
cik: string;
|
||||
companyName: string;
|
||||
filingType: FilingType;
|
||||
filingDate: string;
|
||||
accessionNumber: string;
|
||||
filingUrl: string | null;
|
||||
};
|
||||
|
||||
const SUPPORTED_FORMS: FilingType[] = ['10-K', '10-Q', '8-K'];
|
||||
const TICKER_CACHE_TTL_MS = 1000 * 60 * 60 * 24;
|
||||
const FACTS_CACHE_TTL_MS = 1000 * 60 * 10;
|
||||
|
||||
export class SecService {
|
||||
private tickerCache: Map<string, TickerDirectoryRecord> = new Map();
|
||||
private tickerCacheLoadedAt = 0;
|
||||
private factsCache: Map<string, { loadedAt: number; metrics: FilingMetrics }> = new Map();
|
||||
|
||||
private async fetchJson<T>(url: string): Promise<T> {
|
||||
const response = await fetch(url, {
|
||||
headers: {
|
||||
'User-Agent': env.SEC_USER_AGENT,
|
||||
Accept: 'application/json'
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`SEC request failed (${response.status}) for ${url}`);
|
||||
}
|
||||
|
||||
return await response.json() as T;
|
||||
}
|
||||
|
||||
private async ensureTickerCache() {
|
||||
const isFresh = Date.now() - this.tickerCacheLoadedAt < TICKER_CACHE_TTL_MS;
|
||||
|
||||
if (isFresh && this.tickerCache.size > 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const payload = await this.fetchJson<Record<string, TickerDirectoryRecord>>('https://www.sec.gov/files/company_tickers.json');
|
||||
const nextCache = new Map<string, TickerDirectoryRecord>();
|
||||
|
||||
for (const record of Object.values(payload)) {
|
||||
nextCache.set(record.ticker.toUpperCase(), record);
|
||||
}
|
||||
|
||||
this.tickerCache = nextCache;
|
||||
this.tickerCacheLoadedAt = Date.now();
|
||||
}
|
||||
|
||||
async resolveTicker(ticker: string) {
|
||||
await this.ensureTickerCache();
|
||||
|
||||
const normalizedTicker = ticker.trim().toUpperCase();
|
||||
const record = this.tickerCache.get(normalizedTicker);
|
||||
|
||||
if (!record) {
|
||||
throw new Error(`Ticker ${normalizedTicker} was not found in SEC directory`);
|
||||
}
|
||||
|
||||
return {
|
||||
ticker: normalizedTicker,
|
||||
cik: String(record.cik_str),
|
||||
companyName: record.title
|
||||
};
|
||||
}
|
||||
|
||||
async fetchRecentFilings(ticker: string, limit = 20): Promise<SecFiling[]> {
|
||||
const company = await this.resolveTicker(ticker);
|
||||
const cikPadded = company.cik.padStart(10, '0');
|
||||
|
||||
const payload = await this.fetchJson<RecentFilingsPayload>(`https://data.sec.gov/submissions/CIK${cikPadded}.json`);
|
||||
const recent = payload.filings?.recent;
|
||||
|
||||
if (!recent) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const forms = recent.form ?? [];
|
||||
const accessionNumbers = recent.accessionNumber ?? [];
|
||||
const filingDates = recent.filingDate ?? [];
|
||||
const primaryDocuments = recent.primaryDocument ?? [];
|
||||
const filings: SecFiling[] = [];
|
||||
|
||||
for (let i = 0; i < forms.length; i += 1) {
|
||||
const filingType = forms[i] as FilingType;
|
||||
|
||||
if (!SUPPORTED_FORMS.includes(filingType)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const accessionNumber = accessionNumbers[i];
|
||||
|
||||
if (!accessionNumber) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const compactAccession = accessionNumber.replace(/-/g, '');
|
||||
const documentName = primaryDocuments[i];
|
||||
const filingUrl = documentName
|
||||
? `https://www.sec.gov/Archives/edgar/data/${Number(company.cik)}/${compactAccession}/${documentName}`
|
||||
: null;
|
||||
|
||||
filings.push({
|
||||
ticker: company.ticker,
|
||||
cik: company.cik,
|
||||
companyName: payload.name ?? company.companyName,
|
||||
filingType,
|
||||
filingDate: filingDates[i] ?? new Date().toISOString().slice(0, 10),
|
||||
accessionNumber,
|
||||
filingUrl
|
||||
});
|
||||
|
||||
if (filings.length >= limit) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
return filings;
|
||||
}
|
||||
|
||||
private pickLatestFact(payload: CompanyFactsPayload, tag: string): number | null {
|
||||
const unitCollections = payload.facts?.['us-gaap']?.[tag]?.units;
|
||||
|
||||
if (!unitCollections) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const preferredUnits = ['USD', 'USD/shares'];
|
||||
|
||||
for (const unit of preferredUnits) {
|
||||
const series = unitCollections[unit];
|
||||
if (!series?.length) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const best = [...series]
|
||||
.filter((item) => typeof item.val === 'number')
|
||||
.sort((a, b) => {
|
||||
const aDate = Date.parse(a.filed ?? a.end ?? '1970-01-01');
|
||||
const bDate = Date.parse(b.filed ?? b.end ?? '1970-01-01');
|
||||
return bDate - aDate;
|
||||
})[0];
|
||||
|
||||
if (best?.val !== undefined) {
|
||||
return best.val;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
async fetchMetrics(cik: string): Promise<FilingMetrics> {
|
||||
const normalized = cik.padStart(10, '0');
|
||||
const cached = this.factsCache.get(normalized);
|
||||
|
||||
if (cached && Date.now() - cached.loadedAt < FACTS_CACHE_TTL_MS) {
|
||||
return cached.metrics;
|
||||
}
|
||||
|
||||
const payload = await this.fetchJson<CompanyFactsPayload>(`https://data.sec.gov/api/xbrl/companyfacts/CIK${normalized}.json`);
|
||||
|
||||
const metrics: FilingMetrics = {
|
||||
revenue: this.pickLatestFact(payload, 'Revenues'),
|
||||
netIncome: this.pickLatestFact(payload, 'NetIncomeLoss'),
|
||||
totalAssets: this.pickLatestFact(payload, 'Assets'),
|
||||
cash: this.pickLatestFact(payload, 'CashAndCashEquivalentsAtCarryingValue'),
|
||||
debt: this.pickLatestFact(payload, 'LongTermDebt')
|
||||
};
|
||||
|
||||
this.factsCache.set(normalized, {
|
||||
loadedAt: Date.now(),
|
||||
metrics
|
||||
});
|
||||
|
||||
return metrics;
|
||||
}
|
||||
}
|
||||
@@ -1,30 +0,0 @@
|
||||
import { auth } from './auth';
|
||||
import type { SessionUser } from './types';
|
||||
|
||||
export class UnauthorizedError extends Error {
|
||||
constructor(message = 'Authentication required') {
|
||||
super(message);
|
||||
this.name = 'UnauthorizedError';
|
||||
}
|
||||
}
|
||||
|
||||
export async function requireSessionUser(request: Request): Promise<SessionUser> {
|
||||
const session = await auth.api.getSession({ headers: request.headers });
|
||||
|
||||
if (!session?.user?.id) {
|
||||
throw new UnauthorizedError();
|
||||
}
|
||||
|
||||
const userId = Number(session.user.id);
|
||||
|
||||
if (!Number.isFinite(userId)) {
|
||||
throw new UnauthorizedError('Invalid session user id');
|
||||
}
|
||||
|
||||
return {
|
||||
id: userId,
|
||||
email: session.user.email,
|
||||
name: session.user.name ?? null,
|
||||
image: session.user.image ?? null
|
||||
};
|
||||
}
|
||||
@@ -1,201 +0,0 @@
|
||||
import { z } from 'zod';
|
||||
import { db } from '../db';
|
||||
import { OpenClawService } from '../services/openclaw';
|
||||
import { PriceService } from '../services/prices';
|
||||
import { SecService } from '../services/sec';
|
||||
import type { LongTaskRecord, TaskType } from '../types';
|
||||
|
||||
const secService = new SecService();
|
||||
const priceService = new PriceService();
|
||||
const openClawService = new OpenClawService();
|
||||
|
||||
const syncFilingsPayload = z.object({
|
||||
ticker: z.string().min(1),
|
||||
limit: z.number().int().positive().max(50).default(20)
|
||||
});
|
||||
|
||||
const refreshPricesPayload = z.object({
|
||||
userId: z.number().int().positive().optional()
|
||||
});
|
||||
|
||||
const analyzeFilingPayload = z.object({
|
||||
accessionNumber: z.string().min(8)
|
||||
});
|
||||
|
||||
const portfolioInsightsPayload = z.object({
|
||||
userId: z.number().int().positive()
|
||||
});
|
||||
|
||||
async function processSyncFilings(task: LongTaskRecord) {
|
||||
const { ticker, limit } = syncFilingsPayload.parse(task.payload);
|
||||
const filings = await secService.fetchRecentFilings(ticker, limit);
|
||||
const metrics = filings.length > 0
|
||||
? await secService.fetchMetrics(filings[0].cik)
|
||||
: null;
|
||||
|
||||
let touched = 0;
|
||||
|
||||
for (const filing of filings) {
|
||||
await db`
|
||||
INSERT INTO filings (
|
||||
ticker,
|
||||
filing_type,
|
||||
filing_date,
|
||||
accession_number,
|
||||
cik,
|
||||
company_name,
|
||||
filing_url,
|
||||
metrics,
|
||||
updated_at
|
||||
) VALUES (
|
||||
${filing.ticker},
|
||||
${filing.filingType},
|
||||
${filing.filingDate},
|
||||
${filing.accessionNumber},
|
||||
${filing.cik},
|
||||
${filing.companyName},
|
||||
${filing.filingUrl},
|
||||
${metrics},
|
||||
NOW()
|
||||
)
|
||||
ON CONFLICT (accession_number)
|
||||
DO UPDATE SET
|
||||
filing_type = EXCLUDED.filing_type,
|
||||
filing_date = EXCLUDED.filing_date,
|
||||
filing_url = EXCLUDED.filing_url,
|
||||
metrics = COALESCE(EXCLUDED.metrics, filings.metrics),
|
||||
updated_at = NOW()
|
||||
`;
|
||||
|
||||
touched += 1;
|
||||
}
|
||||
|
||||
return {
|
||||
ticker: ticker.toUpperCase(),
|
||||
filingsFetched: filings.length,
|
||||
recordsUpserted: touched,
|
||||
metrics
|
||||
};
|
||||
}
|
||||
|
||||
async function processRefreshPrices(task: LongTaskRecord) {
|
||||
const { userId } = refreshPricesPayload.parse(task.payload);
|
||||
const result = await priceService.refreshHoldingsPrices(userId);
|
||||
|
||||
return {
|
||||
scope: userId ? `user:${userId}` : 'global',
|
||||
...result
|
||||
};
|
||||
}
|
||||
|
||||
async function processAnalyzeFiling(task: LongTaskRecord) {
|
||||
const { accessionNumber } = analyzeFilingPayload.parse(task.payload);
|
||||
|
||||
const rows = await db`
|
||||
SELECT *
|
||||
FROM filings
|
||||
WHERE accession_number = ${accessionNumber}
|
||||
LIMIT 1
|
||||
`;
|
||||
|
||||
const filing = rows[0];
|
||||
|
||||
if (!filing) {
|
||||
throw new Error(`Filing ${accessionNumber} was not found`);
|
||||
}
|
||||
|
||||
const prompt = [
|
||||
'You are a fiscal research assistant focused on regulatory signals.',
|
||||
`Analyze this SEC filing from ${filing.company_name} (${filing.ticker}).`,
|
||||
`Form: ${filing.filing_type}`,
|
||||
`Filed: ${filing.filing_date}`,
|
||||
`Metrics JSON: ${JSON.stringify(filing.metrics ?? {})}`,
|
||||
'Return concise sections: Thesis, Red Flags, Follow-up Questions, Portfolio Impact.'
|
||||
].join('\n');
|
||||
|
||||
const analysis = await openClawService.runAnalysis(prompt, 'Use concise institutional analyst language.');
|
||||
|
||||
await db`
|
||||
UPDATE filings
|
||||
SET analysis = ${analysis},
|
||||
updated_at = NOW()
|
||||
WHERE accession_number = ${accessionNumber}
|
||||
`;
|
||||
|
||||
return {
|
||||
accessionNumber,
|
||||
analysis
|
||||
};
|
||||
}
|
||||
|
||||
async function processPortfolioInsights(task: LongTaskRecord) {
|
||||
const { userId } = portfolioInsightsPayload.parse(task.payload);
|
||||
|
||||
const holdings = await db`
|
||||
SELECT
|
||||
ticker,
|
||||
shares,
|
||||
avg_cost,
|
||||
current_price,
|
||||
market_value,
|
||||
gain_loss,
|
||||
gain_loss_pct
|
||||
FROM holdings
|
||||
WHERE user_id = ${userId}
|
||||
ORDER BY market_value DESC
|
||||
`;
|
||||
|
||||
const summaryRows = await db`
|
||||
SELECT
|
||||
COUNT(*)::int AS positions,
|
||||
COALESCE(SUM(market_value), 0)::numeric AS total_value,
|
||||
COALESCE(SUM(gain_loss), 0)::numeric AS total_gain_loss,
|
||||
COALESCE(AVG(gain_loss_pct), 0)::numeric AS avg_return_pct
|
||||
FROM holdings
|
||||
WHERE user_id = ${userId}
|
||||
`;
|
||||
|
||||
const summary = summaryRows[0] ?? {
|
||||
positions: 0,
|
||||
total_value: 0,
|
||||
total_gain_loss: 0,
|
||||
avg_return_pct: 0
|
||||
};
|
||||
|
||||
const prompt = [
|
||||
'Generate portfolio intelligence with actionable recommendations.',
|
||||
`Portfolio summary: ${JSON.stringify(summary)}`,
|
||||
`Holdings: ${JSON.stringify(holdings)}`,
|
||||
'Respond with: 1) Portfolio health score (0-100), 2) top 3 risks, 3) top 3 opportunities, 4) next actions in 7 days.'
|
||||
].join('\n');
|
||||
|
||||
const insight = await openClawService.runAnalysis(prompt, 'Act as a risk-aware buy-side analyst.');
|
||||
|
||||
await db`
|
||||
INSERT INTO portfolio_insights (user_id, model, provider, content)
|
||||
VALUES (${userId}, ${insight.model}, ${insight.provider}, ${insight.text})
|
||||
`;
|
||||
|
||||
return {
|
||||
userId,
|
||||
summary,
|
||||
insight
|
||||
};
|
||||
}
|
||||
|
||||
const processors: Record<TaskType, (task: LongTaskRecord) => Promise<Record<string, unknown>>> = {
|
||||
sync_filings: processSyncFilings,
|
||||
refresh_prices: processRefreshPrices,
|
||||
analyze_filing: processAnalyzeFiling,
|
||||
portfolio_insights: processPortfolioInsights
|
||||
};
|
||||
|
||||
export async function processTask(task: LongTaskRecord) {
|
||||
const processor = processors[task.task_type];
|
||||
|
||||
if (!processor) {
|
||||
throw new Error(`No processor registered for task ${task.task_type}`);
|
||||
}
|
||||
|
||||
return await processor(task);
|
||||
}
|
||||
@@ -1,168 +0,0 @@
|
||||
import { db } from '../db';
|
||||
import { env } from '../config';
|
||||
import type { LongTaskRecord, TaskType } from '../types';
|
||||
|
||||
type EnqueueTaskInput = {
|
||||
taskType: TaskType;
|
||||
payload: Record<string, unknown>;
|
||||
createdBy?: number;
|
||||
priority?: number;
|
||||
scheduledAt?: Date;
|
||||
maxAttempts?: number;
|
||||
};
|
||||
|
||||
export async function enqueueTask(input: EnqueueTaskInput) {
|
||||
const task = await db<LongTaskRecord[]>`
|
||||
INSERT INTO long_tasks (
|
||||
task_type,
|
||||
status,
|
||||
priority,
|
||||
payload,
|
||||
max_attempts,
|
||||
scheduled_at,
|
||||
created_by
|
||||
) VALUES (
|
||||
${input.taskType},
|
||||
'queued',
|
||||
${input.priority ?? 50},
|
||||
${input.payload},
|
||||
${input.maxAttempts ?? env.TASK_MAX_ATTEMPTS},
|
||||
${input.scheduledAt ?? new Date()},
|
||||
${input.createdBy ?? null}
|
||||
)
|
||||
RETURNING *
|
||||
`;
|
||||
|
||||
return task[0];
|
||||
}
|
||||
|
||||
export async function getTaskById(taskId: string, userId?: number) {
|
||||
const rows = userId
|
||||
? await db<LongTaskRecord[]>`
|
||||
SELECT *
|
||||
FROM long_tasks
|
||||
WHERE id = ${taskId}
|
||||
AND (created_by IS NULL OR created_by = ${userId})
|
||||
LIMIT 1
|
||||
`
|
||||
: await db<LongTaskRecord[]>`
|
||||
SELECT *
|
||||
FROM long_tasks
|
||||
WHERE id = ${taskId}
|
||||
LIMIT 1
|
||||
`;
|
||||
|
||||
return rows[0] ?? null;
|
||||
}
|
||||
|
||||
export async function listRecentTasks(userId: number, limit = 20) {
|
||||
return await db<LongTaskRecord[]>`
|
||||
SELECT *
|
||||
FROM long_tasks
|
||||
WHERE created_by = ${userId}
|
||||
ORDER BY created_at DESC
|
||||
LIMIT ${limit}
|
||||
`;
|
||||
}
|
||||
|
||||
export async function claimNextTask() {
|
||||
const staleSeconds = env.TASK_STALE_SECONDS;
|
||||
|
||||
return await db.begin(async (tx) => {
|
||||
await tx`
|
||||
UPDATE long_tasks
|
||||
SET status = 'queued',
|
||||
heartbeat_at = NULL,
|
||||
started_at = NULL,
|
||||
updated_at = NOW(),
|
||||
error = COALESCE(error, 'Task lease expired and was re-queued')
|
||||
WHERE status = 'running'
|
||||
AND heartbeat_at IS NOT NULL
|
||||
AND heartbeat_at < NOW() - (${staleSeconds}::text || ' seconds')::interval
|
||||
AND attempts < max_attempts
|
||||
`;
|
||||
|
||||
await tx`
|
||||
UPDATE long_tasks
|
||||
SET status = 'failed',
|
||||
finished_at = NOW(),
|
||||
updated_at = NOW(),
|
||||
error = COALESCE(error, 'Task lease expired and max attempts reached')
|
||||
WHERE status = 'running'
|
||||
AND heartbeat_at IS NOT NULL
|
||||
AND heartbeat_at < NOW() - (${staleSeconds}::text || ' seconds')::interval
|
||||
AND attempts >= max_attempts
|
||||
`;
|
||||
|
||||
const rows = await tx<LongTaskRecord[]>`
|
||||
WITH candidate AS (
|
||||
SELECT id
|
||||
FROM long_tasks
|
||||
WHERE status = 'queued'
|
||||
AND scheduled_at <= NOW()
|
||||
ORDER BY priority DESC, created_at ASC
|
||||
FOR UPDATE SKIP LOCKED
|
||||
LIMIT 1
|
||||
)
|
||||
UPDATE long_tasks t
|
||||
SET status = 'running',
|
||||
started_at = COALESCE(t.started_at, NOW()),
|
||||
heartbeat_at = NOW(),
|
||||
attempts = t.attempts + 1,
|
||||
updated_at = NOW()
|
||||
FROM candidate
|
||||
WHERE t.id = candidate.id
|
||||
RETURNING t.*
|
||||
`;
|
||||
|
||||
return rows[0] ?? null;
|
||||
});
|
||||
}
|
||||
|
||||
export async function heartbeatTask(taskId: string) {
|
||||
await db`
|
||||
UPDATE long_tasks
|
||||
SET heartbeat_at = NOW(),
|
||||
updated_at = NOW()
|
||||
WHERE id = ${taskId}
|
||||
AND status = 'running'
|
||||
`;
|
||||
}
|
||||
|
||||
export async function completeTask(taskId: string, result: Record<string, unknown>) {
|
||||
await db`
|
||||
UPDATE long_tasks
|
||||
SET status = 'completed',
|
||||
result = ${result},
|
||||
error = NULL,
|
||||
finished_at = NOW(),
|
||||
heartbeat_at = NOW(),
|
||||
updated_at = NOW()
|
||||
WHERE id = ${taskId}
|
||||
`;
|
||||
}
|
||||
|
||||
export async function failTask(task: LongTaskRecord, reason: string, retryDelaySeconds = 20) {
|
||||
const canRetry = task.attempts < task.max_attempts;
|
||||
|
||||
if (canRetry) {
|
||||
await db`
|
||||
UPDATE long_tasks
|
||||
SET status = 'queued',
|
||||
error = ${reason},
|
||||
scheduled_at = NOW() + (${retryDelaySeconds}::text || ' seconds')::interval,
|
||||
updated_at = NOW()
|
||||
WHERE id = ${task.id}
|
||||
`;
|
||||
return;
|
||||
}
|
||||
|
||||
await db`
|
||||
UPDATE long_tasks
|
||||
SET status = 'failed',
|
||||
error = ${reason},
|
||||
finished_at = NOW(),
|
||||
updated_at = NOW()
|
||||
WHERE id = ${task.id}
|
||||
`;
|
||||
}
|
||||
@@ -1,52 +0,0 @@
|
||||
import { env } from '../config';
|
||||
import { claimNextTask, completeTask, failTask, heartbeatTask } from './repository';
|
||||
import { processTask } from './processors';
|
||||
|
||||
let keepRunning = true;
|
||||
|
||||
export function stopWorkerLoop() {
|
||||
keepRunning = false;
|
||||
}
|
||||
|
||||
function normalizeError(error: unknown) {
|
||||
if (error instanceof Error) {
|
||||
return `${error.name}: ${error.message}`;
|
||||
}
|
||||
|
||||
return String(error);
|
||||
}
|
||||
|
||||
export async function runWorkerLoop() {
|
||||
console.log('[worker] started');
|
||||
|
||||
while (keepRunning) {
|
||||
const task = await claimNextTask();
|
||||
|
||||
if (!task) {
|
||||
await Bun.sleep(700);
|
||||
continue;
|
||||
}
|
||||
|
||||
console.log(`[worker] claimed task ${task.id} (${task.task_type})`);
|
||||
|
||||
const heartbeatTimer = setInterval(() => {
|
||||
void heartbeatTask(task.id).catch((error) => {
|
||||
console.error(`[worker] heartbeat failed for ${task.id}`, error);
|
||||
});
|
||||
}, env.TASK_HEARTBEAT_SECONDS * 1000);
|
||||
|
||||
try {
|
||||
const result = await processTask(task);
|
||||
await completeTask(task.id, result);
|
||||
console.log(`[worker] completed task ${task.id}`);
|
||||
} catch (error) {
|
||||
const normalized = normalizeError(error);
|
||||
console.error(`[worker] failed task ${task.id}`, normalized);
|
||||
await failTask(task, normalized);
|
||||
} finally {
|
||||
clearInterval(heartbeatTimer);
|
||||
}
|
||||
}
|
||||
|
||||
console.log('[worker] stopping');
|
||||
}
|
||||
@@ -1,78 +0,0 @@
|
||||
export type FilingType = '10-K' | '10-Q' | '8-K';
|
||||
|
||||
export type FilingMetrics = {
|
||||
revenue: number | null;
|
||||
netIncome: number | null;
|
||||
totalAssets: number | null;
|
||||
cash: number | null;
|
||||
debt: number | null;
|
||||
};
|
||||
|
||||
export type FilingRecord = {
|
||||
id: number;
|
||||
ticker: string;
|
||||
filing_type: FilingType;
|
||||
filing_date: string;
|
||||
accession_number: string;
|
||||
cik: string;
|
||||
company_name: string;
|
||||
filing_url: string | null;
|
||||
metrics: FilingMetrics | null;
|
||||
analysis: Record<string, unknown> | null;
|
||||
created_at: string;
|
||||
updated_at: string;
|
||||
};
|
||||
|
||||
export type HoldingRecord = {
|
||||
id: number;
|
||||
user_id: number;
|
||||
ticker: string;
|
||||
shares: string;
|
||||
avg_cost: string;
|
||||
current_price: string | null;
|
||||
market_value: string;
|
||||
gain_loss: string;
|
||||
gain_loss_pct: string;
|
||||
last_price_at: string | null;
|
||||
created_at: string;
|
||||
updated_at: string;
|
||||
};
|
||||
|
||||
export type WatchlistRecord = {
|
||||
id: number;
|
||||
user_id: number;
|
||||
ticker: string;
|
||||
company_name: string;
|
||||
sector: string | null;
|
||||
created_at: string;
|
||||
};
|
||||
|
||||
export type TaskType = 'sync_filings' | 'refresh_prices' | 'analyze_filing' | 'portfolio_insights';
|
||||
|
||||
export type TaskStatus = 'queued' | 'running' | 'completed' | 'failed';
|
||||
|
||||
export type LongTaskRecord = {
|
||||
id: string;
|
||||
task_type: TaskType;
|
||||
status: TaskStatus;
|
||||
priority: number;
|
||||
payload: Record<string, unknown>;
|
||||
result: Record<string, unknown> | null;
|
||||
error: string | null;
|
||||
attempts: number;
|
||||
max_attempts: number;
|
||||
scheduled_at: string;
|
||||
started_at: string | null;
|
||||
heartbeat_at: string | null;
|
||||
finished_at: string | null;
|
||||
created_by: number | null;
|
||||
created_at: string;
|
||||
updated_at: string;
|
||||
};
|
||||
|
||||
export type SessionUser = {
|
||||
id: number;
|
||||
email: string;
|
||||
name: string | null;
|
||||
image: string | null;
|
||||
};
|
||||
@@ -1,19 +0,0 @@
|
||||
import { runWorkerLoop, stopWorkerLoop } from './tasks/worker-loop';
|
||||
import { closeDb } from './db';
|
||||
|
||||
const shutdown = async (signal: string) => {
|
||||
console.log(`[worker] received ${signal}`);
|
||||
stopWorkerLoop();
|
||||
await Bun.sleep(250);
|
||||
await closeDb();
|
||||
process.exit(0);
|
||||
};
|
||||
|
||||
process.on('SIGINT', () => void shutdown('SIGINT'));
|
||||
process.on('SIGTERM', () => void shutdown('SIGTERM'));
|
||||
|
||||
runWorkerLoop().catch(async (error) => {
|
||||
console.error('[worker] fatal error', error);
|
||||
await closeDb();
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -1,101 +1,21 @@
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16-alpine
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
POSTGRES_USER: ${POSTGRES_USER:-postgres}
|
||||
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-postgres}
|
||||
POSTGRES_DB: ${POSTGRES_DB:-fiscal}
|
||||
volumes:
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
expose:
|
||||
- '5432'
|
||||
healthcheck:
|
||||
test: ['CMD-SHELL', 'pg_isready -U ${POSTGRES_USER:-postgres} -d ${POSTGRES_DB:-fiscal}']
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 10
|
||||
|
||||
backend:
|
||||
build:
|
||||
context: ./backend
|
||||
dockerfile: Dockerfile
|
||||
restart: unless-stopped
|
||||
command: ['sh', '-c', 'bun run src/db/migrate.ts && bun run src/index.ts']
|
||||
env_file:
|
||||
- path: ./.env
|
||||
required: false
|
||||
- path: ../.env
|
||||
required: false
|
||||
environment:
|
||||
DATABASE_URL: ${DATABASE_URL:-postgres://postgres:postgres@postgres:5432/fiscal}
|
||||
PORT: ${PORT:-3001}
|
||||
POSTGRES_HOST: postgres
|
||||
FRONTEND_URL: ${FRONTEND_URL:-http://localhost:3000}
|
||||
BETTER_AUTH_SECRET: ${BETTER_AUTH_SECRET:-local-dev-better-auth-secret-change-me}
|
||||
BETTER_AUTH_BASE_URL: ${BETTER_AUTH_BASE_URL:-http://localhost:3001}
|
||||
SEC_USER_AGENT: ${SEC_USER_AGENT:-Fiscal Clone <support@example.com>}
|
||||
OPENCLAW_BASE_URL: ${OPENCLAW_BASE_URL:-}
|
||||
OPENCLAW_API_KEY: ${OPENCLAW_API_KEY:-}
|
||||
OPENCLAW_MODEL: ${OPENCLAW_MODEL:-zeroclaw}
|
||||
TASK_HEARTBEAT_SECONDS: ${TASK_HEARTBEAT_SECONDS:-15}
|
||||
TASK_STALE_SECONDS: ${TASK_STALE_SECONDS:-120}
|
||||
TASK_MAX_ATTEMPTS: ${TASK_MAX_ATTEMPTS:-3}
|
||||
expose:
|
||||
- '3001'
|
||||
depends_on:
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
healthcheck:
|
||||
test: ['CMD-SHELL', 'wget -q --spider http://localhost:3001/api/health || exit 1']
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
|
||||
worker:
|
||||
build:
|
||||
context: ./backend
|
||||
dockerfile: Dockerfile
|
||||
restart: unless-stopped
|
||||
command: ['sh', '-c', 'bun run src/db/migrate.ts && bun run src/worker.ts']
|
||||
env_file:
|
||||
- path: ./.env
|
||||
required: false
|
||||
- path: ../.env
|
||||
required: false
|
||||
environment:
|
||||
DATABASE_URL: ${DATABASE_URL:-postgres://postgres:postgres@postgres:5432/fiscal}
|
||||
PORT: ${PORT:-3001}
|
||||
POSTGRES_HOST: postgres
|
||||
FRONTEND_URL: ${FRONTEND_URL:-http://localhost:3000}
|
||||
BETTER_AUTH_SECRET: ${BETTER_AUTH_SECRET:-local-dev-better-auth-secret-change-me}
|
||||
BETTER_AUTH_BASE_URL: ${BETTER_AUTH_BASE_URL:-http://localhost:3001}
|
||||
SEC_USER_AGENT: ${SEC_USER_AGENT:-Fiscal Clone <support@example.com>}
|
||||
OPENCLAW_BASE_URL: ${OPENCLAW_BASE_URL:-}
|
||||
OPENCLAW_API_KEY: ${OPENCLAW_API_KEY:-}
|
||||
OPENCLAW_MODEL: ${OPENCLAW_MODEL:-zeroclaw}
|
||||
TASK_HEARTBEAT_SECONDS: ${TASK_HEARTBEAT_SECONDS:-15}
|
||||
TASK_STALE_SECONDS: ${TASK_STALE_SECONDS:-120}
|
||||
TASK_MAX_ATTEMPTS: ${TASK_MAX_ATTEMPTS:-3}
|
||||
depends_on:
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
|
||||
frontend:
|
||||
app:
|
||||
build:
|
||||
context: ./frontend
|
||||
dockerfile: Dockerfile
|
||||
args:
|
||||
NEXT_PUBLIC_API_URL: ${NEXT_PUBLIC_API_URL:-http://localhost:3001}
|
||||
NEXT_PUBLIC_API_URL: ${NEXT_PUBLIC_API_URL:-}
|
||||
restart: unless-stopped
|
||||
env_file:
|
||||
- path: ./.env
|
||||
required: false
|
||||
environment:
|
||||
PORT: 3000
|
||||
HOSTNAME: 0.0.0.0
|
||||
NEXT_PUBLIC_API_URL: ${NEXT_PUBLIC_API_URL:-http://localhost:3001}
|
||||
expose:
|
||||
- '3000'
|
||||
depends_on:
|
||||
- backend
|
||||
|
||||
volumes:
|
||||
postgres_data:
|
||||
NEXT_PUBLIC_API_URL: ${NEXT_PUBLIC_API_URL:-}
|
||||
OPENCLAW_BASE_URL: ${OPENCLAW_BASE_URL:-}
|
||||
OPENCLAW_API_KEY: ${OPENCLAW_API_KEY:-}
|
||||
OPENCLAW_MODEL: ${OPENCLAW_MODEL:-zeroclaw}
|
||||
SEC_USER_AGENT: ${SEC_USER_AGENT:-Fiscal Clone <support@fiscal.local>}
|
||||
ports:
|
||||
- '3000:3000'
|
||||
|
||||
@@ -1,78 +0,0 @@
|
||||
# Fiscal Clone Rebuild Decisions
|
||||
|
||||
This document records the ground-up design choices for the 2026 rebuild so every major decision is explicit and reviewable.
|
||||
|
||||
## 1) Architecture: split frontend and API
|
||||
- Decision: keep `Next.js` in `frontend/` and a dedicated high-throughput API in `backend/`.
|
||||
- Why: clean separation for scaling and deployment; web rendering and data ingestion do not contend for resources.
|
||||
- Tradeoff: more services to run locally.
|
||||
|
||||
## 2) Runtime choice: Bun + Elysia for API
|
||||
- Decision: use Bun runtime with Elysia for low overhead and fast cold/warm request handling.
|
||||
- Why: strong performance profile for IO-heavy workloads (quotes, SEC fetch, queue polling).
|
||||
- Tradeoff: narrower ecosystem compatibility than plain Node in some libraries.
|
||||
|
||||
## 3) Auth standard: Better Auth only
|
||||
- Decision: use Better Auth end-to-end and remove legacy JWT/NextAuth patterns.
|
||||
- Why: single auth surface across API and Next.js clients, DB-backed sessions, less custom auth code.
|
||||
- Tradeoff: schema must align closely with Better Auth expectations.
|
||||
|
||||
## 4) Persistence: PostgreSQL as source of truth
|
||||
- Decision: keep Postgres for all domain entities and task durability.
|
||||
- Why: transactional consistency, mature operational tooling, simple backup/restore.
|
||||
- Tradeoff: queue throughput is lower than specialized brokers at massive scale.
|
||||
|
||||
## 5) Long-running jobs: durable DB queue
|
||||
- Decision: implement a durable `long_tasks` queue table plus dedicated worker process.
|
||||
- Why: supports multi-minute jobs, retries, result persistence, and survives API restarts.
|
||||
- Tradeoff: custom queue logic is more code than dropping in a broker library.
|
||||
|
||||
## 6) Async-first API for heavy workflows
|
||||
- Decision: filing sync, filing analysis, and portfolio insights are queued and polled via `/api/tasks/:id`.
|
||||
- Why: avoids request timeouts and keeps the UX responsive.
|
||||
- Tradeoff: frontend must handle job lifecycle states.
|
||||
|
||||
## 7) AI integration contract for OpenClaw/ZeroClaw
|
||||
- Decision: use an adapter that targets an OpenAI-compatible chat endpoint (`OPENCLAW_BASE_URL`) with model override (`OPENCLAW_MODEL`).
|
||||
- Why: works with OpenClaw/ZeroClaw deployments while keeping provider lock-in low.
|
||||
- Tradeoff: advanced provider-specific features are not exposed in v1.
|
||||
|
||||
## 8) SEC ingestion strategy
|
||||
- Decision: fetch filings from SEC submissions API and enrich with company facts metrics.
|
||||
- Why: stable machine-readable endpoints with less brittle parsing than HTML scraping.
|
||||
- Tradeoff: facts can lag specific filing publication timing.
|
||||
|
||||
## 9) Market pricing strategy
|
||||
- Decision: use Yahoo Finance chart endpoint for quote snapshots and periodic refresh.
|
||||
- Why: good coverage and straightforward integration for portfolio mark-to-market.
|
||||
- Tradeoff: endpoint reliability/quotas can vary; provider abstraction retained for future switch.
|
||||
|
||||
## 10) API shape: domain modules + strict schemas
|
||||
- Decision: organize routes by domain (`portfolio`, `watchlist`, `filings`, `ai`, `tasks`) with Zod-style schema validation via Elysia types.
|
||||
- Why: predictable contract boundaries and safer payload handling.
|
||||
- Tradeoff: slight boilerplate cost.
|
||||
|
||||
## 11) Security posture
|
||||
- Decision: all business endpoints require authenticated session resolution through Better Auth session API.
|
||||
- Why: prevents cross-user data access and removes implicit trust in client-supplied user IDs.
|
||||
- Tradeoff: each protected route performs auth/session checks.
|
||||
|
||||
## 12) Frontend rendering model
|
||||
- Decision: use Next.js App Router with client-heavy dashboards where live polling is required.
|
||||
- Why: server rendering for shell + interactive client zones for real-time task/market updates.
|
||||
- Tradeoff: more client-side state management in dashboard screens.
|
||||
|
||||
## 13) Design language: terminal-futurist UI system
|
||||
- Decision: build a clear terminal-inspired design with grid scanlines, mono + geometric type pairing, and neon cyan/green accent palette.
|
||||
- Why: matches requested futuristic terminal aesthetic while remaining readable.
|
||||
- Tradeoff: highly stylized branding may not fit conservative enterprise environments.
|
||||
|
||||
## 14) Performance defaults
|
||||
- Decision: optimize for fewer round trips (batched fetches), async processing, indexed SQL, and paginated list endpoints.
|
||||
- Why: improves p95 latency under concurrent load.
|
||||
- Tradeoff: slightly more complex query/service code.
|
||||
|
||||
## 15) Operations model
|
||||
- Decision: run three processes in production: frontend, backend API, backend worker.
|
||||
- Why: isolates web traffic from heavy background processing and enables independent scaling.
|
||||
- Tradeoff: additional deployment/health-check wiring.
|
||||
@@ -2,25 +2,22 @@ FROM node:20-alpine AS base
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install dependencies
|
||||
FROM base AS deps
|
||||
COPY package.json ./
|
||||
RUN npm install
|
||||
|
||||
# Build
|
||||
FROM base AS builder
|
||||
ARG NEXT_PUBLIC_API_URL=http://backend:3001
|
||||
ARG NEXT_PUBLIC_API_URL=
|
||||
ENV NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL}
|
||||
COPY --from=deps /app/node_modules ./node_modules
|
||||
COPY . .
|
||||
RUN mkdir -p public && npm run build
|
||||
|
||||
# Production
|
||||
FROM base AS runner
|
||||
WORKDIR /app
|
||||
|
||||
ENV NODE_ENV=production
|
||||
ARG NEXT_PUBLIC_API_URL=http://backend:3001
|
||||
ARG NEXT_PUBLIC_API_URL=
|
||||
ENV NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL}
|
||||
|
||||
RUN addgroup --system --gid 1001 nodejs
|
||||
|
||||
@@ -19,7 +19,7 @@ export default function SignUpPage() {
|
||||
)}
|
||||
>
|
||||
<p className="text-sm text-[color:var(--terminal-muted)]">
|
||||
For production deployment you can reintroduce Better Auth, but the rebuilt stack is intentionally self-contained for fast iteration.
|
||||
For production deployment you can reintroduce full multi-user authentication, but this rebuild is intentionally self-contained for fast iteration.
|
||||
</p>
|
||||
|
||||
<Link href="/" className="mt-6 block">
|
||||
|
||||
@@ -126,7 +126,7 @@ function FilingsPageContent() {
|
||||
)}
|
||||
>
|
||||
{liveTask ? (
|
||||
<Panel title="Active Task" subtitle={`${liveTask.task_type} is processing in worker.`}>
|
||||
<Panel title="Active Task" subtitle={`${liveTask.task_type} is processing in the task engine.`}>
|
||||
<div className="flex items-center justify-between gap-3 rounded-lg border border-[color:var(--line-weak)] bg-[color:var(--panel-soft)] px-3 py-2">
|
||||
<p className="text-sm text-[color:var(--terminal-bright)]">{liveTask.id}</p>
|
||||
<StatusPill status={liveTask.status} />
|
||||
|
||||
@@ -3,7 +3,7 @@ import type { Metadata } from 'next';
|
||||
|
||||
export const metadata: Metadata = {
|
||||
title: 'Fiscal Clone',
|
||||
description: 'Futuristic fiscal intelligence terminal powered by Better Auth and durable AI tasks.'
|
||||
description: 'Futuristic fiscal intelligence terminal with durable tasks and OpenClaw integration.'
|
||||
};
|
||||
|
||||
export default function RootLayout({ children }: { children: React.ReactNode }) {
|
||||
|
||||
Reference in New Issue
Block a user