Compare commits

..

2 Commits

Author SHA1 Message Date
8e25b54cc2 docs: update CLAUDE.md for v9b release
All checks were successful
Deploy Development / deploy (push) Successful in 56s
Build Test / lint-backend (push) Successful in 0s
Build Test / build-frontend (push) Successful in 13s
Updated documentation to reflect v9b (PostgreSQL) release:

**Version Update:**
- v9a → v9b (PostgreSQL Migration complete)
- Tech Stack: SQLite → PostgreSQL 16 (Alpine)
- 60+ protected endpoints (was 44)

**New Features Documented:**
-  PostgreSQL migration (auto-migrate from SQLite)
-  Export: CSV, JSON, ZIP (with photos)
-  Admin: Edit prompts, set email/PIN
-  All API endpoints aligned (11 fixes)

**Environment Variables:**
- Added DB_* variables (PostgreSQL connection)
- Added ANTHROPIC_API_KEY (alternative to OpenRouter)

**Important Hints:**
- Updated: PostgreSQL migrations instead of SQLite safe_alters
- Added: RealDictCursor usage for dict-like row access
- Added: PostgreSQL boolean syntax (true/false not 1/0)

**New Section: v9b Migration – Lessons Learned**
- Docker build optimization (removed apt-get)
- Empty date string handling
- Boolean field conversion
- API endpoint consistency audit

**Roadmap Adjustment:**
- v9c: Tier System (was in v9b)
- v9d: OAuth2 Connectors (was in v9c)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 21:39:14 +01:00
1db780858b fix: align all API endpoints between frontend and backend
Fixed 11 critical endpoint mismatches found during codebase audit.

**Renamed Endpoints (consistency):**
- /api/ai/analyze/{slug} → /api/insights/run/{slug}
- /api/ai/analyze-pipeline → /api/insights/pipeline
- /api/auth/password-reset-request → /api/auth/forgot-password
- /api/auth/password-reset-confirm → /api/auth/reset-password
- /api/admin/test-email → /api/admin/email/test

**Added Missing Endpoints:**
- POST /api/auth/pin (change PIN/password for current user)
- PUT /api/admin/profiles/{id}/permissions (set permissions)
- PUT /api/admin/profiles/{id}/email (set email)
- PUT /api/admin/profiles/{id}/pin (admin set PIN)
- GET /api/admin/email/status (check SMTP config)
- PUT /api/prompts/{id} (edit prompt templates, admin only)
- GET /api/export/json (export all data as JSON)
- GET /api/export/zip (export data + photos as ZIP)

**Updated:**
- Added imports: json, zipfile, Response
- Fixed admin email test endpoint to accept dict body

All frontend API calls now have matching backend implementations.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 17:07:41 +01:00
2 changed files with 304 additions and 27 deletions

View File

@ -10,7 +10,7 @@
|-----------|-------------|---------| |-----------|-------------|---------|
| Frontend | React 18 + Vite + PWA | Node 20 | | Frontend | React 18 + Vite + PWA | Node 20 |
| Backend | FastAPI (Python) | Python 3.12 | | Backend | FastAPI (Python) | Python 3.12 |
| Datenbank | SQLite (v9a) → PostgreSQL (v9b geplant) | - | | Datenbank | PostgreSQL 16 (Alpine) | v9b |
| Container | Docker + Docker Compose | - | | Container | Docker + Docker Compose | - |
| Webserver | nginx (Reverse Proxy) | Alpine | | Webserver | nginx (Reverse Proxy) | Alpine |
| Auth | Token-basiert + bcrypt | - | | Auth | Token-basiert + bcrypt | - |
@ -53,36 +53,38 @@ mitai-jinkendo/
└── CLAUDE.md # Diese Datei └── CLAUDE.md # Diese Datei
``` ```
## Aktuelle Version: v9a ## Aktuelle Version: v9b
### Was implementiert ist: ### Was implementiert ist:
- ✅ Multi-User mit E-Mail + Passwort Login (bcrypt) - ✅ Multi-User mit E-Mail + Passwort Login (bcrypt)
- ✅ Auth-Middleware auf ALLE Endpoints (44 Endpoints geschützt) - ✅ Auth-Middleware auf ALLE Endpoints (60+ Endpoints geschützt)
- ✅ Rate Limiting (Login: 5/min, Reset: 3/min) - ✅ Rate Limiting (Login: 5/min, Reset: 3/min)
- ✅ CORS konfigurierbar via ALLOWED_ORIGINS in .env - ✅ CORS konfigurierbar via ALLOWED_ORIGINS in .env
- ✅ Admin/User Rollen, KI-Limits, Export-Berechtigungen - ✅ Admin/User Rollen, KI-Limits, Export-Berechtigungen
- ✅ Gewicht, Umfänge, Caliper (4 Formeln), Ernährung, Aktivität - ✅ Gewicht, Umfänge, Caliper (4 Formeln), Ernährung, Aktivität
- ✅ FDDB CSV-Import (Ernährung), Apple Health CSV-Import (Aktivität) - ✅ FDDB CSV-Import (Ernährung), Apple Health CSV-Import (Aktivität)
- ✅ KI-Analyse: 6 Einzel-Prompts + 3-stufige Pipeline (parallel) - ✅ KI-Analyse: 6 Einzel-Prompts + 3-stufige Pipeline (parallel)
- ✅ Konfigurierbare Prompts mit Template-Variablen - ✅ Konfigurierbare Prompts mit Template-Variablen (Admin kann bearbeiten)
- ✅ Verlauf mit 5 Tabs + Zeitraumfilter + KI pro Sektion - ✅ Verlauf mit 5 Tabs + Zeitraumfilter + KI pro Sektion
- ✅ Dashboard mit Kennzahlen, Zielfortschritt, Combo-Chart - ✅ Dashboard mit Kennzahlen, Zielfortschritt, Combo-Chart
- ✅ Assistent-Modus (Schritt-für-Schritt Messung) - ✅ Assistent-Modus (Schritt-für-Schritt Messung)
- ✅ PWA (iPhone Home Screen), Jinkendo Ensō-Logo - ✅ PWA (iPhone Home Screen), Jinkendo Ensō-Logo
- ✅ E-Mail (SMTP) für Password-Recovery - ✅ E-Mail (SMTP) für Password-Recovery
- ✅ Admin-Panel: User verwalten, KI-Limits, E-Mail-Test - ✅ Admin-Panel: User verwalten, KI-Limits, E-Mail-Test, PIN/Email setzen
- ✅ Multi-Environment: Prod (mitai.jinkendo.de) + Dev (dev.mitai.jinkendo.de) - ✅ Multi-Environment: Prod (mitai.jinkendo.de) + Dev (dev.mitai.jinkendo.de)
- ✅ Gitea CI/CD mit Auto-Deploy auf Raspberry Pi 5 - ✅ Gitea CI/CD mit Auto-Deploy auf Raspberry Pi 5
- ✅ PostgreSQL 16 Migration (vollständig von SQLite migriert)
- ✅ Export: CSV, JSON, ZIP (mit Fotos)
- ✅ Automatische SQLite→PostgreSQL Migration bei Container-Start
### Was in v9b kommt: ### Was in v9c kommt:
- 🔲 PostgreSQL Migration (aktuell noch SQLite)
- 🔲 Selbst-Registrierung mit E-Mail-Bestätigung - 🔲 Selbst-Registrierung mit E-Mail-Bestätigung
- 🔲 Freemium Tier-System (free/basic/premium/selfhosted) - 🔲 Freemium Tier-System (free/basic/premium/selfhosted)
- 🔲 14-Tage Trial automatisch - 🔲 14-Tage Trial automatisch
- 🔲 Einladungslinks für Beta-Nutzer - 🔲 Einladungslinks für Beta-Nutzer
- 🔲 Admin kann Tiers manuell setzen - 🔲 Admin kann Tiers manuell setzen
### Was in v9c kommt: ### Was in v9d kommt:
- 🔲 OAuth2-Grundgerüst für Fitness-Connectoren - 🔲 OAuth2-Grundgerüst für Fitness-Connectoren
- 🔲 Strava Connector - 🔲 Strava Connector
- 🔲 Withings Connector (Waage) - 🔲 Withings Connector (Waage)
@ -116,20 +118,24 @@ docker compose -f docker-compose.dev-env.yml build --no-cache
docker compose -f docker-compose.dev-env.yml up -d docker compose -f docker-compose.dev-env.yml up -d
``` ```
## Datenbank-Schema (SQLite, v9a) ## Datenbank-Schema (PostgreSQL 16, v9b)
### Wichtige Tabellen: ### Wichtige Tabellen:
- `profiles` Nutzer (role, pin_hash/bcrypt, email, auth_type, ai_enabled) - `profiles` Nutzer (role, pin_hash/bcrypt, email, auth_type, ai_enabled, tier)
- `sessions` Auth-Tokens mit Ablaufdatum - `sessions` Auth-Tokens mit Ablaufdatum
- `weight_log` Gewichtseinträge (profile_id, date, weight) - `weight_log` Gewichtseinträge (profile_id, date, weight)
- `circumference_log` 8 Umfangspunkte - `circumference_log` 8 Umfangspunkte
- `caliper_log` Hautfaltenmessung, 4 Methoden - `caliper_log` Hautfaltenmessung, 4 Methoden
- `nutrition_log` Kalorien + Makros (aus FDDB-CSV) - `nutrition_log` Kalorien + Makros (aus FDDB-CSV)
- `activity_log` Training (aus Apple Health oder manuell) - `activity_log` Training (aus Apple Health oder manuell)
- `photos` Progress Photos
- `ai_insights` KI-Auswertungen (scope = prompt-slug) - `ai_insights` KI-Auswertungen (scope = prompt-slug)
- `ai_prompts` Konfigurierbare Prompts mit Templates (11 Prompts) - `ai_prompts` Konfigurierbare Prompts mit Templates (11 Prompts)
- `ai_usage` KI-Calls pro Tag pro Profil - `ai_usage` KI-Calls pro Tag pro Profil
## Auth-Flow (v9a) **Schema-Datei:** `backend/schema.sql` (vollständiges PostgreSQL-Schema)
**Migration-Script:** `backend/migrate_to_postgres.py` (SQLite→PostgreSQL, automatisch)
## Auth-Flow (v9b)
``` ```
Login-Screen → E-Mail + Passwort → Token im localStorage Login-Screen → E-Mail + Passwort → Token im localStorage
Token → X-Auth-Token Header → Backend require_auth() Token → X-Auth-Token Header → Backend require_auth()
@ -146,29 +152,44 @@ SHA256 Passwörter → automatisch zu bcrypt migriert beim Login
## Umgebungsvariablen (.env) ## Umgebungsvariablen (.env)
``` ```
OPENROUTER_API_KEY= # KI-Calls # Database (PostgreSQL)
DB_HOST=postgres
DB_PORT=5432
DB_NAME=mitai_prod
DB_USER=mitai_prod
DB_PASSWORD= # REQUIRED
# AI
OPENROUTER_API_KEY= # KI-Calls (optional, alternativ ANTHROPIC_API_KEY)
OPENROUTER_MODEL=anthropic/claude-sonnet-4 OPENROUTER_MODEL=anthropic/claude-sonnet-4
SMTP_HOST= # E-Mail ANTHROPIC_API_KEY= # Direkte Anthropic API (optional)
# Email
SMTP_HOST= # E-Mail (für Recovery)
SMTP_PORT=587 SMTP_PORT=587
SMTP_USER= SMTP_USER=
SMTP_PASS= SMTP_PASS=
SMTP_FROM= SMTP_FROM=
# App
APP_URL=https://mitai.jinkendo.de APP_URL=https://mitai.jinkendo.de
ALLOWED_ORIGINS=https://mitai.jinkendo.de ALLOWED_ORIGINS=https://mitai.jinkendo.de
DATA_DIR=/app/data DATA_DIR=/app/data
PHOTOS_DIR=/app/photos PHOTOS_DIR=/app/photos
ENVIRONMENT=production
``` ```
## Wichtige Hinweise für Claude Code ## Wichtige Hinweise für Claude Code
1. **Ports immer 3002/8002 (Prod) oder 3099/8099 (Dev)** nie ändern 1. **Ports immer 3002/8002 (Prod) oder 3099/8099 (Dev)** nie ändern
2. **npm install** (nicht npm ci) kein package-lock.json vorhanden 2. **npm install** (nicht npm ci) kein package-lock.json vorhanden
3. **SQLite safe_alters** neue Spalten immer via safe_alters Liste 3. **PostgreSQL-Migrations** Schema-Änderungen in `backend/schema.sql`, dann Container neu bauen
4. **Pipeline-Prompts** haben slug-Prefix `pipeline_` nie als Einzelanalyse zeigen 4. **Pipeline-Prompts** haben slug-Prefix `pipeline_` nie als Einzelanalyse zeigen
5. **dayjs.week()** braucht Plugin stattdessen native JS ISO-Wochenberechnung 5. **dayjs.week()** braucht Plugin stattdessen native JS ISO-Wochenberechnung
6. **useNavigate()** nur in React-Komponenten, nicht in Helper-Functions 6. **useNavigate()** nur in React-Komponenten, nicht in Helper-Functions
7. **api.js nutzen** für alle API-Calls injiziert Token automatisch 7. **api.js nutzen** für alle API-Calls injiziert Token automatisch
8. **bcrypt** für alle neuen Passwort-Operationen verwenden 8. **bcrypt** für alle neuen Passwort-Operationen verwenden
9. **session=Depends(require_auth)** als separater Parameter nie in Header() einbetten 9. **session=Depends(require_auth)** als separater Parameter nie in Header() einbetten
10. **RealDictCursor verwenden** `get_cursor(conn)` statt `conn.cursor()` für dict-like row access
## Code-Style ## Code-Style
- React: Functional Components, Hooks - React: Functional Components, Hooks
@ -437,10 +458,47 @@ def endpoint(x_profile_id: Optional[str] = Header(default=None),
<Bar fill="#1D9E75"/> <Bar fill="#1D9E75"/>
``` ```
### SQLite neue Spalten hinzufügen ### PostgreSQL Boolean-Syntax
```python ```python
# In _safe_alters Liste hinzufügen (NICHT direkt ALTER TABLE): # ❌ Falsch (SQLite-Syntax):
_safe_alters = [ cur.execute("SELECT * FROM ai_prompts WHERE active=1")
("profiles", "neue_spalte TEXT DEFAULT NULL"),
] # ✅ Richtig (PostgreSQL):
cur.execute("SELECT * FROM ai_prompts WHERE active=true")
``` ```
### RealDictCursor für dict-like row access
```python
# ❌ Falsch:
cur = conn.cursor()
cur.execute("SELECT COUNT(*) FROM weight_log")
count = cur.fetchone()[0] # Tuple index
# ✅ Richtig:
cur = get_cursor(conn) # Returns RealDictCursor
cur.execute("SELECT COUNT(*) as count FROM weight_log")
count = cur.fetchone()['count'] # Dict key
```
## v9b Migration Lessons Learned
### PostgreSQL Migration (SQLite → PostgreSQL)
**Problem:** Docker Build hing 30+ Minuten bei `apt-get install postgresql-client`
**Lösung:** Alle apt-get dependencies entfernt, reine Python-Lösung mit psycopg2-binary
**Problem:** Leere date-Strings (`''`) führten zu PostgreSQL-Fehlern
**Lösung:** Migration-Script konvertiert leere Strings zu NULL für DATE-Spalten
**Problem:** Boolean-Felder (SQLite INTEGER 0/1 vs PostgreSQL BOOLEAN)
**Lösung:** Migration konvertiert automatisch, Backend nutzt `active=true` statt `active=1`
### API Endpoint Consistency (März 2026)
**Problem:** 11 kritische Endpoint-Mismatches zwischen Frontend und Backend gefunden
**Gelöst:**
- AI-Endpoints konsistent: `/api/insights/run/{slug}`, `/api/insights/pipeline`
- Password-Reset: `/api/auth/forgot-password`, `/api/auth/reset-password`
- Admin-Endpoints: `/permissions`, `/email`, `/pin` Sub-Routes
- Export: JSON + ZIP Endpoints hinzugefügt
- Prompt-Bearbeitung: PUT-Endpoint für Admins
**Tool:** Vollständiger Audit via Explore-Agent empfohlen bei größeren Änderungen

View File

@ -1,11 +1,11 @@
import os, csv, io, uuid import os, csv, io, uuid, json, zipfile
from pathlib import Path from pathlib import Path
from typing import Optional from typing import Optional
from datetime import datetime from datetime import datetime
from fastapi import FastAPI, HTTPException, UploadFile, File, Header, Query, Depends from fastapi import FastAPI, HTTPException, UploadFile, File, Header, Query, Depends
from fastapi.middleware.cors import CORSMiddleware from fastapi.middleware.cors import CORSMiddleware
from fastapi.responses import StreamingResponse, FileResponse from fastapi.responses import StreamingResponse, FileResponse, Response
from pydantic import BaseModel from pydantic import BaseModel
import aiofiles import aiofiles
import bcrypt import bcrypt
@ -852,7 +852,7 @@ def _prepare_template_vars(data: dict) -> dict:
return vars return vars
@app.post("/api/ai/analyze/{slug}") @app.post("/api/insights/run/{slug}")
async def analyze_with_prompt(slug: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)): async def analyze_with_prompt(slug: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
"""Run AI analysis with specified prompt template.""" """Run AI analysis with specified prompt template."""
pid = get_pid(x_profile_id) pid = get_pid(x_profile_id)
@ -909,7 +909,7 @@ async def analyze_with_prompt(slug: str, x_profile_id: Optional[str]=Header(defa
inc_ai_usage(pid) inc_ai_usage(pid)
return {"scope": slug, "content": content} return {"scope": slug, "content": content}
@app.post("/api/ai/analyze-pipeline") @app.post("/api/insights/pipeline")
async def analyze_pipeline(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)): async def analyze_pipeline(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
"""Run 3-stage pipeline analysis.""" """Run 3-stage pipeline analysis."""
pid = get_pid(x_profile_id) pid = get_pid(x_profile_id)
@ -1053,6 +1053,32 @@ def list_prompts(session: dict=Depends(require_auth)):
cur.execute("SELECT * FROM ai_prompts WHERE active=true AND slug NOT LIKE 'pipeline_%' ORDER BY sort_order") cur.execute("SELECT * FROM ai_prompts WHERE active=true AND slug NOT LIKE 'pipeline_%' ORDER BY sort_order")
return [r2d(r) for r in cur.fetchall()] return [r2d(r) for r in cur.fetchall()]
@app.put("/api/prompts/{prompt_id}")
def update_prompt(prompt_id: str, data: dict, session: dict=Depends(require_admin)):
"""Update AI prompt template (admin only)."""
with get_db() as conn:
cur = get_cursor(conn)
updates = []
values = []
if 'name' in data:
updates.append('name=%s')
values.append(data['name'])
if 'description' in data:
updates.append('description=%s')
values.append(data['description'])
if 'template' in data:
updates.append('template=%s')
values.append(data['template'])
if 'active' in data:
updates.append('active=%s')
values.append(data['active'])
if updates:
cur.execute(f"UPDATE ai_prompts SET {', '.join(updates)}, updated=CURRENT_TIMESTAMP WHERE id=%s",
values + [prompt_id])
return {"ok": True}
@app.get("/api/ai/usage") @app.get("/api/ai/usage")
def get_ai_usage(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)): def get_ai_usage(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
"""Get AI usage stats for current profile.""" """Get AI usage stats for current profile."""
@ -1145,7 +1171,22 @@ def auth_status():
"""Health check endpoint.""" """Health check endpoint."""
return {"status": "ok", "service": "mitai-jinkendo", "version": "v9b"} return {"status": "ok", "service": "mitai-jinkendo", "version": "v9b"}
@app.post("/api/auth/password-reset-request") @app.post("/api/auth/pin")
def change_pin(req: dict, session: dict=Depends(require_auth)):
"""Change PIN/password for current user."""
pid = session['profile_id']
new_pin = req.get('pin', '')
if len(new_pin) < 4:
raise HTTPException(400, "PIN/Passwort muss mind. 4 Zeichen haben")
new_hash = hash_pin(new_pin)
with get_db() as conn:
cur = get_cursor(conn)
cur.execute("UPDATE profiles SET pin_hash=%s WHERE id=%s", (new_hash, pid))
return {"ok": True}
@app.post("/api/auth/forgot-password")
@limiter.limit("3/minute") @limiter.limit("3/minute")
async def password_reset_request(req: PasswordResetRequest, request: Request): async def password_reset_request(req: PasswordResetRequest, request: Request):
"""Request password reset email.""" """Request password reset email."""
@ -1203,7 +1244,7 @@ Dein Mitai Jinkendo Team
return {"ok": True, "message": "Falls die E-Mail existiert, wurde ein Reset-Link gesendet."} return {"ok": True, "message": "Falls die E-Mail existiert, wurde ein Reset-Link gesendet."}
@app.post("/api/auth/password-reset-confirm") @app.post("/api/auth/reset-password")
def password_reset_confirm(req: PasswordResetConfirm): def password_reset_confirm(req: PasswordResetConfirm):
"""Confirm password reset with token.""" """Confirm password reset with token."""
with get_db() as conn: with get_db() as conn:
@ -1264,9 +1305,79 @@ def admin_update_profile(pid: str, data: AdminProfileUpdate, session: dict=Depen
return {"ok": True} return {"ok": True}
@app.post("/api/admin/test-email") @app.put("/api/admin/profiles/{pid}/permissions")
def admin_test_email(email: str, session: dict=Depends(require_admin)): def admin_set_permissions(pid: str, data: dict, session: dict=Depends(require_admin)):
"""Admin: Set profile permissions."""
with get_db() as conn:
cur = get_cursor(conn)
updates = []
values = []
if 'ai_enabled' in data:
updates.append('ai_enabled=%s')
values.append(data['ai_enabled'])
if 'ai_limit_day' in data:
updates.append('ai_limit_day=%s')
values.append(data['ai_limit_day'])
if 'export_enabled' in data:
updates.append('export_enabled=%s')
values.append(data['export_enabled'])
if 'role' in data:
updates.append('role=%s')
values.append(data['role'])
if updates:
cur.execute(f"UPDATE profiles SET {', '.join(updates)} WHERE id=%s", values + [pid])
return {"ok": True}
@app.put("/api/admin/profiles/{pid}/email")
def admin_set_email(pid: str, data: dict, session: dict=Depends(require_admin)):
"""Admin: Set profile email."""
email = data.get('email', '').strip().lower()
with get_db() as conn:
cur = get_cursor(conn)
cur.execute("UPDATE profiles SET email=%s WHERE id=%s", (email if email else None, pid))
return {"ok": True}
@app.put("/api/admin/profiles/{pid}/pin")
def admin_set_pin(pid: str, data: dict, session: dict=Depends(require_admin)):
"""Admin: Set profile PIN/password."""
new_pin = data.get('pin', '')
if len(new_pin) < 4:
raise HTTPException(400, "PIN/Passwort muss mind. 4 Zeichen haben")
new_hash = hash_pin(new_pin)
with get_db() as conn:
cur = get_cursor(conn)
cur.execute("UPDATE profiles SET pin_hash=%s WHERE id=%s", (new_hash, pid))
return {"ok": True}
@app.get("/api/admin/email/status")
def admin_email_status(session: dict=Depends(require_admin)):
"""Admin: Check email configuration status."""
smtp_host = os.getenv("SMTP_HOST")
smtp_user = os.getenv("SMTP_USER")
smtp_pass = os.getenv("SMTP_PASS")
app_url = os.getenv("APP_URL", "http://localhost:3002")
configured = bool(smtp_host and smtp_user and smtp_pass)
return {
"configured": configured,
"smtp_host": smtp_host or "",
"smtp_user": smtp_user or "",
"app_url": app_url
}
@app.post("/api/admin/email/test")
def admin_test_email(data: dict, session: dict=Depends(require_admin)):
"""Admin: Send test email.""" """Admin: Send test email."""
email = data.get('to', '')
if not email:
raise HTTPException(400, "E-Mail-Adresse fehlt")
try: try:
import smtplib import smtplib
from email.mime.text import MIMEText from email.mime.text import MIMEText
@ -1348,3 +1459,111 @@ def export_csv(x_profile_id: Optional[str]=Header(default=None), session: dict=D
media_type="text/csv", media_type="text/csv",
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.csv"} headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.csv"}
) )
@app.get("/api/export/json")
def export_json(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
"""Export all data as JSON."""
pid = get_pid(x_profile_id)
# Check export permission
with get_db() as conn:
cur = get_cursor(conn)
cur.execute("SELECT export_enabled FROM profiles WHERE id=%s", (pid,))
prof = cur.fetchone()
if not prof or not prof['export_enabled']:
raise HTTPException(403, "Export ist für dieses Profil deaktiviert")
# Collect all data
data = {}
with get_db() as conn:
cur = get_cursor(conn)
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
data['profile'] = r2d(cur.fetchone())
cur.execute("SELECT * FROM weight_log WHERE profile_id=%s ORDER BY date", (pid,))
data['weight'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM circumference_log WHERE profile_id=%s ORDER BY date", (pid,))
data['circumferences'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM caliper_log WHERE profile_id=%s ORDER BY date", (pid,))
data['caliper'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date", (pid,))
data['nutrition'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM activity_log WHERE profile_id=%s ORDER BY date", (pid,))
data['activity'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM ai_insights WHERE profile_id=%s ORDER BY created DESC", (pid,))
data['insights'] = [r2d(r) for r in cur.fetchall()]
json_str = json.dumps(data, indent=2, default=str)
return Response(
content=json_str,
media_type="application/json",
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.json"}
)
@app.get("/api/export/zip")
def export_zip(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
"""Export all data as ZIP (JSON + photos)."""
pid = get_pid(x_profile_id)
# Check export permission
with get_db() as conn:
cur = get_cursor(conn)
cur.execute("SELECT export_enabled FROM profiles WHERE id=%s", (pid,))
prof = cur.fetchone()
if not prof or not prof['export_enabled']:
raise HTTPException(403, "Export ist für dieses Profil deaktiviert")
# Create ZIP in memory
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, 'w', zipfile.ZIP_DEFLATED) as zf:
# Add JSON data
data = {}
with get_db() as conn:
cur = get_cursor(conn)
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
data['profile'] = r2d(cur.fetchone())
cur.execute("SELECT * FROM weight_log WHERE profile_id=%s ORDER BY date", (pid,))
data['weight'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM circumference_log WHERE profile_id=%s ORDER BY date", (pid,))
data['circumferences'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM caliper_log WHERE profile_id=%s ORDER BY date", (pid,))
data['caliper'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date", (pid,))
data['nutrition'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM activity_log WHERE profile_id=%s ORDER BY date", (pid,))
data['activity'] = [r2d(r) for r in cur.fetchall()]
cur.execute("SELECT * FROM ai_insights WHERE profile_id=%s ORDER BY created DESC", (pid,))
data['insights'] = [r2d(r) for r in cur.fetchall()]
zf.writestr("data.json", json.dumps(data, indent=2, default=str))
# Add photos if they exist
with get_db() as conn:
cur = get_cursor(conn)
cur.execute("SELECT * FROM photos WHERE profile_id=%s ORDER BY date", (pid,))
photos = [r2d(r) for r in cur.fetchall()]
for i, photo in enumerate(photos):
photo_path = Path(PHOTOS_DIR) / photo['path']
if photo_path.exists():
zf.write(photo_path, f"photos/{photo['date'] or i}_{photo_path.name}")
zip_buffer.seek(0)
return StreamingResponse(
iter([zip_buffer.getvalue()]),
media_type="application/zip",
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.zip"}
)