Compare commits
No commits in common. "8e25b54cc21d72f3bc289e4ca168becf739214ef" and "3d58a2db8e26d48d3aef2bc2e150614c73e8bfff" have entirely different histories.
8e25b54cc2
...
3d58a2db8e
96
CLAUDE.md
96
CLAUDE.md
|
|
@ -10,7 +10,7 @@
|
||||||
|-----------|-------------|---------|
|
|-----------|-------------|---------|
|
||||||
| Frontend | React 18 + Vite + PWA | Node 20 |
|
| Frontend | React 18 + Vite + PWA | Node 20 |
|
||||||
| Backend | FastAPI (Python) | Python 3.12 |
|
| Backend | FastAPI (Python) | Python 3.12 |
|
||||||
| Datenbank | PostgreSQL 16 (Alpine) | v9b |
|
| Datenbank | SQLite (v9a) → PostgreSQL (v9b geplant) | - |
|
||||||
| Container | Docker + Docker Compose | - |
|
| Container | Docker + Docker Compose | - |
|
||||||
| Webserver | nginx (Reverse Proxy) | Alpine |
|
| Webserver | nginx (Reverse Proxy) | Alpine |
|
||||||
| Auth | Token-basiert + bcrypt | - |
|
| Auth | Token-basiert + bcrypt | - |
|
||||||
|
|
@ -53,38 +53,36 @@ mitai-jinkendo/
|
||||||
└── CLAUDE.md # Diese Datei
|
└── CLAUDE.md # Diese Datei
|
||||||
```
|
```
|
||||||
|
|
||||||
## Aktuelle Version: v9b
|
## Aktuelle Version: v9a
|
||||||
|
|
||||||
### Was implementiert ist:
|
### Was implementiert ist:
|
||||||
- ✅ Multi-User mit E-Mail + Passwort Login (bcrypt)
|
- ✅ Multi-User mit E-Mail + Passwort Login (bcrypt)
|
||||||
- ✅ Auth-Middleware auf ALLE Endpoints (60+ Endpoints geschützt)
|
- ✅ Auth-Middleware auf ALLE Endpoints (44 Endpoints geschützt)
|
||||||
- ✅ Rate Limiting (Login: 5/min, Reset: 3/min)
|
- ✅ Rate Limiting (Login: 5/min, Reset: 3/min)
|
||||||
- ✅ CORS konfigurierbar via ALLOWED_ORIGINS in .env
|
- ✅ CORS konfigurierbar via ALLOWED_ORIGINS in .env
|
||||||
- ✅ Admin/User Rollen, KI-Limits, Export-Berechtigungen
|
- ✅ Admin/User Rollen, KI-Limits, Export-Berechtigungen
|
||||||
- ✅ Gewicht, Umfänge, Caliper (4 Formeln), Ernährung, Aktivität
|
- ✅ Gewicht, Umfänge, Caliper (4 Formeln), Ernährung, Aktivität
|
||||||
- ✅ FDDB CSV-Import (Ernährung), Apple Health CSV-Import (Aktivität)
|
- ✅ FDDB CSV-Import (Ernährung), Apple Health CSV-Import (Aktivität)
|
||||||
- ✅ KI-Analyse: 6 Einzel-Prompts + 3-stufige Pipeline (parallel)
|
- ✅ KI-Analyse: 6 Einzel-Prompts + 3-stufige Pipeline (parallel)
|
||||||
- ✅ Konfigurierbare Prompts mit Template-Variablen (Admin kann bearbeiten)
|
- ✅ Konfigurierbare Prompts mit Template-Variablen
|
||||||
- ✅ Verlauf mit 5 Tabs + Zeitraumfilter + KI pro Sektion
|
- ✅ Verlauf mit 5 Tabs + Zeitraumfilter + KI pro Sektion
|
||||||
- ✅ Dashboard mit Kennzahlen, Zielfortschritt, Combo-Chart
|
- ✅ Dashboard mit Kennzahlen, Zielfortschritt, Combo-Chart
|
||||||
- ✅ Assistent-Modus (Schritt-für-Schritt Messung)
|
- ✅ Assistent-Modus (Schritt-für-Schritt Messung)
|
||||||
- ✅ PWA (iPhone Home Screen), Jinkendo Ensō-Logo
|
- ✅ PWA (iPhone Home Screen), Jinkendo Ensō-Logo
|
||||||
- ✅ E-Mail (SMTP) für Password-Recovery
|
- ✅ E-Mail (SMTP) für Password-Recovery
|
||||||
- ✅ Admin-Panel: User verwalten, KI-Limits, E-Mail-Test, PIN/Email setzen
|
- ✅ Admin-Panel: User verwalten, KI-Limits, E-Mail-Test
|
||||||
- ✅ Multi-Environment: Prod (mitai.jinkendo.de) + Dev (dev.mitai.jinkendo.de)
|
- ✅ Multi-Environment: Prod (mitai.jinkendo.de) + Dev (dev.mitai.jinkendo.de)
|
||||||
- ✅ Gitea CI/CD mit Auto-Deploy auf Raspberry Pi 5
|
- ✅ Gitea CI/CD mit Auto-Deploy auf Raspberry Pi 5
|
||||||
- ✅ PostgreSQL 16 Migration (vollständig von SQLite migriert)
|
|
||||||
- ✅ Export: CSV, JSON, ZIP (mit Fotos)
|
|
||||||
- ✅ Automatische SQLite→PostgreSQL Migration bei Container-Start
|
|
||||||
|
|
||||||
### Was in v9c kommt:
|
### Was in v9b kommt:
|
||||||
|
- 🔲 PostgreSQL Migration (aktuell noch SQLite)
|
||||||
- 🔲 Selbst-Registrierung mit E-Mail-Bestätigung
|
- 🔲 Selbst-Registrierung mit E-Mail-Bestätigung
|
||||||
- 🔲 Freemium Tier-System (free/basic/premium/selfhosted)
|
- 🔲 Freemium Tier-System (free/basic/premium/selfhosted)
|
||||||
- 🔲 14-Tage Trial automatisch
|
- 🔲 14-Tage Trial automatisch
|
||||||
- 🔲 Einladungslinks für Beta-Nutzer
|
- 🔲 Einladungslinks für Beta-Nutzer
|
||||||
- 🔲 Admin kann Tiers manuell setzen
|
- 🔲 Admin kann Tiers manuell setzen
|
||||||
|
|
||||||
### Was in v9d kommt:
|
### Was in v9c kommt:
|
||||||
- 🔲 OAuth2-Grundgerüst für Fitness-Connectoren
|
- 🔲 OAuth2-Grundgerüst für Fitness-Connectoren
|
||||||
- 🔲 Strava Connector
|
- 🔲 Strava Connector
|
||||||
- 🔲 Withings Connector (Waage)
|
- 🔲 Withings Connector (Waage)
|
||||||
|
|
@ -118,24 +116,20 @@ docker compose -f docker-compose.dev-env.yml build --no-cache
|
||||||
docker compose -f docker-compose.dev-env.yml up -d
|
docker compose -f docker-compose.dev-env.yml up -d
|
||||||
```
|
```
|
||||||
|
|
||||||
## Datenbank-Schema (PostgreSQL 16, v9b)
|
## Datenbank-Schema (SQLite, v9a)
|
||||||
### Wichtige Tabellen:
|
### Wichtige Tabellen:
|
||||||
- `profiles` – Nutzer (role, pin_hash/bcrypt, email, auth_type, ai_enabled, tier)
|
- `profiles` – Nutzer (role, pin_hash/bcrypt, email, auth_type, ai_enabled)
|
||||||
- `sessions` – Auth-Tokens mit Ablaufdatum
|
- `sessions` – Auth-Tokens mit Ablaufdatum
|
||||||
- `weight_log` – Gewichtseinträge (profile_id, date, weight)
|
- `weight_log` – Gewichtseinträge (profile_id, date, weight)
|
||||||
- `circumference_log` – 8 Umfangspunkte
|
- `circumference_log` – 8 Umfangspunkte
|
||||||
- `caliper_log` – Hautfaltenmessung, 4 Methoden
|
- `caliper_log` – Hautfaltenmessung, 4 Methoden
|
||||||
- `nutrition_log` – Kalorien + Makros (aus FDDB-CSV)
|
- `nutrition_log` – Kalorien + Makros (aus FDDB-CSV)
|
||||||
- `activity_log` – Training (aus Apple Health oder manuell)
|
- `activity_log` – Training (aus Apple Health oder manuell)
|
||||||
- `photos` – Progress Photos
|
|
||||||
- `ai_insights` – KI-Auswertungen (scope = prompt-slug)
|
- `ai_insights` – KI-Auswertungen (scope = prompt-slug)
|
||||||
- `ai_prompts` – Konfigurierbare Prompts mit Templates (11 Prompts)
|
- `ai_prompts` – Konfigurierbare Prompts mit Templates (11 Prompts)
|
||||||
- `ai_usage` – KI-Calls pro Tag pro Profil
|
- `ai_usage` – KI-Calls pro Tag pro Profil
|
||||||
|
|
||||||
**Schema-Datei:** `backend/schema.sql` (vollständiges PostgreSQL-Schema)
|
## Auth-Flow (v9a)
|
||||||
**Migration-Script:** `backend/migrate_to_postgres.py` (SQLite→PostgreSQL, automatisch)
|
|
||||||
|
|
||||||
## Auth-Flow (v9b)
|
|
||||||
```
|
```
|
||||||
Login-Screen → E-Mail + Passwort → Token im localStorage
|
Login-Screen → E-Mail + Passwort → Token im localStorage
|
||||||
Token → X-Auth-Token Header → Backend require_auth()
|
Token → X-Auth-Token Header → Backend require_auth()
|
||||||
|
|
@ -152,44 +146,29 @@ SHA256 Passwörter → automatisch zu bcrypt migriert beim Login
|
||||||
|
|
||||||
## Umgebungsvariablen (.env)
|
## Umgebungsvariablen (.env)
|
||||||
```
|
```
|
||||||
# Database (PostgreSQL)
|
OPENROUTER_API_KEY= # KI-Calls
|
||||||
DB_HOST=postgres
|
|
||||||
DB_PORT=5432
|
|
||||||
DB_NAME=mitai_prod
|
|
||||||
DB_USER=mitai_prod
|
|
||||||
DB_PASSWORD= # REQUIRED
|
|
||||||
|
|
||||||
# AI
|
|
||||||
OPENROUTER_API_KEY= # KI-Calls (optional, alternativ ANTHROPIC_API_KEY)
|
|
||||||
OPENROUTER_MODEL=anthropic/claude-sonnet-4
|
OPENROUTER_MODEL=anthropic/claude-sonnet-4
|
||||||
ANTHROPIC_API_KEY= # Direkte Anthropic API (optional)
|
SMTP_HOST= # E-Mail
|
||||||
|
|
||||||
# Email
|
|
||||||
SMTP_HOST= # E-Mail (für Recovery)
|
|
||||||
SMTP_PORT=587
|
SMTP_PORT=587
|
||||||
SMTP_USER=
|
SMTP_USER=
|
||||||
SMTP_PASS=
|
SMTP_PASS=
|
||||||
SMTP_FROM=
|
SMTP_FROM=
|
||||||
|
|
||||||
# App
|
|
||||||
APP_URL=https://mitai.jinkendo.de
|
APP_URL=https://mitai.jinkendo.de
|
||||||
ALLOWED_ORIGINS=https://mitai.jinkendo.de
|
ALLOWED_ORIGINS=https://mitai.jinkendo.de
|
||||||
DATA_DIR=/app/data
|
DATA_DIR=/app/data
|
||||||
PHOTOS_DIR=/app/photos
|
PHOTOS_DIR=/app/photos
|
||||||
ENVIRONMENT=production
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Wichtige Hinweise für Claude Code
|
## Wichtige Hinweise für Claude Code
|
||||||
1. **Ports immer 3002/8002 (Prod) oder 3099/8099 (Dev)** – nie ändern
|
1. **Ports immer 3002/8002 (Prod) oder 3099/8099 (Dev)** – nie ändern
|
||||||
2. **npm install** (nicht npm ci) – kein package-lock.json vorhanden
|
2. **npm install** (nicht npm ci) – kein package-lock.json vorhanden
|
||||||
3. **PostgreSQL-Migrations** – Schema-Änderungen in `backend/schema.sql`, dann Container neu bauen
|
3. **SQLite safe_alters** – neue Spalten immer via safe_alters Liste
|
||||||
4. **Pipeline-Prompts** haben slug-Prefix `pipeline_` – nie als Einzelanalyse zeigen
|
4. **Pipeline-Prompts** haben slug-Prefix `pipeline_` – nie als Einzelanalyse zeigen
|
||||||
5. **dayjs.week()** braucht Plugin – stattdessen native JS ISO-Wochenberechnung
|
5. **dayjs.week()** braucht Plugin – stattdessen native JS ISO-Wochenberechnung
|
||||||
6. **useNavigate()** nur in React-Komponenten, nicht in Helper-Functions
|
6. **useNavigate()** nur in React-Komponenten, nicht in Helper-Functions
|
||||||
7. **api.js nutzen** für alle API-Calls – injiziert Token automatisch
|
7. **api.js nutzen** für alle API-Calls – injiziert Token automatisch
|
||||||
8. **bcrypt** für alle neuen Passwort-Operationen verwenden
|
8. **bcrypt** für alle neuen Passwort-Operationen verwenden
|
||||||
9. **session=Depends(require_auth)** als separater Parameter – nie in Header() einbetten
|
9. **session=Depends(require_auth)** als separater Parameter – nie in Header() einbetten
|
||||||
10. **RealDictCursor verwenden** – `get_cursor(conn)` statt `conn.cursor()` für dict-like row access
|
|
||||||
|
|
||||||
## Code-Style
|
## Code-Style
|
||||||
- React: Functional Components, Hooks
|
- React: Functional Components, Hooks
|
||||||
|
|
@ -458,47 +437,10 @@ def endpoint(x_profile_id: Optional[str] = Header(default=None),
|
||||||
<Bar fill="#1D9E75"/>
|
<Bar fill="#1D9E75"/>
|
||||||
```
|
```
|
||||||
|
|
||||||
### PostgreSQL Boolean-Syntax
|
### SQLite neue Spalten hinzufügen
|
||||||
```python
|
```python
|
||||||
# ❌ Falsch (SQLite-Syntax):
|
# In _safe_alters Liste hinzufügen (NICHT direkt ALTER TABLE):
|
||||||
cur.execute("SELECT * FROM ai_prompts WHERE active=1")
|
_safe_alters = [
|
||||||
|
("profiles", "neue_spalte TEXT DEFAULT NULL"),
|
||||||
# ✅ Richtig (PostgreSQL):
|
]
|
||||||
cur.execute("SELECT * FROM ai_prompts WHERE active=true")
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### RealDictCursor für dict-like row access
|
|
||||||
```python
|
|
||||||
# ❌ Falsch:
|
|
||||||
cur = conn.cursor()
|
|
||||||
cur.execute("SELECT COUNT(*) FROM weight_log")
|
|
||||||
count = cur.fetchone()[0] # Tuple index
|
|
||||||
|
|
||||||
# ✅ Richtig:
|
|
||||||
cur = get_cursor(conn) # Returns RealDictCursor
|
|
||||||
cur.execute("SELECT COUNT(*) as count FROM weight_log")
|
|
||||||
count = cur.fetchone()['count'] # Dict key
|
|
||||||
```
|
|
||||||
|
|
||||||
## v9b Migration – Lessons Learned
|
|
||||||
|
|
||||||
### PostgreSQL Migration (SQLite → PostgreSQL)
|
|
||||||
**Problem:** Docker Build hing 30+ Minuten bei `apt-get install postgresql-client`
|
|
||||||
**Lösung:** Alle apt-get dependencies entfernt, reine Python-Lösung mit psycopg2-binary
|
|
||||||
|
|
||||||
**Problem:** Leere date-Strings (`''`) führten zu PostgreSQL-Fehlern
|
|
||||||
**Lösung:** Migration-Script konvertiert leere Strings zu NULL für DATE-Spalten
|
|
||||||
|
|
||||||
**Problem:** Boolean-Felder (SQLite INTEGER 0/1 vs PostgreSQL BOOLEAN)
|
|
||||||
**Lösung:** Migration konvertiert automatisch, Backend nutzt `active=true` statt `active=1`
|
|
||||||
|
|
||||||
### API Endpoint Consistency (März 2026)
|
|
||||||
**Problem:** 11 kritische Endpoint-Mismatches zwischen Frontend und Backend gefunden
|
|
||||||
**Gelöst:**
|
|
||||||
- AI-Endpoints konsistent: `/api/insights/run/{slug}`, `/api/insights/pipeline`
|
|
||||||
- Password-Reset: `/api/auth/forgot-password`, `/api/auth/reset-password`
|
|
||||||
- Admin-Endpoints: `/permissions`, `/email`, `/pin` Sub-Routes
|
|
||||||
- Export: JSON + ZIP Endpoints hinzugefügt
|
|
||||||
- Prompt-Bearbeitung: PUT-Endpoint für Admins
|
|
||||||
|
|
||||||
**Tool:** Vollständiger Audit via Explore-Agent empfohlen bei größeren Änderungen
|
|
||||||
|
|
|
||||||
235
backend/main.py
235
backend/main.py
|
|
@ -1,11 +1,11 @@
|
||||||
import os, csv, io, uuid, json, zipfile
|
import os, csv, io, uuid
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
from fastapi import FastAPI, HTTPException, UploadFile, File, Header, Query, Depends
|
from fastapi import FastAPI, HTTPException, UploadFile, File, Header, Query, Depends
|
||||||
from fastapi.middleware.cors import CORSMiddleware
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
from fastapi.responses import StreamingResponse, FileResponse, Response
|
from fastapi.responses import StreamingResponse, FileResponse
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
import aiofiles
|
import aiofiles
|
||||||
import bcrypt
|
import bcrypt
|
||||||
|
|
@ -852,7 +852,7 @@ def _prepare_template_vars(data: dict) -> dict:
|
||||||
|
|
||||||
return vars
|
return vars
|
||||||
|
|
||||||
@app.post("/api/insights/run/{slug}")
|
@app.post("/api/ai/analyze/{slug}")
|
||||||
async def analyze_with_prompt(slug: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
async def analyze_with_prompt(slug: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
"""Run AI analysis with specified prompt template."""
|
"""Run AI analysis with specified prompt template."""
|
||||||
pid = get_pid(x_profile_id)
|
pid = get_pid(x_profile_id)
|
||||||
|
|
@ -909,7 +909,7 @@ async def analyze_with_prompt(slug: str, x_profile_id: Optional[str]=Header(defa
|
||||||
inc_ai_usage(pid)
|
inc_ai_usage(pid)
|
||||||
return {"scope": slug, "content": content}
|
return {"scope": slug, "content": content}
|
||||||
|
|
||||||
@app.post("/api/insights/pipeline")
|
@app.post("/api/ai/analyze-pipeline")
|
||||||
async def analyze_pipeline(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
async def analyze_pipeline(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
"""Run 3-stage pipeline analysis."""
|
"""Run 3-stage pipeline analysis."""
|
||||||
pid = get_pid(x_profile_id)
|
pid = get_pid(x_profile_id)
|
||||||
|
|
@ -1053,32 +1053,6 @@ def list_prompts(session: dict=Depends(require_auth)):
|
||||||
cur.execute("SELECT * FROM ai_prompts WHERE active=true AND slug NOT LIKE 'pipeline_%' ORDER BY sort_order")
|
cur.execute("SELECT * FROM ai_prompts WHERE active=true AND slug NOT LIKE 'pipeline_%' ORDER BY sort_order")
|
||||||
return [r2d(r) for r in cur.fetchall()]
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
@app.put("/api/prompts/{prompt_id}")
|
|
||||||
def update_prompt(prompt_id: str, data: dict, session: dict=Depends(require_admin)):
|
|
||||||
"""Update AI prompt template (admin only)."""
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
updates = []
|
|
||||||
values = []
|
|
||||||
if 'name' in data:
|
|
||||||
updates.append('name=%s')
|
|
||||||
values.append(data['name'])
|
|
||||||
if 'description' in data:
|
|
||||||
updates.append('description=%s')
|
|
||||||
values.append(data['description'])
|
|
||||||
if 'template' in data:
|
|
||||||
updates.append('template=%s')
|
|
||||||
values.append(data['template'])
|
|
||||||
if 'active' in data:
|
|
||||||
updates.append('active=%s')
|
|
||||||
values.append(data['active'])
|
|
||||||
|
|
||||||
if updates:
|
|
||||||
cur.execute(f"UPDATE ai_prompts SET {', '.join(updates)}, updated=CURRENT_TIMESTAMP WHERE id=%s",
|
|
||||||
values + [prompt_id])
|
|
||||||
|
|
||||||
return {"ok": True}
|
|
||||||
|
|
||||||
@app.get("/api/ai/usage")
|
@app.get("/api/ai/usage")
|
||||||
def get_ai_usage(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
def get_ai_usage(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
"""Get AI usage stats for current profile."""
|
"""Get AI usage stats for current profile."""
|
||||||
|
|
@ -1171,22 +1145,7 @@ def auth_status():
|
||||||
"""Health check endpoint."""
|
"""Health check endpoint."""
|
||||||
return {"status": "ok", "service": "mitai-jinkendo", "version": "v9b"}
|
return {"status": "ok", "service": "mitai-jinkendo", "version": "v9b"}
|
||||||
|
|
||||||
@app.post("/api/auth/pin")
|
@app.post("/api/auth/password-reset-request")
|
||||||
def change_pin(req: dict, session: dict=Depends(require_auth)):
|
|
||||||
"""Change PIN/password for current user."""
|
|
||||||
pid = session['profile_id']
|
|
||||||
new_pin = req.get('pin', '')
|
|
||||||
if len(new_pin) < 4:
|
|
||||||
raise HTTPException(400, "PIN/Passwort muss mind. 4 Zeichen haben")
|
|
||||||
|
|
||||||
new_hash = hash_pin(new_pin)
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
cur.execute("UPDATE profiles SET pin_hash=%s WHERE id=%s", (new_hash, pid))
|
|
||||||
|
|
||||||
return {"ok": True}
|
|
||||||
|
|
||||||
@app.post("/api/auth/forgot-password")
|
|
||||||
@limiter.limit("3/minute")
|
@limiter.limit("3/minute")
|
||||||
async def password_reset_request(req: PasswordResetRequest, request: Request):
|
async def password_reset_request(req: PasswordResetRequest, request: Request):
|
||||||
"""Request password reset email."""
|
"""Request password reset email."""
|
||||||
|
|
@ -1244,7 +1203,7 @@ Dein Mitai Jinkendo Team
|
||||||
|
|
||||||
return {"ok": True, "message": "Falls die E-Mail existiert, wurde ein Reset-Link gesendet."}
|
return {"ok": True, "message": "Falls die E-Mail existiert, wurde ein Reset-Link gesendet."}
|
||||||
|
|
||||||
@app.post("/api/auth/reset-password")
|
@app.post("/api/auth/password-reset-confirm")
|
||||||
def password_reset_confirm(req: PasswordResetConfirm):
|
def password_reset_confirm(req: PasswordResetConfirm):
|
||||||
"""Confirm password reset with token."""
|
"""Confirm password reset with token."""
|
||||||
with get_db() as conn:
|
with get_db() as conn:
|
||||||
|
|
@ -1305,79 +1264,9 @@ def admin_update_profile(pid: str, data: AdminProfileUpdate, session: dict=Depen
|
||||||
|
|
||||||
return {"ok": True}
|
return {"ok": True}
|
||||||
|
|
||||||
@app.put("/api/admin/profiles/{pid}/permissions")
|
@app.post("/api/admin/test-email")
|
||||||
def admin_set_permissions(pid: str, data: dict, session: dict=Depends(require_admin)):
|
def admin_test_email(email: str, session: dict=Depends(require_admin)):
|
||||||
"""Admin: Set profile permissions."""
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
updates = []
|
|
||||||
values = []
|
|
||||||
if 'ai_enabled' in data:
|
|
||||||
updates.append('ai_enabled=%s')
|
|
||||||
values.append(data['ai_enabled'])
|
|
||||||
if 'ai_limit_day' in data:
|
|
||||||
updates.append('ai_limit_day=%s')
|
|
||||||
values.append(data['ai_limit_day'])
|
|
||||||
if 'export_enabled' in data:
|
|
||||||
updates.append('export_enabled=%s')
|
|
||||||
values.append(data['export_enabled'])
|
|
||||||
if 'role' in data:
|
|
||||||
updates.append('role=%s')
|
|
||||||
values.append(data['role'])
|
|
||||||
|
|
||||||
if updates:
|
|
||||||
cur.execute(f"UPDATE profiles SET {', '.join(updates)} WHERE id=%s", values + [pid])
|
|
||||||
|
|
||||||
return {"ok": True}
|
|
||||||
|
|
||||||
@app.put("/api/admin/profiles/{pid}/email")
|
|
||||||
def admin_set_email(pid: str, data: dict, session: dict=Depends(require_admin)):
|
|
||||||
"""Admin: Set profile email."""
|
|
||||||
email = data.get('email', '').strip().lower()
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
cur.execute("UPDATE profiles SET email=%s WHERE id=%s", (email if email else None, pid))
|
|
||||||
|
|
||||||
return {"ok": True}
|
|
||||||
|
|
||||||
@app.put("/api/admin/profiles/{pid}/pin")
|
|
||||||
def admin_set_pin(pid: str, data: dict, session: dict=Depends(require_admin)):
|
|
||||||
"""Admin: Set profile PIN/password."""
|
|
||||||
new_pin = data.get('pin', '')
|
|
||||||
if len(new_pin) < 4:
|
|
||||||
raise HTTPException(400, "PIN/Passwort muss mind. 4 Zeichen haben")
|
|
||||||
|
|
||||||
new_hash = hash_pin(new_pin)
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
cur.execute("UPDATE profiles SET pin_hash=%s WHERE id=%s", (new_hash, pid))
|
|
||||||
|
|
||||||
return {"ok": True}
|
|
||||||
|
|
||||||
@app.get("/api/admin/email/status")
|
|
||||||
def admin_email_status(session: dict=Depends(require_admin)):
|
|
||||||
"""Admin: Check email configuration status."""
|
|
||||||
smtp_host = os.getenv("SMTP_HOST")
|
|
||||||
smtp_user = os.getenv("SMTP_USER")
|
|
||||||
smtp_pass = os.getenv("SMTP_PASS")
|
|
||||||
app_url = os.getenv("APP_URL", "http://localhost:3002")
|
|
||||||
|
|
||||||
configured = bool(smtp_host and smtp_user and smtp_pass)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"configured": configured,
|
|
||||||
"smtp_host": smtp_host or "",
|
|
||||||
"smtp_user": smtp_user or "",
|
|
||||||
"app_url": app_url
|
|
||||||
}
|
|
||||||
|
|
||||||
@app.post("/api/admin/email/test")
|
|
||||||
def admin_test_email(data: dict, session: dict=Depends(require_admin)):
|
|
||||||
"""Admin: Send test email."""
|
"""Admin: Send test email."""
|
||||||
email = data.get('to', '')
|
|
||||||
if not email:
|
|
||||||
raise HTTPException(400, "E-Mail-Adresse fehlt")
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import smtplib
|
import smtplib
|
||||||
from email.mime.text import MIMEText
|
from email.mime.text import MIMEText
|
||||||
|
|
@ -1459,111 +1348,3 @@ def export_csv(x_profile_id: Optional[str]=Header(default=None), session: dict=D
|
||||||
media_type="text/csv",
|
media_type="text/csv",
|
||||||
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.csv"}
|
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.csv"}
|
||||||
)
|
)
|
||||||
|
|
||||||
@app.get("/api/export/json")
|
|
||||||
def export_json(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
|
||||||
"""Export all data as JSON."""
|
|
||||||
pid = get_pid(x_profile_id)
|
|
||||||
|
|
||||||
# Check export permission
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
cur.execute("SELECT export_enabled FROM profiles WHERE id=%s", (pid,))
|
|
||||||
prof = cur.fetchone()
|
|
||||||
if not prof or not prof['export_enabled']:
|
|
||||||
raise HTTPException(403, "Export ist für dieses Profil deaktiviert")
|
|
||||||
|
|
||||||
# Collect all data
|
|
||||||
data = {}
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
|
|
||||||
data['profile'] = r2d(cur.fetchone())
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM weight_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['weight'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM circumference_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['circumferences'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM caliper_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['caliper'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['nutrition'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM activity_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['activity'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM ai_insights WHERE profile_id=%s ORDER BY created DESC", (pid,))
|
|
||||||
data['insights'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
json_str = json.dumps(data, indent=2, default=str)
|
|
||||||
return Response(
|
|
||||||
content=json_str,
|
|
||||||
media_type="application/json",
|
|
||||||
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.json"}
|
|
||||||
)
|
|
||||||
|
|
||||||
@app.get("/api/export/zip")
|
|
||||||
def export_zip(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
|
||||||
"""Export all data as ZIP (JSON + photos)."""
|
|
||||||
pid = get_pid(x_profile_id)
|
|
||||||
|
|
||||||
# Check export permission
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
cur.execute("SELECT export_enabled FROM profiles WHERE id=%s", (pid,))
|
|
||||||
prof = cur.fetchone()
|
|
||||||
if not prof or not prof['export_enabled']:
|
|
||||||
raise HTTPException(403, "Export ist für dieses Profil deaktiviert")
|
|
||||||
|
|
||||||
# Create ZIP in memory
|
|
||||||
zip_buffer = io.BytesIO()
|
|
||||||
with zipfile.ZipFile(zip_buffer, 'w', zipfile.ZIP_DEFLATED) as zf:
|
|
||||||
# Add JSON data
|
|
||||||
data = {}
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
|
|
||||||
data['profile'] = r2d(cur.fetchone())
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM weight_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['weight'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM circumference_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['circumferences'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM caliper_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['caliper'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['nutrition'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM activity_log WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
data['activity'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
cur.execute("SELECT * FROM ai_insights WHERE profile_id=%s ORDER BY created DESC", (pid,))
|
|
||||||
data['insights'] = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
zf.writestr("data.json", json.dumps(data, indent=2, default=str))
|
|
||||||
|
|
||||||
# Add photos if they exist
|
|
||||||
with get_db() as conn:
|
|
||||||
cur = get_cursor(conn)
|
|
||||||
cur.execute("SELECT * FROM photos WHERE profile_id=%s ORDER BY date", (pid,))
|
|
||||||
photos = [r2d(r) for r in cur.fetchall()]
|
|
||||||
|
|
||||||
for i, photo in enumerate(photos):
|
|
||||||
photo_path = Path(PHOTOS_DIR) / photo['path']
|
|
||||||
if photo_path.exists():
|
|
||||||
zf.write(photo_path, f"photos/{photo['date'] or i}_{photo_path.name}")
|
|
||||||
|
|
||||||
zip_buffer.seek(0)
|
|
||||||
return StreamingResponse(
|
|
||||||
iter([zip_buffer.getvalue()]),
|
|
||||||
media_type="application/zip",
|
|
||||||
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.zip"}
|
|
||||||
)
|
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue
Block a user