feat: add ZIP import functionality
- Backend: POST /api/import/zip endpoint with validation and rollback - CSV import with ON CONFLICT DO NOTHING for duplicate detection - Photo import with existence check - AI insights import - Frontend: file upload UI in SettingsPage - Import summary showing count per category - Full transaction rollback on error Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
parent
e10e9d7eb9
commit
115d975335
145
CLAUDE.md
145
CLAUDE.md
|
|
@ -502,3 +502,148 @@ count = cur.fetchone()['count'] # Dict key
|
|||
- Prompt-Bearbeitung: PUT-Endpoint für Admins
|
||||
|
||||
**Tool:** Vollständiger Audit via Explore-Agent empfohlen bei größeren Änderungen
|
||||
|
||||
## Export/Import Spezifikation (v9c)
|
||||
|
||||
### ZIP-Export Struktur
|
||||
```
|
||||
mitai-export-{name}-{YYYY-MM-DD}.zip
|
||||
├── README.txt ← Erklärung des Formats + Versionsnummer
|
||||
├── profile.json ← Profildaten (ohne Passwort-Hash)
|
||||
├── data/
|
||||
│ ├── weight.csv ← Gewichtsverlauf
|
||||
│ ├── circumferences.csv ← Umfänge (8 Messpunkte)
|
||||
│ ├── caliper.csv ← Caliper-Messungen
|
||||
│ ├── nutrition.csv ← Ernährungsdaten
|
||||
│ └── activity.csv ← Aktivitäten
|
||||
├── insights/
|
||||
│ └── ai_insights.json ← KI-Auswertungen (alle gespeicherten)
|
||||
└── photos/
|
||||
├── {date}_{id}.jpg ← Progress-Fotos
|
||||
└── ...
|
||||
```
|
||||
|
||||
### CSV Format (alle Dateien)
|
||||
```
|
||||
- Trennzeichen: Semikolon (;) – Excel/LibreOffice kompatibel
|
||||
- Encoding: UTF-8 mit BOM (für Windows Excel)
|
||||
- Datumsformat: YYYY-MM-DD
|
||||
- Dezimaltrennzeichen: Punkt (.)
|
||||
- Erste Zeile: Header
|
||||
- Nullwerte: leer (nicht "null" oder "NULL")
|
||||
```
|
||||
|
||||
### weight.csv Spalten
|
||||
```
|
||||
id;date;weight;note;source;created
|
||||
```
|
||||
|
||||
### circumferences.csv Spalten
|
||||
```
|
||||
id;date;waist;hip;chest;neck;upper_arm;thigh;calf;forearm;note;created
|
||||
```
|
||||
|
||||
### caliper.csv Spalten
|
||||
```
|
||||
id;date;chest;abdomen;thigh;tricep;subscapular;suprailiac;midaxillary;method;bf_percent;note;created
|
||||
```
|
||||
|
||||
### nutrition.csv Spalten
|
||||
```
|
||||
id;date;meal_name;kcal;protein;fat;carbs;fiber;note;source;created
|
||||
```
|
||||
|
||||
### activity.csv Spalten
|
||||
```
|
||||
id;date;name;type;duration_min;kcal;heart_rate_avg;heart_rate_max;distance_km;note;source;created
|
||||
```
|
||||
|
||||
### profile.json Struktur
|
||||
```json
|
||||
{
|
||||
"export_version": "2",
|
||||
"export_date": "2026-03-18",
|
||||
"app": "Mitai Jinkendo",
|
||||
"profile": {
|
||||
"name": "Lars",
|
||||
"email": "lars@stommer.com",
|
||||
"sex": "m",
|
||||
"height": 178,
|
||||
"birth_year": 1980,
|
||||
"goal_weight": 82,
|
||||
"goal_bf_pct": 14,
|
||||
"avatar_color": "#1D9E75",
|
||||
"auth_type": "password",
|
||||
"session_days": 30,
|
||||
"ai_enabled": true,
|
||||
"tier": "selfhosted"
|
||||
},
|
||||
"stats": {
|
||||
"weight_entries": 150,
|
||||
"nutrition_entries": 300,
|
||||
"activity_entries": 45,
|
||||
"photos": 12
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### ai_insights.json Struktur
|
||||
```json
|
||||
[
|
||||
{
|
||||
"id": "uuid",
|
||||
"scope": "gesamt",
|
||||
"created": "2026-03-18T10:00:00",
|
||||
"result": "KI-Analyse Text..."
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### README.txt Inhalt
|
||||
```
|
||||
Mitai Jinkendo – Datenexport
|
||||
Version: 2
|
||||
Exportiert am: YYYY-MM-DD
|
||||
Profil: {name}
|
||||
|
||||
Inhalt:
|
||||
- profile.json: Profildaten und Einstellungen
|
||||
- data/*.csv: Messdaten (Semikolon-getrennt, UTF-8)
|
||||
- insights/: KI-Auswertungen (JSON)
|
||||
- photos/: Progress-Fotos (JPEG)
|
||||
|
||||
Import:
|
||||
Dieser Export kann in Mitai Jinkendo unter
|
||||
Einstellungen → Import → "Mitai Backup importieren"
|
||||
wieder eingespielt werden.
|
||||
|
||||
Format-Version 2 (ab v9b):
|
||||
Alle CSV-Dateien sind UTF-8 mit BOM kodiert.
|
||||
Trennzeichen: Semikolon (;)
|
||||
Datumsformat: YYYY-MM-DD
|
||||
```
|
||||
|
||||
### Import-Funktion (zu implementieren)
|
||||
**Endpoint:** `POST /api/import/zip`
|
||||
**Verhalten:**
|
||||
- Akzeptiert ZIP-Datei (multipart/form-data)
|
||||
- Erkennt export_version aus profile.json
|
||||
- Importiert nur fehlende Einträge (kein Duplikat)
|
||||
- Fotos werden nicht überschrieben falls bereits vorhanden
|
||||
- Gibt Zusammenfassung zurück: wie viele Einträge je Kategorie importiert
|
||||
- Bei Fehler: vollständiger Rollback (alle oder nichts)
|
||||
|
||||
**Duplikat-Erkennung:**
|
||||
```python
|
||||
# INSERT ... ON CONFLICT (profile_id, date) DO NOTHING
|
||||
# weight: UNIQUE (profile_id, date)
|
||||
# nutrition: UNIQUE (profile_id, date, meal_name)
|
||||
# activity: UNIQUE (profile_id, date, name)
|
||||
# caliper: UNIQUE (profile_id, date)
|
||||
# circumferences: UNIQUE (profile_id, date)
|
||||
```
|
||||
|
||||
**Frontend:** Neuer Button in SettingsPage:
|
||||
```
|
||||
[ZIP exportieren] [JSON exportieren] [Backup importieren]
|
||||
```
|
||||
|
|
|
|||
245
backend/main.py
245
backend/main.py
|
|
@ -1722,3 +1722,248 @@ Datumsformat: YYYY-MM-DD
|
|||
media_type="application/zip",
|
||||
headers={"Content-Disposition": f"attachment; filename={filename}"}
|
||||
)
|
||||
|
||||
|
||||
# ── Import ZIP ──────────────────────────────────────────────────
|
||||
@app.post("/api/import/zip")
|
||||
async def import_zip(
|
||||
file: UploadFile = File(...),
|
||||
x_profile_id: Optional[str] = Header(default=None),
|
||||
session: dict = Depends(require_auth)
|
||||
):
|
||||
"""
|
||||
Import data from ZIP export file.
|
||||
|
||||
- Validates export format
|
||||
- Imports missing entries only (ON CONFLICT DO NOTHING)
|
||||
- Imports photos
|
||||
- Returns import summary
|
||||
- Full rollback on error
|
||||
"""
|
||||
pid = get_pid(x_profile_id)
|
||||
|
||||
# Read uploaded file
|
||||
content = await file.read()
|
||||
zip_buffer = io.BytesIO(content)
|
||||
|
||||
try:
|
||||
with zipfile.ZipFile(zip_buffer, 'r') as zf:
|
||||
# 1. Validate profile.json
|
||||
if 'profile.json' not in zf.namelist():
|
||||
raise HTTPException(400, "Ungültiger Export: profile.json fehlt")
|
||||
|
||||
profile_data = json.loads(zf.read('profile.json').decode('utf-8'))
|
||||
export_version = profile_data.get('export_version', '1')
|
||||
|
||||
# Stats tracker
|
||||
stats = {
|
||||
'weight': 0,
|
||||
'circumferences': 0,
|
||||
'caliper': 0,
|
||||
'nutrition': 0,
|
||||
'activity': 0,
|
||||
'photos': 0,
|
||||
'insights': 0
|
||||
}
|
||||
|
||||
with get_db() as conn:
|
||||
cur = get_cursor(conn)
|
||||
|
||||
try:
|
||||
# 2. Import weight.csv
|
||||
if 'data/weight.csv' in zf.namelist():
|
||||
csv_data = zf.read('data/weight.csv').decode('utf-8-sig')
|
||||
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||
for row in reader:
|
||||
cur.execute("""
|
||||
INSERT INTO weight_log (profile_id, date, weight, note, source, created)
|
||||
VALUES (%s, %s, %s, %s, %s, %s)
|
||||
ON CONFLICT (profile_id, date) DO NOTHING
|
||||
""", (
|
||||
pid,
|
||||
row['date'],
|
||||
float(row['weight']) if row['weight'] else None,
|
||||
row.get('note', ''),
|
||||
row.get('source', 'import'),
|
||||
row.get('created', datetime.now())
|
||||
))
|
||||
if cur.rowcount > 0:
|
||||
stats['weight'] += 1
|
||||
|
||||
# 3. Import circumferences.csv
|
||||
if 'data/circumferences.csv' in zf.namelist():
|
||||
csv_data = zf.read('data/circumferences.csv').decode('utf-8-sig')
|
||||
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||
for row in reader:
|
||||
# Map CSV columns to DB columns
|
||||
cur.execute("""
|
||||
INSERT INTO circumference_log (
|
||||
profile_id, date, c_waist, c_hip, c_chest, c_neck,
|
||||
c_arm, c_thigh, c_calf, notes, created
|
||||
)
|
||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||
ON CONFLICT (profile_id, date) DO NOTHING
|
||||
""", (
|
||||
pid,
|
||||
row['date'],
|
||||
float(row['waist']) if row.get('waist') else None,
|
||||
float(row['hip']) if row.get('hip') else None,
|
||||
float(row['chest']) if row.get('chest') else None,
|
||||
float(row['neck']) if row.get('neck') else None,
|
||||
float(row['upper_arm']) if row.get('upper_arm') else None,
|
||||
float(row['thigh']) if row.get('thigh') else None,
|
||||
float(row['calf']) if row.get('calf') else None,
|
||||
row.get('note', ''),
|
||||
row.get('created', datetime.now())
|
||||
))
|
||||
if cur.rowcount > 0:
|
||||
stats['circumferences'] += 1
|
||||
|
||||
# 4. Import caliper.csv
|
||||
if 'data/caliper.csv' in zf.namelist():
|
||||
csv_data = zf.read('data/caliper.csv').decode('utf-8-sig')
|
||||
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||
for row in reader:
|
||||
cur.execute("""
|
||||
INSERT INTO caliper_log (
|
||||
profile_id, date, sf_chest, sf_abdomen, sf_thigh,
|
||||
sf_triceps, sf_subscap, sf_suprailiac, sf_axilla,
|
||||
sf_method, body_fat_pct, notes, created
|
||||
)
|
||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||
ON CONFLICT (profile_id, date) DO NOTHING
|
||||
""", (
|
||||
pid,
|
||||
row['date'],
|
||||
float(row['chest']) if row.get('chest') else None,
|
||||
float(row['abdomen']) if row.get('abdomen') else None,
|
||||
float(row['thigh']) if row.get('thigh') else None,
|
||||
float(row['tricep']) if row.get('tricep') else None,
|
||||
float(row['subscapular']) if row.get('subscapular') else None,
|
||||
float(row['suprailiac']) if row.get('suprailiac') else None,
|
||||
float(row['midaxillary']) if row.get('midaxillary') else None,
|
||||
row.get('method', 'jackson3'),
|
||||
float(row['bf_percent']) if row.get('bf_percent') else None,
|
||||
row.get('note', ''),
|
||||
row.get('created', datetime.now())
|
||||
))
|
||||
if cur.rowcount > 0:
|
||||
stats['caliper'] += 1
|
||||
|
||||
# 5. Import nutrition.csv
|
||||
if 'data/nutrition.csv' in zf.namelist():
|
||||
csv_data = zf.read('data/nutrition.csv').decode('utf-8-sig')
|
||||
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||
for row in reader:
|
||||
cur.execute("""
|
||||
INSERT INTO nutrition_log (
|
||||
profile_id, date, kcal, protein_g, fat_g, carbs_g, source, created
|
||||
)
|
||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s)
|
||||
ON CONFLICT (profile_id, date) DO NOTHING
|
||||
""", (
|
||||
pid,
|
||||
row['date'],
|
||||
float(row['kcal']) if row.get('kcal') else None,
|
||||
float(row['protein']) if row.get('protein') else None,
|
||||
float(row['fat']) if row.get('fat') else None,
|
||||
float(row['carbs']) if row.get('carbs') else None,
|
||||
row.get('source', 'import'),
|
||||
row.get('created', datetime.now())
|
||||
))
|
||||
if cur.rowcount > 0:
|
||||
stats['nutrition'] += 1
|
||||
|
||||
# 6. Import activity.csv
|
||||
if 'data/activity.csv' in zf.namelist():
|
||||
csv_data = zf.read('data/activity.csv').decode('utf-8-sig')
|
||||
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||
for row in reader:
|
||||
cur.execute("""
|
||||
INSERT INTO activity_log (
|
||||
profile_id, date, activity_type, duration_min,
|
||||
kcal_active, hr_avg, hr_max, distance_km, notes, source, created
|
||||
)
|
||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||
""", (
|
||||
pid,
|
||||
row['date'],
|
||||
row.get('type', 'Training'),
|
||||
float(row['duration_min']) if row.get('duration_min') else None,
|
||||
float(row['kcal']) if row.get('kcal') else None,
|
||||
float(row['heart_rate_avg']) if row.get('heart_rate_avg') else None,
|
||||
float(row['heart_rate_max']) if row.get('heart_rate_max') else None,
|
||||
float(row['distance_km']) if row.get('distance_km') else None,
|
||||
row.get('note', ''),
|
||||
row.get('source', 'import'),
|
||||
row.get('created', datetime.now())
|
||||
))
|
||||
if cur.rowcount > 0:
|
||||
stats['activity'] += 1
|
||||
|
||||
# 7. Import ai_insights.json
|
||||
if 'insights/ai_insights.json' in zf.namelist():
|
||||
insights_data = json.loads(zf.read('insights/ai_insights.json').decode('utf-8'))
|
||||
for insight in insights_data:
|
||||
cur.execute("""
|
||||
INSERT INTO ai_insights (profile_id, scope, content, created)
|
||||
VALUES (%s, %s, %s, %s)
|
||||
""", (
|
||||
pid,
|
||||
insight['scope'],
|
||||
insight['result'],
|
||||
insight.get('created', datetime.now())
|
||||
))
|
||||
stats['insights'] += 1
|
||||
|
||||
# 8. Import photos
|
||||
photo_files = [f for f in zf.namelist() if f.startswith('photos/') and not f.endswith('/')]
|
||||
for photo_file in photo_files:
|
||||
# Extract date from filename (format: YYYY-MM-DD_N.jpg)
|
||||
filename = Path(photo_file).name
|
||||
parts = filename.split('_')
|
||||
photo_date = parts[0] if len(parts) > 0 else datetime.now().strftime('%Y-%m-%d')
|
||||
|
||||
# Generate new ID and path
|
||||
photo_id = str(uuid.uuid4())
|
||||
ext = Path(filename).suffix
|
||||
new_filename = f"{photo_id}{ext}"
|
||||
target_path = PHOTOS_DIR / new_filename
|
||||
|
||||
# Check if photo already exists for this date
|
||||
cur.execute("""
|
||||
SELECT id FROM photos
|
||||
WHERE profile_id = %s AND date = %s
|
||||
""", (pid, photo_date))
|
||||
|
||||
if cur.fetchone() is None:
|
||||
# Write photo file
|
||||
with open(target_path, 'wb') as f:
|
||||
f.write(zf.read(photo_file))
|
||||
|
||||
# Insert DB record
|
||||
cur.execute("""
|
||||
INSERT INTO photos (id, profile_id, date, path, created)
|
||||
VALUES (%s, %s, %s, %s, %s)
|
||||
""", (photo_id, pid, photo_date, new_filename, datetime.now()))
|
||||
stats['photos'] += 1
|
||||
|
||||
# Commit transaction
|
||||
conn.commit()
|
||||
|
||||
except Exception as e:
|
||||
# Rollback on any error
|
||||
conn.rollback()
|
||||
raise HTTPException(500, f"Import fehlgeschlagen: {str(e)}")
|
||||
|
||||
return {
|
||||
"ok": True,
|
||||
"message": "Import erfolgreich",
|
||||
"stats": stats,
|
||||
"total": sum(stats.values())
|
||||
}
|
||||
|
||||
except zipfile.BadZipFile:
|
||||
raise HTTPException(400, "Ungültige ZIP-Datei")
|
||||
except Exception as e:
|
||||
raise HTTPException(500, f"Import-Fehler: {str(e)}")
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import { useState } from 'react'
|
||||
import { Save, Download, Trash2, Plus, Check, Pencil, X, LogOut, Shield, Key } from 'lucide-react'
|
||||
import { Save, Download, Upload, Trash2, Plus, Check, Pencil, X, LogOut, Shield, Key } from 'lucide-react'
|
||||
import { useProfile } from '../context/ProfileContext'
|
||||
import { useAuth } from '../context/AuthContext'
|
||||
import { Avatar } from './ProfileSelect'
|
||||
|
|
@ -123,6 +123,73 @@ export default function SettingsPage() {
|
|||
// editingId: string ID of profile being edited, or 'new' for new profile, or null
|
||||
const [editingId, setEditingId] = useState(null)
|
||||
const [saved, setSaved] = useState(false)
|
||||
const [importing, setImporting] = useState(false)
|
||||
const [importMsg, setImportMsg] = useState(null)
|
||||
|
||||
const handleImport = async (e) => {
|
||||
const file = e.target.files?.[0]
|
||||
if (!file) return
|
||||
|
||||
if (!confirm(`Backup "${file.name}" importieren? Vorhandene Einträge werden nicht überschrieben.`)) {
|
||||
e.target.value = '' // Reset file input
|
||||
return
|
||||
}
|
||||
|
||||
setImporting(true)
|
||||
setImportMsg(null)
|
||||
|
||||
try {
|
||||
const formData = new FormData()
|
||||
formData.append('file', file)
|
||||
|
||||
const token = localStorage.getItem('bodytrack_token')||''
|
||||
const pid = localStorage.getItem('bodytrack_active_profile')||''
|
||||
|
||||
const res = await fetch('/api/import/zip', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'X-Auth-Token': token,
|
||||
'X-Profile-Id': pid
|
||||
},
|
||||
body: formData
|
||||
})
|
||||
|
||||
const data = await res.json()
|
||||
|
||||
if (!res.ok) {
|
||||
throw new Error(data.detail || 'Import fehlgeschlagen')
|
||||
}
|
||||
|
||||
// Show success message with stats
|
||||
const stats = data.stats
|
||||
const lines = []
|
||||
if (stats.weight > 0) lines.push(`${stats.weight} Gewicht`)
|
||||
if (stats.circumferences > 0) lines.push(`${stats.circumferences} Umfänge`)
|
||||
if (stats.caliper > 0) lines.push(`${stats.caliper} Caliper`)
|
||||
if (stats.nutrition > 0) lines.push(`${stats.nutrition} Ernährung`)
|
||||
if (stats.activity > 0) lines.push(`${stats.activity} Aktivität`)
|
||||
if (stats.photos > 0) lines.push(`${stats.photos} Fotos`)
|
||||
if (stats.insights > 0) lines.push(`${stats.insights} KI-Analysen`)
|
||||
|
||||
setImportMsg({
|
||||
type: 'success',
|
||||
text: `✓ Import erfolgreich: ${lines.join(', ')}`
|
||||
})
|
||||
|
||||
// Refresh data (in case new entries were added)
|
||||
await refreshProfiles()
|
||||
|
||||
} catch (err) {
|
||||
setImportMsg({
|
||||
type: 'error',
|
||||
text: `✗ ${err.message}`
|
||||
})
|
||||
} finally {
|
||||
setImporting(false)
|
||||
e.target.value = '' // Reset file input
|
||||
setTimeout(() => setImportMsg(null), 5000)
|
||||
}
|
||||
}
|
||||
|
||||
const handleSave = async (form, profileId) => {
|
||||
const data = {}
|
||||
|
|
@ -307,6 +374,55 @@ export default function SettingsPage() {
|
|||
</p>
|
||||
</div>
|
||||
|
||||
{/* Import */}
|
||||
<div className="card section-gap">
|
||||
<div className="card-title">Backup importieren</div>
|
||||
<p style={{fontSize:13,color:'var(--text2)',marginBottom:12,lineHeight:1.6}}>
|
||||
Importiere einen ZIP-Export zurück in <strong>{activeProfile?.name}</strong>.
|
||||
Vorhandene Einträge werden nicht überschrieben.
|
||||
</p>
|
||||
<div style={{display:'flex',flexDirection:'column',gap:8}}>
|
||||
{!canExport && (
|
||||
<div style={{padding:'10px 12px',background:'#FCEBEB',borderRadius:8,
|
||||
fontSize:13,color:'#D85A30',marginBottom:8}}>
|
||||
🔒 Import ist für dein Profil nicht freigeschaltet. Bitte den Admin kontaktieren.
|
||||
</div>
|
||||
)}
|
||||
{canExport && (
|
||||
<>
|
||||
<label className="btn btn-primary btn-full"
|
||||
style={{cursor:importing?'wait':'pointer',opacity:importing?0.6:1}}>
|
||||
<input type="file" accept=".zip" onChange={handleImport}
|
||||
disabled={importing}
|
||||
style={{display:'none'}}/>
|
||||
{importing ? (
|
||||
<>Importiere...</>
|
||||
) : (
|
||||
<>
|
||||
<Upload size={14}/> ZIP-Backup importieren
|
||||
</>
|
||||
)}
|
||||
</label>
|
||||
{importMsg && (
|
||||
<div style={{
|
||||
padding:'10px 12px',
|
||||
background: importMsg.type === 'success' ? '#E1F5EE' : '#FCEBEB',
|
||||
borderRadius:8,
|
||||
fontSize:12,
|
||||
color: importMsg.type === 'success' ? 'var(--accent)' : '#D85A30',
|
||||
lineHeight:1.4
|
||||
}}>
|
||||
{importMsg.text}
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
<p style={{fontSize:11,color:'var(--text3)',marginTop:8}}>
|
||||
Der Import erkennt automatisch das Format und importiert nur neue Einträge.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{saved && (
|
||||
<div style={{position:'fixed',bottom:80,left:'50%',transform:'translateX(-50%)',
|
||||
background:'var(--accent)',color:'white',padding:'8px 20px',borderRadius:20,
|
||||
|
|
|
|||
Loading…
Reference in New Issue
Block a user